WorldWideScience

Sample records for priori defined reference

  1. 'Aussie normals': an a priori study to develop clinical chemistry reference intervals in a healthy Australian population.

    Science.gov (United States)

    Koerbin, G; Cavanaugh, J A; Potter, J M; Abhayaratna, W P; West, N P; Glasgow, N; Hawkins, C; Armbruster, D; Oakman, C; Hickman, P E

    2015-02-01

    Development of reference intervals is difficult, time consuming, expensive and beyond the scope of most laboratories. The Aussie Normals study is a direct a priori study to determine reference intervals in healthy Australian adults. All volunteers completed a health and lifestyle questionnaire and exclusion was based on conditions such as pregnancy, diabetes, renal or cardiovascular disease. Up to 91 biochemical analyses were undertaken on a variety of analytical platforms using serum samples collected from 1856 volunteers. We report on our findings for 40 of these analytes and two calculated parameters performed on the Abbott ARCHITECTci8200/ci16200 analysers. Not all samples were analysed for all assays due to volume requirements or assay/instrument availability. Results with elevated interference indices and those deemed unsuitable after clinical evaluation were removed from the database. Reference intervals were partitioned based on the method of Harris and Boyd into three scenarios, combined gender, males and females and age and gender. We have performed a detailed reference interval study on a healthy Australian population considering the effects of sex, age and body mass. These reference intervals may be adapted to other manufacturer's analytical methods using method transference.

  2. Art Priori = Art Priori / Kristel Jakobson

    Index Scriptorium Estoniae

    Jakobson, Kristel, 1983-

    2015-01-01

    Restoran Art Priori Tallinna vanalinnas Olevimägi 7. Sisekujunduse autor Kristel Jakobson (Haka Disain). Eesti Sisearhitektide Liidu aastapreemia 2014/2015 parima restorani eest. Lühidalt Kristel Jakobsonist

  3. Total Evidence, Uncertainty and A Priori Beliefs

    NARCIS (Netherlands)

    Bewersdorf, Benjamin; Felline, Laura; Ledda, Antonio; Paoli, Francesco; Rossanese, Emanuele

    2016-01-01

    Defining the rational belief state of an agent in terms of her initial or a priori belief state as well as her total evidence can help to address a number of important philosophical problems. In this paper, I discuss how this strategy can be applied to cases in which evidence is uncertain. I argue

  4. Time-discrete higher order ALE formulations: a priori error analysis

    KAUST Repository

    Bonito, Andrea; Kyza, Irene; Nochetto, Ricardo H.

    2013-01-01

    We derive optimal a priori error estimates for discontinuous Galerkin (dG) time discrete schemes of any order applied to an advection-diffusion model defined on moving domains and written in the Arbitrary Lagrangian Eulerian (ALE) framework. Our

  5. The Mediterranean Diet: its definition and evaluation of a priori dietary indexes in primary cardiovascular prevention.

    Science.gov (United States)

    D'Alessandro, Annunziata; De Pergola, Giovanni

    2018-01-18

    We have analysed the definition of Mediterranean Diet in 28 studies included in six meta-analyses evaluating the relation between the Mediterranean Diet and primary prevention of cardiovascular disease. Some typical food of this dietary pattern like whole cereals, olive oil and red wine were taken into account only in a few a priori indexes, and the dietary pattern defined as Mediterranean showed many differences among the studies and compared to traditional Mediterranean Diet of the early 1960s. Altogether, the analysed studies show a protective effect of the Mediterranean Diet against cardiovascular disease but present different effects against specific conditions as cerebrovascular disease and coronary heart disease. These different effects might depend on the definition of Mediterranean Diet and the indexes of the adhesion to the same one used. To compare the effects of the Mediterranean Diet against cardiovascular disease, coronary heart disease and stroke a univocal model of Mediterranean Diet should be established as a reference, and it might be represented by the Modern Mediterranean Diet Pyramid. The a priori index to evaluate the adhesion to Mediterranean Diet might be the Mediterranean-Style Dietary Pattern Score that has some advantages in comparison to the others a priori indexes.

  6. Solution of underdetermined systems of equations with gridded a priori constraints.

    Science.gov (United States)

    Stiros, Stathis C; Saltogianni, Vasso

    2014-01-01

    The TOPINV, Topological Inversion algorithm (or TGS, Topological Grid Search) initially developed for the inversion of highly non-linear redundant systems of equations, can solve a wide range of underdetermined systems of non-linear equations. This approach is a generalization of a previous conclusion that this algorithm can be used for the solution of certain integer ambiguity problems in Geodesy. The overall approach is based on additional (a priori) information for the unknown variables. In the past, such information was used either to linearize equations around approximate solutions, or to expand systems of observation equations solved on the basis of generalized inverses. In the proposed algorithm, the a priori additional information is used in a third way, as topological constraints to the unknown n variables, leading to an R(n) grid containing an approximation of the real solution. The TOPINV algorithm does not focus on point-solutions, but exploits the structural and topological constraints in each system of underdetermined equations in order to identify an optimal closed space in the R(n) containing the real solution. The centre of gravity of the grid points defining this space corresponds to global, minimum-norm solutions. The rationale and validity of the overall approach are demonstrated on the basis of examples and case studies, including fault modelling, in comparison with SVD solutions and true (reference) values, in an accuracy-oriented approach.

  7. Koncepce a priori Mikela Dufrenna

    Directory of Open Access Journals (Sweden)

    Felix Borecký

    2016-06-01

    Full Text Available The principle aim of this essay is to present Mikel Dufrenne’s conception of the a priori as an effort to overcome the sociological and historical relativism that dominates in aesthetics as in all other human sciences. Dufrenne endeavours to show that the understanding of sense cannot be derived only from an empirical framework. It does not suffice to consider only that what one has experienced or the habits, norms and values of the society in which one has grown up and by means of which one perceives reality (common sense; human beings also have a priori constants that are of crucial significance for cognition. The second section of the article presents an interpretation of how Dufrenne delimits his conception of the a priori against epistemology, and, mainly, in the third section, against Kant. The article continues, in the fourth and the sixth section respectively with an endeavour to overcome subjectivism and intellectualism, and focuses, in the fifth section, on the a priori as a historical concept that is inseparably connected with the imaginary (discussed in the seventh section. All these points are completely heterogeneous with the traditional delimitations of the a priori according to the rules of logic. Dufrenne’s notion of the a priori constitutes an original contribution to solving the problem of cognitio sensitiva, which deals with the question of the extent to which it is possible to recognize universal truths on a sensuous individual being. Dufrenne is convinced that the aesthetic aspects of experience, such as the sensuous, the imaginary, and the corporeal, are the truest guide to recognition of the deep truths about human being in the world. Here is manifested the a priori basis that is common both to human beings and the world and guaranteed by Nature. The supreme examples of aesthetic phenomena are works of art and the supreme type of experience is the aesthetic experience.

  8. Conventional Principles in Science: On the foundations and development of the relativized a priori

    Science.gov (United States)

    Ivanova, Milena; Farr, Matt

    2015-11-01

    The present volume consists of a collection of papers originally presented at the conference Conventional Principles in Science, held at the University of Bristol, August 2011, which featured contributions on the history and contemporary development of the notion of 'relativized a priori' principles in science, from Henri Poincaré's conventionalism to Michael Friedman's contemporary defence of the relativized a priori. In Science and Hypothesis, Poincaré assessed the problematic epistemic status of Euclidean geometry and Newton's laws of motion, famously arguing that each has the status of 'convention' in that their justification is neither analytic nor empirical in nature. In The Theory of Relativity and A Priori Knowledge, Hans Reichenbach, in light of the general theory of relativity, proposed an updated notion of the Kantian synthetic a priori to account for the dynamic inter-theoretic status of geometry and other non-empirical physical principles. Reichenbach noted that one may reject the 'necessarily true' aspect of the synthetic a priori whilst preserving the feature of being constitutive of the object of knowledge. Such constitutive principles are theory-relative, as illustrated by the privileged role of non-Euclidean geometry in general relativity theory. This idea of relativized a priori principles in spacetime physics has been analysed and developed at great length in the modern literature in the work of Michael Friedman, in particular the roles played by the light postulate and the equivalence principle - in special and general relativity respectively - in defining the central terms of their respective theories and connecting the abstract mathematical formalism of the theories with their empirical content. The papers in this volume guide the reader through the historical development of conventional and constitutive principles in science, from the foundational work of Poincaré, Reichenbach and others, to contemporary issues and applications of the

  9. LandScape: a simple method to aggregate p--Values and other stochastic variables without a priori grouping

    DEFF Research Database (Denmark)

    Wiuf, Carsten; Pallesen, Jonatan; Foldager, Leslie

    2016-01-01

    variables without assuming a priori defined groups. We provide different ways to evaluate the significance of the aggregated variables based on theoretical considerations and resampling techniques, and show that under certain assumptions the FWER is controlled in the strong sense. Validity of the method...... and the results might depend on the chosen criteria. Methods that summarize, or aggregate, test statistics or p-values, without relying on a priori criteria, are therefore desirable. We present a simple method to aggregate a sequence of stochastic variables, such as test statistics or p-values, into fewer...

  10. A set-theoretic model reference adaptive control architecture for disturbance rejection and uncertainty suppression with strict performance guarantees

    Science.gov (United States)

    Arabi, Ehsan; Gruenwald, Benjamin C.; Yucelen, Tansel; Nguyen, Nhan T.

    2018-05-01

    Research in adaptive control algorithms for safety-critical applications is primarily motivated by the fact that these algorithms have the capability to suppress the effects of adverse conditions resulting from exogenous disturbances, imperfect dynamical system modelling, degraded modes of operation, and changes in system dynamics. Although government and industry agree on the potential of these algorithms in providing safety and reducing vehicle development costs, a major issue is the inability to achieve a-priori, user-defined performance guarantees with adaptive control algorithms. In this paper, a new model reference adaptive control architecture for uncertain dynamical systems is presented to address disturbance rejection and uncertainty suppression. The proposed framework is predicated on a set-theoretic adaptive controller construction using generalised restricted potential functions.The key feature of this framework allows the system error bound between the state of an uncertain dynamical system and the state of a reference model, which captures a desired closed-loop system performance, to be less than a-priori, user-defined worst-case performance bound, and hence, it has the capability to enforce strict performance guarantees. Examples are provided to demonstrate the efficacy of the proposed set-theoretic model reference adaptive control architecture.

  11. Paleolimnological assessment of nutrient enrichment on diatom assemblages in a priori defined nitrogen- and phosphorus-limited lakes downwind of the Athabasca Oil Sands, Canada

    Directory of Open Access Journals (Sweden)

    Kathleen R. Laird

    2017-04-01

    Full Text Available As the industrial footprint of the Athabasca Oil Sands Region (AOSR continues to expand, concern about the potential impacts of pollutants on the surrounding terrestrial and aquatic ecosystems need to be assessed. An emerging issue is whether recent increases in lake production downwind of the development can be linked to AOSR activities, and/or whether changing climatic conditions are influencing lake nutrient status. To decipher the importance of pollutants, particularly atmospheric deposition of reactive nitrogen (Nr, and the effects of climate change as potential sources of increasing lake production, lakes from both within and outside of the nitrogen deposition zone were analyzed for historical changes in diatom assemblages. Lake sediment cores were collected from a priori defined nitrogen (N - and phosphorus (P - limited lakes within and outside the N plume associated with the AOSR. Diatom assemblages were quantified at sub-decadal resolution since ca. 1890 to compare conditions prior to oil sands expansion and regional climate warming, to the more recent conditions in each group of lakes (Reference and Impacted, N- and P-limited lakes. Analyses of changes in assemblage similarity and species turnover indicates that changes in diatom assemblages were minimal both within and across all lake groups.  Small changes in percent composition of planktonic taxa, particularly small centric taxa (Discostella and Cyclotella species and pennate taxa, such as Asterionella formosa and Fragilaria crotonensis, occurred in some of the lakes. While these changes were consistent with potential climate effects on algal growth, water column stability and other factors; the timing and direction of biotic changes were variable among sites suggesting that any apparent response to climate was lake dependent. The absence of a consistent pattern of diatom changes associated with receipt of reactive nitrogen or intrinsic nutrient-limitation status of the lake

  12. Analytical performance, reference values and decision limits. A need to differentiate between reference intervals and decision limits and to define analytical quality specifications

    DEFF Research Database (Denmark)

    Petersen, Per Hyltoft; Jensen, Esther A; Brandslund, Ivan

    2012-01-01

    of the values of analytical components measured on reference samples from reference individuals. Decision limits are based on guidelines from national and international expert groups defining specific concentrations of certain components as limits for decision about diagnosis or well-defined specific actions....... Analytical quality specifications for reference intervals have been defined for bias since the 1990s, but in the recommendations specified in the clinical guidelines analytical quality specifications are only scarcely defined. The demands for negligible biases are, however, even more essential for decision...... limits, as the choice is no longer left to the clinician, but emerge directly from the concentration. Even a small bias will change the number of diseased individuals, so the demands for negligible biases are obvious. A view over the analytical quality as published gives a variable picture of bias...

  13. Optimal phase estimation with arbitrary a priori knowledge

    International Nuclear Information System (INIS)

    Demkowicz-Dobrzanski, Rafal

    2011-01-01

    The optimal-phase estimation strategy is derived when partial a priori knowledge on the estimated phase is available. The solution is found with the help of the most famous result from the entanglement theory: the positive partial transpose criterion. The structure of the optimal measurements, estimators, and the optimal probe states is analyzed. This Rapid Communication provides a unified framework bridging the gap in the literature on the subject which until now dealt almost exclusively with two extreme cases: almost perfect knowledge (local approach based on Fisher information) and no a priori knowledge (global approach based on covariant measurements). Special attention is paid to a natural a priori probability distribution arising from a diffusion process.

  14. Why Even Mind? -- On The A Priori Value Of “Life”

    Directory of Open Access Journals (Sweden)

    Amien Kacou

    2008-10-01

    Full Text Available span style="font-size: 12pt; font-family: Arial"span style="font-weight: normal; font-size: 12pt; text-decoration: none"font face="Times New Roman"span style="font-size: 12pt; font-family: #39;Times New Roman#39;"This article presentsnbsp;an analysis of the matter of the ldquo;meaningrdquo; of life in terms of whether it should even be lived in the first place. It begins with an attempt at defining the question as an inquiry on the ema priori/em value of attention in general, and develops into an axiological reflection distantly inspired from Martin Heideggerrsquo;s notion of ldquo;care.rdquo; The main objective of the article is (1 to ldquo;answerrdquo; the question (or to proceed as if the question could be answered objectively by ldquo;playing alongrdquo; with its naiuml;ve logicmdash;that is, by finding a basis for comparing the good that can be found ema priori/em in life (mainly, pleasure with the good that can be found ema priori/em in death (mainly, the absence of painmdash;and, then, (2 to suggest why we have no good reason to feel dissatisfied with where this leaves us (i.e., possibly facing a certain specter of ethical foundationalism: the question of the ldquo;value of valuerdquo;. Its basic conclusion is, assuming we are committed to assigning value to life emin general/em, that we should be able to say that life is good emirrespective of /emany explanation for its existence./span/font/span/span

  15. Elimination of hidden a priori information from remotely sensed profile data

    Directory of Open Access Journals (Sweden)

    T. von Clarmann

    2007-01-01

    Full Text Available Profiles of atmospheric state variables retrieved from remote measurements often contain a priori information which causes complication in the statistical use of data and in the comparison with other measured or modeled data. For such applications it often is desirable to remove the a priori information from the data product. If the retrieval involves an ill-posed inversion problem, formal removal of the a priori information requires resampling of the data on a coarser grid, which in some sense, however, is a prior constraint in itself. The fact that the trace of the averaging kernel matrix of a retrieval is equivalent to the number of degrees of freedom of the retrieval is used to define an appropriate information-centered representation of the data where each data point represents one degree of freedom. Since regridding implies further degradation of the data and thus causes additional loss of information, a re-regularization scheme has been developed which allows resampling without additional loss of information. For a typical ClONO2 profile retrieved from spectra as measured by the Michelson Interferometer for Passive Atmospheric Sounding (MIPAS, the constrained retrieval has 9.7 degrees of freedom. After application of the proposed transformation to a coarser information-centered altitude grid, there are exactly 9 degrees of freedom left, and the averaging kernel on the coarse grid is unity. Pure resampling on the information-centered grid without re-regularization would reduce the degrees of freedom to 7.1 (6.7 for a staircase (triangular representation scheme.

  16. A priori imaginace

    Directory of Open Access Journals (Sweden)

    Mikel Dufrenne

    2016-06-01

    Full Text Available In this article, Dufrenne argues that imagination need not be only the subjective capacity to invent the unreal (dreams, fantasies, but that it is actually capable of revealing images that bring human beings closer to the hidden plenitude of Nature. Among these a priori images, such as heaven, water, blood, and earth, Dufrenne emphasizes the elementary, power, depth, and purity, which he believes are the most fundamental of them. He considers a potential classification of these images on the principle of ontological quality.

  17. Time-discrete higher order ALE formulations: a priori error analysis

    KAUST Repository

    Bonito, Andrea

    2013-03-16

    We derive optimal a priori error estimates for discontinuous Galerkin (dG) time discrete schemes of any order applied to an advection-diffusion model defined on moving domains and written in the Arbitrary Lagrangian Eulerian (ALE) framework. Our estimates hold without any restrictions on the time steps for dG with exact integration or Reynolds\\' quadrature. They involve a mild restriction on the time steps for the practical Runge-Kutta-Radau methods of any order. The key ingredients are the stability results shown earlier in Bonito et al. (Time-discrete higher order ALE formulations: stability, 2013) along with a novel ALE projection. Numerical experiments illustrate and complement our theoretical results. © 2013 Springer-Verlag Berlin Heidelberg.

  18. A priori a matematika u Berkeleyho

    Czech Academy of Sciences Publication Activity Database

    Tomeček, Marek

    2014-01-01

    Roč. 62, č. 3 (2014), s. 369-385 ISSN 0015-1831 R&D Projects: GA ČR(CZ) GAP401/11/0371 Institutional support: RVO:67985955 Keywords : a priori * mathematics * empiricism * George Berkeley Subject RIV: AA - Philosophy ; Religion

  19. Defining reference conditions for acidified waters using a modern analogue approach

    International Nuclear Information System (INIS)

    Simpson, Gavin L.; Shilland, Ewan M.; Winterbottom, Julie M.; Keay, Janey

    2005-01-01

    Analogue matching is a palaeolimnological technique that aims to find matches for fossil sediment samples from a set of modern surface sediment samples. Modern analogues were identified that closely matched the pre-disturbance conditions of eight of the UK Acid Waters Monitoring Network (AWMN) lakes using diatom- and cladoceran-based analogue matching. These analogue sites were assessed in terms of hydrochemistry, aquatic macrophytes and macro-invertebrates as to their suitability for defining wider hydrochemical and biological reference conditions for acidified sites within the AWMN. The analogues identified for individual AWMN sites show a close degree of similarity in terms of their hydrochemical characteristics, aquatic macrophytes and, to a lesser extent, macro-invertebrate fauna. The reference conditions of acidified AWMN sites are inferred to be less acidic than today and to support a wider range of acid-sensitive aquatic macrophyte and macro-invertebrate taxa than that recorded in the AWMN lakes over the period of monitoring since 1988. - The use of a palaeolimnological technique to identify modern ecological reference analogues for acidified lakes is demonstrated

  20. Kant, Reichenbach, and the Fate of A Priori Principles

    OpenAIRE

    de Boer, Karin

    2011-01-01

    This article contends that the relation of early logical empiricism to Kant was more complex than is often assumed. It argues that Reichenbach’s early work on Kant and Einstein, entitled The Theory of Relativity and A Priori Knowledge (1920) aimed to transform rather than to oppose Kant’s Critique of Pure Reason. One the one hand, I argue that Reichenbach’s conception of coordinating principles, derived from Kant’s conception of synthetic a priori principles, offers a valuable way of accounti...

  1. Verdad y apertura de mundo. El problema de los juicios sintéticos a priori tras el giro lingüístico

    Directory of Open Access Journals (Sweden)

    Cristina LAFONT

    2009-11-01

    Full Text Available RESUMEN: Este artículo analiza el impacto del giro lingüístico en la transformación de la concepción kantiana de los juicios sintéticos a priori. Se centra para ello en dos concepciones contemporáneas de los mismos, a saber, el a priori hermenéutico de Heidegger y el a priori contextual de Putnam, y saca a relucir expresamente tanto sus rasgos similares como sus importantes diferencias: mientras que la concepción heideggeriana mantiene el idealismo transcendental de Kant a través de la suposición hermenéutica de que el significado determina la referencia, la concepción de Putnam rompe por completo con la teoría kantiana de los juicios sintéticos a priori, según la cual éstos no pueden ser modificados por la sola experiencia, al tiempo que rechaza la suposición kantiana de que los juicios sintéticos a priori y aposteriori son dos clases permanentes de juicios. En contraste con Heidegger, Putnam es por eso capaz de mostrar que los juicios sintéticos apriori son modificables bajo condiciones especiales.ABSTRACT: This paper analyzes the impact of the linguistic turn in the transformation of the Kantian conception of the synthetic a priori. It focuses on two contemporary conceptions of the synthetic apriori, namely, Heidegger's hermeneutic apriori and Putnam's contextual apriori and shows their strikingly similar features as well as their important differences: whereas Heidegger's conception retains Kant's transcendental idealism via the hermeneutic assumption that meaning determines reference, Putnam's conception breaks entirely of Kant's conception fo synthetic a priori judgments, namely, that they cannot be revised by experience alone, while rejecting the Kantian assumption that synthetic apriori and aposteriori are permanent statuses of judgments. In contradistinction to Heidegger, Putnam is thus able to show that synthetic apriori judgments are revisable under special conditions.

  2. A priori knowledge and the Kochen-Specker theorem

    International Nuclear Information System (INIS)

    Brunet, Olivier

    2007-01-01

    We introduce and formalize a notion of 'a priori knowledge' about a quantum system, and show some properties about this form of knowledge. Finally, we show that the Kochen-Specker theorem follows directly from this study

  3. Quantitative estimation of diphtheria and tetanus toxoids. 4. Toxoids as international reference materials defining Lf-units for diphtheria and tetanus toxoids.

    Science.gov (United States)

    Lyng, J

    1990-01-01

    The Lf-unit, which is used in the control of diphtheria and tetanus toxoid production and in some countries also to follow immunization of horses for production of antitoxins, has hitherto been defined by means of antitoxin preparations. A diphtheria toxoid and a tetanus toxoid preparation, both freeze-dried, were examined in an international collaborative study for their suitability to serve as reference reagents in the flocculation tests and for defining the Lf-units. It was shown that flocculation tests using the reference toxoids are very reproducible and reliable and the WHO Expert Committee on Biological Standardization established: the toxoid called DIFT as the International Reference Reagent of Diphtheria Toxoid for Flocculation Test with a defined content of 900 Lf-units of diphtheria toxoid per ampoule; and the toxoid called TEFT as the International Reference Reagent of Tetanus Toxoid for Flocculation Test with a defined content of 1000 Lf-units of diphtheria toxoid per ampoule.

  4. A Priori Regularity of Parabolic Partial Differential Equations

    KAUST Repository

    Berkemeier, Francisco

    2018-01-01

    In this thesis, we consider parabolic partial differential equations such as the heat equation, the Fokker-Planck equation, and the porous media equation. Our aim is to develop methods that provide a priori estimates for solutions with singular

  5. Learning to improve medical decision making from imbalanced data without a priori cost.

    Science.gov (United States)

    Wan, Xiang; Liu, Jiming; Cheung, William K; Tong, Tiejun

    2014-12-05

    In a medical data set, data are commonly composed of a minority (positive or abnormal) group and a majority (negative or normal) group and the cost of misclassifying a minority sample as a majority sample is highly expensive. This is the so-called imbalanced classification problem. The traditional classification functions can be seriously affected by the skewed class distribution in the data. To deal with this problem, people often use a priori cost to adjust the learning process in the pursuit of optimal classification function. However, this priori cost is often unknown and hard to estimate in medical decision making. In this paper, we propose a new learning method, named RankCost, to classify imbalanced medical data without using a priori cost. Instead of focusing on improving the class-prediction accuracy, RankCost is to maximize the difference between the minority class and the majority class by using a scoring function, which translates the imbalanced classification problem into a partial ranking problem. The scoring function is learned via a non-parametric boosting algorithm. We compare RankCost to several representative approaches on four medical data sets varying in size, imbalanced ratio, and dimension. The experimental results demonstrate that unlike the currently available methods that often perform unevenly with different priori costs, RankCost shows comparable performance in a consistent manner. It is a challenging task to learn an effective classification model based on imbalanced data in medical data analysis. The traditional approaches often use a priori cost to adjust the learning of the classification function. This work presents a novel approach, namely RankCost, for learning from medical imbalanced data sets without using a priori cost. The experimental results indicate that RankCost performs very well in imbalanced data classification and can be a useful method in real-world applications of medical decision making.

  6. Practical interior tomography with radial Hilbert filtering and a priori knowledge in a small round area.

    Science.gov (United States)

    Tang, Shaojie; Yang, Yi; Tang, Xiangyang

    2012-01-01

    Interior tomography problem can be solved using the so-called differentiated backprojection-projection onto convex sets (DBP-POCS) method, which requires a priori knowledge within a small area interior to the region of interest (ROI) to be imaged. In theory, the small area wherein the a priori knowledge is required can be in any shape, but most of the existing implementations carry out the Hilbert filtering either horizontally or vertically, leading to a vertical or horizontal strip that may be across a large area in the object. In this work, we implement a practical DBP-POCS method with radial Hilbert filtering and thus the small area with the a priori knowledge can be roughly round (e.g., a sinus or ventricles among other anatomic cavities in human or animal body). We also conduct an experimental evaluation to verify the performance of this practical implementation. We specifically re-derive the reconstruction formula in the DBP-POCS fashion with radial Hilbert filtering to assure that only a small round area with the a priori knowledge be needed (namely radial DBP-POCS method henceforth). The performance of the practical DBP-POCS method with radial Hilbert filtering and a priori knowledge in a small round area is evaluated with projection data of the standard and modified Shepp-Logan phantoms simulated by computer, followed by a verification using real projection data acquired by a computed tomography (CT) scanner. The preliminary performance study shows that, if a priori knowledge in a small round area is available, the radial DBP-POCS method can solve the interior tomography problem in a more practical way at high accuracy. In comparison to the implementations of DBP-POCS method demanding the a priori knowledge in horizontal or vertical strip, the radial DBP-POCS method requires the a priori knowledge within a small round area only. Such a relaxed requirement on the availability of a priori knowledge can be readily met in practice, because a variety of small

  7. Introduction of a priori information in the elastic linearized inversion of seismic data before stacking; Introduction d'informations a priori dans l'inversion linearisee elastique de donnees sismiques de surface avant sommation

    Energy Technology Data Exchange (ETDEWEB)

    Tonellot, Th.L.

    2000-03-24

    In this thesis, we propose a method which takes into account a priori information (geological, diagraphic and stratigraphic knowledge) in linearized pre-stack seismic data inversion. The approach is based on a formalism in which the a priori information is incorporated in an a priori model of elastic parameters - density, P and S impedances - and a model covariance operator which describes the uncertainties in the model. The first part of the thesis is dedicated to the study of this covariance operator and to the norm associated to its inverse. We have generalized the exponential covariance operator in order to describe the uncertainties in the a priori model elastic parameters and their correlations at each location. We give the analytical expression of the covariance operator inverse in 1-D, 2-D, and 3-D, and we discretized the associated norm with a finite element method. The second part is dedicated to synthetic and real examples. In a preliminary step, we have developed a pre-stack data well calibration method which allows the estimation of the source signal. The impact of different a priori information is then demonstrated on synthetic and real data. (author)

  8. Intention to use a fully automated car: attitudes and a priori acceptability

    OpenAIRE

    PAYRE, William; CESTAC, Julien; DELHOMME, Patricia

    2014-01-01

    If previous research studied acceptability of partially or highly automated driving, few of them focused on fully automated driving (FAD), including the ability to master longitudinal control, lateral control and maneuvers. The present study analyzes a priori acceptability, attitudes, personality traits and intention to use a fully automated vehicle. 421 French drivers (153 males, M= 40.2 years, age range 19-73) answered an online questionnaire. 68.1% of the sample a priori accepted FAD. P...

  9. Sensitivity of the OMI ozone profile retrieval (OMO3PR) to a priori assumptions

    NARCIS (Netherlands)

    Mielonen, T.; De Haan, J.F.; Veefkind, J.P.

    2014-01-01

    We have assessed the sensitivity of the operational OMI ozone profile retrieval (OMO3PR) algorithm to a number of a priori assumptions. We studied the effect of stray light correction, surface albedo assumptions and a priori ozone profiles on the retrieved ozone profile. Then, we studied how to

  10. A Priori and a Posteriori Dietary Patterns during Pregnancy and Gestational Weight Gain: The Generation R Study

    Directory of Open Access Journals (Sweden)

    Myrte J. Tielemans

    2015-11-01

    Full Text Available Abnormal gestational weight gain (GWG is associated with adverse pregnancy outcomes. We examined whether dietary patterns are associated with GWG. Participants included 3374 pregnant women from a population-based cohort in the Netherlands. Dietary intake during pregnancy was assessed with food-frequency questionnaires. Three a posteriori-derived dietary patterns were identified using principal component analysis: a “Vegetable, oil and fish”, a “Nuts, high-fiber cereals and soy”, and a “Margarine, sugar and snacks” pattern. The a priori-defined dietary pattern was based on national dietary recommendations. Weight was repeatedly measured around 13, 20 and 30 weeks of pregnancy; pre-pregnancy and maximum weight were self-reported. Normal weight women with high adherence to the “Vegetable, oil and fish” pattern had higher early-pregnancy GWG than those with low adherence (43 g/week (95% CI 16; 69 for highest vs. lowest quartile (Q. Adherence to the “Margarine, sugar and snacks” pattern was associated with a higher prevalence of excessive GWG (OR 1.45 (95% CI 1.06; 1.99 Q4 vs. Q1. Normal weight women with higher scores on the “Nuts, high-fiber cereals and soy” pattern had more moderate GWG than women with lower scores (−0.01 (95% CI −0.02; −0.00 per SD. The a priori-defined pattern was not associated with GWG. To conclude, specific dietary patterns may play a role in early pregnancy but are not consistently associated with GWG.

  11. Developing an A Priori Database for Passive Microwave Snow Water Retrievals Over Ocean

    Science.gov (United States)

    Yin, Mengtao; Liu, Guosheng

    2017-12-01

    A physically optimized a priori database is developed for Global Precipitation Measurement Microwave Imager (GMI) snow water retrievals over ocean. The initial snow water content profiles are derived from CloudSat Cloud Profiling Radar (CPR) measurements. A radiative transfer model in which the single-scattering properties of nonspherical snowflakes are based on the discrete dipole approximate results is employed to simulate brightness temperatures and their gradients. Snow water content profiles are then optimized through a one-dimensional variational (1D-Var) method. The standard deviations of the difference between observed and simulated brightness temperatures are in a similar magnitude to the observation errors defined for observation error covariance matrix after the 1D-Var optimization, indicating that this variational method is successful. This optimized database is applied in a Bayesian retrieval snow water algorithm. The retrieval results indicated that the 1D-Var approach has a positive impact on the GMI retrieved snow water content profiles by improving the physical consistency between snow water content profiles and observed brightness temperatures. Global distribution of snow water contents retrieved from the a priori database is compared with CloudSat CPR estimates. Results showed that the two estimates have a similar pattern of global distribution, and the difference of their global means is small. In addition, we investigate the impact of using physical parameters to subset the database on snow water retrievals. It is shown that using total precipitable water to subset the database with 1D-Var optimization is beneficial for snow water retrievals.

  12. Mediterranean Diet and Cardiovascular Disease: A Critical Evaluation of A Priori Dietary Indexes

    Science.gov (United States)

    D’Alessandro, Annunziata; De Pergola, Giovanni

    2015-01-01

    The aim of this paper is to analyze the a priori dietary indexes used in the studies that have evaluated the role of the Mediterranean Diet in influencing the risk of developing cardiovascular disease. All the studies show that this dietary pattern protects against cardiovascular disease, but studies show quite different effects on specific conditions such as coronary heart disease or cerebrovascular disease. A priori dietary indexes used to measure dietary exposure imply quantitative and/or qualitative divergences from the traditional Mediterranean Diet of the early 1960s, and, therefore, it is very difficult to compare the results of different studies. Based on real cultural heritage and traditions, we believe that the a priori indexes used to evaluate adherence to the Mediterranean Diet should consider classifying whole grains and refined grains, olive oil and monounsaturated fats, and wine and alcohol differently. PMID:26389950

  13. Choice of Reference Serum Creatinine in Defining AKI

    Science.gov (United States)

    Siew, Edward D.; Matheny, Michael E.

    2015-01-01

    Background/Aims The study of acute kidney injury (AKI) has expanded with the increasing availability of electronic health records and the use of standardized definitions. Understanding the impact of AKI between settings is limited by heterogeneity in the selection of reference creatinine to anchor the definition of AKI. In this mini-review, we discuss different approaches used to select reference creatinine and their relative merits and limitations. Methods We reviewed the literature to obtain representative examples of published baseline creatinine definitions when pre-hospital data were not available, as well as literature evaluating estimation of baseline renal function, using Pubmed and reference back-tracing within known works. Results 1) Prehospital creatinine values are useful in determining reference creatinine, and in high-risk populations, the mean outpatient serum creatinine value 7-365 days before hospitalization closely approximates nephrology adjudication, 2) in patients without pre-hospital data, the eGFR 75 approach does not reliably estimate true AKI incidence in most at-risk populations 3) using the lowest inpatient serum creatinine may be reasonable, especially in those with preserved kidney function, but may generously estimate AKI incidence and severity and miss community-acquired AKI that does not fully resolve, 4) using more specific definitions of AKI (e.g. KIDGO Stage 2 and 3) may help to reduce the effects of misclassification when using surrogate values, and 5) leveraging available clinical data may help refine the estimate of reference creatinine. Conclusions Choosing reference creatinine for AKI calculation is important for AKI classification and study interpretation. We recommend obtaining data on pre-hospital kidney function, wherever possible. In studies where surrogate estimates are used, transparency in how they are applied and discussion that informs the reader of potential biases should be provided. Further work to refine the

  14. Mediterranean Diet and Cardiovascular Disease: A Critical Evaluation of A Priori Dietary Indexes

    Directory of Open Access Journals (Sweden)

    Annunziata D'Alessandro

    2015-09-01

    Full Text Available The aim of this paper is to analyze the a priori dietary indexes used in the studies that have evaluated the role of the Mediterranean Diet in influencing the risk of developing cardiovascular disease. All the studies show that this dietary pattern protects against cardiovascular disease, but studies show quite different effects on specific conditions such as coronary heart disease or cerebrovascular disease. A priori dietary indexes used to measure dietary exposure imply quantitative and/or qualitative divergences from the traditional Mediterranean Diet of the early 1960s, and, therefore, it is very difficult to compare the results of different studies. Based on real cultural heritage and traditions, we believe that the a priori indexes used to evaluate adherence to the Mediterranean Diet should consider classifying whole grains and refined grains, olive oil and monounsaturated fats, and wine and alcohol differently.

  15. Razonamiento a priori y argumento ontológico en Antonio Rosmini

    Directory of Open Access Journals (Sweden)

    Juan F. Franck

    2013-11-01

    Full Text Available Rosmini’s criticism of the ontological argument finds its place between those of Aquinas and of Kant. With the former he shares the denial of the evidence of God’s essence quoad nos, and with the latter, his acknowledgment of the decisive character of the nucleus of the ontological argument for all other proofs of God’s existence. Such nucleus consists for Rosmini in the possibility of developing an a priori reasoning, different from the ontological one, be it in its Anselmian, Cartesian or Leibnizian form, which would justify the validity of the other proofs and the fascination exercised by the ontological argument. Key words: A Priori Reasoning, Ontological Argument, Antonio Rosmini.

  16. A Priori Knowledge and Heuristic Reasoning in Architectural Design.

    Science.gov (United States)

    Rowe, Peter G.

    1982-01-01

    It is proposed that the various classes of a priori knowledge incorporated in heuristic reasoning processes exert a strong influence over architectural design activity. Some design problems require exercise of some provisional set of rules, inference, or plausible strategy which requires heuristic reasoning. A case study illustrates this concept.…

  17. A concept for automated nanoscale atomic force microscope (AFM) measurements using a priori knowledge

    International Nuclear Information System (INIS)

    Recknagel, C; Rothe, H

    2009-01-01

    The nanometer coordinate measuring machine (NCMM) is developed for comparatively fast large area scans with high resolution. The system combines a metrological atomic force microscope (AFM) with a precise positioning system. The sample is moved under the probe system via the positioning system achieving a scan range of 25 × 25 × 5 mm 3 with a resolution of 0.1 nm. A concept for AFM measurements using a priori knowledge is implemented. The a priori knowledge is generated through measurements with a white light interferometer and the use of CAD data. Dimensional markup language is used as a transfer and target format for a priori knowledge and measurement data. Using the a priori knowledge and template matching algorithms combined with the optical microscope of the NCMM, the region of interest can automatically be identified. In the next step the automatic measurement of the part coordinate system and the measurement elements with the AFM sensor of the NCMM is done. The automatic measurement involves intelligent measurement strategies, which are adapted to specific geometries of the measurement feature to reduce measurement time and drift effects

  18. Methods for improving limited field-of-view radiotherapy reconstructions using imperfect a priori images

    International Nuclear Information System (INIS)

    Ruchala, Kenneth J.; Olivera, Gustavo H.; Kapatoes, Jeffrey M.; Reckwerdt, Paul J.; Mackie, Thomas R.

    2002-01-01

    There are many benefits to having an online CT imaging system for radiotherapy, as it helps identify changes in the patient's position and anatomy between the time of planning and treatment. However, many current online CT systems suffer from a limited field-of-view (LFOV) in that collected data do not encompass the patient's complete cross section. Reconstruction of these data sets can quantitatively distort the image values and introduce artifacts. This work explores the use of planning CT data as a priori information for improving these reconstructions. Methods are presented to incorporate this data by aligning the LFOV with the planning images and then merging the data sets in sinogram space. One alignment option is explicit fusion, producing fusion-aligned reprojection (FAR) images. For cases where explicit fusion is not viable, FAR can be implemented using the implicit fusion of normal setup error, referred to as normal-error-aligned reprojection (NEAR). These methods are evaluated for multiday patient images showing both internal and skin-surface anatomical variation. The iterative use of NEAR and FAR is also investigated, as are applications of NEAR and FAR to dose calculations and the compensation of LFOV online MVCT images with kVCT planning images. Results indicate that NEAR and FAR can utilize planning CT data as imperfect a priori information to reduce artifacts and quantitatively improve images. These benefits can also increase the accuracy of dose calculations and be used for augmenting CT images (e.g., MVCT) acquired at different energies than the planning CT

  19. Methodology to define biological reference values in the environmental and occupational fields: the contribution of the Italian Society for Reference Values (SIVR).

    Science.gov (United States)

    Aprea, Maria Cristina; Scapellato, Maria Luisa; Valsania, Maria Carmen; Perico, Andrea; Perbellini, Luigi; Ricossa, Maria Cristina; Pradella, Marco; Negri, Sara; Iavicoli, Ivo; Lovreglio, Piero; Salamon, Fabiola; Bettinelli, Maurizio; Apostoli, Pietro

    2017-04-21

    Biological reference values (RVs) explore the relationships between humans and their environment and habits. RVs are fundamental in the environmental field for assessing illnesses possibly associated with environmental pollution, and also in the occupational field, especially in the absence of established biological or environmental limits. The Italian Society for Reference Values (SIVR) determined to test criteria and procedures for the definition of RVs to be used in the environmental and occupational fields. The paper describes the SIVR methodology for defining RVs of xenobiotics and their metabolites. Aspects regarding the choice of population sample, the quality of analytical data, statistical analysis and control of variability factors are considered. The simultaneous interlaboratory circuits involved can be expected to increasingly improve the quality of the analytical data. Examples of RVs produced by SIVR are presented. In particular, levels of chromium, mercury, ethylenethiourea, 3,5,6-trichloro-2-pyridinol, 2,5-hexanedione, 1-hydroxypyrene and t,t-muconic acid measured in urine and expressed in micrograms/g creatinine (μg/g creat) or micrograms/L (μg/L) are reported. With the proposed procedure, SIVR intends to make its activities known to the scientific community in order to increase the number of laboratories involved in the definition of RVs for the Italian population. More research is needed to obtain further RVs in different biological matrices, such as hair, nails and exhaled breath. It is also necessary to update and improve the present reference values and broaden the portfolio of chemicals for which RVs are available. In the near future, SIVR intends to expand its scientific activity by using a multivariate approach for xenobiotics that may have a common origin, and to define RVs separately for children who may be exposed more than adults and be more vulnerable.

  20. A soil sampling reference site: The challenge in defining reference material for sampling

    International Nuclear Information System (INIS)

    De Zorzi, Paolo; Barbizzi, Sabrina; Belli, Maria; Fajgelj, Ales; Jacimovic, Radojko; Jeran, Zvonka; Sansone, Umberto; Perk, Marcel van der

    2008-01-01

    In the frame of the international SOILSAMP project, funded and coordinated by the Italian Environmental Protection Agency, an agricultural area was established as a reference site suitable for performing soil sampling inter-comparison exercises. The reference site was characterized for trace element content in soil, in terms of the spatial and temporal variability of their mass fraction. Considering that the behaviour of long-lived radionuclides in soil can be expected to be similar to that of some stable trace elements and that the distribution of these trace elements in soil can simulate the distribution of radionuclides, the reference site characterised in term of trace elements, can be also used to compare the soil sampling strategies developed for radionuclide investigations

  1. A soil sampling reference site: The challenge in defining reference material for sampling

    Energy Technology Data Exchange (ETDEWEB)

    De Zorzi, Paolo [Agenzia per la Protezione dell' Ambiente e per i Servizi Tecnici (APAT), Servizio Metrologia Ambientale, Via di Castel Romano, Rome 100-00128 (Italy)], E-mail: paolo.dezorzi@apat.it; Barbizzi, Sabrina; Belli, Maria [Agenzia per la Protezione dell' Ambiente e per i Servizi Tecnici (APAT), Servizio Metrologia Ambientale, Via di Castel Romano, Rome 100-00128 (Italy); Fajgelj, Ales [International Atomic Energy Agency (IAEA), Agency' s Laboratories Seibersdorf, Vienna A-1400 (Austria); Jacimovic, Radojko; Jeran, Zvonka; Sansone, Umberto [Jozef Stefan Institute, Jamova 39, Ljubljana 1000 (Slovenia); Perk, Marcel van der [Department of Physical Geography, Utrecht University, P.O. Box 80115, TC Utrecht 3508 (Netherlands)

    2008-11-15

    In the frame of the international SOILSAMP project, funded and coordinated by the Italian Environmental Protection Agency, an agricultural area was established as a reference site suitable for performing soil sampling inter-comparison exercises. The reference site was characterized for trace element content in soil, in terms of the spatial and temporal variability of their mass fraction. Considering that the behaviour of long-lived radionuclides in soil can be expected to be similar to that of some stable trace elements and that the distribution of these trace elements in soil can simulate the distribution of radionuclides, the reference site characterised in term of trace elements, can be also used to compare the soil sampling strategies developed for radionuclide investigations.

  2. A soil sampling reference site: the challenge in defining reference material for sampling.

    Science.gov (United States)

    de Zorzi, Paolo; Barbizzi, Sabrina; Belli, Maria; Fajgelj, Ales; Jacimovic, Radojko; Jeran, Zvonka; Sansone, Umberto; van der Perk, Marcel

    2008-11-01

    In the frame of the international SOILSAMP project, funded and coordinated by the Italian Environmental Protection Agency, an agricultural area was established as a reference site suitable for performing soil sampling inter-comparison exercises. The reference site was characterized for trace element content in soil, in terms of the spatial and temporal variability of their mass fraction. Considering that the behaviour of long-lived radionuclides in soil can be expected to be similar to that of some stable trace elements and that the distribution of these trace elements in soil can simulate the distribution of radionuclides, the reference site characterised in term of trace elements, can be also used to compare the soil sampling strategies developed for radionuclide investigations.

  3. Incorporating a priori knowledge into initialized weights for neural classifier

    NARCIS (Netherlands)

    Chen, Zhe; Feng, T.J.; Feng, Tian-Jin; Houkes, Z.

    2000-01-01

    Artificial neural networks (ANN), especially, multilayer perceptrons (MLP) have been widely used in pattern recognition and classification. Nevertheless, how to incorporate a priori knowledge in the design of ANNs is still an open problem. The paper tries to give some insight on this topic

  4. Psicoanálisis y filosofía: el problema del a priori de la investigación en Heidegger y Winnicott Psychoanalysis and philosophy: the problem of a priori research in Heidegger and Winnicott

    Directory of Open Access Journals (Sweden)

    Julieta Bareiro

    2011-12-01

    Full Text Available El presente trabajo intenta presentar la relación entre saber psicoanalítico y filosofía a partir de un diálogo entre Winnicott y Heidegger. Para ello se llevará a cabo dos reconstrucciones argumentativas. En primer lugar, se expondrá el modo en que Heidegger determina el carácter a priori de su investigación. En segundo lugar, se hará un recorrido por la obra de Winnicott a fin de determinar cómo concibe el vínculo entre psicoanálisis y filosofía y, fundamentalmente, para establecer si hay lugar en su pensamiento para un a priori que permita trazar un vínculo con la filosofía.This paper aims to present the relationship between psychoanalytic knowledge and philosophy through to dialogue between Winnicott and Heidegger. This will take place two reconstructions argumentative. First, it exposed the way in which Heidegger determines the a priori character of their research. Secondly, it will go through to Winnicott´s work to determine how he conceives the relationship between psychoanalysis and philosophy and, crucially, to establish if a form in their thinking to a priori that allows a link with philosophy.

  5. Time-Dependent Selection of an Optimal Set of Sources to Define a Stable Celestial Reference Frame

    Science.gov (United States)

    Le Bail, Karine; Gordon, David

    2010-01-01

    Temporal statistical position stability is required for VLBI sources to define a stable Celestial Reference Frame (CRF) and has been studied in many recent papers. This study analyzes the sources from the latest realization of the International Celestial Reference Frame (ICRF2) with the Allan variance, in addition to taking into account the apparent linear motions of the sources. Focusing on the 295 defining sources shows how they are a good compromise of different criteria, such as statistical stability and sky distribution, as well as having a sufficient number of sources, despite the fact that the most stable sources of the entire ICRF2 are mostly in the Northern Hemisphere. Nevertheless, the selection of a stable set is not unique: studying different solutions (GSF005a and AUG24 from GSFC and OPA from the Paris Observatory) over different time periods (1989.5 to 2009.5 and 1999.5 to 2009.5) leads to selections that can differ in up to 20% of the sources. Observing, recording, and network improvement are some of the causes, showing better stability for the CRF over the last decade than the last twenty years. But this may also be explained by the assumption of stationarity that is not necessarily right for some sources.

  6. A priori data-driven multi-clustered reservoir generation algorithm for echo state network.

    Directory of Open Access Journals (Sweden)

    Xiumin Li

    Full Text Available Echo state networks (ESNs with multi-clustered reservoir topology perform better in reservoir computing and robustness than those with random reservoir topology. However, these ESNs have a complex reservoir topology, which leads to difficulties in reservoir generation. This study focuses on the reservoir generation problem when ESN is used in environments with sufficient priori data available. Accordingly, a priori data-driven multi-cluster reservoir generation algorithm is proposed. The priori data in the proposed algorithm are used to evaluate reservoirs by calculating the precision and standard deviation of ESNs. The reservoirs are produced using the clustering method; only the reservoir with a better evaluation performance takes the place of a previous one. The final reservoir is obtained when its evaluation score reaches the preset requirement. The prediction experiment results obtained using the Mackey-Glass chaotic time series show that the proposed reservoir generation algorithm provides ESNs with extra prediction precision and increases the structure complexity of the network. Further experiments also reveal the appropriate values of the number of clusters and time window size to obtain optimal performance. The information entropy of the reservoir reaches the maximum when ESN gains the greatest precision.

  7. Monte Carlo full-waveform inversion of crosshole GPR data using multiple-point geostatistical a priori information

    DEFF Research Database (Denmark)

    Cordua, Knud Skou; Hansen, Thomas Mejer; Mosegaard, Klaus

    2012-01-01

    We present a general Monte Carlo full-waveform inversion strategy that integrates a priori information described by geostatistical algorithms with Bayesian inverse problem theory. The extended Metropolis algorithm can be used to sample the a posteriori probability density of highly nonlinear...... inverse problems, such as full-waveform inversion. Sequential Gibbs sampling is a method that allows efficient sampling of a priori probability densities described by geostatistical algorithms based on either two-point (e.g., Gaussian) or multiple-point statistics. We outline the theoretical framework......) Based on a posteriori realizations, complicated statistical questions can be answered, such as the probability of connectivity across a layer. (3) Complex a priori information can be included through geostatistical algorithms. These benefits, however, require more computing resources than traditional...

  8. New a priori estimates for mean-field games with congestion

    KAUST Repository

    Evangelista, David; Gomes, Diogo A.

    2016-01-01

    We present recent developments in crowd dynamics models (e.g. pedestrian flow problems). Our formulation is given by a mean-field game (MFG) with congestion. We start by reviewing earlier models and results. Next, we develop our model. We establish new a priori estimates that give partial regularity of the solutions. Finally, we discuss numerical results.

  9. New a priori estimates for mean-field games with congestion

    KAUST Repository

    Evangelista, David

    2016-01-06

    We present recent developments in crowd dynamics models (e.g. pedestrian flow problems). Our formulation is given by a mean-field game (MFG) with congestion. We start by reviewing earlier models and results. Next, we develop our model. We establish new a priori estimates that give partial regularity of the solutions. Finally, we discuss numerical results.

  10. A Frequency Matching Method for Generation of a Priori Sample Models from Training Images

    DEFF Research Database (Denmark)

    Lange, Katrine; Cordua, Knud Skou; Frydendall, Jan

    2011-01-01

    This paper presents a Frequency Matching Method (FMM) for generation of a priori sample models based on training images and illustrates its use by an example. In geostatistics, training images are used to represent a priori knowledge or expectations of models, and the FMM can be used to generate...... new images that share the same multi-point statistics as a given training image. The FMM proceeds by iteratively updating voxel values of an image until the frequency of patterns in the image matches the frequency of patterns in the training image; making the resulting image statistically...... indistinguishable from the training image....

  11. Plasma ascorbic acid, a priori diet quality score, and incident hypertension

    NARCIS (Netherlands)

    Buijsse, Brian; Jacobs, D.R.; Steffen, L.M.; Kromhout, Daan; Gross, M.D.

    2015-01-01

    Vitamin C may reduce risk of hypertension, either in itself or by marking a healthy diet pattern. We assessed whether plasma ascorbic acid and the a priori diet quality score relate to incident hypertension and whether they explain each other's predictive abilities. Data were from 2884 black and

  12. Improving multiple-point-based a priori models for inverse problems by combining Sequential Simulation with the Frequency Matching Method

    DEFF Research Database (Denmark)

    Cordua, Knud Skou; Hansen, Thomas Mejer; Lange, Katrine

    In order to move beyond simplified covariance based a priori models, which are typically used for inverse problems, more complex multiple-point-based a priori models have to be considered. By means of marginal probability distributions ‘learned’ from a training image, sequential simulation has...... proven to be an efficient way of obtaining multiple realizations that honor the same multiple-point statistics as the training image. The frequency matching method provides an alternative way of formulating multiple-point-based a priori models. In this strategy the pattern frequency distributions (i.......e. marginals) of the training image and a subsurface model are matched in order to obtain a solution with the same multiple-point statistics as the training image. Sequential Gibbs sampling is a simulation strategy that provides an efficient way of applying sequential simulation based algorithms as a priori...

  13. Importance of A Priori Vertical Ozone Profiles for TEMPO Air Quality Retrievals

    Science.gov (United States)

    Johnson, M. S.; Sullivan, J. T.; Liu, X.; Zoogman, P.; Newchurch, M.; Kuang, S.; McGee, T. J.; Leblanc, T.

    2017-12-01

    Ozone (O3) is a toxic pollutant which plays a major role in air quality. Typically, monitoring of surface air quality and O3 mixing ratios is conducted using in situ measurement networks. This is partially due to high-quality information related to air quality being limited from space-borne platforms due to coarse spatial resolution, limited temporal frequency, and minimal sensitivity to lower tropospheric and surface-level O3. The Tropospheric Emissions: Monitoring of Pollution (TEMPO) satellite is designed to address the limitations of current space-based platforms and to improve our ability to monitor North American air quality. TEMPO will provide hourly data of total column and vertical profiles of O3 with high spatial resolution to be used as a near-real-time air quality product. TEMPO O3 retrievals will apply the Smithsonian Astrophysical Observatory profile algorithm developed based on work from GOME, GOME-2, and OMI. This algorithm is suggested to use a priori O3 profile information from a climatological data-base developed from long-term ozone-sonde measurements (tropopause-based (TB-Clim) O3 climatology). This study evaluates the TB-Clim dataset and model simulated O3 profiles, which could potentially serve as a priori O3 profile information in TEMPO retrievals, from near-real-time data assimilation model products (NASA GMAO's operational GEOS-5 FP model and reanalysis data from MERRA2) and a full chemical transport model (CTM), GEOS-Chem. In this study, vertical profile products are evaluated with surface (0-2 km) and tropospheric (0-10 km) TOLNet observations and the theoretical impact of individual a priori profile sources on the accuracy of TEMPO O3 retrievals in the troposphere and at the surface are presented. Results indicate that while the TB-Clim climatological dataset can replicate seasonally-averaged tropospheric O3 profiles, model-simulated profiles from a full CTM resulted in more accurate tropospheric and surface-level O3 retrievals from

  14. Choice of Reference Serum Creatinine in Defining Acute Kidney Injury.

    Science.gov (United States)

    Siew, Edward D; Matheny, Michael E

    2015-01-01

    The study of acute kidney injury (AKI) has expanded with the increasing availability of electronic health records and the use of standardized definitions. Understanding the impact of AKI between settings is limited by heterogeneity in the selection of reference creatinine to anchor the definition of AKI. In this mini-review, we discuss different approaches used to select reference creatinine and their relative merits and limitations. We reviewed the literature to obtain representative examples of published baseline creatinine definitions when pre-hospital data were not available, as well as literature evaluating the estimation of baseline renal function, using PubMed and reference back-tracing within known works. (1) Pre-hospital creatinine values are useful in determining reference creatinine, and in high-risk populations, the mean outpatient serum creatinine value 7-365 days before hospitalization closely approximates nephrology adjudication, (2) in patients without pre-hospital data, the eGFR 75 approach does not reliably estimate true AKI incidence in most at-risk populations, (3) using the lowest inpatient serum creatinine may be reasonable, especially in those with preserved kidney function, but may generously estimate AKI incidence and severity and miss community-acquired AKI that does not fully resolve, (4) using more specific definitions of AKI (e.g., KIDGO stages 2 and 3) may help to reduce the effects of misclassification when using surrogate values and (5) leveraging available clinical data may help refine the estimate of reference creatinine. Choosing reference creatinine for AKI calculation is important for AKI classification and study interpretation. We recommend obtaining data on pre-hospital kidney function, wherever possible. In studies where surrogate estimates are used, transparency in how they are applied and discussion that informs the reader of potential biases should be provided. Further work to refine the estimation of reference creatinine

  15. Image reconstruction in electrostatic tomography using a priori knowledge from ECT

    International Nuclear Information System (INIS)

    Zhou Bin; Zhang Jianyong; Xu Chuanlong; Wang Shimin

    2011-01-01

    Research highlights: → A dual-mode sensor technique based on ECT and EST is proposed. → The interference of the charged particles to ECT can be eliminated. → A priori knowledge from ECT improves the inversion accuracy. - Abstract: In gas-solid two-phase flow, the charge distribution is a very important process parameter which is useful to the study of electrostatic adhesion. Electrostatic tomography (EST) is a relatively new non-intrusive technique which can be used to acquire charge distribution. However, due to limited measurements, the quality of image reconstruction is poor. In this paper, a dual-mode sensor technique based on electrical capacitance tomography (ECT) and EST is proposed. The theoretical analysis and the numerical simulation results reveal that the permittivity distribution obtained from ECT can provide a priori knowledge for the inversion calculation of EST, so that the accuracy of spatial sensitivity calculation in EST can be improved. This proposed technique is expected to be prospective in industrial applications and will also be beneficial to the research on the fluid dynamics of gas-solid two-phase flow.

  16. Profile reconstruction from neutron reflectivity data and a priori knowledge

    International Nuclear Information System (INIS)

    Leeb, H.

    2008-01-01

    The problem of incomplete and noisy information in profile reconstruction from neutron reflectometry data is considered. In particular methods of Bayesian statistics in combination with modelling or inverse scattering techniques are considered in order to properly include the required a priori knowledge to obtain quantitatively reliable estimates of the reconstructed profiles. Applying Bayes theorem the results of different experiments on the same sample can be consistently included in the profile reconstruction

  17. Full 3-D stratigraphic inversion with a priori information: a powerful way to optimize data integration

    Energy Technology Data Exchange (ETDEWEB)

    Grizon, L.; Leger, M.; Dequirez, P.Y.; Dumont, F.; Richard, V.

    1998-12-31

    Integration between seismic and geological data is crucial to ensure that a reservoir study is accurate and reliable. To reach this goal, there is used a post-stack stratigraphic inversion with a priori information. The global cost-function combines two types of constraints. One is relevant to seismic amplitudes, and the other to an a priori impedance model. This paper presents this flexible and interpretative inversion to determine acoustic impedances constrained by seismic data, log data and geologic information. 5 refs., 8 figs.

  18. Relations between water physico-chemistry and benthic algal communities in a northern Canadian watershed: defining reference conditions using multiple descriptors of community structure.

    Science.gov (United States)

    Thomas, Kathryn E; Hall, Roland I; Scrimgeour, Garry J

    2015-09-01

    Defining reference conditions is central to identifying environmental effects of anthropogenic activities. Using a watershed approach, we quantified reference conditions for benthic algal communities and their relations to physico-chemical conditions in rivers in the South Nahanni River watershed, NWT, Canada, in 2008 and 2009. We also compared the ability of three descriptors that vary in terms of analytical costs to define algal community structure based on relative abundances of (i) all algal taxa, (ii) only diatom taxa, and (iii) photosynthetic pigments. Ordination analyses showed that variance in algal community structure was strongly related to gradients in environmental variables describing water physico-chemistry, stream habitats, and sub-watershed structure. Water physico-chemistry and local watershed-scale descriptors differed significantly between algal communities from sites in the Selwyn Mountain ecoregion compared to sites in the Nahanni-Hyland ecoregions. Distinct differences in algal community types between ecoregions were apparent irrespective of whether algal community structure was defined using all algal taxa, diatom taxa, or photosynthetic pigments. Two algal community types were highly predictable using environmental variables, a core consideration in the development of Reference Condition Approach (RCA) models. These results suggest that assessments of environmental impacts could be completed using RCA models for each ecoregion. We suggest that use of algal pigments, a high through-put analysis, is a promising alternative compared to more labor-intensive and costly taxonomic approaches for defining algal community structure.

  19. LMFBR safety criteria: cost-benefit considerations under the constraint of an a priori risk criterion

    International Nuclear Information System (INIS)

    Hartung, J.

    1979-01-01

    The role of cost-benefit considerations and a priori risk criteria as determinants of Core Disruptive Accident (CDA)-related safety criteria for large LMFBR's is explored with the aid of quantitative risk and probabilistic analysis methods. A methodology is described which allows a large number of design and siting alternatives to be traded off against each other with the goal of minimizing energy generation costs subject to the constraint of both an a priori risk criterion and a cost-benefit criterion. Application of this methodology to a specific LMFBR design project is described and the results are discussed. 5 refs

  20. Comparison of a priori versus provisional heparin therapy on radial artery occlusion after transradial coronary angiography and patent hemostasis (from the PHARAOH Study).

    Science.gov (United States)

    Pancholy, Samir B; Bertrand, Olivier F; Patel, Tejas

    2012-07-15

    Systemic anticoagulation decreases the risk of radial artery occlusion (RAO) after transradial catheterization and standard occlusive hemostasis. We compared the efficacy and safety of provisional heparin use only when the technique of patent hemostasis was not achievable to standard a priori heparin administration after radial sheath introduction. Patients referred for coronary angiography were randomized in 2 groups. In the a priori group, 200 patients received intravenous heparin (50 IU/kg) immediately after sheath insertion. In the provisional group, 200 patients did not receive heparin during the procedure. After sheath removal, hemostasis was obtained using a TR band (Terumo corporation, Tokyo, Japan) with a plethysmography-guided patent hemostasis technique. In the provisional group, no heparin was given if radial artery patency could be obtained and maintained. If radial patency was not achieved, a bolus of heparin (50 IU/kg) was given. Radial artery patency was evaluated at 24 hours (early RAO) and 30 days after the procedure (late RAO) by plethysmography. Patent hemostasis was obtained in 67% in the a priori group and 74% in the provisional group (p = 0.10). Incidence of RAO remained similar in the 2 groups at the early (7.5% vs 7.0%, p = 0.84) and late (4.5% vs 5.0%, p = 0.83) evaluations. Women, patients with diabetes, patients having not received heparin, and patients without radial artery patency during hemostasis had more RAO. By multivariate analysis, patent radial artery during hemostasis (odds ratio [OR] 0.03, 95% confidence interval [CI] 0.004 to 0.28, p = 0.002) and diabetes (OR 11, 95% CI 3 to 38,p patent hemostasis is maintained. Copyright © 2012 Elsevier Inc. All rights reserved.

  1. Realism, functions, and the a priori: Ernst Cassirer's philosophy of science.

    Science.gov (United States)

    Heis, Jeremy

    2014-12-01

    This paper presents the main ideas of Cassirer's general philosophy of science, focusing on the two aspects of his thought that--in addition to being the most central ideas in his philosophy of science--have received the most attention from contemporary philosophers of science: his theory of the a priori aspects of physical theory, and his relation to scientific realism.

  2. Diffuse optical tomography with physiological and spatial a priori constraints

    International Nuclear Information System (INIS)

    Intes, Xavier; Maloux, Clemence; Guven, Murat; Yazici, Birzen; Chance, Britton

    2004-01-01

    Diffuse optical tomography is a typical inverse problem plagued by ill-condition. To overcome this drawback, regularization or constraining techniques are incorporated in the inverse formulation. In this work, we investigate the enhancement in recovering functional parameters by using physiological and spatial a priori constraints. More accurate recovery of the two main functional parameters that are the blood volume and the relative saturation is demonstrated through simulations by using our method compared to actual techniques. (note)

  3. Approximate deconvolution model for the simulation of turbulent gas-solid flows: An a priori analysis

    Science.gov (United States)

    Schneiderbauer, Simon; Saeedipour, Mahdi

    2018-02-01

    Highly resolved two-fluid model (TFM) simulations of gas-solid flows in vertical periodic channels have been performed to study closures for the filtered drag force and the Reynolds-stress-like contribution stemming from the convective terms. An approximate deconvolution model (ADM) for the large-eddy simulation of turbulent gas-solid suspensions is detailed and subsequently used to reconstruct those unresolved contributions in an a priori manner. With such an approach, an approximation of the unfiltered solution is obtained by repeated filtering allowing the determination of the unclosed terms of the filtered equations directly. A priori filtering shows that predictions of the ADM model yield fairly good agreement with the fine grid TFM simulations for various filter sizes and different particle sizes. In particular, strong positive correlation (ρ > 0.98) is observed at intermediate filter sizes for all sub-grid terms. Additionally, our study reveals that the ADM results moderately depend on the choice of the filters, such as box and Gaussian filter, as well as the deconvolution order. The a priori test finally reveals that ADM is superior compared to isotropic functional closures proposed recently [S. Schneiderbauer, "A spatially-averaged two-fluid model for dense large-scale gas-solid flows," AIChE J. 63, 3544-3562 (2017)].

  4. Development and validation of an MRI reference criterion for defining a positive SIJ MRI in spondyloarthritis

    DEFF Research Database (Denmark)

    Weber, Ulrich; Zubler, Veronika; Pedersen, Susanne J

    2012-01-01

    OBJECTIVE: To validate an MRI reference criterion for a positive SIJ MRI based on the level of confidence in classification of spondyloarthritis (SpA) by expert MRI readers. METHODS: Four readers assessed SIJ MRI in two inception cohorts (A/B) of 157 consecutive back pain patients ≤50 years, and ...... using two inception cohorts and comparing clinical and MRI-based classification supports the case for including both erosion and BME to define a positive SIJ MRI for the classification of axial SpA. © 2012 by the American College of Rheumatology.......OBJECTIVE: To validate an MRI reference criterion for a positive SIJ MRI based on the level of confidence in classification of spondyloarthritis (SpA) by expert MRI readers. METHODS: Four readers assessed SIJ MRI in two inception cohorts (A/B) of 157 consecutive back pain patients ≤50 years......, and in 20 healthy controls. Patients were classified according to clinical examination and pelvic radiography as having non-radiographic axial SpA (n=51), ankylosing spondylitis (n=34), or non-specific back pain (n=72). Readers recorded their level of confidence in the classification of SpA on a 0-10 scale...

  5. The Sample Size Influence in the Accuracy of the Image Classification of the Remote Sensing

    Directory of Open Access Journals (Sweden)

    Thomaz C. e C. da Costa

    2004-12-01

    Full Text Available Landuse/landcover maps produced by classification of remote sensing images incorporate uncertainty. This uncertainty is measured by accuracy indices using reference samples. The size of the reference sample is defined by approximation by a binomial function without the use of a pilot sample. This way the accuracy are not estimated, but fixed a priori. In case of divergency between the estimated and a priori accuracy the error of the sampling will deviate from the expected error. The size using pilot sample (theorically correct procedure justify when haven´t estimate of accuracy for work area, referent the product remote sensing utility.

  6. Evaluating A Priori Ozone Profile Information Used in TEMPO Tropospheric Ozone Retrievals

    Science.gov (United States)

    Johnson, Matthew S.; Sullivan, John T.; Liu, Xiong; Newchurch, Mike; Kuang, Shi; McGee, Thomas J.; Langford, Andrew O'Neil; Senff, Christoph J.; Leblanc, Thierry; Berkoff, Timothy; hide

    2016-01-01

    Ozone (O3) is a greenhouse gas and toxic pollutant which plays a major role in air quality. Typically, monitoring of surface air quality and O3 mixing ratios is primarily conducted using in situ measurement networks. This is partially due to high-quality information related to air quality being limited from space-borne platforms due to coarse spatial resolution, limited temporal frequency, and minimal sensitivity to lower tropospheric and surface-level O3. The Tropospheric Emissions: Monitoring of Pollution (TEMPO) satellite is designed to address these limitations of current space-based platforms and to improve our ability to monitor North American air quality. TEMPO will provide hourly data of total column and vertical profiles of O3 with high spatial resolution to be used as a near-real-time air quality product. TEMPO O3 retrievals will apply the Smithsonian Astrophysical Observatory profile algorithm developed based on work from GOME, GOME-2, and OMI. This algorithm uses a priori O3 profile information from a climatological data-base developed from long-term ozone-sonde measurements (tropopause-based (TB) O3 climatology). It has been shown that satellite O3 retrievals are sensitive to a priori O3 profiles and covariance matrices. During this work we investigate the climatological data to be used in TEMPO algorithms (TB O3) and simulated data from the NASA GMAO Goddard Earth Observing System (GEOS-5) Forward Processing (FP) near-real-time (NRT) model products. These two data products will be evaluated with ground-based lidar data from the Tropospheric Ozone Lidar Network (TOLNet) at various locations of the US. This study evaluates the TB climatology, GEOS-5 climatology, and 3-hourly GEOS-5 data compared to lower tropospheric observations to demonstrate the accuracy of a priori information to potentially be used in TEMPO O3 algorithms. Here we present our initial analysis and the theoretical impact on TEMPO retrievals in the lower troposphere.

  7. Evaluating A Priori Ozone Profile Information Used in TEMPO Tropospheric Ozone Retrievals

    Science.gov (United States)

    Johnson, M. S.; Sullivan, J. T.; Liu, X.; Newchurch, M.; Kuang, S.; McGee, T. J.; Langford, A. O.; Senff, C. J.; Leblanc, T.; Berkoff, T.; Gronoff, G.; Chen, G.; Strawbridge, K. B.

    2016-12-01

    Ozone (O3) is a greenhouse gas and toxic pollutant which plays a major role in air quality. Typically, monitoring of surface air quality and O3 mixing ratios is primarily conducted using in situ measurement networks. This is partially due to high-quality information related to air quality being limited from space-borne platforms due to coarse spatial resolution, limited temporal frequency, and minimal sensitivity to lower tropospheric and surface-level O3. The Tropospheric Emissions: Monitoring of Pollution (TEMPO) satellite is designed to address these limitations of current space-based platforms and to improve our ability to monitor North American air quality. TEMPO will provide hourly data of total column and vertical profiles of O3 with high spatial resolution to be used as a near-real-time air quality product. TEMPO O3 retrievals will apply the Smithsonian Astrophysical Observatory profile algorithm developed based on work from GOME, GOME-2, and OMI. This algorithm uses a priori O3 profile information from a climatological data-base developed from long-term ozone-sonde measurements (tropopause-based (TB) O3 climatology). It has been shown that satellite O3 retrievals are sensitive to a priori O3 profiles and covariance matrices. During this work we investigate the climatological data to be used in TEMPO algorithms (TB O3) and simulated data from the NASA GMAO Goddard Earth Observing System (GEOS-5) Forward Processing (FP) near-real-time (NRT) model products. These two data products will be evaluated with ground-based lidar data from the Tropospheric Ozone Lidar Network (TOLNet) at various locations of the US. This study evaluates the TB climatology, GEOS-5 climatology, and 3-hourly GEOS-5 data compared to lower tropospheric observations to demonstrate the accuracy of a priori information to potentially be used in TEMPO O3 algorithms. Here we present our initial analysis and the theoretical impact on TEMPO retrievals in the lower troposphere.

  8. Geopositioning with a quadcopter: Extracted feature locations and predicted accuracy without a priori sensor attitude information

    Science.gov (United States)

    Dolloff, John; Hottel, Bryant; Edwards, David; Theiss, Henry; Braun, Aaron

    2017-05-01

    This paper presents an overview of the Full Motion Video-Geopositioning Test Bed (FMV-GTB) developed to investigate algorithm performance and issues related to the registration of motion imagery and subsequent extraction of feature locations along with predicted accuracy. A case study is included corresponding to a video taken from a quadcopter. Registration of the corresponding video frames is performed without the benefit of a priori sensor attitude (pointing) information. In particular, tie points are automatically measured between adjacent frames using standard optical flow matching techniques from computer vision, an a priori estimate of sensor attitude is then computed based on supplied GPS sensor positions contained in the video metadata and a photogrammetric/search-based structure from motion algorithm, and then a Weighted Least Squares adjustment of all a priori metadata across the frames is performed. Extraction of absolute 3D feature locations, including their predicted accuracy based on the principles of rigorous error propagation, is then performed using a subset of the registered frames. Results are compared to known locations (check points) over a test site. Throughout this entire process, no external control information (e.g. surveyed points) is used other than for evaluation of solution errors and corresponding accuracy.

  9. A priori and a posteriori approaches in human reliability

    International Nuclear Information System (INIS)

    Griffon-Fouco, M.; Gagnolet, P.

    1981-09-01

    The French atomic energy commission (CEA) and the French supplier in electric power (EDF) have joint studies on human factors in nuclear safety. This paper deals with these studies which are a combination of two approaches: - An a posteriori approach so as to know the rate of human errors and their causes: an analysis of incident data banks and an analysis of human errors on simulator are presented. - An a priori approach so as to know the potential factors of human errors: an analysis of the control rooms design and an analysis of the writing of procedures are presented. The possibility to take into account these two approaches to prevent and quantify human errors is discussed

  10. Towards Improving Satellite Tropospheric NO2 Retrieval Products: Impacts of the spatial resolution and lighting NOx production from the a priori chemical transport model

    Science.gov (United States)

    Smeltzer, C. D.; Wang, Y.; Zhao, C.; Boersma, F.

    2009-12-01

    Polar orbiting satellite retrievals of tropospheric nitrogen dioxide (NO2) columns are important to a variety of scientific applications. These NO2 retrievals rely on a priori profiles from chemical transport models and radiative transfer models to derive the vertical columns (VCs) from slant columns measurements. In this work, we compare the retrieval results using a priori profiles from a global model (TM4) and a higher resolution regional model (REAM) at the OMI overpass hour of 1330 local time, implementing the Dutch OMI NO2 (DOMINO) retrieval. We also compare the retrieval results using a priori profiles from REAM model simulations with and without lightning NOx (NO + NO2) production. A priori model resolution and lightning NOx production are both found to have large impact on satellite retrievals by altering the satellite sensitivity to a particular observation by shifting the NO2 vertical distribution interpreted by the radiation model. The retrieved tropospheric NO2 VCs may increase by 25-100% in urban regions and be reduced by 50% in rural regions if the a priori profiles from REAM simulations are used during the retrievals instead of the profiles from TM4 simulations. The a priori profiles with lightning NOx may result in a 25-50% reduction of the retrieved tropospheric NO2 VCs compared to the a priori profiles without lightning. As first priority, a priori vertical NO2 profiles from a chemical transport model with a high resolution, which can better simulate urban-rural NO2 gradients in the boundary layer and make use of observation-based parameterizations of lightning NOx production, should be first implemented to obtain more accurate NO2 retrievals over the United States, where NOx source regions are spatially separated and lightning NOx production is significant. Then as consequence of a priori NO2 profile variabilities resulting from lightning and model resolution dynamics, geostationary satellite, daylight observations would further promote the next

  11. Resting-state test-retest reliability of a priori defined canonical networks over different preprocessing steps.

    Science.gov (United States)

    Varikuti, Deepthi P; Hoffstaedter, Felix; Genon, Sarah; Schwender, Holger; Reid, Andrew T; Eickhoff, Simon B

    2017-04-01

    Resting-state functional connectivity analysis has become a widely used method for the investigation of human brain connectivity and pathology. The measurement of neuronal activity by functional MRI, however, is impeded by various nuisance signals that reduce the stability of functional connectivity. Several methods exist to address this predicament, but little consensus has yet been reached on the most appropriate approach. Given the crucial importance of reliability for the development of clinical applications, we here investigated the effect of various confound removal approaches on the test-retest reliability of functional-connectivity estimates in two previously defined functional brain networks. Our results showed that gray matter masking improved the reliability of connectivity estimates, whereas denoising based on principal components analysis reduced it. We additionally observed that refraining from using any correction for global signals provided the best test-retest reliability, but failed to reproduce anti-correlations between what have been previously described as antagonistic networks. This suggests that improved reliability can come at the expense of potentially poorer biological validity. Consistent with this, we observed that reliability was proportional to the retained variance, which presumably included structured noise, such as reliable nuisance signals (for instance, noise induced by cardiac processes). We conclude that compromises are necessary between maximizing test-retest reliability and removing variance that may be attributable to non-neuronal sources.

  12. Resting-state test-retest reliability of a priori defined canonical networks over different preprocessing steps

    Science.gov (United States)

    Varikuti, Deepthi P.; Hoffstaedter, Felix; Genon, Sarah; Schwender, Holger; Reid, Andrew T.; Eickhoff, Simon B.

    2016-01-01

    Resting-state functional connectivity analysis has become a widely used method for the investigation of human brain connectivity and pathology. The measurement of neuronal activity by functional MRI, however, is impeded by various nuisance signals that reduce the stability of functional connectivity. Several methods exist to address this predicament, but little consensus has yet been reached on the most appropriate approach. Given the crucial importance of reliability for the development of clinical applications, we here investigated the effect of various confound removal approaches on the test-retest reliability of functional-connectivity estimates in two previously defined functional brain networks. Our results showed that grey matter masking improved the reliability of connectivity estimates, whereas de-noising based on principal components analysis reduced it. We additionally observed that refraining from using any correction for global signals provided the best test-retest reliability, but failed to reproduce anti-correlations between what have been previously described as antagonistic networks. This suggests that improved reliability can come at the expense of potentially poorer biological validity. Consistent with this, we observed that reliability was proportional to the retained variance, which presumably included structured noise, such as reliable nuisance signals (for instance, noise induced by cardiac processes). We conclude that compromises are necessary between maximizing test-retest reliability and removing variance that may be attributable to non-neuronal sources. PMID:27550015

  13. Content analyses of a priori qualitative phantom limb pain descriptions and emerging categories in mid-southerners with limb loss.

    Science.gov (United States)

    Evans, Cecile B

    2014-01-01

    The purposes of this descriptive study were (a) to identify the relative frequencies of a priori categories of phantom limb pain (PLP) quality descriptors reported by Mid-Southerners with limb loss, (b) to analyze their descriptions for emerging categories of PLP, and (c) to identify the relative frequencies of the emerging categories. This cross-sectional descriptive verbal survey assessed PLP descriptors. A content analyses determined relative frequencies of a priori PLP descriptors as well as emerging categories that were identified. The most common a priori PLP quality descriptors reported by 52 amputees with PLP were intermittent, tingling/needles/numb, sharp, cramping, burning, and stabbing. The most common emerging categories reported were pain compared to illness/injury, electrical cyclical, and manipulated/positional. The detailed descriptions of PLP provide insight into the vivid experiences of PLP. Rehabilitation nurses can use this information with PLP assessment, patient teaching, and counseling. © 2013 Association of Rehabilitation Nurses.

  14. A Priori Regularity of Parabolic Partial Differential Equations

    KAUST Repository

    Berkemeier, Francisco

    2018-05-13

    In this thesis, we consider parabolic partial differential equations such as the heat equation, the Fokker-Planck equation, and the porous media equation. Our aim is to develop methods that provide a priori estimates for solutions with singular initial data. These estimates are obtained by understanding the time decay of norms of solutions. First, we derive regularity results for the heat equation by estimating the decay of Lebesgue norms. Then, we apply similar methods to the Fokker-Planck equation with suitable assumptions on the advection and diffusion. Finally, we conclude by extending our techniques to the porous media equation. The sharpness of our results is confirmed by examining known solutions of these equations. The main contribution of this thesis is the use of functional inequalities to express decay of norms as differential inequalities. These are then combined with ODE methods to deduce estimates for the norms of solutions and their derivatives.

  15. VBE reference framework

    NARCIS (Netherlands)

    Afsarmanesh, H.; Camarinha-Matos, L.M.; Ermilova, E.; Camarinha-Matos, L.M.; Afsarmanesh, H.; Ollus, M.

    2008-01-01

    Defining a comprehensive and generic "reference framework" for Virtual organizations Breeding Environments (VBEs), addressing all their features and characteristics, is challenging. While the definition and modeling of VBEs has become more formalized during the last five years, "reference models"

  16. An Empirical Study of the Weigl-Goldstein-Scheerer Color-Form Test According to a Developmental Frame of Reference.

    Science.gov (United States)

    Strauss, Helen; Lewin, Isaac

    1982-01-01

    Analyzed the Weigl-Goldstein-Scheerer Color-Form Test using a sample of Danish children. Distinguished three dimensions: configuration of sorting, verbalization of the sorting principle, and the flexibility of switching sorting principle. The three dimensions proved themselves to constitute the a-priori-defined gradients. Results indicated a…

  17. Global a priori estimates for the inhomogeneous Landau equation with moderately soft potentials

    Science.gov (United States)

    Cameron, Stephen; Silvestre, Luis; Snelson, Stanley

    2018-05-01

    We establish a priori upper bounds for solutions to the spatially inhomogeneous Landau equation in the case of moderately soft potentials, with arbitrary initial data, under the assumption that mass, energy and entropy densities stay under control. Our pointwise estimates decay polynomially in the velocity variable. We also show that if the initial data satisfies a Gaussian upper bound, this bound is propagated for all positive times.

  18. Tomographic inversion of time-domain resistivity and chargeability data for the investigation of landfills using a priori information.

    Science.gov (United States)

    De Donno, Giorgio; Cardarelli, Ettore

    2017-01-01

    In this paper, we present a new code for the modelling and inversion of resistivity and chargeability data using a priori information to improve the accuracy of the reconstructed model for landfill. When a priori information is available in the study area, we can insert them by means of inequality constraints on the whole model or on a single layer or assigning weighting factors for enhancing anomalies elongated in the horizontal or vertical directions. However, when we have to face a multilayered scenario with numerous resistive to conductive transitions (the case of controlled landfills), the effective thickness of the layers can be biased. The presented code includes a model-tuning scheme, which is applied after the inversion of field data, where the inversion of the synthetic data is performed based on an initial guess, and the absolute difference between the field and synthetic inverted models is minimized. The reliability of the proposed approach has been supported in two real-world examples; we were able to identify an unauthorized landfill and to reconstruct the geometrical and physical layout of an old waste dump. The combined analysis of the resistivity and chargeability (normalised) models help us to remove ambiguity due to the presence of the waste mass. Nevertheless, the presence of certain layers can remain hidden without using a priori information, as demonstrated by a comparison of the constrained inversion with a standard inversion. The robustness of the above-cited method (using a priori information in combination with model tuning) has been validated with the cross-section from the construction plans, where the reconstructed model is in agreement with the original design. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Defining an absolute reference frame for 'clumped' isotope studies of CO 2

    Science.gov (United States)

    Dennis, Kate J.; Affek, Hagit P.; Passey, Benjamin H.; Schrag, Daniel P.; Eiler, John M.

    2011-11-01

    We present a revised approach for standardizing and reporting analyses of multiply substituted isotopologues of CO 2 (i.e., 'clumped' isotopic species, especially the mass-47 isotopologues). Our approach standardizes such data to an absolute reference frame based on theoretical predictions of the abundances of multiply-substituted isotopologues in gaseous CO 2 at thermodynamic equilibrium. This reference frame is preferred over an inter-laboratory calibration of carbonates because it enables all laboratories measuring mass 47 CO 2 to use a common scale that is tied directly to theoretical predictions of clumping in CO 2, regardless of the laboratory's primary research field (carbonate thermometry or CO 2 biogeochemistry); it explicitly accounts for mass spectrometric artifacts rather than convolving (and potentially confusing) them with chemical fractionations associated with sample preparation; and it is based on a thermodynamic equilibrium that can be experimentally established in any suitably equipped laboratory using commonly available materials. By analyzing CO 2 gases that have been subjected to established laboratory procedures known to promote isotopic equilibrium (i.e., heated gases and water-equilibrated CO 2), and by reference to thermodynamic predictions of equilibrium isotopic distributions, it is possible to construct an empirical transfer function that is applicable to data with unknown clumped isotope signatures. This transfer function empirically accounts for the fragmentation and recombination reactions that occur in electron impact ionization sources and other mass spectrometric artifacts. We describe the protocol necessary to construct such a reference frame, the method for converting gases with unknown clumped isotope compositions to this reference frame, and suggest a protocol for ensuring that all reported isotopic compositions (e.g., Δ 47 values; Eiler and Schauble, 2004; Eiler, 2007) can be compared among different laboratories and

  20. A priori motion models for four-dimensional reconstruction in gated cardiac SPECT

    International Nuclear Information System (INIS)

    Lalush, D.S.; Tsui, B.M.W.; Cui, Lin

    1996-01-01

    We investigate the benefit of incorporating a priori assumptions about cardiac motion in a fully four-dimensional (4D) reconstruction algorithm for gated cardiac SPECT. Previous work has shown that non-motion-specific 4D Gibbs priors enforcing smoothing in time and space can control noise while preserving resolution. In this paper, we evaluate methods for incorporating known heart motion in the Gibbs prior model. The new model is derived by assigning motion vectors to each 4D voxel, defining the movement of that volume of activity into the neighboring time frames. Weights for the Gibbs cliques are computed based on these open-quotes most likelyclose quotes motion vectors. To evaluate, we employ the mathematical cardiac-torso (MCAT) phantom with a new dynamic heart model that simulates the beating and twisting motion of the heart. Sixteen realistically-simulated gated datasets were generated, with noise simulated to emulate a real Tl-201 gated SPECT study. Reconstructions were performed using several different reconstruction algorithms, all modeling nonuniform attenuation and three-dimensional detector response. These include ML-EM with 4D filtering, 4D MAP-EM without prior motion assumption, and 4D MAP-EM with prior motion assumptions. The prior motion assumptions included both the correct motion model and incorrect models. Results show that reconstructions using the 4D prior model can smooth noise and preserve time-domain resolution more effectively than 4D linear filters. We conclude that modeling of motion in 4D reconstruction algorithms can be a powerful tool for smoothing noise and preserving temporal resolution in gated cardiac studies

  1. Brezzi-Pitkaranta stabilization and a priori error analysis for the Stokes Control

    Directory of Open Access Journals (Sweden)

    Aytekin Cibik

    2016-12-01

    Full Text Available In this study, we consider a Brezzi-Pitkaranta stabilization scheme for the optimal control problem governed by stationary Stokes equation, using a P1-P1 interpolation for velocity and pressure. We express the stabilization as extra terms added to the discrete variational form of the problem.  We first prove the stability of the finite element discretization of the problem. Then, we derive a priori error bounds for each variable and present a numerical example to show the effectiveness of the stabilization clearly.

  2. The effect of a priori probability and complexity on decision making in a supervisory control task

    NARCIS (Netherlands)

    Kerstholt, J.H.; Passenier, P.O.; Houttuin, K.; Schuffel, H.

    1996-01-01

    In the present study we investigated how monitoring and fault management in a ship control task are affected by complexity and a priori probability of disturbances. Partici-pants were required to supervise four independent shipping subsystems and to adjust the subsystems whenever deviations

  3. Sampling informative/complex a priori probability distributions using Gibbs sampling assisted by sequential simulation

    DEFF Research Database (Denmark)

    Hansen, Thomas Mejer; Mosegaard, Klaus; Cordua, Knud Skou

    2010-01-01

    Markov chain Monte Carlo methods such as the Gibbs sampler and the Metropolis algorithm can be used to sample the solutions to non-linear inverse problems. In principle these methods allow incorporation of arbitrarily complex a priori information, but current methods allow only relatively simple...... this algorithm with the Metropolis algorithm to obtain an efficient method for sampling posterior probability densities for nonlinear inverse problems....

  4. Using A Priori Information to Improve Atmospheric Duct Estimation

    Science.gov (United States)

    Zhao, X.

    2017-12-01

    Knowledge of refractivity condition in the marine atmospheric boundary layer (MABL) is crucial for the prediction of radar and communication systems performance at frequencies above 1 GHz on low-altitude paths. Since early this century, the `refractivity from clutter (RFC)' technique has been proved to be an effective way to estimate the MABL refractivity structure. Refractivity model is very important for RFC techniques. If prior knowledge of the local refractivity information is available (e.g., from numerical weather prediction models, atmospheric soundings, etc.), more accurate parameterized refractivity model can be constructed by the statistical method, e.g. principal analysis, which in turn can be used to improve the quality of the local refractivity retrievals. This work extends the adjoint parabolic equation approach to range-varying atmospheric duct structure inversions, in which a linear empirical reduced-dimension refractivity model constructed from the a priori refractive information is used.

  5. Considerations about expected a posteriori estimation in adaptive testing: adaptive a priori, adaptive correction for bias, and adaptive integration interval.

    Science.gov (United States)

    Raiche, Gilles; Blais, Jean-Guy

    2009-01-01

    In a computerized adaptive test, we would like to obtain an acceptable precision of the proficiency level estimate using an optimal number of items. Unfortunately, decreasing the number of items is accompanied by a certain degree of bias when the true proficiency level differs significantly from the a priori estimate. The authors suggest that it is possible to reduced the bias, and even the standard error of the estimate, by applying to each provisional estimation one or a combination of the following strategies: adaptive correction for bias proposed by Bock and Mislevy (1982), adaptive a priori estimate, and adaptive integration interval.

  6. Priori mask guided image reconstruction (p-MGIR) for ultra-low dose cone-beam computed tomography

    Science.gov (United States)

    Park, Justin C.; Zhang, Hao; Chen, Yunmei; Fan, Qiyong; Kahler, Darren L.; Liu, Chihray; Lu, Bo

    2015-11-01

    Recently, the compressed sensing (CS) based iterative reconstruction method has received attention because of its ability to reconstruct cone beam computed tomography (CBCT) images with good quality using sparsely sampled or noisy projections, thus enabling dose reduction. However, some challenges remain. In particular, there is always a tradeoff between image resolution and noise/streak artifact reduction based on the amount of regularization weighting that is applied uniformly across the CBCT volume. The purpose of this study is to develop a novel low-dose CBCT reconstruction algorithm framework called priori mask guided image reconstruction (p-MGIR) that allows reconstruction of high-quality low-dose CBCT images while preserving the image resolution. In p-MGIR, the unknown CBCT volume was mathematically modeled as a combination of two regions: (1) where anatomical structures are complex, and (2) where intensities are relatively uniform. The priori mask, which is the key concept of the p-MGIR algorithm, was defined as the matrix that distinguishes between the two separate CBCT regions where the resolution needs to be preserved and where streak or noise needs to be suppressed. We then alternately updated each part of image by solving two sub-minimization problems iteratively, where one minimization was focused on preserving the edge information of the first part while the other concentrated on the removal of noise/artifacts from the latter part. To evaluate the performance of the p-MGIR algorithm, a numerical head-and-neck phantom, a Catphan 600 physical phantom, and a clinical head-and-neck cancer case were used for analysis. The results were compared with the standard Feldkamp-Davis-Kress as well as conventional CS-based algorithms. Examination of the p-MGIR algorithm showed that high-quality low-dose CBCT images can be reconstructed without compromising the image resolution. For both phantom and the patient cases, the p-MGIR is able to achieve a clinically

  7. Revealing plant cryptotypes: defining meaningful phenotypes among infinite traits.

    Science.gov (United States)

    Chitwood, Daniel H; Topp, Christopher N

    2015-04-01

    The plant phenotype is infinite. Plants vary morphologically and molecularly over developmental time, in response to the environment, and genetically. Exhaustive phenotyping remains not only out of reach, but is also the limiting factor to interpreting the wealth of genetic information currently available. Although phenotyping methods are always improving, an impasse remains: even if we could measure the entirety of phenotype, how would we interpret it? We propose the concept of cryptotype to describe latent, multivariate phenotypes that maximize the separation of a priori classes. Whether the infinite points comprising a leaf outline or shape descriptors defining root architecture, statistical methods to discern the quantitative essence of an organism will be required as we approach measuring the totality of phenotype. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Estimating clinical chemistry reference values based on an existing data set of unselected animals.

    Science.gov (United States)

    Dimauro, Corrado; Bonelli, Piero; Nicolussi, Paola; Rassu, Salvatore P G; Cappio-Borlino, Aldo; Pulina, Giuseppe

    2008-11-01

    In an attempt to standardise the determination of biological reference values, the International Federation of Clinical Chemistry (IFCC) has published a series of recommendations on developing reference intervals. The IFCC recommends the use of an a priori sampling of at least 120 healthy individuals. However, such a high number of samples and laboratory analysis is expensive, time-consuming and not always feasible, especially in veterinary medicine. In this paper, an alternative (a posteriori) method is described and is used to determine reference intervals for biochemical parameters of farm animals using an existing laboratory data set. The method used was based on the detection and removal of outliers to obtain a large sample of animals likely to be healthy from the existing data set. This allowed the estimation of reliable reference intervals for biochemical parameters in Sarda dairy sheep. This method may also be useful for the determination of reference intervals for different species, ages and gender.

  9. A priori estimates of global solutions of superlinear parabolic systems

    Directory of Open Access Journals (Sweden)

    Julius Pacuta

    2016-04-01

    Full Text Available We consider the parabolic system $ u_{t}-\\Delta u = u^{r}v^{p}$, $v_{t}-\\Delta v = u^{q}v^{s}$ in $\\Omega\\times(0,\\infty$, complemented by the homogeneous Dirichlet boundary conditions and the initial conditions $(u,v(\\cdot,0 = (u_{0},v_{0}$ in $\\Omega$, where $\\Omega $ is a smooth bounded domain in $ \\mathbb{R}^{N} $ and $ u_{0},v_{0}\\in L^{\\infty}(\\Omega$ are nonnegative functions. We find conditions on $ p,q,r,s $ guaranteeing a priori estimates of nonnegative classical global solutions. More precisely every such solution is bounded by a constant depending on suitable norm of the initial data. Our proofs are based on bootstrap in weighted Lebesgue spaces, universal estimates of auxiliary functions and estimates of the Dirichlet heat kernel.

  10. Knowledge Management and Reference Services

    Science.gov (United States)

    Gandhi, Smiti

    2004-01-01

    Many corporations are embracing knowledge management (KM) to capture the intellectual capital of their employees. This article focuses on KM applications for reference work in libraries. It defines key concepts of KM, establishes a need for KM for reference services, and reviews various KM initiatives for reference services.

  11. Parameter transferability within homogeneous regions and comparisons with predictions from a priori parameters in the eastern United States

    Science.gov (United States)

    Chouaib, Wafa; Alila, Younes; Caldwell, Peter V.

    2018-05-01

    The need for predictions of flow time-series persists at ungauged catchments, motivating the research goals of our study. By means of the Sacramento model, this paper explores the use of parameter transfer within homogeneous regions of similar climate and flow characteristics and makes comparisons with predictions from a priori parameters. We assessed the performance using the Nash-Sutcliffe (NS), bias, mean monthly hydrograph and flow duration curve (FDC). The study was conducted on a large dataset of 73 catchments within the eastern US. Two approaches to the parameter transferability were developed and evaluated; (i) the within homogeneous region parameter transfer using one donor catchment specific to each region, (ii) the parameter transfer disregarding the geographical limits of homogeneous regions, where one donor catchment was common to all regions. Comparisons between both parameter transfers enabled to assess the gain in performance from the parameter regionalization and its respective constraints and limitations. The parameter transfer within homogeneous regions outperformed the a priori parameters and led to a decrease in bias and increase in efficiency reaching a median NS of 0.77 and a NS of 0.85 at individual catchments. The use of FDC revealed the effect of bias on the inaccuracy of prediction from parameter transfer. In one specific region, of mountainous and forested catchments, the prediction accuracy of the parameter transfer was less satisfactory and equivalent to a priori parameters. In this region, the parameter transfer from the outsider catchment provided the best performance; less-biased with smaller uncertainty in medium flow percentiles (40%-60%). The large disparity of energy conditions explained the lack of performance from parameter transfer in this region. Besides, the subsurface stormflow is predominant and there is a likelihood of lateral preferential flow, which according to its specific properties further explained the reduced

  12. O Caráter a priori das Estruturas Necessárias ao Conhecimento, Construídas segundo a Epistemologia Genética

    OpenAIRE

    Marçal, Vicente Eduardo Ribeiro; Tassinari, Ricardo Pereira [UNESP

    2014-01-01

    In this paper we discuss the question of the a priori character of the necessary structures of knowledge according to Genetic Epistemology, focusing on the notion of space in particular. We establish some relations between Jean Piaget’s Genetic Epistemology and Immanuel Kant’s Critical Philosophy, discuss the notion of the a priori according to Kant in relation to the notion of space, and discuss the construction of the notion of space by the epistemic subject according to Genetic Epistemolog...

  13. Communication of scientific uncertainty

    DEFF Research Database (Denmark)

    Brown, Kerry A; Wit, Liesbeth de; Timotijevic, Lada

    2015-01-01

    of folate and vitamin D Dietary Reference Values was explored in three a priori defined areas: (i) value request; (ii) evidence evaluation; and (iii) final values. Design: Qualitative case studies (semi-structured interviews and desk research). A common protocol was used for data collection, interview...

  14. Characteristic Investigation of Unfolded Neutron Spectra with Different Priori Information and Gamma Radiation Interference

    International Nuclear Information System (INIS)

    Kim, Bong Hwan

    2006-01-01

    Neutron field spectrometry using multi spheres such as Bonner Spheres (BS) has been almost essential in radiation protection dosimetry for a long time at workplace in spite of poor energy resolution because it is not asking the fine energy resolution but requiring easy operation and measurement performance over a wide range of energy interested. KAERI has developed and used extended BS system based on a LiI(Eu) scintillator as the representative neutron spectrometry system for workplace monitoring as well as for the quantification of neutron calibration fields such as those recommended by ISO 8529. Major topics in using BS are how close the unfolded spectra is the real one and to minimize the interference of gamma radiation in neutron/gamma mixed fields in case of active instrument such as a BS with a LiI(Eu) scintillator. The former is related with choosing a priori information when unfolding the measured data and the latter is depend on how to discriminate it in intense gamma radiation fields. Influence of a priori information in unfolding and effect of counting loss due to pile-up of signals for the KAERI BS system were investigated analyzing the spectral measurement results of Scattered Neutron Calibration Fields (SNCF)

  15. Contribution to restoration of degraded images by a space-variant system: use of an a priori model of the image

    International Nuclear Information System (INIS)

    Barakat, Valerie

    1998-01-01

    Imaging systems often present shift-variant point spread functions which are usually approximated by shift-invariant ones, in order to simplify the restoration problem. The aim of this thesis is to show that, if this shift-variant degradation is taken into account, it may increase strongly the quality of restoration. The imaging system is a pinhole, used to acquire images of high energy beams. Three restoration methods have been studied and compared: the Tikhonov-Miller regularization, the Markov-fields and the Maximum-Entropy methods. These methods are based on the incorporation of an a priori knowledge into the restoration process, to achieve stability of the solution. An improved restoration method is proposed: this approach is based on the Tikhonov-Miller regularization, combined with an a priori model of the solution. The idea of such a model is to express local characteristics to be reconstructed. The concept of parametric models described by a set of parameters (shape of the object, amplitude values,...) is used. A parametric optimization is used to find the optimal estimation of parameters close to the correct a priori information data of the expected solution. Several criteria have been proposed to measure the restoration quality. (author) [fr

  16. A review of a priori regression models for warfarin maintenance dose prediction.

    Directory of Open Access Journals (Sweden)

    Ben Francis

    Full Text Available A number of a priori warfarin dosing algorithms, derived using linear regression methods, have been proposed. Although these dosing algorithms may have been validated using patients derived from the same centre, rarely have they been validated using a patient cohort recruited from another centre. In order to undertake external validation, two cohorts were utilised. One cohort formed by patients from a prospective trial and the second formed by patients in the control arm of the EU-PACT trial. Of these, 641 patients were identified as having attained stable dosing and formed the dataset used for validation. Predicted maintenance doses from six criterion fulfilling regression models were then compared to individual patient stable warfarin dose. Predictive ability was assessed with reference to several statistics including the R-square and mean absolute error. The six regression models explained different amounts of variability in the stable maintenance warfarin dose requirements of the patients in the two validation cohorts; adjusted R-squared values ranged from 24.2% to 68.6%. An overview of the summary statistics demonstrated that no one dosing algorithm could be considered optimal. The larger validation cohort from the prospective trial produced more consistent statistics across the six dosing algorithms. The study found that all the regression models performed worse in the validation cohort when compared to the derivation cohort. Further, there was little difference between regression models that contained pharmacogenetic coefficients and algorithms containing just non-pharmacogenetic coefficients. The inconsistency of results between the validation cohorts suggests that unaccounted population specific factors cause variability in dosing algorithm performance. Better methods for dosing that take into account inter- and intra-individual variability, at the initiation and maintenance phases of warfarin treatment, are needed.

  17. A review of a priori regression models for warfarin maintenance dose prediction.

    Science.gov (United States)

    Francis, Ben; Lane, Steven; Pirmohamed, Munir; Jorgensen, Andrea

    2014-01-01

    A number of a priori warfarin dosing algorithms, derived using linear regression methods, have been proposed. Although these dosing algorithms may have been validated using patients derived from the same centre, rarely have they been validated using a patient cohort recruited from another centre. In order to undertake external validation, two cohorts were utilised. One cohort formed by patients from a prospective trial and the second formed by patients in the control arm of the EU-PACT trial. Of these, 641 patients were identified as having attained stable dosing and formed the dataset used for validation. Predicted maintenance doses from six criterion fulfilling regression models were then compared to individual patient stable warfarin dose. Predictive ability was assessed with reference to several statistics including the R-square and mean absolute error. The six regression models explained different amounts of variability in the stable maintenance warfarin dose requirements of the patients in the two validation cohorts; adjusted R-squared values ranged from 24.2% to 68.6%. An overview of the summary statistics demonstrated that no one dosing algorithm could be considered optimal. The larger validation cohort from the prospective trial produced more consistent statistics across the six dosing algorithms. The study found that all the regression models performed worse in the validation cohort when compared to the derivation cohort. Further, there was little difference between regression models that contained pharmacogenetic coefficients and algorithms containing just non-pharmacogenetic coefficients. The inconsistency of results between the validation cohorts suggests that unaccounted population specific factors cause variability in dosing algorithm performance. Better methods for dosing that take into account inter- and intra-individual variability, at the initiation and maintenance phases of warfarin treatment, are needed.

  18. Use of a priori information in incomplete data x-ray CT imaging

    International Nuclear Information System (INIS)

    Eberhard, J.W.; Hedengren, K.H.

    1988-01-01

    A new technique for utilizing a priori information is presented which uses CAD electronic part models to make use of effectively all the information which is available in the blueprint of a selected industrial part. Significant improvements in x-ray image quality are demonstrated using the technique in the image enhancement of the model of an exhaust nozzle actuation ring for the F110 aircraft. Three approaches were evaluated: a projection data approach, an iterative reconstruction approach, and an image processing and analysis approach. Results for these approaches are included. X-ray CT images of the simulated part image reconstructed with several choices of available angular range are shown

  19. Globally Stable Adaptive Backstepping Neural Network Control for Uncertain Strict-Feedback Systems With Tracking Accuracy Known a Priori.

    Science.gov (United States)

    Chen, Weisheng; Ge, Shuzhi Sam; Wu, Jian; Gong, Maoguo

    2015-09-01

    This paper addresses the problem of globally stable direct adaptive backstepping neural network (NN) tracking control design for a class of uncertain strict-feedback systems under the assumption that the accuracy of the ultimate tracking error is given a priori. In contrast to the classical adaptive backstepping NN control schemes, this paper analyzes the convergence of the tracking error using Barbalat's Lemma via some nonnegative functions rather than the positive-definite Lyapunov functions. Thus, the accuracy of the ultimate tracking error can be determined and adjusted accurately a priori, and the closed-loop system is guaranteed to be globally uniformly ultimately bounded. The main technical novelty is to construct three new n th-order continuously differentiable functions, which are used to design the control law, the virtual control variables, and the adaptive laws. Finally, two simulation examples are given to illustrate the effectiveness and advantages of the proposed control method.

  20. Reference in human and non-human primate communication: What does it take to refer?

    Science.gov (United States)

    Sievers, Christine; Gruber, Thibaud

    2016-07-01

    The concept of functional reference has been used to isolate potentially referential vocal signals in animal communication. However, its relatedness to the phenomenon of reference in human language has recently been brought into question. While some researchers have suggested abandoning the concept of functional reference altogether, others advocate a revision of its definition to include contextual cues that play a role in signal production and perception. Empirical and theoretical work on functional reference has also put much emphasis on how the receiver understands the referential signal. However, reference, as defined in the linguistic literature, is an action of the producer, and therefore, any definition describing reference in non-human animals must also focus on the producer. To successfully determine whether a signal is used to refer, we suggest an approach from the field of pragmatics, taking a closer look at specific situations of signal production, specifically at the factors that influence the production of a signal by an individual. We define the concept of signaller's reference to identify intentional acts of reference produced by a signaller independently of the communicative modality, and illustrate it with a case study of the hoo vocalizations produced by wild chimpanzees during travel. This novel framework introduces an intentional approach to referentiality. It may therefore permit a closer comparison of human and non-human animal referential behaviour and underlying cognitive processes, allowing us to identify what may have emerged solely in the human lineage.

  1. Determining the depth of certain gravity sources without a priori specification of their structural index

    Science.gov (United States)

    Zhou, Shuai; Huang, Danian

    2015-11-01

    We have developed a new method for the interpretation of gravity tensor data based on the generalized Tilt-depth method. Cooper (2011, 2012) extended the magnetic Tilt-depth method to gravity data. We take the gradient-ratio method of Cooper (2011, 2012) and modify it so that the source type does not need to be specified a priori. We develop the new method by generalizing the Tilt-depth method for depth estimation for different types of source bodies. The new technique uses only the three vertical tensor components of the full gravity tensor data observed or calculated at different height plane to estimate the depth of the buried bodies without a priori specification of their structural index. For severely noise-corrupted data, our method utilizes different upward continuation height data, which can effectively reduce the influence of noise. Theoretical simulations of the gravity source model with and without noise illustrate the ability of the method to provide source depth information. Additionally, the simulations demonstrate that the new method is simple, computationally fast and accurate. Finally, we apply the method using the gravity data acquired over the Humble Salt Dome in the USA as an example. The results show a good correspondence to the previous drilling and seismic interpretation results.

  2. A priori study of subgrid-scale features in turbulent Rayleigh-Bénard convection

    Science.gov (United States)

    Dabbagh, F.; Trias, F. X.; Gorobets, A.; Oliva, A.

    2017-10-01

    At the crossroad between flow topology analysis and turbulence modeling, a priori studies are a reliable tool to understand the underlying physics of the subgrid-scale (SGS) motions in turbulent flows. In this paper, properties of the SGS features in the framework of a large-eddy simulation are studied for a turbulent Rayleigh-Bénard convection (RBC). To do so, data from direct numerical simulation (DNS) of a turbulent air-filled RBC in a rectangular cavity of aspect ratio unity and π spanwise open-ended distance are used at two Rayleigh numbers R a ∈{1 08,1 010 } [Dabbagh et al., "On the evolution of flow topology in turbulent Rayleigh-Bénard convection," Phys. Fluids 28, 115105 (2016)]. First, DNS at Ra = 108 is used to assess the performance of eddy-viscosity models such as QR, Wall-Adapting Local Eddy-viscosity (WALE), and the recent S3PQR-models proposed by Trias et al. ["Building proper invariants for eddy-viscosity subgrid-scale models," Phys. Fluids 27, 065103 (2015)]. The outcomes imply that the eddy-viscosity modeling smoothes the coarse-grained viscous straining and retrieves fairly well the effect of the kinetic unfiltered scales in order to reproduce the coherent large scales. However, these models fail to approach the exact evolution of the SGS heat flux and are incapable to reproduce well the further dominant rotational enstrophy pertaining to the buoyant production. Afterwards, the key ingredients of eddy-viscosity, νt, and eddy-diffusivity, κt, are calculated a priori and revealed positive prevalent values to maintain a turbulent wind essentially driven by the mean buoyant force at the sidewalls. The topological analysis suggests that the effective turbulent diffusion paradigm and the hypothesis of a constant turbulent Prandtl number are only applicable in the large-scale strain-dominated areas in the bulk. It is shown that the bulk-dominated rotational structures of vortex-stretching (and its synchronous viscous dissipative structures) hold

  3. Technostress and the Reference Librarian.

    Science.gov (United States)

    Kupersmith, John

    1992-01-01

    Defines "technostress" as the stress experienced by reference librarians who must constantly deal with the demands of new information technology and the changes they produce in the work place. Discussion includes suggested ways in which both organizations and individuals can work to reduce stress. (27 references) (LAE)

  4. Criteria to define a more relevant reference sample of titanium dioxide in the context of food: a multiscale approach.

    Science.gov (United States)

    Dudefoi, William; Terrisse, Hélène; Richard-Plouet, Mireille; Gautron, Eric; Popa, Florin; Humbert, Bernard; Ropers, Marie-Hélène

    2017-05-01

    Titanium dioxide (TiO 2 ) is a transition metal oxide widely used as a white pigment in various applications, including food. Due to the classification of TiO 2 nanoparticles by the International Agency for Research on Cancer as potentially harmful for humans by inhalation, the presence of nanoparticles in food products needed to be confirmed by a set of independent studies. Seven samples of food-grade TiO 2 (E171) were extensively characterised for their size distribution, crystallinity and surface properties by the currently recommended methods. All investigated E171 samples contained a fraction of nanoparticles, however, below the threshold defining the labelling of nanomaterial. On the basis of these results and a statistical analysis, E171 food-grade TiO 2 totally differs from the reference material P25, confirming the few published data on this kind of particle. Therefore, the reference material P25 does not appear to be the most suitable model to study the fate of food-grade TiO 2 in the gastrointestinal tract. The criteria currently to obtain a representative food-grade sample of TiO 2 are the following: (1) crystalline-phase anatase, (2) a powder with an isoelectric point very close to 4.1, (3) a fraction of nanoparticles comprised between 15% and 45%, and (4) a low specific surface area around 10 m 2  g - 1 .

  5. Local digital control of power electronic converters in a dc microgrid based on a-priori derivation of switching surfaces

    Science.gov (United States)

    Banerjee, Bibaswan

    In power electronic basedmicrogrids, the computational requirements needed to implement an optimized online control strategy can be prohibitive. The work presented in this dissertation proposes a generalized method of derivation of geometric manifolds in a dc microgrid that is based on the a-priori computation of the optimal reactions and trajectories for classes of events in a dc microgrid. The proposed states are the stored energies in all the energy storage elements of the dc microgrid and power flowing into them. It is anticipated that calculating a large enough set of dissimilar transient scenarios will also span many scenarios not specifically used to develop the surface. These geometric manifolds will then be used as reference surfaces in any type of controller, such as a sliding mode hysteretic controller. The presence of switched power converters in microgrids involve different control actions for different system events. The control of the switch states of the converters is essential for steady state and transient operations. A digital memory look-up based controller that uses a hysteretic sliding mode control strategy is an effective technique to generate the proper switch states for the converters. An example dcmicrogrid with three dc-dc boost converters and resistive loads is considered for this work. The geometric manifolds are successfully generated for transient events, such as step changes in the loads and the sources. The surfaces corresponding to a specific case of step change in the loads are then used as reference surfaces in an EEPROM for experimentally validating the control strategy. The required switch states corresponding to this specific transient scenario are programmed in the EEPROM as a memory table. This controls the switching of the dc-dc boost converters and drives the system states to the reference manifold. In this work, it is shown that this strategy effectively controls the system for a transient condition such as step changes

  6. A new birthweight reference in Guangzhou, southern China, and its comparison with the global reference.

    Science.gov (United States)

    He, Jian-Rong; Xia, Hui-Min; Liu, Yu; Xia, Xiao-Yan; Mo, Wei-Jian; Wang, Ping; Cheng, Kar Keung; Leung, Gabriel M; Feng, Qiong; Schooling, C Mary; Qiu, Xiu

    2014-12-01

    To formulate a new birthweight reference for different gestational ages in Guangzhou, southern China, and compare it with the currently used reference in China and the global reference. All singleton live births of more than 26 weeks' gestational age recorded in the Guangzhou Perinatal Health Care and Delivery Surveillance System for the years 2009, 2010 and 2011 (n=510 837) were retrospectively included in the study. In addition, the study sample was supplemented by all singleton live births (n=3538) at gestational ages 26-33 weeks from 2007 and 2008. We used Gaussian mixture models and robust regression to exclude outliers of birth weight and then applied Generalized Additive Models for Location, Scale, and Shape (GAMLSS) to generate smoothed percentile curves separately for gender and parity. Of infants defined as small for gestational age (SGA) in the new reference, 15.3-47.7% (depending on gestational age) were considered appropriate for gestational age (AGA) by the currently used reference of China. Of the infants defined as SGA by the new reference, 9.2% with gestational ages 34-36 weeks and 14.3% with 37-41 weeks were considered AGA by the global reference. At the 50th centile line, the new reference curve was similar to that of the global reference for gestational ages 26-33 weeks and above the global reference for 34-40 weeks. The new birthweight reference based on birthweight data for neonates in Guangzhou, China, differs from the reference currently used in China and the global reference, and appears to be more relevant to the local population. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  7. ‘Sampling the reference set’ revisited

    NARCIS (Netherlands)

    Berkum, van E.E.M.; Linssen, H.N.; Overdijk, D.A.

    1998-01-01

    The confidence level of an inference table is defined as a weighted truth probability of the inference when sampling the reference set. The reference set is recognized by conditioning on the values of maximal partially ancillary statistics. In the sampling experiment values of incidental parameters

  8. A priori tests of combustion models based on a CH{sub 4}/H{sub 2} Triple Flame

    Energy Technology Data Exchange (ETDEWEB)

    Dombard, J.; Naud, B.; Jimenez Sanchez, C.

    2008-07-01

    This document reproduces the final project of Jerome Dombard, presented on June 25, 2008, for the obtention of the Master degree MIMSE (Master Ingenierie Mathematique, Statistique et Economique) of Bordeaux University (Universite Bordeaux 1). We make an a priori study of FPI/FGM-type turbulent combustion models using a 2D DNS of a triple flame. A reduced chemical scheme of 16 species and 12 reactions is used (ARM1, proposed by J.-Y. Chen at Berkeley University). The fuel (CH4/H2 mixture) and oxidizer (air) correspond to the inlet composition of the Sydney bluff-body stabilised flame experiments (flames HM1-3). First, we compute 1D laminar premixed flames. The purpose of those calculations is twofold: 1. check the differences between different computer programs and different treatments of molecular diffusion, and 2. calibrate the 2D-DNS of the laminar triple flame (mainly decide on the grid resolution). Then, the solution of the 2D laminar triple flame is used to test a priori FPI/FGM tables. Finally, preliminary considerations on sub-grid scale modelling in the context of Large Eddy Simulation are made. (Author) 14 refs.

  9. A priori which-way information in quantum interference with unstable particles

    International Nuclear Information System (INIS)

    Krause, D.E.; Fischbach, E.; Rohrbach, Z.J.

    2014-01-01

    If an unstable particle used in a two-path interference experiment decays before reaching a detector, which-way information becomes available that reduces the detected interference fringe visibility V. Here we argue that even when an unstable particle does not decay while in the interferometer, a priori which-way information is still available in the form of path predictability P which depends on the particle's decay rate Γ. We further demonstrate that in a matter-wave Mach–Zehnder interferometer using an excited atom with an appropriately tuned cavity, P is related to V through the duality relation P 2 +V 2 =1. - Highlights: • Even undecayed unstable particles exhibit novel interference effects. • Interference is studied in a Mach–Zehnder interferometer with a cavity. • More which-way information is available when using unstable particles. • A relation between which-way information and interference is satisfied

  10. Reference Structures: Stagnation, Progress, and Future Challenges.

    Science.gov (United States)

    Greenberg, Jane

    1997-01-01

    Assesses the current state of reference structures in online public access catalogs (OPACs) in a framework defined by stagnation, progress, and future challenges. Outlines six areas for reference structure development. Twenty figures provide illustrations. (AEF)

  11. Targeted liquid chromatography tandem mass spectrometry to quantitate wheat gluten using well-defined reference proteins

    Science.gov (United States)

    Schalk, Kathrin; Koehler, Peter

    2018-01-01

    Celiac disease (CD) is an inflammatory disorder of the upper small intestine caused by the ingestion of storage proteins (prolamins and glutelins) from wheat, barley, rye, and, in rare cases, oats. CD patients need to follow a gluten-free diet by consuming gluten-free products with gluten contents of less than 20 mg/kg. Currently, the recommended method for the quantitative determination of gluten is an enzyme-linked immunosorbent assay (ELISA) based on the R5 monoclonal antibody. Because the R5 ELISA mostly detects the prolamin fraction of gluten, a new independent method is required to detect prolamins as well as glutelins. This paper presents the development of a method to quantitate 16 wheat marker peptides derived from all wheat gluten protein types by liquid chromatography tandem mass spectrometry (LC-MS/MS) in the multiple reaction monitoring mode. The quantitation of each marker peptide in the chymotryptic digest of a defined amount of the respective reference wheat protein type resulted in peptide-specific yields. This enabled the conversion of peptide into protein type concentrations. Gluten contents were expressed as sum of all determined protein type concentrations. This new method was applied to quantitate gluten in wheat starches and compared to R5 ELISA and gel-permeation high-performance liquid chromatography with fluorescence detection (GP-HPLC-FLD), which resulted in a strong correlation between LC-MS/MS and the other two methods. PMID:29425234

  12. Targeted liquid chromatography tandem mass spectrometry to quantitate wheat gluten using well-defined reference proteins.

    Directory of Open Access Journals (Sweden)

    Kathrin Schalk

    Full Text Available Celiac disease (CD is an inflammatory disorder of the upper small intestine caused by the ingestion of storage proteins (prolamins and glutelins from wheat, barley, rye, and, in rare cases, oats. CD patients need to follow a gluten-free diet by consuming gluten-free products with gluten contents of less than 20 mg/kg. Currently, the recommended method for the quantitative determination of gluten is an enzyme-linked immunosorbent assay (ELISA based on the R5 monoclonal antibody. Because the R5 ELISA mostly detects the prolamin fraction of gluten, a new independent method is required to detect prolamins as well as glutelins. This paper presents the development of a method to quantitate 16 wheat marker peptides derived from all wheat gluten protein types by liquid chromatography tandem mass spectrometry (LC-MS/MS in the multiple reaction monitoring mode. The quantitation of each marker peptide in the chymotryptic digest of a defined amount of the respective reference wheat protein type resulted in peptide-specific yields. This enabled the conversion of peptide into protein type concentrations. Gluten contents were expressed as sum of all determined protein type concentrations. This new method was applied to quantitate gluten in wheat starches and compared to R5 ELISA and gel-permeation high-performance liquid chromatography with fluorescence detection (GP-HPLC-FLD, which resulted in a strong correlation between LC-MS/MS and the other two methods.

  13. Rapid multi-wavelength optical assessment of circulating blood volume without a priori data

    Science.gov (United States)

    Loginova, Ekaterina V.; Zhidkova, Tatyana V.; Proskurnin, Mikhail A.; Zharov, Vladimir P.

    2016-03-01

    The measurement of circulating blood volume (CBV) is crucial in various medical conditions including surgery, iatrogenic problems, rapid fluid administration, transfusion of red blood cells, or trauma with extensive blood loss including battlefield injuries and other emergencies. Currently, available commercial techniques are invasive and time-consuming for trauma situations. Recently, we have proposed high-speed multi-wavelength photoacoustic/photothermal (PA/PT) flow cytometry for in vivo CBV assessment with multiple dyes as PA contrast agents (labels). As the first step, we have characterized the capability of this technique to monitor the clearance of three dyes (indocyanine green, methylene blue, and trypan blue) in an animal model. However, there are strong demands on improvements in PA/PT flow cytometry. As additional verification of our proof-of-concept of this technique, we performed optical photometric CBV measurements in vitro. Three label dyes—methylene blue, crystal violet and, partially, brilliant green—were selected for simultaneous photometric determination of the components of their two-dye mixtures in the circulating blood in vitro without any extra data (like hemoglobin absorption) known a priori. The tests of single dyes and their mixtures in a flow system simulating a blood transfusion system showed a negligible difference between the sensitivities of the determination of these dyes under batch and flow conditions. For individual dyes, the limits of detection of 3×10-6 M‒3×10-6 M in blood were achieved, which provided their continuous determination at a level of 10-5 M for the CBV assessment without a priori data on the matrix. The CBV assessment with errors no higher than 4% were obtained, and the possibility to apply the developed procedure for optical photometric (flow cytometry) with laser sources was shown.

  14. A Priori User Acceptance and the Perceived Driving Pleasure in Semi-autonomous and Autonomous Vehicles

    DEFF Research Database (Denmark)

    Bjørner, Thomas

    The aim of this minor pilot study is, from a sociological user perspective, to explore a priori user acceptance and the perceived driving pleasure in semi- autonomous and autonomous vehicles. The methods used were 13 in-depth interviews while having participants watch video examples within four...... different scenarios. After each scenario, two different numerical rating scales were used. There was a tendency toward positive attitudes regarding semi- autonomous driving systems, especially the use of a parking assistant and while driving in city traffic congestion. However, there were also major...

  15. Radiotherapy for brain metastases: defining palliative response

    International Nuclear Information System (INIS)

    Bezjak, Andrea; Adam, Janice; Panzarella, Tony; Levin, Wilfred; Barton, Rachael; Kirkbride, Peter; McLean, Michael; Mason, Warren; Wong, Chong Shun; Laperriere, Normand

    2001-01-01

    Background and purpose: Most patients with brain metastases are treated with palliative whole brain radiotherapy (WBRT). There is no established definition of palliative response. The aim of this study was to develop and test clinically useful criteria for response following palliative WBRT. Materials and methods: A prospective study was conducted of patients with symptomatic brain metastases treated with WBRT (20 Gy/5 fractions) and standardised steroid tapering. Assessments included observer rating of neurological symptoms, patient-completed symptom checklist and performance status (PS). Response criteria were operationally defined based on a combination of neurological symptoms, PS and steroid dose. Results: Seventy-five patients were accrued. At 1 month, presenting neurological symptoms were improved in 14 patients, stable in 17, and worse in 21; 23 patients were not assessed, mainly due to death or frailty. Using response criteria defined a priori, 15% (95% CI 7-23%) of patients were classified as having a response to RT, 25% no response, and 29% progression; 27% were deceased at or soon after 1 month. A revised set of criteria was tested, with less emphasis on complete tapering of steroids: they increased the proportion of patients responding to 39% (95% CI 27-50%) but didn't change the large proportion who did not benefit (44%). Conclusions: Clinical response to RT of patients with brain metastases is multifactorial, comprising symptoms, PS and other factors. Assessment of degree of palliation depend on the exact definition used. More research is needed in this important area, to help validate criteria for assessing palliation after WBRT

  16. A program for the a priori evaluation of detection limits in instrumental neutron activation analysis using a SLOWPOKE II reactor

    International Nuclear Information System (INIS)

    Galinier, J.L.; Zikovsky, L.

    1982-01-01

    A program that permits the a priori calculation of detection limits in monoelemental matrices, adapted to instrumental neutron activation analysis using a SLOWPOKE II reactor, is described. A simplified model of the gamma spectra is proposed. Products of (n,p) and (n,α) reactions induced by the fast components of the neutron flux that accompanies the thermal flux at the level of internal irradiation sites in the reactor have been included in the list of interfering radionuclides. The program calculates in a systematic way the detection limits of 66 elements in an equal number of matrices using 153 intermediary radionuclides. Experimental checks carried out with silicon (for short lifetimes) and aluminum and magnesium (for intermediate lifetimes) show satisfactory agreement with the calculations. These results show in particular the importance of the contribution of the (n,p) and (n,α) reactions in the a priori evaluation of detection limits with a SLOWPOKE type reactor [fr

  17. A priori analysis: an application to the estimate of the uncertainty in course grades

    Science.gov (United States)

    Lippi, G. L.

    2014-07-01

    A priori analysis (APA) is discussed as a tool to assess the reliability of grades in standard curricular courses. This unusual, but striking, application is presented when teaching the section on the data treatment of a laboratory course to illustrate the characteristics of the APA and its potential for widespread use, beyond the traditional physics curriculum. The conditions necessary for this kind of analysis are discussed, the general framework is set out and a specific example is given to illustrate its various aspects. Students are often struck by this unusual application and are more apt to remember the APA. Instructors may also benefit from some of the gathered information, as discussed in the paper.

  18. A priori analysis: an application to the estimate of the uncertainty in course grades

    International Nuclear Information System (INIS)

    Lippi, G L

    2014-01-01

    A priori analysis (APA) is discussed as a tool to assess the reliability of grades in standard curricular courses. This unusual, but striking, application is presented when teaching the section on the data treatment of a laboratory course to illustrate the characteristics of the APA and its potential for widespread use, beyond the traditional physics curriculum. The conditions necessary for this kind of analysis are discussed, the general framework is set out and a specific example is given to illustrate its various aspects. Students are often struck by this unusual application and are more apt to remember the APA. Instructors may also benefit from some of the gathered information, as discussed in the paper. (paper)

  19. Complex amplitude reconstruction by iterative amplitude-phase retrieval algorithm with reference

    Science.gov (United States)

    Shen, Cheng; Guo, Cheng; Tan, Jiubin; Liu, Shutian; Liu, Zhengjun

    2018-06-01

    Multi-image iterative phase retrieval methods have been successfully applied in plenty of research fields due to their simple but efficient implementation. However, there is a mismatch between the measurement of the first long imaging distance and the sequential interval. In this paper, an amplitude-phase retrieval algorithm with reference is put forward without additional measurements or priori knowledge. It gets rid of measuring the first imaging distance. With a designed update formula, it significantly raises the convergence speed and the reconstruction fidelity, especially in phase retrieval. Its superiority over the original amplitude-phase retrieval (APR) method is validated by numerical analysis and experiments. Furthermore, it provides a conceptual design of a compact holographic image sensor, which can achieve numerical refocusing easily.

  20. Association of a culturally defined syndrome (nervios) with chest pain and DSM-IV affective disorders in Hispanic patients referred for cardiac stress testing.

    Science.gov (United States)

    Pavlik, Valory N; Hyman, David J; Wendt, Juliet A; Orengo, Claudia

    2004-01-01

    Hispanics have a high prevalence of cardiovascular risk factors, most notably type 2 diabetes. However, in a large public hospital in Houston, Texas, Hispanic patients referred for cardiac stress testing were significantly more likely to have normal test results than were Whites or non-Hispanic Blacks. We undertook an exploratory study to determine if nervios, a culturally based syndrome that shares similarities with both panic disorder and anginal symptoms, is sufficiently prevalent among Hispanics referred for cardiac testing to be considered as a possible explanation for the high probability of a normal test result. Hispanic patients were recruited consecutively when they presented for a cardiac stress test. A bilingual interviewer administered a brief medical history, the Rose Angina Questionnaire (RAQ), a questionnaire to assess a history of nervios and associated symptoms, and the PRIME-MD, a validated brief questionnaire to diagnose DSM-IV defined affective disorders. The average age of the 114 participants (38 men and 76 women) was 57 years, and the average educational attainment was 7 years. Overall, 50% of participants reported a history of chronic nervios, and 14% reported an acute subtype known as ataque de nervios. Only 2% of patients had DSM-IV defined panic disorder, and 59% of patients had a positive RAQ score (ie, Rose questionnaire angina). The acute subtype, ataque de nervios, but not chronic nervios, was related to an increased probability of having Rose questionnaire angina (P=.006). Adjusted for covariates, a positive history of chronic nervios, but not Rose questionnaire angina, was significantly associated with a normal cardiac test result (OR=2.97, P=.04). Nervios is common among Hispanics with symptoms of cardiac disease. Additional research is needed to understand how nervios symptoms differ from chest pain in Hispanics and the role of nervios in referral for cardiac workup by primary care providers and emergency room personnel.

  1. Defining the Reference Condition for Wadeable Streams in the Sand Hills Subdivision of the Southeastern Plains Ecoregion, USA

    Science.gov (United States)

    Kosnicki, Ely; Sefick, Stephen A.; Paller, Michael H.; Jarrell, Miller S.; Prusha, Blair A.; Sterrett, Sean C.; Tuberville, Tracey D.; Feminella, Jack W.

    2014-09-01

    The Sand Hills subdivision of the Southeastern Plains ecoregion has been impacted by historical land uses over the past two centuries and, with the additive effects of contemporary land use, determining reference condition for streams in this region is a challenge. We identified reference condition based on the combined use of 3 independent selection methods. Method 1 involved use of a multivariate disturbance gradient derived from several stressors, method 2 was based on variation in channel morphology, and method 3 was based on passing 6 of 7 environmental criteria. Sites selected as reference from all 3 methods were considered primary reference, whereas those selected by 2 or 1 methods were considered secondary or tertiary reference, respectively. Sites not selected by any of the methods were considered non-reference. In addition, best professional judgment (BPJ) was used to exclude some sites from any reference class, and comparisons were made to examine the utility of BPJ. Non-metric multidimensional scaling indicated that use of BPJ may help designate non-reference sites when unidentified stressors are present. The macroinvertebrate community measures Ephemeroptera, Plecoptera, Trichoptera richness and North Carolina Biotic Index showed no differences between primary and secondary reference sites when BPJ was ignored. However, there was no significant difference among primary, secondary, and tertiary reference sites when BPJ was used. We underscore the importance of classifying reference conditions, especially in regions that have endured significant anthropogenic activity. We suggest that the use of secondary reference sites may enable construction of models that target a broader set of management interests.

  2. Musical Probabilities, Abductive Reasoning, and Brain Mechanisms: Extended Perspective of "A Priori" Listening to Music within the Creative Cognition Approach

    Science.gov (United States)

    Schmidt, Sebastian; Troge, Thomas A.; Lorrain, Denis

    2013-01-01

    A theory of listening to music is proposed. It suggests that, for listeners, the process of prediction is the starting point to experiencing music. This implies that perception of music starts through both a predisposed and an experience-based extrapolation into the future (this is labeled "a priori" listening). Indications for this…

  3. Factor analysis with a priori knowledge - application in dynamic cardiac SPECT

    Energy Technology Data Exchange (ETDEWEB)

    Sitek, A.; Di Bella, E.V.R.; Gullberg, G.T. [Medical Imaging Research Laboratory, Department of Radiology, University of Utah, CAMT, 729 Arapeen Drive, Salt Lake City, UT 84108-1218 (United States)

    2000-09-01

    Two factor analysis of dynamic structures (FADS) methods for the extraction of time-activity curves (TACs) from cardiac dynamic SPECT data sequences were investigated. One method was based on a least squares (LS) approach which was subject to positivity constraints. The other method was the well known apex-seeking (AS) method. A post-processing step utilizing a priori information was employed to correct for the non-uniqueness of the FADS solution. These methods were used to extract {sup 99m}Tc-teboroxime TACs from computer simulations and from experimental canine and patient studies. In computer simulations, the LS and AS methods, which are completely different algorithms, yielded very similar and accurate results after application of the correction for non-uniqueness. FADS-obtained blood curves correlated well with curves derived from region of interest (ROI) measurements in the experimental studies. The results indicate that the factor analysis techniques can be used for semi-automatic estimation of activity curves derived from cardiac dynamic SPECT images, and that they can be used for separation of physiologically different regions in dynamic cardiac SPECT studies. (author)

  4. Identification of parametric models with a priori knowledge of process properties

    Directory of Open Access Journals (Sweden)

    Janiszowski Krzysztof B.

    2016-12-01

    Full Text Available An approach to estimation of a parametric discrete-time model of a process in the case of some a priori knowledge of the investigated process properties is presented. The knowledge of plant properties is introduced in the form of linear bounds, which can be determined for the coefficient vector of the parametric model studied. The approach yields special biased estimation of model coefficients that preserves demanded properties. A formula for estimation of the model coefficients is derived and combined with a recursive scheme determined for minimization of the sum of absolute model errors. The estimation problem of a model with known static gains of inputs is discussed and proper formulas are derived. This approach can overcome the non-identifiability problem which has been observed during estimation based on measurements recorded in industrial closed-loop control systems. The application of the proposed approach to estimation of a model for an industrial plant (a water injector into the steam flow in a power plant is presented and discussed.

  5. A Priori Analysis of a Compressible Flamelet Model using RANS Data for a Dual-Mode Scramjet Combustor

    Science.gov (United States)

    Quinlan, Jesse R.; Drozda, Tomasz G.; McDaniel, James C.; Lacaze, Guilhem; Oefelein, Joseph

    2015-01-01

    In an effort to make large eddy simulation of hydrocarbon-fueled scramjet combustors more computationally accessible using realistic chemical reaction mechanisms, a compressible flamelet/progress variable (FPV) model was proposed that extends current FPV model formulations to high-speed, compressible flows. Development of this model relied on observations garnered from an a priori analysis of the Reynolds-Averaged Navier-Stokes (RANS) data obtained for the Hypersonic International Flight Research and Experimentation (HI-FiRE) dual-mode scramjet combustor. The RANS data were obtained using a reduced chemical mechanism for the combustion of a JP-7 surrogate and were validated using avail- able experimental data. These RANS data were then post-processed to obtain, in an a priori fashion, the scalar fields corresponding to an FPV-based modeling approach. In the current work, in addition to the proposed compressible flamelet model, a standard incompressible FPV model was also considered. Several candidate progress variables were investigated for their ability to recover static temperature and major and minor product species. The effects of pressure and temperature on the tabulated progress variable source term were characterized, and model coupling terms embedded in the Reynolds- averaged Navier-Stokes equations were studied. Finally, results for the novel compressible flamelet/progress variable model were presented to demonstrate the improvement attained by modeling the effects of pressure and flamelet boundary conditions on the combustion.

  6. The Retinome – Defining a reference transcriptome of the adult mammalian retina/retinal pigment epithelium

    Directory of Open Access Journals (Sweden)

    Goetz Thomas

    2004-07-01

    Full Text Available Abstract Background The mammalian retina is a valuable model system to study neuronal biology in health and disease. To obtain insight into intrinsic processes of the retina, great efforts are directed towards the identification and characterization of transcripts with functional relevance to this tissue. Results With the goal to assemble a first genome-wide reference transcriptome of the adult mammalian retina, referred to as the retinome, we have extracted 13,037 non-redundant annotated genes from nearly 500,000 published datasets on redundant retina/retinal pigment epithelium (RPE transcripts. The data were generated from 27 independent studies employing a wide range of molecular and biocomputational approaches. Comparison to known retina-/RPE-specific pathways and established retinal gene networks suggest that the reference retinome may represent up to 90% of the retinal transcripts. We show that the distribution of retinal genes along the chromosomes is not random but exhibits a higher order organization closely following the previously observed clustering of genes with increased expression. Conclusion The genome wide retinome map offers a rational basis for selecting suggestive candidate genes for hereditary as well as complex retinal diseases facilitating elaborate studies into normal and pathological pathways. To make this unique resource freely available we have built a database providing a query interface to the reference retinome 1.

  7. Predicting thermal history a-priori for magnetic nanoparticle hyperthermia of internal carcinoma

    Science.gov (United States)

    Dhar, Purbarun; Sirisha Maganti, Lakshmi

    2017-08-01

    This article proposes a simplistic and realistic method where a direct analytical expression can be derived for the temperature field within a tumour during magnetic nanoparticle hyperthermia. The approximated analytical expression for thermal history within the tumour is derived based on the lumped capacitance approach and considers all therapy protocols and parameters. The present method is simplistic and provides an easy framework for estimating hyperthermia protocol parameters promptly. The model has been validated with respect to several experimental reports on animal models such as mice/rabbit/hamster and human clinical trials. It has been observed that the model is able to accurately estimate the thermal history within the carcinoma during the hyperthermia therapy. The present approach may find implications in a-priori estimation of the thermal history in internal tumours for optimizing magnetic hyperthermia treatment protocols with respect to the ablation time, tumour size, magnetic drug concentration, field strength, field frequency, nanoparticle material and size, tumour location, and so on.

  8. Reference Intervals of Common Clinical Chemistry Analytes for Adults in Hong Kong.

    Science.gov (United States)

    Lo, Y C; Armbruster, David A

    2012-04-01

    Defining reference intervals is a major challenge because of the difficulty in recruiting volunteers to participate and testing samples from a significant number of healthy reference individuals. Historical literature citation intervals are often suboptimal because they're be based on obsolete methods and/or only a small number of poorly defined reference samples. Blood donors in Hong Kong gave permission for additional blood to be collected for reference interval testing. The samples were tested for twenty-five routine analytes on the Abbott ARCHITECT clinical chemistry system. Results were analyzed using the Rhoads EP evaluator software program, which is based on the CLSI/IFCC C28-A guideline, and defines the reference interval as the 95% central range. Method specific reference intervals were established for twenty-five common clinical chemistry analytes for a Chinese ethnic population. The intervals were defined for each gender separately and for genders combined. Gender specific or combined gender intervals were adapted as appropriate for each analyte. A large number of healthy, apparently normal blood donors from a local ethnic population were tested to provide current reference intervals for a new clinical chemistry system. Intervals were determined following an accepted international guideline. Laboratories using the same or similar methodologies may adapt these intervals if deemed validated and deemed suitable for their patient population. Laboratories using different methodologies may be able to successfully adapt the intervals for their facilities using the reference interval transference technique based on a method comparison study.

  9. Investigating industrial investigation: examining the impact of a priori knowledge and tunnel vision education.

    Science.gov (United States)

    Maclean, Carla L; Brimacombe, C A Elizabeth; Lindsay, D Stephen

    2013-12-01

    The current study addressed tunnel vision in industrial incident investigation by experimentally testing how a priori information and a human bias (generated via the fundamental attribution error or correspondence bias) affected participants' investigative behavior as well as the effectiveness of a debiasing intervention. Undergraduates and professional investigators engaged in a simulated industrial investigation exercise. We found that participants' judgments were biased by knowledge about the safety history of either a worker or piece of equipment and that a human bias was evident in participants' decision making. However, bias was successfully reduced with "tunnel vision education." Professional investigators demonstrated a greater sophistication in their investigative decision making compared to undergraduates. The similarities and differences between these two populations are discussed. (c) 2013 APA, all rights reserved

  10. Almost half of the Danish general practitioners have negative a priori attitudes towards a mandatory accreditation programme

    DEFF Research Database (Denmark)

    Waldorff, Frans Boch; Nicolaisdottir, Dagny Ros; Kousgaard, Marius Brostrøm

    2016-01-01

    INTRODUCTION: The objective of this study was to analyse Danish general practitioners' (GPs) a priori attitudes and expectations towards a nationwide mandatory accreditation programme. METHODS: This study is based on a nationwide electronic survey comprising all Danish GPs (n = 3,403). RESULTS...... accreditation. FUNDING: The three Research Units for General Practice in Odense, Aarhus and Copenhagen initiated and funded this study. TRIAL REGISTRATION: The survey was recommended by the Danish Multipractice Committee (MPU 02-2015) and evaluated by the Danish Data Agency (2015-41-3684)....

  11. Harmonising Reference Intervals for Three Calculated Parameters used in Clinical Chemistry.

    Science.gov (United States)

    Hughes, David; Koerbin, Gus; Potter, Julia M; Glasgow, Nicholas; West, Nic; Abhayaratna, Walter P; Cavanaugh, Juleen; Armbruster, David; Hickman, Peter E

    2016-08-01

    For more than a decade there has been a global effort to harmonise all phases of the testing process, with particular emphasis on the most frequently utilised measurands. In addition, it is recognised that calculated parameters derived from these measurands should also be a target for harmonisation. Using data from the Aussie Normals study we report reference intervals for three calculated parameters: serum osmolality, serum anion gap and albumin-adjusted serum calcium. The Aussie Normals study was an a priori study that analysed samples from 1856 healthy volunteers. The nine analytes used for the calculations in this study were measured on Abbott Architect analysers. The data demonstrated normal (Gaussian) distributions for the albumin-adjusted serum calcium, the anion gap (using potassium in the calculation) and the calculated serum osmolality (using both the Bhagat et al. and Smithline and Gardner formulae). To assess the suitability of these reference intervals for use as harmonised reference intervals, we reviewed data from the Royal College of Pathologists of Australasia/Australasian Association of Clinical Biochemists (RCPA/AACB) bias survey. We conclude that the reference intervals for the calculated serum osmolality (using the Smithline and Gardner formulae) may be suitable for use as a common reference interval. Although a common reference interval for albumin-adjusted serum calcium may be possible, further investigations (including a greater range of albumin concentrations) are needed. This is due to the bias between the Bromocresol Green (BCG) and Bromocresol Purple (BCP) methods at lower serum albumin concentrations. Problems with the measurement of Total CO 2 in the bias survey meant that we could not use the data for assessing the suitability of a common reference interval for the anion gap. Further study is required.

  12. Transformations between inertial and linearly accelerated frames of reference

    International Nuclear Information System (INIS)

    Ashworth, D.G.

    1983-01-01

    Transformation equations between inertial and linearly accelerated frames of reference are derived and these transformation equations are shown to be compatible, where applicable, with those of special relativity. The physical nature of an accelerated frame of reference is unambiguously defined by means of an equation which relates the velocity of all points within the accelerated frame of reference to measurements made in an inertial frame of reference. (author)

  13. Reference Architecture for Multi-Layer Software Defined Optical Data Center Networks

    Directory of Open Access Journals (Sweden)

    Casimer DeCusatis

    2015-09-01

    Full Text Available As cloud computing data centers grow larger and networking devices proliferate; many complex issues arise in the network management architecture. We propose a framework for multi-layer; multi-vendor optical network management using open standards-based software defined networking (SDN. Experimental results are demonstrated in a test bed consisting of three data centers interconnected by a 125 km metropolitan area network; running OpenStack with KVM and VMW are components. Use cases include inter-data center connectivity via a packet-optical metropolitan area network; intra-data center connectivity using an optical mesh network; and SDN coordination of networking equipment within and between multiple data centers. We create and demonstrate original software to implement virtual network slicing and affinity policy-as-a-service offerings. Enhancements to synchronous storage backup; cloud exchanges; and Fibre Channel over Ethernet topologies are also discussed.

  14. River routing at the continental scale: use of globally-available data and an a priori method of parameter estimation

    Directory of Open Access Journals (Sweden)

    P. Naden

    1999-01-01

    Full Text Available Two applications of a river routing model based on the observed river network and a linearised solution to the convective-diffusion equation are presented. One is an off-line application to part of the Amazon basin (catchment area 2.15 M km2 using river network data from the Digital Chart of the World and GCM-generated runoff at a grid resolution of 2.5 degrees latitude and 3.75 degrees longitude. The other application is to the Arkansas (409,000 km2 and Red River (125,500 km2 basins as an integrated component of a macro-scale hydrological model, driven by observed meteorology and operating on a 17 km grid. This second application makes use of the US EPA reach data to construct the river network. In both cases, a method of computing parameter values a priori has been applied and shows some success, although some interpretation is required to derive `correct' parameter values and further work is needed to develop guidelines for use of the method. The applications, however, do demonstrate the possibilities for applying the routing model at the continental scale, with globally-available data and a priori parameter estimation, and its value for validating GCM output against observed flows.

  15. The combined effects of self-referent information processing and ruminative responses on adolescent depression.

    Science.gov (United States)

    Black, Stephanie Winkeljohn; Pössel, Patrick

    2013-08-01

    Adolescents who develop depression have worse interpersonal and affective experiences and are more likely to develop substance problems and/or suicidal ideation compared to adolescents who do not develop depression. This study examined the combined effects of negative self-referent information processing and rumination (i.e., brooding and reflection) on adolescent depressive symptoms. It was hypothesized that the interaction of negative self-referent information processing and brooding would significantly predict depressive symptoms, while the interaction of negative self-referent information processing and reflection would not predict depressive symptoms. Adolescents (n = 92; 13-15 years; 34.7% female) participated in a 6-month longitudinal study. Self-report instruments measured depressive symptoms and rumination; a cognitive task measured information processing. Path modelling in Amos 19.0 analyzed the data. The interaction of negative information processing and brooding significantly predicted an increase in depressive symptoms 6 months later. The interaction of negative information processing and reflection did not significantly predict depression, however, the model not meet a priori standards to accept the null hypothesis. Results suggest clinicians working with adolescents at-risk for depression should consider focusing on the reduction of brooding and negative information processing to reduce long-term depressive symptoms.

  16. Defining care products to finance health care in the Netherlands.

    Science.gov (United States)

    Westerdijk, Machiel; Zuurbier, Joost; Ludwig, Martijn; Prins, Sarah

    2012-04-01

    A case-mix project started in the Netherlands with the primary goal to define a complete set of health care products for hospitals. The definition of the product structure was completed 4 years later. The results are currently being used for billing purposes. This paper focuses on the methodology and techniques that were developed and applied in order to define the casemix product structure. The central research question was how to develop a manageable product structure, i.e., a limited set of hospital products, with acceptable cost homogeneity. For this purpose, a data warehouse with approximately 1.5 million patient records from 27 hospitals was build up over a period of 3 years. The data associated with each patient consist of a large number of a priori independent parameters describing the resource utilization in different stages of the treatment process, e.g., activities in the operating theatre, the lab and the radiology department. Because of the complexity of the database, it was necessary to apply advanced data analysis techniques. The full analyses process that starts from the database and ends up with a product definition consists of four basic analyses steps. Each of these steps has revealed interesting insights. This paper describes each step in some detail and presents the major results of each step. The result consists of 687 product groups for 24 medical specialties used for billing purposes.

  17. Multi-edge X-ray absorption spectroscopy study of road dust samples from a traffic area of Venice using stoichiometric and environmental references

    Science.gov (United States)

    Valotto, Gabrio; Cattaruzza, Elti; Bardelli, Fabrizio

    2017-02-01

    The appropriate selection of representative pure compounds to be used as reference is a crucial step for successful analysis of X-ray absorption near edge spectroscopy (XANES) data, and it is often not a trivial task. This is particularly true when complex environmental matrices are investigated, being their elemental speciation a priori unknown. In this paper, an investigation on the speciation of Cu, Zn, and Sb based on the use of conventional (stoichiometric compounds) and non-conventional (environmental samples or relevant certified materials) references is explored. This method can be useful in when the effectiveness of XANES analysis is limited because of the difficulty in obtaining a set of references sufficiently representative of the investigated samples. Road dust samples collected along the bridge connecting Venice to the mainland were used to show the potentialities and the limits of this approach.

  18. Practical Considerations about Expected A Posteriori Estimation in Adaptive Testing: Adaptive A Priori, Adaptive Correction for Bias, and Adaptive Integration Interval.

    Science.gov (United States)

    Raiche, Gilles; Blais, Jean-Guy

    In a computerized adaptive test (CAT), it would be desirable to obtain an acceptable precision of the proficiency level estimate using an optimal number of items. Decreasing the number of items is accompanied, however, by a certain degree of bias when the true proficiency level differs significantly from the a priori estimate. G. Raiche (2000) has…

  19. Defining Leadership as Process Reference Model: Translating Organizational Goals into Practice Using a Structured Leadership Approach

    OpenAIRE

    Tuffley , David

    2010-01-01

    International audience; Effective leadership in organisations is important to the achievement of organizational objectives. Yet leadership is widely seen as a quality that individuals innately possess, and which cannot be learned. This paper makes two assertions; (a) that leadership is a skill that not only can be learned, but which can be formalized into a Process Reference Model that is intelligible from an Enterprise Architecture perspective, and (b) that Process Reference Models in the st...

  20. Geography of resonances and Arnold diffusion in a priori unstable Hamiltonian systems

    International Nuclear Information System (INIS)

    Delshams, Amadeu; Huguet, Gemma

    2009-01-01

    In this paper we consider the case of a general C r+2 perturbation, for r large enough, of an a priori unstable Hamiltonian system of 2 + 1/2 degrees of freedom, and we provide explicit conditions on it, which turn out to be C 2 generic and are verifiable in concrete examples, which guarantee the existence of Arnold diffusion. This is a generalization of the result in Delshams et al (2006 Mem. Am. Math. Soc.) where the case of a perturbation with a finite number of harmonics in the angular variables was considered. The method of proof is based on a careful analysis of the geography of resonances created by a generic perturbation and it contains a deep quantitative description of the invariant objects generated by the resonances therein. The scattering map is used as an essential tool to construct transition chains of objects of different topology. The combination of quantitative expressions for both the geography of resonances and the scattering map provides, in a natural way, explicit computable conditions for instability

  1. A contextual image segmentation system using a priori information for automatic data classification in nuclear physics

    International Nuclear Information System (INIS)

    Benkirane, A.; Auger, G.; Chbihi, A.; Bloyet, D.; Plagnol, E.

    1994-01-01

    This paper presents an original approach to solve an automatic data classification problem by means of image processing techniques. The classification is achieved using image segmentation techniques for extracting the meaningful classes. Two types of information are merged for this purpose: the information contained in experimental images and a priori information derived from underlying physics (and adapted to image segmentation problem). This data fusion is widely used at different stages of the segmentation process. This approach yields interesting results in terms of segmentation performances, even in very noisy cases. Satisfactory classification results are obtained in cases where more ''classical'' automatic data classification methods fail. (authors). 25 refs., 14 figs., 1 append

  2. A contextual image segmentation system using a priori information for automatic data classification in nuclear physics

    Energy Technology Data Exchange (ETDEWEB)

    Benkirane, A; Auger, G; Chbihi, A [Grand Accelerateur National d` Ions Lourds (GANIL), 14 - Caen (France); Bloyet, D [Caen Univ., 14 (France); Plagnol, E [Paris-11 Univ., 91 - Orsay (France). Inst. de Physique Nucleaire

    1994-12-31

    This paper presents an original approach to solve an automatic data classification problem by means of image processing techniques. The classification is achieved using image segmentation techniques for extracting the meaningful classes. Two types of information are merged for this purpose: the information contained in experimental images and a priori information derived from underlying physics (and adapted to image segmentation problem). This data fusion is widely used at different stages of the segmentation process. This approach yields interesting results in terms of segmentation performances, even in very noisy cases. Satisfactory classification results are obtained in cases where more ``classical`` automatic data classification methods fail. (authors). 25 refs., 14 figs., 1 append.

  3. Reference Japanese man

    International Nuclear Information System (INIS)

    Tanaka, Giichiro

    1985-01-01

    To make real and accurate dose assessment method so far, it is necessitated to provide ''Reference Japanese Man'' based on anotomical, physiological and biochemical data of Japanese people instead of the Reference Man presented in ICRP Publications 23 and 30. This review describes present status of researched for the purpose of establishing of Reference Japanese Man. The Reference Japanese Man is defined as a male or female adult who lives in Japan with a Japanese life-style and food custom. His stature and body weight, and the other data was decided as mean values of male or female people of Japan. As for food custom, Japanese people take significantly smaller amount of meat and milk products than Western people, while larger intake amount of cereals and marine products such as fish or seaweeds. Weight of organs is a principal factor for internal dose assessment and mean values for living Japanese adult has been investigated and the value employable for dose assessment for organs and tissues are shown. To employ these values of Reference Japanese Man, it should be taken into account of age. Metabolic parameters should also be considered. Iodine metabolism in Japanese is quite different from that of Western people. The above-mentioned data are now tentatively employing in modification of table of MIRD method and others. (Takagi, S.)

  4. Reference Values for Plasma Electrolytes and Urea in Nigerian ...

    African Journals Online (AJOL)

    Reference values for plasma electrolytes and urea have been defined for Nigerian children and adolescents residing in Abeokuta and its environs, a location in southern Nigeria, by estimating plasma sodium, potassium bicarbonate and urea concentrations in a reference population. The study group comprised three ...

  5. 45 CFR 506.10 - “Vietnam conflict” defined.

    Science.gov (United States)

    2010-10-01

    ... § 506.10 “Vietnam conflict” defined. Vietnam conflict refers to the period beginning February 28, 1961... “Vietnam conflict” for purposes of payment of interest on missing military service members' deposits in the... ending date for the Vietnam conflict for purposes of determining eligibility for compensation under 50 U...

  6. Student Teacher Letters of Reference: A Critical Analysis

    Science.gov (United States)

    Mason, Richard W.; Schroeder, Mark P.

    2012-01-01

    Letters of reference are commonly used in acquiring a job in education. Despite serious issues of validity and reliability in writing and evaluating letters, there is a dearth of research that systematically examines the evaluation process and defines the constructs that define high quality letters. The current study used NVivo to examine 160…

  7. A Fiducial Reference Stie for Satellite Altimetry in Crete, Greece

    Science.gov (United States)

    Mertikas, Stelios; Donlon, Craig; Mavrocordatos, Constantin; Bojkov, Bojan; Femenias, Pierre; Parrinello, Tommaso; Picot, Nicolas; Desjonqueres, Jean-Damien; Andersen, Ole Baltazar

    2016-08-01

    With the advent of diverse satellite altimeters and variant measuring techniques, it has become mature in the scientific community, that an absolute reference Cal/Val site is regularly maintained to define, monitor, control the responses of any altimetric system.This work sets the ground for the establishment of a Fiducial Reference Site for ESA satellite altimetry in Gavdos and West Crete, Greece. It will consistently and reliably determine (a) absolute altimeter biases and their drifts; (b) relative bias among diverse missions; but also (c) continuously and independently connect different missions, on a common and reliable reference and also to SI-traceable measurements. Results from this fiducial reference site will be based on historic Cal/Val site measurement records, and will be the yardstick for building up capacity for monitoring climate change. This will be achieved by defining and assessing any satellite altimeter measurements to known, controlled and absolute reference signals with different techniques, processes and instrumentation.

  8. GNSS Precise Kinematic Positioning for Multiple Kinematic Stations Based on A Priori Distance Constraints

    Science.gov (United States)

    He, Kaifei; Xu, Tianhe; Förste, Christoph; Petrovic, Svetozar; Barthelmes, Franz; Jiang, Nan; Flechtner, Frank

    2016-01-01

    When applying the Global Navigation Satellite System (GNSS) for precise kinematic positioning in airborne and shipborne gravimetry, multiple GNSS receiving equipment is often fixed mounted on the kinematic platform carrying the gravimetry instrumentation. Thus, the distances among these GNSS antennas are known and invariant. This information can be used to improve the accuracy and reliability of the state estimates. For this purpose, the known distances between the antennas are applied as a priori constraints within the state parameters adjustment. These constraints are introduced in such a way that their accuracy is taken into account. To test this approach, GNSS data of a Baltic Sea shipborne gravimetric campaign have been used. The results of our study show that an application of distance constraints improves the accuracy of the GNSS kinematic positioning, for example, by about 4 mm for the radial component. PMID:27043580

  9. A reference model for space data system interconnection services

    Science.gov (United States)

    Pietras, John; Theis, Gerhard

    1993-01-01

    The widespread adoption of standard packet-based data communication protocols and services for spaceflight missions provides the foundation for other standard space data handling services. These space data handling services can be defined as increasingly sophisticated processing of data or information received from lower-level services, using a layering approach made famous in the International Organization for Standardization (ISO) Open System Interconnection Reference Model (OSI-RM). The Space Data System Interconnection Reference Model (SDSI-RM) incorporates the conventions of the OSIRM to provide a framework within which a complete set of space data handling services can be defined. The use of the SDSI-RM is illustrated through its application to data handling services and protocols that have been defined by, or are under consideration by, the Consultative Committee for Space Data Systems (CCSDS).

  10. Optimization of Decision-Making for Spatial Sampling in the North China Plain, Based on Remote-Sensing a Priori Knowledge

    Science.gov (United States)

    Feng, J.; Bai, L.; Liu, S.; Su, X.; Hu, H.

    2012-07-01

    In this paper, the MODIS remote sensing data, featured with low-cost, high-timely and moderate/low spatial resolutions, in the North China Plain (NCP) as a study region were firstly used to carry out mixed-pixel spectral decomposition to extract an useful regionalized indicator parameter (RIP) (i.e., an available ratio, that is, fraction/percentage, of winter wheat planting area in each pixel as a regionalized indicator variable (RIV) of spatial sampling) from the initial selected indicators. Then, the RIV values were spatially analyzed, and the spatial structure characteristics (i.e., spatial correlation and variation) of the NCP were achieved, which were further processed to obtain the scalefitting, valid a priori knowledge or information of spatial sampling. Subsequently, founded upon an idea of rationally integrating probability-based and model-based sampling techniques and effectively utilizing the obtained a priori knowledge or information, the spatial sampling models and design schemes and their optimization and optimal selection were developed, as is a scientific basis of improving and optimizing the existing spatial sampling schemes of large-scale cropland remote sensing monitoring. Additionally, by the adaptive analysis and decision strategy the optimal local spatial prediction and gridded system of extrapolation results were able to excellently implement an adaptive report pattern of spatial sampling in accordance with report-covering units in order to satisfy the actual needs of sampling surveys.

  11. Defining reference sequences for Nocardia species by similarity and clustering analyses of 16S rRNA gene sequence data.

    Directory of Open Access Journals (Sweden)

    Manal Helal

    Full Text Available BACKGROUND: The intra- and inter-species genetic diversity of bacteria and the absence of 'reference', or the most representative, sequences of individual species present a significant challenge for sequence-based identification. The aims of this study were to determine the utility, and compare the performance of several clustering and classification algorithms to identify the species of 364 sequences of 16S rRNA gene with a defined species in GenBank, and 110 sequences of 16S rRNA gene with no defined species, all within the genus Nocardia. METHODS: A total of 364 16S rRNA gene sequences of Nocardia species were studied. In addition, 110 16S rRNA gene sequences assigned only to the Nocardia genus level at the time of submission to GenBank were used for machine learning classification experiments. Different clustering algorithms were compared with a novel algorithm or the linear mapping (LM of the distance matrix. Principal Components Analysis was used for the dimensionality reduction and visualization. RESULTS: The LM algorithm achieved the highest performance and classified the set of 364 16S rRNA sequences into 80 clusters, the majority of which (83.52% corresponded with the original species. The most representative 16S rRNA sequences for individual Nocardia species have been identified as 'centroids' in respective clusters from which the distances to all other sequences were minimized; 110 16S rRNA gene sequences with identifications recorded only at the genus level were classified using machine learning methods. Simple kNN machine learning demonstrated the highest performance and classified Nocardia species sequences with an accuracy of 92.7% and a mean frequency of 0.578. CONCLUSION: The identification of centroids of 16S rRNA gene sequence clusters using novel distance matrix clustering enables the identification of the most representative sequences for each individual species of Nocardia and allows the quantitation of inter- and intra

  12. Quantum bit commitment with misaligned reference frames

    International Nuclear Information System (INIS)

    Harrow, Aram; Oliveira, Roberto; Terhal, Barbara M.

    2006-01-01

    Suppose that Alice and Bob define their coordinate axes differently, and the change of reference frame between them is given by a probability distribution μ over SO(3). We show that this uncertainty of reference frame is of no use for bit commitment when μ is uniformly distributed over a (sub)group of SO(3), but other choices of μ can give rise to a partially or even arbitrarily secure bit commitment

  13. OWL references in ORM conceptual modelling

    Science.gov (United States)

    Matula, Jiri; Belunek, Roman; Hunka, Frantisek

    2017-07-01

    Object Role Modelling methodology is the fact-based type of conceptual modelling. The aim of the paper is to emphasize a close connection to OWL documents and its possible mutual cooperation. The definition of entities or domain values is an indispensable part of the conceptual schema design procedure defined by the ORM methodology. Many of these entities are already defined in OWL documents. Therefore, it is not necessary to declare entities again, whereas it is possible to utilize references from OWL documents during modelling of information systems.

  14. Defining a Bobath clinical framework - A modified e-Delphi study.

    Science.gov (United States)

    Vaughan-Graham, Julie; Cott, Cheryl

    2016-11-01

    To gain consensus within the expert International Bobath Instructors Training Association (IBITA) on a Bobath clinical framework on which future efficacy studies can be based. A three-round modified e-Delphi approach was used with 204 full members of the IBITA. Twenty-one initial statements were generated from the literature. Consensus was defined a priori as at least 80% of the respondents with a level of agreement on a Likert scale of 4 or 5. The Delphi questionnaire for each round was available online for two weeks. Summary reports and subsequent questionnaires were posted within four weeks. Ninety-four IBITA members responded, forming the Delphi panel, of which 68 and 66 responded to Rounds Two and Three, respectively. The 21 initial statements were revised to 17 statements and five new statements in Round Two in which eight statements were accepted and two statements were eliminated. Round Three presented 12 revised statements, all reaching consensus. The Delphi was successful in gaining consensus on a Bobath clinical framework in a geographically diverse expert association, identifying the unique components of Bobath clinical practice. Discussion throughout all three Rounds revolved primarily around the terminology of atypical and compensatory motor behavior and balance.

  15. Referent 3D tumor model at cellular level in radionuclide therapy

    International Nuclear Information System (INIS)

    Spaic, R.; Ilic, R.D.; Petrovic, B.J.

    2002-01-01

    Aim Conventional internal dosimetry has a lot of limitations because of tumor dose nonuniformity. The best approach for absorbed dose at cellular level for different tumors in radionuclide therapy calculation is Monte Carlo method. The purpose of this study is to introduce referent tumor 3D model at cellular level for Monte Carlo simulation study in radionuclide therapy. Material and Methods The moment when tumor is detectable and when same therapy can start is time period in which referent 3D tumor model at cellular level was defined. In accordance with tumor growth rate at that moment he was a sphere with same radius (10 000 μm). In that tumor there are cells or cluster of cells, which are randomly distributed spheres. Distribution of cells/cluster of cells can be calculated from histology data but it was assumed that this distribution is normal with the same mean value and standard deviation (100±50 mm). Second parameter, which was selected to define referent tumor, is volume density of cells (30%). In this referent tumor there are no necroses. Stroma is defined as space between spheres with same concentration of materials as in spheres. Results: Referent tumor defined on this way have about 2,2 10 5 cells or cluster of cells random distributed. Using this referent 3D tumor model and for same concentration of radionuclides (1:100) and energy of beta emitters (1000 keV) which are homogeneously distributed in labeled cells absorbed dose for all cells was calculated. Simulations are done using FOTELP Monte Carlo code, which is modified for this purposes. Results of absorbed dose in cells are given in numerical values (1D distribution) and as the images (2D or 3D distributions). Conclusion Geometrical module for Monte Carlo simulation study can be standardized by introducing referent 3D tumor model at cellular level. This referent 3D tumor model gives most realistic presentation of different tumors at the moment of their detectability. Referent 3D tumor model at

  16. Second reference calculation for the WIPP

    International Nuclear Information System (INIS)

    Branstetter, L.J.

    1985-03-01

    Results of the second reference calculation for the Waste Isolation Pilot Plant (WIPP) project using the dynamic relaxation finite element code SANCHO are presented. This reference calculation is intended to predict the response of a typical panel of excavated rooms designed for storage of nonheat-producing nuclear waste. Results are presented that include relevant deformations, relative clay seam displacements, and stress and strain profiles. This calculation is a particular solution obtained by a computer code, which has proven analytic capabilities when compared with other structural finite element codes. It is hoped that the results presented here will be useful in providing scoping values for defining experiments and for developing instrumentation. It is also hoped that the calculation will be useful as part of an exercise in developing a methodology for performing important design calculations by more than one analyst using more than one computer code, and for defining internal Quality Assurance (QA) procedures for such calculations. 27 refs., 15 figs

  17. ASTM reference radiologic digital image standards

    International Nuclear Information System (INIS)

    Wysnewski, R.; Wysnewski, D.

    1996-01-01

    ASTM Reference Radiographs have been essential in defining industry's material defect grade levels for many years. ASTM Reference Radiographs are used extensively as even the American Society for Metals Nondestructive Inspection and Quality Control Metals Handbook, Volume 11, eighth edition refers to ASTM Standard Reference Radiographs. The recently published E 1648 Standard Reference Radiographs for Examination of Aluminum Fusion Welds is a prime example of the on-going need for these references. To date, 14 Standard Reference Radiographs have been published to characterize material defects. Standard Reference Radiographs do not adequately address film-less radiologic methods. There are differences in mediums to content with. On a computer CRT defect indications appear differently when compared to indications viewed in a radiograph on a view box. Industry that uses non-film radiologic methods of inspection can be burdened with additional time and money developing internal standard reference radiologic images. These references may be deemed necessary for grading levels of product defects. Because there are no ASTM Standard Reference Radiologic data files for addressing this need in the industry, the authors of this paper suggested implementing a method for their creation under ASTM supervision. ASTM can assure continuity to those users making the transition from analog radiographic images to digital image data by swiftly addressing the requirements for reference digital image standards. The current status and possible future activities regarding a method to create digital data files is presented in this paper summary

  18. On Evaluation of Recharge Model Uncertainty: a Priori and a Posteriori

    International Nuclear Information System (INIS)

    Ming Ye; Karl Pohlmann; Jenny Chapman; David Shafer

    2006-01-01

    Hydrologic environments are open and complex, rendering them prone to multiple interpretations and mathematical descriptions. Hydrologic analyses typically rely on a single conceptual-mathematical model, which ignores conceptual model uncertainty and may result in bias in predictions and under-estimation of predictive uncertainty. This study is to assess conceptual model uncertainty residing in five recharge models developed to date by different researchers based on different theories for Nevada and Death Valley area, CA. A recently developed statistical method, Maximum Likelihood Bayesian Model Averaging (MLBMA), is utilized for this analysis. In a Bayesian framework, the recharge model uncertainty is assessed, a priori, using expert judgments collected through an expert elicitation in the form of prior probabilities of the models. The uncertainty is then evaluated, a posteriori, by updating the prior probabilities to estimate posterior model probability. The updating is conducted through maximum likelihood inverse modeling by calibrating the Death Valley Regional Flow System (DVRFS) model corresponding to each recharge model against observations of head and flow. Calibration results of DVRFS for the five recharge models are used to estimate three information criteria (AIC, BIC, and KIC) used to rank and discriminate these models. Posterior probabilities of the five recharge models, evaluated using KIC, are used as weights to average head predictions, which gives posterior mean and variance. The posterior quantities incorporate both parametric and conceptual model uncertainties

  19. Instrumentation reference book

    CERN Document Server

    Boyes, Walt

    2002-01-01

    Instrumentation is not a clearly defined subject, having a 'fuzzy' boundary with a number of other disciplines. Often categorized as either 'techniques' or 'applications' this book addresses the various applications that may be needed with reference to the practical techniques that are available for the instrumentation or measurement of a specific physical quantity or quality. This makes it of direct interest to anyone working in the process, control and instrumentation fields where these measurements are essential.* Comprehensive and authoritative collection of technical information* Writte

  20. Effect of reference loads on fracture mechanics analysis of surface cracked pipe based on reference stress method

    International Nuclear Information System (INIS)

    Shim, Do Jun; Son, Beom Goo; Kim, Young Jin; Kim, Yun Jae

    2004-01-01

    To investigate relevance of the definition of the reference stress to estimate J and C * for surface crack problems, this paper compares FE J and C * results for surface cracked pipes with those estimated according to the reference stress approach using various definitions of the reference stress. Pipes with part circumferential inner surface crack and finite internal axial crack are considered, subject to internal pressure and global bending. The crack depth and aspect ratio are systematically varied. The reference stress is defined in four different ways using (I) the local limit load, (II) the global limit load, (III) the global limit load determined from the FE limit analysis, and (IV) the optimised reference load. It is found that the reference stress based on the local limit load gives overall excessively conservative estimates of J and C * . Use of the global limit load clearly reduces the conservatism, compared to that of the local limit load, although it can provide sometimes non-conservative estimates of J and C * . The use of the FE global limit load gives overall non-conservative estimates of J and C * . The reference stress based on the optimised reference load gives overall accurate estimates of J and C * , compared to other definitions of the reference stress. Based on the present finding, general guidance on the choice of the reference stress for surface crack problems is given

  1. Space-time reference with an optical link

    International Nuclear Information System (INIS)

    Berceau, P; Hollberg, L; Taylor, M; Kahn, J

    2016-01-01

    We describe a concept for realizing a high performance space-time reference using a stable atomic clock in a precisely defined orbit and synchronizing the orbiting clock to high-accuracy atomic clocks on the ground. The synchronization would be accomplished using a two-way lasercom link between ground and space. The basic approach is to take advantage of the highest-performance cold-atom atomic clocks at national standards laboratories on the ground and to transfer that performance to an orbiting clock that has good stability and that serves as a ‘frequency-flywheel’ over time-scales of a few hours. The two-way lasercom link would also provide precise range information and thus precise orbit determination. With a well-defined orbit and a synchronized clock, the satellite could serve as a high-accuracy space-time reference, providing precise time worldwide, a valuable reference frame for geodesy, and independent high-accuracy measurements of GNSS clocks. Under reasonable assumptions, a practical system would be able to deliver picosecond timing worldwide and millimeter orbit determination, and could serve as an enabling subsystem for other proposed space-gravity missions, which are briefly reviewed. (paper)

  2. SCGICAR: Spatial concatenation based group ICA with reference for fMRI data analysis.

    Science.gov (United States)

    Shi, Yuhu; Zeng, Weiming; Wang, Nizhuan

    2017-09-01

    With the rapid development of big data, the functional magnetic resonance imaging (fMRI) data analysis of multi-subject is becoming more and more important. As a kind of blind source separation technique, group independent component analysis (GICA) has been widely applied for the multi-subject fMRI data analysis. However, spatial concatenated GICA is rarely used compared with temporal concatenated GICA due to its disadvantages. In this paper, in order to overcome these issues and to consider that the ability of GICA for fMRI data analysis can be improved by adding a priori information, we propose a novel spatial concatenation based GICA with reference (SCGICAR) method to take advantage of the priori information extracted from the group subjects, and then the multi-objective optimization strategy is used to implement this method. Finally, the post-processing means of principal component analysis and anti-reconstruction are used to obtain group spatial component and individual temporal component in the group, respectively. The experimental results show that the proposed SCGICAR method has a better performance on both single-subject and multi-subject fMRI data analysis compared with classical methods. It not only can detect more accurate spatial and temporal component for each subject of the group, but also can obtain a better group component on both temporal and spatial domains. These results demonstrate that the proposed SCGICAR method has its own advantages in comparison with classical methods, and it can better reflect the commonness of subjects in the group. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. REDEFINING ENSO EPISODES BASED ON CHANGED CLIMATE REFERENCES

    Institute of Scientific and Technical Information of China (English)

    LI Xiao-yan; ZHAI Pan-mao; REN Fu-min

    2005-01-01

    Through studying changes in ENSO indices relative to change of climate reference from 1961~1990 to 1971~2000, the study generated new standards to define ENSO episodes and their intensities. Then according to the new climate references and new index standards, ENSO episodes and their intensities for the period 1951 -2003 have been classified. Finally, an analysis has been performed comparing the new characteristics with the old ones for ENSO period, peak values and intensities.

  4. Definably compact groups definable in real closed fields. I

    OpenAIRE

    Barriga, Eliana

    2017-01-01

    We study definably compact definably connected groups definable in a sufficiently saturated real closed field $R$. We introduce the notion of group-generic point for $\\bigvee$-definable groups and show the existence of group-generic points for definably compact groups definable in a sufficiently saturated o-minimal expansion of a real closed field. We use this notion along with some properties of generic sets to prove that for every definably compact definably connected group $G$ definable in...

  5. Description of a reference mixed oxide fuel fabrication plant (MOFFP)

    International Nuclear Information System (INIS)

    1978-01-01

    In order to evaluate the environment impact, due to the Mixed Oxide Fuel Fabrication Plants, work has been initiated to describe the general design and operating conditions of a reference Mixed Oxide Fuel Fabrication Plant (MOFFP) for the 1990 time frame. The various reference data and basic assumptions for the reference MOFFP plant have been defined after discussion with experts. The data reported in this document are only made available to allow an evaluation of the environmental impact due to a reference MOFFP plant. These data have therefore not to be used as recommandation, standards, regulatory guides or requirements

  6. On possible a-priori "imprinting" of General Relativity itself on the performed Lense-Thirring tests with LAGEOS satellites

    Science.gov (United States)

    Iorio, Lorenzo

    2010-02-01

    The impact of possible a-priori "imprinting" effects of general relativity itself on recent attempts to measure the general relativistic Lense-Thirring effect with the LAGEOS satellites orbiting the Earth and the terrestrial geopotential models from the dedicated mission GRACE is investigated. It is analytically shown that general relativity, not explicitly solved for in the GRACE-based models, may "imprint" their even zonal harmonic coefficients of low degrees J_l at a non-negligible level, given the present-day accuracy in recovering them. This translates into a bias of the LAGEOS-based relativistic tests as large as the Lense-Thirring effect itself. Further analyses should include general relativity itself in the GRACE data processing by explicitly solving for it.

  7. In quest of a systematic framework for unifying and defining nanoscience

    International Nuclear Information System (INIS)

    Tomalia, Donald A.

    2009-01-01

    This article proposes a systematic framework for unifying and defining nanoscience based on historic first principles and step logic that led to a 'central paradigm' (i.e., unifying framework) for traditional elemental/small-molecule chemistry. As such, a Nanomaterials classification roadmap is proposed, which divides all nanomatter into Category I: discrete, well-defined and Category II: statistical, undefined nanoparticles. We consider only Category I, well-defined nanoparticles which are >90% monodisperse as a function of Critical Nanoscale Design Parameters (CNDPs) defined according to: (a) size, (b) shape, (c) surface chemistry, (d) flexibility, and (e) elemental composition. Classified as either hard (H) (i.e., inorganic-based) or soft (S) (i.e., organic-based) categories, these nanoparticles were found to manifest pervasive atom mimicry features that included: (1) a dominance of zero-dimensional (0D) core-shell nanoarchitectures, (2) the ability to self-assemble or chemically bond as discrete, quantized nanounits, and (3) exhibited well-defined nanoscale valencies and stoichiometries reminiscent of atom-based elements. These discrete nanoparticle categories are referred to as hard or soft particle nanoelements. Many examples describing chemical bonding/assembly of these nanoelements have been reported in the literature. We refer to these hard:hard (H-n:H-n), soft:soft (S-n:S-n), or hard:soft (H-n:S-n) nanoelement combinations as nanocompounds. Due to their quantized features, many nanoelement and nanocompound categories are reported to exhibit well-defined nanoperiodic property patterns. These periodic property patterns are dependent on their quantized nanofeatures (CNDPs) and dramatically influence intrinsic physicochemical properties (i.e., melting points, reactivity/self-assembly, sterics, and nanoencapsulation), as well as important functional/performance properties (i.e., magnetic, photonic, electronic, and toxicologic properties). We propose this

  8. Defining Toll Fee of Wheeling Renewable with Reference to a Gas Pipeline in Indonesia

    Science.gov (United States)

    Hakim, Amrullah

    2017-07-01

    Indonesia has a huge number of renewable energy sources (RE) however; the utilization of these is currently very low. The main challenge of power production is its alignment with consumption levels; supply should equal demand at all times. There is a strong initiative from corporations with high energy demand, compared to other sectors, to apply a renewable portfolio standard for their energy input, e.g. 15% of their energy consumption requirement must come from a renewable energy source. To support this initiative, the utilization of power wheeling will help large factories on industrial estates to source firm and steady renewables from remote sites. The wheeling renewable via PLN’s transmission line has been regulated under the Ministry Decree in 2015 however; the tariff or toll fee has not yet been defined. The potential project to apply wheeling renewable will obtain power supply from a geothermal power plant, with power demand from the scattered factories under one company. This is the concept driving the application of power wheeling in the effort to push the growth of renewable energy in Indonesia. Given that the capacity of PLN’s transmission line are normally large and less congested compared to distribution line, the wheeling renewable can accommodate the scattered factories locations which then results in the cheaper toll fee of the wheeling renewable. Defining the best toll fee is the main topic of this paper with comparison of the toll fee of the gas pipeline infrastructure in Indonesia, so that it can be applied massively to achieve COP21’s commitment.

  9. Reference values for spirometry in preschool children.

    Science.gov (United States)

    Burity, Edjane F; Pereira, Carlos A C; Rizzo, José A; Brito, Murilo C A; Sarinho, Emanuel S C

    2013-01-01

    Reference values for lung function tests differ in samples from different countries, including values for preschoolers. The main objective of this study was to derive reference values in this population. A prospective study was conducted through a questionnaire applied to 425 preschool children aged 3 to 6 years, from schools and day-care centers in a metropolitan city in Brazil. Children were selected by simple random sampling from the aforementioned schools. Peak expiratory flow (PEF), forced vital capacity (FVC), forced expiratory volumes (FEV1, FEV0.50), forced expiratory flow (FEF25-75) and FEV1/FVC, FEV0.5/FVC and FEF25-75/FVC ratios were evaluated. Of the 425 children enrolled, 321 (75.6%) underwent the tests. Of these, 135 (42.0%) showed acceptable results with full expiratory curves and thus were included in the regression analysis to define the reference values. Height and gender significantly influenced FVC values through linear and logarithmic regression analysis. In males, R(2) increased with the logarithmic model for FVC and FEV1, but the linear model was retained for its simplicity. The lower limits were calculated by measuring the fifth percentile residues. Full expiratory curves are more difficult to obtain in preschoolers. In addition to height, gender also influences the measures of FVC and FEV1. Reference values were defined for spirometry in preschool children in this population, which are applicable to similar populations. Copyright © 2013 Sociedade Brasileira de Pediatria. Published by Elsevier Editora Ltda. All rights reserved.

  10. Different reference BMDs affect the prevalence of osteoporosis.

    Science.gov (United States)

    Jung, Ki Jin; Chung, Chin Youb; Park, Moon Seok; Kwon, Soon-Sun; Moon, Sang Young; Lee, In Hyeok; Kim, Ka Hyun; Lee, Kyoung Min

    2016-05-01

    The T score represents the degree of deviation from the peak bone mineral density (BMD) (reference standard) in a population. Little has been investigated concerning the age at which the BMD reaches the peak value and how we should define the reference standard BMD in terms of age ranges. BMDs of 9,800 participants were analyzed from the Korean National Health and Nutrition Examination Survey database. Five reference standards were defined: (1) the reference standard of Japanese young adults provided by the dual-energy X-ray absorptiometry machine manufacturer, (2) peak BMD of the Korean population evaluated by statistical analysis (second-order polynomial regression models), (3) BMD of subjects aged 20-29 years, (4) BMD of subjects aged 20-39 years, and (5) BMD of subjects aged 30-39 years. T-scores from the five reference standards were calculated, and the prevalence of osteoporosis was evaluated and compared for males and females separately. The peak BMD in the polynomial regression model was achieved at 26 years in males and 36 years in females in the total hip, at 20 years in males and 27 years in females in the femoral neck, and at 20 years in males and 30 years in females in the lumbar spine. The prevalence of osteoporosis over the age of 50 years showed significant variation of up to two fold depending on the reference standards adopted. The age at which peak BMD was achieved was variable according to the gender and body sites. A consistent definition of peak BMD needs to be established in terms of age ranges because this could affect the prevalence of osteoporosis and healthcare policies.

  11. Mass of organs and composition of the body of Japanese Reference Man

    International Nuclear Information System (INIS)

    Tanaka, Gi-ichiro

    1990-01-01

    Reference Man as defined and described in ICRP Publication 23 is in the process of major revision, with an emphasis on the age and sex, and characteristics of non-European populations. Japanese Reference Man (or Woman) is to be defined as the subject, normal and healthy, 20 to 30 years of age, who inhabits in Japan and live on the 'standard diet'. He (or she) is a Mongoloid in race, and 170 (or 160) cm in height, and 60 (or 51) kg in weight. Physical properties such as masses of 114 organs, tissues and components, and their specific gravities of Japanese Reference Male are given. Body composition or body fat, LBM, skeleton, soft lean body mass (SLBM), body water, blood, muscle, ash, protein and specific gravity were also given as well as body surface. These data are primarily based on the data obtained for normal Japanese, and, where data unavailable, they were derived from ICRP Reference Man data by using a new concept of SLBM. Red bone marrow was estimated to be 1,000g as compared to 1,500g in Reference Man. Body fat was obtained by using Nagamine's equations, which showed a recent slight tendency of obesity. In conclusion, the present data for Japanese Reference Man could be used in designing appropriate phantoms, mathematical and real, for Japanese. Japanese Reference Man will also provide a basis for Asian Reference Man, which, in principle, should be consistent with ICRP concepts of Reference Man. (author)

  12. International Geomagnetic Reference Field

    DEFF Research Database (Denmark)

    Finlay, Chris; Maus, S.; Beggan, C. D.

    2010-01-01

    The eleventh generation of the International Geomagnetic Reference Field (IGRF) was adopted in December 2009 by the International Association of Geomagnetism and Aeronomy Working Group V‐MOD. It updates the previous IGRF generation with a definitive main field model for epoch 2005.0, a main field...... model for epoch 2010.0, and a linear predictive secular variation model for 2010.0–2015.0. In this note the equations defining the IGRF model are provided along with the spherical harmonic coefficients for the eleventh generation. Maps of the magnetic declination, inclination and total intensity...

  13. Definably compact groups definable in real closed fields.II

    OpenAIRE

    Barriga, Eliana

    2017-01-01

    We continue the analysis of definably compact groups definable in a real closed field $\\mathcal{R}$. In [3], we proved that for every definably compact definably connected semialgebraic group $G$ over $\\mathcal{R}$ there are a connected $R$-algebraic group $H$, a definable injective map $\\phi$ from a generic definable neighborhood of the identity of $G$ into the group $H\\left(R\\right)$ of $R$-points of $H$ such that $\\phi$ acts as a group homomorphism inside its domain. The above result and o...

  14. On the Implications of A Priori Constraints in Transdimensional Bayesian Inversion for Continental Lithospheric Layering

    Science.gov (United States)

    Roy, C.; Romanowicz, B. A.

    2017-12-01

    Monte Carlo methods are powerful approaches to solve nonlinear problems and are becoming very popular in Earth sciences. One reason being that, at first glance, no constraints or explicit regularization of model parameters are required. At second glance, one might realize that regularization is done through a prior. The choice of this prior, however, is subjective, and with its choice, unintended or undesired extra information can be injected into the problem. The principal criticism of Bayesian methods is that the prior can be "tuned" in order to get the expected solution. Consequently, detractors of the Bayesian method could easily argue that the solution is influenced by the form of the prior distribution, which choice is subjective. Hence, models obtained with Monte Carlo methods are still highly debated. Here we investigate the influence of a priori constraints (i.e., fixed crustal discontinuities) on the posterior probability distributions of estimated parameters, that is, vertical polarized shear velocity VSV and radial anisotropy ξ, in a transdimensional Bayesian inversion for continental lithospheric structure. We follow upon the work of Calò et al. (2016), who jointly inverted converted phases (P to S) without deconvolution and surface wave dispersion data, to obtain 1-D radial anisotropic shear wave velocity profiles in the North American craton. We aim at verifying whether the strong lithospheric layering found in the stable part of the craton is robust with respect to artifacts that might be caused by the methodology used. We test the hypothesis that the observed midlithospheric discontinuities result from (1) fixed crustal discontinuities in the reference model and (2) a fixed Vp/Vs ratio. The synthetic tests on two Earth models show that a fixed Vp/Vs ratio does not introduce artificial layering, even if the assumed value is slightly wrong. This is an important finding for real data inversion where the true value is not always available or accurate

  15. Cure of tuberculosis despite serum concentrations of antituberculosis drugs below published reference ranges.

    Science.gov (United States)

    Meloni, Monica; Corti, Natascia; Müller, Daniel; Henning, Lars; Gutteck, Ursula; von Braun, Amrei; Weber, Rainer; Fehr, Jan

    2015-01-01

    Therapeutic target serum concentrations of first-line antituberculosis drugs have not been well defined in clinical studies in tuberculosis (TB) patients. We retrospectively investigated the estimated maximum serum concentrations (eC max) of antituberculosis drugs and clinical outcome of TB patients with therapeutic drug monitoring performed between 2010-2012 at our institution, and follow-up until March 2014. The eC max was defined as the highest serum concentration during a sampling period (2, 4 and 6 hours after drug ingestion). We compared the results with published eC max values, and categorised them as either "within reference range", "low eC max", or "very low eC max".Low/very low eC max-levels were defined as follows: isoniazid 2-3/max levels were classified as "low" or "very low". The eC max was below the relevant reference range in 80% of isoniazid, 95% of rifampicin, 30% of pyrazinamide, and 30% of ethambutol measurements. All but one patient were cured of tuberculosis. Although many antituberculosis drug serum concentrations were below the widely used reference ranges, 16 of 17 patients were cured of tuberculosis. These results challenge the use of the published reference ranges for therapeutic drug monitoring.

  16. Assessment of brain reference genes for RT-qPCR studies in neurodegenerative diseases

    DEFF Research Database (Denmark)

    Rydbirk, Rasmus; Folke, Jonas; Winge, Kristian

    2016-01-01

    Evaluation of gene expression levels by reverse transcription quantitative real-time PCR (RT-qPCR) has for many years been the favourite approach for discovering disease-associated alterations. Normalization of results to stably expressed reference genes (RGs) is pivotal to obtain reliable results......, and Progressive Supranuclear Palsy patients. Using RefFinder, a web-based tool for evaluating RG stability, we identified the most stable RGs to be UBE2D2, CYC1, and RPL13 which we recommend for future RT-qPCR studies on human brain tissue from these patients. None of the investigated genes were affected...... by experimental variables such as RIN, PMI, or age. Findings were further validated by expression analyses of a target gene GSK3B, known to be affected by AD and PD. We obtained high variations in GSK3B levels when contrasting the results using different sets of common RG underlining the importance of a priori...

  17. Choosing representative body sizes for reference adults and children

    International Nuclear Information System (INIS)

    Cristy, M.

    1992-01-01

    In 1975 the International Commission on Radiological Protection published a report on Reference Man (ICRP Publication 23), and a task group of the ICRP is now revising that report. Currently 'Reference Man [adult male] is defined as being between 20-30 years of age, weighing 70 kg, is 170 cm in height, is a Caucasian and is a Western European or North American in habitat and custom' (ICRP 23, p. 4). A reference adult female (58 kg, 160 cm) was also defined and data on the fetus and children were given, but with less detail and fewer specific reference values because the focus of the ICRP at that time was on young male radiation workers. The 70-kg Reference Man (earlier called Standard Man) has been used in radiation protection for 40 years, including the dosimetric schema for nuclear medicine, and this 70-kg reference has been used since at least the 1920's in physiological models. As is well known, humans in most parts of the world have increased in size (height and weight) since this standard was first adopted. Taking modern European populations as a reference and expanding the age range to 20-50 years, the author now suggests 176 cm height and 73-75 kg weight for adult males and 163 cm and about 60 kg for adult females would be more appropriate. The change in height is particularly important because many anatomical and physiological parameters - e.g., lean body mass, skeletal weight, total body water, blood volume, respiratory volumes - are correlated more closely with height than with weight. The difference in lean body mass between Asian and Caucasian persons, for example, is largely or wholly due to the difference in body height. Many equations for mean body water and other whole-body measures use body height as the only or the most important parameter, and so it is important that reference body height be chosen well

  18. Evaluating the effectiveness of a priori information on process measures in a virtual reality inspection task

    Directory of Open Access Journals (Sweden)

    Shannon Raye Bowling

    2010-06-01

    Full Text Available 72 1024x768 Normal 0 false false false Due to the nature of the complexity of the aircraft maintenance industry, much emphasis has been placed on improving aircraft inspection performance. One proven technique for improving inspection performance is the use of training. Several strategies have been implemented for training, one of which is giving feedforward information. The use of a priori (feedforward information is known to positively affect inspection performance (Ernst and Yovits, 1972; Long and Rourke, 1989; McKernan, 1989; Gramopadhye et al., 1997.  This information can consist of knowledge about defect characteristics (types, severity/criticality, and location and the probability of occurrence. Although several studies have been conducted that demonstrate the usefulness of feedforward as a training strategy, there are certain research issues that need to be addressed. This study evaluates the effects of feedforward information on process measures in a simulated 3-dimensional environment (aircraft cargo bay by the use of virtual reality.

  19. Defining functional distances over Gene Ontology

    Directory of Open Access Journals (Sweden)

    del Pozo Angela

    2008-01-01

    Full Text Available Abstract Background A fundamental problem when trying to define the functional relationships between proteins is the difficulty in quantifying functional similarities, even when well-structured ontologies exist regarding the activity of proteins (i.e. 'gene ontology' -GO-. However, functional metrics can overcome the problems in the comparing and evaluating functional assignments and predictions. As a reference of proximity, previous approaches to compare GO terms considered linkage in terms of ontology weighted by a probability distribution that balances the non-uniform 'richness' of different parts of the Direct Acyclic Graph. Here, we have followed a different approach to quantify functional similarities between GO terms. Results We propose a new method to derive 'functional distances' between GO terms that is based on the simultaneous occurrence of terms in the same set of Interpro entries, instead of relying on the structure of the GO. The coincidence of GO terms reveals natural biological links between the GO functions and defines a distance model Df which fulfils the properties of a Metric Space. The distances obtained in this way can be represented as a hierarchical 'Functional Tree'. Conclusion The method proposed provides a new definition of distance that enables the similarity between GO terms to be quantified. Additionally, the 'Functional Tree' defines groups with biological meaning enhancing its utility for protein function comparison and prediction. Finally, this approach could be for function-based protein searches in databases, and for analysing the gene clusters produced by DNA array experiments.

  20. Population-Based Pediatric Reference Intervals in General Clinical Chemistry: A Swedish Survey.

    Science.gov (United States)

    Ridefelt, Peter

    2015-01-01

    Very few high quality studies on pediatric reference intervals for general clinical chemistry and hematology analytes have been performed. Three recent prospective community-based projects utilising blood samples from healthy children in Sweden, Denmark and Canada have substantially improved the situation. The Swedish survey included 701 healthy children. Reference intervals for general clinical chemistry and hematology were defined.

  1. Resting-state test-retest reliability of a priori defined canonical networks over different preprocessing steps

    NARCIS (Netherlands)

    Varikuti, D.P.; Hoffstaedter, F.; Genon, S.; Schwender, H.; Reid, A.T.; Eickhoff, S.B.

    2017-01-01

    Resting-state functional connectivity analysis has become a widely used method for the investigation of human brain connectivity and pathology. The measurement of neuronal activity by functional MRI, however, is impeded by various nuisance signals that reduce the stability of functional

  2. The Reference Scenarios for the Swiss Emergency Planning

    International Nuclear Information System (INIS)

    Hanspeter Isaak; Navert, Stephan B.; Ralph Schulz

    2006-01-01

    For the purpose of emergency planning and preparedness, realistic reference scenarios and corresponding accident source terms have been defined on the basis of common plant features. Three types of representative reference scenarios encompass the accident sequences expected to be the most probable. Accident source terms are assumed to be identical for all Swiss nuclear power plants, although the plants differ in reactor type and power. Plant-specific probabilistic safety analyses were used to justify the reference scenarios and the postulated accident source terms. From the full spectrum of release categories available, those categories were selected which would be covered by the releases and time frames assumed in the reference scenarios. For each nuclear power plant, the cumulative frequency of accident sequences not covered by the reference scenarios was determined. It was found that the cumulative frequency for such accident sequences does not exceed about 1 x 10 -6 per year. The Swiss Federal Nuclear Safety Inspectorate concludes that the postulated accident source terms for the reference scenarios are consistent with the current international approach in emergency planning, where one should concentrate on the most probable accident sequences. (N.C.)

  3. The preparation and characterisation of reference fission foils

    International Nuclear Information System (INIS)

    Audenhove, J. van; Bievre, P. de; Pauwels, J.; Peetermans, F.; Gallet, M.; Verbruggen, A.

    1979-01-01

    Homogeneous and accurately defined uranium and plutonium reference fissionable deposits have been prepared by vacuum deposition of fluorides. The preparation of the fluorides as well as their vacuum deposition on planetary rotating multisubstrate holders are described. The characterisation of the deposits is obtained by relative α-counting and calibration using isotope dilution mass spectrometry. The mass per square centimeter of the deposits is corrected for the border effects and the homogeneity is determined by relative α-counting of small spots. The deposits show excellent adherence and resistance to different mediums. This makes their use as permanently available reference fission foils possible. (orig.)

  4. Comparative study and implementation of images re processing methods for the tomography by positrons emission: interest of taking into account a priori information; Etude comparative et implementation de methodes de reconstruction d`images pour la tomographie par emission de positons: interet de la prise en compte d`informations a priori

    Energy Technology Data Exchange (ETDEWEB)

    Bouchet, F.

    1996-09-25

    The tomography by positron emission has for aim to explore an organ by injection of a radiotracer and bidimensional representation with processing techniques. The most used in routine is the filtered retro projection that gives smoothed images. this work realizes a comparative study of new techniques. The methods of preservations of contours are studied here, the idea is to use NMR imaging as a priori information. Two techniques of images construction are viewed more particularly: the resolution by pseudo inverse and the Bayesian method. (N.C.).

  5. Defining School Readiness in Maryland: A Multi-Dimensional Perspective. Publication #2012-44

    Science.gov (United States)

    Forry, Nicole; Wessel, Julia

    2012-01-01

    Increased emphasis has been placed on children's ability to enter kindergarten ready to learn, a concept referred to as "school readiness." School readiness has been defined by the Maryland State Department of Education as "the stage of human development that enables a child to engage in, and benefit from, primary learning…

  6. Reference Ellipsoid and Geoid in Chronometric Geodesy

    Science.gov (United States)

    Kopeikin, Sergei M.

    2016-02-01

    Chronometric geodesy applies general relativity to study the problem of the shape of celestial bodies including the earth, and their gravitational field. The present paper discusses the relativistic problem of construction of a background geometric manifold that is used for describing a reference ellipsoid, geoid, the normal gravity field of the earth and for calculating geoid's undulation (height). We choose the perfect fluid with an ellipsoidal mass distribution uniformly rotating around a fixed axis as a source of matter generating the geometry of the background manifold through the Einstein equations. We formulate the post-Newtonian hydrodynamic equations of the rotating fluid to find out the set of algebraic equations defining the equipotential surface of the gravity field. In order to solve these equations we explicitly perform all integrals characterizing the interior gravitational potentials in terms of elementary functions depending on the parameters defining the shape of the body and the mass distribution. We employ the coordinate freedom of the equations to choose these parameters to make the shape of the rotating fluid configuration to be an ellipsoid of rotation. We derive expressions of the post-Newtonian mass and angular momentum of the rotating fluid as functions of the rotational velocity and the parameters of the ellipsoid including its bare density, eccentricity and semi-major axes. We formulate the post-Newtonian Pizzetti and Clairaut theorems that are used in geodesy to connect the parameters of the reference ellipsoid to the polar and equatorial values of force of gravity. We expand the post-Newtonian geodetic equations characterizing the reference ellipsoid into the Taylor series with respect to the eccentricity of the ellipsoid, and discuss the small-eccentricity approximation. Finally, we introduce the concept of relativistic geoid and its undulation with respect to the reference ellipsoid, and discuss how to calculate it in chronometric

  7. How the Government Defines "Rural" Has Implications for Education Policies and Practices. Issues & Answers. REL 2007-010

    Science.gov (United States)

    Arnold, Michael L.; Biscoe, Belinda; Farmer, Thomas W.; Robertson, Dylan L.; Shapley, Kathy L.

    2007-01-01

    Clearly defining what rural means has tangible implications for public policies and practices in education, from establishing resource needs to achieving the goals of No Child Left Behind in rural areas. The word "rural" has many meanings. It has been defined in reference to population density, geographic features, and level of economic…

  8. First research coordination meeting on reference database for neutron activation analysis. Summary report

    International Nuclear Information System (INIS)

    Firestone, R.B.; Trkov, A.

    2005-10-01

    Potential problems associated with nuclear data for neutron activation analysis were identified, the scope of the work to be undertaken was defined together with its priorities, and tasks were assigned to participants. Data testing and measurements refer to gamma spectrum peak evaluations, detector efficiency calibration, neutron spectrum characteristics and reference materials analysis. (author)

  9. Impact of modellers' decisions on hydrological a priori predictions

    Science.gov (United States)

    Holländer, H. M.; Bormann, H.; Blume, T.; Buytaert, W.; Chirico, G. B.; Exbrayat, J.-F.; Gustafsson, D.; Hölzel, H.; Krauße, T.; Kraft, P.; Stoll, S.; Blöschl, G.; Flühler, H.

    2014-06-01

    In practice, the catchment hydrologist is often confronted with the task of predicting discharge without having the needed records for calibration. Here, we report the discharge predictions of 10 modellers - using the model of their choice - for the man-made Chicken Creek catchment (6 ha, northeast Germany, Gerwin et al., 2009b) and we analyse how well they improved their prediction in three steps based on adding information prior to each following step. The modellers predicted the catchment's hydrological response in its initial phase without having access to the observed records. They used conceptually different physically based models and their modelling experience differed largely. Hence, they encountered two problems: (i) to simulate discharge for an ungauged catchment and (ii) using models that were developed for catchments, which are not in a state of landscape transformation. The prediction exercise was organized in three steps: (1) for the first prediction the modellers received a basic data set describing the catchment to a degree somewhat more complete than usually available for a priori predictions of ungauged catchments; they did not obtain information on stream flow, soil moisture, nor groundwater response and had therefore to guess the initial conditions; (2) before the second prediction they inspected the catchment on-site and discussed their first prediction attempt; (3) for their third prediction they were offered additional data by charging them pro forma with the costs for obtaining this additional information. Holländer et al. (2009) discussed the range of predictions obtained in step (1). Here, we detail the modeller's assumptions and decisions in accounting for the various processes. We document the prediction progress as well as the learning process resulting from the availability of added information. For the second and third steps, the progress in prediction quality is evaluated in relation to individual modelling experience and costs of

  10. REFERENCE ON THERMOPHYSICAL PROPERTIES: DENSITY AND VISCOSITY OF WATER FOR ATMOSPHERIC PRESSURE

    Directory of Open Access Journals (Sweden)

    Elin Yusibani

    2016-09-01

    Full Text Available A reference on thermophysical properties, density and viscosity, for water at atmospheric pressure has been developed in MS Excel (as a macros. Patterson’s density equations and Kestin’s viscosity equations have been chosen as a basic equation in the VBA programming as a user-defined function. These results have been compared with REFPROF as a wellknow standart reference

  11. Defining health-related quality of life for young wheelchair users: A qualitative health economics study.

    Directory of Open Access Journals (Sweden)

    Nathan Bray

    Full Text Available Wheelchairs for children with impaired mobility provide health, developmental and psychosocial benefits, however there is limited understanding of how mobility aids affect the health-related quality of life of children with impaired mobility. Preference-based health-related quality of life outcome measures are used to calculate quality-adjusted life years; an important concept in health economics. The aim of this research was to understand how young wheelchair users and their parents define health-related quality of life in relation to mobility impairment and wheelchair use.The sampling frame was children with impaired mobility (≤18 years who use a wheelchair and their parents. Data were collected through semi-structured face-to-face interviews conducted in participants' homes. Qualitative framework analysis was used to analyse the interview transcripts. An a priori thematic coding framework was developed. Emerging codes were grouped into categories, and refined into analytical themes. The data were used to build an understanding of how children with impaired mobility define health-related quality of life in relation to mobility impairment, and to assess the applicability of two standard measures of health-related quality of life.Eleven children with impaired mobility and 24 parents were interviewed across 27 interviews. Participants defined mobility-related quality of life through three distinct but interrelated concepts: 1 participation and positive experiences; 2 self-worth and feeling fulfilled; 3 health and functioning. A good degree of consensus was found between child and parent responses, although there was some evidence to suggest a shift in perception of mobility-related quality of life with child age.Young wheelchair users define health-related quality of life in a distinct way as a result of their mobility impairment and adaptation use. Generic, preference-based measures of health-related quality of life lack sensitivity in this

  12. Semi-automatic segmentation of myocardium at risk in T2-weighted cardiovascular magnetic resonance

    Directory of Open Access Journals (Sweden)

    Sjögren Jane

    2012-01-01

    Full Text Available Abstract Background T2-weighted cardiovascular magnetic resonance (CMR has been shown to be a promising technique for determination of ischemic myocardium, referred to as myocardium at risk (MaR, after an acute coronary event. Quantification of MaR in T2-weighted CMR has been proposed to be performed by manual delineation or the threshold methods of two standard deviations from remote (2SD, full width half maximum intensity (FWHM or Otsu. However, manual delineation is subjective and threshold methods have inherent limitations related to threshold definition and lack of a priori information about cardiac anatomy and physiology. Therefore, the aim of this study was to develop an automatic segmentation algorithm for quantification of MaR using anatomical a priori information. Methods Forty-seven patients with first-time acute ST-elevation myocardial infarction underwent T2-weighted CMR within 1 week after admission. Endocardial and epicardial borders of the left ventricle, as well as the hyper enhanced MaR regions were manually delineated by experienced observers and used as reference method. A new automatic segmentation algorithm, called Segment MaR, defines the MaR region as the continuous region most probable of being MaR, by estimating the intensities of normal myocardium and MaR with an expectation maximization algorithm and restricting the MaR region by an a priori model of the maximal extent for the user defined culprit artery. The segmentation by Segment MaR was compared against inter observer variability of manual delineation and the threshold methods of 2SD, FWHM and Otsu. Results MaR was 32.9 ± 10.9% of left ventricular mass (LVM when assessed by the reference observer and 31.0 ± 8.8% of LVM assessed by Segment MaR. The bias and correlation was, -1.9 ± 6.4% of LVM, R = 0.81 (p Conclusions There is a good agreement between automatic Segment MaR and manually assessed MaR in T2-weighted CMR. Thus, the proposed algorithm seems to be a

  13. Semi-automatic segmentation of myocardium at risk in T2-weighted cardiovascular magnetic resonance.

    Science.gov (United States)

    Sjögren, Jane; Ubachs, Joey F A; Engblom, Henrik; Carlsson, Marcus; Arheden, Håkan; Heiberg, Einar

    2012-01-31

    T2-weighted cardiovascular magnetic resonance (CMR) has been shown to be a promising technique for determination of ischemic myocardium, referred to as myocardium at risk (MaR), after an acute coronary event. Quantification of MaR in T2-weighted CMR has been proposed to be performed by manual delineation or the threshold methods of two standard deviations from remote (2SD), full width half maximum intensity (FWHM) or Otsu. However, manual delineation is subjective and threshold methods have inherent limitations related to threshold definition and lack of a priori information about cardiac anatomy and physiology. Therefore, the aim of this study was to develop an automatic segmentation algorithm for quantification of MaR using anatomical a priori information. Forty-seven patients with first-time acute ST-elevation myocardial infarction underwent T2-weighted CMR within 1 week after admission. Endocardial and epicardial borders of the left ventricle, as well as the hyper enhanced MaR regions were manually delineated by experienced observers and used as reference method. A new automatic segmentation algorithm, called Segment MaR, defines the MaR region as the continuous region most probable of being MaR, by estimating the intensities of normal myocardium and MaR with an expectation maximization algorithm and restricting the MaR region by an a priori model of the maximal extent for the user defined culprit artery. The segmentation by Segment MaR was compared against inter observer variability of manual delineation and the threshold methods of 2SD, FWHM and Otsu. MaR was 32.9 ± 10.9% of left ventricular mass (LVM) when assessed by the reference observer and 31.0 ± 8.8% of LVM assessed by Segment MaR. The bias and correlation was, -1.9 ± 6.4% of LVM, R = 0.81 (p Segment MaR, -2.3 ± 4.9%, R = 0.91 (p Segment MaR and manually assessed MaR in T2-weighted CMR. Thus, the proposed algorithm seems to be a promising, objective method for standardized MaR quantification in T2

  14. Evaluation of a compliance device in a subgroup of adult patients receiving specific immunotherapy with grass allergen tablets (GRAZAX) in a randomized, open-label, controlled study: an a priori subgroup analysis.

    NARCIS (Netherlands)

    Jansen, A.P.H.; Andersen, K.F.; Bruning, H.

    2009-01-01

    OBJECTIVES: This a priori subgroup analysis was conducted to assess patients' experience with a compliance device for the administration of sublingual specific immunotherapy for grass pollen-induced rhinoconjunctivitis. METHODS: The present paper reports the results of a subgroup analysis of a

  15. Integrated geophysical survey in defining subsidence features on a golf course

    Science.gov (United States)

    Xia, J.; Miller, R.D.

    2007-01-01

    Subsidence was observed at several places on the Salina Municipal Golf Course in areas known to be built over a landfill in Salina, Kansas. High-resolution magnetic survey (???5400 m2), multi-channel electrical resistivity profiling (three 154 m lines) and microgravity profiling (23 gravity-station values) were performed on a subsidence site (Green 16) to aid in determining boundaries and density deficiency of the landfill in the vicinity of the subsidence. Horizontal boundaries of the landfill were confidently defined by both magnetic anomalies and the pseudo-vertical gradient of total field magnetic anomalies. Furthermore, the pseudo-vertical gradient of magnetic anomalies presented a unique anomaly at Green 16, which provided a criterion for predicting other spots with subsidence potential using the same gradient property. Results of multi-channel electrical resistivity profiling (ERP) suggested the bottom limit of the landfill at Green 16 was around 21 m below the ground surface based on the vertical gradient of electric resistivity and a priori information on the depth of the landfill. ERP results also outlined several possible landfill bodies based on their low resistivity values. Microgravity results suggested a -0.14 g cm-3 density deficiency at Green 16 that could equate to future surface subsidence of as much as 1.5 m due to gradual compaction. ?? 2007 Nanjing Institute of Geophysical Prospecting.

  16. Current Practices of Measuring and Reference Range Reporting of Free and Total Testosterone in the United States.

    Science.gov (United States)

    Le, Margaret; Flores, David; May, Danica; Gourley, Eric; Nangia, Ajay K

    2016-05-01

    The evaluation and management of male hypogonadism should be based on symptoms and on serum testosterone levels. Diagnostically this relies on accurate testing and reference values. Our objective was to define the distribution of reference values and assays for free and total testosterone by clinical laboratories in the United States. Upper and lower reference values, assay methodology and source of published reference ranges were obtained from laboratories across the country. A standardized survey was reviewed with laboratory staff via telephone. Descriptive statistics were used to tabulate results. We surveyed a total of 120 laboratories in 47 states. Total testosterone was measured in house at 73% of laboratories. At the remaining laboratories studies were sent to larger centralized reference facilities. The mean ± SD lower reference value of total testosterone was 231 ± 46 ng/dl (range 160 to 300) and the mean upper limit was 850 ± 141 ng/dl (range 726 to 1,130). Only 9% of laboratories where in-house total testosterone testing was performed created a reference range unique to their region. Others validated the instrument recommended reference values in a small number of internal test samples. For free testosterone 82% of laboratories sent testing to larger centralized reference laboratories where equilibrium dialysis and/or liquid chromatography with mass spectrometry was done. The remaining laboratories used published algorithms to calculate serum free testosterone. Reference ranges for testosterone assays vary significantly among laboratories. The ranges are predominantly defined by limited population studies of men with unknown medical and reproductive histories. These poorly defined and variable reference values, especially the lower limit, affect how clinicians determine treatment. Copyright © 2016 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  17. Statistical parameters as a means to a priori assess the accuracy of solar forecasting models

    International Nuclear Information System (INIS)

    Voyant, Cyril; Soubdhan, Ted; Lauret, Philippe; David, Mathieu; Muselli, Marc

    2015-01-01

    In this paper we propose to determinate and to test a set of 20 statistical parameters in order to estimate the short term predictability of the global horizontal irradiation time series and thereby to propose a new prospective tool indicating the expected error regardless the forecasting methods used. The mean absolute log return, which is a tool usually used in econometrics but never in global radiation prediction, proves to be a very good estimator. Some examples of the use of this tool are exposed, showing the interest of this statistical parameter in concrete cases of predictions or optimizations. This study gives a judgment for engineers and researchers on the installation or management of solar plants and could help in minimizing the energy crisis allowing to improve the renewable energy part of the energy mix. - Highlights: • Use of statistical parameter never used for the global radiation forecasting. • A priori index allowing to optimize easily and quickly a clear sky model. • New methodology allowing to quantify the prediction error regardless the predictor used. • The prediction error depends more on the location and the time series type than the machine Learning method used.

  18. Biological reference points for fish stocks in a multispecies context

    DEFF Research Database (Denmark)

    Collie, J.S.; Gislason, Henrik

    2001-01-01

    Biological reference points (BRPs) are widely used to define safe levels of harvesting for marine fish populations. Most BRPs are either minimum acceptable biomass levels or maximum fishing mortality rates. The values of BRPs are determined from historical abundance data and the life...

  19. Deep brain stimulation for Parkinson's disease: defining the optimal location within the subthalamic nucleus.

    Science.gov (United States)

    Bot, Maarten; Schuurman, P Richard; Odekerken, Vincent J J; Verhagen, Rens; Contarino, Fiorella Maria; De Bie, Rob M A; van den Munckhof, Pepijn

    2018-05-01

    Individual motor improvement after deep brain stimulation (DBS) of the subthalamic nucleus (STN) for Parkinson's disease (PD) varies considerably. Stereotactic targeting of the dorsolateral sensorimotor part of the STN is considered paramount for maximising effectiveness, but studies employing the midcommissural point (MCP) as anatomical reference failed to show correlation between DBS location and motor improvement. The medial border of the STN as reference may provide better insight in the relationship between DBS location and clinical outcome. Motor improvement after 12 months of 65 STN DBS electrodes was categorised into non-responding, responding and optimally responding body-sides. Stereotactic coordinates of optimal electrode contacts relative to both medial STN border and MCP served to define theoretic DBS 'hotspots'. Using the medial STN border as reference, significant negative correlation (Pearson's correlation -0.52, P<0.01) was found between the Euclidean distance from the centre of stimulation to this DBS hotspot and motor improvement. This hotspot was located at 2.8 mm lateral, 1.7 mm anterior and 2.5 mm superior relative to the medial STN border. Using MCP as reference, no correlation was found. The medial STN border proved superior compared with MCP as anatomical reference for correlation of DBS location and motor improvement, and enabled defining an optimal DBS location within the nucleus. We therefore propose the medial STN border as a better individual reference point than the currently used MCP on preoperative stereotactic imaging, in order to obtain optimal and thus less variable motor improvement for individual patients with PD following STN DBS. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  20. Endo-referência de uma ciência formalizada da natureza

    Directory of Open Access Journals (Sweden)

    Michel Paty

    1992-04-01

    Full Text Available O trabalho desenvolve as implicações da idéia de que quando uma teoria atinge um estágio abstrato de formalização, como é o caso de teorias físicas, da mecânica quântica, o significado de seus conceitos e proposições, objetivos para representação e inteligibilidade do mundo físico (realidade física em qualquer de suas acepções é obtido a partir da própria teoria. Em síntese, "é a própria que dita o significado de suas proposições", segundo Einstein e Heisenberg. Em seguida apresenta uma análise do significado de 'teoria formalizada' para o caso das ciências de conteúdo empírico como a física, concluindo que a acepção adotada não pode ser reduzida a um 'formalismo interpretado' nos moldes aceitos pelos elementos do Círculo de Viena. Aspectos dinâmicos das teorias são também discutidos. Finalmente, a partir das considerações acima, é introduzido o conceito de 'endo-referência' e discutidas as suas conexões com o conceito kantiano de sintético a priori.The paper develops epistemological implications of the idea that, when a theory is in state of abstract formalization (as to-day physical theory, quantum physics for instance is, the meaning of its concepts and propositions, which are aiming at the representation and intelligibility of the external physical world is however to be taken from within the theory itself. " It is the theory itself which dictates the physical significance of its propositions" in the words of Heisenberg and Einstein. It contains also the analysis of what is meant by " formalized theory" for a science with empirical content as physics and concludes that it cannot be reduced to a mere "interpreta formalism" in the sense of logical impiricism. Finaly it works on the sense of endoreference connctions with a kantian concept of a priori synthetic.

  1. Defining adaptation in a generic multi layer model: CAM: The GRAPPLE Conceptual Adaptation Model

    NARCIS (Netherlands)

    Hendrix, M.; De Bra, P.M.E.; Pechenizkiy, M.; Smits, D.; Cristea, A.I.; Dillenbourg, P.; Specht, M.

    2008-01-01

    Authoring of Adaptive Hypermedia is a difficult and time consuming task. Reference models like LAOS and AHAM separate adaptation and content in different layers. Systems like AHA!, offer graphical tools based on these models to allow authors to define adaptation without knowing any adaptation

  2. The benefits of defining "snacks".

    Science.gov (United States)

    Hess, Julie M; Slavin, Joanne L

    2018-04-18

    Whether eating a "snack" is considered a beneficial or detrimental behavior is largely based on how "snack" is defined. The term "snack food" tends to connote energy-dense, nutrient-poor foods high in nutrients to limit (sugar, sodium, and/or saturated fat) like cakes, cookies, chips and other salty snacks, and sugar-sweetened beverages. Eating a "snack food" is often conflated with eating a "snack," however, leading to an overall perception of snacks as a dietary negative. Yet the term "snack" can also refer simply to an eating occasion outside of breakfast, lunch, or dinner. With this definition, the evidence to support health benefits or detriments to eating a "snack" remains unclear, in part because relatively few well-designed studies that specifically focus on the impact of eating frequency on health have been conducted. Despite these inconsistencies and research gaps, in much of the nutrition literature, "snacking" is still referred to as detrimental to health. As discussed in this review, however, there are multiple factors that influence the health impacts of snacking, including the definition of "snack" itself, the motivation to snack, body mass index of snack eaters, and the food selected as a snack. Without a definition of "snack" and a body of research using methodologically rigorous protocols, determining the health impact of eating a "snack" will continue to elude the nutrition research community and prevent the development of evidence-based policies about snacking that support public health. Copyright © 2018 Elsevier Inc. All rights reserved.

  3. Compilation of anatomical, physiological and metabolic characteristics of Reference Asian Man in Pakistan

    International Nuclear Information System (INIS)

    Manzoor A. Atta; Perveen Akhter; Malik, G.M.

    1998-01-01

    A research programme was initiated in collaboration with IAEA/RCA to establish local sex specific data and latter on to contribute to define a reference Asian man/woman in the age ran-e of 5, 10, 15, 20-29, 30-39, 40-49 and 20-50 years in order to strengthen the radiation protection infrastructure of the country. Physical data on height, weight, chest and head circumference and food consumption data of reference Pakistani man/women were collected from various socioeconomic strata residing at different ecological areas of Pakistan. The present study revealed that our daily nutritional status and all the physical parameters are significantly lower than ICRP reference man of Caucasian origin except the standing height of male. Since the anatomical organs are roughly proportional to body size so approximation can be made for internal dosimetry purposes with the same ratio as defined by those countries who experimentally established their values. (author)

  4. A fiducial reference site for satellite altimetry in Crete, Greece

    DEFF Research Database (Denmark)

    Mertikas, Stelios; Donlon, Craig; Mavrokordatos, Constantin

    With the advent of diverse satellite altimeters and variant measuring techniques, it has become mature in the scientific community, that an absolute reference Cal/Val site is regularly maintained to define, monitor, control the responses of any altimetric system. This work sets the ground for the...

  5. A Modernized National Spatial Reference System in 2022: Focus on the Caribbean Terrestrial Reference Frame

    Science.gov (United States)

    Roman, D. R.

    2017-12-01

    In 2022, the National Geodetic Survey will replace all three NAD 83 reference frames the four new terrestrial reference frames. Each frame will be named after a tectonic plate (North American, Pacific, Caribbean and Mariana) and each will be related to the IGS frame through three Euler Pole parameters (EPPs). This talk will focus on practical application in the Caribbean region. A working group is being re-established for development of the North American region and will likely also result in analysis of the Pacific region as well. Both of these regions are adequately covered with existing CORS sites to model the EPPs. The Mariana region currently lacks sufficient coverage, but a separate project is underway to collect additional information to help in defining EPPs for that region at a later date. The Caribbean region has existing robust coverage through UNAVCO's COCONet and other data sets, but these require further analysis. This discussion will focus on practical examination of Caribbean sites to establish candidates for determining the Caribbean frame EPPs as well as an examination of any remaining velocities that might inform a model of the remaining velocities within that frame (Intra-Frame Velocity Model). NGS has a vested interest in defining such a model to meet obligations to U.S. citizens in Puerto Rico and the U.S. Virgin Islands. Beyond this, NGS aims to collaborate with other countries in the region through efforts with SIRGAS and UN-GGIM-Americas for a more acceptable regional model to serve everyone's needs.

  6. Towards a reference model for portfolio management for product development

    DEFF Research Database (Denmark)

    Larsson, Flemming

    2006-01-01

    The aim of this paper is to explore the concept of portfolio management for product development at company level. Departing from a combination of explorative interviews with industry professionals and a literature review a proposal for a reference model for portfolio management is developed....... The model consists of a set of defined and interrelated concepts which forms a coherent and consistent reference model that explicate the totality of the portfolio management concept at company level in terms of structure, processes and elements. The model simultaneously pinpoints, positions and integrates...... several central dimensions of portfolio management....

  7. An a priori analysis of how solar energy production will affect the balance of payment account in one developing Latin American country

    Energy Technology Data Exchange (ETDEWEB)

    Stavy, Michael [Chicago, Illinois (United States)

    2000-07-01

    This paper studies a model developing Latin American country (hypotheria) with a weak currency (the hypo is the monetary unit), a trade deficit (including being a net importer of fossil fuels) and a sensitive balance of payments situation. There is an a priori analysis of the effect of domestic solar energy production on Hypotheria's positive effect is the BoP Value of domestic solar energy. Many forms of solar energy are not cost competitive with fossil fuels. Because solar energy production does not emit greenhouse Value and the BoP Value of solar energy should be used to reduce the cost of solar energy projects in Hypotheria and to make the solar energy cost competitive with fossil fuels. [Spanish] Este articulo estudia un modelo de un pais Latinoamericano en desarrollo (Hypoteria) con moneda debil (el hypo es la unidad monetaria), un deficit comercial (incluyendo el ser un importador neto de combustibles fosiles) y un balance precario en la situacion de pagos. Existe un analisis a priori sobre el efecto de la produccion domestica de energia solar en un efecto positivo de Hypoteria que es el Valor de la Balanza de Pagos (BoP) de la energia solar domestica. Muchas formas de energia solar no son competitivas en costo con los combustibles fosiles debido a que la produccion de energia solar no emite un Valor de invernadero, y el Valor de la Balanza de Pagos, debe ser utilizado para reducir los costos de los proyectos de energia solar en Hypoteria y asi hacer el costo de la energia solar competitiva con los combustibles fosiles.

  8. Hafnium isotope ratios of nine GSJ reference samples

    International Nuclear Information System (INIS)

    Hanyu, Takeshi; Nakai, Shun'ichi; Tatsuta, Riichiro

    2005-01-01

    176 Hf/ 177 Hf ratios of nine geochemical reference rocks from the Geological Survey of Japan, together with BIR-1 and BCR-2, were determined using multi-collector inductively coupled plasma mass spectrometry. Our data for BIR-1, BCR-2 and JB-1 are in agreement with those previously reported, demonstrating the appropriateness of the chemical procedure and isotopic measurement employed in this study. The reference rocks have a wide range of 176 Hf/ 177 Hf covering the field defined by various volcanic rocks, such as mid-ocean ridge basalts, ocean island basalts, and subduction related volcanic rocks. They are therefore suitable as rock standards for Hf isotope measurement of geological samples. (author)

  9. Reference ellipsoid and geoid in chronometric geodesy

    Directory of Open Access Journals (Sweden)

    Sergei M Kopeikin

    2016-02-01

    Full Text Available Chronometric geodesy applies general relativity to study the problem of the shape of celestial bodies including the earth, and their gravitational field. The present paper discusses the relativistic problem of construction of a background geometric manifold that is used for describing a reference ellipsoid, geoid, the normal gravity field of the earth and for calculating geoid's undulation (height. We choose the perfect fluid with an ellipsoidal mass distribution uniformly rotating around a fixed axis as a source of matter generating the geometry of the background manifold through the Einstein equations. We formulate the post-Newtonian hydrodynamic equations of the rotating fluid to find out the set of algebraic equations defining the equipotential surface of the gravity field. In order to solve these equations we explicitly perform all integrals characterizing the interior gravitational potentials in terms of elementary functions depending on the parameters defining the shape of the body and the mass distribution. We employ the coordinate freedom of the equations to choose these parameters to make the shape of the rotating fluid configuration to be an ellipsoid of rotation. We derive expressions of the post-Newtonian mass and angular momentum of the rotating fluid as functions of the rotational velocity and the parameters of the ellipsoid including its bare density, eccentricity and semi-major axes. We formulate the post-Newtonian Pizzetti and Clairaut theorems that are used in geodesy to connect the parameters of the reference ellipsoid to the polar and equatorial values of force of gravity. We expand the post-Newtonian geodetic equations characterizing the reference ellipsoid into the Taylor series with respect to the eccentricity of the ellipsoid, and discuss the small-eccentricity approximation. Finally, we introduce the concept of relativistic geoid and its undulation with respect to the reference ellipsoid, and discuss how to calculate it

  10. Reference Ellipsoid and Geoid in Chronometric Geodesy

    Energy Technology Data Exchange (ETDEWEB)

    Kopeikin, Sergei M., E-mail: kopeikins@missouri.edu [Department of Physics and Astronomy, University of Missouri, Columbia, MO (United States); Department of Physical Geodesy and Remote Sensing, Siberian State University of Geosystems and Technologies, Novosibirsk (Russian Federation)

    2016-02-25

    Chronometric geodesy applies general relativity to study the problem of the shape of celestial bodies including the earth, and their gravitational field. The present paper discusses the relativistic problem of construction of a background geometric manifold that is used for describing a reference ellipsoid, geoid, the normal gravity field of the earth and for calculating geoid's undulation (height). We choose the perfect fluid with an ellipsoidal mass distribution uniformly rotating around a fixed axis as a source of matter generating the geometry of the background manifold through the Einstein equations. We formulate the post-Newtonian hydrodynamic equations of the rotating fluid to find out the set of algebraic equations defining the equipotential surface of the gravity field. In order to solve these equations we explicitly perform all integrals characterizing the interior gravitational potentials in terms of elementary functions depending on the parameters defining the shape of the body and the mass distribution. We employ the coordinate freedom of the equations to choose these parameters to make the shape of the rotating fluid configuration to be an ellipsoid of rotation. We derive expressions of the post-Newtonian mass and angular momentum of the rotating fluid as functions of the rotational velocity and the parameters of the ellipsoid including its bare density, eccentricity and semi-major axes. We formulate the post-Newtonian Pizzetti and Clairaut theorems that are used in geodesy to connect the parameters of the reference ellipsoid to the polar and equatorial values of force of gravity. We expand the post-Newtonian geodetic equations characterizing the reference ellipsoid into the Taylor series with respect to the eccentricity of the ellipsoid, and discuss the small-eccentricity approximation. Finally, we introduce the concept of relativistic geoid and its undulation with respect to the reference ellipsoid, and discuss how to calculate it in

  11. References and arrow notation instead of join operation in query languages

    Directory of Open Access Journals (Sweden)

    Alexandr Savinov

    2012-10-01

    Full Text Available We study properties of the join operation in query languages and describe some of its major drawbacks. We provide strong arguments against using joins as a main construct for retrieving related data elements in general purpose query languages and argue for using references instead. Since conventional references are quite restrictive when applied to data modeling and query languages, we propose to use generalized references as they are defined in the concept-oriented model (COM. These references are used by two new operations, called projection and de-projection, which are denoted by right and left arrows and therefore this access method is referred to as arrow notation. We demonstrate advantages of the arrow notation in comparison to joins and argue that it makes queries simpler, more natural, easier to understand, and the whole query writing process more productive and less error-prone.

  12. Financial Reporting from the Reference Theories’ Perspective

    OpenAIRE

    Victor Munteanu; Lavinia Copcinschi; Anda Laceanu; Carmen Luschi

    2014-01-01

    International accounting standards are underpinned by a normative approach of accounting, in the sense that these are based on conceptual accounting framework assimilated by a theoretical framework. The conceptual framework’s development is based on a priori theory, initiated by Chambers, in his article published in 1955, where he defends the need for a theory of practical accounting an a detachment from descriptive theories of an inductive approach. Developing the accounting standards on suc...

  13. Defining suitable reference genes for RT-qPCR analysis on human sertoli cells after 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD) exposure.

    Science.gov (United States)

    Ribeiro, Mariana Antunes; dos Reis, Mariana Bisarro; de Moraes, Leonardo Nazário; Briton-Jones, Christine; Rainho, Cláudia Aparecida; Scarano, Wellerson Rodrigo

    2014-11-01

    Quantitative real-time RT-PCR (qPCR) has proven to be a valuable molecular technique to quantify gene expression. There are few studies in the literature that describe suitable reference genes to normalize gene expression data. Studies of transcriptionally disruptive toxins, like tetrachlorodibenzo-p-dioxin (TCDD), require careful consideration of reference genes. The present study was designed to validate potential reference genes in human Sertoli cells after exposure to TCDD. 32 candidate reference genes were analyzed to determine their applicability. geNorm and NormFinder softwares were used to obtain an estimation of the expression stability of the 32 genes and to identify the most suitable genes for qPCR data normalization.

  14. The mere exposure effect depends on an odour’s initial pleasantness

    OpenAIRE

    Sylvain eDelplanque; Sylvain eDelplanque; Géraldine eCoppin; Géraldine eCoppin; Laurène eBloesch; Isabelle eCayeux; David eSander; David eSander

    2015-01-01

    The mere exposure phenomenon refers to improvement of one’s attitude toward an a priori neutral stimulus after its repeated exposure. The extent to which such a phenomenon influences evaluation of a priori emotional stimuli remains under-investigated. Here we investigated this question by presenting participants with different odours varying in a priori pleasantness during different sessions spaced over time. Participants were requested to report each odour’s pleasantness, intensity, and fami...

  15. The mere exposure effect depends on an odor?s initial pleasantness

    OpenAIRE

    Delplanque, Sylvain; Coppin, G?raldine; Bloesch, Laur?ne; Cayeux, Isabelle; Sander, David

    2015-01-01

    The mere exposure phenomenon refers to improvement of one’s attitude toward an a priori neutral stimulus after its repeated exposure. The extent to which such a phenomenon influences evaluation of a priori emotional stimuli remains under-investigated. Here we investigated this question by presenting participants with different odors varying in a priori pleasantness during different sessions spaced over time. Participants were requested to report each odor’s pleasantness, intensity, and famili...

  16. Reconstruction of Consistent 3d CAD Models from Point Cloud Data Using a Priori CAD Models

    Science.gov (United States)

    Bey, A.; Chaine, R.; Marc, R.; Thibault, G.; Akkouche, S.

    2011-09-01

    We address the reconstruction of 3D CAD models from point cloud data acquired in industrial environments, using a pre-existing 3D model as an initial estimate of the scene to be processed. Indeed, this prior knowledge can be used to drive the reconstruction so as to generate an accurate 3D model matching the point cloud. We more particularly focus our work on the cylindrical parts of the 3D models. We propose to state the problem in a probabilistic framework: we have to search for the 3D model which maximizes some probability taking several constraints into account, such as the relevancy with respect to the point cloud and the a priori 3D model, and the consistency of the reconstructed model. The resulting optimization problem can then be handled using a stochastic exploration of the solution space, based on the random insertion of elements in the configuration under construction, coupled with a greedy management of the conflicts which efficiently improves the configuration at each step. We show that this approach provides reliable reconstructed 3D models by presenting some results on industrial data sets.

  17. Selective constraints in experimentally defined primate regulatory regions.

    Directory of Open Access Journals (Sweden)

    Daniel J Gaffney

    2008-08-01

    Full Text Available Changes in gene regulation may be important in evolution. However, the evolutionary properties of regulatory mutations are currently poorly understood. This is partly the result of an incomplete annotation of functional regulatory DNA in many species. For example, transcription factor binding sites (TFBSs, a major component of eukaryotic regulatory architecture, are typically short, degenerate, and therefore difficult to differentiate from randomly occurring, nonfunctional sequences. Furthermore, although sites such as TFBSs can be computationally predicted using evolutionary conservation as a criterion, estimates of the true level of selective constraint (defined as the fraction of strongly deleterious mutations occurring at a locus in regulatory regions will, by definition, be upwardly biased in datasets that are a priori evolutionarily conserved. Here we investigate the fitness effects of regulatory mutations using two complementary datasets of human TFBSs that are likely to be relatively free of ascertainment bias with respect to evolutionary conservation but, importantly, are supported by experimental data. The first is a collection of almost >2,100 human TFBSs drawn from the literature in the TRANSFAC database, and the second is derived from several recent high-throughput chromatin immunoprecipitation coupled with genomic microarray (ChIP-chip analyses. We also define a set of putative cis-regulatory modules (pCRMs by spatially clustering multiple TFBSs that regulate the same gene. We find that a relatively high proportion ( approximately 37% of mutations at TFBSs are strongly deleterious, similar to that at a 2-fold degenerate protein-coding site. However, constraint is significantly reduced in human and chimpanzee pCRMS and ChIP-chip sequences, relative to macaques. We estimate that the fraction of regulatory mutations that have been driven to fixation by positive selection in humans is not significantly different from zero. We also find

  18. A reference aerosol for a radon reference chamber

    Science.gov (United States)

    Paul, Annette; Keyser, Uwe

    1996-02-01

    The measurement of radon and radon progenies and the calibration of their detection systems require the production and measurement of aerosols well-defined in size and concentration. In the German radon reference chamber, because of its unique chemical and physical properties, carnauba wax is used to produce standard aerosols. The aerosol size spectra are measured on-line by an aerosol measurement system in the range of 10 nm to 1 μm aerodynamic diameter. The experimental set-ups for the study of adsorption of radioactive ions on aerosols as function of their size and concentration will be described, the results presented and further adaptations for an aerosol jet introduced (for example, for the measurement of short-lived neutron-rich isotopes). Data on the dependence of aerosol radius, ion concentration and element selectivity is collected by using a 252Cf-sf source. The fission products of this source range widely in elements, isotopes and charges. Adsorption and the transport of radioactive ions on aerosols have therefore been studied for various ions for the first time, simultaneously with the aerosol size on-line spectrometry.

  19. A reference aerosol for a radon reference chamber

    Energy Technology Data Exchange (ETDEWEB)

    Paul, A. [Physikalisch-Technische Bundesanstalt, Braunschweig (Germany); Keyser, U. [Physikalisch-Technische Bundesanstalt, Braunschweig (Germany)

    1996-01-11

    The measurement of radon and radon progenies and the calibration of their detection systems require the production and measurement of aerosols well-defined in size and concentration. In the German radon reference chamber, because of its unique chemical and physical properties, carnauba wax is used to produce standard aerosols. The aerosol size spectra are measured on-line by an aerosol measurement system in the range of 10 nm to 1 {mu}m aerodynamic diameter. The experimental set-ups for the study of adsorption of radioactive ions on aerosols as function of their size and concentration are described, the results presented and further adaptations for an aerosol jet introduced (for example, for the measurement of short-lived neutron-rich isotopes). Data on the dependence of aerosol radius, ion concentration and element selectivity is collected by using a {sup 252}Cf-sf source. The fission products of this source range widely in elements, isotopes and charges. Adsorption and the transport of radioactive ions on aerosols have therefore been studied for various ions for the first time, simultaneously with the aerosol size on-line spectrometry. (orig.).

  20. Descriptions of reference LWR facilities for analysis of nuclear fuel cycles

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, K.J.; Kabele, T.J.

    1979-09-01

    To contribute to the Department of Energy's identification of needs for improved environmental controls in nuclear fuel cycles, a study was made of a light water reactor system. A reference LWR fuel cycle was defined, and each step in this cycle was characterized by facility description and mainline and effluent treatment process performance. The reference fuel cycle uses fresh uranium in light water reactors. Final treatment and ultimate disposition of waste from the fuel cycle steps were not included, and the waste is assumed to be disposed of by approved but currently undefined means. The characterization of the reference fuel cycle system is intended as basic information for further evaluation of alternative effluent control systems.

  1. Descriptions of reference LWR facilities for analysis of nuclear fuel cycles

    International Nuclear Information System (INIS)

    Schneider, K.J.; Kabele, T.J.

    1979-09-01

    To contribute to the Department of Energy's identification of needs for improved environmental controls in nuclear fuel cycles, a study was made of a light water reactor system. A reference LWR fuel cycle was defined, and each step in this cycle was characterized by facility description and mainline and effluent treatment process performance. The reference fuel cycle uses fresh uranium in light water reactors. Final treatment and ultimate disposition of waste from the fuel cycle steps were not included, and the waste is assumed to be disposed of by approved but currently undefined means. The characterization of the reference fuel cycle system is intended as basic information for further evaluation of alternative effluent control systems

  2. Authenticity and Learning: Implications for Reference Librarianship and Information Literacy Instruction

    Science.gov (United States)

    Klipfel, Kevin Michael

    2015-01-01

    This article articulates and defends a student-centered approach to reference and instructional librarianship defined by authentic engagement with students' interests. A review of the history of the construct of authenticity in philosophy, humanistic and existential psychology, and contemporary educational psychology is traced. Connections are…

  3. The origin of anomalous transport in porous media - is it possible to make a priori predictions?

    Science.gov (United States)

    Bijeljic, Branko; Blunt, Martin

    2013-04-01

    at approximately the average flow speed; in the carbonate with the widest velocity distribution the stagnant concentration peak is persistent, while the emergence of a smaller secondary mobile peak is observed, leading to a highly anomalous behavior. This defines different generic nature of non-Fickian transport in the three media and quantifies the effect of pore structure on transport. Moreover, the propagators obtained by the model are in a very good agreement with the propagators measured on beadpack, Bentheimer sandstone and Portland carbonate cores in nuclear magnetic resonance experiments. These findings demonstrate that it is possible to make a priori predictions of anomalous transport in porous media. The importance of these findings for transport in complex carbonate rock micro-CT images is discussed, classifying them in terms of degree of anomalous transport that can have an impact at the field scale. Extensions to reactive transport will be discussed.

  4. Using greenhouse gas fluxes to define soil functional types

    Energy Technology Data Exchange (ETDEWEB)

    Petrakis, Sandra; Barba, Josep; Bond-Lamberty, Ben; Vargas, Rodrigo

    2017-12-04

    Soils provide key ecosystem services and directly control ecosystem functions; thus, there is a need to define the reference state of soil functionality. Most common functional classifications of ecosystems are vegetation-centered and neglect soil characteristics and processes. We propose Soil Functional Types (SFTs) as a conceptual approach to represent and describe the functionality of soils based on characteristics of their greenhouse gas (GHG) flux dynamics. We used automated measurements of CO2, CH4 and N2O in a forested area to define SFTs following a simple statistical framework. This study supports the hypothesis that SFTs provide additional insights on the spatial variability of soil functionality beyond information represented by commonly measured soil parameters (e.g., soil moisture, soil temperature, litter biomass). We discuss the implications of this framework at the plot-scale and the potential of this approach at larger scales. This approach is a first step to provide a framework to define SFTs, but a community effort is necessary to harmonize any global classification for soil functionality. A global application of the proposed SFT framework will only be possible if there is a community-wide effort to share data and create a global database of GHG emissions from soils.

  5. European gene mapping project (EUROGEM) : Breakpoint panels for human chromosomes based on the CEPH reference families

    NARCIS (Netherlands)

    Attwood, J; Bryant, SP; Bains, R; Povey, R; Povey, S; Rebello, M; Kapsetaki, M; Moschonas, NK; Grzeschik, KH; Otto, M; Dixon, M; Sudworth, HE; Kooy, RF; Wright, A; Teague, P; Terrenato, L; Vergnaud, G; Monfouilloux, S; Weissenbach, J; Alibert, O; Dib, C; Faure, S; Bakker, E; Pearson, NM; Vossen, RHAM; Gal, A; MuellerMyhsok, B; Cann, HM; Spurr, NK

    Meiotic breakpoint panels for human chromosomes 2, 3, 4, 5, 6, 7, 8, 9, 10, 13, 14, 15, 17; 18, 20 and X were constructed from genotypes from the CEPH reference families. Each recombinant chromosome included has a breakpoint well-supported with reference to defined quantitative criteria. The panels

  6. Twenty First Century Cyberbullying Defined: An Analysis of Intent, Repetition and Emotional Response

    Science.gov (United States)

    Walker, Carol Marie

    2012-01-01

    The purpose of this study was to analyze the extent and impact that cyberbullying has on the undergraduate college student and provide a current definition for the event. A priori power analysis guided this research to provide an 80 percent probability of detecting a real effect with medium effect size. Adequate research power was essential to…

  7. Deficient motion-defined and texture-defined figure-ground segregation in amblyopic children.

    Science.gov (United States)

    Wang, Jane; Ho, Cindy S; Giaschi, Deborah E

    2007-01-01

    Motion-defined form deficits in the fellow eye and the amblyopic eye of children with amblyopia implicate possible direction-selective motion processing or static figure-ground segregation deficits. Deficient motion-defined form perception in the fellow eye of amblyopic children may not be fully accounted for by a general motion processing deficit. This study investigates the contribution of figure-ground segregation deficits to the motion-defined form perception deficits in amblyopia. Performances of 6 amblyopic children (5 anisometropic, 1 anisostrabismic) and 32 control children with normal vision were assessed on motion-defined form, texture-defined form, and global motion tasks. Performance on motion-defined and texture-defined form tasks was significantly worse in amblyopic children than in control children. Performance on global motion tasks was not significantly different between the 2 groups. Faulty figure-ground segregation mechanisms are likely responsible for the observed motion-defined form perception deficits in amblyopia.

  8. First attempt to assess the viability of bluefin tuna spawning events in offshore cages located in an a priori favourable larval habitat

    Directory of Open Access Journals (Sweden)

    Patricia Reglero

    2013-10-01

    Full Text Available Most of the Atlantic bluefin tuna caught by the purse-seine fleet in the Mediterranean Sea are transferred alive into transport cages and towed to coastal facilities where they are fattened. This major fishery is targeting aggregations of reproductive bluefin tuna that continue spawning within the transport cages. Our study is the first attempt to assess the viability of the spawning events within transport cages placed offshore in a priori favourable locations for larval survival. The study was conducted in June 2010 in the Balearic Sea, a main spawning area for bluefin tuna in the Mediterranean. The location of two transport cages, one with wild and one with captive tuna, coincide with the situation of the chlorophyll front using satellite imagery as a proxy for the salinity front between resident surface waters and those of recent Atlantic origin. The results showed that bluefin tuna eggs were spawned almost every day within the two cages but few or no larvae were found. The expected larval densities estimated after applying mortality curves to the daily egg densities observed in the cages were higher than the sampled larval densities. The trajectories of the eggs after hatching estimated from a particle tracking model based on observed geostrophic currents and a drifter deployed adjacent to the cage suggest that larvae were likely to be caught close to the cages within the sampling dates. Spawning events in captive tuna in transport cages may hatch into larvae though they may experience higher mortality rates than expected in natural populations. The causes of the larval mortality are further discussed in the text. Such studies should be repeated in other spawning areas in the Mediterranean if spawning in cages located offshore in areas favourable a priori for larval survival is likely to be considered a management measurement to minimize the impact of purse-seine fishing on tuna.

  9. Teleology and Defining Sex.

    Science.gov (United States)

    Gamble, Nathan K; Pruski, Michal

    2018-07-01

    Disorders of sexual differentiation lead to what is often referred to as an intersex state. This state has medical, as well as some legal, recognition. Nevertheless, the question remains whether intersex persons occupy a state in between maleness and femaleness or whether they are truly men or women. To answer this question, another important conundrum needs to be first solved: what defines sex? The answer seems rather simple to most people, yet when morphology does not coincide with haplotypes, and genetics might not correlate with physiology the issue becomes more complex. This paper tackles both issues by establishing where the essence of sex is located and by superimposing that framework onto the issue of the intersex. This is achieved through giving due consideration to the biology of sexual development, as well as through the use of a teleological framework of the meaning of sex. Using a range of examples, the paper establishes that sex cannot be pinpointed to one biological variable but is rather determined by how the totality of one's biology is oriented towards biological reproduction. A brief consideration is also given to the way this situation could be comprehended from a Christian understanding of sex and suffering.

  10. "Dermatitis" defined.

    Science.gov (United States)

    Smith, Suzanne M; Nedorost, Susan T

    2010-01-01

    The term "dermatitis" can be defined narrowly or broadly, clinically or histologically. A common and costly condition, dermatitis is underresourced compared to other chronic skin conditions. The lack of a collectively understood definition of dermatitis and its subcategories could be the primary barrier. To investigate how dermatologists define the term "dermatitis" and determine if a consensus on the definition of this term and other related terms exists. A seven-question survey of dermatologists nationwide was conducted. Of respondents (n  =  122), half consider dermatitis to be any inflammation of the skin. Nearly half (47.5%) use the term interchangeably with "eczema." Virtually all (> 96%) endorse the subcategory "atopic" under the terms "dermatitis" and "eczema," but the subcategories "contact," "drug hypersensitivity," and "occupational" are more highly endorsed under the term "dermatitis" than under the term "eczema." Over half (55.7%) personally consider "dermatitis" to have a broad meaning, and even more (62.3%) believe that dermatologists as a whole define the term broadly. There is a lack of consensus among experts in defining dermatitis, eczema, and their related subcategories.

  11. Towards Automatic Testing of Reference Point Based Interactive Methods

    OpenAIRE

    Ojalehto, Vesa; Podkopaev, Dmitry; Miettinen, Kaisa

    2016-01-01

    In order to understand strengths and weaknesses of optimization algorithms, it is important to have access to different types of test problems, well defined performance indicators and analysis tools. Such tools are widely available for testing evolutionary multiobjective optimization algorithms. To our knowledge, there do not exist tools for analyzing the performance of interactive multiobjective optimization methods based on the reference point approach to communicating ...

  12. Physical characteristics of the Japanese in relation to reference man

    International Nuclear Information System (INIS)

    Tanaka, G.-i.; Kawamura, H.; Nomura, E.

    1980-01-01

    Quantitative description of physical and other characteristics of the human body provides basic data for the estimation of radiation risk and the establishment of dose equivalent in line with the International Commission on Radiological Protection recommendations. ICRP Reference Man is based essentially on data reported on populations in rather limited areas in the world although Committee 2 of ICRP concedes that ''it is neither feasible nor necessary to specify Reference Man as representative of a well-defined population group''. ICRP Reference Man is not necessarily based on the most recent available data. In order to be more realistic and quantitative in dose equivalent estimation in Japan, it is necessary to consider populations which are largely different from those of European and North American countries in physical dimensions and other aspects. Therefore, standard or reference values of mass and size of organs, body and organ content and metabolic parameters of some elements have been studied. These are compared with the values of ICRP Reference Man authorized up to now, with the intention of establishing ''Reference Japanese Man'' and contributing to the improvement of models of man used in radiation protection. Dose equivalent commitment and annual limit of intake have been calculated using the obtained data for the general population in Japan,for some nuclides. (H.K.)

  13. Spatial Updating Strategy Affects the Reference Frame in Path Integration.

    Science.gov (United States)

    He, Qiliang; McNamara, Timothy P

    2018-06-01

    This study investigated how spatial updating strategies affected the selection of reference frames in path integration. Participants walked an outbound path consisting of three successive waypoints in a featureless environment and then pointed to the first waypoint. We manipulated the alignment of participants' final heading at the end of the outbound path with their initial heading to examine the adopted reference frame. We assumed that the initial heading defined the principal reference direction in an allocentric reference frame. In Experiment 1, participants were instructed to use a configural updating strategy and to monitor the shape of the outbound path while they walked it. Pointing performance was best when the final heading was aligned with the initial heading, indicating the use of an allocentric reference frame. In Experiment 2, participants were instructed to use a continuous updating strategy and to keep track of the location of the first waypoint while walking the outbound path. Pointing performance was equivalent regardless of the alignment between the final and the initial headings, indicating the use of an egocentric reference frame. These results confirmed that people could employ different spatial updating strategies in path integration (Wiener, Berthoz, & Wolbers Experimental Brain Research 208(1) 61-71, 2011), and suggested that these strategies could affect the selection of the reference frame for path integration.

  14. Roaming Reference: Reinvigorating Reference through Point of Need Service

    Directory of Open Access Journals (Sweden)

    Kealin M. McCabe

    2011-11-01

    Full Text Available Roaming reference service was pursued as a way to address declining reference statistics. The service was staffed by librarians armed with iPads over a period of six months during the 2010-2011 academic year. Transactional statistics were collected in relation to query type (Research, Facilitative or Technology, location and approach (librarian to patron, patron to librarian or via chat widget. Overall, roaming reference resulted in an additional 228 reference questions, 67% (n=153 of which were research related. Two iterations of the service were implemented, roaming reference as a standalone service (Fall 2010 and roaming reference integrated with traditional reference desk duties (Winter 2011. The results demonstrate that although the Weller Library’s reference transactions are declining annually, they are not disappearing. For a roaming reference service to succeed, it must be a standalone service provided in addition to traditional reference services. The integration of the two reference models (roaming reference and reference desk resulted in a 56% decline in the total number of roaming reference questions from the previous term. The simple act of roaming has the potential to reinvigorate reference services as a whole, forcing librarians outside their comfort zones, allowing them to reach patrons at their point of need.

  15. Defining the cortical visual systems: "what", "where", and "how"

    Science.gov (United States)

    Creem, S. H.; Proffitt, D. R.; Kaiser, M. K. (Principal Investigator)

    2001-01-01

    The visual system historically has been defined as consisting of at least two broad subsystems subserving object and spatial vision. These visual processing streams have been organized both structurally as two distinct pathways in the brain, and functionally for the types of tasks that they mediate. The classic definition by Ungerleider and Mishkin labeled a ventral "what" stream to process object information and a dorsal "where" stream to process spatial information. More recently, Goodale and Milner redefined the two visual systems with a focus on the different ways in which visual information is transformed for different goals. They relabeled the dorsal stream as a "how" system for transforming visual information using an egocentric frame of reference in preparation for direct action. This paper reviews recent research from psychophysics, neurophysiology, neuropsychology and neuroimaging to define the roles of the ventral and dorsal visual processing streams. We discuss a possible solution that allows for both "where" and "how" systems that are functionally and structurally organized within the posterior parietal lobe.

  16. Alignment of in-vessel components by metrology defined adaptive machining

    International Nuclear Information System (INIS)

    Wilson, David; Bernard, Nathanaël; Mariani, Antony

    2015-01-01

    Highlights: • Advanced metrology techniques developed for large volume high density in-vessel surveys. • Virtual alignment process employed to optimize the alignment of 440 blanket modules. • Auto-geometry construct, from survey data, using CAD proximity detection and orientation logic. • HMI developed to relocate blanket modules if customization limits on interfaces are exceeded. • Data export format derived for Catia parametric models, defining customization requirements. - Abstract: The assembly of ITER will involve the precise and accurate alignment of a large number of components and assemblies in areas where access will often be severely constrained and where process efficiency will be critical. One such area is the inside of the vacuum vessel where several thousand components shall be custom machined to provide the alignment references for in-vessel systems. The paper gives an overview of the process that will be employed; to survey the interfaces for approximately 3500 components then define and execute the customization process.

  17. Alignment of in-vessel components by metrology defined adaptive machining

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, David [ITER Organization, Route de Vinon sur Verdon, CS90 046, St Paul-lez-Durance (France); Bernard, Nathanaël [G2Métric, Launaguet 31140 (France); Mariani, Antony [Spatial Alignment Ltd., Witney (United Kingdom)

    2015-10-15

    Highlights: • Advanced metrology techniques developed for large volume high density in-vessel surveys. • Virtual alignment process employed to optimize the alignment of 440 blanket modules. • Auto-geometry construct, from survey data, using CAD proximity detection and orientation logic. • HMI developed to relocate blanket modules if customization limits on interfaces are exceeded. • Data export format derived for Catia parametric models, defining customization requirements. - Abstract: The assembly of ITER will involve the precise and accurate alignment of a large number of components and assemblies in areas where access will often be severely constrained and where process efficiency will be critical. One such area is the inside of the vacuum vessel where several thousand components shall be custom machined to provide the alignment references for in-vessel systems. The paper gives an overview of the process that will be employed; to survey the interfaces for approximately 3500 components then define and execute the customization process.

  18. A cognitive mobile BTS solution with software-defined radioelectric sensing.

    Science.gov (United States)

    Muñoz, Jorge; Alonso, Javier Vales; García, Francisco Quiñoy; Costas, Sergio; Pillado, Marcos; Castaño, Francisco Javier González; Sánchez, Manuel García; Valcarce, Roberto López; Bravo, Cristina López

    2013-02-05

    Private communications inside large vehicles such as ships may be effectively provided using standard cellular systems. In this paper we propose a new solution based on software-defined radio with electromagnetic sensing support. Software-defined radio allows low-cost developments and, potentially, added-value services not available in commercial cellular networks. The platform of reference, OpenBTS, only supports single-channel cells. Our proposal, however, has the ability of changing BTS channel frequency without disrupting ongoing communications. This ability should be mandatory in vehicular environments, where neighbouring cell configurations may change rapidly, so a moving cell must be reconfigured in real-time to avoid interferences. Full details about frequency occupancy sensing and the channel reselection procedure are provided in this paper. Moreover, a procedure for fast terminal detection is proposed. This may be decisive in emergency situations, e.g., if someone falls overboard. Different tests confirm the feasibility of our proposal and its compatibility with commercial GSM terminals.

  19. The use of a priori information in ICA-based techniques for real-time fMRI: an evaluation of static/dynamic and spatial/temporal characteristics

    Directory of Open Access Journals (Sweden)

    Nicola eSoldati

    2013-03-01

    Full Text Available Real-time brain functional MRI (rt-fMRI allows in-vivo non-invasive monitoring of neural networks. The use of multivariate data-driven analysis methods such as independent component analysis (ICA offers an attractive trade-off between data interpretability and information extraction, and can be used during both task-based and rest experiments. The purpose of this study was to assess the effectiveness of different ICA-based procedures to monitor in real-time a target IC defined from a functional localizer which also used ICA. Four novel methods were implemented to monitor ongoing brain activity in a sliding window approach. The methods differed in the ways in which a priori information, derived from ICA algorithms, was used to monitora target independent component (IC. We implemented four different algorithms, all based on ICA. One Back-projection method used ICA to derive static spatial information from the functional localizer, off line, which was then back-projected dynamically during the real-time acquisition. The other three methods used real-time ICA algorithms that dynamically exploited temporal, spatial, or spatial-temporal priors during the real-time acquisition. The methods were evaluated by simulating a rt-fMRI experiment that used real fMRI data. The performance of each method was characterized by the spatial and/or temporal correlation with the target IC component monitored, computation time and intrinsic stochastic variability of the algorithms. In this study the Back-projection method, which could monitor more than one IC of interest, outperformed the other methods. These results are consistent with a functional task that gives stable target ICs over time. The dynamic adaptation possibilities offered by the other ICA methods proposed may offer better performance than the Back-projection in conditions where the functional activation shows higher spatial and/or temporal variability.

  20. A priori estimation of accuracy and of the number of wells to be employed in limiting dilution assays

    Directory of Open Access Journals (Sweden)

    J.G. Chaui-Berlinck

    2000-08-01

    Full Text Available The use of limiting dilution assay (LDA for assessing the frequency of responders in a cell population is a method extensively used by immunologists. A series of studies addressing the statistical method of choice in an LDA have been published. However, none of these studies has addressed the point of how many wells should be employed in a given assay. The objective of this study was to demonstrate how a researcher can predict the number of wells that should be employed in order to obtain results with a given accuracy, and, therefore, to help in choosing a better experimental design to fulfill one's expectations. We present the rationale underlying the expected relative error computation based on simple binomial distributions. A series of simulated in machina experiments were performed to test the validity of the a priori computation of expected errors, thus confirming the predictions. The step-by-step procedure of the relative error estimation is given. We also discuss the constraints under which an LDA must be performed.

  1. Defining initiating events for purposes of probabilistic safety assessment

    International Nuclear Information System (INIS)

    1993-09-01

    This document is primarily directed towards technical staff involved in the performance or review of plant specific Probabilistic Safety Assessment (PSA). It highlights different approaches and provides typical examples useful for defining the Initiating Events (IE). The document also includes the generic initiating event database, containing about 300 records taken from about 30 plant specific PSAs. In addition to its usefulness during the actual performance of a PSA, the generic IE database is of the utmost importance for peer reviews of PSAs, such as the IAEA's International Peer Review Service (IPERS) where reference to studies on similar NPPs is needed. 60 refs, figs and tabs

  2. The role of internal reference prices in consumers' willingness to pay judgments: Thaler's Beer Pricing Task revisited.

    Science.gov (United States)

    Ranyard, R; Charlton, J P; Williamson, J

    2001-02-01

    Alternative reference prices, either displayed in the environment (external) or recalled from memory (internal) are known to influence consumer judgments and decisions. In one line of previous research, internal reference prices have been defined in terms of general price expectations. However, Thaler (Marketing Science 4 (1985) 199; Journal of Behavioral Decision Making 12 (1999) 183) defined them as fair prices expected from specific types of seller. Using a Beer Pricing Task, he found that seller context had a substantial effect on willingness to pay, and concluded that this was due to specific internal reference prices evoked by specific contexts. In a think aloud study using the same task (N = 48), we found only a marginal effect of seller context. In a second study using the Beer Pricing Task and seven analogous ones (N = 144), general internal reference prices were estimated by asking people what they normally paid for various commodities. Both general internal reference prices and seller context influenced willingness to pay, although the effect of the latter was again rather small. We conclude that general internal reference prices have a greater impact in these scenarios than specific ones, because of the lower cognitive load involved in their storage and retrieval.

  3. Chemistry of reference waters of the crystalline basement of Northern Switzerland for safety assessment studies

    International Nuclear Information System (INIS)

    Pearson, F.J. Jr.; Scholtis, A.

    1993-08-01

    The chemistry of groundwater in formations being considered as host rocks for nuclear waste repositories must be known to assess the performance of those repositories, and as media for laboratory experiments. Two potential repository siting areas in the crystalline basement of northern Switzerland are being assessed. This report gives the chemistry of water in both areas for reference use in this assessment. The western area is in the region defined by the Kaisten, Leuggern, Boettstein, and Zurzach boreholes. The western reference water is based on samples from the Leuggern, Boettstein, and Zurzach boreholes. Kaisten water is of higher salinity (1.3 g/l). The concentration ranges of the reference water include Kaisten values, however. High quality samples and analyses, particularly from long term sampling at Zurzach and Leuggern, define the concentration ranges of many trace elements. The definition of this water assumes saturation with respect to calcite, baryte, fluorites, chalcedony, and kaolinite. The reference pe is based on the assumption that dissolved iron concentrations are controlled by the solubility of the mineral goethite, and is consistent with other redox indicators such as the measured Pt-electrode potential and the ratio of dissolved As(V) to As(III). The eastern area is characterized by the Siblingen boreholes. The eastern reference water is a Na-HCO 3 -SO 4 -(Cl) type with a total dissolved solids content of about 0.5 g/l. Only three samples taken during borehole drilling are available to define this water, so it can be specified in less detail and with less precision than the western water. Its definition assumes saturation with respect to calcite, baryte, and fluorites. The samples permit only a broad definition of its oxidation potential and content of redox-sensitive metals such as Fe, As, Mn, and U. Trace element data for the most part are lacking. (author) figs., tabs., 28 refs

  4. D2.3 - ENCOURAGE platform reference architecture

    DEFF Research Database (Denmark)

    Ferreira, Luis Lino; Pinho, Luis Miguel; Albano, Michele

    2012-01-01

    documents produced in work package WP2, the framework for the detailed specification activities to be developed in the technical work packages of the project (WP3-WP6). In order to provide the required background for the ENCOURAGE platform reference, the document describes the most relevant standards...... and functionalities of the modules of the architecture logical blocks. Furthermore, the document defines the main interface standards to be used for interoperability. These functionalities and interfaces will then be specified in detail in work packages WP3-WP6. Finally, the document provides the mapping...

  5. Valid analytical performance specifications for combined analytical bias and imprecision for the use of common reference intervals.

    Science.gov (United States)

    Hyltoft Petersen, Per; Lund, Flemming; Fraser, Callum G; Sandberg, Sverre; Sölétormos, György

    2018-01-01

    Background Many clinical decisions are based on comparison of patient results with reference intervals. Therefore, an estimation of the analytical performance specifications for the quality that would be required to allow sharing common reference intervals is needed. The International Federation of Clinical Chemistry (IFCC) recommended a minimum of 120 reference individuals to establish reference intervals. This number implies a certain level of quality, which could then be used for defining analytical performance specifications as the maximum combination of analytical bias and imprecision required for sharing common reference intervals, the aim of this investigation. Methods Two methods were investigated for defining the maximum combination of analytical bias and imprecision that would give the same quality of common reference intervals as the IFCC recommendation. Method 1 is based on a formula for the combination of analytical bias and imprecision and Method 2 is based on the Microsoft Excel formula NORMINV including the fractional probability of reference individuals outside each limit and the Gaussian variables of mean and standard deviation. The combinations of normalized bias and imprecision are illustrated for both methods. The formulae are identical for Gaussian and log-Gaussian distributions. Results Method 2 gives the correct results with a constant percentage of 4.4% for all combinations of bias and imprecision. Conclusion The Microsoft Excel formula NORMINV is useful for the estimation of analytical performance specifications for both Gaussian and log-Gaussian distributions of reference intervals.

  6. On the suitability of ISO 16717-1 reference spectra for rating airborne sound insulation.

    Science.gov (United States)

    Mašović, Draško B; Pavlović, Dragana S Šumarac; Mijić, Miomir M

    2013-11-01

    A standard proposal for rating airborne sound insulation in buildings [ISO 16717-1 (2012)] defines the reference noise spectra. Since their shapes influence the calculated values of single-number descriptors, reference spectra should approximate well typical noise spectra in buildings. There is, however, very little data in the existing literature on a typical noise spectrum in dwellings. A spectral analysis of common noise sources in dwellings is presented in this paper, as a result of an extensive monitoring of various noisy household activities. Apart from music with strong bass content, the proposed "living" reference spectrum overestimates noise levels at low frequencies.

  7. Standard Specification for Physical Characteristics of Nonconcentrator Terrestrial Photovoltaic Reference Cells

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This specification describes the physical requirements for primary and secondary terrestrial nonconcentrator photovoltaic reference cells. A reference cell is defined as a device that meets the requirements of this specification and is calibrated in accordance with Test Method E1125 or Test Method E1362. 1.2 Reference cells are used in the determination of the electrical performance of photovoltaic devices, as stated in Test Methods E948 and E1036. 1.3 Two reference cell physical specifications are described: 1.3.1 Small-Cell Package Design—A small, durable package with a low thermal mass, wide optical field-of-view, and standardized dimensions intended for photovoltaic devices up to 20 by 20 mm, and 1.3.2 Module-Package Design—A package intended to simulate the optical and thermal properties of a photovoltaic module design, but electric connections are made to only one photovoltaic cell in order to eliminate problems with calibrating series and parallel connections of cells. Physical dimensions ...

  8. A formalism for scattering of complex composite structures. II. Distributed reference points

    DEFF Research Database (Denmark)

    Svaneborg, Carsten; Pedersen, Jan Skov

    2012-01-01

    Recently we developed a formalism for the scattering from linear and acyclic branched structures build of mutually non-interacting sub-units.[C. Svaneborg and J. S. Pedersen, J. Chem. Phys. 136, 104105 (2012)] We assumed each sub-unit has reference points associated with it. These are well defined...... positions where sub-units can be linked together. In the present paper, we generalize the formalism to the case where each reference point can represent a distribution of potential link positions. We also present a generalized diagrammatic representation of the formalism. Scattering expressions required...

  9. Opportunities and challenges in conducting systematic reviews to support development of nutrient reference values: vitamin A as an example

    Science.gov (United States)

    Nutrient reference values have significant public health and policy implications. Given the importance of defining reliable nutrient reference values, there is a need for an explicit, objective, and transparent process to set these values. The Tufts Medical Center Evidence-based Practice Center asse...

  10. Analysis of reference X radiations energies adjusted for the same half-value layer

    International Nuclear Information System (INIS)

    Figueiredo, Marcus Tadeu Tanuri de; Baptista Neto, Annibal Theotonio; Silva, Teogenes Augusto da; Oliveira, Paulo Marcio Campos de

    2011-01-01

    The International Standardization Organization (ISO) defined the reference radiation for calibration and testing in x and gamma fields. The ISO 4037-1 establishes that if the first and the second half value - layers (HVL) agree within 5%, for two x- ray beams, then these two beams shall be considered the same. In this study, reference radiations with the same HVLs that were obtained trough the total filtration or the tube voltage adjustments were compared in terms of spectra and beam parameters. (author)

  11. Reference Ranges in [(99m)Tc]Mercaptoacetyltriglycerine Renography

    DEFF Research Database (Denmark)

    Rewers, Kate I; Hvidsten, Svend; Gerke, Oke

    2015-01-01

    PURPOSE: The purpose of the study was to define reference ranges for quantitative parameters in [(99m)Tc]mercaptoacetyltriglycerine ([(99m)Tc]MAG3) renography to assist interpretation in a semi-automated (Xeleris, GE) compared to a manual (Picker, Odyssey) software package. PROCEDURES: Forty......-eight subjects approved for renal donation were evaluated with [(99m)Tc]MAG3 renography using both the Xeleris and the Picker software. RESULTS: Reference ranges for the two software were comparable regarding the relative function of the two kidneys (the split function, SF) and the residual activities (RA......). The time to peak whole-kidney activities (T max whole-kidney) was more dependent on the type of software. Using Bland-Altman limits, we found good and acceptable agreement between the two methods. CONCLUSIONS: We found good correlation between renography results using the Xeleris and Picker software...

  12. International Geomagnetic Reference Field: the 12th generation

    OpenAIRE

    Thébault , Erwan; Finlay , Christopher ,; Beggan , Ciarán ,; Alken , Patrick; Aubert , Julien ,; Barrois , Olivier; Bertrand , François; Bondar , Tatiana; Boness , Axel; Brocco , Laura; Canet , Elisabeth ,; Chambodut , Aude; Chulliat , Arnaud ,; Coïsson , Pierdavide ,; Civet , François

    2015-01-01

    International audience; The 12th generation of the International Geomagnetic Reference Field (IGRF) was adopted in December 2014 by the Working Group V-MOD appointed by the International Association of Geomagnetism and Aeronomy (IAGA). It updates the previous IGRF generation with a definitive main field model for epoch 2010.0, a main field model for epoch 2015.0, and a linear annual predictive secular variation model for 2015.0-2020.0. Here, we present the equations defining the IGRF model, p...

  13. Frames of Reference and Some of its Applications

    OpenAIRE

    Bel, Ll.

    1998-01-01

    We define a Frame of reference as a two ingredients concept: A meta-rigid motion, which is a generalization of a Born motion, and a chorodesic synchronization, which is an adapted foliation. At the end of the line we uncover a low-level 3-dimensional geometry with constant curvature and a corresponding coordinated proper-time scale. We discuss all these aspects both from the geometrical point of view as from the point of view of some of the physical applications derived from them.

  14. Reference Dose Rates for Fluoroscopy Guided Interventions

    International Nuclear Information System (INIS)

    Geleijns, J.; Broerse, J.J.; Hummel, W.A.; Schalij, M.J.; Schultze Kool, L.J.; Teeuwisse, W.; Zoetelief, J.

    1998-01-01

    The wide diversity of fluoroscopy guided interventions which have become available in recent years has improved patient care. They are being performed in increasing numbers, particularly at departments of cardiology and radiology. Some procedures are very complex and require extended fluoroscopy times, i.e. longer than 30 min, and radiation exposure of patient and medical staff is in some cases rather high. The occurrence of radiation-induced skin injuries on patients has shown that radiation protection for fluoroscopy guided interventions should not only be focused on stochastic effects, i.e. tumour induction and hereditary risks, but also on potential deterministic effects. Reference dose levels are introduced by the Council of the European Communities as an instrument to achieve optimisation of radiation protection in radiology. Reference levels in conventional diagnostic radiology are usually expressed as entrance skin dose or dose-area product. It is not possible to define a standard procedure for complex interventions due to the large inter-patient variations with regard to the complexity of specific interventional procedures. Consequently, it is not realistic to establish a reference skin dose or dose-area product for complex fluoroscopy guided interventions. As an alternative, reference values for fluoroscopy guided interventions can be expressed as the entrance dose rates on a homogeneous phantom and on the image intensifier. A protocol has been developed and applied during a nationwide survey of fluoroscopic dose rate during catheter ablations. From this survey reference entrance dose rates of respectively 30 mGy.min -1 on a polymethylmethacrylate (PMMA) phantom with a thickness of 21 cm, and of 0.8 μGy.s -1 on the image intensifier have been derived. (author)

  15. Guidelines for defining and documenting data on costs of possible environmental protection measures

    Energy Technology Data Exchange (ETDEWEB)

    Marlowe, I.; King, K.; Boyd, R.; Bouscaren, R.; Pacyna, J. [AEA Technology Environment, Harwell (United Kingdom)

    1999-07-01

    The Guidelines are intended to promote good practice in the documenting and use of data on the costs of possible environmental protection measures in the context of international data comparisons. The minimum information needed to describe the cost of an environmental protection measures is: details of pollution source; details of the environmental protection measure and its performance characteristics; how costs are defined; the year to which data apply; indications of data uncertainty; how pollutants are defined; and reference to data sources. Guidelines are given for these seven items. These are followed by descriptions of various methods of data processing - dealing with information; calculating annual costs; discount/interest rates; and additional issues relating to the implementation of cost data. 16 refs., 5 tabs., 6 apps.

  16. No-reference image quality assessment for horizontal-path imaging scenarios

    Science.gov (United States)

    Rios, Carlos; Gladysz, Szymon

    2013-05-01

    There exist several image-enhancement algorithms and tasks associated with imaging through turbulence that depend on defining the quality of an image. Examples include: "lucky imaging", choosing the width of the inverse filter for image reconstruction, or stopping iterative deconvolution. We collected a number of image quality metrics found in the literature. Particularly interesting are the blind, "no-reference" metrics. We discuss ways of evaluating the usefulness of these metrics, even when a fully objective comparison is impossible because of the lack of a reference image. Metrics are tested on simulated and real data. Field data comes from experiments performed by the NATO SET 165 research group over a 7 km distance in Dayton, Ohio.

  17. Intra- and extra-articular planes of reference for use in total hip arthroplasty: a preliminary study.

    Science.gov (United States)

    Hausselle, Jerome; Moreau, Pierre Etienne; Wessely, Loic; de Thomasson, Emmanuel; Assi, Ayman; Parratte, Sebastien; Essig, Jerome; Skalli, Wafa

    2012-08-01

    Acetabular component malalignment in total hip arthroplasty can lead to potential complications such as dislocation, component impingement and excessive wear. Computer-assisted orthopaedic surgery systems generally use the anterior pelvic plane (APP). Our aim was to investigate the reliability of anatomical landmarks accessible during surgery and to define new potential planes of reference. Three types of palpations were performed: virtual, on dry bones and on two cadaveric specimens. Four landmarks were selected, the reproducibility of their positioning ranging from 0.9 to 2.3 mm. We then defined five planes and tested them during palpations on two cadaveric specimens. Two planes produced a mean orientation error of 5.0° [standard deviation (SD 3.3°)] and 5.6° (SD 2.7°). Even if further studies are needed to test the reliability of such planes on a larger scale in vivo during surgery, these results demonstrated the feasibility of defining a new plane of reference as an alternative to the APP.

  18. Reference ranges for blood concentrations of eosinophils and monocytes during the neonatal period defined from over 63 000 records in a multihospital health-care system.

    Science.gov (United States)

    Christensen, R D; Jensen, J; Maheshwari, A; Henry, E

    2010-08-01

    Blood concentrations of eosinophils and monocytes are part of the complete blood count. Reference ranges for these concentrations during the neonatal period, established by very large sample sizes and modern methods, are needed for identifying abnormally low or high values. We constructed reference ranges for eosinophils per microl and monocytes per microl among neonates of 22 to 42 weeks of gestation, on the day of birth, and also during 28 days after birth. Data were obtained from archived electronic records over an eight and one-half-year period in a multihospital health-care system. In keeping with the reference range concept, values were excluded from neonates with a diagnosis of infection or necrotizing enterocolitis (NEC). Eosinophils and monocytes per microl of blood were electronically retrieved from 96 162 records, of which 63 371 that lacked a diagnosis of infection or NEC were included in this reference range report. The mean value for eosinophils per microl on the day of birth increased linearly between 22 and 42 weeks of gestation, as did the 5 and 95% values. The reference range at 40 weeks was 140 to 1300 microl(-1) (mean 550 microl(-1)). Similarly, the mean value for monocytes increased linearly over this interval, with a reference range at 40 weeks of 300 to 3300 microl(-1) (mean 1400 microl(-1)). Over the first 4 weeks after birth, no appreciable change was observed in 5% limit and mean eosinophil count, with a slight increase in the 95% limit in week 4. A slight increase in monocyte count was observed during the first 2 weeks after birth. The results of this analysis describe reference ranges for blood concentrations of eosinophils and monocytes during the neonatal period. Additional study is needed for determining the relevance of values falling outside the reference range.

  19. A Method for A Priori Implementation Effort Estimation for Hardware Design

    DEFF Research Database (Denmark)

    Abildgren, Rasmus; Diguet, Jean-Philippe; Gogniat, Guy

    2008-01-01

    This paper presents a metric-based approach for estimating the hardware implementation effort (in terms of time) for an application in relation to the number of independent paths of its algorithms. We define a metric which exploits the relation between the number of independent paths in an algori...... facilitating designers and managers needs for estimating the time-to-market schedule....

  20. Cultural sensitivity in public health: defined and demystified.

    Science.gov (United States)

    Resnicow, K; Baranowski, T; Ahluwalia, J S; Braithwaite, R L

    1999-01-01

    There is consensus that health promotion programs should be culturally sensitive (CS). Yet, despite the ubiquitous nature of CS within public health research and practice, there has been surprisingly little attention given to defining CS or delineating a framework for developing culturally sensitive programs and practitioners. This paper describes a model for understanding CS from a public health perspective; describes a process for applying this model in the development of health promotion and disease prevention interventions; and highlights research priorities. Cultural sensitivity is defined by two dimensions: surface and deep structures. Surface structure involves matching intervention materials and messages to observable, "superficial" characteristics of a target population. This may involve using people, places, language, music, food, locations, and clothing familiar to, and preferred by, the target audience. Surface structure refers to how well interventions fit within a specific culture. Deep structure involves incorporating the cultural, social, historical, environmental and psychological forces that influence the target health behavior in the proposed target population. Whereas surface structure generally increases the "receptivity" or "acceptance" of messages, deep structure conveys salience. Techniques, borrowed from social marketing and health communication theory, for developing culturally sensitive interventions are described. Research is needed to determine the effectiveness of culturally sensitive programs.

  1. A quasiparticle-based multi-reference coupled-cluster method.

    Science.gov (United States)

    Rolik, Zoltán; Kállay, Mihály

    2014-10-07

    The purpose of this paper is to introduce a quasiparticle-based multi-reference coupled-cluster (MRCC) approach. The quasiparticles are introduced via a unitary transformation which allows us to represent a complete active space reference function and other elements of an orthonormal multi-reference (MR) basis in a determinant-like form. The quasiparticle creation and annihilation operators satisfy the fermion anti-commutation relations. On the basis of these quasiparticles, a generalization of the normal-ordered operator products for the MR case can be introduced as an alternative to the approach of Mukherjee and Kutzelnigg [Recent Prog. Many-Body Theor. 4, 127 (1995); Mukherjee and Kutzelnigg, J. Chem. Phys. 107, 432 (1997)]. Based on the new normal ordering any quasiparticle-based theory can be formulated using the well-known diagram techniques. Beyond the general quasiparticle framework we also present a possible realization of the unitary transformation. The suggested transformation has an exponential form where the parameters, holding exclusively active indices, are defined in a form similar to the wave operator of the unitary coupled-cluster approach. The definition of our quasiparticle-based MRCC approach strictly follows the form of the single-reference coupled-cluster method and retains several of its beneficial properties. Test results for small systems are presented using a pilot implementation of the new approach and compared to those obtained by other MR methods.

  2. Stability of the Associations between Early Life Risk Indicators and Adolescent Overweight over the Evolving Obesity Epidemic

    DEFF Research Database (Denmark)

    Graversen, Lise; Sørensen, Thorkild I A; Petersen, Liselotte

    2014-01-01

    BACKGROUND: Pre- and perinatal factors and preschool body size may help identify children developing overweight, but these factors might have changed during the development of the obesity epidemic. OBJECTIVE: We aimed to assess the associations between early life risk indicators and overweight...... at the age of 9 and 15 years at different stages of the obesity epidemic. METHODS: We used two population-based Northern Finland Birth Cohorts including 4111 children born in 1966 (NFBC1966) and 5414 children born in 1985-1986 (NFBC1986). In both cohorts, we used the same a priori defined prenatal factors......, maternal body mass index (BMI), birth weight, infant weight (age 5 months and 1 year), and preschool BMI (age 2-5 years). We used internal references in early childhood to define percentiles of body size (90) and generalized linear models to study the association with overweight...

  3. Comparing uranyl sorption complexes on soil and reference clays

    International Nuclear Information System (INIS)

    Chisholm-Brause, C.J.; Berg, J.M.; Conradson, S.D.; Morris, D.E.; McKinley, J.P.; Zachara, J.M.

    1993-01-01

    Clay minerals and other components in natural soils may play a key role in limiting the mobility of uranium in the environment through the formation of sorption complexes. Reference clays are frequently used as models to study sorption processes because they have well-known chemical and physical properties, but they may differ chemically and morphologically from clays derived from natural soils. Therefore, inferences based on reference clay data have been questioned. The authors have used luminescence and x-ray absorption spectroscopies to characterize the sorption complexes of aqueous uranyl (UO 2 2+ ) species on two soil smectites from the Kenoma and Ringold formations, and compared these results to those obtained on reference smectite clays. The pH dependence of uptake suggests that the ratio of sorption on amphoteric edge sites is greater for the soil smectites than for reference clays such as Wyoming montmorillonite (SWy-1). The luminescence spectra for uranyl sorbed to the soil clays are very similar to those for uranyl sorbed principally to the edge sites of SWy-1. This observation supports the solution data suggesting that adsorption to amphoteric sites is a more important mechanism for soil clays. However, the spectral data indicate that the sorption complexes on natural and reference clays are quite similar. Furthermore, as with the reference clays, the authors have found that the chemistry of the solution plays a greater role in defining the sorption complex than does the clay matrix. Thus, if differences in surface properties are adequately taken into account, the reference clays may serve as useful analogs for soil clays in investigations of metal-ion sorption

  4. AN ASSESSMENT OF CITIZEN CONTRIBUTED GROUND REFERENCE DATA FOR LAND COVER MAP ACCURACY ASSESSMENT

    Directory of Open Access Journals (Sweden)

    G. M. Foody

    2015-08-01

    Full Text Available It is now widely accepted that an accuracy assessment should be part of a thematic mapping programme. Authoritative good or best practices for accuracy assessment have been defined but are often impractical to implement. Key reasons for this situation are linked to the ground reference data used in the accuracy assessment. Typically, it is a challenge to acquire a large sample of high quality reference cases in accordance to desired sampling designs specified as conforming to good practice and the data collected are normally to some degree imperfect limiting their value to an accuracy assessment which implicitly assumes the use of a gold standard reference. Citizen sensors have great potential to aid aspects of accuracy assessment. In particular, they may be able to act as a source of ground reference data that may, for example, reduce sample size problems but concerns with data quality remain. The relative strengths and limitations of citizen contributed data for accuracy assessment are reviewed in the context of the authoritative good practices defined for studies of land cover by remote sensing. The article will highlight some of the ways that citizen contributed data have been used in accuracy assessment as well as some of the problems that require further attention, and indicate some of the potential ways forward in the future.

  5. A method to obtain reference images for evaluation of ultrasonic tissue characterization techniques

    DEFF Research Database (Denmark)

    Jensen, M.S.; Wilhjelm, Jens E.; Sahl, B.

    2002-01-01

    of the macroscopic photograph, due to the histological preparation process. The histological information was "mapped back" into the format of the ultrasound images the following way: On the macroscopic images, outlines were drawn manually which defined the border of the tissue. These outlines were superimposed...... of the various tissue types. Specifically, the macroscopic image revealed the borders between the different tissues, while the histological image identified the four tissue types. A set of 12 reference images based on modified macroscopic outlines was created. The overlap between the ultrasound images...... and the macroscopic images-which are the geometrical basis for the final reference images-was between 77% and 93%. A set of 12 reference images spaced 2.5 mm, identifying spatial location of four different tissue types in porcine muscle has been created. With the reference images, it is possible to quantitatively...

  6. Space-Based Reconfigurable Software Defined Radio Test Bed Aboard International Space Station

    Science.gov (United States)

    Reinhart, Richard C.; Lux, James P.

    2014-01-01

    The National Aeronautical and Space Administration (NASA) recently launched a new software defined radio research test bed to the International Space Station. The test bed, sponsored by the Space Communications and Navigation (SCaN) Office within NASA is referred to as the SCaN Testbed. The SCaN Testbed is a highly capable communications system, composed of three software defined radios, integrated into a flight system, and mounted to the truss of the International Space Station. Software defined radios offer the future promise of in-flight reconfigurability, autonomy, and eventually cognitive operation. The adoption of software defined radios offers space missions a new way to develop and operate space transceivers for communications and navigation. Reconfigurable or software defined radios with communications and navigation functions implemented in software or VHDL (Very High Speed Hardware Description Language) provide the capability to change the functionality of the radio during development or after launch. The ability to change the operating characteristics of a radio through software once deployed to space offers the flexibility to adapt to new science opportunities, recover from anomalies within the science payload or communication system, and potentially reduce development cost and risk by adapting generic space platforms to meet specific mission requirements. The software defined radios on the SCaN Testbed are each compliant to NASA's Space Telecommunications Radio System (STRS) Architecture. The STRS Architecture is an open, non-proprietary architecture that defines interfaces for the connections between radio components. It provides an operating environment to abstract the communication waveform application from the underlying platform specific hardware such as digital-to-analog converters, analog-to-digital converters, oscillators, RF attenuators, automatic gain control circuits, FPGAs, general-purpose processors, etc. and the interconnections among

  7. Defining operational taxonomic units using DNA barcode data.

    Science.gov (United States)

    Blaxter, Mark; Mann, Jenna; Chapman, Tom; Thomas, Fran; Whitton, Claire; Floyd, Robin; Abebe, Eyualem

    2005-10-29

    The scale of diversity of life on this planet is a significant challenge for any scientific programme hoping to produce a complete catalogue, whatever means is used. For DNA barcoding studies, this difficulty is compounded by the realization that any chosen barcode sequence is not the gene 'for' speciation and that taxa have evolutionary histories. How are we to disentangle the confounding effects of reticulate population genetic processes? Using the DNA barcode data from meiofaunal surveys, here we discuss the benefits of treating the taxa defined by barcodes without reference to their correspondence to 'species', and suggest that using this non-idealist approach facilitates access to taxon groups that are not accessible to other methods of enumeration and classification. Major issues remain, in particular the methodologies for taxon discrimination in DNA barcode data.

  8. The Common European Framework of Reference for Languages: A challenge for applied linguistics

    NARCIS (Netherlands)

    Hulstijn, J.H.

    2014-01-01

    The Common European Framework of Reference for Languages (CEFR, Council of Europe, 2001) currently functions as an instrument for educational policy and practice. The view of language proficiency on which it is based and the six proficiency levels it defines lack empirical support from language-use

  9. Can Single-Reference Coupled Cluster Theory Describe Static Correlation?

    Science.gov (United States)

    Bulik, Ireneusz W; Henderson, Thomas M; Scuseria, Gustavo E

    2015-07-14

    While restricted single-reference coupled cluster theory truncated to singles and doubles (CCSD) provides very accurate results for weakly correlated systems, it usually fails in the presence of static or strong correlation. This failure is generally attributed to the qualitative breakdown of the reference, and can accordingly be corrected by using a multideterminant reference, including higher-body cluster operators in the ansatz, or allowing symmetry breaking in the reference. None of these solutions are ideal; multireference coupled cluster is not black box, including higher-body cluster operators is computationally demanding, and allowing symmetry breaking leads to the loss of good quantum numbers. It has long been recognized that quasidegeneracies can instead be treated by modifying the coupled cluster ansatz. The recently introduced pair coupled cluster doubles (pCCD) approach is one such example which avoids catastrophic failures and accurately models strong correlations in a symmetry-adapted framework. Here, we generalize pCCD to a singlet-paired coupled cluster model (CCD0) intermediate between coupled cluster doubles and pCCD, yielding a method that possesses the invariances of the former and much of the stability of the latter. Moreover, CCD0 retains the full structure of coupled cluster theory, including a fermionic wave function, antisymmetric cluster amplitudes, and well-defined response equations and density matrices.

  10. Numerical aspects of drift kinetic turbulence: Ill-posedness, regularization and a priori estimates of sub-grid-scale terms

    KAUST Repository

    Samtaney, Ravi

    2012-01-01

    We present a numerical method based on an Eulerian approach to solve the Vlasov-Poisson system for 4D drift kinetic turbulence. Our numerical approach uses a conservative formulation with high-order (fourth and higher) evaluation of the numerical fluxes coupled with a fourth-order accurate Poisson solver. The fluxes are computed using a low-dissipation high-order upwind differencing method or a tuned high-resolution finite difference method with no numerical dissipation. Numerical results are presented for the case of imposed ion temperature and density gradients. Different forms of controlled regularization to achieve a well-posed system are used to obtain convergent resolved simulations. The regularization of the equations is achieved by means of a simple collisional model, by inclusion of an ad-hoc hyperviscosity or artificial viscosity term or by implicit dissipation in upwind schemes. Comparisons between the various methods and regularizations are presented. We apply a filtering formalism to the Vlasov equation and derive sub-grid-scale (SGS) terms analogous to the Reynolds stress terms in hydrodynamic turbulence. We present a priori quantifications of these SGS terms in resolved simulations of drift-kinetic turbulence by applying a sharp filter. © 2012 IOP Publishing Ltd.

  11. Numerical aspects of drift kinetic turbulence: ill-posedness, regularization and a priori estimates of sub-grid-scale terms

    International Nuclear Information System (INIS)

    Samtaney, Ravi

    2012-01-01

    We present a numerical method based on an Eulerian approach to solve the Vlasov-Poisson system for 4D drift kinetic turbulence. Our numerical approach uses a conservative formulation with high-order (fourth and higher) evaluation of the numerical fluxes coupled with a fourth-order accurate Poisson solver. The fluxes are computed using a low-dissipation high-order upwind differencing method or a tuned high-resolution finite difference method with no numerical dissipation. Numerical results are presented for the case of imposed ion temperature and density gradients. Different forms of controlled regularization to achieve a well-posed system are used to obtain convergent resolved simulations. The regularization of the equations is achieved by means of a simple collisional model, by inclusion of an ad-hoc hyperviscosity or artificial viscosity term or by implicit dissipation in upwind schemes. Comparisons between the various methods and regularizations are presented. We apply a filtering formalism to the Vlasov equation and derive sub-grid-scale (SGS) terms analogous to the Reynolds stress terms in hydrodynamic turbulence. We present a priori quantifications of these SGS terms in resolved simulations of drift-kinetic turbulence by applying a sharp filter.

  12. A comparative study of amplitude calibrations for the East Asia VLBI Network: A priori and template spectrum methods

    Science.gov (United States)

    Cho, Ilje; Jung, Taehyun; Zhao, Guang-Yao; Akiyama, Kazunori; Sawada-Satoh, Satoko; Kino, Motoki; Byun, Do-Young; Sohn, Bong Won; Shibata, Katsunori M.; Hirota, Tomoya; Niinuma, Kotaro; Yonekura, Yoshinori; Fujisawa, Kenta; Oyama, Tomoaki

    2017-12-01

    We present the results of a comparative study of amplitude calibrations for the East Asia VLBI Network (EAVN) at 22 and 43 GHz using two different methods of an "a priori" and a "template spectrum", particularly on lower declination sources. Using observational data sets of early EAVN observations, we investigated the elevation-dependence of the gain values at seven stations of the KaVA (KVN and VERA Array) and three additional telescopes in Japan (Takahagi 32 m, Yamaguchi 32 m, and Nobeyama 45 m). By comparing the independently obtained gain values based on these two methods, we found that the gain values from each method were consistent within 10% at elevations higher than 10°. We also found that the total flux densities of two images produced from the different amplitude calibrations were in agreement within 10% at both 22 and 43 GHz. By using the template spectrum method, furthermore, the additional radio telescopes can participate in KaVA (i.e., EAVN), giving a notable sensitivity increase. Therefore, our results will constrain the detailed conditions in order to measure the VLBI amplitude reliably using EAVN, and discuss the potential of possible expansion to telescopes comprising EAVN.

  13. Validity and Reliability of a Glucometer Against Industry Reference Standards.

    Science.gov (United States)

    Salacinski, Amanda J; Alford, Micah; Drevets, Kathryn; Hart, Sarah; Hunt, Brian E

    2014-01-01

    As an appealing alternative to reference glucose analyzers, portable glucometers are recommended for self-monitoring at home, in the field, and in research settings. The purpose was to characterize the accuracy and precision, and bias of glucometers in biomedical research. Fifteen young (20-36 years; mean = 24.5), moderately to highly active men (n = 10) and women (n = 5), defined by exercising 2 to 3 times a week for the past 6 months, were given an oral glucose tolerance test (OGTT) after an overnight fast. Participants ingested 50, 75, or 150 grams of glucose over a 5-minute period. The glucometer was compared to a reference instrument. The glucometer had 39% of values within 15% of measurements made using the reference instrument ranging from 45.05 to 169.37 mg/dl. There was both a proportional (-0.45 to -0.39) and small fixed (5.06 and 0.90 mg/dl) bias. Results of the present study suggest that the glucometer provided poor validity and reliability results compared to the results provided by the reference laboratory analyzer. The portable glucometers should be used for patient management, but not for diagnosis, treatment, or research purposes. © 2014 Diabetes Technology Society.

  14. O papel da qualidade e do marketing no serviço de referência

    Directory of Open Access Journals (Sweden)

    Marília Cossich Ramos

    2017-04-01

    Full Text Available Apresenta a importância da qualidade no atendimento prestado pelo profissional da informação. Define as principais características e funções do serviço de referência. Identifica o bibliotecário de referência como o principal agente de um serviço de referência eficaz. Aponta as principais correntes do marketing em unidades de informação. Demonstra os principais produtos e serviços oferecidos por uma unidade de informação. Ressalta a contribuição do serviço de referência e um atendimento de qualidade para o marketing da biblioteca. Conclui ressaltando a importância da qualidade para o marketing em unidades de informação.

  15. Procedure and reference standard to determine the structural resolution in coordinate metrology

    Science.gov (United States)

    Illemann, Jens; Bartscher, Markus; Jusko, Otto; Härtig, Frank; Neuschaefer-Rube, Ulrich; Wendt, Klaus

    2014-06-01

    A new procedure and reference standards for specifying the structural resolution in coordinate metrology traceable to the SI unit the metre are proposed. With the definition of the structural resolution, a significant gap will be closed to complete ‘acceptance and verification tests’ of the coordinate measuring systems (CMSs) which are specified in the ISO 10360 series dealing with tactile sensors, optical sensors, and x-ray computed tomography measurement systems (CTs). The proposed new procedure uses reference standards with circular rounded edges. The idea is to measure the radius of curvature on a calibrated round edge structure. From the deviation between the measured and the calibrated radius, an analogue Gaussian broadening of the measurement system is determined. This value is a well-defined and easy-to-apply measure to define the structural resolution for dimensional measurements. It is applicable to CMSs which are based on different sensing principles, e.g. tactile, optical and CT systems. On the other hand, it has a physical meaning similar to the classical optical point-spread function. It makes it possible to predict which smallest details the CMS is capable of measuring reliably for an arbitrary object shape. The theoretical background of the new procedure is given, an appropriate reference standard is described and comparative, quantitative measurement data of CMSs featuring different sensors are shown.

  16. REFERENCE CASES FOR USE IN THE CEMENTITOUS PARTNERSHIP PROJECT

    Energy Technology Data Exchange (ETDEWEB)

    Langton, C.; Kosson, D.; Garrabrants, A.

    2010-08-31

    The Cementitious Barriers Partnership Project (CBP) is a multi-disciplinary, multi-institution cross cutting collaborative effort supported by the US Department of Energy (DOE) to develop a reasonable and credible set of tools to improve understanding and prediction of the structural, hydraulic and chemical performance of cementitious barriers used in nuclear applications. The period of performance is >100 years for operating facilities and > 1000 years for waste management. The CBP has defined a set of reference cases to provide the following functions: (i) a common set of system configurations to illustrate the methods and tools developed by the CBP, (ii) a common basis for evaluating methodology for uncertainty characterization, (iii) a common set of cases to develop a complete set of parameter and changes in parameters as a function of time and changing conditions, (iv) a basis for experiments and model validation, and (v) a basis for improving conceptual models and reducing model uncertainties. These reference cases include the following two reference disposal units and a reference storage unit: (i) a cementitious low activity waste form in a reinforced concrete disposal vault, (ii) a concrete vault containing a steel high-level waste tank filled with grout (closed high-level waste tank), and (iii) a spent nuclear fuel basin during operation. Each case provides a different set of desired performance characteristics and interfaces between materials and with the environment. Examples of concretes, grout fills and a cementitious waste form are identified for the relevant reference case configurations.

  17. REFERENCE CASES FOR USE IN THE CEMENTITIOUS BARRIERS PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Langton, C

    2009-01-06

    The Cementitious Barriers Project (CBP) is a multidisciplinary cross cutting project initiated by the US Department of Energy (DOE) to develop a reasonable and credible set of tools to improve understanding and prediction of the structural, hydraulic and chemical performance of cementitious barriers used in nuclear applications. The period of performance is >100 years for operating facilities and > 1000 years for waste management. The CBP has defined a set of reference cases to provide the following functions: (1) a common set of system configurations to illustrate the methods and tools developed by the CBP, (2) a common basis for evaluating methodology for uncertainty characterization, (3) a common set of cases to develop a complete set of parameter and changes in parameters as a function of time and changing conditions, and (4) a basis for experiments and model validation, and (5) a basis for improving conceptual models and reducing model uncertainties. These reference cases include the following two reference disposal units and a reference storage unit: (1) a cementitious low activity waste form in a reinforced concrete disposal vault, (2) a concrete vault containing a steel high-level waste tank filled with grout (closed high-level waste tank), and (3) a spent nuclear fuel basin during operation. Each case provides a different set of desired performance characteristics and interfaces between materials and with the environment. Examples of concretes, grout fills and a cementitious waste form are identified for the relevant reference case configurations.

  18. REFERENCE CASES FOR USE IN THE CEMENTITOUS PARTNERSHIP PROJECT

    International Nuclear Information System (INIS)

    Langton, C.; Kosson, D.; Garrabrants, A.

    2010-01-01

    The Cementitious Barriers Partnership Project (CBP) is a multi-disciplinary, multi-institution cross cutting collaborative effort supported by the US Department of Energy (DOE) to develop a reasonable and credible set of tools to improve understanding and prediction of the structural, hydraulic and chemical performance of cementitious barriers used in nuclear applications. The period of performance is >100 years for operating facilities and > 1000 years for waste management. The CBP has defined a set of reference cases to provide the following functions: (i) a common set of system configurations to illustrate the methods and tools developed by the CBP, (ii) a common basis for evaluating methodology for uncertainty characterization, (iii) a common set of cases to develop a complete set of parameter and changes in parameters as a function of time and changing conditions, (iv) a basis for experiments and model validation, and (v) a basis for improving conceptual models and reducing model uncertainties. These reference cases include the following two reference disposal units and a reference storage unit: (i) a cementitious low activity waste form in a reinforced concrete disposal vault, (ii) a concrete vault containing a steel high-level waste tank filled with grout (closed high-level waste tank), and (iii) a spent nuclear fuel basin during operation. Each case provides a different set of desired performance characteristics and interfaces between materials and with the environment. Examples of concretes, grout fills and a cementitious waste form are identified for the relevant reference case configurations.

  19. Assessment of brain reference genes for RT-qPCR studies in neurodegenerative diseases.

    Science.gov (United States)

    Rydbirk, Rasmus; Folke, Jonas; Winge, Kristian; Aznar, Susana; Pakkenberg, Bente; Brudek, Tomasz

    2016-11-17

    Evaluation of gene expression levels by reverse transcription quantitative real-time PCR (RT-qPCR) has for many years been the favourite approach for discovering disease-associated alterations. Normalization of results to stably expressed reference genes (RGs) is pivotal to obtain reliable results. This is especially important in relation to neurodegenerative diseases where disease-related structural changes may affect the most commonly used RGs. We analysed 15 candidate RGs in 98 brain samples from two brain regions from Alzheimer's disease (AD), Parkinson's disease (PD), Multiple System Atrophy, and Progressive Supranuclear Palsy patients. Using RefFinder, a web-based tool for evaluating RG stability, we identified the most stable RGs to be UBE2D2, CYC1, and RPL13 which we recommend for future RT-qPCR studies on human brain tissue from these patients. None of the investigated genes were affected by experimental variables such as RIN, PMI, or age. Findings were further validated by expression analyses of a target gene GSK3B, known to be affected by AD and PD. We obtained high variations in GSK3B levels when contrasting the results using different sets of common RG underlining the importance of a priori validation of RGs for RT-qPCR studies.

  20. Reference values for 27 clinical chemistry tests in 70-year-old males and females.

    Science.gov (United States)

    Carlsson, Lena; Lind, Lars; Larsson, Anders

    2010-01-01

    Reference values are usually defined based on blood samples from healthy men or nonpregnant women in the age range of 20-50 years. These values are not optimal for elderly patients, as many biological markers change over time and adequate reference values are important for correct clinical decisions. To validate NORIP (Nordic Reference Interval Project) reference values in a 70-year-old population. We studied 27 frequently used laboratory tests. The 2.5th and 97.5th percentiles for these markers were calculated according to the recommendations of the International Federation of Clinical Chemistry on the statistical treatment of reference values. Reference values are reported for plasma alanine aminotransferase, albumin, alkaline phosphatase, pancreas amylase, apolipoprotein A1, apolipoprotein B, aspartate aminotransferase, bilirubin, calcium, chloride, cholesterol, creatinine, creatine kinase, C-reactive protein, glucose, gamma-glutamyltransferase, HDL-cholesterol, iron, lactate dehydrogenase, LDL-cholesterol, magnesium, phosphate, potassium, sodium, transferrin, triglycerides, urate and urea. Reference values calculated from the whole population and a subpopulation without cardiovascular disease showed strong concordance. Several of the reference interval limits were outside the 90% CI of a Scandinavian population (NORIP). 2009 S. Karger AG, Basel.

  1. Defining recovery in chronic fatigue syndrome: a critical review.

    Science.gov (United States)

    Adamowicz, Jenna L; Caikauskaite, Indre; Friedberg, Fred

    2014-11-01

    In chronic fatigue syndrome (CFS), the lack of consensus on how recovery should be defined or interpreted has generated controversy and confusion. The purpose of this paper was to systematically review, compare, and evaluate the definitions of recovery reported in the CFS literature and to make recommendations about the scope of recovery assessments. A search was done using the MEDLINE, PubMed, PsycINFO, CINAHL, and Cochrane databases for peer review papers that contained the search terms "chronic fatigue syndrome" and "recovery," "reversal," "remission," and/or "treatment response." From the 22 extracted studies, recovery was operationally defined by reference with one or more of these domains: (1) pre-morbid functioning; (2) both fatigue and function; (3) fatigue (or related symptoms) alone; (4) function alone; and/or (5) brief global assessment. Almost all of the studies measuring recovery in CFS did so differently. The brief global assessment was the most common outcome measure used to define recovery. Estimates of recovery ranged from 0 to 66 % in intervention studies and 2.6 to 62 % in naturalistic studies. Given that the term "recovery" was often based on limited assessments and less than full restoration of health, other more precise and accurate labels (e.g., clinically significant improvement) may be more appropriate and informative. In keeping with common understandings of the term recovery, we recommend a consistent definition that captures a broad-based return to health with assessments of both fatigue and function as well as the patient's perceptions of his/her recovery status.

  2. [Security of hospital infusion practices: From an a priori risk analysis to an improvement action plan].

    Science.gov (United States)

    Pignard, J; Cosserant, S; Traore, O; Souweine, B; Sautou, V

    2016-03-01

    Infusion in care units, and all the more in intensive care units, is a complex process which can be the source of many risks for the patient. Under cover of an institutional approach for the improvement of the quality and safety of patient healthcare, a risk mapping infusion practices was performed. The analysis was focused on intravenous infusion situations in adults, the a priori risk assessment methodology was applied and a multidisciplinary work group established. Forty-three risks were identified for the infusion process (prescription, preparation and administration). The risks' assessment and the existing means of control showed that 48% of them would have a highly critical patient security impact. Recommendations were developed for 20 risks considered to be most critical, to limit their occurrence and severity, and improve their control level. An institutional action plan was developed and validated in the Drug and Sterile Medical Devices Commission. This mapping allowed the realization of an exhaustive inventory of potential risks associated with the infusion. At the end of this work, multidisciplinary groups were set up to work on different themes and regular quarterly meetings were established to follow the progress of various projects. Risk mapping will be performed in pediatric and oncology unit where the risks associated with the handling of toxic products is omnipresent. Copyright © 2015 Académie Nationale de Pharmacie. Published by Elsevier Masson SAS. All rights reserved.

  3. Tomographic image via background subtraction using an x-ray projection image and a priori computed tomography

    International Nuclear Information System (INIS)

    Zhang Jin; Yi Byongyong; Lasio, Giovanni; Suntharalingam, Mohan; Yu, Cedric

    2009-01-01

    Kilovoltage x-ray projection images (kV images for brevity) are increasingly available in image guided radiotherapy (IGRT) for patient positioning. These images are two-dimensional (2D) projections of a three-dimensional (3D) object along the x-ray beam direction. Projecting a 3D object onto a plane may lead to ambiguities in the identification of anatomical structures and to poor contrast in kV images. Therefore, the use of kV images in IGRT is mainly limited to bony landmark alignments. This work proposes a novel subtraction technique that isolates a slice of interest (SOI) from a kV image with the assistance of a priori information from a previous CT scan. The method separates structural information within a preselected SOI by suppressing contributions to the unprocessed projection from out-of-SOI-plane structures. Up to a five-fold increase in the contrast-to-noise ratios (CNRs) was observed in selected regions of the isolated SOI, when compared to the original unprocessed kV image. The tomographic image via background subtraction (TIBS) technique aims to provide a quick snapshot of the slice of interest with greatly enhanced image contrast over conventional kV x-ray projections for fast and accurate image guidance of radiation therapy. With further refinements, TIBS could, in principle, provide real-time tumor localization using gantry-mounted x-ray imaging systems without the need for implanted markers.

  4. Séparation des ondes P et S à l'aide de la matrice spectrale avec informations à priori The Separation of P and S Waves Using the Spectral Matrix with a Priori Information

    Directory of Open Access Journals (Sweden)

    Mari J. L.

    2006-11-01

    Full Text Available Classiquement, la technique de filtrage utilisant la matrice spectrale proposée par Mermoz ne permet une séparation automatique des ondes au sens des indicatrices sismiques que dans certains cas particuliers, à savoir lorsque les ondes à séparer sont naturellement alignées sur les vecteurs propres de la matrice spectrale. Dans les autres cas, nous montrons que l'introduction d'information a priori sur la vitesse apparente de quelques ondes et une limitation de la durée temporelle de ces dernières permettent d'estimer leurs vecteurs d'ondes. L'utilisation de ces vecteurs et une technique de projection au sens des moindres carrés conduit à une extraction optimale de ces ondes, sans dégrader les autres ondes. La technique de filtrage proposée a été appliquée sur des données sismiques de type PSV (profil sismique vertical déporté. Le PSV a été enregistré dans un puits entre les cotes 1050 m et 1755 m; la source est déportée de 654 m par rapport à la tête de puits. L'outil utilisé est un géophone de puits à trois composantes. Le puits traverse une structure géologique complexe. Le traitement réalisé a mis en évidence des réflexions sismiques d'ondes de compression et de cisaillement, associées à des marqueurs fortement pentés (10 à 25°. Après estimation des champs de vitesse et des pendages à l'aide d'abaques, la migration en profondeur des horizons temps pointés a permis d'obtenir un modèle structural faillé. Detailed structural analysis can be achieved by using 3-component vertical seismic profiling method which gives structural information at several hundred meters from the wellhead. The use of an offset VSP on the Auzance structure has led to obtain a structural model composed by faulted dipping reflectors. This is due to the robust nature of the wave separation method which is based on the spectral matrix and uses an a priori information. This method preserves the true amplitude and the local apparent

  5. Application of diagnostic reference levels in medical practice

    Energy Technology Data Exchange (ETDEWEB)

    Bourguignon, Michel [Faculty of Medicine of Paris, Deputy Director General, Nuclear Safety Authority (ASN), Paris (France)

    2006-07-01

    Diagnosis reference levels (D.R.L.) are defined in the Council Directive 97/43 EURATOM as 'Dose levels in medical radio diagnosis practices or in the case of radiopharmaceuticals, levels of activity, for typical examinations for groups of standards-sized patients or standards phantoms for broadly defined types of equipment. These levels are expected not to be exceeded for standard procedures when good and normal practice regarding diagnostic and technical performance is applied'. Thus D.R.L. apply only to diagnostic procedures and does not apply to radiotherapy. Radiation protection of patients is based on the application of 2 major radiation protection principles, justification and optimization. The justification principle must be respected first because the best way to protect the patient is not to carry a useless test. Radiation protection of the patient is a continuous process and local dose indicator values in the good range should not prevent the radiologist or nuclear medicine physician to continue to optimize their practice. (N.C.)

  6. Application of diagnostic reference levels in medical practice

    International Nuclear Information System (INIS)

    Bourguignon, Michel

    2006-01-01

    Diagnosis reference levels (D.R.L.) are defined in the Council Directive 97/43 EURATOM as 'Dose levels in medical radio diagnosis practices or in the case of radiopharmaceuticals, levels of activity, for typical examinations for groups of standards-sized patients or standards phantoms for broadly defined types of equipment. These levels are expected not to be exceeded for standard procedures when good and normal practice regarding diagnostic and technical performance is applied'. Thus D.R.L. apply only to diagnostic procedures and does not apply to radiotherapy. Radiation protection of patients is based on the application of 2 major radiation protection principles, justification and optimization. The justification principle must be respected first because the best way to protect the patient is not to carry a useless test. Radiation protection of the patient is a continuous process and local dose indicator values in the good range should not prevent the radiologist or nuclear medicine physician to continue to optimize their practice. (N.C.)

  7. A Reference Model for Distribution Grid Control in the 21st Century

    Energy Technology Data Exchange (ETDEWEB)

    Taft, Jeffrey D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); De Martini, Paul [California Inst. of Technology (CalTech), Pasadena, CA (United States); Kristov, Lorenzo [California Independent System Operator, Folsom, CA (United States)

    2015-07-01

    Intensive changes in the structure of the grid due to the penetration of new technologies, coupled with changing societal needs are outpacing the capabilities of traditional grid control systems. The gap is widening at an accelerating rate with the biggest impacts occurring at the distribution level due to the widespread adoption of diverse distribution-connected energy resources (DER) . This paper outlines the emerging distribution grid control environment, defines the new distribution control problem, and provides a distribution control reference model. The reference model offers a schematic representation of the problem domain to inform development of system architecture and control solutions for the high-DER electric system.

  8. Comparing dietary patterns derived by two methods and their associations with obesity in Polish girls aged 13-21 years: the cross-sectional GEBaHealth study.

    Science.gov (United States)

    Wadolowska, Lidia; Kowalkowska, Joanna; Czarnocinska, Jolanta; Jezewska-Zychowicz, Marzena; Babicz-Zielinska, Ewa

    2017-05-01

    To compare dietary patterns (DPs) derived by two methods and their assessment as a factor of obesity in girls aged 13-21 years. Data from a cross-sectional study conducted among the representative sample of Polish females ( n = 1,107) aged 13-21 years were used. Subjects were randomly selected. Dietary information was collected using three short-validated food frequency questionnaires (FFQs) regarding fibre intake, fat intake and overall food intake variety. DPs were identified by two methods: a priori approach (a priori DPs) and cluster analysis (data-driven DPs). The association between obesity and DPs and three single dietary characteristics was examined using multiple logistic regression analysis. Four data-driven DPs were obtained: 'Low-fat-Low-fibre-Low-varied' (21.2%), 'Low-fibre' (29.1%), 'Low-fat' (25.0%) and 'High-fat-Varied' (24.7%). Three a priori DPs were pre-defined: 'Non-healthy' (16.6%), 'Neither-pro-healthy-nor-non-healthy' (79.1%) and 'Pro-healthy' (4.3%). Girls with 'Low-fibre' DP were less likely to have central obesity (adjusted odds ratio (OR) = 0.36; 95% confidence interval (CI): 0.17, 0.75) than girls with 'Low-fat-Low-fibre-Low-varied' DP (reference group, OR = 1.00). No significant associations were found between a priori DPs and overweight including obesity or central obesity. The majority of girls with 'Non-healthy' DP were also classified as 'Low-fibre' DP in the total sample, in girls with overweight including obesity and in girls with central obesity (81.7%, 80.6% and 87.3%, respectively), while most girls with 'Pro-healthy' DP were classified as 'Low-fat' DP (67.8%, 87.6% and 52.1%, respectively). We found that the a priori approach as well as cluster analysis can be used to derive opposite health-oriented DPs in Polish females. Both methods have provided disappointing outcomes in explaining the association between obesity and DPs. The cluster analysis, in comparison with the a priori approach, was more useful for finding any

  9. Reference standard for serum bile acids in pregnancy.

    LENUS (Irish Health Repository)

    2012-01-31

    Please cite this paper as: Egan N, Bartels A, Khashan A, Broadhurst D, Joyce C, O\\'Mullane J, O\\'Donoghue K. Reference standard for serum bile acids in pregnancy. BJOG 2012;00:000-000. DOI: 10.1111\\/j.1471-0528.2011.03245.x. Objective Obstetric cholestasis (OC) is a liver disorder characterised by pruritus and elevated serum bile acids (SBA) that affects one in 200 pregnant women. It is associated with adverse perinatal outcomes such as premature delivery and stillbirth. Mild OC is defined as SBA levels of 10-39 mumol\\/l, and severe OC is defined by levels >40 mumol\\/l. SBA levels in normal pregnancy have not been investigated. We aimed to establish reference values for SBA in healthy pregnant women across different trimesters of pregnancy. Design Cross-sectional analysis of SBA levels. Setting A large tertiary referral university teaching maternity hospital. Population Healthy pregnant women with a singleton pregnancy and a body mass index (BMI) < 40, excluding women with significant alcohol intake, history of liver disease, prior cholecystectomy and OC. Methods Cross-sectional analysis of SBA levels at 12, 20, 28 and 36 weeks of gestation, and on days 1-3 postpartum. Main outcome measures SBA levels in mumol\\/l. Results A total of 219 women attending for antenatal care were recruited, and SBA levels were assayed at 12, 20, 28 and 36 weeks of gestation, and up to 72 hours postpartum (n = 44-49 cases at each stage). The majority were white European women, with a median age of 30 years (range 17-46 years) and median BMI of 25 (range 18-38). Values of SBA ranged from 0.3 to 9.8 mumol\\/l in 216 women, with only three measurements outside this range. There were no significant changes throughout pregnancy. Conclusions SBA values in uncomplicated pregnancies are consistent, regardless of gestation, and are not elevated in pregnancy. The current reference values for the diagnosis of OC appear to be appropriate.

  10. Can play be defined?

    DEFF Research Database (Denmark)

    Eichberg, Henning

    2015-01-01

    Can play be defined? There is reason to raise critical questions about the established academic demand that at phenomenon – also in humanist studies – should first of all be defined, i.e. de-lineated and by neat lines limited to a “little box” that can be handled. The following chapter develops....... Human beings can very well understand play – or whatever phenomenon in human life – without defining it....

  11. Classical field theory in the space of reference frames. [Space-time manifold, action principle

    Energy Technology Data Exchange (ETDEWEB)

    Toller, M [Dipartimento di Matematica e Fisica, Libera Universita, Trento (Italy)

    1978-03-11

    The formalism of classical field theory is generalized by replacing the space-time manifold M by the ten-dimensional manifold S of all the local reference frames. The geometry of the manifold S is determined by ten vector fields corresponding to ten operationally defined infinitesimal transformations of the reference frames. The action principle is written in terms of a differential 4-form in the space S (the Lagrangian form). Densities and currents are represented by differential 3-forms in S. The field equations and the connection between symmetries and conservation laws (Noether's theorem) are derived from the action principle. Einstein's theory of gravitation and Maxwell's theory of electromagnetism are reformulated in this language. The general formalism can also be used to formulate theories in which charge, energy and momentum cannot be localized in space-time and even theories in which a space-time manifold cannot be defined exactly in any useful way.

  12. Software Defined Cyberinfrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Foster, Ian; Blaiszik, Ben; Chard, Kyle; Chard, Ryan

    2017-07-17

    Within and across thousands of science labs, researchers and students struggle to manage data produced in experiments, simulations, and analyses. Largely manual research data lifecycle management processes mean that much time is wasted, research results are often irreproducible, and data sharing and reuse remain rare. In response, we propose a new approach to data lifecycle management in which researchers are empowered to define the actions to be performed at individual storage systems when data are created or modified: actions such as analysis, transformation, copying, and publication. We term this approach software-defined cyberinfrastructure because users can implement powerful data management policies by deploying rules to local storage systems, much as software-defined networking allows users to configure networks by deploying rules to switches.We argue that this approach can enable a new class of responsive distributed storage infrastructure that will accelerate research innovation by allowing any researcher to associate data workflows with data sources, whether local or remote, for such purposes as data ingest, characterization, indexing, and sharing. We report on early experiments with this approach in the context of experimental science, in which a simple if-trigger-then-action (IFTA) notation is used to define rules.

  13. Different design approaches to structural fire safety

    DEFF Research Database (Denmark)

    Giuliani, Luisa; Budny, I.

    2013-01-01

    -priori evaluate which design is the safest or the most economical one: a punctual analysis of the different aspects and a comparison of the resulting designs is therefore of interest and is presented in this paper with reference to the case study considered.The third approach refers instead to a performance......-based fire design of the structure(PBFD), where safety goals are explicitly defined and a deeper knowledge of the structural response to fire effects can be achieved, for example with the avail of finite element analyses (FEA). On the other hand, designers can’t follow established procedures when undertaking...... such advanced investigations, which are generally quite complex ones, due to the presence of material degradation and large displacements induced by fire, as well as the possible triggering of local mechanism in the system. An example of advanced investigations for fire design is given in the paper...

  14. Defining Documentary Film

    DEFF Research Database (Denmark)

    Juel, Henrik

    2006-01-01

    A discussion of various attemts at defining documentary film regarding form, content, truth, stile, genre or reception - and a propoposal of a positive list of essential, but non-exclusive characteristica of documentary film......A discussion of various attemts at defining documentary film regarding form, content, truth, stile, genre or reception - and a propoposal of a positive list of essential, but non-exclusive characteristica of documentary film...

  15. Growth references

    NARCIS (Netherlands)

    Buuren, S. van

    2007-01-01

    A growth reference describes the variation of an anthropometric measurement within a group of individuals. A reference is a tool for grouping and analyzing data and provides a common basis for comparing populations.1 A well known type of reference is the age-conditional growth diagram. The

  16. Relevance of plastic limit loads to reference stress approach for surface cracked cylinder problems

    International Nuclear Information System (INIS)

    Kim, Yun-Jae; Shim, Do-Jun

    2005-01-01

    To investigate the relevance of the definition of the reference stress to estimate J and C* for surface crack problems, this paper compares finite element (FE) J and C* results for surface cracked pipes with those estimated according to the reference stress approach using various definitions of the reference stress. Pipes with part circumferential inner surface cracks and finite internal axial cracks are considered, subject to internal pressure and global bending. The crack depth and aspect ratio are systematically varied. The reference stress is defined in four different ways using (i) a local limit load (ii), a global limit load, (iii) a global limit load determined from the FE limit analysis, and (iv) the optimised reference load. It is found that the reference stress based on a local limit load gives overall excessively conservative estimates of J and C*. Use of a global limit load clearly reduces the conservatism, compared to that of a local limit load, although it can sometimes provide non-conservative estimates of J and C*. The use of the FE global limit load gives overall non-conservative estimates of J and C*. The reference stress based on the optimised reference load gives overall accurate estimates of J and C*, compared to other definitions of the reference stress. Based on the present findings, general guidance on the choice of the reference stress for surface crack problems is given

  17. Randomized clinical trials in orthodontics are rarely registered a priori and often published late or not at all.

    Science.gov (United States)

    Papageorgiou, Spyridon N; Antonoglou, Georgios N; Sándor, George K; Eliades, Theodore

    2017-01-01

    A priori registration of randomized clinical trials is crucial to the transparency and credibility of their findings. Aim of this study was to assess the frequency with which registered and completed randomized trials in orthodontics are published. We searched ClinicalTrials.gov and ISRCTN for registered randomized clinical trials in orthodontics that had been completed up to January 2017 and judged the publication status and date of registered trials using a systematic protocol. Statistical analysis included descriptive statistics, chi-square or Fisher exact tests, and Kaplan-Meier survival estimates. From the 266 orthodontic trials registered up to January 2017, 80 trials had been completed and included in the present study. Among these 80 included trials, the majority (76%) were registered retrospectively, while only 33 (41%) were published at the time. The median time from completion to publication was 20.1 months (interquartile range: 9.1 to 31.6 months), while survival analysis indicated that less than 10% of the trials were published after 5 years from their completion. Finally, 22 (28%) of completed trials remain unpublished even after 5 years from their completion. Publication rates of registered randomized trials in orthodontics remained low, even 5 years after their completion date.

  18. Isotachophoresis as a candidate reference method in analytical chemistry. Determination of sodium in serum

    NARCIS (Netherlands)

    Lemmens, A.A.G.; Reijenga, J.C.; Everaerts, F.M.; Janssen, R.T.P.; Hulsman, J.A.R.J.; Meijers, C.A.M.

    1985-01-01

    Isotachophoresis seems a likely candidate for a reference method, as it is more accurate and precise than common routine methods. The response is highly linear and depends on a well defined transport number of the leading ion, stability of the driving current, mobility of the separand, which is well

  19. Cerebrospinal fluid glucose and lactate: age-specific reference values and implications for clinical practice.

    NARCIS (Netherlands)

    Leen, W.G.; Willemsen, M.A.A.P.; Wevers, R.A.; Verbeek, M.M.

    2012-01-01

    Cerebrospinal fluid (CSF) analysis is an important tool in the diagnostic work-up of many neurological disorders, but reference ranges for CSF glucose, CSF/plasma glucose ratio and CSF lactate based on studies with large numbers of CSF samples are not available. Our aim was to define age-specific

  20. Impact of nonlinearity on changing the a priori of trace gas profile estimates from the Tropospheric Emission Spectrometer (TES

    Directory of Open Access Journals (Sweden)

    S. S. Kulawik

    2008-06-01

    Full Text Available Non-linear maximum a posteriori (MAP estimates of atmospheric profiles from the Tropospheric Emission Spectrometer (TES contains a priori information that may vary geographically, which is a confounding factor in the analysis and physical interpretation of an ensemble of profiles. One mitigation strategy is to transform profile estimates to a common prior using a linear operation thereby facilitating the interpretation of profile variability. However, this operation is dependent on the assumption of not worse than moderate non-linearity near the solution of the non-linear estimate. The robustness of this assumption is tested by comparing atmospheric retrievals from the Tropospheric Emission Spectrometer processed with a uniform prior with those processed with a variable prior and converted to a uniform prior following the non-linear retrieval. Linearly converting the prior following a non-linear retrieval is shown to have a minor effect on the results as compared to a non-linear retrieval using a uniform prior when compared to the expected total error, with less than 10% of the change in the prior ending up as unbiased fluctuations in the profile estimate results.

  1. Enhancing the performance of model-based elastography by incorporating additional a priori information in the modulus image reconstruction process

    International Nuclear Information System (INIS)

    Doyley, Marvin M; Srinivasan, Seshadri; Dimidenko, Eugene; Soni, Nirmal; Ophir, Jonathan

    2006-01-01

    Model-based elastography is fraught with problems owing to the ill-posed nature of the inverse elasticity problem. To overcome this limitation, we have recently developed a novel inversion scheme that incorporates a priori information concerning the mechanical properties of the underlying tissue structures, and the variance incurred during displacement estimation in the modulus image reconstruction process. The information was procured by employing standard strain imaging methodology, and introduced in the reconstruction process through the generalized Tikhonov approach. In this paper, we report the results of experiments conducted on gelatin phantoms to evaluate the performance of modulus elastograms computed with the generalized Tikhonov (GTK) estimation criterion relative to those computed by employing the un-weighted least-squares estimation criterion, the weighted least-squares estimation criterion and the standard Tikhonov method (i.e., the generalized Tikhonov method with no modulus prior). The results indicate that modulus elastograms computed with the generalized Tikhonov approach had superior elastographic contrast discrimination and contrast recovery. In addition, image reconstruction was more resilient to structural decorrelation noise when additional constraints were imposed on the reconstruction process through the GTK method

  2. Defining Quantum Control Flow

    OpenAIRE

    Ying, Mingsheng; Yu, Nengkun; Feng, Yuan

    2012-01-01

    A remarkable difference between quantum and classical programs is that the control flow of the former can be either classical or quantum. One of the key issues in the theory of quantum programming languages is defining and understanding quantum control flow. A functional language with quantum control flow was defined by Altenkirch and Grattage [\\textit{Proc. LICS'05}, pp. 249-258]. This paper extends their work, and we introduce a general quantum control structure by defining three new quantu...

  3. Face-infringement space: the frame of reference of the ventral intraparietal area.

    Science.gov (United States)

    McCollum, Gin; Klam, François; Graf, Werner

    2012-07-01

    Experimental studies have shown that responses of ventral intraparietal area (VIP) neurons specialize in head movements and the environment near the head. VIP neurons respond to visual, auditory, and tactile stimuli, smooth pursuit eye movements, and passive and active movements of the head. This study demonstrates mathematical structure on a higher organizational level created within VIP by the integration of a complete set of variables covering face-infringement. Rather than positing dynamics in an a priori defined coordinate system such as those of physical space, we assemble neuronal receptive fields to find out what space of variables VIP neurons together cover. Section 1 presents a view of neurons as multidimensional mathematical objects. Each VIP neuron occupies or is responsive to a region in a sensorimotor phase space, thus unifying variables relevant to the disparate sensory modalities and movements. Convergence on one neuron joins variables functionally, as space and time are joined in relativistic physics to form a unified spacetime. The space of position and motion together forms a neuronal phase space, bridging neurophysiology and the physics of face-infringement. After a brief review of the experimental literature, the neuronal phase space natural to VIP is sequentially characterized, based on experimental data. Responses of neurons indicate variables that may serve as axes of neural reference frames, and neuronal responses have been so used in this study. The space of sensory and movement variables covered by VIP receptive fields joins visual and auditory space to body-bound sensory modalities: somatosensation and the inertial senses. This joining of allocentric and egocentric modalities is in keeping with the known relationship of the parietal lobe to the sense of self in space and to hemineglect, in both humans and monkeys. Following this inductive step, variables are formalized in terms of the mathematics of graph theory to deduce which

  4. French approach on the definition of reference defects to be considered for fracture mechanics analyses at design state

    Energy Technology Data Exchange (ETDEWEB)

    Grandemange, J M; Pellissier-Tanon, A [Societe Franco-Americaine de Constructions Atomiques (FRAMATOME), 92 - Paris-La-Defense (France)

    1988-12-31

    This document describes the french approach for verifying fracture resistance of PWR primary components. Three reference defects have been defined, namely the envelope defect, the exceptional defect and the conventional defect. It appears that a precise estimation of the available margins may be obtained by analyzing a set of reference defects representative of the flaws likely to exist in the components. (TEC). 5 refs.

  5. Cell-Type-Specific Gene Programs of the Normal Human Nephron Define Kidney Cancer Subtypes.

    Science.gov (United States)

    Lindgren, David; Eriksson, Pontus; Krawczyk, Krzysztof; Nilsson, Helén; Hansson, Jennifer; Veerla, Srinivas; Sjölund, Jonas; Höglund, Mattias; Johansson, Martin E; Axelson, Håkan

    2017-08-08

    Comprehensive transcriptome studies of cancers often rely on corresponding normal tissue samples to serve as a transcriptional reference. In this study, we performed in-depth analyses of normal kidney tissue transcriptomes from the TCGA and demonstrate that the histological variability in cellularity, inherent in the kidney architecture, lead to considerable transcriptional differences between samples. This should be considered when comparing expression profiles of normal and cancerous kidney tissues. We exploited these differences to define renal-cell-specific gene signatures and used these as a framework to analyze renal cell carcinoma (RCC) ontogeny. Chromophobe RCCs express FOXI1-driven genes that define collecting duct intercalated cells, whereas HNF-regulated genes, specific for proximal tubule cells, are an integral part of clear cell and papillary RCC transcriptomes. These networks may be used as a framework for understanding the interplay between genomic changes in RCC subtypes and the lineage-defining regulatory machinery of their non-neoplastic counterparts. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.

  6. Injury risk at the work processes in fishing: a case-referent study

    DEFF Research Database (Denmark)

    Jensen, Olaf C

    2006-01-01

    Epidemiological studies on occupational injuries describe the incidence ratios related to the main strata in the industries, while the injury incidence ratios for the specific work processes within the work places have not yet been studied. The aim was to estimate the injury rate......-ratios for the main work processes in commercial fishing. A case-referent design with samples of person-time was used. The reported injuries to the National Maritime Authorities for a 5-year period for four types of commercial fishing defined the cases. The odds for the referents were calculated from samples...... of person-times for the specific working processes. Odds Ratios (OR) and 95% confidence intervals (95% CI) were calculated for the specific working processes. A total of 560 cases were included and the samples of referent working periods were 63110 min in total. The largest part of the injuries (n = 318...

  7. Recent references

    International Nuclear Information System (INIS)

    Ramavataram, S.

    1991-01-01

    In support of a continuing program of systematic evaluation of nuclear structure data, the National Nuclear Data Center maintains a complete computer file of references to the nuclear physics literature. Each reference is tagged by a keyword string, which indicates the kinds of data contained in the article. This master file of Nuclear Structure References (NSR) contains complete keyword indexes to literature published since 1969, with partial indexing of older references. Any reader who finds errors in the keyword descriptions is urged to report them to the National Nuclear Data Center so that the master NSR file can be corrected. In 1966, the first collection of Recent References was published as a separate issue of Nuclear Data Sheets. Every four months since 1970, a similar indexed bibliography to new nuclear experiments has been prepared from additions to the NSR file and published. Beginning in 1978, Recent References was cumulated annually, with the third issue completely superseding the two issues previously published during a given year. Due to publication policy changes, cumulation of Recent Reference was discontinued in 1986. The volume and issue number of all the cumulative issues published to date are given. NNDC will continue to respond to individual requests for special bibliographies on nuclear physics topics, in addition to those easily obtained from Recent References. If the required information is available from the keyword string, a reference list can be prepared automatically from the computer files. This service can be provided on request, in exchange for the timely communication of new nuclear physics results (e.g., preprints). A current copy of the NSR file may also be obtained in a standard format on magnetic tape from NNDC. Requests for special searches of the NSR file may also be directed to the National Nuclear Data Center

  8. Reference-based pricing: an evidence-based solution for lab services shopping.

    Science.gov (United States)

    Melton, L Doug; Bradley, Kent; Fu, Patricia Lin; Armata, Raegan; Parr, James B

    2014-01-01

    To determine the effect of reference-based pricing (RBP) on the percentage of lab services utilized by members that were at or below the reference price. Retrospective, quasi-experimental, matched, case-control pilot evaluation of an RBP benefit for lab services. The study group included employees of a multinational grocery chain covered by a national health insurance carrier and subject to RBP for lab services; it had access to an online lab shopping tool and was informed about the RBP benefit through employer communications. The reference group was covered by the same insurance carrier but not subject to RBP. The primary end point was lab compliance, defined as the percentage of lab claims with total charges at or below the reference price. Difference-in-difference regression estimation evaluated changes in lab compliance between the 2 groups. Higher compliance per lab claim was evident for the study group compared with the reference group (69% vs 57%; Ponline shopping tool was used by 7% of the matched-adjusted study group prior to obtaining lab services. Lab compliance was 76% for study group members using the online tool compared with 68% among nonusers who were subject to RBP (P<.01). RBP can promote cost-conscious selection of lab services. Access to facilities that offer services below the reference price and education about RBP improve compliance. Evaluation of the effect of RBP on higher-cost medical services, including radiology, outpatient specialty, and elective inpatient procedures, is needed.

  9. Defining Terrorism at the Special Tribunal for Lebanon

    Directory of Open Access Journals (Sweden)

    Prakash Puchooa

    2011-11-01

    Full Text Available On 16 February 2011, the Appeals Chamber of the Special Tribunal for Lebanon (STL issued an interlocutory decision regarding the legal definition of terrorism.This decision was in response to a Pre-Trial Chamber (PTC list of questions requesting,' inter alia', an elaboration of the elements of this crime.In exploring this matter, the Appeals Chamber defined the subjective ('mens rea' and objective elements ('actus reus' of terrorism by referring to domestic Lebanese law and international law. It thereby set out the applicable law for the court. The consequence of this decision however is not limited to the law of STL but may be seen as having far-reaching consequences for the conception of terrorism under both international law and International Criminal Law (ICL. Given the significance of the Appeals Chamber judgment, this paper will scrutinise three areas of concern regarding its propriety:

  10. Reference values for methacholine reactivity (SAPALDIA study

    Directory of Open Access Journals (Sweden)

    Perruchoud André

    2005-11-01

    Full Text Available Abstract Background The distribution of airway responsiveness in a general population of non-smokers without respiratory symptoms has not been established, limiting its use in clinical and epidemiological practice. We derived reference equations depending on individual characteristics (i.e., sex, age, baseline lung function for relevant percentiles of the methacholine two-point dose-response slope. Methods In a reference sample of 1567 adults of the SAPALDIA cross-sectional survey (1991, defined by excluding subjects with respiratory conditions, responsiveness during methacholine challenge was quantified by calculating the two-point dose-response slope (O'Connor. Weighted L1-regression was used to estimate reference equations for the 95th , 90th , 75th and 50th percentiles of the two-point slope. Results Reference equations for the 95th , 90th , 75th and 50th percentiles of the two-point slope were estimated using a model of the form a + b* Age + c* FEV1 + d* (FEV12 , where FEV1 corresponds to the pre-test (or baseline level of FEV1. For the central half of the FEV1 distribution, we used a quadratic model to describe the dependence of methacholine slope on baseline FEV1. For the first and last quartiles of FEV1, a linear relation with FEV1 was assumed (i.e., d was set to 0. Sex was not a predictor term in this model. A negative linear association with slope was found for age. We provide an Excel file allowing calculation of the percentile of methacholine slope of a subject after introducing age – pre-test FEV1 – and results of methacholine challenge of the subject. Conclusion The present study provides equations for four relevant percentiles of methacholine two-point slope depending on age and baseline FEV1 as basic predictors in an adult reference population of non-obstructive and non-atopic persons. These equations may help clinicians and epidemiologists to better characterize individual or population airway responsiveness.

  11. The scientifical aprehension about student on Portugal education (1880-1900: epistemological references

    Directory of Open Access Journals (Sweden)

    Maria Cristina Soares de Gouveia

    2012-02-01

    Full Text Available The papper analyses the emergence of a cientific knowledege about the student, during the XIX century, student defined according to his generation identity. In this sense, we focus an national context (Portugal and a specifically historical period (last decades of the century. The primary sources were the pedagogical magazines, that was used as strategical vehicule of scientific knowledge difusion, teacher education and improvement of pedagogical experts powerty. The cientificist perspective, refered to Spencer s positivist model of science, characterized the magazine. Into this perspective, is possible to identify the epistemological references of the cientific discourses about the individual development, that changed during the period. If in the first numbers of the vehicule, Comtes historical perspective of understanding the individual development was central, gradually lost its importance. The higienism, recorring to the concept of race to understand and quantificate the fisiological characters of the students turned to be the most important knowledge about the individual developmet, racially defined. So, the reserach demonstrate the tension beetwen race and history on the knowledge production about individual, cultural and social phenomenous during the XIX century, specially on investigations about individual development.

  12. Lost in space: Onboard star identification using CCD star tracker data without an a priori attitude

    Science.gov (United States)

    Ketchum, Eleanor A.; Tolson, Robert H.

    1993-01-01

    There are many algorithms in use today which determine spacecraft attitude by identifying stars in the field of view of a star tracker. Some methods, which date from the early 1960's, compare the angular separation between observed stars with a small catalog. In the last 10 years, several methods have been developed which speed up the process and reduce the amount of memory needed, a key element to onboard attitude determination. However, each of these methods require some a priori knowledge of the spacecraft attitude. Although the Sun and magnetic field generally provide the necessary coarse attitude information, there are occasions when a spacecraft could get lost when it is not prudent to wait for sunlight. Also, the possibility of efficient attitude determination using only the highly accurate CCD star tracker could lead to fully autonomous spacecraft attitude determination. The need for redundant coarse sensors could thus be eliminated at substantial cost reduction. Some groups have extended their algorithms to implement a computation intense full sky scan. Some require large data bases. Both storage and speed are concerns for autonomous onboard systems. Neural network technology is even being explored by some as a possible solution, but because of the limited number of patterns that can be stored and large overhead, nothing concrete has resulted from these efforts. This paper presents an algorithm which, by descretizing the sky and filtering by visual magnitude of the brightness observed star, speeds up the lost in space star identification process while reducing the amount of necessary onboard computer storage compared to existing techniques.

  13. Phillips-Tikhonov regularization with a priori information for neutron emission tomographic reconstruction on Joint European Torus

    Energy Technology Data Exchange (ETDEWEB)

    Bielecki, J.; Scholz, M.; Drozdowicz, K. [Institute of Nuclear Physics, Polish Academy of Sciences, PL-31342 Krakow (Poland); Giacomelli, L. [CCFE, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Istituto di Fisica del Plasma “P. Caldirola,” Milano (Italy); Kiptily, V.; Kempenaars, M. [CCFE, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Conroy, S. [CCFE, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Department of Physics and Astronomy, Uppsala University (Sweden); Craciunescu, T. [IAP, National Institute for Laser Plasma and Radiation Physics, Bucharest (Romania); Collaboration: EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB (United Kingdom)

    2015-09-15

    A method of tomographic reconstruction of the neutron emissivity in the poloidal cross section of the Joint European Torus (JET, Culham, UK) tokamak was developed. Due to very limited data set (two projection angles, 19 lines of sight only) provided by the neutron emission profile monitor (KN3 neutron camera), the reconstruction is an ill-posed inverse problem. The aim of this work consists in making a contribution to the development of reliable plasma tomography reconstruction methods that could be routinely used at JET tokamak. The proposed method is based on Phillips-Tikhonov regularization and incorporates a priori knowledge of the shape of normalized neutron emissivity profile. For the purpose of the optimal selection of the regularization parameters, the shape of normalized neutron emissivity profile is approximated by the shape of normalized electron density profile measured by LIDAR or high resolution Thomson scattering JET diagnostics. In contrast with some previously developed methods of ill-posed plasma tomography reconstruction problem, the developed algorithms do not include any post-processing of the obtained solution and the physical constrains on the solution are imposed during the regularization process. The accuracy of the method is at first evaluated by several tests with synthetic data based on various plasma neutron emissivity models (phantoms). Then, the method is applied to the neutron emissivity reconstruction for JET D plasma discharge #85100. It is demonstrated that this method shows good performance and reliability and it can be routinely used for plasma neutron emissivity reconstruction on JET.

  14. User satisfaction with referrals at a collaborative virtual reference service Virtual reference services, Reference services, Referrals, User satisfaction

    Directory of Open Access Journals (Sweden)

    Nahyun Kwon

    2006-01-01

    Full Text Available Introduction. This study investigated unmonitored referrals in a nationwide, collaborative chat reference service. Specifically, it examined the extent to which questions are referred, the types of questions that are more likely to be referred than others, and the level of user satisfaction with the referrals in the collaborative chat reference service. Method. The data analysed for this study were 420 chat reference transaction transcripts along with corresponding online survey questionnaires submitted by the service users. Both sets of data were collected from an electronic archive of a southeastern state public library system that has participated in 24/7 Reference of the Metropolitan Cooperative Library System (MCLS. Results. Referrals in the collaborative chat reference service comprised approximately 30% of the total transactions. Circulation-related questions were the most often referred among all question types, possibly because of the inability of 'outside' librarians to access patron accounts. Most importantly, user satisfaction with referrals was found to be significantly lower than that of completed answers. Conclusion. The findings of this study addressed the importance of distinguishing two types of referrals: the expert research referrals conducive to collaborative virtual reference services; and the re-directional local referrals that increase unnecessary question traffic, thereby being detrimental to effective use of collaborative reference. Continuing efforts to conceptualize referrals in multiple dimensions are anticipated to fully grasp complex phenomena underlying referrals.

  15. Gender differences on the Five to Fifteen questionnaire in a non-referred sample with inattention and hyperactivity-impulsivity and a clinic-referred sample with hyperkinetic disorder

    DEFF Research Database (Denmark)

    Lambek, Rikke; Trillingsgaard, Anegen; Kadesjö, Björn

    2010-01-01

    The aim of the present study was to examine gender differences in children with inattention, hyperactivity, and impulsivity on the Five to Fifteen (FTF) parent questionnaire. First, non-referred girls (n = 43) and boys (n = 51) with problems of attention and hyperactivity-impulsivity and then cli......The aim of the present study was to examine gender differences in children with inattention, hyperactivity, and impulsivity on the Five to Fifteen (FTF) parent questionnaire. First, non-referred girls (n = 43) and boys (n = 51) with problems of attention and hyperactivity...... questionnaire. Secondly, it was examined whether the application of gender mixed norms versus gender specific norms would result in varying proportions of clinic-referred children with HKD being identified as impaired on the subdomains of the FTF questionnaire. Based on results it was concluded that the use...... of a gender mixed normative sample may lead to overestimation of impairment in boys with HKD, but the type of sample applied to define impairment on the FTF should depend on the purpose for applying the questionnaire....

  16. [Reference citation].

    Science.gov (United States)

    Brkić, Silvija

    2013-01-01

    Scientific and professional papers represent the information basis for scientific research and professional work. References important for the paper should be cited within the text, and listed at the end of the paper. This paper deals with different styles of reference citation. Special emphasis was placed on the Vancouver Style for reference citation in biomedical journals established by the International Committee of Medical Journal Editors. It includes original samples for citing various types of articles, both printed and electronic, as well as recommendations related to reference citation in accordance with the methodology and ethics of scientific research and guidelines for preparing manuscripts for publication.

  17. Preserving the Integrity of Citations and References by All Stakeholders of Science Communication.

    Science.gov (United States)

    Gasparyan, Armen Yuri; Yessirkepov, Marlen; Voronov, Alexander A; Gerasimov, Alexey N; Kostyukova, Elena I; Kitas, George D

    2015-11-01

    Citations to scholarly items are building bricks for multidisciplinary science communication. Citation analyses are currently influencing individual career advancement and ranking of academic and research institutions worldwide. This article overviews the involvement of scientific authors, reviewers, editors, publishers, indexers, and learned associations in the citing and referencing to preserve the integrity of science communication. Authors are responsible for thorough bibliographic searches to select relevant references for their articles, comprehend main points, and cite them in an ethical way. Reviewers and editors may perform additional searches and recommend missing essential references. Publishers, in turn, are in a position to instruct their authors over the citations and references, provide tools for validation of references, and open access to bibliographies. Publicly available reference lists bear important information about the novelty and relatedness of the scholarly items with the published literature. Few editorial associations have dealt with the issue of citations and properly managed references. As a prime example, the International Committee of Medical Journal Editors (ICMJE) issued in December 2014 an updated set of recommendations on the need for citing primary literature and avoiding unethical references, which are applicable to the global scientific community. With the exponential growth of literature and related references, it is critically important to define functions of all stakeholders of science communication in curbing the issue of irrational and unethical citations and thereby improve the quality and indexability of scholarly journals.

  18. Preserving the Integrity of Citations and References by All Stakeholders of Science Communication

    Science.gov (United States)

    Yessirkepov, Marlen; Voronov, Alexander A.; Gerasimov, Alexey N.; Kostyukova, Elena I.; Kitas, George D.

    2015-01-01

    Citations to scholarly items are building bricks for multidisciplinary science communication. Citation analyses are currently influencing individual career advancement and ranking of academic and research institutions worldwide. This article overviews the involvement of scientific authors, reviewers, editors, publishers, indexers, and learned associations in the citing and referencing to preserve the integrity of science communication. Authors are responsible for thorough bibliographic searches to select relevant references for their articles, comprehend main points, and cite them in an ethical way. Reviewers and editors may perform additional searches and recommend missing essential references. Publishers, in turn, are in a position to instruct their authors over the citations and references, provide tools for validation of references, and open access to bibliographies. Publicly available reference lists bear important information about the novelty and relatedness of the scholarly items with the published literature. Few editorial associations have dealt with the issue of citations and properly managed references. As a prime example, the International Committee of Medical Journal Editors (ICMJE) issued in December 2014 an updated set of recommendations on the need for citing primary literature and avoiding unethical references, which are applicable to the global scientific community. With the exponential growth of literature and related references, it is critically important to define functions of all stakeholders of science communication in curbing the issue of irrational and unethical citations and thereby improve the quality and indexability of scholarly journals. PMID:26538996

  19. Serum TSH reference interval in healthy Finnish adults using the Abbott Architect 2000i Analyzer.

    Science.gov (United States)

    Schalin-Jäntti, Camilla; Tanner, Pirjo; Välimäki, Matti J; Hämäläinen, Esa

    2011-07-01

    Current serum TSH reference intervals have been criticized as they were established from unselected background populations. A special concern is that the upper limit, which defines subclinical hypothyroidism, is too high. The objective was to redefine the TSH reference interval in the adult Finnish population. The current reference interval for the widely used Abbott Architect method in Finland is 0.4-4.0 mU/L. Serum TSH and free T4 concentrations were derived from 606 healthy, non-pregnant, 18-91-year-old Finns from the Nordic Reference Interval Project (NORIP) and the possible effects of age, sex and thyroid peroxidase antibody (TPOAb) status were evaluated. After excluding TPOAb-positive subjects and outliers, a reference population of 511 subjects was obtained. In the reference population, no statistically significant gender- or age-specific differences in mean TSH (1.55 ± 3.30 mU/L) or TSH reference intervals were observed. The new reference interval was 0.5-3.6 mU/L (2.5th-97.5th percentiles). The current upper TSH reference limit is 10% too high. A TSH > 3.6 mU/L, confirmed with a repeat TSH sampling, may indicate subclinical hypothyroidism. Differences in ethnicity, regional iodine-intake and analytical methods underline the need for redefining the TSH reference interval in central laboratories in different countries.

  20. Order of 24 October 2011 related to diagnosis reference levels in radiology and nuclear medicine; Arrete du 24 octobre 2011 relatif aux niveaux de reference diagnostiques en radiologie et en medecine nucleaire

    Energy Technology Data Exchange (ETDEWEB)

    Grall, J.Y. [Ministere du travail, de l' emploi et de la sante, Direction generale de la sante, 14, avenue Duquesne, 75350 PARIS 07 SP (France)

    2012-01-14

    This order defines diagnosis reference levels for examinations exposing to the most common or irradiating ionizing radiations in the case of radiology and nuclear medicine. It also specifies the role of the person authorized to use nuclear medicine equipment, the role of the IRSN in collecting and analysing data. Doses are specified in appendix for different types of examinations

  1. The Gaia inertial reference frame and the tilting of the Milky Way disk

    International Nuclear Information System (INIS)

    Perryman, Michael; Spergel, David N.; Lindegren, Lennart

    2014-01-01

    While the precise relationship between the Milky Way disk and the symmetry planes of the dark matter halo remains somewhat uncertain, a time-varying disk orientation with respect to an inertial reference frame seems probable. Hierarchical structure formation models predict that the dark matter halo is triaxial and tumbles with a characteristic rate of ∼2 rad H 0 −1 (∼30 μas yr –1 ). These models also predict a time-dependent accretion of gas, such that the angular momentum vector of the disk should be misaligned with that of the halo. These effects, as well as tidal effects of the LMC, will result in the rotation of the angular momentum vector of the disk population with respect to the quasar reference frame. We assess the accuracy with which the positions and proper motions from Gaia can be referred to a kinematically non-rotating system, and show that the spin vector of the transformation from any rigid self-consistent catalog frame to the quasi-inertial system defined by quasars should be defined to better than 1 μas yr –1 . Determination of this inertial frame by Gaia will reveal any signature of the disk orientation varying with time, improve models of the potential and dynamics of the Milky Way, test theories of gravity, and provide new insights into the orbital evolution of the Sagittarius dwarf galaxy and the Magellanic Clouds.

  2. Pediatric reference intervals for general clinical chemistry components - merging of studies from Denmark and Sweden.

    Science.gov (United States)

    Ridefelt, Peter; Hilsted, Linda; Juul, Anders; Hellberg, Dan; Rustad, Pål

    2018-05-28

    Reference intervals are crucial tools aiding clinicians when making medical decisions. However, for children such values often are lacking or incomplete. The present study combines data from separate pediatric reference interval studies of Denmark and Sweden in order to increase sample size and to include also pre-school children who were lacking in the Danish study. Results from two separate studies including 1988 healthy children and adolescents aged 6 months to 18 years of age were merged and recalculated. Eighteen general clinical chemistry components were measured on Abbott and Roche platforms. To facilitate commutability, the NFKK Reference Serum X was used. Age- and gender-specific pediatric reference intervals were defined by calculating 2.5 and 97.5 percentiles. The data generated are primarily applicable to a Nordic population, but could be used by any laboratory if validated for the local patient population.

  3. Languages for Software-Defined Networks

    Science.gov (United States)

    2013-02-01

    switches, firewalls, and middleboxes) with closed and proprietary configuration inter- faces. Software - Defined Networks ( SDN ) are poised to change...how- ever, have seen growing interest in software - defined networks ( SDNs ), in which a logically-centralized controller manages the packet-processing...switches, firewalls, and middleboxes) with closed and proprietary configuration interfaces. Software - Defined Networks ( SDN ) are poised to change this

  4. Identification of dietary patterns associated with obesity in a nationally representative survey of Canadian adults: application of a priori, hybrid, and simplified dietary pattern techniques.

    Science.gov (United States)

    Jessri, Mahsa; Wolfinger, Russell D; Lou, Wendy Y; L'Abbé, Mary R

    2017-03-01

    Background: Analyzing the effects of dietary patterns is an important approach for examining the complex role of nutrition in the etiology of obesity and chronic diseases. Objectives: The objectives of this study were to characterize the dietary patterns of Canadians with the use of a priori, hybrid, and simplified dietary pattern techniques, and to compare the associations of these patterns with obesity risk in individuals with and without chronic diseases (unhealthy and healthy obesity). Design: Dietary recalls from 11,748 participants (≥18 y of age) in the cross-sectional, nationally representative Canadian Community Health Survey 2.2 were used. A priori dietary pattern was characterized with the use of the previously validated 2015 Dietary Guidelines for Americans Adherence Index (DGAI). Weighted partial least squares (hybrid method) was used to derive an energy-dense (ED), high-fat (HF), low-fiber density (LFD) dietary pattern with the use of 38 food groups. The associations of derived dietary patterns with disease outcomes were then tested with the use of multinomial logistic regression. Results: An ED, HF, and LFD dietary pattern had high positive loadings for fast foods, carbonated drinks, and refined grains, and high negative loadings for whole fruits and vegetables (≥|0.17|). Food groups with a high loading were summed to form a simplified dietary pattern score. Moving from the first (healthiest) to the fourth (least healthy) quartiles of the ED, HF, and LFD pattern and the simplified dietary pattern scores was associated with increasingly elevated ORs for unhealthy obesity, with individuals in quartile 4 having an OR of 2.57 (95% CI: 1.75, 3.76) and 2.73 (95% CI: 1.88, 3.98), respectively ( P -trend obesity ( P -trend dietary patterns with healthy obesity and unhealthy nonobesity were weaker, albeit significant. Conclusions: Consuming an ED, HF, and LFD dietary pattern and lack of adherence to the recommendations of the 2015 DGAI were associated with

  5. Utilizing a Rapid Prototyping Approach in the Building of a Hypermedia-Based Reference Station.

    Science.gov (United States)

    Sell, Dan

    This paper discusses the building of a hypermedia-based reference station at the Wright Laboratory Technical Library, Wright-Patterson Air Force Base, Ohio. Following this, the paper focuses on an electronic user survey from which data is collected and analysis is made. The survey data is used in a rapid prototyping approach, which is defined as…

  6. An algorithm for finding biologically significant features in microarray data based on a priori manifold learning.

    Directory of Open Access Journals (Sweden)

    Zena M Hira

    Full Text Available Microarray databases are a large source of genetic data, which, upon proper analysis, could enhance our understanding of biology and medicine. Many microarray experiments have been designed to investigate the genetic mechanisms of cancer, and analytical approaches have been applied in order to classify different types of cancer or distinguish between cancerous and non-cancerous tissue. However, microarrays are high-dimensional datasets with high levels of noise and this causes problems when using machine learning methods. A popular approach to this problem is to search for a set of features that will simplify the structure and to some degree remove the noise from the data. The most widely used approach to feature extraction is principal component analysis (PCA which assumes a multivariate Gaussian model of the data. More recently, non-linear methods have been investigated. Among these, manifold learning algorithms, for example Isomap, aim to project the data from a higher dimensional space onto a lower dimension one. We have proposed a priori manifold learning for finding a manifold in which a representative set of microarray data is fused with relevant data taken from the KEGG pathway database. Once the manifold has been constructed the raw microarray data is projected onto it and clustering and classification can take place. In contrast to earlier fusion based methods, the prior knowledge from the KEGG databases is not used in, and does not bias the classification process--it merely acts as an aid to find the best space in which to search the data. In our experiments we have found that using our new manifold method gives better classification results than using either PCA or conventional Isomap.

  7. Influence of population selection on the 99th percentile reference value for cardiac troponin assays.

    Science.gov (United States)

    Collinson, Paul O; Heung, Yen Ming; Gaze, David; Boa, Frances; Senior, Roxy; Christenson, Robert; Apple, Fred S

    2012-01-01

    We sought to determine the effect of patient selection on the 99th reference percentile of 2 sensitive and 1 high-sensitivity (hs) cardiac troponin assays in a well-defined reference population. Individuals>45 years old were randomly selected from 7 representative local community practices. Detailed information regarding the participants was collected via questionnaires. The healthy reference population was defined as individuals who had no history of vascular disease, hypertension, or heavy alcohol intake; were not receiving cardiac medication; and had blood pressure60 mL·min(-1)·(1.73 m2)(-1), and normal cardiac function according to results of echocardiography. Samples were stored at -70 °C until analysis for cardiac troponin I (cTnI) and cardiac troponin T (cTnT) and N-terminal pro-B-type natriuretic peptide. Application of progressively more stringent population selection strategies to the initial baseline population of 545 participants until the only individuals who remained were completely healthy according to the study criteria reduced the number of outliers seen and led to a progressive decrease in the 99th-percentile value obtained for the Roche hs-cTnT assay and the sensitive Beckman cTnI assay but not for the sensitive Siemens Ultra cTnI assay. Furthermore, a sex difference found in the baseline population for the hs-cTnT (P=0.0018) and Beckman cTnI assays (Pstrategy significantly influenced the 99th percentile reference values determined for troponin assays and the observed sex differences in troponin concentrations.

  8. Incidence of hospital referred head injuries in Norway: a population based survey from the Stavanger region

    DEFF Research Database (Denmark)

    Heskestad, Ben; Baardsen, Roald; Helseth, Eirik

    2009-01-01

    it with previous Norwegian studies. METHODS: All head injured patients referred to Stavanger University Hospital during a one-year period (2003) were registered in a partly prospective and partly retrospective study. The catchment area for the hospital is strictly defined to a local population of 283...

  9. The role of certified reference materials in material control and accounting

    International Nuclear Information System (INIS)

    Turel, S.P.

    1979-01-01

    One way of providing an adequate material control and accounting system for the nuclear fuel cycle is to calculate material unaccounted for (MUF) after a physical inventory and to compare the limit of error of the MUF value (LEMUF) against prescribed criteria. To achieve a meaningful LEMUF, a programme for the continuing determination of systematic and random errors is necessary. Within this programme it is necessary to achieve traceability of all Special Nuclear Material (SNM) control and accounting measurements to an International/National Measurement System by means of Certified Reference Materials. SNM measurements for control and accounting are made internationally on a great variety of materials using many diverse measurement procedures by a large number of facilities. To achieve valid overall accountability over this great variety of measurements there must be some means of relating all these measurements and their uncertainties to each other. This is best achieved by an International/National Measurement System (IMS/NMS). To this end, all individual measurement systems must be compatible to the IMS/NMS and all measurement results must be traceable to appropriate international/national Primary Certified Reference Materials. To obtain this necessary compatibility for any given SNM measurement system, secondary certified reference materials or working reference materials are needed for every class of SNM and each type of measurement system. Ways to achieve ''traceability'' and the various types of certified reference material are defined and discussed in this paper. (author)

  10. Reference values for anxiety questionnaires: the Leiden Routine Outcome Monitoring Study.

    Science.gov (United States)

    Schulte-van Maaren, Yvonne W M; Giltay, Erik J; van Hemert, Albert M; Zitman, Frans G; de Waal, Margot W M; Carlier, Ingrid V E

    2013-09-25

    The monitoring of patients with an anxiety disorder can benefit from Routine Outcome Monitoring (ROM). As anxiety disorders differ in phenomenology, several anxiety questionnaires are included in ROM: Brief Scale for Anxiety (BSA), PADUA Inventory Revised (PI-R), Panic Appraisal Inventory (PAI), Penn State Worry Questionnaire (PSWQ), Worry Domains Questionnaire (WDQ), Social Interaction, Anxiety Scale (SIAS), Social Phobia Scale (SPS), and the Impact of Event Scale-Revised (IES-R). We aimed to generate reference values for both 'healthy' and 'clinically anxious' populations for these anxiety questionnaires. We included 1295 subjects from the general population (ROM reference-group) and 5066 psychiatric outpatients diagnosed with a specific anxiety disorder (ROM patient-group). The MINI was used as diagnostic device in both the ROM reference group and the ROM patient group. To define limits for one-sided reference intervals (95th percentile; P95) the outermost 5% of observations were used. Receiver Operating Characteristics (ROC) analyses were used to yield alternative cut-off values for the anxiety questionnaires. For the ROM reference-group the mean age was 40.3 years (SD=12.6), and for the ROM patient-group it was 36.5 years (SD=11.9). Females constituted 62.8% of the reference-group and 64.4% of the patient-group. P95 ROM reference group cut-off values for reference versus clinically anxious populations were 11 for the BSA, 43 for the PI-R, 37 for the PAI Anticipated Panic, 47 for the PAI Perceived Consequences, 65 for the PAI Perceived Self-efficacy, 66 for the PSWQ, 74 for the WDQ, 32 for the SIAS, 19 for the SPS, and 36 for IES-R. ROC analyses yielded slightly lower reference values. The discriminative power of all eight anxiety questionnaires was very high. Substantial non-response and limited generalizability. For eight anxiety questionnaires a comprehensive set of reference values was provided. Reference values were generally higher in women than in men

  11. An approach to an acute emotional stress reference scale.

    Science.gov (United States)

    Garzon-Rey, J M; Arza, A; de-la-Camara, C; Lobo, A; Armario, A; Aguilo, J

    2017-06-16

    The clinical diagnosis aims to identify the degree of affectation of the psycho-physical state of the patient as a guide to therapeutic intervention. In stress, the lack of a measurement tool based on a reference makes it difficult to quantitatively assess this degree of affectation. To define and perform a primary assessment of a standard reference in order to measure acute emotional stress from the markers identified as indicators of the degree. Psychometric tests and biochemical variables are, in general, the most accepted stress measurements by the scientific community. Each one of them probably responds to different and complementary processes related to the reaction to a stress stimulus. The reference that is proposed is a weighted mean of these indicators by assigning them relative weights in accordance with a principal components analysis. An experimental study was conducted on 40 healthy young people subjected to the psychosocial stress stimulus of the Trier Social Stress Test in order to perform a primary assessment and consistency check of the proposed reference. The proposed scale clearly differentiates between the induced relax and stress states. Accepting the subjectivity of the definition and the lack of a subsequent validation with new experimental data, the proposed standard differentiates between a relax state and an emotional stress state triggered by a moderate stress stimulus, as it is the Trier Social Stress Test. The scale is robust. Although the variations in the percentage composition slightly affect the score, but they do not affect the valid differentiation between states.

  12. Defining Plagiarism: A Literature Review

    Directory of Open Access Journals (Sweden)

    Akbar Akbar

    2018-02-01

    Full Text Available Plagiarism has repeatedly occurred in Indonesia, resulting in focusing on such academic misbehavior as a “central issue” in Indonesian higher education. One of the issues of addressing plagiarism in higher education is that there is a confusion of defining plagiarism. It seems that Indonesian academics had different perception when defining plagiarism. This article aims at exploring the issue of plagiarism by helping define plagiarism to address confusion among Indonesian academics. This article applies literature review by firs finding relevant articles after identifying databases for literature searching. After the collection of required articles for review, the articles were synthesized before presenting the findings. This study has explored the definition of plagiarism in the context of higher education. This research found that plagiarism is defined in the relation of criminal acts. The huge numbers of discursive features used position plagiaristic acts as an illegal deed. This study also found that cultural backgrounds and exposure to plagiarism were influential in defining plagiarism.

  13. Future Scenarios for Software-Defined Metro and Access Networks and Software-Defined Photonics

    Directory of Open Access Journals (Sweden)

    Tommaso Muciaccia

    2017-01-01

    Full Text Available In recent years, architectures, devices, and components in telecommunication networks have been challenged by evolutionary and revolutionary factors which are drastically changing the traffic features. Most of these changes imply the need for major re-configurability and programmability not only in data-centers and core networks, but also in the metro-access segment. In a wide variety of contexts, this necessity has been addressed by the proposed introduction of the innovative paradigm of software-defined networks (SDNs. Several solutions inspired by the SDN model have been recently proposed also for metro and access networks, where the adoption of a new generation of software-defined reconfigurable integrated photonic devices is highly desirable. In this paper, we review the possible future application scenarios for software-defined metro and access networks and software-defined photonics (SDP, on the base of analytics, statistics, and surveys. This work describes the reasons underpinning the presented radical change of paradigm and summarizes the most significant solutions proposed in literature, with a specific emphasis to physical-layer reconfigurable networks and a focus on both architectures and devices.

  14. Enterprise Reference Library

    Science.gov (United States)

    Bickham, Grandin; Saile, Lynn; Havelka, Jacque; Fitts, Mary

    2011-01-01

    Introduction: Johnson Space Center (JSC) offers two extensive libraries that contain journals, research literature and electronic resources. Searching capabilities are available to those individuals residing onsite or through a librarian s search. Many individuals have rich collections of references, but no mechanisms to share reference libraries across researchers, projects, or directorates exist. Likewise, information regarding which references are provided to which individuals is not available, resulting in duplicate requests, redundant labor costs and associated copying fees. In addition, this tends to limit collaboration between colleagues and promotes the establishment of individual, unshared silos of information The Integrated Medical Model (IMM) team has utilized a centralized reference management tool during the development, test, and operational phases of this project. The Enterprise Reference Library project expands the capabilities developed for IMM to address the above issues and enhance collaboration across JSC. Method: After significant market analysis for a multi-user reference management tool, no available commercial tool was found to meet this need, so a software program was built around a commercial tool, Reference Manager 12 by The Thomson Corporation. A use case approach guided the requirements development phase. The premise of the design is that individuals use their own reference management software and export to SharePoint when their library is incorporated into the Enterprise Reference Library. This results in a searchable user-specific library application. An accompanying share folder will warehouse the electronic full-text articles, which allows the global user community to access full -text articles. Discussion: An enterprise reference library solution can provide a multidisciplinary collection of full text articles. This approach improves efficiency in obtaining and storing reference material while greatly reducing labor, purchasing and

  15. Reference Function Based Spatiotemporal Fuzzy Logic Control Design Using Support Vector Regression Learning

    Directory of Open Access Journals (Sweden)

    Xian-Xia Zhang

    2013-01-01

    Full Text Available This paper presents a reference function based 3D FLC design methodology using support vector regression (SVR learning. The concept of reference function is introduced to 3D FLC for the generation of 3D membership functions (MF, which enhance the capability of the 3D FLC to cope with more kinds of MFs. The nonlinear mathematical expression of the reference function based 3D FLC is derived, and spatial fuzzy basis functions are defined. Via relating spatial fuzzy basis functions of a 3D FLC to kernel functions of an SVR, an equivalence relationship between a 3D FLC and an SVR is established. Therefore, a 3D FLC can be constructed using the learned results of an SVR. Furthermore, the universal approximation capability of the proposed 3D fuzzy system is proven in terms of the finite covering theorem. Finally, the proposed method is applied to a catalytic packed-bed reactor and simulation results have verified its effectiveness.

  16. EFSA Panel on Dietetic Products, Nutrition, and Allergies (NDA); Scientific Opinion on Dietary reference values for water

    DEFF Research Database (Denmark)

    Tetens, Inge

    This Opinion of the EFSA Panel on Dietetic Products, Nutrition, and Allergies (NDA) deals with the setting of dietary reference values for water for specific age groups. Adequate Intakes (AI) have been defined derived from a combination of observed intakes in population groups with desirable...

  17. To establish trimester-specific reference ranges for glycated haemoglobin (HbA1c) in pregnancy

    LENUS (Irish Health Repository)

    O'Connor, CM

    2011-09-01

    Background and aims: Diabetes in Pregnancy imposes additional risks to both mother and infant. These poor outcomes are considered to be primarily related to glycaemic control which is monitored longitudinally through pregnancy by means of HbA1c. The correlation between HbA1c levels with clinical outcomes emphasises the need to measure HbA1c accurately, precisely and for data interpretation comparison to appropriately defined reference intervals. From July 1st 2010, the HbA1c assay in Irish laboratories became fully metrologically traceable to the IFCC standard, permitting HbA1c to be reported in IFCC units (mmol\\/mol) and derived DCCT\\/NGSP units (%) using the IFCC-DCCT\\/NGSP master equation (DCCT = Diabetes Control and Complications Trial, NGSP = National Glycohemoglobin standardisation program). The aim of this project is to establish trimester-specific reference ranges in pregnancy for IFCC standardised HbA1c in non-diabetic Caucasian women. This will allow us to define the goal for HbA1c during pregnancy complicated by diabetes.\\r\

  18. 2002 reference document; Document de reference 2002

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-07-01

    This 2002 reference document of the group Areva, provides information on the society. Organized in seven chapters, it presents the persons responsible for the reference document and for auditing the financial statements, information pertaining to the transaction, general information on the company and share capital, information on company operation, changes and future prospects, assets, financial position, financial performance, information on company management and executive board and supervisory board, recent developments and future prospects. (A.L.B.)

  19. Biomass Scenario Model Documentation: Data and References

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Y.; Newes, E.; Bush, B.; Peterson, S.; Stright, D.

    2013-05-01

    The Biomass Scenario Model (BSM) is a system dynamics model that represents the entire biomass-to-biofuels supply chain, from feedstock to fuel use. The BSM is a complex model that has been used for extensive analyses; the model and its results can be better understood if input data used for initialization and calibration are well-characterized. It has been carefully validated and calibrated against the available data, with data gaps filled in using expert opinion and internally consistent assumed values. Most of the main data sources that feed into the model are recognized as baseline values by the industry. This report documents data sources and references in Version 2 of the BSM (BSM2), which only contains the ethanol pathway, although subsequent versions of the BSM contain multiple conversion pathways. The BSM2 contains over 12,000 total input values, with 506 distinct variables. Many of the variables are opportunities for the user to define scenarios, while others are simply used to initialize a stock, such as the initial number of biorefineries. However, around 35% of the distinct variables are defined by external sources, such as models or reports. The focus of this report is to provide insight into which sources are most influential in each area of the supply chain.

  20. Metaphorical Singular Reference. The Role of Enriched Composition in Reference Resolution

    Directory of Open Access Journals (Sweden)

    Anne Bezuidenhout

    2008-08-01

    Full Text Available It is widely accepted that, in the course of interpreting a metaphorical utterance, both literal and metaphorical interpretations of the utterance are available to the interpreter, although there may be disagreement about the order in which these interpretations are accessed. I call this the dual availability assumption. I argue that it does not apply in cases of metaphorical singular reference. These are cases in which proper names, complex demonstratives or definite descriptions are used metaphorically; e.g., ‘That festering sore must go’, referring to a derelict house. We are forced to give up dual availability in these cases because a process of predicate transfer happens in the restriction clauses of such metaphorically used definite phrases (DPs, so that a denotation-less definite concept is never constructed. A process of enriched composition yields only a metaphorical referent/denotation. I compare cases of metaphorical reference both to cases of metonymic reference and to uses of epithets of the ‘That N of an N’ form. Reflection on the former is helpful in getting clear about the kind of property transfer involved in referential metaphors. Such transfer happens directly at the level of properties and is not mediated via a correspondence between objects, as is the case with metonymic reference. Reflection on epithets such as ‘that festering sore of a house’ is helpful since these are a sort of intermediate case between cases of literal and metaphorical reference. They provide support for my claim that in cases of metaphorical reference there is only a single referent (the metaphorical one. Moreover, constraints on the use of these epithets suggest that referential metaphors are similarly constrained. In particular, I argue that referential metaphors can only be used when the implicit category restriction (e.g., house in the case of the example ‘That festering sore must go’ is highly salient, and that the evaluative

  1. The Philosophy behind a (Danish) Voice-controlled Interface to Internet Browsing for motor-handicapped

    DEFF Research Database (Denmark)

    Brøndsted, Tom

    2005-01-01

    The public-funded project "Indtal" ("Speak-it") has succeeded in developing a Danish voice-controlled utility for internet browsing targeting motor-handicapped users having difficulties using a standard keyboard and/or a standard mouse. The system underlies a number of a priori defined design cri...... with the unimodal oral mode, etc. These criteria have lead to a primarily message-driven system interacting with an existing browser on the end users' systems.......The public-funded project "Indtal" ("Speak-it") has succeeded in developing a Danish voice-controlled utility for internet browsing targeting motor-handicapped users having difficulties using a standard keyboard and/or a standard mouse. The system underlies a number of a priori defined design...

  2. About the geometry of the Earth geodetic reference surfaces

    Science.gov (United States)

    Husár, Ladislav; Švaral, Peter; Janák, Juraj

    2017-10-01

    The paper focuses on the comparison of metrics of three most common reference surfaces of the Earth used in geodesy (excluding the plane which also belongs to reference surfaces used in geodesy when dealing with small areas): a sphere, an ellipsoid of revolution and a triaxial ellipsoid. The two latter surfaces are treated in a more detailed way. First, the mathematical form of the metric tensors using three types of coordinates is derived and the lengths of meridian and parallel arcs between the two types of ellipsoids are compared. Three kinds of parallels, according to the type of latitude, can be defined on a triaxial ellipsoid. We show that two types of parallels are spatial curves and one is represented by ellipses. The differences of curvature of both kinds of ellipsoid are analysed using the normal curvature radii. Priority of the chosen triaxial ellipsoid is documented by its better fit with respect to the high-degree geoid model EIGEN6c4 computed up to degree and order 2160.

  3. Reference nano-dimensional metrology by scanning transmission electron microscopy

    International Nuclear Information System (INIS)

    Dai, Gaoliang; Fluegge, Jens; Bosse, Harald; Heidelmann, Markus; Kübel, Christian; Prang, Robby

    2013-01-01

    Traceable and accurate reference dimensional metrology of nano-structures by scanning transmission electron microscopy (STEM) is introduced in the paper. Two methods, one based on the crystal lattice constant and the other based on the pitch of a feature pair, were applied to calibrate the TEM magnification. The threshold value, which was defined as the half-intensity of boundary materials, is suggested to extract the boundary position of features from the TEM image. Experimental investigations have demonstrated the high potential of the proposed methods. For instance, the standard deviation from ten repeated measurements of a line structure with a nominal 100 nm critical dimension (CD) reaches 1σ = 0.023 nm, about 0.02%. By intentionally introduced defocus and larger sample alignment errors, the investigation shows that these influences may reach 0.20 and 1.3 nm, respectively, indicating the importance of high-quality TEM measurements. Finally, a strategy for disseminating the destructive TEM results is introduced. Using this strategy, the CD of a reference material has been accurately determined. Its agreement over five independent TEM measurements is below 1.2 nm. (paper)

  4. Post-Newtonian reference ellipsoid for relativistic geodesy

    Science.gov (United States)

    Kopeikin, Sergei; Han, Wenbiao; Mazurova, Elena

    2016-02-01

    We apply general relativity to construct the post-Newtonian background manifold that serves as a reference spacetime in relativistic geodesy for conducting a relativistic calculation of the geoid's undulation and the deflection of the plumb line from the vertical. We chose an axisymmetric ellipsoidal body made up of a perfect homogeneous fluid uniformly rotating around a fixed axis, as a source generating the reference geometry of the background manifold through Einstein's equations. We then reformulate and extend hydrodynamic calculations of rotating fluids done by a number of previous researchers for astrophysical applications to the realm of relativistic geodesy to set up algebraic equations defining the shape of the post-Newtonian reference ellipsoid. To complete this task, we explicitly perform all integrals characterizing gravitational field potentials inside the fluid body and represent them in terms of the elementary functions depending on the eccentricity of the ellipsoid. We fully explore the coordinate (gauge) freedom of the equations describing the post-Newtonian ellipsoid and demonstrate that the fractional deviation of the post-Newtonian level surface from the Maclaurin ellipsoid can be made much smaller than the previously anticipated estimate based on the astrophysical application of the coordinate gauge advocated by Bardeen and Chandrasekhar. We also derive the gauge-invariant relations of the post-Newtonian mass and the constant angular velocity of the rotating fluid with the parameters characterizing the shape of the post-Newtonian ellipsoid including its eccentricity, a semiminor axis, and a semimajor axis. We formulate the post-Newtonian theorems of Pizzetti and Clairaut that are used in geodesy to connect the geometric parameters of the reference ellipsoid to the physically measurable force of gravity at the pole and equator of the ellipsoid. Finally, we expand the post-Newtonian geodetic equations describing the post-Newtonian ellipsoid to

  5. A Methodology to Define Flood Resilience

    Science.gov (United States)

    Tourbier, J.

    2012-04-01

    Flood resilience has become an internationally used term with an ever-increasing number of entries on the Internet. The SMARTeST Project is looking at approaches to flood resilience through case studies at cities in various countries, including Washington D.C. in the United States. In light of U.S. experiences a methodology is being proposed by the author that is intended to meet ecologic, spatial, structural, social, disaster relief and flood risk aspects. It concludes that: "Flood resilience combines (1) spatial, (2) structural, (3) social, and (4) risk management levels of flood preparedness." Flood resilience should incorporate all four levels, but not necessarily with equal emphasis. Stakeholders can assign priorities within different flood resilience levels and the considerations they contain, dividing 100% emphasis into four levels. This evaluation would be applied to planned and completed projects, considering existing conditions, goals and concepts. We have long known that the "road to market" for the implementation of flood resilience is linked to capacity building of stakeholders. It is a multidisciplinary enterprise, involving the integration of all the above aspects into the decision-making process. Traditional flood management has largely been influenced by what in the UK has been called "Silo Thinking", involving constituent organizations that are responsible for different elements, and are interested only in their defined part of the system. This barrier to innovation also has been called the "entrapment effect". Flood resilience is being defined as (1) SPATIAL FLOOD RESILIENCE implying the management of land by floodplain zoning, urban greening and management to reduce storm runoff through depression storage and by practicing Sustainable Urban Drainage (SUD's), Best Management Practices (BMP's, or Low Impact Development (LID). Ecologic processes and cultural elements are included. (2) STRUCTURAL FLOOD RESILIENCE referring to permanent flood defense

  6. Geologic utility of improved orbital measurement capabilities in reference to non-renewable resources

    Science.gov (United States)

    Stewart, H.; Marsh, S.

    1982-01-01

    Spectral and spatial characteristics necessary for future orbital remote sensing systems are defined. The conclusions are based on the past decade of experience in exploring for non-renewable resources with reference to data from ground, aircraft, and orbital systems. Two principle areas of investigation are used in the discussion: a structural interpretation in a basin area for hydrocarbon exploration, and a discrimination of altered areas in the Cuprite district in Nevada.

  7. Location memory biases reveal the challenges of coordinating visual and kinesthetic reference frames

    Science.gov (United States)

    Simmering, Vanessa R.; Peterson, Clayton; Darling, Warren; Spencer, John P.

    2008-01-01

    Five experiments explored the influence of visual and kinesthetic/proprioceptive reference frames on location memory. Experiments 1 and 2 compared visual and kinesthetic reference frames in a memory task using visually-specified locations and a visually-guided response. When the environment was visible, results replicated previous findings of biases away from the midline symmetry axis of the task space, with stability for targets aligned with this axis. When the environment was not visible, results showed some evidence of bias away from a kinesthetically-specified midline (trunk anterior–posterior [a–p] axis), but there was little evidence of stability when targets were aligned with body midline. This lack of stability may reflect the challenges of coordinating visual and kinesthetic information in the absence of an environmental reference frame. Thus, Experiments 3–5 examined kinesthetic guidance of hand movement to kinesthetically-defined targets. Performance in these experiments was generally accurate with no evidence of consistent biases away from the trunk a–p axis. We discuss these results in the context of the challenges of coordinating reference frames within versus between multiple sensori-motor systems. PMID:17703284

  8. District-level hospital trauma care audit filters: Delphi technique for defining context-appropriate indicators for quality improvement initiative evaluation in developing countries.

    Science.gov (United States)

    Stewart, Barclay T; Gyedu, Adam; Quansah, Robert; Addo, Wilfred Larbi; Afoko, Akis; Agbenorku, Pius; Amponsah-Manu, Forster; Ankomah, James; Appiah-Denkyira, Ebenezer; Baffoe, Peter; Debrah, Sam; Donkor, Peter; Dorvlo, Theodor; Japiong, Kennedy; Kushner, Adam L; Morna, Martin; Ofosu, Anthony; Oppong-Nketia, Victor; Tabiri, Stephen; Mock, Charles

    2016-01-01

    Prospective clinical audit of trauma care improves outcomes for the injured in high-income countries (HICs). However, equivalent, context-appropriate audit filters for use in low- and middle-income country (LMIC) district-level hospitals have not been well established. We aimed to develop context-appropriate trauma care audit filters for district-level hospitals in Ghana, was well as other LMICs more broadly. Consensus on trauma care audit filters was built between twenty panellists using a Delphi technique with four anonymous, iterative surveys designed to elicit: (i) trauma care processes to be measured; (ii) important features of audit filters for the district-level hospital setting; and (iii) potentially useful filters. Filters were ranked on a scale from 0 to 10 (10 being very useful). Consensus was measured with average percent majority opinion (APMO) cut-off rate. Target consensus was defined a priori as: a median rank of ≥9 for each filter and an APMO cut-off rate of ≥0.8. Panellists agreed on trauma care processes to target (e.g. triage, phases of trauma assessment, early referral if needed) and specific features of filters for district-level hospital use (e.g. simplicity, unassuming of resource capacity). APMO cut-off rate increased successively: Round 1--0.58; Round 2--0.66; Round 3--0.76; and Round 4--0.82. After Round 4, target consensus on 22 trauma care and referral-specific filters was reached. Example filters include: triage--vital signs are recorded within 15 min of arrival (must include breathing assessment, heart rate, blood pressure, oxygen saturation if available); circulation--a large bore IV was placed within 15 min of patient arrival; referral--if referral is activated, the referring clinician and receiving facility communicate by phone or radio prior to transfer. This study proposes trauma care audit filters appropriate for LMIC district-level hospitals. Given the successes of similar filters in HICs and obstetric care filters in LMICs

  9. Uranium reference materials

    International Nuclear Information System (INIS)

    Donivan, S.; Chessmore, R.

    1987-07-01

    The Technical Measurements Center has prepared uranium mill tailings reference materials for use by remedial action contractors and cognizant federal and state agencies. Four materials were prepared with varying concentrations of radionuclides, using three tailings materials and a river-bottom soil diluent. All materials were ground, dried, and blended thoroughly to ensure homogeneity. The analyses on which the recommended values for nuclides in the reference materials are based were performed, using independent methods, by the UNC Geotech (UNC) Chemistry Laboratory, Grand Junction, Colorado, and by C.W. Sill (Sill), Idaho National Engineering Laboratory, Idaho Falls, Idaho. Several statistical tests were performed on the analytical data to characterize the reference materials. Results of these tests reveal that the four reference materials are homogeneous and that no large systematic bias exists between the analytical methods used by Sill and those used by TMC. The average values for radionuclides of the two data sets, representing an unbiased estimate, were used as the recommended values for concentrations of nuclides in the reference materials. The recommended concentrations of radionuclides in the four reference materials are provided. Use of these reference materials will aid in providing uniform standardization among measurements made by remedial action contractors. 11 refs., 9 tabs

  10. Prevalence of thinness in children and adolescents in the Seychelles: comparison of two international growth references.

    Science.gov (United States)

    Bovet, Pascal; Kizirian, Nathalie; Madeleine, George; Blössner, Monika; Chiolero, Arnaud

    2011-06-09

    Thinness in children and adolescents is largely under studied, a contrast with abundant literature on under-nutrition in infants and on overweight in children and adolescents. The aim of this study is to compare the prevalence of thinness using two recently developed growth references, among children and adolescents living in the Seychelles, an economically rapidly developing country in the African region. Weight and height were measured every year in all children of 4 grades (age range: 5 to 16 years) of all schools in the Seychelles as part of a routine school-based surveillance program. In this study we used data collected in 16,672 boys and 16,668 girls examined from 1998 to 2004. Thinness was estimated according to two growth references: i) an international survey (IS), defining three grades of thinness corresponding to a BMI of 18.5, 17.0 and 16.0 kg/m2 at age 18 and ii) the WHO reference, defined here as three categories of thinness (-1, -2 and -3 SD of BMI for age) with the second and third named "thinness" and "severe thinness", respectively. The prevalence of thinness was 21.4%, 6.4% and 2.0% based on the three IS cut-offs and 27.7%, 6.7% and 1.2% based on the WHO cut-offs. The prevalence of thinness categories tended to decrease according to age for both sexes for the IS reference and among girls for the WHO reference. The prevalence of the first category of thinness was larger with the WHO cut-offs than with the IS cut-offs while the prevalence of thinness of "grade 2" and thinness of "grade 3" (IS cut-offs) was similar to the prevalence of "thinness" and "severe thinness" (WHO cut-offs), respectively.

  11. Prevalence of thinness in children and adolescents in the Seychelles: comparison of two international growth references

    Directory of Open Access Journals (Sweden)

    Madeleine George

    2011-06-01

    Full Text Available Abstract Background Thinness in children and adolescents is largely under studied, a contrast with abundant literature on under-nutrition in infants and on overweight in children and adolescents. The aim of this study is to compare the prevalence of thinness using two recently developed growth references, among children and adolescents living in the Seychelles, an economically rapidly developing country in the African region. Methods Weight and height were measured every year in all children of 4 grades (age range: 5 to 16 years of all schools in the Seychelles as part of a routine school-based surveillance program. In this study we used data collected in 16,672 boys and 16,668 girls examined from 1998 to 2004. Thinness was estimated according to two growth references: i an international survey (IS, defining three grades of thinness corresponding to a BMI of 18.5, 17.0 and 16.0 kg/m2 at age 18 and ii the WHO reference, defined here as three categories of thinness (-1, -2 and -3 SD of BMI for age with the second and third named "thinness" and "severe thinness", respectively. Results The prevalence of thinness was 21.4%, 6.4% and 2.0% based on the three IS cut-offs and 27.7%, 6.7% and 1.2% based on the WHO cut-offs. The prevalence of thinness categories tended to decrease according to age for both sexes for the IS reference and among girls for the WHO reference. Conclusion The prevalence of the first category of thinness was larger with the WHO cut-offs than with the IS cut-offs while the prevalence of thinness of "grade 2" and thinness of "grade 3" (IS cut-offs was similar to the prevalence of "thinness" and "severe thinness" (WHO cut-offs, respectively.

  12. The human plasma-metabolome: Reference values in 800 French healthy volunteers; impact of cholesterol, gender and age.

    Science.gov (United States)

    Trabado, Séverine; Al-Salameh, Abdallah; Croixmarie, Vincent; Masson, Perrine; Corruble, Emmanuelle; Fève, Bruno; Colle, Romain; Ripoll, Laurent; Walther, Bernard; Boursier-Neyret, Claire; Werner, Erwan; Becquemont, Laurent; Chanson, Philippe

    2017-01-01

    Metabolomic approaches are increasingly used to identify new disease biomarkers, yet normal values of many plasma metabolites remain poorly defined. The aim of this study was to define the "normal" metabolome in healthy volunteers. We included 800 French volunteers aged between 18 and 86, equally distributed according to sex, free of any medication and considered healthy on the basis of their medical history, clinical examination and standard laboratory tests. We quantified 185 plasma metabolites, including amino acids, biogenic amines, acylcarnitines, phosphatidylcholines, sphingomyelins and hexose, using tandem mass spectrometry with the Biocrates AbsoluteIDQ p180 kit. Principal components analysis was applied to identify the main factors responsible for metabolome variability and orthogonal projection to latent structures analysis was employed to confirm the observed patterns and identify pattern-related metabolites. We established a plasma metabolite reference dataset for 144/185 metabolites. Total blood cholesterol, gender and age were identified as the principal factors explaining metabolome variability. High total blood cholesterol levels were associated with higher plasma sphingomyelins and phosphatidylcholines concentrations. Compared to women, men had higher concentrations of creatinine, branched-chain amino acids and lysophosphatidylcholines, and lower concentrations of sphingomyelins and phosphatidylcholines. Elderly healthy subjects had higher sphingomyelins and phosphatidylcholines plasma levels than young subjects. We established reference human metabolome values in a large and well-defined population of French healthy volunteers. This study provides an essential baseline for defining the "normal" metabolome and its main sources of variation.

  13. Mining of hospital laboratory information systems: a model study defining age- and gender-specific reference intervals and trajectories for plasma creatinine in a pediatric population.

    Science.gov (United States)

    Søeby, Karen; Jensen, Peter Bjødstrup; Werge, Thomas; Sørensen, Steen

    2015-09-01

    The knowledge of physiological fluctuation and variation of even commonly used biochemical quantities in extreme age groups and during development is sparse. This challenges the clinical interpretation and utility of laboratory tests in these age groups. To explore the utility of hospital laboratory data as a source of information, we analyzed enzymatic plasma creatinine as a model analyte in two large pediatric hospital samples. Plasma creatinine measurements from 9700 children aged 0-18 years were obtained from hospital laboratory databases and partitioned into high-resolution gender- and age-groups. Normal probability plots were used to deduce parameters of the normal distributions from healthy creatinine values in the mixed hospital datasets. Furthermore, temporal trajectories were generated from repeated measurements to examine developmental patterns in periods of changing creatinine levels. Creatinine shows great age dependence from birth throughout childhood. We computed and replicated 95% reference intervals in narrow gender and age bins and showed them to be comparable to those determined in healthy population studies. We identified pronounced transitions in creatinine levels at different time points after birth and around the early teens, which challenges the establishment and usefulness of reference intervals in those age groups. The study documents that hospital laboratory data may inform on the developmental aspects of creatinine, on periods with pronounced heterogeneity and valid reference intervals. Furthermore, part of the heterogeneity in creatinine distribution is likely due to differences in biological and chronological age of children and should be considered when using age-specific reference intervals.

  14. Invited commentary: the incremental value of customization in defining abnormal fetal growth status.

    Science.gov (United States)

    Zhang, Jun; Sun, Kun

    2013-10-15

    Reference tools based on birth weight percentiles at a given gestational week have long been used to define fetuses or infants that are small or large for their gestational ages. However, important deficiencies of the birth weight reference are being increasingly recognized. Overwhelming evidence indicates that an ultrasonography-based fetal weight reference should be used to classify fetal and newborn sizes during pregnancy and at birth, respectively. Questions have been raised as to whether further adjustments for race/ethnicity, parity, sex, and maternal height and weight are helpful to improve the accuracy of the classification. In this issue of the Journal, Carberry et al. (Am J Epidemiol. 2013;178(8):1301-1308) show that adjustment for race/ethnicity is useful, but that additional fine tuning for other factors (i.e., full customization) in the classification may not further improve the ability to predict infant morbidity, mortality, and other fetal growth indicators. Thus, the theoretical advantage of full customization may have limited incremental value for pediatric outcomes, particularly in term births. Literature on the prediction of short-term maternal outcomes and very long-term outcomes (adult diseases) is too scarce to draw any conclusions. Given that each additional variable being incorporated in the classification scheme increases complexity and costs in practice, the clinical utility of full customization in obstetric practice requires further testing.

  15. Reference Assessment

    Science.gov (United States)

    Bivens-Tatum, Wayne

    2006-01-01

    This article presents interesting articles that explore several different areas of reference assessment, including practical case studies and theoretical articles that address a range of issues such as librarian behavior, patron satisfaction, virtual reference, or evaluation design. They include: (1) "Evaluating the Quality of a Chat Service"…

  16. The Egyptian geomagnetic reference field to the Epoch, 2010.0

    Science.gov (United States)

    Deebes, H. A.; Abd Elaal, E. M.; Arafa, T.; Lethy, A.; El Emam, A.; Ghamry, E.; Odah, H.

    2017-06-01

    The present work is a compilation of two tasks within the frame of the project ;Geomagnetic Survey & Detailed Geomagnetic Measurements within the Egyptian Territory; funded by the ;Science and Technology Development Fund agency (STDF);. The National Research Institute of Astronomy and Geophysics (NRIAG), has conducted a new extensive land geomagnetic survey that covers the whole Egyptian territory. The field measurements have been done at 3212 points along all the asphalted roads, defined tracks, and ill-defined tracks in Egypt; with total length of 11,586 km. In the present work, the measurements cover for the first time new areas as: the southern eastern borders of Egypt including Halayeb and Shlatin, the Quattara depresion in the western desert, and the new roads between Farafra and Baharia oasis. Also marine geomagnetic survey have been applied for the first time in Naser lake. Misallat and Abu-Simble geomagnetic observatories have been used to reduce the field data to the Epoch 2010. During the field measurements, whenever possible, the old stations occupied by the previous observers have been re-occupied to determine the secular variations at these points. The geomagnetic anomaly maps, the normal geomagnetic field maps with their corresponding secular variation maps, the normal geomagnetic field equations of the geomagnetic elements (EGRF) and their corresponding secular variations equations, are outlined. The anomalous sites, as discovered from the anomaly maps are, only, mentioned. In addition, a correlation between the International Geomagnetic Reference Field (IGRF) 2010.0 and the Egyptian Geomagnetic Reference Field (EGRF) 2010 is indicated.

  17. Constructing Episodes of Inpatient Care: How to Define Hospital Transfer in Hospital Administrative Health Data?

    Science.gov (United States)

    Peng, Mingkai; Li, Bing; Southern, Danielle A; Eastwood, Cathy A; Quan, Hude

    2017-01-01

    Hospital administrative health data create separate records for each hospital stay of patients. Treating a hospital transfer as a readmission could lead to biased results in health service research. This is a cross-sectional study. We used the hospital discharge abstract database in 2013 from Alberta, Canada. Transfer cases were defined by transfer institution code and were used as the reference standard. Four time gaps between 2 hospitalizations (6, 9, 12, and 24 h) and 2 day gaps between hospitalizations [same day (up to 24 h), ≤1 d (up to 48 h)] were used to identify transfer cases. We compared the sensitivity and positive predictive value (PPV) of 6 definitions across different categories of sex, age, and location of residence. Readmission rates within 30 days were compared after episodes of care were defined at the different time gaps. Among the 6 definitions, sensitivity ranged from 93.3% to 98.7% and PPV ranged from 86.4% to 96%. The time gap of 9 hours had the optimal balance of sensitivity and PPV. The time gaps of same day (up to 24 h) and 9 hours had comparable 30-day readmission rates as the transfer indicator after defining episode of care. We recommend the use of a time gap of 9 hours between 2 hospitalizations to define hospital transfer in inpatient databases. When admission or discharge time is not available in the database, a time gap of same day (up to 24 h) can be used to define hospital transfer.

  18. Realization of a tilted reference wave for electron holography by means of a condenser biprism

    Energy Technology Data Exchange (ETDEWEB)

    Röder, Falk, E-mail: Falk.Roeder@tu-dresden.de [Triebenberg Labor, Institut für Strukturphysik, Technische Universität Dresden, D-01062 Dresden (Germany); CEMES-CNRS and Université de Toulouse, 29 rue Jeanne Marvig, F-31055 Toulouse (France); Houdellier, Florent; Denneulin, Thibaud; Snoeck, Etienne; Hÿtch, Martin [CEMES-CNRS and Université de Toulouse, 29 rue Jeanne Marvig, F-31055 Toulouse (France)

    2016-02-15

    As proposed recently, a tilted reference wave in off-axis electron holography is expected to be useful for aberration measurement and correction. Furthermore, in dark-field electron holography, it is considered to replace the reference wave, which is conventionally diffracted in an unstrained object area, by a well-defined object-independent reference wave. Here, we first realize a tilted reference wave by employing a biprism placed in the condenser system above three condenser lenses producing a relative tilt magnitude up to 20/nm at the object plane (300 kV). Paraxial ray-tracing predicts condenser settings for a parallel illumination at the object plane, where only one half of the round illumination disc is tilted relative to the optical axis without displacement. Holographic measurements verify the kink-like phase modulation of the incident beam and return the interference fringe contrast as a function of the relative tilt between both parts of the illumination. Contrast transfer theory including condenser aberrations and biprism instabilities was applied to explain the fringe contrast measurement. A first dark-field hologram with a tilted – object-free – reference wave was acquired and reconstructed. A new application for bright/dark-field imaging is presented.

  19. Can natural variability trigger effects on fish and fish habitat as defined in environment Canada's metal mining environmental effects monitoring program?

    Science.gov (United States)

    Mackey, Robin; Rees, Cassandra; Wells, Kelly; Pham, Samantha; England, Kent

    2013-01-01

    The Metal Mining Effluent Regulations (MMER) took effect in 2002 and require most metal mining operations in Canada to complete environmental effects monitoring (EEM) programs. An "effect" under the MMER EEM program is considered any positive or negative statistically significant difference in fish population, fish usability, or benthic invertebrate community EEM-defined endpoints. Two consecutive studies with the same statistically significant differences trigger more intensive monitoring, including the characterization of extent and magnitude and investigation of cause. Standard EEM study designs do not require multiple reference areas or preexposure sampling, thus results and conclusions about mine effects are highly contingent on the selection of a near perfect reference area and are at risk of falsely labeling natural variation as mine related "effects." A case study was completed to characterize the natural variability in EEM-defined endpoints during preexposure or baseline conditions. This involved completing a typical EEM study in future reference and exposure lakes surrounding a proposed uranium (U) mine in northern Saskatchewan, Canada. Moon Lake was sampled as the future exposure area as it is currently proposed to receive effluent from the U mine. Two reference areas were used: Slush Lake for both the fish population and benthic invertebrate community surveys and Lake C as a second reference area for the benthic invertebrate community survey. Moon Lake, Slush Lake, and Lake C are located in the same drainage basin in close proximity to one another. All 3 lakes contained similar water quality, fish communities, aquatic habitat, and a sediment composition largely comprised of fine-textured particles. The fish population survey consisted of a nonlethal northern pike (Esox lucius) and a lethal yellow perch (Perca flavescens) survey. A comparison of the 5 benthic invertebrate community effect endpoints, 4 nonlethal northern pike population effect endpoints

  20. Valores de referência para carboxiemoglobina Reference values for carboxyhemoglobin

    Directory of Open Access Journals (Sweden)

    Maria Elisa P. B. de Siqueira

    1997-12-01

    Full Text Available INTRODUÇÃO: Os valores de referência de indicadores biológicos são utilizados como parâmetros para interpretação de resultados de valores obtidos em indivíduos expostos ocupacionalmente aos agentes químicos. O Grupo Brasileiro para Estabelecimento dos Valores de Referência tem se dedicado a estas determinações objetivando estabelecer valores de referência para os diferentes bioindicadores em diversas regiões do País. Determinaram-se os valores de referência para a carboxiemoglobina (COHb no Sul de Minas Gerais. MATERIAL E MÉTODO: A COHb foi analisada pelo método espectrofométrico, otimizado no laboratório de análises toxicológicas. Em todas as amostras também foram realizadas análises de alguns parâmetros bioquímicos e hematológicos para atestar o estado de saúde da população, constituída de 200 voluntários não-fumantes e não-expostos, por motivo profissional, ao monóxido de carbono. Cada indivíduo respondeu um questionário para levantamento de dados relevantes à interpretação dos resultados. Os valores de referência foram expressos em termos da média ± desvio-padrão, intervalo de confiança 95% e valor de referência superior. A distribuição estatística dos resultados obtidos foi realizada para possibilitar sua comparação com grupos de trabalhadores, preferentemente à avaliação individual. RESULTADOS E CONCLUSÕES: O valor médio ± desvio-padrão para a carboxiemoglobina foi de 1,0 % ± 0,75; o intervalo de confiança 95%, entre 0,9 e 1,1 % e o valor de referência superior, de 2,5%. Através do teste t de Student (p INTRODUCTION: The reference values (RV of biological indicators are used in the interpretation of the results of such indicators in individuals occupationally exposed to chemical agents. The Brazilian Group for the Establishment of Reference Values has worked on these definitions for the purpose of establishing RVs for several bioindicators in various regions of the country. In

  1. Modular Software-Defined Radio

    Directory of Open Access Journals (Sweden)

    Rhiemeier Arnd-Ragnar

    2005-01-01

    Full Text Available In view of the technical and commercial boundary conditions for software-defined radio (SDR, it is suggestive to reconsider the concept anew from an unconventional point of view. The organizational principles of signal processing (rather than the signal processing algorithms themselves are the main focus of this work on modular software-defined radio. Modularity and flexibility are just two key characteristics of the SDR environment which extend smoothly into the modeling of hardware and software. In particular, the proposed model of signal processing software includes irregular, connected, directed, acyclic graphs with random node weights and random edges. Several approaches for mapping such software to a given hardware are discussed. Taking into account previous findings as well as new results from system simulations presented here, the paper finally concludes with the utility of pipelining as a general design guideline for modular software-defined radio.

  2. Dosimetric analysis at ICRU reference points in HDR-brachytherapy of cervical carcinoma.

    Science.gov (United States)

    Eich, H T; Haverkamp, U; Micke, O; Prott, F J; Müller, R P

    2000-01-01

    In vivo dosimetry in bladder and rectum as well as determining doses on suggested reference points following the ICRU report 38 contribute to quality assurance in HDR-brachytherapy of cervical carcinoma, especially to minimize side effects. In order to gain information regarding the radiation exposure at ICRU reference points in rectum, bladder, ureter and regional lymph nodes those were calculated (digitalisation) by means of orthogonal radiographs of 11 applications in patients with cervical carcinoma, who received primary radiotherapy. In addition, the doses at the ICRU rectum reference point was compared to the results of in vivo measurements in the rectum. The in vivo measurements were by factor 1.5 below the doses determined for the ICRU rectum reference point (4.05 +/- 0.68 Gy versus 6.11 +/- 1.63 Gy). Reasons for this were: calibration errors, non-orthogonal radiographs, movement of applicator and probe in the time span between X-ray and application, missing connection of probe and anterior rectal wall. The standard deviation of calculations at ICRU reference points was on average +/- 30%. Possible reasons for the relatively large standard deviation were difficulties in defining the points, identifying them on radiographs and the different locations of the applicators. Although 3 D CT, US or MR based treatment planning using dose volume histogram analysis is more and more established, this simple procedure of marking and digitising the ICRU reference points lengthened treatment planning only by 5 to 10 minutes. The advantages of in vivo dosimetry are easy practicability and the possibility to determine rectum doses during radiation. The advantages of computer-aided planning at ICRU reference points are that calculations are available before radiation and that they can still be taken into account for treatment planning. Both methods should be applied in HDR-brachytherapy of cervical carcinoma.

  3. Application-Defined Decentralized Access Control

    Science.gov (United States)

    Xu, Yuanzhong; Dunn, Alan M.; Hofmann, Owen S.; Lee, Michael Z.; Mehdi, Syed Akbar; Witchel, Emmett

    2014-01-01

    DCAC is a practical OS-level access control system that supports application-defined principals. It allows normal users to perform administrative operations within their privilege, enabling isolation and privilege separation for applications. It does not require centralized policy specification or management, giving applications freedom to manage their principals while the policies are still enforced by the OS. DCAC uses hierarchically-named attributes as a generic framework for user-defined policies such as groups defined by normal users. For both local and networked file systems, its execution time overhead is between 0%–9% on file system microbenchmarks, and under 1% on applications. This paper shows the design and implementation of DCAC, as well as several real-world use cases, including sandboxing applications, enforcing server applications’ security policies, supporting NFS, and authenticating user-defined sub-principals in SSH, all with minimal code changes. PMID:25426493

  4. THE SECULAR INFLUENCE OF THE RATE OF REFERENCE ON THE ECONOMIC POLICIES

    Directory of Open Access Journals (Sweden)

    Emilian M.DOBRESCU

    2005-12-01

    Full Text Available Having as starting point Radu Stoenescu’s book with respect to discount, and the difficulty in defining discount, we pursue to clarify this category and other associated ones, such as reference rate, discount rate, re-discount and its rate, in a historical and actual approach. The purpose is to demonstrate that the monetary policy about discount influences the economic policies and that is why it must be altered and guided in accordance with the changes of current economic-financial activity.

  5. THE SECULAR INFLUENCE OF THE RATE OF REFERENCE ON THE ECONOMIC POLICIES

    OpenAIRE

    Emilian M.DOBRESCU

    2005-01-01

    Having as starting point Radu Stoenescu’s book with respect to discount, and the difficulty in defining discount, we pursue to clarify this category and other associated ones, such as reference rate, discount rate, re-discount and its rate, in a historical and actual approach. The purpose is to demonstrate that the monetary policy about discount influences the economic policies and that is why it must be altered and guided in accordance with the changes of current economic-financial activity.

  6. Define Project

    DEFF Research Database (Denmark)

    Munk-Madsen, Andreas

    2005-01-01

    "Project" is a key concept in IS management. The word is frequently used in textbooks and standards. Yet we seldom find a precise definition of the concept. This paper discusses how to define the concept of a project. The proposed definition covers both heavily formalized projects and informally...... organized, agile projects. Based on the proposed definition popular existing definitions are discussed....

  7. Three-dimensional model of reference thermal/mechanical and hydrological stratigraphy at Yucca Mountain, southern Nevada

    International Nuclear Information System (INIS)

    Ortiz, T.S.; Williams, R.L.; Nimick, F.B.; Whittet, B.C.; South, D.L.

    1985-10-01

    The Nevada Nuclear Waste Storage Investigations (NNWSI) project is currently examining the feasibility of constructing a nuclear waste repository in the tuffs beneath Yucca Mountain. A three-dimensional model of the thermal/mechanical and hydrological reference stratigraphy at Yucca Mountain has been developed for use in performance assessment and repository design studies involving material properties data. The reference stratigraphy defines units with distinct thermal, physical, mechanical, and hydrological properties. The model is a collection of surface representations, each surface representing the base of a particular unit. The reliability of the model was evaluated by comparing the generated surfaces, existing geologic maps and cross sections, drill hole data, and geologic interpolation. Interpolation of surfaces between drill holes by the model closely matches the existing information. The top of a zone containing prevalent zeolite is defined and superimposed on the reference stratigraphy. Interpretation of the geometric relations between the zeolitic and thermal/mechanical and hydrological surfaces indicates that the zeolitic zone was established before the major portion of local fault displacement took place; however, faulting and zeolitization may have been partly concurrent. The thickness of the proposed repository host rock, the devitrified, relatively lithophysal-poor, moderately to densely welded portion of the Topopah Spring Member of the Paintbrush Tuff, was evaluated and varies from 400 to 800 ft in the repository area. The distance from the repository to groundwater level was estimated to vary from 700 to 1400 ft. 13 figs., 1 tab

  8. Complex reference values for endocrine and special chemistry biomarkers across pediatric, adult, and geriatric ages: establishment of robust pediatric and adult reference intervals on the basis of the Canadian Health Measures Survey.

    Science.gov (United States)

    Adeli, Khosrow; Higgins, Victoria; Nieuwesteeg, Michelle; Raizman, Joshua E; Chen, Yunqi; Wong, Suzy L; Blais, David

    2015-08-01

    Defining laboratory biomarker reference values in a healthy population and understanding the fluctuations in biomarker concentrations throughout life and between sexes are critical to clinical interpretation of laboratory test results in different disease states. The Canadian Health Measures Survey (CHMS) has collected blood samples and health information from the Canadian household population. In collaboration with the Canadian Laboratory Initiative on Pediatric Reference Intervals (CALIPER), the data have been analyzed to determine reference value distributions and reference intervals for several endocrine and special chemistry biomarkers in pediatric, adult, and geriatric age groups. CHMS collected data and blood samples from thousands of community participants aged 3 to 79 years. We used serum samples to measure 13 immunoassay-based special chemistry and endocrine markers. We assessed reference value distributions and, after excluding outliers, calculated age- and sex-specific reference intervals, along with corresponding 90% CIs, according to CLSI C28-A3 guidelines. We observed fluctuations in biomarker reference values across the pediatric, adult, and geriatric age range, with stratification required on the basis of age for all analytes. Additional sex partitions were required for apolipoprotein AI, homocysteine, ferritin, and high sensitivity C-reactive protein. The unique collaboration between CALIPER and CHMS has enabled, for the first time, a detailed examination of the changes in various immunochemical markers that occur in healthy individuals of different ages. The robust age- and sex-specific reference intervals established in this study provide insight into the complex biological changes that take place throughout development and aging and will contribute to improved clinical test interpretation. © 2015 American Association for Clinical Chemistry.

  9. Something new every day: defining innovation and innovativeness in drug therapy.

    Science.gov (United States)

    Aronson, Jeffrey K

    2008-01-01

    The word "innovation" comes from the Latin noun innovatio, derived from the verb innovare, to introduce [something] new. It can refer either to the act of introducing something new or to the thing itself that is introduced. In terms of commerce, it is defined in the Oxford English Dictionary as "the action of introducing a new product into the market; a product newly brought on to the market," a definition that illustrates both aspects of the word's meaning. "Innovativeness" is the property of being an innovation. Here I identify several different types of innovativeness in drug therapy, including structural, pharmacological or pharmacodynamic, pharmaceutical, and pharmacokinetic innovativeness, and I stress the over-riding importance of clinical innovativeness, which should result in a better benefit to harm balance at an affordable cost.

  10. Reference design and operations for deep borehole disposal of high-level radioactive waste

    International Nuclear Information System (INIS)

    Herrick, Courtney Grant; Brady, Patrick Vane; Pye, Steven; Arnold, Bill Walter; Finger, John Travis; Bauer, Stephen J.

    2011-01-01

    A reference design and operational procedures for the disposal of high-level radioactive waste in deep boreholes have been developed and documented. The design and operations are feasible with currently available technology and meet existing safety and anticipated regulatory requirements. Objectives of the reference design include providing a baseline for more detailed technical analyses of system performance and serving as a basis for comparing design alternatives. Numerous factors suggest that deep borehole disposal of high-level radioactive waste is inherently safe. Several lines of evidence indicate that groundwater at depths of several kilometers in continental crystalline basement rocks has long residence times and low velocity. High salinity fluids have limited potential for vertical flow because of density stratification and prevent colloidal transport of radionuclides. Geochemically reducing conditions in the deep subsurface limit the solubility and enhance the retardation of key radionuclides. A non-technical advantage that the deep borehole concept may offer over a repository concept is that of facilitating incremental construction and loading at multiple perhaps regional locations. The disposal borehole would be drilled to a depth of 5,000 m using a telescoping design and would be logged and tested prior to waste emplacement. Waste canisters would be constructed of carbon steel, sealed by welds, and connected into canister strings with high-strength connections. Waste canister strings of about 200 m length would be emplaced in the lower 2,000 m of the fully cased borehole and be separated by bridge and cement plugs. Sealing of the upper part of the borehole would be done with a series of compacted bentonite seals, cement plugs, cement seals, cement plus crushed rock backfill, and bridge plugs. Elements of the reference design meet technical requirements defined in the study. Testing and operational safety assurance requirements are also defined. Overall

  11. Reference design and operations for deep borehole disposal of high-level radioactive waste.

    Energy Technology Data Exchange (ETDEWEB)

    Herrick, Courtney Grant; Brady, Patrick Vane; Pye, Steven; Arnold, Bill Walter; Finger, John Travis; Bauer, Stephen J.

    2011-10-01

    A reference design and operational procedures for the disposal of high-level radioactive waste in deep boreholes have been developed and documented. The design and operations are feasible with currently available technology and meet existing safety and anticipated regulatory requirements. Objectives of the reference design include providing a baseline for more detailed technical analyses of system performance and serving as a basis for comparing design alternatives. Numerous factors suggest that deep borehole disposal of high-level radioactive waste is inherently safe. Several lines of evidence indicate that groundwater at depths of several kilometers in continental crystalline basement rocks has long residence times and low velocity. High salinity fluids have limited potential for vertical flow because of density stratification and prevent colloidal transport of radionuclides. Geochemically reducing conditions in the deep subsurface limit the solubility and enhance the retardation of key radionuclides. A non-technical advantage that the deep borehole concept may offer over a repository concept is that of facilitating incremental construction and loading at multiple perhaps regional locations. The disposal borehole would be drilled to a depth of 5,000 m using a telescoping design and would be logged and tested prior to waste emplacement. Waste canisters would be constructed of carbon steel, sealed by welds, and connected into canister strings with high-strength connections. Waste canister strings of about 200 m length would be emplaced in the lower 2,000 m of the fully cased borehole and be separated by bridge and cement plugs. Sealing of the upper part of the borehole would be done with a series of compacted bentonite seals, cement plugs, cement seals, cement plus crushed rock backfill, and bridge plugs. Elements of the reference design meet technical requirements defined in the study. Testing and operational safety assurance requirements are also defined. Overall

  12. Identification of reference genes in human myelomonocytic cells for gene expression studies in altered gravity.

    Science.gov (United States)

    Thiel, Cora S; Hauschild, Swantje; Tauber, Svantje; Paulsen, Katrin; Raig, Christiane; Raem, Arnold; Biskup, Josefine; Gutewort, Annett; Hürlimann, Eva; Unverdorben, Felix; Buttron, Isabell; Lauber, Beatrice; Philpot, Claudia; Lier, Hartwin; Engelmann, Frank; Layer, Liliana E; Ullrich, Oliver

    2015-01-01

    Gene expression studies are indispensable for investigation and elucidation of molecular mechanisms. For the process of normalization, reference genes ("housekeeping genes") are essential to verify gene expression analysis. Thus, it is assumed that these reference genes demonstrate similar expression levels over all experimental conditions. However, common recommendations about reference genes were established during 1 g conditions and therefore their applicability in studies with altered gravity has not been demonstrated yet. The microarray technology is frequently used to generate expression profiles under defined conditions and to determine the relative difference in expression levels between two or more different states. In our study, we searched for potential reference genes with stable expression during different gravitational conditions (microgravity, normogravity, and hypergravity) which are additionally not altered in different hardware systems. We were able to identify eight genes (ALB, B4GALT6, GAPDH, HMBS, YWHAZ, ABCA5, ABCA9, and ABCC1) which demonstrated no altered gene expression levels in all tested conditions and therefore represent good candidates for the standardization of gene expression studies in altered gravity.

  13. Defining asthma in genetic studies

    NARCIS (Netherlands)

    Koppelman, GH; Postma, DS; Meijer, G.

    1999-01-01

    Genetic studies have been hampered by the lack of a gold standard to diagnose asthma. The complex nature of asthma makes it more difficult to identify asthma genes. Therefore, approaches to define phenotypes, which have been successful in other genetically complex diseases, may be applied to define

  14. IAEA biological reference materials

    International Nuclear Information System (INIS)

    Parr, R.M.; Schelenz, R.; Ballestra, S.

    1988-01-01

    The Analytical Quality Control Services programme of the IAEA encompasses a wide variety of intercomparisons and reference materials. This paper reviews only those aspects of the subject having to do with biological reference materials. The 1988 programme foresees 13 new intercomparison exercises, one for major, minor and trace elements, five for radionuclides, and seven for stable isotopes. Twenty-two natural matrix biological reference materials are available: twelve for major, minor and trace elements, six for radionuclides, and four for chlorinated hydrocarbons. Seven new intercomparisons and reference materials are in preparation or under active consideration. Guidelines on the correct use of reference materials are being prepared for publication in 1989 in consultation with other major international producers and users of biological reference materials. The IAEA database on available reference materials is being updated and expanded in scope, and a new publication is planned for 1989. (orig.)

  15. Standard Reference Tables -

    Data.gov (United States)

    Department of Transportation — The Standard Reference Tables (SRT) provide consistent reference data for the various applications that support Flight Standards Service (AFS) business processes and...

  16. Lawrence Livermore National Laboratory Working Reference Material Production Pla

    Energy Technology Data Exchange (ETDEWEB)

    Wong, Amy; Thronas, Denise; Marshall, Robert

    1998-11-04

    This Lawrence Livermore National Laboratory (LLNL) Working Reference Material Production Plan was written for LLNL by the Los Alamos National Laboratory to address key elements of producing seven Pu-diatomaceous earth NDA Working Reference Materials (WRMS). These WRMS contain low burnup Pu ranging in mass from 0.1 grams to 68 grams. The composite Pu mass of the seven WRMS was designed to approximate the maximum TRU allowable loading of 200 grams Pu. This document serves two purposes: first, it defines all the operations required to meet the LLNL Statement of Work quality objectives, and second, it provides a record of the production and certification of the WRMS. Guidance provided in ASTM Standard Guide C1128-89 was used to ensure that this Plan addressed all the required elements for producing and certifying Working Reference Materials. The Production Plan was written to provide a general description of the processes, steps, files, quality control, and certification measures that were taken to produce the WRMS. The Plan identifies the files where detailed procedures, data, quality control, and certification documentation and forms are retained. The Production Plan is organized into three parts: a) an initial section describing the preparation and characterization of the Pu02 and diatomaceous earth materials, b) middle sections describing the loading, encapsulation, and measurement on the encapsulated WRMS, and c) final sections describing the calculations of the Pu, Am, and alpha activity for the WRMS and the uncertainties associated with these quantities.

  17. Reference intervals for mean platelet volume and immature platelet fraction determined on a sysmex XE5000 hematology analyzer

    DEFF Research Database (Denmark)

    Jørgensen, Mikala Klok; Bathum, L.

    2016-01-01

    Background New parameters describing the platelet population of the blood are mean platelet volume (MPV), which is a crude estimate of thrombocyte reactivity, and immature platelet fraction (IPF), which reflects megakaryopoietic activity. This study aimed to define reference intervals for MPV...... and IPF and to investigate whether separate reference intervals according to smoking status, age or sex are necessary.Methods Blood samples were obtained from subjects participating in The Danish General Suburban Population Study. MPV and IPF measurements were performed by the use of the Sysmex XE-5000...

  18. Trimester-specific reference intervals for haemoglobin A(1c) (HbA(1c)) in pregnancy.

    LENUS (Irish Health Repository)

    O'Connor, Catherine

    2011-11-26

    Abstract Background: Diabetes in pregnancy imposes additional risks to both mother and infant. These increased risks are considered to be primarily related to glycaemic control which is monitored by means of glycated haemoglobin (HbA(1c)). The correlation of HbA(1c) with clinical outcomes emphasises the need to measure HbA(1c) accurately, precisely and for correct interpretation, comparison to appropriately defined reference intervals. Since July 2010, the HbA(1c) assay in Irish laboratories is fully metrologically traceable to the IFCC standard. The objective was to establish trimester-specific reference intervals in pregnancy for IFCC standardised HbA(1c) in non-diabetic Caucasian women. Methods: The authors recruited 311 non-diabetic Caucasian pregnant (n=246) and non-pregnant women (n=65). A selective screening based on risk factors for gestational diabetes was employed. All subjects had a random plasma glucose <7.7 mmol\\/L and normal haemoglobin level. Pregnancy trimester was defined as trimester 1 (T1, n=40) up to 12 weeks +6 days, trimester 2 (T2, n=106) 13-27 weeks +6 days, trimester 3 (T3, n=100) >28 weeks to term. Results: The normal HbA(1c) reference interval for Caucasian non-pregnant women was 29-37 mmol\\/mol (Diabetes Control and Complications Trial; DCCT: 4.8%-5.5%), T1: 24-36 mmol\\/mol (DCCT: 4.3%-5.4%), T2: 25-35 mmol\\/mol (DCCT: 4.4%-5.4%) and T3: 28-39 mmol\\/mol (DCCT: 4.7%-5.7%). HbA(1c) was significantly decreased in trimesters 1 and 2 compared to non-pregnant women. Conclusions: HbA(1c) trimester-specific reference intervals are required to better inform the management of pregnancies complicated by diabetes.

  19. Parametric Method to Define Area of Allowable Configurations while Changing Position of Restricted Zones

    Science.gov (United States)

    Pritykin, F. N.; Nefedov, D. I.; Rogoza, Yu A.; Zinchenko, Yu V.

    2018-03-01

    The article presents the findings related to the development of the module for automatic collision detection of the manipulator with restricted zones for virtual motion modeling. It proposes the parametric method for specifying the area of allowable joint configurations. The authors study the cases when restricted zones are specified using the horizontal plane or front-projection planes. The joint coordinate space is specified by rectangular axes in the direction of which the angles defining the displacements in turning pairs are laid off. The authors present the results of modeling which enabled to develop a parametric method for specifying a set of cross-sections defining the shape and position of allowable configurations in different positions of a restricted zone. All joint points that define allowable configurations refer to the indicated sections. The area of allowable configurations is specified analytically by using several kinematic surfaces that limit it. A geometric analysis is developed based on the use of the area of allowable configurations characterizing the position of the manipulator and reported restricted zones. The paper presents numerical calculations related to virtual simulation of the manipulator path performed by the mobile robot Varan when using the developed algorithm and restricted zones. The obtained analytical dependencies allow us to define the area of allowable configurations, which is a knowledge pool to ensure the intelligent control of the manipulator path in a predefined environment. The use of the obtained region to synthesize a joint trajectory makes it possible to correct the manipulator path to foresee and eliminate deadlocks when synthesizing motions along the velocity vector.

  20. Measuring the quality of a quantum reference frame: The relative entropy of frameness

    International Nuclear Information System (INIS)

    Gour, Gilad; Marvian, Iman; Spekkens, Robert W.

    2009-01-01

    In the absence of a reference frame for transformations associated with group G, any quantum state that is noninvariant under the action of G may serve as a token of the missing reference frame. We here present a measure of the quality of such a token: the relative entropy of frameness. This is defined as the relative entropy distance between the state of interest and the nearest G-invariant state. Unlike the relative entropy of entanglement, this quantity is straightforward to calculate, and we find it to be precisely equal to the G-asymmetry, a measure of frameness introduced by Vaccaro et al. It is shown to provide an upper bound on the mutual information between the group element encoded into the token and the group element that may be extracted from it by measurement. In this sense, it quantifies the extent to which the token successfully simulates a full reference frame. We also show that despite a suggestive analogy from entanglement theory, the regularized relative entropy of frameness is zero and therefore does not quantify the rate of interconversion between the token and some standard form of quantum reference frame. Finally, we show how these investigations yield an approach to bounding the relative entropy of entanglement.

  1. Defining depth of anesthesia.

    Science.gov (United States)

    Shafer, S L; Stanski, D R

    2008-01-01

    In this chapter, drawn largely from the synthesis of material that we first presented in the sixth edition of Miller's Anesthesia, Chap 31 (Stanski and Shafer 2005; used by permission of the publisher), we have defined anesthetic depth as the probability of non-response to stimulation, calibrated against the strength of the stimulus, the difficulty of suppressing the response, and the drug-induced probability of non-responsiveness at defined effect site concentrations. This definition requires measurement of multiple different stimuli and responses at well-defined drug concentrations. There is no one stimulus and response measurement that will capture depth of anesthesia in a clinically or scientifically meaningful manner. The "clinical art" of anesthesia requires calibration of these observations of stimuli and responses (verbal responses, movement, tachycardia) against the dose and concentration of anesthetic drugs used to reduce the probability of response, constantly adjusting the administered dose to achieve the desired anesthetic depth. In our definition of "depth of anesthesia" we define the need for two components to create the anesthetic state: hypnosis created with drugs such as propofol or the inhalational anesthetics and analgesia created with the opioids or nitrous oxide. We demonstrate the scientific evidence that profound degrees of hypnosis in the absence of analgesia will not prevent the hemodynamic responses to profoundly noxious stimuli. Also, profound degrees of analgesia do not guarantee unconsciousness. However, the combination of hypnosis and analgesia suppresses hemodynamic response to noxious stimuli and guarantees unconsciousness.

  2. Comparison of the reference mark azimuth determination methods

    Directory of Open Access Journals (Sweden)

    Danijel Šugar

    2013-03-01

    Full Text Available The knowledge of the azimuth of the reference mark is of crucial importance in the determination of the declination which is defined as the ellipsoidal (geodetic azimuth of the geomagnetic meridian. The accuracy of the azimuth determination has direct impact on the accuracy of the declination. The orientation of the Declination-Inclination Magnetometer is usually carried out by sighting the reference mark in two telescope faces in order to improve the reliability of the observations and eliminate some instrumental errors. In this paper, different coordinate as well as azimuth determination methods using GNSS (Global Navigation Satellite System observation techniques within VPPS (High-Precision Positioning Service and GPPS (Geodetic-Precision Positioning Service services of the CROPOS (CROatian POsitioning System system were explained. The azimuth determination by the observation of the Polaris was exposed and it was subsequently compared with the observation of the Sun using hour-angle and zenith-distance method. The procedure of the calculation of the geodetic azimuth from the astronomic azimuth was explained. The azimuth results obtained by different methods were compared and the recommendations on the minimal distance between repeat station and azimuth mark were given. The results shown in this paper were based on the observations taken on the POKU_SV repeat station.

  3. The resource theory of quantum reference frames: manipulations and monotones

    International Nuclear Information System (INIS)

    Gour, Gilad; Spekkens, Robert W

    2008-01-01

    Every restriction on quantum operations defines a resource theory, determining how quantum states that cannot be prepared under the restriction may be manipulated and used to circumvent the restriction. A superselection rule (SSR) is a restriction that arises through the lack of a classical reference frame and the states that circumvent it (the resource) are quantum reference frames. We consider the resource theories that arise from three types of SSRs, associated respectively with lacking: (i) a phase reference, (ii) a frame for chirality, and (iii) a frame for spatial orientation. Focusing on pure unipartite quantum states (and in some cases restricting our attention even further to subsets of these), we explore single-copy and asymptotic manipulations. In particular, we identify the necessary and sufficient conditions for a deterministic transformation between two resource states to be possible and, when these conditions are not met, the maximum probability with which the transformation can be achieved. We also determine when a particular transformation can be achieved reversibly in the limit of arbitrarily many copies and find the maximum rate of conversion. A comparison of the three resource theories demonstrates that the extent to which resources can be interconverted decreases as the strength of the restriction increases. Along the way, we introduce several measures of frameness and prove that these are monotonically non-increasing under various classes of operations that are permitted by the SSR

  4. Calibrating AIS images using the surface as a reference

    Science.gov (United States)

    Smith, M. O.; Roberts, D. A.; Shipman, H. M.; Adams, J. B.; Willis, S. C.; Gillespie, A. R.

    1987-01-01

    A method of evaluating the initial assumptions and uncertainties of the physical connection between Airborne Imaging Spectrometer (AIS) image data and laboratory/field spectrometer data was tested. The Tuscon AIS-2 image connects to lab reference spectra by an alignment to the image spectral endmembers through a system gain and offset for each band. Images were calibrated to reflectance so as to transform the image into a measure that is independent of the solar radiant flux. This transformation also makes the image spectra directly comparable to data from lab and field spectrometers. A method was tested for calibrating AIS images using the surface as a reference. The surface heterogeneity is defined by lab/field spectral measurements. It was found that the Tuscon AIS-2 image is consistent with each of the initial hypotheses: (1) that the AIS-2 instrument calibration is nearly linear; (2) the spectral variance is caused by sub-pixel mixtures of spectrally distinct materials and shade, and (3) that sub-pixel mixtures can be treated as linear mixtures of pure endmembers. It was also found that the image can be characterized by relatively few endmembers using the AIS-2 spectra.

  5. Drinking Levels Defined

    Science.gov (United States)

    ... of Alcohol Consumption Alcohol's Effects on the Body Alcohol Use Disorder Fetal Alcohol Exposure Support & Treatment Alcohol Policy Special ... Definition of Drinking at Low Risk for Developing Alcohol Use Disorder (AUD): For women, low-risk drinking is defined ...

  6. Reference dosimetry for helical tomotherapy: Practical implementation and a multicenter validation

    International Nuclear Information System (INIS)

    De Ost, B.; Schaeken, B.; Vynckier, S.; Sterpin, E.; Van den Weyngaert, D.

    2011-01-01

    Purpose: The aim of this study was to implement a protocol for reference dosimetry in tomotherapy and to validate the beam output measurements with an independent dosimetry system. Methods: Beam output was measured at the reference depth of 10 cm in water for the following three cases: (1) a 5 x 10 cm 2 static machine specific reference field (MSR), (2) a rotational 5 x 10 cm 2 field without modulation and no tabletop in the beam, (3) a plan class specific reference (PCSR) field defined as a rotational homogeneous dose delivery to a cylindrical shaped target volume: plan with modulation and table-top movement. The formalism for reference dosimetry of small and nonstandard fields [Med.Phys.35: 5179-5186, 2008] and QA recommendations [Med.Phys.37: 4817-4853, 2010] were adopted in the dose measurement protocol. All ionization chamber measurements were verified independently using alanine/EPR dosimetry. As a pilot study, the beam output was measured on tomotherapy Hi-art systems at three other centers and directly compared to the centers specifications and to alanine dosimetry. Results: For the four centers, the mean static output at a depth of 10 cm in water and SAD = 85 cm, measured with an A1SL chamber following the TG-148 report was 6.238 Gy/min ± 0.058 (1 SD); the rotational output was 6.255 Gy/min ± 0.069 (1 SD). The dose stated by the center was found in good agreement with the measurements of the visiting team: D center /D visit = 1.000 ± 0.003 (1 SD). The A1SL chamber measurements were all in good agreement with Alanine/EPR dosimetry. Going from the static reference field to the rotational/non modulated field the dose rate remains constant within 0.2% except for one center where a deviation of 1.3% was detected. Conclusions: Following the TG-148 report, beam output measurements in water at the reference depth using a local protocol, as developed at different centers, was verified. The measurements were found in good agreement with alanine/EPR dosimetry. The

  7. Chatbots no serviço de referência online:uma ferramenta para a gestão da biblioteca da PRT 13ª Região.

    OpenAIRE

    Henn, Gustavo

    2006-01-01

    This work proposes guidelines to construct a chatbot for digital reference service of Procuradoria Regional do Trabalho da 13a Região Library. For this, a exploratory bibliographic research was done. The activities had consisted in: (1) to explore the chatbots and digital reference service thematic and, (2) to define consultation types and reference questions that the chatbot will assist, besides proposing its interface and its personality. Concludes that the use of the chatbots in the digita...

  8. The study on the intake and distribution of elements for 'Reference Japanese'

    International Nuclear Information System (INIS)

    1978-01-01

    From the standpoint of radiological protection, the quantitative description of the physical characteristics and customs of man is the basis for calculating annual limit of intake, estimating dose equivalents and MIRD phantom for radioactive substances (ICRP 77). Committee 2 of ICRP has published compilation of data on the anatomical, chemical and physiological characteristics of man, as Standard Man and Reference Man. The models are, however, mostly based on the data published for Europeans and Americans. Reference Man, as characterized by the Task Group itself, is defined as a Caucasian and is a Western European or North American in habitat and custom. Reference Man, so far, is not directly applicable to other populations, for instance, the Japanese. Because it is known that differences exist between Asians, and Europeans and Americans with respect to race, customs and the pattern of food consumption. In view of this problem, it has become of necessity to seek standard or reference values for Japanese, i.e., mass and dimension of body and organs, and the daily intake, distribution, and metabolism of elements in Japanese, especially on the basis of more recently obtained data. Average mass of organs of the normal Japanese was studied and the results were presented for Reference Japanese Man. Development of chemical methods for determining stable elements in autopsy tissues was carried out to establish a highly reliable analytical system. Thyroidal uptake and biological half-life of ingested 131 I in relation to the dietal stable element intake was determined for two normal Japanese adult male subjects. (author)

  9. Developing social standards for wilderness encounters in Mount Rainier National Park: Manager-defined versus visitor-defined standards

    Science.gov (United States)

    Kristopher J. Lah

    2000-01-01

    This research compared the differences found between manager-defined and visitor-defined social standards for wilderness encounters in Mount Rainier National Park. Social standards in recreation areas of public land are defined by what is acceptable to the public, in addition to the area’s management. Social standards for the encounter indicator in Mount Rainier’s...

  10. Children and adults exposed to electromagnetic fields at the ICNIRP reference levels: Theoretical assessment of the induced peak temperature increase

    NARCIS (Netherlands)

    J. Bakker (Jan); M.M. Paulides (Maarten); E. Neufeld; A. Christ (A.); N. Kuster (Niels); G.C. van Rhoon (Gerard)

    2011-01-01

    textabstractTo avoid potentially adverse health effects of electromagnetic fields (EMF), the International Commission on Non-Ionizing Radiation Protection (ICNIRP) has defined EMF reference levels. Restrictions on induced whole-body-averaged specific absorption rate (SAR wb) are provided to keep the

  11. The physical spacetime as a chronostat defining time. (Prolegomena to a future chronodynamics)

    International Nuclear Information System (INIS)

    Krolikowski, W.

    1993-01-01

    The familiar analogy, appearing in the quantum theory, between the time evolution of an isolated system and the thermal equilibrium of a system with a thermostat, is taken at its face value. This leads us to the phenomenological conjecture that, in reality, the so called isolated system may remain in a ''temporal equilibrium'' with the physical spacetime which plays than the role of a ''chronostat'' defining time equal at all space points (in a Minkowski frame of reference). Such a conjecture suggest virtual deviations from this equilibrium and so seems to imply an extension of the first law of thermodynamics as well as of the state equation in the quantum theory. (author). 5 refs

  12. Observações aos prolegômenos da teoria kantiana dos juízos jurídicos a priori em Rechtslehre

    Directory of Open Access Journals (Sweden)

    Fábio César Scherer

    2010-12-01

    Full Text Available In this article the kantian Rechtslehre is interpreted as a critical juridical doctrine, understandable under the critical project – started in Kritik der reinen Vernunft and adapted to the practical field in Kritik der praktischen Vernunft. In particular, it is aimed to highlight, besides the apriority, the systematic character and the search for the completeness of the juridical principles, the use of the solubility of problems theory of the reasoning in general in the prolegomenon of Rechtslehre. The study of this preliminary part is justified in presenting the supreme division of the system according to principals, where it is derived a division of the law doctrine, which determines the object (Gegenstand and, therefore, the field of this particular science and the discussion of the research procedure. Such a priori frame of the Law doctrine is the basis of the following Kantian theory of private law and public law. In a bigger picture, this article can be understood as a renouncement to the idea that the Kantian Rechtslehre does not follow the requirements of the critical philosophy – created by Hermann Cohen (Ethik des reinen Willens, 1904 and detailed by Christian Ritter (Der Rechtsgedanke Kants nach den frühen Quellen, 1971.

  13. Reference levels in PTCA as a function of procedure complexity

    International Nuclear Information System (INIS)

    Peterzol, A.; Quai, E.; Padovani, R.; Bernardi, G.; Kotre, C. J.; Dowling, A.

    2005-01-01

    The multicentre assessment of a procedure complexity index (CI) for the introduction of reference levels (RLs) in percutaneous transluminal coronary angio-plasties (PTCA) is presented here. PTCAs were investigated based on methodology proposed by Bernardi et al. Multiple linear stepwise regression analysis, including clinical, anatomical and technical factors, was performed to obtain fluoroscopy time predictors. Based on these regression coefficients, a scoring system was defined and CI obtained. CI was used to classify dose values into three groups: low, medium and high complexity procedures, since there was good correlation (r = 0.41; P 2 , and 12, 20 and 27 min for fluoroscopy time, for the three CI groups. (authors)

  14. HIV-induced immunodeficiency and mortality from AIDS-defining and non-AIDS-defining malignancies

    DEFF Research Database (Denmark)

    Monforte, Antonella d'Arminio; Abrams, Donald; Pradier, Christian

    2008-01-01

    OBJECTIVE: To evaluate deaths from AIDS-defining malignancies (ADM) and non-AIDS-defining malignancies (nADM) in the D:A:D Study and to investigate the relationship between these deaths and immunodeficiency. DESIGN: Observational cohort study. METHODS: Patients (23 437) were followed prospectively......-fold higher latest CD4 cell count was associated with a halving of the risk of ADM mortality. Other predictors of an increased risk of ADM mortality were homosexual risk group, older age, a previous (non-malignancy) AIDS diagnosis and earlier calendar years. Predictors of an increased risk of nADM mortality...

  15. 75 FR 61553 - National Transit Database: Amendments to the Urbanized Area Annual Reporting Manual and to the...

    Science.gov (United States)

    2010-10-05

    ... public and that any vans that are restricted a priori to particular employers and which do not... or level-platform boarding, and separate branding of the service. High- frequency service is defined...

  16. Reference amounts utilised in front of package nutrition labelling; impact on product healthfulness evaluations.

    Science.gov (United States)

    Raats, M M; Hieke, S; Jola, C; Hodgkins, C; Kennedy, J; Wills, J

    2015-05-01

    The research question addressed in this paper is how different reference amounts utilised in front of package nutrition labelling influence evaluation of product healthfulness. A total of 13,117 participants from six European countries (Germany, UK, Spain, France, Poland and Sweden) were recruited via online panels. A mixed between/within-subject factorial design was employed with food (biscuits, sandwiches, yogurts), healthfulness and presence of Guideline Daily Amounts as within-subjects factors and reference amount ('per 100 g', 'typical portion', 'half portion') and country as between-subjects factors. Overall, people correctly ranked foods according to their objective healthfulness as defined by risk nutrients alone, and could distinguish between more and less healthful variants of foods. General healthfulness associations with the three product categories do not appear to have had a strong influence on product ratings. This study shows that where the reference amount of 'per 100 g' is very different from the 'typical' portion size, as was the case for biscuits, products with a 'per 100 g' label are rated significantly less healthful than the 'typical' or 'half typical' portions. The results indicate that across the three food categories, consumers do factor the reference amount, that is, the quantity of food for which the nutritional information is being presented, into their judgements of healthfulness. Therefore, appropriate reference amounts are also of importance for the effective presentation of nutritional information.

  17. Los mecanismos de democracia directa en México: el plebiscito y referéndum en las entidades del país

    Directory of Open Access Journals (Sweden)

    León David Zayas Ornelas

    2007-01-01

    Full Text Available En México, el cambio político a nivel local es poco estudiado a pesar de que revela nuevas facetas en el diseño de instituciones democráticas en el país. La pérdida de mayorías hegemónicas, la conformación de espacios de representación plurales en los Congresos locales y el cambio de partido de los Ejecutivos han incrementado el número de puntos de veto en la toma de decisiones. En este marco aparecen los Mecanismos de Democracia Directa (MDD como instituciones que permitan reducir los costos de negociación entre los poderes, al abrir el juego a un tercer actor de veto ciudadano. Desde 1998 han aparecido figuras como el referéndum y el plebiscito en dieciséis entidades. Este artículo propone un modelo de análisis de los MDD como puntos de veto en el marco de la democratización local; estudia su diseño institucional y elabora postulados a priori para explicar algunos escenarios de su posible utilización futura.

  18. Changing quantum reference frames

    OpenAIRE

    Palmer, Matthew C.; Girelli, Florian; Bartlett, Stephen D.

    2013-01-01

    We consider the process of changing reference frames in the case where the reference frames are quantum systems. We find that, as part of this process, decoherence is necessarily induced on any quantum system described relative to these frames. We explore this process with examples involving reference frames for phase and orientation. Quantifying the effect of changing quantum reference frames serves as a first step in developing a relativity principle for theories in which all objects includ...

  19. ON DEFINING S-SPACES

    Directory of Open Access Journals (Sweden)

    Francesco Strati

    2013-05-01

    Full Text Available The present work is intended to be an introduction to the Superposition Theory of David Carfì. In particular I shall depict the meaning of his brand new theory, on the one hand in an informal fashion and on the other hand by giving a formal approach of the algebraic structure of the theory: the S-linear algebra. This kind of structure underpins the notion of S-spaces (or Carfì-spaces by defining both its properties and its nature. Thus I shall define the S-triple as the fundamental principle upon which the S-linear algebra is built up.

  20. A Custom Approach for a Flexible, Real-Time and Reliable Software Defined Utility.

    Science.gov (United States)

    Zaballos, Agustín; Navarro, Joan; Martín De Pozuelo, Ramon

    2018-02-28

    Information and communication technologies (ICTs) have enabled the evolution of traditional electric power distribution networks towards a new paradigm referred to as the smart grid. However, the different elements that compose the ICT plane of a smart grid are usually conceived as isolated systems that typically result in rigid hardware architectures, which are hard to interoperate, manage and adapt to new situations. In the recent years, software-defined systems that take advantage of software and high-speed data network infrastructures have emerged as a promising alternative to classic ad hoc approaches in terms of integration, automation, real-time reconfiguration and resource reusability. The purpose of this paper is to propose the usage of software-defined utilities (SDUs) to address the latent deployment and management limitations of smart grids. More specifically, the implementation of a smart grid's data storage and management system prototype by means of SDUs is introduced, which exhibits the feasibility of this alternative approach. This system features a hybrid cloud architecture able to meet the data storage requirements of electric utilities and adapt itself to their ever-evolving needs. Conducted experimentations endorse the feasibility of this solution and encourage practitioners to point their efforts in this direction.

  1. Identification and Evaluation of Reliable Reference Genes in the Medicinal Fungus Shiraia bambusicola.

    Science.gov (United States)

    Song, Liang; Li, Tong; Fan, Li; Shen, Xiao-Ye; Hou, Cheng-Lin

    2016-04-01

    The stability of reference genes plays a vital role in real-time quantitative reverse transcription polymerase chain reaction (qRT-PCR) analysis, which is generally regarded as a convenient and sensitive tool for the analysis of gene expression. A well-known medicinal fungus, Shiraia bambusicola, has great potential in the pharmaceutical, agricultural and food industries, but its suitable reference genes have not yet been determined. In the present study, 11 candidate reference genes in S. bambusicola were first evaluated and validated comprehensively. To identify the suitable reference genes for qRT-PCR analysis, three software-based algorithms, geNorm, NormFinder and Best Keeper, were applied to rank the tested genes. RNA samples were collected from seven fermentation stages using different media (potato dextrose or Czapek medium) and under different light conditions (12-h light/12-h dark and all-dark). The three most appropriate reference genes, ubi, tfc and ags, were able to normalize the qRT-PCR results under the culturing conditions of 12-h light/12-h dark, whereas the other three genes, vac, gke and acyl, performed better in the culturing conditions of all-dark growth. Therefore, under different light conditions, at least two reference genes (ubi and vac) could be employed to assure the reliability of qRT-PCR results. For both the natural culture medium (the most appropriate genes of this group: ubi, tfc and ags) and the chemically defined synthetic medium (the most stable genes of this group: tfc, vac and ef), the tfc gene remained the best gene used for normalizing the gene expression found with qRT-PCR. It is anticipated that these results would improve the selection of suitable reference genes for qRT-PCR assays and lay the foundation for an accurate analysis of gene expression in S. bambusicola.

  2. Formal structures, the concepts of covariance, invariance, equivalent reference frames, and the principle Relativity

    Science.gov (United States)

    Rodrigues, W. A.; Scanavini, M. E. F.; de Alcantara, L. P.

    1990-02-01

    In this paper a given spacetime theory T is characterized as the theory of a certain species of structure in the sense of Bourbaki [1]. It is then possible to clarify in a rigorous way the concepts of passive and active covariance of T under the action of the manifold mapping group G M . For each T, we define also an invariance group G I T and, in general, G I T ≠ G M . This group is defined once we realize that, for each τ ∈ ModT, each explicit geometrical object defining the structure can be classified as absolute or dynamical [2]. All spacetime theories possess also implicit geometrical objects that do not appear explicitly in the structure. These implicit objects are not absolute nor dynamical. Among them there are the reference frame fields, i.e., “timelike” vector fields X ∈ TU,U subseteq M M, where M is a manifold which is part of ST, a substructure for each τ ∈ ModT, called spacetime. We give a physically motivated definition of equivalent reference frames and introduce the concept of the equivalence group of a class of reference frames of kind X according to T, G X T. We define that T admits a weak principle of relativity (WPR) only if G X T ≠ identity for some X. If G X T = G I T for some X, we say that T admits a strong principle of relativity (PR). The results of this paper generalize and clarify several results obtained by Anderson [2], Scheibe [3], Hiskes [4], Recami and Rodrigues [5], Friedman [6], Fock [7], and Scanavini [8]. Among the novelties here, there is the realization that the definitions of G I T and G X T can be given only when certain boundary conditions for the equations of motion of T can be physically realizable in the domain U U subseteq M M, where a given reference frame is defined. The existence of physically realizable boundary conditions for each τ ∈ ModT (in ∂ U), in contrast with the mathematically possible boundary condition, is then seen to be essential for the validity of a principle of relativity for T

  3. Defining Legal Moralism

    DEFF Research Database (Denmark)

    Thaysen, Jens Damgaard

    2015-01-01

    This paper discusses how legal moralism should be defined. It is argued that legal moralism should be defined as the position that “For any X, it is always a pro tanto reason for justifiably imposing legal regulation on X that X is morally wrong (where “morally wrong” is not conceptually equivalent...... to “harmful”)”. Furthermore, a distinction between six types of legal moralism is made. The six types are grouped according to whether they are concerned with the enforcement of positive or critical morality, and whether they are concerned with criminalising, legally restricting, or refraining from legally...... protecting morally wrong behaviour. This is interesting because not all types of legal moralism are equally vulnerable to the different critiques of legal moralism that have been put forth. Indeed, I show that some interesting types of legal moralism have not been criticised at all....

  4. Reference radiation fields - Simulated workplace neutron fields - Part 2: Calibration fundamentals related to the basic quantities

    International Nuclear Information System (INIS)

    2008-01-01

    ISO 8529-1, ISO 8529-2 and ISO 8529-3, deal with the production, characterization and use of neutron fields for the calibration of personal dosimeters and area survey meters. These International Standards describe reference radiations with neutron energy spectra that are well defined and well suited for use in the calibration laboratory. However, the neutron spectra commonly encountered in routine radiation protection situations are, in many cases, quite different from those produced by the sources specified in the International Standards. Since personal neutron dosimeters, and to a lesser extent survey meters, are generally quite energy dependent in their dose equivalent response, it might not be possible to achieve an appropriate calibration for a device that is used in a workplace where the neutron energy spectrum and angular distribution differ significantly from those of the reference radiation used for calibration. ISO 8529-1 describes four radionuclide based neutron reference radiations in detail. This part of ISO 12789 includes the specification of neutron reference radiations that were developed to closely resemble radiation that is encountered in practice

  5. Characterizing piezoscanner hysteresis and creep using optical levers and a reference nanopositioning stage

    International Nuclear Information System (INIS)

    Xie, H.; Regnier, S.; Rakotondrabe, M.

    2009-01-01

    A method using atomic force microscope (AFM) optical levers and a reference nanopositioning stage has been developed to characterize piezoscanner hysteresis and creep. The piezoscanner is fixed on a closed-loop nanopositioning stage, both of which have the same arrangement on each axis of the three spatial directions inside the AFM-based nanomanipulation system. In order to achieve characterization, the optical lever is used as a displacement sensor to measure the relative movement between the nanopositioning stage and the piezoscanner by lateral tracking a well-defined slope with the tapping mode of the AFM cantilever. This setup can be used to estimate a piezoscanner's voltage input with a reference displacement from the nanopositioning stage. The hysteresis and creep were accurately calibrated by the method presented, which use the current setup of the AFM-based nanomanipulation system without any modification or additional devices.

  6. STL pocket reference

    CERN Document Server

    Lischner, Ray

    2003-01-01

    The STL Pocket Reference describes the functions, classes, and templates in that part of the C++ standard library often referred to as the Standard Template Library (STL). The STL encompasses containers, iterators, algorithms, and function objects, which collectively represent one of the most important and widely used subsets of standard library functionality. The C++ standard library, even the subset known as the STL, is vast. It's next to impossible to work with the STL without some sort of reference at your side to remind you of template parameters, function invocations, return types--ind

  7. [Personalizing the reference level: gold standard to evaluate the quality of service perceived].

    Science.gov (United States)

    Rodrigo-Rincón, I; Reyes-Pérez, M; Martínez-Lozano, M E

    2014-01-01

    To know the cutoff point at which in-house Nuclear Medicine Department (MND) customers consider that the quality of service is good (personalized cutoff). We conducted a survey of the professionals who had requested at least 5 tests to the Nuclear Medicine Department. A total of 71 doctors responded (response rate: 30%). A question was added to the questionnaire for the user to establish a cutoff point for which they would consider the quality of service as good. The quality non-conformities, areas of improvement and strong points of the six questions measuring the quality of service (Likert scale 0 to 10) were compared with two different thresholds: personalized cutoff and one proposed by the service itself a priori. Test statistics: binomial and Student's t-test for paired data. A cutoff value of 7 was proposed by the service as a reference while 68.1% of respondents suggested a cutoff above 7 points (mean 7.9 points). The 6 elements of perceived quality were considered strong points with the cutoff proposed by the MND, while there were 3 detected with the personalized threshold. Thirteen percent of the answers were nonconformities with the service cutoff versus 19.2% with the personalized one, the differences being statistically significant (difference 95% CI 6.44%:0,83-12.06). The final image of the perceived quality of an in-house customer is different when using the cutoff established by the Department versus the personalized cutoff given by the respondent. Copyright © 2013 Elsevier España, S.L. and SEMNIM. All rights reserved.

  8. Urban Green Network Design: Defining green network from an urban planning perspective

    Directory of Open Access Journals (Sweden)

    Andrea Tulisi

    2017-08-01

    Full Text Available From the theoretical context of Smart City various studies have emerged that adopt an analytical approach and description of urban phenomena based on the principles of “network design”; this line of research uses the network systems theory to define the principles that regulate the relationships among the various elements of urban sub-systems in order to optimize their functionality. From the same theoretical basis, urban greenspaces have also been studied as networks, by means of the creation of models capable of measuring the performance of the system in its entirety, posing the basis of a new multy-disciplinary research field called green network. This paper presents the results of research aimed at clarifying the meaning of green network from an urban planning perspective through a lexical analysis applied to a textual corpus of more than 300 abstracts of research papers that have dealt with this topic over the last twenty years. The results show that the concept of green network appears still fuzzy and unclear, due to the different meaning given to the term “green” and to an incorrect use of the term “network”, often referred to as a generic set of natural areas present in a city, without any reference to the network system theory or to the basic rules linking these elements together. For this reason, the paper proposes a unique definition of green network from an urban planning perspective that takes into account the contribution of other research areas to effective green infrastructure planning. This is the concept of “urban green network design” defined as “an urban planning practice, supported by decision support tools able to model green infrastructure as network, composed by natural and semi-natural areas, whose connections are modelled according to specific variables, in order to deliver an equal distribution of public services for enhancing the quality of life as well as a wide range of ecosystem services”.

  9. The iTREN-2030 reference scenario until 2030. Deliverable D4

    Energy Technology Data Exchange (ETDEWEB)

    Fiorello, Davide; De Stasio, Claudia; Koehler, Jonathan; Kraft, Markus; Netwon, Sean; Purwanto, Joko; Schade, Burkhard; Schade, Wolfgang; Szimba, Eckhard

    2009-07-01

    The basic objective of iTREN-2030 is to extend the forecasting and assessment capabilities of the TRANS-TOOLS transport model to the new policy issues arising from the technology, environment and energy fields. This is achieved by couplin the TRANS-TOOLS model with three other models, ASTRA, POLES and TREMOVE covering these new policy issues. The TRANS-TOOLS transport network model has been developed to constitute the reference tool for supporting transport policy in the EU and currently is being developed in several European projects. The scenario set-up to be developed in iTREN-2030 has been modified, so that the projects develops a reference scenario and an integrated scenario. For the reference scenario, the three other modelling tools are harmonised with TRANS-TOOLS and made consistent with each other. This results in a coherent scenario for Europe until 2030 for technology, transport, energy, environment and economic development. The integrated scenario will consider the changing framework conditions until 2030, inparticular the policy pressure coming from climate policy and the increasing scarcity of fossil fuels as well as the impact of the financial and economic crisis. Within the iTREN-2030 project, the overall objective of Work Package 4 (WP4) producing tis deliverable is to develop the reference scenario for the quantitative projections using the four modelling tools involved in the project. The main aims of WP4 are to (a) define a consistent framework for using the different tools in an integrated way; (b) calibrate models with exchanged input to a coherent joint reference; (c) implement external input from WP3 and running models for projections; (d) produce output procedures and templates to facilitate assessment in WP5.

  10. Automatic Enhancement of the Reference Set for Multi-Criteria Sorting in The Frame of Theseus Method

    Directory of Open Access Journals (Sweden)

    Fernandez Eduardo

    2014-05-01

    Full Text Available Some recent works have established the importance of handling abundant reference information in multi-criteria sorting problems. More valid information allows a better characterization of the agent’s assignment policy, which can lead to an improved decision support. However, sometimes information for enhancing the reference set may be not available, or may be too expensive. This paper explores an automatic mode of enhancing the reference set in the framework of the THESEUS multi-criteria sorting method. Some performance measures are defined in order to test results of the enhancement. Several theoretical arguments and practical experiments are provided here, supporting a basic advantage of the automatic enhancement: a reduction of the vagueness measure that improves the THESEUS accuracy, without additional efforts from the decision agent. The experiments suggest that the errors coming from inadequate automatic assignments can be kept at a manageable level.

  11. User Preferences in Reference Services: Virtual Reference and Academic Libraries

    Science.gov (United States)

    Cummings, Joel; Cummings, Lara; Frederiksen, Linda

    2007-01-01

    This study examines the use of chat in an academic library's user population and where virtual reference services might fit within the spectrum of public services offered by academic libraries. Using questionnaires, this research demonstrates that many within the academic community are open to the idea of chat-based reference or using chat for…

  12. Reference values for serum ferritin and percentage of transferrin saturation in Korean children and adolescents.

    Science.gov (United States)

    Oh, Hea Lin; Lee, Jun Ah; Kim, Dong Ho; Lim, Jung Sub

    2018-03-01

    Ferritin reference values vary by age, gender, and ethnicity. We aimed to determine reference values of serum ferritin (SF) and the percentage of transferrin saturation (TSAT) for Korean children and adolescents. We analyzed data from 2,487 participants (1,311 males and 1,176 females) aged 10-20 years from the Korea National Health and Nutrition Examination Survey (2010-2012). We calculated age- and gender-stratified means and percentile values for SF and TSAT. We first plotted mean SF and TSAT by gender and according to age. In males, mean SF tended to be relatively constant among participants aged 10 to 14 years, with an upward trend thereafter. Mean SF trended downward among female participants until the age of 15 years and remained constant thereafter. Thus, significant gender differences in ferritin exist from the age of 14 years. High levels of SF were associated with obesity, and lower SF levels were associated with anemia and menarche status. We established reference values of SF and TSAT according to age and gender. The reference values for SF calculated in this study can be used to test the association between SF values and other defined diseases in Korean children and adolescents.

  13. Sensor distributions for structural monitoring

    DEFF Research Database (Denmark)

    Ulriksen, Martin Dalgaard; Bernal, Dionisio

    2017-01-01

    Deciding on the spatial distribution of output sensors for vibration-based structural health monitoring (SHM) is a task that has been, and still is, studied extensively. Yet, when referring to the conventional damage characterization hierarchy, composed of detection, localization, and quantificat......Deciding on the spatial distribution of output sensors for vibration-based structural health monitoring (SHM) is a task that has been, and still is, studied extensively. Yet, when referring to the conventional damage characterization hierarchy, composed of detection, localization......, and quantification, it is primarily the first component that has been addressed with regard to optimal sensor placement. In this particular context, a common approach is to distribute sensors, of which the amount is determined a priori, such that some scalar function of the probability of detection for a pre......-defined set of damage patterns is maximized. Obviously, the optimal sensor distribution, in terms of damage detection, is algorithm-dependent, but studies have showed how correlation generally exists between the different strategies. However, it still remains a question how this “optimality” correlates...

  14. Official ERS technical standards: Global Lung Function Initiative reference values for the carbon monoxide transfer factor for Caucasians.

    Science.gov (United States)

    Stanojevic, Sanja; Graham, Brian L; Cooper, Brendan G; Thompson, Bruce R; Carter, Kim W; Francis, Richard W; Hall, Graham L

    2017-09-01

    There are numerous reference equations available for the single-breath transfer factor of the lung for carbon monoxide ( T  LCO ); however, it is not always clear which reference set should be used in clinical practice. The aim of the study was to develop the Global Lung Function Initiative (GLI) all-age reference values for T  LCO Data from 19 centres in 14 countries were collected to define T  LCO reference values. Similar to the GLI spirometry project, reference values were derived using the LMS (lambda, mu, sigma) method and the GAMLSS (generalised additive models for location, scale and shape) programme in R.12 660 T  LCO measurements from asymptomatic, lifetime nonsmokers were submitted; 85% of the submitted data were from Caucasians. All data were uncorrected for haemoglobin concentration. Following adjustments for elevation above sea level, gas concentration and assumptions used for calculating the anatomic dead space volume, there was a high degree of overlap between the datasets. Reference values for Caucasians aged 5-85 years were derived for T  LCO , transfer coefficient of the lung for carbon monoxide and alveolar volume.This is the largest collection of normative T  LCO data, and the first global reference values available for T  LCO . Copyright ©ERS 2017.

  15. Python essential reference

    CERN Document Server

    Beazley, David M

    2009-01-01

    Python Essential Reference is the definitive reference guide to the Python programming language — the one authoritative handbook that reliably untangles and explains both the core Python language and the most essential parts of the Python library. Designed for the professional programmer, the book is concise, to the point, and highly accessible. It also includes detailed information on the Python library and many advanced subjects that is not available in either the official Python documentation or any other single reference source. Thoroughly updated to reflect the significant new programming language features and library modules that have been introduced in Python 2.6 and Python 3, the fourth edition of Python Essential Reference is the definitive guide for programmers who need to modernize existing Python code or who are planning an eventual migration to Python 3. Programmers starting a new Python project will find detailed coverage of contemporary Python programming idioms.

  16. A priori assumptions about characters as a cause of incongruence between molecular and morphological hypotheses of primate interrelationships.

    Science.gov (United States)

    Tornow, Matthew A; Skelton, Randall R

    2012-01-01

    When molecules and morphology produce incongruent hypotheses of primate interrelationships, the data are typically viewed as incompatible, and molecular hypotheses are often considered to be better indicators of phylogenetic history. However, it has been demonstrated that the choice of which taxa to include in cladistic analysis as well as assumptions about character weighting, character state transformation order, and outgroup choice all influence hypotheses of relationships and may positively influence tree topology, so that relationships between extant taxa are consistent with those found using molecular data. Thus, the source of incongruence between morphological and molecular trees may lie not in the morphological data themselves but in assumptions surrounding the ways characters evolve and their impact on cladistic analysis. In this study, we investigate the role that assumptions about character polarity and transformation order play in creating incongruence between primate phylogenies based on morphological data and those supported by multiple lines of molecular data. By releasing constraints imposed on published morphological analyses of primates from disparate clades and subjecting those data to parsimony analysis, we test the hypothesis that incongruence between morphology and molecules results from inherent flaws in morphological data. To quantify the difference between incongruent trees, we introduce a new method called branch slide distance (BSD). BSD mitigates many of the limitations attributed to other tree comparison methods, thus allowing for a more accurate measure of topological similarity. We find that releasing a priori constraints on character behavior often produces trees that are consistent with molecular trees. Case studies are presented that illustrate how congruence between molecules and unconstrained morphological data may provide insight into issues of polarity, transformation order, homology, and homoplasy.

  17. Defining and Classifying Interest Groups

    DEFF Research Database (Denmark)

    Baroni, Laura; Carroll, Brendan; Chalmers, Adam

    2014-01-01

    The interest group concept is defined in many different ways in the existing literature and a range of different classification schemes are employed. This complicates comparisons between different studies and their findings. One of the important tasks faced by interest group scholars engaged...... in large-N studies is therefore to define the concept of an interest group and to determine which classification scheme to use for different group types. After reviewing the existing literature, this article sets out to compare different approaches to defining and classifying interest groups with a sample...... in the organizational attributes of specific interest group types. As expected, our comparison of coding schemes reveals a closer link between group attributes and group type in narrower classification schemes based on group organizational characteristics than those based on a behavioral definition of lobbying....

  18. Specifications for trueness and precision of a reference measurement system for serum/plasma 25-hydroxyvitamin D analysis.

    Science.gov (United States)

    Stöckl, Dietmar; Sluss, Patrick M; Thienpont, Linda M

    2009-10-01

    The divergence in analytical quality of serum/plasma 25-hydroxy-vitamin D analysis calls for defining specifications for a reference measurement system. Fundamentally, in a reference measurement system, there should be a relationship between the analytical specifications for higher- (reference) and lower-order (routine) measurements. Therefore, when setting specifications, we started with limits for routine imprecision (CV(rou)) and bias (B(rou)) using 4 models: (1) the misclassifications in diagnosis, (2) biological variation data (reference interval (RI) and monitoring), (3) expert recommendations, and (4) state-of-the-art performance. Then, we used the derived goals to tailor those for reference measurements and certified reference materials (CRMs) for calibration by setting the limits for CV(ref) at 0.5 CV(rou), B(ref) at 0.33 B(rou)(,) max. uncertainty (U(max)) at 0.33 B(ref). The established specifications ranged between CV(rou)model 3) and CV(rou)model 2, monitoring). Model 2 (monitoring) gave the most stringent goals, model 3, the most liberal ones. Accounting for state-of-the-art performance and certification capabilities, we used model 2 (RI) to recommend achievable goals: for routine testing, CV(rou)measurements, CV(ref)

  19. CSS Pocket Reference

    CERN Document Server

    Meyer, Eric

    2011-01-01

    When you're working with CSS and need a quick answer, CSS Pocket Reference delivers. This handy, concise book provides all of the essential information you need to implement CSS on the fly. Ideal for intermediate to advanced web designers and developers, the 4th edition is revised and updated for CSS3, the latest version of the Cascading Style Sheet specification. Along with a complete alphabetical reference to CSS3 selectors and properties, you'll also find a short introduction to the key concepts of CSS. Based on Cascading Style Sheets: The Definitive Guide, this reference is an easy-to-us

  20. An evaluation of selected herbal reference texts and comparison to published reports of adverse herbal events.

    Science.gov (United States)

    Haller, Christine A; Anderson, Ilene B; Kim, Susan Y; Blanc, Paul D

    2002-01-01

    There has been a recent proliferation of medical reference texts intended to guide practitioners whose patients use herbal therapies. We systematically assessed six herbal reference texts to evaluate the information they contain on herbal toxicity. We selected six major herbal references published from 1996 to 2000 to evaluate the adequacy of their toxicological information in light of published adverse events. To identify herbs most relevant to toxicology, we reviewed herbal-related calls to our regional California Poison Control System, San Francisco division (CPCS-SF) in 1998 and identified the 12 herbs (defined as botanical dietary supplements) most frequently involved in these CPCS-SF referrals. We searched Medline (1966 to 2000) to identify published reports of adverse effects potentially related to these same 12 herbs. We scored each herbal reference text on the basis of information inclusiveness for the target 12 herbs, with a maximal overall score of 3. The herbs, identified on the basis of CPCS-SF call frequency were: St John's wort, ma huang, echinacea, guarana, ginkgo, ginseng, valerian, tea tree oil, goldenseal, arnica, yohimbe and kava kava. The overall herbal reference scores ranged from 2.2 to 0.4 (median 1.1). The Natural Medicines Comprehensive Database received the highest overall score and was the most complete and useful reference source. All of the references, however, lacked sufficient information on management of herbal medicine overdose, and several had incorrect overdose management guidelines that could negatively impact patient care. Current herbal reference texts do not contain sufficient information for the assessment and management of adverse health effects of botanical therapies.

  1. A Frequency Matching Method: Solving Inverse Problems by Use of Geologically Realistic Prior Information

    DEFF Research Database (Denmark)

    Lange, Katrine; Frydendall, Jan; Cordua, Knud Skou

    2012-01-01

    The frequency matching method defines a closed form expression for a complex prior that quantifies the higher order statistics of a proposed solution model to an inverse problem. While existing solution methods to inverse problems are capable of sampling the solution space while taking into account...... arbitrarily complex a priori information defined by sample algorithms, it is not possible to directly compute the maximum a posteriori model, as the prior probability of a solution model cannot be expressed. We demonstrate how the frequency matching method enables us to compute the maximum a posteriori...... solution model to an inverse problem by using a priori information based on multiple point statistics learned from training images. We demonstrate the applicability of the suggested method on a synthetic tomographic crosshole inverse problem....

  2. On defining semantics of extended attribute grammars

    DEFF Research Database (Denmark)

    Madsen, Ole Lehrmann

    1980-01-01

    Knuth has introduced attribute grammars (AGs) as a tool to define the semanitcs of context-free languages. The use of AGs in connection with programming language definitions has mostly been to define the context-sensitive syntax of the language and to define a translation in code for a hypothetic...

  3. Endogenizing Prospect Theory's Reference Point

    OpenAIRE

    Ulrich Schmidt; Horst Zank

    2010-01-01

    In previous models of (cumulative) prospect theory reference-dependence of preferences is imposed beforehand and the location of the reference point is exogenously determined. This note provides a foundation of prospect theory, where reference-dependence is derived from preference conditions and a unique reference point arises endogenously.

  4. The citation with reference and the citation as a reference

    Directory of Open Access Journals (Sweden)

    Rafael Antonio Cunha Perrone

    2011-12-01

    Full Text Available O ensaio objetiva estabelecer paralelos entre a citação acadêmica e a citação utilizada como referência, para a produção das obras ou projetos de arquitetura. O artigo discute o uso de figuras paradigmáticas ou significativas, transladadas ou amalgamadas nas obras de arquitetura, a partir do entendimento da citação como dado argumentativo e qualitativo. Investiga algumas de suas assimetrias e congruências em sua utilização na linguagem escrita, como transcrição direta ou como fonte interpretativa, reelaborada e incorporada como argumento em outro texto. Pondera que no ensinar, estudar e fazer arquitetura é preciso saber citar com referência, para poder citar como referência.

  5. Marketing Reference Services.

    Science.gov (United States)

    Norman, O. Gene

    1995-01-01

    Relates the marketing concept to library reference services. Highlights include a review of the literature and an overview of marketing, including research, the marketing mix, strategic plan, marketing plan, and marketing audit. Marketing principles are applied to reference services through the marketing mix elements of product, price, place, and…

  6. Reference cells and ploidy in the comet assay

    Directory of Open Access Journals (Sweden)

    Gunnar eBrunborg

    2015-02-01

    Full Text Available In the comet assay, single cells are analyzed with respect to their level of DNA damage. Discrimination of the individual cell or cell type based on DNA content, with concomitant scoring of the DNA damage, is useful since this may allow analysis of mixtures of cells. Different cells can then be characterized based on their ploidy, cell cycle stage, or genome size. We here describe two applications of such a cell type-specific comet assay: (i Testicular cell suspensions, analyzed on the basis of their ploidy during spermatogenesis; and (ii reference cells in the form of fish erythrocytes which can be included as internal standards to correct for inter-assay variations. With standard fluorochromes used in the comet assay, the total staining signal from each cell – whether damaged or undamaged – was found to be associated with the cell’s DNA content. Analysis of the fluorescence intensity of single cells is straightforward since these data are available in scoring systems based on image analysis. The analysis of testicular cell suspensions provides information on cell type specific composition, susceptibility to genotoxicants, and DNA repair. Internal reference cells, either untreated or carrying defined numbers of lesions induced by ionizing radiation, are useful for investigation of experimental factors that can cause variation in comet assay results, and for routine inclusion in experiments to facilitate standardization of methods and comparison of comet assay data obtained in different experiments or in different laboratories. They can also be used - in combination with a reference curve - to quantify the DNA lesions induced by a certain treatment. Fish cells of a range of genome sizes, both greater and smaller than human, are suitable for this purpose and they are inexpensive.

  7. Defining Modules, Modularity and Modularization

    DEFF Research Database (Denmark)

    Miller, Thomas Dedenroth; Pedersen, Per Erik Elgård

    The paper describes the evolution of the concept of modularity in a historical perspective. The main reasons for modularity are: create variety, utilize similarities, and reduce complexity. The paper defines the terms: Module, modularity, and modularization.......The paper describes the evolution of the concept of modularity in a historical perspective. The main reasons for modularity are: create variety, utilize similarities, and reduce complexity. The paper defines the terms: Module, modularity, and modularization....

  8. An Adaptive Critic Approach to Reference Model Adaptation

    Science.gov (United States)

    Krishnakumar, K.; Limes, G.; Gundy-Burlet, K.; Bryant, D.

    2003-01-01

    Neural networks have been successfully used for implementing control architectures for different applications. In this work, we examine a neural network augmented adaptive critic as a Level 2 intelligent controller for a C- 17 aircraft. This intelligent control architecture utilizes an adaptive critic to tune the parameters of a reference model, which is then used to define the angular rate command for a Level 1 intelligent controller. The present architecture is implemented on a high-fidelity non-linear model of a C-17 aircraft. The goal of this research is to improve the performance of the C-17 under degraded conditions such as control failures and battle damage. Pilot ratings using a motion based simulation facility are included in this paper. The benefits of using an adaptive critic are documented using time response comparisons for severe damage situations.

  9. LINQ Pocket Reference

    CERN Document Server

    Albahari, Joseph

    2008-01-01

    Ready to take advantage of LINQ with C# 3.0? This guide has the detail you need to grasp Microsoft's new querying technology, and concise explanations to help you learn it quickly. And once you begin to apply LINQ, the book serves as an on-the-job reference when you need immediate reminders. All the examples in the LINQ Pocket Reference are preloaded into LINQPad, the highly praised utility that lets you work with LINQ interactively. Created by the authors and free to download, LINQPad will not only help you learn LINQ, it will have you thinking in LINQ. This reference explains: LINQ's ke

  10. Aluminium-gold reference material for the k0-standardisation of neutron activation analysis

    International Nuclear Information System (INIS)

    Ingelbrecht, C.; Peetermans, F.; Corte, F. de; Wispelaere, A. de; Vandecasteele, C.; Courtijn, E.; Hondt, P. d'

    1991-01-01

    Gold is an excellent comparator material for the k 0 -standardisation of neutron activation analysis because of its convenient and well defined nuclear properties. The most suitable form for a reference material is a dilute aluminium-gold alloy, for which the self-shielding effect for neutrons is small. Castings of composition Al-0.1 wt.% Au were prepared by crucible-less levitation melting, which gives close control of ingot composition with minimal contamination of the melt. The alloy composition was checked using induction-coupled plasma source emission spectrometry. The homogeneity of the alloy was measured by neutron activation analysis and a relative standard deviation of the gold content of 0.30% was found (10 mg samples). Metallography revealed a homogeneous distribution of AuAl 2 particles. The alloy was certified as Reference Materials CBNM-530, with certified gold mass fraction 0.100±0.002 wt.%. (orig.)

  11. A reference stress approach for the characterisation of the creep failure of dissimilar welds under isothermal conditions

    International Nuclear Information System (INIS)

    Nicholson, R.D.; Williams, J.A.

    1988-11-01

    In high temperature power plant, welds between austenitic and ferritic steels are required to operate under plant conditions for up to 250,000h. The experience and failure modes for such joints are briefly surveyed in this report. A semi-empirical reference stress approach is used to define the failure life of joints under isothermal conditions. The reference stress is based on a previously published form for multiaxial creep fracture of homogeneous materials but modified to include an additional factor to reflect the complex strains present close to the interface in a dissimilar weld. This reference stress can be modified to give approximate bounds characterised by the equivalent stress or the axial stress on the weld. The reference stress, when applied to the 21/4Cr1Mo:Type 316 welded component data base, gives conservative results for the test data available although conservatism is low for the 9Cr1Mo:Alloy 600 combination. The existing data base for welded components is limited. More data are needed covering a wider range of stress ratios and incorporating bending loads. (author)

  12. TORCH Computational Reference Kernels - A Testbed for Computer Science Research

    Energy Technology Data Exchange (ETDEWEB)

    Kaiser, Alex; Williams, Samuel Webb; Madduri, Kamesh; Ibrahim, Khaled; Bailey, David H.; Demmel, James W.; Strohmaier, Erich

    2010-12-02

    For decades, computer scientists have sought guidance on how to evolve architectures, languages, and programming models in order to improve application performance, efficiency, and productivity. Unfortunately, without overarching advice about future directions in these areas, individual guidance is inferred from the existing software/hardware ecosystem, and each discipline often conducts their research independently assuming all other technologies remain fixed. In today's rapidly evolving world of on-chip parallelism, isolated and iterative improvements to performance may miss superior solutions in the same way gradient descent optimization techniques may get stuck in local minima. To combat this, we present TORCH: A Testbed for Optimization ResearCH. These computational reference kernels define the core problems of interest in scientific computing without mandating a specific language, algorithm, programming model, or implementation. To compliment the kernel (problem) definitions, we provide a set of algorithmically-expressed verification tests that can be used to verify a hardware/software co-designed solution produces an acceptable answer. Finally, to provide some illumination as to how researchers have implemented solutions to these problems in the past, we provide a set of reference implementations in C and MATLAB.

  13. Integrating Nursing Diagnostic Concepts into the Medical Entities Dictionary Using the ISO Reference Terminology Model for Nursing Diagnosis

    OpenAIRE

    Hwang, Jee-In; Cimino, James J.; Bakken, Suzanne

    2003-01-01

    Objective: The purposes of the study were (1) to evaluate the usefulness of the International Standards Organization (ISO) Reference Terminology Model for Nursing Diagnoses as a terminology model for defining nursing diagnostic concepts in the Medical Entities Dictionary (MED) and (2) to create the additional hierarchical structures required for integration of nursing diagnostic concepts into the MED.

  14. Reference class forecasting

    DEFF Research Database (Denmark)

    Flyvbjerg, Bent

    optimisme og misinformation. RCF bygger på teorier, som vandt Daniel Kahneman Nobelprisen i økonomi i 2002. RCF estimerer budgettet for et givet projekt på grundlag af de faktiske udfald for budgetterne i en reference-klasse af projekter. RCF udføres i tre trin: 1. Identifikation af en relevant reference...

  15. A priori-defined diet quality indices, biomarkers and risk for type 2 diabetes in five ethnic groups: the Multiethnic Cohort.

    Science.gov (United States)

    Jacobs, Simone; Boushey, Carol J; Franke, Adrian A; Shvetsov, Yurii B; Monroe, Kristine R; Haiman, Christopher A; Kolonel, Laurence N; Le Marchand, Loic; Maskarinec, Gertraud

    2017-08-01

    Dietary indices have been related to risk for type 2 diabetes (T2D) predominantly in white populations. The present study evaluated this association in the ethnically diverse Multiethnic Cohort and examined four diet quality indices in relation to T2D risk, homoeostatic model assessment-estimated insulin resistance (HOMA-IR) and biomarkers of dyslipidaemia, inflammation and adipokines. The T2D analysis included 166 550 white, African American, Native Hawaiian, Japanese American and Latino participants (9200 incident T2D cases). Dietary intake was assessed at baseline using a quantitative FFQ and T2D status was based on three self-reports and confirmed by administrative data. Biomarkers were assessed about 10 years later in a biomarker subcohort (n 10 060). Sex- and ethnicity-specific hazard ratios were calculated for the Healthy Eating Index-2010 (HEI-2010), the alternative HEI-2010 (AHEI-2010), the alternate Mediterranean diet score (aMED) and the Dietary Approaches to Stop Hypertension (DASH). Multivariable-adjusted means of biomarkers were compared across dietary index tertiles in the biomarker subcohort. The AHEI-2010, aMED (in men only) and DASH scores were related to a 10-20 % lower T2D risk, with the strongest associations in whites and the direction of the relationships mostly consistent across ethnic groups. Higher scores on the four indices were related to lower HOMA-IR, TAG and C-reactive protein concentrations, not related to leptin, and the DASH score was directly associated with adiponectin. The AHEI-2010 and DASH were directly related to HDL-cholesterol in women. Potential underlying biological mechanisms linking diet quality and T2D risk are an improved lipid profile and reduced systemic inflammation and, with regards to DASH alone, an improved adiponectin profile.

  16. Android quick APIs reference

    CERN Document Server

    Cinar, Onur

    2015-01-01

    The Android Quick APIs Reference is a condensed code and APIs reference for the new Google Android 5.0 SDK. It presents the essential Android APIs in a well-organized format that can be used as a handy reference. You won't find any technical jargon, bloated samples, drawn out history lessons, or witty stories in this book. What you will find is a software development kit and APIs reference that is concise, to the point and highly accessible. The book is packed with useful information and is a must-have for any mobile or Android app developer or programmer. In the Android Quick APIs Refe

  17. Sensor employing internal reference electrode

    DEFF Research Database (Denmark)

    2013-01-01

    The present invention concerns a novel internal reference electrode as well as a novel sensing electrode for an improved internal reference oxygen sensor and the sensor employing same.......The present invention concerns a novel internal reference electrode as well as a novel sensing electrode for an improved internal reference oxygen sensor and the sensor employing same....

  18. The Academy's Duty to Define Patriotism

    Science.gov (United States)

    Gitlin, Todd

    2002-01-01

    The author discusses how universities might serve the public interest by stirring up not fewer but more and deeper debates on the failures of intelligence that afflicted American institutions before 11 September 2001--and he does not refer simply to the feebleness of the FBI and other investigation bureaucracies. He refers to the parochialism, the…

  19. With Reference to Reference Genes: A Systematic Review of Endogenous Controls in Gene Expression Studies.

    Science.gov (United States)

    Chapman, Joanne R; Waldenström, Jonas

    2015-01-01

    The choice of reference genes that are stably expressed amongst treatment groups is a crucial step in real-time quantitative PCR gene expression studies. Recent guidelines have specified that a minimum of two validated reference genes should be used for normalisation. However, a quantitative review of the literature showed that the average number of reference genes used across all studies was 1.2. Thus, the vast majority of studies continue to use a single gene, with β-actin (ACTB) and/or glyceraldehyde 3-phosphate dehydrogenase (GAPDH) being commonly selected in studies of vertebrate gene expression. Few studies (15%) tested a panel of potential reference genes for stability of expression before using them to normalise data. Amongst studies specifically testing reference gene stability, few found ACTB or GAPDH to be optimal, whereby these genes were significantly less likely to be chosen when larger panels of potential reference genes were screened. Fewer reference genes were tested for stability in non-model organisms, presumably owing to a dearth of available primers in less well characterised species. Furthermore, the experimental conditions under which real-time quantitative PCR analyses were conducted had a large influence on the choice of reference genes, whereby different studies of rat brain tissue showed different reference genes to be the most stable. These results highlight the importance of validating the choice of normalising reference genes before conducting gene expression studies.

  20. DEFINED CONTRIBUTION PLANS, DEFINED BENEFIT PLANS, AND THE ACCUMULATION OF RETIREMENT WEALTH

    Science.gov (United States)

    Poterba, James; Rauh, Joshua; Venti, Steven; Wise, David

    2010-01-01

    The private pension structure in the United States, once dominated by defined benefit (DB) plans, is currently divided between defined contribution (DC) and DB plans. Wealth accumulation in DC plans depends on the participant's contribution behavior and on financial market returns, while accumulation in DB plans is sensitive to a participant's labor market experience and to plan parameters. This paper simulates the distribution of retirement wealth under representative DB and DC plans. It uses data from the Health and Retirement Study (HRS) to explore how asset returns, earnings histories, and retirement plan characteristics contribute to the variation in retirement wealth outcomes. We simulate DC plan accumulation by randomly assigning individuals a share of wages that they and their employer contribute to the plan. We consider several possible asset allocation strategies, with asset returns drawn from the historical return distribution. Our DB plan simulations draw earnings histories from the HRS, and randomly assign each individual a pension plan drawn from a sample of large private and public defined benefit plans. The simulations yield distributions of both DC and DB wealth at retirement. Average retirement wealth accruals under current DC plans exceed average accruals under private sector DB plans, although DC plans are also more likely to generate very low retirement wealth outcomes. The comparison of current DC plans with more generous public sector DB plans is less definitive, because public sector DB plans are more generous on average than their private sector counterparts. PMID:21057597

  1. Implementation of a reference standard and proficiency testing programme by the World Wide Antimalarial Resistance Network (WWARN

    Directory of Open Access Journals (Sweden)

    Barnes Karen I

    2010-12-01

    Full Text Available Abstract Background The Worldwide Antimalarial Resistance Network (WWARN is a global collaboration to support the objective that anyone affected by malaria receives effective and safe drug treatment. The Pharmacology module aims to inform optimal anti-malarial drug selection. There is an urgent need to define the drug exposure - effect relationship for most anti-malarial drugs. Few anti-malarials have had their therapeutic blood concentration levels defined. One of the main challenges in assessing safety and efficacy data in relation to drug concentrations is the comparability of data generated from different laboratories. To explain differences in anti-malarial pharmacokinetics in studies with different measurement laboratories it is necessary to confirm the accuracy of the assay methods. This requires the establishment of an external quality assurance process to assure results that can be compared. This paper describes this process. Methods The pharmacology module of WWARN has established a quality assurance/quality control (QA/QC programme consisting of two separate components: 1. A proficiency testing programme where blank human plasma spiked with certified reference material (CRM in different concentrations is sent out to participating bioanalytical laboratories. 2. A certified reference standard programme where accurately weighed amounts of certified anti-malarial reference standards, metabolites, and internal standards are sent to participating bioanalytical and in vitro laboratories. Conclusion The proficiency testing programme is designed as a cooperative effort to help participating laboratories assess their ability to carry out drug analysis, resolve any potential problem areas and to improve their results - and, in so doing, to improve the quality of anti-malarial pharmacokinetic data published and shared with WWARN. By utilizing the same source of standards for all laboratories, it is possible to minimize bias arising from poor

  2. Setting reference targets

    International Nuclear Information System (INIS)

    Ruland, R.E.

    1997-04-01

    Reference Targets are used to represent virtual quantities like the magnetic axis of a magnet or the definition of a coordinate system. To explain the function of reference targets in the sequence of the alignment process, this paper will first briefly discuss the geometry of the trajectory design space and of the surveying space, then continue with an overview of a typical alignment process. This is followed by a discussion on magnet fiducialization. While the magnetic measurement methods to determine the magnetic centerline are only listed (they will be discussed in detail in a subsequent talk), emphasis is given to the optical/mechanical methods and to the task of transferring the centerline position to reference targets

  3. 22 CFR 92.36 - Authentication defined.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Authentication defined. 92.36 Section 92.36... Notarial Acts § 92.36 Authentication defined. An authentication is a certification of the genuineness of... recognized in another jurisdiction. Documents which may require authentication include legal instruments...

  4. Ramifications of defining high-level waste

    International Nuclear Information System (INIS)

    Wood, D.E.; Campbell, M.H.; Shupe, M.W.

    1987-01-01

    The Nuclear Regulatory Commission (NRC) is considering rule making to provide a concentration-based definition of high-level waste (HLW) under authority derived from the Nuclear Waste Policy Act (NWPA) of 1982 and the Low Level Waste Policy Amendments Act of 1985. The Department of Energy (DOE), which has the responsibility to dispose of certain kinds of commercial waste, is supporting development of a risk-based classification system by the Oak Ridge National Laboratory to assist in developing and implementing the NRC rule. The system is two dimensional, with the axes based on the phrases highly radioactive and requires permanent isolation in the definition of HLW in the NWPA. Defining HLW will reduce the ambiguity in the present source-based definition by providing concentration limits to establish which materials are to be called HLW. The system allows the possibility of greater-confinement disposal for some wastes which do not require the degree of isolation provided by a repository. The definition of HLW will provide a firm basis for waste processing options which involve partitioning of waste into a high-activity stream for repository disposal, and a low-activity stream for disposal elsewhere. Several possible classification systems have been derived and the characteristics of each are discussed. The Defense High Level Waste Technology Lead Office at DOE - Richland Operations Office, supported by Rockwell Hanford Operations, has coordinated reviews of the ORNL work by a technical peer review group and other DOE offices. The reviews produced several recommendations and identified several issues to be addressed in the NRC rule making. 10 references, 3 figures

  5. Defining Game Mechanics

    DEFF Research Database (Denmark)

    Sicart (Vila), Miguel Angel

    2008-01-01

    This article defins game mechanics in relation to rules and challenges. Game mechanics are methods invoked by agents for interacting with the game world. I apply this definition to a comparative analysis of the games Rez, Every Extend Extra and Shadow of the Colossus that will show the relevance...... of a formal definition of game mechanics. Udgivelsesdato: Dec 2008...

  6. From plastic to gold: a unified classification scheme for reference standards in medical image processing

    Science.gov (United States)

    Lehmann, Thomas M.

    2002-05-01

    Reliable evaluation of medical image processing is of major importance for routine applications. Nonetheless, evaluation is often omitted or methodically defective when novel approaches or algorithms are introduced. Adopted from medical diagnosis, we define the following criteria to classify reference standards: 1. Reliance, if the generation or capturing of test images for evaluation follows an exactly determined and reproducible protocol. 2. Equivalence, if the image material or relationships considered within an algorithmic reference standard equal real-life data with respect to structure, noise, or other parameters of importance. 3. Independence, if any reference standard relies on a different procedure than that to be evaluated, or on other images or image modalities than that used routinely. This criterion bans the simultaneous use of one image for both, training and test phase. 4. Relevance, if the algorithm to be evaluated is self-reproducible. If random parameters or optimization strategies are applied, reliability of the algorithm must be shown before the reference standard is applied for evaluation. 5. Significance, if the number of reference standard images that are used for evaluation is sufficient large to enable statistically founded analysis. We demand that a true gold standard must satisfy the Criteria 1 to 3. Any standard only satisfying two criteria, i.e., Criterion 1 and Criterion 2 or Criterion 1 and Criterion 3, is referred to as silver standard. Other standards are termed to be from plastic. Before exhaustive evaluation based on gold or silver standards is performed, its relevance must be shown (Criterion 4) and sufficient tests must be carried out to found statistical analysis (Criterion 5). In this paper, examples are given for each class of reference standards.

  7. Indico CONFERENCE: Define the Call for Abstracts

    CERN Multimedia

    CERN. Geneva; Ferreira, Pedro

    2017-01-01

    In this tutorial, you will learn how to define and open a call for abstracts. When defining a call for abstracts, you will be able to define settings related to the type of questions asked during a review of an abstract, select the users who will review the abstracts, decide when to open the call for abstracts, and more.

  8. A Custom Approach for a Flexible, Real-Time and Reliable Software Defined Utility

    Science.gov (United States)

    2018-01-01

    Information and communication technologies (ICTs) have enabled the evolution of traditional electric power distribution networks towards a new paradigm referred to as the smart grid. However, the different elements that compose the ICT plane of a smart grid are usually conceived as isolated systems that typically result in rigid hardware architectures, which are hard to interoperate, manage and adapt to new situations. In the recent years, software-defined systems that take advantage of software and high-speed data network infrastructures have emerged as a promising alternative to classic ad hoc approaches in terms of integration, automation, real-time reconfiguration and resource reusability. The purpose of this paper is to propose the usage of software-defined utilities (SDUs) to address the latent deployment and management limitations of smart grids. More specifically, the implementation of a smart grid’s data storage and management system prototype by means of SDUs is introduced, which exhibits the feasibility of this alternative approach. This system features a hybrid cloud architecture able to meet the data storage requirements of electric utilities and adapt itself to their ever-evolving needs. Conducted experimentations endorse the feasibility of this solution and encourage practitioners to point their efforts in this direction. PMID:29495599

  9. A Custom Approach for a Flexible, Real-Time and Reliable Software Defined Utility

    Directory of Open Access Journals (Sweden)

    Agustín Zaballos

    2018-02-01

    Full Text Available Information and communication technologies (ICTs have enabled the evolution of traditional electric power distribution networks towards a new paradigm referred to as the smart grid. However, the different elements that compose the ICT plane of a smart grid are usually conceived as isolated systems that typically result in rigid hardware architectures, which are hard to interoperate, manage and adapt to new situations. In the recent years, software-defined systems that take advantage of software and high-speed data network infrastructures have emerged as a promising alternative to classic ad hoc approaches in terms of integration, automation, real-time reconfiguration and resource reusability. The purpose of this paper is to propose the usage of software-defined utilities (SDUs to address the latent deployment and management limitations of smart grids. More specifically, the implementation of a smart grid’s data storage and management system prototype by means of SDUs is introduced, which exhibits the feasibility of this alternative approach. This system features a hybrid cloud architecture able to meet the data storage requirements of electric utilities and adapt itself to their ever-evolving needs. Conducted experimentations endorse the feasibility of this solution and encourage practitioners to point their efforts in this direction.

  10. Defining and Selecting Independent Directors

    Directory of Open Access Journals (Sweden)

    Eric Pichet

    2017-10-01

    Full Text Available Drawing from the Enlightened Shareholder Theory that the author first developed in 2011, this theoretical paper with practical and normative ambitions achieves a better definition of independent director, while improving the understanding of the roles he fulfils on boards of directors. The first part defines constructs like firms, Governance system and Corporate governance, offering a clear distinction between the latter two concepts before explaining the four main missions of a board. The second part defines the ideal independent director by outlining the objective qualities that are necessary and adding those subjective aspects that have turned this into a veritable profession. The third part defines the ideal process for selecting independent directors, based on nominating committees that should themselves be independent. It also includes ways of assessing directors who are currently in function, as well as modalities for renewing their mandates. The paper’s conclusion presents the Paradox of the Independent Director.

  11. Effect of reference conditions on flow rate, modifier fraction and retention in supercritical fluid chromatography.

    Science.gov (United States)

    De Pauw, Ruben; Shoykhet Choikhet, Konstantin; Desmet, Gert; Broeckhoven, Ken

    2016-08-12

    When using compressible mobile phases such as fluidic CO2, the density, the volumetric flow rates and volumetric fractions are pressure dependent. The pressure and temperature definition of these volumetric parameters (referred to as the reference conditions) may alter between systems, manufacturers and operating conditions. A supercritical fluid chromatography system was modified to operate in two modes with different definition of the eluent delivery parameters, referred to as fixed and variable mode. For the variable mode, the volumetric parameters are defined with reference to the pump operating pressure and actual pump head temperature. These conditions may vary when, e.g. changing the column length, permeability, flow rate, etc. and are thus variable reference conditions. For the fixed mode, the reference conditions were set at 150bar and 30°C, resulting in a mass flow rate and mass fraction of modifier definition which is independent of the operation conditions. For the variable mode, the mass flow rate of carbon dioxide increases with system pump operating pressure, decreasing the fraction of modifier. Comparing the void times and retention factor shows that the deviation between the two modes is almost independent of modifier percentage, but depends on the operating pressure. Recalculating the set volumetric fraction of modifier to the mass fraction results in the same retention behaviour for both modes. This shows that retention in SFC can be best modelled using the mass fraction of modifier. The fixed mode also simplifies method scaling as it only requires matching average column pressure. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Determination of cadmium, lead and zinc in a candidate reference materials using isotope dilution mass spectrometry

    International Nuclear Information System (INIS)

    Munoz, Luis; Gras, Nuri; Quejido, Alberto; Fernandez, Marta

    2001-01-01

    The growing demands placed on analytical laboratories to ensure the reliability of their results, due to the introduction of systems of quality and to the increasing use of metrology in chemical measurements has led most laboratories to validate their methodologies and to control them statistically. One of the techniques used most often for these purposes is based on the use of reference materials. The proper use of these materials means that laboratory results may be traced to the International System of Units, analytical methodologies can be validated, instruments calibrated and chemical measurements harmonized. One of the biggest challenges in developing reference materials is that of certifying their properties, a process that has been defined as assigning a concentration value that is as close as possible to the true value together with its uncertainty. Organizations that produce reference materials use several options for their certification process, and among these is the use of a primary method. Among the primary methods recognized by the International Office of Weights and Measures is the Isotope Dilution Mass Spectrometry technique. The Chilean Nuclear Energy Commission, through its Reference Materials Program, has prepared a reference material of clam tissue, which has been chemically defined by different analytical methodologies applied in different national and international laboratories. This work describes the methodology developed with the CIEMAT for determining the elements lead, cadmium and zinc in the clam tissue reference material using the primary technique of Isotope Dilution Mass Spectrometry. The calculation is described for obtaining the spike amounts to be added to the sample and the procedure is explained for carrying out the isotopic exchange. The isotopic relationships 204 Pb/ 205 Pb, 111 Cd/ 114 Cd and 66 Zn/ 67 Zn were determined in an atomic emission spectrometer with a plasma source with the following characteristics: plasma

  13. Defining Cyberbullying.

    Science.gov (United States)

    Englander, Elizabeth; Donnerstein, Edward; Kowalski, Robin; Lin, Carolyn A; Parti, Katalin

    2017-11-01

    Is cyberbullying essentially the same as bullying, or is it a qualitatively different activity? The lack of a consensual, nuanced definition has limited the field's ability to examine these issues. Evidence suggests that being a perpetrator of one is related to being a perpetrator of the other; furthermore, strong relationships can also be noted between being a victim of either type of attack. It also seems that both types of social cruelty have a psychological impact, although the effects of being cyberbullied may be worse than those of being bullied in a traditional sense (evidence here is by no means definitive). A complicating factor is that the 3 characteristics that define bullying (intent, repetition, and power imbalance) do not always translate well into digital behaviors. Qualities specific to digital environments often render cyberbullying and bullying different in circumstances, motivations, and outcomes. To make significant progress in addressing cyberbullying, certain key research questions need to be addressed. These are as follows: How can we define, distinguish between, and understand the nature of cyberbullying and other forms of digital conflict and cruelty, including online harassment and sexual harassment? Once we have a functional taxonomy of the different types of digital cruelty, what are the short- and long-term effects of exposure to or participation in these social behaviors? What are the idiosyncratic characteristics of digital communication that users can be taught? Finally, how can we apply this information to develop and evaluate effective prevention programs? Copyright © 2017 by the American Academy of Pediatrics.

  14. References for HNF-SD-WM-TRD-007, ''System specification for the double-shell tank system: HNF-PROs, CFRs, DOE Orders, WACs''

    International Nuclear Information System (INIS)

    Shaw, C.P.

    1998-01-01

    HNF-SD-WM-TRD-O07, System Specification for the Double-Shell Tank System, (hereafter referred to as DST Specification), defines the requirements of the double-shell tank system at the Hanford Site for Phase 1 privatization. Many of the sections in this document reference other documents for design guidance and requirements. Referenced documents include Project Hanford Management Contract (PHMC) procedures (HNF-PROS), Codes of Federal Regulation (CFRs), DOE Orders, and Washington Administrative Codes (WACs). This document provides rationale for the selection and inclusion of HNF-PROS, CFRs, DOE Orders and WACs

  15. Defining Hardwood Veneer Log Quality Attributes

    Science.gov (United States)

    Jan Wiedenbeck; Michael Wiemann; Delton Alderman; John Baumgras; William Luppold

    2004-01-01

    This publication provides a broad spectrum of information on the hardwood veneer industry in North America. Veneer manufacturers and their customers impose guidelines in specifying wood quality attributes that are very discriminating but poorly defined (e.g., exceptional color, texture, and/or figure characteristics). To better understand and begin to define the most...

  16. CMS Statistics Reference Booklet

    Data.gov (United States)

    U.S. Department of Health & Human Services — The annual CMS Statistics reference booklet provides a quick reference for summary information about health expenditures and the Medicare and Medicaid health...

  17. Defining Overweight and Obesity

    Science.gov (United States)

    ... Micronutrient Malnutrition State and Local Programs Defining Adult Overweight and Obesity Recommend on Facebook Tweet Share Compartir ... weight for a given height is described as overweight or obese. Body Mass Index, or BMI, is ...

  18. Certified reference materials and reference methods for nuclear safeguards and security.

    Science.gov (United States)

    Jakopič, R; Sturm, M; Kraiem, M; Richter, S; Aregbe, Y

    2013-11-01

    Confidence in comparability and reliability of measurement results in nuclear material and environmental sample analysis are established via certified reference materials (CRMs), reference measurements, and inter-laboratory comparisons (ILCs). Increased needs for quality control tools in proliferation resistance, environmental sample analysis, development of measurement capabilities over the years and progress in modern analytical techniques are the main reasons for the development of new reference materials and reference methods for nuclear safeguards and security. The Institute for Reference Materials and Measurements (IRMM) prepares and certifices large quantities of the so-called "large-sized dried" (LSD) spikes for accurate measurement of the uranium and plutonium content in dissolved nuclear fuel solutions by isotope dilution mass spectrometry (IDMS) and also develops particle reference materials applied for the detection of nuclear signatures in environmental samples. IRMM is currently replacing some of its exhausted stocks of CRMs with new ones whose specifications are up-to-date and tailored for the demands of modern analytical techniques. Some of the existing materials will be re-measured to improve the uncertainties associated with their certified values, and to enable laboratories to reduce their combined measurement uncertainty. Safeguards involve the quantitative verification by independent measurements so that no nuclear material is diverted from its intended peaceful use. Safeguards authorities pay particular attention to plutonium and the uranium isotope (235)U, indicating the so-called 'enrichment', in nuclear material and in environmental samples. In addition to the verification of the major ratios, n((235)U)/n((238)U) and n((240)Pu)/n((239)Pu), the minor ratios of the less abundant uranium and plutonium isotopes contain valuable information about the origin and the 'history' of material used for commercial or possibly clandestine purposes, and

  19. Design of a stateless low-latency router architecture for green software-defined networking

    Science.gov (United States)

    Saldaña Cercós, Silvia; Ramos, Ramon M.; Ewald Eller, Ana C.; Martinello, Magnos; Ribeiro, Moisés. R. N.; Manolova Fagertun, Anna; Tafur Monroy, Idelfonso

    2015-01-01

    Expanding software defined networking (SDN) to transport networks requires new strategies to deal with the large number of flows that future core networks will have to face. New south-bound protocols within SDN have been proposed to benefit from having control plane detached from the data plane offering a cost- and energy-efficient forwarding engine. This paper presents an overview of a new approach named KeyFlow to simultaneously reduce latency, jitter, and power consumption in core network nodes. Results on an emulation platform indicate that round trip time (RTT) can be reduced above 50% compared to the reference protocol OpenFlow, specially when flow tables are densely populated. Jitter reduction has been demonstrated experimentally on a NetFPGA-based platform, and 57.3% power consumption reduction has been achieved.

  20. Progetto FreeGIS.net: definita la Reference Implementation

    Directory of Open Access Journals (Sweden)

    Sergio Farrugia

    2012-04-01

    Full Text Available “Con il tempo, vedremo emergere un nuovo equilibrio all’interno del quale tutte le differenti forme di software troveranno la propria collocazione: il tradizionale software commerciale, in stile Microsoft o SAP, insieme al modello Business web del software in affitto, in stile Salesforce.com, e al software libero prodotto o da comunità finanziate o da individui ispirati”. T. L. Friedman (Il Mondo è Piatto, Mondadori 2006, pag. 116 Project FreeGIS.net: defined the Reference ImplementationFreeGIS.net aims to meet the management, analysis and pub-lication  of  geographic  information  through  the  use  of  open data formats, free software and open standards. FreeGIS.net is an INTERREG project, funded by the supra-re-gional cooperation "Italy-Switzerland 2007-2013" ERDF (European Regional Development Fund of the European Union.

  1. Progetto FreeGIS.net: definita la Reference Implementation

    Directory of Open Access Journals (Sweden)

    Sergio Farrugia

    2012-04-01

    Full Text Available “Con il tempo, vedremo emergere un nuovo equilibrio all’interno del quale tutte le differenti forme di software troveranno la propria collocazione: il tradizionale software commerciale, in stile Microsoft o SAP, insieme al modello Business web del software in affitto, in stile Salesforce.com, e al software libero prodotto o da comunità finanziate o da individui ispirati”. T. L. Friedman (Il Mondo è Piatto, Mondadori 2006, pag. 116   Project FreeGIS.net: defined the Reference Implementation FreeGIS.net aims to meet the management, analysis and pub-lication  of  geographic  information  through  the  use  of  open data formats, free software and open standards. FreeGIS.net is an INTERREG project, funded by the supra-re-gional cooperation "Italy-Switzerland 2007-2013" ERDF (European Regional Development Fund of the European Union.

  2. Viewpoint of defining the groundwater chemistry for the performance assessment on geological disposal of high level radioactive waste

    International Nuclear Information System (INIS)

    Sasamoto, Hiroshi; Yui, Mikazu

    2000-01-01

    This report presents the viewpoint of defining the groundwater chemistry for performance assessment of the second progress report for research and development on geological disposal. Based on the results of statistical analysis (binary scatter plots) of the measured data in addition to the consideration of the first progress report, we defined the five hypothetically modeled groundwaters considering the general geological conditions and importance for performance assessment. In order to evaluate the priority of groundwater chemistries, we have analyzed the above five hypothetical groundwaters by considering the results of multivariate statistical analyses, data reliability, evidence for geochemical controls on groundwater chemistry and exclusion criteria for potential repository sites in Japan. As a result, the fresh reducing high pH (FRHP) type groundwater has been selected for the Reference Case analysis, and the saline reducing high pH (SRHP) type groundwater has been selected for the Alternative Geological Environmental Case analysis, respectively. (author)

  3. A self-defining hierarchical data system

    Science.gov (United States)

    Bailey, J.

    1992-01-01

    The Self-Defining Data System (SDS) is a system which allows the creation of self-defining hierarchical data structures in a form which allows the data to be moved between different machine architectures. Because the structures are self-defining they can be used for communication between independent modules in a distributed system. Unlike disk-based hierarchical data systems such as Starlink's HDS, SDS works entirely in memory and is very fast. Data structures are created and manipulated as internal dynamic structures in memory managed by SDS itself. A structure may then be exported into a caller supplied memory buffer in a defined external format. This structure can be written as a file or sent as a message to another machine. It remains static in structure until it is reimported into SDS. SDS is written in portable C and has been run on a number of different machine architectures. Structures are portable between machines with SDS looking after conversion of byte order, floating point format, and alignment. A Fortran callable version is also available for some machines.

  4. The generation and management of references with the Online Mechanism for References - MORE

    Directory of Open Access Journals (Sweden)

    Proxério Manoel Felisberto

    2015-04-01

    Full Text Available The scientific production and the development of academic papers have their own formalities. In this paper, is sought seek from these formalities for the ones that refers to the way of granting merits to the authors of the works used in the theoretical basis, through quotations and references. The goal is to help users of libraries to generate and manage references using a web tool developed for this purpose. There are many applications, in desktop and web platforms, that could be used to do this task. However, some of them require the payment of an expensive license to be fully functional. Others offers free versions, but they are very limited and often do not generate references specified by ABNT. There are others that do not store the generate references for later use. In order to fill this gap, the Online Mechanism for References (MORE was developed and made available to the general public in the web. Even so, the fast technological advances combined with a high number of users demanded an update to the application, done recently. It is important to state that all the work was developed exclusively with proven and free to use technologies. Initially, sought up identified the main tools available to generate and manage references and which free technologies could be used to build interactive web applications. This paper briefly describes the reengineering process that MORE was submitted, its new structure, new requirements met and its expanded portfolio of features. Finally, the results achieved after the reengineering are compared to indicators of its previous version.

  5. Tattoos defined.

    Science.gov (United States)

    Goldstein, Norman

    2007-01-01

    Tattoo definitions from general, foreign language, medical dictionaries and textbooks are reviewed. In addition to the common usage "to mark the skin with pigments," the word tattoo, used as a noun, first meant a signal on a drum or bugle to call military men to quarters. This chapter includes a variety of colorful, cultural and historical references. The increasing popularity of tattoos stimulated the American Academy of Dermatology to produce the 2004 brochure Tattoos, Body Piercing and Other Skin Adornments, which is reproduce here. When asked by patients about getting tattooed, it is wise to caution that even with the variety of modern techniques for removal available, some scarring may result.

  6. How the reference values for serum parathyroid hormone concentration are (or should be) established?

    Science.gov (United States)

    Souberbielle, J-C; Brazier, F; Piketty, M-L; Cormier, C; Minisola, S; Cavalier, E

    2017-03-01

    Well-validated reference values are necessary for a correct interpretation of a serum PTH concentration. Establishing PTH reference values needs recruiting a large reference population. Exclusion criteria for this population can be defined as any situation possibly inducing an increase or a decrease in PTH concentration. As recommended in the recent guidelines on the diagnosis and management of asymptomatic primary hyperparathyroidism, PTH reference values should be established in vitamin D-replete subjects with a normal renal function with possible stratification according to various factors such as age, gender, menopausal status, body mass index, and race. A consensus about analytical/pre-analytical aspects of PTH measurement is also needed with special emphasis on the nature of the sample (plasma or serum), the time and the fasting/non-fasting status of the blood sample. Our opinion is that blood sample for PTH measurement should be obtained in the morning after an overnight fast. Furthermore, despite longer stability of the PTH molecule in EDTA plasma, we prefer serum as it allows to measure calcium, a prerequisite for a correct interpretation of a PTH concentration, on the same sample. Once a consensus is reached, we believe an important international multicentre work should be performed to recruit a very extensive reference population of apparently healthy vitamin D-replete subjects with a normal renal function in order to establish the PTH normative data. Due to the huge inter-method variability in PTH measurement, a sufficient quantity of blood sample should be obtained to allow measurement with as many PTH kits as possible.

  7. [Errors in Peruvian medical journals references].

    Science.gov (United States)

    Huamaní, Charles; Pacheco-Romero, José

    2009-01-01

    References are fundamental in our studies; an adequate selection is asimportant as an adequate description. To determine the number of errors in a sample of references found in Peruvian medical journals. We reviewed 515 scientific papers references selected by systematic randomized sampling and corroborated reference information with the original document or its citation in Pubmed, LILACS or SciELO-Peru. We found errors in 47,6% (245) of the references, identifying 372 types of errors; the most frequent were errors in presentation style (120), authorship (100) and title (100), mainly due to spelling mistakes (91). References error percentage was high, varied and multiple. We suggest systematic revision of references in the editorial process as well as to extend the discussion on this theme. references, periodicals, research, bibliometrics.

  8. Indoor air: Reference bibliography

    International Nuclear Information System (INIS)

    Campbell, D.; Staves, D.; McDonald, S.

    1989-07-01

    The U. S. Environmental Protection Agency initially established the indoor air Reference Bibliography in 1987 as an appendix to the Indoor Air Quality Implementation Plan. The document was submitted to Congress as required under Title IV--Radon Gas and Indoor Air Quality Research of the Superfund Amendments and Reauthorization Act of 1986. The Reference Bibliography is an extensive bibliography of reference materials on indoor air pollution. The Bibliography contains over 4500 citations and continues to increase as new articles appear

  9. Discussion on Implementation of ICRP Recommendations Concerning Reference Levels and Optimisation

    International Nuclear Information System (INIS)

    2013-02-01

    International Commission on Radiological Protection (ICRP) Publication 103, 'The 2007 Recommendations of the International Commission on Radiological Protection', issued in 2007, defines emergency exposure situations as unexpected situations that may require the implementation of urgent protective actions and perhaps longer term protective actions. The ICRP continues to recommend optimisation and the use of reference levels to ensure an adequate degree of protection in regard to exposure to ionising radiation in emergency exposure situations. Reference levels represent the level of dose or risk above which it is judged to be inappropriate to plan to allow exposures to occur and for which protective actions should therefore be planned and optimised. National authorities are responsible for establishing reference levels. The Expert Group on the Implementation of New International Recommendations for Emergency Exposure Situations (EGIRES) performed a survey to analyse the established processes for optimisation of the protection strategy for emergency exposure situations and for practical implementation of the reference level concept in several member states of the Nuclear Energy Agency (NEA). The EGIRES collected information on several national optimisation strategy definitions, on optimisation of protection for different protective actions, and also on optimisation of urgent protective actions. In addition, national criteria for setting reference levels, their use, and relevant processes, including specific triggers and dosimetric quantifies in setting reference levels, are focus points that the EGIRES also evaluated. The analysis of national responses to this 2011 survey shows many differences in the interpretation and application of the established processes and suggests that most countries are still in the early stages of implementing these processes. Since 2011, national authorities have continued their study of the ICRP recommendations to incorporate them into

  10. Application of air traffic control competence reference models as a mean of air navigation services provider’s charge optimization

    Directory of Open Access Journals (Sweden)

    В.П. Харченко

    2010-01-01

    Full Text Available  The issue of application of Air Traffic Control (ATC competence reference models as a mean of air navigation services provider’s charge optimization is described in the article, and this issue is interpretated as an optimization task. The data relating to the significant growth of aviation traffic, especially using the airspace of Ukraine, given by authors, and the statement of fact that Air Traffic Management (ATM system’s technical component reliability increasement takes place on the basis of practically invariable psychophysiological abilities of aviation controller, make the substantiation of ANSP provision with the most trained ATC controllers for the work on the working places of ATC Unit actual. The ‘mechanism’ of ATC controllers competence reference model creation is defined step-by step. There is an example of candidate’s for the working place competence quantitative individual model forming as a common criteria of competence, which, in its turn, is the compressed format of all parameters of its working activity, received at the stage of control. The approach, according to which the individual parameters of graduating student’s output model’s professional characteristics, which he received after the examination of his work as Tower controller (ATM Unit of aerodrome control service, approach controller or area control service controller, are compared with the predetermined specialist’s competence reference model, relating to the special working place in ATM system, is supposed here. Notably, the conception, relating to the correspondence of the graduating student’s competence output level to the defined reference model of ATC controller, relating to the special working place of ATM Unit, is realised.

  11. Characteristics of 64 sarcoma patients referred to a sarcoma center after unplanned excision.

    Science.gov (United States)

    Dyrop, Heidi Buvarp; Safwat, Akmal; Vedsted, Peter; Maretty-Kongstad, Katja; Hansen, Bjarne Hauge; Jørgensen, Peter Holmberg; Baad-Hansen, Thomas; Keller, Johnny

    2016-02-01

    Unplanned excision of sarcoma before referral to specialist centers can affect prognosis and surgical outcome. The diagnostic pathway of these patients is uncertain and needs to be reviewed. We aimed to describe patient and tumor characteristics, initial symptoms, initial and final diagnosis, and explore reasons for unplanned excision in this patient group. From a previous study on 258 sarcoma patients, we identified 64 patients referred after surgery. Medical records were reviewed. The majority were soft tissue sarcomas, most often with thoracic location. Leiomyosarcoma was the most frequent final diagnosis, lipoma, and fibroma/dermatofibroma the most frequent initial diagnoses. Fifty percent were superficial small tumors, and 60.9% had not received diagnostic imaging before surgery. Fifty percent were referred from public surgical departments, and 1/3 from private specialists. Twenty-three patients had initial presence of alarm symptoms registered before surgery, the remaining 2/3 fell outside referral criteria or alarm symptoms were not discovered. Patients referred after unplanned excision often have small superficial tumors and the majority fall outside of defined referral criteria. Referral criteria are not a guarantee for detection of all sarcomas and surgeons should always be aware of the possibility of malignancy when removing a tumor. © 2016 Wiley Periodicals, Inc.

  12. How do people define moderation?

    Science.gov (United States)

    vanDellen, Michelle R; Isherwood, Jennifer C; Delose, Julie E

    2016-06-01

    Eating in moderation is considered to be sound and practical advice for weight maintenance or prevention of weight gain. However, the concept of moderation is ambiguous, and the effect of moderation messages on consumption has yet to be empirically examined. The present manuscript examines how people define moderate consumption. We expected that people would define moderate consumption in ways that justified their current or desired consumption rather than view moderation as an objective standard. In Studies 1 and 2, moderate consumption was perceived to involve greater quantities of an unhealthy food (chocolate chip cookies, gummy candies) than perceptions of how much one should consume. In Study 3, participants generally perceived themselves to eat in moderation and defined moderate consumption as greater than their personal consumption. Furthermore, definitions of moderate consumption were related to personal consumption behaviors. Results suggest that the endorsement of moderation messages allows for a wide range of interpretations of moderate consumption. Thus, we conclude that moderation messages are unlikely to be effective messages for helping people maintain or lose weight. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Kerlinger's Criterial Referents Theory Revisited.

    Science.gov (United States)

    Zak, Itai; Birenbaum, Menucha

    1980-01-01

    Kerlinger's criterial referents theory of attitudes was tested cross-culturally by administering an education attitude referents summated-rating scale to 713 individuals in Israel. The response pattern to criterial and noncriterial referents was examined. Results indicated empirical cross-cultural validity of theory, but questioned measuring…

  14. Handbook of reference electrodes

    CERN Document Server

    Inzelt, György; Scholz, Fritz

    2013-01-01

    Reference Electrodes are a crucial part of any electrochemical system, yet an up-to-date and comprehensive handbook is long overdue. Here, an experienced team of electrochemists provides an in-depth source of information and data for the proper choice and construction of reference electrodes. This includes all kinds of applications such as aqueous and non-aqueous solutions, ionic liquids, glass melts, solid electrolyte systems, and membrane electrodes. Advanced technologies such as miniaturized, conducting-polymer-based, screen-printed or disposable reference electrodes are also covered. Essen

  15. Poster: A Software-Defined Multi-Camera Network

    OpenAIRE

    Chen, Po-Yen; Chen, Chien; Selvaraj, Parthiban; Claesen, Luc

    2016-01-01

    The widespread popularity of OpenFlow leads to a significant increase in the number of applications developed in SoftwareDefined Networking (SDN). In this work, we propose the architecture of a Software-Defined Multi-Camera Network consisting of small, flexible, economic, and programmable cameras which combine the functions of the processor, switch, and camera. A Software-Defined Multi-Camera Network can effectively reduce the overall network bandwidth and reduce a large amount of the Capex a...

  16. Children and adults exposed to electromagnetic fields at the ICNIRP reference levels: theoretical assessment of the induced peak temperature increase.

    Science.gov (United States)

    Bakker, J F; Paulides, M M; Neufeld, E; Christ, A; Kuster, N; van Rhoon, G C

    2011-08-07

    To avoid potentially adverse health effects of electromagnetic fields (EMF), the International Commission on Non-Ionizing Radiation Protection (ICNIRP) has defined EMF reference levels. Restrictions on induced whole-body-averaged specific absorption rate (SAR(wb)) are provided to keep the whole-body temperature increase (T(body, incr)) under 1 °C during 30 min. Additional restrictions on the peak 10 g spatial-averaged SAR (SAR(10g)) are provided to prevent excessive localized tissue heating. The objective of this study is to assess the localized peak temperature increase (T(incr, max)) in children upon exposure at the reference levels. Finite-difference time-domain modeling was used to calculate T(incr, max) in six children and two adults exposed to orthogonal plane-wave configurations. We performed a sensitivity study and Monte Carlo analysis to assess the uncertainty of the results. Considering the uncertainties in the model parameters, we found that a peak temperature increase as high as 1 °C can occur for worst-case scenarios at the ICNIRP reference levels. Since the guidelines are deduced from temperature increase, we used T(incr, max) as being a better metric to prevent excessive localized tissue heating instead of localized peak SAR. However, we note that the exposure time should also be considered in future guidelines. Hence, we advise defining limits on T(incr, max) for specified durations of exposure.

  17. Nanoscale reference materials for environmental, health and safety measurements: needs, gaps and opportunities.

    Science.gov (United States)

    Stefaniak, Aleksandr B; Hackley, Vincent A; Roebben, Gert; Ehara, Kensei; Hankin, Steve; Postek, Michael T; Lynch, Iseult; Fu, Wei-En; Linsinger, Thomas P J; Thünemann, Andreas F

    2013-12-01

    The authors critically reviewed published lists of nano-objects and their physico-chemical properties deemed important for risk assessment and discussed metrological challenges associated with the development of nanoscale reference materials (RMs). Five lists were identified that contained 25 (classes of) nano-objects; only four (gold, silicon dioxide, silver, titanium dioxide) appeared on all lists. Twenty-three properties were identified for characterisation; only (specific) surface area appeared on all lists. The key themes that emerged from this review were: 1) various groups have prioritised nano-objects for development as "candidate RMs" with limited consensus; 2) a lack of harmonised terminology hinders accurate description of many nano-object properties; 3) many properties identified for characterisation are ill-defined or qualitative and hence are not metrologically traceable; 4) standardised protocols are critically needed for characterisation of nano-objects as delivered in relevant media and as administered to toxicological models; 5) the measurement processes being used to characterise a nano-object must be understood because instruments may measure a given sample in a different way; 6) appropriate RMs should be used for both accurate instrument calibration and for more general testing purposes (e.g., protocol validation); 7) there is a need to clarify that where RMs are not available, if "(representative) test materials" that lack reference or certified values may be useful for toxicology testing and 8) there is a need for consensus building within the nanotechnology and environmental, health and safety communities to prioritise RM needs and better define the required properties and (physical or chemical) forms of the candidate materials.

  18. Defining chaos.

    Science.gov (United States)

    Hunt, Brian R; Ott, Edward

    2015-09-01

    In this paper, we propose, discuss, and illustrate a computationally feasible definition of chaos which can be applied very generally to situations that are commonly encountered, including attractors, repellers, and non-periodically forced systems. This definition is based on an entropy-like quantity, which we call "expansion entropy," and we define chaos as occurring when this quantity is positive. We relate and compare expansion entropy to the well-known concept of topological entropy to which it is equivalent under appropriate conditions. We also present example illustrations, discuss computational implementations, and point out issues arising from attempts at giving definitions of chaos that are not entropy-based.

  19. Towards a classification of Tanzanian rivers: a bioassessment and ...

    African Journals Online (AJOL)

    River classification is important for reporting ecological status and for the general ecological management of river systems by partitioning natural variability. A priori river classification by abiotic variables and validation of classifications obtained using aquatic macroinvertebrates from reference sites for selected Tanzanian ...

  20. Biomedical Engineering Desk Reference

    CERN Document Server

    Ratner, Buddy D; Schoen, Frederick J; Lemons, Jack E; Dyro, Joseph; Martinsen, Orjan G; Kyle, Richard; Preim, Bernhard; Bartz, Dirk; Grimnes, Sverre; Vallero, Daniel; Semmlow, John; Murray, W Bosseau; Perez, Reinaldo; Bankman, Isaac; Dunn, Stanley; Ikada, Yoshito; Moghe, Prabhas V; Constantinides, Alkis

    2009-01-01

    A one-stop Desk Reference, for Biomedical Engineers involved in the ever expanding and very fast moving area; this is a book that will not gather dust on the shelf. It brings together the essential professional reference content from leading international contributors in the biomedical engineering field. Material covers a broad range of topics including: Biomechanics and Biomaterials; Tissue Engineering; and Biosignal Processing* A hard-working desk reference providing all the essential material needed by biomedical and clinical engineers on a day-to-day basis * Fundamentals, key techniques,

  1. Bilayer graphene quantum dot defined by topgates

    Energy Technology Data Exchange (ETDEWEB)

    Müller, André; Kaestner, Bernd; Hohls, Frank; Weimann, Thomas; Pierz, Klaus; Schumacher, Hans W., E-mail: hans.w.schumacher@ptb.de [Physikalisch-Technische Bundesanstalt, Bundesallee 100, 38116 Braunschweig (Germany)

    2014-06-21

    We investigate the application of nanoscale topgates on exfoliated bilayer graphene to define quantum dot devices. At temperatures below 500 mK, the conductance underneath the grounded gates is suppressed, which we attribute to nearest neighbour hopping and strain-induced piezoelectric fields. The gate-layout can thus be used to define resistive regions by tuning into the corresponding temperature range. We use this method to define a quantum dot structure in bilayer graphene showing Coulomb blockade oscillations consistent with the gate layout.

  2. Universal Reference RNA as a standard for microarray experiments

    Directory of Open Access Journals (Sweden)

    Fero Michael

    2004-03-01

    Full Text Available Abstract Background Obtaining reliable and reproducible two-color microarray gene expression data is critically important for understanding the biological significance of perturbations made on a cellular system. Microarray design, RNA preparation and labeling, hybridization conditions and data acquisition and analysis are variables difficult to simultaneously control. A useful tool for monitoring and controlling intra- and inter-experimental variation is Universal Reference RNA (URR, developed with the goal of providing hybridization signal at each microarray probe location (spot. Measuring signal at each spot as the ratio of experimental RNA to reference RNA targets, rather than relying on absolute signal intensity, decreases variability by normalizing signal output in any two-color hybridization experiment. Results Human, mouse and rat URR (UHRR, UMRR and URRR, respectively were prepared from pools of RNA derived from individual cell lines representing different tissues. A variety of microarrays were used to determine percentage of spots hybridizing with URR and producing signal above a user defined threshold (microarray coverage. Microarray coverage was consistently greater than 80% for all arrays tested. We confirmed that individual cell lines contribute their own unique set of genes to URR, arguing for a pool of RNA from several cell lines as a better configuration for URR as opposed to a single cell line source for URR. Microarray coverage comparing two separately prepared batches each of UHRR, UMRR and URRR were highly correlated (Pearson's correlation coefficients of 0.97. Conclusion Results of this study demonstrate that large quantities of pooled RNA from individual cell lines are reproducibly prepared and possess diverse gene representation. This type of reference provides a standard for reducing variation in microarray experiments and allows more reliable comparison of gene expression data within and between experiments and

  3. Survey of reference materials. V. 2: Environmentally related reference materials for trace elements, nuclides and microcontaminants

    International Nuclear Information System (INIS)

    1996-05-01

    The present report presently contains over 250 reference materials with trace element and organic contaminant information on fuel, geological and mineral, anthropogenic disposal, soil reference and miscellaneous reference materials. Not included in the current report is information on most biological and environmental reference materials with trace element, stable isotope, radioisotope and organic contaminant information. 8 refs, tabs

  4. SAPHIRE 8 Volume 2 - Technical Reference

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; S. T. Wood; W. J. Galyean; J. A. Schroeder; M. B. Sattison

    2011-03-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of computer programs that were developed to create and analyze probabilistic risk assessment (PRAs). Herein information is provided on the principles used in the construction and operation of Version 8.0 of the SAPHIRE system. This report summarizes the fundamental mathematical concepts of sets and logic, fault trees, and probability. This volume then describes the algorithms used to construct a fault tree and to obtain the minimal cut sets. It gives the formulas used to obtain the probability of the top event from the minimal cut sets, and the formulas for probabilities that apply for various assumptions concerning reparability and mission time. It defines the measures of basic event importance that SAPHIRE can calculate. This volume gives an overview of uncertainty analysis using simple Monte Carlo sampling or Latin Hypercube sampling, and states the algorithms used by this program to generate random basic event probabilities from various distributions. Also covered are enhance capabilities such as seismic analysis, Workspace algorithms, cut set "recovery," end state manipulation, and use of "compound events."

  5. Quantum cosmology with effects of a preferred reference frame

    International Nuclear Information System (INIS)

    Ghaffarnejad, Hossein

    2010-01-01

    Recently, we presented a gravity model by generalizing the Brans-Dicke theory which is suitable for studying the metric signature transition dynamics without using an imaginary time parameter. Adding a suitable scalar potential described in terms of the Brans-Dicke scalar field 'Φ-tilde, this alternative theory is used to study the Wheeler-DeWitt approach of quantum cosmology. We assumed that the universe is defined in a flat Robertson-Walker metric with Lorentzian signature. In that case, the Wheeler-DeWitt wavefunctional is obtained as two-dimensional quantum harmonic oscillator convergent polynomials for both of the choices of positive and negative values of the Brans-Dicke parameter. Here we choose a preferred reference frame with a time coordinate of 'γ' which relates to time of cosmological free falling observer 't' as 'dt= Φ-tilde(γ)dγ'.

  6. Study of the structure and development of the set of reference materials of composition and structure of heat resisting nickel and intermetallic alloys

    Directory of Open Access Journals (Sweden)

    E. B. Chabina

    2016-01-01

    Full Text Available Relevance of research: There are two sizes (several microns and nanodimensional of strengthening j'-phase in single-crystal heat resisting nickel and intermetallic alloys, used for making blades of modern gas turbine engines (GTD. For in-depth study of structural and phase condition of such alloys not only qualitative description of created structure is necessary, but quantitative analysis of alloy components geometrical characteristics. Purpose of the work: Development of reference material sets of heat resisting nickel and intermetallic alloy composition and structure. Research methods: To address the measurement problem of control of structural and geometrical characteristics of single-crystal heat resisting and intermetallic alloys by analytical microscopy and X-ray diffraction analysis the research was carried out using certified measurement techniques on facilities, entered in the Register of Measurement Means of the Russian Federation. The research was carried out on microsections, foils and plates, cut in the plane {100}. Results: It is established that key parameters, defining the properties of these alloys are particle size of strengthening j' -phase, the layer thickness of j-phase between them and parameters of phases lattice. Metrological requirements for reference materials of composition and structure of heat resisting nickel and intermetallic alloys are formulated. The necessary and sufficient reference material set providing the possibility to determine the composition and structure parameters of single-crystal heat resisting nickel and intermetallic alloys is defined. The developed RM sets are certified as in-plant reference materials. Conclusion: The reference materials can be used for graduation of spectral equipment when conducting element analysis of specified class alloys; for calibration of means of measuring alloy structure parameters; for measurement of alloys phases lattice parameters; for structure reference pictures

  7. 32 CFR 634.2 - References.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 4 2010-07-01 2010-07-01 true References. 634.2 Section 634.2 National Defense Department of Defense (Continued) DEPARTMENT OF THE ARMY (CONTINUED) LAW ENFORCEMENT AND CRIMINAL INVESTIGATIONS MOTOR VEHICLE TRAFFIC SUPERVISION Introduction § 634.2 References. Required and related publications along with prescribed and reference...

  8. Intelligence Defined and Undefined: A Relativistic Appraisal

    Science.gov (United States)

    Wechsler, David

    1975-01-01

    Major reasons for the continuing divergency of opinion as regards the nature and meaning of intelligence are examined. An appraisal of intelligence as a relative concept is proposed which advocates the necessity of specifying the reference systems to which a statement about intelligence refers. (EH)

  9. An overview of digital reference services

    OpenAIRE

    Hemnani, Anita

    2009-01-01

    Digital reference service is an emerging trend of traditional reference service. Easily accessible digital reference service has become one of the hallmark of the library and information services. The paper highlights how new visage of traditional reference service is developing as a natural solution to keep pace with comprehensive technological environment. It discusses about the basic concepts, elements of digital reference service and give in detail modes, the advantages, limitations, and...

  10. Interchangeability of biosimilar and biological reference product: updated regulatory positions and pre- and post-marketing evidence.

    Science.gov (United States)

    Trifirò, Gianluca; Marcianò, Ilaria; Ingrasciotta, Ylenia

    2018-03-01

    Since 2006, biosimilars have been available in several countries worldwide, thus allowing for potential savings in pharmaceutical expenditure. However, there have been numerous debates about the interchangeability of biosimilars and reference products based on concerns of immunogenicity by switching between biological products, which may cause lack of effect and toxicity. Areas covered: The authors provide the reader with an overview of the different positions of regulatory authorities on the interchangeability and automatic substitution of biosimilars and reference products. Presently, the FDA allows automatic substitution without prescriber intervention if the biosimilar is interchangeable with reference products, while the European Medicines Agency delegate to each single EU member state. Expert opinion: Different approaches in defining interchangeability and automatic substitution call for harmonization to increase confidence of healthcare professionals and patients about the clinical impact of switching. Networks of electronic healthcare records and administrative databases, potentially linkable to clinical charts and registries may rapidly assess frequency and benefit-risk profile of different switching patterns in routine care at different levels, thus integrating and strengthening pre-marketing evidence.

  11. A Data-Driven Analysis of the Rules Defining Bilateral Leg Movements during Sleep.

    Science.gov (United States)

    Ferri, Raffaele; Manconi, Mauro; Rundo, Francesco; Zucconi, Marco; Aricò, Debora; Bruni, Oliviero; Ferini-Strambi, Luigi; Fulda, Stephany

    2016-02-01

    The aim of this study was to describe and analyze the association between bilateral leg movements (LMs) during sleep in subjects with restless legs syndrome (RLS), in order to eventually support or challenge the current scoring rules defining bilateral LMs. Polysomnographic recordings of 100 untreated patients with RLS (57 women and 43 males, mean age 57 y) were included. In each recording, we selected as reference all LMs that occurred during sleep and that were separated from another ipsilateral LM by at least 10 sec of EMG inactivity. For each reference LM and an evaluation interval from 5 sec before the onset to 5 sec after the offset of the reference LM, we evaluated (1) the presence or absence of contralateral leg movement activity and (2) the distribution of the onset-to-onset and (3) the offset-to-onset differences between bilateral LMs. We selected a mean of 368 (± 222 standard deviation [SD]) reference LMs per subject. For 42% (± 22%) of the reference LMs no contralateral leg movement activity was observed within the evaluation interval. In 55% (± 22%) exactly one and in 3% (± 2%) more than one contralateral LM was observed. A further evaluation of events where exactly one contralateral LM was observed showed that in most (1) the two LMs were overlapping (93% ± 9% SD) and (2) were classified as bilateral according to the World Association of Sleep Medicine and the International Restless Legs Syndrome Study Group (WASM/ IRLSSG) (96% ± 6% SD) and (3) the American Academy of Sleep Medicine scoring rules (99% ± 2% SD). Although there was a systematic and statistically significant difference in standard LM indices during sleep based on the two different definitions of bilateral LMs, the size of the difference was not clinically meaningful (maximum individual, absolute difference in LM indices ± 2.5). In addition, we found that the duration of LMs within bilateral LM pairs was longer compared to monolateral LMs and that the duration of the single LMs in

  12. Reference life cycle assessment scenarios for manure management in the Baltic Sea Regions - An assessment covering six animal production, five BSR countries, and four manure types

    DEFF Research Database (Denmark)

    Hamelin, Lorie; Baky, A; Cano-Bernal, J

    the manure is applied, specific legislations governing the manure management practices, etc.). Further, it presents a reference manure composition for each of these reference systems, including key parameters such as dry matter, nitrogen (inorganic and total), phosphorus, carbon and volatile solids content......One major pre-condition for assessing a manure management technique in a whole system or LCA-approach is to define a reference system against which this technique can be assessed. This report thus presents and details the establishment of such reference systems, comprising eight different manure...... types (fattening pig slurry, dairy cow slurry, hens manure, bulls deep litter, fattening pig solid manure, dairy cow solid manure, horse manure & broilers manure) and five Baltic Sea Regions (Denmark, Sweden, Finland, Estonia, Poland), for a total of 15 reference systems. It presents, for each...

  13. The NIST natural-matrix radionuclide standard reference material program for ocean studies

    International Nuclear Information System (INIS)

    Inn, K.G.W.; Zhichao Lin; Zhongyu Wu; MacMahon, C.; Filliben, J.J.; Krey, P.; Feiner, M.; Harvey, J.

    2001-01-01

    In 1997, the Low-level Working Group of the International Committee on Radionuclide Metrology met in Boston, MA (USA) to define the characteristics of a new set of environmental radioactivity reference materials. These reference materials were to provide the radiochemist with the same analytical challenges faced when assaying environmental samples. It was decided that radionuclide bearing natural materials should be collected from sites where there had been sufficient time for natural processes to redistribute the various chemically different species of the radionuclides. Over the succeeding years, the National Institute of Standards and Technology (NIST), in cooperation with other highly experienced laboratories, certified and issued a number of these as low-level radioactivity Standard Reference Materials (SRMs) for fission and activation product and actinide concentrations. The experience of certifying these SRMs has given NIST the opportunity to compare radioanalytical methods and learn of their limitations. NIST convened an international workshop in 1994 to define the natural-matrix radionuclide SRM needs for ocean studies. The highest priorities proposed at the workshop were for sediment, shellfish, seaweed, fish flesh and water matrix SRMs certified for mBq per sample concentrations of 90 Sr, 137 Cs and 239 Pu + 240 Pu. The most recent low-level environmental radionuclide SRM issued by NIST, Ocean Sediment (SRM 4357) has certified and uncertified values for the following 22 radionuclides: 40 K, 90 Sr, 129 I, 137 Cs, 155 Eu, 210 Pb, 210 Po, 212 Pb, 214 Bi, 226 Ra, 228 Ra, 228 Th, 230 Th, 232 Th, 234 U, 235 U, 237 Np, 238 U, 238 Pu, 239 Pu + 240 Pu, and 241 Am. The uncertainties for a number of the certified radionuclides are non-symmetrical and relatively large because of the non-normal distribution of reported values. NIST is continuing its efforts to provide the ocean studies community with additional natural matrix radionuclide SRMs. The freeze

  14. Doing the work of reference practical tips for excelling as a reference librarian

    CERN Document Server

    Katz, Linda S

    2013-01-01

    Become more versatile, competent, and resourceful with these practical suggestions!Becoming a first-class reference librarian demands proficiency in a wide range of skills. Doing the Work of Reference offers sound advice for the full spectrum of your responsibilities. Though many aspects of a reference librarian's work are changing with astonishing speed, the classic principles in this volume will never go out of date. This comprehensive volume begins with hints for orienting yourself to a new job and concludes with ideas for serving the profession. On the way, Doing the Wo

  15. 2002 reference document

    International Nuclear Information System (INIS)

    2002-01-01

    This 2002 reference document of the group Areva, provides information on the society. Organized in seven chapters, it presents the persons responsible for the reference document and for auditing the financial statements, information pertaining to the transaction, general information on the company and share capital, information on company operation, changes and future prospects, assets, financial position, financial performance, information on company management and executive board and supervisory board, recent developments and future prospects. (A.L.B.)

  16. Fundamentals of Managing Reference Collections

    Science.gov (United States)

    Singer, Carol A.

    2012-01-01

    Whether a library's reference collection is large or small, it needs constant attention. Singer's book offers information and insight on best practices for reference collection management, no matter the size, and shows why managing without a plan is a recipe for clutter and confusion. In this very practical guide, reference librarians will learn:…

  17. Validation of a case definition to define chronic dialysis using outpatient administrative data.

    Science.gov (United States)

    Clement, Fiona M; James, Matthew T; Chin, Rick; Klarenbach, Scott W; Manns, Braden J; Quinn, Robert R; Ravani, Pietro; Tonelli, Marcello; Hemmelgarn, Brenda R

    2011-03-01

    Administrative health care databases offer an efficient and accessible, though as-yet unvalidated, approach to studying outcomes of patients with chronic kidney disease and end-stage renal disease (ESRD). The objective of this study is to determine the validity of outpatient physician billing derived algorithms for defining chronic dialysis compared to a reference standard ESRD registry. A cohort of incident dialysis patients (Jan. 1-Dec. 31, 2008) and prevalent chronic dialysis patients (Jan 1, 2008) was selected from a geographically inclusive ESRD registry and administrative database. Four administrative data definitions were considered: at least 1 outpatient claim, at least 2 outpatient claims, at least 2 outpatient claims at least 90 days apart, and continuous outpatient claims at least 90 days apart with no gap in claims greater than 21 days. Measures of agreement of the four administrative data definitions were compared to a reference standard (ESRD registry). Basic patient characteristics are compared between all 5 patient groups. 1,118,097 individuals formed the overall population and 2,227 chronic dialysis patients were included in the ESRD registry. The three definitions requiring at least 2 outpatient claims resulted in kappa statistics between 0.60-0.80 indicating "substantial" agreement. "At least 1 outpatient claim" resulted in "excellent" agreement with a kappa statistic of 0.81. Of the four definitions, the simplest (at least 1 outpatient claim) performed comparatively to other definitions. The limitations of this work are the billing codes used are developed in Canada, however, other countries use similar billing practices and thus the codes could easily be mapped to other systems. Our reference standard ESRD registry may not capture all dialysis patients resulting in some misclassification. The registry is linked to on-going care so this is likely to be minimal. The definition utilized will vary with the research objective.

  18. Validation of a case definition to define chronic dialysis using outpatient administrative data

    Directory of Open Access Journals (Sweden)

    Klarenbach Scott W

    2011-03-01

    Full Text Available Abstract Background Administrative health care databases offer an efficient and accessible, though as-yet unvalidated, approach to studying outcomes of patients with chronic kidney disease and end-stage renal disease (ESRD. The objective of this study is to determine the validity of outpatient physician billing derived algorithms for defining chronic dialysis compared to a reference standard ESRD registry. Methods A cohort of incident dialysis patients (Jan. 1 - Dec. 31, 2008 and prevalent chronic dialysis patients (Jan 1, 2008 was selected from a geographically inclusive ESRD registry and administrative database. Four administrative data definitions were considered: at least 1 outpatient claim, at least 2 outpatient claims, at least 2 outpatient claims at least 90 days apart, and continuous outpatient claims at least 90 days apart with no gap in claims greater than 21 days. Measures of agreement of the four administrative data definitions were compared to a reference standard (ESRD registry. Basic patient characteristics are compared between all 5 patient groups. Results 1,118,097 individuals formed the overall population and 2,227 chronic dialysis patients were included in the ESRD registry. The three definitions requiring at least 2 outpatient claims resulted in kappa statistics between 0.60-0.80 indicating "substantial" agreement. "At least 1 outpatient claim" resulted in "excellent" agreement with a kappa statistic of 0.81. Conclusions Of the four definitions, the simplest (at least 1 outpatient claim performed comparatively to other definitions. The limitations of this work are the billing codes used are developed in Canada, however, other countries use similar billing practices and thus the codes could easily be mapped to other systems. Our reference standard ESRD registry may not capture all dialysis patients resulting in some misclassification. The registry is linked to on-going care so this is likely to be minimal. The definition

  19. Defining local food

    DEFF Research Database (Denmark)

    Eriksen, Safania Normann

    2013-01-01

    Despite evolving local food research, there is no consistent definition of “local food.” Various understandings are utilized, which have resulted in a diverse landscape of meaning. The main purpose of this paper is to examine how researchers within the local food systems literature define local...... food, and how these definitions can be used as a starting point to identify a new taxonomy of local food based on three domains of proximity....

  20. Testing the causal theory of reference.

    Science.gov (United States)

    Domaneschi, Filippo; Vignolo, Massimiliano; Di Paola, Simona

    2017-04-01

    Theories of reference are a crucial research topic in analytic philosophy. Since the publication of Kripke's Naming and Necessity, most philosophers have endorsed the causal/historical theory of reference. The goal of this paper is twofold: (i) to discuss a method for testing experimentally the causal theory of reference for proper names by investigating linguistic usage and (ii) to present the results from two experiments conducted with that method. Data collected in our experiments confirm the causal theory of reference for people proper names and for geographical proper names. A secondary but interesting result is that the semantic domain affects reference assignment: while with people proper names speakers tend to assign the semantic reference, with geographical proper names they are prompted to assign the speaker's reference. Copyright © 2016 Elsevier B.V. All rights reserved.