WorldWideScience

Sample records for priori defined reference

  1. 'Aussie normals': an a priori study to develop clinical chemistry reference intervals in a healthy Australian population.

    Science.gov (United States)

    Koerbin, G; Cavanaugh, J A; Potter, J M; Abhayaratna, W P; West, N P; Glasgow, N; Hawkins, C; Armbruster, D; Oakman, C; Hickman, P E

    2015-02-01

    Development of reference intervals is difficult, time consuming, expensive and beyond the scope of most laboratories. The Aussie Normals study is a direct a priori study to determine reference intervals in healthy Australian adults. All volunteers completed a health and lifestyle questionnaire and exclusion was based on conditions such as pregnancy, diabetes, renal or cardiovascular disease. Up to 91 biochemical analyses were undertaken on a variety of analytical platforms using serum samples collected from 1856 volunteers. We report on our findings for 40 of these analytes and two calculated parameters performed on the Abbott ARCHITECTci8200/ci16200 analysers. Not all samples were analysed for all assays due to volume requirements or assay/instrument availability. Results with elevated interference indices and those deemed unsuitable after clinical evaluation were removed from the database. Reference intervals were partitioned based on the method of Harris and Boyd into three scenarios, combined gender, males and females and age and gender. We have performed a detailed reference interval study on a healthy Australian population considering the effects of sex, age and body mass. These reference intervals may be adapted to other manufacturer's analytical methods using method transference.

  2. Paleolimnological assessment of nutrient enrichment on diatom assemblages in a priori defined nitrogen- and phosphorus-limited lakes downwind of the Athabasca Oil Sands, Canada

    Directory of Open Access Journals (Sweden)

    Kathleen R. Laird

    2017-04-01

    Full Text Available As the industrial footprint of the Athabasca Oil Sands Region (AOSR continues to expand, concern about the potential impacts of pollutants on the surrounding terrestrial and aquatic ecosystems need to be assessed. An emerging issue is whether recent increases in lake production downwind of the development can be linked to AOSR activities, and/or whether changing climatic conditions are influencing lake nutrient status. To decipher the importance of pollutants, particularly atmospheric deposition of reactive nitrogen (Nr, and the effects of climate change as potential sources of increasing lake production, lakes from both within and outside of the nitrogen deposition zone were analyzed for historical changes in diatom assemblages. Lake sediment cores were collected from a priori defined nitrogen (N - and phosphorus (P - limited lakes within and outside the N plume associated with the AOSR. Diatom assemblages were quantified at sub-decadal resolution since ca. 1890 to compare conditions prior to oil sands expansion and regional climate warming, to the more recent conditions in each group of lakes (Reference and Impacted, N- and P-limited lakes. Analyses of changes in assemblage similarity and species turnover indicates that changes in diatom assemblages were minimal both within and across all lake groups.  Small changes in percent composition of planktonic taxa, particularly small centric taxa (Discostella and Cyclotella species and pennate taxa, such as Asterionella formosa and Fragilaria crotonensis, occurred in some of the lakes. While these changes were consistent with potential climate effects on algal growth, water column stability and other factors; the timing and direction of biotic changes were variable among sites suggesting that any apparent response to climate was lake dependent. The absence of a consistent pattern of diatom changes associated with receipt of reactive nitrogen or intrinsic nutrient-limitation status of the lake

  3. Resting-state test-retest reliability of a priori defined canonical networks over different preprocessing steps.

    Science.gov (United States)

    Varikuti, Deepthi P; Hoffstaedter, Felix; Genon, Sarah; Schwender, Holger; Reid, Andrew T; Eickhoff, Simon B

    2017-04-01

    Resting-state functional connectivity analysis has become a widely used method for the investigation of human brain connectivity and pathology. The measurement of neuronal activity by functional MRI, however, is impeded by various nuisance signals that reduce the stability of functional connectivity. Several methods exist to address this predicament, but little consensus has yet been reached on the most appropriate approach. Given the crucial importance of reliability for the development of clinical applications, we here investigated the effect of various confound removal approaches on the test-retest reliability of functional-connectivity estimates in two previously defined functional brain networks. Our results showed that gray matter masking improved the reliability of connectivity estimates, whereas denoising based on principal components analysis reduced it. We additionally observed that refraining from using any correction for global signals provided the best test-retest reliability, but failed to reproduce anti-correlations between what have been previously described as antagonistic networks. This suggests that improved reliability can come at the expense of potentially poorer biological validity. Consistent with this, we observed that reliability was proportional to the retained variance, which presumably included structured noise, such as reliable nuisance signals (for instance, noise induced by cardiac processes). We conclude that compromises are necessary between maximizing test-retest reliability and removing variance that may be attributable to non-neuronal sources.

  4. Resting-state test-retest reliability of a priori defined canonical networks over different preprocessing steps

    Science.gov (United States)

    Varikuti, Deepthi P.; Hoffstaedter, Felix; Genon, Sarah; Schwender, Holger; Reid, Andrew T.; Eickhoff, Simon B.

    2016-01-01

    Resting-state functional connectivity analysis has become a widely used method for the investigation of human brain connectivity and pathology. The measurement of neuronal activity by functional MRI, however, is impeded by various nuisance signals that reduce the stability of functional connectivity. Several methods exist to address this predicament, but little consensus has yet been reached on the most appropriate approach. Given the crucial importance of reliability for the development of clinical applications, we here investigated the effect of various confound removal approaches on the test-retest reliability of functional-connectivity estimates in two previously defined functional brain networks. Our results showed that grey matter masking improved the reliability of connectivity estimates, whereas de-noising based on principal components analysis reduced it. We additionally observed that refraining from using any correction for global signals provided the best test-retest reliability, but failed to reproduce anti-correlations between what have been previously described as antagonistic networks. This suggests that improved reliability can come at the expense of potentially poorer biological validity. Consistent with this, we observed that reliability was proportional to the retained variance, which presumably included structured noise, such as reliable nuisance signals (for instance, noise induced by cardiac processes). We conclude that compromises are necessary between maximizing test-retest reliability and removing variance that may be attributable to non-neuronal sources. PMID:27550015

  5. Art Priori = Art Priori / Kristel Jakobson

    Index Scriptorium Estoniae

    Jakobson, Kristel, 1983-

    2015-01-01

    Restoran Art Priori Tallinna vanalinnas Olevimägi 7. Sisekujunduse autor Kristel Jakobson (Haka Disain). Eesti Sisearhitektide Liidu aastapreemia 2014/2015 parima restorani eest. Lühidalt Kristel Jakobsonist

  6. Choice of Reference Serum Creatinine in Defining AKI

    Science.gov (United States)

    Siew, Edward D.; Matheny, Michael E.

    2015-01-01

    Background/Aims The study of acute kidney injury (AKI) has expanded with the increasing availability of electronic health records and the use of standardized definitions. Understanding the impact of AKI between settings is limited by heterogeneity in the selection of reference creatinine to anchor the definition of AKI. In this mini-review, we discuss different approaches used to select reference creatinine and their relative merits and limitations. Methods We reviewed the literature to obtain representative examples of published baseline creatinine definitions when pre-hospital data were not available, as well as literature evaluating estimation of baseline renal function, using Pubmed and reference back-tracing within known works. Results 1) Prehospital creatinine values are useful in determining reference creatinine, and in high-risk populations, the mean outpatient serum creatinine value 7-365 days before hospitalization closely approximates nephrology adjudication, 2) in patients without pre-hospital data, the eGFR 75 approach does not reliably estimate true AKI incidence in most at-risk populations 3) using the lowest inpatient serum creatinine may be reasonable, especially in those with preserved kidney function, but may generously estimate AKI incidence and severity and miss community-acquired AKI that does not fully resolve, 4) using more specific definitions of AKI (e.g. KIDGO Stage 2 and 3) may help to reduce the effects of misclassification when using surrogate values, and 5) leveraging available clinical data may help refine the estimate of reference creatinine. Conclusions Choosing reference creatinine for AKI calculation is important for AKI classification and study interpretation. We recommend obtaining data on pre-hospital kidney function, wherever possible. In studies where surrogate estimates are used, transparency in how they are applied and discussion that informs the reader of potential biases should be provided. Further work to refine the

  7. Choice of Reference Serum Creatinine in Defining Acute Kidney Injury.

    Science.gov (United States)

    Siew, Edward D; Matheny, Michael E

    2015-01-01

    The study of acute kidney injury (AKI) has expanded with the increasing availability of electronic health records and the use of standardized definitions. Understanding the impact of AKI between settings is limited by heterogeneity in the selection of reference creatinine to anchor the definition of AKI. In this mini-review, we discuss different approaches used to select reference creatinine and their relative merits and limitations. We reviewed the literature to obtain representative examples of published baseline creatinine definitions when pre-hospital data were not available, as well as literature evaluating the estimation of baseline renal function, using PubMed and reference back-tracing within known works. (1) Pre-hospital creatinine values are useful in determining reference creatinine, and in high-risk populations, the mean outpatient serum creatinine value 7-365 days before hospitalization closely approximates nephrology adjudication, (2) in patients without pre-hospital data, the eGFR 75 approach does not reliably estimate true AKI incidence in most at-risk populations, (3) using the lowest inpatient serum creatinine may be reasonable, especially in those with preserved kidney function, but may generously estimate AKI incidence and severity and miss community-acquired AKI that does not fully resolve, (4) using more specific definitions of AKI (e.g., KIDGO stages 2 and 3) may help to reduce the effects of misclassification when using surrogate values and (5) leveraging available clinical data may help refine the estimate of reference creatinine. Choosing reference creatinine for AKI calculation is important for AKI classification and study interpretation. We recommend obtaining data on pre-hospital kidney function, wherever possible. In studies where surrogate estimates are used, transparency in how they are applied and discussion that informs the reader of potential biases should be provided. Further work to refine the estimation of reference creatinine

  8. A priori imaginace

    Directory of Open Access Journals (Sweden)

    Mikel Dufrenne

    2016-06-01

    Full Text Available In this article, Dufrenne argues that imagination need not be only the subjective capacity to invent the unreal (dreams, fantasies, but that it is actually capable of revealing images that bring human beings closer to the hidden plenitude of Nature. Among these a priori images, such as heaven, water, blood, and earth, Dufrenne emphasizes the elementary, power, depth, and purity, which he believes are the most fundamental of them. He considers a potential classification of these images on the principle of ontological quality.

  9. A soil sampling reference site: The challenge in defining reference material for sampling

    International Nuclear Information System (INIS)

    De Zorzi, Paolo; Barbizzi, Sabrina; Belli, Maria; Fajgelj, Ales; Jacimovic, Radojko; Jeran, Zvonka; Sansone, Umberto; Perk, Marcel van der

    2008-01-01

    In the frame of the international SOILSAMP project, funded and coordinated by the Italian Environmental Protection Agency, an agricultural area was established as a reference site suitable for performing soil sampling inter-comparison exercises. The reference site was characterized for trace element content in soil, in terms of the spatial and temporal variability of their mass fraction. Considering that the behaviour of long-lived radionuclides in soil can be expected to be similar to that of some stable trace elements and that the distribution of these trace elements in soil can simulate the distribution of radionuclides, the reference site characterised in term of trace elements, can be also used to compare the soil sampling strategies developed for radionuclide investigations

  10. A soil sampling reference site: The challenge in defining reference material for sampling

    Energy Technology Data Exchange (ETDEWEB)

    De Zorzi, Paolo [Agenzia per la Protezione dell' Ambiente e per i Servizi Tecnici (APAT), Servizio Metrologia Ambientale, Via di Castel Romano, Rome 100-00128 (Italy)], E-mail: paolo.dezorzi@apat.it; Barbizzi, Sabrina; Belli, Maria [Agenzia per la Protezione dell' Ambiente e per i Servizi Tecnici (APAT), Servizio Metrologia Ambientale, Via di Castel Romano, Rome 100-00128 (Italy); Fajgelj, Ales [International Atomic Energy Agency (IAEA), Agency' s Laboratories Seibersdorf, Vienna A-1400 (Austria); Jacimovic, Radojko; Jeran, Zvonka; Sansone, Umberto [Jozef Stefan Institute, Jamova 39, Ljubljana 1000 (Slovenia); Perk, Marcel van der [Department of Physical Geography, Utrecht University, P.O. Box 80115, TC Utrecht 3508 (Netherlands)

    2008-11-15

    In the frame of the international SOILSAMP project, funded and coordinated by the Italian Environmental Protection Agency, an agricultural area was established as a reference site suitable for performing soil sampling inter-comparison exercises. The reference site was characterized for trace element content in soil, in terms of the spatial and temporal variability of their mass fraction. Considering that the behaviour of long-lived radionuclides in soil can be expected to be similar to that of some stable trace elements and that the distribution of these trace elements in soil can simulate the distribution of radionuclides, the reference site characterised in term of trace elements, can be also used to compare the soil sampling strategies developed for radionuclide investigations.

  11. A soil sampling reference site: the challenge in defining reference material for sampling.

    Science.gov (United States)

    de Zorzi, Paolo; Barbizzi, Sabrina; Belli, Maria; Fajgelj, Ales; Jacimovic, Radojko; Jeran, Zvonka; Sansone, Umberto; van der Perk, Marcel

    2008-11-01

    In the frame of the international SOILSAMP project, funded and coordinated by the Italian Environmental Protection Agency, an agricultural area was established as a reference site suitable for performing soil sampling inter-comparison exercises. The reference site was characterized for trace element content in soil, in terms of the spatial and temporal variability of their mass fraction. Considering that the behaviour of long-lived radionuclides in soil can be expected to be similar to that of some stable trace elements and that the distribution of these trace elements in soil can simulate the distribution of radionuclides, the reference site characterised in term of trace elements, can be also used to compare the soil sampling strategies developed for radionuclide investigations.

  12. Total Evidence, Uncertainty and A Priori Beliefs

    NARCIS (Netherlands)

    Bewersdorf, Benjamin; Felline, Laura; Ledda, Antonio; Paoli, Francesco; Rossanese, Emanuele

    2016-01-01

    Defining the rational belief state of an agent in terms of her initial or a priori belief state as well as her total evidence can help to address a number of important philosophical problems. In this paper, I discuss how this strategy can be applied to cases in which evidence is uncertain. I argue

  13. Koncepce a priori Mikela Dufrenna

    Directory of Open Access Journals (Sweden)

    Felix Borecký

    2016-06-01

    Full Text Available The principle aim of this essay is to present Mikel Dufrenne’s conception of the a priori as an effort to overcome the sociological and historical relativism that dominates in aesthetics as in all other human sciences. Dufrenne endeavours to show that the understanding of sense cannot be derived only from an empirical framework. It does not suffice to consider only that what one has experienced or the habits, norms and values of the society in which one has grown up and by means of which one perceives reality (common sense; human beings also have a priori constants that are of crucial significance for cognition. The second section of the article presents an interpretation of how Dufrenne delimits his conception of the a priori against epistemology, and, mainly, in the third section, against Kant. The article continues, in the fourth and the sixth section respectively with an endeavour to overcome subjectivism and intellectualism, and focuses, in the fifth section, on the a priori as a historical concept that is inseparably connected with the imaginary (discussed in the seventh section. All these points are completely heterogeneous with the traditional delimitations of the a priori according to the rules of logic. Dufrenne’s notion of the a priori constitutes an original contribution to solving the problem of cognitio sensitiva, which deals with the question of the extent to which it is possible to recognize universal truths on a sensuous individual being. Dufrenne is convinced that the aesthetic aspects of experience, such as the sensuous, the imaginary, and the corporeal, are the truest guide to recognition of the deep truths about human being in the world. Here is manifested the a priori basis that is common both to human beings and the world and guaranteed by Nature. The supreme examples of aesthetic phenomena are works of art and the supreme type of experience is the aesthetic experience.

  14. Defining reference conditions for acidified waters using a modern analogue approach

    International Nuclear Information System (INIS)

    Simpson, Gavin L.; Shilland, Ewan M.; Winterbottom, Julie M.; Keay, Janey

    2005-01-01

    Analogue matching is a palaeolimnological technique that aims to find matches for fossil sediment samples from a set of modern surface sediment samples. Modern analogues were identified that closely matched the pre-disturbance conditions of eight of the UK Acid Waters Monitoring Network (AWMN) lakes using diatom- and cladoceran-based analogue matching. These analogue sites were assessed in terms of hydrochemistry, aquatic macrophytes and macro-invertebrates as to their suitability for defining wider hydrochemical and biological reference conditions for acidified sites within the AWMN. The analogues identified for individual AWMN sites show a close degree of similarity in terms of their hydrochemical characteristics, aquatic macrophytes and, to a lesser extent, macro-invertebrate fauna. The reference conditions of acidified AWMN sites are inferred to be less acidic than today and to support a wider range of acid-sensitive aquatic macrophyte and macro-invertebrate taxa than that recorded in the AWMN lakes over the period of monitoring since 1988. - The use of a palaeolimnological technique to identify modern ecological reference analogues for acidified lakes is demonstrated

  15. Development and validation of an MRI reference criterion for defining a positive SIJ MRI in spondyloarthritis

    DEFF Research Database (Denmark)

    Weber, Ulrich; Zubler, Veronika; Pedersen, Susanne J

    2012-01-01

    OBJECTIVE: To validate an MRI reference criterion for a positive SIJ MRI based on the level of confidence in classification of spondyloarthritis (SpA) by expert MRI readers. METHODS: Four readers assessed SIJ MRI in two inception cohorts (A/B) of 157 consecutive back pain patients ≤50 years, and ...... using two inception cohorts and comparing clinical and MRI-based classification supports the case for including both erosion and BME to define a positive SIJ MRI for the classification of axial SpA. © 2012 by the American College of Rheumatology.......OBJECTIVE: To validate an MRI reference criterion for a positive SIJ MRI based on the level of confidence in classification of spondyloarthritis (SpA) by expert MRI readers. METHODS: Four readers assessed SIJ MRI in two inception cohorts (A/B) of 157 consecutive back pain patients ≤50 years......, and in 20 healthy controls. Patients were classified according to clinical examination and pelvic radiography as having non-radiographic axial SpA (n=51), ankylosing spondylitis (n=34), or non-specific back pain (n=72). Readers recorded their level of confidence in the classification of SpA on a 0-10 scale...

  16. Analytical performance, reference values and decision limits. A need to differentiate between reference intervals and decision limits and to define analytical quality specifications

    DEFF Research Database (Denmark)

    Petersen, Per Hyltoft; Jensen, Esther A; Brandslund, Ivan

    2012-01-01

    of the values of analytical components measured on reference samples from reference individuals. Decision limits are based on guidelines from national and international expert groups defining specific concentrations of certain components as limits for decision about diagnosis or well-defined specific actions....... Analytical quality specifications for reference intervals have been defined for bias since the 1990s, but in the recommendations specified in the clinical guidelines analytical quality specifications are only scarcely defined. The demands for negligible biases are, however, even more essential for decision...... limits, as the choice is no longer left to the clinician, but emerge directly from the concentration. Even a small bias will change the number of diseased individuals, so the demands for negligible biases are obvious. A view over the analytical quality as published gives a variable picture of bias...

  17. Targeted liquid chromatography tandem mass spectrometry to quantitate wheat gluten using well-defined reference proteins

    Science.gov (United States)

    Schalk, Kathrin; Koehler, Peter

    2018-01-01

    Celiac disease (CD) is an inflammatory disorder of the upper small intestine caused by the ingestion of storage proteins (prolamins and glutelins) from wheat, barley, rye, and, in rare cases, oats. CD patients need to follow a gluten-free diet by consuming gluten-free products with gluten contents of less than 20 mg/kg. Currently, the recommended method for the quantitative determination of gluten is an enzyme-linked immunosorbent assay (ELISA) based on the R5 monoclonal antibody. Because the R5 ELISA mostly detects the prolamin fraction of gluten, a new independent method is required to detect prolamins as well as glutelins. This paper presents the development of a method to quantitate 16 wheat marker peptides derived from all wheat gluten protein types by liquid chromatography tandem mass spectrometry (LC-MS/MS) in the multiple reaction monitoring mode. The quantitation of each marker peptide in the chymotryptic digest of a defined amount of the respective reference wheat protein type resulted in peptide-specific yields. This enabled the conversion of peptide into protein type concentrations. Gluten contents were expressed as sum of all determined protein type concentrations. This new method was applied to quantitate gluten in wheat starches and compared to R5 ELISA and gel-permeation high-performance liquid chromatography with fluorescence detection (GP-HPLC-FLD), which resulted in a strong correlation between LC-MS/MS and the other two methods. PMID:29425234

  18. Targeted liquid chromatography tandem mass spectrometry to quantitate wheat gluten using well-defined reference proteins.

    Directory of Open Access Journals (Sweden)

    Kathrin Schalk

    Full Text Available Celiac disease (CD is an inflammatory disorder of the upper small intestine caused by the ingestion of storage proteins (prolamins and glutelins from wheat, barley, rye, and, in rare cases, oats. CD patients need to follow a gluten-free diet by consuming gluten-free products with gluten contents of less than 20 mg/kg. Currently, the recommended method for the quantitative determination of gluten is an enzyme-linked immunosorbent assay (ELISA based on the R5 monoclonal antibody. Because the R5 ELISA mostly detects the prolamin fraction of gluten, a new independent method is required to detect prolamins as well as glutelins. This paper presents the development of a method to quantitate 16 wheat marker peptides derived from all wheat gluten protein types by liquid chromatography tandem mass spectrometry (LC-MS/MS in the multiple reaction monitoring mode. The quantitation of each marker peptide in the chymotryptic digest of a defined amount of the respective reference wheat protein type resulted in peptide-specific yields. This enabled the conversion of peptide into protein type concentrations. Gluten contents were expressed as sum of all determined protein type concentrations. This new method was applied to quantitate gluten in wheat starches and compared to R5 ELISA and gel-permeation high-performance liquid chromatography with fluorescence detection (GP-HPLC-FLD, which resulted in a strong correlation between LC-MS/MS and the other two methods.

  19. Defining Leadership as Process Reference Model: Translating Organizational Goals into Practice Using a Structured Leadership Approach

    OpenAIRE

    Tuffley , David

    2010-01-01

    International audience; Effective leadership in organisations is important to the achievement of organizational objectives. Yet leadership is widely seen as a quality that individuals innately possess, and which cannot be learned. This paper makes two assertions; (a) that leadership is a skill that not only can be learned, but which can be formalized into a Process Reference Model that is intelligible from an Enterprise Architecture perspective, and (b) that Process Reference Models in the st...

  20. Defining an absolute reference frame for 'clumped' isotope studies of CO 2

    Science.gov (United States)

    Dennis, Kate J.; Affek, Hagit P.; Passey, Benjamin H.; Schrag, Daniel P.; Eiler, John M.

    2011-11-01

    We present a revised approach for standardizing and reporting analyses of multiply substituted isotopologues of CO 2 (i.e., 'clumped' isotopic species, especially the mass-47 isotopologues). Our approach standardizes such data to an absolute reference frame based on theoretical predictions of the abundances of multiply-substituted isotopologues in gaseous CO 2 at thermodynamic equilibrium. This reference frame is preferred over an inter-laboratory calibration of carbonates because it enables all laboratories measuring mass 47 CO 2 to use a common scale that is tied directly to theoretical predictions of clumping in CO 2, regardless of the laboratory's primary research field (carbonate thermometry or CO 2 biogeochemistry); it explicitly accounts for mass spectrometric artifacts rather than convolving (and potentially confusing) them with chemical fractionations associated with sample preparation; and it is based on a thermodynamic equilibrium that can be experimentally established in any suitably equipped laboratory using commonly available materials. By analyzing CO 2 gases that have been subjected to established laboratory procedures known to promote isotopic equilibrium (i.e., heated gases and water-equilibrated CO 2), and by reference to thermodynamic predictions of equilibrium isotopic distributions, it is possible to construct an empirical transfer function that is applicable to data with unknown clumped isotope signatures. This transfer function empirically accounts for the fragmentation and recombination reactions that occur in electron impact ionization sources and other mass spectrometric artifacts. We describe the protocol necessary to construct such a reference frame, the method for converting gases with unknown clumped isotope compositions to this reference frame, and suggest a protocol for ensuring that all reported isotopic compositions (e.g., Δ 47 values; Eiler and Schauble, 2004; Eiler, 2007) can be compared among different laboratories and

  1. The Retinome – Defining a reference transcriptome of the adult mammalian retina/retinal pigment epithelium

    Directory of Open Access Journals (Sweden)

    Goetz Thomas

    2004-07-01

    Full Text Available Abstract Background The mammalian retina is a valuable model system to study neuronal biology in health and disease. To obtain insight into intrinsic processes of the retina, great efforts are directed towards the identification and characterization of transcripts with functional relevance to this tissue. Results With the goal to assemble a first genome-wide reference transcriptome of the adult mammalian retina, referred to as the retinome, we have extracted 13,037 non-redundant annotated genes from nearly 500,000 published datasets on redundant retina/retinal pigment epithelium (RPE transcripts. The data were generated from 27 independent studies employing a wide range of molecular and biocomputational approaches. Comparison to known retina-/RPE-specific pathways and established retinal gene networks suggest that the reference retinome may represent up to 90% of the retinal transcripts. We show that the distribution of retinal genes along the chromosomes is not random but exhibits a higher order organization closely following the previously observed clustering of genes with increased expression. Conclusion The genome wide retinome map offers a rational basis for selecting suggestive candidate genes for hereditary as well as complex retinal diseases facilitating elaborate studies into normal and pathological pathways. To make this unique resource freely available we have built a database providing a query interface to the reference retinome 1.

  2. The Mediterranean Diet: its definition and evaluation of a priori dietary indexes in primary cardiovascular prevention.

    Science.gov (United States)

    D'Alessandro, Annunziata; De Pergola, Giovanni

    2018-01-18

    We have analysed the definition of Mediterranean Diet in 28 studies included in six meta-analyses evaluating the relation between the Mediterranean Diet and primary prevention of cardiovascular disease. Some typical food of this dietary pattern like whole cereals, olive oil and red wine were taken into account only in a few a priori indexes, and the dietary pattern defined as Mediterranean showed many differences among the studies and compared to traditional Mediterranean Diet of the early 1960s. Altogether, the analysed studies show a protective effect of the Mediterranean Diet against cardiovascular disease but present different effects against specific conditions as cerebrovascular disease and coronary heart disease. These different effects might depend on the definition of Mediterranean Diet and the indexes of the adhesion to the same one used. To compare the effects of the Mediterranean Diet against cardiovascular disease, coronary heart disease and stroke a univocal model of Mediterranean Diet should be established as a reference, and it might be represented by the Modern Mediterranean Diet Pyramid. The a priori index to evaluate the adhesion to Mediterranean Diet might be the Mediterranean-Style Dietary Pattern Score that has some advantages in comparison to the others a priori indexes.

  3. Reference Architecture for Multi-Layer Software Defined Optical Data Center Networks

    Directory of Open Access Journals (Sweden)

    Casimer DeCusatis

    2015-09-01

    Full Text Available As cloud computing data centers grow larger and networking devices proliferate; many complex issues arise in the network management architecture. We propose a framework for multi-layer; multi-vendor optical network management using open standards-based software defined networking (SDN. Experimental results are demonstrated in a test bed consisting of three data centers interconnected by a 125 km metropolitan area network; running OpenStack with KVM and VMW are components. Use cases include inter-data center connectivity via a packet-optical metropolitan area network; intra-data center connectivity using an optical mesh network; and SDN coordination of networking equipment within and between multiple data centers. We create and demonstrate original software to implement virtual network slicing and affinity policy-as-a-service offerings. Enhancements to synchronous storage backup; cloud exchanges; and Fibre Channel over Ethernet topologies are also discussed.

  4. Defining Toll Fee of Wheeling Renewable with Reference to a Gas Pipeline in Indonesia

    Science.gov (United States)

    Hakim, Amrullah

    2017-07-01

    Indonesia has a huge number of renewable energy sources (RE) however; the utilization of these is currently very low. The main challenge of power production is its alignment with consumption levels; supply should equal demand at all times. There is a strong initiative from corporations with high energy demand, compared to other sectors, to apply a renewable portfolio standard for their energy input, e.g. 15% of their energy consumption requirement must come from a renewable energy source. To support this initiative, the utilization of power wheeling will help large factories on industrial estates to source firm and steady renewables from remote sites. The wheeling renewable via PLN’s transmission line has been regulated under the Ministry Decree in 2015 however; the tariff or toll fee has not yet been defined. The potential project to apply wheeling renewable will obtain power supply from a geothermal power plant, with power demand from the scattered factories under one company. This is the concept driving the application of power wheeling in the effort to push the growth of renewable energy in Indonesia. Given that the capacity of PLN’s transmission line are normally large and less congested compared to distribution line, the wheeling renewable can accommodate the scattered factories locations which then results in the cheaper toll fee of the wheeling renewable. Defining the best toll fee is the main topic of this paper with comparison of the toll fee of the gas pipeline infrastructure in Indonesia, so that it can be applied massively to achieve COP21’s commitment.

  5. Methodology to define biological reference values in the environmental and occupational fields: the contribution of the Italian Society for Reference Values (SIVR).

    Science.gov (United States)

    Aprea, Maria Cristina; Scapellato, Maria Luisa; Valsania, Maria Carmen; Perico, Andrea; Perbellini, Luigi; Ricossa, Maria Cristina; Pradella, Marco; Negri, Sara; Iavicoli, Ivo; Lovreglio, Piero; Salamon, Fabiola; Bettinelli, Maurizio; Apostoli, Pietro

    2017-04-21

    Biological reference values (RVs) explore the relationships between humans and their environment and habits. RVs are fundamental in the environmental field for assessing illnesses possibly associated with environmental pollution, and also in the occupational field, especially in the absence of established biological or environmental limits. The Italian Society for Reference Values (SIVR) determined to test criteria and procedures for the definition of RVs to be used in the environmental and occupational fields. The paper describes the SIVR methodology for defining RVs of xenobiotics and their metabolites. Aspects regarding the choice of population sample, the quality of analytical data, statistical analysis and control of variability factors are considered. The simultaneous interlaboratory circuits involved can be expected to increasingly improve the quality of the analytical data. Examples of RVs produced by SIVR are presented. In particular, levels of chromium, mercury, ethylenethiourea, 3,5,6-trichloro-2-pyridinol, 2,5-hexanedione, 1-hydroxypyrene and t,t-muconic acid measured in urine and expressed in micrograms/g creatinine (μg/g creat) or micrograms/L (μg/L) are reported. With the proposed procedure, SIVR intends to make its activities known to the scientific community in order to increase the number of laboratories involved in the definition of RVs for the Italian population. More research is needed to obtain further RVs in different biological matrices, such as hair, nails and exhaled breath. It is also necessary to update and improve the present reference values and broaden the portfolio of chemicals for which RVs are available. In the near future, SIVR intends to expand its scientific activity by using a multivariate approach for xenobiotics that may have a common origin, and to define RVs separately for children who may be exposed more than adults and be more vulnerable.

  6. Time-discrete higher order ALE formulations: a priori error analysis

    KAUST Repository

    Bonito, Andrea; Kyza, Irene; Nochetto, Ricardo H.

    2013-01-01

    We derive optimal a priori error estimates for discontinuous Galerkin (dG) time discrete schemes of any order applied to an advection-diffusion model defined on moving domains and written in the Arbitrary Lagrangian Eulerian (ALE) framework. Our

  7. Time-Dependent Selection of an Optimal Set of Sources to Define a Stable Celestial Reference Frame

    Science.gov (United States)

    Le Bail, Karine; Gordon, David

    2010-01-01

    Temporal statistical position stability is required for VLBI sources to define a stable Celestial Reference Frame (CRF) and has been studied in many recent papers. This study analyzes the sources from the latest realization of the International Celestial Reference Frame (ICRF2) with the Allan variance, in addition to taking into account the apparent linear motions of the sources. Focusing on the 295 defining sources shows how they are a good compromise of different criteria, such as statistical stability and sky distribution, as well as having a sufficient number of sources, despite the fact that the most stable sources of the entire ICRF2 are mostly in the Northern Hemisphere. Nevertheless, the selection of a stable set is not unique: studying different solutions (GSF005a and AUG24 from GSFC and OPA from the Paris Observatory) over different time periods (1989.5 to 2009.5 and 1999.5 to 2009.5) leads to selections that can differ in up to 20% of the sources. Observing, recording, and network improvement are some of the causes, showing better stability for the CRF over the last decade than the last twenty years. But this may also be explained by the assumption of stationarity that is not necessarily right for some sources.

  8. A priori a matematika u Berkeleyho

    Czech Academy of Sciences Publication Activity Database

    Tomeček, Marek

    2014-01-01

    Roč. 62, č. 3 (2014), s. 369-385 ISSN 0015-1831 R&D Projects: GA ČR(CZ) GAP401/11/0371 Institutional support: RVO:67985955 Keywords : a priori * mathematics * empiricism * George Berkeley Subject RIV: AA - Philosophy ; Religion

  9. Solution of underdetermined systems of equations with gridded a priori constraints.

    Science.gov (United States)

    Stiros, Stathis C; Saltogianni, Vasso

    2014-01-01

    The TOPINV, Topological Inversion algorithm (or TGS, Topological Grid Search) initially developed for the inversion of highly non-linear redundant systems of equations, can solve a wide range of underdetermined systems of non-linear equations. This approach is a generalization of a previous conclusion that this algorithm can be used for the solution of certain integer ambiguity problems in Geodesy. The overall approach is based on additional (a priori) information for the unknown variables. In the past, such information was used either to linearize equations around approximate solutions, or to expand systems of observation equations solved on the basis of generalized inverses. In the proposed algorithm, the a priori additional information is used in a third way, as topological constraints to the unknown n variables, leading to an R(n) grid containing an approximation of the real solution. The TOPINV algorithm does not focus on point-solutions, but exploits the structural and topological constraints in each system of underdetermined equations in order to identify an optimal closed space in the R(n) containing the real solution. The centre of gravity of the grid points defining this space corresponds to global, minimum-norm solutions. The rationale and validity of the overall approach are demonstrated on the basis of examples and case studies, including fault modelling, in comparison with SVD solutions and true (reference) values, in an accuracy-oriented approach.

  10. Criteria to define a more relevant reference sample of titanium dioxide in the context of food: a multiscale approach.

    Science.gov (United States)

    Dudefoi, William; Terrisse, Hélène; Richard-Plouet, Mireille; Gautron, Eric; Popa, Florin; Humbert, Bernard; Ropers, Marie-Hélène

    2017-05-01

    Titanium dioxide (TiO 2 ) is a transition metal oxide widely used as a white pigment in various applications, including food. Due to the classification of TiO 2 nanoparticles by the International Agency for Research on Cancer as potentially harmful for humans by inhalation, the presence of nanoparticles in food products needed to be confirmed by a set of independent studies. Seven samples of food-grade TiO 2 (E171) were extensively characterised for their size distribution, crystallinity and surface properties by the currently recommended methods. All investigated E171 samples contained a fraction of nanoparticles, however, below the threshold defining the labelling of nanomaterial. On the basis of these results and a statistical analysis, E171 food-grade TiO 2 totally differs from the reference material P25, confirming the few published data on this kind of particle. Therefore, the reference material P25 does not appear to be the most suitable model to study the fate of food-grade TiO 2 in the gastrointestinal tract. The criteria currently to obtain a representative food-grade sample of TiO 2 are the following: (1) crystalline-phase anatase, (2) a powder with an isoelectric point very close to 4.1, (3) a fraction of nanoparticles comprised between 15% and 45%, and (4) a low specific surface area around 10 m 2  g - 1 .

  11. Quantitative estimation of diphtheria and tetanus toxoids. 4. Toxoids as international reference materials defining Lf-units for diphtheria and tetanus toxoids.

    Science.gov (United States)

    Lyng, J

    1990-01-01

    The Lf-unit, which is used in the control of diphtheria and tetanus toxoid production and in some countries also to follow immunization of horses for production of antitoxins, has hitherto been defined by means of antitoxin preparations. A diphtheria toxoid and a tetanus toxoid preparation, both freeze-dried, were examined in an international collaborative study for their suitability to serve as reference reagents in the flocculation tests and for defining the Lf-units. It was shown that flocculation tests using the reference toxoids are very reproducible and reliable and the WHO Expert Committee on Biological Standardization established: the toxoid called DIFT as the International Reference Reagent of Diphtheria Toxoid for Flocculation Test with a defined content of 900 Lf-units of diphtheria toxoid per ampoule; and the toxoid called TEFT as the International Reference Reagent of Tetanus Toxoid for Flocculation Test with a defined content of 1000 Lf-units of diphtheria toxoid per ampoule.

  12. Defining reference sequences for Nocardia species by similarity and clustering analyses of 16S rRNA gene sequence data.

    Directory of Open Access Journals (Sweden)

    Manal Helal

    Full Text Available BACKGROUND: The intra- and inter-species genetic diversity of bacteria and the absence of 'reference', or the most representative, sequences of individual species present a significant challenge for sequence-based identification. The aims of this study were to determine the utility, and compare the performance of several clustering and classification algorithms to identify the species of 364 sequences of 16S rRNA gene with a defined species in GenBank, and 110 sequences of 16S rRNA gene with no defined species, all within the genus Nocardia. METHODS: A total of 364 16S rRNA gene sequences of Nocardia species were studied. In addition, 110 16S rRNA gene sequences assigned only to the Nocardia genus level at the time of submission to GenBank were used for machine learning classification experiments. Different clustering algorithms were compared with a novel algorithm or the linear mapping (LM of the distance matrix. Principal Components Analysis was used for the dimensionality reduction and visualization. RESULTS: The LM algorithm achieved the highest performance and classified the set of 364 16S rRNA sequences into 80 clusters, the majority of which (83.52% corresponded with the original species. The most representative 16S rRNA sequences for individual Nocardia species have been identified as 'centroids' in respective clusters from which the distances to all other sequences were minimized; 110 16S rRNA gene sequences with identifications recorded only at the genus level were classified using machine learning methods. Simple kNN machine learning demonstrated the highest performance and classified Nocardia species sequences with an accuracy of 92.7% and a mean frequency of 0.578. CONCLUSION: The identification of centroids of 16S rRNA gene sequence clusters using novel distance matrix clustering enables the identification of the most representative sequences for each individual species of Nocardia and allows the quantitation of inter- and intra

  13. Defining the Reference Condition for Wadeable Streams in the Sand Hills Subdivision of the Southeastern Plains Ecoregion, USA

    Science.gov (United States)

    Kosnicki, Ely; Sefick, Stephen A.; Paller, Michael H.; Jarrell, Miller S.; Prusha, Blair A.; Sterrett, Sean C.; Tuberville, Tracey D.; Feminella, Jack W.

    2014-09-01

    The Sand Hills subdivision of the Southeastern Plains ecoregion has been impacted by historical land uses over the past two centuries and, with the additive effects of contemporary land use, determining reference condition for streams in this region is a challenge. We identified reference condition based on the combined use of 3 independent selection methods. Method 1 involved use of a multivariate disturbance gradient derived from several stressors, method 2 was based on variation in channel morphology, and method 3 was based on passing 6 of 7 environmental criteria. Sites selected as reference from all 3 methods were considered primary reference, whereas those selected by 2 or 1 methods were considered secondary or tertiary reference, respectively. Sites not selected by any of the methods were considered non-reference. In addition, best professional judgment (BPJ) was used to exclude some sites from any reference class, and comparisons were made to examine the utility of BPJ. Non-metric multidimensional scaling indicated that use of BPJ may help designate non-reference sites when unidentified stressors are present. The macroinvertebrate community measures Ephemeroptera, Plecoptera, Trichoptera richness and North Carolina Biotic Index showed no differences between primary and secondary reference sites when BPJ was ignored. However, there was no significant difference among primary, secondary, and tertiary reference sites when BPJ was used. We underscore the importance of classifying reference conditions, especially in regions that have endured significant anthropogenic activity. We suggest that the use of secondary reference sites may enable construction of models that target a broader set of management interests.

  14. Conventional Principles in Science: On the foundations and development of the relativized a priori

    Science.gov (United States)

    Ivanova, Milena; Farr, Matt

    2015-11-01

    The present volume consists of a collection of papers originally presented at the conference Conventional Principles in Science, held at the University of Bristol, August 2011, which featured contributions on the history and contemporary development of the notion of 'relativized a priori' principles in science, from Henri Poincaré's conventionalism to Michael Friedman's contemporary defence of the relativized a priori. In Science and Hypothesis, Poincaré assessed the problematic epistemic status of Euclidean geometry and Newton's laws of motion, famously arguing that each has the status of 'convention' in that their justification is neither analytic nor empirical in nature. In The Theory of Relativity and A Priori Knowledge, Hans Reichenbach, in light of the general theory of relativity, proposed an updated notion of the Kantian synthetic a priori to account for the dynamic inter-theoretic status of geometry and other non-empirical physical principles. Reichenbach noted that one may reject the 'necessarily true' aspect of the synthetic a priori whilst preserving the feature of being constitutive of the object of knowledge. Such constitutive principles are theory-relative, as illustrated by the privileged role of non-Euclidean geometry in general relativity theory. This idea of relativized a priori principles in spacetime physics has been analysed and developed at great length in the modern literature in the work of Michael Friedman, in particular the roles played by the light postulate and the equivalence principle - in special and general relativity respectively - in defining the central terms of their respective theories and connecting the abstract mathematical formalism of the theories with their empirical content. The papers in this volume guide the reader through the historical development of conventional and constitutive principles in science, from the foundational work of Poincaré, Reichenbach and others, to contemporary issues and applications of the

  15. Relations between water physico-chemistry and benthic algal communities in a northern Canadian watershed: defining reference conditions using multiple descriptors of community structure.

    Science.gov (United States)

    Thomas, Kathryn E; Hall, Roland I; Scrimgeour, Garry J

    2015-09-01

    Defining reference conditions is central to identifying environmental effects of anthropogenic activities. Using a watershed approach, we quantified reference conditions for benthic algal communities and their relations to physico-chemical conditions in rivers in the South Nahanni River watershed, NWT, Canada, in 2008 and 2009. We also compared the ability of three descriptors that vary in terms of analytical costs to define algal community structure based on relative abundances of (i) all algal taxa, (ii) only diatom taxa, and (iii) photosynthetic pigments. Ordination analyses showed that variance in algal community structure was strongly related to gradients in environmental variables describing water physico-chemistry, stream habitats, and sub-watershed structure. Water physico-chemistry and local watershed-scale descriptors differed significantly between algal communities from sites in the Selwyn Mountain ecoregion compared to sites in the Nahanni-Hyland ecoregions. Distinct differences in algal community types between ecoregions were apparent irrespective of whether algal community structure was defined using all algal taxa, diatom taxa, or photosynthetic pigments. Two algal community types were highly predictable using environmental variables, a core consideration in the development of Reference Condition Approach (RCA) models. These results suggest that assessments of environmental impacts could be completed using RCA models for each ecoregion. We suggest that use of algal pigments, a high through-put analysis, is a promising alternative compared to more labor-intensive and costly taxonomic approaches for defining algal community structure.

  16. A Priori Regularity of Parabolic Partial Differential Equations

    KAUST Repository

    Berkemeier, Francisco

    2018-01-01

    In this thesis, we consider parabolic partial differential equations such as the heat equation, the Fokker-Planck equation, and the porous media equation. Our aim is to develop methods that provide a priori estimates for solutions with singular

  17. A priori knowledge and the Kochen-Specker theorem

    International Nuclear Information System (INIS)

    Brunet, Olivier

    2007-01-01

    We introduce and formalize a notion of 'a priori knowledge' about a quantum system, and show some properties about this form of knowledge. Finally, we show that the Kochen-Specker theorem follows directly from this study

  18. Optimal phase estimation with arbitrary a priori knowledge

    International Nuclear Information System (INIS)

    Demkowicz-Dobrzanski, Rafal

    2011-01-01

    The optimal-phase estimation strategy is derived when partial a priori knowledge on the estimated phase is available. The solution is found with the help of the most famous result from the entanglement theory: the positive partial transpose criterion. The structure of the optimal measurements, estimators, and the optimal probe states is analyzed. This Rapid Communication provides a unified framework bridging the gap in the literature on the subject which until now dealt almost exclusively with two extreme cases: almost perfect knowledge (local approach based on Fisher information) and no a priori knowledge (global approach based on covariant measurements). Special attention is paid to a natural a priori probability distribution arising from a diffusion process.

  19. Kant, Reichenbach, and the Fate of A Priori Principles

    OpenAIRE

    de Boer, Karin

    2011-01-01

    This article contends that the relation of early logical empiricism to Kant was more complex than is often assumed. It argues that Reichenbach’s early work on Kant and Einstein, entitled The Theory of Relativity and A Priori Knowledge (1920) aimed to transform rather than to oppose Kant’s Critique of Pure Reason. One the one hand, I argue that Reichenbach’s conception of coordinating principles, derived from Kant’s conception of synthetic a priori principles, offers a valuable way of accounti...

  20. Association of a culturally defined syndrome (nervios) with chest pain and DSM-IV affective disorders in Hispanic patients referred for cardiac stress testing.

    Science.gov (United States)

    Pavlik, Valory N; Hyman, David J; Wendt, Juliet A; Orengo, Claudia

    2004-01-01

    Hispanics have a high prevalence of cardiovascular risk factors, most notably type 2 diabetes. However, in a large public hospital in Houston, Texas, Hispanic patients referred for cardiac stress testing were significantly more likely to have normal test results than were Whites or non-Hispanic Blacks. We undertook an exploratory study to determine if nervios, a culturally based syndrome that shares similarities with both panic disorder and anginal symptoms, is sufficiently prevalent among Hispanics referred for cardiac testing to be considered as a possible explanation for the high probability of a normal test result. Hispanic patients were recruited consecutively when they presented for a cardiac stress test. A bilingual interviewer administered a brief medical history, the Rose Angina Questionnaire (RAQ), a questionnaire to assess a history of nervios and associated symptoms, and the PRIME-MD, a validated brief questionnaire to diagnose DSM-IV defined affective disorders. The average age of the 114 participants (38 men and 76 women) was 57 years, and the average educational attainment was 7 years. Overall, 50% of participants reported a history of chronic nervios, and 14% reported an acute subtype known as ataque de nervios. Only 2% of patients had DSM-IV defined panic disorder, and 59% of patients had a positive RAQ score (ie, Rose questionnaire angina). The acute subtype, ataque de nervios, but not chronic nervios, was related to an increased probability of having Rose questionnaire angina (P=.006). Adjusted for covariates, a positive history of chronic nervios, but not Rose questionnaire angina, was significantly associated with a normal cardiac test result (OR=2.97, P=.04). Nervios is common among Hispanics with symptoms of cardiac disease. Additional research is needed to understand how nervios symptoms differ from chest pain in Hispanics and the role of nervios in referral for cardiac workup by primary care providers and emergency room personnel.

  1. A Priori Knowledge and Heuristic Reasoning in Architectural Design.

    Science.gov (United States)

    Rowe, Peter G.

    1982-01-01

    It is proposed that the various classes of a priori knowledge incorporated in heuristic reasoning processes exert a strong influence over architectural design activity. Some design problems require exercise of some provisional set of rules, inference, or plausible strategy which requires heuristic reasoning. A case study illustrates this concept.…

  2. Incorporating a priori knowledge into initialized weights for neural classifier

    NARCIS (Netherlands)

    Chen, Zhe; Feng, T.J.; Feng, Tian-Jin; Houkes, Z.

    2000-01-01

    Artificial neural networks (ANN), especially, multilayer perceptrons (MLP) have been widely used in pattern recognition and classification. Nevertheless, how to incorporate a priori knowledge in the design of ANNs is still an open problem. The paper tries to give some insight on this topic

  3. LandScape: a simple method to aggregate p--Values and other stochastic variables without a priori grouping

    DEFF Research Database (Denmark)

    Wiuf, Carsten; Pallesen, Jonatan; Foldager, Leslie

    2016-01-01

    variables without assuming a priori defined groups. We provide different ways to evaluate the significance of the aggregated variables based on theoretical considerations and resampling techniques, and show that under certain assumptions the FWER is controlled in the strong sense. Validity of the method...... and the results might depend on the chosen criteria. Methods that summarize, or aggregate, test statistics or p-values, without relying on a priori criteria, are therefore desirable. We present a simple method to aggregate a sequence of stochastic variables, such as test statistics or p-values, into fewer...

  4. Diffuse optical tomography with physiological and spatial a priori constraints

    International Nuclear Information System (INIS)

    Intes, Xavier; Maloux, Clemence; Guven, Murat; Yazici, Birzen; Chance, Britton

    2004-01-01

    Diffuse optical tomography is a typical inverse problem plagued by ill-condition. To overcome this drawback, regularization or constraining techniques are incorporated in the inverse formulation. In this work, we investigate the enhancement in recovering functional parameters by using physiological and spatial a priori constraints. More accurate recovery of the two main functional parameters that are the blood volume and the relative saturation is demonstrated through simulations by using our method compared to actual techniques. (note)

  5. Profile reconstruction from neutron reflectivity data and a priori knowledge

    International Nuclear Information System (INIS)

    Leeb, H.

    2008-01-01

    The problem of incomplete and noisy information in profile reconstruction from neutron reflectometry data is considered. In particular methods of Bayesian statistics in combination with modelling or inverse scattering techniques are considered in order to properly include the required a priori knowledge to obtain quantitatively reliable estimates of the reconstructed profiles. Applying Bayes theorem the results of different experiments on the same sample can be consistently included in the profile reconstruction

  6. Why Even Mind? -- On The A Priori Value Of “Life”

    Directory of Open Access Journals (Sweden)

    Amien Kacou

    2008-10-01

    Full Text Available span style="font-size: 12pt; font-family: Arial"span style="font-weight: normal; font-size: 12pt; text-decoration: none"font face="Times New Roman"span style="font-size: 12pt; font-family: #39;Times New Roman#39;"This article presentsnbsp;an analysis of the matter of the ldquo;meaningrdquo; of life in terms of whether it should even be lived in the first place. It begins with an attempt at defining the question as an inquiry on the ema priori/em value of attention in general, and develops into an axiological reflection distantly inspired from Martin Heideggerrsquo;s notion of ldquo;care.rdquo; The main objective of the article is (1 to ldquo;answerrdquo; the question (or to proceed as if the question could be answered objectively by ldquo;playing alongrdquo; with its naiuml;ve logicmdash;that is, by finding a basis for comparing the good that can be found ema priori/em in life (mainly, pleasure with the good that can be found ema priori/em in death (mainly, the absence of painmdash;and, then, (2 to suggest why we have no good reason to feel dissatisfied with where this leaves us (i.e., possibly facing a certain specter of ethical foundationalism: the question of the ldquo;value of valuerdquo;. Its basic conclusion is, assuming we are committed to assigning value to life emin general/em, that we should be able to say that life is good emirrespective of /emany explanation for its existence./span/font/span/span

  7. Elimination of hidden a priori information from remotely sensed profile data

    Directory of Open Access Journals (Sweden)

    T. von Clarmann

    2007-01-01

    Full Text Available Profiles of atmospheric state variables retrieved from remote measurements often contain a priori information which causes complication in the statistical use of data and in the comparison with other measured or modeled data. For such applications it often is desirable to remove the a priori information from the data product. If the retrieval involves an ill-posed inversion problem, formal removal of the a priori information requires resampling of the data on a coarser grid, which in some sense, however, is a prior constraint in itself. The fact that the trace of the averaging kernel matrix of a retrieval is equivalent to the number of degrees of freedom of the retrieval is used to define an appropriate information-centered representation of the data where each data point represents one degree of freedom. Since regridding implies further degradation of the data and thus causes additional loss of information, a re-regularization scheme has been developed which allows resampling without additional loss of information. For a typical ClONO2 profile retrieved from spectra as measured by the Michelson Interferometer for Passive Atmospheric Sounding (MIPAS, the constrained retrieval has 9.7 degrees of freedom. After application of the proposed transformation to a coarser information-centered altitude grid, there are exactly 9 degrees of freedom left, and the averaging kernel on the coarse grid is unity. Pure resampling on the information-centered grid without re-regularization would reduce the degrees of freedom to 7.1 (6.7 for a staircase (triangular representation scheme.

  8. Defining suitable reference genes for RT-qPCR analysis on human sertoli cells after 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD) exposure.

    Science.gov (United States)

    Ribeiro, Mariana Antunes; dos Reis, Mariana Bisarro; de Moraes, Leonardo Nazário; Briton-Jones, Christine; Rainho, Cláudia Aparecida; Scarano, Wellerson Rodrigo

    2014-11-01

    Quantitative real-time RT-PCR (qPCR) has proven to be a valuable molecular technique to quantify gene expression. There are few studies in the literature that describe suitable reference genes to normalize gene expression data. Studies of transcriptionally disruptive toxins, like tetrachlorodibenzo-p-dioxin (TCDD), require careful consideration of reference genes. The present study was designed to validate potential reference genes in human Sertoli cells after exposure to TCDD. 32 candidate reference genes were analyzed to determine their applicability. geNorm and NormFinder softwares were used to obtain an estimation of the expression stability of the 32 genes and to identify the most suitable genes for qPCR data normalization.

  9. Reference ranges for blood concentrations of eosinophils and monocytes during the neonatal period defined from over 63 000 records in a multihospital health-care system.

    Science.gov (United States)

    Christensen, R D; Jensen, J; Maheshwari, A; Henry, E

    2010-08-01

    Blood concentrations of eosinophils and monocytes are part of the complete blood count. Reference ranges for these concentrations during the neonatal period, established by very large sample sizes and modern methods, are needed for identifying abnormally low or high values. We constructed reference ranges for eosinophils per microl and monocytes per microl among neonates of 22 to 42 weeks of gestation, on the day of birth, and also during 28 days after birth. Data were obtained from archived electronic records over an eight and one-half-year period in a multihospital health-care system. In keeping with the reference range concept, values were excluded from neonates with a diagnosis of infection or necrotizing enterocolitis (NEC). Eosinophils and monocytes per microl of blood were electronically retrieved from 96 162 records, of which 63 371 that lacked a diagnosis of infection or NEC were included in this reference range report. The mean value for eosinophils per microl on the day of birth increased linearly between 22 and 42 weeks of gestation, as did the 5 and 95% values. The reference range at 40 weeks was 140 to 1300 microl(-1) (mean 550 microl(-1)). Similarly, the mean value for monocytes increased linearly over this interval, with a reference range at 40 weeks of 300 to 3300 microl(-1) (mean 1400 microl(-1)). Over the first 4 weeks after birth, no appreciable change was observed in 5% limit and mean eosinophil count, with a slight increase in the 95% limit in week 4. A slight increase in monocyte count was observed during the first 2 weeks after birth. The results of this analysis describe reference ranges for blood concentrations of eosinophils and monocytes during the neonatal period. Additional study is needed for determining the relevance of values falling outside the reference range.

  10. A priori estimates of global solutions of superlinear parabolic systems

    Directory of Open Access Journals (Sweden)

    Julius Pacuta

    2016-04-01

    Full Text Available We consider the parabolic system $ u_{t}-\\Delta u = u^{r}v^{p}$, $v_{t}-\\Delta v = u^{q}v^{s}$ in $\\Omega\\times(0,\\infty$, complemented by the homogeneous Dirichlet boundary conditions and the initial conditions $(u,v(\\cdot,0 = (u_{0},v_{0}$ in $\\Omega$, where $\\Omega $ is a smooth bounded domain in $ \\mathbb{R}^{N} $ and $ u_{0},v_{0}\\in L^{\\infty}(\\Omega$ are nonnegative functions. We find conditions on $ p,q,r,s $ guaranteeing a priori estimates of nonnegative classical global solutions. More precisely every such solution is bounded by a constant depending on suitable norm of the initial data. Our proofs are based on bootstrap in weighted Lebesgue spaces, universal estimates of auxiliary functions and estimates of the Dirichlet heat kernel.

  11. A Priori Regularity of Parabolic Partial Differential Equations

    KAUST Repository

    Berkemeier, Francisco

    2018-05-13

    In this thesis, we consider parabolic partial differential equations such as the heat equation, the Fokker-Planck equation, and the porous media equation. Our aim is to develop methods that provide a priori estimates for solutions with singular initial data. These estimates are obtained by understanding the time decay of norms of solutions. First, we derive regularity results for the heat equation by estimating the decay of Lebesgue norms. Then, we apply similar methods to the Fokker-Planck equation with suitable assumptions on the advection and diffusion. Finally, we conclude by extending our techniques to the porous media equation. The sharpness of our results is confirmed by examining known solutions of these equations. The main contribution of this thesis is the use of functional inequalities to express decay of norms as differential inequalities. These are then combined with ODE methods to deduce estimates for the norms of solutions and their derivatives.

  12. Using A Priori Information to Improve Atmospheric Duct Estimation

    Science.gov (United States)

    Zhao, X.

    2017-12-01

    Knowledge of refractivity condition in the marine atmospheric boundary layer (MABL) is crucial for the prediction of radar and communication systems performance at frequencies above 1 GHz on low-altitude paths. Since early this century, the `refractivity from clutter (RFC)' technique has been proved to be an effective way to estimate the MABL refractivity structure. Refractivity model is very important for RFC techniques. If prior knowledge of the local refractivity information is available (e.g., from numerical weather prediction models, atmospheric soundings, etc.), more accurate parameterized refractivity model can be constructed by the statistical method, e.g. principal analysis, which in turn can be used to improve the quality of the local refractivity retrievals. This work extends the adjoint parabolic equation approach to range-varying atmospheric duct structure inversions, in which a linear empirical reduced-dimension refractivity model constructed from the a priori refractive information is used.

  13. A priori and a posteriori approaches in human reliability

    International Nuclear Information System (INIS)

    Griffon-Fouco, M.; Gagnolet, P.

    1981-09-01

    The French atomic energy commission (CEA) and the French supplier in electric power (EDF) have joint studies on human factors in nuclear safety. This paper deals with these studies which are a combination of two approaches: - An a posteriori approach so as to know the rate of human errors and their causes: an analysis of incident data banks and an analysis of human errors on simulator are presented. - An a priori approach so as to know the potential factors of human errors: an analysis of the control rooms design and an analysis of the writing of procedures are presented. The possibility to take into account these two approaches to prevent and quantify human errors is discussed

  14. Mining of hospital laboratory information systems: a model study defining age- and gender-specific reference intervals and trajectories for plasma creatinine in a pediatric population.

    Science.gov (United States)

    Søeby, Karen; Jensen, Peter Bjødstrup; Werge, Thomas; Sørensen, Steen

    2015-09-01

    The knowledge of physiological fluctuation and variation of even commonly used biochemical quantities in extreme age groups and during development is sparse. This challenges the clinical interpretation and utility of laboratory tests in these age groups. To explore the utility of hospital laboratory data as a source of information, we analyzed enzymatic plasma creatinine as a model analyte in two large pediatric hospital samples. Plasma creatinine measurements from 9700 children aged 0-18 years were obtained from hospital laboratory databases and partitioned into high-resolution gender- and age-groups. Normal probability plots were used to deduce parameters of the normal distributions from healthy creatinine values in the mixed hospital datasets. Furthermore, temporal trajectories were generated from repeated measurements to examine developmental patterns in periods of changing creatinine levels. Creatinine shows great age dependence from birth throughout childhood. We computed and replicated 95% reference intervals in narrow gender and age bins and showed them to be comparable to those determined in healthy population studies. We identified pronounced transitions in creatinine levels at different time points after birth and around the early teens, which challenges the establishment and usefulness of reference intervals in those age groups. The study documents that hospital laboratory data may inform on the developmental aspects of creatinine, on periods with pronounced heterogeneity and valid reference intervals. Furthermore, part of the heterogeneity in creatinine distribution is likely due to differences in biological and chronological age of children and should be considered when using age-specific reference intervals.

  15. Time-discrete higher order ALE formulations: a priori error analysis

    KAUST Repository

    Bonito, Andrea

    2013-03-16

    We derive optimal a priori error estimates for discontinuous Galerkin (dG) time discrete schemes of any order applied to an advection-diffusion model defined on moving domains and written in the Arbitrary Lagrangian Eulerian (ALE) framework. Our estimates hold without any restrictions on the time steps for dG with exact integration or Reynolds\\' quadrature. They involve a mild restriction on the time steps for the practical Runge-Kutta-Radau methods of any order. The key ingredients are the stability results shown earlier in Bonito et al. (Time-discrete higher order ALE formulations: stability, 2013) along with a novel ALE projection. Numerical experiments illustrate and complement our theoretical results. © 2013 Springer-Verlag Berlin Heidelberg.

  16. Methods for improving limited field-of-view radiotherapy reconstructions using imperfect a priori images

    International Nuclear Information System (INIS)

    Ruchala, Kenneth J.; Olivera, Gustavo H.; Kapatoes, Jeffrey M.; Reckwerdt, Paul J.; Mackie, Thomas R.

    2002-01-01

    There are many benefits to having an online CT imaging system for radiotherapy, as it helps identify changes in the patient's position and anatomy between the time of planning and treatment. However, many current online CT systems suffer from a limited field-of-view (LFOV) in that collected data do not encompass the patient's complete cross section. Reconstruction of these data sets can quantitatively distort the image values and introduce artifacts. This work explores the use of planning CT data as a priori information for improving these reconstructions. Methods are presented to incorporate this data by aligning the LFOV with the planning images and then merging the data sets in sinogram space. One alignment option is explicit fusion, producing fusion-aligned reprojection (FAR) images. For cases where explicit fusion is not viable, FAR can be implemented using the implicit fusion of normal setup error, referred to as normal-error-aligned reprojection (NEAR). These methods are evaluated for multiday patient images showing both internal and skin-surface anatomical variation. The iterative use of NEAR and FAR is also investigated, as are applications of NEAR and FAR to dose calculations and the compensation of LFOV online MVCT images with kVCT planning images. Results indicate that NEAR and FAR can utilize planning CT data as imperfect a priori information to reduce artifacts and quantitatively improve images. These benefits can also increase the accuracy of dose calculations and be used for augmenting CT images (e.g., MVCT) acquired at different energies than the planning CT

  17. Developing an A Priori Database for Passive Microwave Snow Water Retrievals Over Ocean

    Science.gov (United States)

    Yin, Mengtao; Liu, Guosheng

    2017-12-01

    A physically optimized a priori database is developed for Global Precipitation Measurement Microwave Imager (GMI) snow water retrievals over ocean. The initial snow water content profiles are derived from CloudSat Cloud Profiling Radar (CPR) measurements. A radiative transfer model in which the single-scattering properties of nonspherical snowflakes are based on the discrete dipole approximate results is employed to simulate brightness temperatures and their gradients. Snow water content profiles are then optimized through a one-dimensional variational (1D-Var) method. The standard deviations of the difference between observed and simulated brightness temperatures are in a similar magnitude to the observation errors defined for observation error covariance matrix after the 1D-Var optimization, indicating that this variational method is successful. This optimized database is applied in a Bayesian retrieval snow water algorithm. The retrieval results indicated that the 1D-Var approach has a positive impact on the GMI retrieved snow water content profiles by improving the physical consistency between snow water content profiles and observed brightness temperatures. Global distribution of snow water contents retrieved from the a priori database is compared with CloudSat CPR estimates. Results showed that the two estimates have a similar pattern of global distribution, and the difference of their global means is small. In addition, we investigate the impact of using physical parameters to subset the database on snow water retrievals. It is shown that using total precipitable water to subset the database with 1D-Var optimization is beneficial for snow water retrievals.

  18. Impact of modellers' decisions on hydrological a priori predictions

    Science.gov (United States)

    Holländer, H. M.; Bormann, H.; Blume, T.; Buytaert, W.; Chirico, G. B.; Exbrayat, J.-F.; Gustafsson, D.; Hölzel, H.; Krauße, T.; Kraft, P.; Stoll, S.; Blöschl, G.; Flühler, H.

    2014-06-01

    In practice, the catchment hydrologist is often confronted with the task of predicting discharge without having the needed records for calibration. Here, we report the discharge predictions of 10 modellers - using the model of their choice - for the man-made Chicken Creek catchment (6 ha, northeast Germany, Gerwin et al., 2009b) and we analyse how well they improved their prediction in three steps based on adding information prior to each following step. The modellers predicted the catchment's hydrological response in its initial phase without having access to the observed records. They used conceptually different physically based models and their modelling experience differed largely. Hence, they encountered two problems: (i) to simulate discharge for an ungauged catchment and (ii) using models that were developed for catchments, which are not in a state of landscape transformation. The prediction exercise was organized in three steps: (1) for the first prediction the modellers received a basic data set describing the catchment to a degree somewhat more complete than usually available for a priori predictions of ungauged catchments; they did not obtain information on stream flow, soil moisture, nor groundwater response and had therefore to guess the initial conditions; (2) before the second prediction they inspected the catchment on-site and discussed their first prediction attempt; (3) for their third prediction they were offered additional data by charging them pro forma with the costs for obtaining this additional information. Holländer et al. (2009) discussed the range of predictions obtained in step (1). Here, we detail the modeller's assumptions and decisions in accounting for the various processes. We document the prediction progress as well as the learning process resulting from the availability of added information. For the second and third steps, the progress in prediction quality is evaluated in relation to individual modelling experience and costs of

  19. Sensitivity of the OMI ozone profile retrieval (OMO3PR) to a priori assumptions

    NARCIS (Netherlands)

    Mielonen, T.; De Haan, J.F.; Veefkind, J.P.

    2014-01-01

    We have assessed the sensitivity of the operational OMI ozone profile retrieval (OMO3PR) algorithm to a number of a priori assumptions. We studied the effect of stray light correction, surface albedo assumptions and a priori ozone profiles on the retrieved ozone profile. Then, we studied how to

  20. Resting-state test-retest reliability of a priori defined canonical networks over different preprocessing steps

    NARCIS (Netherlands)

    Varikuti, D.P.; Hoffstaedter, F.; Genon, S.; Schwender, H.; Reid, A.T.; Eickhoff, S.B.

    2017-01-01

    Resting-state functional connectivity analysis has become a widely used method for the investigation of human brain connectivity and pathology. The measurement of neuronal activity by functional MRI, however, is impeded by various nuisance signals that reduce the stability of functional

  1. Intention to use a fully automated car: attitudes and a priori acceptability

    OpenAIRE

    PAYRE, William; CESTAC, Julien; DELHOMME, Patricia

    2014-01-01

    If previous research studied acceptability of partially or highly automated driving, few of them focused on fully automated driving (FAD), including the ability to master longitudinal control, lateral control and maneuvers. The present study analyzes a priori acceptability, attitudes, personality traits and intention to use a fully automated vehicle. 421 French drivers (153 males, M= 40.2 years, age range 19-73) answered an online questionnaire. 68.1% of the sample a priori accepted FAD. P...

  2. Define Project

    DEFF Research Database (Denmark)

    Munk-Madsen, Andreas

    2005-01-01

    "Project" is a key concept in IS management. The word is frequently used in textbooks and standards. Yet we seldom find a precise definition of the concept. This paper discusses how to define the concept of a project. The proposed definition covers both heavily formalized projects and informally...... organized, agile projects. Based on the proposed definition popular existing definitions are discussed....

  3. "Dermatitis" defined.

    Science.gov (United States)

    Smith, Suzanne M; Nedorost, Susan T

    2010-01-01

    The term "dermatitis" can be defined narrowly or broadly, clinically or histologically. A common and costly condition, dermatitis is underresourced compared to other chronic skin conditions. The lack of a collectively understood definition of dermatitis and its subcategories could be the primary barrier. To investigate how dermatologists define the term "dermatitis" and determine if a consensus on the definition of this term and other related terms exists. A seven-question survey of dermatologists nationwide was conducted. Of respondents (n  =  122), half consider dermatitis to be any inflammation of the skin. Nearly half (47.5%) use the term interchangeably with "eczema." Virtually all (> 96%) endorse the subcategory "atopic" under the terms "dermatitis" and "eczema," but the subcategories "contact," "drug hypersensitivity," and "occupational" are more highly endorsed under the term "dermatitis" than under the term "eczema." Over half (55.7%) personally consider "dermatitis" to have a broad meaning, and even more (62.3%) believe that dermatologists as a whole define the term broadly. There is a lack of consensus among experts in defining dermatitis, eczema, and their related subcategories.

  4. Practical interior tomography with radial Hilbert filtering and a priori knowledge in a small round area.

    Science.gov (United States)

    Tang, Shaojie; Yang, Yi; Tang, Xiangyang

    2012-01-01

    Interior tomography problem can be solved using the so-called differentiated backprojection-projection onto convex sets (DBP-POCS) method, which requires a priori knowledge within a small area interior to the region of interest (ROI) to be imaged. In theory, the small area wherein the a priori knowledge is required can be in any shape, but most of the existing implementations carry out the Hilbert filtering either horizontally or vertically, leading to a vertical or horizontal strip that may be across a large area in the object. In this work, we implement a practical DBP-POCS method with radial Hilbert filtering and thus the small area with the a priori knowledge can be roughly round (e.g., a sinus or ventricles among other anatomic cavities in human or animal body). We also conduct an experimental evaluation to verify the performance of this practical implementation. We specifically re-derive the reconstruction formula in the DBP-POCS fashion with radial Hilbert filtering to assure that only a small round area with the a priori knowledge be needed (namely radial DBP-POCS method henceforth). The performance of the practical DBP-POCS method with radial Hilbert filtering and a priori knowledge in a small round area is evaluated with projection data of the standard and modified Shepp-Logan phantoms simulated by computer, followed by a verification using real projection data acquired by a computed tomography (CT) scanner. The preliminary performance study shows that, if a priori knowledge in a small round area is available, the radial DBP-POCS method can solve the interior tomography problem in a more practical way at high accuracy. In comparison to the implementations of DBP-POCS method demanding the a priori knowledge in horizontal or vertical strip, the radial DBP-POCS method requires the a priori knowledge within a small round area only. Such a relaxed requirement on the availability of a priori knowledge can be readily met in practice, because a variety of small

  5. A review of a priori regression models for warfarin maintenance dose prediction.

    Directory of Open Access Journals (Sweden)

    Ben Francis

    Full Text Available A number of a priori warfarin dosing algorithms, derived using linear regression methods, have been proposed. Although these dosing algorithms may have been validated using patients derived from the same centre, rarely have they been validated using a patient cohort recruited from another centre. In order to undertake external validation, two cohorts were utilised. One cohort formed by patients from a prospective trial and the second formed by patients in the control arm of the EU-PACT trial. Of these, 641 patients were identified as having attained stable dosing and formed the dataset used for validation. Predicted maintenance doses from six criterion fulfilling regression models were then compared to individual patient stable warfarin dose. Predictive ability was assessed with reference to several statistics including the R-square and mean absolute error. The six regression models explained different amounts of variability in the stable maintenance warfarin dose requirements of the patients in the two validation cohorts; adjusted R-squared values ranged from 24.2% to 68.6%. An overview of the summary statistics demonstrated that no one dosing algorithm could be considered optimal. The larger validation cohort from the prospective trial produced more consistent statistics across the six dosing algorithms. The study found that all the regression models performed worse in the validation cohort when compared to the derivation cohort. Further, there was little difference between regression models that contained pharmacogenetic coefficients and algorithms containing just non-pharmacogenetic coefficients. The inconsistency of results between the validation cohorts suggests that unaccounted population specific factors cause variability in dosing algorithm performance. Better methods for dosing that take into account inter- and intra-individual variability, at the initiation and maintenance phases of warfarin treatment, are needed.

  6. A review of a priori regression models for warfarin maintenance dose prediction.

    Science.gov (United States)

    Francis, Ben; Lane, Steven; Pirmohamed, Munir; Jorgensen, Andrea

    2014-01-01

    A number of a priori warfarin dosing algorithms, derived using linear regression methods, have been proposed. Although these dosing algorithms may have been validated using patients derived from the same centre, rarely have they been validated using a patient cohort recruited from another centre. In order to undertake external validation, two cohorts were utilised. One cohort formed by patients from a prospective trial and the second formed by patients in the control arm of the EU-PACT trial. Of these, 641 patients were identified as having attained stable dosing and formed the dataset used for validation. Predicted maintenance doses from six criterion fulfilling regression models were then compared to individual patient stable warfarin dose. Predictive ability was assessed with reference to several statistics including the R-square and mean absolute error. The six regression models explained different amounts of variability in the stable maintenance warfarin dose requirements of the patients in the two validation cohorts; adjusted R-squared values ranged from 24.2% to 68.6%. An overview of the summary statistics demonstrated that no one dosing algorithm could be considered optimal. The larger validation cohort from the prospective trial produced more consistent statistics across the six dosing algorithms. The study found that all the regression models performed worse in the validation cohort when compared to the derivation cohort. Further, there was little difference between regression models that contained pharmacogenetic coefficients and algorithms containing just non-pharmacogenetic coefficients. The inconsistency of results between the validation cohorts suggests that unaccounted population specific factors cause variability in dosing algorithm performance. Better methods for dosing that take into account inter- and intra-individual variability, at the initiation and maintenance phases of warfarin treatment, are needed.

  7. A priori motion models for four-dimensional reconstruction in gated cardiac SPECT

    International Nuclear Information System (INIS)

    Lalush, D.S.; Tsui, B.M.W.; Cui, Lin

    1996-01-01

    We investigate the benefit of incorporating a priori assumptions about cardiac motion in a fully four-dimensional (4D) reconstruction algorithm for gated cardiac SPECT. Previous work has shown that non-motion-specific 4D Gibbs priors enforcing smoothing in time and space can control noise while preserving resolution. In this paper, we evaluate methods for incorporating known heart motion in the Gibbs prior model. The new model is derived by assigning motion vectors to each 4D voxel, defining the movement of that volume of activity into the neighboring time frames. Weights for the Gibbs cliques are computed based on these open-quotes most likelyclose quotes motion vectors. To evaluate, we employ the mathematical cardiac-torso (MCAT) phantom with a new dynamic heart model that simulates the beating and twisting motion of the heart. Sixteen realistically-simulated gated datasets were generated, with noise simulated to emulate a real Tl-201 gated SPECT study. Reconstructions were performed using several different reconstruction algorithms, all modeling nonuniform attenuation and three-dimensional detector response. These include ML-EM with 4D filtering, 4D MAP-EM without prior motion assumption, and 4D MAP-EM with prior motion assumptions. The prior motion assumptions included both the correct motion model and incorrect models. Results show that reconstructions using the 4D prior model can smooth noise and preserve time-domain resolution more effectively than 4D linear filters. We conclude that modeling of motion in 4D reconstruction algorithms can be a powerful tool for smoothing noise and preserving temporal resolution in gated cardiac studies

  8. Verdad y apertura de mundo. El problema de los juicios sintéticos a priori tras el giro lingüístico

    Directory of Open Access Journals (Sweden)

    Cristina LAFONT

    2009-11-01

    Full Text Available RESUMEN: Este artículo analiza el impacto del giro lingüístico en la transformación de la concepción kantiana de los juicios sintéticos a priori. Se centra para ello en dos concepciones contemporáneas de los mismos, a saber, el a priori hermenéutico de Heidegger y el a priori contextual de Putnam, y saca a relucir expresamente tanto sus rasgos similares como sus importantes diferencias: mientras que la concepción heideggeriana mantiene el idealismo transcendental de Kant a través de la suposición hermenéutica de que el significado determina la referencia, la concepción de Putnam rompe por completo con la teoría kantiana de los juicios sintéticos a priori, según la cual éstos no pueden ser modificados por la sola experiencia, al tiempo que rechaza la suposición kantiana de que los juicios sintéticos a priori y aposteriori son dos clases permanentes de juicios. En contraste con Heidegger, Putnam es por eso capaz de mostrar que los juicios sintéticos apriori son modificables bajo condiciones especiales.ABSTRACT: This paper analyzes the impact of the linguistic turn in the transformation of the Kantian conception of the synthetic a priori. It focuses on two contemporary conceptions of the synthetic apriori, namely, Heidegger's hermeneutic apriori and Putnam's contextual apriori and shows their strikingly similar features as well as their important differences: whereas Heidegger's conception retains Kant's transcendental idealism via the hermeneutic assumption that meaning determines reference, Putnam's conception breaks entirely of Kant's conception fo synthetic a priori judgments, namely, that they cannot be revised by experience alone, while rejecting the Kantian assumption that synthetic apriori and aposteriori are permanent statuses of judgments. In contradistinction to Heidegger, Putnam is thus able to show that synthetic apriori judgments are revisable under special conditions.

  9. Tattoos defined.

    Science.gov (United States)

    Goldstein, Norman

    2007-01-01

    Tattoo definitions from general, foreign language, medical dictionaries and textbooks are reviewed. In addition to the common usage "to mark the skin with pigments," the word tattoo, used as a noun, first meant a signal on a drum or bugle to call military men to quarters. This chapter includes a variety of colorful, cultural and historical references. The increasing popularity of tattoos stimulated the American Academy of Dermatology to produce the 2004 brochure Tattoos, Body Piercing and Other Skin Adornments, which is reproduce here. When asked by patients about getting tattooed, it is wise to caution that even with the variety of modern techniques for removal available, some scarring may result.

  10. Defining chaos.

    Science.gov (United States)

    Hunt, Brian R; Ott, Edward

    2015-09-01

    In this paper, we propose, discuss, and illustrate a computationally feasible definition of chaos which can be applied very generally to situations that are commonly encountered, including attractors, repellers, and non-periodically forced systems. This definition is based on an entropy-like quantity, which we call "expansion entropy," and we define chaos as occurring when this quantity is positive. We relate and compare expansion entropy to the well-known concept of topological entropy to which it is equivalent under appropriate conditions. We also present example illustrations, discuss computational implementations, and point out issues arising from attempts at giving definitions of chaos that are not entropy-based.

  11. Defining Cyberbullying.

    Science.gov (United States)

    Englander, Elizabeth; Donnerstein, Edward; Kowalski, Robin; Lin, Carolyn A; Parti, Katalin

    2017-11-01

    Is cyberbullying essentially the same as bullying, or is it a qualitatively different activity? The lack of a consensual, nuanced definition has limited the field's ability to examine these issues. Evidence suggests that being a perpetrator of one is related to being a perpetrator of the other; furthermore, strong relationships can also be noted between being a victim of either type of attack. It also seems that both types of social cruelty have a psychological impact, although the effects of being cyberbullied may be worse than those of being bullied in a traditional sense (evidence here is by no means definitive). A complicating factor is that the 3 characteristics that define bullying (intent, repetition, and power imbalance) do not always translate well into digital behaviors. Qualities specific to digital environments often render cyberbullying and bullying different in circumstances, motivations, and outcomes. To make significant progress in addressing cyberbullying, certain key research questions need to be addressed. These are as follows: How can we define, distinguish between, and understand the nature of cyberbullying and other forms of digital conflict and cruelty, including online harassment and sexual harassment? Once we have a functional taxonomy of the different types of digital cruelty, what are the short- and long-term effects of exposure to or participation in these social behaviors? What are the idiosyncratic characteristics of digital communication that users can be taught? Finally, how can we apply this information to develop and evaluate effective prevention programs? Copyright © 2017 by the American Academy of Pediatrics.

  12. Mediterranean Diet and Cardiovascular Disease: A Critical Evaluation of A Priori Dietary Indexes

    Directory of Open Access Journals (Sweden)

    Annunziata D'Alessandro

    2015-09-01

    Full Text Available The aim of this paper is to analyze the a priori dietary indexes used in the studies that have evaluated the role of the Mediterranean Diet in influencing the risk of developing cardiovascular disease. All the studies show that this dietary pattern protects against cardiovascular disease, but studies show quite different effects on specific conditions such as coronary heart disease or cerebrovascular disease. A priori dietary indexes used to measure dietary exposure imply quantitative and/or qualitative divergences from the traditional Mediterranean Diet of the early 1960s, and, therefore, it is very difficult to compare the results of different studies. Based on real cultural heritage and traditions, we believe that the a priori indexes used to evaluate adherence to the Mediterranean Diet should consider classifying whole grains and refined grains, olive oil and monounsaturated fats, and wine and alcohol differently.

  13. Mediterranean Diet and Cardiovascular Disease: A Critical Evaluation of A Priori Dietary Indexes

    Science.gov (United States)

    D’Alessandro, Annunziata; De Pergola, Giovanni

    2015-01-01

    The aim of this paper is to analyze the a priori dietary indexes used in the studies that have evaluated the role of the Mediterranean Diet in influencing the risk of developing cardiovascular disease. All the studies show that this dietary pattern protects against cardiovascular disease, but studies show quite different effects on specific conditions such as coronary heart disease or cerebrovascular disease. A priori dietary indexes used to measure dietary exposure imply quantitative and/or qualitative divergences from the traditional Mediterranean Diet of the early 1960s, and, therefore, it is very difficult to compare the results of different studies. Based on real cultural heritage and traditions, we believe that the a priori indexes used to evaluate adherence to the Mediterranean Diet should consider classifying whole grains and refined grains, olive oil and monounsaturated fats, and wine and alcohol differently. PMID:26389950

  14. A Priori and a Posteriori Dietary Patterns during Pregnancy and Gestational Weight Gain: The Generation R Study

    Directory of Open Access Journals (Sweden)

    Myrte J. Tielemans

    2015-11-01

    Full Text Available Abnormal gestational weight gain (GWG is associated with adverse pregnancy outcomes. We examined whether dietary patterns are associated with GWG. Participants included 3374 pregnant women from a population-based cohort in the Netherlands. Dietary intake during pregnancy was assessed with food-frequency questionnaires. Three a posteriori-derived dietary patterns were identified using principal component analysis: a “Vegetable, oil and fish”, a “Nuts, high-fiber cereals and soy”, and a “Margarine, sugar and snacks” pattern. The a priori-defined dietary pattern was based on national dietary recommendations. Weight was repeatedly measured around 13, 20 and 30 weeks of pregnancy; pre-pregnancy and maximum weight were self-reported. Normal weight women with high adherence to the “Vegetable, oil and fish” pattern had higher early-pregnancy GWG than those with low adherence (43 g/week (95% CI 16; 69 for highest vs. lowest quartile (Q. Adherence to the “Margarine, sugar and snacks” pattern was associated with a higher prevalence of excessive GWG (OR 1.45 (95% CI 1.06; 1.99 Q4 vs. Q1. Normal weight women with higher scores on the “Nuts, high-fiber cereals and soy” pattern had more moderate GWG than women with lower scores (−0.01 (95% CI −0.02; −0.00 per SD. The a priori-defined pattern was not associated with GWG. To conclude, specific dietary patterns may play a role in early pregnancy but are not consistently associated with GWG.

  15. Learning to improve medical decision making from imbalanced data without a priori cost.

    Science.gov (United States)

    Wan, Xiang; Liu, Jiming; Cheung, William K; Tong, Tiejun

    2014-12-05

    In a medical data set, data are commonly composed of a minority (positive or abnormal) group and a majority (negative or normal) group and the cost of misclassifying a minority sample as a majority sample is highly expensive. This is the so-called imbalanced classification problem. The traditional classification functions can be seriously affected by the skewed class distribution in the data. To deal with this problem, people often use a priori cost to adjust the learning process in the pursuit of optimal classification function. However, this priori cost is often unknown and hard to estimate in medical decision making. In this paper, we propose a new learning method, named RankCost, to classify imbalanced medical data without using a priori cost. Instead of focusing on improving the class-prediction accuracy, RankCost is to maximize the difference between the minority class and the majority class by using a scoring function, which translates the imbalanced classification problem into a partial ranking problem. The scoring function is learned via a non-parametric boosting algorithm. We compare RankCost to several representative approaches on four medical data sets varying in size, imbalanced ratio, and dimension. The experimental results demonstrate that unlike the currently available methods that often perform unevenly with different priori costs, RankCost shows comparable performance in a consistent manner. It is a challenging task to learn an effective classification model based on imbalanced data in medical data analysis. The traditional approaches often use a priori cost to adjust the learning of the classification function. This work presents a novel approach, namely RankCost, for learning from medical imbalanced data sets without using a priori cost. The experimental results indicate that RankCost performs very well in imbalanced data classification and can be a useful method in real-world applications of medical decision making.

  16. A Frequency Matching Method for Generation of a Priori Sample Models from Training Images

    DEFF Research Database (Denmark)

    Lange, Katrine; Cordua, Knud Skou; Frydendall, Jan

    2011-01-01

    This paper presents a Frequency Matching Method (FMM) for generation of a priori sample models based on training images and illustrates its use by an example. In geostatistics, training images are used to represent a priori knowledge or expectations of models, and the FMM can be used to generate...... new images that share the same multi-point statistics as a given training image. The FMM proceeds by iteratively updating voxel values of an image until the frequency of patterns in the image matches the frequency of patterns in the training image; making the resulting image statistically...... indistinguishable from the training image....

  17. Full 3-D stratigraphic inversion with a priori information: a powerful way to optimize data integration

    Energy Technology Data Exchange (ETDEWEB)

    Grizon, L.; Leger, M.; Dequirez, P.Y.; Dumont, F.; Richard, V.

    1998-12-31

    Integration between seismic and geological data is crucial to ensure that a reservoir study is accurate and reliable. To reach this goal, there is used a post-stack stratigraphic inversion with a priori information. The global cost-function combines two types of constraints. One is relevant to seismic amplitudes, and the other to an a priori impedance model. This paper presents this flexible and interpretative inversion to determine acoustic impedances constrained by seismic data, log data and geologic information. 5 refs., 8 figs.

  18. LMFBR safety criteria: cost-benefit considerations under the constraint of an a priori risk criterion

    International Nuclear Information System (INIS)

    Hartung, J.

    1979-01-01

    The role of cost-benefit considerations and a priori risk criteria as determinants of Core Disruptive Accident (CDA)-related safety criteria for large LMFBR's is explored with the aid of quantitative risk and probabilistic analysis methods. A methodology is described which allows a large number of design and siting alternatives to be traded off against each other with the goal of minimizing energy generation costs subject to the constraint of both an a priori risk criterion and a cost-benefit criterion. Application of this methodology to a specific LMFBR design project is described and the results are discussed. 5 refs

  19. Plasma ascorbic acid, a priori diet quality score, and incident hypertension

    NARCIS (Netherlands)

    Buijsse, Brian; Jacobs, D.R.; Steffen, L.M.; Kromhout, Daan; Gross, M.D.

    2015-01-01

    Vitamin C may reduce risk of hypertension, either in itself or by marking a healthy diet pattern. We assessed whether plasma ascorbic acid and the a priori diet quality score relate to incident hypertension and whether they explain each other's predictive abilities. Data were from 2884 black and

  20. Realism, functions, and the a priori: Ernst Cassirer's philosophy of science.

    Science.gov (United States)

    Heis, Jeremy

    2014-12-01

    This paper presents the main ideas of Cassirer's general philosophy of science, focusing on the two aspects of his thought that--in addition to being the most central ideas in his philosophy of science--have received the most attention from contemporary philosophers of science: his theory of the a priori aspects of physical theory, and his relation to scientific realism.

  1. New a priori estimates for mean-field games with congestion

    KAUST Repository

    Evangelista, David; Gomes, Diogo A.

    2016-01-01

    We present recent developments in crowd dynamics models (e.g. pedestrian flow problems). Our formulation is given by a mean-field game (MFG) with congestion. We start by reviewing earlier models and results. Next, we develop our model. We establish new a priori estimates that give partial regularity of the solutions. Finally, we discuss numerical results.

  2. The effect of a priori probability and complexity on decision making in a supervisory control task

    NARCIS (Netherlands)

    Kerstholt, J.H.; Passenier, P.O.; Houttuin, K.; Schuffel, H.

    1996-01-01

    In the present study we investigated how monitoring and fault management in a ship control task are affected by complexity and a priori probability of disturbances. Partici-pants were required to supervise four independent shipping subsystems and to adjust the subsystems whenever deviations

  3. A concept for automated nanoscale atomic force microscope (AFM) measurements using a priori knowledge

    International Nuclear Information System (INIS)

    Recknagel, C; Rothe, H

    2009-01-01

    The nanometer coordinate measuring machine (NCMM) is developed for comparatively fast large area scans with high resolution. The system combines a metrological atomic force microscope (AFM) with a precise positioning system. The sample is moved under the probe system via the positioning system achieving a scan range of 25 × 25 × 5 mm 3 with a resolution of 0.1 nm. A concept for AFM measurements using a priori knowledge is implemented. The a priori knowledge is generated through measurements with a white light interferometer and the use of CAD data. Dimensional markup language is used as a transfer and target format for a priori knowledge and measurement data. Using the a priori knowledge and template matching algorithms combined with the optical microscope of the NCMM, the region of interest can automatically be identified. In the next step the automatic measurement of the part coordinate system and the measurement elements with the AFM sensor of the NCMM is done. The automatic measurement involves intelligent measurement strategies, which are adapted to specific geometries of the measurement feature to reduce measurement time and drift effects

  4. New a priori estimates for mean-field games with congestion

    KAUST Repository

    Evangelista, David

    2016-01-06

    We present recent developments in crowd dynamics models (e.g. pedestrian flow problems). Our formulation is given by a mean-field game (MFG) with congestion. We start by reviewing earlier models and results. Next, we develop our model. We establish new a priori estimates that give partial regularity of the solutions. Finally, we discuss numerical results.

  5. On the Implications of A Priori Constraints in Transdimensional Bayesian Inversion for Continental Lithospheric Layering

    Science.gov (United States)

    Roy, C.; Romanowicz, B. A.

    2017-12-01

    Monte Carlo methods are powerful approaches to solve nonlinear problems and are becoming very popular in Earth sciences. One reason being that, at first glance, no constraints or explicit regularization of model parameters are required. At second glance, one might realize that regularization is done through a prior. The choice of this prior, however, is subjective, and with its choice, unintended or undesired extra information can be injected into the problem. The principal criticism of Bayesian methods is that the prior can be "tuned" in order to get the expected solution. Consequently, detractors of the Bayesian method could easily argue that the solution is influenced by the form of the prior distribution, which choice is subjective. Hence, models obtained with Monte Carlo methods are still highly debated. Here we investigate the influence of a priori constraints (i.e., fixed crustal discontinuities) on the posterior probability distributions of estimated parameters, that is, vertical polarized shear velocity VSV and radial anisotropy ξ, in a transdimensional Bayesian inversion for continental lithospheric structure. We follow upon the work of Calò et al. (2016), who jointly inverted converted phases (P to S) without deconvolution and surface wave dispersion data, to obtain 1-D radial anisotropic shear wave velocity profiles in the North American craton. We aim at verifying whether the strong lithospheric layering found in the stable part of the craton is robust with respect to artifacts that might be caused by the methodology used. We test the hypothesis that the observed midlithospheric discontinuities result from (1) fixed crustal discontinuities in the reference model and (2) a fixed Vp/Vs ratio. The synthetic tests on two Earth models show that a fixed Vp/Vs ratio does not introduce artificial layering, even if the assumed value is slightly wrong. This is an important finding for real data inversion where the true value is not always available or accurate

  6. The origin of anomalous transport in porous media - is it possible to make a priori predictions?

    Science.gov (United States)

    Bijeljic, Branko; Blunt, Martin

    2013-04-01

    at approximately the average flow speed; in the carbonate with the widest velocity distribution the stagnant concentration peak is persistent, while the emergence of a smaller secondary mobile peak is observed, leading to a highly anomalous behavior. This defines different generic nature of non-Fickian transport in the three media and quantifies the effect of pore structure on transport. Moreover, the propagators obtained by the model are in a very good agreement with the propagators measured on beadpack, Bentheimer sandstone and Portland carbonate cores in nuclear magnetic resonance experiments. These findings demonstrate that it is possible to make a priori predictions of anomalous transport in porous media. The importance of these findings for transport in complex carbonate rock micro-CT images is discussed, classifying them in terms of degree of anomalous transport that can have an impact at the field scale. Extensions to reactive transport will be discussed.

  7. VBE reference framework

    NARCIS (Netherlands)

    Afsarmanesh, H.; Camarinha-Matos, L.M.; Ermilova, E.; Camarinha-Matos, L.M.; Afsarmanesh, H.; Ollus, M.

    2008-01-01

    Defining a comprehensive and generic "reference framework" for Virtual organizations Breeding Environments (VBEs), addressing all their features and characteristics, is challenging. While the definition and modeling of VBEs has become more formalized during the last five years, "reference models"

  8. Razonamiento a priori y argumento ontológico en Antonio Rosmini

    Directory of Open Access Journals (Sweden)

    Juan F. Franck

    2013-11-01

    Full Text Available Rosmini’s criticism of the ontological argument finds its place between those of Aquinas and of Kant. With the former he shares the denial of the evidence of God’s essence quoad nos, and with the latter, his acknowledgment of the decisive character of the nucleus of the ontological argument for all other proofs of God’s existence. Such nucleus consists for Rosmini in the possibility of developing an a priori reasoning, different from the ontological one, be it in its Anselmian, Cartesian or Leibnizian form, which would justify the validity of the other proofs and the fascination exercised by the ontological argument. Key words: A Priori Reasoning, Ontological Argument, Antonio Rosmini.

  9. Global a priori estimates for the inhomogeneous Landau equation with moderately soft potentials

    Science.gov (United States)

    Cameron, Stephen; Silvestre, Luis; Snelson, Stanley

    2018-05-01

    We establish a priori upper bounds for solutions to the spatially inhomogeneous Landau equation in the case of moderately soft potentials, with arbitrary initial data, under the assumption that mass, energy and entropy densities stay under control. Our pointwise estimates decay polynomially in the velocity variable. We also show that if the initial data satisfies a Gaussian upper bound, this bound is propagated for all positive times.

  10. Sampling informative/complex a priori probability distributions using Gibbs sampling assisted by sequential simulation

    DEFF Research Database (Denmark)

    Hansen, Thomas Mejer; Mosegaard, Klaus; Cordua, Knud Skou

    2010-01-01

    Markov chain Monte Carlo methods such as the Gibbs sampler and the Metropolis algorithm can be used to sample the solutions to non-linear inverse problems. In principle these methods allow incorporation of arbitrarily complex a priori information, but current methods allow only relatively simple...... this algorithm with the Metropolis algorithm to obtain an efficient method for sampling posterior probability densities for nonlinear inverse problems....

  11. A priori data-driven multi-clustered reservoir generation algorithm for echo state network.

    Directory of Open Access Journals (Sweden)

    Xiumin Li

    Full Text Available Echo state networks (ESNs with multi-clustered reservoir topology perform better in reservoir computing and robustness than those with random reservoir topology. However, these ESNs have a complex reservoir topology, which leads to difficulties in reservoir generation. This study focuses on the reservoir generation problem when ESN is used in environments with sufficient priori data available. Accordingly, a priori data-driven multi-cluster reservoir generation algorithm is proposed. The priori data in the proposed algorithm are used to evaluate reservoirs by calculating the precision and standard deviation of ESNs. The reservoirs are produced using the clustering method; only the reservoir with a better evaluation performance takes the place of a previous one. The final reservoir is obtained when its evaluation score reaches the preset requirement. The prediction experiment results obtained using the Mackey-Glass chaotic time series show that the proposed reservoir generation algorithm provides ESNs with extra prediction precision and increases the structure complexity of the network. Further experiments also reveal the appropriate values of the number of clusters and time window size to obtain optimal performance. The information entropy of the reservoir reaches the maximum when ESN gains the greatest precision.

  12. Geopositioning with a quadcopter: Extracted feature locations and predicted accuracy without a priori sensor attitude information

    Science.gov (United States)

    Dolloff, John; Hottel, Bryant; Edwards, David; Theiss, Henry; Braun, Aaron

    2017-05-01

    This paper presents an overview of the Full Motion Video-Geopositioning Test Bed (FMV-GTB) developed to investigate algorithm performance and issues related to the registration of motion imagery and subsequent extraction of feature locations along with predicted accuracy. A case study is included corresponding to a video taken from a quadcopter. Registration of the corresponding video frames is performed without the benefit of a priori sensor attitude (pointing) information. In particular, tie points are automatically measured between adjacent frames using standard optical flow matching techniques from computer vision, an a priori estimate of sensor attitude is then computed based on supplied GPS sensor positions contained in the video metadata and a photogrammetric/search-based structure from motion algorithm, and then a Weighted Least Squares adjustment of all a priori metadata across the frames is performed. Extraction of absolute 3D feature locations, including their predicted accuracy based on the principles of rigorous error propagation, is then performed using a subset of the registered frames. Results are compared to known locations (check points) over a test site. Throughout this entire process, no external control information (e.g. surveyed points) is used other than for evaluation of solution errors and corresponding accuracy.

  13. Approximate deconvolution model for the simulation of turbulent gas-solid flows: An a priori analysis

    Science.gov (United States)

    Schneiderbauer, Simon; Saeedipour, Mahdi

    2018-02-01

    Highly resolved two-fluid model (TFM) simulations of gas-solid flows in vertical periodic channels have been performed to study closures for the filtered drag force and the Reynolds-stress-like contribution stemming from the convective terms. An approximate deconvolution model (ADM) for the large-eddy simulation of turbulent gas-solid suspensions is detailed and subsequently used to reconstruct those unresolved contributions in an a priori manner. With such an approach, an approximation of the unfiltered solution is obtained by repeated filtering allowing the determination of the unclosed terms of the filtered equations directly. A priori filtering shows that predictions of the ADM model yield fairly good agreement with the fine grid TFM simulations for various filter sizes and different particle sizes. In particular, strong positive correlation (ρ > 0.98) is observed at intermediate filter sizes for all sub-grid terms. Additionally, our study reveals that the ADM results moderately depend on the choice of the filters, such as box and Gaussian filter, as well as the deconvolution order. The a priori test finally reveals that ADM is superior compared to isotropic functional closures proposed recently [S. Schneiderbauer, "A spatially-averaged two-fluid model for dense large-scale gas-solid flows," AIChE J. 63, 3544-3562 (2017)].

  14. Local digital control of power electronic converters in a dc microgrid based on a-priori derivation of switching surfaces

    Science.gov (United States)

    Banerjee, Bibaswan

    In power electronic basedmicrogrids, the computational requirements needed to implement an optimized online control strategy can be prohibitive. The work presented in this dissertation proposes a generalized method of derivation of geometric manifolds in a dc microgrid that is based on the a-priori computation of the optimal reactions and trajectories for classes of events in a dc microgrid. The proposed states are the stored energies in all the energy storage elements of the dc microgrid and power flowing into them. It is anticipated that calculating a large enough set of dissimilar transient scenarios will also span many scenarios not specifically used to develop the surface. These geometric manifolds will then be used as reference surfaces in any type of controller, such as a sliding mode hysteretic controller. The presence of switched power converters in microgrids involve different control actions for different system events. The control of the switch states of the converters is essential for steady state and transient operations. A digital memory look-up based controller that uses a hysteretic sliding mode control strategy is an effective technique to generate the proper switch states for the converters. An example dcmicrogrid with three dc-dc boost converters and resistive loads is considered for this work. The geometric manifolds are successfully generated for transient events, such as step changes in the loads and the sources. The surfaces corresponding to a specific case of step change in the loads are then used as reference surfaces in an EEPROM for experimentally validating the control strategy. The required switch states corresponding to this specific transient scenario are programmed in the EEPROM as a memory table. This controls the switching of the dc-dc boost converters and drives the system states to the reference manifold. In this work, it is shown that this strategy effectively controls the system for a transient condition such as step changes

  15. Evaluating A Priori Ozone Profile Information Used in TEMPO Tropospheric Ozone Retrievals

    Science.gov (United States)

    Johnson, Matthew S.; Sullivan, John T.; Liu, Xiong; Newchurch, Mike; Kuang, Shi; McGee, Thomas J.; Langford, Andrew O'Neil; Senff, Christoph J.; Leblanc, Thierry; Berkoff, Timothy; hide

    2016-01-01

    Ozone (O3) is a greenhouse gas and toxic pollutant which plays a major role in air quality. Typically, monitoring of surface air quality and O3 mixing ratios is primarily conducted using in situ measurement networks. This is partially due to high-quality information related to air quality being limited from space-borne platforms due to coarse spatial resolution, limited temporal frequency, and minimal sensitivity to lower tropospheric and surface-level O3. The Tropospheric Emissions: Monitoring of Pollution (TEMPO) satellite is designed to address these limitations of current space-based platforms and to improve our ability to monitor North American air quality. TEMPO will provide hourly data of total column and vertical profiles of O3 with high spatial resolution to be used as a near-real-time air quality product. TEMPO O3 retrievals will apply the Smithsonian Astrophysical Observatory profile algorithm developed based on work from GOME, GOME-2, and OMI. This algorithm uses a priori O3 profile information from a climatological data-base developed from long-term ozone-sonde measurements (tropopause-based (TB) O3 climatology). It has been shown that satellite O3 retrievals are sensitive to a priori O3 profiles and covariance matrices. During this work we investigate the climatological data to be used in TEMPO algorithms (TB O3) and simulated data from the NASA GMAO Goddard Earth Observing System (GEOS-5) Forward Processing (FP) near-real-time (NRT) model products. These two data products will be evaluated with ground-based lidar data from the Tropospheric Ozone Lidar Network (TOLNet) at various locations of the US. This study evaluates the TB climatology, GEOS-5 climatology, and 3-hourly GEOS-5 data compared to lower tropospheric observations to demonstrate the accuracy of a priori information to potentially be used in TEMPO O3 algorithms. Here we present our initial analysis and the theoretical impact on TEMPO retrievals in the lower troposphere.

  16. Evaluating A Priori Ozone Profile Information Used in TEMPO Tropospheric Ozone Retrievals

    Science.gov (United States)

    Johnson, M. S.; Sullivan, J. T.; Liu, X.; Newchurch, M.; Kuang, S.; McGee, T. J.; Langford, A. O.; Senff, C. J.; Leblanc, T.; Berkoff, T.; Gronoff, G.; Chen, G.; Strawbridge, K. B.

    2016-12-01

    Ozone (O3) is a greenhouse gas and toxic pollutant which plays a major role in air quality. Typically, monitoring of surface air quality and O3 mixing ratios is primarily conducted using in situ measurement networks. This is partially due to high-quality information related to air quality being limited from space-borne platforms due to coarse spatial resolution, limited temporal frequency, and minimal sensitivity to lower tropospheric and surface-level O3. The Tropospheric Emissions: Monitoring of Pollution (TEMPO) satellite is designed to address these limitations of current space-based platforms and to improve our ability to monitor North American air quality. TEMPO will provide hourly data of total column and vertical profiles of O3 with high spatial resolution to be used as a near-real-time air quality product. TEMPO O3 retrievals will apply the Smithsonian Astrophysical Observatory profile algorithm developed based on work from GOME, GOME-2, and OMI. This algorithm uses a priori O3 profile information from a climatological data-base developed from long-term ozone-sonde measurements (tropopause-based (TB) O3 climatology). It has been shown that satellite O3 retrievals are sensitive to a priori O3 profiles and covariance matrices. During this work we investigate the climatological data to be used in TEMPO algorithms (TB O3) and simulated data from the NASA GMAO Goddard Earth Observing System (GEOS-5) Forward Processing (FP) near-real-time (NRT) model products. These two data products will be evaluated with ground-based lidar data from the Tropospheric Ozone Lidar Network (TOLNet) at various locations of the US. This study evaluates the TB climatology, GEOS-5 climatology, and 3-hourly GEOS-5 data compared to lower tropospheric observations to demonstrate the accuracy of a priori information to potentially be used in TEMPO O3 algorithms. Here we present our initial analysis and the theoretical impact on TEMPO retrievals in the lower troposphere.

  17. Use of a priori information in incomplete data x-ray CT imaging

    International Nuclear Information System (INIS)

    Eberhard, J.W.; Hedengren, K.H.

    1988-01-01

    A new technique for utilizing a priori information is presented which uses CAD electronic part models to make use of effectively all the information which is available in the blueprint of a selected industrial part. Significant improvements in x-ray image quality are demonstrated using the technique in the image enhancement of the model of an exhaust nozzle actuation ring for the F110 aircraft. Three approaches were evaluated: a projection data approach, an iterative reconstruction approach, and an image processing and analysis approach. Results for these approaches are included. X-ray CT images of the simulated part image reconstructed with several choices of available angular range are shown

  18. A Priori User Acceptance and the Perceived Driving Pleasure in Semi-autonomous and Autonomous Vehicles

    DEFF Research Database (Denmark)

    Bjørner, Thomas

    The aim of this minor pilot study is, from a sociological user perspective, to explore a priori user acceptance and the perceived driving pleasure in semi- autonomous and autonomous vehicles. The methods used were 13 in-depth interviews while having participants watch video examples within four...... different scenarios. After each scenario, two different numerical rating scales were used. There was a tendency toward positive attitudes regarding semi- autonomous driving systems, especially the use of a parking assistant and while driving in city traffic congestion. However, there were also major...

  19. Brezzi-Pitkaranta stabilization and a priori error analysis for the Stokes Control

    Directory of Open Access Journals (Sweden)

    Aytekin Cibik

    2016-12-01

    Full Text Available In this study, we consider a Brezzi-Pitkaranta stabilization scheme for the optimal control problem governed by stationary Stokes equation, using a P1-P1 interpolation for velocity and pressure. We express the stabilization as extra terms added to the discrete variational form of the problem.  We first prove the stability of the finite element discretization of the problem. Then, we derive a priori error bounds for each variable and present a numerical example to show the effectiveness of the stabilization clearly.

  20. A contextual image segmentation system using a priori information for automatic data classification in nuclear physics

    International Nuclear Information System (INIS)

    Benkirane, A.; Auger, G.; Chbihi, A.; Bloyet, D.; Plagnol, E.

    1994-01-01

    This paper presents an original approach to solve an automatic data classification problem by means of image processing techniques. The classification is achieved using image segmentation techniques for extracting the meaningful classes. Two types of information are merged for this purpose: the information contained in experimental images and a priori information derived from underlying physics (and adapted to image segmentation problem). This data fusion is widely used at different stages of the segmentation process. This approach yields interesting results in terms of segmentation performances, even in very noisy cases. Satisfactory classification results are obtained in cases where more ''classical'' automatic data classification methods fail. (authors). 25 refs., 14 figs., 1 append

  1. A contextual image segmentation system using a priori information for automatic data classification in nuclear physics

    Energy Technology Data Exchange (ETDEWEB)

    Benkirane, A; Auger, G; Chbihi, A [Grand Accelerateur National d` Ions Lourds (GANIL), 14 - Caen (France); Bloyet, D [Caen Univ., 14 (France); Plagnol, E [Paris-11 Univ., 91 - Orsay (France). Inst. de Physique Nucleaire

    1994-12-31

    This paper presents an original approach to solve an automatic data classification problem by means of image processing techniques. The classification is achieved using image segmentation techniques for extracting the meaningful classes. Two types of information are merged for this purpose: the information contained in experimental images and a priori information derived from underlying physics (and adapted to image segmentation problem). This data fusion is widely used at different stages of the segmentation process. This approach yields interesting results in terms of segmentation performances, even in very noisy cases. Satisfactory classification results are obtained in cases where more ``classical`` automatic data classification methods fail. (authors). 25 refs., 14 figs., 1 append.

  2. Importance of A Priori Vertical Ozone Profiles for TEMPO Air Quality Retrievals

    Science.gov (United States)

    Johnson, M. S.; Sullivan, J. T.; Liu, X.; Zoogman, P.; Newchurch, M.; Kuang, S.; McGee, T. J.; Leblanc, T.

    2017-12-01

    Ozone (O3) is a toxic pollutant which plays a major role in air quality. Typically, monitoring of surface air quality and O3 mixing ratios is conducted using in situ measurement networks. This is partially due to high-quality information related to air quality being limited from space-borne platforms due to coarse spatial resolution, limited temporal frequency, and minimal sensitivity to lower tropospheric and surface-level O3. The Tropospheric Emissions: Monitoring of Pollution (TEMPO) satellite is designed to address the limitations of current space-based platforms and to improve our ability to monitor North American air quality. TEMPO will provide hourly data of total column and vertical profiles of O3 with high spatial resolution to be used as a near-real-time air quality product. TEMPO O3 retrievals will apply the Smithsonian Astrophysical Observatory profile algorithm developed based on work from GOME, GOME-2, and OMI. This algorithm is suggested to use a priori O3 profile information from a climatological data-base developed from long-term ozone-sonde measurements (tropopause-based (TB-Clim) O3 climatology). This study evaluates the TB-Clim dataset and model simulated O3 profiles, which could potentially serve as a priori O3 profile information in TEMPO retrievals, from near-real-time data assimilation model products (NASA GMAO's operational GEOS-5 FP model and reanalysis data from MERRA2) and a full chemical transport model (CTM), GEOS-Chem. In this study, vertical profile products are evaluated with surface (0-2 km) and tropospheric (0-10 km) TOLNet observations and the theoretical impact of individual a priori profile sources on the accuracy of TEMPO O3 retrievals in the troposphere and at the surface are presented. Results indicate that while the TB-Clim climatological dataset can replicate seasonally-averaged tropospheric O3 profiles, model-simulated profiles from a full CTM resulted in more accurate tropospheric and surface-level O3 retrievals from

  3. Priori mask guided image reconstruction (p-MGIR) for ultra-low dose cone-beam computed tomography

    Science.gov (United States)

    Park, Justin C.; Zhang, Hao; Chen, Yunmei; Fan, Qiyong; Kahler, Darren L.; Liu, Chihray; Lu, Bo

    2015-11-01

    Recently, the compressed sensing (CS) based iterative reconstruction method has received attention because of its ability to reconstruct cone beam computed tomography (CBCT) images with good quality using sparsely sampled or noisy projections, thus enabling dose reduction. However, some challenges remain. In particular, there is always a tradeoff between image resolution and noise/streak artifact reduction based on the amount of regularization weighting that is applied uniformly across the CBCT volume. The purpose of this study is to develop a novel low-dose CBCT reconstruction algorithm framework called priori mask guided image reconstruction (p-MGIR) that allows reconstruction of high-quality low-dose CBCT images while preserving the image resolution. In p-MGIR, the unknown CBCT volume was mathematically modeled as a combination of two regions: (1) where anatomical structures are complex, and (2) where intensities are relatively uniform. The priori mask, which is the key concept of the p-MGIR algorithm, was defined as the matrix that distinguishes between the two separate CBCT regions where the resolution needs to be preserved and where streak or noise needs to be suppressed. We then alternately updated each part of image by solving two sub-minimization problems iteratively, where one minimization was focused on preserving the edge information of the first part while the other concentrated on the removal of noise/artifacts from the latter part. To evaluate the performance of the p-MGIR algorithm, a numerical head-and-neck phantom, a Catphan 600 physical phantom, and a clinical head-and-neck cancer case were used for analysis. The results were compared with the standard Feldkamp-Davis-Kress as well as conventional CS-based algorithms. Examination of the p-MGIR algorithm showed that high-quality low-dose CBCT images can be reconstructed without compromising the image resolution. For both phantom and the patient cases, the p-MGIR is able to achieve a clinically

  4. A set-theoretic model reference adaptive control architecture for disturbance rejection and uncertainty suppression with strict performance guarantees

    Science.gov (United States)

    Arabi, Ehsan; Gruenwald, Benjamin C.; Yucelen, Tansel; Nguyen, Nhan T.

    2018-05-01

    Research in adaptive control algorithms for safety-critical applications is primarily motivated by the fact that these algorithms have the capability to suppress the effects of adverse conditions resulting from exogenous disturbances, imperfect dynamical system modelling, degraded modes of operation, and changes in system dynamics. Although government and industry agree on the potential of these algorithms in providing safety and reducing vehicle development costs, a major issue is the inability to achieve a-priori, user-defined performance guarantees with adaptive control algorithms. In this paper, a new model reference adaptive control architecture for uncertain dynamical systems is presented to address disturbance rejection and uncertainty suppression. The proposed framework is predicated on a set-theoretic adaptive controller construction using generalised restricted potential functions.The key feature of this framework allows the system error bound between the state of an uncertain dynamical system and the state of a reference model, which captures a desired closed-loop system performance, to be less than a-priori, user-defined worst-case performance bound, and hence, it has the capability to enforce strict performance guarantees. Examples are provided to demonstrate the efficacy of the proposed set-theoretic model reference adaptive control architecture.

  5. Determining the depth of certain gravity sources without a priori specification of their structural index

    Science.gov (United States)

    Zhou, Shuai; Huang, Danian

    2015-11-01

    We have developed a new method for the interpretation of gravity tensor data based on the generalized Tilt-depth method. Cooper (2011, 2012) extended the magnetic Tilt-depth method to gravity data. We take the gradient-ratio method of Cooper (2011, 2012) and modify it so that the source type does not need to be specified a priori. We develop the new method by generalizing the Tilt-depth method for depth estimation for different types of source bodies. The new technique uses only the three vertical tensor components of the full gravity tensor data observed or calculated at different height plane to estimate the depth of the buried bodies without a priori specification of their structural index. For severely noise-corrupted data, our method utilizes different upward continuation height data, which can effectively reduce the influence of noise. Theoretical simulations of the gravity source model with and without noise illustrate the ability of the method to provide source depth information. Additionally, the simulations demonstrate that the new method is simple, computationally fast and accurate. Finally, we apply the method using the gravity data acquired over the Humble Salt Dome in the USA as an example. The results show a good correspondence to the previous drilling and seismic interpretation results.

  6. Characteristic Investigation of Unfolded Neutron Spectra with Different Priori Information and Gamma Radiation Interference

    International Nuclear Information System (INIS)

    Kim, Bong Hwan

    2006-01-01

    Neutron field spectrometry using multi spheres such as Bonner Spheres (BS) has been almost essential in radiation protection dosimetry for a long time at workplace in spite of poor energy resolution because it is not asking the fine energy resolution but requiring easy operation and measurement performance over a wide range of energy interested. KAERI has developed and used extended BS system based on a LiI(Eu) scintillator as the representative neutron spectrometry system for workplace monitoring as well as for the quantification of neutron calibration fields such as those recommended by ISO 8529. Major topics in using BS are how close the unfolded spectra is the real one and to minimize the interference of gamma radiation in neutron/gamma mixed fields in case of active instrument such as a BS with a LiI(Eu) scintillator. The former is related with choosing a priori information when unfolding the measured data and the latter is depend on how to discriminate it in intense gamma radiation fields. Influence of a priori information in unfolding and effect of counting loss due to pile-up of signals for the KAERI BS system were investigated analyzing the spectral measurement results of Scattered Neutron Calibration Fields (SNCF)

  7. Image reconstruction in electrostatic tomography using a priori knowledge from ECT

    International Nuclear Information System (INIS)

    Zhou Bin; Zhang Jianyong; Xu Chuanlong; Wang Shimin

    2011-01-01

    Research highlights: → A dual-mode sensor technique based on ECT and EST is proposed. → The interference of the charged particles to ECT can be eliminated. → A priori knowledge from ECT improves the inversion accuracy. - Abstract: In gas-solid two-phase flow, the charge distribution is a very important process parameter which is useful to the study of electrostatic adhesion. Electrostatic tomography (EST) is a relatively new non-intrusive technique which can be used to acquire charge distribution. However, due to limited measurements, the quality of image reconstruction is poor. In this paper, a dual-mode sensor technique based on electrical capacitance tomography (ECT) and EST is proposed. The theoretical analysis and the numerical simulation results reveal that the permittivity distribution obtained from ECT can provide a priori knowledge for the inversion calculation of EST, so that the accuracy of spatial sensitivity calculation in EST can be improved. This proposed technique is expected to be prospective in industrial applications and will also be beneficial to the research on the fluid dynamics of gas-solid two-phase flow.

  8. Comparison of a priori versus provisional heparin therapy on radial artery occlusion after transradial coronary angiography and patent hemostasis (from the PHARAOH Study).

    Science.gov (United States)

    Pancholy, Samir B; Bertrand, Olivier F; Patel, Tejas

    2012-07-15

    Systemic anticoagulation decreases the risk of radial artery occlusion (RAO) after transradial catheterization and standard occlusive hemostasis. We compared the efficacy and safety of provisional heparin use only when the technique of patent hemostasis was not achievable to standard a priori heparin administration after radial sheath introduction. Patients referred for coronary angiography were randomized in 2 groups. In the a priori group, 200 patients received intravenous heparin (50 IU/kg) immediately after sheath insertion. In the provisional group, 200 patients did not receive heparin during the procedure. After sheath removal, hemostasis was obtained using a TR band (Terumo corporation, Tokyo, Japan) with a plethysmography-guided patent hemostasis technique. In the provisional group, no heparin was given if radial artery patency could be obtained and maintained. If radial patency was not achieved, a bolus of heparin (50 IU/kg) was given. Radial artery patency was evaluated at 24 hours (early RAO) and 30 days after the procedure (late RAO) by plethysmography. Patent hemostasis was obtained in 67% in the a priori group and 74% in the provisional group (p = 0.10). Incidence of RAO remained similar in the 2 groups at the early (7.5% vs 7.0%, p = 0.84) and late (4.5% vs 5.0%, p = 0.83) evaluations. Women, patients with diabetes, patients having not received heparin, and patients without radial artery patency during hemostasis had more RAO. By multivariate analysis, patent radial artery during hemostasis (odds ratio [OR] 0.03, 95% confidence interval [CI] 0.004 to 0.28, p = 0.002) and diabetes (OR 11, 95% CI 3 to 38,p patent hemostasis is maintained. Copyright © 2012 Elsevier Inc. All rights reserved.

  9. Improving multiple-point-based a priori models for inverse problems by combining Sequential Simulation with the Frequency Matching Method

    DEFF Research Database (Denmark)

    Cordua, Knud Skou; Hansen, Thomas Mejer; Lange, Katrine

    In order to move beyond simplified covariance based a priori models, which are typically used for inverse problems, more complex multiple-point-based a priori models have to be considered. By means of marginal probability distributions ‘learned’ from a training image, sequential simulation has...... proven to be an efficient way of obtaining multiple realizations that honor the same multiple-point statistics as the training image. The frequency matching method provides an alternative way of formulating multiple-point-based a priori models. In this strategy the pattern frequency distributions (i.......e. marginals) of the training image and a subsurface model are matched in order to obtain a solution with the same multiple-point statistics as the training image. Sequential Gibbs sampling is a simulation strategy that provides an efficient way of applying sequential simulation based algorithms as a priori...

  10. A priori which-way information in quantum interference with unstable particles

    International Nuclear Information System (INIS)

    Krause, D.E.; Fischbach, E.; Rohrbach, Z.J.

    2014-01-01

    If an unstable particle used in a two-path interference experiment decays before reaching a detector, which-way information becomes available that reduces the detected interference fringe visibility V. Here we argue that even when an unstable particle does not decay while in the interferometer, a priori which-way information is still available in the form of path predictability P which depends on the particle's decay rate Γ. We further demonstrate that in a matter-wave Mach–Zehnder interferometer using an excited atom with an appropriately tuned cavity, P is related to V through the duality relation P 2 +V 2 =1. - Highlights: • Even undecayed unstable particles exhibit novel interference effects. • Interference is studied in a Mach–Zehnder interferometer with a cavity. • More which-way information is available when using unstable particles. • A relation between which-way information and interference is satisfied

  11. GNSS Precise Kinematic Positioning for Multiple Kinematic Stations Based on A Priori Distance Constraints

    Science.gov (United States)

    He, Kaifei; Xu, Tianhe; Förste, Christoph; Petrovic, Svetozar; Barthelmes, Franz; Jiang, Nan; Flechtner, Frank

    2016-01-01

    When applying the Global Navigation Satellite System (GNSS) for precise kinematic positioning in airborne and shipborne gravimetry, multiple GNSS receiving equipment is often fixed mounted on the kinematic platform carrying the gravimetry instrumentation. Thus, the distances among these GNSS antennas are known and invariant. This information can be used to improve the accuracy and reliability of the state estimates. For this purpose, the known distances between the antennas are applied as a priori constraints within the state parameters adjustment. These constraints are introduced in such a way that their accuracy is taken into account. To test this approach, GNSS data of a Baltic Sea shipborne gravimetric campaign have been used. The results of our study show that an application of distance constraints improves the accuracy of the GNSS kinematic positioning, for example, by about 4 mm for the radial component. PMID:27043580

  12. A priori analysis: an application to the estimate of the uncertainty in course grades

    Science.gov (United States)

    Lippi, G. L.

    2014-07-01

    A priori analysis (APA) is discussed as a tool to assess the reliability of grades in standard curricular courses. This unusual, but striking, application is presented when teaching the section on the data treatment of a laboratory course to illustrate the characteristics of the APA and its potential for widespread use, beyond the traditional physics curriculum. The conditions necessary for this kind of analysis are discussed, the general framework is set out and a specific example is given to illustrate its various aspects. Students are often struck by this unusual application and are more apt to remember the APA. Instructors may also benefit from some of the gathered information, as discussed in the paper.

  13. Predicting thermal history a-priori for magnetic nanoparticle hyperthermia of internal carcinoma

    Science.gov (United States)

    Dhar, Purbarun; Sirisha Maganti, Lakshmi

    2017-08-01

    This article proposes a simplistic and realistic method where a direct analytical expression can be derived for the temperature field within a tumour during magnetic nanoparticle hyperthermia. The approximated analytical expression for thermal history within the tumour is derived based on the lumped capacitance approach and considers all therapy protocols and parameters. The present method is simplistic and provides an easy framework for estimating hyperthermia protocol parameters promptly. The model has been validated with respect to several experimental reports on animal models such as mice/rabbit/hamster and human clinical trials. It has been observed that the model is able to accurately estimate the thermal history within the carcinoma during the hyperthermia therapy. The present approach may find implications in a-priori estimation of the thermal history in internal tumours for optimizing magnetic hyperthermia treatment protocols with respect to the ablation time, tumour size, magnetic drug concentration, field strength, field frequency, nanoparticle material and size, tumour location, and so on.

  14. A priori analysis: an application to the estimate of the uncertainty in course grades

    International Nuclear Information System (INIS)

    Lippi, G L

    2014-01-01

    A priori analysis (APA) is discussed as a tool to assess the reliability of grades in standard curricular courses. This unusual, but striking, application is presented when teaching the section on the data treatment of a laboratory course to illustrate the characteristics of the APA and its potential for widespread use, beyond the traditional physics curriculum. The conditions necessary for this kind of analysis are discussed, the general framework is set out and a specific example is given to illustrate its various aspects. Students are often struck by this unusual application and are more apt to remember the APA. Instructors may also benefit from some of the gathered information, as discussed in the paper. (paper)

  15. Evaluating the effectiveness of a priori information on process measures in a virtual reality inspection task

    Directory of Open Access Journals (Sweden)

    Shannon Raye Bowling

    2010-06-01

    Full Text Available 72 1024x768 Normal 0 false false false Due to the nature of the complexity of the aircraft maintenance industry, much emphasis has been placed on improving aircraft inspection performance. One proven technique for improving inspection performance is the use of training. Several strategies have been implemented for training, one of which is giving feedforward information. The use of a priori (feedforward information is known to positively affect inspection performance (Ernst and Yovits, 1972; Long and Rourke, 1989; McKernan, 1989; Gramopadhye et al., 1997.  This information can consist of knowledge about defect characteristics (types, severity/criticality, and location and the probability of occurrence. Although several studies have been conducted that demonstrate the usefulness of feedforward as a training strategy, there are certain research issues that need to be addressed. This study evaluates the effects of feedforward information on process measures in a simulated 3-dimensional environment (aircraft cargo bay by the use of virtual reality.

  16. Investigating industrial investigation: examining the impact of a priori knowledge and tunnel vision education.

    Science.gov (United States)

    Maclean, Carla L; Brimacombe, C A Elizabeth; Lindsay, D Stephen

    2013-12-01

    The current study addressed tunnel vision in industrial incident investigation by experimentally testing how a priori information and a human bias (generated via the fundamental attribution error or correspondence bias) affected participants' investigative behavior as well as the effectiveness of a debiasing intervention. Undergraduates and professional investigators engaged in a simulated industrial investigation exercise. We found that participants' judgments were biased by knowledge about the safety history of either a worker or piece of equipment and that a human bias was evident in participants' decision making. However, bias was successfully reduced with "tunnel vision education." Professional investigators demonstrated a greater sophistication in their investigative decision making compared to undergraduates. The similarities and differences between these two populations are discussed. (c) 2013 APA, all rights reserved

  17. Rapid multi-wavelength optical assessment of circulating blood volume without a priori data

    Science.gov (United States)

    Loginova, Ekaterina V.; Zhidkova, Tatyana V.; Proskurnin, Mikhail A.; Zharov, Vladimir P.

    2016-03-01

    The measurement of circulating blood volume (CBV) is crucial in various medical conditions including surgery, iatrogenic problems, rapid fluid administration, transfusion of red blood cells, or trauma with extensive blood loss including battlefield injuries and other emergencies. Currently, available commercial techniques are invasive and time-consuming for trauma situations. Recently, we have proposed high-speed multi-wavelength photoacoustic/photothermal (PA/PT) flow cytometry for in vivo CBV assessment with multiple dyes as PA contrast agents (labels). As the first step, we have characterized the capability of this technique to monitor the clearance of three dyes (indocyanine green, methylene blue, and trypan blue) in an animal model. However, there are strong demands on improvements in PA/PT flow cytometry. As additional verification of our proof-of-concept of this technique, we performed optical photometric CBV measurements in vitro. Three label dyes—methylene blue, crystal violet and, partially, brilliant green—were selected for simultaneous photometric determination of the components of their two-dye mixtures in the circulating blood in vitro without any extra data (like hemoglobin absorption) known a priori. The tests of single dyes and their mixtures in a flow system simulating a blood transfusion system showed a negligible difference between the sensitivities of the determination of these dyes under batch and flow conditions. For individual dyes, the limits of detection of 3×10-6 M‒3×10-6 M in blood were achieved, which provided their continuous determination at a level of 10-5 M for the CBV assessment without a priori data on the matrix. The CBV assessment with errors no higher than 4% were obtained, and the possibility to apply the developed procedure for optical photometric (flow cytometry) with laser sources was shown.

  18. Introduction of a priori information in the elastic linearized inversion of seismic data before stacking; Introduction d'informations a priori dans l'inversion linearisee elastique de donnees sismiques de surface avant sommation

    Energy Technology Data Exchange (ETDEWEB)

    Tonellot, Th.L.

    2000-03-24

    In this thesis, we propose a method which takes into account a priori information (geological, diagraphic and stratigraphic knowledge) in linearized pre-stack seismic data inversion. The approach is based on a formalism in which the a priori information is incorporated in an a priori model of elastic parameters - density, P and S impedances - and a model covariance operator which describes the uncertainties in the model. The first part of the thesis is dedicated to the study of this covariance operator and to the norm associated to its inverse. We have generalized the exponential covariance operator in order to describe the uncertainties in the a priori model elastic parameters and their correlations at each location. We give the analytical expression of the covariance operator inverse in 1-D, 2-D, and 3-D, and we discretized the associated norm with a finite element method. The second part is dedicated to synthetic and real examples. In a preliminary step, we have developed a pre-stack data well calibration method which allows the estimation of the source signal. The impact of different a priori information is then demonstrated on synthetic and real data. (author)

  19. A priori study of subgrid-scale features in turbulent Rayleigh-Bénard convection

    Science.gov (United States)

    Dabbagh, F.; Trias, F. X.; Gorobets, A.; Oliva, A.

    2017-10-01

    At the crossroad between flow topology analysis and turbulence modeling, a priori studies are a reliable tool to understand the underlying physics of the subgrid-scale (SGS) motions in turbulent flows. In this paper, properties of the SGS features in the framework of a large-eddy simulation are studied for a turbulent Rayleigh-Bénard convection (RBC). To do so, data from direct numerical simulation (DNS) of a turbulent air-filled RBC in a rectangular cavity of aspect ratio unity and π spanwise open-ended distance are used at two Rayleigh numbers R a ∈{1 08,1 010 } [Dabbagh et al., "On the evolution of flow topology in turbulent Rayleigh-Bénard convection," Phys. Fluids 28, 115105 (2016)]. First, DNS at Ra = 108 is used to assess the performance of eddy-viscosity models such as QR, Wall-Adapting Local Eddy-viscosity (WALE), and the recent S3PQR-models proposed by Trias et al. ["Building proper invariants for eddy-viscosity subgrid-scale models," Phys. Fluids 27, 065103 (2015)]. The outcomes imply that the eddy-viscosity modeling smoothes the coarse-grained viscous straining and retrieves fairly well the effect of the kinetic unfiltered scales in order to reproduce the coherent large scales. However, these models fail to approach the exact evolution of the SGS heat flux and are incapable to reproduce well the further dominant rotational enstrophy pertaining to the buoyant production. Afterwards, the key ingredients of eddy-viscosity, νt, and eddy-diffusivity, κt, are calculated a priori and revealed positive prevalent values to maintain a turbulent wind essentially driven by the mean buoyant force at the sidewalls. The topological analysis suggests that the effective turbulent diffusion paradigm and the hypothesis of a constant turbulent Prandtl number are only applicable in the large-scale strain-dominated areas in the bulk. It is shown that the bulk-dominated rotational structures of vortex-stretching (and its synchronous viscous dissipative structures) hold

  20. Content analyses of a priori qualitative phantom limb pain descriptions and emerging categories in mid-southerners with limb loss.

    Science.gov (United States)

    Evans, Cecile B

    2014-01-01

    The purposes of this descriptive study were (a) to identify the relative frequencies of a priori categories of phantom limb pain (PLP) quality descriptors reported by Mid-Southerners with limb loss, (b) to analyze their descriptions for emerging categories of PLP, and (c) to identify the relative frequencies of the emerging categories. This cross-sectional descriptive verbal survey assessed PLP descriptors. A content analyses determined relative frequencies of a priori PLP descriptors as well as emerging categories that were identified. The most common a priori PLP quality descriptors reported by 52 amputees with PLP were intermittent, tingling/needles/numb, sharp, cramping, burning, and stabbing. The most common emerging categories reported were pain compared to illness/injury, electrical cyclical, and manipulated/positional. The detailed descriptions of PLP provide insight into the vivid experiences of PLP. Rehabilitation nurses can use this information with PLP assessment, patient teaching, and counseling. © 2013 Association of Rehabilitation Nurses.

  1. Monte Carlo full-waveform inversion of crosshole GPR data using multiple-point geostatistical a priori information

    DEFF Research Database (Denmark)

    Cordua, Knud Skou; Hansen, Thomas Mejer; Mosegaard, Klaus

    2012-01-01

    We present a general Monte Carlo full-waveform inversion strategy that integrates a priori information described by geostatistical algorithms with Bayesian inverse problem theory. The extended Metropolis algorithm can be used to sample the a posteriori probability density of highly nonlinear...... inverse problems, such as full-waveform inversion. Sequential Gibbs sampling is a method that allows efficient sampling of a priori probability densities described by geostatistical algorithms based on either two-point (e.g., Gaussian) or multiple-point statistics. We outline the theoretical framework......) Based on a posteriori realizations, complicated statistical questions can be answered, such as the probability of connectivity across a layer. (3) Complex a priori information can be included through geostatistical algorithms. These benefits, however, require more computing resources than traditional...

  2. Considerations about expected a posteriori estimation in adaptive testing: adaptive a priori, adaptive correction for bias, and adaptive integration interval.

    Science.gov (United States)

    Raiche, Gilles; Blais, Jean-Guy

    2009-01-01

    In a computerized adaptive test, we would like to obtain an acceptable precision of the proficiency level estimate using an optimal number of items. Unfortunately, decreasing the number of items is accompanied by a certain degree of bias when the true proficiency level differs significantly from the a priori estimate. The authors suggest that it is possible to reduced the bias, and even the standard error of the estimate, by applying to each provisional estimation one or a combination of the following strategies: adaptive correction for bias proposed by Bock and Mislevy (1982), adaptive a priori estimate, and adaptive integration interval.

  3. Statistical parameters as a means to a priori assess the accuracy of solar forecasting models

    International Nuclear Information System (INIS)

    Voyant, Cyril; Soubdhan, Ted; Lauret, Philippe; David, Mathieu; Muselli, Marc

    2015-01-01

    In this paper we propose to determinate and to test a set of 20 statistical parameters in order to estimate the short term predictability of the global horizontal irradiation time series and thereby to propose a new prospective tool indicating the expected error regardless the forecasting methods used. The mean absolute log return, which is a tool usually used in econometrics but never in global radiation prediction, proves to be a very good estimator. Some examples of the use of this tool are exposed, showing the interest of this statistical parameter in concrete cases of predictions or optimizations. This study gives a judgment for engineers and researchers on the installation or management of solar plants and could help in minimizing the energy crisis allowing to improve the renewable energy part of the energy mix. - Highlights: • Use of statistical parameter never used for the global radiation forecasting. • A priori index allowing to optimize easily and quickly a clear sky model. • New methodology allowing to quantify the prediction error regardless the predictor used. • The prediction error depends more on the location and the time series type than the machine Learning method used.

  4. [Security of hospital infusion practices: From an a priori risk analysis to an improvement action plan].

    Science.gov (United States)

    Pignard, J; Cosserant, S; Traore, O; Souweine, B; Sautou, V

    2016-03-01

    Infusion in care units, and all the more in intensive care units, is a complex process which can be the source of many risks for the patient. Under cover of an institutional approach for the improvement of the quality and safety of patient healthcare, a risk mapping infusion practices was performed. The analysis was focused on intravenous infusion situations in adults, the a priori risk assessment methodology was applied and a multidisciplinary work group established. Forty-three risks were identified for the infusion process (prescription, preparation and administration). The risks' assessment and the existing means of control showed that 48% of them would have a highly critical patient security impact. Recommendations were developed for 20 risks considered to be most critical, to limit their occurrence and severity, and improve their control level. An institutional action plan was developed and validated in the Drug and Sterile Medical Devices Commission. This mapping allowed the realization of an exhaustive inventory of potential risks associated with the infusion. At the end of this work, multidisciplinary groups were set up to work on different themes and regular quarterly meetings were established to follow the progress of various projects. Risk mapping will be performed in pediatric and oncology unit where the risks associated with the handling of toxic products is omnipresent. Copyright © 2015 Académie Nationale de Pharmacie. Published by Elsevier Masson SAS. All rights reserved.

  5. Factor analysis with a priori knowledge - application in dynamic cardiac SPECT

    Energy Technology Data Exchange (ETDEWEB)

    Sitek, A.; Di Bella, E.V.R.; Gullberg, G.T. [Medical Imaging Research Laboratory, Department of Radiology, University of Utah, CAMT, 729 Arapeen Drive, Salt Lake City, UT 84108-1218 (United States)

    2000-09-01

    Two factor analysis of dynamic structures (FADS) methods for the extraction of time-activity curves (TACs) from cardiac dynamic SPECT data sequences were investigated. One method was based on a least squares (LS) approach which was subject to positivity constraints. The other method was the well known apex-seeking (AS) method. A post-processing step utilizing a priori information was employed to correct for the non-uniqueness of the FADS solution. These methods were used to extract {sup 99m}Tc-teboroxime TACs from computer simulations and from experimental canine and patient studies. In computer simulations, the LS and AS methods, which are completely different algorithms, yielded very similar and accurate results after application of the correction for non-uniqueness. FADS-obtained blood curves correlated well with curves derived from region of interest (ROI) measurements in the experimental studies. The results indicate that the factor analysis techniques can be used for semi-automatic estimation of activity curves derived from cardiac dynamic SPECT images, and that they can be used for separation of physiologically different regions in dynamic cardiac SPECT studies. (author)

  6. Identification of parametric models with a priori knowledge of process properties

    Directory of Open Access Journals (Sweden)

    Janiszowski Krzysztof B.

    2016-12-01

    Full Text Available An approach to estimation of a parametric discrete-time model of a process in the case of some a priori knowledge of the investigated process properties is presented. The knowledge of plant properties is introduced in the form of linear bounds, which can be determined for the coefficient vector of the parametric model studied. The approach yields special biased estimation of model coefficients that preserves demanded properties. A formula for estimation of the model coefficients is derived and combined with a recursive scheme determined for minimization of the sum of absolute model errors. The estimation problem of a model with known static gains of inputs is discussed and proper formulas are derived. This approach can overcome the non-identifiability problem which has been observed during estimation based on measurements recorded in industrial closed-loop control systems. The application of the proposed approach to estimation of a model for an industrial plant (a water injector into the steam flow in a power plant is presented and discussed.

  7. On Evaluation of Recharge Model Uncertainty: a Priori and a Posteriori

    International Nuclear Information System (INIS)

    Ming Ye; Karl Pohlmann; Jenny Chapman; David Shafer

    2006-01-01

    Hydrologic environments are open and complex, rendering them prone to multiple interpretations and mathematical descriptions. Hydrologic analyses typically rely on a single conceptual-mathematical model, which ignores conceptual model uncertainty and may result in bias in predictions and under-estimation of predictive uncertainty. This study is to assess conceptual model uncertainty residing in five recharge models developed to date by different researchers based on different theories for Nevada and Death Valley area, CA. A recently developed statistical method, Maximum Likelihood Bayesian Model Averaging (MLBMA), is utilized for this analysis. In a Bayesian framework, the recharge model uncertainty is assessed, a priori, using expert judgments collected through an expert elicitation in the form of prior probabilities of the models. The uncertainty is then evaluated, a posteriori, by updating the prior probabilities to estimate posterior model probability. The updating is conducted through maximum likelihood inverse modeling by calibrating the Death Valley Regional Flow System (DVRFS) model corresponding to each recharge model against observations of head and flow. Calibration results of DVRFS for the five recharge models are used to estimate three information criteria (AIC, BIC, and KIC) used to rank and discriminate these models. Posterior probabilities of the five recharge models, evaluated using KIC, are used as weights to average head predictions, which gives posterior mean and variance. The posterior quantities incorporate both parametric and conceptual model uncertainties

  8. Reconstruction of Consistent 3d CAD Models from Point Cloud Data Using a Priori CAD Models

    Science.gov (United States)

    Bey, A.; Chaine, R.; Marc, R.; Thibault, G.; Akkouche, S.

    2011-09-01

    We address the reconstruction of 3D CAD models from point cloud data acquired in industrial environments, using a pre-existing 3D model as an initial estimate of the scene to be processed. Indeed, this prior knowledge can be used to drive the reconstruction so as to generate an accurate 3D model matching the point cloud. We more particularly focus our work on the cylindrical parts of the 3D models. We propose to state the problem in a probabilistic framework: we have to search for the 3D model which maximizes some probability taking several constraints into account, such as the relevancy with respect to the point cloud and the a priori 3D model, and the consistency of the reconstructed model. The resulting optimization problem can then be handled using a stochastic exploration of the solution space, based on the random insertion of elements in the configuration under construction, coupled with a greedy management of the conflicts which efficiently improves the configuration at each step. We show that this approach provides reliable reconstructed 3D models by presenting some results on industrial data sets.

  9. Geography of resonances and Arnold diffusion in a priori unstable Hamiltonian systems

    International Nuclear Information System (INIS)

    Delshams, Amadeu; Huguet, Gemma

    2009-01-01

    In this paper we consider the case of a general C r+2 perturbation, for r large enough, of an a priori unstable Hamiltonian system of 2 + 1/2 degrees of freedom, and we provide explicit conditions on it, which turn out to be C 2 generic and are verifiable in concrete examples, which guarantee the existence of Arnold diffusion. This is a generalization of the result in Delshams et al (2006 Mem. Am. Math. Soc.) where the case of a perturbation with a finite number of harmonics in the angular variables was considered. The method of proof is based on a careful analysis of the geography of resonances created by a generic perturbation and it contains a deep quantitative description of the invariant objects generated by the resonances therein. The scattering map is used as an essential tool to construct transition chains of objects of different topology. The combination of quantitative expressions for both the geography of resonances and the scattering map provides, in a natural way, explicit computable conditions for instability

  10. Psicoanálisis y filosofía: el problema del a priori de la investigación en Heidegger y Winnicott Psychoanalysis and philosophy: the problem of a priori research in Heidegger and Winnicott

    Directory of Open Access Journals (Sweden)

    Julieta Bareiro

    2011-12-01

    Full Text Available El presente trabajo intenta presentar la relación entre saber psicoanalítico y filosofía a partir de un diálogo entre Winnicott y Heidegger. Para ello se llevará a cabo dos reconstrucciones argumentativas. En primer lugar, se expondrá el modo en que Heidegger determina el carácter a priori de su investigación. En segundo lugar, se hará un recorrido por la obra de Winnicott a fin de determinar cómo concibe el vínculo entre psicoanálisis y filosofía y, fundamentalmente, para establecer si hay lugar en su pensamiento para un a priori que permita trazar un vínculo con la filosofía.This paper aims to present the relationship between psychoanalytic knowledge and philosophy through to dialogue between Winnicott and Heidegger. This will take place two reconstructions argumentative. First, it exposed the way in which Heidegger determines the a priori character of their research. Secondly, it will go through to Winnicott´s work to determine how he conceives the relationship between psychoanalysis and philosophy and, crucially, to establish if a form in their thinking to a priori that allows a link with philosophy.

  11. A priori-defined diet quality indices, biomarkers and risk for type 2 diabetes in five ethnic groups: the Multiethnic Cohort.

    Science.gov (United States)

    Jacobs, Simone; Boushey, Carol J; Franke, Adrian A; Shvetsov, Yurii B; Monroe, Kristine R; Haiman, Christopher A; Kolonel, Laurence N; Le Marchand, Loic; Maskarinec, Gertraud

    2017-08-01

    Dietary indices have been related to risk for type 2 diabetes (T2D) predominantly in white populations. The present study evaluated this association in the ethnically diverse Multiethnic Cohort and examined four diet quality indices in relation to T2D risk, homoeostatic model assessment-estimated insulin resistance (HOMA-IR) and biomarkers of dyslipidaemia, inflammation and adipokines. The T2D analysis included 166 550 white, African American, Native Hawaiian, Japanese American and Latino participants (9200 incident T2D cases). Dietary intake was assessed at baseline using a quantitative FFQ and T2D status was based on three self-reports and confirmed by administrative data. Biomarkers were assessed about 10 years later in a biomarker subcohort (n 10 060). Sex- and ethnicity-specific hazard ratios were calculated for the Healthy Eating Index-2010 (HEI-2010), the alternative HEI-2010 (AHEI-2010), the alternate Mediterranean diet score (aMED) and the Dietary Approaches to Stop Hypertension (DASH). Multivariable-adjusted means of biomarkers were compared across dietary index tertiles in the biomarker subcohort. The AHEI-2010, aMED (in men only) and DASH scores were related to a 10-20 % lower T2D risk, with the strongest associations in whites and the direction of the relationships mostly consistent across ethnic groups. Higher scores on the four indices were related to lower HOMA-IR, TAG and C-reactive protein concentrations, not related to leptin, and the DASH score was directly associated with adiponectin. The AHEI-2010 and DASH were directly related to HDL-cholesterol in women. Potential underlying biological mechanisms linking diet quality and T2D risk are an improved lipid profile and reduced systemic inflammation and, with regards to DASH alone, an improved adiponectin profile.

  12. Teleology and Defining Sex.

    Science.gov (United States)

    Gamble, Nathan K; Pruski, Michal

    2018-07-01

    Disorders of sexual differentiation lead to what is often referred to as an intersex state. This state has medical, as well as some legal, recognition. Nevertheless, the question remains whether intersex persons occupy a state in between maleness and femaleness or whether they are truly men or women. To answer this question, another important conundrum needs to be first solved: what defines sex? The answer seems rather simple to most people, yet when morphology does not coincide with haplotypes, and genetics might not correlate with physiology the issue becomes more complex. This paper tackles both issues by establishing where the essence of sex is located and by superimposing that framework onto the issue of the intersex. This is achieved through giving due consideration to the biology of sexual development, as well as through the use of a teleological framework of the meaning of sex. Using a range of examples, the paper establishes that sex cannot be pinpointed to one biological variable but is rather determined by how the totality of one's biology is oriented towards biological reproduction. A brief consideration is also given to the way this situation could be comprehended from a Christian understanding of sex and suffering.

  13. Growth references

    NARCIS (Netherlands)

    Buuren, S. van

    2007-01-01

    A growth reference describes the variation of an anthropometric measurement within a group of individuals. A reference is a tool for grouping and analyzing data and provides a common basis for comparing populations.1 A well known type of reference is the age-conditional growth diagram. The

  14. Musical Probabilities, Abductive Reasoning, and Brain Mechanisms: Extended Perspective of "A Priori" Listening to Music within the Creative Cognition Approach

    Science.gov (United States)

    Schmidt, Sebastian; Troge, Thomas A.; Lorrain, Denis

    2013-01-01

    A theory of listening to music is proposed. It suggests that, for listeners, the process of prediction is the starting point to experiencing music. This implies that perception of music starts through both a predisposed and an experience-based extrapolation into the future (this is labeled "a priori" listening). Indications for this…

  15. Lost in space: Onboard star identification using CCD star tracker data without an a priori attitude

    Science.gov (United States)

    Ketchum, Eleanor A.; Tolson, Robert H.

    1993-01-01

    There are many algorithms in use today which determine spacecraft attitude by identifying stars in the field of view of a star tracker. Some methods, which date from the early 1960's, compare the angular separation between observed stars with a small catalog. In the last 10 years, several methods have been developed which speed up the process and reduce the amount of memory needed, a key element to onboard attitude determination. However, each of these methods require some a priori knowledge of the spacecraft attitude. Although the Sun and magnetic field generally provide the necessary coarse attitude information, there are occasions when a spacecraft could get lost when it is not prudent to wait for sunlight. Also, the possibility of efficient attitude determination using only the highly accurate CCD star tracker could lead to fully autonomous spacecraft attitude determination. The need for redundant coarse sensors could thus be eliminated at substantial cost reduction. Some groups have extended their algorithms to implement a computation intense full sky scan. Some require large data bases. Both storage and speed are concerns for autonomous onboard systems. Neural network technology is even being explored by some as a possible solution, but because of the limited number of patterns that can be stored and large overhead, nothing concrete has resulted from these efforts. This paper presents an algorithm which, by descretizing the sky and filtering by visual magnitude of the brightness observed star, speeds up the lost in space star identification process while reducing the amount of necessary onboard computer storage compared to existing techniques.

  16. An algorithm for finding biologically significant features in microarray data based on a priori manifold learning.

    Directory of Open Access Journals (Sweden)

    Zena M Hira

    Full Text Available Microarray databases are a large source of genetic data, which, upon proper analysis, could enhance our understanding of biology and medicine. Many microarray experiments have been designed to investigate the genetic mechanisms of cancer, and analytical approaches have been applied in order to classify different types of cancer or distinguish between cancerous and non-cancerous tissue. However, microarrays are high-dimensional datasets with high levels of noise and this causes problems when using machine learning methods. A popular approach to this problem is to search for a set of features that will simplify the structure and to some degree remove the noise from the data. The most widely used approach to feature extraction is principal component analysis (PCA which assumes a multivariate Gaussian model of the data. More recently, non-linear methods have been investigated. Among these, manifold learning algorithms, for example Isomap, aim to project the data from a higher dimensional space onto a lower dimension one. We have proposed a priori manifold learning for finding a manifold in which a representative set of microarray data is fused with relevant data taken from the KEGG pathway database. Once the manifold has been constructed the raw microarray data is projected onto it and clustering and classification can take place. In contrast to earlier fusion based methods, the prior knowledge from the KEGG databases is not used in, and does not bias the classification process--it merely acts as an aid to find the best space in which to search the data. In our experiments we have found that using our new manifold method gives better classification results than using either PCA or conventional Isomap.

  17. [Reference citation].

    Science.gov (United States)

    Brkić, Silvija

    2013-01-01

    Scientific and professional papers represent the information basis for scientific research and professional work. References important for the paper should be cited within the text, and listed at the end of the paper. This paper deals with different styles of reference citation. Special emphasis was placed on the Vancouver Style for reference citation in biomedical journals established by the International Committee of Medical Journal Editors. It includes original samples for citing various types of articles, both printed and electronic, as well as recommendations related to reference citation in accordance with the methodology and ethics of scientific research and guidelines for preparing manuscripts for publication.

  18. Knowledge Management and Reference Services

    Science.gov (United States)

    Gandhi, Smiti

    2004-01-01

    Many corporations are embracing knowledge management (KM) to capture the intellectual capital of their employees. This article focuses on KM applications for reference work in libraries. It defines key concepts of KM, establishes a need for KM for reference services, and reviews various KM initiatives for reference services.

  19. Reference Assessment

    Science.gov (United States)

    Bivens-Tatum, Wayne

    2006-01-01

    This article presents interesting articles that explore several different areas of reference assessment, including practical case studies and theoretical articles that address a range of issues such as librarian behavior, patron satisfaction, virtual reference, or evaluation design. They include: (1) "Evaluating the Quality of a Chat Service"…

  20. Globally Stable Adaptive Backstepping Neural Network Control for Uncertain Strict-Feedback Systems With Tracking Accuracy Known a Priori.

    Science.gov (United States)

    Chen, Weisheng; Ge, Shuzhi Sam; Wu, Jian; Gong, Maoguo

    2015-09-01

    This paper addresses the problem of globally stable direct adaptive backstepping neural network (NN) tracking control design for a class of uncertain strict-feedback systems under the assumption that the accuracy of the ultimate tracking error is given a priori. In contrast to the classical adaptive backstepping NN control schemes, this paper analyzes the convergence of the tracking error using Barbalat's Lemma via some nonnegative functions rather than the positive-definite Lyapunov functions. Thus, the accuracy of the ultimate tracking error can be determined and adjusted accurately a priori, and the closed-loop system is guaranteed to be globally uniformly ultimately bounded. The main technical novelty is to construct three new n th-order continuously differentiable functions, which are used to design the control law, the virtual control variables, and the adaptive laws. Finally, two simulation examples are given to illustrate the effectiveness and advantages of the proposed control method.

  1. A program for the a priori evaluation of detection limits in instrumental neutron activation analysis using a SLOWPOKE II reactor

    International Nuclear Information System (INIS)

    Galinier, J.L.; Zikovsky, L.

    1982-01-01

    A program that permits the a priori calculation of detection limits in monoelemental matrices, adapted to instrumental neutron activation analysis using a SLOWPOKE II reactor, is described. A simplified model of the gamma spectra is proposed. Products of (n,p) and (n,α) reactions induced by the fast components of the neutron flux that accompanies the thermal flux at the level of internal irradiation sites in the reactor have been included in the list of interfering radionuclides. The program calculates in a systematic way the detection limits of 66 elements in an equal number of matrices using 153 intermediary radionuclides. Experimental checks carried out with silicon (for short lifetimes) and aluminum and magnesium (for intermediate lifetimes) show satisfactory agreement with the calculations. These results show in particular the importance of the contribution of the (n,p) and (n,α) reactions in the a priori evaluation of detection limits with a SLOWPOKE type reactor [fr

  2. Tomographic inversion of time-domain resistivity and chargeability data for the investigation of landfills using a priori information.

    Science.gov (United States)

    De Donno, Giorgio; Cardarelli, Ettore

    2017-01-01

    In this paper, we present a new code for the modelling and inversion of resistivity and chargeability data using a priori information to improve the accuracy of the reconstructed model for landfill. When a priori information is available in the study area, we can insert them by means of inequality constraints on the whole model or on a single layer or assigning weighting factors for enhancing anomalies elongated in the horizontal or vertical directions. However, when we have to face a multilayered scenario with numerous resistive to conductive transitions (the case of controlled landfills), the effective thickness of the layers can be biased. The presented code includes a model-tuning scheme, which is applied after the inversion of field data, where the inversion of the synthetic data is performed based on an initial guess, and the absolute difference between the field and synthetic inverted models is minimized. The reliability of the proposed approach has been supported in two real-world examples; we were able to identify an unauthorized landfill and to reconstruct the geometrical and physical layout of an old waste dump. The combined analysis of the resistivity and chargeability (normalised) models help us to remove ambiguity due to the presence of the waste mass. Nevertheless, the presence of certain layers can remain hidden without using a priori information, as demonstrated by a comparison of the constrained inversion with a standard inversion. The robustness of the above-cited method (using a priori information in combination with model tuning) has been validated with the cross-section from the construction plans, where the reconstructed model is in agreement with the original design. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. A Method for A Priori Implementation Effort Estimation for Hardware Design

    DEFF Research Database (Denmark)

    Abildgren, Rasmus; Diguet, Jean-Philippe; Gogniat, Guy

    2008-01-01

    This paper presents a metric-based approach for estimating the hardware implementation effort (in terms of time) for an application in relation to the number of independent paths of its algorithms. We define a metric which exploits the relation between the number of independent paths in an algori...... facilitating designers and managers needs for estimating the time-to-market schedule....

  4. Recent references

    International Nuclear Information System (INIS)

    Ramavataram, S.

    1991-01-01

    In support of a continuing program of systematic evaluation of nuclear structure data, the National Nuclear Data Center maintains a complete computer file of references to the nuclear physics literature. Each reference is tagged by a keyword string, which indicates the kinds of data contained in the article. This master file of Nuclear Structure References (NSR) contains complete keyword indexes to literature published since 1969, with partial indexing of older references. Any reader who finds errors in the keyword descriptions is urged to report them to the National Nuclear Data Center so that the master NSR file can be corrected. In 1966, the first collection of Recent References was published as a separate issue of Nuclear Data Sheets. Every four months since 1970, a similar indexed bibliography to new nuclear experiments has been prepared from additions to the NSR file and published. Beginning in 1978, Recent References was cumulated annually, with the third issue completely superseding the two issues previously published during a given year. Due to publication policy changes, cumulation of Recent Reference was discontinued in 1986. The volume and issue number of all the cumulative issues published to date are given. NNDC will continue to respond to individual requests for special bibliographies on nuclear physics topics, in addition to those easily obtained from Recent References. If the required information is available from the keyword string, a reference list can be prepared automatically from the computer files. This service can be provided on request, in exchange for the timely communication of new nuclear physics results (e.g., preprints). A current copy of the NSR file may also be obtained in a standard format on magnetic tape from NNDC. Requests for special searches of the NSR file may also be directed to the National Nuclear Data Center

  5. Defining Quantum Control Flow

    OpenAIRE

    Ying, Mingsheng; Yu, Nengkun; Feng, Yuan

    2012-01-01

    A remarkable difference between quantum and classical programs is that the control flow of the former can be either classical or quantum. One of the key issues in the theory of quantum programming languages is defining and understanding quantum control flow. A functional language with quantum control flow was defined by Altenkirch and Grattage [\\textit{Proc. LICS'05}, pp. 249-258]. This paper extends their work, and we introduce a general quantum control structure by defining three new quantu...

  6. Can play be defined?

    DEFF Research Database (Denmark)

    Eichberg, Henning

    2015-01-01

    Can play be defined? There is reason to raise critical questions about the established academic demand that at phenomenon – also in humanist studies – should first of all be defined, i.e. de-lineated and by neat lines limited to a “little box” that can be handled. The following chapter develops....... Human beings can very well understand play – or whatever phenomenon in human life – without defining it....

  7. Defining Overweight and Obesity

    Science.gov (United States)

    ... Micronutrient Malnutrition State and Local Programs Defining Adult Overweight and Obesity Recommend on Facebook Tweet Share Compartir ... weight for a given height is described as overweight or obese. Body Mass Index, or BMI, is ...

  8. Drinking Levels Defined

    Science.gov (United States)

    ... of Alcohol Consumption Alcohol's Effects on the Body Alcohol Use Disorder Fetal Alcohol Exposure Support & Treatment Alcohol Policy Special ... Definition of Drinking at Low Risk for Developing Alcohol Use Disorder (AUD): For women, low-risk drinking is defined ...

  9. Defining Documentary Film

    DEFF Research Database (Denmark)

    Juel, Henrik

    2006-01-01

    A discussion of various attemts at defining documentary film regarding form, content, truth, stile, genre or reception - and a propoposal of a positive list of essential, but non-exclusive characteristica of documentary film......A discussion of various attemts at defining documentary film regarding form, content, truth, stile, genre or reception - and a propoposal of a positive list of essential, but non-exclusive characteristica of documentary film...

  10. Almost half of the Danish general practitioners have negative a priori attitudes towards a mandatory accreditation programme

    DEFF Research Database (Denmark)

    Waldorff, Frans Boch; Nicolaisdottir, Dagny Ros; Kousgaard, Marius Brostrøm

    2016-01-01

    INTRODUCTION: The objective of this study was to analyse Danish general practitioners' (GPs) a priori attitudes and expectations towards a nationwide mandatory accreditation programme. METHODS: This study is based on a nationwide electronic survey comprising all Danish GPs (n = 3,403). RESULTS...... accreditation. FUNDING: The three Research Units for General Practice in Odense, Aarhus and Copenhagen initiated and funded this study. TRIAL REGISTRATION: The survey was recommended by the Danish Multipractice Committee (MPU 02-2015) and evaluated by the Danish Data Agency (2015-41-3684)....

  11. Parameter transferability within homogeneous regions and comparisons with predictions from a priori parameters in the eastern United States

    Science.gov (United States)

    Chouaib, Wafa; Alila, Younes; Caldwell, Peter V.

    2018-05-01

    The need for predictions of flow time-series persists at ungauged catchments, motivating the research goals of our study. By means of the Sacramento model, this paper explores the use of parameter transfer within homogeneous regions of similar climate and flow characteristics and makes comparisons with predictions from a priori parameters. We assessed the performance using the Nash-Sutcliffe (NS), bias, mean monthly hydrograph and flow duration curve (FDC). The study was conducted on a large dataset of 73 catchments within the eastern US. Two approaches to the parameter transferability were developed and evaluated; (i) the within homogeneous region parameter transfer using one donor catchment specific to each region, (ii) the parameter transfer disregarding the geographical limits of homogeneous regions, where one donor catchment was common to all regions. Comparisons between both parameter transfers enabled to assess the gain in performance from the parameter regionalization and its respective constraints and limitations. The parameter transfer within homogeneous regions outperformed the a priori parameters and led to a decrease in bias and increase in efficiency reaching a median NS of 0.77 and a NS of 0.85 at individual catchments. The use of FDC revealed the effect of bias on the inaccuracy of prediction from parameter transfer. In one specific region, of mountainous and forested catchments, the prediction accuracy of the parameter transfer was less satisfactory and equivalent to a priori parameters. In this region, the parameter transfer from the outsider catchment provided the best performance; less-biased with smaller uncertainty in medium flow percentiles (40%-60%). The large disparity of energy conditions explained the lack of performance from parameter transfer in this region. Besides, the subsurface stormflow is predominant and there is a likelihood of lateral preferential flow, which according to its specific properties further explained the reduced

  12. Definably compact groups definable in real closed fields. I

    OpenAIRE

    Barriga, Eliana

    2017-01-01

    We study definably compact definably connected groups definable in a sufficiently saturated real closed field $R$. We introduce the notion of group-generic point for $\\bigvee$-definable groups and show the existence of group-generic points for definably compact groups definable in a sufficiently saturated o-minimal expansion of a real closed field. We use this notion along with some properties of generic sets to prove that for every definably compact definably connected group $G$ definable in...

  13. Technostress and the Reference Librarian.

    Science.gov (United States)

    Kupersmith, John

    1992-01-01

    Defines "technostress" as the stress experienced by reference librarians who must constantly deal with the demands of new information technology and the changes they produce in the work place. Discussion includes suggested ways in which both organizations and individuals can work to reduce stress. (27 references) (LAE)

  14. A priori tests of combustion models based on a CH{sub 4}/H{sub 2} Triple Flame

    Energy Technology Data Exchange (ETDEWEB)

    Dombard, J.; Naud, B.; Jimenez Sanchez, C.

    2008-07-01

    This document reproduces the final project of Jerome Dombard, presented on June 25, 2008, for the obtention of the Master degree MIMSE (Master Ingenierie Mathematique, Statistique et Economique) of Bordeaux University (Universite Bordeaux 1). We make an a priori study of FPI/FGM-type turbulent combustion models using a 2D DNS of a triple flame. A reduced chemical scheme of 16 species and 12 reactions is used (ARM1, proposed by J.-Y. Chen at Berkeley University). The fuel (CH4/H2 mixture) and oxidizer (air) correspond to the inlet composition of the Sydney bluff-body stabilised flame experiments (flames HM1-3). First, we compute 1D laminar premixed flames. The purpose of those calculations is twofold: 1. check the differences between different computer programs and different treatments of molecular diffusion, and 2. calibrate the 2D-DNS of the laminar triple flame (mainly decide on the grid resolution). Then, the solution of the 2D laminar triple flame is used to test a priori FPI/FGM tables. Finally, preliminary considerations on sub-grid scale modelling in the context of Large Eddy Simulation are made. (Author) 14 refs.

  15. A Priori Analysis of a Compressible Flamelet Model using RANS Data for a Dual-Mode Scramjet Combustor

    Science.gov (United States)

    Quinlan, Jesse R.; Drozda, Tomasz G.; McDaniel, James C.; Lacaze, Guilhem; Oefelein, Joseph

    2015-01-01

    In an effort to make large eddy simulation of hydrocarbon-fueled scramjet combustors more computationally accessible using realistic chemical reaction mechanisms, a compressible flamelet/progress variable (FPV) model was proposed that extends current FPV model formulations to high-speed, compressible flows. Development of this model relied on observations garnered from an a priori analysis of the Reynolds-Averaged Navier-Stokes (RANS) data obtained for the Hypersonic International Flight Research and Experimentation (HI-FiRE) dual-mode scramjet combustor. The RANS data were obtained using a reduced chemical mechanism for the combustion of a JP-7 surrogate and were validated using avail- able experimental data. These RANS data were then post-processed to obtain, in an a priori fashion, the scalar fields corresponding to an FPV-based modeling approach. In the current work, in addition to the proposed compressible flamelet model, a standard incompressible FPV model was also considered. Several candidate progress variables were investigated for their ability to recover static temperature and major and minor product species. The effects of pressure and temperature on the tabulated progress variable source term were characterized, and model coupling terms embedded in the Reynolds- averaged Navier-Stokes equations were studied. Finally, results for the novel compressible flamelet/progress variable model were presented to demonstrate the improvement attained by modeling the effects of pressure and flamelet boundary conditions on the combustion.

  16. River routing at the continental scale: use of globally-available data and an a priori method of parameter estimation

    Directory of Open Access Journals (Sweden)

    P. Naden

    1999-01-01

    Full Text Available Two applications of a river routing model based on the observed river network and a linearised solution to the convective-diffusion equation are presented. One is an off-line application to part of the Amazon basin (catchment area 2.15 M km2 using river network data from the Digital Chart of the World and GCM-generated runoff at a grid resolution of 2.5 degrees latitude and 3.75 degrees longitude. The other application is to the Arkansas (409,000 km2 and Red River (125,500 km2 basins as an integrated component of a macro-scale hydrological model, driven by observed meteorology and operating on a 17 km grid. This second application makes use of the US EPA reach data to construct the river network. In both cases, a method of computing parameter values a priori has been applied and shows some success, although some interpretation is required to derive `correct' parameter values and further work is needed to develop guidelines for use of the method. The applications, however, do demonstrate the possibilities for applying the routing model at the continental scale, with globally-available data and a priori parameter estimation, and its value for validating GCM output against observed flows.

  17. Defining Game Mechanics

    DEFF Research Database (Denmark)

    Sicart (Vila), Miguel Angel

    2008-01-01

    This article defins game mechanics in relation to rules and challenges. Game mechanics are methods invoked by agents for interacting with the game world. I apply this definition to a comparative analysis of the games Rez, Every Extend Extra and Shadow of the Colossus that will show the relevance...... of a formal definition of game mechanics. Udgivelsesdato: Dec 2008...

  18. Modal Logics and Definability

    OpenAIRE

    Kuusisto, Antti

    2013-01-01

    In recent years, research into the mathematical foundations of modal logic has become increasingly popular. One of the main reasons for this is the fact that modal logic seems to adapt well to the requirements of a wide range of different fields of application. This paper is a summary of some of the author’s contributions to the understanding of modal definability theory.

  19. Software Defined Cyberinfrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Foster, Ian; Blaiszik, Ben; Chard, Kyle; Chard, Ryan

    2017-07-17

    Within and across thousands of science labs, researchers and students struggle to manage data produced in experiments, simulations, and analyses. Largely manual research data lifecycle management processes mean that much time is wasted, research results are often irreproducible, and data sharing and reuse remain rare. In response, we propose a new approach to data lifecycle management in which researchers are empowered to define the actions to be performed at individual storage systems when data are created or modified: actions such as analysis, transformation, copying, and publication. We term this approach software-defined cyberinfrastructure because users can implement powerful data management policies by deploying rules to local storage systems, much as software-defined networking allows users to configure networks by deploying rules to switches.We argue that this approach can enable a new class of responsive distributed storage infrastructure that will accelerate research innovation by allowing any researcher to associate data workflows with data sources, whether local or remote, for such purposes as data ingest, characterization, indexing, and sharing. We report on early experiments with this approach in the context of experimental science, in which a simple if-trigger-then-action (IFTA) notation is used to define rules.

  20. Defining Abnormally Low Tenders

    DEFF Research Database (Denmark)

    Ølykke, Grith Skovgaard; Nyström, Johan

    2017-01-01

    The concept of an abnormally low tender is not defined in EU public procurement law. This article takes an interdisciplinary law and economics approach to examine a dataset consisting of Swedish and Danish judgments and verdicts concerning the concept of an abnormally low tender. The purpose...

  1. Software Defined Coded Networking

    DEFF Research Database (Denmark)

    Di Paola, Carla; Roetter, Daniel Enrique Lucani; Palazzo, Sergio

    2017-01-01

    the quality of each link and even across neighbouring links and using simulations to show that an additional reduction of packet transmission in the order of 40% is possible. Second, to advocate for the use of network coding (NC) jointly with software defined networking (SDN) providing an implementation...

  2. Defining depth of anesthesia.

    Science.gov (United States)

    Shafer, S L; Stanski, D R

    2008-01-01

    In this chapter, drawn largely from the synthesis of material that we first presented in the sixth edition of Miller's Anesthesia, Chap 31 (Stanski and Shafer 2005; used by permission of the publisher), we have defined anesthetic depth as the probability of non-response to stimulation, calibrated against the strength of the stimulus, the difficulty of suppressing the response, and the drug-induced probability of non-responsiveness at defined effect site concentrations. This definition requires measurement of multiple different stimuli and responses at well-defined drug concentrations. There is no one stimulus and response measurement that will capture depth of anesthesia in a clinically or scientifically meaningful manner. The "clinical art" of anesthesia requires calibration of these observations of stimuli and responses (verbal responses, movement, tachycardia) against the dose and concentration of anesthetic drugs used to reduce the probability of response, constantly adjusting the administered dose to achieve the desired anesthetic depth. In our definition of "depth of anesthesia" we define the need for two components to create the anesthetic state: hypnosis created with drugs such as propofol or the inhalational anesthetics and analgesia created with the opioids or nitrous oxide. We demonstrate the scientific evidence that profound degrees of hypnosis in the absence of analgesia will not prevent the hemodynamic responses to profoundly noxious stimuli. Also, profound degrees of analgesia do not guarantee unconsciousness. However, the combination of hypnosis and analgesia suppresses hemodynamic response to noxious stimuli and guarantees unconsciousness.

  3. Defining and classifying syncope

    NARCIS (Netherlands)

    Thijs, Roland D.; Wieling, Wouter; Kaufmann, Horacio; van Dijk, Gert

    2004-01-01

    There is no widely adopted definition or classification of syncope and related disorders. This lack of uniformity harms patient care, research, and medical education. In this article, syncope is defined as a form of transient loss of consciousness (TLOC) due to cerebral hypoperfusion. Differences

  4. O Caráter a priori das Estruturas Necessárias ao Conhecimento, Construídas segundo a Epistemologia Genética

    OpenAIRE

    Marçal, Vicente Eduardo Ribeiro; Tassinari, Ricardo Pereira [UNESP

    2014-01-01

    In this paper we discuss the question of the a priori character of the necessary structures of knowledge according to Genetic Epistemology, focusing on the notion of space in particular. We establish some relations between Jean Piaget’s Genetic Epistemology and Immanuel Kant’s Critical Philosophy, discuss the notion of the a priori according to Kant in relation to the notion of space, and discuss the construction of the notion of space by the epistemic subject according to Genetic Epistemolog...

  5. Reference Structures: Stagnation, Progress, and Future Challenges.

    Science.gov (United States)

    Greenberg, Jane

    1997-01-01

    Assesses the current state of reference structures in online public access catalogs (OPACs) in a framework defined by stagnation, progress, and future challenges. Outlines six areas for reference structure development. Twenty figures provide illustrations. (AEF)

  6. Defining Legal Moralism

    DEFF Research Database (Denmark)

    Thaysen, Jens Damgaard

    2015-01-01

    This paper discusses how legal moralism should be defined. It is argued that legal moralism should be defined as the position that “For any X, it is always a pro tanto reason for justifiably imposing legal regulation on X that X is morally wrong (where “morally wrong” is not conceptually equivalent...... to “harmful”)”. Furthermore, a distinction between six types of legal moralism is made. The six types are grouped according to whether they are concerned with the enforcement of positive or critical morality, and whether they are concerned with criminalising, legally restricting, or refraining from legally...... protecting morally wrong behaviour. This is interesting because not all types of legal moralism are equally vulnerable to the different critiques of legal moralism that have been put forth. Indeed, I show that some interesting types of legal moralism have not been criticised at all....

  7. Defining local food

    DEFF Research Database (Denmark)

    Eriksen, Safania Normann

    2013-01-01

    Despite evolving local food research, there is no consistent definition of “local food.” Various understandings are utilized, which have resulted in a diverse landscape of meaning. The main purpose of this paper is to examine how researchers within the local food systems literature define local...... food, and how these definitions can be used as a starting point to identify a new taxonomy of local food based on three domains of proximity....

  8. Reference Japanese man

    International Nuclear Information System (INIS)

    Tanaka, Giichiro

    1985-01-01

    To make real and accurate dose assessment method so far, it is necessitated to provide ''Reference Japanese Man'' based on anotomical, physiological and biochemical data of Japanese people instead of the Reference Man presented in ICRP Publications 23 and 30. This review describes present status of researched for the purpose of establishing of Reference Japanese Man. The Reference Japanese Man is defined as a male or female adult who lives in Japan with a Japanese life-style and food custom. His stature and body weight, and the other data was decided as mean values of male or female people of Japan. As for food custom, Japanese people take significantly smaller amount of meat and milk products than Western people, while larger intake amount of cereals and marine products such as fish or seaweeds. Weight of organs is a principal factor for internal dose assessment and mean values for living Japanese adult has been investigated and the value employable for dose assessment for organs and tissues are shown. To employ these values of Reference Japanese Man, it should be taken into account of age. Metabolic parameters should also be considered. Iodine metabolism in Japanese is quite different from that of Western people. The above-mentioned data are now tentatively employing in modification of table of MIRD method and others. (Takagi, S.)

  9. The benefits of defining "snacks".

    Science.gov (United States)

    Hess, Julie M; Slavin, Joanne L

    2018-04-18

    Whether eating a "snack" is considered a beneficial or detrimental behavior is largely based on how "snack" is defined. The term "snack food" tends to connote energy-dense, nutrient-poor foods high in nutrients to limit (sugar, sodium, and/or saturated fat) like cakes, cookies, chips and other salty snacks, and sugar-sweetened beverages. Eating a "snack food" is often conflated with eating a "snack," however, leading to an overall perception of snacks as a dietary negative. Yet the term "snack" can also refer simply to an eating occasion outside of breakfast, lunch, or dinner. With this definition, the evidence to support health benefits or detriments to eating a "snack" remains unclear, in part because relatively few well-designed studies that specifically focus on the impact of eating frequency on health have been conducted. Despite these inconsistencies and research gaps, in much of the nutrition literature, "snacking" is still referred to as detrimental to health. As discussed in this review, however, there are multiple factors that influence the health impacts of snacking, including the definition of "snack" itself, the motivation to snack, body mass index of snack eaters, and the food selected as a snack. Without a definition of "snack" and a body of research using methodologically rigorous protocols, determining the health impact of eating a "snack" will continue to elude the nutrition research community and prevent the development of evidence-based policies about snacking that support public health. Copyright © 2018 Elsevier Inc. All rights reserved.

  10. On possible a-priori "imprinting" of General Relativity itself on the performed Lense-Thirring tests with LAGEOS satellites

    Science.gov (United States)

    Iorio, Lorenzo

    2010-02-01

    The impact of possible a-priori "imprinting" effects of general relativity itself on recent attempts to measure the general relativistic Lense-Thirring effect with the LAGEOS satellites orbiting the Earth and the terrestrial geopotential models from the dedicated mission GRACE is investigated. It is analytically shown that general relativity, not explicitly solved for in the GRACE-based models, may "imprint" their even zonal harmonic coefficients of low degrees J_l at a non-negligible level, given the present-day accuracy in recovering them. This translates into a bias of the LAGEOS-based relativistic tests as large as the Lense-Thirring effect itself. Further analyses should include general relativity itself in the GRACE data processing by explicitly solving for it.

  11. Radiotherapy for brain metastases: defining palliative response

    International Nuclear Information System (INIS)

    Bezjak, Andrea; Adam, Janice; Panzarella, Tony; Levin, Wilfred; Barton, Rachael; Kirkbride, Peter; McLean, Michael; Mason, Warren; Wong, Chong Shun; Laperriere, Normand

    2001-01-01

    Background and purpose: Most patients with brain metastases are treated with palliative whole brain radiotherapy (WBRT). There is no established definition of palliative response. The aim of this study was to develop and test clinically useful criteria for response following palliative WBRT. Materials and methods: A prospective study was conducted of patients with symptomatic brain metastases treated with WBRT (20 Gy/5 fractions) and standardised steroid tapering. Assessments included observer rating of neurological symptoms, patient-completed symptom checklist and performance status (PS). Response criteria were operationally defined based on a combination of neurological symptoms, PS and steroid dose. Results: Seventy-five patients were accrued. At 1 month, presenting neurological symptoms were improved in 14 patients, stable in 17, and worse in 21; 23 patients were not assessed, mainly due to death or frailty. Using response criteria defined a priori, 15% (95% CI 7-23%) of patients were classified as having a response to RT, 25% no response, and 29% progression; 27% were deceased at or soon after 1 month. A revised set of criteria was tested, with less emphasis on complete tapering of steroids: they increased the proportion of patients responding to 39% (95% CI 27-50%) but didn't change the large proportion who did not benefit (44%). Conclusions: Clinical response to RT of patients with brain metastases is multifactorial, comprising symptoms, PS and other factors. Assessment of degree of palliation depend on the exact definition used. More research is needed in this important area, to help validate criteria for assessing palliation after WBRT

  12. Defined contribution health benefits.

    Science.gov (United States)

    Fronstin, P

    2001-03-01

    This Issue Brief discusses the emerging issue of "defined contribution" (DC) health benefits. The term "defined contribution" is used to describe a wide variety of approaches to the provision of health benefits, all of which have in common a shift in the responsibility for payment and selection of health care services from employers to employees. DC health benefits often are mentioned in the context of enabling employers to control their outlay for health benefits by avoiding increases in health care costs. DC health benefits may also shift responsibility for choosing a health plan and the associated risks of choosing a plan from employers to employees. There are three primary reasons why some employers currently are considering some sort of DC approach. First, they are once again looking for ways to keep their health care cost increases in line with overall inflation. Second, some employers are concerned that the public "backlash" against managed care will result in new legislation, regulations, and litigation that will further increase their health care costs if they do not distance themselves from health care decisions. Third, employers have modified not only most employee benefit plans, but labor market practices in general, by giving workers more choice, control, and flexibility. DC-type health benefits have existed as cafeteria plans since the 1980s. A cafeteria plan gives each employee the opportunity to determine the allocation of his or her total compensation (within employer-defined limits) among various employee benefits (primarily retirement or health). Most types of DC health benefits currently being discussed could be provided within the existing employment-based health insurance system, with or without the use of cafeteria plans. They could also allow employees to purchase health insurance directly from insurers, or they could drive new technologies and new forms of risk pooling through which health care services are provided and financed. DC health

  13. Towards Improving Satellite Tropospheric NO2 Retrieval Products: Impacts of the spatial resolution and lighting NOx production from the a priori chemical transport model

    Science.gov (United States)

    Smeltzer, C. D.; Wang, Y.; Zhao, C.; Boersma, F.

    2009-12-01

    Polar orbiting satellite retrievals of tropospheric nitrogen dioxide (NO2) columns are important to a variety of scientific applications. These NO2 retrievals rely on a priori profiles from chemical transport models and radiative transfer models to derive the vertical columns (VCs) from slant columns measurements. In this work, we compare the retrieval results using a priori profiles from a global model (TM4) and a higher resolution regional model (REAM) at the OMI overpass hour of 1330 local time, implementing the Dutch OMI NO2 (DOMINO) retrieval. We also compare the retrieval results using a priori profiles from REAM model simulations with and without lightning NOx (NO + NO2) production. A priori model resolution and lightning NOx production are both found to have large impact on satellite retrievals by altering the satellite sensitivity to a particular observation by shifting the NO2 vertical distribution interpreted by the radiation model. The retrieved tropospheric NO2 VCs may increase by 25-100% in urban regions and be reduced by 50% in rural regions if the a priori profiles from REAM simulations are used during the retrievals instead of the profiles from TM4 simulations. The a priori profiles with lightning NOx may result in a 25-50% reduction of the retrieved tropospheric NO2 VCs compared to the a priori profiles without lightning. As first priority, a priori vertical NO2 profiles from a chemical transport model with a high resolution, which can better simulate urban-rural NO2 gradients in the boundary layer and make use of observation-based parameterizations of lightning NOx production, should be first implemented to obtain more accurate NO2 retrievals over the United States, where NOx source regions are spatially separated and lightning NOx production is significant. Then as consequence of a priori NO2 profile variabilities resulting from lightning and model resolution dynamics, geostationary satellite, daylight observations would further promote the next

  14. On Defining Mass

    Science.gov (United States)

    Hecht, Eugene

    2011-01-01

    Though central to any pedagogical development of physics, the concept of mass is still not well understood. Properly defining mass has proven to be far more daunting than contemporary textbooks would have us believe. And yet today the origin of mass is one of the most aggressively pursued areas of research in all of physics. Much of the excitement surrounding the Large Hadron Collider at CERN is associated with discovering the mechanism responsible for the masses of the elementary particles. This paper will first briefly examine the leading definitions, pointing out their shortcomings. Then, utilizing relativity theory, it will propose—for consideration by the community of physicists—a conceptual definition of mass predicated on the more fundamental concept of energy, more fundamental in that everything that has mass has energy, yet not everything that has energy has mass.

  15. Implementing Software Defined Radio

    CERN Document Server

    Grayver, Eugene

    2013-01-01

    Software Defined Radio makes wireless communications easier, more efficient, and more reliable. This book bridges the gap between academic research and practical implementation. When beginning a project, practicing engineers, technical managers, and graduate students can save countless hours by considering the concepts presented in these pages. The author covers the myriad options and trade-offs available when selecting an appropriate hardware architecture. As demonstrated here, the choice between hardware- and software-centric architecture can mean the difference between meeting an aggressive schedule and bogging down in endless design iterations. Because of the author’s experience overseeing dozens of failed and successful developments, he is able to present many real-life examples. Some of the key concepts covered are: Choosing the right architecture for the market – laboratory, military, or commercial Hardware platforms – FPGAs, GPPs, specialized and hybrid devices Standardization efforts to ens...

  16. Defining cyber warfare

    Directory of Open Access Journals (Sweden)

    Dragan D. Mladenović

    2012-04-01

    Full Text Available Cyber conflicts represent a new kind of warfare that is technologically developing very rapidly. Such development results in more frequent and more intensive cyber attacks undertaken by states against adversary targets, with a wide range of diverse operations, from information operations to physical destruction of targets. Nevertheless, cyber warfare is waged through the application of the same means, techniques and methods as those used in cyber criminal, terrorism and intelligence activities. Moreover, it has a very specific nature that enables states to covertly initiate attacks against their adversaries. The starting point in defining doctrines, procedures and standards in the area of cyber warfare is determining its true nature. In this paper, a contribution to this effort was made through the analysis of the existing state doctrines and international practice in the area of cyber warfare towards the determination of its nationally acceptable definition.

  17. Defining the mobilome.

    Science.gov (United States)

    Siefert, Janet L

    2009-01-01

    This chapter defines the agents that provide for the movement of genetic material which fuels the adaptive potential of life on our planet. The chapter has been structured to be broadly comprehensive, arbitrarily categorizing the mobilome into four classes: (1) transposons, (2) plasmids, (3) bacteriophage, and (4) self-splicing molecular parasites.Our increasing understanding of the mobilome is as dynamic as the mobilome itself. With continuing discovery, it is clear that nature has not confined these genomic agents of change to neat categories, but rather the classification categories overlap and intertwine. Massive sequencing efforts and their published analyses are continuing to refine our understanding of the extent of the mobilome. This chapter provides a framework to describe our current understanding of the mobilome and a foundation on which appreciation of its impact on genome evolution can be understood.

  18. Software Defined Networking

    DEFF Research Database (Denmark)

    Caba, Cosmin Marius

    Network Service Providers (NSP) often choose to overprovision their networks instead of deploying proper Quality of Services (QoS) mechanisms that allow for traffic differentiation and predictable quality. This tendency of overprovisioning is not sustainable for the simple reason that network...... resources are limited. Hence, to counteract this trend, current QoS mechanisms must become simpler to deploy and operate, in order to motivate NSPs to employ QoS techniques instead of overprovisioning. Software Defined Networking (SDN) represents a paradigm shift in the way telecommunication and data...... generic perspective (e.g. service provisioning speed, resources availability). As a result, new mechanisms for providing QoS are proposed, solutions for SDN-specific QoS challenges are designed and tested, and new network management concepts are prototyped, all aiming to improve QoS for network services...

  19. Instrumentation reference book

    CERN Document Server

    Boyes, Walt

    2002-01-01

    Instrumentation is not a clearly defined subject, having a 'fuzzy' boundary with a number of other disciplines. Often categorized as either 'techniques' or 'applications' this book addresses the various applications that may be needed with reference to the practical techniques that are available for the instrumentation or measurement of a specific physical quantity or quality. This makes it of direct interest to anyone working in the process, control and instrumentation fields where these measurements are essential.* Comprehensive and authoritative collection of technical information* Writte

  20. International Geomagnetic Reference Field

    DEFF Research Database (Denmark)

    Finlay, Chris; Maus, S.; Beggan, C. D.

    2010-01-01

    The eleventh generation of the International Geomagnetic Reference Field (IGRF) was adopted in December 2009 by the International Association of Geomagnetism and Aeronomy Working Group V‐MOD. It updates the previous IGRF generation with a definitive main field model for epoch 2005.0, a main field...... model for epoch 2010.0, and a linear predictive secular variation model for 2010.0–2015.0. In this note the equations defining the IGRF model are provided along with the spherical harmonic coefficients for the eleventh generation. Maps of the magnetic declination, inclination and total intensity...

  1. Defining the Anthropocene

    Science.gov (United States)

    Lewis, Simon; Maslin, Mark

    2016-04-01

    Time is divided by geologists according to marked shifts in Earth's state. Recent global environmental changes suggest that Earth may have entered a new human-dominated geological epoch, the Anthropocene. Should the Anthropocene - the idea that human activity is a force acting upon the Earth system in ways that mean that Earth will be altered for millions of years - be defined as a geological time-unit at the level of an Epoch? Here we appraise the data to assess such claims, first in terms of changes to the Earth system, with particular focus on very long-lived impacts, as Epochs typically last millions of years. Can Earth really be said to be in transition from one state to another? Secondly, we then consider the formal criteria used to define geological time-units and move forward through time examining whether currently available evidence passes typical geological time-unit evidence thresholds. We suggest two time periods likely fit the criteria (1) the aftermath of the interlinking of the Old and New Worlds, which moved species across continents and ocean basins worldwide, a geologically unprecedented and permanent change, which is also the globally synchronous coolest part of the Little Ice Age (in Earth system terms), and the beginning of global trade and a new socio-economic "world system" (in historical terms), marked as a golden spike by a temporary drop in atmospheric CO2, centred on 1610 CE; and (2) the aftermath of the Second World War, when many global environmental changes accelerated and novel long-lived materials were increasingly manufactured, known as the Great Acceleration (in Earth system terms) and the beginning of the Cold War (in historical terms), marked as a golden spike by the peak in radionuclide fallout in 1964. We finish by noting that the Anthropocene debate is politically loaded, thus transparency in the presentation of evidence is essential if a formal definition of the Anthropocene is to avoid becoming a debate about bias. The

  2. Defining an emerging disease.

    Science.gov (United States)

    Moutou, F; Pastoret, P-P

    2015-04-01

    Defining an emerging disease is not straightforward, as there are several different types of disease emergence. For example, there can be a 'real' emergence of a brand new disease, such as the emergence of bovine spongiform encephalopathy in the 1980s, or a geographic emergence in an area not previously affected, such as the emergence of bluetongue in northern Europe in 2006. In addition, disease can emerge in species formerly not considered affected, e.g. the emergence of bovine tuberculosis in wildlife species since 2000 in France. There can also be an unexpected increase of disease incidence in a known area and a known species, or there may simply be an increase in our knowledge or awareness of a particular disease. What all these emerging diseases have in common is that human activity frequently has a role to play in their emergence. For example, bovine spongiform encephalopathy very probably emerged as a result of changes in the manufacturing of meat-and-bone meal, bluetongue was able to spread to cooler climes as a result of uncontrolled trade in animals, and a relaxation of screening and surveillance for bovine tuberculosis enabled the disease to re-emerge in areas that had been able to drastically reduce the number of cases. Globalisation and population growth will continue to affect the epidemiology of diseases in years to come and ecosystems will continue to evolve. Furthermore, new technologies such as metagenomics and high-throughput sequencing are identifying new microorganisms all the time. Change is the one constant, and diseases will continue to emerge, and we must consider the causes and different types of emergence as we deal with these diseases in the future.

  3. Randomized clinical trials in orthodontics are rarely registered a priori and often published late or not at all.

    Science.gov (United States)

    Papageorgiou, Spyridon N; Antonoglou, Georgios N; Sándor, George K; Eliades, Theodore

    2017-01-01

    A priori registration of randomized clinical trials is crucial to the transparency and credibility of their findings. Aim of this study was to assess the frequency with which registered and completed randomized trials in orthodontics are published. We searched ClinicalTrials.gov and ISRCTN for registered randomized clinical trials in orthodontics that had been completed up to January 2017 and judged the publication status and date of registered trials using a systematic protocol. Statistical analysis included descriptive statistics, chi-square or Fisher exact tests, and Kaplan-Meier survival estimates. From the 266 orthodontic trials registered up to January 2017, 80 trials had been completed and included in the present study. Among these 80 included trials, the majority (76%) were registered retrospectively, while only 33 (41%) were published at the time. The median time from completion to publication was 20.1 months (interquartile range: 9.1 to 31.6 months), while survival analysis indicated that less than 10% of the trials were published after 5 years from their completion. Finally, 22 (28%) of completed trials remain unpublished even after 5 years from their completion. Publication rates of registered randomized trials in orthodontics remained low, even 5 years after their completion date.

  4. Numerical aspects of drift kinetic turbulence: Ill-posedness, regularization and a priori estimates of sub-grid-scale terms

    KAUST Repository

    Samtaney, Ravi

    2012-01-01

    We present a numerical method based on an Eulerian approach to solve the Vlasov-Poisson system for 4D drift kinetic turbulence. Our numerical approach uses a conservative formulation with high-order (fourth and higher) evaluation of the numerical fluxes coupled with a fourth-order accurate Poisson solver. The fluxes are computed using a low-dissipation high-order upwind differencing method or a tuned high-resolution finite difference method with no numerical dissipation. Numerical results are presented for the case of imposed ion temperature and density gradients. Different forms of controlled regularization to achieve a well-posed system are used to obtain convergent resolved simulations. The regularization of the equations is achieved by means of a simple collisional model, by inclusion of an ad-hoc hyperviscosity or artificial viscosity term or by implicit dissipation in upwind schemes. Comparisons between the various methods and regularizations are presented. We apply a filtering formalism to the Vlasov equation and derive sub-grid-scale (SGS) terms analogous to the Reynolds stress terms in hydrodynamic turbulence. We present a priori quantifications of these SGS terms in resolved simulations of drift-kinetic turbulence by applying a sharp filter. © 2012 IOP Publishing Ltd.

  5. Phillips-Tikhonov regularization with a priori information for neutron emission tomographic reconstruction on Joint European Torus

    Energy Technology Data Exchange (ETDEWEB)

    Bielecki, J.; Scholz, M.; Drozdowicz, K. [Institute of Nuclear Physics, Polish Academy of Sciences, PL-31342 Krakow (Poland); Giacomelli, L. [CCFE, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Istituto di Fisica del Plasma “P. Caldirola,” Milano (Italy); Kiptily, V.; Kempenaars, M. [CCFE, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Conroy, S. [CCFE, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Department of Physics and Astronomy, Uppsala University (Sweden); Craciunescu, T. [IAP, National Institute for Laser Plasma and Radiation Physics, Bucharest (Romania); Collaboration: EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB (United Kingdom)

    2015-09-15

    A method of tomographic reconstruction of the neutron emissivity in the poloidal cross section of the Joint European Torus (JET, Culham, UK) tokamak was developed. Due to very limited data set (two projection angles, 19 lines of sight only) provided by the neutron emission profile monitor (KN3 neutron camera), the reconstruction is an ill-posed inverse problem. The aim of this work consists in making a contribution to the development of reliable plasma tomography reconstruction methods that could be routinely used at JET tokamak. The proposed method is based on Phillips-Tikhonov regularization and incorporates a priori knowledge of the shape of normalized neutron emissivity profile. For the purpose of the optimal selection of the regularization parameters, the shape of normalized neutron emissivity profile is approximated by the shape of normalized electron density profile measured by LIDAR or high resolution Thomson scattering JET diagnostics. In contrast with some previously developed methods of ill-posed plasma tomography reconstruction problem, the developed algorithms do not include any post-processing of the obtained solution and the physical constrains on the solution are imposed during the regularization process. The accuracy of the method is at first evaluated by several tests with synthetic data based on various plasma neutron emissivity models (phantoms). Then, the method is applied to the neutron emissivity reconstruction for JET D plasma discharge #85100. It is demonstrated that this method shows good performance and reliability and it can be routinely used for plasma neutron emissivity reconstruction on JET.

  6. Numerical aspects of drift kinetic turbulence: ill-posedness, regularization and a priori estimates of sub-grid-scale terms

    International Nuclear Information System (INIS)

    Samtaney, Ravi

    2012-01-01

    We present a numerical method based on an Eulerian approach to solve the Vlasov-Poisson system for 4D drift kinetic turbulence. Our numerical approach uses a conservative formulation with high-order (fourth and higher) evaluation of the numerical fluxes coupled with a fourth-order accurate Poisson solver. The fluxes are computed using a low-dissipation high-order upwind differencing method or a tuned high-resolution finite difference method with no numerical dissipation. Numerical results are presented for the case of imposed ion temperature and density gradients. Different forms of controlled regularization to achieve a well-posed system are used to obtain convergent resolved simulations. The regularization of the equations is achieved by means of a simple collisional model, by inclusion of an ad-hoc hyperviscosity or artificial viscosity term or by implicit dissipation in upwind schemes. Comparisons between the various methods and regularizations are presented. We apply a filtering formalism to the Vlasov equation and derive sub-grid-scale (SGS) terms analogous to the Reynolds stress terms in hydrodynamic turbulence. We present a priori quantifications of these SGS terms in resolved simulations of drift-kinetic turbulence by applying a sharp filter.

  7. A priori estimation of accuracy and of the number of wells to be employed in limiting dilution assays

    Directory of Open Access Journals (Sweden)

    J.G. Chaui-Berlinck

    2000-08-01

    Full Text Available The use of limiting dilution assay (LDA for assessing the frequency of responders in a cell population is a method extensively used by immunologists. A series of studies addressing the statistical method of choice in an LDA have been published. However, none of these studies has addressed the point of how many wells should be employed in a given assay. The objective of this study was to demonstrate how a researcher can predict the number of wells that should be employed in order to obtain results with a given accuracy, and, therefore, to help in choosing a better experimental design to fulfill one's expectations. We present the rationale underlying the expected relative error computation based on simple binomial distributions. A series of simulated in machina experiments were performed to test the validity of the a priori computation of expected errors, thus confirming the predictions. The step-by-step procedure of the relative error estimation is given. We also discuss the constraints under which an LDA must be performed.

  8. Tomographic image via background subtraction using an x-ray projection image and a priori computed tomography

    International Nuclear Information System (INIS)

    Zhang Jin; Yi Byongyong; Lasio, Giovanni; Suntharalingam, Mohan; Yu, Cedric

    2009-01-01

    Kilovoltage x-ray projection images (kV images for brevity) are increasingly available in image guided radiotherapy (IGRT) for patient positioning. These images are two-dimensional (2D) projections of a three-dimensional (3D) object along the x-ray beam direction. Projecting a 3D object onto a plane may lead to ambiguities in the identification of anatomical structures and to poor contrast in kV images. Therefore, the use of kV images in IGRT is mainly limited to bony landmark alignments. This work proposes a novel subtraction technique that isolates a slice of interest (SOI) from a kV image with the assistance of a priori information from a previous CT scan. The method separates structural information within a preselected SOI by suppressing contributions to the unprocessed projection from out-of-SOI-plane structures. Up to a five-fold increase in the contrast-to-noise ratios (CNRs) was observed in selected regions of the isolated SOI, when compared to the original unprocessed kV image. The tomographic image via background subtraction (TIBS) technique aims to provide a quick snapshot of the slice of interest with greatly enhanced image contrast over conventional kV x-ray projections for fast and accurate image guidance of radiation therapy. With further refinements, TIBS could, in principle, provide real-time tumor localization using gantry-mounted x-ray imaging systems without the need for implanted markers.

  9. A comparative study of amplitude calibrations for the East Asia VLBI Network: A priori and template spectrum methods

    Science.gov (United States)

    Cho, Ilje; Jung, Taehyun; Zhao, Guang-Yao; Akiyama, Kazunori; Sawada-Satoh, Satoko; Kino, Motoki; Byun, Do-Young; Sohn, Bong Won; Shibata, Katsunori M.; Hirota, Tomoya; Niinuma, Kotaro; Yonekura, Yoshinori; Fujisawa, Kenta; Oyama, Tomoaki

    2017-12-01

    We present the results of a comparative study of amplitude calibrations for the East Asia VLBI Network (EAVN) at 22 and 43 GHz using two different methods of an "a priori" and a "template spectrum", particularly on lower declination sources. Using observational data sets of early EAVN observations, we investigated the elevation-dependence of the gain values at seven stations of the KaVA (KVN and VERA Array) and three additional telescopes in Japan (Takahagi 32 m, Yamaguchi 32 m, and Nobeyama 45 m). By comparing the independently obtained gain values based on these two methods, we found that the gain values from each method were consistent within 10% at elevations higher than 10°. We also found that the total flux densities of two images produced from the different amplitude calibrations were in agreement within 10% at both 22 and 43 GHz. By using the template spectrum method, furthermore, the additional radio telescopes can participate in KaVA (i.e., EAVN), giving a notable sensitivity increase. Therefore, our results will constrain the detailed conditions in order to measure the VLBI amplitude reliably using EAVN, and discuss the potential of possible expansion to telescopes comprising EAVN.

  10. Impact of nonlinearity on changing the a priori of trace gas profile estimates from the Tropospheric Emission Spectrometer (TES

    Directory of Open Access Journals (Sweden)

    S. S. Kulawik

    2008-06-01

    Full Text Available Non-linear maximum a posteriori (MAP estimates of atmospheric profiles from the Tropospheric Emission Spectrometer (TES contains a priori information that may vary geographically, which is a confounding factor in the analysis and physical interpretation of an ensemble of profiles. One mitigation strategy is to transform profile estimates to a common prior using a linear operation thereby facilitating the interpretation of profile variability. However, this operation is dependent on the assumption of not worse than moderate non-linearity near the solution of the non-linear estimate. The robustness of this assumption is tested by comparing atmospheric retrievals from the Tropospheric Emission Spectrometer processed with a uniform prior with those processed with a variable prior and converted to a uniform prior following the non-linear retrieval. Linearly converting the prior following a non-linear retrieval is shown to have a minor effect on the results as compared to a non-linear retrieval using a uniform prior when compared to the expected total error, with less than 10% of the change in the prior ending up as unbiased fluctuations in the profile estimate results.

  11. Enhancing the performance of model-based elastography by incorporating additional a priori information in the modulus image reconstruction process

    International Nuclear Information System (INIS)

    Doyley, Marvin M; Srinivasan, Seshadri; Dimidenko, Eugene; Soni, Nirmal; Ophir, Jonathan

    2006-01-01

    Model-based elastography is fraught with problems owing to the ill-posed nature of the inverse elasticity problem. To overcome this limitation, we have recently developed a novel inversion scheme that incorporates a priori information concerning the mechanical properties of the underlying tissue structures, and the variance incurred during displacement estimation in the modulus image reconstruction process. The information was procured by employing standard strain imaging methodology, and introduced in the reconstruction process through the generalized Tikhonov approach. In this paper, we report the results of experiments conducted on gelatin phantoms to evaluate the performance of modulus elastograms computed with the generalized Tikhonov (GTK) estimation criterion relative to those computed by employing the un-weighted least-squares estimation criterion, the weighted least-squares estimation criterion and the standard Tikhonov method (i.e., the generalized Tikhonov method with no modulus prior). The results indicate that modulus elastograms computed with the generalized Tikhonov approach had superior elastographic contrast discrimination and contrast recovery. In addition, image reconstruction was more resilient to structural decorrelation noise when additional constraints were imposed on the reconstruction process through the GTK method

  12. Definably compact groups definable in real closed fields.II

    OpenAIRE

    Barriga, Eliana

    2017-01-01

    We continue the analysis of definably compact groups definable in a real closed field $\\mathcal{R}$. In [3], we proved that for every definably compact definably connected semialgebraic group $G$ over $\\mathcal{R}$ there are a connected $R$-algebraic group $H$, a definable injective map $\\phi$ from a generic definable neighborhood of the identity of $G$ into the group $H\\left(R\\right)$ of $R$-points of $H$ such that $\\phi$ acts as a group homomorphism inside its domain. The above result and o...

  13. Practical Considerations about Expected A Posteriori Estimation in Adaptive Testing: Adaptive A Priori, Adaptive Correction for Bias, and Adaptive Integration Interval.

    Science.gov (United States)

    Raiche, Gilles; Blais, Jean-Guy

    In a computerized adaptive test (CAT), it would be desirable to obtain an acceptable precision of the proficiency level estimate using an optimal number of items. Decreasing the number of items is accompanied, however, by a certain degree of bias when the true proficiency level differs significantly from the a priori estimate. G. Raiche (2000) has…

  14. A priori assumptions about characters as a cause of incongruence between molecular and morphological hypotheses of primate interrelationships.

    Science.gov (United States)

    Tornow, Matthew A; Skelton, Randall R

    2012-01-01

    When molecules and morphology produce incongruent hypotheses of primate interrelationships, the data are typically viewed as incompatible, and molecular hypotheses are often considered to be better indicators of phylogenetic history. However, it has been demonstrated that the choice of which taxa to include in cladistic analysis as well as assumptions about character weighting, character state transformation order, and outgroup choice all influence hypotheses of relationships and may positively influence tree topology, so that relationships between extant taxa are consistent with those found using molecular data. Thus, the source of incongruence between morphological and molecular trees may lie not in the morphological data themselves but in assumptions surrounding the ways characters evolve and their impact on cladistic analysis. In this study, we investigate the role that assumptions about character polarity and transformation order play in creating incongruence between primate phylogenies based on morphological data and those supported by multiple lines of molecular data. By releasing constraints imposed on published morphological analyses of primates from disparate clades and subjecting those data to parsimony analysis, we test the hypothesis that incongruence between morphology and molecules results from inherent flaws in morphological data. To quantify the difference between incongruent trees, we introduce a new method called branch slide distance (BSD). BSD mitigates many of the limitations attributed to other tree comparison methods, thus allowing for a more accurate measure of topological similarity. We find that releasing a priori constraints on character behavior often produces trees that are consistent with molecular trees. Case studies are presented that illustrate how congruence between molecules and unconstrained morphological data may provide insight into issues of polarity, transformation order, homology, and homoplasy.

  15. Revealing plant cryptotypes: defining meaningful phenotypes among infinite traits.

    Science.gov (United States)

    Chitwood, Daniel H; Topp, Christopher N

    2015-04-01

    The plant phenotype is infinite. Plants vary morphologically and molecularly over developmental time, in response to the environment, and genetically. Exhaustive phenotyping remains not only out of reach, but is also the limiting factor to interpreting the wealth of genetic information currently available. Although phenotyping methods are always improving, an impasse remains: even if we could measure the entirety of phenotype, how would we interpret it? We propose the concept of cryptotype to describe latent, multivariate phenotypes that maximize the separation of a priori classes. Whether the infinite points comprising a leaf outline or shape descriptors defining root architecture, statistical methods to discern the quantitative essence of an organism will be required as we approach measuring the totality of phenotype. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Séparation des ondes P et S à l'aide de la matrice spectrale avec informations à priori The Separation of P and S Waves Using the Spectral Matrix with a Priori Information

    Directory of Open Access Journals (Sweden)

    Mari J. L.

    2006-11-01

    Full Text Available Classiquement, la technique de filtrage utilisant la matrice spectrale proposée par Mermoz ne permet une séparation automatique des ondes au sens des indicatrices sismiques que dans certains cas particuliers, à savoir lorsque les ondes à séparer sont naturellement alignées sur les vecteurs propres de la matrice spectrale. Dans les autres cas, nous montrons que l'introduction d'information a priori sur la vitesse apparente de quelques ondes et une limitation de la durée temporelle de ces dernières permettent d'estimer leurs vecteurs d'ondes. L'utilisation de ces vecteurs et une technique de projection au sens des moindres carrés conduit à une extraction optimale de ces ondes, sans dégrader les autres ondes. La technique de filtrage proposée a été appliquée sur des données sismiques de type PSV (profil sismique vertical déporté. Le PSV a été enregistré dans un puits entre les cotes 1050 m et 1755 m; la source est déportée de 654 m par rapport à la tête de puits. L'outil utilisé est un géophone de puits à trois composantes. Le puits traverse une structure géologique complexe. Le traitement réalisé a mis en évidence des réflexions sismiques d'ondes de compression et de cisaillement, associées à des marqueurs fortement pentés (10 à 25°. Après estimation des champs de vitesse et des pendages à l'aide d'abaques, la migration en profondeur des horizons temps pointés a permis d'obtenir un modèle structural faillé. Detailed structural analysis can be achieved by using 3-component vertical seismic profiling method which gives structural information at several hundred meters from the wellhead. The use of an offset VSP on the Auzance structure has led to obtain a structural model composed by faulted dipping reflectors. This is due to the robust nature of the wave separation method which is based on the spectral matrix and uses an a priori information. This method preserves the true amplitude and the local apparent

  17. Predicting Hyporheic Exchange of Water and Solutes in Streams on the Basis of a Priori Estimates of Stream Physical Characteristics

    Science.gov (United States)

    Stone, S. H.; Harvey, J.; Packman, A.; Worman, A.

    2005-12-01

    source of uncertainty in predicting hyporheic exchange resulted from the lack of detailed information on streambed hydraulic conductivity. We are currently conducting additional fieldwork to improve characterization of hydraulic conductivity and evaluate temporal changes in local stream morphology, and will relate these new measurements to the results of multiple prior solute injection experiments. These methods can potentially be used to provide both a priori, order-of-magnitude prediction of hyporheic exchange and much higher-quality estimates of long-term average behavior when used in conjunction with direct observations of solute transport. In the future we intend to test the generality of our method by applying the technique in other streams with varying geomorphology and flow conditions.

  18. Risk pathways for gonorrhea acquisition in sex workers: can we distinguish confounding from an exposure effect using a priori hypotheses?

    NARCIS (Netherlands)

    Gomez, Gabriela B.; Ward, Helen; Garnett, Geoffrey P.

    2014-01-01

    The population distribution of sexually transmitted infections (STIs) varies broadly across settings. Although there have been many studies aiming to define subgroups at risk of infection that should be a target for prevention interventions by identifying risk factors, questions remain about how

  19. Contribution to restoration of degraded images by a space-variant system: use of an a priori model of the image

    International Nuclear Information System (INIS)

    Barakat, Valerie

    1998-01-01

    Imaging systems often present shift-variant point spread functions which are usually approximated by shift-invariant ones, in order to simplify the restoration problem. The aim of this thesis is to show that, if this shift-variant degradation is taken into account, it may increase strongly the quality of restoration. The imaging system is a pinhole, used to acquire images of high energy beams. Three restoration methods have been studied and compared: the Tikhonov-Miller regularization, the Markov-fields and the Maximum-Entropy methods. These methods are based on the incorporation of an a priori knowledge into the restoration process, to achieve stability of the solution. An improved restoration method is proposed: this approach is based on the Tikhonov-Miller regularization, combined with an a priori model of the solution. The idea of such a model is to express local characteristics to be reconstructed. The concept of parametric models described by a set of parameters (shape of the object, amplitude values,...) is used. A parametric optimization is used to find the optimal estimation of parameters close to the correct a priori information data of the expected solution. Several criteria have been proposed to measure the restoration quality. (author) [fr

  20. Improvement of Tidal Analysis Results by a Priori Rain Fall Modelling at the Vienna and Membach stations

    Science.gov (United States)

    Meurers, B.; van Camp, M.; Petermans, T.

    2005-12-01

    We investigate how far tidal analysis results can be improved when a rain fall admittance model is applied on the superconducting gravity (SG) data. For that purpose both Vienna and Membach data have been analysed with and without a priori rain fall correction. In Membach the residual drop for most events (80%) can be explained by the rain water load, while in Vienna only 50% of all events fit the model in detail. In the other cases the Newtonian effect of vertical air mass redistribution (vertical density variation without air pressure change), predominantly connected with high vertical convection activity, e.g. thunderstorms, plays an essential role: short-term atmospheric signals show up steep gravity residual decreases of a few nms-2 within 10 - 60 min, well correlated with outdoor air temperature in most cases. However, even in those cases the water load model is able to explain the dominating part of the residual drop especially during heavy rain fall. In Vienna more than 110 events have been detected over 10 years. 84% of them are associated with heavy rain starting at or up to 10 min later than the residual drop while the rest (16%) shows no or only little rainfall. The magnitude of the gravity drop depends on the total amount of rainfall accumulated during the meteorological event. Step like signals deteriorate the frequency spectrum estimates. This even holds for tidal analysis. As the drops are of physical origin, they should not be eliminated blindly but corrected using water load modeling constrained by high temporal resolution (1 min) rain data. 3D modeling of the water mass load due to a rain event is based on the following assumptions: (1) Rain water intrudes into the uppermost soil layer (close to the topography surface) and remains there at least until rain has stopped. This is justified for a period of some hours after the rainfall as evapotranspiration is not yet effective. (2) No run-off except of sealed areas or building roofs, where water can

  1. Identification of {sup 192}Ir seeds in localization images using a novel statistical pattern recognition approach and a priori information

    Energy Technology Data Exchange (ETDEWEB)

    Bird, William F; Chaney, Edward L; Coggins, James M

    1995-07-01

    this study are unique in their ability to extract multiscale geometric information that discriminates one object from another object in gray-scale images. The labeling from MGSPR analysis can be refined using a post processing technique that incorporates a priori information including seed shape, intrastrand seed spacing, and the number of seeds per strand. The current implementation of post processing is user-guided but can be automated. The user interaction consists of pointing and clicking on a known seed in a strand. The algorithm then sequentially locates the remaining seeds belonging to that strand by fitting a predefined seed template to the seed pixels labeled by MGSPR. During the seed location process, the search is guided by the orientation of the template fit at each seed, the known seed spacing, and possible candidates for seeds further along the strand. The latter strategy permits correct labeling of poorly imaged seeds. Crossing and intertwining strands create ambiguous branching points that are resolved by minimizing an energy function that is correlated to the amount of strand bending, or flexion. Results: Challenging localization images from actual {sup 192}Ir implants were used in our study. The images contain up to {approx}20 strands, many of which intersect, and up to {approx}200 seeds, many of which partially or completely overlap with other seeds. Most seeds ({approx}95-100%) were labeled successfully. Non-seed pixels with seed-like geometric properties were sometimes labeled incorrectly. The post processing technique successfully resolved ambiguous overlapping seeds, extrapolated poorly visualized seeds that were not labeled, and eliminated incorrectly labeled non-seed pixels. User-guided post processing also labeled seeds that were not visualized at all, e.g., due to obscuration by contrast media or a radiopaque applicator. Conclusion: Our MGSPR approach and complementary post processing technique can be used to successfully label {sup 192}Ir

  2. Identification of 192Ir seeds in localization images using a novel statistical pattern recognition approach and a priori information

    International Nuclear Information System (INIS)

    Bird, William F.; Chaney, Edward L.; Coggins, James M.

    1995-01-01

    unique in their ability to extract multiscale geometric information that discriminates one object from another object in gray-scale images. The labeling from MGSPR analysis can be refined using a post processing technique that incorporates a priori information including seed shape, intrastrand seed spacing, and the number of seeds per strand. The current implementation of post processing is user-guided but can be automated. The user interaction consists of pointing and clicking on a known seed in a strand. The algorithm then sequentially locates the remaining seeds belonging to that strand by fitting a predefined seed template to the seed pixels labeled by MGSPR. During the seed location process, the search is guided by the orientation of the template fit at each seed, the known seed spacing, and possible candidates for seeds further along the strand. The latter strategy permits correct labeling of poorly imaged seeds. Crossing and intertwining strands create ambiguous branching points that are resolved by minimizing an energy function that is correlated to the amount of strand bending, or flexion. Results: Challenging localization images from actual 192 Ir implants were used in our study. The images contain up to ∼20 strands, many of which intersect, and up to ∼200 seeds, many of which partially or completely overlap with other seeds. Most seeds (∼95-100%) were labeled successfully. Non-seed pixels with seed-like geometric properties were sometimes labeled incorrectly. The post processing technique successfully resolved ambiguous overlapping seeds, extrapolated poorly visualized seeds that were not labeled, and eliminated incorrectly labeled non-seed pixels. User-guided post processing also labeled seeds that were not visualized at all, e.g., due to obscuration by contrast media or a radiopaque applicator. Conclusion: Our MGSPR approach and complementary post processing technique can be used to successfully label 192 Ir seeds in digitized localization images

  3. Defining asthma in genetic studies

    NARCIS (Netherlands)

    Koppelman, GH; Postma, DS; Meijer, G.

    1999-01-01

    Genetic studies have been hampered by the lack of a gold standard to diagnose asthma. The complex nature of asthma makes it more difficult to identify asthma genes. Therefore, approaches to define phenotypes, which have been successful in other genetically complex diseases, may be applied to define

  4. ‘Sampling the reference set’ revisited

    NARCIS (Netherlands)

    Berkum, van E.E.M.; Linssen, H.N.; Overdijk, D.A.

    1998-01-01

    The confidence level of an inference table is defined as a weighted truth probability of the inference when sampling the reference set. The reference set is recognized by conditioning on the values of maximal partially ancillary statistics. In the sampling experiment values of incidental parameters

  5. An Empirical Study of the Weigl-Goldstein-Scheerer Color-Form Test According to a Developmental Frame of Reference.

    Science.gov (United States)

    Strauss, Helen; Lewin, Isaac

    1982-01-01

    Analyzed the Weigl-Goldstein-Scheerer Color-Form Test using a sample of Danish children. Distinguished three dimensions: configuration of sorting, verbalization of the sorting principle, and the flexibility of switching sorting principle. The three dimensions proved themselves to constitute the a-priori-defined gradients. Results indicated a…

  6. Standard Reference Tables -

    Data.gov (United States)

    Department of Transportation — The Standard Reference Tables (SRT) provide consistent reference data for the various applications that support Flight Standards Service (AFS) business processes and...

  7. Theoretical approaches to elections defining

    OpenAIRE

    Natalya V. Lebedeva

    2011-01-01

    Theoretical approaches to elections defining develop the nature, essence and content of elections, help to determine their place and a role as one of the major national law institutions in democratic system.

  8. Theoretical approaches to elections defining

    Directory of Open Access Journals (Sweden)

    Natalya V. Lebedeva

    2011-01-01

    Full Text Available Theoretical approaches to elections defining develop the nature, essence and content of elections, help to determine their place and a role as one of the major national law institutions in democratic system.

  9. Defining Modules, Modularity and Modularization

    DEFF Research Database (Denmark)

    Miller, Thomas Dedenroth; Pedersen, Per Erik Elgård

    The paper describes the evolution of the concept of modularity in a historical perspective. The main reasons for modularity are: create variety, utilize similarities, and reduce complexity. The paper defines the terms: Module, modularity, and modularization.......The paper describes the evolution of the concept of modularity in a historical perspective. The main reasons for modularity are: create variety, utilize similarities, and reduce complexity. The paper defines the terms: Module, modularity, and modularization....

  10. Defining care products to finance health care in the Netherlands.

    Science.gov (United States)

    Westerdijk, Machiel; Zuurbier, Joost; Ludwig, Martijn; Prins, Sarah

    2012-04-01

    A case-mix project started in the Netherlands with the primary goal to define a complete set of health care products for hospitals. The definition of the product structure was completed 4 years later. The results are currently being used for billing purposes. This paper focuses on the methodology and techniques that were developed and applied in order to define the casemix product structure. The central research question was how to develop a manageable product structure, i.e., a limited set of hospital products, with acceptable cost homogeneity. For this purpose, a data warehouse with approximately 1.5 million patient records from 27 hospitals was build up over a period of 3 years. The data associated with each patient consist of a large number of a priori independent parameters describing the resource utilization in different stages of the treatment process, e.g., activities in the operating theatre, the lab and the radiology department. Because of the complexity of the database, it was necessary to apply advanced data analysis techniques. The full analyses process that starts from the database and ends up with a product definition consists of four basic analyses steps. Each of these steps has revealed interesting insights. This paper describes each step in some detail and presents the major results of each step. The result consists of 687 product groups for 24 medical specialties used for billing purposes.

  11. Response Surface Approximation for Fatigue Life Prediction and Its Application to Multi-Criteria Optimization With a Priori Preference Information

    International Nuclear Information System (INIS)

    Baek, Seok Heum; Joo, Won Sik; Cho, Seok Swoo

    2009-01-01

    In this paper, a versatile multi-criteria optimization concept for fatigue life prediction is introduced. Multi-criteria decision making in engineering design refers to obtaining a preferred optimal solution in the context of conflicting design objectives. Compromise decision support problems are used to model engineering decisions involving multiple trade-offs. These methods typically rely on a summation of weighted attributes to accomplish trade-offs among competing objectives. This paper gives an interpretation of the decision parameters as governing both the relative importance of the attributes and the degree of compensation between them. The approach utilizes a response surface model, the compromise decision support problem, which is a multi-objective formulation based on goal programming. Examples illustrate the concepts and demonstrate their applicability

  12. Defining Plagiarism: A Literature Review

    Directory of Open Access Journals (Sweden)

    Akbar Akbar

    2018-02-01

    Full Text Available Plagiarism has repeatedly occurred in Indonesia, resulting in focusing on such academic misbehavior as a “central issue” in Indonesian higher education. One of the issues of addressing plagiarism in higher education is that there is a confusion of defining plagiarism. It seems that Indonesian academics had different perception when defining plagiarism. This article aims at exploring the issue of plagiarism by helping define plagiarism to address confusion among Indonesian academics. This article applies literature review by firs finding relevant articles after identifying databases for literature searching. After the collection of required articles for review, the articles were synthesized before presenting the findings. This study has explored the definition of plagiarism in the context of higher education. This research found that plagiarism is defined in the relation of criminal acts. The huge numbers of discursive features used position plagiaristic acts as an illegal deed. This study also found that cultural backgrounds and exposure to plagiarism were influential in defining plagiarism.

  13. Defining functional distances over Gene Ontology

    Directory of Open Access Journals (Sweden)

    del Pozo Angela

    2008-01-01

    Full Text Available Abstract Background A fundamental problem when trying to define the functional relationships between proteins is the difficulty in quantifying functional similarities, even when well-structured ontologies exist regarding the activity of proteins (i.e. 'gene ontology' -GO-. However, functional metrics can overcome the problems in the comparing and evaluating functional assignments and predictions. As a reference of proximity, previous approaches to compare GO terms considered linkage in terms of ontology weighted by a probability distribution that balances the non-uniform 'richness' of different parts of the Direct Acyclic Graph. Here, we have followed a different approach to quantify functional similarities between GO terms. Results We propose a new method to derive 'functional distances' between GO terms that is based on the simultaneous occurrence of terms in the same set of Interpro entries, instead of relying on the structure of the GO. The coincidence of GO terms reveals natural biological links between the GO functions and defines a distance model Df which fulfils the properties of a Metric Space. The distances obtained in this way can be represented as a hierarchical 'Functional Tree'. Conclusion The method proposed provides a new definition of distance that enables the similarity between GO terms to be quantified. Additionally, the 'Functional Tree' defines groups with biological meaning enhancing its utility for protein function comparison and prediction. Finally, this approach could be for function-based protein searches in databases, and for analysing the gene clusters produced by DNA array experiments.

  14. 2002 reference document; Document de reference 2002

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-07-01

    This 2002 reference document of the group Areva, provides information on the society. Organized in seven chapters, it presents the persons responsible for the reference document and for auditing the financial statements, information pertaining to the transaction, general information on the company and share capital, information on company operation, changes and future prospects, assets, financial position, financial performance, information on company management and executive board and supervisory board, recent developments and future prospects. (A.L.B.)

  15. Modular Software-Defined Radio

    Directory of Open Access Journals (Sweden)

    Rhiemeier Arnd-Ragnar

    2005-01-01

    Full Text Available In view of the technical and commercial boundary conditions for software-defined radio (SDR, it is suggestive to reconsider the concept anew from an unconventional point of view. The organizational principles of signal processing (rather than the signal processing algorithms themselves are the main focus of this work on modular software-defined radio. Modularity and flexibility are just two key characteristics of the SDR environment which extend smoothly into the modeling of hardware and software. In particular, the proposed model of signal processing software includes irregular, connected, directed, acyclic graphs with random node weights and random edges. Several approaches for mapping such software to a given hardware are discussed. Taking into account previous findings as well as new results from system simulations presented here, the paper finally concludes with the utility of pipelining as a general design guideline for modular software-defined radio.

  16. Defining and Selecting Independent Directors

    Directory of Open Access Journals (Sweden)

    Eric Pichet

    2017-10-01

    Full Text Available Drawing from the Enlightened Shareholder Theory that the author first developed in 2011, this theoretical paper with practical and normative ambitions achieves a better definition of independent director, while improving the understanding of the roles he fulfils on boards of directors. The first part defines constructs like firms, Governance system and Corporate governance, offering a clear distinction between the latter two concepts before explaining the four main missions of a board. The second part defines the ideal independent director by outlining the objective qualities that are necessary and adding those subjective aspects that have turned this into a veritable profession. The third part defines the ideal process for selecting independent directors, based on nominating committees that should themselves be independent. It also includes ways of assessing directors who are currently in function, as well as modalities for renewing their mandates. The paper’s conclusion presents the Paradox of the Independent Director.

  17. Defining and Classifying Interest Groups

    DEFF Research Database (Denmark)

    Baroni, Laura; Carroll, Brendan; Chalmers, Adam

    2014-01-01

    The interest group concept is defined in many different ways in the existing literature and a range of different classification schemes are employed. This complicates comparisons between different studies and their findings. One of the important tasks faced by interest group scholars engaged...... in large-N studies is therefore to define the concept of an interest group and to determine which classification scheme to use for different group types. After reviewing the existing literature, this article sets out to compare different approaches to defining and classifying interest groups with a sample...... in the organizational attributes of specific interest group types. As expected, our comparison of coding schemes reveals a closer link between group attributes and group type in narrower classification schemes based on group organizational characteristics than those based on a behavioral definition of lobbying....

  18. ON DEFINING S-SPACES

    Directory of Open Access Journals (Sweden)

    Francesco Strati

    2013-05-01

    Full Text Available The present work is intended to be an introduction to the Superposition Theory of David Carfì. In particular I shall depict the meaning of his brand new theory, on the one hand in an informal fashion and on the other hand by giving a formal approach of the algebraic structure of the theory: the S-linear algebra. This kind of structure underpins the notion of S-spaces (or Carfì-spaces by defining both its properties and its nature. Thus I shall define the S-triple as the fundamental principle upon which the S-linear algebra is built up.

  19. ASTM reference radiologic digital image standards

    International Nuclear Information System (INIS)

    Wysnewski, R.; Wysnewski, D.

    1996-01-01

    ASTM Reference Radiographs have been essential in defining industry's material defect grade levels for many years. ASTM Reference Radiographs are used extensively as even the American Society for Metals Nondestructive Inspection and Quality Control Metals Handbook, Volume 11, eighth edition refers to ASTM Standard Reference Radiographs. The recently published E 1648 Standard Reference Radiographs for Examination of Aluminum Fusion Welds is a prime example of the on-going need for these references. To date, 14 Standard Reference Radiographs have been published to characterize material defects. Standard Reference Radiographs do not adequately address film-less radiologic methods. There are differences in mediums to content with. On a computer CRT defect indications appear differently when compared to indications viewed in a radiograph on a view box. Industry that uses non-film radiologic methods of inspection can be burdened with additional time and money developing internal standard reference radiologic images. These references may be deemed necessary for grading levels of product defects. Because there are no ASTM Standard Reference Radiologic data files for addressing this need in the industry, the authors of this paper suggested implementing a method for their creation under ASTM supervision. ASTM can assure continuity to those users making the transition from analog radiographic images to digital image data by swiftly addressing the requirements for reference digital image standards. The current status and possible future activities regarding a method to create digital data files is presented in this paper summary

  20. Defining and Differentiating the Makerspace

    Science.gov (United States)

    Dousay, Tonia A.

    2017-01-01

    Many resources now punctuate the maker movement landscape. However, some schools and communities still struggle to understand this burgeoning movement. How do we define these spaces and differentiate them from previous labs and shops? Through a multidimensional framework, stakeholders should consider how the structure, access, staffing, and tools…

  1. Indico CONFERENCE: Define the Programme

    CERN Multimedia

    CERN. Geneva; Ferreira, Pedro

    2017-01-01

    In this tutorial you are going to learn how to define the programme of a conference in Indico. The program of your conference is divided in different “tracks”. Tracks represent the subject matter of the conference, such as “Online Computing”, “Offline Computing”, and so on.

  2. CMS Statistics Reference Booklet

    Data.gov (United States)

    U.S. Department of Health & Human Services — The annual CMS Statistics reference booklet provides a quick reference for summary information about health expenditures and the Medicare and Medicaid health...

  3. Optimization of Decision-Making for Spatial Sampling in the North China Plain, Based on Remote-Sensing a Priori Knowledge

    Science.gov (United States)

    Feng, J.; Bai, L.; Liu, S.; Su, X.; Hu, H.

    2012-07-01

    In this paper, the MODIS remote sensing data, featured with low-cost, high-timely and moderate/low spatial resolutions, in the North China Plain (NCP) as a study region were firstly used to carry out mixed-pixel spectral decomposition to extract an useful regionalized indicator parameter (RIP) (i.e., an available ratio, that is, fraction/percentage, of winter wheat planting area in each pixel as a regionalized indicator variable (RIV) of spatial sampling) from the initial selected indicators. Then, the RIV values were spatially analyzed, and the spatial structure characteristics (i.e., spatial correlation and variation) of the NCP were achieved, which were further processed to obtain the scalefitting, valid a priori knowledge or information of spatial sampling. Subsequently, founded upon an idea of rationally integrating probability-based and model-based sampling techniques and effectively utilizing the obtained a priori knowledge or information, the spatial sampling models and design schemes and their optimization and optimal selection were developed, as is a scientific basis of improving and optimizing the existing spatial sampling schemes of large-scale cropland remote sensing monitoring. Additionally, by the adaptive analysis and decision strategy the optimal local spatial prediction and gridded system of extrapolation results were able to excellently implement an adaptive report pattern of spatial sampling in accordance with report-covering units in order to satisfy the actual needs of sampling surveys.

  4. Changing quantum reference frames

    OpenAIRE

    Palmer, Matthew C.; Girelli, Florian; Bartlett, Stephen D.

    2013-01-01

    We consider the process of changing reference frames in the case where the reference frames are quantum systems. We find that, as part of this process, decoherence is necessarily induced on any quantum system described relative to these frames. We explore this process with examples involving reference frames for phase and orientation. Quantifying the effect of changing quantum reference frames serves as a first step in developing a relativity principle for theories in which all objects includ...

  5. Defining Weapons of Mass Destruction

    Science.gov (United States)

    2012-01-01

    Cyprus, Liberia, Malta, Marshall Islands , Mongolia, Panama, and St. Vin- cent and the Grenadines, according to a State Department summary available...1972 Biological and Toxin Weapons Convention, and the 1993 Chemical Weapons Convention. As such, NBC weapons represent a group of weapons that the...Development, Produc- tion and Stockpiling of Bacteriological (Biological) and Toxin Weapons and on Their Destruction contains two references to WMD

  6. AIDS defining disease: Disseminated cryptococcosis

    Directory of Open Access Journals (Sweden)

    Roshan Anupama

    2006-01-01

    Full Text Available Disseminated cryptococcosis is one of the acquired immune deficiency syndrome defining criteria and the most common cause of life threatening meningitis. Disseminated lesions in the skin manifest as papules or nodules that mimic molluscum contagiosum (MC. We report here a human immunodeficiency virus positive patient who presented with MC like lesions. Disseminated cryptococcosis was confirmed by India ink preparation and histopathology. The condition of the patient improved with amphotercin B.

  7. IAEA biological reference materials

    International Nuclear Information System (INIS)

    Parr, R.M.; Schelenz, R.; Ballestra, S.

    1988-01-01

    The Analytical Quality Control Services programme of the IAEA encompasses a wide variety of intercomparisons and reference materials. This paper reviews only those aspects of the subject having to do with biological reference materials. The 1988 programme foresees 13 new intercomparison exercises, one for major, minor and trace elements, five for radionuclides, and seven for stable isotopes. Twenty-two natural matrix biological reference materials are available: twelve for major, minor and trace elements, six for radionuclides, and four for chlorinated hydrocarbons. Seven new intercomparisons and reference materials are in preparation or under active consideration. Guidelines on the correct use of reference materials are being prepared for publication in 1989 in consultation with other major international producers and users of biological reference materials. The IAEA database on available reference materials is being updated and expanded in scope, and a new publication is planned for 1989. (orig.)

  8. How do people define moderation?

    Science.gov (United States)

    vanDellen, Michelle R; Isherwood, Jennifer C; Delose, Julie E

    2016-06-01

    Eating in moderation is considered to be sound and practical advice for weight maintenance or prevention of weight gain. However, the concept of moderation is ambiguous, and the effect of moderation messages on consumption has yet to be empirically examined. The present manuscript examines how people define moderate consumption. We expected that people would define moderate consumption in ways that justified their current or desired consumption rather than view moderation as an objective standard. In Studies 1 and 2, moderate consumption was perceived to involve greater quantities of an unhealthy food (chocolate chip cookies, gummy candies) than perceptions of how much one should consume. In Study 3, participants generally perceived themselves to eat in moderation and defined moderate consumption as greater than their personal consumption. Furthermore, definitions of moderate consumption were related to personal consumption behaviors. Results suggest that the endorsement of moderation messages allows for a wide range of interpretations of moderate consumption. Thus, we conclude that moderation messages are unlikely to be effective messages for helping people maintain or lose weight. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Quantum bit commitment with misaligned reference frames

    International Nuclear Information System (INIS)

    Harrow, Aram; Oliveira, Roberto; Terhal, Barbara M.

    2006-01-01

    Suppose that Alice and Bob define their coordinate axes differently, and the change of reference frame between them is given by a probability distribution μ over SO(3). We show that this uncertainty of reference frame is of no use for bit commitment when μ is uniformly distributed over a (sub)group of SO(3), but other choices of μ can give rise to a partially or even arbitrarily secure bit commitment

  10. Network Coded Software Defined Networking

    DEFF Research Database (Denmark)

    Krigslund, Jeppe; Hansen, Jonas; Roetter, Daniel Enrique Lucani

    2015-01-01

    Software Defined Networking (SDN) and Network Coding (NC) are two key concepts in networking that have garnered a large attention in recent years. On the one hand, SDN's potential to virtualize services in the Internet allows a large flexibility not only for routing data, but also to manage....... This paper advocates for the use of SDN to bring about future Internet and 5G network services by incorporating network coding (NC) functionalities. The inherent flexibility of both SDN and NC provides a fertile ground to envision more efficient, robust, and secure networking designs, that may also...

  11. Network Coded Software Defined Networking

    DEFF Research Database (Denmark)

    Hansen, Jonas; Roetter, Daniel Enrique Lucani; Krigslund, Jeppe

    2015-01-01

    Software defined networking has garnered large attention due to its potential to virtualize services in the Internet, introducing flexibility in the buffering, scheduling, processing, and routing of data in network routers. SDN breaks the deadlock that has kept Internet network protocols stagnant...... for decades, while applications and physical links have evolved. This article advocates for the use of SDN to bring about 5G network services by incorporating network coding (NC) functionalities. The latter constitutes a major leap forward compared to the state-of-the- art store and forward Internet paradigm...

  12. (Re)Defining Salesperson Motivation

    DEFF Research Database (Denmark)

    Khusainova, Rushana; de Jong, Ad; Lee, Nick

    2018-01-01

    The construct of motivation is one of the central themes in selling and sales management research. Yet, to-date no review article exists that surveys the construct (both from an extrinsic and intrinsic motivation context), critically evaluates its current status, examines various key challenges...... apparent from the extant research, and suggests new research opportunities based on a thorough review of past work. The authors explore how motivation is defined, major theories underpinning motivation, how motivation has historically been measured, and key methodologies used over time. In addition......, attention is given to principal drivers and outcomes of salesperson motivation. A summarizing appendix of key articles in salesperson motivation is provided....

  13. Defining Usability of PN Services

    DEFF Research Database (Denmark)

    Nicolajsen, Hanne Westh; Ahola, Titta; Fleury, Alexandre

    In this deliverable usability and user experience are defined in relation to MAGNET Beyond technologies, and it is described how the main MAGNET Beyond concepts can be evaluated through the involvement of users. The concepts include the new "Activity based communication approach" for interacting...... with the MAGNET Beyond system, as well as the core concepts: Personal Network, Personal Network-Federation, Service Discovery, User Profile Management, Personal Network Management, Privacy and Security and Context Awareness. The overall plans for the final usability evaluation are documented based on the present...

  14. 45 CFR 506.10 - “Vietnam conflict” defined.

    Science.gov (United States)

    2010-10-01

    ... § 506.10 “Vietnam conflict” defined. Vietnam conflict refers to the period beginning February 28, 1961... “Vietnam conflict” for purposes of payment of interest on missing military service members' deposits in the... ending date for the Vietnam conflict for purposes of determining eligibility for compensation under 50 U...

  15. Indoor air: Reference bibliography

    International Nuclear Information System (INIS)

    Campbell, D.; Staves, D.; McDonald, S.

    1989-07-01

    The U. S. Environmental Protection Agency initially established the indoor air Reference Bibliography in 1987 as an appendix to the Indoor Air Quality Implementation Plan. The document was submitted to Congress as required under Title IV--Radon Gas and Indoor Air Quality Research of the Superfund Amendments and Reauthorization Act of 1986. The Reference Bibliography is an extensive bibliography of reference materials on indoor air pollution. The Bibliography contains over 4500 citations and continues to increase as new articles appear

  16. Android quick APIs reference

    CERN Document Server

    Cinar, Onur

    2015-01-01

    The Android Quick APIs Reference is a condensed code and APIs reference for the new Google Android 5.0 SDK. It presents the essential Android APIs in a well-organized format that can be used as a handy reference. You won't find any technical jargon, bloated samples, drawn out history lessons, or witty stories in this book. What you will find is a software development kit and APIs reference that is concise, to the point and highly accessible. The book is packed with useful information and is a must-have for any mobile or Android app developer or programmer. In the Android Quick APIs Refe

  17. Expressiveness and definability in circumscription

    Directory of Open Access Journals (Sweden)

    Francicleber Martins Ferreira

    2011-06-01

    Full Text Available We investigate expressiveness and definability issues with respect to minimal models, particularly in the scope of Circumscription. First, we give a proof of the failure of the Löwenheim-Skolem Theorem for Circumscription. Then we show that, if the class of P; Z-minimal models of a first-order sentence is Δ-elementary, then it is elementary. That is, whenever the circumscription of a first-order sentence is equivalent to a first-order theory, then it is equivalent to a finitely axiomatizable one. This means that classes of models of circumscribed theories are either elementary or not Δ-elementary. Finally, using the previous result, we prove that, whenever a relation Pi is defined in the class of P; Z-minimal models of a first-order sentence Φ and whenever such class of P; Z-minimal models is Δ-elementary, then there is an explicit definition ψ for Pi such that the class of P; Z-minimal models of Φ is the class of models of Φ ∧ ψ. In order words, the circumscription of P in Φ with Z varied can be replaced by Φ plus this explicit definition ψ for Pi.

  18. Defining Quality in Undergraduate Education

    Directory of Open Access Journals (Sweden)

    Alison W. Bowers

    2018-01-01

    Full Text Available Objectives: This research brief explores the literature addressing quality in undergraduate education to identify what previous research has said about quality and to offer future directions for research on quality in undergraduate education. Method: We conducted a scoping review to provide a broad overview of existing research. Using targeted search terms in academic databases, we identified and reviewed relevant academic literature to develop emergent themes and implications for future research. Results: The exploratory review of the literature revealed a range of thoughtful discussions and empirical studies attempting to define quality in undergraduate education. Many publications highlighted the importance of including different stakeholder perspectives and presented some of the varying perceptions of quality among different stakeholders. Conclusions: While a number of researchers have explored and written about how to define quality in undergraduate education, there is not a general consensus regarding a definition of quality in undergraduate education. Past research offers a range of insights, models, and data to inform future research. Implication for Theory and/or Practice: We provide four recommendations for future research to contribute to a high quality undergraduate educational experience. We suggest more comprehensive systematic reviews of the literature as a next step.

  19. An a priori analysis of how solar energy production will affect the balance of payment account in one developing Latin American country

    Energy Technology Data Exchange (ETDEWEB)

    Stavy, Michael [Chicago, Illinois (United States)

    2000-07-01

    This paper studies a model developing Latin American country (hypotheria) with a weak currency (the hypo is the monetary unit), a trade deficit (including being a net importer of fossil fuels) and a sensitive balance of payments situation. There is an a priori analysis of the effect of domestic solar energy production on Hypotheria's positive effect is the BoP Value of domestic solar energy. Many forms of solar energy are not cost competitive with fossil fuels. Because solar energy production does not emit greenhouse Value and the BoP Value of solar energy should be used to reduce the cost of solar energy projects in Hypotheria and to make the solar energy cost competitive with fossil fuels. [Spanish] Este articulo estudia un modelo de un pais Latinoamericano en desarrollo (Hypoteria) con moneda debil (el hypo es la unidad monetaria), un deficit comercial (incluyendo el ser un importador neto de combustibles fosiles) y un balance precario en la situacion de pagos. Existe un analisis a priori sobre el efecto de la produccion domestica de energia solar en un efecto positivo de Hypoteria que es el Valor de la Balanza de Pagos (BoP) de la energia solar domestica. Muchas formas de energia solar no son competitivas en costo con los combustibles fosiles debido a que la produccion de energia solar no emite un Valor de invernadero, y el Valor de la Balanza de Pagos, debe ser utilizado para reducir los costos de los proyectos de energia solar en Hypoteria y asi hacer el costo de la energia solar competitiva con los combustibles fosiles.

  20. First attempt to assess the viability of bluefin tuna spawning events in offshore cages located in an a priori favourable larval habitat

    Directory of Open Access Journals (Sweden)

    Patricia Reglero

    2013-10-01

    Full Text Available Most of the Atlantic bluefin tuna caught by the purse-seine fleet in the Mediterranean Sea are transferred alive into transport cages and towed to coastal facilities where they are fattened. This major fishery is targeting aggregations of reproductive bluefin tuna that continue spawning within the transport cages. Our study is the first attempt to assess the viability of the spawning events within transport cages placed offshore in a priori favourable locations for larval survival. The study was conducted in June 2010 in the Balearic Sea, a main spawning area for bluefin tuna in the Mediterranean. The location of two transport cages, one with wild and one with captive tuna, coincide with the situation of the chlorophyll front using satellite imagery as a proxy for the salinity front between resident surface waters and those of recent Atlantic origin. The results showed that bluefin tuna eggs were spawned almost every day within the two cages but few or no larvae were found. The expected larval densities estimated after applying mortality curves to the daily egg densities observed in the cages were higher than the sampled larval densities. The trajectories of the eggs after hatching estimated from a particle tracking model based on observed geostrophic currents and a drifter deployed adjacent to the cage suggest that larvae were likely to be caught close to the cages within the sampling dates. Spawning events in captive tuna in transport cages may hatch into larvae though they may experience higher mortality rates than expected in natural populations. The causes of the larval mortality are further discussed in the text. Such studies should be repeated in other spawning areas in the Mediterranean if spawning in cages located offshore in areas favourable a priori for larval survival is likely to be considered a management measurement to minimize the impact of purse-seine fishing on tuna.

  1. The use of a priori information in ICA-based techniques for real-time fMRI: an evaluation of static/dynamic and spatial/temporal characteristics

    Directory of Open Access Journals (Sweden)

    Nicola eSoldati

    2013-03-01

    Full Text Available Real-time brain functional MRI (rt-fMRI allows in-vivo non-invasive monitoring of neural networks. The use of multivariate data-driven analysis methods such as independent component analysis (ICA offers an attractive trade-off between data interpretability and information extraction, and can be used during both task-based and rest experiments. The purpose of this study was to assess the effectiveness of different ICA-based procedures to monitor in real-time a target IC defined from a functional localizer which also used ICA. Four novel methods were implemented to monitor ongoing brain activity in a sliding window approach. The methods differed in the ways in which a priori information, derived from ICA algorithms, was used to monitora target independent component (IC. We implemented four different algorithms, all based on ICA. One Back-projection method used ICA to derive static spatial information from the functional localizer, off line, which was then back-projected dynamically during the real-time acquisition. The other three methods used real-time ICA algorithms that dynamically exploited temporal, spatial, or spatial-temporal priors during the real-time acquisition. The methods were evaluated by simulating a rt-fMRI experiment that used real fMRI data. The performance of each method was characterized by the spatial and/or temporal correlation with the target IC component monitored, computation time and intrinsic stochastic variability of the algorithms. In this study the Back-projection method, which could monitor more than one IC of interest, outperformed the other methods. These results are consistent with a functional task that gives stable target ICs over time. The dynamic adaptation possibilities offered by the other ICA methods proposed may offer better performance than the Back-projection in conditions where the functional activation shows higher spatial and/or temporal variability.

  2. Terminology, the importance of defining.

    Science.gov (United States)

    van Mil, J W Foppe; Henman, Martin

    2016-06-01

    Multiple terms and definitions exist to describe specific aspects of pharmacy practice and service provision. This commentary explores the reasons for different interpretations of words and concepts in pharmaceutical care and pharmacy practice research. Reasons for this variation can be found in language, culture, profession and may also depend on developments over time. A list of words is provided where the authors think that currently multiple interpretations are possible. To make sure that the reader understands the essence, it seems imperative that authors include a definition of the topics that they actually study in their papers, and that they clearly cite existing definitions or refer to collections of definitions such as existing glossaries. It is important that presenters, authors and reviewers of pharmacy practice papers pay more attention to this aspect of describing studies.

  3. Marketing Reference Services.

    Science.gov (United States)

    Norman, O. Gene

    1995-01-01

    Relates the marketing concept to library reference services. Highlights include a review of the literature and an overview of marketing, including research, the marketing mix, strategic plan, marketing plan, and marketing audit. Marketing principles are applied to reference services through the marketing mix elements of product, price, place, and…

  4. Reference class forecasting

    DEFF Research Database (Denmark)

    Flyvbjerg, Bent

    optimisme og misinformation. RCF bygger på teorier, som vandt Daniel Kahneman Nobelprisen i økonomi i 2002. RCF estimerer budgettet for et givet projekt på grundlag af de faktiske udfald for budgetterne i en reference-klasse af projekter. RCF udføres i tre trin: 1. Identifikation af en relevant reference...

  5. Defining the "normal" postejaculate urinalysis.

    Science.gov (United States)

    Mehta, Akanksha; Jarow, Jonathan P; Maples, Pat; Sigman, Mark

    2012-01-01

    Although sperm have been shown to be present in the postejaculate urinalysis (PEU) of both fertile and infertile men, the number of sperm present in the PEU of the general population has never been well defined. The objective of this study was to describe the semen and PEU findings in both the general and infertile population, in order to develop a better appreciation for "normal." Infertile men (n = 77) and control subjects (n = 71) were prospectively recruited. Exclusion criteria included azoospermia and medications known to affect ejaculation. All men underwent a history, physical examination, semen analysis, and PEU. The urine was split into 2 containers: PEU1, the initial voided urine, and PEU2, the remaining voided urine. Parametric statistical methods were applied for data analysis to compare sperm concentrations in each sample of semen and urine between the 2 groups of men. Controls had higher average semen volume (3.3 ± 1.6 vs 2.0 ± 1.4 mL, P sperm concentrations (112 million vs 56.2 million, P = .011), compared with infertile men. The presence of sperm in urine was common in both groups, but more prevalent among infertile men (98.7% vs 88.7%, P = .012), in whom it comprised a greater proportion of the total sperm count (46% vs 24%, P = .022). The majority of sperm present in PEU were seen in PEU1 of both controls (69%) and infertile men (88%). An association was noted between severe oligospermia (sperm counts in PEU (sperm in the urine compared with control, there is a large degree of overlap between the 2 populations, making it difficult to identify a specific threshold to define a positive test. Interpretation of a PEU should be directed by whether the number of sperm in the urine could affect subsequent management.

  6. Miniature EVA Software Defined Radio

    Science.gov (United States)

    Pozhidaev, Aleksey

    2012-01-01

    As NASA embarks upon developing the Next-Generation Extra Vehicular Activity (EVA) Radio for deep space exploration, the demands on EVA battery life will substantially increase. The number of modes and frequency bands required will continue to grow in order to enable efficient and complex multi-mode operations including communications, navigation, and tracking applications. Whether conducting astronaut excursions, communicating to soldiers, or first responders responding to emergency hazards, NASA has developed an innovative, affordable, miniaturized, power-efficient software defined radio that offers unprecedented power-efficient flexibility. This lightweight, programmable, S-band, multi-service, frequency- agile EVA software defined radio (SDR) supports data, telemetry, voice, and both standard and high-definition video. Features include a modular design, an easily scalable architecture, and the EVA SDR allows for both stationary and mobile battery powered handheld operations. Currently, the radio is equipped with an S-band RF section. However, its scalable architecture can accommodate multiple RF sections simultaneously to cover multiple frequency bands. The EVA SDR also supports multiple network protocols. It currently implements a Hybrid Mesh Network based on the 802.11s open standard protocol. The radio targets RF channel data rates up to 20 Mbps and can be equipped with a real-time operating system (RTOS) that can be switched off for power-aware applications. The EVA SDR's modular design permits implementation of the same hardware at all Network Nodes concept. This approach assures the portability of the same software into any radio in the system. It also brings several benefits to the entire system including reducing system maintenance, system complexity, and development cost.

  7. Estimating clinical chemistry reference values based on an existing data set of unselected animals.

    Science.gov (United States)

    Dimauro, Corrado; Bonelli, Piero; Nicolussi, Paola; Rassu, Salvatore P G; Cappio-Borlino, Aldo; Pulina, Giuseppe

    2008-11-01

    In an attempt to standardise the determination of biological reference values, the International Federation of Clinical Chemistry (IFCC) has published a series of recommendations on developing reference intervals. The IFCC recommends the use of an a priori sampling of at least 120 healthy individuals. However, such a high number of samples and laboratory analysis is expensive, time-consuming and not always feasible, especially in veterinary medicine. In this paper, an alternative (a posteriori) method is described and is used to determine reference intervals for biochemical parameters of farm animals using an existing laboratory data set. The method used was based on the detection and removal of outliers to obtain a large sample of animals likely to be healthy from the existing data set. This allowed the estimation of reliable reference intervals for biochemical parameters in Sarda dairy sheep. This method may also be useful for the determination of reference intervals for different species, ages and gender.

  8. Uranium reference materials

    International Nuclear Information System (INIS)

    Donivan, S.; Chessmore, R.

    1987-07-01

    The Technical Measurements Center has prepared uranium mill tailings reference materials for use by remedial action contractors and cognizant federal and state agencies. Four materials were prepared with varying concentrations of radionuclides, using three tailings materials and a river-bottom soil diluent. All materials were ground, dried, and blended thoroughly to ensure homogeneity. The analyses on which the recommended values for nuclides in the reference materials are based were performed, using independent methods, by the UNC Geotech (UNC) Chemistry Laboratory, Grand Junction, Colorado, and by C.W. Sill (Sill), Idaho National Engineering Laboratory, Idaho Falls, Idaho. Several statistical tests were performed on the analytical data to characterize the reference materials. Results of these tests reveal that the four reference materials are homogeneous and that no large systematic bias exists between the analytical methods used by Sill and those used by TMC. The average values for radionuclides of the two data sets, representing an unbiased estimate, were used as the recommended values for concentrations of nuclides in the reference materials. The recommended concentrations of radionuclides in the four reference materials are provided. Use of these reference materials will aid in providing uniform standardization among measurements made by remedial action contractors. 11 refs., 9 tabs

  9. Comparative study and implementation of images re processing methods for the tomography by positrons emission: interest of taking into account a priori information; Etude comparative et implementation de methodes de reconstruction d`images pour la tomographie par emission de positons: interet de la prise en compte d`informations a priori

    Energy Technology Data Exchange (ETDEWEB)

    Bouchet, F.

    1996-09-25

    The tomography by positron emission has for aim to explore an organ by injection of a radiotracer and bidimensional representation with processing techniques. The most used in routine is the filtered retro projection that gives smoothed images. this work realizes a comparative study of new techniques. The methods of preservations of contours are studied here, the idea is to use NMR imaging as a priori information. Two techniques of images construction are viewed more particularly: the resolution by pseudo inverse and the Bayesian method. (N.C.).

  10. Reference in human and non-human primate communication: What does it take to refer?

    Science.gov (United States)

    Sievers, Christine; Gruber, Thibaud

    2016-07-01

    The concept of functional reference has been used to isolate potentially referential vocal signals in animal communication. However, its relatedness to the phenomenon of reference in human language has recently been brought into question. While some researchers have suggested abandoning the concept of functional reference altogether, others advocate a revision of its definition to include contextual cues that play a role in signal production and perception. Empirical and theoretical work on functional reference has also put much emphasis on how the receiver understands the referential signal. However, reference, as defined in the linguistic literature, is an action of the producer, and therefore, any definition describing reference in non-human animals must also focus on the producer. To successfully determine whether a signal is used to refer, we suggest an approach from the field of pragmatics, taking a closer look at specific situations of signal production, specifically at the factors that influence the production of a signal by an individual. We define the concept of signaller's reference to identify intentional acts of reference produced by a signaller independently of the communicative modality, and illustrate it with a case study of the hoo vocalizations produced by wild chimpanzees during travel. This novel framework introduces an intentional approach to referentiality. It may therefore permit a closer comparison of human and non-human animal referential behaviour and underlying cognitive processes, allowing us to identify what may have emerged solely in the human lineage.

  11. STL pocket reference

    CERN Document Server

    Lischner, Ray

    2003-01-01

    The STL Pocket Reference describes the functions, classes, and templates in that part of the C++ standard library often referred to as the Standard Template Library (STL). The STL encompasses containers, iterators, algorithms, and function objects, which collectively represent one of the most important and widely used subsets of standard library functionality. The C++ standard library, even the subset known as the STL, is vast. It's next to impossible to work with the STL without some sort of reference at your side to remind you of template parameters, function invocations, return types--ind

  12. Handbook of reference electrodes

    CERN Document Server

    Inzelt, György; Scholz, Fritz

    2013-01-01

    Reference Electrodes are a crucial part of any electrochemical system, yet an up-to-date and comprehensive handbook is long overdue. Here, an experienced team of electrochemists provides an in-depth source of information and data for the proper choice and construction of reference electrodes. This includes all kinds of applications such as aqueous and non-aqueous solutions, ionic liquids, glass melts, solid electrolyte systems, and membrane electrodes. Advanced technologies such as miniaturized, conducting-polymer-based, screen-printed or disposable reference electrodes are also covered. Essen

  13. Regular Expression Pocket Reference

    CERN Document Server

    Stubblebine, Tony

    2007-01-01

    This handy little book offers programmers a complete overview of the syntax and semantics of regular expressions that are at the heart of every text-processing application. Ideal as a quick reference, Regular Expression Pocket Reference covers the regular expression APIs for Perl 5.8, Ruby (including some upcoming 1.9 features), Java, PHP, .NET and C#, Python, vi, JavaScript, and the PCRE regular expression libraries. This concise and easy-to-use reference puts a very powerful tool for manipulating text and data right at your fingertips. Composed of a mixture of symbols and text, regular exp

  14. Neptunium: a bibliographic reference

    International Nuclear Information System (INIS)

    Mosley, R.E.

    1979-06-01

    A comprehensive bibliograhy of the literature on the element neptunium published prior to January 1976 is presented. A short abstract is given for each listed reference, with a few exceptions. The references are divided into sections categorized as General, Man-Made Sources (Reactors), Man-Made Sources (Fuel Reprocessing), Chemistry (Solubility), Chemistry (Compounds), Chemistry (Isotopes), Analyses (Instrumental), Analyses (Chemical), Chemical (Animal), Biological (Effects), Biological (Animal-Metabolism-Retention), Biological (Air Movement), Biological (Human Inhalation), Measurement, and Dosimetry. The bibliography contains author and keyword indexes and was compiled to serve as a quick reference source for neptunium-related work. 184 citations

  15. CSS Pocket Reference

    CERN Document Server

    Meyer, Eric

    2011-01-01

    When you're working with CSS and need a quick answer, CSS Pocket Reference delivers. This handy, concise book provides all of the essential information you need to implement CSS on the fly. Ideal for intermediate to advanced web designers and developers, the 4th edition is revised and updated for CSS3, the latest version of the Cascading Style Sheet specification. Along with a complete alphabetical reference to CSS3 selectors and properties, you'll also find a short introduction to the key concepts of CSS. Based on Cascading Style Sheets: The Definitive Guide, this reference is an easy-to-us

  16. Biomedical Engineering Desk Reference

    CERN Document Server

    Ratner, Buddy D; Schoen, Frederick J; Lemons, Jack E; Dyro, Joseph; Martinsen, Orjan G; Kyle, Richard; Preim, Bernhard; Bartz, Dirk; Grimnes, Sverre; Vallero, Daniel; Semmlow, John; Murray, W Bosseau; Perez, Reinaldo; Bankman, Isaac; Dunn, Stanley; Ikada, Yoshito; Moghe, Prabhas V; Constantinides, Alkis

    2009-01-01

    A one-stop Desk Reference, for Biomedical Engineers involved in the ever expanding and very fast moving area; this is a book that will not gather dust on the shelf. It brings together the essential professional reference content from leading international contributors in the biomedical engineering field. Material covers a broad range of topics including: Biomechanics and Biomaterials; Tissue Engineering; and Biosignal Processing* A hard-working desk reference providing all the essential material needed by biomedical and clinical engineers on a day-to-day basis * Fundamentals, key techniques,

  17. LINQ Pocket Reference

    CERN Document Server

    Albahari, Joseph

    2008-01-01

    Ready to take advantage of LINQ with C# 3.0? This guide has the detail you need to grasp Microsoft's new querying technology, and concise explanations to help you learn it quickly. And once you begin to apply LINQ, the book serves as an on-the-job reference when you need immediate reminders. All the examples in the LINQ Pocket Reference are preloaded into LINQPad, the highly praised utility that lets you work with LINQ interactively. Created by the authors and free to download, LINQPad will not only help you learn LINQ, it will have you thinking in LINQ. This reference explains: LINQ's ke

  18. R quick syntax reference

    CERN Document Server

    Tollefson, Margot

    2014-01-01

    The R Quick Syntax Reference is a handy reference book detailing the intricacies of the R language. Not only is R a free, open-source tool, R is powerful, flexible, and has state of the art statistical techniques available. With the many details which must be correct when using any language, however, the R Quick Syntax Reference makes using R easier.Starting with the basic structure of R, the book takes you on a journey through the terminology used in R and the syntax required to make R work. You will find looking up the correct form for an expression quick and easy. With a copy of the R Quick

  19. A new birthweight reference in Guangzhou, southern China, and its comparison with the global reference.

    Science.gov (United States)

    He, Jian-Rong; Xia, Hui-Min; Liu, Yu; Xia, Xiao-Yan; Mo, Wei-Jian; Wang, Ping; Cheng, Kar Keung; Leung, Gabriel M; Feng, Qiong; Schooling, C Mary; Qiu, Xiu

    2014-12-01

    To formulate a new birthweight reference for different gestational ages in Guangzhou, southern China, and compare it with the currently used reference in China and the global reference. All singleton live births of more than 26 weeks' gestational age recorded in the Guangzhou Perinatal Health Care and Delivery Surveillance System for the years 2009, 2010 and 2011 (n=510 837) were retrospectively included in the study. In addition, the study sample was supplemented by all singleton live births (n=3538) at gestational ages 26-33 weeks from 2007 and 2008. We used Gaussian mixture models and robust regression to exclude outliers of birth weight and then applied Generalized Additive Models for Location, Scale, and Shape (GAMLSS) to generate smoothed percentile curves separately for gender and parity. Of infants defined as small for gestational age (SGA) in the new reference, 15.3-47.7% (depending on gestational age) were considered appropriate for gestational age (AGA) by the currently used reference of China. Of the infants defined as SGA by the new reference, 9.2% with gestational ages 34-36 weeks and 14.3% with 37-41 weeks were considered AGA by the global reference. At the 50th centile line, the new reference curve was similar to that of the global reference for gestational ages 26-33 weeks and above the global reference for 34-40 weeks. The new birthweight reference based on birthweight data for neonates in Guangzhou, China, differs from the reference currently used in China and the global reference, and appears to be more relevant to the local population. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  20. Transformations between inertial and linearly accelerated frames of reference

    International Nuclear Information System (INIS)

    Ashworth, D.G.

    1983-01-01

    Transformation equations between inertial and linearly accelerated frames of reference are derived and these transformation equations are shown to be compatible, where applicable, with those of special relativity. The physical nature of an accelerated frame of reference is unambiguously defined by means of an equation which relates the velocity of all points within the accelerated frame of reference to measurements made in an inertial frame of reference. (author)

  1. Defining safety goals. 2. Basic Consideration on Defining Safety Goals

    International Nuclear Information System (INIS)

    Hakata, T.

    2001-01-01

    cancer and severe hereditary effects are 10 x 10 -2 /Sv and 1.3 x10 -2 /Sv, respectively. The basic safety goals can be expressed by the complementary accumulative distribution function (CCDF) of dose versus frequencies of events: Pc(C > Cp) 5 (Cp/Co) -α . The aversion factor a is here expressed by the following arbitrary equation, which gives a polynomial curve of the order of m on a logarithmic plane: α = a+b(log(Cp/Co)) m , where: Pc = CCDF frequency for Cp (/yr), Cp = dose (mSv), Co = Cp for Pc =1, a, b, m = constants. Figure 1 shows a typical tolerable risk profile (risk limit curve), which is drawn so that all the points obtained in the previous discussions are above the curve (Co=1, a=1, b=0.0772, and m = 2). Safety criteria by ANS (Ref. 2) and SHE (Ref. 3) are shown in Fig. 1 for comparison. An aversion of a factor of 2 is resulted between 1 mSv and 1 Sv. No ALARA is included, which must be considered in defining specific safety goals. The frequency of a single class of events must be lower than the CCDF profile, and a curve lower by a factor of 10 is drawn in Fig. 1. The doses referenced in the current Japanese safety guidelines and site criteria are shown in Fig. 1. The referenced doses seem reasonable, considering the conservatism in the analysis of design-basis accidents. Specific safety goals for each sort of facility can be defined based on the basic safety goals, reflecting the characteristics of the facilities and considering ALARA. The indexes of engineering terms, such as CMF and LERF, are preferable for nuclear power plants, although interpretation from dose to the engineering terms is needed. Other indexes may be used (such as frequency of criticality accidents, etc.) for facilities except for power plants. The applicability of safety goals will thus be improved. Figure 2 shows the relative risk factors (1, 1%, and 0.1%) versus the severity of radiation effects. This might indicate the adequacy of the risk factors. The absolute risk limits, which

  2. Optimal primitive reference frames

    International Nuclear Information System (INIS)

    Jennings, David

    2011-01-01

    We consider the smallest possible directional reference frames allowed and determine the best one can ever do in preserving quantum information in various scenarios. We find that for the preservation of a single spin state, two orthogonal spins are optimal primitive reference frames; and in a product state, they do approximately 22% as well as an infinite-sized classical frame. By adding a small amount of entanglement to the reference frame, this can be raised to 2(2/3) 5 =26%. Under the different criterion of entanglement preservation, a very similar optimal reference frame is found; however, this time it is for spins aligned at an optimal angle of 87 deg. In this case 24% of the negativity is preserved. The classical limit is considered numerically, and indicates under the criterion of entanglement preservation, that 90 deg. is selected out nonmonotonically, with a peak optimal angle of 96.5 deg. for L=3 spins.

  3. Reference Climatological Stations

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Reference Climatological Stations (RCS) network represents the first effort by NOAA to create and maintain a nationwide network of stations located only in areas...

  4. Toxicity Reference Database

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Toxicity Reference Database (ToxRefDB) contains approximately 30 years and $2 billion worth of animal studies. ToxRefDB allows scientists and the interested...

  5. Python essential reference

    CERN Document Server

    Beazley, David M

    2009-01-01

    Python Essential Reference is the definitive reference guide to the Python programming language — the one authoritative handbook that reliably untangles and explains both the core Python language and the most essential parts of the Python library. Designed for the professional programmer, the book is concise, to the point, and highly accessible. It also includes detailed information on the Python library and many advanced subjects that is not available in either the official Python documentation or any other single reference source. Thoroughly updated to reflect the significant new programming language features and library modules that have been introduced in Python 2.6 and Python 3, the fourth edition of Python Essential Reference is the definitive guide for programmers who need to modernize existing Python code or who are planning an eventual migration to Python 3. Programmers starting a new Python project will find detailed coverage of contemporary Python programming idioms.

  6. Collaborative networks: Reference modeling

    NARCIS (Netherlands)

    Camarinha-Matos, L.M.; Afsarmanesh, H.

    2008-01-01

    Collaborative Networks: Reference Modeling works to establish a theoretical foundation for Collaborative Networks. Particular emphasis is put on modeling multiple facets of collaborative networks and establishing a comprehensive modeling framework that captures and structures diverse perspectives of

  7. Ozone Standard Reference Photometer

    Data.gov (United States)

    Federal Laboratory Consortium — The Standard Reference Photometer (SRP) Program began in the early 1980s as collaboration between NIST and the U.S. Environmental Protection Agency (EPA) to design,...

  8. Enterprise Reference Library

    Science.gov (United States)

    Bickham, Grandin; Saile, Lynn; Havelka, Jacque; Fitts, Mary

    2011-01-01

    Introduction: Johnson Space Center (JSC) offers two extensive libraries that contain journals, research literature and electronic resources. Searching capabilities are available to those individuals residing onsite or through a librarian s search. Many individuals have rich collections of references, but no mechanisms to share reference libraries across researchers, projects, or directorates exist. Likewise, information regarding which references are provided to which individuals is not available, resulting in duplicate requests, redundant labor costs and associated copying fees. In addition, this tends to limit collaboration between colleagues and promotes the establishment of individual, unshared silos of information The Integrated Medical Model (IMM) team has utilized a centralized reference management tool during the development, test, and operational phases of this project. The Enterprise Reference Library project expands the capabilities developed for IMM to address the above issues and enhance collaboration across JSC. Method: After significant market analysis for a multi-user reference management tool, no available commercial tool was found to meet this need, so a software program was built around a commercial tool, Reference Manager 12 by The Thomson Corporation. A use case approach guided the requirements development phase. The premise of the design is that individuals use their own reference management software and export to SharePoint when their library is incorporated into the Enterprise Reference Library. This results in a searchable user-specific library application. An accompanying share folder will warehouse the electronic full-text articles, which allows the global user community to access full -text articles. Discussion: An enterprise reference library solution can provide a multidisciplinary collection of full text articles. This approach improves efficiency in obtaining and storing reference material while greatly reducing labor, purchasing and

  9. Comparability of reference values

    International Nuclear Information System (INIS)

    Rossbach, M.; Stoeppler, M.

    1993-01-01

    Harmonization of certified values in Reference Materials (RMs) can be carried out by applying nuclear analytical techniques to RMs of various matrix types and concentration levels. Although RMs generally should not be used as primary standards the cross evaluation of concentrations in RMs leads to better compatibility of reference values and thus to a greater agreement between analytical results from different laboratories using these RMs for instrument calibration and quality assurance. (orig.)

  10. Electronics engineer's reference book

    CERN Document Server

    Turner, L W

    1976-01-01

    Electronics Engineer's Reference Book, 4th Edition is a reference book for electronic engineers that reviews the knowledge and techniques in electronics engineering and covers topics ranging from basics to materials and components, devices, circuits, measurements, and applications. This edition is comprised of 27 chapters; the first of which presents general information on electronics engineering, including terminology, mathematical equations, mathematical signs and symbols, and Greek alphabet and symbols. Attention then turns to the history of electronics; electromagnetic and nuclear radiatio

  11. 2002 reference document

    International Nuclear Information System (INIS)

    2002-01-01

    This 2002 reference document of the group Areva, provides information on the society. Organized in seven chapters, it presents the persons responsible for the reference document and for auditing the financial statements, information pertaining to the transaction, general information on the company and share capital, information on company operation, changes and future prospects, assets, financial position, financial performance, information on company management and executive board and supervisory board, recent developments and future prospects. (A.L.B.)

  12. Defining Tobacco Regulatory Science Competencies.

    Science.gov (United States)

    Wipfli, Heather L; Berman, Micah; Hanson, Kacey; Kelder, Steven; Solis, Amy; Villanti, Andrea C; Ribeiro, Carla M P; Meissner, Helen I; Anderson, Roger

    2017-02-01

    In 2013, the National Institutes of Health and the Food and Drug Administration funded a network of 14 Tobacco Centers of Regulatory Science (TCORS) with a mission that included research and training. A cross-TCORS Panel was established to define tobacco regulatory science (TRS) competencies to help harmonize and guide their emerging educational programs. The purpose of this paper is to describe the Panel's work to develop core TRS domains and competencies. The Panel developed the list of domains and competencies using a semistructured Delphi method divided into four phases occurring between November 2013 and August 2015. The final proposed list included a total of 51 competencies across six core domains and 28 competencies across five specialized domains. There is a need for continued discussion to establish the utility of the proposed set of competencies for emerging TRS curricula and to identify the best strategies for incorporating these competencies into TRS training programs. Given the field's broad multidisciplinary nature, further experience is needed to refine the core domains that should be covered in TRS training programs versus knowledge obtained in more specialized programs. Regulatory science to inform the regulation of tobacco products is an emerging field. The paper provides an initial list of core and specialized domains and competencies to be used in developing curricula for new and emerging training programs aimed at preparing a new cohort of scientists to conduct critical TRS research. © The Author 2016. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  13. Identification of dietary patterns associated with obesity in a nationally representative survey of Canadian adults: application of a priori, hybrid, and simplified dietary pattern techniques.

    Science.gov (United States)

    Jessri, Mahsa; Wolfinger, Russell D; Lou, Wendy Y; L'Abbé, Mary R

    2017-03-01

    Background: Analyzing the effects of dietary patterns is an important approach for examining the complex role of nutrition in the etiology of obesity and chronic diseases. Objectives: The objectives of this study were to characterize the dietary patterns of Canadians with the use of a priori, hybrid, and simplified dietary pattern techniques, and to compare the associations of these patterns with obesity risk in individuals with and without chronic diseases (unhealthy and healthy obesity). Design: Dietary recalls from 11,748 participants (≥18 y of age) in the cross-sectional, nationally representative Canadian Community Health Survey 2.2 were used. A priori dietary pattern was characterized with the use of the previously validated 2015 Dietary Guidelines for Americans Adherence Index (DGAI). Weighted partial least squares (hybrid method) was used to derive an energy-dense (ED), high-fat (HF), low-fiber density (LFD) dietary pattern with the use of 38 food groups. The associations of derived dietary patterns with disease outcomes were then tested with the use of multinomial logistic regression. Results: An ED, HF, and LFD dietary pattern had high positive loadings for fast foods, carbonated drinks, and refined grains, and high negative loadings for whole fruits and vegetables (≥|0.17|). Food groups with a high loading were summed to form a simplified dietary pattern score. Moving from the first (healthiest) to the fourth (least healthy) quartiles of the ED, HF, and LFD pattern and the simplified dietary pattern scores was associated with increasingly elevated ORs for unhealthy obesity, with individuals in quartile 4 having an OR of 2.57 (95% CI: 1.75, 3.76) and 2.73 (95% CI: 1.88, 3.98), respectively ( P -trend obesity ( P -trend dietary patterns with healthy obesity and unhealthy nonobesity were weaker, albeit significant. Conclusions: Consuming an ED, HF, and LFD dietary pattern and lack of adherence to the recommendations of the 2015 DGAI were associated with

  14. Pasta production: complexity in defining processing conditions for reference trials and quality assessment models

    Science.gov (United States)

    Pasta is a simple food made from water and durum wheat (Triticum turgidum subsp. durum) semolina. As pasta increases in popularity, studies have endeavored to analyze the attributes that contribute to high quality pasta. Despite being a simple food, the laboratory scale analysis of pasta quality is ...

  15. Toward modular biological models: defining analog modules based on referent physiological mechanisms.

    Science.gov (United States)

    Petersen, Brenden K; Ropella, Glen E P; Hunt, C Anthony

    2014-08-16

    Currently, most biomedical models exist in isolation. It is often difficult to reuse or integrate models or their components, in part because they are not modular. Modular components allow the modeler to think more deeply about the role of the model and to more completely address a modeling project's requirements. In particular, modularity facilitates component reuse and model integration for models with different use cases, including the ability to exchange modules during or between simulations. The heterogeneous nature of biology and vast range of wet-lab experimental platforms call for modular models designed to satisfy a variety of use cases. We argue that software analogs of biological mechanisms are reasonable candidates for modularization. Biomimetic software mechanisms comprised of physiomimetic mechanism modules offer benefits that are unique or especially important to multi-scale, biomedical modeling and simulation. We present a general, scientific method of modularizing mechanisms into reusable software components that we call physiomimetic mechanism modules (PMMs). PMMs utilize parametric containers that partition and expose state information into physiologically meaningful groupings. To demonstrate, we modularize four pharmacodynamic response mechanisms adapted from an in silico liver (ISL). We verified the modularization process by showing that drug clearance results from in silico experiments are identical before and after modularization. The modularized ISL achieves validation targets drawn from propranolol outflow profile data. In addition, an in silico hepatocyte culture (ISHC) is created. The ISHC uses the same PMMs and required no refactoring. The ISHC achieves validation targets drawn from propranolol intrinsic clearance data exhibiting considerable between-lab variability. The data used as validation targets for PMMs originate from both in vitro to in vivo experiments exhibiting large fold differences in time scale. This report demonstrates the feasibility of PMMs and their usefulness across multiple model use cases. The pharmacodynamic response module developed here is robust to changes in model context and flexible in its ability to achieve validation targets in the face of considerable experimental uncertainty. Adopting the modularization methods presented here is expected to facilitate model reuse and integration, thereby accelerating the pace of biomedical research.

  16. Observações aos prolegômenos da teoria kantiana dos juízos jurídicos a priori em Rechtslehre

    Directory of Open Access Journals (Sweden)

    Fábio César Scherer

    2010-12-01

    Full Text Available In this article the kantian Rechtslehre is interpreted as a critical juridical doctrine, understandable under the critical project – started in Kritik der reinen Vernunft and adapted to the practical field in Kritik der praktischen Vernunft. In particular, it is aimed to highlight, besides the apriority, the systematic character and the search for the completeness of the juridical principles, the use of the solubility of problems theory of the reasoning in general in the prolegomenon of Rechtslehre. The study of this preliminary part is justified in presenting the supreme division of the system according to principals, where it is derived a division of the law doctrine, which determines the object (Gegenstand and, therefore, the field of this particular science and the discussion of the research procedure. Such a priori frame of the Law doctrine is the basis of the following Kantian theory of private law and public law. In a bigger picture, this article can be understood as a renouncement to the idea that the Kantian Rechtslehre does not follow the requirements of the critical philosophy – created by Hermann Cohen (Ethik des reinen Willens, 1904 and detailed by Christian Ritter (Der Rechtsgedanke Kants nach den frühen Quellen, 1971.

  17. Reference costs of electricity

    International Nuclear Information System (INIS)

    Terraz, N.

    1997-01-01

    The calculation of electric power production reference costs is used in France, even in the present case of over-capacity, for comparing the relative interest of the various means of power generation (nuclear plants, coal plants, hydroelectricity, gas combined cycles, etc.) and as an aid for future investment decisions. Reference costs show a sharp decrease between 1993 and 1997 due to advancements in nuclear plant operating ability and fossil fuel price decrease. Actuarial rates, plant service life, fuel costs and exchange rates are important parameters. The various costs from the research stage to the waste processing stages are discussed and the reference costs of the various power generation systems are presented and compared together with their competitiveness; the future of wind energy and cogeneration and the prospective of the renewal of nuclear plants at the 2010 horizon are also addressed

  18. Setting reference targets

    International Nuclear Information System (INIS)

    Ruland, R.E.

    1997-04-01

    Reference Targets are used to represent virtual quantities like the magnetic axis of a magnet or the definition of a coordinate system. To explain the function of reference targets in the sequence of the alignment process, this paper will first briefly discuss the geometry of the trajectory design space and of the surveying space, then continue with an overview of a typical alignment process. This is followed by a discussion on magnet fiducialization. While the magnetic measurement methods to determine the magnetic centerline are only listed (they will be discussed in detail in a subsequent talk), emphasis is given to the optical/mechanical methods and to the task of transferring the centerline position to reference targets

  19. Electrical engineer's reference book

    CERN Document Server

    Jones, G R

    2013-01-01

    A long established reference book: radical revision for the fifteenth edition includes complete rearrangement to take in chapters on new topics and regroup the subjects covered for easy access to information.The Electrical Engineer's Reference Book, first published in 1945, maintains its original aims: to reflect the state of the art in electrical science and technology and cater for the needs of practising engineers. Most chapters have been revised and many augmented so as to deal properly with both fundamental developments and new technology and applications that have come to the fore since

  20. Python pocket reference

    CERN Document Server

    Lutz, Mark

    2010-01-01

    This is the book to reach for when you're coding on the fly and need an answer now. It's an easy-to-use reference to the core language, with descriptions of commonly used modules and toolkits, and a guide to recent changes, new features, and upgraded built-ins -- all updated to cover Python 3.X as well as version 2.6. You'll also quickly find exactly what you need with the handy index. Written by Mark Lutz -- widely recognized as the world's leading Python trainer -- Python Pocket Reference, Fourth Edition, is the perfect companion to O'Reilly's classic Python tutorials, also written by Mark

  1. The Reference Return Ratio

    DEFF Research Database (Denmark)

    Nicolaisen, Jeppe; Faber Frandsen, Tove

    2008-01-01

    The paper introduces a new journal impact measure called The Reference Return Ratio (3R). Unlike the traditional Journal Impact Factor (JIF), which is based on calculations of publications and citations, the new measure is based on calculations of bibliographic investments (references) and returns...... (citations). A comparative study of the two measures shows a strong relationship between the 3R and the JIF. Yet, the 3R appears to correct for citation habits, citation dynamics, and composition of document types - problems that typically are raised against the JIF. In addition, contrary to traditional...

  2. Perl Pocket Reference

    CERN Document Server

    Vromans, Johan

    2011-01-01

    If you have a Perl programming question, you'll find the answer quickly in this handy, easy-to-use quick reference. The Perl Pocket Reference condenses and organizes stacks of documentation down to the most essential facts, so you can find what you need in a heartbeat. Updated for Perl 5.14, the 5th edition provides a summary of Perl syntax rules and a complete list of operators, built-in functions, and other features. It's the perfect companion to O'Reilly's authoritative and in-depth Perl programming books, including Learning Perl, Programming Perl, and the Perl Cookbook..

  3. HTML & XHTML Pocket Reference

    CERN Document Server

    Robbins, Jennifer

    2010-01-01

    After years of using spacer GIFs, layers of nested tables, and other improvised solutions for building your web sites, getting used to the more stringent standards-compliant design can be intimidating. HTML and XHTML Pocket Reference is the perfect little book when you need answers immediately. Jennifer Niederst-Robbins, author Web Design in a Nutshell, has revised and updated the fourth edition of this pocket guide by taking the top 20% of vital reference information from her Nutshell book, augmenting it judiciously, cross-referencing everything, and organizing it according to the most com

  4. CSS Pocket Reference

    CERN Document Server

    Meyer, Eric A

    2007-01-01

    They say that good things come in small packages, and it's certainly true for this edition of CSS Pocket Reference. Completely revised and updated to reflect the latest Cascading Style Sheet specifications in CSS 2.1, this indispensable little book covers the most essential information that web designers and developers need to implement CSS effectively across all browsers. Inside, you'll find: A short introduction to the key concepts of CSS A complete alphabetical reference to all CSS 2.1 selectors and properties A chart displaying detailed information about CSS support for every style ele

  5. JDBC Pocket Reference

    CERN Document Server

    Bales, Donald

    2003-01-01

    JDBC--the Java Database Connectivity specification--is a complex set of application programming interfaces (APIs) that developers need to understand if they want their Java applications to work with databases. JDBC is so complex that even the most experienced developers need to refresh their memories from time to time on specific methods and details. But, practically speaking, who wants to stop and thumb through a weighty tutorial volume each time a question arises? The answer is the JDBC Pocket Reference, a data-packed quick reference that is both a time-saver and a lifesaver. The JDBC P

  6. Reference values for electrooculography

    International Nuclear Information System (INIS)

    Barrientos Castanno, Alberto; Herrera Mora, Maritza; Garcia Baez, Obel

    2012-01-01

    Obtain electrooculographic reference values based on the patterns set by the Standardization Committee of the International Society for Clinical Electrophysiology of Vision (ISCEV). the lowest amplitude values of the potential ranged between 388 and 882 μv in the dark phase. The light peak was obtained between 9 and 10 minutes, and during this phase the potential reached an amplitude ranging between 808 and 1 963 μv. This amplitude variability may be related to the fact that the test was conducted without pupillary mydriasis. The reference value obtained for Arden index was 1,55 to 2,87

  7. NASCAP programmer's reference manual

    Science.gov (United States)

    Mandell, M. J.; Stannard, P. R.; Katz, I.

    1993-05-01

    The NASA Charging Analyzer Program (NASCAP) is a computer program designed to model the electrostatic charging of complicated three-dimensional objects, both in a test tank and at geosynchronous altitudes. This document is a programmer's reference manual and user's guide. It is designed as a reference to experienced users of the code, as well as an introduction to its use for beginners. All of the many capabilities of NASCAP are covered in detail, together with examples of their use. These include the definition of objects, plasma environments, potential calculations, particle emission and detection simulations, and charging analysis.

  8. Book Catalogs; Selected References.

    Science.gov (United States)

    Brandhorst, Wesley T.

    The 116 citations on book catalogs are divided into the following two main sections: (1) Selected References, in alphabetic sequence by personal or institutional author and (2) Anonymous Entries, in alphabetic sequence by title. One hundred and seven of the citations cover the years 1960 through March 1969. There are five scattered citations in…

  9. ROOT Reference Documentation

    CERN Document Server

    Fuakye, Eric Gyabeng

    2017-01-01

    A ROOT Reference Documentation has been implemented to generate all the lists of libraries needed for each ROOT class. Doxygen has no option to generate or add the lists of libraries for each ROOT class. Therefore shell scripting and a basic C++ program was employed to import the lists of libraries needed by each ROOT class.

  10. Hospitality Services Reference Book.

    Science.gov (United States)

    Texas Tech Univ., Lubbock. Home Economics Curriculum Center.

    This reference book provides information needed by employees in hospitality services occupations. It includes 29 chapters that cover the following topics: the hospitality services industry; professional ethics; organization and management structures; safety practices and emergency procedures; technology; property maintenance and repair; purchasing…

  11. Quantum frames of reference

    International Nuclear Information System (INIS)

    Kaufherr, T.

    1981-01-01

    The idea that only relative variables have physical meaning came to be known as Mach's principle. Carrying over this idea to quantum theory, has led to the consideration of finite mass, macroscopic reference frames, relative to which all physical quantities are measured. During the process of measurement, a finite mass observer receives a kickback, and this reaction of the measuring device is not negligible in quantum theory because of the quantization of the action. Hence, the observer himself has to be included in the system that is being considered. Using this as the starting point, a number of thought experiments involving finite mass observers is discussed which have quantum uncertainties in their time or in their position. These thought experiments serve to elucidate in a qualitative way some of the difficulties involved, as well as pointing out a direction to take in seeking solutions to them. When the discussion is extended to include more than one observer, the question of the covariance of the theory immediately arises. Because none of the frames of reference should be preferred, the theory should be covariant. This demand expresses an equivalence principle which here is extended to include reference frames which are in quantum uncertainties relative to each other. Formulating the problem in terms of canonical variables, the ensueing free Hamiltonian contains vector and scalar potentials which represent the kick that the reference frame receives during measurement. These are essentially gravitational type potentials, resulting, as it were, from the extension of the equivalence principle into the quantum domain

  12. Pollen reference collection digitization

    NARCIS (Netherlands)

    Ercan, F.E.Z.; Donders, T.H.; Bijl, P.K.; Wagner, F.

    2016-01-01

    The extensive Utrecht University pollen reference collection holds thousands of pollen samples of many species and genera from all over the world and has been a basis for the widely-used North West European Pollen Flora. These samples are fixed on glass slides for microscopy use, but the aging

  13. Virtual Reference Services.

    Science.gov (United States)

    Brewer, Sally

    2003-01-01

    As the need to access information increases, school librarians must create virtual libraries. Linked to reliable reference resources, the virtual library extends the physical collection and library hours and lets students learn to use Web-based resources in a protected learning environment. The growing number of virtual schools increases the need…

  14. Reference-Dependent Sympathy

    Science.gov (United States)

    Small, Deborah A.

    2010-01-01

    Natural disasters and other traumatic events often draw a greater charitable response than do ongoing misfortunes, even those that may cause even more widespread misery, such as famine or malaria. Why is the response disproportionate to need? The notion of reference dependence critical to Prospect Theory (Kahneman & Tversky, 1979) maintains that…

  15. Genetics Home Reference

    Science.gov (United States)

    ... Page Search Home Health Conditions Genes Chromosomes & mtDNA Resources Help Me Understand Genetics Share: Email Facebook Twitter Genetics Home Reference provides consumer-friendly information about the effects of genetic variation on human health. Health Conditions More than 1,200 health ...

  16. Python library reference

    NARCIS (Netherlands)

    G. van Rossum (Guido)

    1995-01-01

    textabstractPython is an extensible, interpreted, object-oriented programming language. It supports a wide range of applications, from simple text processing scripts to interactive WWW browsers. While the Python Reference Manual describes the exact syntax and semantics of the language, it does not

  17. Evaluation of a compliance device in a subgroup of adult patients receiving specific immunotherapy with grass allergen tablets (GRAZAX) in a randomized, open-label, controlled study: an a priori subgroup analysis.

    NARCIS (Netherlands)

    Jansen, A.P.H.; Andersen, K.F.; Bruning, H.

    2009-01-01

    OBJECTIVES: This a priori subgroup analysis was conducted to assess patients' experience with a compliance device for the administration of sublingual specific immunotherapy for grass pollen-induced rhinoconjunctivitis. METHODS: The present paper reports the results of a subgroup analysis of a

  18. AMP language reference manual

    International Nuclear Information System (INIS)

    Drouffe, J.M.

    1982-06-01

    The use of a program for symbolic calculations named AMP (Algebraic Manipulation Program) is described. Its main features are: high speed heart for usual algebraic calculations; conversational capability; derivation, substitutions, matricial calculus, expansions, non commutative algebrae...; possibility to define new symbols and associated rules; possibility to create and use external libraries; written for IBM/370 like computers

  19. OWL references in ORM conceptual modelling

    Science.gov (United States)

    Matula, Jiri; Belunek, Roman; Hunka, Frantisek

    2017-07-01

    Object Role Modelling methodology is the fact-based type of conceptual modelling. The aim of the paper is to emphasize a close connection to OWL documents and its possible mutual cooperation. The definition of entities or domain values is an indispensable part of the conceptual schema design procedure defined by the ORM methodology. Many of these entities are already defined in OWL documents. Therefore, it is not necessary to declare entities again, whereas it is possible to utilize references from OWL documents during modelling of information systems.

  20. Confrontación entre un agrupamiento a priori de germoplasma de papa Solanum tuberosum subespecie andigena y un agrupamiento no jerárquico

    Directory of Open Access Journals (Sweden)

    Bernal Ángela María

    2006-12-01

    Full Text Available

    Con el fin de establecer patrones de similitud entre las accesiones para facilitar la identificación de cruzamientos potenciales, se realizó el agrupamiento a priori de la Colección Central Colombiana de papa subespecie andigena por características de color de piel y carne del tubérculo. La necesidad de incluir otras variables de tubérculo y realizar una clasificación basada en métodos estadísticos aumentó con el tiempo y con la evolución de las técnicas multivariadas de datos. En este trabajo se realizó el agrupamiento no jerárquico de la Colección, mediante un análisis de partición que utiliza el algoritmo de k-medias, luego de la caracterización morfológica de 435 accesiones sólo por características del tubérculo. Al comparar ambos agrupamientos se encontró la no correspondencia entre uno y otro, así como la necesidad de mejorar la objetividad en la caracterización de recursos genéticos. Las caracterizaciones tuvieron lugar en el Centro de Investigación Tibaitatá de la Corporación Colombiana de Investigación Agropecuaria (Corpoica, en   Mosquera (Cundinamarca, donde se conserva la Colección Central Colombiana de papa.

  1. A retrospective cohort study on the risk of stroke in relation to a priori health knowledge level among people with type 2 diabetes mellitus in Taiwan.

    Science.gov (United States)

    Lai, Yun-Ju; Hu, Hsiao-Yun; Lee, Ya-Ling; Ku, Po-Wen; Yen, Yung-Feng; Chu, Dachen

    2017-05-22

    Intervention of diabetes care education with regular laboratory check-up in outpatient visits showed long-term benefits to reduce the risk of macrovascular complications among people with type 2 diabetes. However, research on the level of a priori health knowledge to the prevention of diabetic complications in community settings has been scarce. We therefore aimed to investigate the association of health knowledge and stroke incidence in patients with type 2 diabetes in Taiwan. A nationally representative sample of general Taiwanese population was selected using a multistage systematic sampling process from Taiwan National Health Interview Survey (NHIS) in 2005. Subjects were interviewed by a standardized face-to-face questionnaire in the survey, obtaining information of demographics, socioeconomic status, family medical history, obesity, health behaviors, and 15-item health knowledge assessment. The NHIS dataset was linked to Taiwan National Health Insurance claims data to retrieve the diagnosis of type 2 diabetes in NHIS participants at baseline and identify follow-up incidence of stroke from 2005 to 2013. Univariate and multivariate Cox regressions were used to estimate the effect of baseline health knowledge level to the risk of stroke incidence among this group of people with type 2 diabetes. A total of 597 diabetic patients with a mean age of 51.28 years old and nearly half of males were analyzed. During the 9-year follow-up period, 65 new stroke cases were identified among them. Kaplan-Meier curves comparing the three groups of low/moderate/high knowledge levels revealed a statistical significance (p-value of log-rank test Taiwan. Development and delivery of health education on stroke prevention to people with type 2 diabetes are warranted.

  2. A Methodology to Define Flood Resilience

    Science.gov (United States)

    Tourbier, J.

    2012-04-01

    Flood resilience has become an internationally used term with an ever-increasing number of entries on the Internet. The SMARTeST Project is looking at approaches to flood resilience through case studies at cities in various countries, including Washington D.C. in the United States. In light of U.S. experiences a methodology is being proposed by the author that is intended to meet ecologic, spatial, structural, social, disaster relief and flood risk aspects. It concludes that: "Flood resilience combines (1) spatial, (2) structural, (3) social, and (4) risk management levels of flood preparedness." Flood resilience should incorporate all four levels, but not necessarily with equal emphasis. Stakeholders can assign priorities within different flood resilience levels and the considerations they contain, dividing 100% emphasis into four levels. This evaluation would be applied to planned and completed projects, considering existing conditions, goals and concepts. We have long known that the "road to market" for the implementation of flood resilience is linked to capacity building of stakeholders. It is a multidisciplinary enterprise, involving the integration of all the above aspects into the decision-making process. Traditional flood management has largely been influenced by what in the UK has been called "Silo Thinking", involving constituent organizations that are responsible for different elements, and are interested only in their defined part of the system. This barrier to innovation also has been called the "entrapment effect". Flood resilience is being defined as (1) SPATIAL FLOOD RESILIENCE implying the management of land by floodplain zoning, urban greening and management to reduce storm runoff through depression storage and by practicing Sustainable Urban Drainage (SUD's), Best Management Practices (BMP's, or Low Impact Development (LID). Ecologic processes and cultural elements are included. (2) STRUCTURAL FLOOD RESILIENCE referring to permanent flood defense

  3. Ramifications of defining high-level waste

    International Nuclear Information System (INIS)

    Wood, D.E.; Campbell, M.H.; Shupe, M.W.

    1987-01-01

    The Nuclear Regulatory Commission (NRC) is considering rule making to provide a concentration-based definition of high-level waste (HLW) under authority derived from the Nuclear Waste Policy Act (NWPA) of 1982 and the Low Level Waste Policy Amendments Act of 1985. The Department of Energy (DOE), which has the responsibility to dispose of certain kinds of commercial waste, is supporting development of a risk-based classification system by the Oak Ridge National Laboratory to assist in developing and implementing the NRC rule. The system is two dimensional, with the axes based on the phrases highly radioactive and requires permanent isolation in the definition of HLW in the NWPA. Defining HLW will reduce the ambiguity in the present source-based definition by providing concentration limits to establish which materials are to be called HLW. The system allows the possibility of greater-confinement disposal for some wastes which do not require the degree of isolation provided by a repository. The definition of HLW will provide a firm basis for waste processing options which involve partitioning of waste into a high-activity stream for repository disposal, and a low-activity stream for disposal elsewhere. Several possible classification systems have been derived and the characteristics of each are discussed. The Defense High Level Waste Technology Lead Office at DOE - Richland Operations Office, supported by Rockwell Hanford Operations, has coordinated reviews of the ORNL work by a technical peer review group and other DOE offices. The reviews produced several recommendations and identified several issues to be addressed in the NRC rule making. 10 references, 3 figures

  4. Selective constraints in experimentally defined primate regulatory regions.

    Directory of Open Access Journals (Sweden)

    Daniel J Gaffney

    2008-08-01

    Full Text Available Changes in gene regulation may be important in evolution. However, the evolutionary properties of regulatory mutations are currently poorly understood. This is partly the result of an incomplete annotation of functional regulatory DNA in many species. For example, transcription factor binding sites (TFBSs, a major component of eukaryotic regulatory architecture, are typically short, degenerate, and therefore difficult to differentiate from randomly occurring, nonfunctional sequences. Furthermore, although sites such as TFBSs can be computationally predicted using evolutionary conservation as a criterion, estimates of the true level of selective constraint (defined as the fraction of strongly deleterious mutations occurring at a locus in regulatory regions will, by definition, be upwardly biased in datasets that are a priori evolutionarily conserved. Here we investigate the fitness effects of regulatory mutations using two complementary datasets of human TFBSs that are likely to be relatively free of ascertainment bias with respect to evolutionary conservation but, importantly, are supported by experimental data. The first is a collection of almost >2,100 human TFBSs drawn from the literature in the TRANSFAC database, and the second is derived from several recent high-throughput chromatin immunoprecipitation coupled with genomic microarray (ChIP-chip analyses. We also define a set of putative cis-regulatory modules (pCRMs by spatially clustering multiple TFBSs that regulate the same gene. We find that a relatively high proportion ( approximately 37% of mutations at TFBSs are strongly deleterious, similar to that at a 2-fold degenerate protein-coding site. However, constraint is significantly reduced in human and chimpanzee pCRMS and ChIP-chip sequences, relative to macaques. We estimate that the fraction of regulatory mutations that have been driven to fixation by positive selection in humans is not significantly different from zero. We also find

  5. ``Frames of Reference'' revisited

    Science.gov (United States)

    Steyn-Ross, Alistair; Ivey, Donald G.

    1992-12-01

    The PSSC teaching film, ``Frames of Reference,'' was made in 1960, and was one of the first audio-visual attempts at showing how your physical ``point of view,'' or frame of reference, necessarily alters both your perceptions and your observations of motion. The gentle humor and original demonstrations made a lasting impact on many audiences, and with its recent re-release as part of the AAPT Cinema Classics videodisc it is timely that we should review both the message and the methods of the film. An annotated script and photographs from the film are presented, followed by extension material on rotating frames which teachers may find appropriate for use in their classrooms: constructions, demonstrations, an example, and theory.

  6. Program reference schedule baseline

    International Nuclear Information System (INIS)

    1986-07-01

    This Program Reference Schedule Baseline (PRSB) provides the baseline Program-level milestones and associated schedules for the Civilian Radioactive Waste Management Program. It integrates all Program-level schedule-related activities. This schedule baseline will be used by the Director, Office of Civilian Radioactive Waste Management (OCRWM), and his staff to monitor compliance with Program objectives. Chapter 1 includes brief discussions concerning the relationship of the PRSB to the Program Reference Cost Baseline (PRCB), the Mission Plan, the Project Decision Schedule, the Total System Life Cycle Cost report, the Program Management Information System report, the Program Milestone Review, annual budget preparation, and system element plans. Chapter 2 includes the identification of all Level 0, or Program-level, milestones, while Chapter 3 presents and discusses the critical path schedules that correspond to those Level 0 milestones

  7. OSH technical reference manual

    Energy Technology Data Exchange (ETDEWEB)

    1993-11-01

    In an evaluation of the Department of Energy (DOE) Occupational Safety and Health programs for government-owned contractor-operated (GOCO) activities, the Department of Labor`s Occupational Safety and Health Administration (OSHA) recommended a technical information exchange program. The intent was to share written safety and health programs, plans, training manuals, and materials within the entire DOE community. The OSH Technical Reference (OTR) helps support the secretary`s response to the OSHA finding by providing a one-stop resource and referral for technical information that relates to safe operations and practice. It also serves as a technical information exchange tool to reference DOE-wide materials pertinent to specific safety topics and, with some modification, as a training aid. The OTR bridges the gap between general safety documents and very specific requirements documents. It is tailored to the DOE community and incorporates DOE field experience.

  8. Indico CONFERENCE: Define the Call for Abstracts

    CERN Multimedia

    CERN. Geneva; Ferreira, Pedro

    2017-01-01

    In this tutorial, you will learn how to define and open a call for abstracts. When defining a call for abstracts, you will be able to define settings related to the type of questions asked during a review of an abstract, select the users who will review the abstracts, decide when to open the call for abstracts, and more.

  9. On defining semantics of extended attribute grammars

    DEFF Research Database (Denmark)

    Madsen, Ole Lehrmann

    1980-01-01

    Knuth has introduced attribute grammars (AGs) as a tool to define the semanitcs of context-free languages. The use of AGs in connection with programming language definitions has mostly been to define the context-sensitive syntax of the language and to define a translation in code for a hypothetic...

  10. Languages for Software-Defined Networks

    Science.gov (United States)

    2013-02-01

    switches, firewalls, and middleboxes) with closed and proprietary configuration inter- faces. Software - Defined Networks ( SDN ) are poised to change...how- ever, have seen growing interest in software - defined networks ( SDNs ), in which a logically-centralized controller manages the packet-processing...switches, firewalls, and middleboxes) with closed and proprietary configuration interfaces. Software - Defined Networks ( SDN ) are poised to change this

  11. Electrical engineer's reference book

    CERN Document Server

    Laughton, M A

    1985-01-01

    Electrical Engineer's Reference Book, Fourteenth Edition focuses on electrical engineering. The book first discusses units, mathematics, and physical quantities, including the international unit system, physical properties, and electricity. The text also looks at network and control systems analysis. The book examines materials used in electrical engineering. Topics include conducting materials, superconductors, silicon, insulating materials, electrical steels, and soft irons and relay steels. The text underscores electrical metrology and instrumentation, steam-generating plants, turbines

  12. Reference Sources in Chemistry

    OpenAIRE

    Sthapit, Dilip Man

    1995-01-01

    Information plays an important role in the development of every field. Therefore a brief knowledge regarding information sources is necessary to function in any field. There are many information sources about scientific and technical subjects. In this context there are many reference sources in Chemistry too. Chemistry is one important part of the science which deals with the study of the composition of substances and the chemical changes that they undergo. The purpose of this report is...

  13. Radioactive certified reference materials

    International Nuclear Information System (INIS)

    Watanabe, Kazuo

    2010-01-01

    Outline of radioactive certified reference materials (CRM) for the analysis of nuclear materials and radioactive nuclides were described. The nuclear fuel CRMs are supplied by the three institutes: NBL in the US, CETAMA in France and IRMM in Belgium. For the RI CRMs, the Japan Radioisotope Association is engaged in activities concerning supply. The natural-matrix CRMs for the analysis of trace levels of radio-nuclides are prepared and supplied by NIST in the US and the IAEA. (author)

  14. Reference Japanese man

    International Nuclear Information System (INIS)

    Tanaka, G.-I.; Kawamura, H.; Nakahara, Y.

    1979-01-01

    The weight of organs from autopsy cases of normal Japanese adults, children, and infants is presented for the purpose of approaching a Reference Japanese Man. The skeletal content and the daily intake of alkaline earth elements are given. A lower rate of transfer (K 2 ) to the thyroid gland of ingested radioiodine, as well as a remarkably shorter biological half-life than the data adopted by ICRP, is also proved as a result of this study. (author)

  15. Second reference calculation for the WIPP

    International Nuclear Information System (INIS)

    Branstetter, L.J.

    1985-03-01

    Results of the second reference calculation for the Waste Isolation Pilot Plant (WIPP) project using the dynamic relaxation finite element code SANCHO are presented. This reference calculation is intended to predict the response of a typical panel of excavated rooms designed for storage of nonheat-producing nuclear waste. Results are presented that include relevant deformations, relative clay seam displacements, and stress and strain profiles. This calculation is a particular solution obtained by a computer code, which has proven analytic capabilities when compared with other structural finite element codes. It is hoped that the results presented here will be useful in providing scoping values for defining experiments and for developing instrumentation. It is also hoped that the calculation will be useful as part of an exercise in developing a methodology for performing important design calculations by more than one analyst using more than one computer code, and for defining internal Quality Assurance (QA) procedures for such calculations. 27 refs., 15 figs

  16. Reference handbook: Level detectors

    International Nuclear Information System (INIS)

    1990-01-01

    The purpose of this handbook is to provide Rocky Flats personnel with the information necessary to understand level measurement and detection. Upon completion of this handbook you should be able to do the following: List three reasons for measuring level. Describe the basic operating principles of the sight glass. Demonstrate proper techniques for reading a sight glass. Describe the basic operating principles of a float level detector. Describe the basic operating principles of a bubbler level indicating system. Explain the differences between a wet and dry reference leg indicating system, and describe how each functions. This handbook is designed for use by experienced Rocky Flats operators to reinforce and improve their current knowledge level, and by entry-level operators to ensure that they possess a minimum level of fundamental knowledge. Level Detectors is applicable to many job classifications and can be used as a reference for classroom work or for self-study. Although this reference handbook is by no means all-encompassing, you will gain enough information about this subject area to assist you in contributing to the safe operation of Rocky Flats Plant

  17. Electroacoustical reference data

    CERN Document Server

    Eargle, John M

    2002-01-01

    The need for a general collection of electroacoustical reference and design data in graphical form has been felt by acousticians and engineers for some time. This type of data can otherwise only be found in a collection of handbooks. Therefore, it is the author's intention that this book serve as a single source for many electroacoustical reference and system design requirements. In form, the volume closely resembles Frank Massa's Acoustic Design Charts, a handy book dating from 1942 that has long been out of print. The basic format of Massa's book has been followed here: For each entry, graphical data are presented on the right page, while text, examples, and refer­ ences appear on the left page. In this manner, the user can solve a given problem without thumbing from one page to the next. All graphs and charts have been scaled for ease in data entry and reading. The book is divided into the following sections: A. General Acoustical Relationships. This section covers the behavior of sound transmis­ sion in...

  18. Reference Values for Plasma Electrolytes and Urea in Nigerian ...

    African Journals Online (AJOL)

    Reference values for plasma electrolytes and urea have been defined for Nigerian children and adolescents residing in Abeokuta and its environs, a location in southern Nigeria, by estimating plasma sodium, potassium bicarbonate and urea concentrations in a reference population. The study group comprised three ...

  19. Student Teacher Letters of Reference: A Critical Analysis

    Science.gov (United States)

    Mason, Richard W.; Schroeder, Mark P.

    2012-01-01

    Letters of reference are commonly used in acquiring a job in education. Despite serious issues of validity and reliability in writing and evaluating letters, there is a dearth of research that systematically examines the evaluation process and defines the constructs that define high quality letters. The current study used NVivo to examine 160…

  20. Reference values for spirometry in preschool children.

    Science.gov (United States)

    Burity, Edjane F; Pereira, Carlos A C; Rizzo, José A; Brito, Murilo C A; Sarinho, Emanuel S C

    2013-01-01

    Reference values for lung function tests differ in samples from different countries, including values for preschoolers. The main objective of this study was to derive reference values in this population. A prospective study was conducted through a questionnaire applied to 425 preschool children aged 3 to 6 years, from schools and day-care centers in a metropolitan city in Brazil. Children were selected by simple random sampling from the aforementioned schools. Peak expiratory flow (PEF), forced vital capacity (FVC), forced expiratory volumes (FEV1, FEV0.50), forced expiratory flow (FEF25-75) and FEV1/FVC, FEV0.5/FVC and FEF25-75/FVC ratios were evaluated. Of the 425 children enrolled, 321 (75.6%) underwent the tests. Of these, 135 (42.0%) showed acceptable results with full expiratory curves and thus were included in the regression analysis to define the reference values. Height and gender significantly influenced FVC values through linear and logarithmic regression analysis. In males, R(2) increased with the logarithmic model for FVC and FEV1, but the linear model was retained for its simplicity. The lower limits were calculated by measuring the fifth percentile residues. Full expiratory curves are more difficult to obtain in preschoolers. In addition to height, gender also influences the measures of FVC and FEV1. Reference values were defined for spirometry in preschool children in this population, which are applicable to similar populations. Copyright © 2013 Sociedade Brasileira de Pediatria. Published by Elsevier Editora Ltda. All rights reserved.

  1. A reference aerosol for a radon reference chamber

    Science.gov (United States)

    Paul, Annette; Keyser, Uwe

    1996-02-01

    The measurement of radon and radon progenies and the calibration of their detection systems require the production and measurement of aerosols well-defined in size and concentration. In the German radon reference chamber, because of its unique chemical and physical properties, carnauba wax is used to produce standard aerosols. The aerosol size spectra are measured on-line by an aerosol measurement system in the range of 10 nm to 1 μm aerodynamic diameter. The experimental set-ups for the study of adsorption of radioactive ions on aerosols as function of their size and concentration will be described, the results presented and further adaptations for an aerosol jet introduced (for example, for the measurement of short-lived neutron-rich isotopes). Data on the dependence of aerosol radius, ion concentration and element selectivity is collected by using a 252Cf-sf source. The fission products of this source range widely in elements, isotopes and charges. Adsorption and the transport of radioactive ions on aerosols have therefore been studied for various ions for the first time, simultaneously with the aerosol size on-line spectrometry.

  2. A reference aerosol for a radon reference chamber

    Energy Technology Data Exchange (ETDEWEB)

    Paul, A. [Physikalisch-Technische Bundesanstalt, Braunschweig (Germany); Keyser, U. [Physikalisch-Technische Bundesanstalt, Braunschweig (Germany)

    1996-01-11

    The measurement of radon and radon progenies and the calibration of their detection systems require the production and measurement of aerosols well-defined in size and concentration. In the German radon reference chamber, because of its unique chemical and physical properties, carnauba wax is used to produce standard aerosols. The aerosol size spectra are measured on-line by an aerosol measurement system in the range of 10 nm to 1 {mu}m aerodynamic diameter. The experimental set-ups for the study of adsorption of radioactive ions on aerosols as function of their size and concentration are described, the results presented and further adaptations for an aerosol jet introduced (for example, for the measurement of short-lived neutron-rich isotopes). Data on the dependence of aerosol radius, ion concentration and element selectivity is collected by using a {sup 252}Cf-sf source. The fission products of this source range widely in elements, isotopes and charges. Adsorption and the transport of radioactive ions on aerosols have therefore been studied for various ions for the first time, simultaneously with the aerosol size on-line spectrometry. (orig.).

  3. XSLT 10 Pocket Reference

    CERN Document Server

    Lenz, Evan

    2008-01-01

    XSLT is an essential tool for converting XML into other kinds of documents: HTML, PDF file, and many others. It's a critical technology for XML-based platforms such as Microsoft .NET, Sun Microsystems' Sun One, as well as for most web browsers and authoring tools. As useful as XSLT is, however, most people have a difficult time getting used to its peculiar characteristics. The ability to use advanced techniques depends on a clear and exact understanding of how XSLT templates work and interact. The XSLT 1.0 Pocket Reference from O'Reilly wants to make sure you achieve that level of understan

  4. Electronics engineer's reference book

    CERN Document Server

    Mazda, F F

    1989-01-01

    Electronics Engineer's Reference Book, Sixth Edition is a five-part book that begins with a synopsis of mathematical and electrical techniques used in the analysis of electronic systems. Part II covers physical phenomena, such as electricity, light, and radiation, often met with in electronic systems. Part III contains chapters on basic electronic components and materials, the building blocks of any electronic design. Part IV highlights electronic circuit design and instrumentation. The last part shows the application areas of electronics such as radar and computers.

  5. Mechanical engineer's reference book

    CERN Document Server

    Parrish, A

    1973-01-01

    Mechanical Engineer's Reference Book: 11th Edition presents a comprehensive examination of the use of Systéme International d' Unités (SI) metrication. It discusses the effectiveness of such a system when used in the field of engineering. It addresses the basic concepts involved in thermodynamics and heat transfer. Some of the topics covered in the book are the metallurgy of iron and steel; screw threads and fasteners; hole basis and shaft basis fits; an introduction to geometrical tolerancing; mechanical working of steel; high strength alloy steels; advantages of making components as castings

  6. VBScript pocket reference

    CERN Document Server

    Lomax, Paul; Petrusha, Ron

    2008-01-01

    Microsoft's Visual Basic Scripting Edition (VBScript), a subset of Visual Basic for Applications, is a powerful language for Internet application development, where it can serve as a scripting language for server-side, client-side, and system scripting. Whether you're developing code for Active Server Pages, client-side scripts for Internet Explorer, code for Outlook forms, or scripts for Windows Script Host, VBScript Pocket Reference will be your constant companion. Don't let the pocket-friendly format fool you. Based on the bestsellingVBScript in a Nutshell, this small book details every V

  7. Xcode 5 developer reference

    CERN Document Server

    Wentk, Richard

    2014-01-01

    Design, code, and build amazing apps with Xcode 5 Thanks to Apple's awesome Xcode development environment, you can create the next big app for Macs, iPhones, iPads, or iPod touches. Xcode 5 contains gigabytes of great stuff to help you develop for both OS X and iOS devices - things like sample code, utilities, companion applications, documentation, and more. And with Xcode 5 Developer Reference, you now have the ultimate step-by-step guide to it all. Immerse yourself in the heady and lucrative world of Apple app development, see how to tame the latest features and functions, and find loads of

  8. NUnit Pocket Reference

    CERN Document Server

    Hamilton, Bill

    2009-01-01

    The open source NUnit framework is an excellent way to test .NET code as it is written, saving hundreds of QA hours and headaches. Unfortunately, some of those hours saved can be wasted trying to master this popular but under-documented framework. Proof that good things come in small packages, the NUnit Pocket Reference is everything you need to get NUnit up and working for you. It's the only book you'll need on this popular and practical new open source framework.

  9. Coal Data: A reference

    International Nuclear Information System (INIS)

    1991-01-01

    The purpose of Coal Data: A Reference is to provide basic information on the mining and use of coal, an important source of energy in the United States. The report is written for a general audience. The goal is to cover basic material and strike a reasonable compromise between overly generalized statements and detailed analyses. The section ''Coal Terminology and Related Information'' provides additional information about terms mentioned in the text and introduces new terms. Topics covered are US coal deposits, resources and reserves, mining, production, employment and productivity, health and safety, preparation, transportation, supply and stocks, use, coal, the environment, and more. (VC)

  10. Roaming Reference: Reinvigorating Reference through Point of Need Service

    Directory of Open Access Journals (Sweden)

    Kealin M. McCabe

    2011-11-01

    Full Text Available Roaming reference service was pursued as a way to address declining reference statistics. The service was staffed by librarians armed with iPads over a period of six months during the 2010-2011 academic year. Transactional statistics were collected in relation to query type (Research, Facilitative or Technology, location and approach (librarian to patron, patron to librarian or via chat widget. Overall, roaming reference resulted in an additional 228 reference questions, 67% (n=153 of which were research related. Two iterations of the service were implemented, roaming reference as a standalone service (Fall 2010 and roaming reference integrated with traditional reference desk duties (Winter 2011. The results demonstrate that although the Weller Library’s reference transactions are declining annually, they are not disappearing. For a roaming reference service to succeed, it must be a standalone service provided in addition to traditional reference services. The integration of the two reference models (roaming reference and reference desk resulted in a 56% decline in the total number of roaming reference questions from the previous term. The simple act of roaming has the potential to reinvigorate reference services as a whole, forcing librarians outside their comfort zones, allowing them to reach patrons at their point of need.

  11. AREVA - 2013 Reference document

    International Nuclear Information System (INIS)

    2014-01-01

    This Reference Document contains information on the AREVA group's objectives, prospects and development strategies, as well as estimates of the markets, market shares and competitive position of the AREVA group. Content: 1 - Person responsible for the Reference Document; 2 - Statutory auditors; 3 - Selected financial information; 4 - Description of major risks confronting the company; 5 - Information about the issuer; 6 - Business overview; 7 - Organizational structure; 8 - Property, plant and equipment; 9 - Situation and activities of the company and its subsidiaries; 10 - Capital resources; 11 - Research and development programs, patents and licenses; 12 - Trend information; 13 - Profit forecasts or estimates; 14 - Management and supervisory bodies; 15 - Compensation and benefits; 16 - Functioning of the management and supervisory bodies; 17 - Human resources information; 18 - Principal shareholders; 19 - Transactions with related parties; 20 - Financial information concerning assets, financial positions and financial performance; 21 - Additional information; 22 - Major contracts; 23 - Third party information, statements by experts and declarations of interest; 24 - Documents on display; 25 - Information on holdings; Appendix 1: report of the supervisory board chairman on the preparation and organization of the board's activities and internal control procedures; Appendix 2: statutory auditors' reports; Appendix 3: environmental report; Appendix 4: non-financial reporting methodology and independent third-party report on social, environmental and societal data; Appendix 5: ordinary and extraordinary general shareholders' meeting; Appendix 6: values charter; Appendix 7: table of concordance of the management report; glossaries

  12. Balinese Frame of Reference

    Directory of Open Access Journals (Sweden)

    I Nyoman Aryawibawa

    2016-04-01

    Full Text Available Abstract: Balinese Frame of Reference. Wassmann and Dasen (1998 did a study on the acquisition of Balinese frames of reference. They pointed out that, in addition to the dominant use of absolute system, the use of relative system was also observed. This article aims at verifying Wassmann and Dasen’ study. Employing monolingual Balinese speakers and using linguistic and non-linguistic tasks, Aryawibawa (2010, 2012, 2015 showed that Balinese subjects used an absolute system dominantly in responding the two tasks, e.g. The man is north/south/east/west of the car. Unlike Wassmann and Dasen’s results, no relative system was used by the subjects in solving the tasks. Instead of the relative system, an intrinsic system was also observed in this study, even though it was unfrequent. The article concludes that the absolute system was dominantly employed by Balinese speakers in describing spatial relations in Balinese. The use of the system seems to affect their cognitive functions.

  13. Antares Reference Telescope System

    International Nuclear Information System (INIS)

    Viswanathan, V.K.; Kaprelian, E.; Swann, T.; Parker, J.; Wolfe, P.; Woodfin, G.; Knight, D.

    1983-01-01

    Antares is a 24-beam, 40-TW carbon-dioxide laser-fusion system currently nearing completion at the Los Alamos National Laboratory. The 24 beams will be focused onto a tiny target (typically 300 to 1000 μm in diameter) located approximately at the center of a 7.3-m-diameter by 9.3-m-long vacuum (10 - 6 torr) chamber. The design goal is to position the targets to within 10 μm of a selected nominal position, which may be anywhere within a fixed spherical region 1 cm in diameter. The Antares Reference Telescope System is intended to help achieve this goal for alignment and viewing of the various targets used in the laser system. The Antares Reference Telescope System consists of two similar electro-optical systems positioned in a near orthogonal manner in the target chamber area of the laser. Each of these consists of four subsystems: (1) a fixed 9X optical imaging subsystem which produces an image of the target at the vidicon; (2) a reticle projection subsystem which superimposes an image of the reticle pattern at the vidicon; (3) an adjustable front-lighting subsystem which illuminates the target; and (4) an adjustable back-lighting subsystem which also can be used to illuminate the target. The various optical, mechanical, and vidicon design considerations and trade-offs are discussed. The final system chosen (which is being built) and its current status are described in detail

  14. Sensor employing internal reference electrode

    DEFF Research Database (Denmark)

    2013-01-01

    The present invention concerns a novel internal reference electrode as well as a novel sensing electrode for an improved internal reference oxygen sensor and the sensor employing same.......The present invention concerns a novel internal reference electrode as well as a novel sensing electrode for an improved internal reference oxygen sensor and the sensor employing same....

  15. Endogenizing Prospect Theory's Reference Point

    OpenAIRE

    Ulrich Schmidt; Horst Zank

    2010-01-01

    In previous models of (cumulative) prospect theory reference-dependence of preferences is imposed beforehand and the location of the reference point is exogenously determined. This note provides a foundation of prospect theory, where reference-dependence is derived from preference conditions and a unique reference point arises endogenously.

  16. Farmer knowledge and a priori risk analysis: pre-release evaluation of genetically modified Roundup Ready wheat across the Canadian prairies.

    Science.gov (United States)

    Mauro, Ian J; McLachlan, Stéphane M; Van Acker, Rene C

    2009-09-01

    The controversy over the world's first genetically modified (GM) wheat, Roundup Ready wheat (RRW), challenged the efficacy of 'science-based' risk assessment, largely because it excluded the public, particularly farmers, from meaningful input. Risk analysis, in contrast, is broader in orientation as it incorporates scientific data as well as socioeconomic, ethical, and legal concerns, and considers expert and lay input in decision-making. Local knowledge (LK) of farmers is experience-based and represents a rich and reliable source of information regarding the impacts associated with agricultural technology, thereby complementing the scientific data normally used in risk assessment. The overall goal of this study was to explore the role of farmer LK in the a priori risk analysis of RRW. In 2004, data were collected from farmers using mail surveys sent across the three prairie provinces (i.e., Manitoba, Saskatchewan, and Alberta) in western Canada. A stratified random sampling approach was used whereby four separate sampling districts were identified in regions where wheat was grown for each province. Rural post offices were randomly selected in each sampling district using Canada Post databases such that no one post office exceeded 80 farms and that each sampling district comprised 225-235 test farms (n = 11,040). In total, 1,814 people responded, representing an adjusted response rate for farmers of 33%. A subsequent telephone survey showed there was no non-response bias. The primary benefits associated with RRW were associated with weed control, whereas risks emphasized the importance of market harm, corporate control, agronomic problems, and the likelihood of contamination. Overall, risks were ranked much higher than benefits, and the great majority of farmers were highly critical of RRW commercialization. In total, 83.2% of respondents disagreed that RRW should have unconfined release into the environment. Risk was associated with distrust in government and

  17. AREVA 2009 reference document

    International Nuclear Information System (INIS)

    2009-01-01

    This Reference Document contains information on the AREVA group's objectives, prospects and development strategies. It contains information on the markets, market shares and competitive position of the AREVA group. This information provides an adequate picture of the size of these markets and of the AREVA group's competitive position. Content: 1 - Person responsible for the Reference Document and Attestation by the person responsible for the Reference Document; 2 - Statutory and Deputy Auditors; 3 - Selected financial information; 4 - Risks: Risk management and coverage, Legal risk, Industrial and environmental risk, Operating risk, Risk related to major projects, Liquidity and market risk, Other risk; 5 - Information about the issuer: History and development, Investments; 6 - Business overview: Markets for nuclear power and renewable energies, AREVA customers and suppliers, Overview and strategy of the group, Business divisions, Discontinued operations: AREVA Transmission and Distribution; 7 - Organizational structure; 8 - Property, plant and equipment: Principal sites of the AREVA group, Environmental issues that may affect the issuer's; 9 - Analysis of and comments on the group's financial position and performance: Overview, Financial position, Cash flow, Statement of financial position, Events subsequent to year-end closing for 2009; 10 - Capital Resources; 11 - Research and development programs, patents and licenses; 12 -trend information: Current situation, Financial objectives; 13 - Profit forecasts or estimates; 14 - Administrative, management and supervisory bodies and senior management; 15 - Compensation and benefits; 16 - Functioning of corporate bodies; 17 - Employees; 18 - Principal shareholders; 19 - Transactions with related parties: French state, CEA, EDF group; 20 - Financial information concerning assets, financial positions and financial performance; 21 - Additional information: Share capital, Certificate of incorporation and by-laws; 22 - Major

  18. Areva - 2016 Reference document

    International Nuclear Information System (INIS)

    2017-01-01

    Areva supplies high added-value products and services to support the operation of the global nuclear fleet. The company is present throughout the entire nuclear cycle, from uranium mining to used fuel recycling, including nuclear reactor design and operating services. Areva is recognized by utilities around the world for its expertise, its skills in cutting-edge technologies and its dedication to the highest level of safety. Areva's 36,000 employees are helping build tomorrow's energy model: supplying ever safer, cleaner and more economical energy to the greatest number of people. This Reference Document contains information on Areva's objectives, prospects and development strategies. It contains estimates of the markets, market shares and competitive position of Areva

  19. A reference tristimulus colorimeter

    Science.gov (United States)

    Eppeldauer, George P.

    2002-06-01

    A reference tristimulus colorimeter has been developed at NIST with a transmission-type silicon trap detector (1) and four temperature-controlled filter packages to realize the Commission Internationale de l'Eclairage (CIE) x(λ), y(λ) and z(λ) color matching functions (2). Instead of lamp standards, high accuracy detector standards are used for the colorimeter calibration. A detector-based calibration procedure is being suggested for tristimulus colorimeters wehre the absolute spectral responsivity of the tristimulus channels is determined. Then, color (spectral) correct and peak (amplitude) normalization are applied to minimize uncertainties caused by the imperfect realizations of the CIE functions. As a result of the corrections, the chromaticity coordinates of stable light sources with different spectral power distributions can be measured with uncertainties less than 0.0005 (k=1).

  20. Tank characterization reference guide

    International Nuclear Information System (INIS)

    De Lorenzo, D.S.; DiCenso, A.T.; Hiller, D.B.; Johnson, K.W.; Rutherford, J.H.; Smith, D.J.; Simpson, B.C.

    1994-09-01

    Characterization of the Hanford Site high-level waste storage tanks supports safety issue resolution; operations and maintenance requirements; and retrieval, pretreatment, vitrification, and disposal technology development. Technical, historical, and programmatic information about the waste tanks is often scattered among many sources, if it is documented at all. This Tank Characterization Reference Guide, therefore, serves as a common location for much of the generic tank information that is otherwise contained in many documents. The report is intended to be an introduction to the issues and history surrounding the generation, storage, and management of the liquid process wastes, and a presentation of the sampling, analysis, and modeling activities that support the current waste characterization. This report should provide a basis upon which those unfamiliar with the Hanford Site tank farms can start their research

  1. Areva, reference document 2006

    International Nuclear Information System (INIS)

    2006-01-01

    This reference document contains information on the AREVA group's objectives, prospects and development strategies, particularly in Chapters 4 and 7. It contains information on the markets, market shares and competitive position of the AREVA group. Content: - 1 Person responsible for the reference document and persons responsible for auditing the financial statements; - 2 Information pertaining to the transaction (Not applicable); - 3 General information on the company and its share capital: Information on AREVA, on share capital and voting rights, Investment certificate trading, Dividends, Organization chart of AREVA group companies, Equity interests, Shareholders' agreements; - 4 Information on company operations, new developments and future prospects: Overview and strategy of the AREVA group, The Nuclear Power and Transmission and Distribution markets, The energy businesses of the AREVA group, Front End division, Reactors and Services division, Back End division, Transmission and Distribution division, Major contracts, The principal sites of the AREVA group, AREVA's customers and suppliers, Sustainable Development and Continuous Improvement, Capital spending programs, Research and development programs, intellectual property and trademarks, Risk and insurance; - 5 Assets - Financial position - Financial performance: Analysis of and comments on the group's financial position and performance, 2006 Human Resources Report, Environmental Report, Consolidated financial statements, Notes to the consolidated financial statements, AREVA SA financial statements, Notes to the corporate financial statements; 6 - Corporate Governance: Composition and functioning of corporate bodies, Executive compensation, Profit-sharing plans, AREVA Values Charter, Annual Combined General Meeting of Shareholders of May 3, 2007; 7 - Recent developments and future prospects: Events subsequent to year-end closing for 2006, Outlook; 8 - Glossary; 9 - Table of concordance

  2. Areva reference document 2007

    International Nuclear Information System (INIS)

    2008-01-01

    This reference document contains information on the AREVA group's objectives, prospects and development strategies, particularly in Chapters 4 and 7. It contains also information on the markets, market shares and competitive position of the AREVA group. Content: 1 - Person responsible for the reference document and persons responsible for auditing the financial statements; 2 - Information pertaining to the transaction (not applicable); 3 - General information on the company and its share capital: Information on Areva, Information on share capital and voting rights, Investment certificate trading, Dividends, Organization chart of AREVA group companies, Equity interests, Shareholders' agreements; 4 - Information on company operations, new developments and future prospects: Overview and strategy of the AREVA group, The Nuclear Power and Transmission and Distribution markets, The energy businesses of the AREVA group, Front End division, Reactors and Services division, Back End division, Transmission and Distribution division, Major contracts 140 Principal sites of the AREVA group, AREVA's customers and suppliers, Sustainable Development and Continuous Improvement, Capital spending programs, Research and Development programs, Intellectual Property and Trademarks, Risk and insurance; 5 - Assets financial position financial performance: Analysis of and comments on the group's financial position and performance, Human Resources report, Environmental report, Consolidated financial statements 2007, Notes to the consolidated financial statements, Annual financial statements 2007, Notes to the corporate financial statements; 6 - Corporate governance: Composition and functioning of corporate bodies, Executive compensation, Profit-sharing plans, AREVA Values Charter, Annual Ordinary General Meeting of Shareholders of April 17, 2008; 7 - Recent developments and future prospects: Events subsequent to year-end closing for 2007, Outlook; Glossary; table of concordance

  3. The Academy's Duty to Define Patriotism

    Science.gov (United States)

    Gitlin, Todd

    2002-01-01

    The author discusses how universities might serve the public interest by stirring up not fewer but more and deeper debates on the failures of intelligence that afflicted American institutions before 11 September 2001--and he does not refer simply to the feebleness of the FBI and other investigation bureaucracies. He refers to the parochialism, the…

  4. Intelligence Defined and Undefined: A Relativistic Appraisal

    Science.gov (United States)

    Wechsler, David

    1975-01-01

    Major reasons for the continuing divergency of opinion as regards the nature and meaning of intelligence are examined. An appraisal of intelligence as a relative concept is proposed which advocates the necessity of specifying the reference systems to which a statement about intelligence refers. (EH)

  5. Reference ionization chamber

    International Nuclear Information System (INIS)

    Golnik, N.; Zielczynski, M.

    1999-01-01

    The paper presents the design of ionization chamber devoted for the determination of the absolute value of the absorbed dose in tissue-equivalent material. The special attention was paid to ensure that the volume of the active gas cavity was constant and well known. A specific property of the chamber design is that the voltage insulators are 'invisible' from any point of the active volume. Such configuration ensures a very good time stability of the electrical field and defines the active volume. The active volume of the chamber was determined with accuracy of 0.3%. This resulted in accuracy of 0.8% in determination of the absorbed dose in the layer of material adherent to the gas cavity. The chamber was applied for calibration purposes at radiotherapy facility in Joint Institute for Nuclear Research in Dubna (Russia) and in the calibration laboratory of the Institute of Atomic Energy in Swierk. (author)

  6. A Fiducial Reference Stie for Satellite Altimetry in Crete, Greece

    Science.gov (United States)

    Mertikas, Stelios; Donlon, Craig; Mavrocordatos, Constantin; Bojkov, Bojan; Femenias, Pierre; Parrinello, Tommaso; Picot, Nicolas; Desjonqueres, Jean-Damien; Andersen, Ole Baltazar

    2016-08-01

    With the advent of diverse satellite altimeters and variant measuring techniques, it has become mature in the scientific community, that an absolute reference Cal/Val site is regularly maintained to define, monitor, control the responses of any altimetric system.This work sets the ground for the establishment of a Fiducial Reference Site for ESA satellite altimetry in Gavdos and West Crete, Greece. It will consistently and reliably determine (a) absolute altimeter biases and their drifts; (b) relative bias among diverse missions; but also (c) continuously and independently connect different missions, on a common and reliable reference and also to SI-traceable measurements. Results from this fiducial reference site will be based on historic Cal/Val site measurement records, and will be the yardstick for building up capacity for monitoring climate change. This will be achieved by defining and assessing any satellite altimeter measurements to known, controlled and absolute reference signals with different techniques, processes and instrumentation.

  7. 22 CFR 92.36 - Authentication defined.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Authentication defined. 92.36 Section 92.36... Notarial Acts § 92.36 Authentication defined. An authentication is a certification of the genuineness of... recognized in another jurisdiction. Documents which may require authentication include legal instruments...

  8. A definability theorem for first order logic

    NARCIS (Netherlands)

    Butz, C.; Moerdijk, I.

    1997-01-01

    In this paper we will present a definability theorem for first order logic This theorem is very easy to state and its proof only uses elementary tools To explain the theorem let us first observe that if M is a model of a theory T in a language L then clearly any definable subset S M ie a subset S

  9. Dilution Confusion: Conventions for Defining a Dilution

    Science.gov (United States)

    Fishel, Laurence A.

    2010-01-01

    Two conventions for preparing dilutions are used in clinical laboratories. The first convention defines an "a:b" dilution as "a" volumes of solution A plus "b" volumes of solution B. The second convention defines an "a:b" dilution as "a" volumes of solution A diluted into a final volume of "b". Use of the incorrect dilution convention could affect…

  10. Defining Hardwood Veneer Log Quality Attributes

    Science.gov (United States)

    Jan Wiedenbeck; Michael Wiemann; Delton Alderman; John Baumgras; William Luppold

    2004-01-01

    This publication provides a broad spectrum of information on the hardwood veneer industry in North America. Veneer manufacturers and their customers impose guidelines in specifying wood quality attributes that are very discriminating but poorly defined (e.g., exceptional color, texture, and/or figure characteristics). To better understand and begin to define the most...

  11. Defining initiating events for purposes of probabilistic safety assessment

    International Nuclear Information System (INIS)

    1993-09-01

    This document is primarily directed towards technical staff involved in the performance or review of plant specific Probabilistic Safety Assessment (PSA). It highlights different approaches and provides typical examples useful for defining the Initiating Events (IE). The document also includes the generic initiating event database, containing about 300 records taken from about 30 plant specific PSAs. In addition to its usefulness during the actual performance of a PSA, the generic IE database is of the utmost importance for peer reviews of PSAs, such as the IAEA's International Peer Review Service (IPERS) where reference to studies on similar NPPs is needed. 60 refs, figs and tabs

  12. Description of a reference mixed oxide fuel fabrication plant (MOFFP)

    International Nuclear Information System (INIS)

    1978-01-01

    In order to evaluate the environment impact, due to the Mixed Oxide Fuel Fabrication Plants, work has been initiated to describe the general design and operating conditions of a reference Mixed Oxide Fuel Fabrication Plant (MOFFP) for the 1990 time frame. The various reference data and basic assumptions for the reference MOFFP plant have been defined after discussion with experts. The data reported in this document are only made available to allow an evaluation of the environmental impact due to a reference MOFFP plant. These data have therefore not to be used as recommandation, standards, regulatory guides or requirements

  13. AREVA - 2012 Reference document

    International Nuclear Information System (INIS)

    2013-03-01

    After a presentation of the person responsible for this Reference Document, of statutory auditors, and of a summary of financial information, this report address the different risk factors: risk management and coverage, legal risk, industrial and environmental risk, operational risk, risk related to major projects, liquidity and market risk, and other risks (related to political and economic conditions, to Group's structure, and to human resources). The next parts propose information about the issuer, a business overview (markets for nuclear power and renewable energies, customers and suppliers, group's strategy, operations), a brief presentation of the organizational structure, a presentation of properties, plants and equipment (principal sites, environmental issues which may affect these items), analysis and comments on the group's financial position and performance, a presentation of capital resources, a presentation of research and development activities (programs, patents and licenses), a brief description of financial objectives and profit forecasts or estimates, a presentation of administration, management and supervision bodies, a description of the operation of corporate bodies, an overview of personnel, of principal shareholders, and of transactions with related parties, a more detailed presentation of financial information concerning assets, financial positions and financial performance. Addition information regarding share capital is given, as well as an indication of major contracts, third party information, available documents, and information on holdings

  14. Reference thorium fuel cycle

    International Nuclear Information System (INIS)

    Driggers, F.E.

    1978-08-01

    In the reference fuel cycle for the TFCT program, fissile U will be denatured by mixing with 238 U; the plants will be located in secure areas, with Pu being recycled within these secure areas; Th will be recycled with recovered U and Pu; the head end will handle a variety of core and blanket fuel assembly designs for LWRs and HWRs; the fuel may be a homogeneous mixture either of U and Th oxide pellets or sol-gel microspheres; the cladding will be Zircaloy; and MgO may be added to the fuel to improve Th dissolution. Th is being considered as the fertile component of fuel in order to increase proliferation resistance. Spent U recovered from Th-based fuels must be re-enriched before recycle to prevent very rapid buildup of 238 U. Stainless steel will be considered as a backup to Zircaloy cladding in case Zr is incompatible with commercial aqueous dissolution. Storage of recovered irradiated Th will be considered as a backup to its use in the recycle of recovered Pu and U. Estimates are made of the time for introducing the Th fuel cycle into the LWR power industry. Since U fuel exposures in LWRs are likely to increase from 30,000 to 50,000 MWD/MT, the Th reprocessing plant should also be designed for Th fuel with 50,000 MWD/MT exposure

  15. Sensor Characteristics Reference Guide

    Energy Technology Data Exchange (ETDEWEB)

    Cree, Johnathan V.; Dansu, A.; Fuhr, P.; Lanzisera, Steven M.; McIntyre, T.; Muehleisen, Ralph T.; Starke, M.; Banerjee, Pranab; Kuruganti, T.; Castello, C.

    2013-04-01

    The Buildings Technologies Office (BTO), within the U.S. Department of Energy (DOE), Office of Energy Efficiency and Renewable Energy (EERE), is initiating a new program in Sensor and Controls. The vision of this program is: • Buildings operating automatically and continuously at peak energy efficiency over their lifetimes and interoperating effectively with the electric power grid. • Buildings that are self-configuring, self-commissioning, self-learning, self-diagnosing, self-healing, and self-transacting to enable continuous peak performance. • Lower overall building operating costs and higher asset valuation. The overarching goal is to capture 30% energy savings by enhanced management of energy consuming assets and systems through development of cost-effective sensors and controls. One step in achieving this vision is the publication of this Sensor Characteristics Reference Guide. The purpose of the guide is to inform building owners and operators of the current status, capabilities, and limitations of sensor technologies. It is hoped that this guide will aid in the design and procurement process and result in successful implementation of building sensor and control systems. DOE will also use this guide to identify research priorities, develop future specifications for potential market adoption, and provide market clarity through unbiased information

  16. Uranium tailings reference materials

    International Nuclear Information System (INIS)

    Smith, C.W.; Steger, H.F.; Bowman, W.S.

    1984-01-01

    Samples of uranium tailings from Bancroft and Elliot Lake, Ontario, and from Beaverlodge and Rabbit Lake, Saskatchewan, have been prepared as compositional reference materials at the request of the National Uranium Tailings Research Program. The four samples, UTS-1 to UTS-4, were ground to minus 104 μm, each mixed in one lot and bottled in 200-g units for UTS-1 to UTS-3 and in 100-g units for UTS-4. The materials were tested for homogeneity with respect to uranium by neutron activation analysis and to iron by an acid-decomposition atomic absorption procedure. In a free choice analytical program, 18 laboratories contributed results for one or more of total iron, titanium, aluminum, calcium, barium, uranium, thorium, total sulphur, and sulphate for all four samples, and for nickel and arsenic in UTS-4 only. Based on a statistical analysis of the data, recommended values were assigned to all elements/constituents, except for sulphate in UTS-3 and nickel in UTS-4. The radioactivity of thorium-230, radium-226, lead-210, and polonium-210 in UTS-1 to UTS-4 and of thorium-232, radium-228, and thorium-228 in UTS-1 and UTS-2 was determined in a radioanalytical program composed of eight laboratories. Recommended values for the radioactivities and associated parameters were calculated by a statistical treatment of the results

  17. ENRAF gauge reference level calculations

    Energy Technology Data Exchange (ETDEWEB)

    Huber, J.H., Fluor Daniel Hanford

    1997-02-06

    This document describes the method for calculating reference levels for Enraf Series 854 Level Detectors as installed in the tank farms. The reference level calculation for each installed level gauge is contained herein.

  18. Global Reference Tables Services Architecture

    Data.gov (United States)

    Social Security Administration — This database stores the reference and transactional data used to provide a data-driven service access method to certain Global Reference Table (GRT) service tables.

  19. Genetics Home Reference: PURA syndrome

    Science.gov (United States)

    ... TJ, Vreeburg M, Rouhl RPW, Stevens SJC, Stegmann APA, Schieving J, Pfundt R, van Dijk K, Smeets ... article on PubMed Central More from Genetics Home Reference Bulletins Genetics Home Reference Celebrates Its 15th Anniversary ...

  20. Bilayer graphene quantum dot defined by topgates

    Energy Technology Data Exchange (ETDEWEB)

    Müller, André; Kaestner, Bernd; Hohls, Frank; Weimann, Thomas; Pierz, Klaus; Schumacher, Hans W., E-mail: hans.w.schumacher@ptb.de [Physikalisch-Technische Bundesanstalt, Bundesallee 100, 38116 Braunschweig (Germany)

    2014-06-21

    We investigate the application of nanoscale topgates on exfoliated bilayer graphene to define quantum dot devices. At temperatures below 500 mK, the conductance underneath the grounded gates is suppressed, which we attribute to nearest neighbour hopping and strain-induced piezoelectric fields. The gate-layout can thus be used to define resistive regions by tuning into the corresponding temperature range. We use this method to define a quantum dot structure in bilayer graphene showing Coulomb blockade oscillations consistent with the gate layout.

  1. Areva - 2014 Reference document

    International Nuclear Information System (INIS)

    2015-01-01

    Areva supplies high added-value products and services to support the operation of the global nuclear fleet. The company is present throughout the entire nuclear cycle, from uranium mining to used fuel recycling, including nuclear reactor design and operating services. Areva is recognized by utilities around the world for its expertise, its skills in cutting-edge technologies and its dedication to the highest level of safety. Areva's 44,000 employees are helping build tomorrow's energy model: supplying ever safer, cleaner and more economical energy to the greatest number of people. This Reference Document contains information on Areva's objectives, prospects and development strategies. It contains estimates of the markets, market shares and competitive position of Areva. Contents: 1 - Person responsible; 2 - Statutory auditors; 3 - Selected financial information; 4 - Risk factors; 5 - Information about the issuer; 6 - Business overview; 7 - Organizational structure; 8 - Property, plant and equipment; 9 - Analysis of and comments on the group's financial position and performance; 10 - Capital resources; 11 - Research and development programs, patents and licenses; 12 - Trend information; 13 - Profit forecasts; 14 - Administrative, management and supervisory bodies and senior management; 15 - Compensation and benefits; 16 - Functioning of administrative, management and supervisory bodies and senior management; 17 - Employees; 18 - Principal shareholders; 19 - Transactions with related parties; 20 - Financial information concerning assets, financial positions and financial performance; 21 - Additional information; 22 - Major contracts; 23 - Third party information, statements by experts and declarations of interest; 24 - Documents on display; 25 - information on holdings; appendix: Report of the Chairman of the Board of Directors on governance, internal control procedures and risk management, Statutory Auditors' report, Corporate social

  2. Kerlinger's Criterial Referents Theory Revisited.

    Science.gov (United States)

    Zak, Itai; Birenbaum, Menucha

    1980-01-01

    Kerlinger's criterial referents theory of attitudes was tested cross-culturally by administering an education attitude referents summated-rating scale to 713 individuals in Israel. The response pattern to criterial and noncriterial referents was examined. Results indicated empirical cross-cultural validity of theory, but questioned measuring…

  3. Fundamentals of Managing Reference Collections

    Science.gov (United States)

    Singer, Carol A.

    2012-01-01

    Whether a library's reference collection is large or small, it needs constant attention. Singer's book offers information and insight on best practices for reference collection management, no matter the size, and shows why managing without a plan is a recipe for clutter and confusion. In this very practical guide, reference librarians will learn:…

  4. Application-Defined Decentralized Access Control

    Science.gov (United States)

    Xu, Yuanzhong; Dunn, Alan M.; Hofmann, Owen S.; Lee, Michael Z.; Mehdi, Syed Akbar; Witchel, Emmett

    2014-01-01

    DCAC is a practical OS-level access control system that supports application-defined principals. It allows normal users to perform administrative operations within their privilege, enabling isolation and privilege separation for applications. It does not require centralized policy specification or management, giving applications freedom to manage their principals while the policies are still enforced by the OS. DCAC uses hierarchically-named attributes as a generic framework for user-defined policies such as groups defined by normal users. For both local and networked file systems, its execution time overhead is between 0%–9% on file system microbenchmarks, and under 1% on applications. This paper shows the design and implementation of DCAC, as well as several real-world use cases, including sandboxing applications, enforcing server applications’ security policies, supporting NFS, and authenticating user-defined sub-principals in SSH, all with minimal code changes. PMID:25426493

  5. Software Defined Multiband EVA Radio, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this research is to propose a reliable, lightweight, programmable, multi-band, multi-mode, miniaturized frequency-agile EVA software defined radio...

  6. Reconfigurable, Cognitive Software Defined Radio, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — IAI is actively developing Software Defined Radio platforms that can adaptively switch between different modes of operation by modifying both transmit waveforms and...

  7. Software Defined Multiband EVA Radio, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of Phase 2 is to build a reliable, lightweight, programmable, multi-mode, miniaturized EVA Software Defined Radio (SDR) that supports data telemetry,...

  8. Reconfigurable, Cognitive Software Defined Radio, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Intelligent Automation Inc, (IAI) is currently developing a software defined radio (SDR) platform that can adaptively switch between different modes of operation for...

  9. Face-infringement space: the frame of reference of the ventral intraparietal area.

    Science.gov (United States)

    McCollum, Gin; Klam, François; Graf, Werner

    2012-07-01

    Experimental studies have shown that responses of ventral intraparietal area (VIP) neurons specialize in head movements and the environment near the head. VIP neurons respond to visual, auditory, and tactile stimuli, smooth pursuit eye movements, and passive and active movements of the head. This study demonstrates mathematical structure on a higher organizational level created within VIP by the integration of a complete set of variables covering face-infringement. Rather than positing dynamics in an a priori defined coordinate system such as those of physical space, we assemble neuronal receptive fields to find out what space of variables VIP neurons together cover. Section 1 presents a view of neurons as multidimensional mathematical objects. Each VIP neuron occupies or is responsive to a region in a sensorimotor phase space, thus unifying variables relevant to the disparate sensory modalities and movements. Convergence on one neuron joins variables functionally, as space and time are joined in relativistic physics to form a unified spacetime. The space of position and motion together forms a neuronal phase space, bridging neurophysiology and the physics of face-infringement. After a brief review of the experimental literature, the neuronal phase space natural to VIP is sequentially characterized, based on experimental data. Responses of neurons indicate variables that may serve as axes of neural reference frames, and neuronal responses have been so used in this study. The space of sensory and movement variables covered by VIP receptive fields joins visual and auditory space to body-bound sensory modalities: somatosensation and the inertial senses. This joining of allocentric and egocentric modalities is in keeping with the known relationship of the parietal lobe to the sense of self in space and to hemineglect, in both humans and monkeys. Following this inductive step, variables are formalized in terms of the mathematics of graph theory to deduce which

  10. Optimum Criteria for Developing Defined Structures

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2008-01-01

    Full Text Available Basic aspects concerning distributed applications are presented: definition, particularities and importance. For distributed applications linear, arborescent, graph structures are defined with different versions and aggregation methods. Distributed applications have associated structures which through their characteristics influence the costs of the stages in the development cycle and the exploitation costs transferred to each user. The complexity of the defined structures is analyzed. The minimum and maximum criteria are enumerated for optimizing distributed application structures.

  11. Deficient motion-defined and texture-defined figure-ground segregation in amblyopic children.

    Science.gov (United States)

    Wang, Jane; Ho, Cindy S; Giaschi, Deborah E

    2007-01-01

    Motion-defined form deficits in the fellow eye and the amblyopic eye of children with amblyopia implicate possible direction-selective motion processing or static figure-ground segregation deficits. Deficient motion-defined form perception in the fellow eye of amblyopic children may not be fully accounted for by a general motion processing deficit. This study investigates the contribution of figure-ground segregation deficits to the motion-defined form perception deficits in amblyopia. Performances of 6 amblyopic children (5 anisometropic, 1 anisostrabismic) and 32 control children with normal vision were assessed on motion-defined form, texture-defined form, and global motion tasks. Performance on motion-defined and texture-defined form tasks was significantly worse in amblyopic children than in control children. Performance on global motion tasks was not significantly different between the 2 groups. Faulty figure-ground segregation mechanisms are likely responsible for the observed motion-defined form perception deficits in amblyopia.

  12. Reference Ellipsoid and Geoid in Chronometric Geodesy

    Science.gov (United States)

    Kopeikin, Sergei M.

    2016-02-01

    Chronometric geodesy applies general relativity to study the problem of the shape of celestial bodies including the earth, and their gravitational field. The present paper discusses the relativistic problem of construction of a background geometric manifold that is used for describing a reference ellipsoid, geoid, the normal gravity field of the earth and for calculating geoid's undulation (height). We choose the perfect fluid with an ellipsoidal mass distribution uniformly rotating around a fixed axis as a source of matter generating the geometry of the background manifold through the Einstein equations. We formulate the post-Newtonian hydrodynamic equations of the rotating fluid to find out the set of algebraic equations defining the equipotential surface of the gravity field. In order to solve these equations we explicitly perform all integrals characterizing the interior gravitational potentials in terms of elementary functions depending on the parameters defining the shape of the body and the mass distribution. We employ the coordinate freedom of the equations to choose these parameters to make the shape of the rotating fluid configuration to be an ellipsoid of rotation. We derive expressions of the post-Newtonian mass and angular momentum of the rotating fluid as functions of the rotational velocity and the parameters of the ellipsoid including its bare density, eccentricity and semi-major axes. We formulate the post-Newtonian Pizzetti and Clairaut theorems that are used in geodesy to connect the parameters of the reference ellipsoid to the polar and equatorial values of force of gravity. We expand the post-Newtonian geodetic equations characterizing the reference ellipsoid into the Taylor series with respect to the eccentricity of the ellipsoid, and discuss the small-eccentricity approximation. Finally, we introduce the concept of relativistic geoid and its undulation with respect to the reference ellipsoid, and discuss how to calculate it in chronometric

  13. Reference ellipsoid and geoid in chronometric geodesy

    Directory of Open Access Journals (Sweden)

    Sergei M Kopeikin

    2016-02-01

    Full Text Available Chronometric geodesy applies general relativity to study the problem of the shape of celestial bodies including the earth, and their gravitational field. The present paper discusses the relativistic problem of construction of a background geometric manifold that is used for describing a reference ellipsoid, geoid, the normal gravity field of the earth and for calculating geoid's undulation (height. We choose the perfect fluid with an ellipsoidal mass distribution uniformly rotating around a fixed axis as a source of matter generating the geometry of the background manifold through the Einstein equations. We formulate the post-Newtonian hydrodynamic equations of the rotating fluid to find out the set of algebraic equations defining the equipotential surface of the gravity field. In order to solve these equations we explicitly perform all integrals characterizing the interior gravitational potentials in terms of elementary functions depending on the parameters defining the shape of the body and the mass distribution. We employ the coordinate freedom of the equations to choose these parameters to make the shape of the rotating fluid configuration to be an ellipsoid of rotation. We derive expressions of the post-Newtonian mass and angular momentum of the rotating fluid as functions of the rotational velocity and the parameters of the ellipsoid including its bare density, eccentricity and semi-major axes. We formulate the post-Newtonian Pizzetti and Clairaut theorems that are used in geodesy to connect the parameters of the reference ellipsoid to the polar and equatorial values of force of gravity. We expand the post-Newtonian geodetic equations characterizing the reference ellipsoid into the Taylor series with respect to the eccentricity of the ellipsoid, and discuss the small-eccentricity approximation. Finally, we introduce the concept of relativistic geoid and its undulation with respect to the reference ellipsoid, and discuss how to calculate it

  14. Reference Ellipsoid and Geoid in Chronometric Geodesy

    Energy Technology Data Exchange (ETDEWEB)

    Kopeikin, Sergei M., E-mail: kopeikins@missouri.edu [Department of Physics and Astronomy, University of Missouri, Columbia, MO (United States); Department of Physical Geodesy and Remote Sensing, Siberian State University of Geosystems and Technologies, Novosibirsk (Russian Federation)

    2016-02-25

    Chronometric geodesy applies general relativity to study the problem of the shape of celestial bodies including the earth, and their gravitational field. The present paper discusses the relativistic problem of construction of a background geometric manifold that is used for describing a reference ellipsoid, geoid, the normal gravity field of the earth and for calculating geoid's undulation (height). We choose the perfect fluid with an ellipsoidal mass distribution uniformly rotating around a fixed axis as a source of matter generating the geometry of the background manifold through the Einstein equations. We formulate the post-Newtonian hydrodynamic equations of the rotating fluid to find out the set of algebraic equations defining the equipotential surface of the gravity field. In order to solve these equations we explicitly perform all integrals characterizing the interior gravitational potentials in terms of elementary functions depending on the parameters defining the shape of the body and the mass distribution. We employ the coordinate freedom of the equations to choose these parameters to make the shape of the rotating fluid configuration to be an ellipsoid of rotation. We derive expressions of the post-Newtonian mass and angular momentum of the rotating fluid as functions of the rotational velocity and the parameters of the ellipsoid including its bare density, eccentricity and semi-major axes. We formulate the post-Newtonian Pizzetti and Clairaut theorems that are used in geodesy to connect the parameters of the reference ellipsoid to the polar and equatorial values of force of gravity. We expand the post-Newtonian geodetic equations characterizing the reference ellipsoid into the Taylor series with respect to the eccentricity of the ellipsoid, and discuss the small-eccentricity approximation. Finally, we introduce the concept of relativistic geoid and its undulation with respect to the reference ellipsoid, and discuss how to calculate it in

  15. Defining health-related quality of life for young wheelchair users: A qualitative health economics study.

    Directory of Open Access Journals (Sweden)

    Nathan Bray

    Full Text Available Wheelchairs for children with impaired mobility provide health, developmental and psychosocial benefits, however there is limited understanding of how mobility aids affect the health-related quality of life of children with impaired mobility. Preference-based health-related quality of life outcome measures are used to calculate quality-adjusted life years; an important concept in health economics. The aim of this research was to understand how young wheelchair users and their parents define health-related quality of life in relation to mobility impairment and wheelchair use.The sampling frame was children with impaired mobility (≤18 years who use a wheelchair and their parents. Data were collected through semi-structured face-to-face interviews conducted in participants' homes. Qualitative framework analysis was used to analyse the interview transcripts. An a priori thematic coding framework was developed. Emerging codes were grouped into categories, and refined into analytical themes. The data were used to build an understanding of how children with impaired mobility define health-related quality of life in relation to mobility impairment, and to assess the applicability of two standard measures of health-related quality of life.Eleven children with impaired mobility and 24 parents were interviewed across 27 interviews. Participants defined mobility-related quality of life through three distinct but interrelated concepts: 1 participation and positive experiences; 2 self-worth and feeling fulfilled; 3 health and functioning. A good degree of consensus was found between child and parent responses, although there was some evidence to suggest a shift in perception of mobility-related quality of life with child age.Young wheelchair users define health-related quality of life in a distinct way as a result of their mobility impairment and adaptation use. Generic, preference-based measures of health-related quality of life lack sensitivity in this

  16. Using greenhouse gas fluxes to define soil functional types

    Energy Technology Data Exchange (ETDEWEB)

    Petrakis, Sandra; Barba, Josep; Bond-Lamberty, Ben; Vargas, Rodrigo

    2017-12-04

    Soils provide key ecosystem services and directly control ecosystem functions; thus, there is a need to define the reference state of soil functionality. Most common functional classifications of ecosystems are vegetation-centered and neglect soil characteristics and processes. We propose Soil Functional Types (SFTs) as a conceptual approach to represent and describe the functionality of soils based on characteristics of their greenhouse gas (GHG) flux dynamics. We used automated measurements of CO2, CH4 and N2O in a forested area to define SFTs following a simple statistical framework. This study supports the hypothesis that SFTs provide additional insights on the spatial variability of soil functionality beyond information represented by commonly measured soil parameters (e.g., soil moisture, soil temperature, litter biomass). We discuss the implications of this framework at the plot-scale and the potential of this approach at larger scales. This approach is a first step to provide a framework to define SFTs, but a community effort is necessary to harmonize any global classification for soil functionality. A global application of the proposed SFT framework will only be possible if there is a community-wide effort to share data and create a global database of GHG emissions from soils.

  17. REDEFINING ENSO EPISODES BASED ON CHANGED CLIMATE REFERENCES

    Institute of Scientific and Technical Information of China (English)

    LI Xiao-yan; ZHAI Pan-mao; REN Fu-min

    2005-01-01

    Through studying changes in ENSO indices relative to change of climate reference from 1961~1990 to 1971~2000, the study generated new standards to define ENSO episodes and their intensities. Then according to the new climate references and new index standards, ENSO episodes and their intensities for the period 1951 -2003 have been classified. Finally, an analysis has been performed comparing the new characteristics with the old ones for ENSO period, peak values and intensities.

  18. User Preferences in Reference Services: Virtual Reference and Academic Libraries

    Science.gov (United States)

    Cummings, Joel; Cummings, Lara; Frederiksen, Linda

    2007-01-01

    This study examines the use of chat in an academic library's user population and where virtual reference services might fit within the spectrum of public services offered by academic libraries. Using questionnaires, this research demonstrates that many within the academic community are open to the idea of chat-based reference or using chat for…

  19. 16 CFR 500.2 - Terms defined.

    Science.gov (United States)

    2010-01-01

    ... individuals, or use by individuals for purposes of personal care or in the performance of services ordinarily... or outer wrappings used by retailers to ship or deliver any commodity to retail customers if such... practical importance. (k) The term customary inch/pound refers to units belonging to the system of units...

  20. Reference Dose Rates for Fluoroscopy Guided Interventions

    International Nuclear Information System (INIS)

    Geleijns, J.; Broerse, J.J.; Hummel, W.A.; Schalij, M.J.; Schultze Kool, L.J.; Teeuwisse, W.; Zoetelief, J.

    1998-01-01

    The wide diversity of fluoroscopy guided interventions which have become available in recent years has improved patient care. They are being performed in increasing numbers, particularly at departments of cardiology and radiology. Some procedures are very complex and require extended fluoroscopy times, i.e. longer than 30 min, and radiation exposure of patient and medical staff is in some cases rather high. The occurrence of radiation-induced skin injuries on patients has shown that radiation protection for fluoroscopy guided interventions should not only be focused on stochastic effects, i.e. tumour induction and hereditary risks, but also on potential deterministic effects. Reference dose levels are introduced by the Council of the European Communities as an instrument to achieve optimisation of radiation protection in radiology. Reference levels in conventional diagnostic radiology are usually expressed as entrance skin dose or dose-area product. It is not possible to define a standard procedure for complex interventions due to the large inter-patient variations with regard to the complexity of specific interventional procedures. Consequently, it is not realistic to establish a reference skin dose or dose-area product for complex fluoroscopy guided interventions. As an alternative, reference values for fluoroscopy guided interventions can be expressed as the entrance dose rates on a homogeneous phantom and on the image intensifier. A protocol has been developed and applied during a nationwide survey of fluoroscopic dose rate during catheter ablations. From this survey reference entrance dose rates of respectively 30 mGy.min -1 on a polymethylmethacrylate (PMMA) phantom with a thickness of 21 cm, and of 0.8 μGy.s -1 on the image intensifier have been derived. (author)

  1. Java for dummies quick reference

    CERN Document Server

    Lowe, Doug

    2012-01-01

    A reference that answers your questions as you move through your coding The demand for Android programming and web apps continues to grow at an unprecedented pace and Java is the preferred language for both. Java For Dummies Quick Reference keeps you moving through your coding while you solve a problem, look up a command or syntax, or search for a programming tip. Whether you're a Java newbie or a seasoned user, this fast reference offers you quick access to solutions without requiring that you wade through pages of tutorial material. Leverages the true reference format that is organized with

  2. Hanford defined waste model limitations and improvements

    International Nuclear Information System (INIS)

    HARMSEN, R.W.

    1999-01-01

    Recommendation 93-5 Implementation Plan, Milestone 5,6.3.1.i requires issuance of this report which addresses ''updates to the tank contents model''. This report summarizes the review of the Hanford Defined Waste, Revision 4, model limitations and provides conclusions and recommendations for potential updates to the model

  3. Parallel Education and Defining the Fourth Sector.

    Science.gov (United States)

    Chessell, Diana

    1996-01-01

    Parallel to the primary, secondary, postsecondary, and adult/community education sectors is education not associated with formal programs--learning in arts and cultural sites. The emergence of cultural and educational tourism is an opportunity for adult/community education to define itself by extending lifelong learning opportunities into parallel…

  4. Bruxism defined and graded: an international consensus

    NARCIS (Netherlands)

    Lobbezoo, F.; Ahlberg, J.; Glaros, A.G.; Kato, T.; Koyano, K.; Lavigne, G.J.; de Leeuw, R.; Manfredini, D.; Svensson, P.; Winocur, E.

    2013-01-01

    To date, there is no consensus about the definition and diagnostic grading of bruxism. A written consensus discussion was held among an international group of bruxism experts as to formulate a definition of bruxism and to suggest a grading system for its operationalisation. The expert group defined

  5. 7 CFR 28.950 - Terms defined.

    Science.gov (United States)

    2010-01-01

    ... Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing..., TESTING, AND STANDARDS Cotton Fiber and Processing Tests Definitions § 28.950 Terms defined. As used... Agricultural Marketing Service of the U.S. Department of Agriculture. (c) Administrator. The Administrator of...

  6. 47 CFR 54.401 - Lifeline defined.

    Science.gov (United States)

    2010-10-01

    ... SERVICE Universal Service Support for Low-Income Consumers § 54.401 Lifeline defined. (a) As used in this subpart, Lifeline means a retail local service offering: (1) That is available only to qualifying low-income consumers; (2) For which qualifying low-income consumers pay reduced charges as a result of...

  7. How Should Energy Be Defined throughout Schooling?

    Science.gov (United States)

    Bächtold, Manuel

    2018-01-01

    The question of how to teach energy has been renewed by recent studies focusing on the learning and teaching progressions for this concept. In this context, one question has been, for the most part, overlooked: how should energy be defined throughout schooling? This paper addresses this question in three steps. We first identify and discuss two…

  8. Big data and software defined networks

    CERN Document Server

    Taheri, Javid

    2018-01-01

    Big Data Analytics and Software Defined Networking (SDN) are helping to drive the management of data usage of the extraordinary increase of computer processing power provided by Cloud Data Centres (CDCs). This new book investigates areas where Big-Data and SDN can help each other in delivering more efficient services.

  9. Delta Semantics Defined By Petri Nets

    DEFF Research Database (Denmark)

    Jensen, Kurt; Kyng, Morten; Madsen, Ole Lehrmann

    and the possibility of using predicates to specify state changes. In this paper a formal semantics for Delta is defined and analysed using Petri nets. Petri nets was chosen because the ideas behind Petri nets and Delta concide on several points. A number of proposals for changes in Delta, which resulted from...

  10. Towards a Southern African English Defining Vocabulary

    African Journals Online (AJOL)

    user

    of parameters, such as avoiding synonyms and antonyms, to determine which words are necessary to write definitions in a concise and simple way. It has been found that existing defining vocabularies lack certain words that would make definitions more accessible to southern African learners, and therefore there is a need ...

  11. Spaces defined by the Paley function

    Energy Technology Data Exchange (ETDEWEB)

    Astashkin, S V [Samara State University, Samara (Russian Federation); Semenov, E M [Voronezh State University, Faculty of Mathematics, Voronezh (Russian Federation)

    2013-07-31

    The paper is concerned with Haar and Rademacher series in symmetric spaces, and also with the properties of spaces defined by the Paley function. In particular, the symmetric hull of the space of functions with uniformly bounded Paley function is found. Bibliography: 27 titles.

  12. Pointwise extensions of GSOS-defined operations

    NARCIS (Netherlands)

    Hansen, H.H.; Klin, B.

    2011-01-01

    Final coalgebras capture system behaviours such as streams, infinite trees and processes. Algebraic operations on a final coalgebra can be defined by distributive laws (of a syntax functor S over a behaviour functor F). Such distributive laws correspond to abstract specification formats. One such

  13. Pointwise Extensions of GSOS-Defined Operations

    NARCIS (Netherlands)

    H.H. Hansen (Helle); B. Klin

    2011-01-01

    textabstractFinal coalgebras capture system behaviours such as streams, infinite trees and processes. Algebraic operations on a final coalgebra can be defined by distributive laws (of a syntax functor $\\FSig$ over a behaviour functor $F$). Such distributive laws correspond to abstract specification

  14. Defining Virtual Reality: Dimensions Determining Telepresence.

    Science.gov (United States)

    Steuer, Jonathan

    1992-01-01

    Defines virtual reality as a particular type of experience (in terms of "presence" and "telepresence") rather than as a collection of hardware. Maintains that media technologies can be classified and studied in terms of vividness and interactivity, two attributes on which virtual reality ranks very high. (SR)

  15. A self-defining hierarchical data system

    Science.gov (United States)

    Bailey, J.

    1992-01-01

    The Self-Defining Data System (SDS) is a system which allows the creation of self-defining hierarchical data structures in a form which allows the data to be moved between different machine architectures. Because the structures are self-defining they can be used for communication between independent modules in a distributed system. Unlike disk-based hierarchical data systems such as Starlink's HDS, SDS works entirely in memory and is very fast. Data structures are created and manipulated as internal dynamic structures in memory managed by SDS itself. A structure may then be exported into a caller supplied memory buffer in a defined external format. This structure can be written as a file or sent as a message to another machine. It remains static in structure until it is reimported into SDS. SDS is written in portable C and has been run on a number of different machine architectures. Structures are portable between machines with SDS looking after conversion of byte order, floating point format, and alignment. A Fortran callable version is also available for some machines.

  16. Effect of reference loads on fracture mechanics analysis of surface cracked pipe based on reference stress method

    International Nuclear Information System (INIS)

    Shim, Do Jun; Son, Beom Goo; Kim, Young Jin; Kim, Yun Jae

    2004-01-01

    To investigate relevance of the definition of the reference stress to estimate J and C * for surface crack problems, this paper compares FE J and C * results for surface cracked pipes with those estimated according to the reference stress approach using various definitions of the reference stress. Pipes with part circumferential inner surface crack and finite internal axial crack are considered, subject to internal pressure and global bending. The crack depth and aspect ratio are systematically varied. The reference stress is defined in four different ways using (I) the local limit load, (II) the global limit load, (III) the global limit load determined from the FE limit analysis, and (IV) the optimised reference load. It is found that the reference stress based on the local limit load gives overall excessively conservative estimates of J and C * . Use of the global limit load clearly reduces the conservatism, compared to that of the local limit load, although it can provide sometimes non-conservative estimates of J and C * . The use of the FE global limit load gives overall non-conservative estimates of J and C * . The reference stress based on the optimised reference load gives overall accurate estimates of J and C * , compared to other definitions of the reference stress. Based on the present finding, general guidance on the choice of the reference stress for surface crack problems is given

  17. Moving Reference to the Web.

    Science.gov (United States)

    McGlamery, Susan; Coffman, Steve

    2000-01-01

    Explores the possibility of using Web contact center software to offer reference assistance to remote users. Discusses a project by the Metropolitan Cooperative Library System/Santiago Library System consortium to test contact center software and to develop a virtual reference network. (Author/LRW)

  18. Reference vectors in economic choice

    Directory of Open Access Journals (Sweden)

    Teycir Abdelghani GOUCHA

    2013-07-01

    Full Text Available In this paper the introduction of notion of reference vector paves the way for a combination of classical and social approaches in the framework of referential preferences given by matrix groups. It is shown that individual demand issue from rational decision does not depend on that reference.

  19. "In Your Face" Reference Service.

    Science.gov (United States)

    Lipow, Anne Grodzins

    1999-01-01

    Discusses changes in library reference service that have occurred with growing Internet use. Topics include the human factor that is still needed; the nature of reference questions; the goal of user self-sufficiency; the invisible nature of much of librarians' work; and providing real-time, interactive point-of-need service to remote users. (LRW)

  20. Reference Services: A Handmaid's Tale.

    Science.gov (United States)

    Beck, Clare

    1991-01-01

    Discussion of problems in library reference services focuses on the influence of gender roles. A historical overview of gender roles in the development of American librarianship is presented that highlights stereotyped views of and attitudes toward women, which the author suggests still have influences on librarianship today. (17 references) (LRW)

  1. Defining Tiger Parenting in Chinese Americans.

    Science.gov (United States)

    Kim, Su Yeong

    2013-09-01

    "Tiger" parenting, as described by Amy Chua [2011], has instigated scholarly discourse on this phenomenon and its possible effects on families. Our eight-year longitudinal study, published in the Asian American Journal of Psychology [Kim, Wang, Orozco-Lapray, Shen, & Murtuza, 2013b], demonstrates that tiger parenting is not a common parenting profile in a sample of 444 Chinese American families. Tiger parenting also does not relate to superior academic performance in children. In fact, the best developmental outcomes were found among children of supportive parents. We examine the complexities around defining tiger parenting by reviewing classical literature on parenting styles and scholarship on Asian American parenting, along with Amy Chua's own description of her parenting method, to develop, define, and categorize variability in parenting in a sample of Chinese American families. We also provide evidence that supportive parenting is important for the optimal development of Chinese American adolescents.

  2. Defining enthesitis in spondyloarthritis by ultrasound

    DEFF Research Database (Denmark)

    Terslev, Lene; Naredo, E; Iagnocco, A

    2014-01-01

    Objective: To standardize ultrasound (US) in enthesitis. Methods: An Initial Delphi exercise was undertaken to define US detected enthesitis and its core components. These definitions were subsequently tested on static images taken from Spondyloarthritis (SpA) patients in order to evaluate...... elementary component. On static images the intra-observer reliability showed a high degree of variability for the detection of elementary lesions with kappa coefficients ranging from 0.14 - 1. The inter-observer kappa value was variable with the lowest kappa for enthesophytes (0.24) and the best for Doppler...... activity at the enthesis (0.63). Conclusion: This is the first consensus based definition of US enthesitis and its elementary components and the first step performed to ensure a higher degree of homogeneity and comparability of results between studies and in daily clinical work. Defining Enthesitis...

  3. Control of System with Defined Risk Level

    Directory of Open Access Journals (Sweden)

    Pavol Tomasov

    2002-01-01

    Full Text Available In the following paper the basic requirements for system control with defined risk level is presented. The paper should be an introduction to describe of theoretical apparatus, which was created during some years of research work in the Department of information and safety systems in this area. It a modification or creation of new parts of Information theory, System theory, and Control theory means. This parts are necessary for the analysis and synthesis tasks in the systems where dominant attribute of control is defined risk level. The basic problem is the creation of protect mechanism again the threats from inside and from controlled system environs. For each risk reduction mechanism is needed some redundancy which should be into control algorithm to put by exactly determined way.

  4. FINANCIAL ACCOUNTING QUALITY AND ITS DEFINING CHARACTERISTICS

    Directory of Open Access Journals (Sweden)

    Andra M. ACHIM

    2014-11-01

    Full Text Available The importance ofhigh-quality financial statements is highlighted by the main standard-setting institutions activating in the field of accounting and reporting. These have issued Conceptual Frameworks which state and describe the qualitative characteristics of accounting information. In this qualitative study, the research methodology consists of reviewing the literature related to the definition of accounting quality and striving for understanding how it can be explained. The main objective of the study is to identify the characteristics information should possess in order to be of high quality. These characteristics also contribute to the way of defining financial accounting quality. The main conclusions that arise from this research are represented by the facts that indeed financial accounting quality cannot be uniquely defined and that financial information is of good quality when it enhances the characteristics incorporated in the conceptual frameworks issued by both International Accounting Standards Board and Financial Accounting Standards Board.

  5. Exploring self-defining memories in schizophrenia.

    Science.gov (United States)

    Raffard, Stéphane; D'Argembeau, Arnaud; Lardi, Claudia; Bayard, Sophie; Boulenger, Jean-Philippe; Van Der Linden, Martial

    2009-01-01

    Previous studies have shown that patients with schizophrenia are impaired in recalling specific events from their personal past. However, the relationship between autobiographical memory impairments and disturbance of the sense of identity in schizophrenia has not been investigated in detail. In this study the authors investigated schizophrenic patients' ability to recall self-defining memories; that is, memories that play an important role in building and maintaining the self-concept. Results showed that patients recalled as many specific self-defining memories as healthy participants. However, patients with schizophrenia exhibited an abnormal reminiscence bump and reported different types of thematic content (i.e., they recalled less memories about past achievements and more memories regarding hospitalisation and stigmatisation of illness). Furthermore, the findings suggest that impairments in extracting meaning from personal memories could represent a core disturbance of autobiographical memory in patients with schizophrenia.

  6. Defining Terrorism at the Special Tribunal for Lebanon

    Directory of Open Access Journals (Sweden)

    Prakash Puchooa

    2011-11-01

    Full Text Available On 16 February 2011, the Appeals Chamber of the Special Tribunal for Lebanon (STL issued an interlocutory decision regarding the legal definition of terrorism.This decision was in response to a Pre-Trial Chamber (PTC list of questions requesting,' inter alia', an elaboration of the elements of this crime.In exploring this matter, the Appeals Chamber defined the subjective ('mens rea' and objective elements ('actus reus' of terrorism by referring to domestic Lebanese law and international law. It thereby set out the applicable law for the court. The consequence of this decision however is not limited to the law of STL but may be seen as having far-reaching consequences for the conception of terrorism under both international law and International Criminal Law (ICL. Given the significance of the Appeals Chamber judgment, this paper will scrutinise three areas of concern regarding its propriety:

  7. Defining operational taxonomic units using DNA barcode data.

    Science.gov (United States)

    Blaxter, Mark; Mann, Jenna; Chapman, Tom; Thomas, Fran; Whitton, Claire; Floyd, Robin; Abebe, Eyualem

    2005-10-29

    The scale of diversity of life on this planet is a significant challenge for any scientific programme hoping to produce a complete catalogue, whatever means is used. For DNA barcoding studies, this difficulty is compounded by the realization that any chosen barcode sequence is not the gene 'for' speciation and that taxa have evolutionary histories. How are we to disentangle the confounding effects of reticulate population genetic processes? Using the DNA barcode data from meiofaunal surveys, here we discuss the benefits of treating the taxa defined by barcodes without reference to their correspondence to 'species', and suggest that using this non-idealist approach facilitates access to taxon groups that are not accessible to other methods of enumeration and classification. Major issues remain, in particular the methodologies for taxon discrimination in DNA barcode data.

  8. Improving network management with Software Defined Networking

    International Nuclear Information System (INIS)

    Dzhunev, Pavel

    2013-01-01

    Software-defined networking (SDN) is developed as an alternative to closed networks in centers for data processing by providing a means to separate the control layer data layer switches, and routers. SDN introduces new possibilities for network management and configuration methods. In this article, we identify problems with the current state-of-the-art network configuration and management mechanisms and introduce mechanisms to improve various aspects of network management

  9. Stateless multicast switching in software defined networks

    OpenAIRE

    Reed, Martin J.; Al-Naday, Mays; Thomos, Nikolaos; Trossen, Dirk; Petropoulos, George; Spirou, Spiros

    2016-01-01

    Multicast data delivery can significantly reduce traffic in operators' networks, but has been limited in deployment due to concerns such as the scalability of state management. This paper shows how multicast can be implemented in contemporary software defined networking (SDN) switches, with less state than existing unicast switching strategies, by utilising a Bloom Filter (BF) based switching technique. Furthermore, the proposed mechanism uses only proactive rule insertion, and thus, is not l...

  10. Defining and Distinguishing Traditional and Religious Terrorism

    OpenAIRE

    Gregg, Heather S.

    2014-01-01

    The article of record may be found at: http://dx.doi.org/10.1080/23296151.2016.1239978 thus offering few if any policy options for counterterrorism measures. This assumption about religious terrorism stems from two challenges in the literature: disproportionate attention to apocalyptic terrorism, and a lack of distinction between religious terrorism and its secular counterpart. This article, therefore, aims to do four things: define and differentiate religiously motivated terrorism from tr...

  11. Defining Trust Using Expected Utility Theory

    OpenAIRE

    Arai, Kazuhiro

    2009-01-01

    Trust has been discussed in many social sciences including economics, psychology, and sociology. However, there is no widely accepted definition of trust. Inparticular, there is no definition that can be used for economic analysis. This paper regards trust as expectation and defines it using expected utility theory together with concepts such as betrayal premium. In doing so, it rejects the widely accepted black-and-white view that (un) trustworthy people are always (un)trustworthy. This pape...

  12. On Undefined and Meaningless in Lambda Definability

    OpenAIRE

    de Vries, Fer-Jan

    2016-01-01

    We distinguish between undefined terms as used in lambda definability of partial recursive functions and meaningless terms as used in infinite lambda calculus for the infinitary terms models that generalise the Bohm model. While there are uncountable many known sets of meaningless terms, there are four known sets of undefined terms. Two of these four are sets of meaningless terms. In this paper we first present set of sufficient conditions for a set of lambda terms to se...

  13. Financial Reporting from the Reference Theories’ Perspective

    OpenAIRE

    Victor Munteanu; Lavinia Copcinschi; Anda Laceanu; Carmen Luschi

    2014-01-01

    International accounting standards are underpinned by a normative approach of accounting, in the sense that these are based on conceptual accounting framework assimilated by a theoretical framework. The conceptual framework’s development is based on a priori theory, initiated by Chambers, in his article published in 1955, where he defends the need for a theory of practical accounting an a detachment from descriptive theories of an inductive approach. Developing the accounting standards on suc...

  14. How Should Energy Be Defined Throughout Schooling?

    Science.gov (United States)

    Bächtold, Manuel

    2017-02-01

    The question of how to teach energy has been renewed by recent studies focusing on the learning and teaching progressions for this concept. In this context, one question has been, for the most part, overlooked: how should energy be defined throughout schooling? This paper addresses this question in three steps. We first identify and discuss two main approaches in physics concerning the definition of energy, one claiming there is no satisfactory definition and taking conservation as a fundamental property, and the other based on Rankine's definition of energy as the capacity of a system to produce changes. We then present a study concerning how energy is actually defined throughout schooling in the case of France by analyzing national programs, physics textbooks, and the answers of teachers to a questionnaire. This study brings to light a consistency problem in the way energy is defined across school years: in primary school, an adapted version of Rankine's definition is introduced and conservation is ignored; in high school, conservation is introduced and Rankine's definition is ignored. Finally, we address this consistency problem by discussing possible teaching progressions. We argue in favor of the use of Rankine's definition throughout schooling: at primary school, it is a possible substitute to students' erroneous conceptions; at secondary school, it might help students become aware of the unifying role of energy and thereby overcome the compartmentalization problem.

  15. A fiducial reference site for satellite altimetry in Crete, Greece

    DEFF Research Database (Denmark)

    Mertikas, Stelios; Donlon, Craig; Mavrokordatos, Constantin

    With the advent of diverse satellite altimeters and variant measuring techniques, it has become mature in the scientific community, that an absolute reference Cal/Val site is regularly maintained to define, monitor, control the responses of any altimetric system. This work sets the ground for the...

  16. Biological reference points for fish stocks in a multispecies context

    DEFF Research Database (Denmark)

    Collie, J.S.; Gislason, Henrik

    2001-01-01

    Biological reference points (BRPs) are widely used to define safe levels of harvesting for marine fish populations. Most BRPs are either minimum acceptable biomass levels or maximum fishing mortality rates. The values of BRPs are determined from historical abundance data and the life...

  17. District-level hospital trauma care audit filters: Delphi technique for defining context-appropriate indicators for quality improvement initiative evaluation in developing countries.

    Science.gov (United States)

    Stewart, Barclay T; Gyedu, Adam; Quansah, Robert; Addo, Wilfred Larbi; Afoko, Akis; Agbenorku, Pius; Amponsah-Manu, Forster; Ankomah, James; Appiah-Denkyira, Ebenezer; Baffoe, Peter; Debrah, Sam; Donkor, Peter; Dorvlo, Theodor; Japiong, Kennedy; Kushner, Adam L; Morna, Martin; Ofosu, Anthony; Oppong-Nketia, Victor; Tabiri, Stephen; Mock, Charles

    2016-01-01

    Prospective clinical audit of trauma care improves outcomes for the injured in high-income countries (HICs). However, equivalent, context-appropriate audit filters for use in low- and middle-income country (LMIC) district-level hospitals have not been well established. We aimed to develop context-appropriate trauma care audit filters for district-level hospitals in Ghana, was well as other LMICs more broadly. Consensus on trauma care audit filters was built between twenty panellists using a Delphi technique with four anonymous, iterative surveys designed to elicit: (i) trauma care processes to be measured; (ii) important features of audit filters for the district-level hospital setting; and (iii) potentially useful filters. Filters were ranked on a scale from 0 to 10 (10 being very useful). Consensus was measured with average percent majority opinion (APMO) cut-off rate. Target consensus was defined a priori as: a median rank of ≥9 for each filter and an APMO cut-off rate of ≥0.8. Panellists agreed on trauma care processes to target (e.g. triage, phases of trauma assessment, early referral if needed) and specific features of filters for district-level hospital use (e.g. simplicity, unassuming of resource capacity). APMO cut-off rate increased successively: Round 1--0.58; Round 2--0.66; Round 3--0.76; and Round 4--0.82. After Round 4, target consensus on 22 trauma care and referral-specific filters was reached. Example filters include: triage--vital signs are recorded within 15 min of arrival (must include breathing assessment, heart rate, blood pressure, oxygen saturation if available); circulation--a large bore IV was placed within 15 min of patient arrival; referral--if referral is activated, the referring clinician and receiving facility communicate by phone or radio prior to transfer. This study proposes trauma care audit filters appropriate for LMIC district-level hospitals. Given the successes of similar filters in HICs and obstetric care filters in LMICs

  18. References

    OpenAIRE

    2017-01-01

    Achinstein, P. 1983. The Nature of Explanation (Oxford: Oxford University Press). Adler, J.E. 1997. ‘Lying, Deceiving, or Falsely Implicating’, The Journal of Philosophy, 94: 435–52. Adler, J.E. 2002. Belief’s Own Ethics (Cambridge, MA: MIT Press). Alicke, M.D., J. Buckingham, E. Zell, & T. Davis. 2008. ‘Culpable Control and Counterfactual Reasoning in the Psychology of Blame’, Personality and Social Psychology Bulletin, 34: 1371–78, http://journals.sagepub.com/doi/abs/10.1177/014616720832159...

  19. References

    OpenAIRE

    2018-01-01

    Aldridge, H., Kenway, P. and Born, T. (2015), ‘What Happened to Poverty Under the Coalition’. New Policy Institute. Armour, R. (2014) ‘Charity Shops Buck Trend’, http://thirdforcenews.org.uk,24 March 2014. Beatty, C. and Fothergill, S. (2016), The Uneven Impact of Welfare Reform. The Financial Losses to Places and People, Centre for Regional and Social Economic Research and Sheffield Hallam University, https://www4.shu.ac.uk/research/cresr/sites/shu.ac.uk/files/welfare-reform-2016_1.pdf Belfi...

  20. References

    OpenAIRE

    2012-01-01

    Appiah, Kwame Anthony (2005) The Ethics of Identity, Princeton University Press, Princeton, NJ. ______ (2006). Cosmopolitanism: Ethics in a World of Strangers, Norton, New York. Clarke, Charles (2006) ‛Global Citizens and Quality International Education: Enlarging the Role of the Commonwealth’. Speech delivered to the Royal Commonwealth Society, 15 November, 2006, London. Estlund, Cynthia (2003) Working Together: How Workplace Bonds Strengthen a Diverse Democracy, Oxford University Press, New...

  1. References:

    African Journals Online (AJOL)

    brain drain”'. Globalization and Health 2006, 2:12 doi: 10.1186/1744-8603-2-12. 3. Zijlstra, E., Broadhead, R. 2007. The College of Medicine in the. Republic of Malawi: towards sustainable staff development, Human. Resources for Health 2007, ...

  2. Japanese reference man 1988, 3

    International Nuclear Information System (INIS)

    Tanaka, Gi-ichiro

    1988-01-01

    Quantitative description of physical properties and other characteristics of the human body is one of the basic data for estimating dose equivalent and calculating Annual Limit on Intake of radionuclides. The exact mass weight of organs and tissues are measured from about 1000 autopsy cases of normal Japanese adults and physical properties are obtained from recent Japanese Government publications. Japanese (Asian) Reference Man is completed by establishing the Normal Japanese, harmonizing with Caucasian Reference Man and coinciding with the ICRP Reference Man Task Group members concept. (author)

  3. Integrated geophysical survey in defining subsidence features on a golf course

    Science.gov (United States)

    Xia, J.; Miller, R.D.

    2007-01-01

    Subsidence was observed at several places on the Salina Municipal Golf Course in areas known to be built over a landfill in Salina, Kansas. High-resolution magnetic survey (???5400 m2), multi-channel electrical resistivity profiling (three 154 m lines) and microgravity profiling (23 gravity-station values) were performed on a subsidence site (Green 16) to aid in determining boundaries and density deficiency of the landfill in the vicinity of the subsidence. Horizontal boundaries of the landfill were confidently defined by both magnetic anomalies and the pseudo-vertical gradient of total field magnetic anomalies. Furthermore, the pseudo-vertical gradient of magnetic anomalies presented a unique anomaly at Green 16, which provided a criterion for predicting other spots with subsidence potential using the same gradient property. Results of multi-channel electrical resistivity profiling (ERP) suggested the bottom limit of the landfill at Green 16 was around 21 m below the ground surface based on the vertical gradient of electric resistivity and a priori information on the depth of the landfill. ERP results also outlined several possible landfill bodies based on their low resistivity values. Microgravity results suggested a -0.14 g cm-3 density deficiency at Green 16 that could equate to future surface subsidence of as much as 1.5 m due to gradual compaction. ?? 2007 Nanjing Institute of Geophysical Prospecting.

  4. Defining a Bobath clinical framework - A modified e-Delphi study.

    Science.gov (United States)

    Vaughan-Graham, Julie; Cott, Cheryl

    2016-11-01

    To gain consensus within the expert International Bobath Instructors Training Association (IBITA) on a Bobath clinical framework on which future efficacy studies can be based. A three-round modified e-Delphi approach was used with 204 full members of the IBITA. Twenty-one initial statements were generated from the literature. Consensus was defined a priori as at least 80% of the respondents with a level of agreement on a Likert scale of 4 or 5. The Delphi questionnaire for each round was available online for two weeks. Summary reports and subsequent questionnaires were posted within four weeks. Ninety-four IBITA members responded, forming the Delphi panel, of which 68 and 66 responded to Rounds Two and Three, respectively. The 21 initial statements were revised to 17 statements and five new statements in Round Two in which eight statements were accepted and two statements were eliminated. Round Three presented 12 revised statements, all reaching consensus. The Delphi was successful in gaining consensus on a Bobath clinical framework in a geographically diverse expert association, identifying the unique components of Bobath clinical practice. Discussion throughout all three Rounds revolved primarily around the terminology of atypical and compensatory motor behavior and balance.

  5. Cultural sensitivity in public health: defined and demystified.

    Science.gov (United States)

    Resnicow, K; Baranowski, T; Ahluwalia, J S; Braithwaite, R L

    1999-01-01

    There is consensus that health promotion programs should be culturally sensitive (CS). Yet, despite the ubiquitous nature of CS within public health research and practice, there has been surprisingly little attention given to defining CS or delineating a framework for developing culturally sensitive programs and practitioners. This paper describes a model for understanding CS from a public health perspective; describes a process for applying this model in the development of health promotion and disease prevention interventions; and highlights research priorities. Cultural sensitivity is defined by two dimensions: surface and deep structures. Surface structure involves matching intervention materials and messages to observable, "superficial" characteristics of a target population. This may involve using people, places, language, music, food, locations, and clothing familiar to, and preferred by, the target audience. Surface structure refers to how well interventions fit within a specific culture. Deep structure involves incorporating the cultural, social, historical, environmental and psychological forces that influence the target health behavior in the proposed target population. Whereas surface structure generally increases the "receptivity" or "acceptance" of messages, deep structure conveys salience. Techniques, borrowed from social marketing and health communication theory, for developing culturally sensitive interventions are described. Research is needed to determine the effectiveness of culturally sensitive programs.

  6. Defining the cortical visual systems: "what", "where", and "how"

    Science.gov (United States)

    Creem, S. H.; Proffitt, D. R.; Kaiser, M. K. (Principal Investigator)

    2001-01-01

    The visual system historically has been defined as consisting of at least two broad subsystems subserving object and spatial vision. These visual processing streams have been organized both structurally as two distinct pathways in the brain, and functionally for the types of tasks that they mediate. The classic definition by Ungerleider and Mishkin labeled a ventral "what" stream to process object information and a dorsal "where" stream to process spatial information. More recently, Goodale and Milner redefined the two visual systems with a focus on the different ways in which visual information is transformed for different goals. They relabeled the dorsal stream as a "how" system for transforming visual information using an egocentric frame of reference in preparation for direct action. This paper reviews recent research from psychophysics, neurophysiology, neuropsychology and neuroimaging to define the roles of the ventral and dorsal visual processing streams. We discuss a possible solution that allows for both "where" and "how" systems that are functionally and structurally organized within the posterior parietal lobe.

  7. A reference model for space data system interconnection services

    Science.gov (United States)

    Pietras, John; Theis, Gerhard

    1993-01-01

    The widespread adoption of standard packet-based data communication protocols and services for spaceflight missions provides the foundation for other standard space data handling services. These space data handling services can be defined as increasingly sophisticated processing of data or information received from lower-level services, using a layering approach made famous in the International Organization for Standardization (ISO) Open System Interconnection Reference Model (OSI-RM). The Space Data System Interconnection Reference Model (SDSI-RM) incorporates the conventions of the OSIRM to provide a framework within which a complete set of space data handling services can be defined. The use of the SDSI-RM is illustrated through its application to data handling services and protocols that have been defined by, or are under consideration by, the Consultative Committee for Space Data Systems (CCSDS).

  8. Nuclear measurements and reference materials

    International Nuclear Information System (INIS)

    1988-01-01

    This report summarizes the progress of the JRC programs on nuclear data, nuclear metrology, nuclear reference materials and non-nuclear reference materials. Budget restrictions and personnel difficulties were encountered during 1987. Fission properties of 235 U as a function of neutron energy and of the resonances can be successfully described on the basis of a three exit channel fission model. Double differential neutron emission cross-sections were accomplished on 7 Li and were started for the tritium production cross-section of 9 Be. Reference materials of uranium minerals and ores were prepared. Special nuclear targets were prepared. A batch of 250 g of Pu0 2 was characterized in view of certification as reference material for the elemental assay of plutonium

  9. Dietary Reference Values for choline

    DEFF Research Database (Denmark)

    Sjödin, Anders Mikael

    2016-01-01

    Following a request from the European Commission, the EFSA Panel on Dietetic Products, Nutrition and Allergies (NDA) derives Dietary Reference Values (DRVs) for choline. In this Opinion, the Panel considers dietary choline including choline compounds (e.g. glycerophosphocholine, phosphocholine...

  10. Defining recovery in chronic fatigue syndrome: a critical review.

    Science.gov (United States)

    Adamowicz, Jenna L; Caikauskaite, Indre; Friedberg, Fred

    2014-11-01

    In chronic fatigue syndrome (CFS), the lack of consensus on how recovery should be defined or interpreted has generated controversy and confusion. The purpose of this paper was to systematically review, compare, and evaluate the definitions of recovery reported in the CFS literature and to make recommendations about the scope of recovery assessments. A search was done using the MEDLINE, PubMed, PsycINFO, CINAHL, and Cochrane databases for peer review papers that contained the search terms "chronic fatigue syndrome" and "recovery," "reversal," "remission," and/or "treatment response." From the 22 extracted studies, recovery was operationally defined by reference with one or more of these domains: (1) pre-morbid functioning; (2) both fatigue and function; (3) fatigue (or related symptoms) alone; (4) function alone; and/or (5) brief global assessment. Almost all of the studies measuring recovery in CFS did so differently. The brief global assessment was the most common outcome measure used to define recovery. Estimates of recovery ranged from 0 to 66 % in intervention studies and 2.6 to 62 % in naturalistic studies. Given that the term "recovery" was often based on limited assessments and less than full restoration of health, other more precise and accurate labels (e.g., clinically significant improvement) may be more appropriate and informative. In keeping with common understandings of the term recovery, we recommend a consistent definition that captures a broad-based return to health with assessments of both fatigue and function as well as the patient's perceptions of his/her recovery status.

  11. Defining Multiple Chronic Conditions for Quality Measurement.

    Science.gov (United States)

    Drye, Elizabeth E; Altaf, Faseeha K; Lipska, Kasia J; Spatz, Erica S; Montague, Julia A; Bao, Haikun; Parzynski, Craig S; Ross, Joseph S; Bernheim, Susannah M; Krumholz, Harlan M; Lin, Zhenqiu

    2018-02-01

    Patients with multiple chronic conditions (MCCs) are a critical but undefined group for quality measurement. We present a generally applicable systematic approach to defining an MCC cohort of Medicare fee-for-service beneficiaries that we developed for a national quality measure, risk-standardized rates of unplanned admissions for Accountable Care Organizations. To define the MCC cohort we: (1) identified potential chronic conditions; (2) set criteria for cohort conditions based on MCC framework and measure concept; (3) applied the criteria informed by empirical analysis, experts, and the public; (4) described "broader" and "narrower" cohorts; and (5) selected final cohort with stakeholder input. Subjects were patients with chronic conditions. Participants included 21.8 million Medicare fee-for-service beneficiaries in 2012 aged 65 years and above with ≥1 of 27 Medicare Chronic Condition Warehouse condition(s). In total, 10 chronic conditions were identified based on our criteria; 8 of these 10 were associated with notably increased admission risk when co-occurring. A broader cohort (2+ of the 8 conditions) included 4.9 million beneficiaries (23% of total cohort) with an admission rate of 70 per 100 person-years. It captured 53% of total admissions. The narrower cohort (3+ conditions) had 2.2 million beneficiaries (10%) with 100 admissions per 100 person-years and captured 32% of admissions. Most stakeholders viewed the broader cohort as best aligned with the measure concept. By systematically narrowing chronic conditions to those most relevant to the outcome and incorporating stakeholder input, we defined an MCC admission measure cohort supported by stakeholders. This approach can be used as a model for other MCC outcome measures.

  12. How do pediatric anesthesiologists define intraoperative hypotension?

    Science.gov (United States)

    Nafiu, Olubukola O; Voepel-Lewis, Terri; Morris, Michelle; Chimbira, Wilson T; Malviya, Shobha; Reynolds, Paul I; Tremper, Kevin K

    2009-11-01

    Although blood pressure (BP) monitoring is a recommended standard of care by the ASA, and pediatric anesthesiologists routinely monitor the BP of their patients and when appropriate treat deviations from 'normal', there is no robust definition of hypotension in any of the pediatric anesthesia texts or journals. Consequently, what constitutes hypotension in pediatric anesthesia is currently unknown. We designed a questionnaire-based survey of pediatric anesthesiologists to determine the BP ranges and thresholds used to define intraoperative hypotension (IOH). Members of the Society of Pediatric Anesthesia (SPA) and the Association of Paediatric Anaesthetists (APA) of Great Britain and Ireland were contacted through e-mail to participate in this survey. We asked a few demographic questions and five questions about specific definitions of hypotension for different age groups of patients undergoing inguinal herniorraphy, a common pediatric surgical procedure. The overall response rate was 56% (483/860), of which 76% were SPA members. Majority of the respondents (72%) work in academic institutions, while 8.9% work in institutions with fewer than 1000 annual pediatric surgical caseload. About 76% of respondents indicated that a 20-30% reduction in baseline systolic blood pressure (SBP) indicates significant hypotension in children under anesthesia. Most responders (86.7%) indicated that they use mean arterial pressure or SBP (72%) to define IOH. The mean SBP values for hypotension quoted by SPA members was about 5-7% lower across all pediatric age groups compared to values quoted by APA members (P = 0.001 for all age groups). There is great variability in the BP parameters used and the threshold used for defining and treating IOH among pediatric anesthesiologists. The majority of respondents considered a 20-30% reduction from baseline in SBP as indicative of significant hypotension. Lack of a consensus definition for a common clinical condition like IOH could have

  13. TclTk Pocket Reference

    CERN Document Server

    Raines, Paul

    1998-01-01

    The Tcl/Tk combination is increasingly popular because it lets you produce sophisticated graphical interfaces with a few easy commands, develop and change scripts quickly, and conveniently tie together existing utilities or programming libraries. The Tcl/Tk Pocket Reference,a handy reference guide to the basic Tcl language elements, Tcl and Tk commands, and Tk widgets, is a companion volume to Tcl/Tk in a Nutshell.

  14. Defining recovery in adult bulimia nervosa.

    Science.gov (United States)

    Yu, Jessica; Agras, W Stewart; Bryson, Susan

    2013-01-01

    To examine how different definitions of recovery lead to varying rates of recovery, maintenance of recovery, and relapse in bulimia nervosa (BN), end-of-treatment (EOT) and follow-up data were obtained from 96 adults with BN. Combining behavioral, physical, and psychological criteria led to recovery rates between 15.5% and 34.4% at EOT, though relapse was approximately 50%. Combining these criteria and requiring abstinence from binge eating and purging when defining recovery may lead to lower recovery rates than those found in previous studies; however, a strength of this definition is that individuals who meet this criteria have no remaining disordered behaviors or symptoms.

  15. Defining Marriage: Classification, Interpretation, and Definitional Disputes

    Directory of Open Access Journals (Sweden)

    Fabrizio Macagno

    2016-09-01

    Full Text Available The classification of a state of affairs under a legal category can be considered as a kind of con- densed decision that can be made explicit, analyzed, and assessed us- ing argumentation schemes. In this paper, the controversial conflict of opinions concerning the nature of “marriage” in Obergefell v. Hodges is analyzed pointing out the dialecti- cal strategies used for addressing the interpretive doubts. The dispute about the same-sex couples’ right to marry hides a much deeper disa- greement not only about what mar- riage is, but more importantly about the dialectical rules for defining it.

  16. Software defined networks a comprehensive approach

    CERN Document Server

    Goransson, Paul

    2014-01-01

    Software Defined Networks discusses the historical networking environment that gave rise to SDN, as well as the latest advances in SDN technology. The book gives you the state of the art knowledge needed for successful deployment of an SDN, including: How to explain to the non-technical business decision makers in your organization the potential benefits, as well as the risks, in shifting parts of a network to the SDN modelHow to make intelligent decisions about when to integrate SDN technologies in a networkHow to decide if your organization should be developing its own SDN applications or

  17. Software Defined Radio: Basic Principles and Applications

    Directory of Open Access Journals (Sweden)

    José Raúl Machado-Fernández

    2014-12-01

    Full Text Available The author makes a review of the SDR (Software Defined Radio technology, including hardware schemes and application fields. A low performance device is presented and several tests are executed with it using free software. With the acquired experience, SDR employment opportunities are identified for low-cost solutions that can solve significant problems. In addition, a list of the most important frameworks related to the technology developed in the last years is offered, recommending the use of three of them.

  18. Defining the Strategy of Nuclear Activity

    International Nuclear Information System (INIS)

    Racana, R.

    2006-01-01

    This article presents nuclear activity as defined within the field of the nuclear industry, which is studied from its capacity to generate electric power to its application in industry and medicine as well as a source for weapons of mass destruction. These fields of analysis introduce some problems that the nuclear activity itself must know how to confront employing action strategies aimed at becoming an activity to be kept in mind when making use of the benefits that its peaceful use contributes to human life. (Author)

  19. Animal bioavailability of defined xenobiotic lignin metabolites

    International Nuclear Information System (INIS)

    Sandermann, H. Jr.; Arjmand, M.; Gennity, I.; Winkler, R.; Struble, C.B.; Aschbacher, P.W.

    1990-01-01

    Lignin has been recognized as a major component of bound pesticide residues in plants and is thought to be undigestible in animals. Two defined ring-U- 14 C-labeled chloroaniline/lignin metabolites have now been fed to rats, where a release of ∼66% of the bound xenobiotic occurred in the form of simple chloroaniline derivatives. The observed high degree of bioavailability indicates that bound pesticidal residues may possess ecotoxicological significance. In parallel studies, the white-rot fungus Phanerochaete chrysosporium was more efficient, and a soil system was much less efficient, in the degradation of the [ring-U- 14 C]chloroaniline/lignin metabolites

  20. DEFINING THE CHEMICAL SPACE OF PUBLIC GENOMIC ...

    Science.gov (United States)

    The current project aims to chemically index the genomics content of public genomic databases to make these data accessible in relation to other publicly available, chemically-indexed toxicological information. By defining the chemical space of public genomic data, it is possible to identify classes of chemicals on which to develop methodologies for the integration of chemogenomic data into predictive toxicology. The chemical space of public genomic data will be presented as well as the methodologies and tools developed to identify this chemical space.

  1. Healthcare Engineering Defined: A White Paper.

    Science.gov (United States)

    Chyu, Ming-Chien; Austin, Tony; Calisir, Fethi; Chanjaplammootil, Samuel; Davis, Mark J; Favela, Jesus; Gan, Heng; Gefen, Amit; Haddas, Ram; Hahn-Goldberg, Shoshana; Hornero, Roberto; Huang, Yu-Li; Jensen, Øystein; Jiang, Zhongwei; Katsanis, J S; Lee, Jeong-A; Lewis, Gladius; Lovell, Nigel H; Luebbers, Heinz-Theo; Morales, George G; Matis, Timothy; Matthews, Judith T; Mazur, Lukasz; Ng, Eddie Yin-Kwee; Oommen, K J; Ormand, Kevin; Rohde, Tarald; Sánchez-Morillo, Daniel; Sanz-Calcedo, Justo García; Sawan, Mohamad; Shen, Chwan-Li; Shieh, Jiann-Shing; Su, Chao-Ton; Sun, Lilly; Sun, Mingui; Sun, Yi; Tewolde, Senay N; Williams, Eric A; Yan, Chongjun; Zhang, Jiajie; Zhang, Yuan-Ting

    2015-01-01

    Engineering has been playing an important role in serving and advancing healthcare. The term "Healthcare Engineering" has been used by professional societies, universities, scientific authors, and the healthcare industry for decades. However, the definition of "Healthcare Engineering" remains ambiguous. The purpose of this position paper is to present a definition of Healthcare Engineering as an academic discipline, an area of research, a field of specialty, and a profession. Healthcare Engineering is defined in terms of what it is, who performs it, where it is performed, and how it is performed, including its purpose, scope, topics, synergy, education/training, contributions, and prospects.

  2. Software defined networking applications in distributed datacenters

    CERN Document Server

    Qi, Heng

    2016-01-01

    This SpringerBrief provides essential insights on the SDN application designing and deployment in distributed datacenters. In this book, three key problems are discussed: SDN application designing, SDN deployment and SDN management. This book demonstrates how to design the SDN-based request allocation application in distributed datacenters. It also presents solutions for SDN controller placement to deploy SDN in distributed datacenters. Finally, an SDN management system is proposed to guarantee the performance of datacenter networks which are covered and controlled by many heterogeneous controllers. Researchers and practitioners alike will find this book a valuable resource for further study on Software Defined Networking. .

  3. Defining Starch Binding by Glucan Phosphatases

    DEFF Research Database (Denmark)

    Auger, Kyle; Raththagala, Madushi; Wilkens, Casper

    2015-01-01

    Starch is a vital energy molecule in plants that has a wide variety of uses in industry, such as feedstock for biomaterial processing and biofuel production. Plants employ a three enzyme cyclic process utilizing kinases, amylases, and phosphatases to degrade starch in a diurnal manner. Starch...... is comprised of the branched glucan amylopectin and the more linear glucan amylose. Our lab has determined the first structures of these glucan phosphatases and we have defined their enzymatic action. Despite this progress, we lacked a means to quickly and efficiently quantify starch binding to glucan...

  4. Healthcare Engineering Defined: A White Paper

    Directory of Open Access Journals (Sweden)

    Ming-Chien Chyu

    2015-01-01

    Full Text Available Engineering has been playing an important role in serving and advancing healthcare. The term “Healthcare Engineering” has been used by professional societies, universities, scientific authors, and the healthcare industry for decades. However, the definition of “Healthcare Engineering” remains ambiguous. The purpose of this position paper is to present a definition of Healthcare Engineering as an academic discipline, an area of research, a field of specialty, and a profession. Healthcare Engineering is defined in terms of what it is, who performs it, where it is performed, and how it is performed, including its purpose, scope, topics, synergy, education/training, contributions, and prospects.

  5. Software-defined reconfigurable microwave photonics processor.

    Science.gov (United States)

    Pérez, Daniel; Gasulla, Ivana; Capmany, José

    2015-06-01

    We propose, for the first time to our knowledge, a software-defined reconfigurable microwave photonics signal processor architecture that can be integrated on a chip and is capable of performing all the main functionalities by suitable programming of its control signals. The basic configuration is presented and a thorough end-to-end design model derived that accounts for the performance of the overall processor taking into consideration the impact and interdependencies of both its photonic and RF parts. We demonstrate the model versatility by applying it to several relevant application examples.

  6. Fingerprinting Software Defined Networks and Controllers

    Science.gov (United States)

    2015-03-01

    rps requests per second RTT Round-Trip Time SDN Software Defined Networking SOM Self-Organizing Map STP Spanning Tree Protocol TRW-CB Threshold Random...Protocol ( STP ) updates), in which case the frame will be “punted” from the forwarding lookup process and processed by the route processor [9]. The act of...environment 20 to accomplish the needs of B4. In addition to Google, the SDN market is expected to grow beyond $35 billion by April 2018 [31]. The rate

  7. How the Government Defines "Rural" Has Implications for Education Policies and Practices. Issues & Answers. REL 2007-010

    Science.gov (United States)

    Arnold, Michael L.; Biscoe, Belinda; Farmer, Thomas W.; Robertson, Dylan L.; Shapley, Kathy L.

    2007-01-01

    Clearly defining what rural means has tangible implications for public policies and practices in education, from establishing resource needs to achieving the goals of No Child Left Behind in rural areas. The word "rural" has many meanings. It has been defined in reference to population density, geographic features, and level of economic…

  8. Defining School Readiness in Maryland: A Multi-Dimensional Perspective. Publication #2012-44

    Science.gov (United States)

    Forry, Nicole; Wessel, Julia

    2012-01-01

    Increased emphasis has been placed on children's ability to enter kindergarten ready to learn, a concept referred to as "school readiness." School readiness has been defined by the Maryland State Department of Education as "the stage of human development that enables a child to engage in, and benefit from, primary learning…

  9. Defining adaptation in a generic multi layer model: CAM: The GRAPPLE Conceptual Adaptation Model

    NARCIS (Netherlands)

    Hendrix, M.; De Bra, P.M.E.; Pechenizkiy, M.; Smits, D.; Cristea, A.I.; Dillenbourg, P.; Specht, M.

    2008-01-01

    Authoring of Adaptive Hypermedia is a difficult and time consuming task. Reference models like LAOS and AHAM separate adaptation and content in different layers. Systems like AHA!, offer graphical tools based on these models to allow authors to define adaptation without knowing any adaptation

  10. Korean Reference HLW Disposal System

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Heui Joo; Lee, J. Y.; Kim, S. S. (and others)

    2008-03-15

    This report outlines the results related to the development of Korean Reference Disposal System for High-level radioactive wastes. The research has been supported around for 10 years through a long-term research plan by MOST. The reference disposal method was selected via the first stage of the research during which the technical guidelines for the geological disposal of HLW were determined too. At the second stage of the research, the conceptual design of the reference disposal system was made. For this purpose the characteristics of the reference spent fuels from PWR and CANDU reactors were specified, and the material and specifications of the canisters were determined in term of structural analysis and manufacturing capability in Korea. Also, the mechanical and chemical characteristics of the domestic Ca-bentonite were analyzed in order to supply the basic design parameters of the buffer. Based on these parameters the thermal and mechanical analysis of the near-field was carried out. Thermal-Hydraulic-Mechanical behavior of the disposal system was analyzed. The reference disposal system was proposed through the second year research. At the final third stage of the research, the Korean Reference disposal System including the engineered barrier, surface facilities, and underground facilities was proposed through the performance analysis of the disposal system.

  11. Defining clogging potential for permeable concrete.

    Science.gov (United States)

    Kia, Alalea; Wong, Hong S; Cheeseman, Christopher R

    2018-08-15

    Permeable concrete is used to reduce urban flooding as it allows water to flow through normally impermeable infrastructure. It is prone to clogging by particulate matter and predicting the long-term performance of permeable concrete is challenging as there is currently no reliable means of characterising clogging potential. This paper reports on the performance of a range of laboratory-prepared and commercial permeable concretes, close packed glass spheres and aggregate particles of varying size, exposed to different clogging methods to understand this phenomena. New methods were developed to study clogging and define clogging potential. The tests involved applying flowing water containing sand and/or clay in cycles, and measuring the change in permeability. Substantial permeability reductions were observed in all samples, particularly when exposed to sand and clay simultaneously. Three methods were used to define clogging potential based on measuring the initial permeability decay, half-life cycle and number of cycles to full clogging. We show for the first time strong linear correlations between these parameters for a wide range of samples, indicating their use for service-life prediction. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Defining an Open Source Strategy for NASA

    Science.gov (United States)

    Mattmann, C. A.; Crichton, D. J.; Lindsay, F.; Berrick, S. W.; Marshall, J. J.; Downs, R. R.

    2011-12-01

    Over the course of the past year, we have worked to help frame a strategy for NASA and open source software. This includes defining information processes to understand open source licensing, attribution, commerciality, redistribution, communities, architectures, and interactions within the agency. Specifically we held a training session at the NASA Earth Science Data Systems Working Group meeting in Open Source software as it relates to the NASA Earth Science data systems enterprise, including EOSDIS, the Distributed Active Archive Centers (DAACs), ACCESS proposals, and the MEASURES communities, and efforts to understand how open source software can be both consumed and produced within that ecosystem. In addition, we presented at the 1st NASA Open Source Summit (OSS) and helped to define an agency-level strategy, a set of recommendations and paths forward for how to identify healthy open source communities, how to deal with issues such as contributions originating from other agencies, and how to search out talent with the right skills to develop software for NASA in the modern age. This talk will review our current recommendations for open source at NASA, and will cover the set of thirteen recommendations output from the NASA Open Source Summit and discuss some of their implications for the agency.

  13. How Do You Define an Internship?

    Science.gov (United States)

    Wilson, C. E.; Keane, C.

    2017-12-01

    According to the American Geosciences Institute's Geoscience Student Exit Survey, internship participation rates over the past four years have been low, particularly among bachelor's and doctoral graduates. In 2016, 65% of bachelor's graduates, 44% of master's graduates, and 57% of doctoral graduates did not participate in an internship while working on their degree. When asked if they submitted applications for internship opportunities, 42% of bachelor's graduates, 23% of master's graduates, and 46% of doctoral graduates claimed to not submit any applications. These statistics have raised concern at AGI because internships provide experiences that help develop critical professional skills and industry connections that can lead to jobs after graduation. However, when internships are discussed among various representatives in geoscience industries, there are disagreements in how an internship experience is defined. For example, opinions differ on whether REUs or other research experiences count as an internship. Clear definitions of internship opportunities may help academic faculty and advisors direct students towards these opportunities and help develop a collection of resources for finding future internships. This presentation will present some of the recent statistics on internship participation among geoscience graduates and present a series of questions to ascertain defining features of internships among AGU attendees and where help is needed to increase participation in internships among current geoscience students.

  14. Defining Medical Capabilities for Exploration Missions

    Science.gov (United States)

    Hailey, M.; Antonsen, E.; Blue, R.; Reyes, D.; Mulcahy, R.; Kerstman, E.; Bayuse, T.

    2018-01-01

    Exploration-class missions to the moon, Mars and beyond will require a significant change in medical capability from today's low earth orbit centric paradigm. Significant increases in autonomy will be required due to differences in duration, distance and orbital mechanics. Aerospace medicine and systems engineering teams are working together within ExMC to meet these challenges. Identifying exploration medical system needs requires accounting for planned and unplanned medical care as defined in the concept of operations. In 2017, the ExMC Clinicians group identified medical capabilities to feed into the Systems Engineering process, including: determining what and how to address planned and preventive medical care; defining an Accepted Medical Condition List (AMCL) of conditions that may occur and a subset of those that can be treated effectively within the exploration environment; and listing the medical capabilities needed to treat those conditions in the AMCL. This presentation will discuss the team's approach to addressing these issues, as well as how the outputs of the clinical process impact the systems engineering effort.

  15. Distributed controller clustering in software defined networks.

    Directory of Open Access Journals (Sweden)

    Ahmed Abdelaziz

    Full Text Available Software Defined Networking (SDN is an emerging promising paradigm for network management because of its centralized network intelligence. However, the centralized control architecture of the software-defined networks (SDNs brings novel challenges of reliability, scalability, fault tolerance and interoperability. In this paper, we proposed a novel clustered distributed controller architecture in the real setting of SDNs. The distributed cluster implementation comprises of multiple popular SDN controllers. The proposed mechanism is evaluated using a real world network topology running on top of an emulated SDN environment. The result shows that the proposed distributed controller clustering mechanism is able to significantly reduce the average latency from 8.1% to 1.6%, the packet loss from 5.22% to 4.15%, compared to distributed controller without clustering running on HP Virtual Application Network (VAN SDN and Open Network Operating System (ONOS controllers respectively. Moreover, proposed method also shows reasonable CPU utilization results. Furthermore, the proposed mechanism makes possible to handle unexpected load fluctuations while maintaining a continuous network operation, even when there is a controller failure. The paper is a potential contribution stepping towards addressing the issues of reliability, scalability, fault tolerance, and inter-operability.

  16. Distributed controller clustering in software defined networks.

    Science.gov (United States)

    Abdelaziz, Ahmed; Fong, Ang Tan; Gani, Abdullah; Garba, Usman; Khan, Suleman; Akhunzada, Adnan; Talebian, Hamid; Choo, Kim-Kwang Raymond

    2017-01-01

    Software Defined Networking (SDN) is an emerging promising paradigm for network management because of its centralized network intelligence. However, the centralized control architecture of the software-defined networks (SDNs) brings novel challenges of reliability, scalability, fault tolerance and interoperability. In this paper, we proposed a novel clustered distributed controller architecture in the real setting of SDNs. The distributed cluster implementation comprises of multiple popular SDN controllers. The proposed mechanism is evaluated using a real world network topology running on top of an emulated SDN environment. The result shows that the proposed distributed controller clustering mechanism is able to significantly reduce the average latency from 8.1% to 1.6%, the packet loss from 5.22% to 4.15%, compared to distributed controller without clustering running on HP Virtual Application Network (VAN) SDN and Open Network Operating System (ONOS) controllers respectively. Moreover, proposed method also shows reasonable CPU utilization results. Furthermore, the proposed mechanism makes possible to handle unexpected load fluctuations while maintaining a continuous network operation, even when there is a controller failure. The paper is a potential contribution stepping towards addressing the issues of reliability, scalability, fault tolerance, and inter-operability.

  17. The citation with reference and the citation as a reference

    Directory of Open Access Journals (Sweden)

    Rafael Antonio Cunha Perrone

    2011-12-01

    Full Text Available O ensaio objetiva estabelecer paralelos entre a citação acadêmica e a citação utilizada como referência, para a produção das obras ou projetos de arquitetura. O artigo discute o uso de figuras paradigmáticas ou significativas, transladadas ou amalgamadas nas obras de arquitetura, a partir do entendimento da citação como dado argumentativo e qualitativo. Investiga algumas de suas assimetrias e congruências em sua utilização na linguagem escrita, como transcrição direta ou como fonte interpretativa, reelaborada e incorporada como argumento em outro texto. Pondera que no ensinar, estudar e fazer arquitetura é preciso saber citar com referência, para poder citar como referência.

  18. Sequence Factorization with Multiple References.

    Directory of Open Access Journals (Sweden)

    Sebastian Wandelt

    Full Text Available The success of high-throughput sequencing has lead to an increasing number of projects which sequence large populations of a species. Storage and analysis of sequence data is a key challenge in these projects, because of the sheer size of the datasets. Compression is one simple technology to deal with this challenge. Referential factorization and compression schemes, which store only the differences between input sequence and a reference sequence, gained lots of interest in this field. Highly-similar sequences, e.g., Human genomes, can be compressed with a compression ratio of 1,000:1 and more, up to two orders of magnitude better than with standard compression techniques. Recently, it was shown that the compression against multiple references from the same species can boost the compression ratio up to 4,000:1. However, a detailed analysis of using multiple references is lacking, e.g., for main memory consumption and optimality. In this paper, we describe one key technique for the referential compression against multiple references: The factorization of sequences. Based on the notion of an optimal factorization, we propose optimization heuristics and identify parameter settings which greatly influence 1 the size of the factorization, 2 the time for factorization, and 3 the required amount of main memory. We evaluate a total of 30 setups with a varying number of references on data from three different species. Our results show a wide range of factorization sizes (optimal to an overhead of up to 300%, factorization speed (0.01 MB/s to more than 600 MB/s, and main memory usage (few dozen MB to dozens of GB. Based on our evaluation, we identify the best configurations for common use cases. Our evaluation shows that multi-reference factorization is much better than single-reference factorization.

  19. Defining the expressed breast cancer kinome

    Institute of Scientific and Technical Information of China (English)

    Alicia A Midland; Shawn M Gomez; Gary L Johnson; Martin C Whittle; James S Duncan; Amy N Abell; Kazuhiro Nakamura; Jon S Zawistowski; Lisa A Carey; H Shelton Earp III; Lee M Graves

    2012-01-01

    Protein kinases are arguably the most tractable candidates for development of new therapies to treat cancer.Deep sequencing of breast cancer cell lines indicates each express 375or so kinases,representing nearly 75% of the kinome.A rich network both downstream and upstream from key oncogenic kinases includes both tyrosine and serine/threonine kinases,giving plasticity and resiliency to the cancer cell kinome. Protein kinases have proven to be highly tractable candidates for development of new cancer therapies with over 130 kinase-specific inhibitors currently in Phase 1-3 clinical trials [ 1].Approximately 518 protein kinases are encoded by the human genome,collectively referred to as the kinome.

  20. Assessment of brain reference genes for RT-qPCR studies in neurodegenerative diseases

    DEFF Research Database (Denmark)

    Rydbirk, Rasmus; Folke, Jonas; Winge, Kristian

    2016-01-01

    Evaluation of gene expression levels by reverse transcription quantitative real-time PCR (RT-qPCR) has for many years been the favourite approach for discovering disease-associated alterations. Normalization of results to stably expressed reference genes (RGs) is pivotal to obtain reliable results......, and Progressive Supranuclear Palsy patients. Using RefFinder, a web-based tool for evaluating RG stability, we identified the most stable RGs to be UBE2D2, CYC1, and RPL13 which we recommend for future RT-qPCR studies on human brain tissue from these patients. None of the investigated genes were affected...... by experimental variables such as RIN, PMI, or age. Findings were further validated by expression analyses of a target gene GSK3B, known to be affected by AD and PD. We obtained high variations in GSK3B levels when contrasting the results using different sets of common RG underlining the importance of a priori...

  1. Complex amplitude reconstruction by iterative amplitude-phase retrieval algorithm with reference

    Science.gov (United States)

    Shen, Cheng; Guo, Cheng; Tan, Jiubin; Liu, Shutian; Liu, Zhengjun

    2018-06-01

    Multi-image iterative phase retrieval methods have been successfully applied in plenty of research fields due to their simple but efficient implementation. However, there is a mismatch between the measurement of the first long imaging distance and the sequential interval. In this paper, an amplitude-phase retrieval algorithm with reference is put forward without additional measurements or priori knowledge. It gets rid of measuring the first imaging distance. With a designed update formula, it significantly raises the convergence speed and the reconstruction fidelity, especially in phase retrieval. Its superiority over the original amplitude-phase retrieval (APR) method is validated by numerical analysis and experiments. Furthermore, it provides a conceptual design of a compact holographic image sensor, which can achieve numerical refocusing easily.

  2. Reference Librarian in Digital Environment:

    Directory of Open Access Journals (Sweden)

    Faramarz Sohili

    2008-07-01

    Full Text Available The information explosion of the latter half of the twentieth century, gave rise to online databases and various information media that gradually impacted on the very physical environment of the library. It transformed the librarians’ role. Reference librarians are no exception. The present study aims to investigate the need or lack of need to reference librarians within the digital domains based on the views expressed by LIS authorities in Iran. It would attempt further, to identify the qualities required for such librarian should a need for her/his expressed. The research, while descriptive in nature, was based on analyzing the results obtained by the checklist devised by the authors. LIS Specialist sample was composed of 57 people who filled the checklist. Findings show that there is a significance between employing ICT and need for a reference librarian. LIS experts in Iran believe that introduction of ICT, especially Internet and the WWW not only didn’t decrease the need for such librarians, but has caused the reference librarian to attain a more important and better status than before. Findings further demonstrated that while Iran is not a signatory to the international copyright conventions, the Iranian reference librarians are fully committed to observing author’s copyright and intellectual rights and frown on using software crackers.

  3. Reconfigurable, Cognitive Software-Defined Radio

    Science.gov (United States)

    Bhat, Arvind

    2015-01-01

    Software-defined radio (SDR) technology allows radios to be reconfigured to perform different communication functions without using multiple radios to accomplish each task. Intelligent Automation, Inc., has developed SDR platforms that switch adaptively between different operation modes. The innovation works by modifying both transmit waveforms and receiver signal processing tasks. In Phase I of the project, the company developed SDR cognitive capabilities, including adaptive modulation and coding (AMC), automatic modulation recognition (AMR), and spectrum sensing. In Phase II, these capabilities were integrated into SDR platforms. The reconfigurable transceiver design employs high-speed field-programmable gate arrays, enabling multimode operation and scalable architecture. Designs are based on commercial off-the-shelf (COTS) components and are modular in nature, making it easier to upgrade individual components rather than redesigning the entire SDR platform as technology advances.

  4. Nurse leader resilience: career defining moments.

    Science.gov (United States)

    Cline, Susan

    2015-01-01

    Resilience is an essential component of effective nursing leadership. It is defined as the ability to survive and thrive in the face of adversity. Resilience can be developed and internalized as a measure to improve retention and reduce burnout. Nurse leaders at all levels should develop these competencies to survive and thrive in an increasingly complex health care environment. Building positive relationships, maintaining positivity, developing emotional insight, creating work-life balance, and reflecting on successes and challenges are effective strategies for resilience building. Nurse leaders have a professional obligation to develop resilience in themselves, the teams they supervise, and the organization as a whole. Additional benefits include reduced turnover, reduced cost, and improved quality outcomes through organizational mindfulness.

  5. Defining and testing a granular continuum element

    Energy Technology Data Exchange (ETDEWEB)

    Rycroft, Chris H.; Kamrin, Ken; Bazant, Martin Z.

    2007-12-03

    Continuum mechanics relies on the fundamental notion of amesoscopic volume "element" in which properties averaged over discreteparticles obey deterministic relationships. Recent work on granularmaterials suggests a continuum law may be inapplicable, revealinginhomogeneities at the particle level, such as force chains and slow cagebreaking. Here, we analyze large-scale Discrete-Element Method (DEM)simulations of different granular flows and show that a "granularelement" can indeed be defined at the scale of dynamical correlations,roughly three to five particle diameters. Its rheology is rather subtle,combining liquid-like dependence on deformation rate and solid-likedependence on strain. Our results confirm some aspects of classicalplasticity theory (e.g., coaxiality of stress and deformation rate),while contradicting others (i.e., incipient yield), and can guide thedevelopment of more realistic continuum models.

  6. Defining and Distinguishing Secular and Religious Terrorism

    Directory of Open Access Journals (Sweden)

    Heather S. Gregg

    2014-04-01

    Full Text Available Religious terrorism is typically characterised as acts of unrestrained, irrational and indiscriminant violence, thus offering few if any policy options for counterterrorism measures. This assumption about religious terrorism stems from two challenges in the literature: disproportionate attention to apocalyptic terrorism, and a lack of distinction between religious terrorism and its secular counterpart. This article, therefore, aims to do four things: define and differentiate religiously motivated terrorism from traditional terrorism; investigate three goals of religious terrorism (fomenting the apocalypse, creating a religious government, and establishing a religiously pure state; consider the role of leadership and target selection of religious terrorists; and, finally, suggest a range of counterterrorism strategies based on these observations.

  7. "Defining Computer 'Speed': An Unsolved Challenge"

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Abstract: The reason we use computers is their speed, and the reason we use parallel computers is that they're faster than single-processor computers. Yet, after 70 years of electronic digital computing, we still do not have a solid definition of what computer 'speed' means, or even what it means to be 'faster'. Unlike measures in physics, where the definition of speed is rigorous and unequivocal, in computing there is no definition of speed that is universally accepted. As a result, computer customers have made purchases misguided by dubious information, computer designers have optimized their designs for the wrong goals, and computer programmers have chosen methods that optimize the wrong things. This talk describes why some of the obvious and historical ways of defining 'speed' haven't served us well, and the things we've learned in the struggle to find a definition that works. Biography: Dr. John Gustafson is a Director ...

  8. Using archetypes for defining CDA templates.

    Science.gov (United States)

    Moner, David; Moreno, Alberto; Maldonado, José A; Robles, Montserrat; Parra, Carlos

    2012-01-01

    While HL7 CDA is a widely adopted standard for the documentation of clinical information, the archetype approach proposed by CEN/ISO 13606 and openEHR is gaining recognition as a means of describing domain models and medical knowledge. This paper describes our efforts in combining both standards. Using archetypes as an alternative for defining CDA templates permit new possibilities all based on the formal nature of archetypes and their ability to merge into the same artifact medical knowledge and technical requirements for semantic interoperability of electronic health records. We describe the process followed for the normalization of existing legacy data in a hospital environment, from the importation of the HL7 CDA model into an archetype editor, the definition of CDA archetypes and the application of those archetypes to obtain normalized CDA data instances.

  9. Defining the critical hurdles in cancer immunotherapy

    DEFF Research Database (Denmark)

    Fox, Bernard A; Schendel, Dolores J; Butterfield, Lisa H

    2011-01-01

    of cancer immunotherapy. With consensus on these hurdles, international working groups could be developed to make recommendations vetted by the participating organizations. These recommendations could then be considered by regulatory bodies, governmental and private funding agencies, pharmaceutical...... immunotherapy organizations representing Europe, Japan, China and North America to discuss collaborations to improve development and delivery of cancer immunotherapy. One of the concepts raised by SITC and defined as critical by all parties was the need to identify hurdles that impede effective translation...... companies and academic institutions to facilitate changes necessary to accelerate clinical translation of novel immune-based cancer therapies. The critical hurdles identified by representatives of the collaborating organizations, now organized as the World Immunotherapy Council, are presented and discussed...

  10. Just caring: defining a basic benefit package.

    Science.gov (United States)

    Fleck, Leonard M

    2011-12-01

    What should be the content of a package of health care services that we would want to guarantee to all Americans? This question cannot be answered adequately apart from also addressing the issue of fair health care rationing. Consequently, as I argue in this essay, appeal to the language of "basic," "essential," "adequate," "minimally decent," or "medically necessary" for purposes of answering our question is unhelpful. All these notions are too vague to be useful. Cost matters. Effectiveness matters. The clinical circumstances of a patient matters. But what we must ultimately determine is what we mutually agree are the just claims to needed health care of each American in a relatively complex range of clinical circumstances. Answering this question will require a public moral conversation, a fair process of rational democratic deliberation aimed at defining both just claims to needed health care and just limits.

  11. Bruxism defined and graded: an international consensus.

    Science.gov (United States)

    Lobbezoo, F; Ahlberg, J; Glaros, A G; Kato, T; Koyano, K; Lavigne, G J; de Leeuw, R; Manfredini, D; Svensson, P; Winocur, E

    2013-01-01

    To date, there is no consensus about the definition and diagnostic grading of bruxism. A written consensus discussion was held among an international group of bruxism experts as to formulate a definition of bruxism and to suggest a grading system for its operationalisation. The expert group defined bruxism as a repetitive jaw-muscle activity characterised by clenching or grinding of the teeth and/or by bracing or thrusting of the mandible. Bruxism has two distinct circadian manifestations: it can occur during sleep (indicated as sleep bruxism) or during wakefulness (indicated as awake bruxism). For the operationalisation of this definition, the expert group proposes a diagnostic grading system of 'possible', 'probable' and 'definite' sleep or awake bruxism. The proposed definition and grading system are suggested for clinical and research purposes in all relevant dental and medical domains. © 2012 Blackwell Publishing Ltd.

  12. Defining and Supporting Narrative-driven Recommendation

    DEFF Research Database (Denmark)

    Bogers, Toine; Koolen, Marijn

    2017-01-01

    Research into recommendation algorithms has made great strides in recent years. However, these algorithms are typically applied in relatively straightforward scenarios: given information about a user's past preferences, what will they like in the future? Recommendation is often more complex......: evaluating recommended items never takes place in a vacuum, and it is often a single step in the user's more complex background task. In this paper, we define a specific type of recommendation scenario called narrative-driven recommendation, where the recommendation process is driven by both a log...... of the user's past transactions as well as a narrative description of their current interest(s). Through an analysis of a set of real-world recommendation narratives from the LibraryThing forums, we demonstrate the uniqueness and richness of this scenario and highlight common patterns and properties...

  13. Defining Service and Education in Pediatrics.

    Science.gov (United States)

    Boyer, Debra; Gagne, Josh; Kesselheim, Jennifer C

    2017-11-01

    Program directors (PDs) and trainees are often queried regarding the balance of service and education during pediatric residency training. We aimed to use qualitative methods to learn how pediatric residents and PDs define service and education and to identify activities that exemplify these concepts. Focus groups of pediatric residents and PDs were performed and the data qualitatively analyzed. Thematic analysis revealed 4 themes from focus group data: (1) misalignment of the perceived definition of service; (2) agreement about the definition of education; (3) overlapping perceptions of the value of service to training; and (4) additional suggestions for improved integration of education and service. Pediatric residents hold positive definitions of service and believe that service adds value to their education. Importantly, the discovery of heterogeneous definitions of service between pediatric residents and PDs warrants further investigation and may have ramifications for Accreditation Council for Graduate Medical Education and those responsible for residency curricula.

  14. Quantum computing. Defining and detecting quantum speedup.

    Science.gov (United States)

    Rønnow, Troels F; Wang, Zhihui; Job, Joshua; Boixo, Sergio; Isakov, Sergei V; Wecker, David; Martinis, John M; Lidar, Daniel A; Troyer, Matthias

    2014-07-25

    The development of small-scale quantum devices raises the question of how to fairly assess and detect quantum speedup. Here, we show how to define and measure quantum speedup and how to avoid pitfalls that might mask or fake such a speedup. We illustrate our discussion with data from tests run on a D-Wave Two device with up to 503 qubits. By using random spin glass instances as a benchmark, we found no evidence of quantum speedup when the entire data set is considered and obtained inconclusive results when comparing subsets of instances on an instance-by-instance basis. Our results do not rule out the possibility of speedup for other classes of problems and illustrate the subtle nature of the quantum speedup question. Copyright © 2014, American Association for the Advancement of Science.

  15. Software Defined Networking Demands on Software Technologies

    DEFF Research Database (Denmark)

    Galinac Grbac, T.; Caba, Cosmin Marius; Soler, José

    2015-01-01

    Software Defined Networking (SDN) is a networking approach based on a centralized control plane architecture with standardised interfaces between control and data planes. SDN enables fast configuration and reconfiguration of the network to enhance resource utilization and service performances....... This new approach enables a more dynamic and flexible network, which may adapt to user needs and application requirements. To this end, systemized solutions must be implemented in network software, aiming to provide secure network services that meet the required service performance levels. In this paper......, we review this new approach to networking from an architectural point of view, and identify and discuss some critical quality issues that require new developments in software technologies. These issues we discuss along with use case scenarios. Here in this paper we aim to identify challenges...

  16. Computing platforms for software-defined radio

    CERN Document Server

    Nurmi, Jari; Isoaho, Jouni; Garzia, Fabio

    2017-01-01

    This book addresses Software-Defined Radio (SDR) baseband processing from the computer architecture point of view, providing a detailed exploration of different computing platforms by classifying different approaches, highlighting the common features related to SDR requirements and by showing pros and cons of the proposed solutions. Coverage includes architectures exploiting parallelism by extending single-processor environment (such as VLIW, SIMD, TTA approaches), multi-core platforms distributing the computation to either a homogeneous array or a set of specialized heterogeneous processors, and architectures exploiting fine-grained, coarse-grained, or hybrid reconfigurability. Describes a computer engineering approach to SDR baseband processing hardware; Discusses implementation of numerous compute-intensive signal processing algorithms on single and multicore platforms; Enables deep understanding of optimization techniques related to power and energy consumption of multicore platforms using several basic a...

  17. Environmentally acceptable thread compounds: Requirements defined

    International Nuclear Information System (INIS)

    Stringfellow, W.D.; Hendriks, R.V.; Jacobs, N.L.

    1993-01-01

    New environmental regulations on thread compounds are now being enforced in several areas with strong maritime tradition and a sensitive environment. These areas include Indonesia, Alaska and portions of Norway. The industry generally recognizes the environmental concerns but, with wider enforcement of regulations imminent, has not been able to define clearly the requirements for environmental compliance. This paper, written in collaboration with The Netherlands State Supervision of Mines, is based on the National Policy on Thread Compounds of The Netherlands. This national policy is representative of policies being followed by other North Sea governments. Similar policies might well be adopted by other governments worldwide. These policies will affect the operator, drilling contractor, and supplier. This paper provides a specific and detailed definition of thread compound requirements by addressing four relevant categories. The categories of interest are regulatory approval, environmental, health, and performance

  18. Defining nodes in complex brain networks

    Directory of Open Access Journals (Sweden)

    Matthew Lawrence Stanley

    2013-11-01

    Full Text Available Network science holds great promise for expanding our understanding of the human brain in health, disease, development, and aging. Network analyses are quickly becoming the method of choice for analyzing functional MRI data. However, many technical issues have yet to be confronted in order to optimize results. One particular issue that remains controversial in functional brain network analyses is the definition of a network node. In functional brain networks a node represents some predefined collection of brain tissue, and an edge measures the functional connectivity between pairs of nodes. The characteristics of a node, chosen by the researcher, vary considerably in the literature. This manuscript reviews the current state of the art based on published manuscripts and highlights the strengths and weaknesses of three main methods for defining nodes. Voxel-wise networks are constructed by assigning a node to each, equally sized brain area (voxel. The fMRI time-series recorded from each voxel is then used to create the functional network. Anatomical methods utilize atlases to define the nodes based on brain structure. The fMRI time-series from all voxels within the anatomical area are averaged and subsequently used to generate the network. Functional activation methods rely on data from traditional fMRI activation studies, often from databases, to identify network nodes. Such methods identify the peaks or centers of mass from activation maps to determine the location of the nodes. Small (~10-20 millimeter diameter spheres located at the coordinates of the activation foci are then applied to the data being used in the network analysis. The fMRI time-series from all voxels in the sphere are then averaged, and the resultant time series is used to generate the network. We attempt to clarify the discussion and move the study of complex brain networks forward. While the correct method to be used remains an open, possibly unsolvable question that

  19. Exposing the Myths, Defining the Future

    International Nuclear Information System (INIS)

    Slavov, S.

    2013-01-01

    With this official statement, the WEC calls for policymakers and industry leaders to ''get real'' as the World Energy Council as a global energy body exposes the myths by informing the energy debate and defines a path to a more sustainable energy future. The World Energy Council urged stakeholders to take urgent and incisive actions, to develop and transform the global energy system. Failure to do so could put aspirations on the triple challenge of WEC Energy Trilemma defined by affordability, accessibility and environmental sustainability at serious risk. Through its multi-year in-depth global studies and issue-mapping the WEC has found that challenges that energy sector is facing today are much more crucial than previously envisaged. The WEC's analysis has exposed a number of myths which influence our understanding of important aspects of the global energy landscape. If not challenged, these misconceptions will lead us down a path of complacency and missed opportunities. Much has, and still is, being done to secure energy future, but the WEC' s studies reveal that current pathways fall short of delivering on global aspirations of energy access, energy security and environmental improvements. If we are to derive the full economic and social benefits from energy resources, then we must take incisive and urgent action to modify our steps to energy solutions. The usual business approaches are not effective, the business as usual is not longer a solution. The focus has moved from large universal solutions to an appreciation of regional and national contexts and sharply differentiated consumer expectations.(author)

  20. Defining Future Directions for Endometriosis Research

    Science.gov (United States)

    D’Hooghe, Thomas M.; Fazleabas, Asgerally; Giudice, Linda C.; Montgomery, Grant W.; Petraglia, Felice; Taylor, Robert N.

    2013-01-01

    Endometriosis, defined as estrogen-dependent lesions containing endometrial glands and stroma outside the uterus, is a chronic and often painful gynecological condition that affects 6% to 10% of reproductive age women. Endometriosis has estimated annual costs of US $12 419 per woman (approximately €9579), comprising one-third of the direct health care costs with two-thirds attributed to loss of productivity. Decreased quality of life is the most important predictor of direct health care and total costs. It has been estimated that there is a mean delay of 6.7 years between onset of symptoms and a surgical diagnosis of endometriosis, and each affected woman loses on average 10.8 hours of work weekly, mainly owing to reduced effectiveness while working. To encourage and facilitate research into this debilitating disease, a consensus workshop to define future directions for endometriosis research was held as part of the 11th World Congress on Endometriosis in September 2011 in Montpellier, France. The objective of this workshop was to review and update the endometriosis research priorities consensus statement developed following the 10th World Congress on Endometriosis in 2008.1 A total of 56 recommendations for research have been developed, grouped under 6 subheadings: (1) diagnosis, (2) classification and prognosis, (3) clinical trials, treatment, and outcomes, (4) epidemiology, (5) pathophysiology, and (6) research policy. By producing this consensus international research priorities statement, it is the hope of the workshop participants that researchers will be encouraged to develop new interdisciplinary research proposals that will attract increased funding support for work on endometriosis. PMID:23427182

  1. Reference values for methacholine reactivity (SAPALDIA study

    Directory of Open Access Journals (Sweden)

    Perruchoud André

    2005-11-01

    Full Text Available Abstract Background The distribution of airway responsiveness in a general population of non-smokers without respiratory symptoms has not been established, limiting its use in clinical and epidemiological practice. We derived reference equations depending on individual characteristics (i.e., sex, age, baseline lung function for relevant percentiles of the methacholine two-point dose-response slope. Methods In a reference sample of 1567 adults of the SAPALDIA cross-sectional survey (1991, defined by excluding subjects with respiratory conditions, responsiveness during methacholine challenge was quantified by calculating the two-point dose-response slope (O'Connor. Weighted L1-regression was used to estimate reference equations for the 95th , 90th , 75th and 50th percentiles of the two-point slope. Results Reference equations for the 95th , 90th , 75th and 50th percentiles of the two-point slope were estimated using a model of the form a + b* Age + c* FEV1 + d* (FEV12 , where FEV1 corresponds to the pre-test (or baseline level of FEV1. For the central half of the FEV1 distribution, we used a quadratic model to describe the dependence of methacholine slope on baseline FEV1. For the first and last quartiles of FEV1, a linear relation with FEV1 was assumed (i.e., d was set to 0. Sex was not a predictor term in this model. A negative linear association with slope was found for age. We provide an Excel file allowing calculation of the percentile of methacholine slope of a subject after introducing age – pre-test FEV1 – and results of methacholine challenge of the subject. Conclusion The present study provides equations for four relevant percentiles of methacholine two-point slope depending on age and baseline FEV1 as basic predictors in an adult reference population of non-obstructive and non-atopic persons. These equations may help clinicians and epidemiologists to better characterize individual or population airway responsiveness.

  2. Virtual Reference, Real Money: Modeling Costs in Virtual Reference Services

    Science.gov (United States)

    Eakin, Lori; Pomerantz, Jeffrey

    2009-01-01

    Libraries nationwide are in yet another phase of belt tightening. Without an understanding of the economic factors that influence library operations, however, controlling costs and performing cost-benefit analyses on services is difficult. This paper describes a project to develop a cost model for collaborative virtual reference services. This…

  3. JavaScript programmer's reference

    CERN Document Server

    Valentine, Thomas

    2013-01-01

    JavaScript Programmer's Reference is an invaluable resource that won't stray far from your desktop (or your tablet!). It contains detailed information on every JavaScript object and command, and combines that reference with practical examples showcasing how you can use those commands in the real world. Whether you're just checking the syntax of a method or you're starting out on the road to JavaScript mastery, the JavaScript Programmer's Reference will be an essential aid.  With a detailed and informative tutorial section giving you the ins and outs of programming with JavaScript and the DOM f

  4. Haemostatic reference intervals in pregnancy

    DEFF Research Database (Denmark)

    Szecsi, Pal Bela; Jørgensen, Maja; Klajnbard, Anna

    2010-01-01

    largely unchanged during pregnancy, delivery, and postpartum and were within non-pregnant reference intervals. However, levels of fibrinogen, D-dimer, and coagulation factors VII, VIII, and IX increased markedly. Protein S activity decreased substantially, while free protein S decreased slightly and total......Haemostatic reference intervals are generally based on samples from non-pregnant women. Thus, they may not be relevant to pregnant women, a problem that may hinder accurate diagnosis and treatment of haemostatic disorders during pregnancy. In this study, we establish gestational age......-20, 21-28, 29-34, 35-42, at active labor, and on postpartum days 1 and 2. Reference intervals for each gestational period using only the uncomplicated pregnancies were calculated in all 391 women for activated partial thromboplastin time (aPTT), fibrinogen, fibrin D-dimer, antithrombin, free protein S...

  5. Considerations for reference pump curves

    International Nuclear Information System (INIS)

    Stockton, N.B.

    1992-01-01

    This paper examines problems associated with inservice testing (IST) of pumps to assess their hydraulic performance using reference pump curves to establish acceptance criteria. Safety-related pumps at nuclear power plants are tested under the American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code (the Code), Section 11. The Code requires testing pumps at specific reference points of differential pressure or flow rate that can be readily duplicated during subsequent tests. There are many cases where test conditions cannot be duplicated. For some pumps, such as service water or component cooling pumps, the flow rate at any time depends on plant conditions and the arrangement of multiple independent and constantly changing loads. System conditions cannot be controlled to duplicate a specific reference value. In these cases, utilities frequently request to use pump curves for comparison of test data for acceptance. There is no prescribed method for developing a pump reference curve. The methods vary and may yield substantially different results. Some results are conservative when compared to the Code requirements; some are not. The errors associated with different curve testing techniques should be understood and controlled within reasonable bounds. Manufacturer's pump curves, in general, are not sufficiently accurate to use as reference pump curves for IST. Testing using reference curves generated with polynomial least squares fits over limited ranges of pump operation, cubic spline interpolation, or cubic spline least squares fits can provide a measure of pump hydraulic performance that is at least as accurate as the Code required method. Regardless of the test method, error can be reduced by using more accurate instruments, by correcting for systematic errors, by increasing the number of data points, and by taking repetitive measurements at each data point

  6. Biomass Scenario Model Documentation: Data and References

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Y.; Newes, E.; Bush, B.; Peterson, S.; Stright, D.

    2013-05-01

    The Biomass Scenario Model (BSM) is a system dynamics model that represents the entire biomass-to-biofuels supply chain, from feedstock to fuel use. The BSM is a complex model that has been used for extensive analyses; the model and its results can be better understood if input data used for initialization and calibration are well-characterized. It has been carefully validated and calibrated against the available data, with data gaps filled in using expert opinion and internally consistent assumed values. Most of the main data sources that feed into the model are recognized as baseline values by the industry. This report documents data sources and references in Version 2 of the BSM (BSM2), which only contains the ethanol pathway, although subsequent versions of the BSM contain multiple conversion pathways. The BSM2 contains over 12,000 total input values, with 506 distinct variables. Many of the variables are opportunities for the user to define scenarios, while others are simply used to initialize a stock, such as the initial number of biorefineries. However, around 35% of the distinct variables are defined by external sources, such as models or reports. The focus of this report is to provide insight into which sources are most influential in each area of the supply chain.

  7. Choosing representative body sizes for reference adults and children

    International Nuclear Information System (INIS)

    Cristy, M.

    1992-01-01

    In 1975 the International Commission on Radiological Protection published a report on Reference Man (ICRP Publication 23), and a task group of the ICRP is now revising that report. Currently 'Reference Man [adult male] is defined as being between 20-30 years of age, weighing 70 kg, is 170 cm in height, is a Caucasian and is a Western European or North American in habitat and custom' (ICRP 23, p. 4). A reference adult female (58 kg, 160 cm) was also defined and data on the fetus and children were given, but with less detail and fewer specific reference values because the focus of the ICRP at that time was on young male radiation workers. The 70-kg Reference Man (earlier called Standard Man) has been used in radiation protection for 40 years, including the dosimetric schema for nuclear medicine, and this 70-kg reference has been used since at least the 1920's in physiological models. As is well known, humans in most parts of the world have increased in size (height and weight) since this standard was first adopted. Taking modern European populations as a reference and expanding the age range to 20-50 years, the author now suggests 176 cm height and 73-75 kg weight for adult males and 163 cm and about 60 kg for adult females would be more appropriate. The change in height is particularly important because many anatomical and physiological parameters - e.g., lean body mass, skeletal weight, total body water, blood volume, respiratory volumes - are correlated more closely with height than with weight. The difference in lean body mass between Asian and Caucasian persons, for example, is largely or wholly due to the difference in body height. Many equations for mean body water and other whole-body measures use body height as the only or the most important parameter, and so it is important that reference body height be chosen well

  8. rf reference line for PEP

    International Nuclear Information System (INIS)

    Schwarz, H.D.; Weaver, J.N.

    1979-03-01

    A rf phase reference line in 6 segments around the 2200 meter circumference PEP storage ring is described. Each segment of the reference line is phase stabilized by its own independent feedback system, which uses an amplitude modulated reflection from the end of each line. The modulation is kept small and decoupled from the next segment to avoid crosstalk and significant modulation of the rf drive signal. An error evaluation of the system is made. The technical implementation and prototype performance are described. Prototype tests indicate that the phase error around the ring can be held below 1 degree with this relatively simple system

  9. Dietary reference values for thiamin

    DEFF Research Database (Denmark)

    Sjödin, Anders Mikael

    2016-01-01

    Following a request from the European Commission, the EFSA Panel on Dietetic Products, Nutrition and Allergies (NDA) derived dietary reference values (DRVs) for thiamin (vitamin B1). The Panel considers that data from depletion–repletion studies in adults on the amount of dietary thiamin intake...... were measured. Results from other depletion–repletion studies are in agreement with this value. The Panel agrees on the coefficient of variation of 20% used by the SCF to cover uncertainties related to distribution of thiamin requirements in the general population, and endorses the population reference...

  10. Sizewell 'B' PWR reference design

    International Nuclear Information System (INIS)

    1982-04-01

    The reference design for a PWR power station to be constructed as Sizewell 'B' is presented in 3 volumes containing 14 chapters and in a volume of drawings. The report describes the proposed design and provides the basis upon which the safety case and the Pre-Construction Safety Report have been prepared. The station is based on a 3425MWt Westinghouse PWR providing steam to two turbine generators each of 600 MW. The layout and many of the systems are based on the SNUPPS design for Callaway which has been chosen as the US reference plant for the project. (U.K.)

  11. Electrical engineering a pocket reference

    CERN Document Server

    Schmidt-Walter, Heinz

    2007-01-01

    This essential reference offers you a well-organized resource for accessing the basic electrical engineering knowledge you need for your work. Whether you're an experienced engineer who appreciates an occasional refresher in key areas, or a student preparing to enter the field, Electrical Engineering: A Pocket Reference provides quick and easy access to fundamental principles and their applications. You also find an extensive collection of time-saving equations that help simplify your daily projects.Supported with more than 500 diagrams and figures, 60 tables, and an extensive index, this uniq

  12. Perl/Tk Pocket Reference

    CERN Document Server

    Lidie, Stephen

    1998-01-01

    The Perl/Tk Pocket Reference is a companion volume to Learning Perl/Tk, an O'Reilly Animal Guide. Learning Perl/Tk is a tutorial for Perl/Tk, the extension to Perl for creating graphical user interfaces. With Tk, Perl programs can be window-based rather than command-line based, with buttons, entry fields, listboxes, menus, scrollbars, balloons, tables, dialogs, and more. And Perl/Tk programs run on UNIX and Windows-based computers. This small book is a handy reference guide geared toward the advanced Perl/Tk programmer. Novice Perl/Tk programmers will find that its compact size gives th

  13. rf reference line for PEP

    Energy Technology Data Exchange (ETDEWEB)

    Schwarz, H.D.; Weaver, J.N.

    1979-03-01

    A rf phase reference line in 6 segments around the 2200 meter circumference PEP storage ring is described. Each segment of the reference line is phase stabilized by its own independent feedback system, which uses an amplitude modulated reflection from the end of each line. The modulation is kept small and decoupled from the next segment to avoid crosstalk and significant modulation of the rf drive signal. An error evaluation of the system is made. The technical implementation and prototype performance are described. Prototype tests indicate that the phase error around the ring can be held below 1 degree with this relatively simple system.

  14. Reference data about petroleum fiscality

    International Nuclear Information System (INIS)

    2006-01-01

    This paper explains the different taxes existing in France for the petroleum products (domestic tax on petroleum products, added-value tax), the share of taxes in the retail price, the differences with other European countries, the French Government fiscal receipts and budget. Some information forms are attached to this document and concern: the formation of fuel prices (upstream, refining and transport-distribution margins), the evolution of annual average transport-distribution margins, some reference data about world petroleum markets (supply and demand, prices, market data), and some reference data about the role of oil companies on the petroleum market. (J.S.)

  15. Reference Performance Level Descriptors: Outcome of a National Working Session on Defining an "English Proficient" Performance Standard

    Science.gov (United States)

    Cook, H. Gary; MacDonald, Rita

    2014-01-01

    This document is the second in a series of working papers that elaborate on a framework of four key stages in moving toward a common definition of English learner (EL), as described in the Council of Chief State School Officers (CCSSO) publication, "Toward a 'Common Definition of English Learner': Guidance for States and State Assessment…

  16. Harmonising Reference Intervals for Three Calculated Parameters used in Clinical Chemistry.

    Science.gov (United States)

    Hughes, David; Koerbin, Gus; Potter, Julia M; Glasgow, Nicholas; West, Nic; Abhayaratna, Walter P; Cavanaugh, Juleen; Armbruster, David; Hickman, Peter E

    2016-08-01

    For more than a decade there has been a global effort to harmonise all phases of the testing process, with particular emphasis on the most frequently utilised measurands. In addition, it is recognised that calculated parameters derived from these measurands should also be a target for harmonisation. Using data from the Aussie Normals study we report reference intervals for three calculated parameters: serum osmolality, serum anion gap and albumin-adjusted serum calcium. The Aussie Normals study was an a priori study that analysed samples from 1856 healthy volunteers. The nine analytes used for the calculations in this study were measured on Abbott Architect analysers. The data demonstrated normal (Gaussian) distributions for the albumin-adjusted serum calcium, the anion gap (using potassium in the calculation) and the calculated serum osmolality (using both the Bhagat et al. and Smithline and Gardner formulae). To assess the suitability of these reference intervals for use as harmonised reference intervals, we reviewed data from the Royal College of Pathologists of Australasia/Australasian Association of Clinical Biochemists (RCPA/AACB) bias survey. We conclude that the reference intervals for the calculated serum osmolality (using the Smithline and Gardner formulae) may be suitable for use as a common reference interval. Although a common reference interval for albumin-adjusted serum calcium may be possible, further investigations (including a greater range of albumin concentrations) are needed. This is due to the bias between the Bromocresol Green (BCG) and Bromocresol Purple (BCP) methods at lower serum albumin concentrations. Problems with the measurement of Total CO 2 in the bias survey meant that we could not use the data for assessing the suitability of a common reference interval for the anion gap. Further study is required.

  17. A Modernized National Spatial Reference System in 2022: Focus on the Caribbean Terrestrial Reference Frame

    Science.gov (United States)

    Roman, D. R.

    2017-12-01

    In 2022, the National Geodetic Survey will replace all three NAD 83 reference frames the four new terrestrial reference frames. Each frame will be named after a tectonic plate (North American, Pacific, Caribbean and Mariana) and each will be related to the IGS frame through three Euler Pole parameters (EPPs). This talk will focus on practical application in the Caribbean region. A working group is being re-established for development of the North American region and will likely also result in analysis of the Pacific region as well. Both of these regions are adequately covered with existing CORS sites to model the EPPs. The Mariana region currently lacks sufficient coverage, but a separate project is underway to collect additional information to help in defining EPPs for that region at a later date. The Caribbean region has existing robust coverage through UNAVCO's COCONet and other data sets, but these require further analysis. This discussion will focus on practical examination of Caribbean sites to establish candidates for determining the Caribbean frame EPPs as well as an examination of any remaining velocities that might inform a model of the remaining velocities within that frame (Intra-Frame Velocity Model). NGS has a vested interest in defining such a model to meet obligations to U.S. citizens in Puerto Rico and the U.S. Virgin Islands. Beyond this, NGS aims to collaborate with other countries in the region through efforts with SIRGAS and UN-GGIM-Americas for a more acceptable regional model to serve everyone's needs.

  18. Spatial Updating Strategy Affects the Reference Frame in Path Integration.

    Science.gov (United States)

    He, Qiliang; McNamara, Timothy P

    2018-06-01

    This study investigated how spatial updating strategies affected the selection of reference frames in path integration. Participants walked an outbound path consisting of three successive waypoints in a featureless environment and then pointed to the first waypoint. We manipulated the alignment of participants' final heading at the end of the outbound path with their initial heading to examine the adopted reference frame. We assumed that the initial heading defined the principal reference direction in an allocentric reference frame. In Experiment 1, participants were instructed to use a configural updating strategy and to monitor the shape of the outbound path while they walked it. Pointing performance was best when the final heading was aligned with the initial heading, indicating the use of an allocentric reference frame. In Experiment 2, participants were instructed to use a continuous updating strategy and to keep track of the location of the first waypoint while walking the outbound path. Pointing performance was equivalent regardless of the alignment between the final and the initial headings, indicating the use of an egocentric reference frame. These results confirmed that people could employ different spatial updating strategies in path integration (Wiener, Berthoz, & Wolbers Experimental Brain Research 208(1) 61-71, 2011), and suggested that these strategies could affect the selection of the reference frame for path integration.

  19. Selected Reference Books of 1992.

    Science.gov (United States)

    McIlvaine, Eileen

    1993-01-01

    Presents an annotated bibliography of 40 recent scholarly and general works of interest to reference workers in university libraries. Topics areas covered include philosophy, religion, language, literature, architecture, economics, law, area studies, Russia and the Soviet Union, women's studies, and Christopher Columbus. New editions and…

  20. Development of beta reference radiations

    International Nuclear Information System (INIS)

    Wan Zhaoyong; Cai Shanyu; Li Yanbo; Yin Wei; Feng Jiamin; Sun Yuhua; Li Yongqiang

    1997-09-01

    A system of beta reference radiation has been developed, that is composed of 740 MBq 147 Pm beta source, 74 MBq and 740 MBq 90 Sr + 90 Y β sources, compensation filters, a source handling tool, a source jig, spacing bars, a shutter, a control unit and a beta dose meter calibration stand. For 740 MBq 147 Pm and 74 MBq 90 Sr + 90 Y beta reference radiations with compensation filters and 740 MBq 90 Sr + 90 Y beta reference radiation without compensation filter, at 20 cm, 30 cm and 30 cm distance separately; the residual energy of maximum is 0.14 MeV, 1.98 MeV and 2.18 MeV separately; the absorbed dose to tissue D (0.07) is 1.547 mGy/h (1996-05-20), 5.037 mGy/h (1996-05-10) and 93.57 mGy/h (1996-05-15) separately; the total uncertainty is 3.0%, 1.7% and 1.7% separately. For the first and the second beta reference radiation, the dose rate variability in the area of 18 cm diameter in the plane perpendicular to the beta-ray beam axis is within +-6% and +-3% separately. (3 refs., 2 tabs., 8 figs.)