WorldWideScience

Sample records for methods background information

  1. Italian: Area Background Information.

    Science.gov (United States)

    Defense Language Inst., Washington, DC.

    This booklet has been assembled in order to provide students of Italian with a compact source of cultural information on their target area. Chapters include discussion of: (1) introduction to Italian; (2) origins of the Italian population; (3) geography; (4) history including the Roman Era, the Middle Ages, the Renaissance, the "Risorgimento," and…

  2. Monitor Clean and Efficient. Background information. Methods and references as applied in the Monitor in April 2009

    International Nuclear Information System (INIS)

    Gerdes, J.; De Ligt, T.

    2010-01-01

    This report contains background information about the Monitor Clean and Efficient that was published in April 2009. The goal and approach of the Monitor are clarified, as well as the methods and data that are used. The structure of this report resembles the structure of the Monitor. Sources and dates of availability are mentioned along with the data, as are the parties collecting and processing the information. The results that were found using this methodology have been published in the Monitor Clean and Efficient. [nl

  3. Background information for the Leaching environmental Assessment Framework (LEAF) test methods

    Science.gov (United States)

    The U.S. Environmental Protection Agency Office of Resource Conservation and Recovery has initiated the review and validation process for four leaching tests under consideration for inclusion into SW-846: Method 1313 "Liquid-Solid Partitioning as a Function of Extract pH for Co...

  4. Foreign Energy Company Competitiveness: Background information

    Energy Technology Data Exchange (ETDEWEB)

    Weimar, M.R.; Freund, K.A.; Roop, J.M.

    1994-10-01

    This report provides background information to the report Energy Company Competitiveness: Little to Do With Subsidies (DOE 1994). The main body of this publication consists of data uncovered during the course of research on this DOE report. This data pertains to major government energy policies in each country studied. This report also provides a summary of the DOE report. In October 1993, the Office of Energy Intelligence, US Department of Energy (formerly the Office of Foreign Intelligence), requested that Pacific Northwest Laboratory prepare a report addressing policies and actions used by foreign governments to enhance the competitiveness of their energy firms. Pacific Northwest Laboratory prepared the report Energy Company Competitiveness Little to Do With Subsidies (DOE 1994), which provided the analysis requested by DOE. An appendix was also prepared, which provided extensive background documentation to the analysis. Because of the length of the appendix, Pacific Northwest Laboratory decided to publish this information separately, as contained in this report.

  5. Introduction to the background field method

    International Nuclear Information System (INIS)

    Abbott, L.F.; Brandeis Univ., Waltham, MA

    1982-01-01

    The background field approach to calculations in gauge field theories is presented. Conventional functional techniques are reviewed and the background field method is introduced. Feynman rules and renormalization are discussed and, as an example, the Yang-Mills β function is computed. (author)

  6. Renormalization using the background-field method

    International Nuclear Information System (INIS)

    Ichinose, S.; Omote, M.

    1982-01-01

    Renormalization using the background-field method is examined in detail. The subtraction mechanism of subdivergences is described with reference to multi-loop diagrams and one- and two-loop counter-term formulae are explicitly given. The original one-loop counter-term formula of 't Hooft is thereby improved. The present method of renormalization is far easier to manage than the usual one owing to the fact only gauge-invariant quantities are to be considered when worked in an appropriate gauge. Gravity and Yang-Mills theories are studied as examples. (orig.)

  7. Background elimination methods for multidimensional coincidence γ-ray spectra

    International Nuclear Information System (INIS)

    Morhac, M.

    1997-01-01

    In the paper new methods to separate useful information from background in one, two, three and multidimensional spectra (histograms) measured in large multidetector γ-ray arrays are derived. The sensitive nonlinear peak clipping algorithm is the basis of the methods for estimation of the background in multidimensional spectra. The derived procedures are simple and therefore have a very low cost in terms of computing time. (orig.)

  8. Background information on the SSC project

    International Nuclear Information System (INIS)

    Warren, J.

    1991-10-01

    This report discusses the following information about the Superconducting Super Collider: Goals and milestones; civil construction; ring components; cryogenics; vacuum and cooling water systems; electrical power; instrumentation and control systems; and installation planning

  9. Lewis Information Network (LINK): Background and overview

    Science.gov (United States)

    Schulte, Roger R.

    1987-01-01

    The NASA Lewis Research Center supports many research facilities with many isolated buildings, including wind tunnels, test cells, and research laboratories. These facilities are all located on a 350 acre campus adjacent to the Cleveland Hopkins Airport. The function of NASA-Lewis is to do basic and applied research in all areas of aeronautics, fluid mechanics, materials and structures, space propulsion, and energy systems. These functions require a great variety of remote high speed, high volume data communications for computing and interactive graphic capabilities. In addition, new requirements for local distribution of intercenter video teleconferencing and data communications via satellite have developed. To address these and future communications requirements for the next 15 yrs, a project team was organized to design and implement a new high speed communication system that would handle both data and video information in a common lab-wide Local Area Network. The project team selected cable television broadband coaxial cable technology as the communications medium and first installation of in-ground cable began in the summer of 1980. The Lewis Information Network (LINK) became operational in August 1982 and has become the backbone of all data communications and video.

  10. The spinorial method of classifying supersymmetric backgrounds

    NARCIS (Netherlands)

    Gran, U.; Gutowski, J.; Papadopoulos, G.; Roest, D.

    2006-01-01

    We review how the classification of all supersymmetric backgrounds of IIB supergravity can be reduced to the evaluation of the Killing spinor equations and their integrability conditions, which contain the field equations, on five types of spinors. This is an extension of the work [hep-th/0503046

  11. Study on the background information for the geological disposal concept

    International Nuclear Information System (INIS)

    Matsui, Kazuaki; Murano, Tohru; Hirusawa, Shigenobu; Komoto, Harumi

    2000-03-01

    Japan Nuclear Cycle Development Institute (JNC) has published first R and D report in 1992, in which the fruits of the R and D work were compiled. Since then, JNC, has been promoting the second R and D progress report until before 2000, in which the background information on the geological disposal of high level radioactive waste (HLW) was to be presented as well as the technical basis. Recognizing the importance of the social consensus to the geological disposal, understanding and consensus by the society are essential to the development and realization of the geological disposal of HLW. In this fiscal year, studies were divided into 2 phases, considering the time schedule of the second R and D progress report. 1. Phase 1: Analysis of the background information on the geological disposal concept. Based on the recent informations and the research works of last 2 years, final version of the study was made to contribute to the background informations for the second R and D progress report. (This was published in Nov. 1999 as the intermediate report: JNC TJ 1420 2000-006). 2. Phase 2: Following 2 specific items were selected for the candidate issues which need to be studied, considering the present circumstances around the R and D of geological disposal. (1) Educational materials and strategies related to nuclear energy and nuclear waste. Specific strategies and approaches in the area of nuclear energy and nuclear waste educational outreach and curriculum activities by the nuclear industry, government and other entities in 6 countries were surveyed and summarized. (2) Alternatives to geological disposal of HLW: Past national/international consideration and current status. The alternatives for the disposal of HLW have been discussed in the past and the major waste-producing countries have almost all chosen deep geological disposal as preferred method. Here past histories and recent discussions on the variations to geological disposal were studied. (author)

  12. The Institute of American Indian Arts Background Information (Task One of the Transition Evaluation). Background Paper.

    Science.gov (United States)

    Tippeconnic, John W., Jr.

    The paper, prepared as Task One of the Institute of American Indian Arts Transition Evaluation, provides pertinent background information about the Institute of American Indian Arts in Santa Fe, New Mexico. A brief history of the Institute is given, with information about its philosophy and purpose; objectives; organization and administration; the…

  13. Background and Qualification of Uncertainty Methods

    International Nuclear Information System (INIS)

    D'Auria, F.; Petruzzi, A.

    2008-01-01

    The evaluation of uncertainty constitutes the necessary supplement of Best Estimate calculations performed to understand accident scenarios in water cooled nuclear reactors. The needs come from the imperfection of computational tools on the one side and from the interest in using such tool to get more precise evaluation of safety margins. The paper reviews the salient features of two independent approaches for estimating uncertainties associated with predictions of complex system codes. Namely the propagation of code input error and the propagation of the calculation output error constitute the key-words for identifying the methods of current interest for industrial applications. Throughout the developed methods, uncertainty bands can be derived (both upper and lower) for any desired quantity of the transient of interest. For the second case, the uncertainty method is coupled with the thermal-hydraulic code to get the Code with capability of Internal Assessment of Uncertainty, whose features are discussed in more detail.

  14. The Qatar Biobank: background and methods.

    Science.gov (United States)

    Al Kuwari, Hanan; Al Thani, Asma; Al Marri, Ajayeb; Al Kaabi, Abdulla; Abderrahim, Hadi; Afifi, Nahla; Qafoud, Fatima; Chan, Queenie; Tzoulaki, Ioanna; Downey, Paul; Ward, Heather; Murphy, Neil; Riboli, Elio; Elliott, Paul

    2015-12-03

    The Qatar Biobank aims to collect extensive lifestyle, clinical, and biological information from up to 60,000 men and women Qatari nationals and long-term residents (individuals living in the country for ≥15 years) aged ≥18 years (approximately one-fifth of all Qatari citizens), to follow up these same individuals over the long term to record any subsequent disease, and hence to study the causes and progression of disease, and disease burden, in the Qatari population. Between the 11(th)-December-2012 and 20(th)-February-2014, 1209 participants were recruited into the pilot study of the Qatar Biobank. At recruitment, extensive phenotype information was collected from each participant, including information/measurements of socio-demographic factors, prevalent health conditions, diet, lifestyle, anthropometry, body composition, bone health, cognitive function, grip strength, retinal imaging, total body dual energy X-ray absorptiometry, and measurements of cardiovascular and respiratory function. Blood, urine, and saliva were collected and stored for future research use. A panel of 66 clinical biomarkers was routinely measured on fresh blood samples in all participants. Rates of recruitment are to be progressively increased in the coming period and the recruitment base widened to achieve a cohort of consented individuals broadly representative of the eligible Qatari population. In addition, it is planned to add additional measures in sub-samples of the cohort, including Magnetic Resonance Imaging (MRI) of the brain, heart and abdomen. The mean time for collection of the extensive phenotypic information and biological samples from each participant at the baseline recruitment visit was 179 min. The 1209 pilot study participants (506 men and 703 women) were aged between 28-80 years (median 39 years); 899 (74.4%) were Qatari nationals and 310 (25.6%) were long-term residents. Approximately two-thirds of pilot participants were educated to graduate level or above. The

  15. 42 CFR 82.0 - Background information on this part.

    Science.gov (United States)

    2010-10-01

    ... 82.0 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES OCCUPATIONAL SAFETY... EMPLOYEES OCCUPATIONAL ILLNESS COMPENSATION PROGRAM ACT OF 2000 Introduction § 82.0 Background information on this part. The Energy Employees Occupational Illness Compensation Program Act (EEOICPA), 42 U.S.C...

  16. The Background-Field Method and Noninvariant Renormalization

    International Nuclear Information System (INIS)

    Avdeev, L.V.; Kazakov, D.I.; Kalmykov, M.Yu.

    1994-01-01

    We investigate the consistency of the background-field formalism when applying various regularizations and renormalization schemes. By an example of a two-dimensional σ model it is demonstrated that the background-field method gives incorrect results when the regularization (and/or renormalization) is noninvariant. In particular, it is found that the cut-off regularization and the differential renormalization belong to this class and are incompatible with the background-field method in theories with nonlinear symmetries. 17 refs

  17. Methods of information processing

    Energy Technology Data Exchange (ETDEWEB)

    Kosarev, Yu G; Gusev, V D

    1978-01-01

    Works are presented on automation systems for editing and publishing operations by methods of processing symbol information and information contained in training selection (ranking of objectives by promise, classification algorithm of tones and noise). The book will be of interest to specialists in the automation of processing textural information, programming, and pattern recognition.

  18. The information content of cosmic microwave background anisotropies

    Science.gov (United States)

    Scott, Douglas; Contreras, Dagoberto; Narimani, Ali; Ma, Yin-Zhe

    2016-06-01

    The cosmic microwave background (CMB) contains perturbations that are close to Gaussian and isotropic. This means that its information content, in the sense of the ability to constrain cosmological models, is closely related to the number of modes probed in CMB power spectra. Rather than making forecasts for specific experimental setups, here we take a more pedagogical approach and ask how much information we can extract from the CMB if we are only limited by sample variance. We show that, compared with temperature measurements, the addition of E-mode polarization doubles the number of modes available out to a fixed maximum multipole, provided that all of the TT, TE, and EE power spectra are measured. However, the situation in terms of constraints on particular parameters is more complicated, as we explain and illustrate graphically. We also discuss the enhancements in information that can come from adding B-mode polarization and gravitational lensing. We show how well one could ever determine the basic cosmological parameters from CMB data compared with what has been achieved with Planck, which has already probed a substantial fraction of the TT information. Lastly, we look at constraints on neutrino mass as a specific example of how lensing information improves future prospects beyond the current 6-parameter model.

  19. The information content of cosmic microwave background anisotropies

    International Nuclear Information System (INIS)

    Scott, Douglas; Contreras, Dagoberto; Narimani, Ali; Ma, Yin-Zhe

    2016-01-01

    The cosmic microwave background (CMB) contains perturbations that are close to Gaussian and isotropic. This means that its information content, in the sense of the ability to constrain cosmological models, is closely related to the number of modes probed in CMB power spectra. Rather than making forecasts for specific experimental setups, here we take a more pedagogical approach and ask how much information we can extract from the CMB if we are only limited by sample variance. We show that, compared with temperature measurements, the addition of E -mode polarization doubles the number of modes available out to a fixed maximum multipole, provided that all of the TT , TE , and EE power spectra are measured. However, the situation in terms of constraints on particular parameters is more complicated, as we explain and illustrate graphically. We also discuss the enhancements in information that can come from adding B -mode polarization and gravitational lensing. We show how well one could ever determine the basic cosmological parameters from CMB data compared with what has been achieved with Planck , which has already probed a substantial fraction of the TT information. Lastly, we look at constraints on neutrino mass as a specific example of how lensing information improves future prospects beyond the current 6-parameter model.

  20. Study on the background information for the geological disposal concept

    International Nuclear Information System (INIS)

    Matsui, Kazuaki; Murano, Tohru; Hirusawa, Shigenobu; Komoto, Harumi

    1999-11-01

    Japan Nuclear Cycle Development Institute (JNC) has published the first R and D progress report in 1992. In which the fruits of the R and D works were compiled. Since then the next step of R and D has been developing progressively in Japan. Now JNC has a plan to make the second R and D progress report until before 2000, in which information on the geological disposal of high level radioactive waste(HLW) will be presented to show the technical reliability and technical basis to contribute for the site selection or the safety-standard developments. Recognizing the importance of the social consensus to the geological disposal of international discussions in 1990's, understanding and consensus by the society are essential to the development and realization of the geological disposal of HLW. For getting social understanding and consensus, it is quite important to present the broad basis background information on the geological disposal of HLW, together with the technical basis and also the international discussion of the issues. In this report, the following studies have been done to help to prepare the background information for the 2nd R and D progress report, based on the recent informations and research and assessment works of last 2 years. These are, (1) As the part of general discussion, characteristics of HLW disposal and several issues to be considered for establishing the measures of the disposal of HLW were identified and analyzed from both practical and logical points of view. Those issues were the concept and image of the long term safety measures, the concept and criteria of geological disposal, and, safety assessment and performance assessment. (2) As the part of specific discussion, questions and concerns frequently raised by the non-specialists were taken up and 10 topics in relation to the geological disposal have been identified based on the discussion. Scientific and technical facts, consensus by the specialists on the issues, and international

  1. Backgrounder

    International Development Research Centre (IDRC) Digital Library (Canada)

    IDRC CRDI

    Center for Mountain Ecosystem Studies, Kunming Institute of Botany of the Chinese Academy of Sciences, China: $1,526,000 to inform effective water governance in the Asian highlands of China, Nepal, and Pakistan. • Ashoka Trust for Research in Ecology and the Environment (ATREE), India: $1,499,300 for research on ...

  2. A method of background noise cancellation for SQUID applications

    International Nuclear Information System (INIS)

    He, D F; Yoshizawa, M

    2003-01-01

    When superconducting quantum inference devices (SQUIDs) operate in low-cost shielding or unshielded environments, the environmental background noise should be reduced to increase the signal-to-noise ratio. In this paper we present a background noise cancellation method based on a spectral subtraction algorithm. We first measure the background noise and estimate the noise spectrum using fast Fourier transform (FFT), then we subtract the spectrum of background noise from that of the observed noisy signal and the signal can be reconstructed by inverse FFT of the subtracted spectrum. With this method, the background noise, especially stationary inferences, can be suppressed well and the signal-to-noise ratio can be increased. Using high-T C radio-frequency SQUID gradiometer and magnetometer, we have measured the magnetic field produced by a watch, which was placed 35 cm under a SQUID. After noise cancellation, the signal-to-noise ratio could be greatly increased. We also used this method to eliminate the vibration noise of a cryocooler SQUID

  3. Background risk information to assist in risk management decision making

    International Nuclear Information System (INIS)

    Hammonds, J.S.; Hoffman, F.O.; White, R.K.; Miller, D.B.

    1992-10-01

    The evaluation of the need for remedial activities at hazardous waste sites requires quantification of risks of adverse health effects to humans and the ecosystem resulting from the presence of chemical and radioactive substances at these sites. The health risks from exposure to these substances are in addition to risks encountered because of the virtually unavoidable exposure to naturally occurring chemicals and radioactive materials that are present in air, water, soil, building materials, and food products. To provide a frame of reference for interpreting risks quantified for hazardous waste sites, it is useful to identify the relative magnitude of risks of both a voluntary and involuntary nature that are ubiquitous throughout east Tennessee. In addition to discussing risks from the ubiquitous presence of background carcinogens in the east Tennessee environment, this report also presents risks resulting from common, everyday activities. Such information should, not be used to discount or trivialize risks from hazardous waste contamination, but rather, to create a sensitivity to general risk issues, thus providing a context for better interpretation of risk information

  4. Latent variable method for automatic adaptation to background states in motor imagery BCI

    Science.gov (United States)

    Dagaev, Nikolay; Volkova, Ksenia; Ossadtchi, Alexei

    2018-02-01

    Objective. Brain-computer interface (BCI) systems are known to be vulnerable to variabilities in background states of a user. Usually, no detailed information on these states is available even during the training stage. Thus there is a need in a method which is capable of taking background states into account in an unsupervised way. Approach. We propose a latent variable method that is based on a probabilistic model with a discrete latent variable. In order to estimate the model’s parameters, we suggest to use the expectation maximization algorithm. The proposed method is aimed at assessing characteristics of background states without any corresponding data labeling. In the context of asynchronous motor imagery paradigm, we applied this method to the real data from twelve able-bodied subjects with open/closed eyes serving as background states. Main results. We found that the latent variable method improved classification of target states compared to the baseline method (in seven of twelve subjects). In addition, we found that our method was also capable of background states recognition (in six of twelve subjects). Significance. Without any supervised information on background states, the latent variable method provides a way to improve classification in BCI by taking background states into account at the training stage and then by making decisions on target states weighted by posterior probabilities of background states at the prediction stage.

  5. Limitations of the time slide method of background estimation

    International Nuclear Information System (INIS)

    Was, Michal; Bizouard, Marie-Anne; Brisson, Violette; Cavalier, Fabien; Davier, Michel; Hello, Patrice; Leroy, Nicolas; Robinet, Florent; Vavoulidis, Miltiadis

    2010-01-01

    Time shifting the output of gravitational wave detectors operating in coincidence is a convenient way of estimating the background in a search for short-duration signals. In this paper, we show how non-stationary data affect the background estimation precision. We present a method of measuring the fluctuations of the data and computing its effects on a coincident search. In particular, we show that for fluctuations of moderate amplitude, time slides larger than the fluctuation time scales can be used. We also recall how the false alarm variance saturates with the number of time shifts.

  6. Limitations of the time slide method of background estimation

    Energy Technology Data Exchange (ETDEWEB)

    Was, Michal; Bizouard, Marie-Anne; Brisson, Violette; Cavalier, Fabien; Davier, Michel; Hello, Patrice; Leroy, Nicolas; Robinet, Florent; Vavoulidis, Miltiadis, E-mail: mwas@lal.in2p3.f [LAL, Universite Paris-Sud, CNRS/IN2P3, Orsay (France)

    2010-10-07

    Time shifting the output of gravitational wave detectors operating in coincidence is a convenient way of estimating the background in a search for short-duration signals. In this paper, we show how non-stationary data affect the background estimation precision. We present a method of measuring the fluctuations of the data and computing its effects on a coincident search. In particular, we show that for fluctuations of moderate amplitude, time slides larger than the fluctuation time scales can be used. We also recall how the false alarm variance saturates with the number of time shifts.

  7. A novel method to remove the background from x-ray diffraction signal

    DEFF Research Database (Denmark)

    Zheng, Yi; Speller, Robert; Griffiths, Jennifer

    2018-01-01

    The first step that is required to extract the correct information from a two-dimensional (2D) diffraction signature is to remove the background accurately. However, direct background subtraction inevitably overcorrects the signal as it does not take into account the attenuation by the sample. Ot...... proposes a novel method that combines peak fitting and experimental results to estimate the background for 2D XRD signals....

  8. A monitored retrievable storage facility: Technical background information

    International Nuclear Information System (INIS)

    1991-07-01

    The US government is seeking a site for a monitored retrievable storage facility (MRS). Employing proven technologies used in this country and abroad, the MRS will be an integral part of the federal system for safe and permanent disposal of the nation's high-level radioactive wastes. The MRS will accept shipments of spent fuel from commercial nuclear power plants, temporarily store the spent fuel above ground, and stage shipments of it to a geologic repository for permanent disposal. The law authorizing the MRS provides an opportunity for a state or an Indian tribe to volunteer to host the MRS. The law establishes the Office of the Nuclear Waste Negotiator, who is to seek a state or an Indian tribe willing to host an MRS at a technically-qualified site on reasonable terms, and is to negotiate a proposed agreement specifying the terms and conditions under which the MRS would be developed and operated at that site. This agreement can ensure that the MRS is acceptable to -- and benefits -- the host community. The proposed agreement must be submitted to Congress and enacted into law to become effective. This technical background information presents an overview of various aspects of a monitored retrievable storage facility, including the process by which it will be developed

  9. Method and equipment for γ background compensation in neutron spectrometry

    International Nuclear Information System (INIS)

    Holman, M.; Marik, P.

    1986-01-01

    The compensation of background gamma radiation in neutron spectrometry is based on the measurement of the total energy spectrum of all protons and electrons, and of the energy spectrum of those protons and neutrons which are in coincidence with the discriminating signal derived from the integral of the counting rate distribution by pulse shape. The benefits of the method consist in the possibility of using standard single-parameter apparatus, in considerably smaller demands on the memory capacity and the possibility of a substantially finer division of the spectrum and more accurate compensation of the background than has been the case with methods used so far. A practical application is shown in a block diagram. (J.B.)

  10. Numerical method for IR background and clutter simulation

    Science.gov (United States)

    Quaranta, Carlo; Daniele, Gina; Balzarotti, Giorgio

    1997-06-01

    The paper describes a fast and accurate algorithm of IR background noise and clutter generation for application in scene simulations. The process is based on the hypothesis that background might be modeled as a statistical process where amplitude of signal obeys to the Gaussian distribution rule and zones of the same scene meet a correlation function with exponential form. The algorithm allows to provide an accurate mathematical approximation of the model and also an excellent fidelity with reality, that appears from a comparison with images from IR sensors. The proposed method shows advantages with respect to methods based on the filtering of white noise in time or frequency domain as it requires a limited number of computation and, furthermore, it is more accurate than the quasi random processes. The background generation starts from a reticule of few points and by means of growing rules the process is extended to the whole scene of required dimension and resolution. The statistical property of the model are properly maintained in the simulation process. The paper gives specific attention to the mathematical aspects of the algorithm and provides a number of simulations and comparisons with real scenes.

  11. Research methods in information

    CERN Document Server

    Pickard, Alison Jane

    2013-01-01

    The long-awaited 2nd edition of this best-selling research methods handbook is fully updated and includes brand new coverage of online research methods and techniques, mixed methodology and qualitative analysis. There is an entire chapter contributed by Professor Julie McLeod, Sue Childs and Elizabeth Lomas focusing on research data management, applying evidence from the recent JISC funded 'DATUM' project. The first to focus entirely on the needs of the information and communications community, it guides the would-be researcher through the variety of possibilities open to them under the heading "research" and provides students with the confidence to embark on their dissertations. The focus here is on the 'doing' and although the philosophy and theory of research is explored to provide context, this is essentially a practical exploration of the whole research process with each chapter fully supported by examples and exercises tried and tested over a whole teaching career. The book will take readers through eac...

  12. Information visualization courses for students with a computer science background.

    Science.gov (United States)

    Kerren, Andreas

    2013-01-01

    Linnaeus University offers two master's courses in information visualization for computer science students with programming experience. This article briefly describes the syllabi, exercises, and practices developed for these courses.

  13. Resonances and background: A decomposition of scattering information

    International Nuclear Information System (INIS)

    Engdahl, E.; Braendas, E.; Rittby, M.; Elander, N.

    1988-01-01

    An analytic representation of the full Green's function including bound states, resonances, and remaining contributions has been obtained for a class of dilatation analytic potentials, including the superimposed Coulomb potential. It is demonstrated how to obtain the locations and residues of the poles of the Green's function as well as the associated generalized spectral density. For a model potential which has a barrier and decreases exponentially at infinity we have found a certain deflation property of the generalized spectral density. A qualitative explanation of this phenomenon is suggested. This constitutes the motivation for an approximation that explicitly shows a decomposition of the (real) continuum, corresponding to scattering data, into resonances and background contributions. The present representation is also shown to incorporate the appropriate pole-background interferences. Numerical residue strings are computed and analyzed. Results for the Coulomb potential plus the above-mentioned model potential are reported and compared with the previous non-Coulomb case. A similar deflation effect is seen to occur, as well as basically the same pole- and residue-string behavior. The relevance of the present analysis in relation to recently planned experiments with electron-cooled beams of highly charged ions is briefly discussed

  14. Thermalization of mutual information in hyperscaling violating backgrounds

    Energy Technology Data Exchange (ETDEWEB)

    Tanhayi, M. Reza [Department of Physics, Faculty of Basic Science,Islamic Azad University Central Tehran Branch (IAUCTB),P.O. Box 14676-86831, Tehran (Iran, Islamic Republic of); School of Physics, Institute for Research in Fundamental Sciences (IPM),P.O. Box 19395-5531, Tehran (Iran, Islamic Republic of)

    2016-03-31

    We study certain features of scaling behaviors of the mutual information during a process of thermalization, more precisely we extend the time scaling behavior of mutual information which has been discussed in http://dx.doi.org/10.1007/JHEP09(2015)165 to time-dependent hyperscaling violating geometries. We use the holographic description of entanglement entropy for two disjoint system consisting of two parallel strips whose widths are much larger than the separation between them. We show that during the thermalization process, the dynamical exponent plays a crucial rule in reading the general time scaling behavior of mutual information (e.g., at the pre-local-equilibration regime). It is shown that the scaling violating parameter can be employed to define an effective dimension.

  15. Unexploded ordnance issues at Aberdeen Proving Ground: Background information

    Energy Technology Data Exchange (ETDEWEB)

    Rosenblatt, D.H.

    1996-11-01

    This document summarizes currently available information about the presence and significance of unexploded ordnance (UXO) in the two main areas of Aberdeen Proving Ground: Aberdeen Area and Edgewood Area. Known UXO in the land ranges of the Aberdeen Area consists entirely of conventional munitions. The Edgewood Area contains, in addition to conventional munitions, a significant quantity of chemical-munition UXO, which is reflected in the presence of chemical agent decomposition products in Edgewood Area ground-water samples. It may be concluded from current information that the UXO at Aberdeen Proving Ground has not adversely affected the environment through release of toxic substances to the public domain, especially not by water pathways, and is not likely to do so in the near future. Nevertheless, modest but periodic monitoring of groundwater and nearby surface waters would be a prudent policy.

  16. Methods of information geometry

    CERN Document Server

    Amari, Shun-Ichi

    2000-01-01

    Information geometry provides the mathematical sciences with a new framework of analysis. It has emerged from the investigation of the natural differential geometric structure on manifolds of probability distributions, which consists of a Riemannian metric defined by the Fisher information and a one-parameter family of affine connections called the \\alpha-connections. The duality between the \\alpha-connection and the (-\\alpha)-connection together with the metric play an essential role in this geometry. This kind of duality, having emerged from manifolds of probability distributions, is ubiquitous, appearing in a variety of problems which might have no explicit relation to probability theory. Through the duality, it is possible to analyze various fundamental problems in a unified perspective. The first half of this book is devoted to a comprehensive introduction to the mathematical foundation of information geometry, including preliminaries from differential geometry, the geometry of manifolds or probability d...

  17. The Pedestrian Detection Method Using an Extension Background Subtraction about the Driving Safety Support Systems

    Science.gov (United States)

    Muranaka, Noriaki; Date, Kei; Tokumaru, Masataka; Imanishi, Shigeru

    In recent years, the traffic accident occurs frequently with explosion of traffic density. Therefore, we think that the safe and comfortable transportation system to defend the pedestrian who is the traffic weak is necessary. First, we detect and recognize the pedestrian (the crossing person) by the image processing. Next, we inform all the drivers of the right or left turn that the pedestrian exists by the sound and the image and so on. By prompting a driver to do safe driving in this way, the accident to the pedestrian can decrease. In this paper, we are using a background subtraction method for the movement detection of the movement object. In the background subtraction method, the update method in the background was important, and as for the conventional way, the threshold values of the subtraction processing and background update were identical. That is, the mixing rate of the input image and the background image of the background update was a fixation value, and the fine tuning which corresponded to the environment change of the weather was difficult. Therefore, we propose the update method of the background image that the estimated mistake is difficult to be amplified. We experiment and examines in the comparison about five cases of sunshine, cloudy, evening, rain, sunlight change, except night. This technique can set separately the threshold values of the subtraction processing and background update processing which suited the environmental condition of the weather and so on. Therefore, the fine tuning becomes possible freely in the mixing rate of the input image and the background image of the background update. Because the setting of the parameter which suited an environmental condition becomes important to minimize mistaking percentage, we examine about the setting of a parameter.

  18. Clustering document fragments using background color and texture information

    Science.gov (United States)

    Chanda, Sukalpa; Franke, Katrin; Pal, Umapada

    2012-01-01

    Forensic analysis of questioned documents sometimes can be extensively data intensive. A forensic expert might need to analyze a heap of document fragments and in such cases to ensure reliability he/she should focus only on relevant evidences hidden in those document fragments. Relevant document retrieval needs finding of similar document fragments. One notion of obtaining such similar documents could be by using document fragment's physical characteristics like color, texture, etc. In this article we propose an automatic scheme to retrieve similar document fragments based on visual appearance of document paper and texture. Multispectral color characteristics using biologically inspired color differentiation techniques are implemented here. This is done by projecting document color characteristics to Lab color space. Gabor filter-based texture analysis is used to identify document texture. It is desired that document fragments from same source will have similar color and texture. For clustering similar document fragments of our test dataset we use a Self Organizing Map (SOM) of dimension 5×5, where the document color and texture information are used as features. We obtained an encouraging accuracy of 97.17% from 1063 test images.

  19. Estimating the SM background for supersymmetry searches: challenges and methods

    CERN Document Server

    Besjes, G J; The ATLAS collaboration

    2013-01-01

    Supersymmetry features a broad range of possible signatures at the LHC. If R-parity is conserved the production of squarks and gluinos is accompanied by events with hard jets, possibly leptons or photons and missing transverse momentum. Some Standard Model processes also mimic such events, which, due to their large cross sections, represent backgrounds that can fake or hide supersymmetry. While the normalisation of these backgrounds can be obtained from data in dedicated control regions, Monte Carlo simulation is often used to extrapolate the measured event yields from control to signal regions. Next-to-leading order and multi-parton generators are employed to predict these extrapolations for the dominant processes contributing to the SM background: W/Z boson and top pair production in association with (many) jets. The proper estimate of the associated theoretical uncertainties and testing these with data represent challenges. Other important backgrounds are diboson and top pair plus boson events with additio...

  20. Information System Quality Assessment Methods

    OpenAIRE

    Korn, Alexandra

    2014-01-01

    This thesis explores challenging topic of information system quality assessment and mainly process assessment. In this work the term Information System Quality is defined as well as different approaches in a quality definition for different domains of information systems are outlined. Main methods of process assessment are overviewed and their relationships are described. Process assessment methods are divided into two categories: ISO standards and best practices. The main objective of this w...

  1. Method and apparatus for reducing solvent luminescence background emissions

    Energy Technology Data Exchange (ETDEWEB)

    Affleck, Rhett L. (Los Alamos, NM); Ambrose, W. Patrick (Los Alamos, NM); Demas, James N. (Charlottesville, VA); Goodwin, Peter M. (Jemez Springs, NM); Johnson, Mitchell E. (Pittsburgh, PA); Keller, Richard A. (Los Alamos, NM); Petty, Jeffrey T. (Los Alamos, NM); Schecker, Jay A. (Santa Fe, NM); Wu, Ming (Los Alamos, NM)

    1998-01-01

    The detectability of luminescent molecules in solution is enhanced by reducing the background luminescence due to impurity species also present in the solution. A light source that illuminates the solution acts to photolyze the impurities so that the impurities do not luminesce in the fluorescence band of the molecule of interest. Molecules of interest may be carried through the photolysis region in the solution or may be introduced into the solution after the photolysis region.

  2. A new method for background rejection with surface sensitive bolometers

    International Nuclear Information System (INIS)

    Nones, C.; Foggetta, L.; Giuliani, A.; Pedretti, M.; Salvioni, C.; Sangiorgio, S.

    2006-01-01

    We report the performance of three prototype TeO 2 macrobolometers, able to identify events due to energy deposited at the detector surface. This capability is obtained by thermally coupling thin active layers to the main absorber of the bolometer, and is proved by irradiating the detectors with alpha particles. This technique can be very useful in view of background study and reduction for the CUORE experiment, a next generation Double Beta Decay search based on TeO 2 macrobolometers and to be installed in the Laboratori Nazionali del Gran Sasso

  3. A novel background field removal method for MRI using projection onto dipole fields (PDF).

    Science.gov (United States)

    Liu, Tian; Khalidov, Ildar; de Rochefort, Ludovic; Spincemaille, Pascal; Liu, Jing; Tsiouris, A John; Wang, Yi

    2011-11-01

    For optimal image quality in susceptibility-weighted imaging and accurate quantification of susceptibility, it is necessary to isolate the local field generated by local magnetic sources (such as iron) from the background field that arises from imperfect shimming and variations in magnetic susceptibility of surrounding tissues (including air). Previous background removal techniques have limited effectiveness depending on the accuracy of model assumptions or information input. In this article, we report an observation that the magnetic field for a dipole outside a given region of interest (ROI) is approximately orthogonal to the magnetic field of a dipole inside the ROI. Accordingly, we propose a nonparametric background field removal technique based on projection onto dipole fields (PDF). In this PDF technique, the background field inside an ROI is decomposed into a field originating from dipoles outside the ROI using the projection theorem in Hilbert space. This novel PDF background removal technique was validated on a numerical simulation and a phantom experiment and was applied in human brain imaging, demonstrating substantial improvement in background field removal compared with the commonly used high-pass filtering method. Copyright © 2011 John Wiley & Sons, Ltd.

  4. Binary recursive partitioning: background, methods, and application to psychology.

    Science.gov (United States)

    Merkle, Edgar C; Shaffer, Victoria A

    2011-02-01

    Binary recursive partitioning (BRP) is a computationally intensive statistical method that can be used in situations where linear models are often used. Instead of imposing many assumptions to arrive at a tractable statistical model, BRP simply seeks to accurately predict a response variable based on values of predictor variables. The method outputs a decision tree depicting the predictor variables that were related to the response variable, along with the nature of the variables' relationships. No significance tests are involved, and the tree's 'goodness' is judged based on its predictive accuracy. In this paper, we describe BRP methods in a detailed manner and illustrate their use in psychological research. We also provide R code for carrying out the methods.

  5. Spectral feature characterization methods for blood stain detection in crime scene backgrounds

    Science.gov (United States)

    Yang, Jie; Mathew, Jobin J.; Dube, Roger R.; Messinger, David W.

    2016-05-01

    Blood stains are one of the most important types of evidence for forensic investigation. They contain valuable DNA information, and the pattern of the stains can suggest specifics about the nature of the violence that transpired at the scene. Blood spectral signatures containing unique reflectance or absorption features are important both for forensic on-site investigation and laboratory testing. They can be used for target detection and identification applied to crime scene hyperspectral imagery, and also be utilized to analyze the spectral variation of blood on various backgrounds. Non-blood stains often mislead the detection and can generate false alarms at a real crime scene, especially for dark and red backgrounds. This paper measured the reflectance of liquid blood and 9 kinds of non-blood samples in the range of 350 nm - 2500 nm in various crime scene backgrounds, such as pure samples contained in petri dish with various thicknesses, mixed samples with different colors and materials of fabrics, and mixed samples with wood, all of which are examined to provide sub-visual evidence for detecting and recognizing blood from non-blood samples in a realistic crime scene. The spectral difference between blood and non-blood samples are examined and spectral features such as "peaks" and "depths" of reflectance are selected. Two blood stain detection methods are proposed in this paper. The first method uses index to denote the ratio of "depth" minus "peak" over"depth" add"peak" within a wavelength range of the reflectance spectrum. The second method uses relative band depth of the selected wavelength ranges of the reflectance spectrum. Results show that the index method is able to discriminate blood from non-blood samples in most tested crime scene backgrounds, but is not able to detect it from black felt. Whereas the relative band depth method is able to discriminate blood from non-blood samples on all of the tested background material types and colors.

  6. Methods of Organizational Information Security

    Science.gov (United States)

    Martins, José; Dos Santos, Henrique

    The principle objective of this article is to present a literature review for the methods used in the security of information at the level of organizations. Some of the principle problems are identified and a first group of relevant dimensions is presented for an efficient management of information security. The study is based on the literature review made, using some of the more relevant certified articles of this theme, in international reports and in the principle norms of management of information security. From the readings that were done, we identified some of the methods oriented for risk management, norms of certification and good practice of security of information. Some of the norms are oriented for the certification of the product or system and others oriented to the processes of the business. There are also studies with the proposal of Frameworks that suggest the integration of different approaches with the foundation of norms focused on technologies, in processes and taking into consideration the organizational and human environment of the organizations. In our perspective, the biggest contribute to the security of information is the development of a method of security of information for an organization in a conflicting environment. This should make available the security of information, against the possible dimensions of attack that the threats could exploit, through the vulnerability of the organizational actives. This method should support the new concepts of "Network centric warfare", "Information superiority" and "Information warfare" especially developed in this last decade, where information is seen simultaneously as a weapon and as a target.

  7. Application of Canonical Effective Methods to Background-Independent Theories

    Science.gov (United States)

    Buyukcam, Umut

    Effective formalisms play an important role in analyzing phenomena above some given length scale when complete theories are not accessible. In diverse exotic but physically important cases, the usual path-integral techniques used in a standard Quantum Field Theory approach seldom serve as adequate tools. This thesis exposes a new effective method for quantum systems, called the Canonical Effective Method, which owns particularly wide applicability in backgroundindependent theories as in the case of gravitational phenomena. The central purpose of this work is to employ these techniques to obtain semi-classical dynamics from canonical quantum gravity theories. Application to non-associative quantum mechanics is developed and testable results are obtained. Types of non-associative algebras relevant for magnetic-monopole systems are discussed. Possible modifications of hypersurface deformation algebra and the emergence of effective space-times are presented. iii.

  8. Discriminating background from anthropogenic lead by isotopic methods

    International Nuclear Information System (INIS)

    Nelson, B.K.; O'Brien, H.E.

    1995-01-01

    The goal of this pilot project was to evaluate the practicality of using natural variations in the isotopic composition of lead to test for the presence of anthropogenic lead in soil, surface water and ground water. Complex chemical reactions in the environment may cause measured lead concentrations to be ambiguous indicators of anthropogenic lead component. The lead isotope tracer technique has the potential to identify both the presence and proportion of anthropogenic lead in the environment. The tested the lead isotope technique at Eielson Air Force Base, Alaska, on sources of suspected fuel contamination. Although the results are specific to this base, the general technique of using lead isotopes to trace the movement of anthropogenic lead is applicable to other CERCLA sites. The study had four objectives: (1) characterize the natural lead isotope composition of bedrock, stream sediment and soils; (2) characterize the isotopic composition of the contaminant lead derived from fuel; (3) evaluate the sensitivity of the isotopic method to distinguishing between anthropogenic and natural lead in soil and water samples and (4) evaluate the analytical feasibility and accuracy of the method at the Isotope Geochemistry Laboratory at the University of Washington

  9. Diagnosis of condensation-induced waterhammer: Methods and background

    International Nuclear Information System (INIS)

    Izenson, M.G.; Rothe, P.H.; Wallis, G.B.

    1988-10-01

    This guidebook provides reference material and diagnostic procedures concerning condensation-induced waterhammer in nuclear power plants. Condensation-induced waterhammer is the most damaging form of waterhammer and its diagnosis is complicated by the complex nature of the underlying phenomena. In Volume 1, the guidebook groups condensation-induced waterhammers into five event classes which have similar phenomena and levels of damage. Diagnostic guidelines focus on locating the event center where condensation and slug acceleration take place. Diagnosis is described in three stages: an initial assessment, detailed evaluation and final confirmation. Graphical scoping analyses are provided to evaluate whether an event from one of the event classes could have occurred at the event center. Examples are provided for each type of waterhammer. Special instructions are provided for walking down damaged piping and evaluating damage due to waterhammer. To illustrate the diagnostic methods and document past experience, six case studies have been compiled in Volume 2. These case studies, based on actual condensation-induced waterhammer events at nuclear plants, present detailed data and work through the event diagnosis using the tools introduced in the first volume. 65 figs., 8 tabs

  10. 77 FR 21992 - Proposed Renewal of Information Collection: Applicant Background Survey

    Science.gov (United States)

    2012-04-12

    ... customers. By including employees of all backgrounds, all DOI employees gain a measure of knowledge... barriers in our recruitment and selection processes, DOI must track the demographic groups that apply for... need and use of the information: This information is required to obtain the source of recruitment...

  11. Improvement of Accuracy for Background Noise Estimation Method Based on TPE-AE

    Science.gov (United States)

    Itai, Akitoshi; Yasukawa, Hiroshi

    This paper proposes a method of a background noise estimation based on the tensor product expansion with a median and a Monte carlo simulation. We have shown that a tensor product expansion with absolute error method is effective to estimate a background noise, however, a background noise might not be estimated by using conventional method properly. In this paper, it is shown that the estimate accuracy can be improved by using proposed methods.

  12. [Effects of exposure frequency and background information on preferences for photographs of cars in different locations].

    Science.gov (United States)

    Matsuda, Ken; Kusumi, Takashi; Hosomi, Naohiro; Osa, Atsushi; Miike, Hidetoshi

    2014-08-01

    This study examined the influence of familiarity and novelty on the mere exposure effect while manipulating the presentation of background information. We selected presentation stimuli that integrated cars and backgrounds based on the results of pilot studies. During the exposure phase, we displayed the stimuli successively for 3 seconds, manipulating the background information (same or different backgrounds with each presentation) and exposure frequency (3, 6, and 9 times). In the judgment phase, 18 participants judged the cars in terms of preference, familiarity, and novelty on a 7-point scale. As the number of stimulus presentations increased, the preference for the cars increased during the different background condition and decreased during the same background condition. This increased preference may be due to the increase in familiarity caused by the higher exposure frequency and novelty resulting from the background changes per exposure session. The rise in preference judgments was not seen when cars and backgrounds were presented independently. Therefore, the addition of novel features to each exposure session facilitated the mere exposure effect.

  13. Impact of information technology on the role of medical libraries in information managment: normative background

    Directory of Open Access Journals (Sweden)

    Anamarija Rožić-Hristovski

    1998-01-01

    Full Text Available Exponential growth of biomedical knowledge and information technology development is changing the infrastructure of health care systems, education and research. So medical libraries roles have shifted from managing containers of information toward influencing biomedical information resource content and education. These new tasks are formalised in modem American standards for medical libraries, stressing information management role in evolving environment.In Slovenia medical libraries also are aware of development imperative of information activities for advances in medicine. At one side they are faced with lack of specific guidelines for proactive action and on the other with inadequate assessment in legal documents and insufficient funding.

  14. Background Information on Crimes against Children Study. Information Memorandum 86-20.

    Science.gov (United States)

    Haas, Shaun

    This document was prepared to assist the Wisconsin Legislative Council's Special Committee on Crimes Against Children in its study of current laws relating to crimes against children. It provides the background of the origin of the study and describes the characteristics of the Criminal Code, upon which much of the committee review will center.…

  15. Research on Statistical Flow of the Complex Background Based on Image Method

    Directory of Open Access Journals (Sweden)

    Yang Huanhai

    2014-06-01

    Full Text Available Along with our country city changes a process continues to accelerate, city road traffic system pressure increasing. Therefore, the importance of intelligent transportation system based on computer vision technology is becoming more and more significant. Using the image processing technology for the vehicle detection has become a hot topic in the research field of. Only accurately segmented from the background of vehicle can recognize and track vehicles. Therefore, the application of video vehicle detection technology and image processing technology, identify a number of the same sight many car can, types and moving characteristics, can provide real-time basis for intelligent traffic control. This paper first introduces the concept of intelligent transportation system, the importance and the image processing technology in vehicle recognition in statistics, overview of video vehicle detection method, and the video detection technology and other detection technology, puts forward the superiority of video detection technology. Finally we design a real-time and reliable background subtraction method and the area of the vehicle recognition method based on information fusion algorithm, which is implemented with the MATLAB/GUI development tool in Windows operating system platform. In this paper, the application of the algorithm to study the frame traffic flow image. The experimental results show that, the algorithm of recognition of vehicle flow statistics, the effect is very good.

  16. E-mail Writing: Providing Background Information in the Core of Computer Assisted Instruction

    Directory of Open Access Journals (Sweden)

    Behzad NAZARI

    2015-01-01

    Full Text Available The present study highly supported the effective role of providing background information via e-mail by the teacher to write e-mail by the students in learners’ writing ability. A total number of 50 EFL advanced male students aged between 25 and 40 at different branches of Iran Language Institute in Tehran, Tehran. Through the placement test of Oxford English Language Placement Test (OELPT the students' proficiency level seems to be nearly the same. Participants were randomly assign into two groups of experimental and control, each consisting of 25 students. After the administration of the proficiency test, all groups were assigned to write topic 1 as the pre-test. Next, the teacher involved the learners in the new instruction (treatment. During writing topics 2, 3, 4, 5, 6, and 7 experimental group’s background knowledge was activated through e-mail before writing and e-mailing topics while the control group received no background knowledge activation through e-mail. After the treatment was given to the experimental group, the students in both groups were required to write another composition about the last topic, topic 8. Again, in this phase, none of the groups received any background information. The results indicated that providing background information via e-mail by the teacher to write e-mail by the students significantly improved learners’ writing ability.

  17. Identification and summary characterization of materials potentially requiring vitrification: Background information

    International Nuclear Information System (INIS)

    Croff, A.G.

    1996-01-01

    This document contains background information for the Workshop in general and the presentation entitled 'Identification and Summary Characterization of Materials Potentially Requiring Vitrification' that was given during the first morning of the workshop. summary characteristics of 9 categories of US materials having some potential to be vitrified are given. This is followed by a 1-2 page elaborations for each of these 9 categories. References to more detailed information are included

  18. The Cryogenic Dark Matter Search and Background Rejection with Event Position Information

    International Nuclear Information System (INIS)

    Wang, Gen-sheng

    2005-01-01

    Evidence from observational cosmology and astrophysics indicates that about one third of the universe is matter, but that the known baryonic matter only contributes to the universe at 4%. A large fraction of the universe is cold and non-baryonic matter, which has important role in the universe structure formation and its evolution. The leading candidate for the non-baryonic dark matter is Weakly Interacting Massive Particles (WIMPs), which naturally occurs in the supersymmetry theory in particle physics. The Cryogenic Dark Matter Search (CDMS) experiment is searching for evidence of a WIMP interaction off an atomic nucleus in crystals of Ge and Si by measuring simultaneously the phonon energy and ionization energy of the interaction in the CDMS detectors. The WIMP interaction energy is from a few keV to tens of keV with a rate less than 0.1 events/kg/day. To reach the goal of WIMP detection, the CDMS experiment has been conducted in the Soudan mine with an active muon veto and multistage passive background shields. The CDMS detectors have a low energy threshold and background rejection capabilities based on ionization yield. However, betas from contamination and other radioactive sources produce surface interactions, which have low ionization yield, comparable to that of bulk nuclear interactions. The low-ionization surface electron recoils must be removed in the WIMP search data analysis. An emphasis of this thesis is on developing the method of the surface-interaction rejection using location information of the interactions, phonon energy distributions and phonon timing parameters. The result of the CDMS Soudan run118 92.3 live day WIMP search data analysis is presented, and represents the most sensitive search yet performed

  19. The Cryogenic Dark Matter Search and Background Rejection with Event Position Information

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Gensheng [Case Western Reserve Univ., Cleveland, OH (United States). Dept. of Physics

    2005-01-01

    Evidence from observational cosmology and astrophysics indicates that about one third of the universe is matter, but that the known baryonic matter only contributes to the universe at 4%. A large fraction of the universe is cold and non-baryonic matter, which has important role in the universe structure formation and its evolution. The leading candidate for the non-baryonic dark matter is Weakly Interacting Massive Particles (WIMPs), which naturally occurs in the supersymmetry theory in particle physics. The Cryogenic Dark Matter Search (CDMS) experiment is searching for evidence of a WIMP interaction off an atomic nucleus in crystals of Ge and Si by measuring simultaneously the phonon energy and ionization energy of the interaction in the CDMS detectors. The WIMP interaction energy is from a few keV to tens of keV with a rate less than 0.1 events/kg/day. To reach the goal of WIMP detection, the CDMS experiment has been conducted in the Soudan mine with an active muon veto and multistage passive background shields. The CDMS detectors have a low energy threshold and background rejection capabilities based on ionization yield. However, betas from contamination and other radioactive sources produce surface interactions, which have low ionization yield, comparable to that of bulk nuclear interactions. The low-ionization surface electron recoils must be removed in the WIMP search data analysis. An emphasis of this thesis is on developing the method of the surface-interaction rejection using location information of the interactions, phonon energy distributions and phonon timing parameters. The result of the CDMS Soudan run118 92.3 live day WIMP search data analysis is presented, and represents the most sensitive search yet performed.

  20. Judicial decision making: order of evidence presentation and availability of background information

    NARCIS (Netherlands)

    Kerstholt, J.H.; Jackson, J.L.

    1998-01-01

    An experiment was conducted to investigate both the effect of the order of presentation of defence and prosecution evidence and the prior availability of background information on assessment of guilt. Subjects were required to judge the defendant's probability of guilt either after each witness

  1. Technical background information for the environmental and safety report, Volume 4: White Oak Lake and Dam

    International Nuclear Information System (INIS)

    Oakes, T.W.; Kelly, B.A.; Ohnesorge, W.F.; Eldridge, J.S.; Bird, J.C.; Shank, K.E.; Tsakeres, F.S.

    1982-03-01

    This report has been prepared to provide background information on White Oak Lake for the Oak Ridge National Laboratory Environmental and Safety Report. The paper presents the history of White Oak Dam and Lake and describes the hydrological conditions of the White Oak Creek watershed. Past and present sediment and water data are included; pathway analyses are described in detail

  2. Technical background information for the environmental and safety report, Volume 4: White Oak Lake and Dam

    Energy Technology Data Exchange (ETDEWEB)

    Oakes, T.W.; Kelly, B.A.; Ohnesorge, W.F.; Eldridge, J.S.; Bird, J.C.; Shank, K.E.; Tsakeres, F.S.

    1982-03-01

    This report has been prepared to provide background information on White Oak Lake for the Oak Ridge National Laboratory Environmental and Safety Report. The paper presents the history of White Oak Dam and Lake and describes the hydrological conditions of the White Oak Creek watershed. Past and present sediment and water data are included; pathway analyses are described in detail.

  3. E-Mail Writing: Providing Background Information in the Core of Computer Assisted Instruction

    Science.gov (United States)

    Nazari, Behzad; Ninknejad, Sahar

    2015-01-01

    The present study highly supported the effective role of providing background information via email by the teacher to write e-mail by the students in learners' writing ability. A total number of 50 EFL advanced male students aged between 25 and 40 at different branches of Iran Language Institute in Tehran, Tehran. Through the placement test of…

  4. The influence of immigrant background on the choice of sedation method in paediatric dentistry.

    Science.gov (United States)

    Dahlander, Andreas; Jansson, Leif; Carlstedt, Kerstin; Grindefjord, Margaret

    2015-01-01

    The effects of immigration on the demographics of the Swedish population have changed the situation for many dental care providers, placing increased demand on cultural competence. The aim of this investigation was to study the choice of sedation method among children with immigrant background, referred to paediatric dentistry specialists, because of behaviour management problems or dental fear in combination with treatment needs. The material consisted of dental records from children referred to two clinics for paediatric dentistry: 117 records from children with an immigrant background and 106 from children with a non-immigrant background. Information about choice of sedation method (conventional treatment, conscious sedation with midazolam, nitrous oxide, or general anaesthesia) and dental status was collected from the records. The number of missed appointments (defaults) was also registered. Binary logistic regression analyses were used to calculate the influence of potential predictors on choice of sedation method. The mean age of the patients in the immigrant group was 4.9 yrs, making them significantly younger than the patients in the non-immigrant group (mean 5.7 yrs). In the immigrant group, 26% of the patients defaulted from treatments, while the corresponding frequency was significantly lower for the reference group (7%). The numbers of primary teeth with caries and permanent teeth with caries were positively and significantly correlated with the choice of treatment under general anaesthesia. Conscious sedation was used significantly more often in younger children and in the non-immigrant group, while nitrous oxide was preferred in the older children. In conclusion, conscious sedation was more frequently used in the non-immigrant group. The choice of sedation was influenced by caries frequency and the age of the child.

  5. Information geometric methods for complexity

    Science.gov (United States)

    Felice, Domenico; Cafaro, Carlo; Mancini, Stefano

    2018-03-01

    Research on the use of information geometry (IG) in modern physics has witnessed significant advances recently. In this review article, we report on the utilization of IG methods to define measures of complexity in both classical and, whenever available, quantum physical settings. A paradigmatic example of a dramatic change in complexity is given by phase transitions (PTs). Hence, we review both global and local aspects of PTs described in terms of the scalar curvature of the parameter manifold and the components of the metric tensor, respectively. We also report on the behavior of geodesic paths on the parameter manifold used to gain insight into the dynamics of PTs. Going further, we survey measures of complexity arising in the geometric framework. In particular, we quantify complexity of networks in terms of the Riemannian volume of the parameter space of a statistical manifold associated with a given network. We are also concerned with complexity measures that account for the interactions of a given number of parts of a system that cannot be described in terms of a smaller number of parts of the system. Finally, we investigate complexity measures of entropic motion on curved statistical manifolds that arise from a probabilistic description of physical systems in the presence of limited information. The Kullback-Leibler divergence, the distance to an exponential family and volumes of curved parameter manifolds, are examples of essential IG notions exploited in our discussion of complexity. We conclude by discussing strengths, limits, and possible future applications of IG methods to the physics of complexity.

  6. Presenting and processing information in background noise: A combined speaker-listener perspective.

    Science.gov (United States)

    Bockstael, Annelies; Samyn, Laurie; Corthals, Paul; Botteldooren, Dick

    2018-01-01

    Transferring information orally in background noise is challenging, for both speaker and listener. Successful transfer depends on complex interaction between characteristics related to listener, speaker, task, background noise, and context. To fully assess the underlying real-life mechanisms, experimental design has to mimic this complex reality. In the current study, the effects of different types of background noise have been studied in an ecologically valid test design. Documentary-style information had to be presented by the speaker and simultaneously acquired by the listener in four conditions: quiet, unintelligible multitalker babble, fluctuating city street noise, and little varying highway noise. For both speaker and listener, the primary task was to focus on the content that had to be transferred. In addition, for the speakers, the occurrence of hesitation phenomena was assessed. The listener had to perform an additional secondary task to address listening effort. For the listener the condition with the most eventful background noise, i.e., fluctuating city street noise, appeared to be the most difficult with markedly longer duration of the secondary task. In the same fluctuating background noise, speech appeared to be less disfluent, suggesting a higher level of concentration from the speaker's side.

  7. An analysis of the Bonn agreement. Background information for evaluating business implications

    International Nuclear Information System (INIS)

    Torvanger, Asbjoern

    2001-01-01

    This report has been commissioned by the World Business Council for Sustainable Development and written in August 2001. The aim of the report is to present and analyze the newest developments in the climate negotiations, particularly from part two of the sixth Conference of the Parties to the Climate Convention in Bonn in July 2001, and to provide background information to evaluate what the ''Bonn agreement'' means for business. The report is organized as a collection of slides with supporting text explaining the background and contents of each slide. (author)

  8. Segmentation of Moving Object Using Background Subtraction Method in Complex Environments

    Directory of Open Access Journals (Sweden)

    S. Kumar

    2016-06-01

    Full Text Available Background subtraction is an extensively used approach to localize the moving object in a video sequence. However, detecting an object under the spatiotemporal behavior of background such as rippling of water, moving curtain and illumination change or low resolution is not a straightforward task. To deal with the above-mentioned problem, we address a background maintenance scheme based on the updating of background pixels by estimating the current spatial variance along the temporal line. The work is focused to immune the variation of local motion in the background. Finally, the most suitable label assignment to the motion field is estimated and optimized by using iterated conditional mode (ICM under a Markovian framework. Performance evaluation and comparisons with the other well-known background subtraction methods show that the proposed method is unaffected by the problem of aperture distortion, ghost image, and high frequency noise.

  9. Information Retrieval Methods in Libraries and Information Centers ...

    African Journals Online (AJOL)

    The volumes of information created, generated and stored are immense that without adequate knowledge of information retrieval methods, the retrieval process for an information user would be cumbersome and frustrating. Studies have further revealed that information retrieval methods are essential in information centers ...

  10. Methods for evaluating information sources

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2012-01-01

    The article briefly presents and discusses 12 different approaches to the evaluation of information sources (for example a Wikipedia entry or a journal article): (1) the checklist approach; (2) classical peer review; (3) modified peer review; (4) evaluation based on examining the coverage...... of controversial views; (5) evidence-based evaluation; (6) comparative studies; (7) author credentials; (8) publisher reputation; (9) journal impact factor; (10) sponsoring: tracing the influence of economic, political, and ideological interests; (11) book reviews and book reviewing; and (12) broader criteria....... Reading a text is often not a simple process. All the methods discussed here are steps on the way on learning how to read, understand, and criticize texts. According to hermeneutics it involves the subjectivity of the reader, and that subjectivity is influenced, more or less, by different theoretical...

  11. METHODS OF POLYMODAL INFORMATION TRANSMISSION

    Directory of Open Access Journals (Sweden)

    O. O. Basov

    2015-03-01

    Full Text Available The research results upon the application of the existing information transmission methods in polymodal info communication systems are presented herein. The analysis of the existing commutation ways and multiplexing schemes has revealed that modern means of telecommunication are capable of providing polymodal information delivery with the required quality to the customer correspondent terminal. Under these conditions substantial capacity resource consumption in the data transmission networks with a simultaneous static time multiplexing is required, however, it is easier to achieve the modality synchronization within that kind of an infrastructure. The data networks with a static time multiplexing demand employing more sophisticated supporting algorithms of the guaranteed data blocks delivery quality. However, due to the stochastic data blocks delays modality synchronizing during the off-line processing is more difficult to provide. Nowadays there are objective preconditions for a data networking realization which is invariable to the applied transmission technology. This capability is defined by a wide (person-to-person application of the optical technologies in the transport infrastructure of the polymodal info communication systems. In case of the availability of the customer terminal and networking functioning matching mode it becomes possible to organize channels in the latter which can adaptively select the most effective networking technology according to the current volume allocation and modality types in the messages.

  12. Method and apparatus for determining accuracy of radiation measurements made in the presence of background radiation

    International Nuclear Information System (INIS)

    Horrocks, D.L.

    1977-01-01

    A radioactivity measuring instrument, and a method related to its use, for determining the radioactivity of a sample measured in the presence of significant background radiation, and for determining an error value relating to a specific probability of accuracy of the result are presented. Error values relating to the measurement of background radiation alone, and to the measurement of sample radiation and background radiation together, are combined to produce a true error value relating to the sample radiation alone

  13. Background information and technical basis for assessment of environmental implications of magnetic fusion energy

    International Nuclear Information System (INIS)

    Cannon, J.B.

    1983-08-01

    This report contains background information for assessing the potential environmental implications of fusion-based central electric power stations. It was developed as part of an environmental review of the Magnetic Fusion Energy Program. Transition of the program from demonstration of purely scientific feasibility (breakeven conditions) to exploration of engineering feasibility suggests that formal program environmental review under the National Environmental Policy Act is timely. This report is the principal reference upon which an environmental impact statement on magnetic fusion will be based

  14. 108 Information Retrieval Methods in Libraries and Information ...

    African Journals Online (AJOL)

    User

    without adequate knowledge of information retrieval methods, the retrieval process for an ... discusses the concept of Information retrieval, the various information ..... Other advantages of automatic indexing are the maintenance of consistency.

  15. Comparison of selection methods to deduce natural background levels for groundwater units

    NARCIS (Netherlands)

    Griffioen, J.; Passier, H.F.; Klein, J.

    2008-01-01

    Establishment of natural background levels (NBL) for groundwater is commonly performed to serve as reference when assessing the contamination status of groundwater units. We compare various selection methods to establish NBLs using groundwater quality data forfour hydrogeologically different areas

  16. Measuring method to impulse neutron scattering background in complicated ambient condition

    International Nuclear Information System (INIS)

    Tang Zhangkui; Peng Taiping; Tang Zhengyuan; Liu Hangang; Hu Mengchun; Fan Juan

    2004-01-01

    This paper introduced a measuring method and calculative formula about impulse neutron scattering background in complicated ambient condition. The experiment had been done in the lab, and the factors to affect measurement conclusion were analysised. (authors)

  17. Study on the background information for the R and D of geological disposal

    International Nuclear Information System (INIS)

    Matsui, Kazuaki; Hirusawa, Shigenobu; Komoto, Harumi

    2001-02-01

    It is quite important for Japan Nuclear Cycle Development Institute (JNC) to analyze the R and D items after 'H12 report' and also provide their results of R and D activities to general public effectively. Recognizing the importance of the social consensus to the geological disposal, relating background informations were to be picked up. In this fiscal year, following two main topics were selected and studied. 1. Research and analysis on the options for the geological disposal concept. The major nuclear power-generating countries have almost all chosen deep geological disposal as preferred method for HLW disposal. Since 1990's, to make the geological disposal flexible, the alternative concepts for the disposal of HLW have been discussed promoting the social acceptance. In this context, recent optional discussions and international evaluations on the following topics were studied and summarized. (1) Reversibility of waste disposal/Retrievability of waste/Waste monitoring, (2) Long-term storage concept and its effectiveness, (3) Present position and role of international disposal. 2. Research and analysis on some educational materials collected from foreign countries. Although geological disposals is scheduled to start still in future, it is quite important to study the procedures to attract younger generation and get their proper perceptions on the nuclear energy and waste problems. As the supporting analysis to implement strategically the public relational activities for JNC's geological disposal R and D, particular attention was focused on the educational materials obtained in the last year's survey. Representative educational materials were selected and following items were studied and summarized. (1) Basic approach, positioning and characteristics of the educational materials, (2) Detailed analysis of the representatively selected educational materials, (3) Comparison of the analyzed characteristics and study on its feedback to Japanese materials. (author)

  18. Background information document to support NESHAPS rulemaking on nuclear power reactors. Draft report

    International Nuclear Information System (INIS)

    Colli, A.; Conklin, C.; Hoffmeyer, D.

    1991-08-01

    The purpose of this Background Information Document (BID) is to present information relevant to the Administrator of the Environmental Protection Agency's (EPA) reconsideration of the need for a NESHAP to control radionuclides emitted to the air from commercial nuclear power reactors. The BID presents information on the relevant portions of the regulatory framework that NRC has implemented for nuclear power plant licensees, under the authority of the Atomic Energy Act, as amended, to protect the public's health and safety. To provide context, it summarizes the rulemaking history for Subpart I. It then describes NRC's regulatory program for routine atmospheric emissions of radionuclides and evaluates the doses caused by actual airborne emissions from nuclear power plants, including releases resulting from anticipated operational occurrences

  19. Evaluation of Shifted Excitation Raman Difference Spectroscopy and Comparison to Computational Background Correction Methods Applied to Biochemical Raman Spectra.

    Science.gov (United States)

    Cordero, Eliana; Korinth, Florian; Stiebing, Clara; Krafft, Christoph; Schie, Iwan W; Popp, Jürgen

    2017-07-27

    Raman spectroscopy provides label-free biochemical information from tissue samples without complicated sample preparation. The clinical capability of Raman spectroscopy has been demonstrated in a wide range of in vitro and in vivo applications. However, a challenge for in vivo applications is the simultaneous excitation of auto-fluorescence in the majority of tissues of interest, such as liver, bladder, brain, and others. Raman bands are then superimposed on a fluorescence background, which can be several orders of magnitude larger than the Raman signal. To eliminate the disturbing fluorescence background, several approaches are available. Among instrumentational methods shifted excitation Raman difference spectroscopy (SERDS) has been widely applied and studied. Similarly, computational techniques, for instance extended multiplicative scatter correction (EMSC), have also been employed to remove undesired background contributions. Here, we present a theoretical and experimental evaluation and comparison of fluorescence background removal approaches for Raman spectra based on SERDS and EMSC.

  20. Natural background radioactivity of the earth's surface -- essential information for environmental impact studies

    International Nuclear Information System (INIS)

    Tauchid, M.; Grasty, R.L.

    2002-01-01

    An environmental impact study is basically a study of change. This change is compared to the preexisting conditions that are usually perceived to be the original one or the 'pristine' stage. Unfortunately reliable information on the 'so called' pristine stage is far from adequate. One of the essential parts of this information is a good knowledge of the earth's chemical make up, or its geochemistry. Presently available data on the geochemistry of the earth's surface, including those related to radioactive elements, are incomplete and inconsistent. The main reason why a number of regulations are judged to be too strict and disproportional to the risks that might be caused by some human activities, is the lack of reliable information on the natural global geochemical background on which environmental regulations should be based. The main objective of this paper is to present a view on the need for complete baseline information on the earth's surface environment and in particular its geochemical character. It is only through the availability of complete information, including reliable baseline information on the natural radioactivity, that an appropriate study on the potential effect of the various naturally occurring elements on human health be carried out. Presented here are a number of examples where the natural radioactivity of an entire country has been mapped, or is in progress. Also described are the ways these undertakings were accomplished. There is a general misconception that elevated radioactivity can be found only around uranium mines, nuclear power reactors and similar nuclear installations. As can be seen from some of these maps, the natural background radioactivity of the earth's surface closely reflects the underlying geological formations and their alteration products. In reality, properly regulated and managed facilities, the levels of radioactivity associated with many of these facilities are generally quite low relative to those associated with

  1. Information technology equipment cooling method

    Science.gov (United States)

    Schultz, Mark D.

    2015-10-20

    According to one embodiment, a system for removing heat from a rack of information technology equipment may include a sidecar indoor air to liquid heat exchanger that cools air utilized by the rack of information technology equipment to cool the rack of information technology equipment. The system may also include a liquid to liquid heat exchanger and an outdoor heat exchanger. The system may further include configurable pathways to connect and control fluid flow through the sidecar heat exchanger, the liquid to liquid heat exchanger, the rack of information technology equipment, and the outdoor heat exchanger based upon ambient temperature and/or ambient humidity to remove heat generated by the rack of information technology equipment.

  2. A method of reducing background fluctuation in tunable diode laser absorption spectroscopy

    Science.gov (United States)

    Yang, Rendi; Dong, Xiaozhou; Bi, Yunfeng; Lv, Tieliang

    2018-03-01

    Optical interference fringe is the main factor that leads to background fluctuation in gas concentration detection based on tunable diode laser absorption spectroscopy. The interference fringes are generated by multiple reflections or scatterings upon optical surfaces in optical path and make the background signal present an approximated sinusoidal oscillation. To reduce the fluctuation of the background, a method that combines dual tone modulation (DTM) with vibration reflector (VR) is proposed in this paper. The combination of DTM and VR can make the unwanted periodic interference fringes to be averaged out and the effectiveness of the method in reducing background fluctuation has been verified by simulation and real experiments in this paper. In the detection system based on the proposed method, the standard deviation (STD) value of the background signal is decreased to 0.0924 parts per million (ppm), which is reduced by a factor of 16 compared with that of wavelength modulation spectroscopy. The STD value of 0.0924 ppm corresponds to the absorption of 4 . 328 × 10-6Hz - 1 / 2 (with effective optical path length of 4 m and integral time of 0.1 s). Moreover, the proposed method presents a better stable performance in reducing background fluctuation in long time experiments.

  3. Background information to the installers guide for small scale mains connected PV

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-07-01

    This report contains background information used by BRE, EA Technology, Halcrows and Sundog when compiling guidance for the UK's New and Renewable Energy Programme on the installation of small-scale photovoltaics (PV) in buildings. The report considers: relevant standards; general safety issues; fire and safety issues, including the fire resistance of PV modules; PV module ratings such as maximum voltage and maximum current; DC cabling; the DC disconnect; the DC junction box; fault analysis; general and AC side earthing; DC earthing; lightning and surge suppression; inverters; AC modules; AC systems; getting connection; mounting options; and installation issues.

  4. Background Information for the Nevada National Security Site Integrated Sampling Plan, Revision 0

    Energy Technology Data Exchange (ETDEWEB)

    Farnham, Irene; Marutzky, Sam

    2014-12-01

    This document describes the process followed to develop the Nevada National Security Site (NNSS) Integrated Sampling Plan (referred to herein as the Plan). It provides the Plan’s purpose and objectives, and briefly describes the Underground Test Area (UGTA) Activity, including the conceptual model and regulatory requirements as they pertain to groundwater sampling. Background information on other NNSS groundwater monitoring programs—the Routine Radiological Environmental Monitoring Plan (RREMP) and Community Environmental Monitoring Program (CEMP)—and their integration with the Plan are presented. Descriptions of the evaluations, comments, and responses of two Sampling Plan topical committees are also included.

  5. Background information on a multimedia nitrogen emission reduction strategy; Hintergrundpapier zu einer multimedialen Stickstoffemissionsminderungsstrategie

    Energy Technology Data Exchange (ETDEWEB)

    Geupel; Jering; Frey (and others)

    2009-04-15

    The background information report on a multimedia nitrogen reduction strategy covers the following chapters: 1. Introduction: the nitrogen cascade and the anthropogenic influence, environmental impact of increased nitrogen emissions and effects on human health. 2. Sources and balancing of anthropogenic nitrogen emissions in Germany. 3. Environmental quality targets, activity goals of environmental measures and instruments of an integrated nitrogen reduction strategy. 4. Conclusions and perspectives. The attachments include emission sources, nitrogen release and nitrogen transport in Germany; catalogue of measures and instruments according the criteria efficiency and cost-efficacy.

  6. Envelope method for background elimination from X-ray fluorescence spectra

    International Nuclear Information System (INIS)

    Monakhov, V.V.; Naumenko, P.A.; Chashinskaya, O.A.

    2006-01-01

    The influence of the background noise caused by Bremsstrahlung on the accuracy of the envelope method at x-ray fluorescence spectra processing is studied. This is carried out by the example of model spectra at different forms of Bremsstrahlung noise as well as at the presence of background noise in spectra. The interpolation by parabolic splines is used for the estimation of the error of the envelope method for the elimination of continuos background noise. It is found out that the error of the proposed method constitutes decimal parts of percent. It is shown that the envelope method is the effective technique for the elimination of the continuous Bremsstrahlung from x-ray fluorescence spectra of the first order [ru

  7. A non-iterative method for fitting decay curves with background

    International Nuclear Information System (INIS)

    Mukoyama, T.

    1982-01-01

    A non-iterative method for fitting a decay curve with background is presented. The sum of an exponential function and a constant term is linearized by the use of the difference equation and parameters are determined by the standard linear least-squares fitting. The validity of the present method has been tested against pseudo-experimental data. (orig.)

  8. Chemical Source Localization Fusing Concentration Information in the Presence of Chemical Background Noise.

    Science.gov (United States)

    Pomareda, Víctor; Magrans, Rudys; Jiménez-Soto, Juan M; Martínez, Dani; Tresánchez, Marcel; Burgués, Javier; Palacín, Jordi; Marco, Santiago

    2017-04-20

    We present the estimation of a likelihood map for the location of the source of a chemical plume dispersed under atmospheric turbulence under uniform wind conditions. The main contribution of this work is to extend previous proposals based on Bayesian inference with binary detections to the use of concentration information while at the same time being robust against the presence of background chemical noise. For that, the algorithm builds a background model with robust statistics measurements to assess the posterior probability that a given chemical concentration reading comes from the background or from a source emitting at a distance with a specific release rate. In addition, our algorithm allows multiple mobile gas sensors to be used. Ten realistic simulations and ten real data experiments are used for evaluation purposes. For the simulations, we have supposed that sensors are mounted on cars which do not have among its main tasks navigating toward the source. To collect the real dataset, a special arena with induced wind is built, and an autonomous vehicle equipped with several sensors, including a photo ionization detector (PID) for sensing chemical concentration, is used. Simulation results show that our algorithm, provides a better estimation of the source location even for a low background level that benefits the performance of binary version. The improvement is clear for the synthetic data while for real data the estimation is only slightly better, probably because our exploration arena is not able to provide uniform wind conditions. Finally, an estimation of the computational cost of the algorithmic proposal is presented.

  9. Subspace-based optimization method for inverse scattering problems with an inhomogeneous background medium

    International Nuclear Information System (INIS)

    Chen, Xudong

    2010-01-01

    This paper proposes a version of the subspace-based optimization method to solve the inverse scattering problem with an inhomogeneous background medium where the known inhomogeneities are bounded in a finite domain. Although the background Green's function at each discrete point in the computational domain is not directly available in an inhomogeneous background scenario, the paper uses the finite element method to simultaneously obtain the Green's function at all discrete points. The essence of the subspace-based optimization method is that part of the contrast source is determined from the spectrum analysis without using any optimization, whereas the orthogonally complementary part is determined by solving a lower dimension optimization problem. This feature significantly speeds up the convergence of the algorithm and at the same time makes it robust against noise. Numerical simulations illustrate the efficacy of the proposed algorithm. The algorithm presented in this paper finds wide applications in nondestructive evaluation, such as through-wall imaging

  10. Evaluation of methods to reduce background using the Python-based ELISA_QC program.

    Science.gov (United States)

    Webster, Rose P; Cohen, Cinder F; Saeed, Fatima O; Wetzel, Hanna N; Ball, William J; Kirley, Terence L; Norman, Andrew B

    2018-05-01

    Almost all immunological approaches [immunohistochemistry, enzyme-linked immunosorbent assay (ELISA), Western blot], that are used to quantitate specific proteins have had to address high backgrounds due to non-specific reactivity. We report here for the first time a quantitative comparison of methods for reduction of the background of commercial biotinylated antibodies using the Python-based ELISA_QC program. This is demonstrated using a recombinant humanized anti-cocaine monoclonal antibody. Several approaches, such as adjustment of the incubation time and the concentration of blocking agent, as well as the dilution of secondary antibodies, have been explored to address this issue. In this report, systematic comparisons of two different methods, contrasted with other more traditional methods to address this problem are provided. Addition of heparin (HP) at 1 μg/ml to the wash buffer prior to addition of the secondary biotinylated antibody reduced the elevated background absorbance values (from a mean of 0.313 ± 0.015 to 0.137 ± 0.002). A novel immunodepletion (ID) method also reduced the background (from a mean of 0.331 ± 0.010 to 0.146 ± 0.013). Overall, the ID method generated more similar results at each concentration of the ELISA standard curve to that using the standard lot 1 than the HP method, as analyzed by the Python-based ELISA_QC program. We conclude that the ID method, while more laborious, provides the best solution to resolve the high background seen with specific lots of biotinylated secondary antibody. Copyright © 2018. Published by Elsevier B.V.

  11. GROUPWARE - MODERN INFORMATION MANAGERIAL METHOD

    Directory of Open Access Journals (Sweden)

    Rozalia NISTOR

    2006-01-01

    Full Text Available The notion groupware contents the information technologies that facilitate theteam work and that are intended for communication, collaboration,coordination within the organization. Having as base software routines forteamwork, the groupware technology has many applications in themanagement process of the organization. The notion groupware refers to aspecial class of web packages connected to a network of personalcomputers: email, chat, video IP, newsgroups, etc. The studies from theliterature consider the groupware as a class of software programs thatfacilitate the coordination, the communication and the cooperation within themember of a group. As in marketing the marketing-mix is known as the “4P”,in the area of groupware its characteristics are known as the “3C”:communication within the group; coordination among the members of thegroup; collaboration among the members of the group. From the groupwaresoftware those with relevance for the managerial activity are: electronic mail,Internet meetings, time management, project management, the managementof dissimulated information. The groupware technologies can be divised inmany categories based on two elements: time and space. The users of agroupware work together in the same time – real time groupware, or invarious periods of time – offline groupware.

  12. Pricing Power of Agricultural Products under the Background of Small Peasant Management and Information Asymmetry

    Institute of Scientific and Technical Information of China (English)

    Dexuan LI

    2016-01-01

    From the background of small peasant management and information asymmetry,this paper introduced the middle profit sharing model and discussed influence factors and ownership of pricing power of agricultural products. It obtained following results:( i) the transaction scale has positive effect on farmer’s pricing power of agricultural products,while the competitor’s transaction scale has negative effect on it,so does the cost for information search;( ii) under the condition of small peasant management system,farmer is in a relatively weak position in the distribution of pricing power of agricultural products,due to factors such as small transaction scale,information asymmetry and farmer’s weak negotiation ability;( iii) through cooperative game,farmer and buyers can share cooperative surplus at the agreed ratio;( iv) the introduction of self-organizing specialized farmers cooperatives is favorable for solving the problem of pricing power of agricultural products,and possible problems,such as " collective action dilemma" and " fake cooperatives" in the cooperative development process can be solved by internal and external division of labor and specialization of cooperatives.

  13. Opportunities for renewable biomass in the Dutch province of Zeeland. Background information

    International Nuclear Information System (INIS)

    De Buck, A.; Croezen, H.

    2009-04-01

    The Dutch province of Zeeland is organizing three bio-debates to map economically attractive and renewable biomass opportunities. Participants included industrial businesses, ZLTO, ZMF, Zeeland Seaports, Impuls Zeeland, Hogeschool Zeeland and the University of Ghent. CE Delft is organizing the debates and provides the expertise in this field. In the first debate (Goes, 22 January 2009) the main lines for deployment of biomass in Zeeland were established. One of the conclusions was that there are opportunities for existing industry to implement new technology for large-scale use of (imported) biomass. As for agriculture, there may be opportunities for high-quality chemicals from agricultural crops. Agriculture and industry have opportunities in the short term for better and more high-quality utilization of existing residual flows of biomass. The second and third debate should address concrete opportunities for the industry and agriculture in Zeeland. This report is background information to support the debates. [nl

  14. Comparison of two interpolative background subtraction methods using phantom and clinical data

    International Nuclear Information System (INIS)

    Houston, A.S.; Sampson, W.F.D.

    1989-01-01

    Two interpolative background subtraction methods used in scintigraphy are tested using both phantom and clinical data. Cauchy integral subtraction was found to be relatively free of artefacts but required more computing time than bilinear interpolation. Both methods may be used with reasonable confidence for the quantification of relative measurements such as left ventricular ejection fraction and myocardial perfusion index but should be avoided if at all possible in the quantification of absolute measurements such as glomerular filtration rate. (author)

  15. Food irradiation: physical-chemical, technological and economical background and competing methods of food preservation

    International Nuclear Information System (INIS)

    Zagorski, Z.P.

    1994-01-01

    Physical, chemical and technical as well as economical background of food preservation by irradiation have been performed. The radiation sources and the elements of radiation chemistry connected with their use in food irradiation process have been shown. The problems of dosimetry and endurance of dose uniformity for processed products have been also discussed. The other methods of food preservation and their weakness and advantages have been also presented and compared with food irradiation method

  16. Gravel Image Segmentation in Noisy Background Based on Partial Entropy Method

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Because of wide variation in gray levels and particle dimensions and the presence of many small gravel objects in the background, as well as corrupting the image by noise, it is difficult o segment gravel objects. In this paper, we develop a partial entropy method and succeed to realize gravel objects segmentation. We give entropy principles and fur calculation methods. Moreover, we use minimum entropy error automaticly to select a threshold to segment image. We introduce the filter method using mathematical morphology. The segment experiments are performed by using different window dimensions for a group of gravel image and demonstrates that this method has high segmentation rate and low noise sensitivity.

  17. A study on the method for cancelling the background noise of the impact signal

    International Nuclear Information System (INIS)

    Kim, J. S.; Ham, C. S.; Park, J. H.

    1998-01-01

    In this paper, we compared the noise canceller (time domain analysis method) to the spectral subtraction (frequency domain analysis method) for cancelling background noise when the Loose Part Monitoring System's accelerometers combined the noise signal with the impact signal if the impact signal exists. In the operation of a nuclear power plant monitoring, alarm triggering occurs due to a peak signal in the background noise, an amplitude increase by component operation such as control rod movement or abrupt pump operation. This operation causes the background noise in LPMS. Thus this noise inputs to LPMS together with the impact signal. In case that this noise amplitude is very large comparing to that of the impact signal, we may not analyze the impact position and mass estimation. We analyzed two methods for cancelling background noise. First, we evaluate the signal to noise ratio utilizing the noise canceller. Second, we evaluate the signal to noise ratio utilizing the spectral subtraction. The evaluation resulted superior the noise canceller to the spectral subtraction on the signal to noise ratio

  18. Removal of stored particle background via the electric dipole method in the KATRIN main spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    Hilk, Daniel [Institut fuer Experimentelle Kernphysik, KIT, Karlsruhe (Germany); Collaboration: KATRIN-Collaboration

    2016-07-01

    The goal of the KArlsruhe TRItium Neutrino (KATRIN) experiment is to determine the effective mass of the electron anti neutrino by measuring the electron energy spectrum of tritium beta decay near the endpoint. The goal is to reach a sensitivity on the neutrino mass of 200 meV for which a low background level of 10{sup -2} counts per second is mandatory. Electrons from single radioactive decays of radon and tritium in the KATRIN main spectrometer with energies in the keV range can be magnetically stored for hours. While cooling down via ionization of residual gas molecules, they produce hundreds of secondary electrons, which can reach the detector and contribute to the background signals. In order to suppress this background component, several methods are investigated to remove stored electrons, such as the application of an electric dipole field and the application of magnetic pulses. This talk introduces the mechanism of background production due to stored electrons and their removal by the electric dipole method in the main spectrometer. In context of the spectrometer- and detector-commissioning phase in summer 2015, measurement results of the application of the electric dipole method are presented.

  19. Method Engineering: Engineering of Information Systems Development Methods and Tools

    NARCIS (Netherlands)

    Brinkkemper, J.N.; Brinkkemper, Sjaak

    1996-01-01

    This paper proposes the term method engineering for the research field of the construction of information systems development methods and tools. Some research issues in method engineering are identified. One major research topic in method engineering is discussed in depth: situational methods, i.e.

  20. A review of Green's function methods in computational fluid mechanics: Background, recent developments and future directions

    International Nuclear Information System (INIS)

    Dorning, J.

    1981-01-01

    The research and development over the past eight years on local Green's function methods for the high-accuracy, high-efficiency numerical solution of nuclear engineering problems is reviewed. The basic concepts and key ideas are presented by starting with an expository review of the original fully two-dimensional local Green's function methods developed for neutron diffusion and heat conduction, and continuing through the progressively more complicated and more efficient nodal Green's function methods for neutron diffusion, heat conduction and neutron transport to establish the background for the recent development of Green's function methods in computational fluid mechanics. Some of the impressive numerical results obtained via these classes of methods for nuclear engineering problems are briefly summarized. Finally, speculations are proffered on future directions in which the development of these types of methods in fluid mechanics and other areas might lead. (orig.) [de

  1. Background field method for nonlinear σ-model in stochastic quantization

    International Nuclear Information System (INIS)

    Nakazawa, Naohito; Ennyu, Daiji

    1988-01-01

    We formulate the background field method for the nonlinear σ-model in stochastic quantization. We demonstrate a one-loop calculation for a two-dimensional non-linear σ-model on a general riemannian manifold based on our formulation. The formulation is consistent with the known results in ordinary quantization. As a simple application, we also analyse the multiplicative renormalization of the O(N) nonlinear σ-model. (orig.)

  2. Method and apparatus for information carrier authentication

    NARCIS (Netherlands)

    2015-01-01

    The present invention relates to a method of enabling authentication of an information carrier, the information carrier comprising a writeable part and a physical token arranged to supply a response upon receiving a challenge, the method comprising the following steps; applying a first challenge to

  3. Universal field matching in craniospinal irradiation by a background-dose gradient-optimized method.

    Science.gov (United States)

    Traneus, Erik; Bizzocchi, Nicola; Fellin, Francesco; Rombi, Barbara; Farace, Paolo

    2018-01-01

    The gradient-optimized methods are overcoming the traditional feathering methods to plan field junctions in craniospinal irradiation. In this note, a new gradient-optimized technique, based on the use of a background dose, is described. Treatment planning was performed by RayStation (RaySearch Laboratories, Stockholm, Sweden) on the CT scans of a pediatric patient. Both proton (by pencil beam scanning) and photon (by volumetric modulated arc therapy) treatments were planned with three isocenters. An 'in silico' ideal background dose was created first to cover the upper-spinal target and to produce a perfect dose gradient along the upper and lower junction regions. Using it as background, the cranial and the lower-spinal beams were planned by inverse optimization to obtain dose coverage of their relevant targets and of the junction volumes. Finally, the upper-spinal beam was inversely planned after removal of the background dose and with the previously optimized beams switched on. In both proton and photon plans, the optimized cranial and the lower-spinal beams produced a perfect linear gradient in the junction regions, complementary to that produced by the optimized upper-spinal beam. The final dose distributions showed a homogeneous coverage of the targets. Our simple technique allowed to obtain high-quality gradients in the junction region. Such technique universally works for photons as well as protons and could be applicable to the TPSs that allow to manage a background dose. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  4. Internal combustion engines for alcohol motor fuels: a compilation of background technical information

    Energy Technology Data Exchange (ETDEWEB)

    Blaser, Richard

    1980-11-01

    This compilation, a draft training manual containing technical background information on internal combustion engines and alcohol motor fuel technologies, is presented in 3 parts. The first is a compilation of facts from the state of the art on internal combustion engine fuels and their characteristics and requisites and provides an overview of fuel sources, fuels technology and future projections for availability and alternatives. Part two compiles facts about alcohol chemistry, alcohol identification, production, and use, examines ethanol as spirit and as fuel, and provides an overview of modern evaluation of alcohols as motor fuels and of the characteristics of alcohol fuels. The final section compiles cross references on the handling and combustion of fuels for I.C. engines, presents basic evaluations of events leading to the use of alcohols as motor fuels, reviews current applications of alcohols as motor fuels, describes the formulation of alcohol fuels for engines and engine and fuel handling hardware modifications for using alcohol fuels, and introduces the multifuel engines concept. (LCL)

  5. Thin-shell bubbles and information loss problem in anti de Sitter background

    Energy Technology Data Exchange (ETDEWEB)

    Sasaki, Misao [Yukawa Institute for Theoretical Physics,Kyoto University, Kyoto 606-8502 (Japan); Tomsk State Pedagogical University,634050 Tomsk (Russian Federation); Yeom, Dong-han [Yukawa Institute for Theoretical Physics,Kyoto University, Kyoto 606-8502 (Japan); Leung Center for Cosmology and Particle Astrophysics, National Taiwan University,Taipei 10617, Taiwan (China)

    2014-12-24

    We study the motion of thin-shell bubbles and their tunneling in anti de Sitter (AdS) background. We are interested in the case when the outside of a shell is a Schwarzschild-AdS space (false vacuum) and the inside of it is an AdS space with a lower vacuum energy (true vacuum). If a collapsing true vacuum bubble is created, classically it will form a Schwarzschild-AdS black hole. However, this collapsing bubble can tunnel to a bouncing bubble that moves out to spatial infinity. Then, although the classical causal structure of a collapsing true vacuum bubble has the singularity and the event horizon, quantum mechanically the wavefunction has support for a history without any singularity nor event horizon which is mediated by the non-perturbative, quantum tunneling effect. This may be regarded an explicit example that shows the unitarity of an asymptotic observer in AdS, while a classical observer who only follows the most probable history effectively lose information due to the formation of an event horizon.

  6. Thin-shell bubbles and information loss problem in anti de Sitter background

    International Nuclear Information System (INIS)

    Sasaki, Misao; Yeom, Dong-han

    2014-01-01

    We study the motion of thin-shell bubbles and their tunneling in anti de Sitter (AdS) background. We are interested in the case when the outside of a shell is a Schwarzschild-AdS space (false vacuum) and the inside of it is an AdS space with a lower vacuum energy (true vacuum). If a collapsing true vacuum bubble is created, classically it will form a Schwarzschild-AdS black hole. However, this collapsing bubble can tunnel to a bouncing bubble that moves out to spatial infinity. Then, although the classical causal structure of a collapsing true vacuum bubble has the singularity and the event horizon, quantum mechanically the wavefunction has support for a history without any singularity nor event horizon which is mediated by the non-perturbative, quantum tunneling effect. This may be regarded an explicit example that shows the unitarity of an asymptotic observer in AdS, while a classical observer who only follows the most probable history effectively lose information due to the formation of an event horizon.

  7. Increasing Power by Sharing Information from Genetic Background and Treatment in Clustering of Gene Expression Time Series

    Directory of Open Access Journals (Sweden)

    Sura Zaki Alrashid

    2018-02-01

    Full Text Available Clustering of gene expression time series gives insight into which genes may be co-regulated, allowing us to discern the activity of pathways in a given microarray experiment. Of particular interest is how a given group of genes varies with different conditions or genetic background. This paper develops
a new clustering method that allows each cluster to be parameterised according to whether the behaviour of the genes across conditions is correlated or anti-correlated. By specifying correlation between such genes,more information is gain within the cluster about how the genes interrelate. Amyotrophic lateral sclerosis (ALS is an irreversible neurodegenerative disorder that kills the motor neurons and results in death within 2 to 3 years from the symptom onset. Speed of progression for different patients are heterogeneous with significant variability. The SOD1G93A transgenic mice from different backgrounds (129Sv and C57 showed consistent phenotypic differences for disease progression. A hierarchy of Gaussian isused processes to model condition-specific and gene-specific temporal co-variances. This study demonstrated about finding some significant gene expression profiles and clusters of associated or co-regulated gene expressions together from four groups of data (SOD1G93A and Ntg from 129Sv and C57 backgrounds. Our study shows the effectiveness of sharing information between replicates and different model conditions when modelling gene expression time series. Further gene enrichment score analysis and ontology pathway analysis of some specified clusters for a particular group may lead toward identifying features underlying the differential speed of disease progression.

  8. Method Engineering: Engineering of Information Systems Development Methods and Tools

    OpenAIRE

    Brinkkemper, J.N.; Brinkkemper, Sjaak

    1996-01-01

    This paper proposes the term method engineering for the research field of the construction of information systems development methods and tools. Some research issues in method engineering are identified. One major research topic in method engineering is discussed in depth: situational methods, i.e. the configuration of a project approach that is tuned to the project at hand. A language and support tool for the engineering of situational methods are discussed.

  9. Methods for communicating technical information as public information

    International Nuclear Information System (INIS)

    Zara, S.A.

    1987-01-01

    Many challenges face the nuclear industry, especially in the waste management area. One of the biggest challenges is effective communication with the general public. Technical complexity, combined with the public's lack of knowledge and negative emotional response, complicate clear communication of radioactive waste management issues. The purpose of this session is to present and discuss methods for overcoming these obstacles and effectively transmitting technical information as public information. The methods presented encompass audio, visual, and print approaches to message transmission. To support these methods, the author also discusses techniques, based on current research, for improving the communication process

  10. 77 FR 31017 - Office of Facilities Management and Program Services; Information Collection; Background...

    Science.gov (United States)

    2012-05-24

    ... 3090-0287, Background Investigations for Child Care Workers. Instructions: Please submit comments only... request for review and approval for background check investigations of child care workers, form GSA 176C... Child Care Workers AGENCY: Office of Facilities Management and Program Services, Public Building Service...

  11. Method for gathering and summarizing internet information

    Energy Technology Data Exchange (ETDEWEB)

    Potok, Thomas E.; Elmore, Mark Thomas; Reed, Joel Wesley; Treadwell, Jim N.; Samatova, Nagiza Faridovna

    2010-04-06

    A computer method of gathering and summarizing large amounts of information comprises collecting information from a plurality of information sources (14, 51) according to respective maps (52) of the information sources (14), converting the collected information from a storage format to XML-language documents (26, 53) and storing the XML-language documents in a storage medium, searching for documents (55) according to a search query (13) having at least one term and identifying the documents (26) found in the search, and displaying the documents as nodes (33) of a tree structure (32) having links (34) and nodes (33) so as to indicate similarity of the documents to each other.

  12. A method for subtraction of the extrarenal 'background' in dynamic 131I-hippurate renoscintigraphy

    International Nuclear Information System (INIS)

    Mlodkowska, E.; Liniecki, J.; Surma, M.

    1979-01-01

    Using a Toshiba GC-401 gamma camera with MDS computer Trinary a new method was developed for subtracting the extrarenal (extracanalicular) 'background' from the count rate recorded over the kidneys after intravenous administration of 131 I-hippurate. Mean subtraction factors of the 'blood' activity curve were calculated from a study of 27 patients who were given 51 Cr-HSA for purposes of conventional renography with 'background' subtraction. The values of the mean subtraction factors anti Fsub(R,L) for the right and left kidney, by which a blood count rate should be multiplied amounted to 0.86 +- 0.12 and 0.79 +- 0.13, respectively. A comparison of the coefficients of variation of the pure renal signal when mean vs. individually determined subtraction factors were used, and the verification of the method in unilaterally nephrectomized patients have demonstrated that determination of the factors, anti Fsub(R,L), for each patient individually is not required and sufficient precision can be obtained by using the method and factors reported in this study. (orig.) [de

  13. Simple analytical methods for computing the gravity-wave contribution to the cosmic background radiation anisotropy

    International Nuclear Information System (INIS)

    Wang, Y.

    1996-01-01

    We present two simple analytical methods for computing the gravity-wave contribution to the cosmic background radiation (CBR) anisotropy in inflationary models; one method uses a time-dependent transfer function, the other methods uses an approximate gravity-mode function which is a simple combination of the lowest order spherical Bessel functions. We compare the CBR anisotropy tensor multipole spectrum computed using our methods with the previous result of the highly accurate numerical method, the open-quote open-quote Boltzmann close-quote close-quote method. Our time-dependent transfer function is more accurate than the time-independent transfer function found by Turner, White, and Lindsey; however, we find that the transfer function method is only good for l approx-lt 120. Using our approximate gravity-wave mode function, we obtain much better accuracy; the tensor multipole spectrum we find differs by less than 2% for l approx-lt 50, less than 10% for l approx-lt 120, and less than 20% for l≤300 from the open-quote open-quote Boltzmann close-quote close-quote result. Our approximate graviton mode function should be quite useful in studying tensor perturbations from inflationary models. copyright 1996 The American Physical Society

  14. Information decomposition method to analyze symbolical sequences

    International Nuclear Information System (INIS)

    Korotkov, E.V.; Korotkova, M.A.; Kudryashov, N.A.

    2003-01-01

    The information decomposition (ID) method to analyze symbolical sequences is presented. This method allows us to reveal a latent periodicity of any symbolical sequence. The ID method is shown to have advantages in comparison with application of the Fourier transformation, the wavelet transform and the dynamic programming method to look for latent periodicity. Examples of the latent periods for poetic texts, DNA sequences and amino acids are presented. Possible origin of a latent periodicity for different symbolical sequences is discussed

  15. Calculation of one-loop anomalous dimensions by means of the background field method

    International Nuclear Information System (INIS)

    Morozov, A.Yu.

    1983-01-01

    The knowledge of propagators in background fields makes calculation of anomalous dimensions (AD) straightforward and brief. The paper illustrates this statement by calculation of AD of many spin-zero and one QCD operators up to the eighth dimension included. The method presented does not simplify calculations in case of four-quark operators, therefore these are not discussed. Together with calculational difficulties arising for operators with derivatives this limits capacities of the whole approach and leads to incompleteness of some mixing matrices found in the article

  16. Report: Management Alert - EPA Has Not Initiated Required Background Investigations for Information Systems Contractor Personnel

    Science.gov (United States)

    Report #17-P-0409, September 27, 2017. Not vetting contractor personnel before granting them network access exposes the EPA to risks. Contractor personnel with potentially questionable backgrounds who access sensitive agency data could cause harm.

  17. Studying Heavy Ion Collisions Using Methods From Cosmic Microwave Background (CMB Analysis

    Directory of Open Access Journals (Sweden)

    Gaardhøje J. J.

    2014-04-01

    Full Text Available We present and discuss a framework for studying the morphology of high-multiplicity events from relativistic heavy ion collisions using methods commonly employed in the analysis of the photons from the Cosmic Microwave Background (CMB. The analysis is based on the decomposition of the distribution of the number density of (charged particles expressed in polar and azimuthal coordinates into a sum of spherical harmonic functions. We present an application of the method exploting relevant symmetries to the study of azimuthal correlations arizing from collective flow among charged particles produced in relativistic heavy ion collisions. We discuss perspectives for event-by- event analyses, which with increasing collision energy will eventually open entirely new dimensions in the study of ultrarelaticistic heavy ion reactions.

  18. Methods of determining information needs for control

    Energy Technology Data Exchange (ETDEWEB)

    Borkowski, Z.

    1980-01-01

    Work has begun in the Main Data Center in the field of mining (Poland) on estimation in improvement of methods of determining information requirements necessary for control. Existing methods are briefly surveyed. Their imperfection is shown. The complexity of characteristics for this problem is pointed out.

  19. Exploring methods in information literacy research

    CERN Document Server

    Lipu, Suzanne; Lloyd, Annemaree

    2007-01-01

    This book provides an overview of approaches to assist researchers and practitioners to explore ways of undertaking research in the information literacy field. The first chapter provides an introductory overview of research by Dr Kirsty Williamson (author of Research Methods for Students, Academics and Professionals: Information Management and Systems) and this sets the scene for the rest of the chapters where each author explores the key aspects of a specific method and explains how it may be applied in practice. The methods covered include those representing qualitative, quantitative and mix

  20. Method of and System for Information Retrieval

    DEFF Research Database (Denmark)

    2015-01-01

    This invention relates to a system for and a method (100) of searching a collection of digital information (150) comprising a number of digital documents (110), the method comprising receiving or obtaining (102) a search query, the query comprising a number of search terms, searching (103) an ind......, a method of and a system for information retrieval or searching is readily provided that enhances the searching quality (i.e. the number of relevant documents retrieved and such documents being ranked high) when (also) using queries containing many search terms.......This invention relates to a system for and a method (100) of searching a collection of digital information (150) comprising a number of digital documents (110), the method comprising receiving or obtaining (102) a search query, the query comprising a number of search terms, searching (103) an index...... (300) using the search terms thereby providing information (301) about which digital documents (110) of the collection of digital information (150) that contains a given search term and one or more search related metrics (302; 303; 304; 305; 306), ranking (105) at least a part of the search result...

  1. A genomic background based method for association analysis in related individuals.

    Directory of Open Access Journals (Sweden)

    Najaf Amin

    Full Text Available BACKGROUND: Feasibility of genotyping of hundreds and thousands of single nucleotide polymorphisms (SNPs in thousands of study subjects have triggered the need for fast, powerful, and reliable methods for genome-wide association analysis. Here we consider a situation when study participants are genetically related (e.g. due to systematic sampling of families or because a study was performed in a genetically isolated population. Of the available methods that account for relatedness, the Measured Genotype (MG approach is considered the 'gold standard'. However, MG is not efficient with respect to time taken for the analysis of genome-wide data. In this context we proposed a fast two-step method called Genome-wide Association using Mixed Model and Regression (GRAMMAR for the analysis of pedigree-based quantitative traits. This method certainly overcomes the drawback of time limitation of the measured genotype (MG approach, but pays in power. One of the major drawbacks of both MG and GRAMMAR, is that they crucially depend on the availability of complete and correct pedigree data, which is rarely available. METHODOLOGY: In this study we first explore type 1 error and relative power of MG, GRAMMAR, and Genomic Control (GC approaches for genetic association analysis. Secondly, we propose an extension to GRAMMAR i.e. GRAMMAR-GC. Finally, we propose application of GRAMMAR-GC using the kinship matrix estimated through genomic marker data, instead of (possibly missing and/or incorrect genealogy. CONCLUSION: Through simulations we show that MG approach maintains high power across a range of heritabilities and possible pedigree structures, and always outperforms other contemporary methods. We also show that the power of our proposed GRAMMAR-GC approaches to that of the 'gold standard' MG for all models and pedigrees studied. We show that this method is both feasible and powerful and has correct type 1 error in the context of genome-wide association analysis

  2. Analysis of methods. [information systems evolution environment

    Science.gov (United States)

    Mayer, Richard J. (Editor); Ackley, Keith A.; Wells, M. Sue; Mayer, Paula S. D.; Blinn, Thomas M.; Decker, Louis P.; Toland, Joel A.; Crump, J. Wesley; Menzel, Christopher P.; Bodenmiller, Charles A.

    1991-01-01

    Information is one of an organization's most important assets. For this reason the development and maintenance of an integrated information system environment is one of the most important functions within a large organization. The Integrated Information Systems Evolution Environment (IISEE) project has as one of its primary goals a computerized solution to the difficulties involved in the development of integrated information systems. To develop such an environment a thorough understanding of the enterprise's information needs and requirements is of paramount importance. This document is the current release of the research performed by the Integrated Development Support Environment (IDSE) Research Team in support of the IISEE project. Research indicates that an integral part of any information system environment would be multiple modeling methods to support the management of the organization's information. Automated tool support for these methods is necessary to facilitate their use in an integrated environment. An integrated environment makes it necessary to maintain an integrated database which contains the different kinds of models developed under the various methodologies. In addition, to speed the process of development of models, a procedure or technique is needed to allow automatic translation from one methodology's representation to another while maintaining the integrity of both. The purpose for the analysis of the modeling methods included in this document is to examine these methods with the goal being to include them in an integrated development support environment. To accomplish this and to develop a method for allowing intra-methodology and inter-methodology model element reuse, a thorough understanding of multiple modeling methodologies is necessary. Currently the IDSE Research Team is investigating the family of Integrated Computer Aided Manufacturing (ICAM) DEFinition (IDEF) languages IDEF(0), IDEF(1), and IDEF(1x), as well as ENALIM, Entity

  3. Surfing for health: user evaluation of a health information website. Part one: Background and literature review.

    Science.gov (United States)

    Williams, Peter; Nicholas, David; Huntington, Paul; McLean, Fiona

    2002-06-01

    The Government in Britain is set on using the Internet to expand the provision of health information to the general public. Concerns over the quality of the health information have preoccupied commentators and organizations rather than the way users interact with health information systems. This report examines the issues surrounding the provision of electronic health information, and describes an evaluation undertaken of a commercial health website-that of Surgerydoor (http://www.surgerydoor.co.uk/), and comprises two parts. Part one outlines the literature on electronic health information evaluation. It discusses quality issues, but also redresses the imbalance by exploring other evaluative perspectives. Part two describes an evaluation of a health information Internet site in terms of its usability and appeal, undertaken as part of a Department of Health funded study on the impact of such systems.

  4. Research Investigation of Information Access Methods

    Science.gov (United States)

    Heinrichs, John H.; Sharkey, Thomas W.; Lim, Jeen-Su

    2006-01-01

    This study investigates the satisfaction of library users at Wayne State University who utilize alternative information access methods. The LibQUAL+[TM] desired and perceived that satisfaction ratings are used to determine the user's "superiority gap." By focusing limited library resources to address "superiority gap" issues identified by each…

  5. Agile Methods from the Viewpoint of Information

    Directory of Open Access Journals (Sweden)

    Eder Junior Alves

    2017-10-01

    Full Text Available Introduction: Since Paul M. G. Otlet highlighted the term documentation in 1934, proposing how to collect and organize the world's knowledge, many scientific researches directed observations to the study of Information Science. Methods and techniques have come up with a world view from the perspective of information. Agile methods follow this trend. Objective: The purpose is to analyze the relevance of information flow to organizations adopting agile methods, understanding how the innovation process is influenced by this practice. Methodology: This is a bibliometric study with fundamentals of Systematic Literature Review (SLR. The integration between the SLR technique interacting with Summarize tool is a new methodological proposal. Results: Scrum appears with the highest number of publications in SPELL. In comparison, results of Google Scholar pointed out to the importance of practices and team behaviors. In Science Direct repository, critical success factors in project management and software development are highlighted. Introduction: Conclusions: It was evident that agile methods are being used as process innovations. The benefits and advantages are evident with internal and external occurrence of information flow. Due to the prevalence in the literature, Scrum deserves attention by firms.

  6. Background estimation in short-wave region during determination of total sample composition by x-ray fluorescence method

    International Nuclear Information System (INIS)

    Simakov, V.A.; Kordyukov, S.V.; Petrov, E.N.

    1988-01-01

    Method of background estimation in short-wave spectral region during determination of total sample composition by X-ray fluorescence method is described. 13 types of different rocks with considerable variations of base composition and Zr, Nb, Th, U content below 7x10 -3 % are investigated. The suggested method of background accounting provides for a less statistical error of the background estimation than direct isolated measurement and reliability of its determination in a short-wave region independent on the sample base. Possibilities of suggested method for artificial mixtures conforming by the content of main component to technological concemtrates - niobium, zirconium, tantalum are estimated

  7. Traceability information carriers. The technology backgrounds and consumers' perceptions of the technological solutions

    DEFF Research Database (Denmark)

    Chrysochou, Polymeros; Chryssochoidis, George; Kehagia, Olga

    2009-01-01

    of and confidence in the information provided, perceived levels of convenience, impact on product quality and safety, impact on consumers' health and the environment, and potential consequences on ethical and privacy liberties constitute important factors influencing consumers' perceptions of technologies......The implementation of traceability in the food supply chain has reinforced adoption of technologies with the ability to track forward and trace back product-related information. Based on the premise that these technologies can be used as a means to provide product-related information to consumers...... in their production lines. For the purposes of the study, a focus group study was conducted across 12 European countries, while a set of four different technologies used as a means to provide traceability information to consumers was the focal point of the discussions in each focus group. Results show that the amount...

  8. Background field method in gauge theories and on linear sigma models

    International Nuclear Information System (INIS)

    van de Ven, A.E.M.

    1986-01-01

    This dissertation constitutes a study of the ultraviolet behavior of gauge theories and two-dimensional nonlinear sigma-models by means of the background field method. After a general introduction in chapter 1, chapter 2 presents algorithms which generate the divergent terms in the effective action at one-loop for arbitrary quantum field theories in flat spacetime of dimension d ≤ 11. It is demonstrated that global N = 1 supersymmetric Yang-Mills theory in six dimensions in one-loop UV-finite. Chapter 3 presents an algorithm which produces the divergent terms in the effective action at two-loops for renormalizable quantum field theories in a curved four-dimensional background spacetime. Chapter 4 presents a study of the two-loop UV-behavior of two-dimensional bosonic and supersymmetric non-linear sigma-models which include a Wess-Zumino-Witten term. It is found that, to this order, supersymmetric models on quasi-Ricci flat spaces are UV-finite and the β-functions for the bosonic model depend only on torsionful curvatures. Chapter 5 summarizes a superspace calculation of the four-loop β-function for two-dimensional N = 1 and N = 2 supersymmetric non-linear sigma-models. It is found that besides the one-loop contribution which vanishes on Ricci-flat spaces, the β-function receives four-loop contributions which do not vanish in the Ricci-flat case. Implications for superstrings are discussed. Chapters 6 and 7 treat the details of these calculations

  9. Danish extreme wind atlas: Background and methods for a WAsP engineering option

    Energy Technology Data Exchange (ETDEWEB)

    Rathmann, O; Kristensen, L; Mann, J [Risoe National Lab., Wind Energy and Atmospheric Physics Dept., Roskilde (Denmark); Hansen, S O [Svend Ole Hansen ApS, Copenhagen (Denmark)

    1999-03-01

    Extreme wind statistics is necessary design information when establishing wind farms and erecting bridges, buildings and other structures in the open air. Normal mean wind statistics in terms of directional and speed distribution may be estimated by wind atlas methods and are used to estimate e.g. annual energy output for wind turbines. It is the purpose of the present work to extend the wind atlas method to also include the local extreme wind statistics so that an extreme value as e.g. the 50-year wind can be estimated at locations of interest. Together with turbulence estimates such information is important regarding the necessary strength of wind turbines or structures to withstand high wind loads. In the `WAsP Engineering` computer program a flow model, which includes a model for the dynamic roughness of water surfaces, is used to realise such an extended wind atlas method. With basis in an extended wind atlas, also containing extreme wind statistics, this allows the program to estimate extreme winds in addition to mean winds and turbulence intensities at specified positions and heights. (au) EFP-97. 15 refs.

  10. Measuring background by the DIN-1M spectrometer using the oscillating absorbing screen method

    International Nuclear Information System (INIS)

    Glazkov, Yu.Yu.; Liforov, V.G.; Novikov, A.G.; Parfenov, V.A.; Semenov, V.A.

    1982-01-01

    Technique for measuring background by a double pulse slow neutron spectrometer is described. To measure the background on oscillating absorbing screen (OAS) periodically overlapping primary neutron beam at the input of a mechanical interrupter was used. During the overlapping monochromatic neutrons conditioned the effect are removed out of the beam and general background conditions are not practically applied. Screen oscillation permits to realize the condition of simultaneous measurement of effect and background neutrons. The optimal period of oscillations amounts to approximately 3 min. Analysis of neutron spectra scattered with different materials and corresponding background curves measured by means of the OAS technique shows that the share of monochromatic neutrons passing through the screen constitutes less than 1% of elastic peak and relative decrease of the total background level doesn't exceed 1.5-2%

  11. Algebraic renormalization of Yang-Mills theory with background field method

    International Nuclear Information System (INIS)

    Grassi, P.A.

    1996-01-01

    In this paper the renormalizability of Yang-Mills theory in the background gauge fixing is studied. By means of Ward identities of background gauge invariance and Slavnov-Taylor identities, in a regularization-independent way, the stability of the model under radiative corrections is proved and its renormalizability is verified. In particular, it is shown that the splitting between background and quantum field is stable under radiative corrections and this splitting does not introduce any new anomalies. (orig.)

  12. Financial Information Source, Knowledge, and Practices of College Students from Diverse Backgrounds

    Science.gov (United States)

    Mimura, Yoko; Koonce, Joan; Plunkett, Scott W.; Pleskus, Lindsey

    2015-01-01

    Using cross-sectional data, we examined the financial information sources, financial knowledge, and financial practices of young adults, many of whom are first generation college students, ethnic minorities, and immigrants or children of immigrants. Participants (n = 1,249) were undergraduate students at a large regional comprehensive university.…

  13. Traceability information carriers. The technology backgrounds and consumers' perceptions of the technological solutions.

    Science.gov (United States)

    Chrysochou, Polymeros; Chryssochoidis, George; Kehagia, Olga

    2009-12-01

    The implementation of traceability in the food supply chain has reinforced adoption of technologies with the ability to track forward and trace back product-related information. Based on the premise that these technologies can be used as a means to provide product-related information to consumers, this paper explores the perceived benefits and drawbacks of such technologies. The aim is to identify factors that influence consumers' perceptions of such technologies, and furthermore to advise the agri-food business on issues that they should consider prior to the implementation of such technologies in their production lines. For the purposes of the study, a focus group study was conducted across 12 European countries, while a set of four different technologies used as a means to provide traceability information to consumers was the focal point of the discussions in each focus group. Results show that the amount of and confidence in the information provided, perceived levels of convenience, impact on product quality and safety, impact on consumers' health and the environment, and potential consequences on ethical and privacy liberties constitute important factors influencing consumers' perceptions of technologies that provide traceability.

  14. Developing written information for cancer survivors from culturally and linguistically diverse backgrounds: Lessons learnt

    Directory of Open Access Journals (Sweden)

    Georgina Wiley

    2018-01-01

    Full Text Available Australia is a multicultural nation with a large migrant population. Migrants with cancer report inferior quality of life and the need for more information in their own language. This paper describes lessons learnt from developing culturally appropriate written information resources with and for Arabic, Italian, and Vietnamese cancer survivors and carers. The information needs of survivors from these language groups as well as guidelines for the development of written resources for culturally diverse populations were identified through literature review. Community consultation was undertaken with focus groups. The content was developed and tested with health professionals who spoke the appropriate language and focus group participants, ensuring relevance and appropriateness. Resource design and dissemination were informed through community consultation. A number of key tasks for developing resources were identified as follows: (1 community engagement and consultation; (2 culturally sensitive data collection; (3 focus group facilitators (recruitment and training; (4 content development; (5 translation and review process; (6 design; and (7 sustainability. This project reinforced literature review findings on the importance of cultural sensitivity in the development of resources. Engaging with community groups and incorporating culturally appropriate recruitment strategies optimises recruitment to focus groups and facilitates content development. Stakeholders and lay persons from the intended ethnic-minority communities should be involved in the development and formative evaluation of resources to ensure appropriateness and relevance and in the dissemination strategy to optimize penetration. We believe the lessons we have learnt will be relevant to any group intending to develop health information for culturally and linguistic diverse groups.

  15. Applied Ecosystem Analysis - Background EDT - The Ecosystem Diagnosis and Treatment Method

    International Nuclear Information System (INIS)

    Mobrand, L.E.; Lichatowich, J.A.; Howard, D.A.; Vogel, T.S.

    1996-05-01

    This volume consists of eight separate reports. We present them as background to the Ecosystem Diagnosis and Treatment (EDT) methodology. They are a selection from publications, white papers, and presentations prepared over the past two years. Some of the papers are previously published, others are currently being prepared for publication. In the early to mid 1980's the concern for failure of both natural and hatchery production of Columbia river salmon populations was widespread. The concept of supplementation was proposed as an alternative solution that would integrate artificial propagation with natural production. In response to the growing expectations placed upon the supplementation tool, a project called Regional Assessment of Supplementation Project (RASP) was initiated in 1990. The charge of RASP was to define supplementation and to develop guidelines for when, where and how it would be the appropriate solution to salmon enhancement in the Columbia basin. The RASP developed a definition of supplementation and a set of guidelines for planning salmon enhancement efforts which required consideration of all factors affecting salmon populations, including environmental, genetic, and ecological variables. The results of RASP led to a conclusion that salmon issues needed to be addressed in a manner that was consistent with an ecosystem approach. If the limitations and potentials of supplementation or any other management tool were to be fully understood it would have to be within the context of a broadly integrated approach - thus the Ecosystem Diagnosis and Treatment (EDT) method was born

  16. [The German program for disease management guidelines. Background, methods, and development process].

    Science.gov (United States)

    Ollenschläger, Günter; Kopp, Ina; Lelgemann, Monika; Sänger, Sylvia; Heymans, Lothar; Thole, Henning; Trapp, Henrike; Lorenz, Wilfried; Selbmann, Hans-Konrad; Encke, Albrecht

    2006-10-15

    The Program for National Disease Management Guidelines (German DM-CPG Program) was established in 2002 by the German Medical Association (umbrella organization of the German Chambers of Physicians) and joined by the Association of the Scientific Medical Societies (AWMF; umbrella organization of more than 150 professional societies) and by the National Association of Statutory Health Insurance Physicians (NASHIP) in 2003. The program provides a conceptual basis for disease management, focusing on high-priority health-care topics and aiming at the implementation of best practice recommendations for prevention, acute care, rehabilitation and chronic care. It is organized by the German Agency for Quality in Medicine, a founding member of the Guidelines International Network (G-I-N). The main objective of the German DM-CPG Program is to establish consensus of the medical professions on evidence-based key recommendations covering all sectors of health-care provision and facilitating the coordination of care for the individual patient through time and across interfaces. Within the last year, DM-CPGs have been published for asthma, chronic obstructive pulmonary disease, type 2 diabetes, and coronary heart disease. In addition, experts from national patient self-help groups have been developing patient guidance based upon the recommendations for health-care providers. The article describes background, methods, and tools of the DM-CPG Program, and is the first of a publication series dealing with innovative recommendations and aspects of the program.

  17. Applied Ecosystem Analysis - Background : EDT the Ecosystem Diagnosis and Treatment Method.

    Energy Technology Data Exchange (ETDEWEB)

    Mobrand, Lars E.

    1996-05-01

    This volume consists of eight separate reports. We present them as background to the Ecosystem Diagnosis and Treatment (EDT) methodology. They are a selection from publications, white papers, and presentations prepared over the past two years. Some of the papers are previously published, others are currently being prepared for publication. In the early to mid 1980`s the concern for failure of both natural and hatchery production of Columbia river salmon populations was widespread. The concept of supplementation was proposed as an alternative solution that would integrate artificial propagation with natural production. In response to the growing expectations placed upon the supplementation tool, a project called Regional Assessment of Supplementation Project (RASP) was initiated in 1990. The charge of RASP was to define supplementation and to develop guidelines for when, where and how it would be the appropriate solution to salmon enhancement in the Columbia basin. The RASP developed a definition of supplementation and a set of guidelines for planning salmon enhancement efforts which required consideration of all factors affecting salmon populations, including environmental, genetic, and ecological variables. The results of RASP led to a conclusion that salmon issues needed to be addressed in a manner that was consistent with an ecosystem approach. If the limitations and potentials of supplementation or any other management tool were to be fully understood it would have to be within the context of a broadly integrated approach - thus the Ecosystem Diagnosis and Treatment (EDT) method was born.

  18. Comparison of presbyopic additions determined by the fused cross-cylinder method using alternative target background colours.

    Science.gov (United States)

    Wee, Sung-Hyun; Yu, Dong-Sik; Moon, Byeong-Yeon; Cho, Hyun Gug

    2010-11-01

    To compare and contrast standard and alternative versions of refractor head (phoropter)-based charts used to determine reading addition. Forty one presbyopic subjects aged between 42 and 60 years were tested. Tentative additions were determined using a red-green background letter chart, and 4 cross-grid charts (with white, red, green, or red-green backgrounds) which were used with the fused cross cylinder (FCC) method. The final addition for a 40 cm working distance was determined for each subject by subjectively adjusting the tentative additions. There were significant differences in the tentative additions obtained using the 5 methods (repeated measures ANOVA, p FCC method. There were no significant differences between the tentative and final additions for the green background in the FCC method (p > 0.05). The intervals of the 95% limits of agreement were under ±0.50 D, and the narrowest interval (±0.26 D) was for the red-green background. The 3 FCC methods with a white, green, or red-green background provided a tentative addition close to the final addition. Compared with the other methods, the FCC method with the red-green background had a narrow range of error. Further, since this method combines the functions of both the fused cross-cylinder test and the duochrome test, it can be a useful technique for determining presbyopic additions. © 2010 The Authors. Ophthalmic and Physiological Optics © 2010 The College of Optometrists.

  19. Practical Methods for Information Security Risk Management

    Directory of Open Access Journals (Sweden)

    Cristian AMANCEI

    2011-01-01

    Full Text Available The purpose of this paper is to present some directions to perform the risk man-agement for information security. The article follows to practical methods through question-naire that asses the internal control, and through evaluation based on existing controls as part of vulnerability assessment. The methods presented contains all the key elements that concurs in risk management, through the elements proposed for evaluation questionnaire, list of threats, resource classification and evaluation, correlation between risks and controls and residual risk computation.

  20. Selective Exposure to and Acquisition of Information from Educational Television Programs as a Function of Appeal and Tempo of Background Music.

    Science.gov (United States)

    Wakshlag, Jacob J.; And Others

    1982-01-01

    The effect of educational television background music on selective exposure and information acquisition was studied. Background music of slow tempo, regardless of its appeal, had negligible effects on attention and information acquisition. Rhythmic, fast-tempo background music, especially when appealing, significantly reduced visual attention to…

  1. Background information for the development of a low-level waste performance assessment methodology

    International Nuclear Information System (INIS)

    Shipers, L.R.

    1989-12-01

    This document identifies and describes the potential postclosure pathways of radionuclide release, migration, and exposure from low-level radioactive waste disposal facilities. Each pathway identified is composed of a combination of migration pathways (air, surface water, ground water, food chain) and exposure pathways (direct gamma, inhalation, ingestion, surface contact). The pathway identification is based on a review and evaluation of existing information, and not all pathways presented in the document would necessarily be of importance at a given low-level waste disposal site. This document presents pathways associated with undisturbed (ground water, gas generation), naturally disturbed (erosion, bathtubbing, earth creep, frost heave, plant and animal intruder), and inadvertent intruder (construction, agriculture) scenarios of a low-level waste disposal facility. 20 refs., 1 fig

  2. Formation factor logging in-situ by electrical methods. Background and methodology

    International Nuclear Information System (INIS)

    Loefgren, Martin; Neretnieks, Ivars

    2002-10-01

    Matrix diffusion has been identified as one of the most important mechanisms governing the retardation of radionuclides escaping from a deep geological repository for nuclear waste. Radionuclides dissolved in groundwater flowing in water-bearing fractures will diffuse into water filled micropores in the rock. Important parameters governing the matrix diffusion are the formation factor, the surface diffusion and sorption. This report focuses on the formation factor in undisturbed intrusive igneous rock and the possibility of measuring this parameter in-situ. The background to and the methodology of formation factor logging in-situ by electrical methods are given. The formation factor is here defined as a parameter only depending on the geometry of the porous system and not on the diffusing specie. Traditionally the formation factor has been measured by through diffusion experiments on core samples, which are costly and time consuming. It has been shown that the formation factor could also be measured by electrical methods that are faster and less expensive. Previously this has only been done quantitatively in the laboratory on a centimetre or decimetre scale. When measuring the formation factor in-situ in regions with saline groundwater only the rock resistivity and the pore water resistivity are needed. The rock resistivity could be obtained by a variety of geophysical downhole tools. Water-bearing fractures disturb the measurements and data possibly affected by free water has to be sorted out. This could be done without loosing too much data if the vertical resolution of the tool is high enough. It was found that the rock resistivity tool presently used by SKB are neither quantitative or have enough vertical resolution. Therefore the slimhole Dual-Laterolog from Antares was tested with good results. This tool has a high vertical resolution and gives quantitative rock resistivities that need no correction. At present there is no method of directly obtaining the

  3. Formation factor logging in-situ by electrical methods. Background and methodology

    Energy Technology Data Exchange (ETDEWEB)

    Loefgren, Martin; Neretnieks, Ivars [Royal Inst. of Tech., Stockholm (Sweden). Dept. of Chemical Engineering and Technology

    2002-10-01

    Matrix diffusion has been identified as one of the most important mechanisms governing the retardation of radionuclides escaping from a deep geological repository for nuclear waste. Radionuclides dissolved in groundwater flowing in water-bearing fractures will diffuse into water filled micropores in the rock. Important parameters governing the matrix diffusion are the formation factor, the surface diffusion and sorption. This report focuses on the formation factor in undisturbed intrusive igneous rock and the possibility of measuring this parameter in-situ. The background to and the methodology of formation factor logging in-situ by electrical methods are given. The formation factor is here defined as a parameter only depending on the geometry of the porous system and not on the diffusing specie. Traditionally the formation factor has been measured by through diffusion experiments on core samples, which are costly and time consuming. It has been shown that the formation factor could also be measured by electrical methods that are faster and less expensive. Previously this has only been done quantitatively in the laboratory on a centimetre or decimetre scale. When measuring the formation factor in-situ in regions with saline groundwater only the rock resistivity and the pore water resistivity are needed. The rock resistivity could be obtained by a variety of geophysical downhole tools. Water-bearing fractures disturb the measurements and data possibly affected by free water has to be sorted out. This could be done without loosing too much data if the vertical resolution of the tool is high enough. It was found that the rock resistivity tool presently used by SKB are neither quantitative or have enough vertical resolution. Therefore the slimhole Dual-Laterolog from Antares was tested with good results. This tool has a high vertical resolution and gives quantitative rock resistivities that need no correction. At present there is no method of directly obtaining the

  4. Produced water discharges to the Gulf of Mexico: Background information for ecological risk assessments

    International Nuclear Information System (INIS)

    Meinhold, A.F.; Holtzman, S.; DePhillips, M.P.

    1996-06-01

    This report reviews ecological risk assessment concepts and methods; describes important biological resources in the Gulf of Mexico of potential concern for produced water impacts; and summarizes data available to estimate exposure and effects of produced water discharges. The emphasis is on data relating to produced water discharges in the central and western Gulf of Mexico, especially in Louisiana. Much of the summarized data and cited literature are relevant to assessments of impacts in other regions. Data describing effects on marine and estuarine fishes, mollusks, crustaceans and benthic invertebrates are emphasized. This review is part of a series of studies of the health and ecological risks from discharges of produced water to the Gulf of Mexico. These assessments will provide input to regulators in the development of guidelines and permits, and to industry in the use of appropriate discharge practices

  5. Produced water discharges to the Gulf of Mexico: Background information for ecological risk assessments

    Energy Technology Data Exchange (ETDEWEB)

    Meinhold, A.F.; Holtzman, S.; DePhillips, M.P.

    1996-06-01

    This report reviews ecological risk assessment concepts and methods; describes important biological resources in the Gulf of Mexico of potential concern for produced water impacts; and summarizes data available to estimate exposure and effects of produced water discharges. The emphasis is on data relating to produced water discharges in the central and western Gulf of Mexico, especially in Louisiana. Much of the summarized data and cited literature are relevant to assessments of impacts in other regions. Data describing effects on marine and estuarine fishes, mollusks, crustaceans and benthic invertebrates are emphasized. This review is part of a series of studies of the health and ecological risks from discharges of produced water to the Gulf of Mexico. These assessments will provide input to regulators in the development of guidelines and permits, and to industry in the use of appropriate discharge practices.

  6. A novel method to remove GPR background noise based on the similarity of non-neighboring regions

    Science.gov (United States)

    Montiel-Zafra, V.; Canadas-Quesada, F. J.; Vera-Candeas, P.; Ruiz-Reyes, N.; Rey, J.; Martinez, J.

    2017-09-01

    Ground penetrating radar (GPR) is a non-destructive technique that has been widely used in many areas of research, such as landmine detection or subsurface anomalies, where it is required to locate targets embedded within a background medium. One of the major challenges in the research of GPR data remains the improvement of the image quality of stone materials by means of detection of true anisotropies since most of the errors are caused by an incorrect interpretation by the users. However, it is complicated due to the interference of the horizontal background noise, e.g., the air-ground interface, that reduces the high-resolution quality of radargrams. Thus, weak or deep anisotropies are often masked by this type of noise. In order to remove the background noise obtained by GPR, this work proposes a novel background removal method assuming that the horizontal noise shows repetitive two-dimensional regions along the movement of the GPR antenna. Specifically, the proposed method, based on the non-local similarity of regions over the distance, computes similarities between different regions of the same depth in order to identify most repetitive regions using a criterion to avoid closer regions. Evaluations are performed using a set of synthetic and real GPR data. Experimental results show that the proposed method obtains promising results compared to the classic background removal techniques and the most recently published background removal methods.

  7. Increasing Power by Sharing Information from Genetic Background and Treatment in Clustering of Gene Expression Time Series

    OpenAIRE

    Sura Zaki Alrashid; Muhammad Arifur Rahman; Nabeel H Al-Aaraji; Neil D Lawrence; Paul R Heath

    2018-01-01

    Clustering of gene expression time series gives insight into which genes may be co-regulated, allowing us to discern the activity of pathways in a given microarray experiment. Of particular interest is how a given group of genes varies with different conditions or genetic background. This paper develops
a new clustering method that allows each cluster to be parameterised according to whether the behaviour of the genes across conditions is correlated or anti-correlated. By specifying correlati...

  8. The multinational birth cohort of EuroPrevall: background, aims and methods

    NARCIS (Netherlands)

    Keil, T.; McBride, D.; Grimshaw, K.; Niggemann, B.; Xepapadaki, P.; Zannikos, K.; Sigurdardottir, S. T.; Clausen, M.; Reche, M.; Pascual, C.; Stanczyk, A. P.; Kowalski, M. L.; Dubakiene, R.; Drasutiene, G.; Roberts, G.; Schoemaker, A.-F. A.; Sprikkelman, A. B.; Fiocchi, A.; Martelli, A.; Dufour, S.; Hourihane, J.; Kulig, M.; Wjst, M.; Yazdanbakhsh, M.; Szépfalusi, Z.; van Ree, R.; Willich, S. N.; Wahn, U.; Mills, E. N. C.; Beyer, K.

    2010-01-01

    P>Background/aim: The true prevalence and risk factors of food allergies in children are not known because estimates were based predominantly on subjective assessments and skin or serum tests of allergic sensitization to food. The diagnostic gold standard, a double-blind placebo-controlled food

  9. Polonium-210 assay using a background-rejecting extractive liquid-scintillation method

    International Nuclear Information System (INIS)

    Case, C.N.; McDowell, W.J.

    1981-01-01

    This paper describes a procedure which combines solvent extraction with alpha liquid scintillation spectrometry. Pulse shape discrimination electronics are used to reject beta and gamma pulses and to lower the background count to acceptable levels. Concentration of 210 Po and separation from interferring elements are accomplished using a H 3 Po 4 -HCl solution with TOPO combined with a scintillor in toluene

  10. Albania; Background Information

    OpenAIRE

    International Monetary Fund

    1995-01-01

    This paper describes a evolution of the financial system in Albania. The paper highlights that a two-tier banking system was created following passage of a new Central Bank Law and Commercial Banking Law in April 1992. The State Bank of Albania became the Bank of Albania and retained only the functions of a central bank. Its commercial operations were hived off to become the National Bank of Albania in July 1992, which was subsequently merged with the Albanian Commercial Bank to form the Nati...

  11. Operator-independent method for background subtraction in adrenal-uptake measurements: concise communication

    International Nuclear Information System (INIS)

    Koral, K.F.; Sarkar, S.D.

    1977-01-01

    A new computer program for adrenal-uptake measurements is presented in which the algorithm identifies the adrenal and background regions automatically after being given a starting point in the image. Adrenal uptakes and results of reproducibility tests are given for patients injected with [ 131 I] 6β-iodomethyl-19-norcholesterol. The data to date indicate no overlap in the percent-of-dose uptakes for normal patients and patients with Cushing's disease and Cushing's syndrome

  12. Knowledge information management toolkit and method

    Science.gov (United States)

    Hempstead, Antoinette R.; Brown, Kenneth L.

    2006-08-15

    A system is provided for managing user entry and/or modification of knowledge information into a knowledge base file having an integrator support component and a data source access support component. The system includes processing circuitry, memory, a user interface, and a knowledge base toolkit. The memory communicates with the processing circuitry and is configured to store at least one knowledge base. The user interface communicates with the processing circuitry and is configured for user entry and/or modification of knowledge pieces within a knowledge base. The knowledge base toolkit is configured for converting knowledge in at least one knowledge base from a first knowledge base form into a second knowledge base form. A method is also provided.

  13. Investigation and development of the suppression methods of the {sup 42}K background in LArGe

    Energy Technology Data Exchange (ETDEWEB)

    Lubashevskiy, Alexey [Max-Planck-Institut fuer Kernphysik, Saupfercheckweg 1, D-69117 Heidelberg (Germany); Collaboration: GERDA-Collaboration

    2013-07-01

    GERDA is an ultra-low background experiment aimed for the neutrinoless double beta decay search. The search is performed using HPGe detectors operated in liquid argon (LAr). One of the most dangerous backgrounds in GERDA is the background from {sup 42}K which is a daughter isotope of cosmogenically produced {sup 42}Ar. {sup 42}K ions are collected towards to the detector by the electric field of the detector. Estimation of the background contribution and development of the suppression methods were performed in the low background test facility LArGe. For this purpose encapsulated HPGe and bare BEGe detectors were operated in 1m{sup 3} of LAr in the LArGe setup. It is equipped with scintillation veto, so particles which deposit part of their energy in LAr can be detected by 9 PMTs. In order to better understand background and to increase statistics the LAr of LArGe was spiked with specially produced {sup 42}Ar. All these investigations allowed us to estimate background contribution from {sup 42}K and demonstrate the possibility to suppress it in future measurements in GERDA Phase II.

  14. A method for measuring power signal background and source strength in a fission reactor

    International Nuclear Information System (INIS)

    Baers, B.; Kall, L.; Visuri, P.

    1977-01-01

    Theory and experimental verification of a novel method for measuring power signal bias and source strength in a fission reactor are reported. A minicomputer was applied in the measurements. The method is an extension of the inverse kinetics method presented by Mogilner et al. (Auth.)

  15. Spectral-ratio radon background correction method in airborne γ-ray spectrometry based on compton scattering deduction

    International Nuclear Information System (INIS)

    Gu Yi; Xiong Shengqing; Zhou Jianxin; Fan Zhengguo; Ge Liangquan

    2014-01-01

    γ-ray released by the radon daughter has severe impact on airborne γ-ray spectrometry. The spectral-ratio method is one of the best mathematical methods for radon background deduction in airborne γ-ray spectrometry. In this paper, an advanced spectral-ratio method was proposed which deducts Compton scattering ray by the fast Fourier transform rather than tripping ratios, the relationship between survey height and correction coefficient of the advanced spectral-ratio radon background correction method was studied, the advanced spectral-ratio radon background correction mathematic model was established, and the ground saturation model calibrating technology for correction coefficient was proposed. As for the advanced spectral-ratio radon background correction method, its applicability and correction efficiency are improved, and the application cost is saved. Furthermore, it can prevent the physical meaning lost and avoid the possible errors caused by matrix computation and mathematical fitting based on spectrum shape which is applied in traditional correction coefficient. (authors)

  16. Systems and methods for enhancing optical information

    Science.gov (United States)

    DeVore, Peter Thomas Setsuda; Chou, Jason T.

    2018-01-02

    An Optical Information Transfer Enhancer System includes a first system for producing an information bearing first optical wave that is impressed with a first information having a first information strength wherein the first optical wave has a first shape. A second system produces a second optical wave. An information strength enhancer module receives the first and said second optical waves and impresses the first optical wave upon the second optical wave via cross-phase modulation (XPM) to produce an information-strength-enhanced second optical wave having a second information strength that is greater than the first information strength of the first optical wave. Following a center-wavelength changer by an Optical Information Transfer Enhancer System improves its performance.

  17. SWCD: a sliding window and self-regulated learning-based background updating method for change detection in videos

    Science.gov (United States)

    Işık, Şahin; Özkan, Kemal; Günal, Serkan; Gerek, Ömer Nezih

    2018-03-01

    Change detection with background subtraction process remains to be an unresolved issue and attracts research interest due to challenges encountered on static and dynamic scenes. The key challenge is about how to update dynamically changing backgrounds from frames with an adaptive and self-regulated feedback mechanism. In order to achieve this, we present an effective change detection algorithm for pixelwise changes. A sliding window approach combined with dynamic control of update parameters is introduced for updating background frames, which we called sliding window-based change detection. Comprehensive experiments on related test videos show that the integrated algorithm yields good objective and subjective performance by overcoming illumination variations, camera jitters, and intermittent object motions. It is argued that the obtained method makes a fair alternative in most types of foreground extraction scenarios; unlike case-specific methods, which normally fail for their nonconsidered scenarios.

  18. Statistical methods for determination of background levels for naturally occuring radionuclides in soil at a RCRA facility

    International Nuclear Information System (INIS)

    Guha, S.; Taylor, J.H.

    1996-01-01

    It is critical that summary statistics on background data, or background levels, be computed based on standardized and defensible statistical methods because background levels are frequently used in subsequent analyses and comparisons performed by separate analysts over time. The final background for naturally occurring radionuclide concentrations in soil at a RCRA facility, and the associated statistical methods used to estimate these concentrations, are presented. The primary objective is to describe, via a case study, the statistical methods used to estimate 95% upper tolerance limits (UTL) on radionuclide background soil data sets. A 95% UTL on background samples can be used as a screening level concentration in the absence of definitive soil cleanup criteria for naturally occurring radionuclides. The statistical methods are based exclusively on EPA guidance. This paper includes an introduction, a discussion of the analytical results for the radionuclides and a detailed description of the statistical analyses leading to the determination of 95% UTLs. Soil concentrations reported are based on validated data. Data sets are categorized as surficial soil; samples collected at depths from zero to one-half foot; and deep soil, samples collected from 3 to 5 feet. These data sets were tested for statistical outliers and underlying distributions were determined by using the chi-squared test for goodness-of-fit. UTLs for the data sets were then computed based on the percentage of non-detects and the appropriate best-fit distribution (lognormal, normal, or non-parametric). For data sets containing greater than approximately 50% nondetects, nonparametric UTLs were computed

  19. Quenching methods for background reduction in luminescence-based probe-target binding assays

    Energy Technology Data Exchange (ETDEWEB)

    Cai, Hong [Los Alamos, NM; Goodwin, Peter M [Los Alamos, NM; Keller, Richard A [Los Alamos, NM; Nolan, Rhiannon L [Santa Fe, NM

    2007-04-10

    Background luminescence is reduced from a solution containing unbound luminescent probes, each having a first molecule that attaches to a target molecule and having an attached luminescent moiety, and luminescent probe/target adducts. Quenching capture reagent molecules are formed that are capable of forming an adduct with the unbound luminescent probes and having an attached quencher material effective to quench luminescence of the luminescent moiety. The quencher material of the capture reagent molecules is added to a solution of the luminescent probe/target adducts and binds in a proximity to the luminescent moiety of the unbound luminescent probes to quench luminescence from the luminescent moiety when the luminescent moiety is exposed to exciting illumination. The quencher capture reagent does not bind to probe molecules that are bound to target molecules and the probe/target adduct emission is not quenched.

  20. Internet security information system implement method

    International Nuclear Information System (INIS)

    Liu Baoxu; Mei Jie; Xu Rongsheng; An Dehai; Yu Mingjian; Chen Xiangyang; Zheng Peng

    1999-01-01

    On the basis of analysis of the key elements that will affect the Internet Security Information System, the author takes UNIX Operating System as an example, and provides the important stages that must be considered when implementing the Internet Security Information System. An implemental model of the Internet Security Information System is given

  1. Teaching molecular genetics: Chapter 1--Background principles and methods of molecular biology.

    NARCIS (Netherlands)

    Knoers, N.V.A.M.; Monnens, L.A.H.

    2006-01-01

    In this first chapter of the series "Teaching molecular genetics," an introduction to molecular genetics is presented. We describe the structure of DNA and genes and explain in detail the central dogma of molecular biology, that is, the flow of genetic information from DNA via RNA to polypeptide

  2. Analysis of an automated background correction method for cardiovascular MR phase contrast imaging in children and young adults

    Energy Technology Data Exchange (ETDEWEB)

    Rigsby, Cynthia K.; Hilpipre, Nicholas; Boylan, Emma E.; Popescu, Andrada R.; Deng, Jie [Ann and Robert H. Lurie Children' s Hospital of Chicago, Department of Medical Imaging, Chicago, IL (United States); McNeal, Gary R. [Siemens Medical Solutions USA Inc., Customer Solutions Group, Cardiovascular MR R and D, Chicago, IL (United States); Zhang, Gang [Ann and Robert H. Lurie Children' s Hospital of Chicago Research Center, Biostatistics Research Core, Chicago, IL (United States); Choi, Grace [Ann and Robert H. Lurie Children' s Hospital of Chicago, Department of Pediatrics, Chicago, IL (United States); Greiser, Andreas [Siemens AG Healthcare Sector, Erlangen (Germany)

    2014-03-15

    Phase contrast magnetic resonance imaging (MRI) is a powerful tool for evaluating vessel blood flow. Inherent errors in acquisition, such as phase offset, eddy currents and gradient field effects, can cause significant inaccuracies in flow parameters. These errors can be rectified with the use of background correction software. To evaluate the performance of an automated phase contrast MRI background phase correction method in children and young adults undergoing cardiac MR imaging. We conducted a retrospective review of patients undergoing routine clinical cardiac MRI including phase contrast MRI for flow quantification in the aorta (Ao) and main pulmonary artery (MPA). When phase contrast MRI of the right and left pulmonary arteries was also performed, these data were included. We excluded patients with known shunts and metallic implants causing visible MRI artifact and those with more than mild to moderate aortic or pulmonary stenosis. Phase contrast MRI of the Ao, mid MPA, proximal right pulmonary artery (RPA) and left pulmonary artery (LPA) using 2-D gradient echo Fast Low Angle SHot (FLASH) imaging was acquired during normal respiration with retrospective cardiac gating. Standard phase image reconstruction and the automatic spatially dependent background-phase-corrected reconstruction were performed on each phase contrast MRI dataset. Non-background-corrected and background-phase-corrected net flow, forward flow, regurgitant volume, regurgitant fraction, and vessel cardiac output were recorded for each vessel. We compared standard non-background-corrected and background-phase-corrected mean flow values for the Ao and MPA. The ratio of pulmonary to systemic blood flow (Qp:Qs) was calculated for the standard non-background and background-phase-corrected data and these values were compared to each other and for proximity to 1. In a subset of patients who also underwent phase contrast MRI of the MPA, RPA, and LPA a comparison was made between standard non-background

  3. [Establishment and assessment of QA/QC method for sampling and analysis of atmosphere background CO2].

    Science.gov (United States)

    Liu, Li-xin; Zhou, Ling-xi; Xia, Ling-jun; Wang, Hong-yang; Fang, Shuang-xi

    2014-12-01

    To strengthen scientific management and sharing of greenhouse gas data obtained from atmospheric background stations in China, it is important to ensure the standardization of quality assurance and quality control method for background CO2 sampling and analysis. Based on the greenhouse gas sampling and observation experience of CMA, using portable sampling observation and WS-CRDS analysis technique as an example, the quality assurance measures for atmospheric CO,sampling and observation in the Waliguan station (Qinghai), the glass bottle quality assurance measures and the systematic quality control method during sample analysis, the correction method during data processing, as well as the data grading quality markers and data fitting interpolation method were systematically introduced. Finally, using this research method, the CO2 sampling and observation data at the atmospheric background stations in 3 typical regions were processed and the concentration variation characteristics were analyzed, indicating that this research method could well catch the influences of the regional and local environmental factors on the observation results, and reflect the characteristics of natural and human activities in an objective and accurate way.

  4. Teaching molecular genetics: Chapter 1--Background principles and methods of molecular biology.

    Science.gov (United States)

    Knoers, Nine V A M; Monnens, Leo A H

    2006-02-01

    In this first chapter of the series "Teaching molecular genetics," an introduction to molecular genetics is presented. We describe the structure of DNA and genes and explain in detail the central dogma of molecular biology, that is, the flow of genetic information from DNA via RNA to polypeptide (protein). In addition, several basic and frequently used general molecular tools, such as restriction enzymes, Southern blotting, DNA amplification and sequencing are discussed, in order to lay the foundations for the forthcoming chapters.

  5. New method of 85Kr reduction in a noble gas based low-background detector

    Science.gov (United States)

    Akimov, D. Yu.; Bolozdynya, A. I.; Burenkov, A. A.; Hall, C.; Kovalenko, A. G.; Kuzminov, V. V.; Simakov, G. E.

    2017-04-01

    Krypton-85 is an anthropogenic beta-decaying isotope which produces low energy backgrounds in dark matter and neutrino experiments, especially those based upon liquid xenon. Several technologies have been developed to reduce the Kr concentration in such experiments. We propose to augment those separation technologies by first adding to the xenon an 85Kr-free sample of krypton in an amount much larger than the natural krypton that is already present. After the purification system reduces the total Kr concentration to the same level, the final 85Kr concentration will be reduced even further by the dilution factor. A test cell for measurement of the activity of various Kr samples has been assembled, and the activity of 25-year-old krypton has been measured. The measured activity agrees well with the expected activity accounting for the 85Kr abundance of the earth's atmosphere in 1990 and the half-life of the isotope. Additional tests with a Kr sample produced in the year 1944 (before the atomic era) have been done in order to demonstrate the sensitivity of the test cell.

  6. New method of 85Kr reduction in a noble gas based low-background detector

    International Nuclear Information System (INIS)

    Akimov, D.Yu.; Burenkov, A.A.; Kovalenko, A.G.; Simakov, G.E.; Bolozdynya, A.I.; Hall, C.; Kuzminov, V.V.

    2017-01-01

    Krypton-85 is an anthropogenic beta-decaying isotope which produces low energy backgrounds in dark matter and neutrino experiments, especially those based upon liquid xenon. Several technologies have been developed to reduce the Kr concentration in such experiments. We propose to augment those separation technologies by first adding to the xenon an 85 Kr-free sample of krypton in an amount much larger than the natural krypton that is already present. After the purification system reduces the total Kr concentration to the same level, the final 85 Kr concentration will be reduced even further by the dilution factor. A test cell for measurement of the activity of various Kr samples has been assembled, and the activity of 25-year-old krypton has been measured. The measured activity agrees well with the expected activity accounting for the 85 Kr abundance of the earth's atmosphere in 1990 and the half-life of the isotope. Additional tests with a Kr sample produced in the year 1944 (before the atomic era) have been done in order to demonstrate the sensitivity of the test cell.

  7. Research method of nuclear patent information

    International Nuclear Information System (INIS)

    Mo Dan; Gao An'na; Sun Chenglin; Wang Lei; You Xinfeng

    2010-01-01

    When faced with a huge amount of nuclear patent information, the key to effective research include: (1) Choose convenient way to search, quick access to nuclear technology related patents; (2) To overcome the language barrier, analysis the technical content of patent information; (3) Organize the publication date of retrieved patent documents, analysis the status and trends of nuclear technology development; (4) Research the patented technology of main applicants; (5) Always pay attention to the legal status of patent information, free use the invalid patents, at the same time avoid the patent infringement. Summary, patent information is important to obtain the latest technical information source, and the research work of patent information is a comprehensive understanding and mastery way for advanced nuclear technology. (authors)

  8. Orientations in adolescent use of information and communication technology: a digital divide by sociodemographic background, educational career, and health.

    Science.gov (United States)

    Koivusilta, Leena K; Lintonen, Tomi P; Rimpelä, Arja H

    2007-01-01

    The role of information and communication technology (ICT) in adolescents' lives was studied, with emphasis on whether there exists a digital divide based on sociodemographic background, educational career, and health. The assumption was that some groups of adolescents use ICT more so that their information utilization skills improve (computer use), while others use it primarily for entertainment (digital gaming, contacting friends by mobile phone). Data were collected by mailed survey from a nationally representative sample of 12- to 18-year-olds (n=7,292; response 70%) in 2001 and analysed using ANOVA. Computer use was most frequent among adolescents whose fathers had higher education or socioeconomic status, who came from nuclear families, and who continued studies after compulsory education. Digital gaming was associated with poor school achievement and attending vocational rather than upper secondary school. Mobile phone use was frequent among adolescents whose fathers had lower education or socioeconomic status, who came from non-nuclear families, and whose educational prospects were poor. Intensive use of each ICT form, especially of mobile phones, was associated with health problems. High social position, nuclear family, and a successful educational career signified good health in general, independently of the diverse usage of ICT. There exists a digital divide among adolescents: orientation to computer use is more common in educated well-off families while digital gaming and mobile phone use accumulate at the opposite end of the spectrum. Poorest health was reported by mobile phone users. High social background and success at school signify better health, independently of the ways of using ICT.

  9. The Army Method Revisited: The Historical and Theoretical Backgrounds of the Military Intensive Language Programs.

    Science.gov (United States)

    Bayuk, Milla; Bayuk, Barry S.

    A program currently in use by the military that gives instruction in the so-called "sensitive" languages is based on the "Army Method" which was initiated in military language programs during World War II. Attention to the sensitive language program initiated a review of the programs, especially those conducted by the military intelligence schools…

  10. The background cross section method for calculating the epithermal neutron spectra

    International Nuclear Information System (INIS)

    Martinez, A.S.

    1983-01-01

    We have developed a new methodology to the multigroup constants calculations, for thermal and fast reactors. The method to obtain the constants is extremely fast and simple, and it avoid repeated computations of the detailed neutron spectrum for different cell configurations (composition, geometry and temperature). (author) [pt

  11. Effects of projection and background correction method upon calculation of right ventricular ejection fraction using first-pass radionuclide angiography

    International Nuclear Information System (INIS)

    Caplin, J.L.; Flatman, W.D.; Dymond, D.S.

    1985-01-01

    There is no consensus as to the best projection or correction method for first-pass radionuclide studies of the right ventricle. We assessed the effects of two commonly used projections, 30 degrees right anterior oblique and anterior-posterior, on the calculation of right ventricular ejection fraction. In addition two background correction methods, planar background correction to account for scatter, and right atrial correction to account for right atrio-ventricular overlap were assessed. Two first-pass radionuclide angiograms were performed in 19 subjects, one in each projection, using gold-195m (half-life 30.5 seconds), and each study was analysed using the two methods of correction. Right ventricular ejection fraction was highest using the right anterior oblique projection with right atrial correction 35.6 +/- 12.5% (mean +/- SD), and lowest when using the anterior posterior projection with planar background correction 26.2 +/- 11% (p less than 0.001). The study design allowed assessment of the effects of correction method and projection independently. Correction method appeared to have relatively little effect on right ventricular ejection fraction. Using right atrial correction correlation coefficient (r) between projections was 0.92, and for planar background correction r = 0.76, both p less than 0.001. However, right ventricular ejection fraction was far more dependent upon projection. When the anterior-posterior projection was used calculated right ventricular ejection fraction was much more dependent on correction method (r = 0.65, p = not significant), than using the right anterior oblique projection (r = 0.85, p less than 0.001)

  12. Research on the algorithm of infrared target detection based on the frame difference and background subtraction method

    Science.gov (United States)

    Liu, Yun; Zhao, Yuejin; Liu, Ming; Dong, Liquan; Hui, Mei; Liu, Xiaohua; Wu, Yijian

    2015-09-01

    As an important branch of infrared imaging technology, infrared target tracking and detection has a very important scientific value and a wide range of applications in both military and civilian areas. For the infrared image which is characterized by low SNR and serious disturbance of background noise, an innovative and effective target detection algorithm is proposed in this paper, according to the correlation of moving target frame-to-frame and the irrelevance of noise in sequential images based on OpenCV. Firstly, since the temporal differencing and background subtraction are very complementary, we use a combined detection method of frame difference and background subtraction which is based on adaptive background updating. Results indicate that it is simple and can extract the foreground moving target from the video sequence stably. For the background updating mechanism continuously updating each pixel, we can detect the infrared moving target more accurately. It paves the way for eventually realizing real-time infrared target detection and tracking, when transplanting the algorithms on OpenCV to the DSP platform. Afterwards, we use the optimal thresholding arithmetic to segment image. It transforms the gray images to black-white images in order to provide a better condition for the image sequences detection. Finally, according to the relevance of moving objects between different frames and mathematical morphology processing, we can eliminate noise, decrease the area, and smooth region boundaries. Experimental results proves that our algorithm precisely achieve the purpose of rapid detection of small infrared target.

  13. Endpoints and cutpoints in head and neck oncology trials: methodical background, challenges, current practice and perspectives.

    Science.gov (United States)

    Hezel, Marcus; von Usslar, Kathrin; Kurzweg, Thiemo; Lörincz, Balazs B; Knecht, Rainald

    2016-04-01

    This article reviews the methodical and statistical basics of designing a trial, with a special focus on the process of defining and choosing endpoints and cutpoints as the foundations of clinical research, and ultimately that of evidence-based medicine. There has been a significant progress in the treatment of head and neck cancer in the past few decades. Currently available treatment options can have a variety of different goals, depending e.g. on tumor stage, among other factors. The outcome of a specific treatment in clinical trials is measured using endpoints. Besides classical endpoints, such as overall survival or organ preservation, other endpoints like quality of life are becoming increasingly important in designing and conducting a trial. The present work is based on electronic research and focuses on the solid methodical and statistical basics of a clinical trial, on the structure of study designs and on the presentation of various endpoints.

  14. Information systems research methods, epistemology, and applications

    National Research Council Canada - National Science Library

    Cater-Steel, Aileen; Al-Hakim, Latif

    2009-01-01

    ..., University of Dublin, Trinity College, IrelandChapter IV A Critical Theory Approach to Information Technology Transfer to the Developing World and a Critique of Maintained Assumptions in the Lite...

  15. Pleural manometry-historical background, rationale for use and methods of measurement.

    Science.gov (United States)

    Zielinska-Krawczyk, Monika; Krenke, Rafal; Grabczak, Elzbieta M; Light, Richard W

    2018-03-01

    Subatmospheric pleural pressure (Ppl), which is approximately -3 to -5 cmH 2 O at functional residual capacity (FRC) makes pleura a unique organ in the human body. The negative Ppl is critical for maintaining the lungs in a properly inflated state and for proper blood circulation within the thorax. Significant and sudden pleural pressure changes associated with major pleural pathologies, as well as therapeutic interventions may be associated with life-threatening complications. The pleural pressure may show two different values depending on the measurement method applied. These are called pleural liquid pressure and pleural surface pressure. It should also be realized that there are significant differences in pleural pressure distribution in pneumothorax and pleural effusion. In pneumothorax, the pressure is the same throughout the pleural space, while in pleural effusion there is a vertical gradient of approximately 1 cm H 2 O/cm in the pleural pressure associated with the hydrostatic pressure of the fluid column. Currently, two main methods of pleural pressure measurement are used: simple water manometers and electronic systems. The water manometers are conceptually simple, cheap and user-friendly but they only allow the estimation of the mean values of pleural pressure. The electronic systems for pleural pressure measurement are based on pressure transducers. Their major advantages include precise measurements of instantaneous pleural pressure and the ability to display and to store a large amount of data. The paper presents principles and details of pleural pressure measurement as well as the rationale for its use. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. European network for promoting the physical health of residents in psychiatric and social care facilities (HELPS: background, aims and methods

    Directory of Open Access Journals (Sweden)

    Marginean Roxana

    2009-08-01

    Full Text Available Abstract Background People with mental disorders have a higher prevalence of physical illnesses and reduced life expectancy as compared with the general population. However, there is a lack of knowledge across Europe concerning interventions that aim at reducing somatic morbidity and excess mortality by promoting behaviour-based and/or environment-based interventions. Methods and design HELPS is an interdisciplinary European network that aims at (i gathering relevant knowledge on physical illness in people with mental illness, (ii identifying health promotion initiatives in European countries that meet country-specific needs, and (iii at identifying best practice across Europe. Criteria for best practice will include evidence on the efficacy of physical health interventions and of their effectiveness in routine care, cost implications and feasibility for adaptation and implementation of interventions across different settings in Europe. HELPS will develop and implement a "physical health promotion toolkit". The toolkit will provide information to empower residents and staff to identify the most relevant risk factors in their specific context and to select the most appropriate action out of a range of defined health promoting interventions. The key methods are (a stakeholder analysis, (b international literature reviews, (c Delphi rounds with experts from participating centres, and (d focus groups with staff and residents of mental health care facilities. Meanwhile a multi-disciplinary network consisting of 15 European countries has been established and took up the work. As one main result of the project they expect that a widespread use of the HELPS toolkit could have a significant positive effect on the physical health status of residents of mental health and social care facilities, as well as to hold resonance for community dwelling people with mental health problems. Discussion A general strategy on health promotion for people with mental

  17. European network for promoting the physical health of residents in psychiatric and social care facilities (HELPS): background, aims and methods

    Science.gov (United States)

    Weiser, Prisca; Becker, Thomas; Losert, Carolin; Alptekin, Köksal; Berti, Loretta; Burti, Lorenzo; Burton, Alexandra; Dernovsek, Mojca; Dragomirecka, Eva; Freidl, Marion; Friedrich, Fabian; Genova, Aneta; Germanavicius, Arunas; Halis, Ulaş; Henderson, John; Hjorth, Peter; Lai, Taavi; Larsen, Jens Ivar; Lech, Katarzyna; Lucas, Ramona; Marginean, Roxana; McDaid, David; Mladenova, Maya; Munk-Jørgensen, Povl; Paziuc, Alexandru; Paziuc, Petronela; Priebe, Stefan; Prot-Klinger, Katarzyna; Wancata, Johannes; Kilian, Reinhold

    2009-01-01

    Background People with mental disorders have a higher prevalence of physical illnesses and reduced life expectancy as compared with the general population. However, there is a lack of knowledge across Europe concerning interventions that aim at reducing somatic morbidity and excess mortality by promoting behaviour-based and/or environment-based interventions. Methods and design HELPS is an interdisciplinary European network that aims at (i) gathering relevant knowledge on physical illness in people with mental illness, (ii) identifying health promotion initiatives in European countries that meet country-specific needs, and (iii) at identifying best practice across Europe. Criteria for best practice will include evidence on the efficacy of physical health interventions and of their effectiveness in routine care, cost implications and feasibility for adaptation and implementation of interventions across different settings in Europe. HELPS will develop and implement a "physical health promotion toolkit". The toolkit will provide information to empower residents and staff to identify the most relevant risk factors in their specific context and to select the most appropriate action out of a range of defined health promoting interventions. The key methods are (a) stakeholder analysis, (b) international literature reviews, (c) Delphi rounds with experts from participating centres, and (d) focus groups with staff and residents of mental health care facilities. Meanwhile a multi-disciplinary network consisting of 15 European countries has been established and took up the work. As one main result of the project they expect that a widespread use of the HELPS toolkit could have a significant positive effect on the physical health status of residents of mental health and social care facilities, as well as to hold resonance for community dwelling people with mental health problems. Discussion A general strategy on health promotion for people with mental disorders must take into

  18. Governance Methods Used in Externalizing Information Technology

    Science.gov (United States)

    Chan, Steven King-Lun

    2012-01-01

    Information technology (IT) is the largest capital expenditure in many firms and is an integral part of many organizations' strategies. However, the benefits that each company receives from its IT investments vary. One study by Weill (2004) found that the top performer in the sample was estimated to have as high as a 40% greater return on its…

  19. Human factors estimation methods using physiological informations

    International Nuclear Information System (INIS)

    Takano, Ken-ichi; Yoshino, Kenji; Nakasa, Hiroyasu

    1984-01-01

    To enhance the operational safety in the nuclear power plant, it is necessary to decrease abnormal phenomena due to human errors. Especially, it is essential to basically understand human behaviors under the work environment for plant maintenance workers, inspectors, and operators. On the above stand point, this paper presents the results of literature survey on the present status of human factors engineering technology applicable to the nuclear power plant and also discussed the following items: (1) Application fields where the ergonomical evaluation is needed for workers safety. (2) Basic methodology for investigating the human performance. (3) Features of the physiological information analysis among various types of ergonomical techniques. (4) Necessary conditions for the application of in-situ physiological measurement to the nuclear power plant. (5) Availability of the physiological information analysis. (6) Effectiveness of the human factors engineering methodology, especially physiological information analysis in the case of application to the nuclear power plant. The above discussions lead to the demonstration of high applicability of the physiological information analysis to nuclear power plant, in order to improve the work performance. (author)

  20. Agricultural practice and water quality in the Netherlands in the 1992-2002 period. Background information for the third EU Nitrate Directive Member States report

    NARCIS (Netherlands)

    Fraters B; Hotsma PH; Langenberg VT; Leeuwen TC van; Mol APA; Olsthoorn CSM; Schotten CGJ; Willems WJ; EC-LNV; RIKZ; LEI; RIZA; CBS; LDL

    2004-01-01

    This overview provides the background information for the Netherlands Member State report, 'Nitrate Directive, status and trends of aquatic environment and agricultural practice' to be submitted to the European Commission mid-2004. It documents current agricultural practice, and groundwater and

  1. Principles and methods of quantum information technologies

    CERN Document Server

    Semba, Kouichi

    2016-01-01

    This book presents the research and development-related results of the “FIRST” Quantum Information Processing Project, which was conducted from 2010 to 2014 with the support of the Council for Science, Technology and Innovation of the Cabinet Office of the Government of Japan. The project supported 33 research groups and explored five areas: quantum communication, quantum metrology and sensing, coherent computing, quantum simulation, and quantum computing. The book is divided into seven main sections. Parts I through V, which consist of twenty chapters, focus on the system and architectural aspects of quantum information technologies, while Parts VI and VII, which consist of eight chapters, discuss the superconducting quantum circuit, semiconductor spin and molecular spin technologies.   Readers will be introduced to new quantum computing schemes such as quantum annealing machines and coherent Ising machines, which have now arisen as alternatives to standard quantum computers and are designed to successf...

  2. Effects of ivermectin application on the diversity and function of dung and soil fauna: Regulatory and scientific background information

    DEFF Research Database (Denmark)

    Adler, Nicole; Blanckenhorn, Wolf U; Bachmann, Jean

    2016-01-01

    for veterinary medicinal products in the European Union includes a requirement for higher-tier tests when adverse effects on dung organisms are observed in single-species toxicity tests. However, no guidance documents for the performance of higher-tier tests are available. Hence, an international research...... on communities of dung-breeding insects and soil fauna under field conditions, the test method meets the requirements of a higher-tier test as mandated by the European Union. The present study provides contextual information on authorization requirements for veterinary medicinal products and on the structure...... project was undertaken to develop and validate a proposed test method under varying field conditions of climate, soil, and endemic coprophilous fauna at Lethbridge (Canada), Montpellier (France), Zurich (Switzerland), and Wageningen (The Netherlands). The specific objectives were to determine if fecal...

  3. Applying Human Computation Methods to Information Science

    Science.gov (United States)

    Harris, Christopher Glenn

    2013-01-01

    Human Computation methods such as crowdsourcing and games with a purpose (GWAP) have each recently drawn considerable attention for their ability to synergize the strengths of people and technology to accomplish tasks that are challenging for either to do well alone. Despite this increased attention, much of this transformation has been focused on…

  4. GafChromic EBT film dosimetry with flatbed CCD scanner: a novel background correction method and full dose uncertainty analysis.

    Science.gov (United States)

    Saur, Sigrun; Frengen, Jomar

    2008-07-01

    Film dosimetry using radiochromic EBT film in combination with a flatbed charge coupled device scanner is a useful method both for two-dimensional verification of intensity-modulated radiation treatment plans and for general quality assurance of treatment planning systems and linear accelerators. Unfortunately, the response over the scanner area is nonuniform, and when not corrected for, this results in a systematic error in the measured dose which is both dose and position dependent. In this study a novel method for background correction is presented. The method is based on the subtraction of a correction matrix, a matrix that is based on scans of films that are irradiated to nine dose levels in the range 0.08-2.93 Gy. Because the response of the film is dependent on the film's orientation with respect to the scanner, correction matrices for both landscape oriented and portrait oriented scans were made. In addition to the background correction method, a full dose uncertainty analysis of the film dosimetry procedure was performed. This analysis takes into account the fit uncertainty of the calibration curve, the variation in response for different film sheets, the nonuniformity after background correction, and the noise in the scanned films. The film analysis was performed for film pieces of size 16 x 16 cm, all with the same lot number, and all irradiations were done perpendicular onto the films. The results show that the 2-sigma dose uncertainty at 2 Gy is about 5% and 3.5% for landscape and portrait scans, respectively. The uncertainty gradually increases as the dose decreases, but at 1 Gy the 2-sigma dose uncertainty is still as good as 6% and 4% for landscape and portrait scans, respectively. The study shows that film dosimetry using GafChromic EBT film, an Epson Expression 1680 Professional scanner and a dedicated background correction technique gives precise and accurate results. For the purpose of dosimetric verification, the calculated dose distribution

  5. GafChromic EBT film dosimetry with flatbed CCD scanner: A novel background correction method and full dose uncertainty analysis

    International Nuclear Information System (INIS)

    Saur, Sigrun; Frengen, Jomar

    2008-01-01

    Film dosimetry using radiochromic EBT film in combination with a flatbed charge coupled device scanner is a useful method both for two-dimensional verification of intensity-modulated radiation treatment plans and for general quality assurance of treatment planning systems and linear accelerators. Unfortunately, the response over the scanner area is nonuniform, and when not corrected for, this results in a systematic error in the measured dose which is both dose and position dependent. In this study a novel method for background correction is presented. The method is based on the subtraction of a correction matrix, a matrix that is based on scans of films that are irradiated to nine dose levels in the range 0.08-2.93 Gy. Because the response of the film is dependent on the film's orientation with respect to the scanner, correction matrices for both landscape oriented and portrait oriented scans were made. In addition to the background correction method, a full dose uncertainty analysis of the film dosimetry procedure was performed. This analysis takes into account the fit uncertainty of the calibration curve, the variation in response for different film sheets, the nonuniformity after background correction, and the noise in the scanned films. The film analysis was performed for film pieces of size 16x16 cm, all with the same lot number, and all irradiations were done perpendicular onto the films. The results show that the 2-sigma dose uncertainty at 2 Gy is about 5% and 3.5% for landscape and portrait scans, respectively. The uncertainty gradually increases as the dose decreases, but at 1 Gy the 2-sigma dose uncertainty is still as good as 6% and 4% for landscape and portrait scans, respectively. The study shows that film dosimetry using GafChromic EBT film, an Epson Expression 1680 Professional scanner and a dedicated background correction technique gives precise and accurate results. For the purpose of dosimetric verification, the calculated dose distribution can

  6. Versatile Formal Methods Applied to Quantum Information.

    Energy Technology Data Exchange (ETDEWEB)

    Witzel, Wayne [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Rudinger, Kenneth Michael [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Sarovar, Mohan [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-11-01

    Using a novel formal methods approach, we have generated computer-veri ed proofs of major theorems pertinent to the quantum phase estimation algorithm. This was accomplished using our Prove-It software package in Python. While many formal methods tools are available, their practical utility is limited. Translating a problem of interest into these systems and working through the steps of a proof is an art form that requires much expertise. One must surrender to the preferences and restrictions of the tool regarding how mathematical notions are expressed and what deductions are allowed. Automation is a major driver that forces restrictions. Our focus, on the other hand, is to produce a tool that allows users the ability to con rm proofs that are essentially known already. This goal is valuable in itself. We demonstrate the viability of our approach that allows the user great exibility in expressing state- ments and composing derivations. There were no major obstacles in following a textbook proof of the quantum phase estimation algorithm. There were tedious details of algebraic manipulations that we needed to implement (and a few that we did not have time to enter into our system) and some basic components that we needed to rethink, but there were no serious roadblocks. In the process, we made a number of convenient additions to our Prove-It package that will make certain algebraic manipulations easier to perform in the future. In fact, our intent is for our system to build upon itself in this manner.

  7. A statistical, task-based evaluation method for three-dimensional x-ray breast imaging systems using variable-background phantoms

    International Nuclear Information System (INIS)

    Park, Subok; Jennings, Robert; Liu Haimo; Badano, Aldo; Myers, Kyle

    2010-01-01

    Purpose: For the last few years, development and optimization of three-dimensional (3D) x-ray breast imaging systems, such as digital breast tomosynthesis (DBT) and computed tomography, have drawn much attention from the medical imaging community, either academia or industry. However, there is still much room for understanding how to best optimize and evaluate the devices over a large space of many different system parameters and geometries. Current evaluation methods, which work well for 2D systems, do not incorporate the depth information from the 3D imaging systems. Therefore, it is critical to develop a statistically sound evaluation method to investigate the usefulness of inclusion of depth and background-variability information into the assessment and optimization of the 3D systems. Methods: In this paper, we present a mathematical framework for a statistical assessment of planar and 3D x-ray breast imaging systems. Our method is based on statistical decision theory, in particular, making use of the ideal linear observer called the Hotelling observer. We also present a physical phantom that consists of spheres of different sizes and materials for producing an ensemble of randomly varying backgrounds to be imaged for a given patient class. Lastly, we demonstrate our evaluation method in comparing laboratory mammography and three-angle DBT systems for signal detection tasks using the phantom's projection data. We compare the variable phantom case to that of a phantom of the same dimensions filled with water, which we call the uniform phantom, based on the performance of the Hotelling observer as a function of signal size and intensity. Results: Detectability trends calculated using the variable and uniform phantom methods are different from each other for both mammography and DBT systems. Conclusions: Our results indicate that measuring the system's detection performance with consideration of background variability may lead to differences in system performance

  8. Effects of ivermectin application on the diversity and function of dung and soil fauna: Regulatory and scientific background information.

    Science.gov (United States)

    Adler, Nicole; Bachmann, Jean; Blanckenhorn, Wolf U; Floate, Kevin D; Jensen, John; Römbke, Jörg

    2016-08-01

    The application of veterinary medical products to livestock can impact soil organisms in manure-amended fields or adversely affect organisms that colonize dung pats of treated animals and potentially retard the degradation of dung on pastures. For this reason, the authorization process for veterinary medicinal products in the European Union includes a requirement for higher-tier tests when adverse effects on dung organisms are observed in single-species toxicity tests. However, no guidance documents for the performance of higher-tier tests are available. Hence, an international research project was undertaken to develop and validate a proposed test method under varying field conditions of climate, soil, and endemic coprophilous fauna at Lethbridge (Canada), Montpellier (France), Zurich (Switzerland), and Wageningen (The Netherlands). The specific objectives were to determine if fecal residues of an anthelmintic with known insecticidal activity (ivermectin) showed similar effects across sites on 1) insects breeding in dung of treated animals, 2) coprophilous organisms in the soil beneath the dung, and 3) rates of dung degradation. By evaluating the effects of parasiticides on communities of dung-breeding insects and soil fauna under field conditions, the test method meets the requirements of a higher-tier test as mandated by the European Union. The present study provides contextual information on authorization requirements for veterinary medicinal products and on the structure and function of dung and soil organism communities. It also provides a summary of the main findings. Subsequent studies on this issue provide detailed information on different aspects of this overall project. Environ Toxicol Chem 2016;35:1914-1923. © 2015 SETAC. © 2015 SETAC.

  9. Information in Our World: Conceptions of Information and Problems of Method in Information Science

    Science.gov (United States)

    Ma, Lai

    2012-01-01

    Many concepts of information have been proposed and discussed in library and information science. These concepts of information can be broadly categorized as empirical and situational information. Unlike nomenclatures in many sciences, however, the concept of information in library and information science does not bear a generally accepted…

  10. A comparison of methods used to calculate normal background concentrations of potentially toxic elements for urban soil

    Energy Technology Data Exchange (ETDEWEB)

    Rothwell, Katherine A., E-mail: k.rothwell@ncl.ac.uk; Cooke, Martin P., E-mail: martin.cooke@ncl.ac.uk

    2015-11-01

    To meet the requirements of regulation and to provide realistic remedial targets there is a need for the background concentration of potentially toxic elements (PTEs) in soils to be considered when assessing contaminated land. In England, normal background concentrations (NBCs) have been published for several priority contaminants for a number of spatial domains however updated regulatory guidance places the responsibility on Local Authorities to set NBCs for their jurisdiction. Due to the unique geochemical nature of urban areas, Local Authorities need to define NBC values specific to their area, which the national data is unable to provide. This study aims to calculate NBC levels for Gateshead, an urban Metropolitan Borough in the North East of England, using freely available data. The ‘median + 2MAD’, boxplot upper whisker and English NBC (according to the method adopted by the British Geological Survey) methods were compared for test PTEs lead, arsenic and cadmium. Due to the lack of systematically collected data for Gateshead in the national soil chemistry database, the use of site investigation (SI) data collected during the planning process was investigated. 12,087 SI soil chemistry data points were incorporated into a database and 27 comparison samples were taken from undisturbed locations across Gateshead. The SI data gave high resolution coverage of the area and Mann–Whitney tests confirmed statistical similarity for the undisturbed comparison samples and the SI data. SI data was successfully used to calculate NBCs for Gateshead and the median + 2MAD method was selected as most appropriate by the Local Authority according to the precautionary principle as it consistently provided the most conservative NBC values. The use of this data set provides a freely available, high resolution source of data that can be used for a range of environmental applications. - Highlights: • The use of site investigation data is proposed for land contamination studies

  11. The geometric background-field method, renormalization and the Wess-Zumino term in non-linear sigma-models

    International Nuclear Information System (INIS)

    Mukhi, S.

    1986-01-01

    A simple recursive algorithm is presented which generates the reparametrization-invariant background-field expansion for non-linear sigma-models on manifolds with an arbitrary riemannian metric. The method is also applicable to Wess-Zumino terms and to counterterms. As an example, the general-metric model is expanded to sixth order and compared with previous results. For locally symmetric spaces, we actually obtain a general formula for the nth order term. The method is shown to facilitate the study of models with Wess-Zumino terms. It is demonstrated that, for chiral models, the Wess-Zumino term is unrenormalized to all orders in perturbation theory even when the model is not conformally invariant. (orig.)

  12. Experiences with nutrition-related information during antenatal care of pregnant women of different ethnic backgrounds residing in the area of Oslo, Norway.

    Science.gov (United States)

    Garnweidner, Lisa M; Sverre Pettersen, Kjell; Mosdøl, Annhild

    2013-12-01

    to explore experiences with nutrition-related information during routine antenatal care among women of different ethnical backgrounds. individual interviews with seventeen participants were conducted twice during pregnancy. Data collection and analysis were inspired by an interpretative phenomenological approach. participants were purposively recruited at eight Mother and Child Health Centres in the area of Oslo, Norway, where they received antenatal care. participants had either immigrant backgrounds from African and Asian countries (n=12) or were ethnic Norwegian (n=5). Participants were pregnant with their first child and had a pre-pregnancy Body Mass Index above 25 kg/m(2). participants experienced that they were provided with little nutrition-related information in antenatal care. The information was perceived as presented in very general terms and focused on food safety. Weight management and the long-term prevention of diet-related chronic diseases had hardly been discussed. Participants with immigrant backgrounds appeared to be confused about information given by the midwife which was incongruent with their original food culture. The participants were actively seeking for nutrition-related information and had to navigate between various sources of information. the midwife is considered a trustworthy source of nutrition-related information. Therefore, antenatal care may have considerable potential to promote a healthy diet to pregnant women. Findings suggest that nutrition communication in antenatal care should be more tailored towards women's dietary habits and cultural background, nutritional knowledge as well as level of nutrition literacy. Copyright © 2012 Elsevier Ltd. All rights reserved.

  13. Application of geo-information science methods in ecotourism exploitation

    Science.gov (United States)

    Dong, Suocheng; Hou, Xiaoli

    2004-11-01

    Application of geo-information science methods in ecotourism development was discussed in the article. Since 1990s, geo-information science methods, which take the 3S (Geographic Information System, Global Positioning System, and Remote Sensing) as core techniques, has played an important role in resources reconnaissance, data management, environment monitoring, and regional planning. Geo-information science methods can easily analyze and convert geographic spatial data. The application of 3S methods is helpful to sustainable development in tourism. Various assignments are involved in the development of ecotourism, such as reconnaissance of ecotourism resources, drawing of tourism maps, dealing with mass data, and also tourism information inquire, employee management, quality management of products. The utilization of geo-information methods in ecotourism can make the development more efficient by promoting the sustainable development of tourism and the protection of eco-environment.

  14. 48 CFR 2905.101 - Methods of disseminating information.

    Science.gov (United States)

    2010-10-01

    ... information. 2905.101 Section 2905.101 Federal Acquisition Regulations System DEPARTMENT OF LABOR ACQUISITION PLANNING PUBLICIZING CONTRACT ACTIONS Dissemination of Information 2905.101 Methods of disseminating... dissemination of information concerning procurement actions. The Division of Acquisition Management Services...

  15. Axiomatic Evaluation Method and Content Structure for Information Appliances

    Science.gov (United States)

    Guo, Yinni

    2010-01-01

    Extensive studies have been conducted to determine how best to present information in order to enhance usability, but not what information is needed to be presented for effective decision making. Hence, this dissertation addresses the factor structure of the nature of information needed for presentation and proposes a more effective method than…

  16. Charge and magnetic moment of the neutrino in the background field method and in the linear RξL gauge

    International Nuclear Information System (INIS)

    Cabral-Rosetti, L.G.; Bernabeu, J.; Vidal, J.

    2000-01-01

    We present a computation of the charge and the magnetic moment of the neutrino in the recently developed electro-weak background field method and in the linear R ξ L gauge. First, we deduce a formal Ward-Takahashi identity which implies the immediate cancellation of the neutrino electric charge. This Ward-Takahashi identity is as simple as that for QED. The computation of the (proper and improper) one loop vertex diagrams contributing to the neutrino electric charge is also presented in an arbitrary gauge, checking in this way the Ward-Takahashi identity previously obtained. Finally, the calculation of the magnetic moment of the neutrino, in the minimal extension of the standard model with massive Dirac neutrinos, is presented, showing its gauge parameter and gauge structure independence explicitly. (orig.)

  17. Autogenic-Feedback Training (AFT) as a preventive method for space motion sickness: Background and experimental design

    Science.gov (United States)

    Cowings, Patricia S.; Toscano, William B.

    1993-01-01

    Finding an effective treatment for the motion sickness-like symptoms that occur in space has become a high priority for NASA. The background research is reviewed and the experimental design of a formal life sciences shuttle flight experiment designed to prevent space motion sickness in shuttle crew members is presented. This experiment utilizes a behavioral medicine approach to solving this problem. This method, Autogenic-Feedback Training (AFT), involves training subjects to voluntarily control several of their own physiological responses to environmental stressors. AFT has been used reliably to increase tolerance to motion sickness during ground-based tests in over 200 men and women under a variety of conditions that induce motion sickness, and preliminary evidence from space suggests that AFT may be an effective treatment for space motion sickness as well. Proposed changes to this experiment for future manifests are included.

  18. Fuzzy Search Method for Hi Education Information Security

    Directory of Open Access Journals (Sweden)

    Grigory Grigorevich Novikov

    2016-03-01

    Full Text Available The main reason of the research is how to use fuzzy search method for information security of Hi Education or some similar purposes. So many sensitive information leaks are through non SUMMARY 149 classified documents legal publishing. That’s why many intelligence services so love to use the «mosaic» information collection method. This article is about how to prevent it.

  19. Geometrical Fuzzy Search Method for the Business Information Security Systems

    Directory of Open Access Journals (Sweden)

    Grigory Grigorievich Novikov

    2014-12-01

    Full Text Available The main reason of the article is how to use one of new fuzzy search method for information security of business or some other purposes. So many sensitive information leaks are through non-classified documents legal publishing. That’s why many intelligence services like to use the “mosaic” information collection method so much: This article is about how to prevent it.

  20. NATO mission in Kosovo: historical backgrounds and informations of working as radiologist in the German field hospital

    International Nuclear Information System (INIS)

    Voelk, M.; Danz, B.

    2005-01-01

    The first part of this article describes how the NATO mission in Kosovo came into existence and focuses on the historical background and ethnical problems. The second part deals with the working conditions of a radiologist in the German field hospital in Prizren and focuses on the personnel and technical equipment in the radiological department. (orig.) [de

  1. Vector analysis as a fast and easy method to compare gene expression responses between different experimental backgrounds

    NARCIS (Netherlands)

    Breitling, R.; Armengaud, P.; Amtmann, A.

    2005-01-01

    Background Gene expression studies increasingly compare expression responses between different experimental backgrounds (genetic, physiological, or phylogenetic). By focusing on dynamic responses rather than a direct comparison of static expression levels, this type of study allows a finer

  2. The effects of problem content and scientific background on information search and the assessment and valuation of correlations.

    Science.gov (United States)

    Soffer, Shira; Kareev, Yaakov

    2011-01-01

    The effects of problem contents and one's scientific background on the detection of correlations and the assessment of their strength were studied using a task that required active data search, assessment of the strength of a correlation, and monetary valuation of the correlation's predictive utility. Participants (N = 72) who were trained either in the natural sciences or in the social sciences and humanities explored data sets differing in contents and actual strength of correlation. Data search was consistent across all variables: Participants drew relatively small samples whose relative sizes would favor the detection of a correlation, if one existed. In contrast, the assessment of the correlation strength and the valuation of its predictive utility were strongly related not only to its objective strength, but also to the correspondence between problem contents and one's scientific background: When the two matched, correlations were judged to be stronger and more valuable than when they did not.

  3. Studying collaborative information seeking: Experiences with three methods

    DEFF Research Database (Denmark)

    Hyldegård, Jette Seiden; Hertzum, Morten; Hansen, Preben

    2015-01-01

    , however, benefit from a discussion of methodological issues. This chapter describes the application of three methods for collecting and analyzing data in three CIS studies. The three methods are Multidimensional Exploration, used in a CIS study of students’ in-formation behavior during a group assignment......; Task-structured Observation, used in a CIS study of patent engineers; and Condensed Observation, used in a CIS study of information-systems development. The three methods are presented in the context of the studies for which they were devised, and the experiences gained using the methods are discussed....... The chapter shows that different methods can be used for collecting and analyzing data about CIS incidents. Two of the methods focused on tasks and events in work settings, while the third was applied in an educational setting. Commonalities and differences among the methods are discussed to inform decisions...

  4. IDEF method for designing seismic information system in CTBT verification

    International Nuclear Information System (INIS)

    Zheng Xuefeng; Shen Junyi; Jin Ping; Zhang Huimin; Zheng Jiangling; Sun Peng

    2004-01-01

    Seismic information system is of great importance for improving the capability of CTBT verification. A large amount of money has been appropriated for the research in this field in the U.S. and some other countries in recent years. However, designing and developing a seismic information system involves various technologies about complex system design. This paper discusses the IDEF0 method to construct function models and the IDEF1x method to make information models systemically, as well as how they are used in designing seismic information system in CTBT verification. (authors)

  5. Evaluation of Information Requirements of Reliability Methods in Engineering Design

    DEFF Research Database (Denmark)

    Marini, Vinicius Kaster; Restrepo-Giraldo, John Dairo; Ahmed-Kristensen, Saeema

    2010-01-01

    This paper aims to characterize the information needed to perform methods for robustness and reliability, and verify their applicability to early design stages. Several methods were evaluated on their support to synthesis in engineering design. Of those methods, FMEA, FTA and HAZOP were selected...

  6. Method of Improving Personal Name Search in Academic Information Service

    Directory of Open Access Journals (Sweden)

    Heejun Han

    2012-12-01

    Full Text Available All academic information on the web or elsewhere has its creator, that is, a subject who has created the information. The subject can be an individual, a group, or an institution, and can be a nation depending on the nature of the relevant information. Most information is composed of a title, an author, and contents. An essay which is under the academic information category has metadata including a title, an author, keyword, abstract, data about publication, place of publication, ISSN, and the like. A patent has metadata including the title, an applicant, an inventor, an attorney, IPC, number of application, and claims of the invention. Most web-based academic information services enable users to search the information by processing the meta-information. An important element is to search information by using the author field which corresponds to a personal name. This study suggests a method of efficient indexing and using the adjacent operation result ranking algorithm to which phrase search-based boosting elements are applied, and thus improving the accuracy of the search results of personal names. It also describes a method for providing the results of searching co-authors and related researchers in searching personal names. This method can be effectively applied to providing accurate and additional search results in the academic information services.

  7. Self-informant Agreement for Personality and Evaluative Person Descriptors: Comparing Methods for Creating Informant Measures.

    Science.gov (United States)

    Simms, Leonard J; Zelazny, Kerry; Yam, Wern How; Gros, Daniel F

    2010-05-01

    Little attention typically is paid to the way self-report measures are translated for use in self-informant agreement studies. We studied two possible methods for creating informant measures: (a) the traditional method in which self-report items were translated from the first- to the third-person and (b) an alternative meta-perceptual method in which informants were directed to rate their perception of the targets' self-perception. We hypothesized that the latter method would yield stronger self-informant agreement for evaluative personality dimensions measured by indirect item markers. We studied these methods in a sample of 303 undergraduate friendship dyads. Results revealed mean-level differences between methods, similar self-informant agreement across methods, stronger agreement for Big Five dimensions than for evaluative dimensions, and incremental validity for meta-perceptual informant rating methods. Limited power reduced the interpretability of several sparse acquaintanceship effects. We conclude that traditional informant methods are appropriate for most personality traits, but meta-perceptual methods may be more appropriate when personality questionnaire items reflect indirect indicators of the trait being measured, which is particularly likely for evaluative traits.

  8. You don't have to believe everything you read: background knowledge permits fast and efficient validation of information.

    Science.gov (United States)

    Richter, Tobias; Schroeder, Sascha; Wöhrmann, Britta

    2009-03-01

    In social cognition, knowledge-based validation of information is usually regarded as relying on strategic and resource-demanding processes. Research on language comprehension, in contrast, suggests that validation processes are involved in the construction of a referential representation of the communicated information. This view implies that individuals can use their knowledge to validate incoming information in a routine and efficient manner. Consistent with this idea, Experiments 1 and 2 demonstrated that individuals are able to reject false assertions efficiently when they have validity-relevant beliefs. Validation processes were carried out routinely even when individuals were put under additional cognitive load during comprehension. Experiment 3 demonstrated that the rejection of false information occurs automatically and interferes with affirmative responses in a nonsemantic task (epistemic Stroop effect). Experiment 4 also revealed complementary interference effects of true information with negative responses in a nonsemantic task. These results suggest the existence of fast and efficient validation processes that protect mental representations from being contaminated by false and inaccurate information.

  9. Classifying and Designing the Educational Methods with Information Communications Technoligies

    Directory of Open Access Journals (Sweden)

    I. N. Semenova

    2013-01-01

    Full Text Available The article describes the conceptual apparatus for implementing the Information Communications Technologies (ICT in education. The authors suggest the classification variants of the related teaching methods according to the following component combinations: types of students work with information, goals of ICT incorporation into the training process, individualization degrees, contingent involvement, activity levels and pedagogical field targets, ideology of informational didactics, etc. Each classification can solve the educational tasks in the context of the partial paradigm of modern didactics; any kind of methods implies the particular combination of activities in educational environment.The whole spectrum of classifications provides the informational functional basis for the adequate selection of necessary teaching methods in accordance with the specified goals and planned results. The potential variants of ICT implementation methods are given for different teaching models. 

  10. Method s for Measuring Productivity in Libraries and Information Centres

    OpenAIRE

    Mohammad Alaaei

    2009-01-01

      Within Information centers, productivity is the result of optimal and effective use of information resources, service quality improvement, increased user satisfaction, pleasantness of working environment, increased motivation and enthusiasm of staff to work better. All contribute to the growth and development of information centers. Thus these centers would need to be familiar with methods employed in productivity measurement. Productivity is one of the criteria for evaluating system perfor...

  11. Classification Method in Integrated Information Network Using Vector Image Comparison

    Directory of Open Access Journals (Sweden)

    Zhou Yuan

    2014-05-01

    Full Text Available Wireless Integrated Information Network (WMN consists of integrated information that can get data from its surrounding, such as image, voice. To transmit information, large resource is required which decreases the service time of the network. In this paper we present a Classification Approach based on Vector Image Comparison (VIC for WMN that improve the service time of the network. The available methods for sub-region selection and conversion are also proposed.

  12. Deriving harmonised forest information in Europe using remote sensing methods

    DEFF Research Database (Denmark)

    Seebach, Lucia Maria

    the need for harmonised forest information can be satisfied using remote sensing methods. In conclusion, the study showed that it is possible to derive harmonised forest information of high spatial detail in Europe with remote sensing. The study also highlighted the imperative provision of accuracy...

  13. 48 CFR 1205.101 - Methods of disseminating information.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Methods of disseminating information. 1205.101 Section 1205.101 Federal Acquisition Regulations System DEPARTMENT OF TRANSPORTATION... disseminating information. (b) The DOT Office of Small and Disadvantaged Business Utilization (S-40), 400 7th...

  14. Methods and background characteristics of the TOHNN study: a population-based study of oral health conditions in northern Norway

    Science.gov (United States)

    Holde, Gro Eirin; Oscarson, Nils; Tillberg, Anders; Marstrander, Peter; Jönsson, Birgitta

    2016-01-01

    Objectives The aim of the Tromstannen – Oral Health in Northern Norway (TOHNN) study was to investigate oral health and dental-related diseases in an adult population. This article provides an overview of the background of the study and a description of the sample characteristics and methods employed in data collection. Study design Cross-sectional population-based study including a questionnaire and clinical dental examination. Methods A randomly selected sample of 2,909 individuals (20–79 years old) drawn from the population register was invited to participate in the study. The data were collected between October 2013 and November 2014 in Troms County in northern Norway. The questionnaire focused on oral health-related behaviours and attitudes, oral health-related quality of life, sense of coherence, dental anxiety and symptoms from the temporomandibular joint. The dental examinations, including radiographs, were conducted by 11 dental teams in 5 dental offices. The examination comprised of registration of dental caries, full mouth periodontal status, temporomandibular disorders, mucosal lesions and height and weight. The participants were grouped by age (20–34, 35–49, 50–64 and 65–79) and ethnicity (Norwegian, Sámi, other European and other world). Results From the original sample of 2,909 individuals, 1,986 (68.3%) people participated, of whom 1,019 (51.3%) were women. The highest attendance rate was among women 20–34 years old (80.3%) and the lowest in the oldest age group of women (55.4%). There was no difference in response rate between rural and urban areas. There was a positive correlation between population size and household gross income (p population in Troms County. Due to the high participation rate, generalization both nationally and to the circumpolar area ought to be possible. PMID:26900910

  15. METHODOLOGICAL BACKGROUND OF EXPERT ESTIMATION OF INITIAL DATA COMPLETENESS AND QUALITY ACCORDING TO THE CERTIFIED INFORMATION SECURITY SYSTEM

    Directory of Open Access Journals (Sweden)

    V. K. Fisenko

    2015-01-01

    Full Text Available Problem of information security systems certification is analyzed and the tasks of initial data analysis are carried out. The objectives, indices and decision making criteria, as well as the challenges to be addressed are formulated. It is shown that, in order to improve quality, reduce time and cost of preparation for certification, it is reasonable to use software system for automatization of the process of initial data analysis, presented by the owner of the information system.

  16. Radiological assessment of residences in the Oak Ridge area. Volume 1. Background information for ORNL environmental impact statement

    International Nuclear Information System (INIS)

    Tsakeres, F.S.; Shank, K.E.; Chaudhry, M.Y.; Ahmad, S.; DiZillo-Benoit, P.M.; Oakes, T.W.

    1980-10-01

    Measurements of exposure rates using thermoluminescent dosimeters placed within residences in the Oak Ridge/Knoxville area are presented. The objective of this investigation was to determine the radiation component acquired by Oak Ridge National Laboratory employee personnel dosimeter-security badges during residential badge storage and to develop a model to predict the radiation exposure rate in Oak Ridge/Knoxville-area homes. The exposure rates varied according to building material used and geographic location. Exposure rates were higher in the fall and lower in the spring; stone residences had a higher average dose equivalent rate than residences made of wood. An average yearly exposure rate was determined to be 78 millirems per year for the Oak Ridge-area homes. This value can be compared to the natural background radiation dose equivalent rate in the United States of 80 to 200 millirems per year

  17. Collecting Information for Rating Global Assessment of Functioning (GAF): Sources of Information and Methods for Information Collection.

    Science.gov (United States)

    I H, Monrad Aas

    2014-11-01

    Global Assessment of Functioning (GAF) is an assessment instrument that is known worldwide. It is widely used for rating the severity of illness. Results from evaluations in psychiatry should characterize the patients. Rating of GAF is based on collected information. The aim of the study is to identify the factors involved in collecting information that is relevant for rating GAF, and gaps in knowledge where it is likely that further development would play a role for improved scoring. A literature search was conducted with a combination of thorough hand search and search in the bibliographic databases PubMed, PsycINFO, Google Scholar, and Campbell Collaboration Library of Systematic Reviews. Collection of information for rating GAF depends on two fundamental factors: the sources of information and the methods for information collection. Sources of information are patients, informants, health personnel, medical records, letters of referral and police records about violence and substance abuse. Methods for information collection include the many different types of interview - unstructured, semi-structured, structured, interviews for Axis I and II disorders, semistructured interviews for rating GAF, and interviews of informants - as well as instruments for rating symptoms and functioning, and observation. The different sources of information, and methods for collection, frequently result in inconsistencies in the information collected. The variation in collected information, and lack of a generally accepted algorithm for combining collected information, is likely to be important for rated GAF values, but there is a fundamental lack of knowledge about the degree of importance. Research to improve GAF has not reached a high level. Rated GAF values are likely to be influenced by both the sources of information used and the methods employed for information collection, but the lack of research-based information about these influences is fundamental. Further development of

  18. On the Adaptation of an Agile Information Systems Development Method

    NARCIS (Netherlands)

    Aydin, M.N.; Harmsen, F.; van Slooten, C.; Stegwee, R.A.

    2005-01-01

    Little specific research has been conducted to date on the adaptation of agile information systems development (ISD) methods. This article presents the work practice in dealing with the adaptation of such a method in the ISD department of one of the leading financial institutes in Europe. Two forms

  19. Adaptation of an Agile Information System Development Method

    NARCIS (Netherlands)

    Aydin, M.N.; Harmsen, A.F.; van Hillegersberg, Jos; Stegwee, R.A.; Siau, K.

    2007-01-01

    Little specific research has been conducted to date on the adaptation of agile information systems development (ISD) methods. This chapter presents the work practice in dealing with the adaptation of such a method in the ISD department of one of the leading financial institutes in Europe. The

  20. How Qualitative Methods Can be Used to Inform Model Development.

    Science.gov (United States)

    Husbands, Samantha; Jowett, Susan; Barton, Pelham; Coast, Joanna

    2017-06-01

    Decision-analytic models play a key role in informing healthcare resource allocation decisions. However, there are ongoing concerns with the credibility of models. Modelling methods guidance can encourage good practice within model development, but its value is dependent on its ability to address the areas that modellers find most challenging. Further, it is important that modelling methods and related guidance are continually updated in light of any new approaches that could potentially enhance model credibility. The objective of this article was to highlight the ways in which qualitative methods have been used and recommended to inform decision-analytic model development and enhance modelling practices. With reference to the literature, the article discusses two key ways in which qualitative methods can be, and have been, applied. The first approach involves using qualitative methods to understand and inform general and future processes of model development, and the second, using qualitative techniques to directly inform the development of individual models. The literature suggests that qualitative methods can improve the validity and credibility of modelling processes by providing a means to understand existing modelling approaches that identifies where problems are occurring and further guidance is needed. It can also be applied within model development to facilitate the input of experts to structural development. We recommend that current and future model development would benefit from the greater integration of qualitative methods, specifically by studying 'real' modelling processes, and by developing recommendations around how qualitative methods can be adopted within everyday modelling practice.

  1. BRST with background field method of the (4,0) supersymmetric σ-model in two dimensions

    International Nuclear Information System (INIS)

    Lhallabi, T.

    1988-08-01

    A manifestly covariant background field formalism for (4,0) supersymmetric non-linear σ-model in two dimensions is presented. The BRST argument is used in order to obtain Faddeev-Popov ghost terms. (author). 13 refs

  2. From Cleanup to Stewardship. A companion report to Accelerating Cleanup: Paths to Closure and background information to support the scoping process required for the 1998 PEIS Settlement Study

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    1999-10-01

    Long-term stewardship is expected to be needed at more than 100 DOE sites after DOE's Environmental Management program completes disposal, stabilization, and restoration operations to address waste and contamination resulting from nuclear research and nuclear weapons production conducted over the past 50 years. From Cleanup to stewardship provides background information on the Department of Energy (DOE) long-term stewardship obligations and activities. This document begins to examine the transition from cleanup to long-term stewardship, and it fulfills the Secretary's commitment to the President in the 1999 Performance Agreement to provide a companion report to the Department's Accelerating Cleanup: Paths to Closure report. It also provides background information to support the scoping process required for a study on long-term stewardship required by a 1998 Settlement Agreement.

  3. Background Material

    DEFF Research Database (Denmark)

    Zandersen, Marianne; Hyytiäinen, Kari; Saraiva, Sofia

    This document serves as a background material to the BONUS Pilot Scenario Workshop, which aims to develop harmonised regional storylines of socio-ecological futures in the Baltic Sea region in a collaborative effort together with other BONUS projects and stakeholders.......This document serves as a background material to the BONUS Pilot Scenario Workshop, which aims to develop harmonised regional storylines of socio-ecological futures in the Baltic Sea region in a collaborative effort together with other BONUS projects and stakeholders....

  4. Justification of computational methods to ensure information management systems

    Directory of Open Access Journals (Sweden)

    E. D. Chertov

    2016-01-01

    Full Text Available Summary. Due to the diversity and complexity of organizational management tasks a large enterprise, the construction of an information management system requires the establishment of interconnected complexes of means, implementing the most efficient way collect, transfer, accumulation and processing of information necessary drivers handle different ranks in the governance process. The main trends of the construction of integrated logistics management information systems can be considered: the creation of integrated data processing systems by centralizing storage and processing of data arrays; organization of computer systems to realize the time-sharing; aggregate-block principle of the integrated logistics; Use a wide range of peripheral devices with the unification of information and hardware communication. Main attention is paid to the application of the system of research of complex technical support, in particular, the definition of quality criteria for the operation of technical complex, the development of information base analysis methods of management information systems and define the requirements for technical means, as well as methods of structural synthesis of the major subsystems of integrated logistics. Thus, the aim is to study on the basis of systematic approach of integrated logistics management information system and the development of a number of methods of analysis and synthesis of complex logistics that are suitable for use in the practice of engineering systems design. The objective function of the complex logistics management information systems is the task of gathering systems, transmission and processing of specified amounts of information in the regulated time intervals with the required degree of accuracy while minimizing the reduced costs for the establishment and operation of technical complex. Achieving the objective function of the complex logistics to carry out certain organization of interaction of information

  5. Hazardous air pollutant emissions from process units in the synthetic organic chemical manufacturing industry: Background information for proposed standards. Volume 1B. Control technologies. Draft report

    International Nuclear Information System (INIS)

    1992-11-01

    A draft rule for the regulation of emissions of organic hazardous air pollutants (HAP's) from chemical processes of the synthetic organic chemical manufacturing industry (SOCMI) is being proposed under the authority of Sections 112, 114, 116, and 301 of the Clean Air Act, as amended in 1990. The volume of the Background Information Document presents discussions of control technologies used in the industry and the costs of those technologies

  6. Hazardous air pollutant emissions from process units in the synthetic organic chemical manufacturing industry: Background information for proposed standards. Volume 1A. National impacts assessment. Draft report

    International Nuclear Information System (INIS)

    1992-11-01

    A draft rule for the regulation of emissions of organic hazardous air pollutants (HAP's) from chemical processes of the synthetic organic chemical manufacturing industry (SOCMI) is being proposed under the authority of Sections 112, 114, 116, and 301 of the Clean Air Act, as amended in 1990. The volume of the Background Information Document presents the results of the national impacts assessment for the proposed rule

  7. Usability Evaluation Methods for Special Interest Internet Information Services

    Directory of Open Access Journals (Sweden)

    Eva-Maria Schön

    2014-06-01

    Full Text Available The internet provides a wide range of scientific information for different areas of research, used by the related scientific communities. Often the design or architecture of these web pages does not correspond to the mental model of their users. As a result the wanted information is difficult to find. Methods established by Usability Engineering and User Experience can help to increase the appeal of scientific internet information services by analyzing the users’ requirements. This paper describes a procedure to analyze and optimize scientific internet information services that can be accomplished with relatively low effort. It consists of a combination of methods that already have been successfully applied to practice: Personas, usability inspections, Online Questionnaire, Kano model and Web Analytics.

  8. Interface methods for using intranet portal organizational memory information system.

    Science.gov (United States)

    Ji, Yong Gu; Salvendy, Gavriel

    2004-12-01

    In this paper, an intranet portal is considered as an information infrastructure (organizational memory information system, OMIS) supporting organizational learning. The properties and the hierarchical structure of information and knowledge in an intranet portal OMIS was identified as a problem for navigation tools of an intranet portal interface. The problem relates to navigation and retrieval functions of intranet portal OMIS and is expected to adversely affect user performance, satisfaction, and usefulness. To solve the problem, a conceptual model for navigation tools of an intranet portal interface was proposed and an experiment using a crossover design was conducted with 10 participants. In the experiment, a separate access method (tabbed tree tool) was compared to an unified access method (single tree tool). The results indicate that each information/knowledge repository for which a user has a different structural knowledge should be handled separately with a separate access to increase user satisfaction and the usefulness of the OMIS and to improve user performance in navigation.

  9. A Method for Estimating Urban Background Concentrations in Support of Hybrid Air Pollution Modeling for Environmental Health Studies

    Directory of Open Access Journals (Sweden)

    Saravanan Arunachalam

    2014-10-01

    Full Text Available Exposure studies rely on detailed characterization of air quality, either from sparsely located routine ambient monitors or from central monitoring sites that may lack spatial representativeness. Alternatively, some studies use models of various complexities to characterize local-scale air quality, but often with poor representation of background concentrations. A hybrid approach that addresses this drawback combines a regional-scale model to provide background concentrations and a local-scale model to assess impacts of local sources. However, this approach may double-count sources in the study regions. To address these limitations, we carefully define the background concentration as the concentration that would be measured if local sources were not present, and to estimate these background concentrations we developed a novel technique that combines space-time ordinary kriging (STOK of observations with outputs from a detailed chemistry-transport model with local sources zeroed out. We applied this technique to support an exposure study in Detroit, Michigan, for several pollutants (including NOx and PM2.5, and evaluated the estimated hybrid concentrations (calculated by combining the background estimates that addresses this issue of double counting with local-scale dispersion model estimates using observations. Our results demonstrate the strength of this approach specifically by eliminating the problem of double-counting reported in previous hybrid modeling approaches leading to improved estimates of background concentrations, and further highlight the relative importance of NOx vs. PM2.5 in their relative contributions to total concentrations. While a key limitation of this approach is the requirement for another detailed model simulation to avoid double-counting, STOK improves the overall characterization of background concentrations at very fine spatial scales.

  10. Information-theoretic methods for estimating of complicated probability distributions

    CERN Document Server

    Zong, Zhi

    2006-01-01

    Mixing up various disciplines frequently produces something that are profound and far-reaching. Cybernetics is such an often-quoted example. Mix of information theory, statistics and computing technology proves to be very useful, which leads to the recent development of information-theory based methods for estimating complicated probability distributions. Estimating probability distribution of a random variable is the fundamental task for quite some fields besides statistics, such as reliability, probabilistic risk analysis (PSA), machine learning, pattern recognization, image processing, neur

  11. Application of nonparametric regression methods to study the relationship between NO2 concentrations and local wind direction and speed at background sites.

    Science.gov (United States)

    Donnelly, Aoife; Misstear, Bruce; Broderick, Brian

    2011-02-15

    Background concentrations of nitrogen dioxide (NO(2)) are not constant but vary temporally and spatially. The current paper presents a powerful tool for the quantification of the effects of wind direction and wind speed on background NO(2) concentrations, particularly in cases where monitoring data are limited. In contrast to previous studies which applied similar methods to sites directly affected by local pollution sources, the current study focuses on background sites with the aim of improving methods for predicting background concentrations adopted in air quality modelling studies. The relationship between measured NO(2) concentration in air at three such sites in Ireland and locally measured wind direction has been quantified using nonparametric regression methods. The major aim was to analyse a method for quantifying the effects of local wind direction on background levels of NO(2) in Ireland. The method was expanded to include wind speed as an added predictor variable. A Gaussian kernel function is used in the analysis and circular statistics employed for the wind direction variable. Wind direction and wind speed were both found to have a statistically significant effect on background levels of NO(2) at all three sites. Frequently environmental impact assessments are based on short term baseline monitoring producing a limited dataset. The presented non-parametric regression methods, in contrast to the frequently used methods such as binning of the data, allow concentrations for missing data pairs to be estimated and distinction between spurious and true peaks in concentrations to be made. The methods were found to provide a realistic estimation of long term concentration variation with wind direction and speed, even for cases where the data set is limited. Accurate identification of the actual variation at each location and causative factors could be made, thus supporting the improved definition of background concentrations for use in air quality modelling

  12. MAIA - Method for Architecture of Information Applied: methodological construct of information processing in complex contexts

    Directory of Open Access Journals (Sweden)

    Ismael de Moura Costa

    2017-04-01

    Full Text Available Introduction: Paper to presentation the MAIA Method for Architecture of Information Applied evolution, its structure, results obtained and three practical applications.Objective: Proposal of a methodological constructo for treatment of complex information, distinguishing information spaces and revealing inherent configurations of those spaces. Metodology: The argument is elaborated from theoretical research of analitical hallmark, using distinction as a way to express concepts. Phenomenology is used as a philosophical position, which considers the correlation between Subject↔Object. The research also considers the notion of interpretation as an integrating element for concepts definition. With these postulates, the steps to transform the information spaces are formulated. Results: This article explores not only how the method is structured to process information in its contexts, starting from a succession of evolutive cicles, divided in moments, which, on their turn, evolve to transformation acts. Conclusions: This article explores not only how the method is structured to process information in its contexts, starting from a succession of evolutive cicles, divided in moments, which, on their turn, evolve to transformation acts. Besides that, the article presents not only possible applications as a cientific method, but also as configuration tool in information spaces, as well as generator of ontologies. At last, but not least, presents a brief summary of the analysis made by researchers who have already evaluated the method considering the three aspects mentioned.

  13. An Improved Information Hiding Method Based on Sparse Representation

    Directory of Open Access Journals (Sweden)

    Minghai Yao

    2015-01-01

    Full Text Available A novel biometric authentication information hiding method based on the sparse representation is proposed for enhancing the security of biometric information transmitted in the network. In order to make good use of abundant information of the cover image, the sparse representation method is adopted to exploit the correlation between the cover and biometric images. Thus, the biometric image is divided into two parts. The first part is the reconstructed image, and the other part is the residual image. The biometric authentication image cannot be restored by any one part. The residual image and sparse representation coefficients are embedded into the cover image. Then, for the sake of causing much less attention of attackers, the visual attention mechanism is employed to select embedding location and embedding sequence of secret information. Finally, the reversible watermarking algorithm based on histogram is utilized for embedding the secret information. For verifying the validity of the algorithm, the PolyU multispectral palmprint and the CASIA iris databases are used as biometric information. The experimental results show that the proposed method exhibits good security, invisibility, and high capacity.

  14. System and method for acquisition management of subject position information

    Science.gov (United States)

    Carrender, Curt

    2005-12-13

    A system and method for acquisition management of subject position information that utilizes radio frequency identification (RF ID) to store position information in position tags. Tag programmers receive position information from external positioning systems, such as the Global Positioning System (GPS), from manual inputs, such as keypads, or other tag programmers. The tag programmers program each position tag with the received position information. Both the tag programmers and the position tags can be portable or fixed. Implementations include portable tag programmers and fixed position tags for subject position guidance, and portable tag programmers for collection sample labeling. Other implementations include fixed tag programmers and portable position tags for subject route recordation. Position tags can contain other associated information such as destination address of an affixed subject for subject routing.

  15. System and method for acquisition management of subject position information

    Energy Technology Data Exchange (ETDEWEB)

    Carrender, Curt [Morgan Hill, CA

    2007-01-23

    A system and method for acquisition management of subject position information that utilizes radio frequency identification (RF ID) to store position information in position tags. Tag programmers receive position information from external positioning systems, such as the Global Positioning System (GPS), from manual inputs, such as keypads, or other tag programmers. The tag programmers program each position tag with the received position information. Both the tag programmers and the position tags can be portable or fixed. Implementations include portable tag programmers and fixed position tags for subject position guidance, and portable tag programmers for collection sample labeling. Other implementations include fixed tag programmers and portable position tags for subject route recordation. Position tags can contain other associated information such as destination address of an affixed subject for subject routing.

  16. Financial time series analysis based on information categorization method

    Science.gov (United States)

    Tian, Qiang; Shang, Pengjian; Feng, Guochen

    2014-12-01

    The paper mainly applies the information categorization method to analyze the financial time series. The method is used to examine the similarity of different sequences by calculating the distances between them. We apply this method to quantify the similarity of different stock markets. And we report the results of similarity in US and Chinese stock markets in periods 1991-1998 (before the Asian currency crisis), 1999-2006 (after the Asian currency crisis and before the global financial crisis), and 2007-2013 (during and after global financial crisis) by using this method. The results show the difference of similarity between different stock markets in different time periods and the similarity of the two stock markets become larger after these two crises. Also we acquire the results of similarity of 10 stock indices in three areas; it means the method can distinguish different areas' markets from the phylogenetic trees. The results show that we can get satisfactory information from financial markets by this method. The information categorization method can not only be used in physiologic time series, but also in financial time series.

  17. Assessment of technical strengths and information flow of energy conservation research in Japan. Volume 2. Background document

    Energy Technology Data Exchange (ETDEWEB)

    Hane, G.J.; Lewis, P.M.; Hutchinson, R.A.; Rubinger, B.; Willis, A.

    1985-06-01

    Purpose of this study is to explore the status of R and D in Japan and the ability of US researchers to keep abreast of Japanese technical advances. US researchers familiar with R and D activities in Japan were interviewed in ten fields that are relevant to the more efficient use of energy: amorphous metals, biotechnology, ceramics, combustion, electrochemical energy storage, heat engines, heat transfer, high-temperature sensors, thermal and chemical energy storage, and tribology. The researchers were questioned about their perceptions of the strengths of R and D in Japan, comparative aspects of US work, and the quality of available information sources describing R and D in Japan. Of the ten related fields, the researchers expressed a strong perception that significant R and D is under way in amorphous metals, biotechnology, and ceramics, and that the US competitive position in these technologies will be significantly challenged. Researchers also identified alternative emphases in Japanese R and D programs in these areas that provide Japan with stronger technical capabilities. For example, in biotechnology, researchers noted the significant Japanese emphasis on industrial-scale bioprocess engineering, which contrasts with a more meager effort in the US. In tribology, researchers also noted the strength of the chemical tribology research in Japan and commented on the effective mix of chemical and mechanical tribology research. This approach contrasts with the emphasis on mechanical tribology in the US.

  18. Discussion of a method for providing general risk information by linking with the nuclear information

    International Nuclear Information System (INIS)

    Shobu, Nobuhiro; Yokomizo, Shirou; Umezawa, Sayaka

    2004-06-01

    'Risk information navigator (http://www.ricotti.jp/risknavi/)', an internet tool for arousing public interest and fostering people's risk literacy, has been developed as the contents for the official website of Techno Community Square 'RICOTTI' (http://www.ricotti.jp) at TOKAI village. In this report we classified the risk information into the fields, Health/Daily Life', 'Society/Crime/Disaster' and Technology/Environment/Energy', for the internet tool contents. According to these categories we discussed a method for providing various risk information on general fields by linking with the information on nuclear field. The web contents are attached to this report with the CD-R media. (author)

  19. Background radiation

    International Nuclear Information System (INIS)

    Arnott, D.

    1985-01-01

    The effects of background radiation, whether natural or caused by man's activities, are discussed. The known biological effects of radiation in causing cancers or genetic mutations are explained. The statement that there is a threshold below which there is no risk is examined critically. (U.K.)

  20. Effect of background dielectric on TE-polarized photonic bandgap of metallodielectric photonic crystals using Dirichlet-to-Neumann map method.

    Science.gov (United States)

    Sedghi, Aliasghar; Rezaei, Behrooz

    2016-11-20

    Using the Dirichlet-to-Neumann map method, we have calculated the photonic band structure of two-dimensional metallodielectric photonic crystals having the square and triangular lattices of circular metal rods in a dielectric background. We have selected the transverse electric mode of electromagnetic waves, and the resulting band structures showed the existence of photonic bandgap in these structures. We theoretically study the effect of background dielectric on the photonic bandgap.

  1. Multimodal cues provide redundant information for bumblebees when the stimulus is visually salient, but facilitate red target detection in a naturalistic background

    Science.gov (United States)

    Corcobado, Guadalupe; Trillo, Alejandro

    2017-01-01

    Our understanding of how floral visitors integrate visual and olfactory cues when seeking food, and how background complexity affects flower detection is limited. Here, we aimed to understand the use of visual and olfactory information for bumblebees (Bombus terrestris terrestris L.) when seeking flowers in a visually complex background. To explore this issue, we first evaluated the effect of flower colour (red and blue), size (8, 16 and 32 mm), scent (presence or absence) and the amount of training on the foraging strategy of bumblebees (accuracy, search time and flight behaviour), considering the visual complexity of our background, to later explore whether experienced bumblebees, previously trained in the presence of scent, can recall and make use of odour information when foraging in the presence of novel visual stimuli carrying a familiar scent. Of all the variables analysed, flower colour had the strongest effect on the foraging strategy. Bumblebees searching for blue flowers were more accurate, flew faster, followed more direct paths between flowers and needed less time to find them, than bumblebees searching for red flowers. In turn, training and the presence of odour helped bees to find inconspicuous (red) flowers. When bees foraged on red flowers, search time increased with flower size; but search time was independent of flower size when bees foraged on blue flowers. Previous experience with floral scent enhances the capacity of detection of a novel colour carrying a familiar scent, probably by elemental association influencing attention. PMID:28898287

  2. Research on a Method of Geographical Information Service Load Balancing

    Science.gov (United States)

    Li, Heyuan; Li, Yongxing; Xue, Zhiyong; Feng, Tao

    2018-05-01

    With the development of geographical information service technologies, how to achieve the intelligent scheduling and high concurrent access of geographical information service resources based on load balancing is a focal point of current study. This paper presents an algorithm of dynamic load balancing. In the algorithm, types of geographical information service are matched with the corresponding server group, then the RED algorithm is combined with the method of double threshold effectively to judge the load state of serve node, finally the service is scheduled based on weighted probabilistic in a certain period. At the last, an experiment system is built based on cluster server, which proves the effectiveness of the method presented in this paper.

  3. Control method for biped locomotion robots based on ZMP information

    International Nuclear Information System (INIS)

    Kume, Etsuo

    1994-01-01

    The Human Acts Simulation Program (HASP) started as a ten year program of Computing and Information Systems Center (CISC) at Japan Atomic Energy Research Institute (JAERI) in 1987. A mechanical design study of biped locomotion robots for patrol and inspection in nuclear facilities is being performed as an item of the research scope. One of the goals of our research is to design a biped locomotion robot for practical use in nuclear facilities. So far, we have been studying for several dynamic walking patterns. In conventional control methods for biped locomotion robots, the program control is used based on preset walking patterns, so it dose not have the robustness such as a dynamic change of walking pattern. Therefore, a real-time control method based on dynamic information of the robot states is necessary for the high performance of walking. In this study a new control method based on Zero Moment Point (ZMP) information is proposed as one of real-time control methods. The proposed method is discussed and validated based on the numerical simulation. (author)

  4. A Model-Driven Development Method for Management Information Systems

    Science.gov (United States)

    Mizuno, Tomoki; Matsumoto, Keinosuke; Mori, Naoki

    Traditionally, a Management Information System (MIS) has been developed without using formal methods. By the informal methods, the MIS is developed on its lifecycle without having any models. It causes many problems such as lack of the reliability of system design specifications. In order to overcome these problems, a model theory approach was proposed. The approach is based on an idea that a system can be modeled by automata and set theory. However, it is very difficult to generate automata of the system to be developed right from the start. On the other hand, there is a model-driven development method that can flexibly correspond to changes of business logics or implementing technologies. In the model-driven development, a system is modeled using a modeling language such as UML. This paper proposes a new development method for management information systems applying the model-driven development method to a component of the model theory approach. The experiment has shown that a reduced amount of efforts is more than 30% of all the efforts.

  5. Research on image complexity evaluation method based on color information

    Science.gov (United States)

    Wang, Hao; Duan, Jin; Han, Xue-hui; Xiao, Bo

    2017-11-01

    In order to evaluate the complexity of a color image more effectively and find the connection between image complexity and image information, this paper presents a method to compute the complexity of image based on color information.Under the complexity ,the theoretical analysis first divides the complexity from the subjective level, divides into three levels: low complexity, medium complexity and high complexity, and then carries on the image feature extraction, finally establishes the function between the complexity value and the color characteristic model. The experimental results show that this kind of evaluation method can objectively reconstruct the complexity of the image from the image feature research. The experimental results obtained by the method of this paper are in good agreement with the results of human visual perception complexity,Color image complexity has a certain reference value.

  6. Information loss method to measure node similarity in networks

    Science.gov (United States)

    Li, Yongli; Luo, Peng; Wu, Chong

    2014-09-01

    Similarity measurement for the network node has been paid increasing attention in the field of statistical physics. In this paper, we propose an entropy-based information loss method to measure the node similarity. The whole model is established based on this idea that less information loss is caused by seeing two more similar nodes as the same. The proposed new method has relatively low algorithm complexity, making it less time-consuming and more efficient to deal with the large scale real-world network. In order to clarify its availability and accuracy, this new approach was compared with some other selected approaches on two artificial examples and synthetic networks. Furthermore, the proposed method is also successfully applied to predict the network evolution and predict the unknown nodes' attributions in the two application examples.

  7. Confirming candidate genes for longevity in Drosophila melanogaster using two different genetic backgrounds and selection methods

    DEFF Research Database (Denmark)

    Wit, Janneke; Frydenberg, Jane; Sarup, Pernille Merete

    2013-01-01

    usually focussed on one sex and on flies originating from one genetic background, and results from different studies often do not overlap. Using D. melanogaster selected for increased longevity we aimed to find robust longevity related genes by examining gene expression in both sexes of flies originating......Elucidating genes that affect life span or that can be used as biomarkers for ageing has received attention in diverse studies in recent years. Using model organisms and various approaches several genes have been linked to the longevity phenotype. For Drosophila melanogaster those studies have...... from different genetic backgrounds. Further, we compared expression changes across three ages, when flies were young, middle aged or old, to examine how candidate gene expression changes with the onset of ageing. We selected 10 genes based on their expression differences in prior microarray studies...

  8. Direction Dependent Background Fitting for the Fermi GBM Data

    OpenAIRE

    Szécsi, Dorottya; Bagoly, Zsolt; Kóbori, József; Horváth, István; Balázs, Lajos G.

    2013-01-01

    We present a method for determining the background of Fermi GBM GRBs using the satellite positional information and a physical model. Since the polynomial fitting method typically used for GRBs is generally only indicative of the background over relatively short timescales, this method is particularly useful in the cases of long GRBs or those which have Autonomous Repoint Request (ARR) and a background with much variability on short timescales. We give a Direction Dependent Background Fitting...

  9. Hybrid methods to represent incomplete and uncertain information

    Energy Technology Data Exchange (ETDEWEB)

    Joslyn, C. [NASA Goddard Space Flight Center, Greenbelt, MD (United States)

    1996-12-31

    Decision making is cast in the semiotic context of perception, decision, and action loops. Towards the goal of properly grounding hybrid representations of information and uncertainty from this semiotic perspective, we consider the roles of and relations among the mathematical components of General Information Theory (GIT), particularly among fuzzy sets, possibility theory, probability theory, and random sets. We do so by using a clear distinction between the syntactic, mathematical formalism and the semantic domains of application of each of these fields, placing the emphasis on available measurement and action methods appropriate for each formalism, to which and from which the decision-making process flows.

  10. Background information for the SER Energy Agreement for Sustainable Growth calculations. Sectors Industry, Agriculture and Horticulture; Achtergronddocument bij doorrekening Energieakkoord. Sectoren industrie en land- en tuinbouw

    Energy Technology Data Exchange (ETDEWEB)

    Wetzels, W. [ECN Beleidsstudies, Petten (Netherlands)

    2013-09-01

    On September 4, 2013, representatives of employers' associations, trade union federations, environmental organizations, the Dutch government and civil society have signed an Energy Agreement for Sustainable Growth. ECN and PBL have been asked to evaluate this agreement. This report gives background information on the evaluation of the measures aimed at improving energy efficiency in industry and agriculture [Dutch] Op 4 september 2013 is het 'Energieakkoord voor duurzame groei' getekend. ECN en PBL zijn gevraagd het akkoord te beoordelen en door te rekenen. Dit rapport dient als achtergronddocument bij de doorrekening van de maatregelen gericht op energiebesparing in de industrie en land- en tuinbouw.

  11. The Alaskan mineral resource assessment program; background information to accompany folio of geologic and mineral resource maps of the Ambler River Quadrangle, Alaska

    Science.gov (United States)

    Mayfield, Charles F.; Tailleur, I.L.; Albert, N.R.; Ellersieck, Inyo; Grybeck, Donald; Hackett, S.W.

    1983-01-01

    The Ambler River quadrangle, consisting of 14,290 km2 (5,520 mi2) in northwest Alaska, was investigated by an interdisciplinary research team for the purpose of assessing the mineral resource potential of the quadrangle. This report provides background information for a folio of maps on the geology, reconnaissance geochemistry, aeromagnetics, Landsat imagery, and mineral resource evaluation of the quadrangle. A summary of the geologic history, radiometric dates, and fossil localities and a comprehensive bibliography are also included. The quadrangle contains jade reserves, now being mined, and potentially significant resources of copper, zinc, lead, and silver.

  12. Botanical Dietary Supplements: Background Information

    Science.gov (United States)

    ... use a Latin name made up of the genus and species of the plant. Under this system ... form of a botanical preparation also play important roles in its safety. Teas, tinctures, and extracts have ...

  13. Zambia Country Background Report

    DEFF Research Database (Denmark)

    Hampwaye, Godfrey; Jeppesen, Søren; Kragelund, Peter

    This paper provides background data and general information for the Zambia studies focusing on local food processing sub­‐sector; and the local suppliers to the mines as part of the SAFIC project (Successful African Firms and Institutional Change).......This paper provides background data and general information for the Zambia studies focusing on local food processing sub­‐sector; and the local suppliers to the mines as part of the SAFIC project (Successful African Firms and Institutional Change)....

  14. Method of accounting and suppressing the instability of dosimetric information

    International Nuclear Information System (INIS)

    Fejtek, Ya.

    1977-01-01

    To account for dosimetric information instability differential and integral correcting factors are proposed. The differential factor converts signals of dosimeters irradiated during short but different periods of time into equivalent signals related to a certain period of time. The factor excludes the effect of signal instability in the case of short exposures. The integral factor represents a generalization of the differential one for prolonged exposures. The statistical integral factor is derived. An example of processing experimental data using the analytical method developed is presented. The method is pointed out to have been introduced in the state personal dosimetry service in Czechoslovakia [ru

  15. Computational Methods for Physical Model Information Management: Opening the Aperture

    International Nuclear Information System (INIS)

    Moser, F.; Kirgoeze, R.; Gagne, D.; Calle, D.; Murray, J.; Crowley, J.

    2015-01-01

    The volume, velocity and diversity of data available to analysts are growing exponentially, increasing the demands on analysts to stay abreast of developments in their areas of investigation. In parallel to the growth in data, technologies have been developed to efficiently process, store, and effectively extract information suitable for the development of a knowledge base capable of supporting inferential (decision logic) reasoning over semantic spaces. These technologies and methodologies, in effect, allow for automated discovery and mapping of information to specific steps in the Physical Model (Safeguard's standard reference of the Nuclear Fuel Cycle). This paper will describe and demonstrate an integrated service under development at the IAEA that utilizes machine learning techniques, computational natural language models, Bayesian methods and semantic/ontological reasoning capabilities to process large volumes of (streaming) information and associate relevant, discovered information to the appropriate process step in the Physical Model. The paper will detail how this capability will consume open source and controlled information sources and be integrated with other capabilities within the analysis environment, and provide the basis for a semantic knowledge base suitable for hosting future mission focused applications. (author)

  16. A human-machine interface evaluation method: A difficulty evaluation method in information searching (DEMIS)

    International Nuclear Information System (INIS)

    Ha, Jun Su; Seong, Poong Hyun

    2009-01-01

    A human-machine interface (HMI) evaluation method, which is named 'difficulty evaluation method in information searching (DEMIS)', is proposed and demonstrated with an experimental study. The DEMIS is based on a human performance model and two measures of attentional-resource effectiveness in monitoring and detection tasks in nuclear power plants (NPPs). Operator competence and HMI design are modeled to be most significant factors to human performance. One of the two effectiveness measures is fixation-to-importance ratio (FIR) which represents attentional resource (eye fixations) spent on an information source compared to importance of the information source. The other measure is selective attention effectiveness (SAE) which incorporates FIRs for all information sources. The underlying principle of the measures is that the information source should be selectively attended to according to its informational importance. In this study, poor performance in information searching tasks is modeled to be coupled with difficulties caused by poor mental models of operators or/and poor HMI design. Human performance in information searching tasks is evaluated by analyzing the FIR and the SAE. Operator mental models are evaluated by a questionnaire-based method. Then difficulties caused by a poor HMI design are evaluated by a focused interview based on the FIR evaluation and then root causes leading to poor performance are identified in a systematic way.

  17. Informed consent in colonoscopy: A comparative analysis of 2 methods.

    Science.gov (United States)

    Sanguinetti, J M; Lotero Polesel, J C; Iriarte, S M; Ledesma, C; Canseco Fuentes, S E; Caro, L E

    2015-01-01

    The manner in which informed consent is obtained varies. The aim of this study is to evaluate the level of knowledge about colonoscopy and comparing 2 methods of obtaining informed consent. A comparative, cross-sectional, observational study was conducted on patients that underwent colonoscopy in a public hospital (Group A) and in a private hospital (Group B). Group A received information verbally from a physician, as well as in the form of printed material, and Group B only received printed material. A telephone survey was carried out one or 2 weeks later. The study included a total of 176 subjects (group A [n=55] and group B [n=121]). As regards education level, 69.88% (n=123) of the patients had completed university education, 23.29% (n= 41) secondary level, 5.68% (n=10) primary level, and the remaining subjects (n=2) had not completed any level of education. All (100%) of the subjects knew the characteristics of the procedure, and 99.43% were aware of its benefits. A total of 97.7% received information about complications, 93.7% named some of them, and 25% (n=44) remembered major complications. All the subjects received, read, and signed the informed consent statement before the study. There were no differences between the groups with respect to knowledge of the characteristics and benefits of the procedure, or the receipt and reading of the consent form. Group B responded better in relation to complications (P=.0027) and group A had a better recollection of the major complications (P<.0001). Group A had a higher number of affirmative answers (P<.0001). The combination of verbal and written information provides the patient with a more comprehensive level of knowledge about the procedure. Copyright © 2014 Asociación Mexicana de Gastroenterología. Published by Masson Doyma México S.A. All rights reserved.

  18. Method s for Measuring Productivity in Libraries and Information Centres

    Directory of Open Access Journals (Sweden)

    Mohammad Alaaei

    2009-04-01

    Full Text Available   Within Information centers, productivity is the result of optimal and effective use of information resources, service quality improvement, increased user satisfaction, pleasantness of working environment, increased motivation and enthusiasm of staff to work better. All contribute to the growth and development of information centers. Thus these centers would need to be familiar with methods employed in productivity measurement. Productivity is one of the criteria for evaluating system performance. In the past decades particular emphasis has been placed on measurement and improvement of human resource, creativity, innovation and expert analysis. Contemplation and efforts made towards identification of problems and issues and new means to make more useful and better resource management is the very definition of productivity. Simply put, productivity is the relationship between system output and the elements garnered to produce these outputs. The causality between variables and factors impacting on productivity is very complex. In information centers, given the large volume of elements involved, it seems necessary to increase efficiency and productivity

  19. A method for characterization of coherent backgrounds in real time and its application in gravitational wave data analysis

    International Nuclear Information System (INIS)

    Daw, E J; Hewitson, M R

    2008-01-01

    Many experiments, and in particular gravitational wave detectors, produce continuous streams of data whose frequency representations contain discrete, relatively narrowband coherent features at high amplitude. We discuss the application of digital Fourier transforms (DFTs) to characterization of these features, hereafter frequently referred to as lines. Application of DFTs to continuously produced time-domain data is achieved through an algorithm, hereafter referred to as EFC , for efficient time-domain determination of the Fourier coefficients of a data set. We first define EFC and discuss parameters relating to the algorithm that determine its properties and action on the data. In gravitational wave interferometers, these lines are commonly due to parasitic sources of coherent background interference coupling into the instrument. Using GEO 600 data, we next demonstrate that time-domain subtraction of lines can proceed without detrimental effects either on features at frequencies separated from that of the subtracted line, or on features at the frequency of the line but having different stationarity properties

  20. Identification of hand motion using background subtraction method and extraction of image binary with backpropagation neural network on skeleton model

    Science.gov (United States)

    Fauziah; Wibowo, E. P.; Madenda, S.; Hustinawati

    2018-03-01

    Capturing and recording motion in human is mostly done with the aim for sports, health, animation films, criminality, and robotic applications. In this study combined background subtraction and back propagation neural network. This purpose to produce, find similarity movement. The acquisition process using 8 MP resolution camera MP4 format, duration 48 seconds, 30frame/rate. video extracted produced 1444 pieces and results hand motion identification process. Phase of image processing performed is segmentation process, feature extraction, identification. Segmentation using bakground subtraction, extracted feature basically used to distinguish between one object to another object. Feature extraction performed by using motion based morfology analysis based on 7 invariant moment producing four different classes motion: no object, hand down, hand-to-side and hands-up. Identification process used to recognize of hand movement using seven inputs. Testing and training with a variety of parameters tested, it appears that architecture provides the highest accuracy in one hundred hidden neural network. The architecture is used propagate the input value of the system implementation process into the user interface. The result of the identification of the type of the human movement has been clone to produce the highest acuracy of 98.5447%. The training process is done to get the best results.

  1. a Task-Oriented Disaster Information Correlation Method

    Science.gov (United States)

    Linyao, Q.; Zhiqiang, D.; Qing, Z.

    2015-07-01

    With the rapid development of sensor networks and Earth observation technology, a large quantity of disaster-related data is available, such as remotely sensed data, historic data, case data, simulated data, and disaster products. However, the efficiency of current data management and service systems has become increasingly difficult due to the task variety and heterogeneous data. For emergency task-oriented applications, the data searches primarily rely on artificial experience based on simple metadata indices, the high time consumption and low accuracy of which cannot satisfy the speed and veracity requirements for disaster products. In this paper, a task-oriented correlation method is proposed for efficient disaster data management and intelligent service with the objectives of 1) putting forward disaster task ontology and data ontology to unify the different semantics of multi-source information, 2) identifying the semantic mapping from emergency tasks to multiple data sources on the basis of uniform description in 1), and 3) linking task-related data automatically and calculating the correlation between each data set and a certain task. The method goes beyond traditional static management of disaster data and establishes a basis for intelligent retrieval and active dissemination of disaster information. The case study presented in this paper illustrates the use of the method on an example flood emergency relief task.

  2. Information processing systems, reasoning modules, and reasoning system design methods

    Science.gov (United States)

    Hohimer, Ryan E; Greitzer, Frank L; Hampton, Shawn D

    2014-03-04

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  3. Information processing systems, reasoning modules, and reasoning system design methods

    Energy Technology Data Exchange (ETDEWEB)

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2016-08-23

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  4. Information processing systems, reasoning modules, and reasoning system design methods

    Energy Technology Data Exchange (ETDEWEB)

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2015-08-18

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  5. Method for separation of a weak background signal from at the presence of a strong one in the Fourier spectroscopy

    International Nuclear Information System (INIS)

    Lib, Yu.N.; Zhukov, M.S.

    1985-01-01

    A method for solving a big signal problem in the nmr Fourier spectroscopy is described. Thus the digital filtration of a big signal is carried out, where from the droop of induced signal accumulated before the moment of memory content overflow, subtracted is a model interferogram, corresponding only to removed big signals (the model interferogram is the result of perocessing of an initial interferogram). Calculating formulae and dependences haracterizing the accumulation-subtraction process and minimal gain as compared with a common technique with scaling are given. Experimental results which confirm the method efficiency are stated

  6. Method for reducing x-ray background signals from insertion device x-ray beam position monitors

    Directory of Open Access Journals (Sweden)

    Glenn Decker

    1999-11-01

    Full Text Available A method is described that provides a solution to the long-standing problem of stray radiation-induced signals on photoemission-based x-ray beam position monitors (BPMs located on insertion device x-ray beam lines. The method involves the introduction of a chicane into the accelerator lattice that directs unwanted x radiation away from the photosensitive x-ray BPM blades. This technique has been implemented at the Advanced Photon Source, and experimental confirmation of the technique is provided.

  7. Designed Green Toolbox as built environment educating method-analytical comparison between two groups of students with different cultural background

    NARCIS (Netherlands)

    El Fiky, U.; Hamdy, I.; Fikry, M.

    2006-01-01

    This paper is concerned with evaluating and testing the application process of green architecture design strategies using a tool-box as a built environment educating method and a pre-design reminder. Understanding the suggested green design strategies, testing the tool-box effectiveness,

  8. Risk-Informed SSCs Categorization: Elicitation Method of Expert's Opinion

    International Nuclear Information System (INIS)

    Hwang, Mee Jeong; Yang, Joon Eon; Kim, Kil Yoo

    2005-01-01

    The regulations have been performing by deterministic way since nuclear power plants have been operating. However, some SSCs identified as safety-significance by deterministic way, were turned out to be low or non safety-significant and some SSCs identified as non-safety significance were turned out to be high safety-significant according to the results of PSA. Considering these risk insights, Regulatory Guide 1.174 and 10CFR50.69 were drawn up, and we can re-categorize the SSCs according to their safety significance. Therefore, a study and an interest about the risk-informed SSCs re-categorization and treatment has been continued. The objective of this regulatory initiative is to adjust the scope of equipment subject to special regulatory treatment to better focus licensee and regulatory attention and resources on equipment that has safety significance. Current most regulations define the plant equipment necessary to meet deterministic regulatory basis as 'safety-related.' This equipment is subject to special treatment regulations. Other plant equipment is categorized as 'non-safety related,' and is not subject to a select number of special treatment requirement or a subset of those requirement. However, risk information is not a magic tool making a decision but a supporting tool to categorize SSCs. This is because only small parts of a plant are modeled in PSA model. Thus, engineering and deterministic judgments are also used for risk-informed SSCs categorization, and expert opinion elicitation is very important for risk-informed SSCs categorization. Therefore, we need a rational method to elicit the expert's opinions, and in this study, we developed a systematic method for expert elicitation to categorize the nuclear power plants' SSCs. Current states for SSCs categorization of the USA and the existing methods for expert elicitation were surveyed and more systematic way eliciting the expert opinions and combining was developed. To validate the developed method

  9. Methods and Systems for Advanced Spaceport Information Management

    Science.gov (United States)

    Fussell, Ronald M. (Inventor); Ely, Donald W. (Inventor); Meier, Gary M. (Inventor); Halpin, Paul C. (Inventor); Meade, Phillip T. (Inventor); Jacobson, Craig A. (Inventor); Blackwell-Thompson, Charlie (Inventor)

    2007-01-01

    Advanced spaceport information management methods and systems are disclosed. In one embodiment, a method includes coupling a test system to the payload and transmitting one or more test signals that emulate an anticipated condition from the test system to the payload. One or more responsive signals are received from the payload into the test system and are analyzed to determine whether one or more of the responsive signals comprises an anomalous signal. At least one of the steps of transmitting, receiving, analyzing and determining includes transmitting at least one of the test signals and the responsive signals via a communications link from a payload processing facility to a remotely located facility. In one particular embodiment, the communications link is an Internet link from a payload processing facility to a remotely located facility (e.g. a launch facility, university, etc.).

  10. Community-based obesity prevention in Australia: Background, methods and recruitment outcomes for the evaluation of the effectiveness of OPAL (Obesity Prevention and Lifestyle

    Directory of Open Access Journals (Sweden)

    Eva Leslie

    2015-09-01

    Full Text Available Background: The Obesity Prevention and Lifestyle (OPAL intervention program targets families and communities to improve children’s eating and physical activity patterns. We outline the quantitative evaluation design and recruitment results for baseline data collection. Methods: A longitudinal quasi-experimental design, with baseline data collection and five-year follow-up. Participants targeted are children, parents, and school principals/directors from primary, secondary/R-12 schools, pre-schools, childcare and out-of-school-hours-care (OSHC centers in 20 selected communities across South Australia (SA, and one in the Northern Territory (NT. A total of 277 (262 SA, 15 NT schools participated; 4860 9-11 year olds and 1164 12-16 year olds completed a questionnaire. Anthropometric measures were taken from 5531 students; 6552 parents, 276 pre/school/childcare directors, 139 OSHC directors and 237 principals completed questionnaires. Data include measurements of child participants’ weight/height/waist circumference; paper-based/online surveys of informants in early childhood, primary/secondary school and community settings; and secondary growth check data for 4-5 year old children. Serial cross-sectional analyses will compare intervention to matched comparison communities. Results: Overall school response rate was 50%. Student response rates were 20-22% and 11-13% (questionnaires and measurements respectively; 14-21% of parents, 49-55% of directors, and 26-44% of principals completed and returned questionnaires. Changes to child weight status; eating practices; sleep, physical activity/sedentary behaviors; physical environments; community capacity; and economic evaluation (Quality Adjusted Life year gain will examine program effectiveness. Conclusions: As the most significant program of its kind in Australia, OPAL will contribute to obesity prevention efforts on an international scale.

  11. Detection of a periodic structure hidden in random background: the role of signal amplitude in the matched filter detection method

    International Nuclear Information System (INIS)

    Vani, V C; Chatterjee, S

    2010-01-01

    The matched filter method for detecting a periodic structure on a surface hidden behind randomness is known to detect up to (r 0 /Λ)≥0.11, where r 0 is the coherence length of light on scattering from the rough part and Λ is the wavelength of the periodic part of the surface-the above limit being much lower than what is allowed by conventional detection methods. The primary goal of this technique is the detection and characterization of the periodic structure hidden behind randomness without the use of any complicated experimental or computational procedures. This paper examines this detection procedure for various values of the amplitude a of the periodic part beginning from a=0 to small finite values of a. We thus address the importance of the following quantities: '(a/λ)', which scales the amplitude of the periodic part with the wavelength of light, and (r 0 /Λ), in determining the detectability of the intensity peaks.

  12. Background information for the SER Energy Agreement for Sustainable Growth calculations. Sector Built Environment; Achtergronddocument bij doorrekening SER Energieakkoord. Sector Gebouwde omgeving

    Energy Technology Data Exchange (ETDEWEB)

    Menkveld, M.; Tigchelaar, C. [ECN Beleidsstudies, Petten (Netherlands)

    2013-09-01

    This publication is part of the support given by ECN and PBL in the development of a national energy agreement between March and September 2013 as initiated by the SER (Social and Economic Council of the Netherlands). The report gives background information on the evaluation of measures in the agreement aimed at the built environment. It is an annex of the general evaluation of PBL/ECN [Dutch] Dit rapport is geschreven als onderdeel van de ondersteuning door ECN en PBL bij het tot stand komen van het energieakkoord in de periode maart tot september 2013. Dit rapport dient als achtergrond bij de doorrekening van de maatregelen gericht op energiebesparing in de gebouwde omgeving.

  13. Performance Comparison of Several Pre-Processing Methods in a Hand Gesture Recognition System based on Nearest Neighbor for Different Background Conditions

    Directory of Open Access Journals (Sweden)

    Iwan Setyawan

    2012-12-01

    Full Text Available This paper presents a performance analysis and comparison of several pre-processing methods used in a hand gesture recognition system. The pre-processing methods are based on the combinations of several image processing operations, namely edge detection, low pass filtering, histogram equalization, thresholding and desaturation. The hand gesture recognition system is designed to classify an input image into one of six possible classes. The input images are taken with various background conditions. Our experiments showed that the best result is achieved when the pre-processing method consists of only a desaturation operation, achieving a classification accuracy of up to 83.15%.

  14. Looking for Cosmic Neutrino Background

    Directory of Open Access Journals (Sweden)

    Chiaki eYanagisawa

    2014-06-01

    Full Text Available Since the discovery of neutrino oscillation in atmospheric neutrinos by the Super-Kamiokande experiment in 1998, study of neutrinos has been one of exciting fields in high-energy physics. All the mixing angles were measured. Quests for 1 measurements of the remaining parameters, the lightest neutrino mass, the CP violating phase(s, and the sign of mass splitting between the mass eigenstates m3 and m1, and 2 better measurements to determine whether the mixing angle theta23 is less than pi/4, are in progress in a well-controlled manner. Determining the nature of neutrinos, whether they are Dirac or Majorana particles is also in progress with continuous improvement. On the other hand, although the ideas of detecting cosmic neutrino background have been discussed since 1960s, there has not been a serious concerted effort to achieve this goal. One of the reasons is that it is extremely difficult to detect such low energy neutrinos from the Big Bang. While there has been tremendous accumulation of information on Cosmic Microwave Background since its discovery in 1965, there is no direct evidence for Cosmic Neutrino Background. The importance of detecting Cosmic Neutrino Background is that, although detailed studies of Big Bang Nucleosynthesis and Cosmic Microwave Background give information of the early Universe at ~a few minutes old and ~300 k years old, respectively, observation of Cosmic Neutrino Background allows us to study the early Universe at $sim$ 1 sec old. This article reviews progress made in the past 50 years on detection methods of Cosmic Neutrino Background.

  15. A review of Web information seeking research: considerations of method and foci of interest

    Directory of Open Access Journals (Sweden)

    Konstantina Martzoukou

    2005-01-01

    Full Text Available Introduction. This review shows that Web information seeking research suffers from inconsistencies in method and a lack of homogeneity in research foci. Background. Qualitative and quantitative methods are needed to produce a comprehensive view of information seeking. Studies also recommend observation as one of the most fundamental ways of gaining direct knowledge of behaviour. User-centred research emphasises the importance of holistic approaches, which incorporate physical, cognitive, and affective elements. Problems. Comprehensive studies are limited; many approaches are problematic and a consistent methodological framework has not been developed. Research has often failed to ensure appropriate samples that ensure both quantitative validity and qualitative consistency. Typically, observation has been based on simulated rather than real information needs and most studies show little attempt to examine holistically different characteristics of users in the same research schema. Research also deals with various aspects of cognitive style and ability with variant definitions of expertise and different layers of user experience. Finally the effect of social and cultural elements has not been extensively investigated. Conclusion. The existing limitations in method and the plethora of different approaches allow little progress and fewer comparisons across studies. There is urgent need for establishing a theoretical framework on which future studies can be based so that information seeking behaviour can be more holistically understood, and results can be generalised.

  16. Seasonal changes in background levels of deuterium and oxygen-18 prove water drinking by harp seals, which affects the use of the doubly labelled water method.

    Science.gov (United States)

    Nordøy, Erling S; Lager, Anne R; Schots, Pauke C

    2017-12-01

    The aim of this study was to monitor seasonal changes in stable isotopes of pool freshwater and harp seal ( Phoca groenlandica ) body water, and to study whether these potential seasonal changes might bias results obtained using the doubly labelled water (DLW) method when measuring energy expenditure in animals with access to freshwater. Seasonal changes in the background levels of deuterium and oxygen-18 in the body water of four captive harp seals and in the freshwater pool in which they were kept were measured over a time period of 1 year. The seals were offered daily amounts of capelin and kept under a seasonal photoperiod of 69°N. Large seasonal variations of deuterium and oxygen-18 in the pool water were measured, and the isotope abundance in the body water showed similar seasonal changes to the pool water. This shows that the seals were continuously equilibrating with the surrounding water as a result of significant daily water drinking. Variations in background levels of deuterium and oxygen-18 in freshwater sources may be due to seasonal changes in physical processes such as precipitation and evaporation that cause fractionation of isotopes. Rapid and abrupt changes in the background levels of deuterium and oxygen-18 may complicate calculation of energy expenditure by use of the DLW method. It is therefore strongly recommended that analysis of seasonal changes in background levels of isotopes is performed before the DLW method is applied on (free-ranging) animals, and to use a control group in order to correct for changes in background levels. © 2017. Published by The Company of Biologists Ltd.

  17. Methods for using argon-39 to age-date groundwater using ultra-low-background proportional counting

    Energy Technology Data Exchange (ETDEWEB)

    Mace, Emily; Aalseth, Craig; Brandenberger, Jill; Day, Anthony; Hoppe, Eric; Humble, Paul; Keillor, Martin; Kulongoski, Justin; Overman, Cory; Panisko, Mark; Seifert, Allen; White, Signe; Wilcox Freeburg, Eric; Williams, Richard

    2017-08-01

    Argon-39 can be used as a tracer for age-dating glaciers, oceans, and more recently, groundwater. With a half-life of 269 years, 39Ar fills an intermediate age range gap (50-1,000 years) not currently covered by other common groundwater tracers. Therefore, adding this tracer to the data suite for groundwater studies provides an important tool for improving our understanding of groundwater systems. We present the methods employed for arriving at an age-date for a given sample of argon degassed from groundwater.

  18. Theoretical background and implementation of the finite element method for multi-dimensional Fokker-Planck equation analysis

    Czech Academy of Sciences Publication Activity Database

    Král, Radomil; Náprstek, Jiří

    2017-01-01

    Roč. 113, November (2017), s. 54-75 ISSN 0965-9978 R&D Projects: GA ČR(CZ) GP14-34467P; GA ČR(CZ) GA15-01035S Institutional support: RVO:68378297 Keywords : Fokker-Planck equation * finite element method * simplex element * multi-dimensional problem * non-symmetric operator Subject RIV: JM - Building Engineering OBOR OECD: Mechanical engineering Impact factor: 3.000, year: 2016 https://www.sciencedirect.com/science/ article /pii/S0965997817301904

  19. Vacuum expectation value of the stress tensor in an arbitrary curved background: The covariant point-separation method

    International Nuclear Information System (INIS)

    Christensen, S.M.

    1976-01-01

    A method known as covariant geodesic point separation is developed to calculate the vacuum expectation value of the stress tensor for a massive scalar field in an arbitrary gravitational field. The vacuum expectation value will diverge because the stress-tensor operator is constructed from products of field operators evaluated at the same space-time point. To remedy this problem, one of the field operators is taken to a nearby point. The resultant vacuum expectation value is finite and may be expressed in terms of the Hadamard elementary function. This function is calculated using a curved-space generalization of Schwinger's proper-time method for calculating the Feynman Green's function. The expression for the Hadamard function is written in terms of the biscalar of geodetic interval which gives a measure of the square of the geodesic distance between the separated points. Next, using a covariant expansion in terms of the tangent to the geodesic, the stress tensor may be expanded in powers of the length of the geodesic. Covariant expressions for each divergent term and for certain terms in the finite portion of the vacuum expectation value of the stress tensor are found. The properties, uses, and limitations of the results are discussed

  20. Determination of several trace elements in silicate rocks by an XRF method with background and matrix corrections

    International Nuclear Information System (INIS)

    Pascual, J.

    1987-01-01

    An X-ray fluorescence method for determining trace elements in silicate rock samples was studied. The procedure focused on the application of the pertinent matrix corrections. Either the Compton peak or the reciprocal of the mass absorption coefficient of the sample was used as internal standard for this purpose. X-ray tubes with W or Cr anodes were employed, and the W Lβ and Cr Kα Compton intensities scattered by the sample were measured. The mass absorption coefficients at both sides of the absorption edge for Fe (1.658 and 1.936 A) were calculated. The elements Zr, Y, Rb, Zn, Ni, Cr and V were determined in 15 international reference rocks covering wide ranges of concentration. Relative mean errors were in many cases less than 10%. (author)

  1. A Method Validation for Determination of Gross Alpha and Gross Beta in Water Sample Using Low Background Gross Alpha/ Beta Counting System

    International Nuclear Information System (INIS)

    Zal Uyun Wan Mahmood; Norfaizal Mohamed; Nita Salina Abu Bakar

    2016-01-01

    Method validation (MV) for the measurement of gross alpha and gross beta activity in water (drinking, mineral and environmental) samples using Low Background Gross Alpha/ Beta Counting System was performed to characterize precision, accuracy and reliable results. The main objective of this assignment is to ensure that both the instrument and method always good performed and resulting accuracy and reliable results. Generally, almost the results of estimated RSD, z-score and U_s_c_o_r_e were reliable which are recorded as ≤30 %, less than 2 and less than 1.5, respectively. Minimum Detected Activity (MDA) was estimated based on the counting time of 100 minutes and present background counting value of gross alpha (0.01 - 0.35 cpm) and gross beta (0.50 - 2.18 cpm). Estimated Detection Limit (DL) was 0.1 Bq/ L for gross alpha and 0.2 Bq/ L for gross beta and expended uncertainty was relatively small of 9.77 % for gross alpha and 10.57 % for gross beta. Align with that, background counting for gross alpha and gross beta was ranged of 0.01 - 0.35 cpm and 0.50 - 2.18 cpm, respectively. While, sample volume was set at minimum of 500 mL and maximum of 2000 mL. These proven the accuracy and precision result that are generated from developed method/ technique is satisfactory and method is recommended to be used. Therefore, it can be concluded that the MV found no doubtful on the ability of the developed method. The test result showed the method is suitable for all types of water samples which are contained several radionuclides and elements as well as any impurities that interfere the measurement analysis of gross alpha and gross beta. (author)

  2. System and Method for RFID-Enabled Information Collection

    Science.gov (United States)

    Fink, Patrick W. (Inventor); Lin, Gregory Y. (Inventor); Kennedy, Timothy F. (Inventor); Ngo, Phong H. (Inventor); Byerly, Diane (Inventor)

    2016-01-01

    Methods, apparatuses and systems for radio frequency identification (RFID)-enabled information collection are disclosed, including an enclosure, a collector coupled to the enclosure, an interrogator, a processor, and one or more RFID field sensors, each having an individual identification, disposed within the enclosure. In operation, the interrogator transmits an incident signal to the collector, causing the collector to generate an electromagnetic field within the enclosure. The electromagnetic field is affected by one or more influences. RFID sensors respond to the electromagnetic field by transmitting reflected signals containing the individual identifications of the responding RFID sensors to the interrogator. The interrogator receives the reflected signals, measures one or more returned signal strength indications ("RSSI") of the reflected signals and sends the RSSI measurements and identification of the responding RFID sensors to the processor to determine one or more facts about the influences. Other embodiments are also described.

  3. Liquid argon as active shielding and coolant for bare germanium detectors. A novel background suppression method for the GERDA 0νββ experiment

    International Nuclear Information System (INIS)

    Peiffer, J.P.

    2007-01-01

    Two of the most important open questions in particle physics are whether neutrinos are their own anti-particles (Majorana particles) as required by most extensions of the StandardModel and the absolute values of the neutrino masses. The neutrinoless double beta (0νββ) decay, which can be investigated using 76 Ge (a double beta isotope), is the most sensitive probe for these properties. There is a claim for an evidence for the 0νββ decay in the Heidelberg-Moscow (HdM) 76 Ge experiment by a part of the HdM collaboration. The new 76 Ge experiment Gerda aims to check this claim within one year with 15 kg.y of statistics in Phase I at a background level of ≤10 -2 events/(kg.keV.y) and to go to higher sensitivity with 100 kg.y of statistics in Phase II at a background level of ≤10 -3 events/(kg.keV.y). In Gerda bare germanium semiconductor detectors (enriched in 76 Ge) will be operated in liquid argon (LAr). LAr serves as cryogenic coolant and as high purity shielding against external background. To reach the background level for Phase II, new methods are required to suppress the cosmogenic background of the diodes. The background from cosmogenically produced 60 Co is expected to be ∝2.5.10 -3 events/(kg.keV.y). LAr scintillates in UV (λ=128 nm) and a novel concept is to use this scintillation light as anti-coincidence signal for background suppression. In this work the efficiency of such a LAr scintillation veto was investigated for the first time. In a setup with 19 kg active LAr mass a suppression of a factor 3 has been achieved for 60 Co and a factor 17 for 232 Th around Q ββ = 2039 keV. This suppression will further increase for a one ton active volume (factor O(100) for 232 Th and 60 Co). LAr scintillation can also be used as a powerful tool for background diagnostics. For this purpose a new, very stable and robust wavelength shifter/reflector combination for the light detection has been developed, leading to a photo electron (pe) yield of as much as

  4. Liquid argon as active shielding and coolant for bare germanium detectors. A novel background suppression method for the GERDA 0{nu}{beta}{beta} experiment

    Energy Technology Data Exchange (ETDEWEB)

    Peiffer, J.P.

    2007-07-25

    Two of the most important open questions in particle physics are whether neutrinos are their own anti-particles (Majorana particles) as required by most extensions of the StandardModel and the absolute values of the neutrino masses. The neutrinoless double beta (0{nu}{beta}{beta}) decay, which can be investigated using {sup 76}Ge (a double beta isotope), is the most sensitive probe for these properties. There is a claim for an evidence for the 0{nu}{beta}{beta} decay in the Heidelberg-Moscow (HdM) {sup 76}Ge experiment by a part of the HdM collaboration. The new {sup 76}Ge experiment Gerda aims to check this claim within one year with 15 kg.y of statistics in Phase I at a background level of {<=}10{sup -2} events/(kg.keV.y) and to go to higher sensitivity with 100 kg.y of statistics in Phase II at a background level of {<=}10{sup -3} events/(kg.keV.y). In Gerda bare germanium semiconductor detectors (enriched in {sup 76}Ge) will be operated in liquid argon (LAr). LAr serves as cryogenic coolant and as high purity shielding against external background. To reach the background level for Phase II, new methods are required to suppress the cosmogenic background of the diodes. The background from cosmogenically produced {sup 60}Co is expected to be {proportional_to}2.5.10{sup -3} events/(kg.keV.y). LAr scintillates in UV ({lambda}=128 nm) and a novel concept is to use this scintillation light as anti-coincidence signal for background suppression. In this work the efficiency of such a LAr scintillation veto was investigated for the first time. In a setup with 19 kg active LAr mass a suppression of a factor 3 has been achieved for {sup 60}Co and a factor 17 for {sup 232}Th around Q{sub {beta}}{sub {beta}} = 2039 keV. This suppression will further increase for a one ton active volume (factor O(100) for {sup 232}Th and {sup 60}Co). LAr scintillation can also be used as a powerful tool for background diagnostics. For this purpose a new, very stable and robust wavelength

  5. Information theoretic methods for image processing algorithm optimization

    Science.gov (United States)

    Prokushkin, Sergey F.; Galil, Erez

    2015-01-01

    Modern image processing pipelines (e.g., those used in digital cameras) are full of advanced, highly adaptive filters that often have a large number of tunable parameters (sometimes > 100). This makes the calibration procedure for these filters very complex, and the optimal results barely achievable in the manual calibration; thus an automated approach is a must. We will discuss an information theory based metric for evaluation of algorithm adaptive characteristics ("adaptivity criterion") using noise reduction algorithms as an example. The method allows finding an "orthogonal decomposition" of the filter parameter space into the "filter adaptivity" and "filter strength" directions. This metric can be used as a cost function in automatic filter optimization. Since it is a measure of a physical "information restoration" rather than perceived image quality, it helps to reduce the set of the filter parameters to a smaller subset that is easier for a human operator to tune and achieve a better subjective image quality. With appropriate adjustments, the criterion can be used for assessment of the whole imaging system (sensor plus post-processing).

  6. A new template matching method based on contour information

    Science.gov (United States)

    Cai, Huiying; Zhu, Feng; Wu, Qingxiao; Li, Sicong

    2014-11-01

    Template matching is a significant approach in machine vision due to its effectiveness and robustness. However, most of the template matching methods are so time consuming that they can't be used to many real time applications. The closed contour matching method is a popular kind of template matching methods. This paper presents a new closed contour template matching method which is suitable for two dimensional objects. Coarse-to-fine searching strategy is used to improve the matching efficiency and a partial computation elimination scheme is proposed to further speed up the searching process. The method consists of offline model construction and online matching. In the process of model construction, triples and distance image are obtained from the template image. A certain number of triples which are composed by three points are created from the contour information that is extracted from the template image. The rule to select the three points is that the template contour is divided equally into three parts by these points. The distance image is obtained here by distance transform. Each point on the distance image represents the nearest distance between current point and the points on the template contour. During the process of matching, triples of the searching image are created with the same rule as the triples of the model. Through the similarity that is invariant to rotation, translation and scaling between triangles, the triples corresponding to the triples of the model are found. Then we can obtain the initial RST (rotation, translation and scaling) parameters mapping the searching contour to the template contour. In order to speed up the searching process, the points on the searching contour are sampled to reduce the number of the triples. To verify the RST parameters, the searching contour is projected into the distance image, and the mean distance can be computed rapidly by simple operations of addition and multiplication. In the fine searching process

  7. A method for investigating relative timing information on phylogenetic trees.

    Science.gov (United States)

    Ford, Daniel; Matsen, Frederick A; Stadler, Tanja

    2009-04-01

    In this paper, we present a new way to describe the timing of branching events in phylogenetic trees. Our description is in terms of the relative timing of diversification events between sister clades; as such it is complementary to existing methods using lineages-through-time plots which consider diversification in aggregate. The method can be applied to look for evidence of diversification happening in lineage-specific "bursts", or the opposite, where diversification between 2 clades happens in an unusually regular fashion. In order to be able to distinguish interesting events from stochasticity, we discuss 2 classes of neutral models on trees with relative timing information and develop a statistical framework for testing these models. These model classes include both the coalescent with ancestral population size variation and global rate speciation-extinction models. We end the paper with 2 example applications: first, we show that the evolution of the hepatitis C virus deviates from the coalescent with arbitrary population size. Second, we analyze a large tree of ants, demonstrating that a period of elevated diversification rates does not appear to have occurred in a bursting manner.

  8. About application during lectures on protection of the information and information security of the method of "the round table"

    Directory of Open Access Journals (Sweden)

    Simon Zh. Simavoryan

    2011-05-01

    Full Text Available In article the analysis of one of passive methods of transfer of knowledge – lecture is resulted. Experience of teaching of a subject on protection of the information and information security shows that students acquire a teaching material if during lecture to apply an active method of transfer of knowledge – a method of "a round table" is better.

  9. Using information to deliver safer care: a mixed-methods study exploring general practitioners’ information needs in North West London primary care

    Directory of Open Access Journals (Sweden)

    Nikolaos Mastellos

    2014-12-01

    Full Text Available Background The National Health Service in England has given increasing priority to improving inter-professional communication, enabling better management of patients with chronic conditions and reducing medical errors through effective use of information. Despite considerable efforts to reduce patient harm through better information usage, medical errors continue to occur, posing a serious threat to patient safety.Objectives This study explores the range, quality and sophistication of existing information systems in primary care with the aim to capture what information practitioners need to provide a safe service and identify barriers to its effective use in care pathways.Method Data were collected through semi-structured interviews with general practitioners from surgeries in North West London and a survey evaluating their experience with information systems in care pathways.Results Important information is still missing, specifically discharge summaries detailing medication changes and changes in the diagnosis and management of patients, blood results ordered by hospital specialists and findings from clinical investigations. Participants identified numerous barriers, including the communication gap between primary and secondary care, the variable quality and consistency of clinical correspondence and the inadequate technological integration.Conclusion Despite attempts to improve integration and information flow in care pathways, existing systems provide practitioners with only partial access to information, hindering their ability to take informed decisions. This study offers a framework for understanding what tools should be in place to enable effective use of information in primary care. 

  10. Performance Comparison of Several Pre-Processing Methods in a Hand Gesture Recognition System based on Nearest Neighbor for Different Background Conditions

    Directory of Open Access Journals (Sweden)

    Regina Lionnie

    2013-09-01

    Full Text Available This paper presents a performance analysis and comparison of several pre-processing  methods  used  in  a  hand  gesture  recognition  system.  The  preprocessing methods are based on the combinations ofseveral image processing operations,  namely  edge  detection,  low  pass  filtering,  histogram  equalization, thresholding and desaturation. The hand gesture recognition system is designed to classify an input image into one of six possibleclasses. The input images are taken with various background conditions. Our experiments showed that the best result is achieved when the pre-processing method consists of only a desaturation operation, achieving a classification accuracy of up to 83.15%.

  11. Information systems project management: methods, tools, and techniques

    OpenAIRE

    Mcmanus, John; Wood-Harper, Trevor

    2004-01-01

    Information Systems Project Management offers a clear and logical exposition of how to plan, organise and monitor projects effectively in order to deliver quality information systems within time, to budget and quality. This new book by John McManus and Trevor Wood-Harper is suitable for upper level undergraduates and postgraduates studying project management and Information Systems. Practising managers will also find it to be a valuable tool in their work. Managing information systems pro...

  12. Agent-based method for distributed clustering of textual information

    Science.gov (United States)

    Potok, Thomas E [Oak Ridge, TN; Reed, Joel W [Knoxville, TN; Elmore, Mark T [Oak Ridge, TN; Treadwell, Jim N [Louisville, TN

    2010-09-28

    A computer method and system for storing, retrieving and displaying information has a multiplexing agent (20) that calculates a new document vector (25) for a new document (21) to be added to the system and transmits the new document vector (25) to master cluster agents (22) and cluster agents (23) for evaluation. These agents (22, 23) perform the evaluation and return values upstream to the multiplexing agent (20) based on the similarity of the document to documents stored under their control. The multiplexing agent (20) then sends the document (21) and the document vector (25) to the master cluster agent (22), which then forwards it to a cluster agent (23) or creates a new cluster agent (23) to manage the document (21). The system also searches for stored documents according to a search query having at least one term and identifying the documents found in the search, and displays the documents in a clustering display (80) of similarity so as to indicate similarity of the documents to each other.

  13. Effect of sumatriptan on cerebral blood flow during migraine headache. Measurement by sequential SPECT used 99mTc-ECD background subtraction method

    International Nuclear Information System (INIS)

    Ueda, Takashi; Torihara, Yoshito; Tsuneyoshi, Noritaka; Ikeda, Yoshitomo

    2001-01-01

    The present study was designed to examine the effect of sumatriptan on regional cerebral blood flow (CBF) during migraine headache. Nine cases were examined by 99m Tc-ECD background subtraction method for the absolute value measurement of regional CBF before and after sumatriptan injection. rCBF except for occipital and perioccipital lobes, were increased 10-20% during migraine headache and significant decreases were observed by sumatriptan injection. Two cases of nine had transiently increased systemic blood pressure and cardiac pulse rate, however, all cases improved migraine headache after injection of sumatriptan. (author)

  14. Radiative Improvement of the Lattice Nonrelativistic QCD Action Using the Background Field Method and Application to the Hyperfine Splitting of Quarkonium States

    International Nuclear Information System (INIS)

    Hammant, T. C.; Horgan, R. R.; Monahan, C. J.; Hart, A. G.; Hippel, G. M. von

    2011-01-01

    We present the first application of the background field method to nonrelativistic QCD (NRQCD) on the lattice in order to determine the one-loop radiative corrections to the coefficients of the NRQCD action in a manifestly gauge-covariant manner. The coefficients of the σ·B term in the NRQCD action and the four-fermion spin-spin interaction are computed at the one-loop level; the resulting shift of the hyperfine splitting of bottomonium is found to bring the lattice predictions in line with experiment.

  15. Effect of sumatriptan on cerebral blood flow during migraine headache. Measurement by sequential SPECT used {sup 99m}Tc-ECD background subtraction method

    Energy Technology Data Exchange (ETDEWEB)

    Ueda, Takashi; Torihara, Yoshito; Tsuneyoshi, Noritaka; Ikeda, Yoshitomo [Miyazaki Social Insurance Hospital (Japan)

    2001-07-01

    The present study was designed to examine the effect of sumatriptan on regional cerebral blood flow (CBF) during migraine headache. Nine cases were examined by {sup 99m}Tc-ECD background subtraction method for the absolute value measurement of regional CBF before and after sumatriptan injection. rCBF except for occipital and perioccipital lobes, were increased 10-20% during migraine headache and significant decreases were observed by sumatriptan injection. Two cases of nine had transiently increased systemic blood pressure and cardiac pulse rate, however, all cases improved migraine headache after injection of sumatriptan. (author)

  16. Identification of source velocities on 3D structures in non-anechoic environments: Theoretical background and experimental validation of the inverse patch transfer functions method

    Science.gov (United States)

    Aucejo, M.; Totaro, N.; Guyader, J.-L.

    2010-08-01

    In noise control, identification of the source velocity field remains a major problem open to investigation. Consequently, methods such as nearfield acoustical holography (NAH), principal source projection, the inverse frequency response function and hybrid NAH have been developed. However, these methods require free field conditions that are often difficult to achieve in practice. This article presents an alternative method known as inverse patch transfer functions, designed to identify source velocities and developed in the framework of the European SILENCE project. This method is based on the definition of a virtual cavity, the double measurement of the pressure and particle velocity fields on the aperture surfaces of this volume, divided into elementary areas called patches and the inversion of impedances matrices, numerically computed from a modal basis obtained by FEM. Theoretically, the method is applicable to sources with complex 3D geometries and measurements can be carried out in a non-anechoic environment even in the presence of other stationary sources outside the virtual cavity. In the present paper, the theoretical background of the iPTF method is described and the results (numerical and experimental) for a source with simple geometry (two baffled pistons driven in antiphase) are presented and discussed.

  17. When Educational Material Is Delivered: A Mixed Methods Content Validation Study of the Information Assessment Method.

    Science.gov (United States)

    Badran, Hani; Pluye, Pierre; Grad, Roland

    2017-03-14

    The Information Assessment Method (IAM) allows clinicians to report the cognitive impact, clinical relevance, intention to use, and expected patient health benefits associated with clinical information received by email. More than 15,000 Canadian physicians and pharmacists use the IAM in continuing education programs. In addition, information providers can use IAM ratings and feedback comments from clinicians to improve their products. Our general objective was to validate the IAM questionnaire for the delivery of educational material (ecological and logical content validity). Our specific objectives were to measure the relevance and evaluate the representativeness of IAM items for assessing information received by email. A 3-part mixed methods study was conducted (convergent design). In part 1 (quantitative longitudinal study), the relevance of IAM items was measured. Participants were 5596 physician members of the Canadian Medical Association who used the IAM. A total of 234,196 ratings were collected in 2012. The relevance of IAM items with respect to their main construct was calculated using descriptive statistics (relevance ratio R). In part 2 (qualitative descriptive study), the representativeness of IAM items was evaluated. A total of 15 family physicians completed semistructured face-to-face interviews. For each construct, we evaluated the representativeness of IAM items using a deductive-inductive thematic qualitative data analysis. In part 3 (mixing quantitative and qualitative parts), results from quantitative and qualitative analyses were reviewed, juxtaposed in a table, discussed with experts, and integrated. Thus, our final results are derived from the views of users (ecological content validation) and experts (logical content validation). Of the 23 IAM items, 21 were validated for content, while 2 were removed. In part 1 (quantitative results), 21 items were deemed relevant, while 2 items were deemed not relevant (R=4.86% [N=234,196] and R=3.04% [n

  18. Using the Work System Method with Freshman Information Systems Students

    Science.gov (United States)

    Recker, Jan; Alter, Steven

    2012-01-01

    Recent surveys of information technology management professionals show that understanding business domains in terms of business productivity and cost reduction potential, knowledge of different vertical industry segments and their information requirements, understanding of business processes and client-facing skills are more critical for…

  19. A new method based on low background instrumental neutron activation analysis for major, trace and ultra-trace element determination in atmospheric mineral dust from polar ice cores

    Energy Technology Data Exchange (ETDEWEB)

    Baccolo, Giovanni, E-mail: giovanni.baccolo@mib.infn.it [Graduate School in Polar Sciences, University of Siena, Via Laterina 8, 53100, Siena (Italy); Department of Environmental Sciences, University of Milano-Bicocca, P.zza della Scienza 1, 20126, Milano (Italy); INFN, Section of Milano-Bicocca, P.zza della Scienza 3, 20126, Milano (Italy); Clemenza, Massimiliano [INFN, Section of Milano-Bicocca, P.zza della Scienza 3, 20126, Milano (Italy); Department of Physics, University of Milano-Bicocca, P.zza della Scienza 3, 20126, Milano (Italy); Delmonte, Barbara [Department of Environmental Sciences, University of Milano-Bicocca, P.zza della Scienza 1, 20126, Milano (Italy); Maffezzoli, Niccolò [Centre for Ice and Climate, Niels Bohr Institute, Juliane Maries Vej, 30, 2100, Copenhagen (Denmark); Nastasi, Massimiliano; Previtali, Ezio [INFN, Section of Milano-Bicocca, P.zza della Scienza 3, 20126, Milano (Italy); Department of Physics, University of Milano-Bicocca, P.zza della Scienza 3, 20126, Milano (Italy); Prata, Michele; Salvini, Andrea [LENA, University of Pavia, Pavia (Italy); Maggi, Valter [Department of Environmental Sciences, University of Milano-Bicocca, P.zza della Scienza 1, 20126, Milano (Italy); INFN, Section of Milano-Bicocca, P.zza della Scienza 3, 20126, Milano (Italy)

    2016-05-30

    Dust found in polar ice core samples present extremely low concentrations, in addition the availability of such samples is usually strictly limited. For these reasons the chemical and physical analysis of polar ice cores is an analytical challenge. In this work a new method based on low background instrumental neutron activation analysis (LB-INAA) for the multi-elemental characterization of the insoluble fraction of dust from polar ice cores is presented. Thanks to an accurate selection of the most proper materials and procedures it was possible to reach unprecedented analytical performances, suitable for ice core analyses. The method was applied to Antarctic ice core samples. Five samples of atmospheric dust (μg size) from ice sections of the Antarctic Talos Dome ice core were prepared and analyzed. A set of 37 elements was quantified, spanning from all the major elements (Na, Mg, Al, Si, K, Ca, Ti, Mn and Fe) to trace ones, including 10 (La, Ce, Nd, Sm, Eu, Tb, Ho, Tm, Yb and Lu) of the 14 natural occurring lanthanides. The detection limits are in the range of 10{sup −13}–10{sup −6} g, improving previous results of 1–3 orders of magnitude depending on the element; uncertainties lies between 4% and 60%. - Highlights: • A new method based on neutron activation for the multi-elemental characterization of atmospheric dust entrapped in polar ice cores is proposed. • 37 elements were quantified in μg size dust samples with detection limits ranging from 10{sup −13} to 10{sup −6} g. • A low background approach and a clean analytical protocol improved INAA performances to unprecedented levels for multi-elemental analyses.

  20. Improve Biomedical Information Retrieval using Modified Learning to Rank Methods.

    Science.gov (United States)

    Xu, Bo; Lin, Hongfei; Lin, Yuan; Ma, Yunlong; Yang, Liang; Wang, Jian; Yang, Zhihao

    2016-06-14

    In these years, the number of biomedical articles has increased exponentially, which becomes a problem for biologists to capture all the needed information manually. Information retrieval technologies, as the core of search engines, can deal with the problem automatically, providing users with the needed information. However, it is a great challenge to apply these technologies directly for biomedical retrieval, because of the abundance of domain specific terminologies. To enhance biomedical retrieval, we propose a novel framework based on learning to rank. Learning to rank is a series of state-of-the-art information retrieval techniques, and has been proved effective in many information retrieval tasks. In the proposed framework, we attempt to tackle the problem of the abundance of terminologies by constructing ranking models, which focus on not only retrieving the most relevant documents, but also diversifying the searching results to increase the completeness of the resulting list for a given query. In the model training, we propose two novel document labeling strategies, and combine several traditional retrieval models as learning features. Besides, we also investigate the usefulness of different learning to rank approaches in our framework. Experimental results on TREC Genomics datasets demonstrate the effectiveness of our framework for biomedical information retrieval.

  1. METHODS FOR ASSESSING SECURITY THREATS CONFIDENTIAL INFORMATION FOR THE INFORMATION AND TELECOMMUNICATIONS SYSTEMS

    Directory of Open Access Journals (Sweden)

    E. V. Belokurova

    2015-01-01

    Full Text Available The article discusses the different approaches to assessing the safety of confidential information-term for information and telecommunication systems of various pre-appreciable destination in the presence of internal and external threats to its integrity and availability. The difficulty of ensuring the security of confidential information from exposure to information and telecommunication systems of external and internal threats at the present time, is of particular relevance. This problem is confirmed by the analysis of available statistical information on the impact of threats on the security circulating in the information and telecommunications system. Leak confidential information, intellectual property, information, know-how is the result of significant material and moral damage caused to the owner of the restricted information. The paper presents the structure of the indicators and criteria shows that the most promising are analytical criteria. However, their use to assess the level of security of confidential information is difficult due to the lack of appropriate mathematical models. The complexity of the problem is that existing traditional mathematical models are not always appropriate for the stated objectives. Therefore, it is necessary to develop mathematical models designed to assess the security of confidential information and its impact on information and telecommunication system threats.

  2. Traffic Information Unit, Traffic Information System, Vehicle Management System, Vehicle, and Method of Controlling a Vehicle

    NARCIS (Netherlands)

    Papp, Z.; Doodeman, G.J.N.; Nelisse, M.W.; Sijs, J.; Theeuwes, J.A.C.; Driessen, B.J.F.

    2010-01-01

    A traffic information unit (MD1, MD2, MD3) according to the invention comprises a facility (MI) for tracking vehicle state information of individual vehicles present at a traffic infrastructure and a facility (T) for transmitting said vehicle state information to a vehicle (70B, 70E). A traffic

  3. The natural radiation background

    International Nuclear Information System (INIS)

    Duggleby, J.C.

    1982-01-01

    The components of the natural background radiation and their variations are described. Cosmic radiation is a major contributor to the external dose to the human body whilst naturally-occurring radionuclides of primordial and cosmogenic origin contribute to both the external and internal doses, with the primordial radionuclides being the major contributor in both cases. Man has continually modified the radiation dose to which he has been subjected. The two traditional methods of measuring background radiation, ionisation chamber measurements and scintillation counting, are looked at and the prospect of using thermoluminescent dosimetry is considered

  4. A New Method to Measure the Post-reionization Ionizing Background from the Joint Distribution of Lyα and Lyβ Forest Transmission

    Science.gov (United States)

    Davies, Frederick B.; Hennawi, Joseph F.; Eilers, Anna-Christina; Lukić, Zarija

    2018-03-01

    The amplitude of the ionizing background that pervades the intergalactic medium (IGM) at the end of the epoch of reionization provides a valuable constraint on the emissivity of the sources that reionized the universe. While measurements of the ionizing background at lower redshifts rely on a simulation-calibrated mapping between the photoionization rate and the mean transmission of the Lyα forest, at z ≳ 6 the IGM becomes increasingly opaque and transmission arises solely in narrow spikes separated by saturated Gunn–Peterson troughs. In this regime, the traditional approach of measuring the average transmission over large ∼50 Mpc/h regions is less sensitive and suboptimal. In addition, the five times smaller oscillator strength of the Lyβ transition implies that the Lyβ forest is considerably more transparent at z ≳ 6, even in the presence of contamination by foreground z ∼ 5 Lyα forest absorption. In this work we present a novel statistical approach to analyze the joint distribution of transmission spikes in the cospatial z ∼ 6 Lyα and Lyβ forests. Our method relies on approximate Bayesian computation (ABC), which circumvents the necessity of computing the intractable likelihood function describing the highly correlated Lyα and Lyβ transmission. We apply ABC to mock data generated from a large-volume hydrodynamical simulation combined with a state-of-the-art model of ionizing background fluctuations in the post-reionization IGM and show that it is sensitive to higher IGM neutral hydrogen fractions than previous techniques. As a proof of concept, we apply this methodology to a real spectrum of a z = 6.54 quasar and measure the ionizing background from 5.4 ≤ z ≤ 6.4 along this sightline with ∼0.2 dex statistical uncertainties. Some of the data presented herein were obtained at the W. M. Keck Observatory, which is operated as a scientific partnership among the California Institute of Technology, the University of California, and the

  5. Validation of a new background discrimination method for the TACTIC TeV γ-ray telescope with Markarian 421 data

    International Nuclear Information System (INIS)

    Sharma, Mradul; Nayak, J.; Koul, M.K.; Bose, S.; Mitra, Abhas; Dhar, V.K.; Tickoo, A.K.; Koul, R.

    2015-01-01

    This paper describes the validation of a new background discrimination method based on Random Forest technique by re-analysing the Markarian 421 (Mrk 421) observations performed by the TACTIC (TeV Atmospheric Cherenkov Telescope with Imaging Camera) γ-ray telescope. The Random Forest technique is a flexible multivariate method which combines Bagging and Random Split Selection to construct a large collection of decision trees and then combines them to construct a common classifier. Markarian 421 in a high state was observed by TACTIC during December 07, 2005–April 30, 2006 for 202 h. Previous analysis of this data led to a detection of flaring activity from the source at Energy >1TeV. Within this data set, a spell of 97 h revealed strong detection of a γ-ray signal with daily flux of >1 Crab unit on several days. Here we re-analyze this spell as well as the data from the entire observation period with the Random Forest method. Application of this method led to an improvement in the signal detection strength by ∼26% along with a ∼18% increase in detected γ rays compared to the conventional Dynamic Supercuts method. The resultant differential spectrum obtained is represented by a power law with an exponential cut off Γ=−2.51±0.10 and E 0 =4.71±2.20TeV. Such a spectrum is consistent with previously reported results and justifies the use of Random Forest method for analyzing data from atmospheric Cherenkov telescopes

  6. Validation of a new background discrimination method for the TACTIC TeV γ-ray telescope with Markarian 421 data

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, Mradul, E-mail: mradul@barc.gov.in [Astrophysical Sciences Division, Bhabha Atomic Research Centre, Mumbai (India); Nayak, J. [The Bayesian and Interdisciplinary Research Unit, Indian Statistical Institute, Kolkata (India); Koul, M.K. [Astrophysical Sciences Division, Bhabha Atomic Research Centre, Mumbai (India); Bose, S. [The Bayesian and Interdisciplinary Research Unit, Indian Statistical Institute, Kolkata (India); Mitra, Abhas; Dhar, V.K.; Tickoo, A.K.; Koul, R. [Astrophysical Sciences Division, Bhabha Atomic Research Centre, Mumbai (India)

    2015-01-11

    This paper describes the validation of a new background discrimination method based on Random Forest technique by re-analysing the Markarian 421 (Mrk 421) observations performed by the TACTIC (TeV Atmospheric Cherenkov Telescope with Imaging Camera) γ-ray telescope. The Random Forest technique is a flexible multivariate method which combines Bagging and Random Split Selection to construct a large collection of decision trees and then combines them to construct a common classifier. Markarian 421 in a high state was observed by TACTIC during December 07, 2005–April 30, 2006 for 202 h. Previous analysis of this data led to a detection of flaring activity from the source at Energy >1TeV. Within this data set, a spell of 97 h revealed strong detection of a γ-ray signal with daily flux of >1 Crab unit on several days. Here we re-analyze this spell as well as the data from the entire observation period with the Random Forest method. Application of this method led to an improvement in the signal detection strength by ∼26% along with a ∼18% increase in detected γ rays compared to the conventional Dynamic Supercuts method. The resultant differential spectrum obtained is represented by a power law with an exponential cut off Γ=−2.51±0.10 and E{sub 0}=4.71±2.20TeV. Such a spectrum is consistent with previously reported results and justifies the use of Random Forest method for analyzing data from atmospheric Cherenkov telescopes.

  7. Hybridization of the probability perturbation method with gradient information

    DEFF Research Database (Denmark)

    Johansen, Kent; Caers, J.; Suzuki, S.

    2007-01-01

    Geostatistically based history-matching methods make it possible to devise history-matching strategies that will honor geologic knowledge about the reservoir. However, the performance of these methods is known to be impeded by slow convergence rates resulting from the stochastic nature of the alg...

  8. Evaluation of the impact of reducing national emissions of SO2 and metals in Poland on background pollution using a bioindication method.

    Science.gov (United States)

    Dmuchowski, Wojciech; Gozdowski, Dariusz; Baczewska-Dąbrowska, Aneta H; Dąbrowski, Piotr; Gworek, Barbara; Suwara, Irena

    2018-01-01

    Changes in environmental pollution by S, Cd, Cu, Pb and Zn in 2006-2014 were evaluated using a bioindication method. This method was based on measurements of pollutants in Scots pine (Pinus sylvestris L.) needles. The measurements were performed in the Chojnowskie Forests, a region recognized as a background area for central Poland. The changes in the contents of sulfur (S) and metals in needles were not comparable with the changes in the global emissions of the pollutants in Poland. On average, the pollution level in the study area decreased by 9.9% for S, 61.4% for Pb, 22.5% for Cd, 11.7% for Zn and 10.4% for Cu. During the same period, global emissions in Poland decreased by 38.1% for S, 8.0% for Pb, 63.2% for Cd, 11.7% for Zn and 14.0% for Cu. Therefore, the differences in the changes in emissions and the needle contents of each element should be examined separately which was not a goal of this study. However, the discrepancy between these results did not prevent the use of bioindication methods. Evaluation of pollutant contents in plants reflected their incorporation in biological processes rather than air or soil pollution levels.

  9. Information entropy method and the description of echo hologram formation in gaseous media

    Science.gov (United States)

    Garnaeva, G. I.; Nefediev, L. A.; Akhmedshina, E. N.

    2018-02-01

    The effect of collisions with a change in velocity of gas particles, on the value of information entropy, is associated with the spectral structure of the echo hologram’s response, where its temporal form is considered. It is shown that collisions with a change in gas particle velocity increase the ‘parasitical’ information, on the background of which the information contained in the temporary shape of the object laser pulse is lost.

  10. Information Geometry, Inference Methods and Chaotic Energy Levels Statistics

    OpenAIRE

    Cafaro, Carlo

    2008-01-01

    In this Letter, we propose a novel information-geometric characterization of chaotic (integrable) energy level statistics of a quantum antiferromagnetic Ising spin chain in a tilted (transverse) external magnetic field. Finally, we conjecture our results might find some potential physical applications in quantum energy level statistics.

  11. Aligning building information model tools and construction management methods

    NARCIS (Netherlands)

    Hartmann, Timo; van Meerveld, H.J.; Vossebeld, N.; Adriaanse, Adriaan Maria

    2012-01-01

    Few empirical studies exist that can explain how different Building Information Model (BIM) based tool implementation strategies work in practical contexts. To help overcoming this gap, this paper describes the implementation of two BIM based tools, the first, to support the activities at an

  12. Statistical methods of combining information: Applications to sensor data fusion

    Energy Technology Data Exchange (ETDEWEB)

    Burr, T.

    1996-12-31

    This paper reviews some statistical approaches to combining information from multiple sources. Promising new approaches will be described, and potential applications to combining not-so-different data sources such as sensor data will be discussed. Experiences with one real data set are described.

  13. A multi-method approach to evaluate health information systems.

    Science.gov (United States)

    Yu, Ping

    2010-01-01

    Systematic evaluation of the introduction and impact of health information systems (HIS) is a challenging task. As the implementation is a dynamic process, with diverse issues emerge at various stages of system introduction, it is challenge to weigh the contribution of various factors and differentiate the critical ones. A conceptual framework will be helpful in guiding the evaluation effort; otherwise data collection may not be comprehensive and accurate. This may again lead to inadequate interpretation of the phenomena under study. Based on comprehensive literature research and own practice of evaluating health information systems, the author proposes a multimethod approach that incorporates both quantitative and qualitative measurement and centered around DeLone and McLean Information System Success Model. This approach aims to quantify the performance of HIS and its impact, and provide comprehensive and accurate explanations about the casual relationships of the different factors. This approach will provide decision makers with accurate and actionable information for improving the performance of the introduced HIS.

  14. Sharing information: Mixed-methods investigation of brief experiential interprofessional

    Science.gov (United States)

    Cocksedge, Simon; Barr, Nicky; Deakin, Corinne

    In UK health policy ‘sharing good information is pivotal to improving care quality, safety, and effectiveness. Nevertheless, educators often neglect this vital communication skill. The consequences of brief communication education interventions for healthcare workers are not yet established. This study investigated a three-hour interprofessional experiential workshop (group work, theoretical input, rehearsal) training healthcare staff in sharing information using a clear structure (PARSLEY). Staff in one UK hospital participated. Questionnaires were completed before, immediately after, and eight weeks after training, with semistructured interviews seven weeks after training. Participants (n=76) were from assorted healthcare occupations (26% non-clinical). Knowledge significantly increased immediately after training. Self-efficacy, outcome expectancy, and motivation to use the structure taught were significantly increased immediately following training and at eight weeks. Respondents at eight weeks (n=35) reported their practice in sharing information had changed within seven days of training. Seven weeks after training, most interviewees (n=13) reported confidently using the PARSLEY structure regularly in varied settings. All had re-evaluated their communication practice. Brief training altered self-reported communication behaviour of healthcare staff, with sustained changes in everyday work. As sharing information is central to communication curricula, health policy, and shared decision-making, the effectiveness of brief teaching interventions has economic and educational implications.

  15. Method for modeling social care processes for national information exchange.

    Science.gov (United States)

    Miettinen, Aki; Mykkänen, Juha; Laaksonen, Maarit

    2012-01-01

    Finnish social services include 21 service commissions of social welfare including Adoption counselling, Income support, Child welfare, Services for immigrants and Substance abuse care. This paper describes the method used for process modeling in the National project for IT in Social Services in Finland (Tikesos). The process modeling in the project aimed to support common national target state processes from the perspective of national electronic archive, increased interoperability between systems and electronic client documents. The process steps and other aspects of the method are presented. The method was developed, used and refined during the three years of process modeling in the national project.

  16. Imaging systems and methods for obtaining and using biometric information

    Science.gov (United States)

    McMakin, Douglas L [Richland, WA; Kennedy, Mike O [Richland, WA

    2010-11-30

    Disclosed herein are exemplary embodiments of imaging systems and methods of using such systems. In one exemplary embodiment, one or more direct images of the body of a clothed subject are received, and a motion signature is determined from the one or more images. In this embodiment, the one or more images show movement of the body of the subject over time, and the motion signature is associated with the movement of the subject's body. In certain implementations, the subject can be identified based at least in part on the motion signature. Imaging systems for performing any of the disclosed methods are also disclosed herein. Furthermore, the disclosed imaging, rendering, and analysis methods can be implemented, at least in part, as one or more computer-readable media comprising computer-executable instructions for causing a computer to perform the respective methods.

  17. A Synchronisation Method For Informed Spread-Spectrum Audiowatermarking

    OpenAIRE

    Pierre-Yves Fulchiron; Barry O'Donovan; Guenole Silvestre; Neil Hurley

    2003-01-01

    Under perfect synchronisation conditions, watermarking schemes employing asymmetric spread-spectrum techniques are suitable for copy-protection of audio signals. This paper proposes to combine the use of a robust psychoacoustic projection for the extraction of a watermark feature vector along with non-linear detection functions optimised with side-information. The new proposed scheme benefits from an increased level of security through the use of asymmetric detectors. We apply this scheme to ...

  18. Fast Reduction Method in Dominance-Based Information Systems

    Science.gov (United States)

    Li, Yan; Zhou, Qinghua; Wen, Yongchuan

    2018-01-01

    In real world applications, there are often some data with continuous values or preference-ordered values. Rough sets based on dominance relations can effectively deal with these kinds of data. Attribute reduction can be done in the framework of dominance-relation based approach to better extract decision rules. However, the computational cost of the dominance classes greatly affects the efficiency of attribute reduction and rule extraction. This paper presents an efficient method of computing dominance classes, and further compares it with traditional method with increasing attributes and samples. Experiments on UCI data sets show that the proposed algorithm obviously improves the efficiency of the traditional method, especially for large-scale data.

  19. Stabilization of the Lattice Boltzmann Method Using Information Theory

    OpenAIRE

    Wilson, Tyler L; Pugh, Mary; Dawson, Francis

    2018-01-01

    A novel Lattice Boltzmann method is derived using the Principle of Minimum Cross Entropy (MinxEnt) via the minimization of Kullback-Leibler Divergence (KLD). By carrying out the actual single step Newton-Raphson minimization (MinxEnt-LBM) a more accurate and stable Lattice Boltzmann Method can be implemented. To demonstrate this, 1D shock tube and 2D lid-driven cavity flow simulations are carried out and compared to Single Relaxation Time LBM, Two Relaxation Time LBM, Multiple Relaxation Time...

  20. A Method to Measure the Amount of Battlefield Situation Information

    Science.gov (United States)

    2014-06-01

    log)( 20 atS  3.2 Measurement of trends information Kierkegaard once said “Life can only be understood backwards, but it must be lived forwards” [8...Towards a theory of s 10 11 , “The Journals of Soren Kierkegaard ”, A selection and translated by Alexander chun, “Formal Description of Command...and 37(2), pp32-64, 1995. [8] Kierkegaard , Soren – Dru , Oxford: Oxford University Press (1938). pp465 [9] ZHOU Dao-an, ZHANG Dong-ge, CHANG Shu

  1. A Synchronisation Method For Informed Spread-Spectrum Audiowatermarking

    Directory of Open Access Journals (Sweden)

    Pierre-Yves Fulchiron

    2003-12-01

    Full Text Available Under perfect synchronisation conditions, watermarking schemes employing asymmetric spread-spectrum techniques are suitable for copy-protection of audio signals. This paper proposes to combine the use of a robust psychoacoustic projection for the extraction of a watermark feature vector along with non-linear detection functions optimised with side-information. The new proposed scheme benefits from an increased level of security through the use of asymmetric detectors. We apply this scheme to real audio signals and experimental results show an increased robustness to desynchronisation attacks such as random cropping.

  2. Where is information quality lost at clinical level? A mixed-method study on information systems and data quality in three urban Kenyan ANC clinics

    Directory of Open Access Journals (Sweden)

    Daniel Hahn

    2013-08-01

    Full Text Available Background: Well-working health information systems are considered vital with the quality of health data ranked of highest importance for decision making at patient care and policy levels. In particular, health facilities play an important role, since they are not only the entry point for the national health information system but also use health data (and primarily for patient care. Design: A multiple case study was carried out between March and August 2012 at the antenatal care (ANC clinics of two private and one public Kenyan hospital to describe clinical information systems and assess the quality of information. The following methods were developed and employed in an iterative process: workplace walkthroughs, structured and in-depth interviews with staff members, and a quantitative assessment of data quality (completeness and accurate transmission of clinical information and reports in ANC. Views of staff and management on the quality of employed information systems, data quality, and influencing factors were captured qualitatively. Results: Staff rated the quality of information higher in the private hospitals employing computers than in the public hospital which relies on paper forms. Several potential threats to data quality were reported. Limitations in data quality were common at all study sites including wrong test results, missing registers, and inconsistencies in reports. Feedback was seldom on content or quality of reports and usage of data beyond individual patient care was low. Conclusions: We argue that the limited data quality has to be seen in the broader perspective of the information systems in which it is produced and used. The combination of different methods has proven to be useful for this. To improve the effectiveness and capabilities of these systems, combined measures are needed which include technical and organizational aspects (e.g. regular feedback to health workers and individual skills and motivation.

  3. Where is information quality lost at clinical level? A mixed-method study on information systems and data quality in three urban Kenyan ANC clinics

    Science.gov (United States)

    Hahn, Daniel; Wanjala, Pepela; Marx, Michael

    2013-01-01

    Background Well-working health information systems are considered vital with the quality of health data ranked of highest importance for decision making at patient care and policy levels. In particular, health facilities play an important role, since they are not only the entry point for the national health information system but also use health data (and primarily) for patient care. Design A multiple case study was carried out between March and August 2012 at the antenatal care (ANC) clinics of two private and one public Kenyan hospital to describe clinical information systems and assess the quality of information. The following methods were developed and employed in an iterative process: workplace walkthroughs, structured and in-depth interviews with staff members, and a quantitative assessment of data quality (completeness and accurate transmission of clinical information and reports in ANC). Views of staff and management on the quality of employed information systems, data quality, and influencing factors were captured qualitatively. Results Staff rated the quality of information higher in the private hospitals employing computers than in the public hospital which relies on paper forms. Several potential threats to data quality were reported. Limitations in data quality were common at all study sites including wrong test results, missing registers, and inconsistencies in reports. Feedback was seldom on content or quality of reports and usage of data beyond individual patient care was low. Conclusions We argue that the limited data quality has to be seen in the broader perspective of the information systems in which it is produced and used. The combination of different methods has proven to be useful for this. To improve the effectiveness and capabilities of these systems, combined measures are needed which include technical and organizational aspects (e.g. regular feedback to health workers) and individual skills and motivation. PMID:23993022

  4. A model independent safeguard against background mismodeling for statistical inference

    Energy Technology Data Exchange (ETDEWEB)

    Priel, Nadav; Landsman, Hagar; Manfredini, Alessandro; Budnik, Ranny [Department of Particle Physics and Astrophysics, Weizmann Institute of Science, Herzl St. 234, Rehovot (Israel); Rauch, Ludwig, E-mail: nadav.priel@weizmann.ac.il, E-mail: rauch@mpi-hd.mpg.de, E-mail: hagar.landsman@weizmann.ac.il, E-mail: alessandro.manfredini@weizmann.ac.il, E-mail: ran.budnik@weizmann.ac.il [Teilchen- und Astroteilchenphysik, Max-Planck-Institut für Kernphysik, Saupfercheckweg 1, 69117 Heidelberg (Germany)

    2017-05-01

    We propose a safeguard procedure for statistical inference that provides universal protection against mismodeling of the background. The method quantifies and incorporates the signal-like residuals of the background model into the likelihood function, using information available in a calibration dataset. This prevents possible false discovery claims that may arise through unknown mismodeling, and corrects the bias in limit setting created by overestimated or underestimated background. We demonstrate how the method removes the bias created by an incomplete background model using three realistic case studies.

  5. The Methods of Information Security Based on Blurring of System

    Directory of Open Access Journals (Sweden)

    Mikhail Andreevich Styugin

    2016-03-01

    Full Text Available The paper present the model of researching system with own known input, output and set of discrete internal states. These theoretical objects like an absolutely protected from research system and an absolutely indiscernible data transfer channel are defined. Generalization of the principle of Shannon Secrecy are made. The method of system blurring is defined. Theoretically cryptographically strong of absolutely indiscernible data transfer channel is proved and its practical unbreakable against unreliable pseudo random number generator is shown. This paper present system with blurring of channel named Pseudo IDTC and shown asymptotic complexity of break this system compare with AES and GOST.

  6. The Effects of Presentation Method and Information Density on Visual Search Ability and Working Memory Load

    Science.gov (United States)

    Chang, Ting-Wen; Kinshuk; Chen, Nian-Shing; Yu, Pao-Ta

    2012-01-01

    This study investigates the effects of successive and simultaneous information presentation methods on learner's visual search ability and working memory load for different information densities. Since the processing of information in the brain depends on the capacity of visual short-term memory (VSTM), the limited information processing capacity…

  7. Method for thermoelectric cooler utilization using manufacturer's technical information

    Science.gov (United States)

    Ajiwiguna, Tri Ayodha; Nugroho, Rio; Ismardi, Abrar

    2018-03-01

    Thermoelectric cooler (TEC) module has been widely used for many applications. In this study, a procedure to use TEC module for specific requirement is developed based on manufacturer's technical data. For study case, the cooling system using TEC module is designed and tested to maintain 6.6 liter of water at 24 °C while surrounding temperature is 26 °C. First, cooling load estimation is performed empirically by observing the temperature change when cold water is inside the container. Second, the working temperature on hot side and cold side of TEC are determined. Third, the parameters of Seebeck coefficient, thermal resistance and electrical resistance are predicted by using information from the manufacturer. Fourth, the operating current is determined by the assumption the voltage across the TEC is 12V. Fifth, cooling capacity of TEC module is calculated by using energy balance equation of TEC. Sixth, the cooling load and cooling capacity are compared to determine the number of TEC module needed. The result of these calculations showed that one TEC module is enough for cooling system since the cooling load is 17.5 W while the cooling capacity is 18.87 W. From the experimental result, the set point temperature was achieved using one TEC module as predicted in calculations steps.

  8. Pavement Management Systems Application with Geographic Information System Method

    Directory of Open Access Journals (Sweden)

    Nihat MOROVA

    2016-04-01

    Full Text Available In this study, performance models were developed. Software in Visual Basic programming language was used for the developed model. Using the software, both the present condition of the pavement can be examined and future performance based on expected traffic values can be predicted. So, the software can be used at both network and project level. Cost and benefit values taken from the literature were used in determining the cost-benefit ratio. Using the genetic algorithm approach, a computer program in Visual Basic programming language was written. Using the model developed, a five-year maintenance and rehabilitation program can be planned for a given database considering budget restraints. The developed models were merged by writing Geographic Information System (GIS software in order to show the effectiveness of models and adopt the models into a GIS. For this purpose, a case study of GIS was exposed. The control of the overall system can be applied in addition to the application of the model at network level. The developed software allows data to be transferred to the database, analyses and different scenario applications for showing GIS results.

  9. Determining the Effectiveness of Various Delivery Methods in an Information Technology/Information Systems Curriculum

    Science.gov (United States)

    Davis, Gary Alan; Kovacs, Paul J.; Scarpino, John; Turchek, John C.

    2010-01-01

    The emergence of increasingly sophisticated communication technologies and the media-rich extensions of the World Wide Web have prompted universities to use alternatives to the traditional classroom teaching and learning methods. This demand for alternative delivery methods has led to the development of a wide range of eLearning techniques.…

  10. 78 FR 25440 - Request for Information and Citations on Methods for Cumulative Risk Assessment

    Science.gov (United States)

    2013-05-01

    ... Citations on Methods for Cumulative Risk Assessment AGENCY: Office of the Science Advisor, Environmental... influence exposures, dose-response or risk/hazard posed by environmental contaminant exposures, and methods... who wish to receive further information about submitting information on methods for cumulative risk...

  11. Possibility of application of low-frequency piezothromboelastografy method for the evaluation of haemostatic potential of blood in coronary bypass surgery on the background of long aspirinotherapy

    Directory of Open Access Journals (Sweden)

    Elena V. Fanaskova

    2017-01-01

    Full Text Available Aim. To assess the hemostatic system on the aspirin therapy background using the VerifyNow Aspirin test, as well as examining platelet aggregation and low-frequency piezothromboelastography in patients with coronary artery disease during coronary artery bypass grafting.Materials and methods. 100 people with coronary artery disease who were taking aspirin (75–100 mg daily for more than 1 year were examined. The study was performed in the perioperative period of coronary bypass surgery without aspirin withdrawal. Evaluation of hemostasis was performed by the VerifyNow Aspirin тest (USA, platelet aggregation (Нelena Laboratories AggRAMTM, Britain and the low-frequency piezothromboelastography (ARP-01M “Mednord”, Russia.Results. Patients included in the study were sensitive to aspirin when under long-term administration of a drug: indicator test was VerifyNow ARU 496,9 ± 21,3%, the platelet aggregation to ADP and adrenaline was reduced by 46% and 52%, respectively. In the early postoperative period platelet aggregation of ADP decreased by 73,2%, collagen – 75,9%, and adrenaline – 82,64% in comparison with the control group.Perioperative hemorrhagic complications in the study group were not observed. Reduction of platelet aggregation after aspirin therapy was accompanied by an increase in thrombin activity of the blood, which allows for evaluation of the method of low-frequency piezothromboelastography (LPTEG. In the early postoperative period, the results of LPTEG, thrombin potential, and anticoagulant and fibrinolytic activity of blood were partially normalized without reaching the level of the control group.Conclusion. For the evaluation of hemostasis under aspirin therapy it is advisable to apply the low-frequency piezothromboelastography, which in contrast to the VerifyNow test and traditional platelet aggregation, allows one to reveal a degree of impairment in thrombin blood activity and to conduct an integrative assessment on all

  12. Ensuring the integrity of information resources based methods dvooznakovoho structural data encoding

    Directory of Open Access Journals (Sweden)

    О.К. Юдін

    2009-01-01

    Full Text Available  Developed methods of estimation of noise stability and correction of structural code constructions to distortion in comunication of data in informatively communication systems and networks taking into account providing of integrity of informative resource.

  13. Conserved quantities in background independent theories

    Energy Technology Data Exchange (ETDEWEB)

    Markopoulou, Fotini [Perimeter Institute for Theoretical Physics, 35 King Street North, Waterloo, Ontario N2J 2W9 (Canada); Department of Physics, University of Waterloo, Waterloo, Ontario N2L 3G1 (Canada)

    2007-05-15

    We discuss the difficulties that background independent theories based on quantum geometry encounter in deriving general relativity as the low energy limit. We follow a geometrogenesis scenario of a phase transition from a pre-geometric theory to a geometric phase which suggests that a first step towards the low energy limit is searching for the effective collective excitations that will characterize it. Using the correspondence between the pre-geometric background independent theory and a quantum information processor, we are able to use the method of noiseless subsystems to extract such coherent collective excitations. We illustrate this in the case of locally evolving graphs.

  14. Power generation from renewable energy sources. Climate-friendly and economically efficient. Background information; Stromerzeugung aus erneuerbaren Energien. Klimafreundlich und oekonomisch sinnvoll. Hintergrund

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2011-02-15

    As the publication shows, the public discussion in Germany is increasingly focusing on the cost of the promotion of renewable energy sources. Critical comments state that the EEG (Renewables Act) accounts for most of the recent electricity rate increases and also does not contribute to climate protection. This background paper of the Federal Environmental Office stresses the role of the EEC for climate protection and its effects on price trends in electricity supply. The resulting financial burden for the German citizens and industry is investigated, and it is discussed whether public funding of renewable energy sources is indeed beneficial for the German economy on the whole.

  15. Short Term Gain, Long Term Pain:Informal Job Search Methods and Post-Displacement Outcomes

    OpenAIRE

    Green, Colin

    2012-01-01

    This paper examines the role of informal job search methods on the labour market outcomes of displaced workers. Informal job search methods could alleviate short-term labour market difficulties of displaced workers by providing information on job opportunities, allowing them to signal their productivity and may mitigate wage losses through better post-displacement job matching. However if displacement results from reductions in demand for specific sectors/skills, the use of informal job searc...

  16. Physics background in luminosity measurement at ILC and measurement of the proton b-content at H1 using multivariate method

    Energy Technology Data Exchange (ETDEWEB)

    Pandurovic, Mila

    2011-12-15

    ILC physics program sets the minimal precision of the luminosity measurement to be of order of 10{sup -3}. This may be accomplished by construction of fine granulated electromagnetic calorimeter, which will measure the rate of Bhabha scattering process at small angles at one hand and by the experimental control of various systematic effects at the other. The first part of this thesis is dedicated to the study of four-fermion processes e{sup +}e{sup -}{yields}e{sup +}e{sup -}f anti f, as a physics background in the luminosity measurement. This SM process comes as one of the major systematic effects in luminosity measurement at ILC due to the high cross-section and the fact that electron spectators emitted at low polar angles can be misidentified as a signal. It has been demonstrated that the event selection can be performed in a way that the overall relative systematic uncertainty does not exceed 2.3 10{sup -3}. Selection efficiency of the Bhabha signal is maintained to limit the statistical uncertainty of the measurement at 1.2 10{sup -4}. In addition, background suppression potential is discussed for various selection setups. The second part of the thesis is dedicated to the physics of heavy quarks at the H1 experiment at the accelerator HERA at DESY, Hamburg, Germany. The HERA experiments H1 and ZEUS gave an important experimental insight of the proton structure in the wide phase space of photon virtuality and Bjorken scaling variable. In this thesis the b-content of the proton is measured that can be further used for F{sub 2}{sup b} and the corresponding cross-section measurements. With the sample of 54.4 pb{sup -1} of HERA II data the proton b-content is measured, using the e{sup -}p neutral current events of deep inelastic scattering in the kinematic region of Q{sup 2}>6 GeV{sup 2} and the Bjorken scaling variable 0.0002method is based on an inclusive approach exploiting the different lifetime signatures as well as the mass

  17. An information hiding method based on LSB and tent chaotic map

    Science.gov (United States)

    Song, Jianhua; Ding, Qun

    2011-06-01

    In order to protect information security more effectively, a novel information hiding method based on LSB and Tent chaotic map was proposed, first the secret message is Tent chaotic encrypted, and then LSB steganography is executed for the encrypted message in the cover-image. Compared to the traditional image information hiding method, the simulation results indicate that the method greatly improved in imperceptibility and security, and acquired good results.

  18. Recovery of native genetic background in admixed populations using haplotypes, phenotypes, and pedigree information--using Cika cattle as a case breed.

    Directory of Open Access Journals (Sweden)

    Mojca Simčič

    Full Text Available The aim of this study was to obtain unbiased estimates of the diversity parameters, the population history, and the degree of admixture in Cika cattle which represents the local admixed breeds at risk of extinction undergoing challenging conservation programs. Genetic analyses were performed on the genome-wide Single Nucleotide Polymorphism (SNP Illumina Bovine SNP50 array data of 76 Cika animals and 531 animals from 14 reference populations. To obtain unbiased estimates we used short haplotypes spanning four markers instead of single SNPs to avoid an ascertainment bias of the BovineSNP50 array. Genome-wide haplotypes combined with partial pedigree and type trait classification show the potential to improve identification of purebred animals with a low degree of admixture. Phylogenetic analyses demonstrated unique genetic identity of Cika animals. Genetic distance matrix presented by rooted Neighbour-Net suggested long and broad phylogenetic connection between Cika and Pinzgauer. Unsupervised clustering performed by the admixture analysis and two-dimensional presentation of the genetic distances between individuals also suggest Cika is a distinct breed despite being similar in appearance to Pinzgauer. Animals identified as the most purebred could be used as a nucleus for a recovery of the native genetic background in the current admixed population. The results show that local well-adapted strains, which have never been intensively managed and differentiated into specific breeds, exhibit large haplotype diversity. They suggest a conservation and recovery approach that does not rely exclusively on the search for the original native genetic background but rather on the identification and removal of common introgressed haplotypes would be more powerful. Successful implementation of such an approach should be based on combining phenotype, pedigree, and genome-wide haplotype data of the breed of interest and a spectrum of reference breeds which

  19. A Multi-Classification Method of Improved SVM-based Information Fusion for Traffic Parameters Forecasting

    Directory of Open Access Journals (Sweden)

    Hongzhuan Zhao

    2016-04-01

    Full Text Available With the enrichment of perception methods, modern transportation system has many physical objects whose states are influenced by many information factors so that it is a typical Cyber-Physical System (CPS. Thus, the traffic information is generally multi-sourced, heterogeneous and hierarchical. Existing research results show that the multisourced traffic information through accurate classification in the process of information fusion can achieve better parameters forecasting performance. For solving the problem of traffic information accurate classification, via analysing the characteristics of the multi-sourced traffic information and using redefined binary tree to overcome the shortcomings of the original Support Vector Machine (SVM classification in information fusion, a multi-classification method using improved SVM in information fusion for traffic parameters forecasting is proposed. The experiment was conducted to examine the performance of the proposed scheme, and the results reveal that the method can get more accurate and practical outcomes.

  20. METHODS FOR EVALUATION OF COMPANIES’ INFORMATION SYSTEMS AND TECHNOLOGIES EFFICIENCY AND CONTROL IN TEACHING COURSE "INFORMATION TECHNOLOGY GOVERNANCE"

    Directory of Open Access Journals (Sweden)

    Samchynska Yaroslava

    2014-10-01

    Full Text Available The use of the information systems and technologies in economic activity is called to represent companies’ corporate principles, aims, traditions, to help in realization of the planned strategies, thanks to that the management efficiency and the cost of enterprise increases. Teaching for educational discipline «IT Governance» is directed on a study and research of these intercommunications by students of the 5th course on specialties «Computer Science», «Software Engineering» for educational level Specialist (Past Bachelor Degree, Master degree. The auditing services belong to the actual methods for evaluation of the information systems and technologies efficiency and control, which are studied according to the course «IT Governance». The article deals with the methodological basis of providing auditing services for evaluation of efficiency and control of information systems (technologies for the purpose of satisfaction of growing informational needs of companies and functional activation in their information resources. The main task of auditing services for control of information systems (information communication technologies efficiency is to evaluate independently and objectively if the information technologies provide the necessary services. The basic criteria, data ware, subject and object of audit necessary for drawing up an audit report and assurance declaring are established. The program and a detailed list of auditing procedures for evaluation of efficiency of information systems and technologies have been presented

  1. A method for investigation of the D(4He, γ)6Li reaction in the Ultralow energy region under a high background

    Science.gov (United States)

    Bystritsky, V. M.; Dudkin, G. N.; Krylov, A. R.; Gazi, S.; Huran, J.; Nechaev, B. A.; Padalko, V. N.; Sadovsky, A. B.; Tuleushev, Yu. Zh.; Filipowicz, M.; Philippov, A. V.

    2016-07-01

    The cosmological lithium problem, that is, a noticeable discrepancy between the predicted and observed abundances of lithium, is in conflict with the Standard Big Bang Nucleosynthesis model. For example, the abundance of 7Li is 2-4 times smaller than predicted by the Standard Big Bang Nucleosynthesis. As to the abundance of 6Li, recent more accurate optical investigations have yielded only the upper limit on the 6Li/7Li ratio, which makes the problem of 6Li abundance and accordingly of disagreement with the Standard Big Bang Nucleosynthesis predictions less acute. However, experimental study of the D(4He, γ)6Li reaction cross section is still of current importance because there is a theoretical approach predicting its anomalously large value in the region of energies below the Standard Big Bang Nucleosynthesis energy. The work is dedicated to the measurement of the cross section for the D(4He, γ)6Li reaction proceeding in zirconium deuteride at the incident 4He+ion energy of 36 keV. The experiment is performed at a pulsed Hall plasma accelerator with an energy spread of 20% FWHM. A method for direct measurement of the background from the reaction chain D(4He, 4He)D→D(D, n)3He→(n, γ) and/or (n, n‧γ) ending with activation of the surrounding material by neutrons is proposed and implemented in the work. An upper limit on the D(4He, γ)6Li reaction cross section σ≤7·10-36 cm2 at the 90% confidence level is obtained.

  2. A method for investigation of the D("4He, γ)"6Li reaction in the Ultralow energy region under a high background

    International Nuclear Information System (INIS)

    Bystritsky, V.M.; Dudkin, G.N.; Krylov, A.R.; Gazi, S.; Huran, J.; Nechaev, B.A.; Padalko, V.N.; Sadovsky, A.B.; Tuleushev, Yu.Zh.; Filipowicz, M.; Philippov, A.V.

    2016-01-01

    The cosmological lithium problem, that is, a noticeable discrepancy between the predicted and observed abundances of lithium, is in conflict with the Standard Big Bang Nucleosynthesis model. For example, the abundance of "7Li is 2–4 times smaller than predicted by the Standard Big Bang Nucleosynthesis. As to the abundance of "6Li, recent more accurate optical investigations have yielded only the upper limit on the "6Li/"7Li ratio, which makes the problem of "6Li abundance and accordingly of disagreement with the Standard Big Bang Nucleosynthesis predictions less acute. However, experimental study of the D("4He, γ)"6Li reaction cross section is still of current importance because there is a theoretical approach predicting its anomalously large value in the region of energies below the Standard Big Bang Nucleosynthesis energy. The work is dedicated to the measurement of the cross section for the D("4He, γ)"6Li reaction proceeding in zirconium deuteride at the incident "4He"+ion energy of 36 keV. The experiment is performed at a pulsed Hall plasma accelerator with an energy spread of 20% FWHM. A method for direct measurement of the background from the reaction chain D("4He, "4He)D→D(D, n)"3He→(n, γ) and/or (n, n′γ) ending with activation of the surrounding material by neutrons is proposed and implemented in the work. An upper limit on the D("4He, γ)"6Li reaction cross section σ≤7·10"−"3"6 cm"2 at the 90% confidence level is obtained.

  3. A method for investigation of the D({sup 4}He, γ){sup 6}Li reaction in the Ultralow energy region under a high background

    Energy Technology Data Exchange (ETDEWEB)

    Bystritsky, V.M. [Joint Institute for Nuclear Research, Dubna (Russian Federation); Dudkin, G.N. [National Research Tomsk Polytechnic University, Tomsk (Russian Federation); Krylov, A.R. [Joint Institute for Nuclear Research, Dubna (Russian Federation); Gazi, S.; Huran, J. [Institute of Electrical Engineering, Slovak Academy of Sciences, Bratislava (Slovakia); Nechaev, B.A.; Padalko, V.N. [National Research Tomsk Polytechnic University, Tomsk (Russian Federation); Sadovsky, A.B. [Joint Institute for Nuclear Research, Dubna (Russian Federation); Tuleushev, Yu.Zh. [Institute of Nuclear Physics, Ministry of Power Engineering, Almaty (Kazakhstan); Filipowicz, M. [Faculty of Energy and Fuels, University of Science and Technologies, Krakow (Poland); Philippov, A.V. [Joint Institute for Nuclear Research, Dubna (Russian Federation)

    2016-07-21

    The cosmological lithium problem, that is, a noticeable discrepancy between the predicted and observed abundances of lithium, is in conflict with the Standard Big Bang Nucleosynthesis model. For example, the abundance of {sup 7}Li is 2–4 times smaller than predicted by the Standard Big Bang Nucleosynthesis. As to the abundance of {sup 6}Li, recent more accurate optical investigations have yielded only the upper limit on the {sup 6}Li/{sup 7}Li ratio, which makes the problem of {sup 6}Li abundance and accordingly of disagreement with the Standard Big Bang Nucleosynthesis predictions less acute. However, experimental study of the D({sup 4}He, γ){sup 6}Li reaction cross section is still of current importance because there is a theoretical approach predicting its anomalously large value in the region of energies below the Standard Big Bang Nucleosynthesis energy. The work is dedicated to the measurement of the cross section for the D({sup 4}He, γ){sup 6}Li reaction proceeding in zirconium deuteride at the incident {sup 4}He{sup +}ion energy of 36 keV. The experiment is performed at a pulsed Hall plasma accelerator with an energy spread of 20% FWHM. A method for direct measurement of the background from the reaction chain D({sup 4}He, {sup 4}He)D→D(D, n){sup 3}He→(n, γ) and/or (n, n′γ) ending with activation of the surrounding material by neutrons is proposed and implemented in the work. An upper limit on the D({sup 4}He, γ){sup 6}Li reaction cross section σ≤7·10{sup −36} cm{sup 2} at the 90% confidence level is obtained.

  4. An evaluation of multimedia and online support groups (OSG) contents and application of information by infertile patients: Mixed method study

    Science.gov (United States)

    Wiweko, Budi; Narasati, Shabrina; Agung, Prince Gusti; Zesario, Aulia; Wibawa, Yohanes Satrya; Maidarti, Mila; Harzif, Achmad Kemal; Pratama, Gita; Sumapradja, Kanadi; Muharam, Raden; Hestiantoro, Andon

    2018-02-01

    Background: The presence of Online Support Groups (OSG) is expected to empower patients with infertility, thus allowing patients to be the focus of healthcare services. This study will evaluate multimedia content, OSG, and utilization of information for decision-making by patients using infertility services. This study is a mixed method study conducted from January - June 2016 at Yasmin IVF Clinic, Dr. Cipto Mangunkusumo General Hospital; and SMART IVF Clinic, Jakarta. The subjects are patients with infertility who sought treatment at the clinics. Data was collected through a structured interview in the form of a questionnaire. Informed consent was obtained from all individual participants included in the study. All procedures that performed in studies were by the ethical standards of the institutional. The result from 72 respondents showed quantitative analysis did not reveal any association between multimedia and OSG information sources with patient knowledge regarding infertility management. However, qualitative analysis highlighted three issues: the information regarding infertility services in the available multimedia and the OSG; use of the available information by patients when deciding to use infertility services. The level of awareness of respondents on searching information regarding infertility on the clinic website is still limited. It happened because most of the patients in the clinic are unaware of clinic website existence which provided the infertility information. Therefore, the clinic website needs to be promoted so the usage of this website will increase in the future.

  5. The criteria for selecting a method for unfolding neutron spectra based on the information entropy theory

    International Nuclear Information System (INIS)

    Zhu, Qingjun; Song, Fengquan; Ren, Jie; Chen, Xueyong; Zhou, Bin

    2014-01-01

    To further expand the application of an artificial neural network in the field of neutron spectrometry, the criteria for choosing between an artificial neural network and the maximum entropy method for the purpose of unfolding neutron spectra was presented. The counts of the Bonner spheres for IAEA neutron spectra were used as a database, and the artificial neural network and the maximum entropy method were used to unfold neutron spectra; the mean squares of the spectra were defined as the differences between the desired and unfolded spectra. After the information entropy of each spectrum was calculated using information entropy theory, the relationship between the mean squares of the spectra and the information entropy was acquired. Useful information from the information entropy guided the selection of unfolding methods. Due to the importance of the information entropy, the method for predicting the information entropy using the Bonner spheres' counts was established. The criteria based on the information entropy theory can be used to choose between the artificial neural network and the maximum entropy method unfolding methods. The application of an artificial neural network to unfold neutron spectra was expanded. - Highlights: • Two neutron spectra unfolding methods, ANN and MEM, were compared. • The spectrum's entropy offers useful information for selecting unfolding methods. • For the spectrum with low entropy, the ANN was generally better than MEM. • The spectrum's entropy was predicted based on the Bonner spheres' counts

  6. Description of background data in the SKB database GEOTAB

    International Nuclear Information System (INIS)

    Eriksson, E.; Sehlstedt, S.

    1989-02-01

    During the research and development program performed by SKB for the final disposal of spent nuclear fuel, a large quantity of geoscientific data was collected. Most of this data was stored in a database called GEOTAB. This report describes data within the background data group. This data provides information on the location of areas studied, borehole positions and also some drilling information. The background data group (subject), called BGR, is divided into several subgroups (methods): BGAREA area background data; BGDRILL drilling information; BGDRILLP drill penetration data; BGHOLE borehole information; BGTABLES number of rows in a table, and BGTOLR data table tolerance. A method consists of one or several data tables. In each chapter a method and its data tables are described. (orig./HP)

  7. Hospital discharge: What are the problems, information needs and objectives of community pharmacists? A mixed method approach

    Directory of Open Access Journals (Sweden)

    Brühwiler LD

    2017-09-01

    Full Text Available Background: After hospital discharge, community pharmacists are often the first health care professionals the discharged patient encounters. They reconcile and dispense prescribed medicines and provide pharmaceutical care. Compared to the roles of general practitioners, the pharmacists’ needs to perform these tasks are not well known. Objective: This study aims to a Identify community pharmacists’ current problems and roles at hospital discharge, b Assess their information needs, specifically the availability and usefulness of information, and c Gain insight into pharmacists’ objectives and ideas for discharge optimisation. Methods: A focus group was conducted with a sample of six community pharmacists from different Swiss regions. Based on these qualitative results, a nationwide online-questionnaire was sent to 1348 Swiss pharmacies. Results: The focus group participants were concerned about their extensive workload with discharge prescriptions and about gaps in therapy. They emphasised the importance of more extensive information transfer. This applied especially to medication changes, unclear prescriptions, and information about a patient's care. Participants identified treatment continuity as a main objective when it comes to discharge optimisation. There were 194 questionnaires returned (response rate 14.4%. The majority of respondents reported to fulfil their role as defined by the Joint-FIP/WHO Guideline on Good Pharmacy Practice (rather badly. They reported many unavailable but useful information items, like therapy changes, allergies, specifications for “off-label” medication use or contact information. Information should be delivered in a structured way, but no clear preference for one particular transfer method was found. Pharmacists requested this information in order to improve treatment continuity and patient safety, and to be able to provide better pharmaceutical care services. Conclusion: Surveyed Swiss community

  8. Analysis method for the search for neutrinoless double beta decay in the NEMO3 experiment: study of the background and first results

    International Nuclear Information System (INIS)

    Etienvre, A.I.

    2003-04-01

    The NEMO3 detector, installed in the Frejus Underground Laboratory, is dedicated to the study of neutrinoless double beta decay: the observation of this process would sign the massive and Majorana nature of neutrino. The experiment consists in very thin central source foils (the total mass is equal to 10 kg), a tracking detector made of drift cells operating in Geiger mode, a calorimeter made of plastic scintillators associated to photomultipliers, a coil producing a 30 gauss magnetic field and two shields, dedicated to the reduction of the γ-ray and neutron fluxes. In the first part, I describe the implications of several mechanisms, related to trilinear R-parity violation, on double beta decay. The second part is dedicated to a detailed study of the tracking detector of the experiment: after a description of the different working tests, I present the determination of the characteristics of the tracking reconstruction (transverse and longitudinal resolution, by Geiger cell and precision on vertex determination, charge recognition). The last part corresponds to the analysis of the data taken by the experiment. On the one hand, an upper limit on the Tl 208 activity of the sources has been determined: it is lower than 68 mBq/kg, at 90% of confidence level. On the other hand, I have developed and tested on these data a method in order to analyse the neutrinoless double beta decay signal; this method is based on a maximum of likelihood using all the available information. Using this method, I could determine a first and very preliminary upper limit on the effective mass of the neutrino. (author)

  9. Background subtraction system for pulsed neutron logging of earth boreholes

    International Nuclear Information System (INIS)

    Hertzog, R.C.

    1983-01-01

    The invention provides a method for determining the characteristics of earth formations surrounding a well borehole comprising the steps of: repetitively irradiating the earth formations surrounding the well bore with relatively short duration pulses of high energy neutrons; detecting during each pulse of high energy neutrons, gamma radiation due to the inelastic scattering of neutrons by materials comprising the earth formations surrounding the borehole and providing information representative thereof; detecting immediately following each such pulse of high energy neutrons, background gamma radiation due to thermal neutron capture and providing information representative thereof; and correcting the inelastic gamma representative information to compensate for said background representative information

  10. Background suppression of infrared small target image based on inter-frame registration

    Science.gov (United States)

    Ye, Xiubo; Xue, Bindang

    2018-04-01

    We propose a multi-frame background suppression method for remote infrared small target detection. Inter-frame information is necessary when the heavy background clutters make it difficult to distinguish real targets and false alarms. A registration procedure based on points matching in image patches is used to compensate the local deformation of background. Then the target can be separated by background subtraction. Experiments show our method serves as an effective preliminary of target detection.

  11. Methods of Certification tests PLC-Networks in Compliance Safety Information

    Directory of Open Access Journals (Sweden)

    A. A. Balaev

    2011-12-01

    Full Text Available The aim of this research was description of the methodology of the audit plc-network to meet the requirements of information security. The technique is based on the provisions of the guidance documents and model FSTEC Russia test object methods of information on safety information.

  12. Research on the method of measuring space information network capacity in communication service

    Directory of Open Access Journals (Sweden)

    Zhu Shichao

    2017-02-01

    Full Text Available Because of the large scale characteristic of space information network in terms of space and time and the increasing of its complexity,existing measuring methods of information transmission capacity have been unable to measure the existing and future space information networkeffectively.In this study,we firstly established a complex model of space information network,and measured the whole space information network capacity by means of analyzing data access capability to the network and data transmission capability within the network.At last,we verified the rationality of the proposed measuring method by using STK and Matlab simulation software for collaborative simulation.

  13. A Dynamic and Adaptive Selection Radar Tracking Method Based on Information Entropy

    Directory of Open Access Journals (Sweden)

    Ge Jianjun

    2017-12-01

    Full Text Available Nowadays, the battlefield environment has become much more complex and variable. This paper presents a quantitative method and lower bound for the amount of target information acquired from multiple radar observations to adaptively and dynamically organize the detection of battlefield resources based on the principle of information entropy. Furthermore, for minimizing the given information entropy’s lower bound for target measurement at every moment, a method to dynamically and adaptively select radars with a high amount of information for target tracking is proposed. The simulation results indicate that the proposed method has higher tracking accuracy than that of tracking without adaptive radar selection based on entropy.

  14. Method of sharing mobile unit state information between base station routers

    NARCIS (Netherlands)

    Bosch, H.G.P.; Mullender, Sape J.; Polakos, Paul Anthony; Rajkumar, Ajay; Sundaram, Ganapathy S.

    2007-01-01

    The present invention provides a method of operating a first base station router. The method may include transmitting state information associated with at least one inactive mobile unit to at least one second base station router. The state information is usable to initiate an active session with the

  15. Method of sharing mobile unit state information between base station routers

    NARCIS (Netherlands)

    Bosch, H.G.P.; Mullender, Sape J.; Polakos, Paul Anthony; Rajkumar, Ajay; Sundaram, Ganapathy S.

    2010-01-01

    The present invention provides a method of operating a first base station router. The method may include transmitting state information associated with at least one inactive mobile unit to at least one second base station router. The state information is usable to initiate an active session with the

  16. Developing corpus-based translation methods between informal and formal mathematics : project description

    NARCIS (Netherlands)

    Kaliszyk, C.; Urban, J.; Vyskocil, J.; Geuvers, J.H.; Watt, S.M.; Davenport, J.H.; Sexton, A.P.; Sojka, P.; Urban, J.

    2014-01-01

    The goal of this project is to (i) accumulate annotated informal/formal mathematical corpora suitable for training semi-automated translation between informal and formal mathematics by statistical machine-translation methods, (ii) to develop such methods oriented at the formalization task, and in

  17. Emission and costs up to and including 2030 for the current environmental policy. Background information for the National Environmental Outlook 5

    International Nuclear Information System (INIS)

    Van Wee, G.P.; Kuijpers-Linde, M.A.J.; Van Gerwen, O.J.

    2001-03-01

    Every four years the Dutch National Institute of Public Health and the Environment (RIVM) publishes an Environmental Outlook in preparation for the National Environmental Policy Plan (NEPP). The fifth National Environmental Outlook (NEOS) describes developments in the quality of the environment in the Netherlands for 2000-2030 against a background of developments on the European and global scales. The two macro-economic scenarios of the Netherlands Bureau for Economic and Policy Analysis (CPB) used are the European Coordination (EC) scenario and the Global Competition scenario (GC). Consequences for public health, nature and the human physical environment are also indicated. 'Fixed policy' scenarios are used in the Environmental Outlook for the Netherlands. In 'fixed policy' scenarios it is assumed that all policy measures agreed on by the year 2000 will be implemented, but no new measures taken. In this way the Outlook offers baseline scenarios that can be compared with targets and objectives to facilitate the development of new policy. The Fifth National Environmental Outlook was realised with the assistance of many other Dutch research institutes. This background document to NEOS presents estimated levels of energy use, emissions and costs of environmental measures for the 1995-2020 period. The main conclusions are: The environmental problems most difficult to tackle are climate change and noise nuisance. These problems are highly related to energy use and transportation; The policy as presented in the 'Uitvoeringsnota Klimaatbeleid', a document describing the Dutch Kyoto-related climate policy, results in a reduction of greenhouse gases of 15 Mton CO2 equivalents (GS scenario) with respect to the pre-Kyoto policy in 2010. To meet the Kyoto agreements a further reduction of approximately 45 Mton CO2 equivalents is needed. If policies in the 'Uitvoeringsnota Klimaatbeleid' are further instrumentalised and made concrete, an extra reduction of 10 Mton is possible

  18. Perceptions of Athletic Trainers as a Source of Nutritional Information among Collegiate Athletes: A Mixed-methods Approach

    Directory of Open Access Journals (Sweden)

    Rebecca A. Schlaff

    2016-05-01

    Full Text Available Background: Athletes obtain nutrition information from a number of sources, with some being more accurate than others.  Little is known about athletes’ perceptions of utilizing Certified Athletic Trainers (ATs as a primary source of information. Objective: We sought to 1 examine the primary sources of nutrition information among a group of United States collegiate athletes and 2 understand athletes’ perceptions regarding utilization of their ATs as primary sources of nutrition information. Methods: Participants (Division II university athletes completed an online questionnaire (n=155;n=58 males, n=97 females assessing demographic information and ranked primary sources of nutrition information, and participated in focus groups (n=26;n=18 women, n=8 men to better understand barriers/perceptions for using their ATs for nutrition information. Mean+SD ranking were calculated for all sources. Mann Whitney-U analyses were used to identify differences in rank order nutrition sources between genders and years of collegiate experience. Semi-structured focus groups were transcribed, coded, and themes were identified regarding barriers to utilizing ATs for nutrition-related information. Results: Parents (3.54±2.38 and the internet (3.69±2.29 had the highest mean ranks.  ATs were least often ranked as the number one nutrition source (7.5%, among all sources provided.  Barriers to utilizing ATs for nutritional information included discomfort, nutrition information not being within the scope of practice, lack of knowledge, the athletic trainer not caring, and lack of time. Conclusions: Participants reported utilizing ATs less than previous research indicates. Continuing education may be needed to improve the efficacy of ATs in addressing nutritional issues and being seen as a credible and accessible source. Keywords: Diet, Athlete perceptions, Barriers

  19. Actors’ Competencies or Methods? A Case Study of Successful Information Systems Development

    DEFF Research Database (Denmark)

    Omland, Hans Olav; Nielsen, Peter Axel

    2009-01-01

    and methods are exercised. Emphasising the intertwining of competencies and methods, we discuss the character of the intertwining process, how different actors relate to different methods, and how methods may be part of the problem rather than part of the solution to challenges in information systems...... between actors’ competencies and their deployment of methods, arguing that this relationship is described over-simplistically and needs a better explanation. Through a case study of a successful information systems development project we identify some central situations where a variety of competencies...... development. The paper suggests elements for a new model for explaining actors’ competencies and their use of methods....

  20. The VicGeneration study - a birth cohort to examine the environmental, behavioural and biological predictors of early childhood caries: background, aims and methods

    Directory of Open Access Journals (Sweden)

    Dashper Stuart

    2010-02-01

    Full Text Available Abstract Background Dental caries (decay during childhood is largely preventable however it remains a significant and costly public health concern, identified as the most prevalent chronic disease of childhood. Caries in children aged less than five years (early childhood caries is a rapid and progressive disease that can be painful and debilitating, and significantly increases the likelihood of poor child growth, development and social outcomes. Early childhood caries may also result in a substantial social burden on families and significant costs to the public health system. A disproportionate burden of disease is also experienced by disadvantaged populations. Methods/Design This study involves the establishment of a birth cohort in disadvantaged communities in Victoria, Australia. Children will be followed for at least 18 months and the data gathered will explore longitudinal relationships and generate new evidence on the natural history of early childhood caries, the prevalence of the disease and relative contributions of risk and protective biological, environmental and behavioural factors. Specifically, the study aims to: 1. Describe the natural history of early childhood caries (at ages 1, 6, 12 and 18 months, tracking pathways from early bacterial colonisation, through non-cavitated enamel white spot lesions to cavitated lesions extending into dentine. 2. Enumerate oral bacterial species in the saliva of infants and their primary care giver. 3. Identify the strength of concurrent associations between early childhood caries and putative risk and protective factors, including biological (eg microbiota, saliva, environmental (fluoride exposure and socio-behavioural factors (proximal factors such as: feeding practices and oral hygiene; and distal factors such as parental health behaviours, physical health, coping and broader socio-economic conditions. 4. Quantify the longitudinal relationships between these factors and the development and

  1. Backgrounded but not peripheral

    DEFF Research Database (Denmark)

    Hovmark, Henrik

    2013-01-01

    .e. the schema enters into apparently contradictory constructions of the informants’ local home-base and, possibly, of their identity (cf. Hovmark, 2010). Second, I discuss the status and role of the specific linguistic category in question, i.e. the directional adverbs. On the one hand we claim that the DDAs......In this paper I pay a closer look at the use of the CENTRE-PERIPHERY schema in context. I address two specific issues: first, I show how the CENTRE-PERIPHERY schema, encoded in the DDAs, enters into discourses that conceptualize and characterize a local community as both CENTRE and PERIPHERY, i......; furthermore, the DDAs are backgrounded in discourse. Is it reasonable to claim, rather boldly, that “the informants express their identity in the use of the directional adverb ud ‘out’ etc.”? In the course of this article, however, I suggest that the DDAs in question do contribute to the socio...

  2. The Conterminous United States Mineral Appraisal Program; background information to accompany folio of geologic, geochemical, geophysical, and mineral resources maps of the Tonopah 1 by 2 degree Quadrangle, Nevada

    Science.gov (United States)

    John, David A.; Nash, J.T.; Plouff, Donald; Whitebread, D.H.

    1991-01-01

    The Tonopah 1 ? by 2 ? quadrangle in south-central Nevada was studied by an interdisciplinary research team to appraise its mineral resources. The appraisal is based on geological, geochemical, and geophysical field and laboratory investigations, the results of which are published as a folio of maps, figures, and tables, with accompanying discussions. This circular provides background information on the investigations and integrates the information presented in the folio. The selected bibliography lists references to the geology, geochemistry, geophysics, and mineral deposits of the Tonopah 1 ? by 2 ? quadrangle.

  3. Nuclear atlas. After Chernobyl: Nuclear energy between fear and hope. Figures - facts - background information. Der Atomatlas. Nach Tschernobyl: Kernenergie zwischen Angst und Hoffnung. Zahlen - Fakten - Hintergruende

    Energy Technology Data Exchange (ETDEWEB)

    Heinrich, M; Schmidt, A

    1986-01-01

    Chernobyl has become the catchword that reveals the profound difference of opinions among experts and the population on the issue of nuclear energy. In the many discussions in the public media and elsewhere after the Chernobyl accident, technical terms and scientific terminology and analytic data have been used that induced a feeling of helplessness in the general public, simply because people are not familiar with this terminology and subject. What does 'radioactivity' really mean. What damage to health is done by ionizing radiation. Who finds his way through the muddle of measuring units for radiation, such as rem, Becquerel, or Curie. How does a nuclear power plant work. What kind of reactors are operating in Europe. How safe are they. Where are they. What is a reprocessing plant. These are only a few of the many questions the book answers. The information is intended for the general reader, and supplemented by numerous illustrations, maps, tables and graphs.

  4. State of the art/science: Visual methods and information behavior research

    DEFF Research Database (Denmark)

    Hartel, Jenna; Sonnenwald, Diane H.; Lundh, Anna

    2012-01-01

    This panel reports on methodological innovation now underway as information behavior scholars begin to experiment with visual methods. The session launches with a succinct introduction to visual methods by Jenna Hartel and then showcases three exemplar visual research designs. First, Dianne Sonne...... will have gained: knowledge of the state of the art/science of visual methods in information behavior research; an appreciation for the richness the approach brings to the specialty; and a platform to take new visual research designs forward....

  5. METHODS OF MANAGING TRAFFIC DISTRIBUTION IN INFORMATION AND COMMUNICATION NETWORKS OF CRITICAL INFRASTRUCTURE SYSTEMS

    OpenAIRE

    Kosenko, Viktor; Persiyanova, Elena; Belotskyy, Oleksiy; Malyeyeva, Olga

    2017-01-01

    The subject matter of the article is information and communication networks (ICN) of critical infrastructure systems (CIS). The goal of the work is to create methods for managing the data flows and resources of the ICN of CIS to improve the efficiency of information processing. The following tasks were solved in the article: the data flow model of multi-level ICN structure was developed, the method of adaptive distribution of data flows was developed, the method of network resource assignment...

  6. LA-ICP-MS analysis of plastics as a method to support polymer assay in the assessment of materials for low-background detectors

    International Nuclear Information System (INIS)

    Grate, J.W.; Bliss, Mary; Farmer III, O.T.; Thomas, M.L.P.; Liezers, Martin

    2016-01-01

    Ultra low-background radiation measurements are essential to several large-scale physics investigations. Assay of solid polymer materials for extremely low levels of radioactive elements, such as uranium, presents challenges. This paper describes an initial investigation into the use of laser ablation with inductively coupled plasma mass spectrometry for screening a solid plastic, polyethylene, for gross uranium levels. (author)

  7. Searching for Suicide Methods: Accessibility of Information About Helium as a Method of Suicide on the Internet.

    Science.gov (United States)

    Gunnell, David; Derges, Jane; Chang, Shu-Sen; Biddle, Lucy

    2015-01-01

    Helium gas suicides have increased in England and Wales; easy-to-access descriptions of this method on the Internet may have contributed to this rise. To investigate the availability of information on using helium as a method of suicide and trends in searching about this method on the Internet. We analyzed trends in (a) Google searching (2004-2014) and (b) hits on a Wikipedia article describing helium as a method of suicide (2013-2014). We also investigated the extent to which helium was described as a method of suicide on web pages and discussion forums identified via Google. We found no evidence of rises in Internet searching about suicide using helium. News stories about helium suicides were associated with increased search activity. The Wikipedia article may have been temporarily altered to increase awareness of suicide using helium around the time of a celebrity suicide. Approximately one third of the links retrieved using Google searches for suicide methods mentioned helium. Information about helium as a suicide method is readily available on the Internet; the Wikipedia article describing its use was highly accessed following celebrity suicides. Availability of online information about this method may contribute to rises in helium suicides.

  8. Novel Methods for Measuring Depth of Anesthesia by Quantifying Dominant Information Flow in Multichannel EEGs

    Directory of Open Access Journals (Sweden)

    Kab-Mun Cha

    2017-01-01

    Full Text Available In this paper, we propose novel methods for measuring depth of anesthesia (DOA by quantifying dominant information flow in multichannel EEGs. Conventional methods mainly use few EEG channels independently and most of multichannel EEG based studies are limited to specific regions of the brain. Therefore the function of the cerebral cortex over wide brain regions is hardly reflected in DOA measurement. Here, DOA is measured by the quantification of dominant information flow obtained from principle bipartition. Three bipartitioning methods are used to detect the dominant information flow in entire EEG channels and the dominant information flow is quantified by calculating information entropy. High correlation between the proposed measures and the plasma concentration of propofol is confirmed from the experimental results of clinical data in 39 subjects. To illustrate the performance of the proposed methods more easily we present the results for multichannel EEG on a two-dimensional (2D brain map.

  9. Method for Measuring the Information Content of Terrain from Digital Elevation Models

    Directory of Open Access Journals (Sweden)

    Lujin Hu

    2015-10-01

    Full Text Available As digital terrain models are indispensable for visualizing and modeling geographic processes, terrain information content is useful for terrain generalization and representation. For terrain generalization, if the terrain information is considered, the generalized terrain may be of higher fidelity. In other words, the richer the terrain information at the terrain surface, the smaller the degree of terrain simplification. Terrain information content is also important for evaluating the quality of the rendered terrain, e.g., the rendered web terrain tile service in Google Maps (Google Inc., Mountain View, CA, USA. However, a unified definition and measures for terrain information content have not been established. Therefore, in this paper, a definition and measures for terrain information content from Digital Elevation Model (DEM, i.e., a digital model or 3D representation of a terrain’s surface data are proposed and are based on the theory of map information content, remote sensing image information content and other geospatial information content. The information entropy was taken as the information measuring method for the terrain information content. Two experiments were carried out to verify the measurement methods of the terrain information content. One is the analysis of terrain information content in different geomorphic types, and the results showed that the more complex the geomorphic type, the richer the terrain information content. The other is the analysis of terrain information content with different resolutions, and the results showed that the finer the resolution, the richer the terrain information. Both experiments verified the reliability of the measurements of the terrain information content proposed in this paper.

  10. Effectiveness of Visual Methods in Information Procedures for Stem Cell Recipients and Donors

    Directory of Open Access Journals (Sweden)

    Çağla Sarıtürk

    2017-12-01

    Full Text Available Objective: Obtaining informed consent from hematopoietic stem cell recipients and donors is a critical step in the transplantation process. Anxiety may affect their understanding of the provided information. However, use of audiovisual methods may facilitate understanding. In this prospective randomized study, we investigated the effectiveness of using an audiovisual method of providing information to patients and donors in combination with the standard model. Materials and Methods: A 10-min informational animation was prepared for this purpose. In total, 82 participants were randomly assigned to two groups: group 1 received the additional audiovisual information and group 2 received standard information. A 20-item questionnaire was administered to participants at the end of the informational session. Results: A reliability test and factor analysis showed that the questionnaire was reliable and valid. For all participants, the mean overall satisfaction score was 184.8±19.8 (maximum possible score of 200. However, for satisfaction with information about written informed consent, group 1 scored significantly higher than group 2 (p=0.039. Satisfaction level was not affected by age, education level, or differences between the physicians conducting the informative session. Conclusion: This study shows that using audiovisual tools may contribute to a better understanding of the informed consent procedure and potential risks of stem cell transplantation.

  11. Comparison of information-theoretic to statistical methods for gene-gene interactions in the presence of genetic heterogeneity

    Directory of Open Access Journals (Sweden)

    Sucheston Lara

    2010-09-01

    Full Text Available Abstract Background Multifactorial diseases such as cancer and cardiovascular diseases are caused by the complex interplay between genes and environment. The detection of these interactions remains challenging due to computational limitations. Information theoretic approaches use computationally efficient directed search strategies and thus provide a feasible solution to this problem. However, the power of information theoretic methods for interaction analysis has not been systematically evaluated. In this work, we compare power and Type I error of an information-theoretic approach to existing interaction analysis methods. Methods The k-way interaction information (KWII metric for identifying variable combinations involved in gene-gene interactions (GGI was assessed using several simulated data sets under models of genetic heterogeneity driven by susceptibility increasing loci with varying allele frequency, penetrance values and heritability. The power and proportion of false positives of the KWII was compared to multifactor dimensionality reduction (MDR, restricted partitioning method (RPM and logistic regression. Results The power of the KWII was considerably greater than MDR on all six simulation models examined. For a given disease prevalence at high values of heritability, the power of both RPM and KWII was greater than 95%. For models with low heritability and/or genetic heterogeneity, the power of the KWII was consistently greater than RPM; the improvements in power for the KWII over RPM ranged from 4.7% to 14.2% at for α = 0.001 in the three models at the lowest heritability values examined. KWII performed similar to logistic regression. Conclusions Information theoretic models are flexible and have excellent power to detect GGI under a variety of conditions that characterize complex diseases.

  12. Spectral characterization of natural backgrounds

    Science.gov (United States)

    Winkelmann, Max

    2017-10-01

    As the distribution and use of hyperspectral sensors is constantly increasing, the exploitation of spectral features is a threat for camouflaged objects. To improve camouflage materials at first the spectral behavior of backgrounds has to be known to adjust and optimize the spectral reflectance of camouflage materials. In an international effort, the NATO CSO working group SCI-295 "Development of Methods for Measurements and Evaluation of Natural Background EO Signatures" is developing a method how this characterization of backgrounds has to be done. It is obvious that the spectral characterization of a background will be quite an effort. To compare and exchange data internationally the measurements will have to be done in a similar way. To test and further improve this method an international field trial has been performed in Storkow, Germany. In the following we present first impressions and lessons learned from this field campaign and describe the data that has been measured.

  13. Durability 2007. Injection grout investigations. Background description

    International Nuclear Information System (INIS)

    Orantie, K.; Kuosa, H.

    2008-12-01

    The aim of this project was to evaluate the durability risks of injection grouts. The investigations were done with respect to the application conditions, materials and service life requirements at the ONKALO underground research facility. The study encompassed injection grout mixtures made of ultrafine cement with and without silica fume. Some of the mixtures hade a low pH and thus a high silica fume content. The project includes a background description on durability literature, laboratory testing programme, detailed analysis of results and recommendations for selecting of ideal grout mixtures. The background description was made for the experimental study of low-pH and reference rock injection grouts as regards pore- and microstructure, strength, shrinkage/swelling and thus versatile durability properties. A summary of test methods is presented as well as examples, i.e. literature information or former test results, of expected range of results from the tests. Also background information about how the test results correlate to other material properties and mix designs is presented. Besides the report provides basic information on the pore structure of cement based materials. Also the correlation between the pore structure of cement based materials and permeability is shortly discussed. The test methods included in the background description are compressive strength, measurement of bulk drying, autogenous and chemical shrinkage and swelling, hydraulic conductivity / permeability, capillary water uptake test, mercury intrusion porosimetry (MIP) and thin section analysis. Three main mixtures with water-binder ratio of 0.8, 1.0 and 1.4 and silica fume content of 0, 15 and 40% were studied in the laboratory. Besides two extra mixtures were studied to provide additional information about the effect of varying water-dry-material ratio and silica fume content on durability. The evaluation of water tightness based on water permeability coefficient and micro cracking was

  14. Method of Choosing the Information Technology System Supporting Management of the Military Aircraft Operation

    Directory of Open Access Journals (Sweden)

    Barszcz Piotr

    2014-12-01

    Full Text Available The paper presents a method of choosing the information technology system, the task of which is to support the management process of the military aircraft operation. The proposed method is based on surveys conducted among direct users of IT systems used in aviation of the Polish Armed Forces. The analysis of results of the surveys was conducted using statistical methods. The paper was completed with practical conclusions related to further usefulness of the individual information technology systems. In the future, they can be extremely useful in the process of selecting the best solutions and integration of the information technology systems

  15. Comparison of two heuristic evaluation methods for evaluating the usability of health information systems.

    Science.gov (United States)

    Khajouei, Reza; Hajesmaeel Gohari, Sadrieh; Mirzaee, Moghaddameh

    2018-04-01

    In addition to following the usual Heuristic Evaluation (HE) method, the usability of health information systems can also be evaluated using a checklist. The objective of this study is to compare the performance of these two methods in identifying usability problems of health information systems. Eight evaluators independently evaluated different parts of a Medical Records Information System using two methods of HE (usual and with a checklist). The two methods were compared in terms of the number of problems identified, problem type, and the severity of identified problems. In all, 192 usability problems were identified by two methods in the Medical Records Information System. This was significantly higher than the number of usability problems identified by the checklist and usual method (148 and 92, respectively) (p information systems. The results demonstrated that the checklist method had significantly better performance in terms of the number of identified usability problems; however, the performance of the usual method for identifying problems of higher severity was significantly better. Although the checklist method can be more efficient for less experienced evaluators, wherever usability is critical, the checklist should be used with caution in usability evaluations. Copyright © 2018 Elsevier Inc. All rights reserved.

  16. Knowledge and information needs of young people with epilepsy and their parents: Mixed-method systematic review

    Directory of Open Access Journals (Sweden)

    Noyes Jane

    2010-12-01

    Full Text Available Abstract Background Young people with neurological impairments such as epilepsy are known to receive less adequate services compared to young people with other long-term conditions. The time (age 13-19 years around transition to adult services is particularly important in facilitating young people's self-care and ongoing management. There are epilepsy specific, biological and psycho-social factors that act as barriers and enablers to information exchange and nurturing of self-care practices. Review objectives were to identify what is known to be effective in delivering information to young people age 13-19 years with epilepsy and their parents, to describe their experiences of information exchange in healthcare contexts, and to identify factors influencing positive and negative healthcare communication. Methods The Evidence for Policy and Practice Information Coordinating Centre systematic mixed-method approach was adapted to locate, appraise, extract and synthesise evidence. We used Ley's cognitive hypothetical model of communication and subsequently developed a theoretical framework explaining information exchange in healthcare contexts. Results Young people and parents believed that healthcare professionals were only interested in medical management. Young people felt that discussions about their epilepsy primarily occurred between professionals and parents. Epilepsy information that young people obtained from parents or from their own efforts increased the risk of epilepsy misconceptions. Accurate epilepsy knowledge aided psychosocial adjustment. There is some evidence that interventions, when delivered in a structured psycho-educational, age appropriate way, increased young people's epilepsy knowledge, with positive trend to improving quality of life. We used mainly qualitative and mixed-method evidence to develop a theoretical framework explaining information exchange in clinical encounters. Conclusions There is a paucity of evidence

  17. Background sources at PEP

    International Nuclear Information System (INIS)

    Lynch, H.; Schwitters, R.F.; Toner, W.T.

    1988-01-01

    Important sources of background for PEP experiments are studied. Background particles originate from high-energy electrons and positrons which have been lost from stable orbits, γ-rays emitted by the primary beams through bremsstrahlung in the residual gas, and synchrotron radiation x-rays. The effect of these processes on the beam lifetime are calculated and estimates of background rates at the interaction region are given. Recommendations for the PEP design, aimed at minimizing background are presented. 7 figs., 4 tabs

  18. Method for Evaluating Information to Solve Problems of Control, Monitoring and Diagnostics

    Science.gov (United States)

    Vasil'ev, V. A.; Dobrynina, N. V.

    2017-06-01

    The article describes a method for evaluating information to solve problems of control, monitoring and diagnostics. It is necessary for reducing the dimensionality of informational indicators of situations, bringing them to relative units, for calculating generalized information indicators on their basis, ranking them by characteristic levels, for calculating the efficiency criterion of a system functioning in real time. The design of information evaluation system has been developed on its basis that allows analyzing, processing and assessing information about the object. Such object can be a complex technical, economic and social system. The method and the based system thereof can find a wide application in the field of analysis, processing and evaluation of information on the functioning of the systems, regardless of their purpose, goals, tasks and complexity. For example, they can be used to assess the innovation capacities of industrial enterprises and management decisions.

  19. Extraction Method for Earthquake-Collapsed Building Information Based on High-Resolution Remote Sensing

    International Nuclear Information System (INIS)

    Chen, Peng; Wu, Jian; Liu, Yaolin; Wang, Jing

    2014-01-01

    At present, the extraction of earthquake disaster information from remote sensing data relies on visual interpretation. However, this technique cannot effectively and quickly obtain precise and efficient information for earthquake relief and emergency management. Collapsed buildings in the town of Zipingpu after the Wenchuan earthquake were used as a case study to validate two kinds of rapid extraction methods for earthquake-collapsed building information based on pixel-oriented and object-oriented theories. The pixel-oriented method is based on multi-layer regional segments that embody the core layers and segments of the object-oriented method. The key idea is to mask layer by layer all image information, including that on the collapsed buildings. Compared with traditional techniques, the pixel-oriented method is innovative because it allows considerably rapid computer processing. As for the object-oriented method, a multi-scale segment algorithm was applied to build a three-layer hierarchy. By analyzing the spectrum, texture, shape, location, and context of individual object classes in different layers, the fuzzy determined rule system was established for the extraction of earthquake-collapsed building information. We compared the two sets of results using three variables: precision assessment, visual effect, and principle. Both methods can extract earthquake-collapsed building information quickly and accurately. The object-oriented method successfully overcomes the pepper salt noise caused by the spectral diversity of high-resolution remote sensing data and solves the problem of same object, different spectrums and that of same spectrum, different objects. With an overall accuracy of 90.38%, the method achieves more scientific and accurate results compared with the pixel-oriented method (76.84%). The object-oriented image analysis method can be extensively applied in the extraction of earthquake disaster information based on high-resolution remote sensing

  20. Cosmic Microwave Background Timeline

    Science.gov (United States)

    Cosmic Microwave Background Timeline 1934 : Richard Tolman shows that blackbody radiation in an will have a blackbody cosmic microwave background with temperature about 5 K 1955: Tigran Shmaonov anisotropy in the cosmic microwave background, this strongly supports the big bang model with gravitational

  1. On Cryptographic Information Security in Cloud Infrastructures: PKI and IBE Methods

    Directory of Open Access Journals (Sweden)

    Konstantin Grigorevich Kogos

    2014-05-01

    Full Text Available The application of cryptographic security methods in cloud infrastructure information security is analyzed. The cryptographic problems in cloudy infrastructures are chosen; the appropriate protocols are investigated; the appropriate mathematical problems are examined.

  2. INFORMATIONAL-METHODICAL SUPPORT OF THE COURSE «MATHEMATICAL LOGIC AND THEORY OF ALGORITHMS»

    Directory of Open Access Journals (Sweden)

    Y. I. Sinko

    2010-06-01

    Full Text Available In this article the basic principles of training technique of future teachers of mathematics to foundations of mathematical logic and theory of algorithms in the Kherson State University with the use of information technologies are examined. General description of functioning of the methodical system of learning of mathematical logic with the use of information technologies, in that variant, when information technologies are presented by the integrated specialized programmatic environment of the educational purpose «MatLog» is given.

  3. Method of bistable optical information storage using antiferroelectric phase PLZT ceramics

    Science.gov (United States)

    Land, Cecil E.

    1990-01-01

    A method for bistable storage of binary optical information includes an antiferroelectric (AFE) lead lanthanum zirconate titanate (PLZT) layer having a stable antiferroelectric first phase and a ferroelectric (FE) second phase obtained by applying a switching electric field across the surface of the device. Optical information is stored by illuminating selected portions of the layer to photoactivate an FE to AFE transition in those portions. Erasure of the stored information is obtained by reapplying the switching field.

  4. Application of mathematical methods of analysis in selection of competing information technologies

    Science.gov (United States)

    Semenov, V. L.; Kadyshev, E. N.; Zakharova, A. N.; Patianova, A. O.; Dulina, G. S.

    2018-05-01

    The article discusses the use of qualimetry methods using the apparatus of mathematical analysis in the formation of the integral index that allows one to select the best option among competing information technology. The authors propose the use of affine space in the evaluation and selection of competing information technologies.

  5. A method for automating the extraction of specialized information from the web

    NARCIS (Netherlands)

    Lin, L.; Liotta, A.; Hippisley, A.; Hao, Y.; Liu, J.; Wang, Y.; Cheung, Y-M.; Yin, H.; Jiao, L.; Ma, j.; Jiao, Y-C.

    2005-01-01

    The World Wide Web can be viewed as a gigantic distributed database including millions of interconnected hosts some of which publish information via web servers or peer-to-peer systems. We present here a novel method for the extraction of semantically rich information from the web in a fully

  6. Empirical studies on informal patient payments for health care services: a systematic and critical review of research methods and instruments

    Directory of Open Access Journals (Sweden)

    Pavlova Milena

    2010-09-01

    Full Text Available Abstract Background Empirical evidence demonstrates that informal patient payments are an important feature of many health care systems. However, the study of these payments is a challenging task because of their potentially illegal and sensitive nature. The aim of this paper is to provide a systematic review and analysis of key methodological difficulties in measuring informal patient payments. Methods The systematic review was based on the following eligibility criteria: English language publications that reported on empirical studies measuring informal patient payments. There were no limitations with regard to the year of publication. The content of the publications was analysed qualitatively and the results were organised in the form of tables. Data sources were Econlit, Econpapers, Medline, PubMed, ScienceDirect, SocINDEX. Results Informal payments for health care services are most often investigated in studies involving patients or the general public, but providers and officials are also sample units in some studies. The majority of the studies apply a single mode of data collection that involves either face-to-face interviews or group discussions. One of the main methodological difficulties reported in the publication concerns the inability of some respondents to distinguish between official and unofficial payments. Another complication is associated with the refusal of some respondents to answer questions on informal patient payments. We do not exclude the possibility that we have missed studies that reported in non-English language journals as well as very recent studies that are not yet published. Conclusions Given the recent evidence from research on survey methods, a self-administrated questionnaire during a face-to-face interview could be a suitable mode of collecting sensitive data, such as data on informal patient payments.

  7. Evaluation of two methods for using MR information in PET reconstruction

    International Nuclear Information System (INIS)

    Caldeira, L.; Scheins, J.; Almeida, P.; Herzog, H.

    2013-01-01

    Using magnetic resonance (MR) information in maximum a posteriori (MAP) algorithms for positron emission tomography (PET) image reconstruction has been investigated in the last years. Recently, three methods to introduce this information have been evaluated and the Bowsher prior was considered the best. Its main advantage is that it does not require image segmentation. Another method that has been widely used for incorporating MR information is using boundaries obtained by segmentation. This method has also shown improvements in image quality. In this paper, two methods for incorporating MR information in PET reconstruction are compared. After a Bayes parameter optimization, the reconstructed images were compared using the mean squared error (MSE) and the coefficient of variation (CV). MSE values are 3% lower in Bowsher than using boundaries. CV values are 10% lower in Bowsher than using boundaries. Both methods performed better than using no prior, that is, maximum likelihood expectation maximization (MLEM) or MAP without anatomic information in terms of MSE and CV. Concluding, incorporating MR information using the Bowsher prior gives better results in terms of MSE and CV than boundaries. MAP algorithms showed again to be effective in noise reduction and convergence, specially when MR information is incorporated. The robustness of the priors in respect to noise and inhomogeneities in the MR image has however still to be performed

  8. Using psychological theory to inform methods to optimize the implementation of a hand hygiene intervention

    Directory of Open Access Journals (Sweden)

    Boscart Veronique M

    2012-08-01

    Full Text Available Abstract Background Careful hand hygiene (HH is the single most important factor in preventing the transmission of infections to patients, but compliance is difficult to achieve and maintain. A lack of understanding of the processes involved in changing staff behaviour may contribute to the failure to achieve success. The purpose of this study was to identify nurses’ and administrators’ perceived barriers and facilitators to current HH practices and the implementation of a new electronic monitoring technology for HH. Methods Ten key informant interviews (three administrators and seven nurses were conducted to explore barriers and facilitators related to HH and the impact of the new technology on outcomes. The semi structured interviews were based on the Theoretical Domains Framework by Michie et al. and conducted prior to intervention implementation. Data were explored using an inductive qualitative analysis approach. Data between administrators and nurses were compared. Results In 9 of the 12 domains, nurses and administrators differed in their responses. Administrators believed that nurses have insufficient knowledge and skills to perform HH, whereas the nurses were confident they had the required knowledge and skills. Nurses focused on immediate consequences, whereas administrators highlighted long-term outcomes of the system. Nurses concentrated foremost on their personal safety and their families’ safety as a source of motivation to perform HH, whereas administrators identified professional commitment, incentives, and goal setting. Administrators stated that the staff do not have the decision processes in place to judge whether HH is necessary or not. They also highlighted the positive aspects of teams as a social influence, whereas nurses were not interested in group conformity or being compared to others. Nurses described the importance of individual feedback and self-monitoring in order to increase their performance, whereas

  9. Methane and nitrous oxide emissions from animal manure management, 1990 - 2003 - Background document on the calculation method for the Dutch National Inventory Report

    NARCIS (Netherlands)

    Hoek KW van der; Schijndel MW van; MNP; LVM

    2006-01-01

    Since 2005 the Netherlands has used a new country-specific method to calculate the methane and nitrous oxide emissions from animal manure management. Compared to the default methods provided by the Intergovernmental Panel on Climate Change, this method has led to a more realistic estimate of the

  10. Evaluation of the clinical process in a critical care information system using the Lean method: a case study

    Directory of Open Access Journals (Sweden)

    Yusof Maryati Mohd

    2012-12-01

    Full Text Available Abstract Background There are numerous applications for Health Information Systems (HIS that support specific tasks in the clinical workflow. The Lean method has been used increasingly to optimize clinical workflows, by removing waste and shortening the delivery cycle time. There are a limited number of studies on Lean applications related to HIS. Therefore, we applied the Lean method to evaluate the clinical processes related to HIS, in order to evaluate its efficiency in removing waste and optimizing the process flow. This paper presents the evaluation findings of these clinical processes, with regards to a critical care information system (CCIS, known as IntelliVue Clinical Information Portfolio (ICIP, and recommends solutions to the problems that were identified during the study. Methods We conducted a case study under actual clinical settings, to investigate how the Lean method can be used to improve the clinical process. We used observations, interviews, and document analysis, to achieve our stated goal. We also applied two tools from the Lean methodology, namely the Value Stream Mapping and the A3 problem-solving tools. We used eVSM software to plot the Value Stream Map and A3 reports. Results We identified a number of problems related to inefficiency and waste in the clinical process, and proposed an improved process model. Conclusions The case study findings show that the Value Stream Mapping and the A3 reports can be used as tools to identify waste and integrate the process steps more efficiently. We also proposed a standardized and improved clinical process model and suggested an integrated information system that combines database and software applications to reduce waste and data redundancy.

  11. FEATURE SELECTION METHODS BASED ON MUTUAL INFORMATION FOR CLASSIFYING HETEROGENEOUS FEATURES

    Directory of Open Access Journals (Sweden)

    Ratri Enggar Pawening

    2016-06-01

    Full Text Available Datasets with heterogeneous features can affect feature selection results that are not appropriate because it is difficult to evaluate heterogeneous features concurrently. Feature transformation (FT is another way to handle heterogeneous features subset selection. The results of transformation from non-numerical into numerical features may produce redundancy to the original numerical features. In this paper, we propose a method to select feature subset based on mutual information (MI for classifying heterogeneous features. We use unsupervised feature transformation (UFT methods and joint mutual information maximation (JMIM methods. UFT methods is used to transform non-numerical features into numerical features. JMIM methods is used to select feature subset with a consideration of the class label. The transformed and the original features are combined entirely, then determine features subset by using JMIM methods, and classify them using support vector machine (SVM algorithm. The classification accuracy are measured for any number of selected feature subset and compared between UFT-JMIM methods and Dummy-JMIM methods. The average classification accuracy for all experiments in this study that can be achieved by UFT-JMIM methods is about 84.47% and Dummy-JMIM methods is about 84.24%. This result shows that UFT-JMIM methods can minimize information loss between transformed and original features, and select feature subset to avoid redundant and irrelevant features.

  12. Background information on the estimation of short-term effects of the Energy Agreement for Sustainable Growth on renewable energy; Toelichting inschatting korte-termijneffecten Energieakkoord op hernieuwbare energie

    Energy Technology Data Exchange (ETDEWEB)

    Hekkenberg, M.; Londo, H.M.; Lensink, S.M. [ECN Beleidsstudies, Petten (Netherlands)

    2013-09-01

    On September 4, 2013, representatives of employers' associations, trade union federations, environmental organizations, the Dutch government and civil society have signed an Energy Agreement for Sustainable Growth. ECN and PBL have been asked to evaluate this agreement. This report gives background information on the evaluation of the measures aimed at improving energy efficiency in industry and agriculture [Dutch] Op 4 september 2013 is het 'Energieakkoord voor duurzame groei' getekend. ECN en PBL zijn gevraagd het akkoord te beoordelen en door te rekenen. Dit rapport dient als achtergronddocument bij de doorrekening van de maatregelen gericht op energiebesparing in de industrie en land- en tuinbouw.

  13. The equivalence of information-theoretic and likelihood-based methods for neural dimensionality reduction.

    Directory of Open Access Journals (Sweden)

    Ross S Williamson

    2015-04-01

    Full Text Available Stimulus dimensionality-reduction methods in neuroscience seek to identify a low-dimensional space of stimulus features that affect a neuron's probability of spiking. One popular method, known as maximally informative dimensions (MID, uses an information-theoretic quantity known as "single-spike information" to identify this space. Here we examine MID from a model-based perspective. We show that MID is a maximum-likelihood estimator for the parameters of a linear-nonlinear-Poisson (LNP model, and that the empirical single-spike information corresponds to the normalized log-likelihood under a Poisson model. This equivalence implies that MID does not necessarily find maximally informative stimulus dimensions when spiking is not well described as Poisson. We provide several examples to illustrate this shortcoming, and derive a lower bound on the information lost when spiking is Bernoulli in discrete time bins. To overcome this limitation, we introduce model-based dimensionality reduction methods for neurons with non-Poisson firing statistics, and show that they can be framed equivalently in likelihood-based or information-theoretic terms. Finally, we show how to overcome practical limitations on the number of stimulus dimensions that MID can estimate by constraining the form of the non-parametric nonlinearity in an LNP model. We illustrate these methods with simulations and data from primate visual cortex.

  14. Background Information on the Soviet Union

    Science.gov (United States)

    1978-01-01

    example that of Orlov, are coming had such a hot exchange at the Manege exhibition more and more to resemble those under Stalin. To in December 1962... Manege affair and his While Stalin’s heirs walk thisearth, Stalin, I fancy, still lurksbecame ~ ~ ~ ~ ~ ~ i good frensustrtholeum.fai a i widow

  15. Method and apparatus for bistable optical information storage for erasable optical disks

    Science.gov (United States)

    Land, Cecil E.; McKinney, Ira D.

    1990-01-01

    A method and an optical device for bistable storage of optical information, together with reading and erasure of the optical information, using a photoactivated shift in a field dependent phase transition between a metastable or a bias-stabilized ferroelectric (FE) phase and a stable antiferroelectric (AFE) phase in an lead lanthanum zirconate titanate (PLZT). An optical disk contains the PLZT. Writing and erasing of optical information can be accomplished by a light beam normal to the disk. Reading of optical information can be accomplished by a light beam at an incidence angle of 15 to 60 degrees to the normal of the disk.

  16. A Method of Rotating Machinery Fault Diagnosis Based on the Close Degree of Information Entropy

    Institute of Scientific and Technical Information of China (English)

    GENG Jun-bao; HUANG Shu-hong; JIN Jia-shan; CHEN Fei; LIU Wei

    2006-01-01

    This paper presents a method of rotating machinery fault diagnosis based on the close degree of information entropy. In the view of the information entropy, we introduce four information entropy features of the rotating machinery, which describe the vibration condition of the machinery. The four features are, respectively, denominated as singular spectrum entropy, power spectrum entropy, wavelet space state feature entropy and wavelet power spectrum entropy. The value scopes of the four information entropy features of the rotating machinery in some typical fault conditions are gained by experiments, which can be acted as the standard features of fault diagnosis. According to the principle of the shorter distance between the more similar models, the decision-making method based on the close degree of information entropy is put forward to deal with the recognition of fault patterns. We demonstrate the effectiveness of this approach in an instance involving the fault pattern recognition of some rotating machinery.

  17. Video coding and decoding devices and methods preserving PPG relevant information

    NARCIS (Netherlands)

    2015-01-01

    The present invention relates to a video encoding device (10, 10', 10") and method for encoding video data and to a corresponding video decoding device (60, 60') and method. To preserve PPG relevant information after encoding without requiring a large amount of additional data for the video encoder

  18. Video coding and decoding devices and methods preserving ppg relevant information

    NARCIS (Netherlands)

    2013-01-01

    The present invention relates to a video encoding device (10, 10', 10'') and method for encoding video data and to a corresponding video decoding device (60, 60') and method. To preserve PPG relevant information after encoding without requiring a large amount of additional data for the video encoder

  19. Visual Methods and Quality in Information Behaviour Research: The Cases of Photovoice and Mental Mapping

    Science.gov (United States)

    Cox, Andrew; Benson, Melanie

    2017-01-01

    Introduction: The purpose of the paper is to explore the ways in which visual methods can increase the quality of qualitative information behaviour research. Methods: The paper examines Tracy's framework of eight criteria for research quality: worthy topic, rich rigour, sincerity, credibility, resonance, significant contribution, ethical issues…

  20. Using Financial Information in Continuing Education. Accepted Methods and New Approaches.

    Science.gov (United States)

    Matkin, Gary W.

    This book, which is intended as a resource/reference guide for experienced financial managers and course planners, examines accepted methods and new approaches for using financial information in continuing education. The introduction reviews theory and practice, traditional and new methods, planning and organizational management, and technology.…

  1. A Method for the Analysis of Information Use in Source-Based Writing

    Science.gov (United States)

    Sormunen, Eero; Heinstrom, Jannica; Romu, Leena; Turunen, Risto

    2012-01-01

    Introduction: Past research on source-based writing assignments has hesitated to scrutinize how students actually use information afforded by sources. This paper introduces a method for the analysis of text transformations from sources to texts composed. The method is aimed to serve scholars in building a more detailed understanding of how…

  2. Specific Methods of Information Security for Nuclear Materials Control and Accounting Automate Systems

    Directory of Open Access Journals (Sweden)

    Konstantin Vyacheslavovich Ivanov

    2013-02-01

    Full Text Available The paper is devoted to specific methods of information security for nuclear materials control and accounting automate systems which is not required of OS and DBMS certifications and allowed to programs modification for clients specific without defenses modification. System ACCORD-2005 demonstrates the realization of this method.

  3. Research Methods and Techniques in Spanish Library and Information Science Journals (2012-2014)

    Science.gov (United States)

    Ferran-Ferrer, Núria; Guallar, Javier; Abadal, Ernest; Server, Adan

    2017-01-01

    Introduction. This study examines the research methods and techniques used in Spanish journals of library and information science, the topics addressed by papers in these journals and their authorship affiliation. Method. The researchers selected 580 papers published in the top seven Spanish LIS journals indexed in Web of Science and Scopus and…

  4. [Development method of healthcare information system integration based on business collaboration model].

    Science.gov (United States)

    Li, Shasha; Nie, Hongchao; Lu, Xudong; Duan, Huilong

    2015-02-01

    Integration of heterogeneous systems is the key to hospital information construction due to complexity of the healthcare environment. Currently, during the process of healthcare information system integration, people participating in integration project usually communicate by free-format document, which impairs the efficiency and adaptability of integration. A method utilizing business process model and notation (BPMN) to model integration requirement and automatically transforming it to executable integration configuration was proposed in this paper. Based on the method, a tool was developed to model integration requirement and transform it to integration configuration. In addition, an integration case in radiology scenario was used to verify the method.

  5. THE PRINCIPLES AND METHODS OF INFORMATION AND EDUCATIONAL SPACE SEMANTIC STRUCTURING BASED ON ONTOLOGIC APPROACH REALIZATION

    Directory of Open Access Journals (Sweden)

    Yurij F. Telnov

    2014-01-01

    Full Text Available This article reveals principles of semantic structuring of information and educational space of objects of knowledge and scientific and educational services with use of methods of ontologic engineering. Novelty of offered approach is interface of ontology of a content and ontology of scientific and educational services that allows to carry out effective composition of services and objects of knowledge according to models of professional competences and requirements being trained. As a result of application of methods of information and educational space semantic structuring integration of use of the diverse distributed scientific and educational content by educational institutions for carrying out scientific researches, methodical development and training is provided.

  6. A study of symbol segmentation method for handwritten mathematical formula recognition using mathematical structure information

    OpenAIRE

    Toyozumi, Kenichi; Yamada, Naoya; Kitasaka, Takayuki; Mori, Kensaku; Suenaga, Yasuhito; Mase, Kenji; Takahashi, Tomoichi

    2004-01-01

    Symbol segmentation is very important in handwritten mathematical formula recognition, since it is the very first portion of the recognition, since it is the very first portion of the recognition process. This paper proposes a new symbol segmentation method using mathematical structure information. The base technique of symbol segmentation employed in theexisting methods is dynamic programming which optimizes the overall results of individual symbol recognition. The new method we propose here...

  7. Designing Health Websites Based on Users’ Web-Based Information-Seeking Behaviors: A Mixed-Method Observational Study

    Science.gov (United States)

    Pang, Patrick Cheong-Iao; Verspoor, Karin; Pearce, Jon

    2016-01-01

    Background Laypeople increasingly use the Internet as a source of health information, but finding and discovering the right information remains problematic. These issues are partially due to the mismatch between the design of consumer health websites and the needs of health information seekers, particularly the lack of support for “exploring” health information. Objective The aim of this research was to create a design for consumer health websites by supporting different health information–seeking behaviors. We created a website called Better Health Explorer with the new design. Through the evaluation of this new design, we derive design implications for future implementations. Methods Better Health Explorer was designed using a user-centered approach. The design was implemented and assessed through a laboratory-based observational study. Participants tried to use Better Health Explorer and another live health website. Both websites contained the same content. A mixed-method approach was adopted to analyze multiple types of data collected in the experiment, including screen recordings, activity logs, Web browsing histories, and audiotaped interviews. Results Overall, 31 participants took part in the observational study. Our new design showed a positive result for improving the experience of health information seeking, by providing a wide range of information and an engaging environment. The results showed better knowledge acquisition, a higher number of page reads, and more query reformulations in both focused and exploratory search tasks. In addition, participants spent more time to discover health information with our design in exploratory search tasks, indicating higher engagement with the website. Finally, we identify 4 design considerations for designing consumer health websites and health information–seeking apps: (1) providing a dynamic information scope; (2) supporting serendipity; (3) considering trust implications; and (4) enhancing interactivity

  8. Depression and suicidal behavior in adolescents: a multi-informant and multi-methods approach to diagnostic classification.

    Directory of Open Access Journals (Sweden)

    Andrew James Lewis

    2014-07-01

    Full Text Available Background: Informant discrepancies have been reported between parent and adolescent measures of depressive disorders and suicidality. We aimed to examine the concordance between adolescent and parent ratings of depressive disorder using both clinical interview and questionnaire measures and assess multi-informant and multi-method approaches to classification.Method: Within the context of assessment of eligibility for a randomized clinical trial, 50 parent–adolescent pairs (mean age of adolescents = 15.0 years were interviewed separately with a structured diagnostic interview for depression, the KID-SCID. Adolescent self-report and parent-report versions of the Strengths and Difficulties Questionnaire, the Short Mood and Feelings Questionnaire and the Depressive Experiences Questionnaire were also administered. We examined the diagnostic concordance rates of the parent vs. adolescent structured interview methods and the prediction of adolescent diagnosis via questionnaire methods.Results: Parent proxy reporting of adolescent depression and suicidal thoughts and behavior is not strongly concordant with adolescent report. Adolescent self-reported symptoms on depression scales provide a more accurate report of diagnosable adolescent depression than parent proxy reports of adolescent depressive symptoms. Adolescent self-report measures can be combined to improve the accuracy of classification. Parents tend to over report their adolescent’s depressive symptoms while under reporting their suicidal thoughts and behavior.Conclusion: Parent proxy report is clearly less reliable than the adolescent’s own report of their symptoms and subjective experiences, and could be considered inaccurate for research purposes. While parent report would still be sought clinically where an adolescent refuses to provide information, our findings suggest that parent reporting of adolescent suicidality should be interpreted with caution.

  9. The International College of Neuropsychopharmacology (CINP) Treatment Guidelines for Bipolar Disorder in Adults (CINP-BD-2017), Part 1: Background and Methods of the Development of Guidelines.

    Science.gov (United States)

    Fountoulakis, Konstantinos N; Young, Allan; Yatham, Lakshmi; Grunze, Heinz; Vieta, Eduard; Blier, Pierre; Moeller, Hans Jurgen; Kasper, Siegfried

    2017-02-01

    This paper includes a short description of the important clinical aspects of Bipolar Disorder with emphasis on issues that are important for the therapeutic considerations, including mixed and psychotic features, predominant polarity, and rapid cycling as well as comorbidity. The workgroup performed a review and critical analysis of the literature concerning grading methods and methods for the development of guidelines. The workgroup arrived at a consensus to base the development of the guideline on randomized controlled trials and related meta-analyses alone in order to follow a strict evidence-based approach. A critical analysis of the existing methods for the grading of treatment options was followed by the development of a new grading method to arrive at efficacy and recommendation levels after the analysis of 32 distinct scenarios of available data for a given treatment option. The current paper reports details on the design, method, and process for the development of CINP guidelines for the treatment of Bipolar Disorder. The rationale and the method with which all data and opinions are combined in order to produce an evidence-based operationalized but also user-friendly guideline and a specific algorithm are described in detail in this paper. © The Author 2016. Published by Oxford University Press on behalf of CINP.

  10. Ecological content validation of the Information Assessment Method for parents (IAM-parent): A mixed methods study.

    Science.gov (United States)

    Bujold, M; El Sherif, R; Bush, P L; Johnson-Lafleur, J; Doray, G; Pluye, P

    2018-02-01

    This mixed methods study content validated the Information Assessment Method for parents (IAM-parent) that allows users to systematically rate and comment on online parenting information. Quantitative data and results: 22,407 IAM ratings were collected; of the initial 32 items, descriptive statistics showed that 10 had low relevance. Qualitative data and results: IAM-based comments were collected, and 20 IAM users were interviewed (maximum variation sample); the qualitative data analysis assessed the representativeness of IAM items, and identified items with problematic wording. Researchers, the program director, and Web editors integrated quantitative and qualitative results, which led to a shorter and clearer IAM-parent. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  11. Comparison of methods of extracting information for meta-analysis of observational studies in nutritional epidemiology

    Directory of Open Access Journals (Sweden)

    Jong-Myon Bae

    2016-01-01

    Full Text Available OBJECTIVES: A common method for conducting a quantitative systematic review (QSR for observational studies related to nutritional epidemiology is the “highest versus lowest intake” method (HLM, in which only the information concerning the effect size (ES of the highest category of a food item is collected on the basis of its lowest category. However, in the interval collapsing method (ICM, a method suggested to enable a maximum utilization of all available information, the ES information is collected by collapsing all categories into a single category. This study aimed to compare the ES and summary effect size (SES between the HLM and ICM. METHODS: A QSR for evaluating the citrus fruit intake and risk of pancreatic cancer and calculating the SES by using the HLM was selected. The ES and SES were estimated by performing a meta-analysis using the fixed-effect model. The directionality and statistical significance of the ES and SES were used as criteria for determining the concordance between the HLM and ICM outcomes. RESULTS: No significant differences were observed in the directionality of SES extracted by using the HLM or ICM. The application of the ICM, which uses a broader information base, yielded more-consistent ES and SES, and narrower confidence intervals than the HLM. CONCLUSIONS: The ICM is advantageous over the HLM owing to its higher statistical accuracy in extracting information for QSR on nutritional epidemiology. The application of the ICM should hence be recommended for future studies.

  12. Development and content validation of the information assessment method for patients and consumers.

    Science.gov (United States)

    Pluye, Pierre; Granikov, Vera; Bartlett, Gillian; Grad, Roland M; Tang, David L; Johnson-Lafleur, Janique; Shulha, Michael; Barbosa Galvão, Maria Cristiane; Ricarte, Ivan Lm; Stephenson, Randolph; Shohet, Linda; Hutsul, Jo-Anne; Repchinsky, Carol A; Rosenberg, Ellen; Burnand, Bernard; Légaré, France; Dunikowski, Lynn; Murray, Susan; Boruff, Jill; Frati, Francesca; Kloda, Lorie; Macaulay, Ann; Lagarde, François; Doray, Geneviève

    2014-02-18

    Online consumer health information addresses health problems, self-care, disease prevention, and health care services and is intended for the general public. Using this information, people can improve their knowledge, participation in health decision-making, and health. However, there are no comprehensive instruments to evaluate the value of health information from a consumer perspective. We collaborated with information providers to develop and validate the Information Assessment Method for all (IAM4all) that can be used to collect feedback from information consumers (including patients), and to enable a two-way knowledge translation between information providers and consumers. Content validation steps were followed to develop the IAM4all questionnaire. The first version was based on a theoretical framework from information science, a critical literature review and prior work. Then, 16 laypersons were interviewed on their experience with online health information and specifically their impression of the IAM4all questionnaire. Based on the summaries and interpretations of interviews, questionnaire items were revised, added, and excluded, thus creating the second version of the questionnaire. Subsequently, a panel of 12 information specialists and 8 health researchers participated in an online survey to rate each questionnaire item for relevance, clarity, representativeness, and specificity. The result of this expert panel contributed to the third, current, version of the questionnaire. The current version of the IAM4all questionnaire is structured by four levels of outcomes of information seeking/receiving: situational relevance, cognitive impact, information use, and health benefits. Following the interviews and the expert panel survey, 9 questionnaire items were confirmed as relevant, clear, representative, and specific. To improve readability and accessibility for users with a lower level of literacy, 19 items were reworded and all inconsistencies in using a

  13. Optimal background matching camouflage.

    Science.gov (United States)

    Michalis, Constantine; Scott-Samuel, Nicholas E; Gibson, David P; Cuthill, Innes C

    2017-07-12

    Background matching is the most familiar and widespread camouflage strategy: avoiding detection by having a similar colour and pattern to the background. Optimizing background matching is straightforward in a homogeneous environment, or when the habitat has very distinct sub-types and there is divergent selection leading to polymorphism. However, most backgrounds have continuous variation in colour and texture, so what is the best solution? Not all samples of the background are likely to be equally inconspicuous, and laboratory experiments on birds and humans support this view. Theory suggests that the most probable background sample (in the statistical sense), at the size of the prey, would, on average, be the most cryptic. We present an analysis, based on realistic assumptions about low-level vision, that estimates the distribution of background colours and visual textures, and predicts the best camouflage. We present data from a field experiment that tests and supports our predictions, using artificial moth-like targets under bird predation. Additionally, we present analogous data for humans, under tightly controlled viewing conditions, searching for targets on a computer screen. These data show that, in the absence of predator learning, the best single camouflage pattern for heterogeneous backgrounds is the most probable sample. © 2017 The Authors.

  14. An information theory criteria based blind method for enumerating active users in DS-CDMA system

    Science.gov (United States)

    Samsami Khodadad, Farid; Abed Hodtani, Ghosheh

    2014-11-01

    In this paper, a new and blind algorithm for active user enumeration in asynchronous direct sequence code division multiple access (DS-CDMA) in multipath channel scenario is proposed. The proposed method is based on information theory criteria. There are two main categories of information criteria which are widely used in active user enumeration, Akaike Information Criterion (AIC) and Minimum Description Length (MDL) information theory criteria. The main difference between these two criteria is their penalty functions. Due to this difference, MDL is a consistent enumerator which has better performance in higher signal-to-noise ratios (SNR) but AIC is preferred in lower SNRs. In sequel, we propose a SNR compliance method based on subspace and training genetic algorithm to have the performance of both of them. Moreover, our method uses only a single antenna, in difference to the previous methods which decrease hardware complexity. Simulation results show that the proposed method is capable of estimating the number of active users without any prior knowledge and the efficiency of the method.

  15. A New Method of Reliability Evaluation Based on Wavelet Information Entropy for Equipment Condition Identification

    International Nuclear Information System (INIS)

    He, Z J; Zhang, X L; Chen, X F

    2012-01-01

    Aiming at reliability evaluation of condition identification of mechanical equipment, it is necessary to analyze condition monitoring information. A new method of reliability evaluation based on wavelet information entropy extracted from vibration signals of mechanical equipment is proposed. The method is quite different from traditional reliability evaluation models that are dependent on probability statistics analysis of large number sample data. The vibration signals of mechanical equipment were analyzed by means of second generation wavelet package (SGWP). We take relative energy in each frequency band of decomposed signal that equals a percentage of the whole signal energy as probability. Normalized information entropy (IE) is obtained based on the relative energy to describe uncertainty of a system instead of probability. The reliability degree is transformed by the normalized wavelet information entropy. A successful application has been achieved to evaluate the assembled quality reliability for a kind of dismountable disk-drum aero-engine. The reliability degree indicates the assembled quality satisfactorily.

  16. Fault Diagnosis Method Based on Information Entropy and Relative Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    Xiaoming Xu

    2017-01-01

    Full Text Available In traditional principle component analysis (PCA, because of the neglect of the dimensions influence between different variables in the system, the selected principal components (PCs often fail to be representative. While the relative transformation PCA is able to solve the above problem, it is not easy to calculate the weight for each characteristic variable. In order to solve it, this paper proposes a kind of fault diagnosis method based on information entropy and Relative Principle Component Analysis. Firstly, the algorithm calculates the information entropy for each characteristic variable in the original dataset based on the information gain algorithm. Secondly, it standardizes every variable’s dimension in the dataset. And, then, according to the information entropy, it allocates the weight for each standardized characteristic variable. Finally, it utilizes the relative-principal-components model established for fault diagnosis. Furthermore, the simulation experiments based on Tennessee Eastman process and Wine datasets demonstrate the feasibility and effectiveness of the new method.

  17. Method of extracting significant trouble information of nuclear power plants using probabilistic analysis technique

    International Nuclear Information System (INIS)

    Shimada, Yoshio; Miyazaki, Takamasa

    2005-01-01

    In order to analyze and evaluate large amounts of trouble information of overseas nuclear power plants, it is necessary to select information that is significant in terms of both safety and reliability. In this research, a method of efficiently and simply classifying degrees of importance of components in terms of safety and reliability while paying attention to root-cause components appearing in the information was developed. Regarding safety, the reactor core damage frequency (CDF), which is used in the probabilistic analysis of a reactor, was used. Regarding reliability, the automatic plant trip probability (APTP), which is used in the probabilistic analysis of automatic reactor trips, was used. These two aspects were reflected in the development of criteria for classifying degrees of importance of components. By applying these criteria, a simple method of extracting significant trouble information of overseas nuclear power plants was developed. (author)

  18. INNOVATIVE METHODS TO EVALUATE THE RELIABILITY OF INFORMATION CONSOLIDATED FINANCIAL STATEMENTS

    Directory of Open Access Journals (Sweden)

    Irina P. Kurochkina

    2014-01-01

    Full Text Available The article explores the possibility of using foreign innovative methods to assess the reliabilityof information consolidated fi nancial statements of Russian companies. Recommendations aremade under their adaptation and applicationinto commercial organizations. Banish methodindicators are implemented in one of the world’s largest vertically integrated steel and miningcompanies. Audit firms are proposed to usemethods of assessing the reliability of information in the practical application of ISA.

  19. Methods of Information Subjects and Objects Interaction Rules Formalization in the Electronic Trading Platform System

    Directory of Open Access Journals (Sweden)

    Emma Emanuilova Yandybaeva

    2015-03-01

    Full Text Available The methods of information subjects and objects interaction rules formalization in the electronic trading platform system has been developed. They are based on mathematical model of mandatory role-based access control. As a result of the work we have defined set of user roles and constructed roles hierarchy. For the roles hierarchy restrictions have been imposed to ensure the safety of the information system.

  20. Direct and indirect nitrous oxide emissions from agricultural soils, 1990 - 2003. Background document on the calculation method for the Dutch National Inventory Report

    NARCIS (Netherlands)

    Hoek KW van der; Schijndel MW van; Kuikman PJ; MNP; Alterra; LVM

    2007-01-01

    Since 2005 the Dutch method to calculate the nitrous oxide emissions from agricultural soils has fully complied with the Intergovernmental Panel on Climate Change (IPCC) Good Practice Guidelines. In order to meet the commitments of the Convention on Climate Change and the Kyoto Protocol, nitrous

  1. Study on adsorption of activated carbon fiber to background-level xenon in air by the method of 133Xe tracer

    International Nuclear Information System (INIS)

    Zhang Haitao; Wang Yalong; Zhang Lixing; Wang Xuhui; Zhang Xiaolin

    2001-01-01

    The adsorption behaviors of the different activated carbon fibers to ultra-trace xenon in air are studied using the method of 133 Xe as tracer. The efficiency equation of adsorption columns are determined. The comparison of adsorptive capacity between activated carbon fibers and activated carbon indicates that activated carbon fibers are better than activated carbon under low temperature

  2. Inter-lab testing of Hyalella azteca water and sediment methods: 1 background and overview of the 42-d survival, growth and reproduction test

    Science.gov (United States)

    Over the past four years, USEPA-Duluth, USGS-Columbia, the Illinois Natural History Survey, and Environment Canada have been conducting studies to refine the USEPA and ASTM International methods for conducting 10- to 42-d water or sediment toxicity exposures with the amphipod Hya...

  3. Combining qualitative and quantitative operational research methods to inform quality improvement in pathways that span multiple settings

    Science.gov (United States)

    Crowe, Sonya; Brown, Katherine; Tregay, Jenifer; Wray, Jo; Knowles, Rachel; Ridout, Deborah A; Bull, Catherine; Utley, Martin

    2017-01-01

    Background Improving integration and continuity of care across sectors within resource constraints is a priority in many health systems. Qualitative operational research methods of problem structuring have been used to address quality improvement in services involving multiple sectors but not in combination with quantitative operational research methods that enable targeting of interventions according to patient risk. We aimed to combine these methods to augment and inform an improvement initiative concerning infants with congenital heart disease (CHD) whose complex care pathway spans multiple sectors. Methods Soft systems methodology was used to consider systematically changes to services from the perspectives of community, primary, secondary and tertiary care professionals and a patient group, incorporating relevant evidence. Classification and regression tree (CART) analysis of national audit datasets was conducted along with data visualisation designed to inform service improvement within the context of limited resources. Results A ‘Rich Picture’ was developed capturing the main features of services for infants with CHD pertinent to service improvement. This was used, along with a graphical summary of the CART analysis, to guide discussions about targeting interventions at specific patient risk groups. Agreement was reached across representatives of relevant health professions and patients on a coherent set of targeted recommendations for quality improvement. These fed into national decisions about service provision and commissioning. Conclusions When tackling complex problems in service provision across multiple settings, it is important to acknowledge and work with multiple perspectives systematically and to consider targeting service improvements in response to confined resources. Our research demonstrates that applying a combination of qualitative and quantitative operational research methods is one approach to doing so that warrants further

  4. Leak testing plan for the Oak Ridge National Laboratory liquid low-level waste systems (active tanks): Revision 2. Volume 1: Regulatory background and plan approach; Volume 2: Methods, protocols, and schedules; Volume 3: Evaluation of the ORNL/LT-823DP differential pressure leak detection method; Appendix to Revision 2: DOE/EPA/TDEC correspondence

    Energy Technology Data Exchange (ETDEWEB)

    Douglas, D.G.; Wise, R.F.; Starr, J.W.; Maresca, J.W. Jr. [Vista Research, Inc., Mountain View, CA (United States)

    1994-11-01

    This document, the Leak Testing Plan for the Oak Ridge National Laboratory Liquid Low-Level Waste System (Active Tanks), comprises three volumes. The first two volumes address the component-based leak testing plan for the liquid low-level waste system at Oak Ridge, while the third volume describes the performance evaluation of the leak detection method that will be used to test this system. Volume 1, describes that portion of the liquid low-level waste system at that will be tested; it provides the regulatory background, especially in terms of the requirements stipulated in the Federal Facilities Agreement, upon which the leak testing plan is based. Volume 1 also describes the foundation of the plan, portions of which were abstracted from existing federal documents that regulate the petroleum and hazardous chemicals industries. Finally, Volume 1 gives an overview the plan, describing the methods that will be used to test the four classes of components in the liquid low-level waste system. Volume 2 takes the general information on component classes and leak detection methods presented in Volume 1 and shows how it applies particularly to each of the individual components. A complete test plan for each of the components is presented, with emphasis placed on the methods designated for testing tanks. The protocol for testing tank systems is described, and general leak testing schedules are presented. Volume 3 describes the results of a performance evaluation completed for the leak testing method that will be used to test the small tanks at the facility (those less than 3,000 gal in capacity). Some of the details described in Volumes 1 and 2 are expected to change as additional information is obtained, as the viability of candidate release detection methods is proven in the Oak Ridge environment, and as the testing program evolves.

  5. Leak testing plan for the Oak Ridge National Laboratory liquid low-level waste systems (active tanks): Revision 2. Volume 1: Regulatory background and plan approach; Volume 2: Methods, protocols, and schedules; Volume 3: Evaluation of the ORNL/LT-823DP differential pressure leak detection method; Appendix to Revision 2: DOE/EPA/TDEC correspondence

    International Nuclear Information System (INIS)

    Douglas, D.G.; Wise, R.F.; Starr, J.W.; Maresca, J.W. Jr.

    1994-11-01

    This document, the Leak Testing Plan for the Oak Ridge National Laboratory Liquid Low-Level Waste System (Active Tanks), comprises three volumes. The first two volumes address the component-based leak testing plan for the liquid low-level waste system at Oak Ridge, while the third volume describes the performance evaluation of the leak detection method that will be used to test this system. Volume 1, describes that portion of the liquid low-level waste system at that will be tested; it provides the regulatory background, especially in terms of the requirements stipulated in the Federal Facilities Agreement, upon which the leak testing plan is based. Volume 1 also describes the foundation of the plan, portions of which were abstracted from existing federal documents that regulate the petroleum and hazardous chemicals industries. Finally, Volume 1 gives an overview the plan, describing the methods that will be used to test the four classes of components in the liquid low-level waste system. Volume 2 takes the general information on component classes and leak detection methods presented in Volume 1 and shows how it applies particularly to each of the individual components. A complete test plan for each of the components is presented, with emphasis placed on the methods designated for testing tanks. The protocol for testing tank systems is described, and general leak testing schedules are presented. Volume 3 describes the results of a performance evaluation completed for the leak testing method that will be used to test the small tanks at the facility (those less than 3,000 gal in capacity). Some of the details described in Volumes 1 and 2 are expected to change as additional information is obtained, as the viability of candidate release detection methods is proven in the Oak Ridge environment, and as the testing program evolves

  6. Preference Construction Processes for Renewable Energies: Assessing the Influence of Sustainability Information and Decision Support Methods

    Directory of Open Access Journals (Sweden)

    Kiyotada Hayashi

    2016-11-01

    Full Text Available Sustainability information and decision support can be two important driving forces for making sustainable transitions in society. However, not enough knowledge is available on the effectiveness of these two factors. Here, we conducted an experimental study to support the hypotheses that acquisition of sustainability information and use of decision support methods consistently construct preferences for renewable power generation technologies that use solar power, wind power, small-scale hydroelectric power, geothermal power, wood biomass, or biogas as energy sources. The sustainability information was prepared using a renewable energy-focused input-output model of Japan and contained life cycle greenhouse gas emissions, electricity generation costs, and job creation. We measured rank-ordered preferences in the following four steps in experimental workshops conducted for municipal officials: provision of (1 energy-source names; (2 sustainability information; (3 additional explanation of public value; and (4 knowledge and techniques about multi-attribute value functions. The degree of changes in preference orders was evaluated using Spearman’s rank correlation coefficient. The consistency of rank-ordered preferences among participants was determined by using the maximum eigenvalue for the coefficient matrix. The results show: (1 the individual preferences evolved drastically in response to the sustainability information and the decision support method; and (2 the rank-ordered preferences were more consistent during the preference construction processes. These results indicate that provision of sustainability information, coupled with decision support methods, is effective for decision making regarding renewable energies.

  7. Mutual trust method for forwarding information in wireless sensor networks using random secret pre-distribution

    Directory of Open Access Journals (Sweden)

    Chih-Hsueh Lin

    2016-04-01

    Full Text Available In wireless sensor networks, sensing information must be transmitted from sensor nodes to the base station by multiple hopping. Every sensor node is a sender and a relay node that forwards the sensing information that is sent by other nodes. Under an attack, the sensing information may be intercepted, modified, interrupted, or fabricated during transmission. Accordingly, the development of mutual trust to enable a secure path to be established for forwarding information is an important issue. Random key pre-distribution has been proposed to establish mutual trust among sensor nodes. This article modifies the random key pre-distribution to a random secret pre-distribution and incorporates identity-based cryptography to establish an effective method of establishing mutual trust for a wireless sensor network. In the proposed method, base station assigns an identity and embeds n secrets into the private secret keys for every sensor node. Based on the identity and private secret keys, the mutual trust method is utilized to explore the types of trust among neighboring sensor nodes. The novel method can resist malicious attacks and satisfy the requirements of wireless sensor network, which are resistance to compromising attacks, masquerading attacks, forger attacks, replying attacks, authentication of forwarding messages, and security of sensing information.

  8. Automatic segmentation of MRI head images by 3-D region growing method which utilizes edge information

    International Nuclear Information System (INIS)

    Jiang, Hao; Suzuki, Hidetomo; Toriwaki, Jun-ichiro

    1991-01-01

    This paper presents a 3-D segmentation method that automatically extracts soft tissue from multi-sliced MRI head images. MRI produces a sequence of two-dimensional (2-D) images which contains three-dimensional (3-D) information of organs. To utilize such information we need effective algorithms to treat 3-D digital images and to extract organs and tissues of interest. We developed a method to extract the brain from MRI images which uses a region growing procedure and integrates information of uniformity of gray levels and information of the presence of edge segments in the local area around the pixel of interest. First we generate a kernel region which is a part of brain tissue by simple thresholding. Then we grow the region by means of a region growing algorithm under the control of 3-D edge existence to obtain the region of the brain. Our method is rather simple because it uses basic 3-D image processing techniques like spatial difference. It is robust for variation of gray levels inside a tissue since it also refers to the edge information in the process of region growing. Therefore, the method is flexible enough to be applicable to the segmentation of other images including soft tissues which have complicated shapes and fluctuation in gray levels. (author)

  9. A new method to detect geometrical information by the tunneling microscope

    DEFF Research Database (Denmark)

    Tasaki, S.; Levitan, J.; Mygind, Jesper

    1997-01-01

    A new method for the detection of the geometrical information by the scanning tunneling microscope is proposed. In addition to the bias voltage, a small ac modulation is applied. The nonlinear dependence of the transmission coefficient on the applied voltage is used to generate harmonics. The ratio...... of the harmonics to the dc current is found to give the width between the sample and the probe, i.e., the geometrical information. This method may be useful to measure materials, where the local-spatial-density of states may change notably from place to place. ©1997 American Institute of Physics....

  10. Efficiency evaluation of China's investment in Africa under the background of constructing "Silk Road Economic Belt": Application of DEA model and Malmquist index method

    OpenAIRE

    Ze, Tian; Yumei, Fan; Chao, Liu

    2018-01-01

    In recent years, with implementation and propulsion of the strategic planning "One Belt and One Road" of China, quality and efficiency of China's foreign direct investment have increasingly become a focused issue. This paper utilizes the DEA model and Malmquist index method to select the data of China's investment in 20 countries of Africa and conducts the empirical research on dynamic evaluation of efficiency of China's direct investment in Africa. The result shows that the general efficienc...

  11. An information preserving method for producing full coverage CoRoT light curves

    Directory of Open Access Journals (Sweden)

    Pascual-Granado J.

    2015-01-01

    Full Text Available Invalid flux measurements, caused mainly by the South Atlantic Anomaly crossing of the CoRoT satellite, introduce aliases in the periodogram and wrong amplitudes. It has been demonstrated that replacing such invalid data with a linear interpolation is not harmless. On the other side, using power spectrum estimators for unevenly sampled time series is not only less computationally efficient but it leads to difficulties in the interpretation of the results. Therefore, even when the gaps are rather small and the duty cycle is high enough the use of gap-filling methods is a gain in frequency analysis. However, the method must preserve the information contained in the time series. In this work we give a short description of an information preserving method (MIARMA and show some results when applying it to CoRoT seismo light curves. The method is implemented as the second step of a pipeline for CoRoT data analysis.

  12. A Frequency Matching Method: Solving Inverse Problems by Use of Geologically Realistic Prior Information

    DEFF Research Database (Denmark)

    Lange, Katrine; Frydendall, Jan; Cordua, Knud Skou

    2012-01-01

    The frequency matching method defines a closed form expression for a complex prior that quantifies the higher order statistics of a proposed solution model to an inverse problem. While existing solution methods to inverse problems are capable of sampling the solution space while taking into account...... arbitrarily complex a priori information defined by sample algorithms, it is not possible to directly compute the maximum a posteriori model, as the prior probability of a solution model cannot be expressed. We demonstrate how the frequency matching method enables us to compute the maximum a posteriori...... solution model to an inverse problem by using a priori information based on multiple point statistics learned from training images. We demonstrate the applicability of the suggested method on a synthetic tomographic crosshole inverse problem....

  13. Near surface illumination method to detect particle size information by optical calibration free remission measurements

    Science.gov (United States)

    Stocker, Sabrina; Foschum, Florian; Kienle, Alwin

    2017-07-01

    A calibration free method to detect particle size information is presented. A possible application for such measurements is the investigation of raw milk since there not only the fat and protein content varies but also the fat droplet size. The newly developed method is sensitive to the scattering phase function, which makes it applicable to many other applications, too. By simulating the light propagation by use of Monte Carlo simulations, a calibration free device can be developed from this principle.

  14. A method for automated processing of measurement information during mechanical drilling

    Energy Technology Data Exchange (ETDEWEB)

    Samonenko, V.I.; Belinkov, V.G.; Romanova, L.A.

    1984-01-01

    An algorithm is cited for a developed method for automated processing of measurement information during mechanical drilling. Its use in conditions of operation of an automated control system (ASU) from drilling will make it possible to precisely identify a change in the lithology, the physical and mechanical and the abrasive properties, in the stratum (pore) pressure in the rock being drilled out during mechanical drilling, which along with other methods for testing the drilling process will increase the reliability of the decisions made.

  15. Application of information-retrieval methods to the classification of physical data

    Science.gov (United States)

    Mamotko, Z. N.; Khorolskaya, S. K.; Shatrovskiy, L. I.

    1975-01-01

    Scientific data received from satellites are characterized as a multi-dimensional time series, whose terms are vector functions of a vector of measurement conditions. Information retrieval methods are used to construct lower dimensional samples on the basis of the condition vector, in order to obtain these data and to construct partial relations. The methods are applied to the joint Soviet-French Arkad project.

  16. Interactive QR code beautification with full background image embedding

    Science.gov (United States)

    Lin, Lijian; Wu, Song; Liu, Sijiang; Jiang, Bo

    2017-06-01

    QR (Quick Response) code is a kind of two dimensional barcode that was first developed in automotive industry. Nowadays, QR code has been widely used in commercial applications like product promotion, mobile payment, product information management, etc. Traditional QR codes in accordance with the international standard are reliable and fast to decode, but are lack of aesthetic appearance to demonstrate visual information to customers. In this work, we present a novel interactive method to generate aesthetic QR code. By given information to be encoded and an image to be decorated as full QR code background, our method accepts interactive user's strokes as hints to remove undesired parts of QR code modules based on the support of QR code error correction mechanism and background color thresholds. Compared to previous approaches, our method follows the intention of the QR code designer, thus can achieve more user pleasant result, while keeping high machine readability.

  17. Online drug databases: a new method to assess and compare inclusion of clinically relevant information.

    Science.gov (United States)

    Silva, Cristina; Fresco, Paula; Monteiro, Joaquim; Rama, Ana Cristina Ribeiro

    2013-08-01

    Evidence-Based Practice requires health care decisions to be based on the best available evidence. The model "Information Mastery" proposes that clinicians should use sources of information that have previously evaluated relevance and validity, provided at the point of care. Drug databases (DB) allow easy and fast access to information and have the benefit of more frequent content updates. Relevant information, in the context of drug therapy, is that which supports safe and effective use of medicines. Accordingly, the European Guideline on the Summary of Product Characteristics (EG-SmPC) was used as a standard to evaluate the inclusion of relevant information contents in DB. To develop and test a method to evaluate relevancy of DB contents, by assessing the inclusion of information items deemed relevant for effective and safe drug use. Hierarchical organisation and selection of the principles defined in the EGSmPC; definition of criteria to assess inclusion of selected information items; creation of a categorisation and quantification system that allows score calculation; calculation of relative differences (RD) of scores for comparison with an "ideal" database, defined as the one that achieves the best quantification possible for each of the information items; pilot test on a sample of 9 drug databases, using 10 drugs frequently associated in literature with morbidity-mortality and also being widely consumed in Portugal. Main outcome measure Calculate individual and global scores for clinically relevant information items of drug monographs in databases, using the categorisation and quantification system created. A--Method development: selection of sections, subsections, relevant information items and corresponding requisites; system to categorise and quantify their inclusion; score and RD calculation procedure. B--Pilot test: calculated scores for the 9 databases; globally, all databases evaluated significantly differed from the "ideal" database; some DB performed

  18. Cosmic microwave background radiation

    International Nuclear Information System (INIS)

    Wilson, R.W.

    1979-01-01

    The 20-ft horn-reflector antenna at Bell Laboratories is discussed in detail with emphasis on the 7.35 cm radiometer. The circumstances leading to the detection of the cosmic microwave background radiation are explored

  19. Direct and indirect nitrous oxide emissions from agricultural soils, 1990 - 2003. Background document on the calculation method for the Dutch National Inventory Report

    International Nuclear Information System (INIS)

    Van der Hoek, K.W.; Van Schijndel, M.W.; Kuikman, P.J.

    2007-01-01

    Since 2005 the Dutch method to calculate the nitrous oxide emissions from agricultural soils has fully complied with the Intergovernmental Panel on Climate Change (IPCC) Good Practice Guidelines. In order to meet the commitments of the Convention on Climate Change and the Kyoto Protocol, nitrous oxide emissions have to be reported annually in the Dutch National Inventory Report (NIR). Countries are encouraged to use country-specific data rather than the default values provided by the IPCC. This report describes the calculation schemes and data sources used for nitrous oxide emissions from agricultural soils in the Netherlands. The nitrous oxide emissions, which contribute to the greenhouse effect, occur due to nitrification and denitrification processes. They include direct emissions from agricultural soils due to the application of animal manure and fertilizer nitrogen and the manure production in the meadow. Also included are indirect emissions resulting from the subsequent leaching of nitrate to ground water and surface waters, and from deposition of ammonia that had volatilized as a result of agricultural activities. Before 2005 indirect emissions in the Netherlands were calculated using a method that did not compare well with IPCC definitions and categories. The elaborate explanation here should facilitate reviewing by experts. Finally, the report also presents an overview of the nitrous oxide emissions from agricultural soils and the underlying data used in the 1990 - 2003 period

  20. A flood-based information flow analysis and network minimization method for gene regulatory networks.

    Science.gov (United States)

    Pavlogiannis, Andreas; Mozhayskiy, Vadim; Tagkopoulos, Ilias

    2013-04-24

    Biological networks tend to have high interconnectivity, complex topologies and multiple types of interactions. This renders difficult the identification of sub-networks that are involved in condition- specific responses. In addition, we generally lack scalable methods that can reveal the information flow in gene regulatory and biochemical pathways. Doing so will help us to identify key participants and paths under specific environmental and cellular context. This paper introduces the theory of network flooding, which aims to address the problem of network minimization and regulatory information flow in gene regulatory networks. Given a regulatory biological network, a set of source (input) nodes and optionally a set of sink (output) nodes, our task is to find (a) the minimal sub-network that encodes the regulatory program involving all input and output nodes and (b) the information flow from the source to the sink nodes of the network. Here, we describe a novel, scalable, network traversal algorithm and we assess its potential to achieve significant network size reduction in both synthetic and E. coli networks. Scalability and sensitivity analysis show that the proposed method scales well with the size of the network, and is robust to noise and missing data. The method of network flooding proves to be a useful, practical approach towards information flow analysis in gene regulatory networks. Further extension of the proposed theory has the potential to lead in a unifying framework for the simultaneous network minimization and information flow analysis across various "omics" levels.

  1. Information content in B→VV decays and the angular moments method

    International Nuclear Information System (INIS)

    Dighe, A.; Sen, S.

    1998-10-01

    The time-dependent angular distributions of decays of neutral B mesons into two vector mesons contain information about the lifetimes, mass differences, strong and weak phases, form factors, and CP violating quantities. A statistical analysis of the information content is performed by giving the ''information'' a quantitative meaning. It is shown that for some parameters of interest, the information content in time and angular measurements combined may be orders of magnitude more than the information from time measurements alone and hence the angular measurements are highly recommended. The method of angular moments is compared with the (maximum) likelihood method to find that it works almost as well in the region of interest for the one-angle distribution. For the complete three-angle distribution, an estimate of possible statistical errors expected on the observables of interest is obtained. It indicates that the three-angle distribution, unraveled by the method of angular moments, would be able to nail down many quantities of interest and will help in pointing unambiguously to new physics. (author)

  2. Information operator approach and iterative regularization methods for atmospheric remote sensing

    International Nuclear Information System (INIS)

    Doicu, A.; Hilgers, S.; Bargen, A. von; Rozanov, A.; Eichmann, K.-U.; Savigny, C. von; Burrows, J.P.

    2007-01-01

    In this study, we present the main features of the information operator approach for solving linear inverse problems arising in atmospheric remote sensing. This method is superior to the stochastic version of the Tikhonov regularization (or the optimal estimation method) due to its capability to filter out the noise-dominated components of the solution generated by an inappropriate choice of the regularization parameter. We extend this approach to iterative methods for nonlinear ill-posed problems and derive the truncated versions of the Gauss-Newton and Levenberg-Marquardt methods. Although the paper mostly focuses on discussing the mathematical details of the inverse method, retrieval results have been provided, which exemplify the performances of the methods. These results correspond to the NO 2 retrieval from SCIAMACHY limb scatter measurements and have been obtained by using the retrieval processors developed at the German Aerospace Center Oberpfaffenhofen and Institute of Environmental Physics of the University of Bremen

  3. An innovative method for extracting isotopic information from low-resolution gamma spectra

    International Nuclear Information System (INIS)

    Miko, D.; Estep, R.J.; Rawool-Sullivan, M.W.

    1998-01-01

    A method is described for the extraction of isotopic information from attenuated gamma ray spectra using the gross-count material basis set (GC-MBS) model. This method solves for the isotopic composition of an unknown mixture of isotopes attenuated through an absorber of unknown material. For binary isotopic combinations the problem is nonlinear in only one variable and is easily solved using standard line optimization techniques. Results are presented for NaI spectrum analyses of various binary combinations of enriched uranium, depleted uranium, low burnup Pu, 137 Cs, and 133 Ba attenuated through a suite of absorbers ranging in Z from polyethylene through lead. The GC-MBS method results are compared to those computed using ordinary response function fitting and with a simple net peak area method. The GC-MBS method was found to be significantly more accurate than the other methods over the range of absorbers and isotopic blends studied

  4. A Review of Methods for Analysis of the Expected Value of Information.

    Science.gov (United States)

    Heath, Anna; Manolopoulou, Ioanna; Baio, Gianluca

    2017-10-01

    In recent years, value-of-information analysis has become more widespread in health economic evaluations, specifically as a tool to guide further research and perform probabilistic sensitivity analysis. This is partly due to methodological advancements allowing for the fast computation of a typical summary known as the expected value of partial perfect information (EVPPI). A recent review discussed some approximation methods for calculating the EVPPI, but as the research has been active over the intervening years, that review does not discuss some key estimation methods. Therefore, this paper presents a comprehensive review of these new methods. We begin by providing the technical details of these computation methods. We then present two case studies in order to compare the estimation performance of these new methods. We conclude that a method based on nonparametric regression offers the best method for calculating the EVPPI in terms of accuracy, computational time, and ease of implementation. This means that the EVPPI can now be used practically in health economic evaluations, especially as all the methods are developed in parallel with R functions and a web app to aid practitioners.

  5. Double Separation Method for Translation of the Infrared Information into a Visible Area

    Directory of Open Access Journals (Sweden)

    Ivana Žiljak

    2009-06-01

    Full Text Available Information visualization refers to the wavelength area ranging from 400 to 700 nm. Areas in lower wavelengths ranging from 100 to 400 nm are translated into the visual area with the goal to protect information visible only by applying instruments adapted for the ultraviolet area. Our recent research work refers to the infrared wavelength areas above the visible specter up to 1000 nm. The scientific contribution of this paper is in setting the double separation method for printing with CMYK printing inks with the goal to detect graphic information in the infrared area only. An algorithm has been created for making visual basics in the overall visible specter containing material that responds in the infrared section. This allows planning of areas in all coloring types for one and the same document that contains a secure piece of information. The system is based on double transition transformation of the visible RGB1 information recognition into CMYK2 in the same document. Secure information is recognized with the help of instruments in the set wavelength range. Most of the experiments have been carried out by analyzing the same set of RGB records. Each sample in the set was a test unit coming from another source containing different IR3 components. Thus an innovative method of color mixing has been set where colors appear respectively in daylight and separately according to IR light programming. New IR cryptography is proposed as shown in the experimental work.

  6. Unified method to integrate and blend several, potentially related, sources of information for genetic evaluation.

    Science.gov (United States)

    Vandenplas, Jérémie; Colinet, Frederic G; Gengler, Nicolas

    2014-09-30

    A condition to predict unbiased estimated breeding values by best linear unbiased prediction is to use simultaneously all available data. However, this condition is not often fully met. For example, in dairy cattle, internal (i.e. local) populations lead to evaluations based only on internal records while widely used foreign sires have been selected using internally unavailable external records. In such cases, internal genetic evaluations may be less accurate and biased. Because external records are unavailable, methods were developed to combine external information that summarizes these records, i.e. external estimated breeding values and associated reliabilities, with internal records to improve accuracy of internal genetic evaluations. Two issues of these methods concern double-counting of contributions due to relationships and due to records. These issues could be worse if external information came from several evaluations, at least partially based on the same records, and combined into a single internal evaluation. Based on a Bayesian approach, the aim of this research was to develop a unified method to integrate and blend simultaneously several sources of information into an internal genetic evaluation by avoiding double-counting of contributions due to relationships and due to records. This research resulted in equations that integrate and blend simultaneously several sources of information and avoid double-counting of contributions due to relationships and due to records. The performance of the developed equations was evaluated using simulated and real datasets. The results showed that the developed equations integrated and blended several sources of information well into a genetic evaluation. The developed equations also avoided double-counting of contributions due to relationships and due to records. Furthermore, because all available external sources of information were correctly propagated, relatives of external animals benefited from the integrated

  7. Background radioactivity in environmental materials

    International Nuclear Information System (INIS)

    Maul, P.R.; O'Hara, J.P.

    1989-01-01

    This paper presents the results of a literature search to identify information on concentrations of 'background' radioactivity in foodstuffs and other commonly available environmental materials. The review has concentrated on naturally occurring radioactivity in foods and on UK data, although results from other countries have also been considered where appropriate. The data are compared with established definitions of a 'radioactive' substance and radionuclides which do not appear to be adequately covered in the literature are noted. (author)

  8. Background paper on aquaculture research

    OpenAIRE

    Wenblad, Axel; Jokumsen, Alfred; Eskelinen, Unto; Torrissen, Ole

    2013-01-01

    The Board of MISTRA established in 2012 a Working Group (WG) on Aquaculture to provide the Board with background information for its upcoming decision on whether the foundation should invest in aquaculture research. The WG included Senior Advisor Axel Wenblad, Sweden (Chairman), Professor Ole Torrissen, Norway, Senior Advisory Scientist Unto Eskelinen, Finland and Senior Advisory Scientist Alfred Jokumsen, Denmark. The WG performed an investigation of the Swedish aquaculture sector including ...

  9. Method and Apparatus Providing Deception and/or Altered Operation in an Information System Operating System

    Science.gov (United States)

    Cohen, Fred; Rogers, Deanna T.; Neagoe, Vicentiu

    2008-10-14

    A method and/or system and/or apparatus providing deception and/or execution alteration in an information system. In specific embodiments, deceptions and/or protections are provided by intercepting and/or modifying operation of one or more system calls of an operating system.

  10. System and method for extracting physiological information from remotely detected electromagnetic radiation

    NARCIS (Netherlands)

    2016-01-01

    The present invention relates to a device and a method for extracting physiological information indicative of at least one health symptom from remotely detected electromagnetic radiation. The device comprises an interface (20) for receiving a data stream comprising remotely detected image data

  11. System and method for extracting physiological information from remotely detected electromagnetic radiation

    NARCIS (Netherlands)

    2015-01-01

    The present invention relates to a device and a method for extracting physiological information indicative of at least one health symptom from remotely detected electromagnetic radiation. The device comprises an interface (20) for receiving a data stream comprising remotely detected image data

  12. A novel Bayesian learning method for information aggregation in modular neural networks

    DEFF Research Database (Denmark)

    Wang, Pan; Xu, Lida; Zhou, Shang-Ming

    2010-01-01

    Modular neural network is a popular neural network model which has many successful applications. In this paper, a sequential Bayesian learning (SBL) is proposed for modular neural networks aiming at efficiently aggregating the outputs of members of the ensemble. The experimental results on eight...... benchmark problems have demonstrated that the proposed method can perform information aggregation efficiently in data modeling....

  13. A Simple Method to Determine if a Music Information Retrieval System is a "Horse"

    DEFF Research Database (Denmark)

    Sturm, Bob L.

    2014-01-01

    We propose and demonstrate a simple method to determine if a music information retrieval (MIR) system is using factors irrelevant to the task for which it is designed. This is of critical importance to certain use cases, but cannot be accomplished using standard approaches to evaluation in MIR...

  14. Method of projects on informatics lesson - as means of pupils’ informative competency development.

    Directory of Open Access Journals (Sweden)

    О. Staryh

    2008-06-01

    Full Text Available Information competence forms most effectively by pupils under joint execution of three conditions: problem-solving education, using of multimedia technologies and drafts method. Untraditional lessons that are conducted in Kherson Academical Lyceum help to arouse children’s longing to self-education, realization of their abilities.

  15. Reframing Research on Methods Courses to Inform Mathematics Teacher Educators' Practice

    Science.gov (United States)

    Kastberg, Signe E.; Tyminski, Andrew M.; Sanchez, Wendy B.

    2017-01-01

    Calls have been made for the creation of a shared knowledge base in mathematics teacher education with the power to inform the design of scholarly inquiry and mathematics teacher educators' (MTEs) scholarly practices. Focusing on mathematics methods courses, we summarize and contribute to literature documenting activities MTEs use in mathematics…

  16. New Fault Recognition Method for Rotary Machinery Based on Information Entropy and a Probabilistic Neural Network.

    Science.gov (United States)

    Jiang, Quansheng; Shen, Yehu; Li, Hua; Xu, Fengyu

    2018-01-24

    Feature recognition and fault diagnosis plays an important role in equipment safety and stable operation of rotating machinery. In order to cope with the complexity problem of the vibration signal of rotating machinery, a feature fusion model based on information entropy and probabilistic neural network is proposed in this paper. The new method first uses information entropy theory to extract three kinds of characteristics entropy in vibration signals, namely, singular spectrum entropy, power spectrum entropy, and approximate entropy. Then the feature fusion model is constructed to classify and diagnose the fault signals. The proposed approach can combine comprehensive information from different aspects and is more sensitive to the fault features. The experimental results on simulated fault signals verified better performances of our proposed approach. In real two-span rotor data, the fault detection accuracy of the new method is more than 10% higher compared with the methods using three kinds of information entropy separately. The new approach is proved to be an effective fault recognition method for rotating machinery.

  17. The duty to provide information in the case of neuroradiological examination methods

    International Nuclear Information System (INIS)

    Deutsch, E.

    1987-01-01

    The author gives a survey on the judicial decisions concerning the obligation to give information in the case of neuroradiological examination methods. The scope and content of the medical explanation depends inter alia on the urgency and the necessity of the medical diagnosis and on the understanding of the patient. (WG) [de

  18. Aligning Professional Skills and Active Learning Methods: An Application for Information and Communications Technology Engineering

    Science.gov (United States)

    Llorens, Ariadna; Berbegal-Mirabent, Jasmina; Llinàs-Audet, Xavier

    2017-01-01

    Engineering education is facing new challenges to effectively provide the appropriate skills to future engineering professionals according to market demands. This study proposes a model based on active learning methods, which is expected to facilitate the acquisition of the professional skills most highly valued in the information and…

  19. The Standardization Method of Address Information for POIs from Internet Based on Positional Relation

    Directory of Open Access Journals (Sweden)

    WANG Yong

    2016-05-01

    Full Text Available As points of interest (POIon the internet, exists widely incomplete addresses and inconsistent literal expressions, a fast standardization processing method of network POIs address information based on spatial constraints was proposed. Based on the model of the extensible address expression, first of all, address information of POI was segmented and extracted. Address elements are updated by means of matching with the address tree layer by layer. Then, by defining four types of positional relations, corresponding set are selected from standard POI library as candidate for enrichment and amendment of non-standard address. At last, the fast standardized processing of POI address information was achieved with the help of backtracking address elements with minimum granularity. Experiments in this paper proved that the standardization processing of an address can be realized by means of this method with higher accuracy in order to build the address database.

  20. An Efficient Method to Search Real-Time Bulk Data for an Information Processing System

    International Nuclear Information System (INIS)

    Kim, Seong Jin; Kim, Jong Myung; Suh, Yong Suk; Keum, Jong Yong; Park, Heui Youn

    2005-01-01

    The Man Machine Interface System (MMIS) of System-integrated Modular Advanced ReacTor (SMART) is designed with fully digitalized features. The Information Processing System (IPS) of the MMIS acquires and processes plant data from other systems. In addition, the IPS provides plant operation information to operators in the control room. The IPS is required to process bulky data in a real-time. So, it is necessary to consider a special processing method with regards to flexibility and performance because more than a few thousands of Plant Information converges on the IPS. Among other things, the processing time for searching for data from the bulk data consumes much more than other the processing times. Thus, this paper explores an efficient method for the search and examines its feasibility

  1. Mixed multiscale finite element methods using approximate global information based on partial upscaling

    KAUST Repository

    Jiang, Lijian

    2009-10-02

    The use of limited global information in multiscale simulations is needed when there is no scale separation. Previous approaches entail fine-scale simulations in the computation of the global information. The computation of the global information is expensive. In this paper, we propose the use of approximate global information based on partial upscaling. A requirement for partial homogenization is to capture long-range (non-local) effects present in the fine-scale solution, while homogenizing some of the smallest scales. The local information at these smallest scales is captured in the computation of basis functions. Thus, the proposed approach allows us to avoid the computations at the scales that can be homogenized. This results in coarser problems for the computation of global fields. We analyze the convergence of the proposed method. Mathematical formalism is introduced, which allows estimating the errors due to small scales that are homogenized. The proposed method is applied to simulate two-phase flows in heterogeneous porous media. Numerical results are presented for various permeability fields, including those generated using two-point correlation functions and channelized permeability fields from the SPE Comparative Project (Christie and Blunt, SPE Reserv Evalu Eng 4:308-317, 2001). We consider simple cases where one can identify the scales that can be homogenized. For more general cases, we suggest the use of upscaling on the coarse grid with the size smaller than the target coarse grid where multiscale basis functions are constructed. This intermediate coarse grid renders a partially upscaled solution that contains essential non-local information. Numerical examples demonstrate that the use of approximate global information provides better accuracy than purely local multiscale methods. © 2009 Springer Science+Business Media B.V.

  2. An automated background estimation procedure for gamma ray spectra

    International Nuclear Information System (INIS)

    Tervo, R.J.; Kennett, T.J.; Prestwich, W.V.

    1983-01-01

    An objective and simple method has been developed to estimate the background continuum in Ge gamma ray spectra. Requiring no special procedures, the method is readily automated. Based upon the inherent statistical properties of the experimental data itself, nodes, which reflect background samples are located and used to produce an estimate of the continuum. A simple procedure to interpolate between nodes is reported and a range of rather typical experimental data is presented. All information necessary to implemented this technique is given including the relevant properties of various factors involved in its development. (orig.)

  3. Ultrasound-guided continuous interscalene block: the influence of local anesthetic background delivery method on postoperative analgesia after shoulder surgery: a randomized trial.

    Science.gov (United States)

    Hamdani, Mehdi; Chassot, Olivier; Fournier, Roxane

    2014-01-01

    Automated bolus delivery has recently been shown to reduce local anesthetic consumption and improve analgesia, compared with continuous infusion, in continuous sciatic and epidural block. However, there are few data on the influence of local anesthetic delivery method on local anesthetic consumption following interscalene blockade. This randomized, double-blind trial was designed to determine whether hourly automated perineural boluses (4 mL) of local anesthesia delivered with patient-controlled pro re nata (PRN, on demand) boluses would result in a reduction in total local anesthesia consumption during continuous interscalene blockade after shoulder surgery compared with continuous perineural infusion (4 mL/h) plus patient-controlled PRN boluses. One hundred one patients undergoing major shoulder surgery under general anesthesia with ultrasound-guided continuous interscalene block were randomly assigned to receive 0.2% ropivacaine via interscalene end-hole catheter either by continuous infusion 4 mL/h (n = 50) or as automated bolus 4 mL/h (n = 51). Both delivery methods were combined with 5 mL PRN boluses of 0.2% ropivacaine with a lockout time of 30 minutes. Postoperative number of PRN boluses, 24- and 48-hour local anesthetic consumption, pain scores, rescue analgesia (morphine), and adverse events were recorded. There were no significant differences in either the number of PRN ropivacaine boluses or total 48 hour local anesthetic consumption between the groups (18.5 [11-25.2] PRN boluses in the continuous infusion group vs 17 [8.5-29] PRN boluses in the automated bolus group). Postoperative pain was similar in both groups; on day 2, the median average pain score was 4 (2-6) in the continuous infusion group versus 3 (2-5) in the automated bolus group (P = 0.54). Nor were any statistically significant intergroup differences observed with respect to morphine rescue, incidence of adverse events, or patient satisfaction. In continuous interscalene blockade under

  4. Information Literacy Instruction Assessment and Improvement through Evidence Based Practice: A Mixed Method Study

    Directory of Open Access Journals (Sweden)

    Diana K. Wakimoto

    2010-03-01

    Full Text Available Objective — This study explored first-year students’ learning and satisfaction in a required information literacy course. The study asked how students understand connections between themselves and information literacy in terms of power, society, and personal relevance to assess if students’ understanding of information literacy increased after taking the course. Student satisfaction with the course also was measured.Methods — The study used pre- and post tests and focus group session transcripts which were coded and analyzed to determine student learning and satisfaction during the regular 2008-2009 academic year at California State University, East Bay.Results — Many students entered the course without any concept of information literacy; however, after taking the course they found information literacy to be personally relevant and were able to articulate connections among information, power, and society. The majority of students were satisfied with the course. The results from analyzing the pre- and post-tests were supported by the findings from the focus group sessions.Conclusion — The results of this study are supported by other studies that show the importance of personal relevancy to student learning. In order to fully assess information literacy instruction and student learning, librarians should consider incorporating ways of assessing student learning beyond testing content knowledge and levels of competency.

  5. Soil scientific survey of 220/38 kV cable circuits of the power station 'Eemscentrale' in the Dutch province Groningen; Theoretical backgrounds, research method and results

    International Nuclear Information System (INIS)

    Langevoord, J.; Van Loon, L.J.M.

    1995-01-01

    Recently, five underground cable circuits were completed at the site of the EPON (an energy utility for the north-eastern part of the Netherlands) title power station, consisting of two 220 kV and two 380 kV connections with a total length of 24 km. Soil scientific in situ investigations and laboratory tests were carried out in advance to collect data, on the basis of which thermal resistivity and critical thermal conditions could be calculated. It was demonstrated by the calculated results that no de-hydrated zones occurred around the cable for design criteria conditions. Optimal cable bed conditions could be achieved, using some of the sand excavated from the trench. In this article, attention will be paid to theoretical aspects of heat transfer of cables for underground electricity transport, the research method of the soil scientific survey, and the results of the survey for the design of the cable connection, to be made by NKF (cable manufacturer) and for the final execution of the cable design. In the second article, to be published in a next issue of this magazine, attention will be paid to soil scientific marginal conditions and soil scientific supervision during the realization. 9 figs., 6 tabs., 9 refs

  6. An Accurate Integral Method for Vibration Signal Based on Feature Information Extraction

    Directory of Open Access Journals (Sweden)

    Yong Zhu

    2015-01-01

    Full Text Available After summarizing the advantages and disadvantages of current integral methods, a novel vibration signal integral method based on feature information extraction was proposed. This method took full advantage of the self-adaptive filter characteristic and waveform correction feature of ensemble empirical mode decomposition in dealing with nonlinear and nonstationary signals. This research merged the superiorities of kurtosis, mean square error, energy, and singular value decomposition on signal feature extraction. The values of the four indexes aforementioned were combined into a feature vector. Then, the connotative characteristic components in vibration signal were accurately extracted by Euclidean distance search, and the desired integral signals were precisely reconstructed. With this method, the interference problem of invalid signal such as trend item and noise which plague traditional methods is commendably solved. The great cumulative error from the traditional time-domain integral is effectively overcome. Moreover, the large low-frequency error from the traditional frequency-domain integral is successfully avoided. Comparing with the traditional integral methods, this method is outstanding at removing noise and retaining useful feature information and shows higher accuracy and superiority.

  7. Predicting protein complexes using a supervised learning method combined with local structural information.

    Science.gov (United States)

    Dong, Yadong; Sun, Yongqi; Qin, Chao

    2018-01-01

    The existing protein complex detection methods can be broadly divided into two categories: unsupervised and supervised learning methods. Most of the unsupervised learning methods assume that protein complexes are in dense regions of protein-protein interaction (PPI) networks even though many true complexes are not dense subgraphs. Supervised learning methods utilize the informative properties of known complexes; they often extract features from existing complexes and then use the features to train a classification model. The trained model is used to guide the search process for new complexes. However, insufficient extracted features, noise in the PPI data and the incompleteness of complex data make the classification model imprecise. Consequently, the classification model is not sufficient for guiding the detection of complexes. Therefore, we propose a new robust score function that combines the classification model with local structural information. Based on the score function, we provide a search method that works both forwards and backwards. The results from experiments on six benchmark PPI datasets and three protein complex datasets show that our approach can achieve better performance compared with the state-of-the-art supervised, semi-supervised and unsupervised methods for protein complex detection, occasionally significantly outperforming such methods.

  8. Research on the method of information system risk state estimation based on clustering particle filter

    Science.gov (United States)

    Cui, Jia; Hong, Bei; Jiang, Xuepeng; Chen, Qinghua

    2017-05-01

    With the purpose of reinforcing correlation analysis of risk assessment threat factors, a dynamic assessment method of safety risks based on particle filtering is proposed, which takes threat analysis as the core. Based on the risk assessment standards, the method selects threat indicates, applies a particle filtering algorithm to calculate influencing weight of threat indications, and confirms information system risk levels by combining with state estimation theory. In order to improve the calculating efficiency of the particle filtering algorithm, the k-means cluster algorithm is introduced to the particle filtering algorithm. By clustering all particles, the author regards centroid as the representative to operate, so as to reduce calculated amount. The empirical experience indicates that the method can embody the relation of mutual dependence and influence in risk elements reasonably. Under the circumstance of limited information, it provides the scientific basis on fabricating a risk management control strategy.

  9. Research on the method of information system risk state estimation based on clustering particle filter

    Directory of Open Access Journals (Sweden)

    Cui Jia

    2017-05-01

    Full Text Available With the purpose of reinforcing correlation analysis of risk assessment threat factors, a dynamic assessment method of safety risks based on particle filtering is proposed, which takes threat analysis as the core. Based on the risk assessment standards, the method selects threat indicates, applies a particle filtering algorithm to calculate influencing weight of threat indications, and confirms information system risk levels by combining with state estimation theory. In order to improve the calculating efficiency of the particle filtering algorithm, the k-means cluster algorithm is introduced to the particle filtering algorithm. By clustering all particles, the author regards centroid as the representative to operate, so as to reduce calculated amount. The empirical experience indicates that the method can embody the relation of mutual dependence and influence in risk elements reasonably. Under the circumstance of limited information, it provides the scientific basis on fabricating a risk management control strategy.

  10. Method and simulation program informed decisions in the early stages of building design

    DEFF Research Database (Denmark)

    Petersen, Steffen; Svendsen, Svend

    2010-01-01

    variations. The program then presents the output in a way that enables designers to make informed decisions. The method and the program reduce the need for design iterations, reducing time consumption and construction costs, to obtain the intended energy performance and indoor environment....... for making informed decisions in the early stages of building design to fulfil performance requirements with regard to energy consumption and indoor environment. The method is operationalised in a program that utilises a simple simulation program to make performance predictions of user-defined parameter......The early stages of building design include a number of decisions which have a strong influence on the performance of the building throughout the rest of the process. It is therefore important that designers are aware of the consequences of these design decisions. This paper presents a method...

  11. Adaptation of the European Commission-recommended user testing method to patient medication information leaflets in Japan

    Directory of Open Access Journals (Sweden)

    Yamamoto M

    2017-06-01

    Full Text Available Michiko Yamamoto,1 Hirohisa Doi,1 Ken Yamamoto,2 Kazuhiro Watanabe,2 Tsugumichi Sato,3 Machi Suka,4 Takeo Nakayama,5 Hiroki Sugimori6 1Department of Drug Informatics, Center for Education & Research on Clinical Pharmacy, Showa Pharmaceutical University, Tokyo, Japan; 2Department of Pharmacy Practice, Center for Education & Research on Clinical Pharmacy, Showa Pharmaceutical University, Tokyo, Japan; 3Faculty of Pharmaceutical Sciences, Tokyo University of Science, Chiba, Japan; 4Department of Public Health and Environmental Medicine, The Jikei University School of Medicine, Tokyo, Japan; 5Department of Health Informatics, Kyoto University School of Public, Kyoto, Japan; 6Department of Preventive Medicine, Graduate School of Sports and Health Sciences, Daito Bunka University, Saitama, Japan Background: The safe use of drugs relies on providing accurate drug information to patients. In Japan, patient leaflets called Drug Guide for Patients are officially available; however, their utility has never been verified. This is the first attempt to improve Drug Guide for Patients via user testing in Japan.Purpose: To test and improve communication of drug information to minimize risk for patients via user testing of the current and revised versions of Drug Guide for Patients, and to demonstrate that this method is effective for improving Drug Guide for Patients in Japan.Method: We prepared current and revised versions of the Drug Guide for Patients and performed user testing via semi-structured interviews with consumers to compare these versions for two guides for Mercazole and Strattera. We evenly divided 54 participants into two groups with similar distributions of sex, age, and literacy level to test the differing versions of the Mercazole guide. Another group of 30 participants were divided evenly to test the versions of the Strattera guide. After completing user testing, the participants evaluated both guides in terms of amount of information

  12. Automatic Optimizer Generation Method Based on Location and Context Information to Improve Mobile Services

    Directory of Open Access Journals (Sweden)

    Yunsik Son

    2017-01-01

    Full Text Available Several location-based services (LBSs have been recently developed for smartphones. Among these are proactive LBSs, which provide services to smartphone users by periodically collecting background logs. However, because they consume considerable battery power, they are not widely used for various LBS-based services. Battery consumption, in particular, is a significant issue on account of the characteristics of mobile systems. This problem involves a greater service restriction when performing complex operations. Therefore, to successfully enable various services based on location, this problem must be solved. In this paper, we introduce a technique to automatically generate a customized service optimizer for each application, service type, and platform using location and situation information. By using the proposed technique, energy and computing resources can be more efficiently employed for each service. Thus, users should receive more effective LBSs on mobile devices, such as smartphones.

  13. Informational Urbanism

    Directory of Open Access Journals (Sweden)

    Wolfgang G. Stock

    2015-10-01

    Full Text Available Contemporary and future cities are often labeled as "smart cities," "ubiquitous cities," "knowledge cities" and "creative cities." Informational urbanism includes all aspects of information and knowledge with regard to urban regions. "Informational city" is an umbrella term uniting the divergent trends of information-related city research. Informational urbanism is an interdisciplinary endeavor incorporating on the one side computer science and information science and on the other side urbanism, architecture, (city economics, and (city sociology. In our research project on informational cities, we visited more than 40 metropolises and smaller towns all over the world. In this paper, we sketch the theoretical background on a journey from Max Weber to the Internet of Things, introduce our research methods, and describe main results on characteristics of informational cities as prototypical cities of the emerging knowledge society.

  14. Value of information methods to design a clinical trial in a small population to optimise a health economic utility function

    Directory of Open Access Journals (Sweden)

    Michael Pearce

    2018-02-01

    Full Text Available Abstract Background Most confirmatory randomised controlled clinical trials (RCTs are designed with specified power, usually 80% or 90%, for a hypothesis test conducted at a given significance level, usually 2.5% for a one-sided test. Approval of the experimental treatment by regulatory agencies is then based on the result of such a significance test with other information to balance the risk of adverse events against the benefit of the treatment to future patients. In the setting of a rare disease, recruiting sufficient patients to achieve conventional error rates for clinically reasonable effect sizes may be infeasible, suggesting that the decision-making process should reflect the size of the target population. Methods We considered the use of a decision-theoretic value of information (VOI method to obtain the optimal sample size and significance level for confirmatory RCTs in a range of settings. We assume the decision maker represents society. For simplicity we assume the primary endpoint to be normally distributed with unknown mean following some normal prior distribution representing information on the anticipated effectiveness of the therapy available before the trial. The method is illustrated by an application in an RCT in haemophilia A. We explicitly specify the utility in terms of improvement in primary outcome and compare this with the costs of treating patients, both financial and in terms of potential harm, during the trial and in the future. Results The optimal sample size for the clinical trial decreases as the size of the population decreases. For non-zero cost of treating future patients, either monetary or in terms of potential harmful effects, stronger evidence is required for approval as the population size increases, though this is not the case if the costs of treating future patients are ignored. Conclusions Decision-theoretic VOI methods offer a flexible approach with both type I error rate and power (or equivalently

  15. Applying Multiple Methods to Comprehensively Evaluate a Patient Portal’s Effectiveness to Convey Information to Patients

    Science.gov (United States)

    Krist, Alex H; Aycock, Rebecca A; Kreps, Gary L

    2016-01-01

    Background Patient portals have yet to achieve their full potential for enhancing health communication and improving health outcomes. Although the Patient Protection and Affordable Care Act in the United States mandates the utilization of patient portals, and usage continues to rise, their impact has not been as profound as anticipated. Objective The objective of our case study was to evaluate how well portals convey information to patients. To demonstrate how multiple methodologies could be used to evaluate and improve the design of patient-centered portals, we conducted an in-depth evaluation of an exemplar patient-centered portal designed to promote preventive care to consumers. Methods We used 31 critical incident patient interviews, 2 clinician focus groups, and a thematic content analysis to understand patients’ and clinicians’ perspectives, as well as theoretical understandings of the portal’s use. Results We gathered over 140 critical incidents, 71.8% (102/142) negative and 28.2% (40/142) positive. Positive incident categories were (1) instant medical information access, (2) clear health information, and (3) patient vigilance. Negative incident categories were (1) standardized content, (2) desire for direct communication, (3) website functionality, and (4) difficulty interpreting laboratory data. Thematic analysis of the portal’s immediacy resulted in high scores in the attributes enhances understanding (18/23, 78%), personalization (18/24, 75%), and motivates behavior (17/24, 71%), but low levels of interactivity (7/24, 29%) and engagement (2/24, 8%). Two overarching themes emerged to guide portal refinements: (1) communication can be improved with directness and interactivity and (2) perceived personalization must be greater to engage patients. Conclusions Results suggest that simple modifications, such as increased interactivity and personalized messages, can make portals customized, robust, easily accessible, and trusted information sources

  16. Hybrid Multicriteria Group Decision Making Method for Information System Project Selection Based on Intuitionistic Fuzzy Theory

    Directory of Open Access Journals (Sweden)

    Jian Guo

    2013-01-01

    Full Text Available Information system (IS project selection is of critical importance to every organization in dynamic competing environment. The aim of this paper is to develop a hybrid multicriteria group decision making approach based on intuitionistic fuzzy theory for IS project selection. The decision makers’ assessment information can be expressed in the form of real numbers, interval-valued numbers, linguistic variables, and intuitionistic fuzzy numbers (IFNs. All these evaluation pieces of information can be transformed to the form of IFNs. Intuitionistic fuzzy weighted averaging (IFWA operator is utilized to aggregate individual opinions of decision makers into a group opinion. Intuitionistic fuzzy entropy is used to obtain the entropy weights of the criteria. TOPSIS method combined with intuitionistic fuzzy set is proposed to select appropriate IS project in group decision making environment. Finally, a numerical example for information system projects selection is given to illustrate application of hybrid multi-criteria group decision making (MCGDM method based on intuitionistic fuzzy theory and TOPSIS method.

  17. Improvements in recall and food choices using a graphical method to deliver information of select nutrients.

    Science.gov (United States)

    Pratt, Nathan S; Ellison, Brenna D; Benjamin, Aaron S; Nakamura, Manabu T

    2016-01-01

    Consumers have difficulty using nutrition information. We hypothesized that graphically delivering information of select nutrients relative to a target would allow individuals to process information in time-constrained settings more effectively than numerical information. Objectives of the study were to determine the efficacy of the graphical method in (1) improving memory of nutrient information and (2) improving consumer purchasing behavior in a restaurant. Values of fiber and protein per calorie were 2-dimensionally plotted alongside a target box. First, a randomized cued recall experiment was conducted (n=63). Recall accuracy of nutrition information improved by up to 43% when shown graphically instead of numerically. Second, the impact of graphical nutrition signposting on diner choices was tested in a cafeteria. Saturated fat and sodium information was also presented using color coding. Nutrient content of meals (n=362) was compared between 3 signposting phases: graphical, nutrition facts panels (NFP), or no nutrition label. Graphical signposting improved nutrient content of purchases in the intended direction, whereas NFP had no effect compared with the baseline. Calories ordered from total meals, entrées, and sides were significantly less during graphical signposting than no-label and NFP periods. For total meal and entrées, protein per calorie purchased was significantly higher and saturated fat significantly lower during graphical signposting than the other phases. Graphical signposting remained a predictor of calories and protein per calorie purchased in regression modeling. These findings demonstrate that graphically presenting nutrition information makes that information more available for decision making and influences behavior change in a realistic setting. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. The Effect of Health Information Technology on Health Care Provider Communication: A Mixed-Method Protocol.

    Science.gov (United States)

    Manojlovich, Milisa; Adler-Milstein, Julia; Harrod, Molly; Sales, Anne; Hofer, Timothy P; Saint, Sanjay; Krein, Sarah L

    2015-06-11

    Communication failures between physicians and nurses are one of the most common causes of adverse events for hospitalized patients, as well as a major root cause of all sentinel events. Communication technology (ie, the electronic medical record, computerized provider order entry, email, and pagers), which is a component of health information technology (HIT), may help reduce some communication failures but increase others because of an inadequate understanding of how communication technology is used. Increasing use of health information and communication technologies is likely to affect communication between nurses and physicians. The purpose of this study is to describe, in detail, how health information and communication technologies facilitate or hinder communication between nurses and physicians with the ultimate goal of identifying how we can optimize the use of these technologies to support effective communication. Effective communication is the process of developing shared understanding between communicators by establishing, testing, and maintaining relationships. Our theoretical model, based in communication and sociology theories, describes how health information and communication technologies affect communication through communication practices (ie, use of rich media; the location and availability of computers) and work relationships (ie, hierarchies and team stability). Therefore we seek to (1) identify the range of health information and communication technologies used in a national sample of medical-surgical acute care units, (2) describe communication practices and work relationships that may be influenced by health information and communication technologies in these same settings, and (3) explore how differences in health information and communication technologies, communication practices, and work relationships between physicians and nurses influence communication. This 4-year study uses a sequential mixed-methods design, beginning with a

  19. E-learning for Critical Thinking: Using Nominal Focus Group Method to Inform Software Content and Design

    Science.gov (United States)

    Parker, Steve; Mayner, Lidia; Michael Gillham, David

    2015-01-01

    Background: Undergraduate nursing students are often confused by multiple understandings of critical thinking. In response to this situation, the Critiique for critical thinking (CCT) project was implemented to provide consistent structured guidance about critical thinking. Objectives: This paper introduces Critiique software, describes initial validation of the content of this critical thinking tool and explores wider applications of the Critiique software. Materials and Methods: Critiique is flexible, authorable software that guides students step-by-step through critical appraisal of research papers. The spelling of Critiique was deliberate, so as to acquire a unique web domain name and associated logo. The CCT project involved implementation of a modified nominal focus group process with academic staff working together to establish common understandings of critical thinking. Previous work established a consensus about critical thinking in nursing and provided a starting point for the focus groups. The study was conducted at an Australian university campus with the focus group guided by open ended questions. Results: Focus group data established categories of content that academic staff identified as important for teaching critical thinking. This emerging focus group data was then used to inform modification of Critiique software so that students had access to consistent and structured guidance in relation to critical thinking and critical appraisal. Conclusions: The project succeeded in using focus group data from academics to inform software development while at the same time retaining the benefits of broader philosophical dimensions of critical thinking. PMID:26835469

  20. Effects of background radiation

    International Nuclear Information System (INIS)

    Knox, E.G.; Stewart, A.M.; Gilman, E.A.; Kneale, G.W.

    1987-01-01

    The primary objective of this investigation is to measure the relationship between exposure to different levels of background gamma radiation in different parts of the country, and different Relative Risks for leukaemias and cancers in children. The investigation is linked to an earlier analysis of the effects of prenatal medical x-rays upon leukaemia and cancer risk; the prior hypothesis on which the background-study was based, is derived from the earlier results. In a third analysis, the authors attempted to measure varying potency of medical x-rays delivered at different stages of gestation and the results supply a link between the other two estimates. (author)

  1. The cosmic microwave background

    International Nuclear Information System (INIS)

    Silk, J.

    1991-01-01

    Recent limits on spectral distortions and angular anisotropies in the cosmic microwave background are reviewed. The various backgrounds are described, and the theoretical implications are assessed. Constraints on inflationary cosmology dominated by cold dark matter (CDM) and on open cosmological models dominated by baryonic dark matter (BDM), with, respectively, primordial random phase scale-invariant curvature fluctuations or non-gaussian isocurvature fluctuations are described. More exotic theories are addressed, and I conclude with the 'bottom line': what theories expect experimentalists to be measuring within the next two to three years without having to abandon their most cherished theorists. (orig.)

  2. The Cosmic Background Explorer

    Science.gov (United States)

    Gulkis, Samuel; Lubin, Philip M.; Meyer, Stephan S.; Silverberg, Robert F.

    1990-01-01

    The Cosmic Background Explorer (CBE), NASA's cosmological satellite which will observe a radiative relic of the big bang, is discussed. The major questions connected to the big bang theory which may be clarified using the CBE are reviewed. The satellite instruments and experiments are described, including the Differential Microwave Radiometer, which measures the difference between microwave radiation emitted from two points on the sky, the Far-Infrared Absolute Spectrophotometer, which compares the spectrum of radiation from the sky at wavelengths from 100 microns to one cm with that from an internal blackbody, and the Diffuse Infrared Background Experiment, which searches for the radiation from the earliest generation of stars.

  3. Models, methods and software tools to evaluate the quality of informational and educational resources

    International Nuclear Information System (INIS)

    Gavrilov, S.I.

    2011-01-01

    The paper studies the modern methods and tools to evaluate the quality of data systems, which allows determining the specificity of informational and educational resources (IER). The author has developed a model of IER quality management at all stages of the life cycle and an integrated multi-level hierarchical system of IER quality assessment, taking into account both information properties and targeted resource assignment. The author presents a mathematical and algorithmic justification of solving the problem of IER quality management, and offers data system to assess the IER quality [ru

  4. Review of Statistical Learning Methods in Integrated Omics Studies (An Integrated Information Science).

    Science.gov (United States)

    Zeng, Irene Sui Lan; Lumley, Thomas

    2018-01-01

    Integrated omics is becoming a new channel for investigating the complex molecular system in modern biological science and sets a foundation for systematic learning for precision medicine. The statistical/machine learning methods that have emerged in the past decade for integrated omics are not only innovative but also multidisciplinary with integrated knowledge in biology, medicine, statistics, machine learning, and artificial intelligence. Here, we review the nontrivial classes of learning methods from the statistical aspects and streamline these learning methods within the statistical learning framework. The intriguing findings from the review are that the methods used are generalizable to other disciplines with complex systematic structure, and the integrated omics is part of an integrated information science which has collated and integrated different types of information for inferences and decision making. We review the statistical learning methods of exploratory and supervised learning from 42 publications. We also discuss the strengths and limitations of the extended principal component analysis, cluster analysis, network analysis, and regression methods. Statistical techniques such as penalization for sparsity induction when there are fewer observations than the number of features and using Bayesian approach when there are prior knowledge to be integrated are also included in the commentary. For the completeness of the review, a table of currently available software and packages from 23 publications for omics are summarized in the appendix.

  5. A Method for Extracting Road Boundary Information from Crowdsourcing Vehicle GPS Trajectories.

    Science.gov (United States)

    Yang, Wei; Ai, Tinghua; Lu, Wei

    2018-04-19

    Crowdsourcing trajectory data is an important approach for accessing and updating road information. In this paper, we present a novel approach for extracting road boundary information from crowdsourcing vehicle traces based on Delaunay triangulation (DT). First, an optimization and interpolation method is proposed to filter abnormal trace segments from raw global positioning system (GPS) traces and interpolate the optimization segments adaptively to ensure there are enough tracking points. Second, constructing the DT and the Voronoi diagram within interpolated tracking lines to calculate road boundary descriptors using the area of Voronoi cell and the length of triangle edge. Then, the road boundary detection model is established integrating the boundary descriptors and trajectory movement features (e.g., direction) by DT. Third, using the boundary detection model to detect road boundary from the DT constructed by trajectory lines, and a regional growing method based on seed polygons is proposed to extract the road boundary. Experiments were conducted using the GPS traces of taxis in Beijing, China, and the results show that the proposed method is suitable for extracting the road boundary from low-frequency GPS traces, multi-type road structures, and different time intervals. Compared with two existing methods, the automatically extracted boundary information was proved to be of higher quality.

  6. A Method for Extracting Road Boundary Information from Crowdsourcing Vehicle GPS Trajectories

    Directory of Open Access Journals (Sweden)

    Wei Yang

    2018-04-01

    Full Text Available Crowdsourcing trajectory data is an important approach for accessing and updating road information. In this paper, we present a novel approach for extracting road boundary information from crowdsourcing vehicle traces based on Delaunay triangulation (DT. First, an optimization and interpolation method is proposed to filter abnormal trace segments from raw global positioning system (GPS traces and interpolate the optimization segments adaptively to ensure there are enough tracking points. Second, constructing the DT and the Voronoi diagram within interpolated tracking lines to calculate road boundary descriptors using the area of Voronoi cell and the length of triangle edge. Then, the road boundary detection model is established integrating the boundary descriptors and trajectory movement features (e.g., direction by DT. Third, using the boundary detection model to detect road boundary from the DT constructed by trajectory lines, and a regional growing method based on seed polygons is proposed to extract the road boundary. Experiments were conducted using the GPS traces of taxis in Beijing, China, and the results show that the proposed method is suitable for extracting the road boundary from low-frequency GPS traces, multi-type road structures, and different time intervals. Compared with two existing methods, the automatically extracted boundary information was proved to be of higher quality.

  7. A new method of CCD dark current correction via extracting the dark Information from scientific images

    Science.gov (United States)

    Ma, Bin; Shang, Zhaohui; Hu, Yi; Liu, Qiang; Wang, Lifan; Wei, Peng

    2014-07-01

    We have developed a new method to correct dark current at relatively high temperatures for Charge-Coupled Device (CCD) images when dark frames cannot be obtained on the telescope. For images taken with the Antarctic Survey Telescopes (AST3) in 2012, due to the low cooling efficiency, the median CCD temperature was -46°C, resulting in a high dark current level of about 3e-/pix/sec, even comparable to the sky brightness (10e-/pix/sec). If not corrected, the nonuniformity of the dark current could even overweight the photon noise of the sky background. However, dark frames could not be obtained during the observing season because the camera was operated in frame-transfer mode without a shutter, and the telescope was unattended in winter. Here we present an alternative, but simple and effective method to derive the dark current frame from the scientific images. Then we can scale this dark frame to the temperature at which the scientific images were taken, and apply the dark frame corrections to the scientific images. We have applied this method to the AST3 data, and demonstrated that it can reduce the noise to a level roughly as low as the photon noise of the sky brightness, solving the high noise problem and improving the photometric precision. This method will also be helpful for other projects that suffer from similar issues.

  8. Provision of assistive technology services method (ATSM) according to evidence-based information and knowledge management.

    Science.gov (United States)

    Elsaesser, Linda-Jeanne; Bauer, Stephen M

    2011-01-01

    PURPOSE. This article develops a standardised method for assistive technology service (ATS) provision and a logical basis for research to improve health care quality. The method is 'interoperable' across disabilities, disciplines, assistive technology devices and ATSs. BACKGROUND. Absence of a standardised and interoperable method for ATS provision results in ineffective communication between providers, manufacturers, researchers, policy-makers and individuals with disabilities (IWD), a fragmented service delivery system, inefficient resource allocation and sub-optimal outcomes. OBJECTIVES. Synthesise a standardised, interoperable AT service method (ATSM) fully consistent with key guidelines, systems, models and Federal legislation. Express the ATSM using common and unambiguous language. RESULTS. Guidelines, systems, models and Federal legislation relevant to ATS provision are reviewed. These include the RESNA Guidelines for Knowledge and Skills for Provision of Assistive Technology Products and Services (RESNA Guidelines), IMPACT2 model, international classification of functioning, disability and health (ICF) and AT device classification (ATDC). Federal legislation includes the Assistive Technology Act of 2004, Americans with Disabilities Act of 2008 and Social Security Act. Based on these findings, the ATSM is synthesised and translated into common and accessible language. CONCLUSION. ATSM usage will improve communication between stakeholders, service delivery coherence, resource allocation and intervention outcomes.

  9. Thermal background noise limitations

    Science.gov (United States)

    Gulkis, S.

    1982-01-01

    Modern detection systems are increasingly limited in sensitivity by the background thermal photons which enter the receiving system. Expressions for the fluctuations of detected thermal radiation are derived. Incoherent and heterodyne detection processes are considered. References to the subject of photon detection statistics are given.

  10. Berkeley Low Background Facility

    International Nuclear Information System (INIS)

    Thomas, K. J.; Norman, E. B.; Smith, A. R.; Poon, A. W. P.; Chan, Y. D.; Lesko, K. T.

    2015-01-01

    The Berkeley Low Background Facility (BLBF) at Lawrence Berkeley National Laboratory (LBNL) in Berkeley, California provides low background gamma spectroscopy services to a wide array of experiments and projects. The analysis of samples takes place within two unique facilities; locally within a carefully-constructed, low background laboratory on the surface at LBNL and at the Sanford Underground Research Facility (SURF) in Lead, SD. These facilities provide a variety of gamma spectroscopy services to low background experiments primarily in the form of passive material screening for primordial radioisotopes (U, Th, K) or common cosmogenic/anthropogenic products; active screening via neutron activation analysis for U,Th, and K as well as a variety of stable isotopes; and neutron flux/beam characterization measurements through the use of monitors. A general overview of the facilities, services, and sensitivities will be presented. Recent activities and upgrades will also be described including an overview of the recently installed counting system at SURF (recently relocated from Oroville, CA in 2014), the installation of a second underground counting station at SURF in 2015, and future plans. The BLBF is open to any users for counting services or collaboration on a wide variety of experiments and projects

  11. Curriculum and instructional methods for drug information, literature evaluation, and biostatistics: survey of US pharmacy schools.

    Science.gov (United States)

    Phillips, Jennifer A; Gabay, Michael P; Ficzere, Cathy; Ward, Kristina E

    2012-06-01

    The drug information curriculum in US colleges of pharmacy continues to evolve. The American College of Clinical Pharmacy (ACCP) Drug Information Practice and Research Network (DI PRN) published an opinion paper with specific recommendations regarding drug information education in 2009. Adoption of these recommendations has not been evaluated. To assess which recommendations made in the ACCP DI PRN opinion paper are included in US pharmacy school curricula and characterize faculty qualifications, educational methods, and recent changes in drug information education. An electronic survey was designed using the ACCP DI PRN opinion paper and the Accreditation Council for Pharmacy Education standards and guidelines for accreditation of PharmD programs in the US. Survey questions addressed curricular content within the following categories: drug information, literature evaluation, and biostatistics. A letter including the online survey link was sent via email to the dean of each US college/school of pharmacy (N = 128). Recipients were instructed to forward the email to the individual at their institution who was the most knowledgeable about the content and methodology used for didactic drug information education. Sixty-four responses were included in the final analysis. Of the 19 ACCP DI PRN minimum core concepts, 9 (47%) were included in curricula of all responding institutions; 14 of 19 (74%) were included in curricula for all but 1 institution. In contrast, 5 of 16 concepts (31%) were not formally taught by a number of institutions. Many respondents noted an increased focus on evidence-based medicine, medication safety, and informatics. Although a survey of drug information curricula documented substantial inclusion of the essential concepts presented in the ACCP DI PRN opinion paper, room for improvement remains in drug information curricula in US colleges of pharmacy.

  12. The Effects of Background Music on the Middle School Students' Recognition of the Expository Text Information%背景音乐对中学生说明文文本信息再认的影响

    Institute of Scientific and Technical Information of China (English)

    刘明; 张裕鼎; 张立春

    2012-01-01

    以某中学一年级221名学生为被试,考察了不同类型及不同声压水平的背景音乐对中学生说明文文本信息再认过程的影响。实验结果表明:(1)音乐类型主效应显著,声压水平主效应不显著,两者交互作用显著,高音条件下不同音乐类型产生显著的再认成绩差异,而低音条件下未产生显著性差异。(2)高音条件下,不同类型的背景音乐对中学生的说明文文本信息再认成绩产生不同程度的影响。与无音乐环境相比,古典音乐对成绩产生显著的促进作用;流行音乐含中文歌词、流行音乐含日语歌词两个水平均产生显著的干扰作用;流行音乐不含歌词对再认无显著影响;此外,流行音乐含中文歌词干扰作用最大,但与流行音乐含日文歌词相比差异不显著。%We investigate the effects of different types of background music on recognition of the expository text information. The 221 participants are from 5 classes of Grade 1 of one middle school, with the likely same level of the Chinese course. The results of the experiments are as follows: (1) The main effect of the type of background music is significant while that of sound pressure level is not, and the interactive effect is significant: on the condition of high sound pressure level, the type of background music significant effect on the recognition score, but it has no effect on condition of the low sound pressure level. (2) On the condition of high sound pressure level, different types of background music take significantly different effect on the recognition score of the expository text information: comparing with no sound level, the classic music take significant facilitation effect, both the pop music with Chinese lyric and with Japanese lyric produce significant interference effect, and the pop music without lyric produces no effect. Besides, the interference effect taken by the pop music with Chinese is the

  13. Aligning professional skills and active learning methods: an application for information and communications technology engineering

    Science.gov (United States)

    Llorens, Ariadna; Berbegal-Mirabent, Jasmina; Llinàs-Audet, Xavier

    2017-07-01

    Engineering education is facing new challenges to effectively provide the appropriate skills to future engineering professionals according to market demands. This study proposes a model based on active learning methods, which is expected to facilitate the acquisition of the professional skills most highly valued in the information and communications technology (ICT) market. The theoretical foundations of the study are based on the specific literature on active learning methodologies. The Delphi method is used to establish the fit between learning methods and generic skills required by the ICT sector. An innovative proposition is therefore presented that groups the required skills in relation to the teaching method that best develops them. The qualitative research suggests that a combination of project-based learning and the learning contract is sufficient to ensure a satisfactory skills level for this profile of engineers.

  14. Methods of Hematoxylin and Erosin Image Information Acquisition and Optimization in Confocal Microscopy.

    Science.gov (United States)

    Yoon, Woong Bae; Kim, Hyunjin; Kim, Kwang Gi; Choi, Yongdoo; Chang, Hee Jin; Sohn, Dae Kyung

    2016-07-01

    We produced hematoxylin and eosin (H&E) staining-like color images by using confocal laser scanning microscopy (CLSM), which can obtain the same or more information in comparison to conventional tissue staining. We improved images by using several image converting techniques, including morphological methods, color space conversion methods, and segmentation methods. An image obtained after image processing showed coloring very similar to that in images produced by H&E staining, and it is advantageous to conduct analysis through fluorescent dye imaging and microscopy rather than analysis based on single microscopic imaging. The colors used in CLSM are different from those seen in H&E staining, which is the method most widely used for pathologic diagnosis and is familiar to pathologists. Computer technology can facilitate the conversion of images by CLSM to be very similar to H&E staining images. We believe that the technique used in this study has great potential for application in clinical tissue analysis.

  15. Two Ranking Methods of Single Valued Triangular Neutrosophic Numbers to Rank and Evaluate Information Systems Quality

    Directory of Open Access Journals (Sweden)

    Samah Ibrahim Abdel Aal

    2018-03-01

    Full Text Available The concept of neutrosophic can provide a generalization of fuzzy set and intuitionistic fuzzy set that make it is the best fit in representing indeterminacy and uncertainty. Single Valued Triangular Numbers (SVTrN-numbers is a special case of neutrosophic set that can handle ill-known quantity very difficult problems. This work intended to introduce a framework with two types of ranking methods. The results indicated that each ranking method has its own advantage. In this perspective, the weighted value and ambiguity based method gives more attention to uncertainty in ranking and evaluating ISQ as well as it takes into account cut sets of SVTrN numbers that can reflect the information on Truth-membership-membership degree, false membership-membership degree and Indeterminacy-membership degree. The value index and ambiguity index method can reflect the decision maker's subjectivity attitude to the SVTrN- numbers.

  16. Optimal cross-sectional sampling for river modelling with bridges: An information theory-based method

    Energy Technology Data Exchange (ETDEWEB)

    Ridolfi, E.; Napolitano, F., E-mail: francesco.napolitano@uniroma1.it [Sapienza Università di Roma, Dipartimento di Ingegneria Civile, Edile e Ambientale (Italy); Alfonso, L. [Hydroinformatics Chair Group, UNESCO-IHE, Delft (Netherlands); Di Baldassarre, G. [Department of Earth Sciences, Program for Air, Water and Landscape Sciences, Uppsala University (Sweden)

    2016-06-08

    The description of river topography has a crucial role in accurate one-dimensional (1D) hydraulic modelling. Specifically, cross-sectional data define the riverbed elevation, the flood-prone area, and thus, the hydraulic behavior of the river. Here, the problem of the optimal cross-sectional spacing is solved through an information theory-based concept. The optimal subset of locations is the one with the maximum information content and the minimum amount of redundancy. The original contribution is the introduction of a methodology to sample river cross sections in the presence of bridges. The approach is tested on the Grosseto River (IT) and is compared to existing guidelines. The results show that the information theory-based approach can support traditional methods to estimate rivers’ cross-sectional spacing.

  17. Optimal cross-sectional sampling for river modelling with bridges: An information theory-based method

    International Nuclear Information System (INIS)

    Ridolfi, E.; Napolitano, F.; Alfonso, L.; Di Baldassarre, G.

    2016-01-01

    The description of river topography has a crucial role in accurate one-dimensional (1D) hydraulic modelling. Specifically, cross-sectional data define the riverbed elevation, the flood-prone area, and thus, the hydraulic behavior of the river. Here, the problem of the optimal cross-sectional spacing is solved through an information theory-based concept. The optimal subset of locations is the one with the maximum information content and the minimum amount of redundancy. The original contribution is the introduction of a methodology to sample river cross sections in the presence of bridges. The approach is tested on the Grosseto River (IT) and is compared to existing guidelines. The results show that the information theory-based approach can support traditional methods to estimate rivers’ cross-sectional spacing.

  18. Two methods for isolating the lung area of a CT scan for density information

    International Nuclear Information System (INIS)

    Hedlund, L.W.; Anderson, R.F.; Goulding, P.L.; Beck, J.W.; Effmann, E.L.; Putman, C.E.

    1982-01-01

    Extracting density information from irregularly shaped tissue areas of CT scans requires automated methods when many scans are involved. We describe two computer methods that automatically isolate the lung area of a CT scan. Each starts from a single, operator specified point in the lung. The first method follows the steep density gradient boundary between lung and adjacent tissues; this tracking method is useful for estimating the overall density and total area of lung in a scan because all pixels within the lung area are available for statistical sampling. The second method finds all contiguous pixels of lung that are within the CT number range of air to water and are not a part of strong density gradient edges; this method is useful for estimating density and area of the lung parenchyma. Structures within the lung area that are surrounded by strong density gradient edges, such as large blood vessels, airways and nodules, are excluded from the lung sample while lung areas with diffuse borders, such as an area of mild or moderate edema, are retained. Both methods were tested on scans from an animal model of pulmonary edema and were found to be effective in isolating normal and diseased lungs. These methods are also suitable for isolating other organ areas of CT scans that are bounded by density gradient edges

  19. Investigating the feasibility of using partial least squares as a method of extracting salient information for the evaluation of digital breast tomosynthesis

    Science.gov (United States)

    Zhang, George Z.; Myers, Kyle J.; Park, Subok

    2013-03-01

    Digital breast tomosynthesis (DBT) has shown promise for improving the detection of breast cancer, but it has not yet been fully optimized due to a large space of system parameters to explore. A task-based statistical approach1 is a rigorous method for evaluating and optimizing this promising imaging technique with the use of optimal observers such as the Hotelling observer (HO). However, the high data dimensionality found in DBT has been the bottleneck for the use of a task-based approach in DBT evaluation. To reduce data dimensionality while extracting salient information for performing a given task, efficient channels have to be used for the HO. In the past few years, 2D Laguerre-Gauss (LG) channels, which are a complete basis for stationary backgrounds and rotationally symmetric signals, have been utilized for DBT evaluation2, 3 . But since background and signal statistics from DBT data are neither stationary nor rotationally symmetric, LG channels may not be efficient in providing reliable performance trends as a function of system parameters. Recently, partial least squares (PLS) has been shown to generate efficient channels for the Hotelling observer in detection tasks involving random backgrounds and signals.4 In this study, we investigate the use of PLS as a method for extracting salient information from DBT in order to better evaluate such systems.

  20. A Method to Increase Drivers' Trust in Collision Warning Systems Based on Reliability Information of Sensor

    Science.gov (United States)

    Tsutsumi, Shigeyoshi; Wada, Takahiro; Akita, Tokihiko; Doi, Shun'ichi

    Driver's workload tends to be increased during driving under complicated traffic environments like a lane change. In such cases, rear collision warning is effective for reduction of cognitive workload. On the other hand, it is pointed out that false alarm or missing alarm caused by sensor errors leads to decrease of driver' s trust in the warning system and it can result in low efficiency of the system. Suppose that reliability information of the sensor is provided in real-time. In this paper, we propose a new warning method to increase driver' s trust in the system even with low sensor reliability utilizing the sensor reliability information. The effectiveness of the warning methods is shown by driving simulator experiments.