WorldWideScience

Sample records for methods background information

  1. Background elimination methods for multidimensional coincidence γ-ray spectra

    International Nuclear Information System (INIS)

    Morhac, M.

    1997-01-01

    In the paper new methods to separate useful information from background in one, two, three and multidimensional spectra (histograms) measured in large multidetector γ-ray arrays are derived. The sensitive nonlinear peak clipping algorithm is the basis of the methods for estimation of the background in multidimensional spectra. The derived procedures are simple and therefore have a very low cost in terms of computing time. (orig.)

  2. Monitor Clean and Efficient. Background information. Methods and references as applied in the Monitor in April 2009

    International Nuclear Information System (INIS)

    Gerdes, J.; De Ligt, T.

    2010-01-01

    This report contains background information about the Monitor Clean and Efficient that was published in April 2009. The goal and approach of the Monitor are clarified, as well as the methods and data that are used. The structure of this report resembles the structure of the Monitor. Sources and dates of availability are mentioned along with the data, as are the parties collecting and processing the information. The results that were found using this methodology have been published in the Monitor Clean and Efficient. [nl

  3. Latent variable method for automatic adaptation to background states in motor imagery BCI

    Science.gov (United States)

    Dagaev, Nikolay; Volkova, Ksenia; Ossadtchi, Alexei

    2018-02-01

    Objective. Brain-computer interface (BCI) systems are known to be vulnerable to variabilities in background states of a user. Usually, no detailed information on these states is available even during the training stage. Thus there is a need in a method which is capable of taking background states into account in an unsupervised way. Approach. We propose a latent variable method that is based on a probabilistic model with a discrete latent variable. In order to estimate the model’s parameters, we suggest to use the expectation maximization algorithm. The proposed method is aimed at assessing characteristics of background states without any corresponding data labeling. In the context of asynchronous motor imagery paradigm, we applied this method to the real data from twelve able-bodied subjects with open/closed eyes serving as background states. Main results. We found that the latent variable method improved classification of target states compared to the baseline method (in seven of twelve subjects). In addition, we found that our method was also capable of background states recognition (in six of twelve subjects). Significance. Without any supervised information on background states, the latent variable method provides a way to improve classification in BCI by taking background states into account at the training stage and then by making decisions on target states weighted by posterior probabilities of background states at the prediction stage.

  4. A novel method to remove the background from x-ray diffraction signal

    DEFF Research Database (Denmark)

    Zheng, Yi; Speller, Robert; Griffiths, Jennifer

    2018-01-01

    The first step that is required to extract the correct information from a two-dimensional (2D) diffraction signature is to remove the background accurately. However, direct background subtraction inevitably overcorrects the signal as it does not take into account the attenuation by the sample. Ot...... proposes a novel method that combines peak fitting and experimental results to estimate the background for 2D XRD signals....

  5. Foreign Energy Company Competitiveness: Background information

    Energy Technology Data Exchange (ETDEWEB)

    Weimar, M.R.; Freund, K.A.; Roop, J.M.

    1994-10-01

    This report provides background information to the report Energy Company Competitiveness: Little to Do With Subsidies (DOE 1994). The main body of this publication consists of data uncovered during the course of research on this DOE report. This data pertains to major government energy policies in each country studied. This report also provides a summary of the DOE report. In October 1993, the Office of Energy Intelligence, US Department of Energy (formerly the Office of Foreign Intelligence), requested that Pacific Northwest Laboratory prepare a report addressing policies and actions used by foreign governments to enhance the competitiveness of their energy firms. Pacific Northwest Laboratory prepared the report Energy Company Competitiveness Little to Do With Subsidies (DOE 1994), which provided the analysis requested by DOE. An appendix was also prepared, which provided extensive background documentation to the analysis. Because of the length of the appendix, Pacific Northwest Laboratory decided to publish this information separately, as contained in this report.

  6. Introduction to the background field method

    International Nuclear Information System (INIS)

    Abbott, L.F.; Brandeis Univ., Waltham, MA

    1982-01-01

    The background field approach to calculations in gauge field theories is presented. Conventional functional techniques are reviewed and the background field method is introduced. Feynman rules and renormalization are discussed and, as an example, the Yang-Mills β function is computed. (author)

  7. The Background-Field Method and Noninvariant Renormalization

    International Nuclear Information System (INIS)

    Avdeev, L.V.; Kazakov, D.I.; Kalmykov, M.Yu.

    1994-01-01

    We investigate the consistency of the background-field formalism when applying various regularizations and renormalization schemes. By an example of a two-dimensional σ model it is demonstrated that the background-field method gives incorrect results when the regularization (and/or renormalization) is noninvariant. In particular, it is found that the cut-off regularization and the differential renormalization belong to this class and are incompatible with the background-field method in theories with nonlinear symmetries. 17 refs

  8. Study on the background information for the geological disposal concept

    International Nuclear Information System (INIS)

    Matsui, Kazuaki; Murano, Tohru; Hirusawa, Shigenobu; Komoto, Harumi

    2000-03-01

    Japan Nuclear Cycle Development Institute (JNC) has published first R and D report in 1992, in which the fruits of the R and D work were compiled. Since then, JNC, has been promoting the second R and D progress report until before 2000, in which the background information on the geological disposal of high level radioactive waste (HLW) was to be presented as well as the technical basis. Recognizing the importance of the social consensus to the geological disposal, understanding and consensus by the society are essential to the development and realization of the geological disposal of HLW. In this fiscal year, studies were divided into 2 phases, considering the time schedule of the second R and D progress report. 1. Phase 1: Analysis of the background information on the geological disposal concept. Based on the recent informations and the research works of last 2 years, final version of the study was made to contribute to the background informations for the second R and D progress report. (This was published in Nov. 1999 as the intermediate report: JNC TJ 1420 2000-006). 2. Phase 2: Following 2 specific items were selected for the candidate issues which need to be studied, considering the present circumstances around the R and D of geological disposal. (1) Educational materials and strategies related to nuclear energy and nuclear waste. Specific strategies and approaches in the area of nuclear energy and nuclear waste educational outreach and curriculum activities by the nuclear industry, government and other entities in 6 countries were surveyed and summarized. (2) Alternatives to geological disposal of HLW: Past national/international consideration and current status. The alternatives for the disposal of HLW have been discussed in the past and the major waste-producing countries have almost all chosen deep geological disposal as preferred method. Here past histories and recent discussions on the variations to geological disposal were studied. (author)

  9. The Institute of American Indian Arts Background Information (Task One of the Transition Evaluation). Background Paper.

    Science.gov (United States)

    Tippeconnic, John W., Jr.

    The paper, prepared as Task One of the Institute of American Indian Arts Transition Evaluation, provides pertinent background information about the Institute of American Indian Arts in Santa Fe, New Mexico. A brief history of the Institute is given, with information about its philosophy and purpose; objectives; organization and administration; the…

  10. A method of background noise cancellation for SQUID applications

    International Nuclear Information System (INIS)

    He, D F; Yoshizawa, M

    2003-01-01

    When superconducting quantum inference devices (SQUIDs) operate in low-cost shielding or unshielded environments, the environmental background noise should be reduced to increase the signal-to-noise ratio. In this paper we present a background noise cancellation method based on a spectral subtraction algorithm. We first measure the background noise and estimate the noise spectrum using fast Fourier transform (FFT), then we subtract the spectrum of background noise from that of the observed noisy signal and the signal can be reconstructed by inverse FFT of the subtracted spectrum. With this method, the background noise, especially stationary inferences, can be suppressed well and the signal-to-noise ratio can be increased. Using high-T C radio-frequency SQUID gradiometer and magnetometer, we have measured the magnetic field produced by a watch, which was placed 35 cm under a SQUID. After noise cancellation, the signal-to-noise ratio could be greatly increased. We also used this method to eliminate the vibration noise of a cryocooler SQUID

  11. Spectral feature characterization methods for blood stain detection in crime scene backgrounds

    Science.gov (United States)

    Yang, Jie; Mathew, Jobin J.; Dube, Roger R.; Messinger, David W.

    2016-05-01

    Blood stains are one of the most important types of evidence for forensic investigation. They contain valuable DNA information, and the pattern of the stains can suggest specifics about the nature of the violence that transpired at the scene. Blood spectral signatures containing unique reflectance or absorption features are important both for forensic on-site investigation and laboratory testing. They can be used for target detection and identification applied to crime scene hyperspectral imagery, and also be utilized to analyze the spectral variation of blood on various backgrounds. Non-blood stains often mislead the detection and can generate false alarms at a real crime scene, especially for dark and red backgrounds. This paper measured the reflectance of liquid blood and 9 kinds of non-blood samples in the range of 350 nm - 2500 nm in various crime scene backgrounds, such as pure samples contained in petri dish with various thicknesses, mixed samples with different colors and materials of fabrics, and mixed samples with wood, all of which are examined to provide sub-visual evidence for detecting and recognizing blood from non-blood samples in a realistic crime scene. The spectral difference between blood and non-blood samples are examined and spectral features such as "peaks" and "depths" of reflectance are selected. Two blood stain detection methods are proposed in this paper. The first method uses index to denote the ratio of "depth" minus "peak" over"depth" add"peak" within a wavelength range of the reflectance spectrum. The second method uses relative band depth of the selected wavelength ranges of the reflectance spectrum. Results show that the index method is able to discriminate blood from non-blood samples in most tested crime scene backgrounds, but is not able to detect it from black felt. Whereas the relative band depth method is able to discriminate blood from non-blood samples on all of the tested background material types and colors.

  12. The Pedestrian Detection Method Using an Extension Background Subtraction about the Driving Safety Support Systems

    Science.gov (United States)

    Muranaka, Noriaki; Date, Kei; Tokumaru, Masataka; Imanishi, Shigeru

    In recent years, the traffic accident occurs frequently with explosion of traffic density. Therefore, we think that the safe and comfortable transportation system to defend the pedestrian who is the traffic weak is necessary. First, we detect and recognize the pedestrian (the crossing person) by the image processing. Next, we inform all the drivers of the right or left turn that the pedestrian exists by the sound and the image and so on. By prompting a driver to do safe driving in this way, the accident to the pedestrian can decrease. In this paper, we are using a background subtraction method for the movement detection of the movement object. In the background subtraction method, the update method in the background was important, and as for the conventional way, the threshold values of the subtraction processing and background update were identical. That is, the mixing rate of the input image and the background image of the background update was a fixation value, and the fine tuning which corresponded to the environment change of the weather was difficult. Therefore, we propose the update method of the background image that the estimated mistake is difficult to be amplified. We experiment and examines in the comparison about five cases of sunshine, cloudy, evening, rain, sunlight change, except night. This technique can set separately the threshold values of the subtraction processing and background update processing which suited the environmental condition of the weather and so on. Therefore, the fine tuning becomes possible freely in the mixing rate of the input image and the background image of the background update. Because the setting of the parameter which suited an environmental condition becomes important to minimize mistaking percentage, we examine about the setting of a parameter.

  13. 42 CFR 82.0 - Background information on this part.

    Science.gov (United States)

    2010-10-01

    ... 82.0 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES OCCUPATIONAL SAFETY... EMPLOYEES OCCUPATIONAL ILLNESS COMPENSATION PROGRAM ACT OF 2000 Introduction § 82.0 Background information on this part. The Energy Employees Occupational Illness Compensation Program Act (EEOICPA), 42 U.S.C...

  14. Envelope method for background elimination from X-ray fluorescence spectra

    International Nuclear Information System (INIS)

    Monakhov, V.V.; Naumenko, P.A.; Chashinskaya, O.A.

    2006-01-01

    The influence of the background noise caused by Bremsstrahlung on the accuracy of the envelope method at x-ray fluorescence spectra processing is studied. This is carried out by the example of model spectra at different forms of Bremsstrahlung noise as well as at the presence of background noise in spectra. The interpolation by parabolic splines is used for the estimation of the error of the envelope method for the elimination of continuos background noise. It is found out that the error of the proposed method constitutes decimal parts of percent. It is shown that the envelope method is the effective technique for the elimination of the continuous Bremsstrahlung from x-ray fluorescence spectra of the first order [ru

  15. Renormalization using the background-field method

    International Nuclear Information System (INIS)

    Ichinose, S.; Omote, M.

    1982-01-01

    Renormalization using the background-field method is examined in detail. The subtraction mechanism of subdivergences is described with reference to multi-loop diagrams and one- and two-loop counter-term formulae are explicitly given. The original one-loop counter-term formula of 't Hooft is thereby improved. The present method of renormalization is far easier to manage than the usual one owing to the fact only gauge-invariant quantities are to be considered when worked in an appropriate gauge. Gravity and Yang-Mills theories are studied as examples. (orig.)

  16. 77 FR 21992 - Proposed Renewal of Information Collection: Applicant Background Survey

    Science.gov (United States)

    2012-04-12

    ... customers. By including employees of all backgrounds, all DOI employees gain a measure of knowledge... barriers in our recruitment and selection processes, DOI must track the demographic groups that apply for... need and use of the information: This information is required to obtain the source of recruitment...

  17. A novel background field removal method for MRI using projection onto dipole fields (PDF).

    Science.gov (United States)

    Liu, Tian; Khalidov, Ildar; de Rochefort, Ludovic; Spincemaille, Pascal; Liu, Jing; Tsiouris, A John; Wang, Yi

    2011-11-01

    For optimal image quality in susceptibility-weighted imaging and accurate quantification of susceptibility, it is necessary to isolate the local field generated by local magnetic sources (such as iron) from the background field that arises from imperfect shimming and variations in magnetic susceptibility of surrounding tissues (including air). Previous background removal techniques have limited effectiveness depending on the accuracy of model assumptions or information input. In this article, we report an observation that the magnetic field for a dipole outside a given region of interest (ROI) is approximately orthogonal to the magnetic field of a dipole inside the ROI. Accordingly, we propose a nonparametric background field removal technique based on projection onto dipole fields (PDF). In this PDF technique, the background field inside an ROI is decomposed into a field originating from dipoles outside the ROI using the projection theorem in Hilbert space. This novel PDF background removal technique was validated on a numerical simulation and a phantom experiment and was applied in human brain imaging, demonstrating substantial improvement in background field removal compared with the commonly used high-pass filtering method. Copyright © 2011 John Wiley & Sons, Ltd.

  18. Improvement of Accuracy for Background Noise Estimation Method Based on TPE-AE

    Science.gov (United States)

    Itai, Akitoshi; Yasukawa, Hiroshi

    This paper proposes a method of a background noise estimation based on the tensor product expansion with a median and a Monte carlo simulation. We have shown that a tensor product expansion with absolute error method is effective to estimate a background noise, however, a background noise might not be estimated by using conventional method properly. In this paper, it is shown that the estimate accuracy can be improved by using proposed methods.

  19. Presenting and processing information in background noise: A combined speaker-listener perspective.

    Science.gov (United States)

    Bockstael, Annelies; Samyn, Laurie; Corthals, Paul; Botteldooren, Dick

    2018-01-01

    Transferring information orally in background noise is challenging, for both speaker and listener. Successful transfer depends on complex interaction between characteristics related to listener, speaker, task, background noise, and context. To fully assess the underlying real-life mechanisms, experimental design has to mimic this complex reality. In the current study, the effects of different types of background noise have been studied in an ecologically valid test design. Documentary-style information had to be presented by the speaker and simultaneously acquired by the listener in four conditions: quiet, unintelligible multitalker babble, fluctuating city street noise, and little varying highway noise. For both speaker and listener, the primary task was to focus on the content that had to be transferred. In addition, for the speakers, the occurrence of hesitation phenomena was assessed. The listener had to perform an additional secondary task to address listening effort. For the listener the condition with the most eventful background noise, i.e., fluctuating city street noise, appeared to be the most difficult with markedly longer duration of the secondary task. In the same fluctuating background noise, speech appeared to be less disfluent, suggesting a higher level of concentration from the speaker's side.

  20. Description of background data in the SKB database GEOTAB

    International Nuclear Information System (INIS)

    Eriksson, E.; Sehlstedt, S.

    1989-02-01

    During the research and development program performed by SKB for the final disposal of spent nuclear fuel, a large quantity of geoscientific data was collected. Most of this data was stored in a database called GEOTAB. This report describes data within the background data group. This data provides information on the location of areas studied, borehole positions and also some drilling information. The background data group (subject), called BGR, is divided into several subgroups (methods): BGAREA area background data; BGDRILL drilling information; BGDRILLP drill penetration data; BGHOLE borehole information; BGTABLES number of rows in a table, and BGTOLR data table tolerance. A method consists of one or several data tables. In each chapter a method and its data tables are described. (orig./HP)

  1. The information content of cosmic microwave background anisotropies

    Science.gov (United States)

    Scott, Douglas; Contreras, Dagoberto; Narimani, Ali; Ma, Yin-Zhe

    2016-06-01

    The cosmic microwave background (CMB) contains perturbations that are close to Gaussian and isotropic. This means that its information content, in the sense of the ability to constrain cosmological models, is closely related to the number of modes probed in CMB power spectra. Rather than making forecasts for specific experimental setups, here we take a more pedagogical approach and ask how much information we can extract from the CMB if we are only limited by sample variance. We show that, compared with temperature measurements, the addition of E-mode polarization doubles the number of modes available out to a fixed maximum multipole, provided that all of the TT, TE, and EE power spectra are measured. However, the situation in terms of constraints on particular parameters is more complicated, as we explain and illustrate graphically. We also discuss the enhancements in information that can come from adding B-mode polarization and gravitational lensing. We show how well one could ever determine the basic cosmological parameters from CMB data compared with what has been achieved with Planck, which has already probed a substantial fraction of the TT information. Lastly, we look at constraints on neutrino mass as a specific example of how lensing information improves future prospects beyond the current 6-parameter model.

  2. The information content of cosmic microwave background anisotropies

    International Nuclear Information System (INIS)

    Scott, Douglas; Contreras, Dagoberto; Narimani, Ali; Ma, Yin-Zhe

    2016-01-01

    The cosmic microwave background (CMB) contains perturbations that are close to Gaussian and isotropic. This means that its information content, in the sense of the ability to constrain cosmological models, is closely related to the number of modes probed in CMB power spectra. Rather than making forecasts for specific experimental setups, here we take a more pedagogical approach and ask how much information we can extract from the CMB if we are only limited by sample variance. We show that, compared with temperature measurements, the addition of E -mode polarization doubles the number of modes available out to a fixed maximum multipole, provided that all of the TT , TE , and EE power spectra are measured. However, the situation in terms of constraints on particular parameters is more complicated, as we explain and illustrate graphically. We also discuss the enhancements in information that can come from adding B -mode polarization and gravitational lensing. We show how well one could ever determine the basic cosmological parameters from CMB data compared with what has been achieved with Planck , which has already probed a substantial fraction of the TT information. Lastly, we look at constraints on neutrino mass as a specific example of how lensing information improves future prospects beyond the current 6-parameter model.

  3. Numerical method for IR background and clutter simulation

    Science.gov (United States)

    Quaranta, Carlo; Daniele, Gina; Balzarotti, Giorgio

    1997-06-01

    The paper describes a fast and accurate algorithm of IR background noise and clutter generation for application in scene simulations. The process is based on the hypothesis that background might be modeled as a statistical process where amplitude of signal obeys to the Gaussian distribution rule and zones of the same scene meet a correlation function with exponential form. The algorithm allows to provide an accurate mathematical approximation of the model and also an excellent fidelity with reality, that appears from a comparison with images from IR sensors. The proposed method shows advantages with respect to methods based on the filtering of white noise in time or frequency domain as it requires a limited number of computation and, furthermore, it is more accurate than the quasi random processes. The background generation starts from a reticule of few points and by means of growing rules the process is extended to the whole scene of required dimension and resolution. The statistical property of the model are properly maintained in the simulation process. The paper gives specific attention to the mathematical aspects of the algorithm and provides a number of simulations and comparisons with real scenes.

  4. Segmentation of Moving Object Using Background Subtraction Method in Complex Environments

    Directory of Open Access Journals (Sweden)

    S. Kumar

    2016-06-01

    Full Text Available Background subtraction is an extensively used approach to localize the moving object in a video sequence. However, detecting an object under the spatiotemporal behavior of background such as rippling of water, moving curtain and illumination change or low resolution is not a straightforward task. To deal with the above-mentioned problem, we address a background maintenance scheme based on the updating of background pixels by estimating the current spatial variance along the temporal line. The work is focused to immune the variation of local motion in the background. Finally, the most suitable label assignment to the motion field is estimated and optimized by using iterated conditional mode (ICM under a Markovian framework. Performance evaluation and comparisons with the other well-known background subtraction methods show that the proposed method is unaffected by the problem of aperture distortion, ghost image, and high frequency noise.

  5. E-mail Writing: Providing Background Information in the Core of Computer Assisted Instruction

    Directory of Open Access Journals (Sweden)

    Behzad NAZARI

    2015-01-01

    Full Text Available The present study highly supported the effective role of providing background information via e-mail by the teacher to write e-mail by the students in learners’ writing ability. A total number of 50 EFL advanced male students aged between 25 and 40 at different branches of Iran Language Institute in Tehran, Tehran. Through the placement test of Oxford English Language Placement Test (OELPT the students' proficiency level seems to be nearly the same. Participants were randomly assign into two groups of experimental and control, each consisting of 25 students. After the administration of the proficiency test, all groups were assigned to write topic 1 as the pre-test. Next, the teacher involved the learners in the new instruction (treatment. During writing topics 2, 3, 4, 5, 6, and 7 experimental group’s background knowledge was activated through e-mail before writing and e-mailing topics while the control group received no background knowledge activation through e-mail. After the treatment was given to the experimental group, the students in both groups were required to write another composition about the last topic, topic 8. Again, in this phase, none of the groups received any background information. The results indicated that providing background information via e-mail by the teacher to write e-mail by the students significantly improved learners’ writing ability.

  6. Research on Statistical Flow of the Complex Background Based on Image Method

    Directory of Open Access Journals (Sweden)

    Yang Huanhai

    2014-06-01

    Full Text Available Along with our country city changes a process continues to accelerate, city road traffic system pressure increasing. Therefore, the importance of intelligent transportation system based on computer vision technology is becoming more and more significant. Using the image processing technology for the vehicle detection has become a hot topic in the research field of. Only accurately segmented from the background of vehicle can recognize and track vehicles. Therefore, the application of video vehicle detection technology and image processing technology, identify a number of the same sight many car can, types and moving characteristics, can provide real-time basis for intelligent traffic control. This paper first introduces the concept of intelligent transportation system, the importance and the image processing technology in vehicle recognition in statistics, overview of video vehicle detection method, and the video detection technology and other detection technology, puts forward the superiority of video detection technology. Finally we design a real-time and reliable background subtraction method and the area of the vehicle recognition method based on information fusion algorithm, which is implemented with the MATLAB/GUI development tool in Windows operating system platform. In this paper, the application of the algorithm to study the frame traffic flow image. The experimental results show that, the algorithm of recognition of vehicle flow statistics, the effect is very good.

  7. Identification and summary characterization of materials potentially requiring vitrification: Background information

    International Nuclear Information System (INIS)

    Croff, A.G.

    1996-01-01

    This document contains background information for the Workshop in general and the presentation entitled 'Identification and Summary Characterization of Materials Potentially Requiring Vitrification' that was given during the first morning of the workshop. summary characteristics of 9 categories of US materials having some potential to be vitrified are given. This is followed by a 1-2 page elaborations for each of these 9 categories. References to more detailed information are included

  8. A method of reducing background fluctuation in tunable diode laser absorption spectroscopy

    Science.gov (United States)

    Yang, Rendi; Dong, Xiaozhou; Bi, Yunfeng; Lv, Tieliang

    2018-03-01

    Optical interference fringe is the main factor that leads to background fluctuation in gas concentration detection based on tunable diode laser absorption spectroscopy. The interference fringes are generated by multiple reflections or scatterings upon optical surfaces in optical path and make the background signal present an approximated sinusoidal oscillation. To reduce the fluctuation of the background, a method that combines dual tone modulation (DTM) with vibration reflector (VR) is proposed in this paper. The combination of DTM and VR can make the unwanted periodic interference fringes to be averaged out and the effectiveness of the method in reducing background fluctuation has been verified by simulation and real experiments in this paper. In the detection system based on the proposed method, the standard deviation (STD) value of the background signal is decreased to 0.0924 parts per million (ppm), which is reduced by a factor of 16 compared with that of wavelength modulation spectroscopy. The STD value of 0.0924 ppm corresponds to the absorption of 4 . 328 × 10-6Hz - 1 / 2 (with effective optical path length of 4 m and integral time of 0.1 s). Moreover, the proposed method presents a better stable performance in reducing background fluctuation in long time experiments.

  9. Direction Dependent Background Fitting for the Fermi GBM Data

    OpenAIRE

    Szécsi, Dorottya; Bagoly, Zsolt; Kóbori, József; Horváth, István; Balázs, Lajos G.

    2013-01-01

    We present a method for determining the background of Fermi GBM GRBs using the satellite positional information and a physical model. Since the polynomial fitting method typically used for GRBs is generally only indicative of the background over relatively short timescales, this method is particularly useful in the cases of long GRBs or those which have Autonomous Repoint Request (ARR) and a background with much variability on short timescales. We give a Direction Dependent Background Fitting...

  10. Limitations of the time slide method of background estimation

    International Nuclear Information System (INIS)

    Was, Michal; Bizouard, Marie-Anne; Brisson, Violette; Cavalier, Fabien; Davier, Michel; Hello, Patrice; Leroy, Nicolas; Robinet, Florent; Vavoulidis, Miltiadis

    2010-01-01

    Time shifting the output of gravitational wave detectors operating in coincidence is a convenient way of estimating the background in a search for short-duration signals. In this paper, we show how non-stationary data affect the background estimation precision. We present a method of measuring the fluctuations of the data and computing its effects on a coincident search. In particular, we show that for fluctuations of moderate amplitude, time slides larger than the fluctuation time scales can be used. We also recall how the false alarm variance saturates with the number of time shifts.

  11. Limitations of the time slide method of background estimation

    Energy Technology Data Exchange (ETDEWEB)

    Was, Michal; Bizouard, Marie-Anne; Brisson, Violette; Cavalier, Fabien; Davier, Michel; Hello, Patrice; Leroy, Nicolas; Robinet, Florent; Vavoulidis, Miltiadis, E-mail: mwas@lal.in2p3.f [LAL, Universite Paris-Sud, CNRS/IN2P3, Orsay (France)

    2010-10-07

    Time shifting the output of gravitational wave detectors operating in coincidence is a convenient way of estimating the background in a search for short-duration signals. In this paper, we show how non-stationary data affect the background estimation precision. We present a method of measuring the fluctuations of the data and computing its effects on a coincident search. In particular, we show that for fluctuations of moderate amplitude, time slides larger than the fluctuation time scales can be used. We also recall how the false alarm variance saturates with the number of time shifts.

  12. [Effects of exposure frequency and background information on preferences for photographs of cars in different locations].

    Science.gov (United States)

    Matsuda, Ken; Kusumi, Takashi; Hosomi, Naohiro; Osa, Atsushi; Miike, Hidetoshi

    2014-08-01

    This study examined the influence of familiarity and novelty on the mere exposure effect while manipulating the presentation of background information. We selected presentation stimuli that integrated cars and backgrounds based on the results of pilot studies. During the exposure phase, we displayed the stimuli successively for 3 seconds, manipulating the background information (same or different backgrounds with each presentation) and exposure frequency (3, 6, and 9 times). In the judgment phase, 18 participants judged the cars in terms of preference, familiarity, and novelty on a 7-point scale. As the number of stimulus presentations increased, the preference for the cars increased during the different background condition and decreased during the same background condition. This increased preference may be due to the increase in familiarity caused by the higher exposure frequency and novelty resulting from the background changes per exposure session. The rise in preference judgments was not seen when cars and backgrounds were presented independently. Therefore, the addition of novel features to each exposure session facilitated the mere exposure effect.

  13. Method and equipment for γ background compensation in neutron spectrometry

    International Nuclear Information System (INIS)

    Holman, M.; Marik, P.

    1986-01-01

    The compensation of background gamma radiation in neutron spectrometry is based on the measurement of the total energy spectrum of all protons and electrons, and of the energy spectrum of those protons and neutrons which are in coincidence with the discriminating signal derived from the integral of the counting rate distribution by pulse shape. The benefits of the method consist in the possibility of using standard single-parameter apparatus, in considerably smaller demands on the memory capacity and the possibility of a substantially finer division of the spectrum and more accurate compensation of the background than has been the case with methods used so far. A practical application is shown in a block diagram. (J.B.)

  14. An analysis of the Bonn agreement. Background information for evaluating business implications

    International Nuclear Information System (INIS)

    Torvanger, Asbjoern

    2001-01-01

    This report has been commissioned by the World Business Council for Sustainable Development and written in August 2001. The aim of the report is to present and analyze the newest developments in the climate negotiations, particularly from part two of the sixth Conference of the Parties to the Climate Convention in Bonn in July 2001, and to provide background information to evaluate what the ''Bonn agreement'' means for business. The report is organized as a collection of slides with supporting text explaining the background and contents of each slide. (author)

  15. The influence of immigrant background on the choice of sedation method in paediatric dentistry.

    Science.gov (United States)

    Dahlander, Andreas; Jansson, Leif; Carlstedt, Kerstin; Grindefjord, Margaret

    2015-01-01

    The effects of immigration on the demographics of the Swedish population have changed the situation for many dental care providers, placing increased demand on cultural competence. The aim of this investigation was to study the choice of sedation method among children with immigrant background, referred to paediatric dentistry specialists, because of behaviour management problems or dental fear in combination with treatment needs. The material consisted of dental records from children referred to two clinics for paediatric dentistry: 117 records from children with an immigrant background and 106 from children with a non-immigrant background. Information about choice of sedation method (conventional treatment, conscious sedation with midazolam, nitrous oxide, or general anaesthesia) and dental status was collected from the records. The number of missed appointments (defaults) was also registered. Binary logistic regression analyses were used to calculate the influence of potential predictors on choice of sedation method. The mean age of the patients in the immigrant group was 4.9 yrs, making them significantly younger than the patients in the non-immigrant group (mean 5.7 yrs). In the immigrant group, 26% of the patients defaulted from treatments, while the corresponding frequency was significantly lower for the reference group (7%). The numbers of primary teeth with caries and permanent teeth with caries were positively and significantly correlated with the choice of treatment under general anaesthesia. Conscious sedation was used significantly more often in younger children and in the non-immigrant group, while nitrous oxide was preferred in the older children. In conclusion, conscious sedation was more frequently used in the non-immigrant group. The choice of sedation was influenced by caries frequency and the age of the child.

  16. A statistical, task-based evaluation method for three-dimensional x-ray breast imaging systems using variable-background phantoms

    International Nuclear Information System (INIS)

    Park, Subok; Jennings, Robert; Liu Haimo; Badano, Aldo; Myers, Kyle

    2010-01-01

    Purpose: For the last few years, development and optimization of three-dimensional (3D) x-ray breast imaging systems, such as digital breast tomosynthesis (DBT) and computed tomography, have drawn much attention from the medical imaging community, either academia or industry. However, there is still much room for understanding how to best optimize and evaluate the devices over a large space of many different system parameters and geometries. Current evaluation methods, which work well for 2D systems, do not incorporate the depth information from the 3D imaging systems. Therefore, it is critical to develop a statistically sound evaluation method to investigate the usefulness of inclusion of depth and background-variability information into the assessment and optimization of the 3D systems. Methods: In this paper, we present a mathematical framework for a statistical assessment of planar and 3D x-ray breast imaging systems. Our method is based on statistical decision theory, in particular, making use of the ideal linear observer called the Hotelling observer. We also present a physical phantom that consists of spheres of different sizes and materials for producing an ensemble of randomly varying backgrounds to be imaged for a given patient class. Lastly, we demonstrate our evaluation method in comparing laboratory mammography and three-angle DBT systems for signal detection tasks using the phantom's projection data. We compare the variable phantom case to that of a phantom of the same dimensions filled with water, which we call the uniform phantom, based on the performance of the Hotelling observer as a function of signal size and intensity. Results: Detectability trends calculated using the variable and uniform phantom methods are different from each other for both mammography and DBT systems. Conclusions: Our results indicate that measuring the system's detection performance with consideration of background variability may lead to differences in system performance

  17. Judicial decision making: order of evidence presentation and availability of background information

    NARCIS (Netherlands)

    Kerstholt, J.H.; Jackson, J.L.

    1998-01-01

    An experiment was conducted to investigate both the effect of the order of presentation of defence and prosecution evidence and the prior availability of background information on assessment of guilt. Subjects were required to judge the defendant's probability of guilt either after each witness

  18. A non-iterative method for fitting decay curves with background

    International Nuclear Information System (INIS)

    Mukoyama, T.

    1982-01-01

    A non-iterative method for fitting a decay curve with background is presented. The sum of an exponential function and a constant term is linearized by the use of the difference equation and parameters are determined by the standard linear least-squares fitting. The validity of the present method has been tested against pseudo-experimental data. (orig.)

  19. Evaluation of methods to reduce background using the Python-based ELISA_QC program.

    Science.gov (United States)

    Webster, Rose P; Cohen, Cinder F; Saeed, Fatima O; Wetzel, Hanna N; Ball, William J; Kirley, Terence L; Norman, Andrew B

    2018-05-01

    Almost all immunological approaches [immunohistochemistry, enzyme-linked immunosorbent assay (ELISA), Western blot], that are used to quantitate specific proteins have had to address high backgrounds due to non-specific reactivity. We report here for the first time a quantitative comparison of methods for reduction of the background of commercial biotinylated antibodies using the Python-based ELISA_QC program. This is demonstrated using a recombinant humanized anti-cocaine monoclonal antibody. Several approaches, such as adjustment of the incubation time and the concentration of blocking agent, as well as the dilution of secondary antibodies, have been explored to address this issue. In this report, systematic comparisons of two different methods, contrasted with other more traditional methods to address this problem are provided. Addition of heparin (HP) at 1 μg/ml to the wash buffer prior to addition of the secondary biotinylated antibody reduced the elevated background absorbance values (from a mean of 0.313 ± 0.015 to 0.137 ± 0.002). A novel immunodepletion (ID) method also reduced the background (from a mean of 0.331 ± 0.010 to 0.146 ± 0.013). Overall, the ID method generated more similar results at each concentration of the ELISA standard curve to that using the standard lot 1 than the HP method, as analyzed by the Python-based ELISA_QC program. We conclude that the ID method, while more laborious, provides the best solution to resolve the high background seen with specific lots of biotinylated secondary antibody. Copyright © 2018. Published by Elsevier B.V.

  20. Study on the background information for the geological disposal concept

    International Nuclear Information System (INIS)

    Matsui, Kazuaki; Murano, Tohru; Hirusawa, Shigenobu; Komoto, Harumi

    1999-11-01

    Japan Nuclear Cycle Development Institute (JNC) has published the first R and D progress report in 1992. In which the fruits of the R and D works were compiled. Since then the next step of R and D has been developing progressively in Japan. Now JNC has a plan to make the second R and D progress report until before 2000, in which information on the geological disposal of high level radioactive waste(HLW) will be presented to show the technical reliability and technical basis to contribute for the site selection or the safety-standard developments. Recognizing the importance of the social consensus to the geological disposal of international discussions in 1990's, understanding and consensus by the society are essential to the development and realization of the geological disposal of HLW. For getting social understanding and consensus, it is quite important to present the broad basis background information on the geological disposal of HLW, together with the technical basis and also the international discussion of the issues. In this report, the following studies have been done to help to prepare the background information for the 2nd R and D progress report, based on the recent informations and research and assessment works of last 2 years. These are, (1) As the part of general discussion, characteristics of HLW disposal and several issues to be considered for establishing the measures of the disposal of HLW were identified and analyzed from both practical and logical points of view. Those issues were the concept and image of the long term safety measures, the concept and criteria of geological disposal, and, safety assessment and performance assessment. (2) As the part of specific discussion, questions and concerns frequently raised by the non-specialists were taken up and 10 topics in relation to the geological disposal have been identified based on the discussion. Scientific and technical facts, consensus by the specialists on the issues, and international

  1. Information Retrieval Methods in Libraries and Information Centers ...

    African Journals Online (AJOL)

    The volumes of information created, generated and stored are immense that without adequate knowledge of information retrieval methods, the retrieval process for an information user would be cumbersome and frustrating. Studies have further revealed that information retrieval methods are essential in information centers ...

  2. Evaluation of Shifted Excitation Raman Difference Spectroscopy and Comparison to Computational Background Correction Methods Applied to Biochemical Raman Spectra.

    Science.gov (United States)

    Cordero, Eliana; Korinth, Florian; Stiebing, Clara; Krafft, Christoph; Schie, Iwan W; Popp, Jürgen

    2017-07-27

    Raman spectroscopy provides label-free biochemical information from tissue samples without complicated sample preparation. The clinical capability of Raman spectroscopy has been demonstrated in a wide range of in vitro and in vivo applications. However, a challenge for in vivo applications is the simultaneous excitation of auto-fluorescence in the majority of tissues of interest, such as liver, bladder, brain, and others. Raman bands are then superimposed on a fluorescence background, which can be several orders of magnitude larger than the Raman signal. To eliminate the disturbing fluorescence background, several approaches are available. Among instrumentational methods shifted excitation Raman difference spectroscopy (SERDS) has been widely applied and studied. Similarly, computational techniques, for instance extended multiplicative scatter correction (EMSC), have also been employed to remove undesired background contributions. Here, we present a theoretical and experimental evaluation and comparison of fluorescence background removal approaches for Raman spectra based on SERDS and EMSC.

  3. A model independent safeguard against background mismodeling for statistical inference

    Energy Technology Data Exchange (ETDEWEB)

    Priel, Nadav; Landsman, Hagar; Manfredini, Alessandro; Budnik, Ranny [Department of Particle Physics and Astrophysics, Weizmann Institute of Science, Herzl St. 234, Rehovot (Israel); Rauch, Ludwig, E-mail: nadav.priel@weizmann.ac.il, E-mail: rauch@mpi-hd.mpg.de, E-mail: hagar.landsman@weizmann.ac.il, E-mail: alessandro.manfredini@weizmann.ac.il, E-mail: ran.budnik@weizmann.ac.il [Teilchen- und Astroteilchenphysik, Max-Planck-Institut für Kernphysik, Saupfercheckweg 1, 69117 Heidelberg (Germany)

    2017-05-01

    We propose a safeguard procedure for statistical inference that provides universal protection against mismodeling of the background. The method quantifies and incorporates the signal-like residuals of the background model into the likelihood function, using information available in a calibration dataset. This prevents possible false discovery claims that may arise through unknown mismodeling, and corrects the bias in limit setting created by overestimated or underestimated background. We demonstrate how the method removes the bias created by an incomplete background model using three realistic case studies.

  4. Description of background data in the SKB database GEOTAB. Version 2

    International Nuclear Information System (INIS)

    Eriksson, E.; Sehlstedt, S.

    1991-03-01

    During the research and development program performed by SKB for the final disposal of spent nuclear fuel, a large quantity of geoscientific data was collected. Most of this data was stored in a database called Geotab. The data is organized into eight groups as follows: Background information; Geological data; Borehole geophysical measurements; Ground surface geophysical measurements; Hydrogeological and meteorological data; Hydrochemical data; Petrophysical measurements and Tracer tests. Except for the case of borehole geophysical data, ground surface geophysical data and petrophysical data, described in the same report, the data in each group is described in a separate SKB report. The present report describes data within the Background data group. This data provides information on the location of areas studied, borehole positions and also some drilling information. Data is normally collected on forms or as notes and this is then stored into the database. The background data group, called BACKGROUND, is divided into several subgroups: BGAREA area background data; BGDRILL drilling information; BGDRILLP drill penetration data; BGHOLE borehole information; BGTABLES number of rows in a table and BGTOLR data table tolerance. A method consist of one or several data tables. In each chapter a method and its data tables are described. (authors)

  5. Detecting impact signal in mechanical fault diagnosis under chaotic and Gaussian background noise

    Science.gov (United States)

    Hu, Jinfeng; Duan, Jie; Chen, Zhuo; Li, Huiyong; Xie, Julan; Chen, Hanwen

    2018-01-01

    In actual fault diagnosis, useful information is often submerged in heavy noise, and the feature information is difficult to extract. Traditional methods, such like stochastic resonance (SR), which using noise to enhance weak signals instead of suppressing noise, failed in chaotic background. Neural network, which use reference sequence to estimate and reconstruct the background noise, failed in white Gaussian noise. To solve these problems, a novel weak signal detection method aimed at the problem of detecting impact signal buried under heavy chaotic and Gaussian background noise is proposed. First, the proposed method obtains the virtual reference sequence by constructing the Hankel data matrix. Then an M-order optimal FIR filter is designed, which can minimize the output power of background noise and pass the weak periodic signal undistorted. Finally, detection and reconstruction of the weak periodic signal are achieved from the output SBNR (signal to background noise ratio). The simulation shows, compared with the stochastic resonance (SR) method, the proposed method can detect the weak periodic signal in chaotic noise background while stochastic resonance (SR) method cannot. Compared with the neural network method, (a) the proposed method does not need a reference sequence while neural network method needs one; (b) the proposed method can detect the weak periodic signal in white Gaussian noise background while the neural network method fails, in chaotic noise background, the proposed method can detect the weak periodic signal under a lower SBNR (about 8-17 dB lower) than the neural network method; (c) the proposed method can reconstruct the weak periodic signal precisely.

  6. Increasing Power by Sharing Information from Genetic Background and Treatment in Clustering of Gene Expression Time Series

    Directory of Open Access Journals (Sweden)

    Sura Zaki Alrashid

    2018-02-01

    Full Text Available Clustering of gene expression time series gives insight into which genes may be co-regulated, allowing us to discern the activity of pathways in a given microarray experiment. Of particular interest is how a given group of genes varies with different conditions or genetic background. This paper develops
a new clustering method that allows each cluster to be parameterised according to whether the behaviour of the genes across conditions is correlated or anti-correlated. By specifying correlation between such genes,more information is gain within the cluster about how the genes interrelate. Amyotrophic lateral sclerosis (ALS is an irreversible neurodegenerative disorder that kills the motor neurons and results in death within 2 to 3 years from the symptom onset. Speed of progression for different patients are heterogeneous with significant variability. The SOD1G93A transgenic mice from different backgrounds (129Sv and C57 showed consistent phenotypic differences for disease progression. A hierarchy of Gaussian isused processes to model condition-specific and gene-specific temporal co-variances. This study demonstrated about finding some significant gene expression profiles and clusters of associated or co-regulated gene expressions together from four groups of data (SOD1G93A and Ntg from 129Sv and C57 backgrounds. Our study shows the effectiveness of sharing information between replicates and different model conditions when modelling gene expression time series. Further gene enrichment score analysis and ontology pathway analysis of some specified clusters for a particular group may lead toward identifying features underlying the differential speed of disease progression.

  7. Selective Exposure to and Acquisition of Information from Educational Television Programs as a Function of Appeal and Tempo of Background Music.

    Science.gov (United States)

    Wakshlag, Jacob J.; And Others

    1982-01-01

    The effect of educational television background music on selective exposure and information acquisition was studied. Background music of slow tempo, regardless of its appeal, had negligible effects on attention and information acquisition. Rhythmic, fast-tempo background music, especially when appealing, significantly reduced visual attention to…

  8. Interactive QR code beautification with full background image embedding

    Science.gov (United States)

    Lin, Lijian; Wu, Song; Liu, Sijiang; Jiang, Bo

    2017-06-01

    QR (Quick Response) code is a kind of two dimensional barcode that was first developed in automotive industry. Nowadays, QR code has been widely used in commercial applications like product promotion, mobile payment, product information management, etc. Traditional QR codes in accordance with the international standard are reliable and fast to decode, but are lack of aesthetic appearance to demonstrate visual information to customers. In this work, we present a novel interactive method to generate aesthetic QR code. By given information to be encoded and an image to be decorated as full QR code background, our method accepts interactive user's strokes as hints to remove undesired parts of QR code modules based on the support of QR code error correction mechanism and background color thresholds. Compared to previous approaches, our method follows the intention of the QR code designer, thus can achieve more user pleasant result, while keeping high machine readability.

  9. Looking for Cosmic Neutrino Background

    Directory of Open Access Journals (Sweden)

    Chiaki eYanagisawa

    2014-06-01

    Full Text Available Since the discovery of neutrino oscillation in atmospheric neutrinos by the Super-Kamiokande experiment in 1998, study of neutrinos has been one of exciting fields in high-energy physics. All the mixing angles were measured. Quests for 1 measurements of the remaining parameters, the lightest neutrino mass, the CP violating phase(s, and the sign of mass splitting between the mass eigenstates m3 and m1, and 2 better measurements to determine whether the mixing angle theta23 is less than pi/4, are in progress in a well-controlled manner. Determining the nature of neutrinos, whether they are Dirac or Majorana particles is also in progress with continuous improvement. On the other hand, although the ideas of detecting cosmic neutrino background have been discussed since 1960s, there has not been a serious concerted effort to achieve this goal. One of the reasons is that it is extremely difficult to detect such low energy neutrinos from the Big Bang. While there has been tremendous accumulation of information on Cosmic Microwave Background since its discovery in 1965, there is no direct evidence for Cosmic Neutrino Background. The importance of detecting Cosmic Neutrino Background is that, although detailed studies of Big Bang Nucleosynthesis and Cosmic Microwave Background give information of the early Universe at ~a few minutes old and ~300 k years old, respectively, observation of Cosmic Neutrino Background allows us to study the early Universe at $sim$ 1 sec old. This article reviews progress made in the past 50 years on detection methods of Cosmic Neutrino Background.

  10. Background suppression of infrared small target image based on inter-frame registration

    Science.gov (United States)

    Ye, Xiubo; Xue, Bindang

    2018-04-01

    We propose a multi-frame background suppression method for remote infrared small target detection. Inter-frame information is necessary when the heavy background clutters make it difficult to distinguish real targets and false alarms. A registration procedure based on points matching in image patches is used to compensate the local deformation of background. Then the target can be separated by background subtraction. Experiments show our method serves as an effective preliminary of target detection.

  11. Comparison of presbyopic additions determined by the fused cross-cylinder method using alternative target background colours.

    Science.gov (United States)

    Wee, Sung-Hyun; Yu, Dong-Sik; Moon, Byeong-Yeon; Cho, Hyun Gug

    2010-11-01

    To compare and contrast standard and alternative versions of refractor head (phoropter)-based charts used to determine reading addition. Forty one presbyopic subjects aged between 42 and 60 years were tested. Tentative additions were determined using a red-green background letter chart, and 4 cross-grid charts (with white, red, green, or red-green backgrounds) which were used with the fused cross cylinder (FCC) method. The final addition for a 40 cm working distance was determined for each subject by subjectively adjusting the tentative additions. There were significant differences in the tentative additions obtained using the 5 methods (repeated measures ANOVA, p FCC method. There were no significant differences between the tentative and final additions for the green background in the FCC method (p > 0.05). The intervals of the 95% limits of agreement were under ±0.50 D, and the narrowest interval (±0.26 D) was for the red-green background. The 3 FCC methods with a white, green, or red-green background provided a tentative addition close to the final addition. Compared with the other methods, the FCC method with the red-green background had a narrow range of error. Further, since this method combines the functions of both the fused cross-cylinder test and the duochrome test, it can be a useful technique for determining presbyopic additions. © 2010 The Authors. Ophthalmic and Physiological Optics © 2010 The College of Optometrists.

  12. Subspace-based optimization method for inverse scattering problems with an inhomogeneous background medium

    International Nuclear Information System (INIS)

    Chen, Xudong

    2010-01-01

    This paper proposes a version of the subspace-based optimization method to solve the inverse scattering problem with an inhomogeneous background medium where the known inhomogeneities are bounded in a finite domain. Although the background Green's function at each discrete point in the computational domain is not directly available in an inhomogeneous background scenario, the paper uses the finite element method to simultaneously obtain the Green's function at all discrete points. The essence of the subspace-based optimization method is that part of the contrast source is determined from the spectrum analysis without using any optimization, whereas the orthogonally complementary part is determined by solving a lower dimension optimization problem. This feature significantly speeds up the convergence of the algorithm and at the same time makes it robust against noise. Numerical simulations illustrate the efficacy of the proposed algorithm. The algorithm presented in this paper finds wide applications in nondestructive evaluation, such as through-wall imaging

  13. Measuring method to impulse neutron scattering background in complicated ambient condition

    International Nuclear Information System (INIS)

    Tang Zhangkui; Peng Taiping; Tang Zhengyuan; Liu Hangang; Hu Mengchun; Fan Juan

    2004-01-01

    This paper introduced a measuring method and calculative formula about impulse neutron scattering background in complicated ambient condition. The experiment had been done in the lab, and the factors to affect measurement conclusion were analysised. (authors)

  14. Natural background radioactivity of the earth's surface -- essential information for environmental impact studies

    International Nuclear Information System (INIS)

    Tauchid, M.; Grasty, R.L.

    2002-01-01

    An environmental impact study is basically a study of change. This change is compared to the preexisting conditions that are usually perceived to be the original one or the 'pristine' stage. Unfortunately reliable information on the 'so called' pristine stage is far from adequate. One of the essential parts of this information is a good knowledge of the earth's chemical make up, or its geochemistry. Presently available data on the geochemistry of the earth's surface, including those related to radioactive elements, are incomplete and inconsistent. The main reason why a number of regulations are judged to be too strict and disproportional to the risks that might be caused by some human activities, is the lack of reliable information on the natural global geochemical background on which environmental regulations should be based. The main objective of this paper is to present a view on the need for complete baseline information on the earth's surface environment and in particular its geochemical character. It is only through the availability of complete information, including reliable baseline information on the natural radioactivity, that an appropriate study on the potential effect of the various naturally occurring elements on human health be carried out. Presented here are a number of examples where the natural radioactivity of an entire country has been mapped, or is in progress. Also described are the ways these undertakings were accomplished. There is a general misconception that elevated radioactivity can be found only around uranium mines, nuclear power reactors and similar nuclear installations. As can be seen from some of these maps, the natural background radioactivity of the earth's surface closely reflects the underlying geological formations and their alteration products. In reality, properly regulated and managed facilities, the levels of radioactivity associated with many of these facilities are generally quite low relative to those associated with

  15. A study on the method for cancelling the background noise of the impact signal

    International Nuclear Information System (INIS)

    Kim, J. S.; Ham, C. S.; Park, J. H.

    1998-01-01

    In this paper, we compared the noise canceller (time domain analysis method) to the spectral subtraction (frequency domain analysis method) for cancelling background noise when the Loose Part Monitoring System's accelerometers combined the noise signal with the impact signal if the impact signal exists. In the operation of a nuclear power plant monitoring, alarm triggering occurs due to a peak signal in the background noise, an amplitude increase by component operation such as control rod movement or abrupt pump operation. This operation causes the background noise in LPMS. Thus this noise inputs to LPMS together with the impact signal. In case that this noise amplitude is very large comparing to that of the impact signal, we may not analyze the impact position and mass estimation. We analyzed two methods for cancelling background noise. First, we evaluate the signal to noise ratio utilizing the noise canceller. Second, we evaluate the signal to noise ratio utilizing the spectral subtraction. The evaluation resulted superior the noise canceller to the spectral subtraction on the signal to noise ratio

  16. Background information for the Leaching environmental Assessment Framework (LEAF) test methods

    Science.gov (United States)

    The U.S. Environmental Protection Agency Office of Resource Conservation and Recovery has initiated the review and validation process for four leaching tests under consideration for inclusion into SW-846: Method 1313 "Liquid-Solid Partitioning as a Function of Extract pH for Co...

  17. The Cryogenic Dark Matter Search and Background Rejection with Event Position Information

    International Nuclear Information System (INIS)

    Wang, Gen-sheng

    2005-01-01

    Evidence from observational cosmology and astrophysics indicates that about one third of the universe is matter, but that the known baryonic matter only contributes to the universe at 4%. A large fraction of the universe is cold and non-baryonic matter, which has important role in the universe structure formation and its evolution. The leading candidate for the non-baryonic dark matter is Weakly Interacting Massive Particles (WIMPs), which naturally occurs in the supersymmetry theory in particle physics. The Cryogenic Dark Matter Search (CDMS) experiment is searching for evidence of a WIMP interaction off an atomic nucleus in crystals of Ge and Si by measuring simultaneously the phonon energy and ionization energy of the interaction in the CDMS detectors. The WIMP interaction energy is from a few keV to tens of keV with a rate less than 0.1 events/kg/day. To reach the goal of WIMP detection, the CDMS experiment has been conducted in the Soudan mine with an active muon veto and multistage passive background shields. The CDMS detectors have a low energy threshold and background rejection capabilities based on ionization yield. However, betas from contamination and other radioactive sources produce surface interactions, which have low ionization yield, comparable to that of bulk nuclear interactions. The low-ionization surface electron recoils must be removed in the WIMP search data analysis. An emphasis of this thesis is on developing the method of the surface-interaction rejection using location information of the interactions, phonon energy distributions and phonon timing parameters. The result of the CDMS Soudan run118 92.3 live day WIMP search data analysis is presented, and represents the most sensitive search yet performed

  18. The Cryogenic Dark Matter Search and Background Rejection with Event Position Information

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Gensheng [Case Western Reserve Univ., Cleveland, OH (United States). Dept. of Physics

    2005-01-01

    Evidence from observational cosmology and astrophysics indicates that about one third of the universe is matter, but that the known baryonic matter only contributes to the universe at 4%. A large fraction of the universe is cold and non-baryonic matter, which has important role in the universe structure formation and its evolution. The leading candidate for the non-baryonic dark matter is Weakly Interacting Massive Particles (WIMPs), which naturally occurs in the supersymmetry theory in particle physics. The Cryogenic Dark Matter Search (CDMS) experiment is searching for evidence of a WIMP interaction off an atomic nucleus in crystals of Ge and Si by measuring simultaneously the phonon energy and ionization energy of the interaction in the CDMS detectors. The WIMP interaction energy is from a few keV to tens of keV with a rate less than 0.1 events/kg/day. To reach the goal of WIMP detection, the CDMS experiment has been conducted in the Soudan mine with an active muon veto and multistage passive background shields. The CDMS detectors have a low energy threshold and background rejection capabilities based on ionization yield. However, betas from contamination and other radioactive sources produce surface interactions, which have low ionization yield, comparable to that of bulk nuclear interactions. The low-ionization surface electron recoils must be removed in the WIMP search data analysis. An emphasis of this thesis is on developing the method of the surface-interaction rejection using location information of the interactions, phonon energy distributions and phonon timing parameters. The result of the CDMS Soudan run118 92.3 live day WIMP search data analysis is presented, and represents the most sensitive search yet performed.

  19. E-Mail Writing: Providing Background Information in the Core of Computer Assisted Instruction

    Science.gov (United States)

    Nazari, Behzad; Ninknejad, Sahar

    2015-01-01

    The present study highly supported the effective role of providing background information via email by the teacher to write e-mail by the students in learners' writing ability. A total number of 50 EFL advanced male students aged between 25 and 40 at different branches of Iran Language Institute in Tehran, Tehran. Through the placement test of…

  20. Spectral-ratio radon background correction method in airborne γ-ray spectrometry based on compton scattering deduction

    International Nuclear Information System (INIS)

    Gu Yi; Xiong Shengqing; Zhou Jianxin; Fan Zhengguo; Ge Liangquan

    2014-01-01

    γ-ray released by the radon daughter has severe impact on airborne γ-ray spectrometry. The spectral-ratio method is one of the best mathematical methods for radon background deduction in airborne γ-ray spectrometry. In this paper, an advanced spectral-ratio method was proposed which deducts Compton scattering ray by the fast Fourier transform rather than tripping ratios, the relationship between survey height and correction coefficient of the advanced spectral-ratio radon background correction method was studied, the advanced spectral-ratio radon background correction mathematic model was established, and the ground saturation model calibrating technology for correction coefficient was proposed. As for the advanced spectral-ratio radon background correction method, its applicability and correction efficiency are improved, and the application cost is saved. Furthermore, it can prevent the physical meaning lost and avoid the possible errors caused by matrix computation and mathematical fitting based on spectrum shape which is applied in traditional correction coefficient. (authors)

  1. Background risk information to assist in risk management decision making

    International Nuclear Information System (INIS)

    Hammonds, J.S.; Hoffman, F.O.; White, R.K.; Miller, D.B.

    1992-10-01

    The evaluation of the need for remedial activities at hazardous waste sites requires quantification of risks of adverse health effects to humans and the ecosystem resulting from the presence of chemical and radioactive substances at these sites. The health risks from exposure to these substances are in addition to risks encountered because of the virtually unavoidable exposure to naturally occurring chemicals and radioactive materials that are present in air, water, soil, building materials, and food products. To provide a frame of reference for interpreting risks quantified for hazardous waste sites, it is useful to identify the relative magnitude of risks of both a voluntary and involuntary nature that are ubiquitous throughout east Tennessee. In addition to discussing risks from the ubiquitous presence of background carcinogens in the east Tennessee environment, this report also presents risks resulting from common, everyday activities. Such information should, not be used to discount or trivialize risks from hazardous waste contamination, but rather, to create a sensitivity to general risk issues, thus providing a context for better interpretation of risk information

  2. Background information document to support NESHAPS rulemaking on nuclear power reactors. Draft report

    International Nuclear Information System (INIS)

    Colli, A.; Conklin, C.; Hoffmeyer, D.

    1991-08-01

    The purpose of this Background Information Document (BID) is to present information relevant to the Administrator of the Environmental Protection Agency's (EPA) reconsideration of the need for a NESHAP to control radionuclides emitted to the air from commercial nuclear power reactors. The BID presents information on the relevant portions of the regulatory framework that NRC has implemented for nuclear power plant licensees, under the authority of the Atomic Energy Act, as amended, to protect the public's health and safety. To provide context, it summarizes the rulemaking history for Subpart I. It then describes NRC's regulatory program for routine atmospheric emissions of radionuclides and evaluates the doses caused by actual airborne emissions from nuclear power plants, including releases resulting from anticipated operational occurrences

  3. Background subtraction system for pulsed neutron logging of earth boreholes

    International Nuclear Information System (INIS)

    Hertzog, R.C.

    1983-01-01

    The invention provides a method for determining the characteristics of earth formations surrounding a well borehole comprising the steps of: repetitively irradiating the earth formations surrounding the well bore with relatively short duration pulses of high energy neutrons; detecting during each pulse of high energy neutrons, gamma radiation due to the inelastic scattering of neutrons by materials comprising the earth formations surrounding the borehole and providing information representative thereof; detecting immediately following each such pulse of high energy neutrons, background gamma radiation due to thermal neutron capture and providing information representative thereof; and correcting the inelastic gamma representative information to compensate for said background representative information

  4. Background field method for nonlinear σ-model in stochastic quantization

    International Nuclear Information System (INIS)

    Nakazawa, Naohito; Ennyu, Daiji

    1988-01-01

    We formulate the background field method for the nonlinear σ-model in stochastic quantization. We demonstrate a one-loop calculation for a two-dimensional non-linear σ-model on a general riemannian manifold based on our formulation. The formulation is consistent with the known results in ordinary quantization. As a simple application, we also analyse the multiplicative renormalization of the O(N) nonlinear σ-model. (orig.)

  5. Gravel Image Segmentation in Noisy Background Based on Partial Entropy Method

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Because of wide variation in gray levels and particle dimensions and the presence of many small gravel objects in the background, as well as corrupting the image by noise, it is difficult o segment gravel objects. In this paper, we develop a partial entropy method and succeed to realize gravel objects segmentation. We give entropy principles and fur calculation methods. Moreover, we use minimum entropy error automaticly to select a threshold to segment image. We introduce the filter method using mathematical morphology. The segment experiments are performed by using different window dimensions for a group of gravel image and demonstrates that this method has high segmentation rate and low noise sensitivity.

  6. Conserved quantities in background independent theories

    Energy Technology Data Exchange (ETDEWEB)

    Markopoulou, Fotini [Perimeter Institute for Theoretical Physics, 35 King Street North, Waterloo, Ontario N2J 2W9 (Canada); Department of Physics, University of Waterloo, Waterloo, Ontario N2L 3G1 (Canada)

    2007-05-15

    We discuss the difficulties that background independent theories based on quantum geometry encounter in deriving general relativity as the low energy limit. We follow a geometrogenesis scenario of a phase transition from a pre-geometric theory to a geometric phase which suggests that a first step towards the low energy limit is searching for the effective collective excitations that will characterize it. Using the correspondence between the pre-geometric background independent theory and a quantum information processor, we are able to use the method of noiseless subsystems to extract such coherent collective excitations. We illustrate this in the case of locally evolving graphs.

  7. 108 Information Retrieval Methods in Libraries and Information ...

    African Journals Online (AJOL)

    User

    without adequate knowledge of information retrieval methods, the retrieval process for an ... discusses the concept of Information retrieval, the various information ..... Other advantages of automatic indexing are the maintenance of consistency.

  8. A novel method to remove GPR background noise based on the similarity of non-neighboring regions

    Science.gov (United States)

    Montiel-Zafra, V.; Canadas-Quesada, F. J.; Vera-Candeas, P.; Ruiz-Reyes, N.; Rey, J.; Martinez, J.

    2017-09-01

    Ground penetrating radar (GPR) is a non-destructive technique that has been widely used in many areas of research, such as landmine detection or subsurface anomalies, where it is required to locate targets embedded within a background medium. One of the major challenges in the research of GPR data remains the improvement of the image quality of stone materials by means of detection of true anisotropies since most of the errors are caused by an incorrect interpretation by the users. However, it is complicated due to the interference of the horizontal background noise, e.g., the air-ground interface, that reduces the high-resolution quality of radargrams. Thus, weak or deep anisotropies are often masked by this type of noise. In order to remove the background noise obtained by GPR, this work proposes a novel background removal method assuming that the horizontal noise shows repetitive two-dimensional regions along the movement of the GPR antenna. Specifically, the proposed method, based on the non-local similarity of regions over the distance, computes similarities between different regions of the same depth in order to identify most repetitive regions using a criterion to avoid closer regions. Evaluations are performed using a set of synthetic and real GPR data. Experimental results show that the proposed method obtains promising results compared to the classic background removal techniques and the most recently published background removal methods.

  9. Background estimation in short-wave region during determination of total sample composition by x-ray fluorescence method

    International Nuclear Information System (INIS)

    Simakov, V.A.; Kordyukov, S.V.; Petrov, E.N.

    1988-01-01

    Method of background estimation in short-wave spectral region during determination of total sample composition by X-ray fluorescence method is described. 13 types of different rocks with considerable variations of base composition and Zr, Nb, Th, U content below 7x10 -3 % are investigated. The suggested method of background accounting provides for a less statistical error of the background estimation than direct isolated measurement and reliability of its determination in a short-wave region independent on the sample base. Possibilities of suggested method for artificial mixtures conforming by the content of main component to technological concemtrates - niobium, zirconium, tantalum are estimated

  10. An automated background estimation procedure for gamma ray spectra

    International Nuclear Information System (INIS)

    Tervo, R.J.; Kennett, T.J.; Prestwich, W.V.

    1983-01-01

    An objective and simple method has been developed to estimate the background continuum in Ge gamma ray spectra. Requiring no special procedures, the method is readily automated. Based upon the inherent statistical properties of the experimental data itself, nodes, which reflect background samples are located and used to produce an estimate of the continuum. A simple procedure to interpolate between nodes is reported and a range of rather typical experimental data is presented. All information necessary to implemented this technique is given including the relevant properties of various factors involved in its development. (orig.)

  11. Comparison of selection methods to deduce natural background levels for groundwater units

    NARCIS (Netherlands)

    Griffioen, J.; Passier, H.F.; Klein, J.

    2008-01-01

    Establishment of natural background levels (NBL) for groundwater is commonly performed to serve as reference when assessing the contamination status of groundwater units. We compare various selection methods to establish NBLs using groundwater quality data forfour hydrogeologically different areas

  12. Italian: Area Background Information.

    Science.gov (United States)

    Defense Language Inst., Washington, DC.

    This booklet has been assembled in order to provide students of Italian with a compact source of cultural information on their target area. Chapters include discussion of: (1) introduction to Italian; (2) origins of the Italian population; (3) geography; (4) history including the Roman Era, the Middle Ages, the Renaissance, the "Risorgimento," and…

  13. Durability 2007. Injection grout investigations. Background description

    International Nuclear Information System (INIS)

    Orantie, K.; Kuosa, H.

    2008-12-01

    The aim of this project was to evaluate the durability risks of injection grouts. The investigations were done with respect to the application conditions, materials and service life requirements at the ONKALO underground research facility. The study encompassed injection grout mixtures made of ultrafine cement with and without silica fume. Some of the mixtures hade a low pH and thus a high silica fume content. The project includes a background description on durability literature, laboratory testing programme, detailed analysis of results and recommendations for selecting of ideal grout mixtures. The background description was made for the experimental study of low-pH and reference rock injection grouts as regards pore- and microstructure, strength, shrinkage/swelling and thus versatile durability properties. A summary of test methods is presented as well as examples, i.e. literature information or former test results, of expected range of results from the tests. Also background information about how the test results correlate to other material properties and mix designs is presented. Besides the report provides basic information on the pore structure of cement based materials. Also the correlation between the pore structure of cement based materials and permeability is shortly discussed. The test methods included in the background description are compressive strength, measurement of bulk drying, autogenous and chemical shrinkage and swelling, hydraulic conductivity / permeability, capillary water uptake test, mercury intrusion porosimetry (MIP) and thin section analysis. Three main mixtures with water-binder ratio of 0.8, 1.0 and 1.4 and silica fume content of 0, 15 and 40% were studied in the laboratory. Besides two extra mixtures were studied to provide additional information about the effect of varying water-dry-material ratio and silica fume content on durability. The evaluation of water tightness based on water permeability coefficient and micro cracking was

  14. Universal field matching in craniospinal irradiation by a background-dose gradient-optimized method.

    Science.gov (United States)

    Traneus, Erik; Bizzocchi, Nicola; Fellin, Francesco; Rombi, Barbara; Farace, Paolo

    2018-01-01

    The gradient-optimized methods are overcoming the traditional feathering methods to plan field junctions in craniospinal irradiation. In this note, a new gradient-optimized technique, based on the use of a background dose, is described. Treatment planning was performed by RayStation (RaySearch Laboratories, Stockholm, Sweden) on the CT scans of a pediatric patient. Both proton (by pencil beam scanning) and photon (by volumetric modulated arc therapy) treatments were planned with three isocenters. An 'in silico' ideal background dose was created first to cover the upper-spinal target and to produce a perfect dose gradient along the upper and lower junction regions. Using it as background, the cranial and the lower-spinal beams were planned by inverse optimization to obtain dose coverage of their relevant targets and of the junction volumes. Finally, the upper-spinal beam was inversely planned after removal of the background dose and with the previously optimized beams switched on. In both proton and photon plans, the optimized cranial and the lower-spinal beams produced a perfect linear gradient in the junction regions, complementary to that produced by the optimized upper-spinal beam. The final dose distributions showed a homogeneous coverage of the targets. Our simple technique allowed to obtain high-quality gradients in the junction region. Such technique universally works for photons as well as protons and could be applicable to the TPSs that allow to manage a background dose. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  15. Removal of stored particle background via the electric dipole method in the KATRIN main spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    Hilk, Daniel [Institut fuer Experimentelle Kernphysik, KIT, Karlsruhe (Germany); Collaboration: KATRIN-Collaboration

    2016-07-01

    The goal of the KArlsruhe TRItium Neutrino (KATRIN) experiment is to determine the effective mass of the electron anti neutrino by measuring the electron energy spectrum of tritium beta decay near the endpoint. The goal is to reach a sensitivity on the neutrino mass of 200 meV for which a low background level of 10{sup -2} counts per second is mandatory. Electrons from single radioactive decays of radon and tritium in the KATRIN main spectrometer with energies in the keV range can be magnetically stored for hours. While cooling down via ionization of residual gas molecules, they produce hundreds of secondary electrons, which can reach the detector and contribute to the background signals. In order to suppress this background component, several methods are investigated to remove stored electrons, such as the application of an electric dipole field and the application of magnetic pulses. This talk introduces the mechanism of background production due to stored electrons and their removal by the electric dipole method in the main spectrometer. In context of the spectrometer- and detector-commissioning phase in summer 2015, measurement results of the application of the electric dipole method are presented.

  16. Methods of information processing

    Energy Technology Data Exchange (ETDEWEB)

    Kosarev, Yu G; Gusev, V D

    1978-01-01

    Works are presented on automation systems for editing and publishing operations by methods of processing symbol information and information contained in training selection (ranking of objectives by promise, classification algorithm of tones and noise). The book will be of interest to specialists in the automation of processing textural information, programming, and pattern recognition.

  17. Methods of Organizational Information Security

    Science.gov (United States)

    Martins, José; Dos Santos, Henrique

    The principle objective of this article is to present a literature review for the methods used in the security of information at the level of organizations. Some of the principle problems are identified and a first group of relevant dimensions is presented for an efficient management of information security. The study is based on the literature review made, using some of the more relevant certified articles of this theme, in international reports and in the principle norms of management of information security. From the readings that were done, we identified some of the methods oriented for risk management, norms of certification and good practice of security of information. Some of the norms are oriented for the certification of the product or system and others oriented to the processes of the business. There are also studies with the proposal of Frameworks that suggest the integration of different approaches with the foundation of norms focused on technologies, in processes and taking into consideration the organizational and human environment of the organizations. In our perspective, the biggest contribute to the security of information is the development of a method of security of information for an organization in a conflicting environment. This should make available the security of information, against the possible dimensions of attack that the threats could exploit, through the vulnerability of the organizational actives. This method should support the new concepts of "Network centric warfare", "Information superiority" and "Information warfare" especially developed in this last decade, where information is seen simultaneously as a weapon and as a target.

  18. A monitored retrievable storage facility: Technical background information

    International Nuclear Information System (INIS)

    1991-07-01

    The US government is seeking a site for a monitored retrievable storage facility (MRS). Employing proven technologies used in this country and abroad, the MRS will be an integral part of the federal system for safe and permanent disposal of the nation's high-level radioactive wastes. The MRS will accept shipments of spent fuel from commercial nuclear power plants, temporarily store the spent fuel above ground, and stage shipments of it to a geologic repository for permanent disposal. The law authorizing the MRS provides an opportunity for a state or an Indian tribe to volunteer to host the MRS. The law establishes the Office of the Nuclear Waste Negotiator, who is to seek a state or an Indian tribe willing to host an MRS at a technically-qualified site on reasonable terms, and is to negotiate a proposed agreement specifying the terms and conditions under which the MRS would be developed and operated at that site. This agreement can ensure that the MRS is acceptable to -- and benefits -- the host community. The proposed agreement must be submitted to Congress and enacted into law to become effective. This technical background information presents an overview of various aspects of a monitored retrievable storage facility, including the process by which it will be developed

  19. [Establishment and assessment of QA/QC method for sampling and analysis of atmosphere background CO2].

    Science.gov (United States)

    Liu, Li-xin; Zhou, Ling-xi; Xia, Ling-jun; Wang, Hong-yang; Fang, Shuang-xi

    2014-12-01

    To strengthen scientific management and sharing of greenhouse gas data obtained from atmospheric background stations in China, it is important to ensure the standardization of quality assurance and quality control method for background CO2 sampling and analysis. Based on the greenhouse gas sampling and observation experience of CMA, using portable sampling observation and WS-CRDS analysis technique as an example, the quality assurance measures for atmospheric CO,sampling and observation in the Waliguan station (Qinghai), the glass bottle quality assurance measures and the systematic quality control method during sample analysis, the correction method during data processing, as well as the data grading quality markers and data fitting interpolation method were systematically introduced. Finally, using this research method, the CO2 sampling and observation data at the atmospheric background stations in 3 typical regions were processed and the concentration variation characteristics were analyzed, indicating that this research method could well catch the influences of the regional and local environmental factors on the observation results, and reflect the characteristics of natural and human activities in an objective and accurate way.

  20. Background information to the installers guide for small scale mains connected PV

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-07-01

    This report contains background information used by BRE, EA Technology, Halcrows and Sundog when compiling guidance for the UK's New and Renewable Energy Programme on the installation of small-scale photovoltaics (PV) in buildings. The report considers: relevant standards; general safety issues; fire and safety issues, including the fire resistance of PV modules; PV module ratings such as maximum voltage and maximum current; DC cabling; the DC disconnect; the DC junction box; fault analysis; general and AC side earthing; DC earthing; lightning and surge suppression; inverters; AC modules; AC systems; getting connection; mounting options; and installation issues.

  1. Comparison of two interpolative background subtraction methods using phantom and clinical data

    International Nuclear Information System (INIS)

    Houston, A.S.; Sampson, W.F.D.

    1989-01-01

    Two interpolative background subtraction methods used in scintigraphy are tested using both phantom and clinical data. Cauchy integral subtraction was found to be relatively free of artefacts but required more computing time than bilinear interpolation. Both methods may be used with reasonable confidence for the quantification of relative measurements such as left ventricular ejection fraction and myocardial perfusion index but should be avoided if at all possible in the quantification of absolute measurements such as glomerular filtration rate. (author)

  2. Information System Quality Assessment Methods

    OpenAIRE

    Korn, Alexandra

    2014-01-01

    This thesis explores challenging topic of information system quality assessment and mainly process assessment. In this work the term Information System Quality is defined as well as different approaches in a quality definition for different domains of information systems are outlined. Main methods of process assessment are overviewed and their relationships are described. Process assessment methods are divided into two categories: ISO standards and best practices. The main objective of this w...

  3. Methods for communicating technical information as public information

    International Nuclear Information System (INIS)

    Zara, S.A.

    1987-01-01

    Many challenges face the nuclear industry, especially in the waste management area. One of the biggest challenges is effective communication with the general public. Technical complexity, combined with the public's lack of knowledge and negative emotional response, complicate clear communication of radioactive waste management issues. The purpose of this session is to present and discuss methods for overcoming these obstacles and effectively transmitting technical information as public information. The methods presented encompass audio, visual, and print approaches to message transmission. To support these methods, the author also discusses techniques, based on current research, for improving the communication process

  4. Chemical Source Localization Fusing Concentration Information in the Presence of Chemical Background Noise.

    Science.gov (United States)

    Pomareda, Víctor; Magrans, Rudys; Jiménez-Soto, Juan M; Martínez, Dani; Tresánchez, Marcel; Burgués, Javier; Palacín, Jordi; Marco, Santiago

    2017-04-20

    We present the estimation of a likelihood map for the location of the source of a chemical plume dispersed under atmospheric turbulence under uniform wind conditions. The main contribution of this work is to extend previous proposals based on Bayesian inference with binary detections to the use of concentration information while at the same time being robust against the presence of background chemical noise. For that, the algorithm builds a background model with robust statistics measurements to assess the posterior probability that a given chemical concentration reading comes from the background or from a source emitting at a distance with a specific release rate. In addition, our algorithm allows multiple mobile gas sensors to be used. Ten realistic simulations and ten real data experiments are used for evaluation purposes. For the simulations, we have supposed that sensors are mounted on cars which do not have among its main tasks navigating toward the source. To collect the real dataset, a special arena with induced wind is built, and an autonomous vehicle equipped with several sensors, including a photo ionization detector (PID) for sensing chemical concentration, is used. Simulation results show that our algorithm, provides a better estimation of the source location even for a low background level that benefits the performance of binary version. The improvement is clear for the synthetic data while for real data the estimation is only slightly better, probably because our exploration arena is not able to provide uniform wind conditions. Finally, an estimation of the computational cost of the algorithmic proposal is presented.

  5. A probabilistic cell model in background corrected image sequences for single cell analysis

    Directory of Open Access Journals (Sweden)

    Fieguth Paul

    2010-10-01

    Full Text Available Abstract Background Methods of manual cell localization and outlining are so onerous that automated tracking methods would seem mandatory for handling huge image sequences, nevertheless manual tracking is, astonishingly, still widely practiced in areas such as cell biology which are outside the influence of most image processing research. The goal of our research is to address this gap by developing automated methods of cell tracking, localization, and segmentation. Since even an optimal frame-to-frame association method cannot compensate and recover from poor detection, it is clear that the quality of cell tracking depends on the quality of cell detection within each frame. Methods Cell detection performs poorly where the background is not uniform and includes temporal illumination variations, spatial non-uniformities, and stationary objects such as well boundaries (which confine the cells under study. To improve cell detection, the signal to noise ratio of the input image can be increased via accurate background estimation. In this paper we investigate background estimation, for the purpose of cell detection. We propose a cell model and a method for background estimation, driven by the proposed cell model, such that well structure can be identified, and explicitly rejected, when estimating the background. Results The resulting background-removed images have fewer artifacts and allow cells to be localized and detected more reliably. The experimental results generated by applying the proposed method to different Hematopoietic Stem Cell (HSC image sequences are quite promising. Conclusion The understanding of cell behavior relies on precise information about the temporal dynamics and spatial distribution of cells. Such information may play a key role in disease research and regenerative medicine, so automated methods for observation and measurement of cells from microscopic images are in high demand. The proposed method in this paper is capable

  6. Statistical methods for determination of background levels for naturally occuring radionuclides in soil at a RCRA facility

    International Nuclear Information System (INIS)

    Guha, S.; Taylor, J.H.

    1996-01-01

    It is critical that summary statistics on background data, or background levels, be computed based on standardized and defensible statistical methods because background levels are frequently used in subsequent analyses and comparisons performed by separate analysts over time. The final background for naturally occurring radionuclide concentrations in soil at a RCRA facility, and the associated statistical methods used to estimate these concentrations, are presented. The primary objective is to describe, via a case study, the statistical methods used to estimate 95% upper tolerance limits (UTL) on radionuclide background soil data sets. A 95% UTL on background samples can be used as a screening level concentration in the absence of definitive soil cleanup criteria for naturally occurring radionuclides. The statistical methods are based exclusively on EPA guidance. This paper includes an introduction, a discussion of the analytical results for the radionuclides and a detailed description of the statistical analyses leading to the determination of 95% UTLs. Soil concentrations reported are based on validated data. Data sets are categorized as surficial soil; samples collected at depths from zero to one-half foot; and deep soil, samples collected from 3 to 5 feet. These data sets were tested for statistical outliers and underlying distributions were determined by using the chi-squared test for goodness-of-fit. UTLs for the data sets were then computed based on the percentage of non-detects and the appropriate best-fit distribution (lognormal, normal, or non-parametric). For data sets containing greater than approximately 50% nondetects, nonparametric UTLs were computed

  7. Zambia Country Background Report

    DEFF Research Database (Denmark)

    Hampwaye, Godfrey; Jeppesen, Søren; Kragelund, Peter

    This paper provides background data and general information for the Zambia studies focusing on local food processing sub­‐sector; and the local suppliers to the mines as part of the SAFIC project (Successful African Firms and Institutional Change).......This paper provides background data and general information for the Zambia studies focusing on local food processing sub­‐sector; and the local suppliers to the mines as part of the SAFIC project (Successful African Firms and Institutional Change)....

  8. Experiences with nutrition-related information during antenatal care of pregnant women of different ethnic backgrounds residing in the area of Oslo, Norway.

    Science.gov (United States)

    Garnweidner, Lisa M; Sverre Pettersen, Kjell; Mosdøl, Annhild

    2013-12-01

    to explore experiences with nutrition-related information during routine antenatal care among women of different ethnical backgrounds. individual interviews with seventeen participants were conducted twice during pregnancy. Data collection and analysis were inspired by an interpretative phenomenological approach. participants were purposively recruited at eight Mother and Child Health Centres in the area of Oslo, Norway, where they received antenatal care. participants had either immigrant backgrounds from African and Asian countries (n=12) or were ethnic Norwegian (n=5). Participants were pregnant with their first child and had a pre-pregnancy Body Mass Index above 25 kg/m(2). participants experienced that they were provided with little nutrition-related information in antenatal care. The information was perceived as presented in very general terms and focused on food safety. Weight management and the long-term prevention of diet-related chronic diseases had hardly been discussed. Participants with immigrant backgrounds appeared to be confused about information given by the midwife which was incongruent with their original food culture. The participants were actively seeking for nutrition-related information and had to navigate between various sources of information. the midwife is considered a trustworthy source of nutrition-related information. Therefore, antenatal care may have considerable potential to promote a healthy diet to pregnant women. Findings suggest that nutrition communication in antenatal care should be more tailored towards women's dietary habits and cultural background, nutritional knowledge as well as level of nutrition literacy. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. Method and apparatus for determining accuracy of radiation measurements made in the presence of background radiation

    International Nuclear Information System (INIS)

    Horrocks, D.L.

    1977-01-01

    A radioactivity measuring instrument, and a method related to its use, for determining the radioactivity of a sample measured in the presence of significant background radiation, and for determining an error value relating to a specific probability of accuracy of the result are presented. Error values relating to the measurement of background radiation alone, and to the measurement of sample radiation and background radiation together, are combined to produce a true error value relating to the sample radiation alone

  10. Adaptive removal of background and white space from document images using seam categorization

    Science.gov (United States)

    Fillion, Claude; Fan, Zhigang; Monga, Vishal

    2011-03-01

    Document images are obtained regularly by rasterization of document content and as scans of printed documents. Resizing via background and white space removal is often desired for better consumption of these images, whether on displays or in print. While white space and background are easy to identify in images, existing methods such as naïve removal and content aware resizing (seam carving) each have limitations that can lead to undesirable artifacts, such as uneven spacing between lines of text or poor arrangement of content. An adaptive method based on image content is hence needed. In this paper we propose an adaptive method to intelligently remove white space and background content from document images. Document images are different from pictorial images in structure. They typically contain objects (text letters, pictures and graphics) separated by uniform background, which include both white paper space and other uniform color background. Pixels in uniform background regions are excellent candidates for deletion if resizing is required, as they introduce less change in document content and style, compared with deletion of object pixels. We propose a background deletion method that exploits both local and global context. The method aims to retain the document structural information and image quality.

  11. Background Information on Crimes against Children Study. Information Memorandum 86-20.

    Science.gov (United States)

    Haas, Shaun

    This document was prepared to assist the Wisconsin Legislative Council's Special Committee on Crimes Against Children in its study of current laws relating to crimes against children. It provides the background of the origin of the study and describes the characteristics of the Criminal Code, upon which much of the committee review will center.…

  12. Background Information for the Nevada National Security Site Integrated Sampling Plan, Revision 0

    Energy Technology Data Exchange (ETDEWEB)

    Farnham, Irene; Marutzky, Sam

    2014-12-01

    This document describes the process followed to develop the Nevada National Security Site (NNSS) Integrated Sampling Plan (referred to herein as the Plan). It provides the Plan’s purpose and objectives, and briefly describes the Underground Test Area (UGTA) Activity, including the conceptual model and regulatory requirements as they pertain to groundwater sampling. Background information on other NNSS groundwater monitoring programs—the Routine Radiological Environmental Monitoring Plan (RREMP) and Community Environmental Monitoring Program (CEMP)—and their integration with the Plan are presented. Descriptions of the evaluations, comments, and responses of two Sampling Plan topical committees are also included.

  13. A nonlinear inversion for the velocity background and perturbation models

    KAUST Repository

    Wu, Zedong

    2015-08-19

    Reflected waveform inversion (RWI) provides a method to reduce the nonlinearity of the standard full waveform inversion (FWI) by inverting for the single scattered wavefield obtained using an image. However, current RWI methods usually neglect diving waves, which is an important source of information for extracting the long wavelength components of the velocity model. Thus, we propose a new optimization problem through breaking the velocity model into the background and the perturbation in the wave equation directly. In this case, the perturbed model is no longer the single scattering model, but includes all scattering. We optimize both components simultaneously, and thus, the objective function is nonlinear with respect to both the background and perturbation. The new introduced w can absorb the non-smooth update of background naturally. Application to the Marmousi model with frequencies that start at 5 Hz shows that this method can converge to the accurate velocity starting from a linearly increasing initial velocity. Application to the SEG2014 demonstrates the versatility of the approach.

  14. Background information and technical basis for assessment of environmental implications of magnetic fusion energy

    International Nuclear Information System (INIS)

    Cannon, J.B.

    1983-08-01

    This report contains background information for assessing the potential environmental implications of fusion-based central electric power stations. It was developed as part of an environmental review of the Magnetic Fusion Energy Program. Transition of the program from demonstration of purely scientific feasibility (breakeven conditions) to exploration of engineering feasibility suggests that formal program environmental review under the National Environmental Policy Act is timely. This report is the principal reference upon which an environmental impact statement on magnetic fusion will be based

  15. Background velocity inversion by phase along reflection wave paths

    KAUST Repository

    Yu, Han; Guo, Bowen; Schuster, Gerard T.

    2014-01-01

    A background velocity model containing the correct lowwavenumber information is desired for both the quality of the migration image and the success of waveform inversion. We propose to invert for the low-wavenumber part of the velocity model by minimizing the phase difference between predicted and observed reflections. The velocity update is exclusively along the reflection wavepaths and, unlike conventional FWI, not along the reflection ellipses. This allows for reconstructing the smoothly varying parts of the background velocity model. Tests with synthetic data show both the benefits and limitations of this method.

  16. Background velocity inversion by phase along reflection wave paths

    KAUST Repository

    Yu, Han

    2014-08-05

    A background velocity model containing the correct lowwavenumber information is desired for both the quality of the migration image and the success of waveform inversion. We propose to invert for the low-wavenumber part of the velocity model by minimizing the phase difference between predicted and observed reflections. The velocity update is exclusively along the reflection wavepaths and, unlike conventional FWI, not along the reflection ellipses. This allows for reconstructing the smoothly varying parts of the background velocity model. Tests with synthetic data show both the benefits and limitations of this method.

  17. A statistical background noise correction sensitive to the steadiness of background noise.

    Science.gov (United States)

    Oppenheimer, Charles H

    2016-10-01

    A statistical background noise correction is developed for removing background noise contributions from measured source levels, producing a background noise-corrected source level. Like the standard background noise corrections of ISO 3741, ISO 3744, ISO 3745, and ISO 11201, the statistical background correction increases as the background level approaches the measured source level, decreasing the background noise-corrected source level. Unlike the standard corrections, the statistical background correction increases with steadiness of the background and is excluded from use when background fluctuation could be responsible for measured differences between the source and background noise levels. The statistical background noise correction has several advantages over the standard correction: (1) enveloping the true source with known confidence, (2) assuring physical source descriptions when measuring sources in fluctuating backgrounds, (3) reducing background corrected source descriptions by 1 to 8 dB for sources in steady backgrounds, and (4) providing a means to replace standardized background correction caps that incentivize against high precision grade methods.

  18. A method for subtraction of the extrarenal 'background' in dynamic 131I-hippurate renoscintigraphy

    International Nuclear Information System (INIS)

    Mlodkowska, E.; Liniecki, J.; Surma, M.

    1979-01-01

    Using a Toshiba GC-401 gamma camera with MDS computer Trinary a new method was developed for subtracting the extrarenal (extracanalicular) 'background' from the count rate recorded over the kidneys after intravenous administration of 131 I-hippurate. Mean subtraction factors of the 'blood' activity curve were calculated from a study of 27 patients who were given 51 Cr-HSA for purposes of conventional renography with 'background' subtraction. The values of the mean subtraction factors anti Fsub(R,L) for the right and left kidney, by which a blood count rate should be multiplied amounted to 0.86 +- 0.12 and 0.79 +- 0.13, respectively. A comparison of the coefficients of variation of the pure renal signal when mean vs. individually determined subtraction factors were used, and the verification of the method in unilaterally nephrectomized patients have demonstrated that determination of the factors, anti Fsub(R,L), for each patient individually is not required and sufficient precision can be obtained by using the method and factors reported in this study. (orig.) [de

  19. Technical background information for the environmental and safety report, Volume 4: White Oak Lake and Dam

    International Nuclear Information System (INIS)

    Oakes, T.W.; Kelly, B.A.; Ohnesorge, W.F.; Eldridge, J.S.; Bird, J.C.; Shank, K.E.; Tsakeres, F.S.

    1982-03-01

    This report has been prepared to provide background information on White Oak Lake for the Oak Ridge National Laboratory Environmental and Safety Report. The paper presents the history of White Oak Dam and Lake and describes the hydrological conditions of the White Oak Creek watershed. Past and present sediment and water data are included; pathway analyses are described in detail

  20. Technical background information for the environmental and safety report, Volume 4: White Oak Lake and Dam

    Energy Technology Data Exchange (ETDEWEB)

    Oakes, T.W.; Kelly, B.A.; Ohnesorge, W.F.; Eldridge, J.S.; Bird, J.C.; Shank, K.E.; Tsakeres, F.S.

    1982-03-01

    This report has been prepared to provide background information on White Oak Lake for the Oak Ridge National Laboratory Environmental and Safety Report. The paper presents the history of White Oak Dam and Lake and describes the hydrological conditions of the White Oak Creek watershed. Past and present sediment and water data are included; pathway analyses are described in detail.

  1. Background information on a multimedia nitrogen emission reduction strategy; Hintergrundpapier zu einer multimedialen Stickstoffemissionsminderungsstrategie

    Energy Technology Data Exchange (ETDEWEB)

    Geupel; Jering; Frey (and others)

    2009-04-15

    The background information report on a multimedia nitrogen reduction strategy covers the following chapters: 1. Introduction: the nitrogen cascade and the anthropogenic influence, environmental impact of increased nitrogen emissions and effects on human health. 2. Sources and balancing of anthropogenic nitrogen emissions in Germany. 3. Environmental quality targets, activity goals of environmental measures and instruments of an integrated nitrogen reduction strategy. 4. Conclusions and perspectives. The attachments include emission sources, nitrogen release and nitrogen transport in Germany; catalogue of measures and instruments according the criteria efficiency and cost-efficacy.

  2. The spinorial method of classifying supersymmetric backgrounds

    NARCIS (Netherlands)

    Gran, U.; Gutowski, J.; Papadopoulos, G.; Roest, D.

    2006-01-01

    We review how the classification of all supersymmetric backgrounds of IIB supergravity can be reduced to the evaluation of the Killing spinor equations and their integrability conditions, which contain the field equations, on five types of spinors. This is an extension of the work [hep-th/0503046

  3. THE COSMIC INFRARED BACKGROUND EXPERIMENT (CIBER): A SOUNDING ROCKET PAYLOAD TO STUDY THE NEAR INFRARED EXTRAGALACTIC BACKGROUND LIGHT

    Energy Technology Data Exchange (ETDEWEB)

    Zemcov, M.; Bock, J.; Hristov, V.; Levenson, L. R.; Mason, P. [Department of Physics, Mathematics, and Astronomy, California Institute of Technology, Pasadena, CA 91125 (United States); Arai, T.; Matsumoto, T.; Matsuura, S.; Tsumura, K.; Wada, T. [Department of Space Astronomy and Astrophysics, Institute of Space and Astronautical Science (ISAS), Japan Aerospace Exploration Agency (JAXA), Sagamihara, Kanagawa 252-5210 (Japan); Battle, J. [Jet Propulsion Laboratory (JPL), National Aeronautics and Space Administration (NASA), Pasadena, CA 91109 (United States); Cooray, A. [Center for Cosmology, University of California, Irvine, Irvine, CA 92697 (United States); Keating, B.; Renbarger, T. [Department of Physics, University of California, San Diego, San Diego, CA 92093 (United States); Kim, M. G. [Department of Physics and Astronomy, Seoul National University, Seoul 151-742 (Korea, Republic of); Lee, D. H.; Nam, U. W. [Korea Astronomy and Space Science Institute (KASI), Daejeon 305-348 (Korea, Republic of); Sullivan, I. [Department of Physics, The University of Washington, Seattle, WA 98195 (United States); Suzuki, K., E-mail: zemcov@caltech.edu [Instrument Development Group of Technical Center, Nagoya University, Nagoya, Aichi 464-8602 (Japan)

    2013-08-15

    The Cosmic Infrared Background Experiment (CIBER) is a suite of four instruments designed to study the near infrared (IR) background light from above the Earth's atmosphere. The instrument package comprises two imaging telescopes designed to characterize spatial anisotropy in the extragalactic IR background caused by cosmological structure during the epoch of reionization, a low resolution spectrometer to measure the absolute spectrum of the extragalactic IR background, and a narrow band spectrometer optimized to measure the absolute brightness of the zodiacal light foreground. In this paper we describe the design and characterization of the CIBER payload. The detailed mechanical, cryogenic, and electrical design of the system are presented, including all system components common to the four instruments. We present the methods and equipment used to characterize the instruments before and after flight, and give a detailed description of CIBER's flight profile and configurations. CIBER is designed to be recoverable and has flown four times, with modifications to the payload having been informed by analysis of the first flight data. All four instruments performed to specifications during the subsequent flights, and the scientific data from these flights are currently being analyzed.

  4. 16 CFR 1404.2 - Background.

    Science.gov (United States)

    2010-01-01

    ... 16 Commercial Practices 2 2010-01-01 2010-01-01 false Background. 1404.2 Section 1404.2 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION CONSUMER PRODUCT SAFETY ACT REGULATIONS CELLULOSE INSULATION § 1404.2 Background. Based on available fire incident information, engineering analysis of the probable...

  5. Food irradiation: physical-chemical, technological and economical background and competing methods of food preservation

    International Nuclear Information System (INIS)

    Zagorski, Z.P.

    1994-01-01

    Physical, chemical and technical as well as economical background of food preservation by irradiation have been performed. The radiation sources and the elements of radiation chemistry connected with their use in food irradiation process have been shown. The problems of dosimetry and endurance of dose uniformity for processed products have been also discussed. The other methods of food preservation and their weakness and advantages have been also presented and compared with food irradiation method

  6. Investigation and development of the suppression methods of the {sup 42}K background in LArGe

    Energy Technology Data Exchange (ETDEWEB)

    Lubashevskiy, Alexey [Max-Planck-Institut fuer Kernphysik, Saupfercheckweg 1, D-69117 Heidelberg (Germany); Collaboration: GERDA-Collaboration

    2013-07-01

    GERDA is an ultra-low background experiment aimed for the neutrinoless double beta decay search. The search is performed using HPGe detectors operated in liquid argon (LAr). One of the most dangerous backgrounds in GERDA is the background from {sup 42}K which is a daughter isotope of cosmogenically produced {sup 42}Ar. {sup 42}K ions are collected towards to the detector by the electric field of the detector. Estimation of the background contribution and development of the suppression methods were performed in the low background test facility LArGe. For this purpose encapsulated HPGe and bare BEGe detectors were operated in 1m{sup 3} of LAr in the LArGe setup. It is equipped with scintillation veto, so particles which deposit part of their energy in LAr can be detected by 9 PMTs. In order to better understand background and to increase statistics the LAr of LArGe was spiked with specially produced {sup 42}Ar. All these investigations allowed us to estimate background contribution from {sup 42}K and demonstrate the possibility to suppress it in future measurements in GERDA Phase II.

  7. Method Engineering: Engineering of Information Systems Development Methods and Tools

    NARCIS (Netherlands)

    Brinkkemper, J.N.; Brinkkemper, Sjaak

    1996-01-01

    This paper proposes the term method engineering for the research field of the construction of information systems development methods and tools. Some research issues in method engineering are identified. One major research topic in method engineering is discussed in depth: situational methods, i.e.

  8. A review of Web information seeking research: considerations of method and foci of interest

    Directory of Open Access Journals (Sweden)

    Konstantina Martzoukou

    2005-01-01

    Full Text Available Introduction. This review shows that Web information seeking research suffers from inconsistencies in method and a lack of homogeneity in research foci. Background. Qualitative and quantitative methods are needed to produce a comprehensive view of information seeking. Studies also recommend observation as one of the most fundamental ways of gaining direct knowledge of behaviour. User-centred research emphasises the importance of holistic approaches, which incorporate physical, cognitive, and affective elements. Problems. Comprehensive studies are limited; many approaches are problematic and a consistent methodological framework has not been developed. Research has often failed to ensure appropriate samples that ensure both quantitative validity and qualitative consistency. Typically, observation has been based on simulated rather than real information needs and most studies show little attempt to examine holistically different characteristics of users in the same research schema. Research also deals with various aspects of cognitive style and ability with variant definitions of expertise and different layers of user experience. Finally the effect of social and cultural elements has not been extensively investigated. Conclusion. The existing limitations in method and the plethora of different approaches allow little progress and fewer comparisons across studies. There is urgent need for establishing a theoretical framework on which future studies can be based so that information seeking behaviour can be more holistically understood, and results can be generalised.

  9. SWCD: a sliding window and self-regulated learning-based background updating method for change detection in videos

    Science.gov (United States)

    Işık, Şahin; Özkan, Kemal; Günal, Serkan; Gerek, Ömer Nezih

    2018-03-01

    Change detection with background subtraction process remains to be an unresolved issue and attracts research interest due to challenges encountered on static and dynamic scenes. The key challenge is about how to update dynamically changing backgrounds from frames with an adaptive and self-regulated feedback mechanism. In order to achieve this, we present an effective change detection algorithm for pixelwise changes. A sliding window approach combined with dynamic control of update parameters is introduced for updating background frames, which we called sliding window-based change detection. Comprehensive experiments on related test videos show that the integrated algorithm yields good objective and subjective performance by overcoming illumination variations, camera jitters, and intermittent object motions. It is argued that the obtained method makes a fair alternative in most types of foreground extraction scenarios; unlike case-specific methods, which normally fail for their nonconsidered scenarios.

  10. Algebraic renormalization of Yang-Mills theory with background field method

    International Nuclear Information System (INIS)

    Grassi, P.A.

    1996-01-01

    In this paper the renormalizability of Yang-Mills theory in the background gauge fixing is studied. By means of Ward identities of background gauge invariance and Slavnov-Taylor identities, in a regularization-independent way, the stability of the model under radiative corrections is proved and its renormalizability is verified. In particular, it is shown that the splitting between background and quantum field is stable under radiative corrections and this splitting does not introduce any new anomalies. (orig.)

  11. Exploring methods in information literacy research

    CERN Document Server

    Lipu, Suzanne; Lloyd, Annemaree

    2007-01-01

    This book provides an overview of approaches to assist researchers and practitioners to explore ways of undertaking research in the information literacy field. The first chapter provides an introductory overview of research by Dr Kirsty Williamson (author of Research Methods for Students, Academics and Professionals: Information Management and Systems) and this sets the scene for the rest of the chapters where each author explores the key aspects of a specific method and explains how it may be applied in practice. The methods covered include those representing qualitative, quantitative and mix

  12. Method of and System for Information Retrieval

    DEFF Research Database (Denmark)

    2015-01-01

    This invention relates to a system for and a method (100) of searching a collection of digital information (150) comprising a number of digital documents (110), the method comprising receiving or obtaining (102) a search query, the query comprising a number of search terms, searching (103) an ind......, a method of and a system for information retrieval or searching is readily provided that enhances the searching quality (i.e. the number of relevant documents retrieved and such documents being ranked high) when (also) using queries containing many search terms.......This invention relates to a system for and a method (100) of searching a collection of digital information (150) comprising a number of digital documents (110), the method comprising receiving or obtaining (102) a search query, the query comprising a number of search terms, searching (103) an index...... (300) using the search terms thereby providing information (301) about which digital documents (110) of the collection of digital information (150) that contains a given search term and one or more search related metrics (302; 303; 304; 305; 306), ranking (105) at least a part of the search result...

  13. Studying Heavy Ion Collisions Using Methods From Cosmic Microwave Background (CMB Analysis

    Directory of Open Access Journals (Sweden)

    Gaardhøje J. J.

    2014-04-01

    Full Text Available We present and discuss a framework for studying the morphology of high-multiplicity events from relativistic heavy ion collisions using methods commonly employed in the analysis of the photons from the Cosmic Microwave Background (CMB. The analysis is based on the decomposition of the distribution of the number density of (charged particles expressed in polar and azimuthal coordinates into a sum of spherical harmonic functions. We present an application of the method exploting relevant symmetries to the study of azimuthal correlations arizing from collective flow among charged particles produced in relativistic heavy ion collisions. We discuss perspectives for event-by- event analyses, which with increasing collision energy will eventually open entirely new dimensions in the study of ultrarelaticistic heavy ion reactions.

  14. Background information on the SSC project

    International Nuclear Information System (INIS)

    Warren, J.

    1991-10-01

    This report discusses the following information about the Superconducting Super Collider: Goals and milestones; civil construction; ring components; cryogenics; vacuum and cooling water systems; electrical power; instrumentation and control systems; and installation planning

  15. Self-informant Agreement for Personality and Evaluative Person Descriptors: Comparing Methods for Creating Informant Measures.

    Science.gov (United States)

    Simms, Leonard J; Zelazny, Kerry; Yam, Wern How; Gros, Daniel F

    2010-05-01

    Little attention typically is paid to the way self-report measures are translated for use in self-informant agreement studies. We studied two possible methods for creating informant measures: (a) the traditional method in which self-report items were translated from the first- to the third-person and (b) an alternative meta-perceptual method in which informants were directed to rate their perception of the targets' self-perception. We hypothesized that the latter method would yield stronger self-informant agreement for evaluative personality dimensions measured by indirect item markers. We studied these methods in a sample of 303 undergraduate friendship dyads. Results revealed mean-level differences between methods, similar self-informant agreement across methods, stronger agreement for Big Five dimensions than for evaluative dimensions, and incremental validity for meta-perceptual informant rating methods. Limited power reduced the interpretability of several sparse acquaintanceship effects. We conclude that traditional informant methods are appropriate for most personality traits, but meta-perceptual methods may be more appropriate when personality questionnaire items reflect indirect indicators of the trait being measured, which is particularly likely for evaluative traits.

  16. Resonances and background: A decomposition of scattering information

    International Nuclear Information System (INIS)

    Engdahl, E.; Braendas, E.; Rittby, M.; Elander, N.

    1988-01-01

    An analytic representation of the full Green's function including bound states, resonances, and remaining contributions has been obtained for a class of dilatation analytic potentials, including the superimposed Coulomb potential. It is demonstrated how to obtain the locations and residues of the poles of the Green's function as well as the associated generalized spectral density. For a model potential which has a barrier and decreases exponentially at infinity we have found a certain deflation property of the generalized spectral density. A qualitative explanation of this phenomenon is suggested. This constitutes the motivation for an approximation that explicitly shows a decomposition of the (real) continuum, corresponding to scattering data, into resonances and background contributions. The present representation is also shown to incorporate the appropriate pole-background interferences. Numerical residue strings are computed and analyzed. Results for the Coulomb potential plus the above-mentioned model potential are reported and compared with the previous non-Coulomb case. A similar deflation effect is seen to occur, as well as basically the same pole- and residue-string behavior. The relevance of the present analysis in relation to recently planned experiments with electron-cooled beams of highly charged ions is briefly discussed

  17. Method and apparatus for information carrier authentication

    NARCIS (Netherlands)

    2015-01-01

    The present invention relates to a method of enabling authentication of an information carrier, the information carrier comprising a writeable part and a physical token arranged to supply a response upon receiving a challenge, the method comprising the following steps; applying a first challenge to

  18. Pricing Power of Agricultural Products under the Background of Small Peasant Management and Information Asymmetry

    Institute of Scientific and Technical Information of China (English)

    Dexuan LI

    2016-01-01

    From the background of small peasant management and information asymmetry,this paper introduced the middle profit sharing model and discussed influence factors and ownership of pricing power of agricultural products. It obtained following results:( i) the transaction scale has positive effect on farmer’s pricing power of agricultural products,while the competitor’s transaction scale has negative effect on it,so does the cost for information search;( ii) under the condition of small peasant management system,farmer is in a relatively weak position in the distribution of pricing power of agricultural products,due to factors such as small transaction scale,information asymmetry and farmer’s weak negotiation ability;( iii) through cooperative game,farmer and buyers can share cooperative surplus at the agreed ratio;( iv) the introduction of self-organizing specialized farmers cooperatives is favorable for solving the problem of pricing power of agricultural products,and possible problems,such as " collective action dilemma" and " fake cooperatives" in the cooperative development process can be solved by internal and external division of labor and specialization of cooperatives.

  19. Estimating the SM background for supersymmetry searches: challenges and methods

    CERN Document Server

    Besjes, G J; The ATLAS collaboration

    2013-01-01

    Supersymmetry features a broad range of possible signatures at the LHC. If R-parity is conserved the production of squarks and gluinos is accompanied by events with hard jets, possibly leptons or photons and missing transverse momentum. Some Standard Model processes also mimic such events, which, due to their large cross sections, represent backgrounds that can fake or hide supersymmetry. While the normalisation of these backgrounds can be obtained from data in dedicated control regions, Monte Carlo simulation is often used to extrapolate the measured event yields from control to signal regions. Next-to-leading order and multi-parton generators are employed to predict these extrapolations for the dominant processes contributing to the SM background: W/Z boson and top pair production in association with (many) jets. The proper estimate of the associated theoretical uncertainties and testing these with data represent challenges. Other important backgrounds are diboson and top pair plus boson events with additio...

  20. Collecting Information for Rating Global Assessment of Functioning (GAF): Sources of Information and Methods for Information Collection.

    Science.gov (United States)

    I H, Monrad Aas

    2014-11-01

    Global Assessment of Functioning (GAF) is an assessment instrument that is known worldwide. It is widely used for rating the severity of illness. Results from evaluations in psychiatry should characterize the patients. Rating of GAF is based on collected information. The aim of the study is to identify the factors involved in collecting information that is relevant for rating GAF, and gaps in knowledge where it is likely that further development would play a role for improved scoring. A literature search was conducted with a combination of thorough hand search and search in the bibliographic databases PubMed, PsycINFO, Google Scholar, and Campbell Collaboration Library of Systematic Reviews. Collection of information for rating GAF depends on two fundamental factors: the sources of information and the methods for information collection. Sources of information are patients, informants, health personnel, medical records, letters of referral and police records about violence and substance abuse. Methods for information collection include the many different types of interview - unstructured, semi-structured, structured, interviews for Axis I and II disorders, semistructured interviews for rating GAF, and interviews of informants - as well as instruments for rating symptoms and functioning, and observation. The different sources of information, and methods for collection, frequently result in inconsistencies in the information collected. The variation in collected information, and lack of a generally accepted algorithm for combining collected information, is likely to be important for rated GAF values, but there is a fundamental lack of knowledge about the degree of importance. Research to improve GAF has not reached a high level. Rated GAF values are likely to be influenced by both the sources of information used and the methods employed for information collection, but the lack of research-based information about these influences is fundamental. Further development of

  1. Method Engineering: Engineering of Information Systems Development Methods and Tools

    OpenAIRE

    Brinkkemper, J.N.; Brinkkemper, Sjaak

    1996-01-01

    This paper proposes the term method engineering for the research field of the construction of information systems development methods and tools. Some research issues in method engineering are identified. One major research topic in method engineering is discussed in depth: situational methods, i.e. the configuration of a project approach that is tuned to the project at hand. A language and support tool for the engineering of situational methods are discussed.

  2. Effects of projection and background correction method upon calculation of right ventricular ejection fraction using first-pass radionuclide angiography

    International Nuclear Information System (INIS)

    Caplin, J.L.; Flatman, W.D.; Dymond, D.S.

    1985-01-01

    There is no consensus as to the best projection or correction method for first-pass radionuclide studies of the right ventricle. We assessed the effects of two commonly used projections, 30 degrees right anterior oblique and anterior-posterior, on the calculation of right ventricular ejection fraction. In addition two background correction methods, planar background correction to account for scatter, and right atrial correction to account for right atrio-ventricular overlap were assessed. Two first-pass radionuclide angiograms were performed in 19 subjects, one in each projection, using gold-195m (half-life 30.5 seconds), and each study was analysed using the two methods of correction. Right ventricular ejection fraction was highest using the right anterior oblique projection with right atrial correction 35.6 +/- 12.5% (mean +/- SD), and lowest when using the anterior posterior projection with planar background correction 26.2 +/- 11% (p less than 0.001). The study design allowed assessment of the effects of correction method and projection independently. Correction method appeared to have relatively little effect on right ventricular ejection fraction. Using right atrial correction correlation coefficient (r) between projections was 0.92, and for planar background correction r = 0.76, both p less than 0.001. However, right ventricular ejection fraction was far more dependent upon projection. When the anterior-posterior projection was used calculated right ventricular ejection fraction was much more dependent on correction method (r = 0.65, p = not significant), than using the right anterior oblique projection (r = 0.85, p less than 0.001)

  3. Opportunities for renewable biomass in the Dutch province of Zeeland. Background information

    International Nuclear Information System (INIS)

    De Buck, A.; Croezen, H.

    2009-04-01

    The Dutch province of Zeeland is organizing three bio-debates to map economically attractive and renewable biomass opportunities. Participants included industrial businesses, ZLTO, ZMF, Zeeland Seaports, Impuls Zeeland, Hogeschool Zeeland and the University of Ghent. CE Delft is organizing the debates and provides the expertise in this field. In the first debate (Goes, 22 January 2009) the main lines for deployment of biomass in Zeeland were established. One of the conclusions was that there are opportunities for existing industry to implement new technology for large-scale use of (imported) biomass. As for agriculture, there may be opportunities for high-quality chemicals from agricultural crops. Agriculture and industry have opportunities in the short term for better and more high-quality utilization of existing residual flows of biomass. The second and third debate should address concrete opportunities for the industry and agriculture in Zeeland. This report is background information to support the debates. [nl

  4. Analysis of methods. [information systems evolution environment

    Science.gov (United States)

    Mayer, Richard J. (Editor); Ackley, Keith A.; Wells, M. Sue; Mayer, Paula S. D.; Blinn, Thomas M.; Decker, Louis P.; Toland, Joel A.; Crump, J. Wesley; Menzel, Christopher P.; Bodenmiller, Charles A.

    1991-01-01

    Information is one of an organization's most important assets. For this reason the development and maintenance of an integrated information system environment is one of the most important functions within a large organization. The Integrated Information Systems Evolution Environment (IISEE) project has as one of its primary goals a computerized solution to the difficulties involved in the development of integrated information systems. To develop such an environment a thorough understanding of the enterprise's information needs and requirements is of paramount importance. This document is the current release of the research performed by the Integrated Development Support Environment (IDSE) Research Team in support of the IISEE project. Research indicates that an integral part of any information system environment would be multiple modeling methods to support the management of the organization's information. Automated tool support for these methods is necessary to facilitate their use in an integrated environment. An integrated environment makes it necessary to maintain an integrated database which contains the different kinds of models developed under the various methodologies. In addition, to speed the process of development of models, a procedure or technique is needed to allow automatic translation from one methodology's representation to another while maintaining the integrity of both. The purpose for the analysis of the modeling methods included in this document is to examine these methods with the goal being to include them in an integrated development support environment. To accomplish this and to develop a method for allowing intra-methodology and inter-methodology model element reuse, a thorough understanding of multiple modeling methodologies is necessary. Currently the IDSE Research Team is investigating the family of Integrated Computer Aided Manufacturing (ICAM) DEFinition (IDEF) languages IDEF(0), IDEF(1), and IDEF(1x), as well as ENALIM, Entity

  5. Spectral characterization of natural backgrounds

    Science.gov (United States)

    Winkelmann, Max

    2017-10-01

    As the distribution and use of hyperspectral sensors is constantly increasing, the exploitation of spectral features is a threat for camouflaged objects. To improve camouflage materials at first the spectral behavior of backgrounds has to be known to adjust and optimize the spectral reflectance of camouflage materials. In an international effort, the NATO CSO working group SCI-295 "Development of Methods for Measurements and Evaluation of Natural Background EO Signatures" is developing a method how this characterization of backgrounds has to be done. It is obvious that the spectral characterization of a background will be quite an effort. To compare and exchange data internationally the measurements will have to be done in a similar way. To test and further improve this method an international field trial has been performed in Storkow, Germany. In the following we present first impressions and lessons learned from this field campaign and describe the data that has been measured.

  6. Using information to deliver safer care: a mixed-methods study exploring general practitioners’ information needs in North West London primary care

    Directory of Open Access Journals (Sweden)

    Nikolaos Mastellos

    2014-12-01

    Full Text Available Background The National Health Service in England has given increasing priority to improving inter-professional communication, enabling better management of patients with chronic conditions and reducing medical errors through effective use of information. Despite considerable efforts to reduce patient harm through better information usage, medical errors continue to occur, posing a serious threat to patient safety.Objectives This study explores the range, quality and sophistication of existing information systems in primary care with the aim to capture what information practitioners need to provide a safe service and identify barriers to its effective use in care pathways.Method Data were collected through semi-structured interviews with general practitioners from surgeries in North West London and a survey evaluating their experience with information systems in care pathways.Results Important information is still missing, specifically discharge summaries detailing medication changes and changes in the diagnosis and management of patients, blood results ordered by hospital specialists and findings from clinical investigations. Participants identified numerous barriers, including the communication gap between primary and secondary care, the variable quality and consistency of clinical correspondence and the inadequate technological integration.Conclusion Despite attempts to improve integration and information flow in care pathways, existing systems provide practitioners with only partial access to information, hindering their ability to take informed decisions. This study offers a framework for understanding what tools should be in place to enable effective use of information in primary care. 

  7. Application of nonparametric regression methods to study the relationship between NO2 concentrations and local wind direction and speed at background sites.

    Science.gov (United States)

    Donnelly, Aoife; Misstear, Bruce; Broderick, Brian

    2011-02-15

    Background concentrations of nitrogen dioxide (NO(2)) are not constant but vary temporally and spatially. The current paper presents a powerful tool for the quantification of the effects of wind direction and wind speed on background NO(2) concentrations, particularly in cases where monitoring data are limited. In contrast to previous studies which applied similar methods to sites directly affected by local pollution sources, the current study focuses on background sites with the aim of improving methods for predicting background concentrations adopted in air quality modelling studies. The relationship between measured NO(2) concentration in air at three such sites in Ireland and locally measured wind direction has been quantified using nonparametric regression methods. The major aim was to analyse a method for quantifying the effects of local wind direction on background levels of NO(2) in Ireland. The method was expanded to include wind speed as an added predictor variable. A Gaussian kernel function is used in the analysis and circular statistics employed for the wind direction variable. Wind direction and wind speed were both found to have a statistically significant effect on background levels of NO(2) at all three sites. Frequently environmental impact assessments are based on short term baseline monitoring producing a limited dataset. The presented non-parametric regression methods, in contrast to the frequently used methods such as binning of the data, allow concentrations for missing data pairs to be estimated and distinction between spurious and true peaks in concentrations to be made. The methods were found to provide a realistic estimation of long term concentration variation with wind direction and speed, even for cases where the data set is limited. Accurate identification of the actual variation at each location and causative factors could be made, thus supporting the improved definition of background concentrations for use in air quality modelling

  8. Application of geo-information science methods in ecotourism exploitation

    Science.gov (United States)

    Dong, Suocheng; Hou, Xiaoli

    2004-11-01

    Application of geo-information science methods in ecotourism development was discussed in the article. Since 1990s, geo-information science methods, which take the 3S (Geographic Information System, Global Positioning System, and Remote Sensing) as core techniques, has played an important role in resources reconnaissance, data management, environment monitoring, and regional planning. Geo-information science methods can easily analyze and convert geographic spatial data. The application of 3S methods is helpful to sustainable development in tourism. Various assignments are involved in the development of ecotourism, such as reconnaissance of ecotourism resources, drawing of tourism maps, dealing with mass data, and also tourism information inquire, employee management, quality management of products. The utilization of geo-information methods in ecotourism can make the development more efficient by promoting the sustainable development of tourism and the protection of eco-environment.

  9. Studying collaborative information seeking: Experiences with three methods

    DEFF Research Database (Denmark)

    Hyldegård, Jette Seiden; Hertzum, Morten; Hansen, Preben

    2015-01-01

    , however, benefit from a discussion of methodological issues. This chapter describes the application of three methods for collecting and analyzing data in three CIS studies. The three methods are Multidimensional Exploration, used in a CIS study of students’ in-formation behavior during a group assignment......; Task-structured Observation, used in a CIS study of patent engineers; and Condensed Observation, used in a CIS study of information-systems development. The three methods are presented in the context of the studies for which they were devised, and the experiences gained using the methods are discussed....... The chapter shows that different methods can be used for collecting and analyzing data about CIS incidents. Two of the methods focused on tasks and events in work settings, while the third was applied in an educational setting. Commonalities and differences among the methods are discussed to inform decisions...

  10. Non-perturbative background field calculations

    International Nuclear Information System (INIS)

    Stephens, C.R.; Department of Physics, University of Utah, Salt Lake City, Utah 84112)

    1988-01-01

    New methods are developed for calculating one loop functional determinants in quantum field theory. Instead of relying on a calculation of all the eigenvalues of the small fluctuation equation, these techniques exploit the ability of the proper time formalism to reformulate an infinite dimensional field theoretic problem into a finite dimensional covariant quantum mechanical analog, thereby allowing powerful tools such as the method of Jacobi fields to be used advantageously in a field theory setting. More generally the methods developed herein should be extremely valuable when calculating quantum processes in non-constant background fields, offering a utilitarian alternative to the two standard methods of calculation: perturbation theory in the background field or taking the background field into account exactly. The formalism developed also allows for the approximate calculation of covariances of partial differential equations from a knowledge of the solutions of a homogeneous ordinary differential equation. copyright 1988 Academic Press, Inc

  11. Non-perturbative background field calculations

    Science.gov (United States)

    Stephens, C. R.

    1988-01-01

    New methods are developed for calculating one loop functional determinants in quantum field theory. Instead of relying on a calculation of all the eigenvalues of the small fluctuation equation, these techniques exploit the ability of the proper time formalism to reformulate an infinite dimensional field theoretic problem into a finite dimensional covariant quantum mechanical analog, thereby allowing powerful tools such as the method of Jacobi fields to be used advantageously in a field theory setting. More generally the methods developed herein should be extremely valuable when calculating quantum processes in non-constant background fields, offering a utilitarian alternative to the two standard methods of calculation—perturbation theory in the background field or taking the background field into account exactly. The formalism developed also allows for the approximate calculation of covariances of partial differential equations from a knowledge of the solutions of a homogeneous ordinary differential equation.

  12. Agile Methods from the Viewpoint of Information

    Directory of Open Access Journals (Sweden)

    Eder Junior Alves

    2017-10-01

    Full Text Available Introduction: Since Paul M. G. Otlet highlighted the term documentation in 1934, proposing how to collect and organize the world's knowledge, many scientific researches directed observations to the study of Information Science. Methods and techniques have come up with a world view from the perspective of information. Agile methods follow this trend. Objective: The purpose is to analyze the relevance of information flow to organizations adopting agile methods, understanding how the innovation process is influenced by this practice. Methodology: This is a bibliometric study with fundamentals of Systematic Literature Review (SLR. The integration between the SLR technique interacting with Summarize tool is a new methodological proposal. Results: Scrum appears with the highest number of publications in SPELL. In comparison, results of Google Scholar pointed out to the importance of practices and team behaviors. In Science Direct repository, critical success factors in project management and software development are highlighted. Introduction: Conclusions: It was evident that agile methods are being used as process innovations. The benefits and advantages are evident with internal and external occurrence of information flow. Due to the prevalence in the literature, Scrum deserves attention by firms.

  13. Calculation of one-loop anomalous dimensions by means of the background field method

    International Nuclear Information System (INIS)

    Morozov, A.Yu.

    1983-01-01

    The knowledge of propagators in background fields makes calculation of anomalous dimensions (AD) straightforward and brief. The paper illustrates this statement by calculation of AD of many spin-zero and one QCD operators up to the eighth dimension included. The method presented does not simplify calculations in case of four-quark operators, therefore these are not discussed. Together with calculational difficulties arising for operators with derivatives this limits capacities of the whole approach and leads to incompleteness of some mixing matrices found in the article

  14. Fuzzy Search Method for Hi Education Information Security

    Directory of Open Access Journals (Sweden)

    Grigory Grigorevich Novikov

    2016-03-01

    Full Text Available The main reason of the research is how to use fuzzy search method for information security of Hi Education or some similar purposes. So many sensitive information leaks are through non SUMMARY 149 classified documents legal publishing. That’s why many intelligence services so love to use the «mosaic» information collection method. This article is about how to prevent it.

  15. Background field method in gauge theories and on linear sigma models

    International Nuclear Information System (INIS)

    van de Ven, A.E.M.

    1986-01-01

    This dissertation constitutes a study of the ultraviolet behavior of gauge theories and two-dimensional nonlinear sigma-models by means of the background field method. After a general introduction in chapter 1, chapter 2 presents algorithms which generate the divergent terms in the effective action at one-loop for arbitrary quantum field theories in flat spacetime of dimension d ≤ 11. It is demonstrated that global N = 1 supersymmetric Yang-Mills theory in six dimensions in one-loop UV-finite. Chapter 3 presents an algorithm which produces the divergent terms in the effective action at two-loops for renormalizable quantum field theories in a curved four-dimensional background spacetime. Chapter 4 presents a study of the two-loop UV-behavior of two-dimensional bosonic and supersymmetric non-linear sigma-models which include a Wess-Zumino-Witten term. It is found that, to this order, supersymmetric models on quasi-Ricci flat spaces are UV-finite and the β-functions for the bosonic model depend only on torsionful curvatures. Chapter 5 summarizes a superspace calculation of the four-loop β-function for two-dimensional N = 1 and N = 2 supersymmetric non-linear sigma-models. It is found that besides the one-loop contribution which vanishes on Ricci-flat spaces, the β-function receives four-loop contributions which do not vanish in the Ricci-flat case. Implications for superstrings are discussed. Chapters 6 and 7 treat the details of these calculations

  16. 47 CFR 215.1 - Background.

    Science.gov (United States)

    2010-10-01

    ... POINT FOR ELECTROMAGNETIC PULSE (EMP) INFORMATION § 215.1 Background. (a) The nuclear electromagnetic pulse (EMP) is part of the complex environment produced by nuclear explosions. It consists of transient...

  17. Thin-shell bubbles and information loss problem in anti de Sitter background

    Energy Technology Data Exchange (ETDEWEB)

    Sasaki, Misao [Yukawa Institute for Theoretical Physics,Kyoto University, Kyoto 606-8502 (Japan); Tomsk State Pedagogical University,634050 Tomsk (Russian Federation); Yeom, Dong-han [Yukawa Institute for Theoretical Physics,Kyoto University, Kyoto 606-8502 (Japan); Leung Center for Cosmology and Particle Astrophysics, National Taiwan University,Taipei 10617, Taiwan (China)

    2014-12-24

    We study the motion of thin-shell bubbles and their tunneling in anti de Sitter (AdS) background. We are interested in the case when the outside of a shell is a Schwarzschild-AdS space (false vacuum) and the inside of it is an AdS space with a lower vacuum energy (true vacuum). If a collapsing true vacuum bubble is created, classically it will form a Schwarzschild-AdS black hole. However, this collapsing bubble can tunnel to a bouncing bubble that moves out to spatial infinity. Then, although the classical causal structure of a collapsing true vacuum bubble has the singularity and the event horizon, quantum mechanically the wavefunction has support for a history without any singularity nor event horizon which is mediated by the non-perturbative, quantum tunneling effect. This may be regarded an explicit example that shows the unitarity of an asymptotic observer in AdS, while a classical observer who only follows the most probable history effectively lose information due to the formation of an event horizon.

  18. Thin-shell bubbles and information loss problem in anti de Sitter background

    International Nuclear Information System (INIS)

    Sasaki, Misao; Yeom, Dong-han

    2014-01-01

    We study the motion of thin-shell bubbles and their tunneling in anti de Sitter (AdS) background. We are interested in the case when the outside of a shell is a Schwarzschild-AdS space (false vacuum) and the inside of it is an AdS space with a lower vacuum energy (true vacuum). If a collapsing true vacuum bubble is created, classically it will form a Schwarzschild-AdS black hole. However, this collapsing bubble can tunnel to a bouncing bubble that moves out to spatial infinity. Then, although the classical causal structure of a collapsing true vacuum bubble has the singularity and the event horizon, quantum mechanically the wavefunction has support for a history without any singularity nor event horizon which is mediated by the non-perturbative, quantum tunneling effect. This may be regarded an explicit example that shows the unitarity of an asymptotic observer in AdS, while a classical observer who only follows the most probable history effectively lose information due to the formation of an event horizon.

  19. 102: PROMOTING INFORMATION LITERACY BY PROMOTING HEALTH LITERACY IN THE INFORMATION SOCIETY

    OpenAIRE

    Dastani, Meisam; Sattari, Masoume

    2017-01-01

    Background and aims In the information society the production, distribution and use of information is freely and widely available for all issues of life. Correct and appropriate use of appropriate and reliable information is especially important in health care. The present study introduces the concepts and benefits of health literacy and information literacy and its role in improving health literacy. Methods This study is a review based on a review of the concepts of the information society, ...

  20. Study on the background information for the R and D of geological disposal

    International Nuclear Information System (INIS)

    Matsui, Kazuaki; Hirusawa, Shigenobu; Komoto, Harumi

    2001-02-01

    It is quite important for Japan Nuclear Cycle Development Institute (JNC) to analyze the R and D items after 'H12 report' and also provide their results of R and D activities to general public effectively. Recognizing the importance of the social consensus to the geological disposal, relating background informations were to be picked up. In this fiscal year, following two main topics were selected and studied. 1. Research and analysis on the options for the geological disposal concept. The major nuclear power-generating countries have almost all chosen deep geological disposal as preferred method for HLW disposal. Since 1990's, to make the geological disposal flexible, the alternative concepts for the disposal of HLW have been discussed promoting the social acceptance. In this context, recent optional discussions and international evaluations on the following topics were studied and summarized. (1) Reversibility of waste disposal/Retrievability of waste/Waste monitoring, (2) Long-term storage concept and its effectiveness, (3) Present position and role of international disposal. 2. Research and analysis on some educational materials collected from foreign countries. Although geological disposals is scheduled to start still in future, it is quite important to study the procedures to attract younger generation and get their proper perceptions on the nuclear energy and waste problems. As the supporting analysis to implement strategically the public relational activities for JNC's geological disposal R and D, particular attention was focused on the educational materials obtained in the last year's survey. Representative educational materials were selected and following items were studied and summarized. (1) Basic approach, positioning and characteristics of the educational materials, (2) Detailed analysis of the representatively selected educational materials, (3) Comparison of the analyzed characteristics and study on its feedback to Japanese materials. (author)

  1. Internal combustion engines for alcohol motor fuels: a compilation of background technical information

    Energy Technology Data Exchange (ETDEWEB)

    Blaser, Richard

    1980-11-01

    This compilation, a draft training manual containing technical background information on internal combustion engines and alcohol motor fuel technologies, is presented in 3 parts. The first is a compilation of facts from the state of the art on internal combustion engine fuels and their characteristics and requisites and provides an overview of fuel sources, fuels technology and future projections for availability and alternatives. Part two compiles facts about alcohol chemistry, alcohol identification, production, and use, examines ethanol as spirit and as fuel, and provides an overview of modern evaluation of alcohols as motor fuels and of the characteristics of alcohol fuels. The final section compiles cross references on the handling and combustion of fuels for I.C. engines, presents basic evaluations of events leading to the use of alcohols as motor fuels, reviews current applications of alcohols as motor fuels, describes the formulation of alcohol fuels for engines and engine and fuel handling hardware modifications for using alcohol fuels, and introduces the multifuel engines concept. (LCL)

  2. Thermalization of mutual information in hyperscaling violating backgrounds

    Energy Technology Data Exchange (ETDEWEB)

    Tanhayi, M. Reza [Department of Physics, Faculty of Basic Science,Islamic Azad University Central Tehran Branch (IAUCTB),P.O. Box 14676-86831, Tehran (Iran, Islamic Republic of); School of Physics, Institute for Research in Fundamental Sciences (IPM),P.O. Box 19395-5531, Tehran (Iran, Islamic Republic of)

    2016-03-31

    We study certain features of scaling behaviors of the mutual information during a process of thermalization, more precisely we extend the time scaling behavior of mutual information which has been discussed in http://dx.doi.org/10.1007/JHEP09(2015)165 to time-dependent hyperscaling violating geometries. We use the holographic description of entanglement entropy for two disjoint system consisting of two parallel strips whose widths are much larger than the separation between them. We show that during the thermalization process, the dynamical exponent plays a crucial rule in reading the general time scaling behavior of mutual information (e.g., at the pre-local-equilibration regime). It is shown that the scaling violating parameter can be employed to define an effective dimension.

  3. 48 CFR 2905.101 - Methods of disseminating information.

    Science.gov (United States)

    2010-10-01

    ... information. 2905.101 Section 2905.101 Federal Acquisition Regulations System DEPARTMENT OF LABOR ACQUISITION PLANNING PUBLICIZING CONTRACT ACTIONS Dissemination of Information 2905.101 Methods of disseminating... dissemination of information concerning procurement actions. The Division of Acquisition Management Services...

  4. A genomic background based method for association analysis in related individuals.

    Directory of Open Access Journals (Sweden)

    Najaf Amin

    Full Text Available BACKGROUND: Feasibility of genotyping of hundreds and thousands of single nucleotide polymorphisms (SNPs in thousands of study subjects have triggered the need for fast, powerful, and reliable methods for genome-wide association analysis. Here we consider a situation when study participants are genetically related (e.g. due to systematic sampling of families or because a study was performed in a genetically isolated population. Of the available methods that account for relatedness, the Measured Genotype (MG approach is considered the 'gold standard'. However, MG is not efficient with respect to time taken for the analysis of genome-wide data. In this context we proposed a fast two-step method called Genome-wide Association using Mixed Model and Regression (GRAMMAR for the analysis of pedigree-based quantitative traits. This method certainly overcomes the drawback of time limitation of the measured genotype (MG approach, but pays in power. One of the major drawbacks of both MG and GRAMMAR, is that they crucially depend on the availability of complete and correct pedigree data, which is rarely available. METHODOLOGY: In this study we first explore type 1 error and relative power of MG, GRAMMAR, and Genomic Control (GC approaches for genetic association analysis. Secondly, we propose an extension to GRAMMAR i.e. GRAMMAR-GC. Finally, we propose application of GRAMMAR-GC using the kinship matrix estimated through genomic marker data, instead of (possibly missing and/or incorrect genealogy. CONCLUSION: Through simulations we show that MG approach maintains high power across a range of heritabilities and possible pedigree structures, and always outperforms other contemporary methods. We also show that the power of our proposed GRAMMAR-GC approaches to that of the 'gold standard' MG for all models and pedigrees studied. We show that this method is both feasible and powerful and has correct type 1 error in the context of genome-wide association analysis

  5. The Qatar Biobank: background and methods.

    Science.gov (United States)

    Al Kuwari, Hanan; Al Thani, Asma; Al Marri, Ajayeb; Al Kaabi, Abdulla; Abderrahim, Hadi; Afifi, Nahla; Qafoud, Fatima; Chan, Queenie; Tzoulaki, Ioanna; Downey, Paul; Ward, Heather; Murphy, Neil; Riboli, Elio; Elliott, Paul

    2015-12-03

    The Qatar Biobank aims to collect extensive lifestyle, clinical, and biological information from up to 60,000 men and women Qatari nationals and long-term residents (individuals living in the country for ≥15 years) aged ≥18 years (approximately one-fifth of all Qatari citizens), to follow up these same individuals over the long term to record any subsequent disease, and hence to study the causes and progression of disease, and disease burden, in the Qatari population. Between the 11(th)-December-2012 and 20(th)-February-2014, 1209 participants were recruited into the pilot study of the Qatar Biobank. At recruitment, extensive phenotype information was collected from each participant, including information/measurements of socio-demographic factors, prevalent health conditions, diet, lifestyle, anthropometry, body composition, bone health, cognitive function, grip strength, retinal imaging, total body dual energy X-ray absorptiometry, and measurements of cardiovascular and respiratory function. Blood, urine, and saliva were collected and stored for future research use. A panel of 66 clinical biomarkers was routinely measured on fresh blood samples in all participants. Rates of recruitment are to be progressively increased in the coming period and the recruitment base widened to achieve a cohort of consented individuals broadly representative of the eligible Qatari population. In addition, it is planned to add additional measures in sub-samples of the cohort, including Magnetic Resonance Imaging (MRI) of the brain, heart and abdomen. The mean time for collection of the extensive phenotypic information and biological samples from each participant at the baseline recruitment visit was 179 min. The 1209 pilot study participants (506 men and 703 women) were aged between 28-80 years (median 39 years); 899 (74.4%) were Qatari nationals and 310 (25.6%) were long-term residents. Approximately two-thirds of pilot participants were educated to graduate level or above. The

  6. MAIA - Method for Architecture of Information Applied: methodological construct of information processing in complex contexts

    Directory of Open Access Journals (Sweden)

    Ismael de Moura Costa

    2017-04-01

    Full Text Available Introduction: Paper to presentation the MAIA Method for Architecture of Information Applied evolution, its structure, results obtained and three practical applications.Objective: Proposal of a methodological constructo for treatment of complex information, distinguishing information spaces and revealing inherent configurations of those spaces. Metodology: The argument is elaborated from theoretical research of analitical hallmark, using distinction as a way to express concepts. Phenomenology is used as a philosophical position, which considers the correlation between Subject↔Object. The research also considers the notion of interpretation as an integrating element for concepts definition. With these postulates, the steps to transform the information spaces are formulated. Results: This article explores not only how the method is structured to process information in its contexts, starting from a succession of evolutive cicles, divided in moments, which, on their turn, evolve to transformation acts. Conclusions: This article explores not only how the method is structured to process information in its contexts, starting from a succession of evolutive cicles, divided in moments, which, on their turn, evolve to transformation acts. Besides that, the article presents not only possible applications as a cientific method, but also as configuration tool in information spaces, as well as generator of ontologies. At last, but not least, presents a brief summary of the analysis made by researchers who have already evaluated the method considering the three aspects mentioned.

  7. Effect of background dielectric on TE-polarized photonic bandgap of metallodielectric photonic crystals using Dirichlet-to-Neumann map method.

    Science.gov (United States)

    Sedghi, Aliasghar; Rezaei, Behrooz

    2016-11-20

    Using the Dirichlet-to-Neumann map method, we have calculated the photonic band structure of two-dimensional metallodielectric photonic crystals having the square and triangular lattices of circular metal rods in a dielectric background. We have selected the transverse electric mode of electromagnetic waves, and the resulting band structures showed the existence of photonic bandgap in these structures. We theoretically study the effect of background dielectric on the photonic bandgap.

  8. Comparison of information-theoretic to statistical methods for gene-gene interactions in the presence of genetic heterogeneity

    Directory of Open Access Journals (Sweden)

    Sucheston Lara

    2010-09-01

    Full Text Available Abstract Background Multifactorial diseases such as cancer and cardiovascular diseases are caused by the complex interplay between genes and environment. The detection of these interactions remains challenging due to computational limitations. Information theoretic approaches use computationally efficient directed search strategies and thus provide a feasible solution to this problem. However, the power of information theoretic methods for interaction analysis has not been systematically evaluated. In this work, we compare power and Type I error of an information-theoretic approach to existing interaction analysis methods. Methods The k-way interaction information (KWII metric for identifying variable combinations involved in gene-gene interactions (GGI was assessed using several simulated data sets under models of genetic heterogeneity driven by susceptibility increasing loci with varying allele frequency, penetrance values and heritability. The power and proportion of false positives of the KWII was compared to multifactor dimensionality reduction (MDR, restricted partitioning method (RPM and logistic regression. Results The power of the KWII was considerably greater than MDR on all six simulation models examined. For a given disease prevalence at high values of heritability, the power of both RPM and KWII was greater than 95%. For models with low heritability and/or genetic heterogeneity, the power of the KWII was consistently greater than RPM; the improvements in power for the KWII over RPM ranged from 4.7% to 14.2% at for α = 0.001 in the three models at the lowest heritability values examined. KWII performed similar to logistic regression. Conclusions Information theoretic models are flexible and have excellent power to detect GGI under a variety of conditions that characterize complex diseases.

  9. A review of Green's function methods in computational fluid mechanics: Background, recent developments and future directions

    International Nuclear Information System (INIS)

    Dorning, J.

    1981-01-01

    The research and development over the past eight years on local Green's function methods for the high-accuracy, high-efficiency numerical solution of nuclear engineering problems is reviewed. The basic concepts and key ideas are presented by starting with an expository review of the original fully two-dimensional local Green's function methods developed for neutron diffusion and heat conduction, and continuing through the progressively more complicated and more efficient nodal Green's function methods for neutron diffusion, heat conduction and neutron transport to establish the background for the recent development of Green's function methods in computational fluid mechanics. Some of the impressive numerical results obtained via these classes of methods for nuclear engineering problems are briefly summarized. Finally, speculations are proffered on future directions in which the development of these types of methods in fluid mechanics and other areas might lead. (orig.) [de

  10. Where is information quality lost at clinical level? A mixed-method study on information systems and data quality in three urban Kenyan ANC clinics

    Directory of Open Access Journals (Sweden)

    Daniel Hahn

    2013-08-01

    Full Text Available Background: Well-working health information systems are considered vital with the quality of health data ranked of highest importance for decision making at patient care and policy levels. In particular, health facilities play an important role, since they are not only the entry point for the national health information system but also use health data (and primarily for patient care. Design: A multiple case study was carried out between March and August 2012 at the antenatal care (ANC clinics of two private and one public Kenyan hospital to describe clinical information systems and assess the quality of information. The following methods were developed and employed in an iterative process: workplace walkthroughs, structured and in-depth interviews with staff members, and a quantitative assessment of data quality (completeness and accurate transmission of clinical information and reports in ANC. Views of staff and management on the quality of employed information systems, data quality, and influencing factors were captured qualitatively. Results: Staff rated the quality of information higher in the private hospitals employing computers than in the public hospital which relies on paper forms. Several potential threats to data quality were reported. Limitations in data quality were common at all study sites including wrong test results, missing registers, and inconsistencies in reports. Feedback was seldom on content or quality of reports and usage of data beyond individual patient care was low. Conclusions: We argue that the limited data quality has to be seen in the broader perspective of the information systems in which it is produced and used. The combination of different methods has proven to be useful for this. To improve the effectiveness and capabilities of these systems, combined measures are needed which include technical and organizational aspects (e.g. regular feedback to health workers and individual skills and motivation.

  11. Where is information quality lost at clinical level? A mixed-method study on information systems and data quality in three urban Kenyan ANC clinics

    Science.gov (United States)

    Hahn, Daniel; Wanjala, Pepela; Marx, Michael

    2013-01-01

    Background Well-working health information systems are considered vital with the quality of health data ranked of highest importance for decision making at patient care and policy levels. In particular, health facilities play an important role, since they are not only the entry point for the national health information system but also use health data (and primarily) for patient care. Design A multiple case study was carried out between March and August 2012 at the antenatal care (ANC) clinics of two private and one public Kenyan hospital to describe clinical information systems and assess the quality of information. The following methods were developed and employed in an iterative process: workplace walkthroughs, structured and in-depth interviews with staff members, and a quantitative assessment of data quality (completeness and accurate transmission of clinical information and reports in ANC). Views of staff and management on the quality of employed information systems, data quality, and influencing factors were captured qualitatively. Results Staff rated the quality of information higher in the private hospitals employing computers than in the public hospital which relies on paper forms. Several potential threats to data quality were reported. Limitations in data quality were common at all study sites including wrong test results, missing registers, and inconsistencies in reports. Feedback was seldom on content or quality of reports and usage of data beyond individual patient care was low. Conclusions We argue that the limited data quality has to be seen in the broader perspective of the information systems in which it is produced and used. The combination of different methods has proven to be useful for this. To improve the effectiveness and capabilities of these systems, combined measures are needed which include technical and organizational aspects (e.g. regular feedback to health workers) and individual skills and motivation. PMID:23993022

  12. Information entropy method and the description of echo hologram formation in gaseous media

    Science.gov (United States)

    Garnaeva, G. I.; Nefediev, L. A.; Akhmedshina, E. N.

    2018-02-01

    The effect of collisions with a change in velocity of gas particles, on the value of information entropy, is associated with the spectral structure of the echo hologram’s response, where its temporal form is considered. It is shown that collisions with a change in gas particle velocity increase the ‘parasitical’ information, on the background of which the information contained in the temporary shape of the object laser pulse is lost.

  13. Method for gathering and summarizing internet information

    Energy Technology Data Exchange (ETDEWEB)

    Potok, Thomas E.; Elmore, Mark Thomas; Reed, Joel Wesley; Treadwell, Jim N.; Samatova, Nagiza Faridovna

    2010-04-06

    A computer method of gathering and summarizing large amounts of information comprises collecting information from a plurality of information sources (14, 51) according to respective maps (52) of the information sources (14), converting the collected information from a storage format to XML-language documents (26, 53) and storing the XML-language documents in a storage medium, searching for documents (55) according to a search query (13) having at least one term and identifying the documents (26) found in the search, and displaying the documents as nodes (33) of a tree structure (32) having links (34) and nodes (33) so as to indicate similarity of the documents to each other.

  14. Bayesian Analysis of the Cosmic Microwave Background

    Science.gov (United States)

    Jewell, Jeffrey

    2007-01-01

    There is a wealth of cosmological information encoded in the spatial power spectrum of temperature anisotropies of the cosmic microwave background! Experiments designed to map the microwave sky are returning a flood of data (time streams of instrument response as a beam is swept over the sky) at several different frequencies (from 30 to 900 GHz), all with different resolutions and noise properties. The resulting analysis challenge is to estimate, and quantify our uncertainty in, the spatial power spectrum of the cosmic microwave background given the complexities of "missing data", foreground emission, and complicated instrumental noise. Bayesian formulation of this problem allows consistent treatment of many complexities including complicated instrumental noise and foregrounds, and can be numerically implemented with Gibbs sampling. Gibbs sampling has now been validated as an efficient, statistically exact, and practically useful method for low-resolution (as demonstrated on WMAP 1 and 3 year temperature and polarization data). Continuing development for Planck - the goal is to exploit the unique capabilities of Gibbs sampling to directly propagate uncertainties in both foreground and instrument models to total uncertainty in cosmological parameters.

  15. Bremsstrahlung background and quantitation of thin biological samples

    International Nuclear Information System (INIS)

    Khan, K.M.

    1991-02-01

    An inherent feature of all electron-excited X-ray spectra is the presence of a background upon which the characteristic peaks of interest are superimposed. To extract the x-ray intensity of the characteristics lines of interest, it is necessary to subtract this background, which may be due to both specimen generated Bremsstrahlung and extraneous sources, from a measured x-ray spectrum. Some conventional methods of background subtraction will be briefly reviewed. It has been seen that these conventional methods do not give sufficiently accurate results in biological analysis where the peaks of interest are not well-separated and most of the peaks lie in the region where the background is appreciably curved. An alternative approach of background subtraction for such samples is investigated which involves modelling the background using the currently available knowledge of x-ray physics and energy dispersive detectors. This method is particularly suitable for biological samples as well as other samples having an organic matrix. 6 figs. (author)

  16. Investigating the feasibility of using partial least squares as a method of extracting salient information for the evaluation of digital breast tomosynthesis

    Science.gov (United States)

    Zhang, George Z.; Myers, Kyle J.; Park, Subok

    2013-03-01

    Digital breast tomosynthesis (DBT) has shown promise for improving the detection of breast cancer, but it has not yet been fully optimized due to a large space of system parameters to explore. A task-based statistical approach1 is a rigorous method for evaluating and optimizing this promising imaging technique with the use of optimal observers such as the Hotelling observer (HO). However, the high data dimensionality found in DBT has been the bottleneck for the use of a task-based approach in DBT evaluation. To reduce data dimensionality while extracting salient information for performing a given task, efficient channels have to be used for the HO. In the past few years, 2D Laguerre-Gauss (LG) channels, which are a complete basis for stationary backgrounds and rotationally symmetric signals, have been utilized for DBT evaluation2, 3 . But since background and signal statistics from DBT data are neither stationary nor rotationally symmetric, LG channels may not be efficient in providing reliable performance trends as a function of system parameters. Recently, partial least squares (PLS) has been shown to generate efficient channels for the Hotelling observer in detection tasks involving random backgrounds and signals.4 In this study, we investigate the use of PLS as a method for extracting salient information from DBT in order to better evaluate such systems.

  17. Securing recruitment and obtaining informed consent in minority ethnic groups in the UK

    Directory of Open Access Journals (Sweden)

    Roy Tapash

    2008-03-01

    Full Text Available Abstract Background Previous health research has often explicitly excluded individuals from minority ethnic backgrounds due to perceived cultural and communication difficulties, including studies where there might be language/literacy problems in obtaining informed consent. This study addressed these difficulties by developing audio-recorded methods of obtaining informed consent and recording data. This report outlines 1 our experiences with securing recruitment to a qualitative study investigating alternative methods of data collection, and 2 the development of a standardised process for obtaining informed consent from individuals from minority ethnic backgrounds whose main language does not have an agreed written form. Methods Two researchers from South Asian backgrounds recruited adults with Type 2 diabetes whose main language was spoken and not written, to attend a series of focus groups. A screening tool was used at recruitment in order to assess literacy skills in potential participants. Informed consent was obtained using audio-recordings of the patient information and recording patients' verbal consent. Participants' perceptions of this method of obtaining consent were recorded. Results Recruitment rates were improved by using telephone compared to face-to-face methods. The screening tool was found to be acceptable by all potential participants. Audio-recorded methods of obtaining informed consent were easy to implement and accepted by all participants. Attrition rates differed according to ethnic group. Snowballing techniques only partly improved participation rates. Conclusion Audio-recorded methods of obtaining informed consent are an acceptable alternative to written consent in study populations where literacy skills are variable. Further exploration of issues relating to attrition is required, and a range of methods may be necessary in order to maximise response and participation rates.

  18. 77 FR 31017 - Office of Facilities Management and Program Services; Information Collection; Background...

    Science.gov (United States)

    2012-05-24

    ... 3090-0287, Background Investigations for Child Care Workers. Instructions: Please submit comments only... request for review and approval for background check investigations of child care workers, form GSA 176C... Child Care Workers AGENCY: Office of Facilities Management and Program Services, Public Building Service...

  19. Methods of determining information needs for control

    Energy Technology Data Exchange (ETDEWEB)

    Borkowski, Z.

    1980-01-01

    Work has begun in the Main Data Center in the field of mining (Poland) on estimation in improvement of methods of determining information requirements necessary for control. Existing methods are briefly surveyed. Their imperfection is shown. The complexity of characteristics for this problem is pointed out.

  20. George Smoot, Blackbody, and Anisotropy of the Cosmic Microwave Background

    Science.gov (United States)

    the Cosmic Microwave Background Radiation Resources with Additional Information * Videos 'George Smoot anisotropy of the cosmic microwave background radiation." '1 Smoot previously won the Ernest Orlando . Smoot, blackbody, and anisotropy of the Cosmic Microwave Background (CMB) radiation is available in full

  1. Multi-Dimensional Analysis of Dynamic Human Information Interaction

    Science.gov (United States)

    Park, Minsoo

    2013-01-01

    Introduction: This study aims to understand the interactions of perception, effort, emotion, time and performance during the performance of multiple information tasks using Web information technologies. Method: Twenty volunteers from a university participated in this study. Questionnaires were used to obtain general background information and…

  2. Simple analytical methods for computing the gravity-wave contribution to the cosmic background radiation anisotropy

    International Nuclear Information System (INIS)

    Wang, Y.

    1996-01-01

    We present two simple analytical methods for computing the gravity-wave contribution to the cosmic background radiation (CBR) anisotropy in inflationary models; one method uses a time-dependent transfer function, the other methods uses an approximate gravity-mode function which is a simple combination of the lowest order spherical Bessel functions. We compare the CBR anisotropy tensor multipole spectrum computed using our methods with the previous result of the highly accurate numerical method, the open-quote open-quote Boltzmann close-quote close-quote method. Our time-dependent transfer function is more accurate than the time-independent transfer function found by Turner, White, and Lindsey; however, we find that the transfer function method is only good for l approx-lt 120. Using our approximate gravity-wave mode function, we obtain much better accuracy; the tensor multipole spectrum we find differs by less than 2% for l approx-lt 50, less than 10% for l approx-lt 120, and less than 20% for l≤300 from the open-quote open-quote Boltzmann close-quote close-quote result. Our approximate graviton mode function should be quite useful in studying tensor perturbations from inflationary models. copyright 1996 The American Physical Society

  3. Lewis Information Network (LINK): Background and overview

    Science.gov (United States)

    Schulte, Roger R.

    1987-01-01

    The NASA Lewis Research Center supports many research facilities with many isolated buildings, including wind tunnels, test cells, and research laboratories. These facilities are all located on a 350 acre campus adjacent to the Cleveland Hopkins Airport. The function of NASA-Lewis is to do basic and applied research in all areas of aeronautics, fluid mechanics, materials and structures, space propulsion, and energy systems. These functions require a great variety of remote high speed, high volume data communications for computing and interactive graphic capabilities. In addition, new requirements for local distribution of intercenter video teleconferencing and data communications via satellite have developed. To address these and future communications requirements for the next 15 yrs, a project team was organized to design and implement a new high speed communication system that would handle both data and video information in a common lab-wide Local Area Network. The project team selected cable television broadband coaxial cable technology as the communications medium and first installation of in-ground cable began in the summer of 1980. The Lewis Information Network (LINK) became operational in August 1982 and has become the backbone of all data communications and video.

  4. A human-machine interface evaluation method: A difficulty evaluation method in information searching (DEMIS)

    International Nuclear Information System (INIS)

    Ha, Jun Su; Seong, Poong Hyun

    2009-01-01

    A human-machine interface (HMI) evaluation method, which is named 'difficulty evaluation method in information searching (DEMIS)', is proposed and demonstrated with an experimental study. The DEMIS is based on a human performance model and two measures of attentional-resource effectiveness in monitoring and detection tasks in nuclear power plants (NPPs). Operator competence and HMI design are modeled to be most significant factors to human performance. One of the two effectiveness measures is fixation-to-importance ratio (FIR) which represents attentional resource (eye fixations) spent on an information source compared to importance of the information source. The other measure is selective attention effectiveness (SAE) which incorporates FIRs for all information sources. The underlying principle of the measures is that the information source should be selectively attended to according to its informational importance. In this study, poor performance in information searching tasks is modeled to be coupled with difficulties caused by poor mental models of operators or/and poor HMI design. Human performance in information searching tasks is evaluated by analyzing the FIR and the SAE. Operator mental models are evaluated by a questionnaire-based method. Then difficulties caused by a poor HMI design are evaluated by a focused interview based on the FIR evaluation and then root causes leading to poor performance are identified in a systematic way.

  5. Financial time series analysis based on information categorization method

    Science.gov (United States)

    Tian, Qiang; Shang, Pengjian; Feng, Guochen

    2014-12-01

    The paper mainly applies the information categorization method to analyze the financial time series. The method is used to examine the similarity of different sequences by calculating the distances between them. We apply this method to quantify the similarity of different stock markets. And we report the results of similarity in US and Chinese stock markets in periods 1991-1998 (before the Asian currency crisis), 1999-2006 (after the Asian currency crisis and before the global financial crisis), and 2007-2013 (during and after global financial crisis) by using this method. The results show the difference of similarity between different stock markets in different time periods and the similarity of the two stock markets become larger after these two crises. Also we acquire the results of similarity of 10 stock indices in three areas; it means the method can distinguish different areas' markets from the phylogenetic trees. The results show that we can get satisfactory information from financial markets by this method. The information categorization method can not only be used in physiologic time series, but also in financial time series.

  6. Danish extreme wind atlas: Background and methods for a WAsP engineering option

    Energy Technology Data Exchange (ETDEWEB)

    Rathmann, O; Kristensen, L; Mann, J [Risoe National Lab., Wind Energy and Atmospheric Physics Dept., Roskilde (Denmark); Hansen, S O [Svend Ole Hansen ApS, Copenhagen (Denmark)

    1999-03-01

    Extreme wind statistics is necessary design information when establishing wind farms and erecting bridges, buildings and other structures in the open air. Normal mean wind statistics in terms of directional and speed distribution may be estimated by wind atlas methods and are used to estimate e.g. annual energy output for wind turbines. It is the purpose of the present work to extend the wind atlas method to also include the local extreme wind statistics so that an extreme value as e.g. the 50-year wind can be estimated at locations of interest. Together with turbulence estimates such information is important regarding the necessary strength of wind turbines or structures to withstand high wind loads. In the `WAsP Engineering` computer program a flow model, which includes a model for the dynamic roughness of water surfaces, is used to realise such an extended wind atlas method. With basis in an extended wind atlas, also containing extreme wind statistics, this allows the program to estimate extreme winds in addition to mean winds and turbulence intensities at specified positions and heights. (au) EFP-97. 15 refs.

  7. Research on the algorithm of infrared target detection based on the frame difference and background subtraction method

    Science.gov (United States)

    Liu, Yun; Zhao, Yuejin; Liu, Ming; Dong, Liquan; Hui, Mei; Liu, Xiaohua; Wu, Yijian

    2015-09-01

    As an important branch of infrared imaging technology, infrared target tracking and detection has a very important scientific value and a wide range of applications in both military and civilian areas. For the infrared image which is characterized by low SNR and serious disturbance of background noise, an innovative and effective target detection algorithm is proposed in this paper, according to the correlation of moving target frame-to-frame and the irrelevance of noise in sequential images based on OpenCV. Firstly, since the temporal differencing and background subtraction are very complementary, we use a combined detection method of frame difference and background subtraction which is based on adaptive background updating. Results indicate that it is simple and can extract the foreground moving target from the video sequence stably. For the background updating mechanism continuously updating each pixel, we can detect the infrared moving target more accurately. It paves the way for eventually realizing real-time infrared target detection and tracking, when transplanting the algorithms on OpenCV to the DSP platform. Afterwards, we use the optimal thresholding arithmetic to segment image. It transforms the gray images to black-white images in order to provide a better condition for the image sequences detection. Finally, according to the relevance of moving objects between different frames and mathematical morphology processing, we can eliminate noise, decrease the area, and smooth region boundaries. Experimental results proves that our algorithm precisely achieve the purpose of rapid detection of small infrared target.

  8. Determination of the input parameters for inelastic background analysis combined with HAXPES using a reference sample

    DEFF Research Database (Denmark)

    Zborowski, C.; Renault, O; Torres, A

    2018-01-01

    The recent progress in HAXPES combined with Inelastic Background Analysis makes this method a powerful, non-destructive solution to get quantitative information on deeply buried layers and interfaces at depths up to 70. nm. However, we recently highlighted the need for carefully choosing the scat...

  9. Information decomposition method to analyze symbolical sequences

    International Nuclear Information System (INIS)

    Korotkov, E.V.; Korotkova, M.A.; Kudryashov, N.A.

    2003-01-01

    The information decomposition (ID) method to analyze symbolical sequences is presented. This method allows us to reveal a latent periodicity of any symbolical sequence. The ID method is shown to have advantages in comparison with application of the Fourier transformation, the wavelet transform and the dynamic programming method to look for latent periodicity. Examples of the latent periods for poetic texts, DNA sequences and amino acids are presented. Possible origin of a latent periodicity for different symbolical sequences is discussed

  10. A comparison of methods used to calculate normal background concentrations of potentially toxic elements for urban soil

    Energy Technology Data Exchange (ETDEWEB)

    Rothwell, Katherine A., E-mail: k.rothwell@ncl.ac.uk; Cooke, Martin P., E-mail: martin.cooke@ncl.ac.uk

    2015-11-01

    To meet the requirements of regulation and to provide realistic remedial targets there is a need for the background concentration of potentially toxic elements (PTEs) in soils to be considered when assessing contaminated land. In England, normal background concentrations (NBCs) have been published for several priority contaminants for a number of spatial domains however updated regulatory guidance places the responsibility on Local Authorities to set NBCs for their jurisdiction. Due to the unique geochemical nature of urban areas, Local Authorities need to define NBC values specific to their area, which the national data is unable to provide. This study aims to calculate NBC levels for Gateshead, an urban Metropolitan Borough in the North East of England, using freely available data. The ‘median + 2MAD’, boxplot upper whisker and English NBC (according to the method adopted by the British Geological Survey) methods were compared for test PTEs lead, arsenic and cadmium. Due to the lack of systematically collected data for Gateshead in the national soil chemistry database, the use of site investigation (SI) data collected during the planning process was investigated. 12,087 SI soil chemistry data points were incorporated into a database and 27 comparison samples were taken from undisturbed locations across Gateshead. The SI data gave high resolution coverage of the area and Mann–Whitney tests confirmed statistical similarity for the undisturbed comparison samples and the SI data. SI data was successfully used to calculate NBCs for Gateshead and the median + 2MAD method was selected as most appropriate by the Local Authority according to the precautionary principle as it consistently provided the most conservative NBC values. The use of this data set provides a freely available, high resolution source of data that can be used for a range of environmental applications. - Highlights: • The use of site investigation data is proposed for land contamination studies

  11. Knowledge and information needs of young people with epilepsy and their parents: Mixed-method systematic review

    Directory of Open Access Journals (Sweden)

    Noyes Jane

    2010-12-01

    Full Text Available Abstract Background Young people with neurological impairments such as epilepsy are known to receive less adequate services compared to young people with other long-term conditions. The time (age 13-19 years around transition to adult services is particularly important in facilitating young people's self-care and ongoing management. There are epilepsy specific, biological and psycho-social factors that act as barriers and enablers to information exchange and nurturing of self-care practices. Review objectives were to identify what is known to be effective in delivering information to young people age 13-19 years with epilepsy and their parents, to describe their experiences of information exchange in healthcare contexts, and to identify factors influencing positive and negative healthcare communication. Methods The Evidence for Policy and Practice Information Coordinating Centre systematic mixed-method approach was adapted to locate, appraise, extract and synthesise evidence. We used Ley's cognitive hypothetical model of communication and subsequently developed a theoretical framework explaining information exchange in healthcare contexts. Results Young people and parents believed that healthcare professionals were only interested in medical management. Young people felt that discussions about their epilepsy primarily occurred between professionals and parents. Epilepsy information that young people obtained from parents or from their own efforts increased the risk of epilepsy misconceptions. Accurate epilepsy knowledge aided psychosocial adjustment. There is some evidence that interventions, when delivered in a structured psycho-educational, age appropriate way, increased young people's epilepsy knowledge, with positive trend to improving quality of life. We used mainly qualitative and mixed-method evidence to develop a theoretical framework explaining information exchange in clinical encounters. Conclusions There is a paucity of evidence

  12. Research methods in information

    CERN Document Server

    Pickard, Alison Jane

    2013-01-01

    The long-awaited 2nd edition of this best-selling research methods handbook is fully updated and includes brand new coverage of online research methods and techniques, mixed methodology and qualitative analysis. There is an entire chapter contributed by Professor Julie McLeod, Sue Childs and Elizabeth Lomas focusing on research data management, applying evidence from the recent JISC funded 'DATUM' project. The first to focus entirely on the needs of the information and communications community, it guides the would-be researcher through the variety of possibilities open to them under the heading "research" and provides students with the confidence to embark on their dissertations. The focus here is on the 'doing' and although the philosophy and theory of research is explored to provide context, this is essentially a practical exploration of the whole research process with each chapter fully supported by examples and exercises tried and tested over a whole teaching career. The book will take readers through eac...

  13. An Improved Information Hiding Method Based on Sparse Representation

    Directory of Open Access Journals (Sweden)

    Minghai Yao

    2015-01-01

    Full Text Available A novel biometric authentication information hiding method based on the sparse representation is proposed for enhancing the security of biometric information transmitted in the network. In order to make good use of abundant information of the cover image, the sparse representation method is adopted to exploit the correlation between the cover and biometric images. Thus, the biometric image is divided into two parts. The first part is the reconstructed image, and the other part is the residual image. The biometric authentication image cannot be restored by any one part. The residual image and sparse representation coefficients are embedded into the cover image. Then, for the sake of causing much less attention of attackers, the visual attention mechanism is employed to select embedding location and embedding sequence of secret information. Finally, the reversible watermarking algorithm based on histogram is utilized for embedding the secret information. For verifying the validity of the algorithm, the PolyU multispectral palmprint and the CASIA iris databases are used as biometric information. The experimental results show that the proposed method exhibits good security, invisibility, and high capacity.

  14. Informational Urbanism

    Directory of Open Access Journals (Sweden)

    Wolfgang G. Stock

    2015-10-01

    Full Text Available Contemporary and future cities are often labeled as "smart cities," "ubiquitous cities," "knowledge cities" and "creative cities." Informational urbanism includes all aspects of information and knowledge with regard to urban regions. "Informational city" is an umbrella term uniting the divergent trends of information-related city research. Informational urbanism is an interdisciplinary endeavor incorporating on the one side computer science and information science and on the other side urbanism, architecture, (city economics, and (city sociology. In our research project on informational cities, we visited more than 40 metropolises and smaller towns all over the world. In this paper, we sketch the theoretical background on a journey from Max Weber to the Internet of Things, introduce our research methods, and describe main results on characteristics of informational cities as prototypical cities of the emerging knowledge society.

  15. The natural radiation background

    International Nuclear Information System (INIS)

    Duggleby, J.C.

    1982-01-01

    The components of the natural background radiation and their variations are described. Cosmic radiation is a major contributor to the external dose to the human body whilst naturally-occurring radionuclides of primordial and cosmogenic origin contribute to both the external and internal doses, with the primordial radionuclides being the major contributor in both cases. Man has continually modified the radiation dose to which he has been subjected. The two traditional methods of measuring background radiation, ionisation chamber measurements and scintillation counting, are looked at and the prospect of using thermoluminescent dosimetry is considered

  16. Background enhancement in breast MR: Correlation with breast density in mammography and background echotexture in ultrasound

    International Nuclear Information System (INIS)

    Ko, Eun Sook; Lee, Byung Hee; Choi, Hye Young; Kim, Rock Bum; Noh, Woo-Chul

    2011-01-01

    Objective: This study aimed to determine whether background enhancement on MR was related to mammographic breast density or ultrasonographic background echotexture in premenopausal and postmenopausal women. Materials and methods: We studied 142 patients (79 premenopausal, 63 postmenopausal) who underwent mammography, ultrasonography, and breast MR. We reviewed the mammography for overall breast density of the contralateral normal breast according to the four-point scale of the BI-RADS classification. Ultrasound findings were classified as homogeneous or heterogeneous background echotexture according to the BI-RADS lexicon. We rated background enhancement on a contralateral breast MR into four categories based on subtraction images: absent, mild, moderate, and marked. All imaging findings were interpreted independently by two readers without knowledge of menstrual status, imaging findings of other modalities. Results: There were significant differences between the premenopausal and postmenopausal group in distribution of mammographic breast density, ultrasonographic background echotexture, and degree of background enhancement. Regarding the relationship between mammographic density and background enhancement, there was no significant correlation. There was significant relationship between ultrasonographic background echotexture and background enhancement in both premenopausal and postmenopausal groups. Conclusion: There is a significant correlation between ultrasonographic background echotexture and background enhancement in MR regardless of menopausal status. Interpreting breast MR, or scheduling for breast MR of women showing heterogeneous background echotexture needs more caution.

  17. The effect of background music in auditory health persuasion

    NARCIS (Netherlands)

    Elbert, Sarah; Dijkstra, Arie

    2013-01-01

    In auditory health persuasion, threatening information regarding health is communicated by voice only. One relevant context of auditory persuasion is the addition of background music. There are different mechanisms through which background music might influence persuasion, for example through mood

  18. 32 CFR 701.40 - Background.

    Science.gov (United States)

    2010-07-01

    ... OFFICIAL RECORDS AVAILABILITY OF DEPARTMENT OF THE NAVY RECORDS AND PUBLICATION OF DEPARTMENT OF THE NAVY DOCUMENTS AFFECTING THE PUBLIC FOIA Fees § 701.40 Background. (a) The DON follows the uniform fee schedule... Freedom of Information Act Fee Schedule and Guidelines. (b) Fees reflect direct costs for search; review...

  19. Analysis of an automated background correction method for cardiovascular MR phase contrast imaging in children and young adults

    Energy Technology Data Exchange (ETDEWEB)

    Rigsby, Cynthia K.; Hilpipre, Nicholas; Boylan, Emma E.; Popescu, Andrada R.; Deng, Jie [Ann and Robert H. Lurie Children' s Hospital of Chicago, Department of Medical Imaging, Chicago, IL (United States); McNeal, Gary R. [Siemens Medical Solutions USA Inc., Customer Solutions Group, Cardiovascular MR R and D, Chicago, IL (United States); Zhang, Gang [Ann and Robert H. Lurie Children' s Hospital of Chicago Research Center, Biostatistics Research Core, Chicago, IL (United States); Choi, Grace [Ann and Robert H. Lurie Children' s Hospital of Chicago, Department of Pediatrics, Chicago, IL (United States); Greiser, Andreas [Siemens AG Healthcare Sector, Erlangen (Germany)

    2014-03-15

    Phase contrast magnetic resonance imaging (MRI) is a powerful tool for evaluating vessel blood flow. Inherent errors in acquisition, such as phase offset, eddy currents and gradient field effects, can cause significant inaccuracies in flow parameters. These errors can be rectified with the use of background correction software. To evaluate the performance of an automated phase contrast MRI background phase correction method in children and young adults undergoing cardiac MR imaging. We conducted a retrospective review of patients undergoing routine clinical cardiac MRI including phase contrast MRI for flow quantification in the aorta (Ao) and main pulmonary artery (MPA). When phase contrast MRI of the right and left pulmonary arteries was also performed, these data were included. We excluded patients with known shunts and metallic implants causing visible MRI artifact and those with more than mild to moderate aortic or pulmonary stenosis. Phase contrast MRI of the Ao, mid MPA, proximal right pulmonary artery (RPA) and left pulmonary artery (LPA) using 2-D gradient echo Fast Low Angle SHot (FLASH) imaging was acquired during normal respiration with retrospective cardiac gating. Standard phase image reconstruction and the automatic spatially dependent background-phase-corrected reconstruction were performed on each phase contrast MRI dataset. Non-background-corrected and background-phase-corrected net flow, forward flow, regurgitant volume, regurgitant fraction, and vessel cardiac output were recorded for each vessel. We compared standard non-background-corrected and background-phase-corrected mean flow values for the Ao and MPA. The ratio of pulmonary to systemic blood flow (Qp:Qs) was calculated for the standard non-background and background-phase-corrected data and these values were compared to each other and for proximity to 1. In a subset of patients who also underwent phase contrast MRI of the MPA, RPA, and LPA a comparison was made between standard non-background

  20. Research Investigation of Information Access Methods

    Science.gov (United States)

    Heinrichs, John H.; Sharkey, Thomas W.; Lim, Jeen-Su

    2006-01-01

    This study investigates the satisfaction of library users at Wayne State University who utilize alternative information access methods. The LibQUAL+[TM] desired and perceived that satisfaction ratings are used to determine the user's "superiority gap." By focusing limited library resources to address "superiority gap" issues identified by each…

  1. Measuring background by the DIN-1M spectrometer using the oscillating absorbing screen method

    International Nuclear Information System (INIS)

    Glazkov, Yu.Yu.; Liforov, V.G.; Novikov, A.G.; Parfenov, V.A.; Semenov, V.A.

    1982-01-01

    Technique for measuring background by a double pulse slow neutron spectrometer is described. To measure the background on oscillating absorbing screen (OAS) periodically overlapping primary neutron beam at the input of a mechanical interrupter was used. During the overlapping monochromatic neutrons conditioned the effect are removed out of the beam and general background conditions are not practically applied. Screen oscillation permits to realize the condition of simultaneous measurement of effect and background neutrons. The optimal period of oscillations amounts to approximately 3 min. Analysis of neutron spectra scattered with different materials and corresponding background curves measured by means of the OAS technique shows that the share of monochromatic neutrons passing through the screen constitutes less than 1% of elastic peak and relative decrease of the total background level doesn't exceed 1.5-2%

  2. Background and Qualification of Uncertainty Methods

    International Nuclear Information System (INIS)

    D'Auria, F.; Petruzzi, A.

    2008-01-01

    The evaluation of uncertainty constitutes the necessary supplement of Best Estimate calculations performed to understand accident scenarios in water cooled nuclear reactors. The needs come from the imperfection of computational tools on the one side and from the interest in using such tool to get more precise evaluation of safety margins. The paper reviews the salient features of two independent approaches for estimating uncertainties associated with predictions of complex system codes. Namely the propagation of code input error and the propagation of the calculation output error constitute the key-words for identifying the methods of current interest for industrial applications. Throughout the developed methods, uncertainty bands can be derived (both upper and lower) for any desired quantity of the transient of interest. For the second case, the uncertainty method is coupled with the thermal-hydraulic code to get the Code with capability of Internal Assessment of Uncertainty, whose features are discussed in more detail.

  3. 48 CFR 1205.101 - Methods of disseminating information.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Methods of disseminating information. 1205.101 Section 1205.101 Federal Acquisition Regulations System DEPARTMENT OF TRANSPORTATION... disseminating information. (b) The DOT Office of Small and Disadvantaged Business Utilization (S-40), 400 7th...

  4. Classifying and Designing the Educational Methods with Information Communications Technoligies

    Directory of Open Access Journals (Sweden)

    I. N. Semenova

    2013-01-01

    Full Text Available The article describes the conceptual apparatus for implementing the Information Communications Technologies (ICT in education. The authors suggest the classification variants of the related teaching methods according to the following component combinations: types of students work with information, goals of ICT incorporation into the training process, individualization degrees, contingent involvement, activity levels and pedagogical field targets, ideology of informational didactics, etc. Each classification can solve the educational tasks in the context of the partial paradigm of modern didactics; any kind of methods implies the particular combination of activities in educational environment.The whole spectrum of classifications provides the informational functional basis for the adequate selection of necessary teaching methods in accordance with the specified goals and planned results. The potential variants of ICT implementation methods are given for different teaching models. 

  5. Depression and suicidal behavior in adolescents: a multi-informant and multi-methods approach to diagnostic classification.

    Directory of Open Access Journals (Sweden)

    Andrew James Lewis

    2014-07-01

    Full Text Available Background: Informant discrepancies have been reported between parent and adolescent measures of depressive disorders and suicidality. We aimed to examine the concordance between adolescent and parent ratings of depressive disorder using both clinical interview and questionnaire measures and assess multi-informant and multi-method approaches to classification.Method: Within the context of assessment of eligibility for a randomized clinical trial, 50 parent–adolescent pairs (mean age of adolescents = 15.0 years were interviewed separately with a structured diagnostic interview for depression, the KID-SCID. Adolescent self-report and parent-report versions of the Strengths and Difficulties Questionnaire, the Short Mood and Feelings Questionnaire and the Depressive Experiences Questionnaire were also administered. We examined the diagnostic concordance rates of the parent vs. adolescent structured interview methods and the prediction of adolescent diagnosis via questionnaire methods.Results: Parent proxy reporting of adolescent depression and suicidal thoughts and behavior is not strongly concordant with adolescent report. Adolescent self-reported symptoms on depression scales provide a more accurate report of diagnosable adolescent depression than parent proxy reports of adolescent depressive symptoms. Adolescent self-report measures can be combined to improve the accuracy of classification. Parents tend to over report their adolescent’s depressive symptoms while under reporting their suicidal thoughts and behavior.Conclusion: Parent proxy report is clearly less reliable than the adolescent’s own report of their symptoms and subjective experiences, and could be considered inaccurate for research purposes. While parent report would still be sought clinically where an adolescent refuses to provide information, our findings suggest that parent reporting of adolescent suicidality should be interpreted with caution.

  6. Slavnov-Taylor constraints for nontrivial backgrounds

    International Nuclear Information System (INIS)

    Binosi, D.; Quadri, A.

    2011-01-01

    We devise an algebraic procedure for the evaluation of Green's functions in SU(N) Yang-Mills theory in the presence of a nontrivial background field. In the ghost-free sector the dependence of the vertex functional on the background is shown to be uniquely determined by the Slavnov-Taylor identities in terms of a certain 1-PI correlator of the covariant derivatives of the ghost and the antighost fields. At nonvanishing background this amplitude is shown to encode the quantum deformations to the tree-level background-quantum splitting. The approach only relies on the functional identities of the model (Slavnov-Taylor identities, b-equation, antighost equation) and thus it is valid beyond perturbation theory, and, in particular, in a lattice implementation of the background field method. As an example of the formalism we analyze the ghost two-point function and the Kugo-Ojima function in an instanton background in SU(2) Yang-Mills theory, quantized in the background Landau gauge.

  7. Increasing Power by Sharing Information from Genetic Background and Treatment in Clustering of Gene Expression Time Series

    OpenAIRE

    Sura Zaki Alrashid; Muhammad Arifur Rahman; Nabeel H Al-Aaraji; Neil D Lawrence; Paul R Heath

    2018-01-01

    Clustering of gene expression time series gives insight into which genes may be co-regulated, allowing us to discern the activity of pathways in a given microarray experiment. Of particular interest is how a given group of genes varies with different conditions or genetic background. This paper develops
a new clustering method that allows each cluster to be parameterised according to whether the behaviour of the genes across conditions is correlated or anti-correlated. By specifying correlati...

  8. Hospital discharge: What are the problems, information needs and objectives of community pharmacists? A mixed method approach

    Directory of Open Access Journals (Sweden)

    Brühwiler LD

    2017-09-01

    Full Text Available Background: After hospital discharge, community pharmacists are often the first health care professionals the discharged patient encounters. They reconcile and dispense prescribed medicines and provide pharmaceutical care. Compared to the roles of general practitioners, the pharmacists’ needs to perform these tasks are not well known. Objective: This study aims to a Identify community pharmacists’ current problems and roles at hospital discharge, b Assess their information needs, specifically the availability and usefulness of information, and c Gain insight into pharmacists’ objectives and ideas for discharge optimisation. Methods: A focus group was conducted with a sample of six community pharmacists from different Swiss regions. Based on these qualitative results, a nationwide online-questionnaire was sent to 1348 Swiss pharmacies. Results: The focus group participants were concerned about their extensive workload with discharge prescriptions and about gaps in therapy. They emphasised the importance of more extensive information transfer. This applied especially to medication changes, unclear prescriptions, and information about a patient's care. Participants identified treatment continuity as a main objective when it comes to discharge optimisation. There were 194 questionnaires returned (response rate 14.4%. The majority of respondents reported to fulfil their role as defined by the Joint-FIP/WHO Guideline on Good Pharmacy Practice (rather badly. They reported many unavailable but useful information items, like therapy changes, allergies, specifications for “off-label” medication use or contact information. Information should be delivered in a structured way, but no clear preference for one particular transfer method was found. Pharmacists requested this information in order to improve treatment continuity and patient safety, and to be able to provide better pharmaceutical care services. Conclusion: Surveyed Swiss community

  9. Dim point target detection against bright background

    Science.gov (United States)

    Zhang, Yao; Zhang, Qiheng; Xu, Zhiyong; Xu, Junping

    2010-05-01

    For target detection within a large-field cluttered background from a long distance, several difficulties, involving low contrast between target and background, little occupancy, illumination ununiformity caused by vignetting of lens, and system noise, make it a challenging problem. The existing approaches to dim target detection can be roughly divided into two categories: detection before tracking (DBT) and tracking before detection (TBD). The DBT-based scheme has been widely used in practical applications due to its simplicity, but it often requires working in the situation with a higher signal-to-noise ratio (SNR). In contrast, the TBD-based methods can provide impressive detection results even in the cases of very low SNR; unfortunately, the large memory requirement and high computational load prevents these methods from real-time tasks. In this paper, we propose a new method for dim target detection. We address this problem by combining the advantages of the DBT-based scheme in computational efficiency and of the TBD-based in detection capability. Our method first predicts the local background, and then employs the energy accumulation and median filter to remove background clutter. The dim target is finally located by double window filtering together with an improved high order correlation which speeds up the convergence. The proposed method is implemented on a hardware platform and performs suitably in outside experiments.

  10. Geometrical Fuzzy Search Method for the Business Information Security Systems

    Directory of Open Access Journals (Sweden)

    Grigory Grigorievich Novikov

    2014-12-01

    Full Text Available The main reason of the article is how to use one of new fuzzy search method for information security of business or some other purposes. So many sensitive information leaks are through non-classified documents legal publishing. That’s why many intelligence services like to use the “mosaic” information collection method so much: This article is about how to prevent it.

  11. A Model-Driven Development Method for Management Information Systems

    Science.gov (United States)

    Mizuno, Tomoki; Matsumoto, Keinosuke; Mori, Naoki

    Traditionally, a Management Information System (MIS) has been developed without using formal methods. By the informal methods, the MIS is developed on its lifecycle without having any models. It causes many problems such as lack of the reliability of system design specifications. In order to overcome these problems, a model theory approach was proposed. The approach is based on an idea that a system can be modeled by automata and set theory. However, it is very difficult to generate automata of the system to be developed right from the start. On the other hand, there is a model-driven development method that can flexibly correspond to changes of business logics or implementing technologies. In the model-driven development, a system is modeled using a modeling language such as UML. This paper proposes a new development method for management information systems applying the model-driven development method to a component of the model theory approach. The experiment has shown that a reduced amount of efforts is more than 30% of all the efforts.

  12. Measuring consumers' information acquisition and decision behavior with the computer-based information-display-matrix

    DEFF Research Database (Denmark)

    Aschemann-Witzel, Jessica; Hamm, Ulrich

    2011-01-01

    development of the method: starting points are choice of location, increased relevance of choice, individual adjustment of task structure, simplified navigation and realistic layout. Used in multi-measurement-approaches, the IDM can provide detailed background information about consumer information behaviour...... prior to decisions reached in interviews or choice experiments. The contribution introduces to the method and its´ development, use and (dis-)advantages. Results of a survey illustrate the options for analysis and indicate that consumer behaviour in the IDM, compared to face-to-face-interviews, is less...

  13. Method and apparatus for reducing solvent luminescence background emissions

    Energy Technology Data Exchange (ETDEWEB)

    Affleck, Rhett L. (Los Alamos, NM); Ambrose, W. Patrick (Los Alamos, NM); Demas, James N. (Charlottesville, VA); Goodwin, Peter M. (Jemez Springs, NM); Johnson, Mitchell E. (Pittsburgh, PA); Keller, Richard A. (Los Alamos, NM); Petty, Jeffrey T. (Los Alamos, NM); Schecker, Jay A. (Santa Fe, NM); Wu, Ming (Los Alamos, NM)

    1998-01-01

    The detectability of luminescent molecules in solution is enhanced by reducing the background luminescence due to impurity species also present in the solution. A light source that illuminates the solution acts to photolyze the impurities so that the impurities do not luminesce in the fluorescence band of the molecule of interest. Molecules of interest may be carried through the photolysis region in the solution or may be introduced into the solution after the photolysis region.

  14. On background radiation gradients – the use of airborne surveys when searching for orphan sources using mobile gamma-ray spectrometry

    International Nuclear Information System (INIS)

    Kock, Peder; Rääf, Christopher; Samuelsson, Christer

    2014-01-01

    Systematic background radiation variations can lead to both false positives and failures to detect an orphan source when searching using car-borne mobile gamma-ray spectrometry. The stochastic variation at each point is well described by Poisson statistics, but when moving in a background radiation gradient the mean count rate will continually change, leading to inaccurate background estimations. Airborne gamma spectrometry (AGS) surveys conducted on the national level, usually in connection to mineral exploration, exist in many countries. These data hold information about the background radiation gradients which could be used at the ground level. This article describes a method that aims to incorporate the systematic as well as stochastic variations of the background radiation. We introduce a weighted moving average where the weights are calculated from existing AGS data, supplied by the Geological Survey of Sweden. To test the method we chose an area with strong background gradients, especially in the thorium component. Within the area we identified two roads which pass through the high-variability locations. The proposed method is compared with an unweighted moving average. The results show that the weighting reduces the excess false positives in the positive background gradients without introducing an excess of failures to detect a source during passage in negative gradients. -- Highlights: • We present a simple method to account for gradients in the natural background radiation. • Gradients in the natural radiation background can be modelled at the ground level using AGS data. • The number of false positives due to background gradients can be reduced by using airborne data

  15. Usability Evaluation Methods for Special Interest Internet Information Services

    Directory of Open Access Journals (Sweden)

    Eva-Maria Schön

    2014-06-01

    Full Text Available The internet provides a wide range of scientific information for different areas of research, used by the related scientific communities. Often the design or architecture of these web pages does not correspond to the mental model of their users. As a result the wanted information is difficult to find. Methods established by Usability Engineering and User Experience can help to increase the appeal of scientific internet information services by analyzing the users’ requirements. This paper describes a procedure to analyze and optimize scientific internet information services that can be accomplished with relatively low effort. It consists of a combination of methods that already have been successfully applied to practice: Personas, usability inspections, Online Questionnaire, Kano model and Web Analytics.

  16. Observing a Gravitational Wave Background With Lisa

    National Research Council Canada - National Science Library

    Tinto, M; Armstrong, J; Estabrook, F

    2000-01-01

    .... Comparison of the conventional Michelson interferometer observable with the fully-symmetric Sagnac data-type allows unambiguous discrimination between a gravitational wave background and instrumental noise. The method presented here can be used to detect a confusion-limited gravitational wave background.

  17. Factors affecting implementation of perinatal mental health screening in women of refugee background

    Directory of Open Access Journals (Sweden)

    Nishani Nithianandan

    2016-11-01

    Full Text Available Abstract Background For women of refugee background, the increased risk of mental illness associated with pregnancy is compounded by pre- and post-settlement stressors. In Australia, antenatal screening for depression and anxiety symptoms using the Edinburgh Postnatal Depression Scale is recommended for all women. Despite this, screening is not routinely implemented and little is known about barriers and enablers to implementation for women of refugee background. Methods Semi-structured interviews were conducted with a range of health professionals (n = 28: midwives, obstetricians, perinatal mental health and refugee health experts, interpreters and women of refugee background (n = 9. Themes generated from thematic analysis were examined in relation to the Theoretical Domains Framework and Cultural Competence Conceptual Framework, followed by identification of effective behaviour change techniques to address the barriers and enablers identified by participants. These techniques formed the basis of recommendations to inform sustainable implementation of screening and referral. Results Almost all participants perceived perinatal mental health screening to be necessary and most recognised the importance of post-traumatic stress disorder (PTSD screening. Barriers and enablers were identified and related to eight domains: knowledge, skills, professional roles, beliefs about capabilities and consequences, environmental context, social influences and behavioural regulation. Conclusions This research clarifies how mental health screening may be integrated into routine antenatal care for women of refugee background, in order to improve provision of recommended care. These theory-informed recommendations include an inter-disciplinary approach, coordinating care within and across services, addition of PTSD screening, and effective communication with women.

  18. Application of Research on the Metallogenic Background in the Assessment of Mineral Resources Potentiality

    Science.gov (United States)

    Jia, D.; Feng, Y.; Liu, J.; Yao, X.; Zhang, Z.; Ye, T.

    2017-12-01

    1. Working BackgroundCurrent Status of Geological Prospecting: Detecting boundaries and bottoms, making ore search nearby; Seeing the stars, not seeing the Moon; Deep prospecting, undesirable results. The reasons of these problems are the regional metallogenic backgroud unclear and the metallogenic backgroud of the exploration regions unknown. Accordingly, Development and Research Center, CGS organized a geological setting research, in detail investigate metallogenic geological features and acquire mineralization information. 2. Technical SchemeCore research content is prediction elements of Metallogenic Structure. Adopt unified technical requirements from top to bottom, and technical route from bottom to top; Divide elements of mineral forecast and characteristics of geological structure into five elements for research and expression; Make full use of geophysical, geochemical and remote sensing inferences for the interpretation of macro information. After eight years the great project was completed. 3. Main AchievementsInnovation of basic maps compilation content of geological background, reinforce of geological structure data base of potentiality valuation. Preparation of geotectonic facies maps in different scales and professions, providing brand-new geologic background for potentiality assessment, promoting Chinese geotectonic research to the new height. Preparation of 3,375 geological structure thematic base maps of detecting working area in 6 kinds of prediction methods, providing base working maps, rock assemblage, structure of the protolith of geologic body / mineralization / ore controlling for mineral prediction of 25 ores. Enrichment and development of geotectonic facies analysis method, establishment of metallogenic background research thoughts and approach system for assessment of national mineral resources potentiality for the first time. 4. Application EffectOrientation——More and better results with less effort. Positioning——Have a definite

  19. Improved radiological/nuclear source localization in variable NORM background: An MLEM approach with segmentation data

    Energy Technology Data Exchange (ETDEWEB)

    Penny, Robert D., E-mail: robert.d.penny@leidos.com [Leidos Inc., 10260 Campus Point Road, San Diego, CA (United States); Crowley, Tanya M.; Gardner, Barbara M.; Mandell, Myron J.; Guo, Yanlin; Haas, Eric B.; Knize, Duane J.; Kuharski, Robert A.; Ranta, Dale; Shyffer, Ryan [Leidos Inc., 10260 Campus Point Road, San Diego, CA (United States); Labov, Simon; Nelson, Karl; Seilhan, Brandon [Lawrence Livermore National Laboratory, Livermore, CA (United States); Valentine, John D. [Lawrence Berkeley National Laboratory, Berkeley, CA (United States)

    2015-06-01

    A novel approach and algorithm have been developed to rapidly detect and localize both moving and static radiological/nuclear (R/N) sources from an airborne platform. Current aerial systems with radiological sensors are limited in their ability to compensate for variable naturally occurring radioactive material (NORM) background. The proposed approach suppresses the effects of NORM background by incorporating additional information to segment the survey area into regions over which the background is likely to be uniform. The method produces pixelated Source Activity Maps (SAMs) of both target and background radionuclide activity over the survey area. The task of producing the SAMs requires (1) the development of a forward model which describes the transformation of radionuclide activity to detector measurements and (2) the solution of the associated inverse problem. The inverse problem is ill-posed as there are typically fewer measurements than unknowns. In addition the measurements are subject to Poisson statistical noise. The Maximum-Likelihood Expectation-Maximization (MLEM) algorithm is used to solve the inverse problem as it is well suited for under-determined problems corrupted by Poisson noise. A priori terrain information is incorporated to segment the reconstruction space into regions within which we constrain NORM background activity to be uniform. Descriptions of the algorithm and examples of performance with and without segmentation on simulated data are presented.

  20. Justification of computational methods to ensure information management systems

    Directory of Open Access Journals (Sweden)

    E. D. Chertov

    2016-01-01

    Full Text Available Summary. Due to the diversity and complexity of organizational management tasks a large enterprise, the construction of an information management system requires the establishment of interconnected complexes of means, implementing the most efficient way collect, transfer, accumulation and processing of information necessary drivers handle different ranks in the governance process. The main trends of the construction of integrated logistics management information systems can be considered: the creation of integrated data processing systems by centralizing storage and processing of data arrays; organization of computer systems to realize the time-sharing; aggregate-block principle of the integrated logistics; Use a wide range of peripheral devices with the unification of information and hardware communication. Main attention is paid to the application of the system of research of complex technical support, in particular, the definition of quality criteria for the operation of technical complex, the development of information base analysis methods of management information systems and define the requirements for technical means, as well as methods of structural synthesis of the major subsystems of integrated logistics. Thus, the aim is to study on the basis of systematic approach of integrated logistics management information system and the development of a number of methods of analysis and synthesis of complex logistics that are suitable for use in the practice of engineering systems design. The objective function of the complex logistics management information systems is the task of gathering systems, transmission and processing of specified amounts of information in the regulated time intervals with the required degree of accuracy while minimizing the reduced costs for the establishment and operation of technical complex. Achieving the objective function of the complex logistics to carry out certain organization of interaction of information

  1. IDEF method for designing seismic information system in CTBT verification

    International Nuclear Information System (INIS)

    Zheng Xuefeng; Shen Junyi; Jin Ping; Zhang Huimin; Zheng Jiangling; Sun Peng

    2004-01-01

    Seismic information system is of great importance for improving the capability of CTBT verification. A large amount of money has been appropriated for the research in this field in the U.S. and some other countries in recent years. However, designing and developing a seismic information system involves various technologies about complex system design. This paper discusses the IDEF0 method to construct function models and the IDEF1x method to make information models systemically, as well as how they are used in designing seismic information system in CTBT verification. (authors)

  2. The geometric background-field method, renormalization and the Wess-Zumino term in non-linear sigma-models

    International Nuclear Information System (INIS)

    Mukhi, S.

    1986-01-01

    A simple recursive algorithm is presented which generates the reparametrization-invariant background-field expansion for non-linear sigma-models on manifolds with an arbitrary riemannian metric. The method is also applicable to Wess-Zumino terms and to counterterms. As an example, the general-metric model is expanded to sixth order and compared with previous results. For locally symmetric spaces, we actually obtain a general formula for the nth order term. The method is shown to facilitate the study of models with Wess-Zumino terms. It is demonstrated that, for chiral models, the Wess-Zumino term is unrenormalized to all orders in perturbation theory even when the model is not conformally invariant. (orig.)

  3. Impact of information technology on the role of medical libraries in information managment: normative background

    Directory of Open Access Journals (Sweden)

    Anamarija Rožić-Hristovski

    1998-01-01

    Full Text Available Exponential growth of biomedical knowledge and information technology development is changing the infrastructure of health care systems, education and research. So medical libraries roles have shifted from managing containers of information toward influencing biomedical information resource content and education. These new tasks are formalised in modem American standards for medical libraries, stressing information management role in evolving environment.In Slovenia medical libraries also are aware of development imperative of information activities for advances in medicine. At one side they are faced with lack of specific guidelines for proactive action and on the other with inadequate assessment in legal documents and insufficient funding.

  4. GafChromic EBT film dosimetry with flatbed CCD scanner: A novel background correction method and full dose uncertainty analysis

    International Nuclear Information System (INIS)

    Saur, Sigrun; Frengen, Jomar

    2008-01-01

    Film dosimetry using radiochromic EBT film in combination with a flatbed charge coupled device scanner is a useful method both for two-dimensional verification of intensity-modulated radiation treatment plans and for general quality assurance of treatment planning systems and linear accelerators. Unfortunately, the response over the scanner area is nonuniform, and when not corrected for, this results in a systematic error in the measured dose which is both dose and position dependent. In this study a novel method for background correction is presented. The method is based on the subtraction of a correction matrix, a matrix that is based on scans of films that are irradiated to nine dose levels in the range 0.08-2.93 Gy. Because the response of the film is dependent on the film's orientation with respect to the scanner, correction matrices for both landscape oriented and portrait oriented scans were made. In addition to the background correction method, a full dose uncertainty analysis of the film dosimetry procedure was performed. This analysis takes into account the fit uncertainty of the calibration curve, the variation in response for different film sheets, the nonuniformity after background correction, and the noise in the scanned films. The film analysis was performed for film pieces of size 16x16 cm, all with the same lot number, and all irradiations were done perpendicular onto the films. The results show that the 2-sigma dose uncertainty at 2 Gy is about 5% and 3.5% for landscape and portrait scans, respectively. The uncertainty gradually increases as the dose decreases, but at 1 Gy the 2-sigma dose uncertainty is still as good as 6% and 4% for landscape and portrait scans, respectively. The study shows that film dosimetry using GafChromic EBT film, an Epson Expression 1680 Professional scanner and a dedicated background correction technique gives precise and accurate results. For the purpose of dosimetric verification, the calculated dose distribution can

  5. GafChromic EBT film dosimetry with flatbed CCD scanner: a novel background correction method and full dose uncertainty analysis.

    Science.gov (United States)

    Saur, Sigrun; Frengen, Jomar

    2008-07-01

    Film dosimetry using radiochromic EBT film in combination with a flatbed charge coupled device scanner is a useful method both for two-dimensional verification of intensity-modulated radiation treatment plans and for general quality assurance of treatment planning systems and linear accelerators. Unfortunately, the response over the scanner area is nonuniform, and when not corrected for, this results in a systematic error in the measured dose which is both dose and position dependent. In this study a novel method for background correction is presented. The method is based on the subtraction of a correction matrix, a matrix that is based on scans of films that are irradiated to nine dose levels in the range 0.08-2.93 Gy. Because the response of the film is dependent on the film's orientation with respect to the scanner, correction matrices for both landscape oriented and portrait oriented scans were made. In addition to the background correction method, a full dose uncertainty analysis of the film dosimetry procedure was performed. This analysis takes into account the fit uncertainty of the calibration curve, the variation in response for different film sheets, the nonuniformity after background correction, and the noise in the scanned films. The film analysis was performed for film pieces of size 16 x 16 cm, all with the same lot number, and all irradiations were done perpendicular onto the films. The results show that the 2-sigma dose uncertainty at 2 Gy is about 5% and 3.5% for landscape and portrait scans, respectively. The uncertainty gradually increases as the dose decreases, but at 1 Gy the 2-sigma dose uncertainty is still as good as 6% and 4% for landscape and portrait scans, respectively. The study shows that film dosimetry using GafChromic EBT film, an Epson Expression 1680 Professional scanner and a dedicated background correction technique gives precise and accurate results. For the purpose of dosimetric verification, the calculated dose distribution

  6. The slope-background for the near-peak regimen of photoemission spectra

    Energy Technology Data Exchange (ETDEWEB)

    Herrera-Gomez, A., E-mail: aherrera@qro.cinvestav.mx [CINVESTAV-Unidad Queretaro, Queretaro 76230 (Mexico); Bravo-Sanchez, M. [CINVESTAV-Unidad Queretaro, Queretaro 76230 (Mexico); Aguirre-Tostado, F.S. [Centro de Investigación en Materiales Avanzados, Chihuahua, Chihuahua 31109 (Mexico); Vazquez-Lepe, M.O. [Departamento de Ingeniería de Proyectos, Universidad de Guadalajara, Jalisco 44430 (Mexico)

    2013-08-15

    Highlights: •We propose a method that accounts for the change in the background slope of XPS data. •The slope-background can be derived from Tougaard–Sigmund's transport theory. •The total background is composed by Shirley–Sherwood and Tougaard type backgrounds. •The slope-background employs one parameter that can be related to REELS spectra. •The slope, in conjunction with the Shirley–Sherwood background, provides better fits. -- Abstract: Photoemission data typically exhibits a change on the intensity of the background between the two sides of the peaks. This step is usually very well reproduced by the Shirley–Sherwood background. Yet, the change on the slope of the background in the near-peak regime, although usually present, is not always as obvious to the eye. However, the intensity of the background signal associated with the evolution of its slope can be appreciable. The slope-background is designed to empirically reproduce the change on the slope. Resembling the non-iterative Shirley method, the proposed functional form relates the slope of the background to the integrated signal at higher electron kinetic energies. This form can be predicted under Tougaard–Sigmund's electron transport theory in the near-peak regime. To reproduce both the step and slope changes on the background, it is necessary to employ the slope-background in conjunction with the Shirley–Sherwood background under the active-background method. As it is shown for a series of materials, the application of the slope-background provides excellent fits, is transparent to the operator, and is much more independent of the fitting range than other background methods. The total area assessed through the combination of the slope and the Shirley–Sherwood backgrounds is larger than when only the Shirley–Sherwood background is employed, and smaller than when the Tougaard background is employed.

  7. Method of Improving Personal Name Search in Academic Information Service

    Directory of Open Access Journals (Sweden)

    Heejun Han

    2012-12-01

    Full Text Available All academic information on the web or elsewhere has its creator, that is, a subject who has created the information. The subject can be an individual, a group, or an institution, and can be a nation depending on the nature of the relevant information. Most information is composed of a title, an author, and contents. An essay which is under the academic information category has metadata including a title, an author, keyword, abstract, data about publication, place of publication, ISSN, and the like. A patent has metadata including the title, an applicant, an inventor, an attorney, IPC, number of application, and claims of the invention. Most web-based academic information services enable users to search the information by processing the meta-information. An important element is to search information by using the author field which corresponds to a personal name. This study suggests a method of efficient indexing and using the adjacent operation result ranking algorithm to which phrase search-based boosting elements are applied, and thus improving the accuracy of the search results of personal names. It also describes a method for providing the results of searching co-authors and related researchers in searching personal names. This method can be effectively applied to providing accurate and additional search results in the academic information services.

  8. Evaluation of the clinical process in a critical care information system using the Lean method: a case study

    Directory of Open Access Journals (Sweden)

    Yusof Maryati Mohd

    2012-12-01

    Full Text Available Abstract Background There are numerous applications for Health Information Systems (HIS that support specific tasks in the clinical workflow. The Lean method has been used increasingly to optimize clinical workflows, by removing waste and shortening the delivery cycle time. There are a limited number of studies on Lean applications related to HIS. Therefore, we applied the Lean method to evaluate the clinical processes related to HIS, in order to evaluate its efficiency in removing waste and optimizing the process flow. This paper presents the evaluation findings of these clinical processes, with regards to a critical care information system (CCIS, known as IntelliVue Clinical Information Portfolio (ICIP, and recommends solutions to the problems that were identified during the study. Methods We conducted a case study under actual clinical settings, to investigate how the Lean method can be used to improve the clinical process. We used observations, interviews, and document analysis, to achieve our stated goal. We also applied two tools from the Lean methodology, namely the Value Stream Mapping and the A3 problem-solving tools. We used eVSM software to plot the Value Stream Map and A3 reports. Results We identified a number of problems related to inefficiency and waste in the clinical process, and proposed an improved process model. Conclusions The case study findings show that the Value Stream Mapping and the A3 reports can be used as tools to identify waste and integrate the process steps more efficiently. We also proposed a standardized and improved clinical process model and suggested an integrated information system that combines database and software applications to reduce waste and data redundancy.

  9. Combining rules, background knowledge and change patterns to maintain semantic annotations.

    Science.gov (United States)

    Cardoso, Silvio Domingos; Chantal, Reynaud-Delaître; Da Silveira, Marcos; Pruski, Cédric

    2017-01-01

    Knowledge Organization Systems (KOS) play a key role in enriching biomedical information in order to make it machine-understandable and shareable. This is done by annotating medical documents, or more specifically, associating concept labels from KOS with pieces of digital information, e.g., images or texts. However, the dynamic nature of KOS may impact the annotations, thus creating a mismatch between the evolved concept and the associated information. To solve this problem, methods to maintain the quality of the annotations are required. In this paper, we define a framework based on rules, background knowledge and change patterns to drive the annotation adaption process. We evaluate experimentally the proposed approach in realistic cases-studies and demonstrate the overall performance of our approach in different KOS considering the precision, recall, F1-score and AUC value of the system.

  10. Designing Health Websites Based on Users’ Web-Based Information-Seeking Behaviors: A Mixed-Method Observational Study

    Science.gov (United States)

    Pang, Patrick Cheong-Iao; Verspoor, Karin; Pearce, Jon

    2016-01-01

    Background Laypeople increasingly use the Internet as a source of health information, but finding and discovering the right information remains problematic. These issues are partially due to the mismatch between the design of consumer health websites and the needs of health information seekers, particularly the lack of support for “exploring” health information. Objective The aim of this research was to create a design for consumer health websites by supporting different health information–seeking behaviors. We created a website called Better Health Explorer with the new design. Through the evaluation of this new design, we derive design implications for future implementations. Methods Better Health Explorer was designed using a user-centered approach. The design was implemented and assessed through a laboratory-based observational study. Participants tried to use Better Health Explorer and another live health website. Both websites contained the same content. A mixed-method approach was adopted to analyze multiple types of data collected in the experiment, including screen recordings, activity logs, Web browsing histories, and audiotaped interviews. Results Overall, 31 participants took part in the observational study. Our new design showed a positive result for improving the experience of health information seeking, by providing a wide range of information and an engaging environment. The results showed better knowledge acquisition, a higher number of page reads, and more query reformulations in both focused and exploratory search tasks. In addition, participants spent more time to discover health information with our design in exploratory search tasks, indicating higher engagement with the website. Finally, we identify 4 design considerations for designing consumer health websites and health information–seeking apps: (1) providing a dynamic information scope; (2) supporting serendipity; (3) considering trust implications; and (4) enhancing interactivity

  11. An information hiding method based on LSB and tent chaotic map

    Science.gov (United States)

    Song, Jianhua; Ding, Qun

    2011-06-01

    In order to protect information security more effectively, a novel information hiding method based on LSB and Tent chaotic map was proposed, first the secret message is Tent chaotic encrypted, and then LSB steganography is executed for the encrypted message in the cover-image. Compared to the traditional image information hiding method, the simulation results indicate that the method greatly improved in imperceptibility and security, and acquired good results.

  12. Agricultural practice and water quality in the Netherlands in the 1992-2002 period. Background information for the third EU Nitrate Directive Member States report

    NARCIS (Netherlands)

    Fraters B; Hotsma PH; Langenberg VT; Leeuwen TC van; Mol APA; Olsthoorn CSM; Schotten CGJ; Willems WJ; EC-LNV; RIKZ; LEI; RIZA; CBS; LDL

    2004-01-01

    This overview provides the background information for the Netherlands Member State report, 'Nitrate Directive, status and trends of aquatic environment and agricultural practice' to be submitted to the European Commission mid-2004. It documents current agricultural practice, and groundwater and

  13. Background-Modeling-Based Adaptive Prediction for Surveillance Video Coding.

    Science.gov (United States)

    Zhang, Xianguo; Huang, Tiejun; Tian, Yonghong; Gao, Wen

    2014-02-01

    The exponential growth of surveillance videos presents an unprecedented challenge for high-efficiency surveillance video coding technology. Compared with the existing coding standards that were basically developed for generic videos, surveillance video coding should be designed to make the best use of the special characteristics of surveillance videos (e.g., relative static background). To do so, this paper first conducts two analyses on how to improve the background and foreground prediction efficiencies in surveillance video coding. Following the analysis results, we propose a background-modeling-based adaptive prediction (BMAP) method. In this method, all blocks to be encoded are firstly classified into three categories. Then, according to the category of each block, two novel inter predictions are selectively utilized, namely, the background reference prediction (BRP) that uses the background modeled from the original input frames as the long-term reference and the background difference prediction (BDP) that predicts the current data in the background difference domain. For background blocks, the BRP can effectively improve the prediction efficiency using the higher quality background as the reference; whereas for foreground-background-hybrid blocks, the BDP can provide a better reference after subtracting its background pixels. Experimental results show that the BMAP can achieve at least twice the compression ratio on surveillance videos as AVC (MPEG-4 Advanced Video Coding) high profile, yet with a slightly additional encoding complexity. Moreover, for the foreground coding performance, which is crucial to the subjective quality of moving objects in surveillance videos, BMAP also obtains remarkable gains over several state-of-the-art methods.

  14. Background suppression in Gerda Phase II and its study in the LArGe low background set-up

    Energy Technology Data Exchange (ETDEWEB)

    Budjas, Dusan [Physik-Department E15, Technische Universitaet Muenchen (Germany); Collaboration: GERDA-Collaboration

    2013-07-01

    In Phase II of the Gerda experiment additional ∝20 kg of BEGe-type germanium detectors, enriched in {sup 76}Ge, will be deployed in liquid argon (LAr) to further increase the sensitivity for the half-life of neutrinoless double beta (0νββ) decay of {sup 76}Ge to > 2 . 10{sup 26} yr. To reduce background by a factor of 10 to the required level of < 10{sup -3} cts/(keV.kg.yr), it is necessary to employ active background-suppression techniques, including anti-Compton veto using scintillation light detection from LAr and pulse shape discrimination exploiting the characteristic electrical field distribution inside BEGe detectors. The latter technique can identify single-site events (typical for 0νββ) and efficiently reject multi-site events (mainly from γ-rays), as well as different types of background events from detector surfaces. The combined power of these techniques was studied for {sup 42}K and other background sources at the low background facility LArGe. Together with extensive simulations, the information from tracking of the Phase II detector material exposure to cosmic rays and based on the background contributions observed in Phase I, the expected background level in Phase II in the region of interest at 2039 keV, the Q{sub ββ} energy of {sup 76}Ge, is estimated. The preliminary analysis shows that contributions from all expected background components after all cuts are in line with the goal of Gerda Phase II.

  15. Background modeling for the GERDA experiment

    Science.gov (United States)

    Becerici-Schmidt, N.; Gerda Collaboration

    2013-08-01

    The neutrinoless double beta (0νββ) decay experiment GERDA at the LNGS of INFN has started physics data taking in November 2011. This paper presents an analysis aimed at understanding and modeling the observed background energy spectrum, which plays an essential role in searches for a rare signal like 0νββ decay. A very promising preliminary model has been obtained, with the systematic uncertainties still under study. Important information can be deduced from the model such as the expected background and its decomposition in the signal region. According to the model the main background contributions around Qββ come from 214Bi, 228Th, 42K, 60Co and α emitting isotopes in the 226Ra decay chain, with a fraction depending on the assumed source positions.

  16. Axiomatic Evaluation Method and Content Structure for Information Appliances

    Science.gov (United States)

    Guo, Yinni

    2010-01-01

    Extensive studies have been conducted to determine how best to present information in order to enhance usability, but not what information is needed to be presented for effective decision making. Hence, this dissertation addresses the factor structure of the nature of information needed for presentation and proposes a more effective method than…

  17. Multimodal cues provide redundant information for bumblebees when the stimulus is visually salient, but facilitate red target detection in a naturalistic background

    Science.gov (United States)

    Corcobado, Guadalupe; Trillo, Alejandro

    2017-01-01

    Our understanding of how floral visitors integrate visual and olfactory cues when seeking food, and how background complexity affects flower detection is limited. Here, we aimed to understand the use of visual and olfactory information for bumblebees (Bombus terrestris terrestris L.) when seeking flowers in a visually complex background. To explore this issue, we first evaluated the effect of flower colour (red and blue), size (8, 16 and 32 mm), scent (presence or absence) and the amount of training on the foraging strategy of bumblebees (accuracy, search time and flight behaviour), considering the visual complexity of our background, to later explore whether experienced bumblebees, previously trained in the presence of scent, can recall and make use of odour information when foraging in the presence of novel visual stimuli carrying a familiar scent. Of all the variables analysed, flower colour had the strongest effect on the foraging strategy. Bumblebees searching for blue flowers were more accurate, flew faster, followed more direct paths between flowers and needed less time to find them, than bumblebees searching for red flowers. In turn, training and the presence of odour helped bees to find inconspicuous (red) flowers. When bees foraged on red flowers, search time increased with flower size; but search time was independent of flower size when bees foraged on blue flowers. Previous experience with floral scent enhances the capacity of detection of a novel colour carrying a familiar scent, probably by elemental association influencing attention. PMID:28898287

  18. Effect of background music on auditory-verbal memory performance

    Directory of Open Access Journals (Sweden)

    Sona Matloubi

    2014-12-01

    Full Text Available Background and Aim: Music exists in all cultures; many scientists are seeking to understand how music effects cognitive development such as comprehension, memory, and reading skills. More recently, a considerable number of neuroscience studies on music have been developed. This study aimed to investigate the effects of null and positive background music in comparison with silence on auditory-verbal memory performance.Methods: Forty young adults (male and female with normal hearing, aged between 18 and 26, participated in this comparative-analysis study. An auditory and speech evaluation was conducted in order to investigate the effects of background music on working memory. Subsequently, the Rey auditory-verbal learning test was performed for three conditions: silence, positive, and null music.Results: The mean score of the Rey auditory-verbal learning test in silence condition was higher than the positive music condition (p=0.003 and the null music condition (p=0.01. The tests results did not reveal any gender differences.Conclusion: It seems that the presence of competitive music (positive and null music and the orientation of auditory attention have negative effects on the performance of verbal working memory. It is possibly owing to the intervention of music with verbal information processing in the brain.

  19. Limits of Astrophysics with Gravitational-Wave Backgrounds

    Directory of Open Access Journals (Sweden)

    Thomas Callister

    2016-08-01

    Full Text Available The recent Advanced LIGO detection of gravitational waves from the binary black hole GW150914 suggests there exists a large population of merging binary black holes in the Universe. Although most are too distant to be individually resolved by advanced detectors, the superposition of gravitational waves from many unresolvable binaries is expected to create an astrophysical stochastic background. Recent results from the LIGO and Virgo Collaborations show that this astrophysical background is within reach of Advanced LIGO. In principle, the binary black hole background encodes interesting astrophysical properties, such as the mass distribution and redshift distribution of distant binaries. However, we show that this information will be difficult to extract with the current configuration of advanced detectors (and using current data analysis tools. Additionally, the binary black hole background also constitutes a foreground that limits the ability of advanced detectors to observe other interesting stochastic background signals, for example, from cosmic strings or phase transitions in the early Universe. We quantify this effect.

  20. Adaptation of an Agile Information System Development Method

    NARCIS (Netherlands)

    Aydin, M.N.; Harmsen, A.F.; van Hillegersberg, Jos; Stegwee, R.A.; Siau, K.

    2007-01-01

    Little specific research has been conducted to date on the adaptation of agile information systems development (ISD) methods. This chapter presents the work practice in dealing with the adaptation of such a method in the ISD department of one of the leading financial institutes in Europe. The

  1. On the microwave background spectrum and noise

    International Nuclear Information System (INIS)

    De Bernardis, P.; Masi, S.

    1982-01-01

    We show that the combined measurement of the cosmic background radiation (CBR) intensity and noise can provide direct information on the temperature and the emissivity of the source responsible for the CBR. (orig.)

  2. Human Detection Based on the Generation of a Background Image by Using a Far-Infrared Light Camera

    Directory of Open Access Journals (Sweden)

    Eun Som Jeon

    2015-03-01

    Full Text Available The need for computer vision-based human detection has increased in fields, such as security, intelligent surveillance and monitoring systems. However, performance enhancement of human detection based on visible light cameras is limited, because of factors, such as nonuniform illumination, shadows and low external light in the evening and night. Consequently, human detection based on thermal (far-infrared light cameras has been considered as an alternative. However, its performance is influenced by the factors, such as low image resolution, low contrast and the large noises of thermal images. It is also affected by the high temperature of backgrounds during the day. To solve these problems, we propose a new method for detecting human areas in thermal camera images. Compared to previous works, the proposed research is novel in the following four aspects. One background image is generated by median and average filtering. Additional filtering procedures based on maximum gray level, size filtering and region erasing are applied to remove the human areas from the background image. Secondly, candidate human regions in the input image are located by combining the pixel and edge difference images between the input and background images. The thresholds for the difference images are adaptively determined based on the brightness of the generated background image. Noise components are removed by component labeling, a morphological operation and size filtering. Third, detected areas that may have more than two human regions are merged or separated based on the information in the horizontal and vertical histograms of the detected area. This procedure is adaptively operated based on the brightness of the generated background image. Fourth, a further procedure for the separation and removal of the candidate human regions is performed based on the size and ratio of the height to width information of the candidate regions considering the camera viewing direction

  3. Background Model for the Majorana Demonstrator

    Science.gov (United States)

    Cuesta, C.; Abgrall, N.; Aguayo, E.; Avignone, F. T.; Barabash, A. S.; Bertrand, F. E.; Boswell, M.; Brudanin, V.; Busch, M.; Byram, D.; Caldwell, A. S.; Chan, Y.-D.; Christofferson, C. D.; Combs, D. C.; Detwiler, J. A.; Doe, P. J.; Efremenko, Yu.; Egorov, V.; Ejiri, H.; Elliott, S. R.; Fast, J. E.; Finnerty, P.; Fraenkle, F. M.; Galindo-Uribarri, A.; Giovanetti, G. K.; Goett, J.; Green, M. P.; Gruszko, J.; Guiseppe, V. E.; Gusev, K.; Hallin, A. L.; Hazama, R.; Hegai, A.; Henning, R.; Hoppe, E. W.; Howard, S.; Howe, M. A.; Keeter, K. J.; Kidd, M. F.; Kochetov, O.; Konovalov, S. I.; Kouzes, R. T.; LaFerriere, B. D.; Leon, J.; Leviner, L. E.; Loach, J. C.; MacMullin, J.; MacMullin, S.; Martin, R. D.; Meijer, S.; Mertens, S.; Nomachi, M.; Orrell, J. L.; O'Shaughnessy, C.; Overman, N. R.; Phillips, D. G.; Poon, A. W. P.; Pushkin, K.; Radford, D. C.; Rager, J.; Rielage, K.; Robertson, R. G. H.; Romero-Romero, E.; Ronquest, M. C.; Schubert, A. G.; Shanks, B.; Shima, T.; Shirchenko, M.; Snavely, K. J.; Snyder, N.; Suriano, A. M.; Thompson, J.; Timkin, V.; Tornow, W.; Trimble, J. E.; Varner, R. L.; Vasilyev, S.; Vetter, K.; Vorren, K.; White, B. R.; Wilkerson, J. F.; Wiseman, C.; Xu, W.; Yakushev, E.; Young, A. R.; Yu, C.-H.; Yumatov, V.

    The Majorana Collaboration is constructing a system containing 40 kg of HPGe detectors to demonstrate the feasibility and potential of a future tonne-scale experiment capable of probing the neutrino mass scale in the inverted-hierarchy region. To realize this, a major goal of the Majorana Demonstrator is to demonstrate a path forward to achieving a background rate at or below 1 cnt/(ROI-t-y) in the 4 keV region of interest around the Q-value at 2039 keV. This goal is pursued through a combination of a significant reduction of radioactive impurities in construction materials with analytical methods for background rejection, for example using powerful pulse shape analysis techniques profiting from the p-type point contact HPGe detectors technology. The effectiveness of these methods is assessed using simulations of the different background components whose purity levels are constrained from radioassay measurements.

  4. How Qualitative Methods Can be Used to Inform Model Development.

    Science.gov (United States)

    Husbands, Samantha; Jowett, Susan; Barton, Pelham; Coast, Joanna

    2017-06-01

    Decision-analytic models play a key role in informing healthcare resource allocation decisions. However, there are ongoing concerns with the credibility of models. Modelling methods guidance can encourage good practice within model development, but its value is dependent on its ability to address the areas that modellers find most challenging. Further, it is important that modelling methods and related guidance are continually updated in light of any new approaches that could potentially enhance model credibility. The objective of this article was to highlight the ways in which qualitative methods have been used and recommended to inform decision-analytic model development and enhance modelling practices. With reference to the literature, the article discusses two key ways in which qualitative methods can be, and have been, applied. The first approach involves using qualitative methods to understand and inform general and future processes of model development, and the second, using qualitative techniques to directly inform the development of individual models. The literature suggests that qualitative methods can improve the validity and credibility of modelling processes by providing a means to understand existing modelling approaches that identifies where problems are occurring and further guidance is needed. It can also be applied within model development to facilitate the input of experts to structural development. We recommend that current and future model development would benefit from the greater integration of qualitative methods, specifically by studying 'real' modelling processes, and by developing recommendations around how qualitative methods can be adopted within everyday modelling practice.

  5. Machine Induced Background with BCM1F 2015

    CERN Document Server

    CMS Collaboration

    2015-01-01

    BCM1F provides information on the condition of the beam and ensures that the inner detector occupancy is sufficiently low for data-taking. Measurements of the machine induced background for beam 1 (BKGD1) and beam 2 (BKGD2) are displayed in the CMS and LHC control room. The background data correlates the with collimator settings as well as with the vacuum pressure.

  6. Background removal in X-ray fiber diffraction patterns

    International Nuclear Information System (INIS)

    Millane, R.P.; Arnott, S.

    1985-01-01

    Background can be a major source of error in measurement of diffracted intensities in fiber diffraction patterns. Errors can be large when poorly oriented less-crystalline specimens give diffraction patterns with little uncontaminated background. A method for estimating and removing a general global background in such cases is described and illustrated with an example. (orig.)

  7. 76 FR 55725 - Agency Information Collection Activities: Request for Comments for a New Information Collection

    Science.gov (United States)

    2011-09-08

    ... requirements or power calculations that justify the proposed sample size, the expected response rate, methods... Collection of Qualitative Feedback on Agency Service Delivery '' to OMB for approval under the Paperwork... Clearance for the Collection of Qualitative Feedback on Agency Service Delivery. Background: The information...

  8. Augmenting Ordinal Methods of Attribute Weight Approximation

    DEFF Research Database (Denmark)

    Daneilson, Mats; Ekenberg, Love; He, Ying

    2014-01-01

    of the obstacles and methods for introducing so-called surrogate weights have proliferated in the form of ordinal ranking methods for criteria weights. Considering the decision quality, one main problem is that the input information allowed in ordinal methods is sometimes too restricted. At the same time, decision...... makers often possess more background information, for example, regarding the relative strengths of the criteria, and might want to use that. We propose combined methods for facilitating the elicitation process and show how this provides a way to use partial information from the strength of preference...

  9. Evaluation of two methods for using MR information in PET reconstruction

    International Nuclear Information System (INIS)

    Caldeira, L.; Scheins, J.; Almeida, P.; Herzog, H.

    2013-01-01

    Using magnetic resonance (MR) information in maximum a posteriori (MAP) algorithms for positron emission tomography (PET) image reconstruction has been investigated in the last years. Recently, three methods to introduce this information have been evaluated and the Bowsher prior was considered the best. Its main advantage is that it does not require image segmentation. Another method that has been widely used for incorporating MR information is using boundaries obtained by segmentation. This method has also shown improvements in image quality. In this paper, two methods for incorporating MR information in PET reconstruction are compared. After a Bayes parameter optimization, the reconstructed images were compared using the mean squared error (MSE) and the coefficient of variation (CV). MSE values are 3% lower in Bowsher than using boundaries. CV values are 10% lower in Bowsher than using boundaries. Both methods performed better than using no prior, that is, maximum likelihood expectation maximization (MLEM) or MAP without anatomic information in terms of MSE and CV. Concluding, incorporating MR information using the Bowsher prior gives better results in terms of MSE and CV than boundaries. MAP algorithms showed again to be effective in noise reduction and convergence, specially when MR information is incorporated. The robustness of the priors in respect to noise and inhomogeneities in the MR image has however still to be performed

  10. On estimating probability of presence from use-availability or presence-background data.

    Science.gov (United States)

    Phillips, Steven J; Elith, Jane

    2013-06-01

    A fundamental ecological modeling task is to estimate the probability that a species is present in (or uses) a site, conditional on environmental variables. For many species, available data consist of "presence" data (locations where the species [or evidence of it] has been observed), together with "background" data, a random sample of available environmental conditions. Recently published papers disagree on whether probability of presence is identifiable from such presence-background data alone. This paper aims to resolve the disagreement, demonstrating that additional information is required. We defined seven simulated species representing various simple shapes of response to environmental variables (constant, linear, convex, unimodal, S-shaped) and ran five logistic model-fitting methods using 1000 presence samples and 10 000 background samples; the simulations were repeated 100 times. The experiment revealed a stark contrast between two groups of methods: those based on a strong assumption that species' true probability of presence exactly matches a given parametric form had highly variable predictions and much larger RMS error than methods that take population prevalence (the fraction of sites in which the species is present) as an additional parameter. For six species, the former group grossly under- or overestimated probability of presence. The cause was not model structure or choice of link function, because all methods were logistic with linear and, where necessary, quadratic terms. Rather, the experiment demonstrates that an estimate of prevalence is not just helpful, but is necessary (except in special cases) for identifying probability of presence. We therefore advise against use of methods that rely on the strong assumption, due to Lele and Keim (recently advocated by Royle et al.) and Lancaster and Imbens. The methods are fragile, and their strong assumption is unlikely to be true in practice. We emphasize, however, that we are not arguing against

  11. Simultaneous and separate, low background counting of beta rays and gamma rays using the phoswich principle

    International Nuclear Information System (INIS)

    Mayhugh, M.R.; Utts, B.K.; Shoffner, B.M.

    1978-01-01

    A phoswich constructed using thin calcium fluoride optically coupled to a thicker sodium iodide crystal and operated with pulse shape analysis equipment can be used as an efficient low background counting assembly. Low background in the beta ray counting channel is achieved by judicious choice of pure materials in the assembly and by operating the analysis equipment so as to reject background events which occur simultaneously in the sodium iodide crystal. Careful survey of construction materials and methods has resulted in reducing beta ray counting background to 0.6 c/min for a 2-inch diameter assembly. The radioactivity of typical building materials will be discussed. A pulse shape analyzer has been constructed which provides separately adjusted time windows and separate output information for the beta ray and gamma ray channels. The dual channel capability combined with the low beta ray background reduces the sample counting time significantly for typical laboratory samples. (author)

  12. Background modeling for the GERDA experiment

    Energy Technology Data Exchange (ETDEWEB)

    Becerici-Schmidt, N. [Max-Planck-Institut für Physik, München (Germany); Collaboration: GERDA Collaboration

    2013-08-08

    The neutrinoless double beta (0νββ) decay experiment GERDA at the LNGS of INFN has started physics data taking in November 2011. This paper presents an analysis aimed at understanding and modeling the observed background energy spectrum, which plays an essential role in searches for a rare signal like 0νββ decay. A very promising preliminary model has been obtained, with the systematic uncertainties still under study. Important information can be deduced from the model such as the expected background and its decomposition in the signal region. According to the model the main background contributions around Q{sub ββ} come from {sup 214}Bi, {sup 228}Th, {sup 42}K, {sup 60}Co and α emitting isotopes in the {sup 226}Ra decay chain, with a fraction depending on the assumed source positions.

  13. General Aviation in Nebraska: Nebraska SATS Project Background Paper No. 1

    Science.gov (United States)

    Smith, Russell; Wachal, Jocelyn

    2000-01-01

    The Nebraska SATS project is a state-level component of NASA's Small Aircraft Transportation System (SATS). During the next several years the project will examine several different factors affecting SATS implementation in Nebraska. These include economic and taxation issues, public policy issues, airport planning processes, information dissemination strategies, and systemic change factors. This background paper profiles the general aviation system in Nebraska. It is written to provide information about the "context" within which SATS will be pursued. The primary focus is thus on describing and providing background information about the current situation. A secondary focus is on drawing general conclusions about the ability of the current system to incorporate the types of changes implied by SATS. First, some brief information on the U.S. aviation system is provided. The next two sections profile the current general aviation aircraft and pilot base. Nebraska's system of general aviation airports is then described. Within this section of the paper, information is provided on the different types of general aviation airports in Nebraska, airport activity levels and current infrastructure. The fourth major section of the background paper looks at Nebraska's local airport authorities. These special purpose local governments oversee the majority of the general aviation airports in the state. Among the items examined are total expenditures, capital expenditures and planning activities. Next, the paper provides background information on the Nebraska Department of Aeronautics (NDA) and recent Federal funding for general aviation in Nebraska. The final section presents summary conclusions.

  14. Artificial intelligence applications in information and communication technologies

    CERN Document Server

    Bouguila, Nizar

    2015-01-01

    This book presents various recent applications of Artificial Intelligence in Information and Communication Technologies such as Search and Optimization methods, Machine Learning, Data Representation and Ontologies, and Multi-agent Systems. The main aim of this book is to help Information and Communication Technologies (ICT) practitioners in managing efficiently their platforms using AI tools and methods and to provide them with sufficient Artificial Intelligence background to deal with real-life problems.  .

  15. INFORMATION CULTURE AND INFORMATION SAFETY OF SCHOOLCHILDREN

    Directory of Open Access Journals (Sweden)

    E. G. Belyakova

    2017-01-01

    Full Text Available Introduction. The article is devoted to the problem of interaction between schoolchildren and possible informational risks transmitted on the Internet. Considering the lack of external filters on the way of harmful information streams, it is actually necessary to develop information culture of schoolchildren, their abilities to sensibly and critically interpret the information on the Internet, and choice of adequate behaviour models surfing the Web. The aim of the present research is to analyze the state of informational safety of schoolchildren while using the Internet; gaining an understanding of the role of external restrictions and opportunities of intrapersonal filtration of the harmful Internet content depending on children age. Methodology and research methods. The methodology of the research is based on modern methods aimed to consider the problem of personal socialization in modern information society. Thus, the Internet Initiatives Development Fund (IIDF questionnaire let the authors define the level of awareness of recipients on the problem under consideration. Results and scientific novelty. The theoretical analysis helped the authors predict the correlation of basic methods in order to guarantee personal safety of schoolchildren taking into account the process of maturity as well as the decrease of external filters that may stop harmful content. Empirical part of the research has enabled to reveal decrease in external control of staying of a child in network in the process of growing up against the background of restrictive attitudes prevalence among teachers and parents. Therefore, the research supposed to improve information culture of schoolchildren from the earliest ages encouraging them to sensibly and correctly interpret the information on the Internet. Practical significance. The practical recommendations to parents and teachers in order to improve informational personal safety of schoolchildren are proposed. The relevancy

  16. The natural background approach to setting radiation standards

    International Nuclear Information System (INIS)

    Adler, H.I.; Federow, H.; Weinberg, A.M.

    1979-01-01

    The suggestion has often been made that an additional radiation exposure imposed on humanity as a result of some important activity such as electricity generation would be acceptable if the exposure was 'small' compared to the natural background. In order to make this concept quantitative and objective, we propose that 'small compared with the natural background' be interpreted as the standard deviation (weighted with the exposed population) of the natural background. We believe that this use of the variation in natural background radiation is less arbitrary and requires fewer unfounded assumptions than some current approaches to standard-setting. The standard deviation is an easily calculated statistic that is small compared with the mean value for natural exposures of populations. It is an objectively determined quantity and its significance is generally understood. Its determination does not omit any of the pertinent data. When this method is applied to the population of the USA, it implies that a dose of 20 mrem/year would be an acceptable standard. This is closely comparable to the 25 mrem/year suggested by the Environmental Protection Agency as the maximum allowable exposure to an individual in the general population as a result of the operation of the complete uranium fuel cycle. Other agents for which a natural background exists can be treated in the same way as radiation. In addition, a second method for determining permissible exposure levels for agents other than radiation is presented. This method makes use of the natural background radiation data as a primary standard. Some observations on benzo(a)pyrene, using this latter method, are presented. (author)

  17. Orientations in adolescent use of information and communication technology: a digital divide by sociodemographic background, educational career, and health.

    Science.gov (United States)

    Koivusilta, Leena K; Lintonen, Tomi P; Rimpelä, Arja H

    2007-01-01

    The role of information and communication technology (ICT) in adolescents' lives was studied, with emphasis on whether there exists a digital divide based on sociodemographic background, educational career, and health. The assumption was that some groups of adolescents use ICT more so that their information utilization skills improve (computer use), while others use it primarily for entertainment (digital gaming, contacting friends by mobile phone). Data were collected by mailed survey from a nationally representative sample of 12- to 18-year-olds (n=7,292; response 70%) in 2001 and analysed using ANOVA. Computer use was most frequent among adolescents whose fathers had higher education or socioeconomic status, who came from nuclear families, and who continued studies after compulsory education. Digital gaming was associated with poor school achievement and attending vocational rather than upper secondary school. Mobile phone use was frequent among adolescents whose fathers had lower education or socioeconomic status, who came from non-nuclear families, and whose educational prospects were poor. Intensive use of each ICT form, especially of mobile phones, was associated with health problems. High social position, nuclear family, and a successful educational career signified good health in general, independently of the diverse usage of ICT. There exists a digital divide among adolescents: orientation to computer use is more common in educated well-off families while digital gaming and mobile phone use accumulate at the opposite end of the spectrum. Poorest health was reported by mobile phone users. High social background and success at school signify better health, independently of the ways of using ICT.

  18. Background dose subtraction in personnel dosimetry

    International Nuclear Information System (INIS)

    Picazo, T.; Llorca, N.; Alabau, J.

    1997-01-01

    In this paper it is proposed to consider the mode of the frequency distribution of the low dose dosemeters from each clinic that uses X rays as the background environmental dose that should be subtracted from the personnel dosimetry to evaluate the doses due to practice. The problems and advantages of this indirect method to estimate the environmental background dose are discussed. The results for 60 towns are presented. (author)

  19. Background-cross-section-dependent subgroup parameters

    International Nuclear Information System (INIS)

    Yamamoto, Toshihisa

    2003-01-01

    A new set of subgroup parameters was derived that can reproduce the self-shielded cross section against a wide range of background cross sections. The subgroup parameters are expressed with a rational equation which numerator and denominator are expressed as the expansion series of background cross section, so that the background cross section dependence is exactly taken into account in the parameters. The advantage of the new subgroup parameters is that they can reproduce the self-shielded effect not only by group basis but also by subgroup basis. Then an adaptive method is also proposed which uses fitting procedure to evaluate the background-cross-section-dependence of the parameters. One of the simple fitting formula was able to reproduce the self-shielded subgroup cross section by less than 1% error from the precise evaluation. (author)

  20. Information geometric methods for complexity

    Science.gov (United States)

    Felice, Domenico; Cafaro, Carlo; Mancini, Stefano

    2018-03-01

    Research on the use of information geometry (IG) in modern physics has witnessed significant advances recently. In this review article, we report on the utilization of IG methods to define measures of complexity in both classical and, whenever available, quantum physical settings. A paradigmatic example of a dramatic change in complexity is given by phase transitions (PTs). Hence, we review both global and local aspects of PTs described in terms of the scalar curvature of the parameter manifold and the components of the metric tensor, respectively. We also report on the behavior of geodesic paths on the parameter manifold used to gain insight into the dynamics of PTs. Going further, we survey measures of complexity arising in the geometric framework. In particular, we quantify complexity of networks in terms of the Riemannian volume of the parameter space of a statistical manifold associated with a given network. We are also concerned with complexity measures that account for the interactions of a given number of parts of a system that cannot be described in terms of a smaller number of parts of the system. Finally, we investigate complexity measures of entropic motion on curved statistical manifolds that arise from a probabilistic description of physical systems in the presence of limited information. The Kullback-Leibler divergence, the distance to an exponential family and volumes of curved parameter manifolds, are examples of essential IG notions exploited in our discussion of complexity. We conclude by discussing strengths, limits, and possible future applications of IG methods to the physics of complexity.

  1. Planck 2013 results. XVIII. The gravitational lensing-infrared background correlation

    DEFF Research Database (Denmark)

    Ade, P. A. R.; Aghanim, N.; Armitage-Caplan, C.

    2013-01-01

    The multi-frequency capability of the Planck satellite provides information both on the integrated history of star formation (via the cosmic infrared background, or CIB) and on the distribution of dark matter (via the lensing effect on the cosmic microwave background, or CMB). The conjunction of ...

  2. Optimization approach of background value and initial item for improving prediction precision of GM(1,1) model

    Institute of Scientific and Technical Information of China (English)

    Yuhong Wang; Qin Liu; Jianrong Tang; Wenbin Cao; Xiaozhong Li

    2014-01-01

    A combination method of optimization of the back-ground value and optimization of the initial item is proposed. The sequences of the unbiased exponential distribution are simulated and predicted through the optimization of the background value in grey differential equations. The principle of the new information priority in the grey system theory and the rationality of the initial item in the original GM(1,1) model are ful y expressed through the improvement of the initial item in the proposed time response function. A numerical example is employed to il ustrate that the proposed method is able to simulate and predict sequences of raw data with the unbiased exponential distribution and has better simulation performance and prediction precision than the original GM(1,1) model relatively.

  3. Iterative estimation of the background in noisy spectroscopic data

    International Nuclear Information System (INIS)

    Zhu, M.H.; Liu, L.G.; Cheng, Y.S.; Dong, T.K.; You, Z.; Xu, A.A.

    2009-01-01

    In this paper, we present an iterative filtering method to estimate the background of noisy spectroscopic data. The proposed method avoids the calculation of the average full width at half maximum (FWHM) of the whole spectrum and the peak regions, and it can estimate the background efficiently, especially for spectroscopic data with the Compton continuum.

  4. Multivariate Identification of Background Contributions for the H ! tt

    CERN Document Server

    Andrejkovic, Janik Walter

    2016-01-01

    Within the H ! tt analysis it is very important to understand the background contamination in the signal region coming from events where a jet is misidentified as a hadronic tau (fake events). Currently, the fake rate method is used to estimate the number and distributions of fake events in the signal region. This method relies on the correct identification of different background types. The study presented in this report focuses on the use of boosted decision trees in order to identify different background types. It is shown how the addition of more input variables, leading to a multi-dimensional multi-classification task, improves the overall identification accuracy of the different background types.

  5. Information in medical treatment courses

    DEFF Research Database (Denmark)

    Møller, Marianne; Hollnagel, Erik; Andersen, Stig Ejdrup

    Background Unintended events and suboptimal treatment with medicines are major burdens for patients and health systems all over the world. Information processes have important roles for establishing safe and effective treatment courses. The platform for this Ph.d. study is learning from situations...... to the quality of medical treatment courses. Methods Systems theory, cybernetics (steering, timing and feedback) and a classic communication model are applied as theoretical frames. Two groups of patients and their information providers are studied using qualitative methods. The data analysis focuses...... that goes well (Safety-II) while having a broad understanding of quality. Objectives The overall purpose is to investigate how information is used as a steering tool for quality in medical treatment courses. In this first part of the study, the role of information on medicine is analyzed in relation...

  6. A System based on Adaptive Background Subtraction Approach for Moving Object Detection and Tracking in Videos

    Directory of Open Access Journals (Sweden)

    Bahadır KARASULU

    2013-04-01

    Full Text Available Video surveillance systems are based on video and image processing research areas in the scope of computer science. Video processing covers various methods which are used to browse the changes in existing scene for specific video. Nowadays, video processing is one of the important areas of computer science. Two-dimensional videos are used to apply various segmentation and object detection and tracking processes which exists in multimedia content-based indexing, information retrieval, visual and distributed cross-camera surveillance systems, people tracking, traffic tracking and similar applications. Background subtraction (BS approach is a frequently used method for moving object detection and tracking. In the literature, there exist similar methods for this issue. In this research study, it is proposed to provide a more efficient method which is an addition to existing methods. According to model which is produced by using adaptive background subtraction (ABS, an object detection and tracking system’s software is implemented in computer environment. The performance of developed system is tested via experimental works with related video datasets. The experimental results and discussion are given in the study

  7. Control method for biped locomotion robots based on ZMP information

    International Nuclear Information System (INIS)

    Kume, Etsuo

    1994-01-01

    The Human Acts Simulation Program (HASP) started as a ten year program of Computing and Information Systems Center (CISC) at Japan Atomic Energy Research Institute (JAERI) in 1987. A mechanical design study of biped locomotion robots for patrol and inspection in nuclear facilities is being performed as an item of the research scope. One of the goals of our research is to design a biped locomotion robot for practical use in nuclear facilities. So far, we have been studying for several dynamic walking patterns. In conventional control methods for biped locomotion robots, the program control is used based on preset walking patterns, so it dose not have the robustness such as a dynamic change of walking pattern. Therefore, a real-time control method based on dynamic information of the robot states is necessary for the high performance of walking. In this study a new control method based on Zero Moment Point (ZMP) information is proposed as one of real-time control methods. The proposed method is discussed and validated based on the numerical simulation. (author)

  8. Partition function for a singular background

    International Nuclear Information System (INIS)

    McKenzie-Smith, J.J.; Naylor, W.

    2005-01-01

    We present a method for evaluating the partition function in a varying external field. Specifically, we look at the case of a non-interacting, charged, massive scalar field at finite temperature with an associated chemical potential in the background of a delta-function potential. Whilst we present a general method, valid at all temperatures, we only give the result for the leading order term in the high temperature limit. Although the derivative expansion breaks down for inhomogeneous backgrounds we are able to obtain the high temperature expansion, as well as an analytic expression for the zero point energy, by way of a different approximation scheme, which we call the local Born approximation (LBA)

  9. Partition function for a singular background

    Energy Technology Data Exchange (ETDEWEB)

    McKenzie-Smith, J.J. [Financial Risk Management Ltd, 15 Adam Street, London WC2N 6AH (United Kingdom)]. E-mail: julian.mckenzie-smith@frmhedge.com; Naylor, W. [Yukawa Institute for Theoretical Physics, Kyoto University, Kyoto 606-8502 (Japan)]. E-mail: naylor@yukawa.kyoto-u.ac.jp

    2005-03-17

    We present a method for evaluating the partition function in a varying external field. Specifically, we look at the case of a non-interacting, charged, massive scalar field at finite temperature with an associated chemical potential in the background of a delta-function potential. Whilst we present a general method, valid at all temperatures, we only give the result for the leading order term in the high temperature limit. Although the derivative expansion breaks down for inhomogeneous backgrounds we are able to obtain the high temperature expansion, as well as an analytic expression for the zero point energy, by way of a different approximation scheme, which we call the local Born approximation (LBA)

  10. Adaptive cancellation of geomagnetic background noise for magnetic anomaly detection using coherence

    International Nuclear Information System (INIS)

    Liu, Dunge; Xu, Xin; Huang, Chao; Zhu, Wanhua; Liu, Xiaojun; Fang, Guangyou; Yu, Gang

    2015-01-01

    Magnetic anomaly detection (MAD) is an effective method for the detection of ferromagnetic targets against background magnetic fields. Currently, the performance of MAD systems is mainly limited by the background geomagnetic noise. Several techniques have been developed to detect target signatures, such as the synchronous reference subtraction (SRS) method. In this paper, we propose an adaptive coherent noise suppression (ACNS) method. The proposed method is capable of evaluating and detecting weak anomaly signals buried in background geomagnetic noise. Tests with real-world recorded magnetic signals show that the ACNS method can excellently remove the background geomagnetic noise by about 21 dB or more in high background geomagnetic field environments. Additionally, as a general form of the SRS method, the ACNS method offers appreciable advantages over the existing algorithms. Compared to the SRS method, the ACNS algorithm can eliminate the false target signals and represents a noise suppressing capability improvement of 6.4 dB. The positive outcomes in terms of intelligibility make this method a potential candidate for application in MAD systems. (paper)

  11. Practical Methods for Information Security Risk Management

    Directory of Open Access Journals (Sweden)

    Cristian AMANCEI

    2011-01-01

    Full Text Available The purpose of this paper is to present some directions to perform the risk man-agement for information security. The article follows to practical methods through question-naire that asses the internal control, and through evaluation based on existing controls as part of vulnerability assessment. The methods presented contains all the key elements that concurs in risk management, through the elements proposed for evaluation questionnaire, list of threats, resource classification and evaluation, correlation between risks and controls and residual risk computation.

  12. Infrared images target detection based on background modeling in the discrete cosine domain

    Science.gov (United States)

    Ye, Han; Pei, Jihong

    2018-02-01

    Background modeling is the critical technology to detect the moving target for video surveillance. Most background modeling techniques are aimed at land monitoring and operated in the spatial domain. A background establishment becomes difficult when the scene is a complex fluctuating sea surface. In this paper, the background stability and separability between target are analyzed deeply in the discrete cosine transform (DCT) domain, on this basis, we propose a background modeling method. The proposed method models each frequency point as a single Gaussian model to represent background, and the target is extracted by suppressing the background coefficients. Experimental results show that our approach can establish an accurate background model for seawater, and the detection results outperform other background modeling methods in the spatial domain.

  13. Gamma-Ray Background Variability in Mobile Detectors

    Science.gov (United States)

    Aucott, Timothy John

    Gamma-ray background radiation significantly reduces detection sensitivity when searching for radioactive sources in the field, such as in wide-area searches for homeland security applications. Mobile detector systems in particular must contend with a variable background that is not necessarily known or even measurable a priori. This work will present measurements of the spatial and temporal variability of the background, with the goal of merging gamma-ray detection, spectroscopy, and imaging with contextual information--a "nuclear street view" of the ubiquitous background radiation. The gamma-ray background originates from a variety of sources, both natural and anthropogenic. The dominant sources in the field are the primordial isotopes potassium-40, uranium-238, and thorium-232, as well as their decay daughters. In addition to the natural background, many artificially-created isotopes are used for industrial or medical purposes, and contamination from fission products can be found in many environments. Regardless of origin, these backgrounds will reduce detection sensitivity by adding both statistical as well as systematic uncertainty. In particular, large detector arrays will be limited by the systematic uncertainty in the background and will suffer from a high rate of false alarms. The goal of this work is to provide a comprehensive characterization of the gamma-ray background and its variability in order to improve detection sensitivity and evaluate the performance of mobile detectors in the field. Large quantities of data are measured in order to study their performance at very low false alarm rates. Two different approaches, spectroscopy and imaging, are compared in a controlled study in the presence of this measured background. Furthermore, there is additional information that can be gained by correlating the gamma-ray data with contextual data streams (such as cameras and global positioning systems) in order to reduce the variability in the background

  14. Probabilistic BPRRC: Robust Change Detection against Illumination Changes and Background Movements

    Science.gov (United States)

    Yokoi, Kentaro

    This paper presents Probabilistic Bi-polar Radial Reach Correlation (PrBPRRC), a change detection method that is robust against illumination changes and background movements. Most of the traditional change detection methods are robust against either illumination changes or background movements; BPRRC is one of the illumination-robust change detection methods. We introduce a probabilistic background texture model into BPRRC and add the robustness against background movements including foreground invasions such as moving cars, walking people, swaying trees, and falling snow. We show the superiority of PrBPRRC in the environment with illumination changes and background movements by using three public datasets and one private dataset: ATON Highway data, Karlsruhe traffic sequence data, PETS 2007 data, and Walking-in-a-room data.

  15. Methods of Certification tests PLC-Networks in Compliance Safety Information

    Directory of Open Access Journals (Sweden)

    A. A. Balaev

    2011-12-01

    Full Text Available The aim of this research was description of the methodology of the audit plc-network to meet the requirements of information security. The technique is based on the provisions of the guidance documents and model FSTEC Russia test object methods of information on safety information.

  16. Background paper on aquaculture research

    OpenAIRE

    Wenblad, Axel; Jokumsen, Alfred; Eskelinen, Unto; Torrissen, Ole

    2013-01-01

    The Board of MISTRA established in 2012 a Working Group (WG) on Aquaculture to provide the Board with background information for its upcoming decision on whether the foundation should invest in aquaculture research. The WG included Senior Advisor Axel Wenblad, Sweden (Chairman), Professor Ole Torrissen, Norway, Senior Advisory Scientist Unto Eskelinen, Finland and Senior Advisory Scientist Alfred Jokumsen, Denmark. The WG performed an investigation of the Swedish aquaculture sector including ...

  17. a Task-Oriented Disaster Information Correlation Method

    Science.gov (United States)

    Linyao, Q.; Zhiqiang, D.; Qing, Z.

    2015-07-01

    With the rapid development of sensor networks and Earth observation technology, a large quantity of disaster-related data is available, such as remotely sensed data, historic data, case data, simulated data, and disaster products. However, the efficiency of current data management and service systems has become increasingly difficult due to the task variety and heterogeneous data. For emergency task-oriented applications, the data searches primarily rely on artificial experience based on simple metadata indices, the high time consumption and low accuracy of which cannot satisfy the speed and veracity requirements for disaster products. In this paper, a task-oriented correlation method is proposed for efficient disaster data management and intelligent service with the objectives of 1) putting forward disaster task ontology and data ontology to unify the different semantics of multi-source information, 2) identifying the semantic mapping from emergency tasks to multiple data sources on the basis of uniform description in 1), and 3) linking task-related data automatically and calculating the correlation between each data set and a certain task. The method goes beyond traditional static management of disaster data and establishes a basis for intelligent retrieval and active dissemination of disaster information. The case study presented in this paper illustrates the use of the method on an example flood emergency relief task.

  18. Ethnic differences in informed decision-making about prenatal screening for Down's syndrome

    NARCIS (Netherlands)

    Fransen, Mirjam P.; Essink-Bot, Marie-Louise; Vogel, Ineke; Mackenbach, Johan P.; Steegers, Eric A. P.; Wildschut, Hajo I. J.

    2010-01-01

    BACKGROUND: The aim of this study was to assess ethnic variations in informed decision-making about prenatal screening for Down's syndrome and to examine the contribution of background and decision-making variables. METHODS: Pregnant women of Dutch, Turkish and Surinamese origin were recruited

  19. Moving object detection using background subtraction

    CERN Document Server

    Shaikh, Soharab Hossain; Chaki, Nabendu

    2014-01-01

    This Springer Brief presents a comprehensive survey of the existing methodologies of background subtraction methods. It presents a framework for quantitative performance evaluation of different approaches and summarizes the public databases available for research purposes. This well-known methodology has applications in moving object detection from video captured with a stationery camera, separating foreground and background objects and object classification and recognition. The authors identify common challenges faced by researchers including gradual or sudden illumination change, dynamic bac

  20. TeV Blazars and Cosmic Infrared Background Radiation

    OpenAIRE

    Aharonian, F. A.

    2001-01-01

    The recent developments in studies of TeV radiation from blazars are highlighted and the implications of these results for derivation of cosmologically important information about the cosmic infrared background radiation are discussed.

  1. Method of sharing mobile unit state information between base station routers

    NARCIS (Netherlands)

    Bosch, H.G.P.; Mullender, Sape J.; Polakos, Paul Anthony; Rajkumar, Ajay; Sundaram, Ganapathy S.

    2007-01-01

    The present invention provides a method of operating a first base station router. The method may include transmitting state information associated with at least one inactive mobile unit to at least one second base station router. The state information is usable to initiate an active session with the

  2. Method of sharing mobile unit state information between base station routers

    NARCIS (Netherlands)

    Bosch, H.G.P.; Mullender, Sape J.; Polakos, Paul Anthony; Rajkumar, Ajay; Sundaram, Ganapathy S.

    2010-01-01

    The present invention provides a method of operating a first base station router. The method may include transmitting state information associated with at least one inactive mobile unit to at least one second base station router. The state information is usable to initiate an active session with the

  3. Background Reduction around Prompt Gamma-ray Peaks from Korean White Ginseng

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Y. N.; Sun, G. M.; Moon, J. H.; Chung, Y. S. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Kim, Y. E. [Chung-buk National University, Chungju (Korea, Republic of)

    2007-10-15

    Prompt gamma-ray activation analysis (PGAA) is recognized as a very powerful and unique nuclear method in terms of its non-destruction, high precision, and no time-consuming advantages. This method is used for the analysis of trace elements in various types of sample matrix such as metallurgical, environmental, biological samples, etc. When a spectrum is evaluated, background continuum is a major disturbing factor for a precise and accurate analysis. Furthermore, a prompt gamma spectrum is complicate with a wide range. To make the condition free from this limitation, a reduction of the background is important for the PGAA analysis. The background-reducing methods are divided into using the electronic equipment like a suppression mode and principal component analysis (PCA) based on a multivariate statistical method. In PGAA analysis, Lee et al. compared the background reduction methods like PCA and wavelet transform for the prompt gamma-ray spectra. Lim et al. have applied the multivariate statistical method to the identification of the peaks with low-statistics from the explosives. In this paper, effective reduction of background in the prompt gamma spectra using the PCA is applied to the prompt gammaray peaks from Korean Baeksam (Korean white ginseng)

  4. The multinational birth cohort of EuroPrevall: background, aims and methods

    NARCIS (Netherlands)

    Keil, T.; McBride, D.; Grimshaw, K.; Niggemann, B.; Xepapadaki, P.; Zannikos, K.; Sigurdardottir, S. T.; Clausen, M.; Reche, M.; Pascual, C.; Stanczyk, A. P.; Kowalski, M. L.; Dubakiene, R.; Drasutiene, G.; Roberts, G.; Schoemaker, A.-F. A.; Sprikkelman, A. B.; Fiocchi, A.; Martelli, A.; Dufour, S.; Hourihane, J.; Kulig, M.; Wjst, M.; Yazdanbakhsh, M.; Szépfalusi, Z.; van Ree, R.; Willich, S. N.; Wahn, U.; Mills, E. N. C.; Beyer, K.

    2010-01-01

    P>Background/aim: The true prevalence and risk factors of food allergies in children are not known because estimates were based predominantly on subjective assessments and skin or serum tests of allergic sensitization to food. The diagnostic gold standard, a double-blind placebo-controlled food

  5. Advanced Background Subtraction Applied to Aeroacoustic Wind Tunnel Testing

    Science.gov (United States)

    Bahr, Christopher J.; Horne, William C.

    2015-01-01

    An advanced form of background subtraction is presented and applied to aeroacoustic wind tunnel data. A variant of this method has seen use in other fields such as climatology and medical imaging. The technique, based on an eigenvalue decomposition of the background noise cross-spectral matrix, is robust against situations where isolated background auto-spectral levels are measured to be higher than levels of combined source and background signals. It also provides an alternate estimate of the cross-spectrum, which previously might have poor definition for low signal-to-noise ratio measurements. Simulated results indicate similar performance to conventional background subtraction when the subtracted spectra are weaker than the true contaminating background levels. Superior performance is observed when the subtracted spectra are stronger than the true contaminating background levels. Experimental results show limited success in recovering signal behavior for data where conventional background subtraction fails. They also demonstrate the new subtraction technique's ability to maintain a proper coherence relationship in the modified cross-spectral matrix. Beam-forming and de-convolution results indicate the method can successfully separate sources. Results also show a reduced need for the use of diagonal removal in phased array processing, at least for the limited data sets considered.

  6. Background field removal using a region adaptive kernel for quantitative susceptibility mapping of human brain

    Science.gov (United States)

    Fang, Jinsheng; Bao, Lijun; Li, Xu; van Zijl, Peter C. M.; Chen, Zhong

    2017-08-01

    Background field removal is an important MR phase preprocessing step for quantitative susceptibility mapping (QSM). It separates the local field induced by tissue magnetic susceptibility sources from the background field generated by sources outside a region of interest, e.g. brain, such as air-tissue interface. In the vicinity of air-tissue boundary, e.g. skull and paranasal sinuses, where large susceptibility variations exist, present background field removal methods are usually insufficient and these regions often need to be excluded by brain mask erosion at the expense of losing information of local field and thus susceptibility measures in these regions. In this paper, we propose an extension to the variable-kernel sophisticated harmonic artifact reduction for phase data (V-SHARP) background field removal method using a region adaptive kernel (R-SHARP), in which a scalable spherical Gaussian kernel (SGK) is employed with its kernel radius and weights adjustable according to an energy "functional" reflecting the magnitude of field variation. Such an energy functional is defined in terms of a contour and two fitting functions incorporating regularization terms, from which a curve evolution model in level set formation is derived for energy minimization. We utilize it to detect regions of with a large field gradient caused by strong susceptibility variation. In such regions, the SGK will have a small radius and high weight at the sphere center in a manner adaptive to the voxel energy of the field perturbation. Using the proposed method, the background field generated from external sources can be effectively removed to get a more accurate estimation of the local field and thus of the QSM dipole inversion to map local tissue susceptibility sources. Numerical simulation, phantom and in vivo human brain data demonstrate improved performance of R-SHARP compared to V-SHARP and RESHARP (regularization enabled SHARP) methods, even when the whole paranasal sinus regions

  7. Beyond scene gist: Objects guide search more than scene background.

    Science.gov (United States)

    Koehler, Kathryn; Eckstein, Miguel P

    2017-06-01

    Although the facilitation of visual search by contextual information is well established, there is little understanding of the independent contributions of different types of contextual cues in scenes. Here we manipulated 3 types of contextual information: object co-occurrence, multiple object configurations, and background category. We isolated the benefits of each contextual cue to target detectability, its impact on decision bias, confidence, and the guidance of eye movements. We find that object-based information guides eye movements and facilitates perceptual judgments more than scene background. The degree of guidance and facilitation of each contextual cue can be related to its inherent informativeness about the target spatial location as measured by human explicit judgments about likely target locations. Our results improve the understanding of the contributions of distinct contextual scene components to search and suggest that the brain's utilization of cues to guide eye movements is linked to the cue's informativeness about the target's location. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  8. Background approximation in automatic qualitative X-ray-fluorescent analysis

    International Nuclear Information System (INIS)

    Jordanov, J.; Tsanov, T.; Stefanov, R.; Jordanov, N.; Paunov, M.

    1982-01-01

    An empirical method of finding the dependence of the background intensity (Isub(bg) on the wavelength is proposed, based on the approximation of the experimentally found values for the background in the course of an automatic qualitative X-ray fluorescent analysis with pre-set curve. It is assumed that the dependence I(lambda) will be well approximated by a curve of the type Isub(bg)=(lambda-lambda sub(o)sup(fsub(1)(lambda))exp[fsub(2)(lambda)] where fsub(1) (lambda) and f 2 (lambda) are linear functions with respect to the sought parameters. This assumption was checked out on a ''pure'' starch background, in which it is not known beforehand which points belong to the background. It was assumed that the dependence I(lambda) can be found from all minima in the spectrum. Three types of minima has been distinguished: 1. the lowest point between two well-solved X-ray lines; 2. a minimum obtained as a result of statistical fluctuations of the measured signal; 3. the lowest point between two overlapped lines. The minima strongly deviating from the background are removed from the obtained set. The sum-total of the remaining minima serves as a base for the approximation of the dependence I(lambda). The unknown parameters are determined by means of the LSM. The approximated curve obtained by this method is closer to the real background than the background determined by the method described by Kigaki Denki, as the effect of all recorded minima is taken into account. As an example the PbTe spectrum recorded with crystal LiF 220 is shown graphically. The curve well describes the background of the spectrum even in the regions in which there are no minima belonging to the background. (authors)

  9. A Method Validation for Determination of Gross Alpha and Gross Beta in Water Sample Using Low Background Gross Alpha/ Beta Counting System

    International Nuclear Information System (INIS)

    Zal Uyun Wan Mahmood; Norfaizal Mohamed; Nita Salina Abu Bakar

    2016-01-01

    Method validation (MV) for the measurement of gross alpha and gross beta activity in water (drinking, mineral and environmental) samples using Low Background Gross Alpha/ Beta Counting System was performed to characterize precision, accuracy and reliable results. The main objective of this assignment is to ensure that both the instrument and method always good performed and resulting accuracy and reliable results. Generally, almost the results of estimated RSD, z-score and U_s_c_o_r_e were reliable which are recorded as ≤30 %, less than 2 and less than 1.5, respectively. Minimum Detected Activity (MDA) was estimated based on the counting time of 100 minutes and present background counting value of gross alpha (0.01 - 0.35 cpm) and gross beta (0.50 - 2.18 cpm). Estimated Detection Limit (DL) was 0.1 Bq/ L for gross alpha and 0.2 Bq/ L for gross beta and expended uncertainty was relatively small of 9.77 % for gross alpha and 10.57 % for gross beta. Align with that, background counting for gross alpha and gross beta was ranged of 0.01 - 0.35 cpm and 0.50 - 2.18 cpm, respectively. While, sample volume was set at minimum of 500 mL and maximum of 2000 mL. These proven the accuracy and precision result that are generated from developed method/ technique is satisfactory and method is recommended to be used. Therefore, it can be concluded that the MV found no doubtful on the ability of the developed method. The test result showed the method is suitable for all types of water samples which are contained several radionuclides and elements as well as any impurities that interfere the measurement analysis of gross alpha and gross beta. (author)

  10. Information loss method to measure node similarity in networks

    Science.gov (United States)

    Li, Yongli; Luo, Peng; Wu, Chong

    2014-09-01

    Similarity measurement for the network node has been paid increasing attention in the field of statistical physics. In this paper, we propose an entropy-based information loss method to measure the node similarity. The whole model is established based on this idea that less information loss is caused by seeing two more similar nodes as the same. The proposed new method has relatively low algorithm complexity, making it less time-consuming and more efficient to deal with the large scale real-world network. In order to clarify its availability and accuracy, this new approach was compared with some other selected approaches on two artificial examples and synthetic networks. Furthermore, the proposed method is also successfully applied to predict the network evolution and predict the unknown nodes' attributions in the two application examples.

  11. Patient representatives? views on patient information in clinical cancer trials

    OpenAIRE

    Dellson, Pia; Nilbert, Mef; Carlsson, Christina

    2016-01-01

    Background Patient enrolment into clinical trials is based on oral information and informed consent, which includes an information sheet and a consent certificate. The written information should be complete, but at the same time risks being so complex that it may be questioned if a fully informed consent is possible to provide. We explored patient representatives? views and perceptions on the written trial information used in clinical cancer trials. Methods Written patient information leaflet...

  12. Searching for Suicide Methods: Accessibility of Information About Helium as a Method of Suicide on the Internet.

    Science.gov (United States)

    Gunnell, David; Derges, Jane; Chang, Shu-Sen; Biddle, Lucy

    2015-01-01

    Helium gas suicides have increased in England and Wales; easy-to-access descriptions of this method on the Internet may have contributed to this rise. To investigate the availability of information on using helium as a method of suicide and trends in searching about this method on the Internet. We analyzed trends in (a) Google searching (2004-2014) and (b) hits on a Wikipedia article describing helium as a method of suicide (2013-2014). We also investigated the extent to which helium was described as a method of suicide on web pages and discussion forums identified via Google. We found no evidence of rises in Internet searching about suicide using helium. News stories about helium suicides were associated with increased search activity. The Wikipedia article may have been temporarily altered to increase awareness of suicide using helium around the time of a celebrity suicide. Approximately one third of the links retrieved using Google searches for suicide methods mentioned helium. Information about helium as a suicide method is readily available on the Internet; the Wikipedia article describing its use was highly accessed following celebrity suicides. Availability of online information about this method may contribute to rises in helium suicides.

  13. Unexploded ordnance issues at Aberdeen Proving Ground: Background information

    Energy Technology Data Exchange (ETDEWEB)

    Rosenblatt, D.H.

    1996-11-01

    This document summarizes currently available information about the presence and significance of unexploded ordnance (UXO) in the two main areas of Aberdeen Proving Ground: Aberdeen Area and Edgewood Area. Known UXO in the land ranges of the Aberdeen Area consists entirely of conventional munitions. The Edgewood Area contains, in addition to conventional munitions, a significant quantity of chemical-munition UXO, which is reflected in the presence of chemical agent decomposition products in Edgewood Area ground-water samples. It may be concluded from current information that the UXO at Aberdeen Proving Ground has not adversely affected the environment through release of toxic substances to the public domain, especially not by water pathways, and is not likely to do so in the near future. Nevertheless, modest but periodic monitoring of groundwater and nearby surface waters would be a prudent policy.

  14. A Dynamic and Adaptive Selection Radar Tracking Method Based on Information Entropy

    Directory of Open Access Journals (Sweden)

    Ge Jianjun

    2017-12-01

    Full Text Available Nowadays, the battlefield environment has become much more complex and variable. This paper presents a quantitative method and lower bound for the amount of target information acquired from multiple radar observations to adaptively and dynamically organize the detection of battlefield resources based on the principle of information entropy. Furthermore, for minimizing the given information entropy’s lower bound for target measurement at every moment, a method to dynamically and adaptively select radars with a high amount of information for target tracking is proposed. The simulation results indicate that the proposed method has higher tracking accuracy than that of tracking without adaptive radar selection based on entropy.

  15. A population-based approach to background discrimination in particle physics

    International Nuclear Information System (INIS)

    Colecchia, Federico

    2012-01-01

    Background properties in experimental particle physics are typically estimated from control samples corresponding to large numbers of events. This can provide precise knowledge of average background distributions, but typically does not take into account statistical fluctuations in a data set of interest. A novel approach based on mixture model decomposition is presented, as a way to extract additional information about statistical fluctuations from a given data set with a view to improving on knowledge of background distributions obtained from control samples. Events are treated as heterogeneous populations comprising particles originating from different processes, and individual particles are mapped to a process of interest on a probabilistic basis. The proposed approach makes it possible to estimate features of the background distributions from the data, and to extract information about statistical fluctuations that would otherwise be lost using traditional supervised classifiers trained on high-statistics control samples. A feasibility study on Monte Carlo is presented, together with a comparison with existing techniques. Finally, the prospects for the development of tools for intensive offline analysis of individual interesting events at the Large Hadron Collider are discussed.

  16. Combining qualitative and quantitative operational research methods to inform quality improvement in pathways that span multiple settings

    Science.gov (United States)

    Crowe, Sonya; Brown, Katherine; Tregay, Jenifer; Wray, Jo; Knowles, Rachel; Ridout, Deborah A; Bull, Catherine; Utley, Martin

    2017-01-01

    Background Improving integration and continuity of care across sectors within resource constraints is a priority in many health systems. Qualitative operational research methods of problem structuring have been used to address quality improvement in services involving multiple sectors but not in combination with quantitative operational research methods that enable targeting of interventions according to patient risk. We aimed to combine these methods to augment and inform an improvement initiative concerning infants with congenital heart disease (CHD) whose complex care pathway spans multiple sectors. Methods Soft systems methodology was used to consider systematically changes to services from the perspectives of community, primary, secondary and tertiary care professionals and a patient group, incorporating relevant evidence. Classification and regression tree (CART) analysis of national audit datasets was conducted along with data visualisation designed to inform service improvement within the context of limited resources. Results A ‘Rich Picture’ was developed capturing the main features of services for infants with CHD pertinent to service improvement. This was used, along with a graphical summary of the CART analysis, to guide discussions about targeting interventions at specific patient risk groups. Agreement was reached across representatives of relevant health professions and patients on a coherent set of targeted recommendations for quality improvement. These fed into national decisions about service provision and commissioning. Conclusions When tackling complex problems in service provision across multiple settings, it is important to acknowledge and work with multiple perspectives systematically and to consider targeting service improvements in response to confined resources. Our research demonstrates that applying a combination of qualitative and quantitative operational research methods is one approach to doing so that warrants further

  17. Evaluation of Information Requirements of Reliability Methods in Engineering Design

    DEFF Research Database (Denmark)

    Marini, Vinicius Kaster; Restrepo-Giraldo, John Dairo; Ahmed-Kristensen, Saeema

    2010-01-01

    This paper aims to characterize the information needed to perform methods for robustness and reliability, and verify their applicability to early design stages. Several methods were evaluated on their support to synthesis in engineering design. Of those methods, FMEA, FTA and HAZOP were selected...

  18. Tools for forming strategies for remediation of forests and park areas in northern Europe after radioactive contamination: background and techniques

    International Nuclear Information System (INIS)

    Hubbard, L.; Rantavaara, A.; Andersson, K.; Roed, J.

    2002-01-01

    This report compiles background information that can be used in planning appropriate countermeasures for forest and park areas in Denmark, Sweden, Finland and Norway, in case a nuclear accident results in large-scale contamination of forests. The information is formulated to inform the forestry sector and radiation protection experts about the practicality of both forest management techniques and mechanical cleanup methods, for use in their planning of specific strategies that can lead to an optimal use of contaminated forests. Decisions will depend on the site and the actual situation after radioactive deposition to forested areas, but the report provides background information from investigations performed before an accident occurs that will make the process more effective. The report also discusses the radiological consequences of producing energy from biomass contaminated by a major nuclear accident, both in the context of normal bio-fuel energy production and as a means of reducing potentially severe environmental problems in the forest by firing power plants with highly contaminated forest biomass. (au)

  19. Background of SAM atom-fraction profiles

    International Nuclear Information System (INIS)

    Ernst, Frank

    2017-01-01

    Atom-fraction profiles acquired by SAM (scanning Auger microprobe) have important applications, e.g. in the context of alloy surface engineering by infusion of carbon or nitrogen through the alloy surface. However, such profiles often exhibit an artifact in form of a background with a level that anti-correlates with the local atom fraction. This article presents a theory explaining this phenomenon as a consequence of the way in which random noise in the spectrum propagates into the discretized differentiated spectrum that is used for quantification. The resulting model of “energy channel statistics” leads to a useful semi-quantitative background reduction procedure, which is validated by applying it to simulated data. Subsequently, the procedure is applied to an example of experimental SAM data. The analysis leads to conclusions regarding optimum experimental acquisition conditions. The proposed method of background reduction is based on general principles and should be useful for a broad variety of applications. - Highlights: • Atom-fraction–depth profiles of carbon measured by scanning Auger microprobe • Strong background, varies with local carbon concentration. • Needs correction e.g. for quantitative comparison with simulations • Quantitative theory explains background. • Provides background removal strategy and practical advice for acquisition

  20. Background of SAM atom-fraction profiles

    Energy Technology Data Exchange (ETDEWEB)

    Ernst, Frank

    2017-03-15

    Atom-fraction profiles acquired by SAM (scanning Auger microprobe) have important applications, e.g. in the context of alloy surface engineering by infusion of carbon or nitrogen through the alloy surface. However, such profiles often exhibit an artifact in form of a background with a level that anti-correlates with the local atom fraction. This article presents a theory explaining this phenomenon as a consequence of the way in which random noise in the spectrum propagates into the discretized differentiated spectrum that is used for quantification. The resulting model of “energy channel statistics” leads to a useful semi-quantitative background reduction procedure, which is validated by applying it to simulated data. Subsequently, the procedure is applied to an example of experimental SAM data. The analysis leads to conclusions regarding optimum experimental acquisition conditions. The proposed method of background reduction is based on general principles and should be useful for a broad variety of applications. - Highlights: • Atom-fraction–depth profiles of carbon measured by scanning Auger microprobe • Strong background, varies with local carbon concentration. • Needs correction e.g. for quantitative comparison with simulations • Quantitative theory explains background. • Provides background removal strategy and practical advice for acquisition.

  1. Research on a Method of Geographical Information Service Load Balancing

    Science.gov (United States)

    Li, Heyuan; Li, Yongxing; Xue, Zhiyong; Feng, Tao

    2018-05-01

    With the development of geographical information service technologies, how to achieve the intelligent scheduling and high concurrent access of geographical information service resources based on load balancing is a focal point of current study. This paper presents an algorithm of dynamic load balancing. In the algorithm, types of geographical information service are matched with the corresponding server group, then the RED algorithm is combined with the method of double threshold effectively to judge the load state of serve node, finally the service is scheduled based on weighted probabilistic in a certain period. At the last, an experiment system is built based on cluster server, which proves the effectiveness of the method presented in this paper.

  2. FEATURE SELECTION METHODS BASED ON MUTUAL INFORMATION FOR CLASSIFYING HETEROGENEOUS FEATURES

    Directory of Open Access Journals (Sweden)

    Ratri Enggar Pawening

    2016-06-01

    Full Text Available Datasets with heterogeneous features can affect feature selection results that are not appropriate because it is difficult to evaluate heterogeneous features concurrently. Feature transformation (FT is another way to handle heterogeneous features subset selection. The results of transformation from non-numerical into numerical features may produce redundancy to the original numerical features. In this paper, we propose a method to select feature subset based on mutual information (MI for classifying heterogeneous features. We use unsupervised feature transformation (UFT methods and joint mutual information maximation (JMIM methods. UFT methods is used to transform non-numerical features into numerical features. JMIM methods is used to select feature subset with a consideration of the class label. The transformed and the original features are combined entirely, then determine features subset by using JMIM methods, and classify them using support vector machine (SVM algorithm. The classification accuracy are measured for any number of selected feature subset and compared between UFT-JMIM methods and Dummy-JMIM methods. The average classification accuracy for all experiments in this study that can be achieved by UFT-JMIM methods is about 84.47% and Dummy-JMIM methods is about 84.24%. This result shows that UFT-JMIM methods can minimize information loss between transformed and original features, and select feature subset to avoid redundant and irrelevant features.

  3. About application during lectures on protection of the information and information security of the method of "the round table"

    Directory of Open Access Journals (Sweden)

    Simon Zh. Simavoryan

    2011-05-01

    Full Text Available In article the analysis of one of passive methods of transfer of knowledge – lecture is resulted. Experience of teaching of a subject on protection of the information and information security shows that students acquire a teaching material if during lecture to apply an active method of transfer of knowledge – a method of "a round table" is better.

  4. Information content in B→VV decays and the angular moments method

    International Nuclear Information System (INIS)

    Dighe, A.; Sen, S.

    1998-10-01

    The time-dependent angular distributions of decays of neutral B mesons into two vector mesons contain information about the lifetimes, mass differences, strong and weak phases, form factors, and CP violating quantities. A statistical analysis of the information content is performed by giving the ''information'' a quantitative meaning. It is shown that for some parameters of interest, the information content in time and angular measurements combined may be orders of magnitude more than the information from time measurements alone and hence the angular measurements are highly recommended. The method of angular moments is compared with the (maximum) likelihood method to find that it works almost as well in the region of interest for the one-angle distribution. For the complete three-angle distribution, an estimate of possible statistical errors expected on the observables of interest is obtained. It indicates that the three-angle distribution, unraveled by the method of angular moments, would be able to nail down many quantities of interest and will help in pointing unambiguously to new physics. (author)

  5. Background radioactivity in environmental materials

    International Nuclear Information System (INIS)

    Maul, P.R.; O'Hara, J.P.

    1989-01-01

    This paper presents the results of a literature search to identify information on concentrations of 'background' radioactivity in foodstuffs and other commonly available environmental materials. The review has concentrated on naturally occurring radioactivity in foods and on UK data, although results from other countries have also been considered where appropriate. The data are compared with established definitions of a 'radioactive' substance and radionuclides which do not appear to be adequately covered in the literature are noted. (author)

  6. Hanford Site background: Part 1, Soil background for nonradioactive analytes

    International Nuclear Information System (INIS)

    1993-04-01

    Volume two contains the following appendices: Description of soil sampling sites; sampling narrative; raw data soil background; background data analysis; sitewide background soil sampling plan; and use of soil background data for the detection of contamination at waste management unit on the Hanford Site

  7. Discussion of a method for providing general risk information by linking with the nuclear information

    International Nuclear Information System (INIS)

    Shobu, Nobuhiro; Yokomizo, Shirou; Umezawa, Sayaka

    2004-06-01

    'Risk information navigator (http://www.ricotti.jp/risknavi/)', an internet tool for arousing public interest and fostering people's risk literacy, has been developed as the contents for the official website of Techno Community Square 'RICOTTI' (http://www.ricotti.jp) at TOKAI village. In this report we classified the risk information into the fields, Health/Daily Life', 'Society/Crime/Disaster' and Technology/Environment/Energy', for the internet tool contents. According to these categories we discussed a method for providing various risk information on general fields by linking with the information on nuclear field. The web contents are attached to this report with the CD-R media. (author)

  8. State of the art/science: Visual methods and information behavior research

    DEFF Research Database (Denmark)

    Hartel, Jenna; Sonnenwald, Diane H.; Lundh, Anna

    2012-01-01

    This panel reports on methodological innovation now underway as information behavior scholars begin to experiment with visual methods. The session launches with a succinct introduction to visual methods by Jenna Hartel and then showcases three exemplar visual research designs. First, Dianne Sonne...... will have gained: knowledge of the state of the art/science of visual methods in information behavior research; an appreciation for the richness the approach brings to the specialty; and a platform to take new visual research designs forward....

  9. [Dissemination of medical information in Europe, the USA and Japan, 1850-1870: focusing on information concerning the hypodermic injection method].

    Science.gov (United States)

    Tsukisawa, Miyoko

    2011-12-01

    Modern medicine was introduced in Japan in the second half of the nineteenth century. In order to investigate this historical process, this paper focuses on the dissemination of information of a new medical technology developed in the mid-nineteenth century; it does so by making comparisons of the access to medical information between Europe, the USA and Japan. The hypodermic injection method was introduced in the clinical field in Europe and the USA as a newly developed therapeutic method during the 1850s and 1870s. This study analyzed information on the medical assessments of this method by clinicians of these periods. The crucial factor in accumulating this information was to develop a worldwide inter-medical communication circle with the aid of the medical journals. Information on the hypodermic injection method was introduced in Japan almost simultaneously with its introduction in Europe and the USA. However, because of the geographical distance and the language barrier, Japanese clinicians lacked access to this worldwide communication circle, and they accepted this new method without adequate medical technology assessments.

  10. Weighted Low-Rank Approximation of Matrices and Background Modeling

    KAUST Repository

    Dutta, Aritra

    2018-04-15

    We primarily study a special a weighted low-rank approximation of matrices and then apply it to solve the background modeling problem. We propose two algorithms for this purpose: one operates in the batch mode on the entire data and the other one operates in the batch-incremental mode on the data and naturally captures more background variations and computationally more effective. Moreover, we propose a robust technique that learns the background frame indices from the data and does not require any training frames. We demonstrate through extensive experiments that by inserting a simple weight in the Frobenius norm, it can be made robust to the outliers similar to the $\\\\ell_1$ norm. Our methods match or outperform several state-of-the-art online and batch background modeling methods in virtually all quantitative and qualitative measures.

  11. Weighted Low-Rank Approximation of Matrices and Background Modeling

    KAUST Repository

    Dutta, Aritra; Li, Xin; Richtarik, Peter

    2018-01-01

    We primarily study a special a weighted low-rank approximation of matrices and then apply it to solve the background modeling problem. We propose two algorithms for this purpose: one operates in the batch mode on the entire data and the other one operates in the batch-incremental mode on the data and naturally captures more background variations and computationally more effective. Moreover, we propose a robust technique that learns the background frame indices from the data and does not require any training frames. We demonstrate through extensive experiments that by inserting a simple weight in the Frobenius norm, it can be made robust to the outliers similar to the $\\ell_1$ norm. Our methods match or outperform several state-of-the-art online and batch background modeling methods in virtually all quantitative and qualitative measures.

  12. Classification of supersymmetric backgrounds of string theory

    International Nuclear Information System (INIS)

    Gran, U.; Gutowski, J.; Papadopoulos, G.; Roest, D.

    2007-01-01

    We review the recent progress made towards the classification of supersymmetric solutions in ten and eleven dimensions with emphasis on those of IIB supergravity. In particular, the spinorial geometry method is outlined and adapted to nearly maximally supersymmetric backgrounds. We then demonstrate its effectiveness by classifying the maximally supersymmetric IIB G-backgrounds and by showing that N=31 IIB solutions do not exist. (Abstract Copyright [2007], Wiley Periodicals, Inc.)

  13. Measurements of the cosmic background radiation

    International Nuclear Information System (INIS)

    Weiss, R.

    1980-01-01

    Measurements of the attributes of the 2.7-K microwave background radiation (CBR) are reviewed, with emphasis on the analytic phase of CBR studies. Methods for the direct measurement of the CBR spectrum are discussed. Attention is given to receivers, antennas, absolute receiver calibration, atmospheric emission and absorption, the galactic background contribution, the analysis of LF measurements, and recent HF observations of the CBR spectrum. Measurements of the large-angular-scale intensity distribution of the CBR (the most convincing evidence that the radiation is of cosmological origin) are examined, along with limits on the linear polarization of the CBR. A description is given of the NASA-sponsored Cosmic Background Explorer (COBE) satellite mission. The results of the COBE mission will be a set of sky maps showing, in the wave number range from 1 to 10,000 kaysers, the galactic background radiation due to synchrotron emission from galactic cosmic rays, to diffuse thermal emission from H II regions, and to diffuse thermal emission from interstellar and interplanetary dust, as well as a residue consisting of the CBR and whatever other cosmological background might exist

  14. Internet uses for health information seeking : Internet uses and healthcare information

    OpenAIRE

    Renahy , Emilie; Chauvin , Pierre

    2006-01-01

    Background: With the widespread dissemination of the Internet throughout the world of health, it would be relevant to report on current knowledge about health information search on the Internet from the consumers' standpoint. Methods: We conducted a bibliographical research over the past five years and distinguished between international and French studies. Results: For a long time, the (mostly US) studies have been merely descriptive. The studies highlight that the factors associated with he...

  15. Albania; Background Information

    OpenAIRE

    International Monetary Fund

    1995-01-01

    This paper describes a evolution of the financial system in Albania. The paper highlights that a two-tier banking system was created following passage of a new Central Bank Law and Commercial Banking Law in April 1992. The State Bank of Albania became the Bank of Albania and retained only the functions of a central bank. Its commercial operations were hived off to become the National Bank of Albania in July 1992, which was subsequently merged with the Albanian Commercial Bank to form the Nati...

  16. Applied Ecosystem Analysis - Background : EDT the Ecosystem Diagnosis and Treatment Method.

    Energy Technology Data Exchange (ETDEWEB)

    Mobrand, Lars E.

    1996-05-01

    This volume consists of eight separate reports. We present them as background to the Ecosystem Diagnosis and Treatment (EDT) methodology. They are a selection from publications, white papers, and presentations prepared over the past two years. Some of the papers are previously published, others are currently being prepared for publication. In the early to mid 1980`s the concern for failure of both natural and hatchery production of Columbia river salmon populations was widespread. The concept of supplementation was proposed as an alternative solution that would integrate artificial propagation with natural production. In response to the growing expectations placed upon the supplementation tool, a project called Regional Assessment of Supplementation Project (RASP) was initiated in 1990. The charge of RASP was to define supplementation and to develop guidelines for when, where and how it would be the appropriate solution to salmon enhancement in the Columbia basin. The RASP developed a definition of supplementation and a set of guidelines for planning salmon enhancement efforts which required consideration of all factors affecting salmon populations, including environmental, genetic, and ecological variables. The results of RASP led to a conclusion that salmon issues needed to be addressed in a manner that was consistent with an ecosystem approach. If the limitations and potentials of supplementation or any other management tool were to be fully understood it would have to be within the context of a broadly integrated approach - thus the Ecosystem Diagnosis and Treatment (EDT) method was born.

  17. Applied Ecosystem Analysis - Background EDT - The Ecosystem Diagnosis and Treatment Method

    International Nuclear Information System (INIS)

    Mobrand, L.E.; Lichatowich, J.A.; Howard, D.A.; Vogel, T.S.

    1996-05-01

    This volume consists of eight separate reports. We present them as background to the Ecosystem Diagnosis and Treatment (EDT) methodology. They are a selection from publications, white papers, and presentations prepared over the past two years. Some of the papers are previously published, others are currently being prepared for publication. In the early to mid 1980's the concern for failure of both natural and hatchery production of Columbia river salmon populations was widespread. The concept of supplementation was proposed as an alternative solution that would integrate artificial propagation with natural production. In response to the growing expectations placed upon the supplementation tool, a project called Regional Assessment of Supplementation Project (RASP) was initiated in 1990. The charge of RASP was to define supplementation and to develop guidelines for when, where and how it would be the appropriate solution to salmon enhancement in the Columbia basin. The RASP developed a definition of supplementation and a set of guidelines for planning salmon enhancement efforts which required consideration of all factors affecting salmon populations, including environmental, genetic, and ecological variables. The results of RASP led to a conclusion that salmon issues needed to be addressed in a manner that was consistent with an ecosystem approach. If the limitations and potentials of supplementation or any other management tool were to be fully understood it would have to be within the context of a broadly integrated approach - thus the Ecosystem Diagnosis and Treatment (EDT) method was born

  18. Perceptions of Athletic Trainers as a Source of Nutritional Information among Collegiate Athletes: A Mixed-methods Approach

    Directory of Open Access Journals (Sweden)

    Rebecca A. Schlaff

    2016-05-01

    Full Text Available Background: Athletes obtain nutrition information from a number of sources, with some being more accurate than others.  Little is known about athletes’ perceptions of utilizing Certified Athletic Trainers (ATs as a primary source of information. Objective: We sought to 1 examine the primary sources of nutrition information among a group of United States collegiate athletes and 2 understand athletes’ perceptions regarding utilization of their ATs as primary sources of nutrition information. Methods: Participants (Division II university athletes completed an online questionnaire (n=155;n=58 males, n=97 females assessing demographic information and ranked primary sources of nutrition information, and participated in focus groups (n=26;n=18 women, n=8 men to better understand barriers/perceptions for using their ATs for nutrition information. Mean+SD ranking were calculated for all sources. Mann Whitney-U analyses were used to identify differences in rank order nutrition sources between genders and years of collegiate experience. Semi-structured focus groups were transcribed, coded, and themes were identified regarding barriers to utilizing ATs for nutrition-related information. Results: Parents (3.54±2.38 and the internet (3.69±2.29 had the highest mean ranks.  ATs were least often ranked as the number one nutrition source (7.5%, among all sources provided.  Barriers to utilizing ATs for nutritional information included discomfort, nutrition information not being within the scope of practice, lack of knowledge, the athletic trainer not caring, and lack of time. Conclusions: Participants reported utilizing ATs less than previous research indicates. Continuing education may be needed to improve the efficacy of ATs in addressing nutritional issues and being seen as a credible and accessible source. Keywords: Diet, Athlete perceptions, Barriers

  19. Deriving harmonised forest information in Europe using remote sensing methods

    DEFF Research Database (Denmark)

    Seebach, Lucia Maria

    the need for harmonised forest information can be satisfied using remote sensing methods. In conclusion, the study showed that it is possible to derive harmonised forest information of high spatial detail in Europe with remote sensing. The study also highlighted the imperative provision of accuracy...

  20. Extraction Method for Earthquake-Collapsed Building Information Based on High-Resolution Remote Sensing

    International Nuclear Information System (INIS)

    Chen, Peng; Wu, Jian; Liu, Yaolin; Wang, Jing

    2014-01-01

    At present, the extraction of earthquake disaster information from remote sensing data relies on visual interpretation. However, this technique cannot effectively and quickly obtain precise and efficient information for earthquake relief and emergency management. Collapsed buildings in the town of Zipingpu after the Wenchuan earthquake were used as a case study to validate two kinds of rapid extraction methods for earthquake-collapsed building information based on pixel-oriented and object-oriented theories. The pixel-oriented method is based on multi-layer regional segments that embody the core layers and segments of the object-oriented method. The key idea is to mask layer by layer all image information, including that on the collapsed buildings. Compared with traditional techniques, the pixel-oriented method is innovative because it allows considerably rapid computer processing. As for the object-oriented method, a multi-scale segment algorithm was applied to build a three-layer hierarchy. By analyzing the spectrum, texture, shape, location, and context of individual object classes in different layers, the fuzzy determined rule system was established for the extraction of earthquake-collapsed building information. We compared the two sets of results using three variables: precision assessment, visual effect, and principle. Both methods can extract earthquake-collapsed building information quickly and accurately. The object-oriented method successfully overcomes the pepper salt noise caused by the spectral diversity of high-resolution remote sensing data and solves the problem of same object, different spectrums and that of same spectrum, different objects. With an overall accuracy of 90.38%, the method achieves more scientific and accurate results compared with the pixel-oriented method (76.84%). The object-oriented image analysis method can be extensively applied in the extraction of earthquake disaster information based on high-resolution remote sensing

  1. Developing corpus-based translation methods between informal and formal mathematics : project description

    NARCIS (Netherlands)

    Kaliszyk, C.; Urban, J.; Vyskocil, J.; Geuvers, J.H.; Watt, S.M.; Davenport, J.H.; Sexton, A.P.; Sojka, P.; Urban, J.

    2014-01-01

    The goal of this project is to (i) accumulate annotated informal/formal mathematical corpora suitable for training semi-automated translation between informal and formal mathematics by statistical machine-translation methods, (ii) to develop such methods oriented at the formalization task, and in

  2. Egypt: Background and U.S. Relations

    Science.gov (United States)

    2009-05-12

    contributions from Germany , Japan, and Switzerland. For more information on the MFO, see http://www.mfo.org/Default.asp?bhcp=1. Egypt: Background and...2008 Report, Egypt’s pace of business reforms and deregulation between 2006 and 2007 ranked first worldwide. In recent years, the state has...reinvigorated its privatization program by divesting shares in the state-dominated banking and insurance sectors. Additionally, the government removed import

  3. Natural background radiation and oncologic disease incidence

    International Nuclear Information System (INIS)

    Burenin, P.I.

    1982-01-01

    Cause and effect relationships between oncologic disease incidence in human population and environmental factors are examined using investigation materials of Soviet and foreign authors. The data concerning US white population are adduced. The role and contribution of natural background radiation oncologic disease prevalence have been determined with the help of system information analysis. The probable damage of oncologic disease is shown to decrease as the background radiation level diminishes. The linear nature of dose-response relationspip has been established. The necessity to include the life history of the studied population along with environmental factors in epidemiological study under conditions of multiplicity of cancerogenesis causes is emphasized

  4. Quantitative Analysis of Qualitative Information from Interviews: A Systematic Literature Review

    Science.gov (United States)

    Fakis, Apostolos; Hilliam, Rachel; Stoneley, Helen; Townend, Michael

    2014-01-01

    Background: A systematic literature review was conducted on mixed methods area. Objectives: The overall aim was to explore how qualitative information from interviews has been analyzed using quantitative methods. Methods: A contemporary review was undertaken and based on a predefined protocol. The references were identified using inclusion and…

  5. Fluorescence background subtraction technique for hybrid fluorescence molecular tomography/x-ray computed tomography imaging of a mouse model of early stage lung cancer.

    Science.gov (United States)

    Ale, Angelique; Ermolayev, Vladimir; Deliolanis, Nikolaos C; Ntziachristos, Vasilis

    2013-05-01

    The ability to visualize early stage lung cancer is important in the study of biomarkers and targeting agents that could lead to earlier diagnosis. The recent development of hybrid free-space 360-deg fluorescence molecular tomography (FMT) and x-ray computed tomography (XCT) imaging yields a superior optical imaging modality for three-dimensional small animal fluorescence imaging over stand-alone optical systems. Imaging accuracy was improved by using XCT information in the fluorescence reconstruction method. Despite this progress, the detection sensitivity of targeted fluorescence agents remains limited by nonspecific background accumulation of the fluorochrome employed, which complicates early detection of murine cancers. Therefore we examine whether x-ray CT information and bulk fluorescence detection can be combined to increase detection sensitivity. Correspondingly, we research the performance of a data-driven fluorescence background estimator employed for subtraction of background fluorescence from acquisition data. Using mice containing known fluorochromes ex vivo, we demonstrate the reduction of background signals from reconstructed images and sensitivity improvements. Finally, by applying the method to in vivo data from K-ras transgenic mice developing lung cancer, we find small tumors at an early stage compared with reconstructions performed using raw data. We conclude with the benefits of employing fluorescence subtraction in hybrid FMT-XCT for early detection studies.

  6. From Cleanup to Stewardship. A companion report to Accelerating Cleanup: Paths to Closure and background information to support the scoping process required for the 1998 PEIS Settlement Study

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    1999-10-01

    Long-term stewardship is expected to be needed at more than 100 DOE sites after DOE's Environmental Management program completes disposal, stabilization, and restoration operations to address waste and contamination resulting from nuclear research and nuclear weapons production conducted over the past 50 years. From Cleanup to stewardship provides background information on the Department of Energy (DOE) long-term stewardship obligations and activities. This document begins to examine the transition from cleanup to long-term stewardship, and it fulfills the Secretary's commitment to the President in the 1999 Performance Agreement to provide a companion report to the Department's Accelerating Cleanup: Paths to Closure report. It also provides background information to support the scoping process required for a study on long-term stewardship required by a 1998 Settlement Agreement.

  7. A new method for background rejection with surface sensitive bolometers

    International Nuclear Information System (INIS)

    Nones, C.; Foggetta, L.; Giuliani, A.; Pedretti, M.; Salvioni, C.; Sangiorgio, S.

    2006-01-01

    We report the performance of three prototype TeO 2 macrobolometers, able to identify events due to energy deposited at the detector surface. This capability is obtained by thermally coupling thin active layers to the main absorber of the bolometer, and is proved by irradiating the detectors with alpha particles. This technique can be very useful in view of background study and reduction for the CUORE experiment, a next generation Double Beta Decay search based on TeO 2 macrobolometers and to be installed in the Laboratori Nazionali del Gran Sasso

  8. Interface methods for using intranet portal organizational memory information system.

    Science.gov (United States)

    Ji, Yong Gu; Salvendy, Gavriel

    2004-12-01

    In this paper, an intranet portal is considered as an information infrastructure (organizational memory information system, OMIS) supporting organizational learning. The properties and the hierarchical structure of information and knowledge in an intranet portal OMIS was identified as a problem for navigation tools of an intranet portal interface. The problem relates to navigation and retrieval functions of intranet portal OMIS and is expected to adversely affect user performance, satisfaction, and usefulness. To solve the problem, a conceptual model for navigation tools of an intranet portal interface was proposed and an experiment using a crossover design was conducted with 10 participants. In the experiment, a separate access method (tabbed tree tool) was compared to an unified access method (single tree tool). The results indicate that each information/knowledge repository for which a user has a different structural knowledge should be handled separately with a separate access to increase user satisfaction and the usefulness of the OMIS and to improve user performance in navigation.

  9. Method s for Measuring Productivity in Libraries and Information Centres

    OpenAIRE

    Mohammad Alaaei

    2009-01-01

      Within Information centers, productivity is the result of optimal and effective use of information resources, service quality improvement, increased user satisfaction, pleasantness of working environment, increased motivation and enthusiasm of staff to work better. All contribute to the growth and development of information centers. Thus these centers would need to be familiar with methods employed in productivity measurement. Productivity is one of the criteria for evaluating system perfor...

  10. Polonium-210 assay using a background-rejecting extractive liquid-scintillation method

    International Nuclear Information System (INIS)

    Case, C.N.; McDowell, W.J.

    1981-01-01

    This paper describes a procedure which combines solvent extraction with alpha liquid scintillation spectrometry. Pulse shape discrimination electronics are used to reject beta and gamma pulses and to lower the background count to acceptable levels. Concentration of 210 Po and separation from interferring elements are accomplished using a H 3 Po 4 -HCl solution with TOPO combined with a scintillor in toluene

  11. Importance of background values in assessing the impact of heavy metals in river ecosystems: case study of Tisza River, Serbia.

    Science.gov (United States)

    Štrbac, Snežana; Kašanin Grubin, Milica; Vasić, Nebojša

    2017-11-30

    The main objective of this paper is to evaluate how a choice of different background values may affect assessing the anthropogenic heavy metal pollution in sediments from Tisza River (Serbia). The second objective of this paper is to underline significance of using geochemical background values when establishing quality criteria for sediment. Enrichment factor (EF), geoaccumulation index (I geo ), pollution load index (PLI), and potential ecological risk index (PERI) were calculated using different background values. Three geochemical (average metal concentrations in continental crust, average metal concentrations in shale, and average metal concentrations in non-contaminated core sediment samples) and two statistical methods (delineation method and principal component analyses) were used for calculating background values. It can be concluded that obtained information of pollution status can be more dependent on the use of background values than the index/factor chosen. The best option to assess the potential river sediment contamination is to compare obtained concentrations of analyzed elements with concentrations of mineralogically and texturally comparable, uncontaminated core sediment samples. Geochemical background values should be taken into account when establishing quality criteria for soils, sediments, and waters. Due to complexity of the local lithology, it is recommended that environmental monitoring and assessment include selection of an appropriate background values to gain understanding of the geochemistry and potential source of pollution in a given environment.

  12. The Effect of Cultural Background Knowledge on Learning English Language

    Directory of Open Access Journals (Sweden)

    Dr. Ibrahim

    2013-12-01

    Full Text Available This study aims to investigate the effect of cultural background knowledge on learning English Language. It also aims to investigate if there are significant differences between subjects' performance in reading comprehension according to sex and general ability in English (GAE. The study aims at answering the following questions: 1 . To what extent is the effect of cultural background knowledge on subjects' performance in reading comprehension? 2 . What is the difference in performance in reading comprehension between male and female subjects who have cultural background knowledge and those who do not have any knowledge? 3. What is the differenc e between subjects' performance in reading comprehension texts which are loaded with American culture and their general ability in English. ? The population of th is study consisted of all first - year students majoring in English at Hebron University in th e first semester of the academic year 2011/2012. They were 600. The sample of the study consisted of 60 subjects, males and females divided into four groups, two experimental and two controlled. The researcher followed the experimental method. Means, stand ard deviations and Pearson Product Moment Correlation were calculated by using SPSS program. The study revealed the following results: 1. There are statistically significant differences in performance in reading comprehension between subjects who have cu ltural background knowledge and those who do not have any knowledge . 2 . There are no statistically significant differences in performance in reading comprehension between male and female subjects who have cultural background knowledge and those who do not have any knowledge. 3. Subjects' GAE revealed that there are significant differences in performance in reading comprehension between subjects who have cultural background knowledge and those who do not have any knowledge. In the light of the results of th e study, the researcher recommends the

  13. A Dynamic Enhancement With Background Reduction Algorithm: Overview and Application to Satellite-Based Dust Storm Detection

    Science.gov (United States)

    Miller, Steven D.; Bankert, Richard L.; Solbrig, Jeremy E.; Forsythe, John M.; Noh, Yoo-Jeong; Grasso, Lewis D.

    2017-12-01

    This paper describes a Dynamic Enhancement Background Reduction Algorithm (DEBRA) applicable to multispectral satellite imaging radiometers. DEBRA uses ancillary information about the clear-sky background to reduce false detections of atmospheric parameters in complex scenes. Applied here to the detection of lofted dust, DEBRA enlists a surface emissivity database coupled with a climatological database of surface temperature to approximate the clear-sky equivalent signal for selected infrared-based multispectral dust detection tests. This background allows for suppression of false alarms caused by land surface features while retaining some ability to detect dust above those problematic surfaces. The algorithm is applicable to both day and nighttime observations and enables weighted combinations of dust detection tests. The results are provided quantitatively, as a detection confidence factor [0, 1], but are also readily visualized as enhanced imagery. Utilizing the DEBRA confidence factor as a scaling factor in false color red/green/blue imagery enables depiction of the targeted parameter in the context of the local meteorology and topography. In this way, the method holds utility to both automated clients and human analysts alike. Examples of DEBRA performance from notable dust storms and comparisons against other detection methods and independent observations are presented.

  14. Retrieval of reflections from seismic background?noise measurements

    NARCIS (Netherlands)

    Draganov, D.S.; Wapenaar, K.; Mulder, W.; Singer, J.; Verdel, A.

    2007-01-01

    The retrieval of the earth's reflection response from cross?correlations of seismic noise recordings can provide valuable information, which may otherwise not be available due to limited spatial distribution of seismic sources. We cross?correlated ten hours of seismic background?noise data acquired

  15. Density-Based Clustering with Geographical Background Constraints Using a Semantic Expression Model

    Directory of Open Access Journals (Sweden)

    Qingyun Du

    2016-05-01

    Full Text Available A semantics-based method for density-based clustering with constraints imposed by geographical background knowledge is proposed. In this paper, we apply an ontological approach to the DBSCAN (Density-Based Geospatial Clustering of Applications with Noise algorithm in the form of knowledge representation for constraint clustering. When used in the process of clustering geographic information, semantic reasoning based on a defined ontology and its relationships is primarily intended to overcome the lack of knowledge of the relevant geospatial data. Better constraints on the geographical knowledge yield more reasonable clustering results. This article uses an ontology to describe the four types of semantic constraints for geographical backgrounds: “No Constraints”, “Constraints”, “Cannot-Link Constraints”, and “Must-Link Constraints”. This paper also reports the implementation of a prototype clustering program. Based on the proposed approach, DBSCAN can be applied with both obstacle and non-obstacle constraints as a semi-supervised clustering algorithm and the clustering results are displayed on a digital map.

  16. Background factors related to and/or influencing occupation in mentally disordered offenders.

    Science.gov (United States)

    Lindstedt, Helena; Ivarsson, Ann-Britt; Söderlund, Anne

    2006-09-01

    Knowledge of background and occupational related factors of mentally disordered offenders are missing. It is essential to understand these issues when planning discharge from forensic psychiatric hospital care to enable community dwelling. One aim was to investigate mentally disordered offenders' background factors, confidence in and how they value occupations. Another aim was to investigate MDOs background factors' in relation to and the influences on Occupational Performance and Social Participation. Data was collected with an explorative, correlative design after informed consent, from 74 mentally disordered offenders (mean age 34,2) cared for in forensic psychiatric hospitals. Assessments were Allen Cognitive Level Screen, Capability to Perform Daily Occupations, Interview Schedule of Social Interaction, Manchester Short Assessment of Quality of Life, Self-efficacy Scale and Importance scale. Eight background factors were assembled from the individual forensic psychiatric investigation. Most of the investigated background factors relate to and half of them influence occupational performance, particular the cognitive aspect of occupational performance. The influences on occupation originate from adulthood, such as suffering from schizophrenia, psycho/social problems, and having performed violent crimes. These findings indicate that staff in forensic hospital care should initiate rehabilitation with knowledge about MDOs' complex daily occupations. For avoiding information bias, information gathering preceding treatment planning should be performed in collaboration between caring staff and mentally disordered offenders.

  17. Racial background and possible relationships between physical ...

    African Journals Online (AJOL)

    The aim of this research was to investigate possible relationships between physical activity and physical fitness of girls between the ages of 13 and 15 years and the role of different racial backgrounds in this relationship. A cross-sectional research design was used to obtain information from 290 girls between the ages of 13 ...

  18. Classification Method in Integrated Information Network Using Vector Image Comparison

    Directory of Open Access Journals (Sweden)

    Zhou Yuan

    2014-05-01

    Full Text Available Wireless Integrated Information Network (WMN consists of integrated information that can get data from its surrounding, such as image, voice. To transmit information, large resource is required which decreases the service time of the network. In this paper we present a Classification Approach based on Vector Image Comparison (VIC for WMN that improve the service time of the network. The available methods for sub-region selection and conversion are also proposed.

  19. An EPIC Tale of the Quiescent Particle Background

    Science.gov (United States)

    Snowden, S.L.; Kuntz, K.D.

    2017-01-01

    Extended Source Analysis Software Use Based Empirical Investigation: (1) Builds quiescent particle background (QPB) spectra and images for observations of extended sources that fill (or mostly fill) the FOV i.e., annular background subtraction won't work. (2) Uses a combination of Filter Wheel Closed (FWC) and corner data to capture the spectral, spatial, and temporal variation of the quiescent particle background. New Work: (1) Improved understanding of the QPB (aided by adding a whole lot of data since 2008). (2) Significantly improved statistics (did I mention a LOT more data?). (3) Better characterization and identification of anomalous states. (4) Builds backgrounds for some anomalous state. (5) New efficient method for non-anomalous states.

  20. Background field removal using a region adaptive kernel for quantitative susceptibility mapping of human brain.

    Science.gov (United States)

    Fang, Jinsheng; Bao, Lijun; Li, Xu; van Zijl, Peter C M; Chen, Zhong

    2017-08-01

    Background field removal is an important MR phase preprocessing step for quantitative susceptibility mapping (QSM). It separates the local field induced by tissue magnetic susceptibility sources from the background field generated by sources outside a region of interest, e.g. brain, such as air-tissue interface. In the vicinity of air-tissue boundary, e.g. skull and paranasal sinuses, where large susceptibility variations exist, present background field removal methods are usually insufficient and these regions often need to be excluded by brain mask erosion at the expense of losing information of local field and thus susceptibility measures in these regions. In this paper, we propose an extension to the variable-kernel sophisticated harmonic artifact reduction for phase data (V-SHARP) background field removal method using a region adaptive kernel (R-SHARP), in which a scalable spherical Gaussian kernel (SGK) is employed with its kernel radius and weights adjustable according to an energy "functional" reflecting the magnitude of field variation. Such an energy functional is defined in terms of a contour and two fitting functions incorporating regularization terms, from which a curve evolution model in level set formation is derived for energy minimization. We utilize it to detect regions of with a large field gradient caused by strong susceptibility variation. In such regions, the SGK will have a small radius and high weight at the sphere center in a manner adaptive to the voxel energy of the field perturbation. Using the proposed method, the background field generated from external sources can be effectively removed to get a more accurate estimation of the local field and thus of the QSM dipole inversion to map local tissue susceptibility sources. Numerical simulation, phantom and in vivo human brain data demonstrate improved performance of R-SHARP compared to V-SHARP and RESHARP (regularization enabled SHARP) methods, even when the whole paranasal sinus regions

  1. Actors’ Competencies or Methods? A Case Study of Successful Information Systems Development

    DEFF Research Database (Denmark)

    Omland, Hans Olav; Nielsen, Peter Axel

    2009-01-01

    and methods are exercised. Emphasising the intertwining of competencies and methods, we discuss the character of the intertwining process, how different actors relate to different methods, and how methods may be part of the problem rather than part of the solution to challenges in information systems...... between actors’ competencies and their deployment of methods, arguing that this relationship is described over-simplistically and needs a better explanation. Through a case study of a successful information systems development project we identify some central situations where a variety of competencies...... development. The paper suggests elements for a new model for explaining actors’ competencies and their use of methods....

  2. Methods for evaluating information sources

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2012-01-01

    The article briefly presents and discusses 12 different approaches to the evaluation of information sources (for example a Wikipedia entry or a journal article): (1) the checklist approach; (2) classical peer review; (3) modified peer review; (4) evaluation based on examining the coverage...... of controversial views; (5) evidence-based evaluation; (6) comparative studies; (7) author credentials; (8) publisher reputation; (9) journal impact factor; (10) sponsoring: tracing the influence of economic, political, and ideological interests; (11) book reviews and book reviewing; and (12) broader criteria....... Reading a text is often not a simple process. All the methods discussed here are steps on the way on learning how to read, understand, and criticize texts. According to hermeneutics it involves the subjectivity of the reader, and that subjectivity is influenced, more or less, by different theoretical...

  3. METHODS OF POLYMODAL INFORMATION TRANSMISSION

    Directory of Open Access Journals (Sweden)

    O. O. Basov

    2015-03-01

    Full Text Available The research results upon the application of the existing information transmission methods in polymodal info communication systems are presented herein. The analysis of the existing commutation ways and multiplexing schemes has revealed that modern means of telecommunication are capable of providing polymodal information delivery with the required quality to the customer correspondent terminal. Under these conditions substantial capacity resource consumption in the data transmission networks with a simultaneous static time multiplexing is required, however, it is easier to achieve the modality synchronization within that kind of an infrastructure. The data networks with a static time multiplexing demand employing more sophisticated supporting algorithms of the guaranteed data blocks delivery quality. However, due to the stochastic data blocks delays modality synchronizing during the off-line processing is more difficult to provide. Nowadays there are objective preconditions for a data networking realization which is invariable to the applied transmission technology. This capability is defined by a wide (person-to-person application of the optical technologies in the transport infrastructure of the polymodal info communication systems. In case of the availability of the customer terminal and networking functioning matching mode it becomes possible to organize channels in the latter which can adaptively select the most effective networking technology according to the current volume allocation and modality types in the messages.

  4. Informed consent, parental awareness, and reasons for participating in a randomised controlled study

    NARCIS (Netherlands)

    M. van Stuijvenberg (Margriet); M.H. Suur (Marja); S. de Vos (Sandra); G.C.H. Tjiang (Gilbert); E.W. Steyerberg (Ewout); G. Derksen-Lubsen (Gerarda); H.A. Moll (Henriëtte)

    1998-01-01

    textabstractBACKGROUND: The informed consent procedure plays a central role in randomised controlled trials but has only been explored in a few studies on children. AIM: To assess the quality of the informed consent process in a paediatric setting. METHODS: A

  5. Robust constraint on cosmic textures from the cosmic microwave background.

    Science.gov (United States)

    Feeney, Stephen M; Johnson, Matthew C; Mortlock, Daniel J; Peiris, Hiranya V

    2012-06-15

    Fluctuations in the cosmic microwave background (CMB) contain information which has been pivotal in establishing the current cosmological model. These data can also be used to test well-motivated additions to this model, such as cosmic textures. Textures are a type of topological defect that can be produced during a cosmological phase transition in the early Universe, and which leave characteristic hot and cold spots in the CMB. We apply bayesian methods to carry out a rigorous test of the texture hypothesis, using full-sky data from the Wilkinson Microwave Anisotropy Probe. We conclude that current data do not warrant augmenting the standard cosmological model with textures. We rule out at 95% confidence models that predict more than 6 detectable cosmic textures on the full sky.

  6. Neutron- and muon-induced background in underground physics experiments

    International Nuclear Information System (INIS)

    Kudryavtsev, V.A.; Tomasello, V.; Pandola, L.

    2008-01-01

    Background induced by neutrons in deep underground laboratories is a critical issue for all experiments looking for rare events, such as dark matter interactions or neutrinoless ββ decay. Neutrons can be produced either by natural radioactivity, via spontaneous fission or (α, n) reactions, or by interactions initiated by high-energy cosmic rays. In all underground experiments, Monte Carlo simulations of neutron background play a crucial role for the evaluation of the total background rate and for the optimization of rejection strategies. The Monte Carlo methods that are commonly employed to evaluate neutron-induced background and to optimize the experimental setup, are reviewed and discussed. Focus is given to the issue of reliability of Monte Carlo background estimates. (orig.)

  7. Neutron- and muon-induced background in underground physics experiments

    Energy Technology Data Exchange (ETDEWEB)

    Kudryavtsev, V.A.; Tomasello, V. [University of Sheffield, Department of Physics and Astronomy, Sheffield (United Kingdom); Pandola, L. [Laboratori Nazionali del Gran Sasso, INFN, Assergi (Italy)

    2008-05-15

    Background induced by neutrons in deep underground laboratories is a critical issue for all experiments looking for rare events, such as dark matter interactions or neutrinoless {beta}{beta} decay. Neutrons can be produced either by natural radioactivity, via spontaneous fission or ({alpha}, n) reactions, or by interactions initiated by high-energy cosmic rays. In all underground experiments, Monte Carlo simulations of neutron background play a crucial role for the evaluation of the total background rate and for the optimization of rejection strategies. The Monte Carlo methods that are commonly employed to evaluate neutron-induced background and to optimize the experimental setup, are reviewed and discussed. Focus is given to the issue of reliability of Monte Carlo background estimates. (orig.)

  8. Comparison of two heuristic evaluation methods for evaluating the usability of health information systems.

    Science.gov (United States)

    Khajouei, Reza; Hajesmaeel Gohari, Sadrieh; Mirzaee, Moghaddameh

    2018-04-01

    In addition to following the usual Heuristic Evaluation (HE) method, the usability of health information systems can also be evaluated using a checklist. The objective of this study is to compare the performance of these two methods in identifying usability problems of health information systems. Eight evaluators independently evaluated different parts of a Medical Records Information System using two methods of HE (usual and with a checklist). The two methods were compared in terms of the number of problems identified, problem type, and the severity of identified problems. In all, 192 usability problems were identified by two methods in the Medical Records Information System. This was significantly higher than the number of usability problems identified by the checklist and usual method (148 and 92, respectively) (p information systems. The results demonstrated that the checklist method had significantly better performance in terms of the number of identified usability problems; however, the performance of the usual method for identifying problems of higher severity was significantly better. Although the checklist method can be more efficient for less experienced evaluators, wherever usability is critical, the checklist should be used with caution in usability evaluations. Copyright © 2018 Elsevier Inc. All rights reserved.

  9. Formation factor logging in-situ by electrical methods. Background and methodology

    International Nuclear Information System (INIS)

    Loefgren, Martin; Neretnieks, Ivars

    2002-10-01

    Matrix diffusion has been identified as one of the most important mechanisms governing the retardation of radionuclides escaping from a deep geological repository for nuclear waste. Radionuclides dissolved in groundwater flowing in water-bearing fractures will diffuse into water filled micropores in the rock. Important parameters governing the matrix diffusion are the formation factor, the surface diffusion and sorption. This report focuses on the formation factor in undisturbed intrusive igneous rock and the possibility of measuring this parameter in-situ. The background to and the methodology of formation factor logging in-situ by electrical methods are given. The formation factor is here defined as a parameter only depending on the geometry of the porous system and not on the diffusing specie. Traditionally the formation factor has been measured by through diffusion experiments on core samples, which are costly and time consuming. It has been shown that the formation factor could also be measured by electrical methods that are faster and less expensive. Previously this has only been done quantitatively in the laboratory on a centimetre or decimetre scale. When measuring the formation factor in-situ in regions with saline groundwater only the rock resistivity and the pore water resistivity are needed. The rock resistivity could be obtained by a variety of geophysical downhole tools. Water-bearing fractures disturb the measurements and data possibly affected by free water has to be sorted out. This could be done without loosing too much data if the vertical resolution of the tool is high enough. It was found that the rock resistivity tool presently used by SKB are neither quantitative or have enough vertical resolution. Therefore the slimhole Dual-Laterolog from Antares was tested with good results. This tool has a high vertical resolution and gives quantitative rock resistivities that need no correction. At present there is no method of directly obtaining the

  10. Formation factor logging in-situ by electrical methods. Background and methodology

    Energy Technology Data Exchange (ETDEWEB)

    Loefgren, Martin; Neretnieks, Ivars [Royal Inst. of Tech., Stockholm (Sweden). Dept. of Chemical Engineering and Technology

    2002-10-01

    Matrix diffusion has been identified as one of the most important mechanisms governing the retardation of radionuclides escaping from a deep geological repository for nuclear waste. Radionuclides dissolved in groundwater flowing in water-bearing fractures will diffuse into water filled micropores in the rock. Important parameters governing the matrix diffusion are the formation factor, the surface diffusion and sorption. This report focuses on the formation factor in undisturbed intrusive igneous rock and the possibility of measuring this parameter in-situ. The background to and the methodology of formation factor logging in-situ by electrical methods are given. The formation factor is here defined as a parameter only depending on the geometry of the porous system and not on the diffusing specie. Traditionally the formation factor has been measured by through diffusion experiments on core samples, which are costly and time consuming. It has been shown that the formation factor could also be measured by electrical methods that are faster and less expensive. Previously this has only been done quantitatively in the laboratory on a centimetre or decimetre scale. When measuring the formation factor in-situ in regions with saline groundwater only the rock resistivity and the pore water resistivity are needed. The rock resistivity could be obtained by a variety of geophysical downhole tools. Water-bearing fractures disturb the measurements and data possibly affected by free water has to be sorted out. This could be done without loosing too much data if the vertical resolution of the tool is high enough. It was found that the rock resistivity tool presently used by SKB are neither quantitative or have enough vertical resolution. Therefore the slimhole Dual-Laterolog from Antares was tested with good results. This tool has a high vertical resolution and gives quantitative rock resistivities that need no correction. At present there is no method of directly obtaining the

  11. Establishment of a Background Environmental Monitoring Station for the PNNL Campus

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Brad G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Snyder, Sandra F. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Barnett, J. Matthew [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bisping, Lynn E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Rishel, Jeremy P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-01

    The environmental surveillance of background levels of radionuclides and, in particular, the siting of a background environmental surveillance (monitoring) station are examined. Many published works identify and stress the need for background monitoring; however, little definitive and comprehensive information for siting a station exists. A definition of an ideal background monitoring location and the generic criteria recommended for use in establishing such a background monitoring location are proposed. There are seven primary (mandatory) criteria described with two additional, optional criteria. The criteria are applied to the Richland, Washington (WA), Pacific Northwest National Laboratory (PNNL) Campus, which currently uses background monitoring data from the nearby Hanford Site. Eleven potential background monitoring sites were identified, with one location in Benton City, WA found to meet all of the mandatory and optional criteria. It is expected that the new sampler will be installed and operating by the end of June, 2015.

  12. Informed consent for medical photography in Nigerian surgical ...

    African Journals Online (AJOL)

    Background: The aim of this study is to assess the current practice of informed consent for medical photography in the Nigerian surgical practice and how it compares to international best practices. Methods: Self-administered questionnaires were distributed to consenting surgeons attending two major surgical conferences.

  13. Detecting the Stochastic Gravitational-Wave Background

    Science.gov (United States)

    Colacino, Carlo Nicola

    2017-12-01

    The stochastic gravitational-wave background (SGWB) is by far the most difficult source of gravitational radiation detect. At the same time, it is the most interesting and intriguing one. This book describes the initial detection of the SGWB and describes the underlying mathematics behind one of the most amazing discoveries of the 21st century. On the experimental side it would mean that interferometric gravitational wave detectors work even better than expected. On the observational side, such a detection could give us information about the very early Universe, information that could not be obtained otherwise. Even negative results and improved upper bounds could put constraints on many cosmological and particle physics models.

  14. New technique to determine beta half-lives in complex background conditions

    International Nuclear Information System (INIS)

    Kurtukian-Nieto, T.; Benlliure, J.; Casarejos, E.; Cortina-Gil, D.; Fernandez-Ordonez, M.; Pereira, J.; Schmidt, K.H.; Becker, F.; Henzlova, D.; Yordanov, O.; Audouin, L.; Blank, B.; Giovinazzo, J.; Jurado, B.; Rejmund, F.

    2008-01-01

    Very neutron-rich nuclei near the A = 195 r-process waiting point were produced as projectile fragments from a 208 Pb primary beam at GSI, Darmstadt, by cold fragmentation. After in-flight separation, the fragments were implanted in an active catcher, and time correlations to the subsequent beta-decay were established. Due to the periodic operation cycles of the synchrotron, providing the primary beam, the background shows a complex time structure, which prevents applying well established analytical methods to extract the half-life information. A new mathematical analysis method has been developed, which is based on a Monte Carlo code, simulating the time sequence of implantation and beta detection according to the experimental conditions, leaving the beta lifetimes and the beta detection efficiency as free parameters. In addition, both the analysis of the experimental data and the simulation were performed in time-reversed sequence. The ratio of forward/backward time spectra contains the information of the 'true' fragment-beta correlations. Half-lives were obtained from two-dimensional fits of the measured and simulated ratios of time correlations in forward- and backward-time direction by the least-squares method, being the lifetime and the beta-detection efficiency the two fitting parameters. Half-lives of 8 heavy neutron-rich nuclei approaching the r-process waiting point A = 195 have been determined. (authors)

  15. Matching of Remote Sensing Images with Complex Background Variations via Siamese Convolutional Neural Network

    Directory of Open Access Journals (Sweden)

    Haiqing He

    2018-02-01

    Full Text Available Feature-based matching methods have been widely used in remote sensing image matching given their capability to achieve excellent performance despite image geometric and radiometric distortions. However, most of the feature-based methods are unreliable for complex background variations, because the gradient or other image grayscale information used to construct the feature descriptor is sensitive to image background variations. Recently, deep learning-based methods have been proven suitable for high-level feature representation and comparison in image matching. Inspired by the progresses made in deep learning, a new technical framework for remote sensing image matching based on the Siamese convolutional neural network is presented in this paper. First, a Siamese-type network architecture is designed to simultaneously learn the features and the corresponding similarity metric from labeled training examples of matching and non-matching true-color patch pairs. In the proposed network, two streams of convolutional and pooling layers sharing identical weights are arranged without the manually designed features. The number of convolutional layers is determined based on the factors that affect image matching. The sigmoid function is employed to compute the matching and non-matching probabilities in the output layer. Second, a gridding sub-pixel Harris algorithm is used to obtain the accurate localization of candidate matches. Third, a Gaussian pyramid coupling quadtree is adopted to gradually narrow down the searching space of the candidate matches, and multiscale patches are compared synchronously. Subsequently, a similarity measure based on the output of the sigmoid is adopted to find the initial matches. Finally, the random sample consensus algorithm and the whole-to-local quadratic polynomial constraints are used to remove false matches. In the experiments, different types of satellite datasets, such as ZY3, GF1, IKONOS, and Google Earth images

  16. Sharing Family Life Information Through Video Calls and Other Information and Communication Technologies and the Association With Family Well-Being: Population-Based Survey

    OpenAIRE

    Shen, Chen; Wang, Man Ping; Chu, Joanna TW; Wan, Alice; Viswanath, Kasisomayajula; Chan, Sophia Siu Chee; Lam, Tai Hing

    2017-01-01

    Background The use of information and communication technologies (ICTs) for information sharing among family members is increasing dramatically. However, little is known about the associated factors and the influence on family well-being. Objective The authors investigated the pattern and social determinants of family life information sharing with family and the associations of different methods of sharing with perceived family health, happiness, and harmony (3Hs) in Hong Kong, where mobile p...

  17. Effectiveness of Visual Methods in Information Procedures for Stem Cell Recipients and Donors

    Directory of Open Access Journals (Sweden)

    Çağla Sarıtürk

    2017-12-01

    Full Text Available Objective: Obtaining informed consent from hematopoietic stem cell recipients and donors is a critical step in the transplantation process. Anxiety may affect their understanding of the provided information. However, use of audiovisual methods may facilitate understanding. In this prospective randomized study, we investigated the effectiveness of using an audiovisual method of providing information to patients and donors in combination with the standard model. Materials and Methods: A 10-min informational animation was prepared for this purpose. In total, 82 participants were randomly assigned to two groups: group 1 received the additional audiovisual information and group 2 received standard information. A 20-item questionnaire was administered to participants at the end of the informational session. Results: A reliability test and factor analysis showed that the questionnaire was reliable and valid. For all participants, the mean overall satisfaction score was 184.8±19.8 (maximum possible score of 200. However, for satisfaction with information about written informed consent, group 1 scored significantly higher than group 2 (p=0.039. Satisfaction level was not affected by age, education level, or differences between the physicians conducting the informative session. Conclusion: This study shows that using audiovisual tools may contribute to a better understanding of the informed consent procedure and potential risks of stem cell transplantation.

  18. Background Noise Reduction Using Adaptive Noise Cancellation Determined by the Cross-Correlation

    Science.gov (United States)

    Spalt, Taylor B.; Brooks, Thomas F.; Fuller, Christopher R.

    2012-01-01

    Background noise due to flow in wind tunnels contaminates desired data by decreasing the Signal-to-Noise Ratio. The use of Adaptive Noise Cancellation to remove background noise at measurement microphones is compromised when the reference sensor measures both background and desired noise. The technique proposed modifies the classical processing configuration based on the cross-correlation between the reference and primary microphone. Background noise attenuation is achieved using a cross-correlation sample width that encompasses only the background noise and a matched delay for the adaptive processing. A present limitation of the method is that a minimum time delay between the background noise and desired signal must exist in order for the correlated parts of the desired signal to be separated from the background noise in the crosscorrelation. A simulation yields primary signal recovery which can be predicted from the coherence of the background noise between the channels. Results are compared with two existing methods.

  19. HemeBIND: a novel method for heme binding residue prediction by combining structural and sequence information

    Directory of Open Access Journals (Sweden)

    Hu Jianjun

    2011-05-01

    Full Text Available Abstract Background Accurate prediction of binding residues involved in the interactions between proteins and small ligands is one of the major challenges in structural bioinformatics. Heme is an essential and commonly used ligand that plays critical roles in electron transfer, catalysis, signal transduction and gene expression. Although much effort has been devoted to the development of various generic algorithms for ligand binding site prediction over the last decade, no algorithm has been specifically designed to complement experimental techniques for identification of heme binding residues. Consequently, an urgent need is to develop a computational method for recognizing these important residues. Results Here we introduced an efficient algorithm HemeBIND for predicting heme binding residues by integrating structural and sequence information. We systematically investigated the characteristics of binding interfaces based on a non-redundant dataset of heme-protein complexes. It was found that several sequence and structural attributes such as evolutionary conservation, solvent accessibility, depth and protrusion clearly illustrate the differences between heme binding and non-binding residues. These features can then be separately used or combined to build the structure-based classifiers using support vector machine (SVM. The results showed that the information contained in these features is largely complementary and their combination achieved the best performance. To further improve the performance, an attempt has been made to develop a post-processing procedure to reduce the number of false positives. In addition, we built a sequence-based classifier based on SVM and sequence profile as an alternative when only sequence information can be used. Finally, we employed a voting method to combine the outputs of structure-based and sequence-based classifiers, which demonstrated remarkably better performance than the individual classifier alone

  20. Background rejection in NEXT using deep neural networks

    CERN Document Server

    Renner, J.

    2017-01-01

    We investigate the potential of using deep learning techniques to reject background events in searches for neutrinoless double beta decay with high pressure xenon time projection chambers capable of detailed track reconstruction. The differences in the topological signatures of background and signal events can be learned by deep neural networks via training over many thousands of events. These networks can then be used to classify further events as signal or background, providing an additional background rejection factor at an acceptable loss of efficiency. The networks trained in this study performed better than previous methods developed based on the use of the same topological signatures by a factor of 1.2 to 1.6, and there is potential for further improvement.

  1. Background rejection in NEXT using deep neural networks

    International Nuclear Information System (INIS)

    Renner, J.; Farbin, A.; Vidal, J. Muñoz; Benlloch-Rodríguez, J. M.; Botas, A.

    2017-01-01

    Here, we investigate the potential of using deep learning techniques to reject background events in searches for neutrinoless double beta decay with high pressure xenon time projection chambers capable of detailed track reconstruction. The differences in the topological signatures of background and signal events can be learned by deep neural networks via training over many thousands of events. These networks can then be used to classify further events as signal or background, providing an additional background rejection factor at an acceptable loss of efficiency. The networks trained in this study performed better than previous methods developed based on the use of the same topological signatures by a factor of 1.2 to 1.6, and there is potential for further improvement.

  2. Information visualization courses for students with a computer science background.

    Science.gov (United States)

    Kerren, Andreas

    2013-01-01

    Linnaeus University offers two master's courses in information visualization for computer science students with programming experience. This article briefly describes the syllabi, exercises, and practices developed for these courses.

  3. The colorful brain: Visualization of EEG background patterns

    NARCIS (Netherlands)

    van Putten, Michel Johannes Antonius Maria

    2008-01-01

    This article presents a method to transform routine clinical EEG recordings to an alternative visual domain. The method is intended to support the classic visual interpretation of the EEG background pattern and to facilitate communication about relevant EEG characteristics. In addition, it provides

  4. Effects of placement point of background music on shopping website.

    Science.gov (United States)

    Lai, Chien-Jung; Chiang, Chia-Chi

    2012-01-01

    Consumer on-line behaviors are more important than ever due to highly growth of on-line shopping. The purposes of this study were to design placement methods of background music for shopping website and examine the effect on browsers' emotional and cognitive response. Three placement points of background music during the browsing, i.e. 2 min., 4 min., and 6 min. from the start of browsing were considered for entry points. Both browsing without music (no music) and browsing with constant music volume (full music) were treated as control groups. Participants' emotional state, approach-avoidance behavior intention, and action to adjust music volume were collected. Results showed that participants had a higher level of pleasure, arousal and approach behavior intention for the three placement points than for no music and full music. Most of the participants for full music (5/6) adjusted the background music. Only 16.7% (3/18) participants for other levels turn off the background music. The results indicate that playing background music after the start of browsing is benefit for on-line shopping atmosphere. It is inappropriate to place background music at the start of browsing shopping website. The marketer must manipulated placement methods of background music for a web store carefully.

  5. Hybrid methods to represent incomplete and uncertain information

    Energy Technology Data Exchange (ETDEWEB)

    Joslyn, C. [NASA Goddard Space Flight Center, Greenbelt, MD (United States)

    1996-12-31

    Decision making is cast in the semiotic context of perception, decision, and action loops. Towards the goal of properly grounding hybrid representations of information and uncertainty from this semiotic perspective, we consider the roles of and relations among the mathematical components of General Information Theory (GIT), particularly among fuzzy sets, possibility theory, probability theory, and random sets. We do so by using a clear distinction between the syntactic, mathematical formalism and the semantic domains of application of each of these fields, placing the emphasis on available measurement and action methods appropriate for each formalism, to which and from which the decision-making process flows.

  6. On the Adaptation of an Agile Information Systems Development Method

    NARCIS (Netherlands)

    Aydin, M.N.; Harmsen, F.; van Slooten, C.; Stegwee, R.A.

    2005-01-01

    Little specific research has been conducted to date on the adaptation of agile information systems development (ISD) methods. This article presents the work practice in dealing with the adaptation of such a method in the ISD department of one of the leading financial institutes in Europe. Two forms

  7. Hanford Site background: Part 1, Soil background for nonradioactive analytes

    International Nuclear Information System (INIS)

    1993-04-01

    The determination of soil background is one of the most important activities supporting environmental restoration and waste management on the Hanford Site. Background compositions serve as the basis for identifying soil contamination, and also as a baseline in risk assessment processes used to determine soil cleanup and treatment levels. These uses of soil background require an understanding of the extent to which analytes of concern occur naturally in the soils. This report documents the results of sampling and analysis activities designed to characterize the composition of soil background at the Hanford Site, and to evaluate the feasibility for use as Sitewide background. The compositions of naturally occurring soils in the vadose Zone have been-determined for-nonradioactive inorganic and organic analytes and related physical properties. These results confirm that a Sitewide approach to the characterization of soil background is technically sound and is a viable alternative to the determination and use of numerous local or area backgrounds that yield inconsistent definitions of contamination. Sitewide soil background consists of several types of data and is appropriate for use in identifying contamination in all soils in the vadose zone on the Hanford Site. The natural concentrations of nearly every inorganic analyte extend to levels that exceed calculated health-based cleanup limits. The levels of most inorganic analytes, however, are well below these health-based limits. The highest measured background concentrations occur in three volumetrically minor soil types, the most important of which are topsoils adjacent to the Columbia River that are rich in organic carbon. No organic analyte levels above detection were found in any of the soil samples

  8. Information-theoretic methods for estimating of complicated probability distributions

    CERN Document Server

    Zong, Zhi

    2006-01-01

    Mixing up various disciplines frequently produces something that are profound and far-reaching. Cybernetics is such an often-quoted example. Mix of information theory, statistics and computing technology proves to be very useful, which leads to the recent development of information-theory based methods for estimating complicated probability distributions. Estimating probability distribution of a random variable is the fundamental task for quite some fields besides statistics, such as reliability, probabilistic risk analysis (PSA), machine learning, pattern recognization, image processing, neur

  9. Fluorescence molecular tomography in the presence of background fluorescence

    International Nuclear Information System (INIS)

    Soubret, Antoine; Ntziachristos, Vasilis

    2006-01-01

    Fluorescence molecular tomography is an emerging imaging technique that resolves the bio-distribution of engineered fluorescent probes developed for in vivo reporting of specific cellular and sub-cellular targets. The method can detect fluorochromes in picomole amounts or less, imaged through entire animals, but the detection sensitivity and imaging performance drop in the presence of background, non-specific fluorescence. In this study, we carried out a theoretical and an experimental investigation on the effect of background fluorescence on the measured signal and on the tomographic reconstruction. We further examined the performance of three subtraction methods based on physical models of photon propagation, using experimental data on phantoms and small animals. We show that the data pre-processing with subtraction schemes can improve image quality and quantification when non-specific background florescence is present

  10. Performance Comparison of Several Pre-Processing Methods in a Hand Gesture Recognition System based on Nearest Neighbor for Different Background Conditions

    Directory of Open Access Journals (Sweden)

    Iwan Setyawan

    2012-12-01

    Full Text Available This paper presents a performance analysis and comparison of several pre-processing methods used in a hand gesture recognition system. The pre-processing methods are based on the combinations of several image processing operations, namely edge detection, low pass filtering, histogram equalization, thresholding and desaturation. The hand gesture recognition system is designed to classify an input image into one of six possible classes. The input images are taken with various background conditions. Our experiments showed that the best result is achieved when the pre-processing method consists of only a desaturation operation, achieving a classification accuracy of up to 83.15%.

  11. Research on image complexity evaluation method based on color information

    Science.gov (United States)

    Wang, Hao; Duan, Jin; Han, Xue-hui; Xiao, Bo

    2017-11-01

    In order to evaluate the complexity of a color image more effectively and find the connection between image complexity and image information, this paper presents a method to compute the complexity of image based on color information.Under the complexity ,the theoretical analysis first divides the complexity from the subjective level, divides into three levels: low complexity, medium complexity and high complexity, and then carries on the image feature extraction, finally establishes the function between the complexity value and the color characteristic model. The experimental results show that this kind of evaluation method can objectively reconstruct the complexity of the image from the image feature research. The experimental results obtained by the method of this paper are in good agreement with the results of human visual perception complexity,Color image complexity has a certain reference value.

  12. Research on the method of measuring space information network capacity in communication service

    Directory of Open Access Journals (Sweden)

    Zhu Shichao

    2017-02-01

    Full Text Available Because of the large scale characteristic of space information network in terms of space and time and the increasing of its complexity,existing measuring methods of information transmission capacity have been unable to measure the existing and future space information networkeffectively.In this study,we firstly established a complex model of space information network,and measured the whole space information network capacity by means of analyzing data access capability to the network and data transmission capability within the network.At last,we verified the rationality of the proposed measuring method by using STK and Matlab simulation software for collaborative simulation.

  13. ADAPTIVE BACKGROUND DENGAN METODE GAUSSIAN MIXTURE MODELS UNTUK REAL-TIME TRACKING

    Directory of Open Access Journals (Sweden)

    Silvia Rostianingsih

    2008-01-01

    Full Text Available Nowadays, motion tracking application is widely used for many purposes, such as detecting traffic jam and counting how many people enter a supermarket or a mall. A method to separate background and the tracked object is required for motion tracking. It will not be hard to develop the application if the tracking is performed on a static background, but it will be difficult if the tracked object is at a place with a non-static background, because the changing part of the background can be recognized as a tracking area. In order to handle the problem an application can be made to separate background where that separation can adapt to change that occur. This application is made to produce adaptive background using Gaussian Mixture Models (GMM as its method. GMM method clustered the input pixel data with pixel color value as it’s basic. After the cluster formed, dominant distributions are choosen as background distributions. This application is made by using Microsoft Visual C 6.0. The result of this research shows that GMM algorithm could made adaptive background satisfactory. This proofed by the result of the tests that succeed at all condition given. This application can be developed so the tracking process integrated in adaptive background maker process. Abstract in Bahasa Indonesia : Saat ini, aplikasi motion tracking digunakan secara luas untuk banyak tujuan, seperti mendeteksi kemacetan dan menghitung berapa banyak orang yang masuk ke sebuah supermarket atau sebuah mall. Sebuah metode untuk memisahkan antara background dan obyek yang di-track dibutuhkan untuk melakukan motion tracking. Membuat aplikasi tracking pada background yang statis bukanlah hal yang sulit, namun apabila tracking dilakukan pada background yang tidak statis akan lebih sulit, dikarenakan perubahan background dapat dikenali sebagai area tracking. Untuk mengatasi masalah tersebut, dapat dibuat suatu aplikasi untuk memisahkan background dimana aplikasi tersebut dapat

  14. Method for Evaluating Information to Solve Problems of Control, Monitoring and Diagnostics

    Science.gov (United States)

    Vasil'ev, V. A.; Dobrynina, N. V.

    2017-06-01

    The article describes a method for evaluating information to solve problems of control, monitoring and diagnostics. It is necessary for reducing the dimensionality of informational indicators of situations, bringing them to relative units, for calculating generalized information indicators on their basis, ranking them by characteristic levels, for calculating the efficiency criterion of a system functioning in real time. The design of information evaluation system has been developed on its basis that allows analyzing, processing and assessing information about the object. Such object can be a complex technical, economic and social system. The method and the based system thereof can find a wide application in the field of analysis, processing and evaluation of information on the functioning of the systems, regardless of their purpose, goals, tasks and complexity. For example, they can be used to assess the innovation capacities of industrial enterprises and management decisions.

  15. Assessment of Radiation Background Variation for Moving Detection Systems

    Energy Technology Data Exchange (ETDEWEB)

    Miller, James Christopher [Los Alamos National Laboratory; Rennie, John Alan [Los Alamos National Laboratory; Toevs, James Waldo [Los Alamos National Laboratory; Wallace, Darrin J. [Los Alamos National Laboratory; Abhold, Mark Edward [Los Alamos National Laboratory

    2015-07-13

    The introduction points out that radiation backgrounds fluctuate across very short distances: factors include geology, soil composition, altitude, building structures, topography, and other manmade structures; and asphalt and concrete can vary significantly over short distances. Brief descriptions are given of the detection system, experimental setup, and background variation measurements. It is concluded that positive and negative gradients can greatly reduce the detection sensitivity of an MDS: negative gradients create opportunities for false negatives (nondetection), and positive gradients create a potentially unacceptable FAR (above 1%); the location of use for mobile detection is important to understand; spectroscopic systems provide more information for screening out false alarms and may be preferred for mobile use; and mobile monitor testing at LANL accounts for expected variations in the background.

  16. Empirical studies on informal patient payments for health care services: a systematic and critical review of research methods and instruments

    Directory of Open Access Journals (Sweden)

    Pavlova Milena

    2010-09-01

    Full Text Available Abstract Background Empirical evidence demonstrates that informal patient payments are an important feature of many health care systems. However, the study of these payments is a challenging task because of their potentially illegal and sensitive nature. The aim of this paper is to provide a systematic review and analysis of key methodological difficulties in measuring informal patient payments. Methods The systematic review was based on the following eligibility criteria: English language publications that reported on empirical studies measuring informal patient payments. There were no limitations with regard to the year of publication. The content of the publications was analysed qualitatively and the results were organised in the form of tables. Data sources were Econlit, Econpapers, Medline, PubMed, ScienceDirect, SocINDEX. Results Informal payments for health care services are most often investigated in studies involving patients or the general public, but providers and officials are also sample units in some studies. The majority of the studies apply a single mode of data collection that involves either face-to-face interviews or group discussions. One of the main methodological difficulties reported in the publication concerns the inability of some respondents to distinguish between official and unofficial payments. Another complication is associated with the refusal of some respondents to answer questions on informal patient payments. We do not exclude the possibility that we have missed studies that reported in non-English language journals as well as very recent studies that are not yet published. Conclusions Given the recent evidence from research on survey methods, a self-administrated questionnaire during a face-to-face interview could be a suitable mode of collecting sensitive data, such as data on informal patient payments.

  17. Seasonal changes in background levels of deuterium and oxygen-18 prove water drinking by harp seals, which affects the use of the doubly labelled water method.

    Science.gov (United States)

    Nordøy, Erling S; Lager, Anne R; Schots, Pauke C

    2017-12-01

    The aim of this study was to monitor seasonal changes in stable isotopes of pool freshwater and harp seal ( Phoca groenlandica ) body water, and to study whether these potential seasonal changes might bias results obtained using the doubly labelled water (DLW) method when measuring energy expenditure in animals with access to freshwater. Seasonal changes in the background levels of deuterium and oxygen-18 in the body water of four captive harp seals and in the freshwater pool in which they were kept were measured over a time period of 1 year. The seals were offered daily amounts of capelin and kept under a seasonal photoperiod of 69°N. Large seasonal variations of deuterium and oxygen-18 in the pool water were measured, and the isotope abundance in the body water showed similar seasonal changes to the pool water. This shows that the seals were continuously equilibrating with the surrounding water as a result of significant daily water drinking. Variations in background levels of deuterium and oxygen-18 in freshwater sources may be due to seasonal changes in physical processes such as precipitation and evaporation that cause fractionation of isotopes. Rapid and abrupt changes in the background levels of deuterium and oxygen-18 may complicate calculation of energy expenditure by use of the DLW method. It is therefore strongly recommended that analysis of seasonal changes in background levels of isotopes is performed before the DLW method is applied on (free-ranging) animals, and to use a control group in order to correct for changes in background levels. © 2017. Published by The Company of Biologists Ltd.

  18. A Multi-Classification Method of Improved SVM-based Information Fusion for Traffic Parameters Forecasting

    Directory of Open Access Journals (Sweden)

    Hongzhuan Zhao

    2016-04-01

    Full Text Available With the enrichment of perception methods, modern transportation system has many physical objects whose states are influenced by many information factors so that it is a typical Cyber-Physical System (CPS. Thus, the traffic information is generally multi-sourced, heterogeneous and hierarchical. Existing research results show that the multisourced traffic information through accurate classification in the process of information fusion can achieve better parameters forecasting performance. For solving the problem of traffic information accurate classification, via analysing the characteristics of the multi-sourced traffic information and using redefined binary tree to overcome the shortcomings of the original Support Vector Machine (SVM classification in information fusion, a multi-classification method using improved SVM in information fusion for traffic parameters forecasting is proposed. The experiment was conducted to examine the performance of the proposed scheme, and the results reveal that the method can get more accurate and practical outcomes.

  19. Research on information security in big data era

    Science.gov (United States)

    Zhou, Linqi; Gu, Weihong; Huang, Cheng; Huang, Aijun; Bai, Yongbin

    2018-05-01

    Big data is becoming another hotspot in the field of information technology after the cloud computing and the Internet of Things. However, the existing information security methods can no longer meet the information security requirements in the era of big data. This paper analyzes the challenges and a cause of data security brought by big data, discusses the development trend of network attacks under the background of big data, and puts forward my own opinions on the development of security defense in technology, strategy and product.

  20. Research on Extraction of Ship Target in Complex Sea-sky Background

    International Nuclear Information System (INIS)

    Kang, W J; Ding, X M; Cui, J W; Ao, L

    2006-01-01

    Research on the extraction of ship target in complex sea-sky background has important value to improve the capability of imaging-typed sea navigation and nautical traffic control systems. According to the imaging property of complex sea-sky background, a reliable ship target extraction method is proposed in this paper. The general guide line is that getting the sea-sky division line as a priori knowledge and then the target potential area is determined through discontinuous region of the sea-sky division line. Firstly, a local selective window filter is adopted to filter the image; secondly, eight directions Sobel operator edge detection method and gradient Hough transform are combined to extract sea-sky division line in the image; then a multi-histogram matching technique is adopted to remove the sea and sky background and thus ship target is extracted from complex background. The experiments show that our method has the merits of robustness to noise, small computational complexity and stability

  1. 78 FR 25440 - Request for Information and Citations on Methods for Cumulative Risk Assessment

    Science.gov (United States)

    2013-05-01

    ... Citations on Methods for Cumulative Risk Assessment AGENCY: Office of the Science Advisor, Environmental... influence exposures, dose-response or risk/hazard posed by environmental contaminant exposures, and methods... who wish to receive further information about submitting information on methods for cumulative risk...

  2. THE EFFECTIVENESS OF COACHING USING SBAR (SITUATION, BACKGROUND, ASSESSMENT, RECOMMENDATION COMMUNICATION TOOL ON NURSING SHIFT HANDOVERS

    Directory of Open Access Journals (Sweden)

    Vitri Dyah Herawati

    2018-05-01

    Full Text Available Background: The SBAR (situation, background, assessment, recommendation method assists nurses in communicating information in nursing shift handover. Inaccurate shift handover can have a serious impact on patients due to poor communication. Optimal resource development can be done by coaching as the best guidance method from manager for directional discussion and guidance activity to learn to solve problem or do better job and build nursing leadership culture in clinical service. Objective: To analyze the effectiveness of coaching method using SBAR communication tool on nursing shift handovers. Methods: This was quasy experimental study with pretest posttest control group design. Fifty-four nurses were selected using consecutive sampling, which 27 assigned in the experiment and control group. An observation checklist was developed by the researchers based on the theory of Lardner to evaluate the effectiveness of the implementation of coaching using SBAR on nursing shift handover. Independent t-test, Mann-Whitney test and Wilcoxon test were used for data analyses. Results: There was an increase in coaching ability of head nurses in the implementation of SBAR in nursing handover after 2-weeks and 4-weeks of coaching. And there was also a significant improvement of the use of SBAR on nursing shift handover in the experiment group (p <0.05 compared to the control group. Conclusion: coaching using SBAR (situation, background, assessment, recommendation communication tool was effective on nursing shift handovers. There was a significant increase of the capability of head nurse and nursing shift handover after coaching intervention.

  3. The equivalence of information-theoretic and likelihood-based methods for neural dimensionality reduction.

    Directory of Open Access Journals (Sweden)

    Ross S Williamson

    2015-04-01

    Full Text Available Stimulus dimensionality-reduction methods in neuroscience seek to identify a low-dimensional space of stimulus features that affect a neuron's probability of spiking. One popular method, known as maximally informative dimensions (MID, uses an information-theoretic quantity known as "single-spike information" to identify this space. Here we examine MID from a model-based perspective. We show that MID is a maximum-likelihood estimator for the parameters of a linear-nonlinear-Poisson (LNP model, and that the empirical single-spike information corresponds to the normalized log-likelihood under a Poisson model. This equivalence implies that MID does not necessarily find maximally informative stimulus dimensions when spiking is not well described as Poisson. We provide several examples to illustrate this shortcoming, and derive a lower bound on the information lost when spiking is Bernoulli in discrete time bins. To overcome this limitation, we introduce model-based dimensionality reduction methods for neurons with non-Poisson firing statistics, and show that they can be framed equivalently in likelihood-based or information-theoretic terms. Finally, we show how to overcome practical limitations on the number of stimulus dimensions that MID can estimate by constraining the form of the non-parametric nonlinearity in an LNP model. We illustrate these methods with simulations and data from primate visual cortex.

  4. Optical polarization: background and camouflage

    Science.gov (United States)

    Škerlind, Christina; Hallberg, Tomas; Eriksson, Johan; Kariis, Hans; Bergström, David

    2017-10-01

    Polarimetric imaging sensors in the electro-optical region, already military and commercially available in both the visual and infrared, show enhanced capabilities for advanced target detection and recognition. The capabilities arise due to the ability to discriminate between man-made and natural background surfaces using the polarization information of light. In the development of materials for signature management in the visible and infrared wavelength regions, different criteria need to be met to fulfil the requirements for a good camouflage against modern sensors. In conventional camouflage design, the aimed design of the surface properties of an object is to spectrally match or adapt it to a background and thereby minimizing the contrast given by a specific threat sensor. Examples will be shown from measurements of some relevant materials and how they in different ways affect the polarimetric signature. Dimensioning properties relevant in an optical camouflage from a polarimetric perspective, such as degree of polarization, the viewing or incident angle, and amount of diffuse reflection, mainly in the infrared region, will be discussed.

  5. An evaluation of multimedia and online support groups (OSG) contents and application of information by infertile patients: Mixed method study

    Science.gov (United States)

    Wiweko, Budi; Narasati, Shabrina; Agung, Prince Gusti; Zesario, Aulia; Wibawa, Yohanes Satrya; Maidarti, Mila; Harzif, Achmad Kemal; Pratama, Gita; Sumapradja, Kanadi; Muharam, Raden; Hestiantoro, Andon

    2018-02-01

    Background: The presence of Online Support Groups (OSG) is expected to empower patients with infertility, thus allowing patients to be the focus of healthcare services. This study will evaluate multimedia content, OSG, and utilization of information for decision-making by patients using infertility services. This study is a mixed method study conducted from January - June 2016 at Yasmin IVF Clinic, Dr. Cipto Mangunkusumo General Hospital; and SMART IVF Clinic, Jakarta. The subjects are patients with infertility who sought treatment at the clinics. Data was collected through a structured interview in the form of a questionnaire. Informed consent was obtained from all individual participants included in the study. All procedures that performed in studies were by the ethical standards of the institutional. The result from 72 respondents showed quantitative analysis did not reveal any association between multimedia and OSG information sources with patient knowledge regarding infertility management. However, qualitative analysis highlighted three issues: the information regarding infertility services in the available multimedia and the OSG; use of the available information by patients when deciding to use infertility services. The level of awareness of respondents on searching information regarding infertility on the clinic website is still limited. It happened because most of the patients in the clinic are unaware of clinic website existence which provided the infertility information. Therefore, the clinic website needs to be promoted so the usage of this website will increase in the future.

  6. The spinorial geometry of supersymmetric backgrounds

    International Nuclear Information System (INIS)

    Gillard, J; Gran, U; Papadopoulos, G

    2005-01-01

    We propose a new method to solve the Killing spinor equations of 11-dimensional supergravity based on a description of spinors in terms of forms and on the Spin(1, 10) gauge symmetry of the supercovariant derivative. We give the canonical form of Killing spinors for backgrounds preserving two supersymmetries, N = 2, provided that one of the spinors represents the orbit of Spin(1, 10) with stability subgroup SU(5). We directly solve the Killing spinor equations of N = 1 and some N = 2, N = 3 and N = 4 backgrounds. In the N = 2 case, we investigate backgrounds with SU(5) and SU(4) invariant Killing spinors and compute the associated spacetime forms. We find that N = 2 backgrounds with SU(5) invariant Killing spinors admit a timelike Killing vector and that the space transverse to the orbits of this vector field is a Hermitian manifold with an SU(5)-structure. Furthermore, N = 2 backgrounds with SU(4) invariant Killing spinors admit two Killing vectors, one timelike and one spacelike. The space transverse to the orbits of the former is an almost Hermitian manifold with an SU(4)-structure. The spacelike Killing vector field leaves the almost complex structure invariant. We explore the canonical form of Killing spinors for backgrounds preserving more than two supersymmetries, N > 2. We investigate a class of N = 3 and N = 4 backgrounds with SU(4) invariant spinors. We find that in both cases the space transverse to a timelike vector field is a Hermitian manifold equipped with an SU(4)-structure and admits two holomorphic Killing vector fields. We also present an application to M-theory Calabi-Yau compactifications with fluxes to one dimension

  7. The spinorial geometry of supersymmetric backgrounds

    Energy Technology Data Exchange (ETDEWEB)

    Gillard, J; Gran, U; Papadopoulos, G [Department of Mathematics, King' s College London, Strand, London WC2R 2LS (United Kingdom)

    2005-03-21

    We propose a new method to solve the Killing spinor equations of 11-dimensional supergravity based on a description of spinors in terms of forms and on the Spin(1, 10) gauge symmetry of the supercovariant derivative. We give the canonical form of Killing spinors for backgrounds preserving two supersymmetries, N = 2, provided that one of the spinors represents the orbit of Spin(1, 10) with stability subgroup SU(5). We directly solve the Killing spinor equations of N = 1 and some N = 2, N = 3 and N = 4 backgrounds. In the N = 2 case, we investigate backgrounds with SU(5) and SU(4) invariant Killing spinors and compute the associated spacetime forms. We find that N = 2 backgrounds with SU(5) invariant Killing spinors admit a timelike Killing vector and that the space transverse to the orbits of this vector field is a Hermitian manifold with an SU(5)-structure. Furthermore, N = 2 backgrounds with SU(4) invariant Killing spinors admit two Killing vectors, one timelike and one spacelike. The space transverse to the orbits of the former is an almost Hermitian manifold with an SU(4)-structure. The spacelike Killing vector field leaves the almost complex structure invariant. We explore the canonical form of Killing spinors for backgrounds preserving more than two supersymmetries, N > 2. We investigate a class of N = 3 and N = 4 backgrounds with SU(4) invariant spinors. We find that in both cases the space transverse to a timelike vector field is a Hermitian manifold equipped with an SU(4)-structure and admits two holomorphic Killing vector fields. We also present an application to M-theory Calabi-Yau compactifications with fluxes to one dimension.

  8. Spontaneous Radiation Background Calculation for LCLS

    CERN Document Server

    Reiche, Sven

    2004-01-01

    The intensity of undulator radiation, not amplified by the FEL interaction, can be larger than the maximum FEL signal in the case of an X-ray FEL. In the commissioning of a SASE FEL it is essential to extract an amplified signal early to diagnose eventual misalignment of undulator modules or errors in the undulator field strength. We developed a numerical code to calculate the radiation pattern at any position behind a multi-segmented undulator with arbitrary spacing and field profiles. The output can be run through numerical spatial and frequency filters to model the radiation beam transport and diagnostic. In this presentation we estimate the expected background signal for the FEL diagnostic and at what point along the undulator the FEL signal can be separated from the background. We also discusses how much information on the undulator field and alignment can be obtained from the incoherent radiation signal itself.

  9. Method for Measuring the Information Content of Terrain from Digital Elevation Models

    Directory of Open Access Journals (Sweden)

    Lujin Hu

    2015-10-01

    Full Text Available As digital terrain models are indispensable for visualizing and modeling geographic processes, terrain information content is useful for terrain generalization and representation. For terrain generalization, if the terrain information is considered, the generalized terrain may be of higher fidelity. In other words, the richer the terrain information at the terrain surface, the smaller the degree of terrain simplification. Terrain information content is also important for evaluating the quality of the rendered terrain, e.g., the rendered web terrain tile service in Google Maps (Google Inc., Mountain View, CA, USA. However, a unified definition and measures for terrain information content have not been established. Therefore, in this paper, a definition and measures for terrain information content from Digital Elevation Model (DEM, i.e., a digital model or 3D representation of a terrain’s surface data are proposed and are based on the theory of map information content, remote sensing image information content and other geospatial information content. The information entropy was taken as the information measuring method for the terrain information content. Two experiments were carried out to verify the measurement methods of the terrain information content. One is the analysis of terrain information content in different geomorphic types, and the results showed that the more complex the geomorphic type, the richer the terrain information content. The other is the analysis of terrain information content with different resolutions, and the results showed that the finer the resolution, the richer the terrain information. Both experiments verified the reliability of the measurements of the terrain information content proposed in this paper.

  10. Risk-Informed SSCs Categorization: Elicitation Method of Expert's Opinion

    International Nuclear Information System (INIS)

    Hwang, Mee Jeong; Yang, Joon Eon; Kim, Kil Yoo

    2005-01-01

    The regulations have been performing by deterministic way since nuclear power plants have been operating. However, some SSCs identified as safety-significance by deterministic way, were turned out to be low or non safety-significant and some SSCs identified as non-safety significance were turned out to be high safety-significant according to the results of PSA. Considering these risk insights, Regulatory Guide 1.174 and 10CFR50.69 were drawn up, and we can re-categorize the SSCs according to their safety significance. Therefore, a study and an interest about the risk-informed SSCs re-categorization and treatment has been continued. The objective of this regulatory initiative is to adjust the scope of equipment subject to special regulatory treatment to better focus licensee and regulatory attention and resources on equipment that has safety significance. Current most regulations define the plant equipment necessary to meet deterministic regulatory basis as 'safety-related.' This equipment is subject to special treatment regulations. Other plant equipment is categorized as 'non-safety related,' and is not subject to a select number of special treatment requirement or a subset of those requirement. However, risk information is not a magic tool making a decision but a supporting tool to categorize SSCs. This is because only small parts of a plant are modeled in PSA model. Thus, engineering and deterministic judgments are also used for risk-informed SSCs categorization, and expert opinion elicitation is very important for risk-informed SSCs categorization. Therefore, we need a rational method to elicit the expert's opinions, and in this study, we developed a systematic method for expert elicitation to categorize the nuclear power plants' SSCs. Current states for SSCs categorization of the USA and the existing methods for expert elicitation were surveyed and more systematic way eliciting the expert opinions and combining was developed. To validate the developed method

  11. FlexiTerm: a flexible term recognition method

    NARCIS (Netherlands)

    Spasic, I.; Greenwood, M.; Preece, A.; Francis, N.; Elwyn, G.

    2013-01-01

    BACKGROUND: The increasing amount of textual information in biomedicine requires effective term recognition methods to identify textual representations of domain-specific concepts as the first step toward automating its semantic interpretation. The dictionary look-up approaches may not always be

  12. Gestational weight gain information: seeking and sources among pregnant women

    OpenAIRE

    Willcox, Jane C.; Campbell, Karen J.; McCarthy, Elizabeth A.; Lappas, Martha; Ball, Kylie; Crawford, David; Shub, Alexis; Wilkinson, Shelley A.

    2015-01-01

    Background Promoting healthy gestational weight gain (GWG) is important for preventing obstetric and perinatal morbidity, along with obesity in both mother and child. Provision of GWG guidelines by health professionals predicts women meeting GWG guidelines. Research concerning women?s GWG information sources is limited. This study assessed pregnant women?s sources of GWG information and how, where and which women seek GWG information. Methods Consecutive women (n?=?1032) received a mailed que...

  13. System and method for acquisition management of subject position information

    Science.gov (United States)

    Carrender, Curt

    2005-12-13

    A system and method for acquisition management of subject position information that utilizes radio frequency identification (RF ID) to store position information in position tags. Tag programmers receive position information from external positioning systems, such as the Global Positioning System (GPS), from manual inputs, such as keypads, or other tag programmers. The tag programmers program each position tag with the received position information. Both the tag programmers and the position tags can be portable or fixed. Implementations include portable tag programmers and fixed position tags for subject position guidance, and portable tag programmers for collection sample labeling. Other implementations include fixed tag programmers and portable position tags for subject route recordation. Position tags can contain other associated information such as destination address of an affixed subject for subject routing.

  14. System and method for acquisition management of subject position information

    Energy Technology Data Exchange (ETDEWEB)

    Carrender, Curt [Morgan Hill, CA

    2007-01-23

    A system and method for acquisition management of subject position information that utilizes radio frequency identification (RF ID) to store position information in position tags. Tag programmers receive position information from external positioning systems, such as the Global Positioning System (GPS), from manual inputs, such as keypads, or other tag programmers. The tag programmers program each position tag with the received position information. Both the tag programmers and the position tags can be portable or fixed. Implementations include portable tag programmers and fixed position tags for subject position guidance, and portable tag programmers for collection sample labeling. Other implementations include fixed tag programmers and portable position tags for subject route recordation. Position tags can contain other associated information such as destination address of an affixed subject for subject routing.

  15. A Feature Subset Selection Method Based On High-Dimensional Mutual Information

    Directory of Open Access Journals (Sweden)

    Chee Keong Kwoh

    2011-04-01

    Full Text Available Feature selection is an important step in building accurate classifiers and provides better understanding of the data sets. In this paper, we propose a feature subset selection method based on high-dimensional mutual information. We also propose to use the entropy of the class attribute as a criterion to determine the appropriate subset of features when building classifiers. We prove that if the mutual information between a feature set X and the class attribute Y equals to the entropy of Y , then X is a Markov Blanket of Y . We show that in some cases, it is infeasible to approximate the high-dimensional mutual information with algebraic combinations of pairwise mutual information in any forms. In addition, the exhaustive searches of all combinations of features are prerequisite for finding the optimal feature subsets for classifying these kinds of data sets. We show that our approach outperforms existing filter feature subset selection methods for most of the 24 selected benchmark data sets.

  16. Reduced background autofluorescence for cell imaging using nanodiamonds and lanthanide chelates.

    Science.gov (United States)

    Cordina, Nicole M; Sayyadi, Nima; Parker, Lindsay M; Everest-Dass, Arun; Brown, Louise J; Packer, Nicolle H

    2018-03-14

    Bio-imaging is a key technique in tracking and monitoring important biological processes and fundamental biomolecular interactions, however the interference of background autofluorescence with targeted fluorophores is problematic for many bio-imaging applications. This study reports on two novel methods for reducing interference with cellular autofluorescence for bio-imaging. The first method uses fluorescent nanodiamonds (FNDs), containing nitrogen vacancy centers. FNDs emit at near-infrared wavelengths typically higher than most cellular autofluorescence; and when appropriately functionalized, can be used for background-free imaging of targeted biomolecules. The second method uses europium-chelating tags with long fluorescence lifetimes. These europium-chelating tags enhance background-free imaging due to the short fluorescent lifetimes of cellular autofluorescence. In this study, we used both methods to target E-selectin, a transmembrane glycoprotein that is activated by inflammation, to demonstrate background-free fluorescent staining in fixed endothelial cells. Our findings indicate that both FND and Europium based staining can improve fluorescent bio-imaging capabilities by reducing competition with cellular autofluorescence. 30 nm nanodiamonds coated with the E-selectin antibody was found to enable the most sensitive detective of E-selectin in inflamed cells, with a 40-fold increase in intensity detected.

  17. [The German program for disease management guidelines. Background, methods, and development process].

    Science.gov (United States)

    Ollenschläger, Günter; Kopp, Ina; Lelgemann, Monika; Sänger, Sylvia; Heymans, Lothar; Thole, Henning; Trapp, Henrike; Lorenz, Wilfried; Selbmann, Hans-Konrad; Encke, Albrecht

    2006-10-15

    The Program for National Disease Management Guidelines (German DM-CPG Program) was established in 2002 by the German Medical Association (umbrella organization of the German Chambers of Physicians) and joined by the Association of the Scientific Medical Societies (AWMF; umbrella organization of more than 150 professional societies) and by the National Association of Statutory Health Insurance Physicians (NASHIP) in 2003. The program provides a conceptual basis for disease management, focusing on high-priority health-care topics and aiming at the implementation of best practice recommendations for prevention, acute care, rehabilitation and chronic care. It is organized by the German Agency for Quality in Medicine, a founding member of the Guidelines International Network (G-I-N). The main objective of the German DM-CPG Program is to establish consensus of the medical professions on evidence-based key recommendations covering all sectors of health-care provision and facilitating the coordination of care for the individual patient through time and across interfaces. Within the last year, DM-CPGs have been published for asthma, chronic obstructive pulmonary disease, type 2 diabetes, and coronary heart disease. In addition, experts from national patient self-help groups have been developing patient guidance based upon the recommendations for health-care providers. The article describes background, methods, and tools of the DM-CPG Program, and is the first of a publication series dealing with innovative recommendations and aspects of the program.

  18. Performance Comparison of Several Pre-Processing Methods in a Hand Gesture Recognition System based on Nearest Neighbor for Different Background Conditions

    Directory of Open Access Journals (Sweden)

    Regina Lionnie

    2013-09-01

    Full Text Available This paper presents a performance analysis and comparison of several pre-processing  methods  used  in  a  hand  gesture  recognition  system.  The  preprocessing methods are based on the combinations ofseveral image processing operations,  namely  edge  detection,  low  pass  filtering,  histogram  equalization, thresholding and desaturation. The hand gesture recognition system is designed to classify an input image into one of six possibleclasses. The input images are taken with various background conditions. Our experiments showed that the best result is achieved when the pre-processing method consists of only a desaturation operation, achieving a classification accuracy of up to 83.15%.

  19. Exploratory and spatial data analysis (EDA-SDA) for determining regional background levels and anomalies of potentially toxic elements in soils from Catorce-Matehuala, Mexico

    Science.gov (United States)

    Chiprés, J.A.; Castro-Larragoitia, J.; Monroy, M.G.

    2009-01-01

    The threshold between geochemical background and anomalies can be influenced by the methodology selected for its estimation. Environmental evaluations, particularly those conducted in mineralized areas, must consider this when trying to determinate the natural geochemical status of a study area, quantifying human impacts, or establishing soil restoration values for contaminated sites. Some methods in environmental geochemistry incorporate the premise that anomalies (natural or anthropogenic) and background data are characterized by their own probabilistic distributions. One of these methods uses exploratory data analysis (EDA) on regional geochemical data sets coupled with a geographic information system (GIS) to spatially understand the processes that influence the geochemical landscape in a technique that can be called a spatial data analysis (SDA). This EDA-SDA methodology was used to establish the regional background range from the area of Catorce-Matehuala in north-central Mexico. Probability plots of the data, particularly for those areas affected by human activities, show that the regional geochemical background population is composed of smaller subpopulations associated with factors such as soil type and parent material. This paper demonstrates that the EDA-SDA method offers more certainty in defining thresholds between geochemical background and anomaly than a numeric technique, making it a useful tool for regional geochemical landscape analysis and environmental geochemistry studies.

  20. Knowledge information management toolkit and method

    Science.gov (United States)

    Hempstead, Antoinette R.; Brown, Kenneth L.

    2006-08-15

    A system is provided for managing user entry and/or modification of knowledge information into a knowledge base file having an integrator support component and a data source access support component. The system includes processing circuitry, memory, a user interface, and a knowledge base toolkit. The memory communicates with the processing circuitry and is configured to store at least one knowledge base. The user interface communicates with the processing circuitry and is configured for user entry and/or modification of knowledge pieces within a knowledge base. The knowledge base toolkit is configured for converting knowledge in at least one knowledge base from a first knowledge base form into a second knowledge base form. A method is also provided.

  1. Compressive Online Robust Principal Component Analysis with Multiple Prior Information

    DEFF Research Database (Denmark)

    Van Luong, Huynh; Deligiannis, Nikos; Seiler, Jürgen

    -rank components. Unlike conventional batch RPCA, which processes all the data directly, our method considers a small set of measurements taken per data vector (frame). Moreover, our method incorporates multiple prior information signals, namely previous reconstructed frames, to improve these paration...... and thereafter, update the prior information for the next frame. Using experiments on synthetic data, we evaluate the separation performance of the proposed algorithm. In addition, we apply the proposed algorithm to online video foreground and background separation from compressive measurements. The results show...

  2. Statistical removal of background signals from high-throughput 1H NMR line-broadening ligand-affinity screens

    International Nuclear Information System (INIS)

    Worley, Bradley; Sisco, Nicholas J.; Powers, Robert

    2015-01-01

    NMR ligand-affinity screens are vital to drug discovery, are routinely used to screen fragment-based libraries, and used to verify chemical leads from high-throughput assays and virtual screens. NMR ligand-affinity screens are also a highly informative first step towards identifying functional epitopes of unknown proteins, as well as elucidating the biochemical functions of protein–ligand interaction at their binding interfaces. While simple one-dimensional 1 H NMR experiments are capable of indicating binding through a change in ligand line shape, they are plagued by broad, ill-defined background signals from protein 1 H resonances. We present an uncomplicated method for subtraction of protein background in high-throughput ligand-based affinity screens, and show that its performance is maximized when phase-scatter correction is applied prior to subtraction

  3. Heat kernel expansion in the background field formalism

    CERN Document Server

    Barvinsky, Andrei

    2015-01-01

    Heat kernel expansion and background field formalism represent the combination of two calculational methods within the functional approach to quantum field theory. This approach implies construction of generating functionals for matrix elements and expectation values of physical observables. These are functionals of arbitrary external sources or the mean field of a generic configuration -- the background field. Exact calculation of quantum effects on a generic background is impossible. However, a special integral (proper time) representation for the Green's function of the wave operator -- the propagator of the theory -- and its expansion in the ultraviolet and infrared limits of respectively short and late proper time parameter allow one to construct approximations which are valid on generic background fields. Current progress of quantum field theory, its renormalization properties, model building in unification of fundamental physical interactions and QFT applications in high energy physics, gravitation and...

  4. Socio-economic background and prevalence of visual defects ...

    African Journals Online (AJOL)

    The thrust of this study is to examine the socio-economic background and prevalence of visual defects among students in public and private secondary schools in Calabar municipality in Cross River State. The main objective of the study is to screen for and present information on the prevalence of visual defects amongst the ...

  5. Ecological content validation of the Information Assessment Method for parents (IAM-parent): A mixed methods study.

    Science.gov (United States)

    Bujold, M; El Sherif, R; Bush, P L; Johnson-Lafleur, J; Doray, G; Pluye, P

    2018-02-01

    This mixed methods study content validated the Information Assessment Method for parents (IAM-parent) that allows users to systematically rate and comment on online parenting information. Quantitative data and results: 22,407 IAM ratings were collected; of the initial 32 items, descriptive statistics showed that 10 had low relevance. Qualitative data and results: IAM-based comments were collected, and 20 IAM users were interviewed (maximum variation sample); the qualitative data analysis assessed the representativeness of IAM items, and identified items with problematic wording. Researchers, the program director, and Web editors integrated quantitative and qualitative results, which led to a shorter and clearer IAM-parent. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  6. Short Term Gain, Long Term Pain:Informal Job Search Methods and Post-Displacement Outcomes

    OpenAIRE

    Green, Colin

    2012-01-01

    This paper examines the role of informal job search methods on the labour market outcomes of displaced workers. Informal job search methods could alleviate short-term labour market difficulties of displaced workers by providing information on job opportunities, allowing them to signal their productivity and may mitigate wage losses through better post-displacement job matching. However if displacement results from reductions in demand for specific sectors/skills, the use of informal job searc...

  7. Information systems for mental health in six low and middle income countries : Cross country situation analysis

    NARCIS (Netherlands)

    Upadhaya, Nawaraj; Jordans, Mark J D; Abdulmalik, Jibril; Ahuja, Shalini; Alem, Atalay; Hanlon, Charlotte; Kigozi, Fred; Kizza, Dorothy; Lund, Crick; Semrau, Maya; Shidhaye, Rahul; Thornicroft, Graham; Komproe, Ivan H.; Gureje, Oye

    2016-01-01

    Background: Research on information systems for mental health in low and middle income countries (LMICs) is scarce. As a result, there is a lack of reliable information on mental health service needs, treatment coverage and the quality of services provided. Methods: With the aim of informing the

  8. EPR dosimetry of radiation background in the Urals region

    Energy Technology Data Exchange (ETDEWEB)

    Shishkina, E.A.; Degteva, M.O.; Shved, V.A. [Urals Research Center for Radiation Medicine, 48-A Vorovsky, Chelyabinsk 454076 (Russian Federation); Fattibene, P.; Onori, S. [Istituto Superiore di Sanita and Istituto Nazionale di Fisica Nucleare (Italy); Wieser, A. [GSF, Forschungszentrum fuer Umwelt und Gesundheit, Ingolstaedter Landstr (Germany); Ivanov, D.V.; Bayankin, S.N. [Institute of Metal Physics, Russian Academy of Sciences (Russian Federation); Knyazev, V.A.; Vasilenko, E.I.; Gorelov, M. [ZAO, Closed Corporation ' Company GEOSPETSECOLOGIA' (Russian Federation)

    2006-07-01

    Method of Electron Paramagnetic Resonance is extensively applied to individual retrospective dosimetry. The background dose is unavoidable component of cumulative absorbed dose in the tooth enamel accumulated during the lifetime of donor. Estimation of incidental radiation dose using tooth enamel needs in extraction of background dose. Moreover, the variation of background doses in the population is a limited factor for reliable detection of additional irradiation especially for low dose level. Therefore the accurate knowledge of the natural background radiation dose is a critical element of EPR studies of exposed populations. In the Urals region the method applies for such two large cohorts as the workers of Mayak (Ozersk citizens) and Techa River riverside inhabitants (rural population). Current study aimed to investigate the Urals radiation background detected by EPR spectrometry. For this aim two group of unexposed Urals residents were separated, viz: citizens of Ozersk and rural inhabitants of Chelyabinsk region. Comparison of two investigated territories has demonstrated that from the point of view of radiation background it is impossible to assume the Urals population as uniform. The reliable difference between the urban and rural residents has been found. The average background doses of Ozersk donors is in average 50 mGy higher than those detected for rural residents. The individual variability of background doses for Osersk has been higher than in the rural results. The difference in background dose levels between two population results in different limits of accidental dose detection and individualization. The doses for 'Mayak' workers (Ozyorsk citizens) can be classed as anthropogenic if the EPR measurements exceed 120 mGy for teeth younger than 40 years, and 240 mGy for teeth older than 70 years. The anthropogenic doses for Techa River residents (rural population) would be higher than 95 mGy for teeth younger than 50 years and 270 mGy for

  9. EPR dosimetry of radiation background in the Urals region

    International Nuclear Information System (INIS)

    Shishkina, E.A.; Degteva, M.O.; Shved, V.A.; Fattibene, P.; Onori, S.; Wieser, A.; Ivanov, D.V.; Bayankin, S.N.; Knyazev, V.A.; Vasilenko, E.I.; Gorelov, M.

    2006-01-01

    Method of Electron Paramagnetic Resonance is extensively applied to individual retrospective dosimetry. The background dose is unavoidable component of cumulative absorbed dose in the tooth enamel accumulated during the lifetime of donor. Estimation of incidental radiation dose using tooth enamel needs in extraction of background dose. Moreover, the variation of background doses in the population is a limited factor for reliable detection of additional irradiation especially for low dose level. Therefore the accurate knowledge of the natural background radiation dose is a critical element of EPR studies of exposed populations. In the Urals region the method applies for such two large cohorts as the workers of Mayak (Ozersk citizens) and Techa River riverside inhabitants (rural population). Current study aimed to investigate the Urals radiation background detected by EPR spectrometry. For this aim two group of unexposed Urals residents were separated, viz: citizens of Ozersk and rural inhabitants of Chelyabinsk region. Comparison of two investigated territories has demonstrated that from the point of view of radiation background it is impossible to assume the Urals population as uniform. The reliable difference between the urban and rural residents has been found. The average background doses of Ozersk donors is in average 50 mGy higher than those detected for rural residents. The individual variability of background doses for Osersk has been higher than in the rural results. The difference in background dose levels between two population results in different limits of accidental dose detection and individualization. The doses for 'Mayak' workers (Ozyorsk citizens) can be classed as anthropogenic if the EPR measurements exceed 120 mGy for teeth younger than 40 years, and 240 mGy for teeth older than 70 years. The anthropogenic doses for Techa River residents (rural population) would be higher than 95 mGy for teeth younger than 50 years and 270 mGy for teeth older

  10. Binary recursive partitioning: background, methods, and application to psychology.

    Science.gov (United States)

    Merkle, Edgar C; Shaffer, Victoria A

    2011-02-01

    Binary recursive partitioning (BRP) is a computationally intensive statistical method that can be used in situations where linear models are often used. Instead of imposing many assumptions to arrive at a tractable statistical model, BRP simply seeks to accurately predict a response variable based on values of predictor variables. The method outputs a decision tree depicting the predictor variables that were related to the response variable, along with the nature of the variables' relationships. No significance tests are involved, and the tree's 'goodness' is judged based on its predictive accuracy. In this paper, we describe BRP methods in a detailed manner and illustrate their use in psychological research. We also provide R code for carrying out the methods.

  11. Optimal Search for an Astrophysical Gravitational-Wave Background

    OpenAIRE

    Rory Smith; Eric Thrane

    2018-01-01

    Roughly every 2–10 min, a pair of stellar-mass black holes merge somewhere in the Universe. A small fraction of these mergers are detected as individually resolvable gravitational-wave events by advanced detectors such as LIGO and Virgo. The rest contribute to a stochastic background. We derive the statistically optimal search strategy (producing minimum credible intervals) for a background of unresolved binaries. Our method applies Bayesian parameter estimation to all available data. Using M...

  12. Detection of admittivity anomaly on high-contrast heterogeneous backgrounds using frequency difference EIT.

    Science.gov (United States)

    Jang, J; Seo, J K

    2015-06-01

    This paper describes a multiple background subtraction method in frequency difference electrical impedance tomography (fdEIT) to detect an admittivity anomaly from a high-contrast background conductivity distribution. The proposed method expands the use of the conventional weighted frequency difference EIT method, which has been used limitedly to detect admittivity anomalies in a roughly homogeneous background. The proposed method can be viewed as multiple weighted difference imaging in fdEIT. Although the spatial resolutions of the output images by fdEIT are very low due to the inherent ill-posedness, numerical simulations and phantom experiments of the proposed method demonstrate its feasibility to detect anomalies. It has potential application in stroke detection in a head model, which is highly heterogeneous due to the skull.

  13. Using Correlated Photons to Suppress Background Noise

    Science.gov (United States)

    Jackson, Deborah; Hockney, George; Dowling, Jonathan

    2003-01-01

    A proposed method of suppressing the effect of background noise in an optical communication system would exploit the transmission and reception of correlated photons at the receiver. The method would not afford any advantage in a system in which performance is limited by shot noise. However, if the performance of the system is limited by background noise (e.g., sunlight in the case of a free-space optical communication system or incoherently scattered in-band photons in the case of a fiber-optic communication system), then the proposed method could offer an advantage: the proposed method would make it possible to achieve a signal-to-noise ratio (S/N) significantly greater than that of an otherwise equivalent background- noise-limited optical communication system based on the classical transmission and reception of uncorrelated photons. The figure schematically depicts a classical optical-communication system and a system according to the proposed method. In the classical system, a modulated laser beam is transmitted along an optical path to a receiver, the optics of which include a narrow-band-pass filter that suppresses some of the background noise. A photodetector in the receiver detects the laser-beam and background photons, most or all of which are uncorrelated. In the proposed system, correlated photons would be generated at the transmitter by making a modulated laser beam pass through a nonlinear parametric down-conversion crystal. The sum of frequencies of the correlated photons in each pair would equal the frequency of the incident photon from which they were generated. As in the classical system, the correlated photons would travel along an optical path to a receiver, where they would be band-pass filtered and detected. Unlike in the classical system, the photodetector in the receiver in this system would be one that intrinsically favors the detection of pairs of correlated photons over the detection of uncorrelated photons. Even though there would be no

  14. A new method to detect geometrical information by the tunneling microscope

    DEFF Research Database (Denmark)

    Tasaki, S.; Levitan, J.; Mygind, Jesper

    1997-01-01

    A new method for the detection of the geometrical information by the scanning tunneling microscope is proposed. In addition to the bias voltage, a small ac modulation is applied. The nonlinear dependence of the transmission coefficient on the applied voltage is used to generate harmonics. The ratio...... of the harmonics to the dc current is found to give the width between the sample and the probe, i.e., the geometrical information. This method may be useful to measure materials, where the local-spatial-density of states may change notably from place to place. ©1997 American Institute of Physics....

  15. On estimating the background of remote sensing gamma-ray spectroscopic data

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, Meng-Hua, E-mail: mhzhu@must.edu.mo

    2016-10-01

    In this paper, we considered the inverse count accumulation process of gamma-ray spectrum and derived an iterative filtering method to estimate the background of noisy spectroscopic data for the remote sensing observations of planetary surface. Compared with the SNIP method, the proposed method avoids the calculation of the average FWHM of the whole spectrum or the peak regions, which is an important parameter for the SNIP method. The synthetic and experimental spectra are used to validate the derived method. The results show that the proposed method can estimate the background efficiently, especially for the spectroscopic data with Compton continuum. In addition, by combining the proposed method and the SNIP method, the average FWHM can be determined easily, which can be used to validate the characteristics of detector.

  16. Cavern background measurement with the ATLAS RPC system

    CERN Document Server

    Aielli, G; The ATLAS collaboration

    2012-01-01

    The measurement of cavern background has been carried out systematically since the beginning of LHC, as soon as the luminosity produced a detectable signal, from L = 10^28 cm^2s^1 of the early 2010 operation up to L=10^28 cm^2s^1 at the end of 2011 proton-proton run, which is just 1/3 of the nominal LHC luminosity. The reason for this is to early foresee the running condition for the detector for the nominal LHC luminosity and beyond, in view of the super-LHC upgrade. Background Montecarlo calculations have been validated against data and the background map analysis pointed out hotspots due to localized cracks in the radiation shielding. The RPCs participated to this effort since the earliest stages providing an accurate correlation between luminosity and background, a 3D background map in the barrel region and a direct measurement of the cavern activation. Moreover due to the high sensitivity and very good signal to noise ratio of the proposed method, based on the gap current, the measurement was provided in...

  17. Cavern background measurement with the ATLAS RPC system

    CERN Document Server

    Aielli, G; The ATLAS collaboration

    2012-01-01

    The measurement of cavern background has been carried out systematically since the beginning of LHC, as soon as the luminosity produced a detectable signal, from L=1028 cm-2s-1 of the early 2010 operation up to L=3.5x1033 cm-2s-1 at the end of 2011 proton-proton run, which is just 1/3 of the nominal LHC luminosity. The reason for this is to early foresee the running condition for the detector for the nominal LHC luminosity and beyond, in view of the super-LHC upgrade. Background Montecarlo calculations have been validated against data and the background map analysis pointed out hotspots due to localized cracks in the radiation shielding. The RPCs participated to this effort since the earliest stages providing an accurate correlation between luminosity and background, a 3D background map in the barrel region and a direct measurement of the cavern activation. Moreover due to the high sensitivity and very good signal to noise ratio of the proposed method, based on the gap current, the measurement was provided in...

  18. Beta activity measurements in high, variable gamma backgrounds

    International Nuclear Information System (INIS)

    Stanga, D.; Sandu, E.; Craciun, L.

    1997-01-01

    In many cases beta activity measurements must be performed in high and variable gamma backgrounds. In such instances it is necessary to use well-shielded detectors but this technique is limited to laboratory equipment and frequently insufficient. In order to perform in a simple manner beta activity measurements in high and variable backgrounds a software-aided counting technique have been developed and a counting system have been constructed. This technique combines the different counting techniques with traditional method of successive measurement of the sample and background. The counting system is based on a programmable multi-scaler which is endowed with appropriate software and allow all operations to be performed via keyboard in an interactive fashion. Two large - area proportional detectors were selected in order to have the same background and the same gamma response within 5%. A program has been developed for the counting data analysis and beta activity computing. The software-aided counting technique has been implemented for beta activity measurement in high and variable backgrounds. (authors)

  19. Aluminum as a source of background in low background experiments

    Energy Technology Data Exchange (ETDEWEB)

    Majorovits, B., E-mail: bela@mppmu.mpg.de [MPI fuer Physik, Foehringer Ring 6, 80805 Munich (Germany); Abt, I. [MPI fuer Physik, Foehringer Ring 6, 80805 Munich (Germany); Laubenstein, M. [Laboratori Nazionali del Gran Sasso, INFN, S.S.17/bis, km 18 plus 910, I-67100 Assergi (Italy); Volynets, O. [MPI fuer Physik, Foehringer Ring 6, 80805 Munich (Germany)

    2011-08-11

    Neutrinoless double beta decay would be a key to understanding the nature of neutrino masses. The next generation of High Purity Germanium experiments will have to be operated with a background rate of better than 10{sup -5} counts/(kg y keV) in the region of interest around the Q-value of the decay. Therefore, so far irrelevant sources of background have to be considered. The metalization of the surface of germanium detectors is in general done with aluminum. The background from the decays of {sup 22}Na, {sup 26}Al, {sup 226}Ra and {sup 228}Th introduced by this metalization is discussed. It is shown that only a special selection of aluminum can keep these background contributions acceptable.

  20. Method of bistable optical information storage using antiferroelectric phase PLZT ceramics

    Science.gov (United States)

    Land, Cecil E.

    1990-01-01

    A method for bistable storage of binary optical information includes an antiferroelectric (AFE) lead lanthanum zirconate titanate (PLZT) layer having a stable antiferroelectric first phase and a ferroelectric (FE) second phase obtained by applying a switching electric field across the surface of the device. Optical information is stored by illuminating selected portions of the layer to photoactivate an FE to AFE transition in those portions. Erasure of the stored information is obtained by reapplying the switching field.

  1. Classification of supersymmetric backgrounds of string theory

    NARCIS (Netherlands)

    Gran, Ulf; Gutowski, Jan; Papadopoulos, George; Roest, Diederik

    2007-01-01

    We review the recent progress made towards the classification of supersymmetric solutions in ten and eleven dimensions with emphasis on those of IIB supergravity. In particular, the spinorial geometry method is outlined and adapted to nearly maximally supersymmetric backgrounds.We then demonstrate

  2. Electromagnetic wave collapse in a radiation background

    International Nuclear Information System (INIS)

    Marklund, Mattias; Brodin, Gert; Stenflo, Lennart

    2003-01-01

    The nonlinear interaction, due to quantum electrodynamical (QED) effects between an electromagnetic pulse and a radiation background, is investigated by combining the methods of radiation hydrodynamics with the QED theory for photon-photon scattering. For the case of a single coherent electromagnetic pulse, we obtain a Zakharov-like system, where the radiation pressure of the pulse acts as a driver of acoustic waves in the photon gas. For a sufficiently intense pulse and/or background energy density, there is focusing and the subsequent collapse of the pulse. The relevance of our results for various astrophysical applications are discussed

  3. Polarimeter Arrays for Cosmic Microwave Background Measurements

    Science.gov (United States)

    Stevenson, Thomas; Cao, Nga; Chuss, David; Fixsen, Dale; Hsieh, Wen-Ting; Kogut, Alan; Limon, Michele; Moseley, S. Harvey; Phillips, Nicholas; Schneider, Gideon

    2006-01-01

    We discuss general system architectures and specific work towards precision measurements of Cosmic Microwave Background (CMB) polarization. The CMB and its polarization carry fundamental information on the origin, structure, and evolution of the universe. Detecting the imprint of primordial gravitational radiation on the faint polarization of the CMB will be difficult. The two primary challenges will be achieving both the required sensitivity and precise control over systematic errors. At anisotropy levels possibly as small as a few nanokelvin, the gravity-wave signal is faint compared to the fundamental sensitivity limit imposed by photon arrival statistics, and one must make simultaneous measurements with large numbers, hundreds to thousands, of independent background-limited direct detectors. Highly integrated focal plane architectures, and multiplexing of detector outputs, will be essential. Because the detectors, optics, and even the CMB itself are brighter than the faint gravity-wave signal by six to nine orders of magnitude, even a tiny leakage of polarized light reflected or diffracted from warm objects could overwhelm the primordial signal. Advanced methods of modulating only the polarized component of the incident radiation will play an essential role in measurements of CMB polarization. One promising general polarimeter concept that is under investigation by a number of institutions is to first use planar antennas to separate millimeter-wave radiation collected by a lens or horn into two polarization channels. Then the signals can be fed to a pair of direct detectors through a planar circuit consisting of superconducting niobium microstrip transmission lines, hybrid couplers, band-pass filters, and phase modulators to measure the Stokes parameters of the incoming radiation.

  4. The criteria for selecting a method for unfolding neutron spectra based on the information entropy theory

    International Nuclear Information System (INIS)

    Zhu, Qingjun; Song, Fengquan; Ren, Jie; Chen, Xueyong; Zhou, Bin

    2014-01-01

    To further expand the application of an artificial neural network in the field of neutron spectrometry, the criteria for choosing between an artificial neural network and the maximum entropy method for the purpose of unfolding neutron spectra was presented. The counts of the Bonner spheres for IAEA neutron spectra were used as a database, and the artificial neural network and the maximum entropy method were used to unfold neutron spectra; the mean squares of the spectra were defined as the differences between the desired and unfolded spectra. After the information entropy of each spectrum was calculated using information entropy theory, the relationship between the mean squares of the spectra and the information entropy was acquired. Useful information from the information entropy guided the selection of unfolding methods. Due to the importance of the information entropy, the method for predicting the information entropy using the Bonner spheres' counts was established. The criteria based on the information entropy theory can be used to choose between the artificial neural network and the maximum entropy method unfolding methods. The application of an artificial neural network to unfold neutron spectra was expanded. - Highlights: • Two neutron spectra unfolding methods, ANN and MEM, were compared. • The spectrum's entropy offers useful information for selecting unfolding methods. • For the spectrum with low entropy, the ANN was generally better than MEM. • The spectrum's entropy was predicted based on the Bonner spheres' counts

  5. TIMSS 2011 User Guide for the International Database. Supplement 2: National Adaptations of International Background Questionnaires

    Science.gov (United States)

    Foy, Pierre, Ed.; Arora, Alka, Ed.; Stanco, Gabrielle M., Ed.

    2013-01-01

    This supplement describes national adaptations made to the international version of the TIMSS 2011 background questionnaires. This information provides users with a guide to evaluate the availability of internationally comparable data for use in secondary analyses involving the TIMSS 2011 background variables. Background questionnaire adaptations…

  6. Using Financial Information in Continuing Education. Accepted Methods and New Approaches.

    Science.gov (United States)

    Matkin, Gary W.

    This book, which is intended as a resource/reference guide for experienced financial managers and course planners, examines accepted methods and new approaches for using financial information in continuing education. The introduction reviews theory and practice, traditional and new methods, planning and organizational management, and technology.…

  7. Ensuring the integrity of information resources based methods dvooznakovoho structural data encoding

    Directory of Open Access Journals (Sweden)

    О.К. Юдін

    2009-01-01

    Full Text Available  Developed methods of estimation of noise stability and correction of structural code constructions to distortion in comunication of data in informatively communication systems and networks taking into account providing of integrity of informative resource.

  8. The dark soliton on a cnoidal wave background

    International Nuclear Information System (INIS)

    Shin, H J

    2005-01-01

    We find a solution of the dark soliton lying on a cnoidal wave background in a defocusing medium. We use the method of Darboux transformation, which is applied to the cnoidal wave solution of the defocusing nonlinear Schroedinger equation. Interesting characteristics of the dark soliton, i.e., the velocity and greyness, are calculated and compared with those of the dark soliton lying on a continuous wave background. We also calculate the shift of the crest of the cnoidal wave along the soliton

  9. Autogenic-Feedback Training (AFT) as a preventive method for space motion sickness: Background and experimental design

    Science.gov (United States)

    Cowings, Patricia S.; Toscano, William B.

    1993-01-01

    Finding an effective treatment for the motion sickness-like symptoms that occur in space has become a high priority for NASA. The background research is reviewed and the experimental design of a formal life sciences shuttle flight experiment designed to prevent space motion sickness in shuttle crew members is presented. This experiment utilizes a behavioral medicine approach to solving this problem. This method, Autogenic-Feedback Training (AFT), involves training subjects to voluntarily control several of their own physiological responses to environmental stressors. AFT has been used reliably to increase tolerance to motion sickness during ground-based tests in over 200 men and women under a variety of conditions that induce motion sickness, and preliminary evidence from space suggests that AFT may be an effective treatment for space motion sickness as well. Proposed changes to this experiment for future manifests are included.

  10. Information technology equipment cooling method

    Science.gov (United States)

    Schultz, Mark D.

    2015-10-20

    According to one embodiment, a system for removing heat from a rack of information technology equipment may include a sidecar indoor air to liquid heat exchanger that cools air utilized by the rack of information technology equipment to cool the rack of information technology equipment. The system may also include a liquid to liquid heat exchanger and an outdoor heat exchanger. The system may further include configurable pathways to connect and control fluid flow through the sidecar heat exchanger, the liquid to liquid heat exchanger, the rack of information technology equipment, and the outdoor heat exchanger based upon ambient temperature and/or ambient humidity to remove heat generated by the rack of information technology equipment.

  11. How information systems should support the information needs of general dentists in clinical settings: suggestions from a qualitative study

    Directory of Open Access Journals (Sweden)

    Wali Teena

    2010-02-01

    Full Text Available Abstract Background A major challenge in designing useful clinical information systems in dentistry is to incorporate clinical evidence based on dentists' information needs and then integrate the system seamlessly into the complex clinical workflow. However, little is known about the actual information needs of dentists during treatment sessions. The purpose of this study is to identify general dentists' information needs and the information sources they use to meet those needs in clinical settings so as to inform the design of dental information systems. Methods A semi-structured interview was conducted with a convenience sample of 18 general dentists in the Pittsburgh area during clinical hours. One hundred and five patient cases were reported by these dentists. Interview transcripts were coded and analyzed using thematic analysis with a constant comparative method to identify categories and themes regarding information needs and information source use patterns. Results Two top-level categories of information needs were identified: foreground and background information needs. To meet these needs, dentists used four types of information sources: clinical information/tasks, administrative tasks, patient education and professional development. Major themes of dentists' unmet information needs include: (1 timely access to information on various subjects; (2 better visual representations of dental problems; (3 access to patient-specific evidence-based information; and (4 accurate, complete and consistent documentation of patient records. Resource use patterns include: (1 dentists' information needs matched information source use; (2 little use of electronic sources took place during treatment; (3 source use depended on the nature and complexity of the dental problems; and (4 dentists routinely practiced cross-referencing to verify patient information. Conclusions Dentists have various information needs at the point of care. Among them, the needs

  12. 102: PROMOTING INFORMATION LITERACY BY PROMOTING HEALTH LITERACY IN THE INFORMATION SOCIETY

    Science.gov (United States)

    Dastani, Meisam; Sattari, Masoume

    2017-01-01

    Background and aims In the information society the production, distribution and use of information is freely and widely available for all issues of life. Correct and appropriate use of appropriate and reliable information is especially important in health care. The present study introduces the concepts and benefits of health literacy and information literacy and its role in improving health literacy. Methods This study is a review based on a review of the concepts of the information society, information literacy and information educated to present importance of promoting information literacy on health literacy in the information society. Results and Conclusion The information society by providing a platform of information technology and computer systems to attempts exchange and development information between people in the community. Currently, electronic and web-based health information in the form of mass is available for people. Information as a fundamental base of the information society is a phenomenon that our decisions are affect in relation to various issues such as safety and health issues. It is important point to avoid the mass of information invalid, incorrect and inappropriate available on the internet. This requires information literacy skills such as identifying, accessing and evaluating information. In general, it can be said that the promotion of health literacy in communities are required to learn different skills in the form of information literacy.

  13. The Danish National Birth Cohort--its background, structure and aim

    DEFF Research Database (Denmark)

    Olsen, J; Melbye, M; Olsen, S F

    2001-01-01

    BACKGROUND: It is well known that the time from conception to early childhood has importance for health conditions that reach into later stages of life. Recent research supports this view, and diseases such as cardiovascular morbidity, cancer, mental illnesses, asthma, and allergy may all have...... component causes that act early in life. Exposures in this period, which influence fetal growth, cell divisions, and organ functioning, may have long-lasting impact on health and disease susceptibility. METHODS: To investigate these issues the Danish National Birth Cohort (Better health for mother and child....... Exposure information is mainly collected by computer-assisted telephone interviews with the women twice during pregnancy and when their children are six and 18 months old. Participants are also asked to fill in a self-administered food frequency questionnaire in mid-pregnancy. Furthermore, a biological...

  14. Personal exposure to grass pollen: relating inhaled dose to background concentration

    DEFF Research Database (Denmark)

    Peel, Robert George; Hertel, Ole; Smith, Matt

    2013-01-01

    Background: Very few studies on human exposure to allergenic pollen have been conducted using direct methods, with background concentrations measured at city center monitoring stations typically taken as a proxy for exposure despite the inhomogeneous nature of atmospheric pollen concentrations. A...

  15. Listening to Students from Refugee Backgrounds: Lessons for Education Professionals

    Science.gov (United States)

    Mthethwa-Sommers, Shirley; Kisiara, Otieno

    2015-01-01

    This article is based on a study that examined how students from refugee backgrounds cope with victimization and bullying in three urban high schools in the United States. Qualitative methods of data collection and analysis were employed. Twelve high school students from refugee backgrounds participated in the study, which involved focus group…

  16. Background radiation map of Thailand

    International Nuclear Information System (INIS)

    Angsuwathana, P.; Chotikanatis, P.

    1997-01-01

    The radioelement concentration in the natural environment as well as the radiation exposure to man in day-to-day life is now the most interesting topic. The natural radiation is frequently referred as a standard for comparing additional sources of man-made radiation such as atomic weapon fallout, nuclear power generation, radioactive waste disposal, etc. The Department of Mineral Resources commenced a five-year project of nationwide airborne geophysical survey by awarding to Kenting Earth Sciences International Limited in 1984. The original purpose of survey was to support mineral exploration and geological mapping. Subsequently, the data quantity has been proved to be suitable for natural radiation information. In 1993 the Department of Mineral Resources, with the assistance of IAEA, published a Background Radiation Map of Thailand at the scale of 1:1,000,000 from the existing airborne radiometric digital data. The production of Background Radiation Map of Thailand is the result of data compilation and correction procedure developed over the Canadian Shield. This end product will be used as a base map in environmental application not only for Thailand but also Southeast Asia region. (author)

  17. [Development method of healthcare information system integration based on business collaboration model].

    Science.gov (United States)

    Li, Shasha; Nie, Hongchao; Lu, Xudong; Duan, Huilong

    2015-02-01

    Integration of heterogeneous systems is the key to hospital information construction due to complexity of the healthcare environment. Currently, during the process of healthcare information system integration, people participating in integration project usually communicate by free-format document, which impairs the efficiency and adaptability of integration. A method utilizing business process model and notation (BPMN) to model integration requirement and automatically transforming it to executable integration configuration was proposed in this paper. Based on the method, a tool was developed to model integration requirement and transform it to integration configuration. In addition, an integration case in radiology scenario was used to verify the method.

  18. Organizational and provider level factors in implementation of trauma-informed care after a city-wide training: an explanatory mixed methods assessment

    Directory of Open Access Journals (Sweden)

    April Joy Damian

    2017-11-01

    Full Text Available Abstract Background While there is increasing support for training youth-serving providers in trauma-informed care (TIC as a means of addressing high prevalence of U.S. childhood trauma, we know little about the effects of TIC training on organizational culture and providers’ professional quality of life. This mixed-methods study evaluated changes in organizational- and provider-level factors following participation in a citywide TIC training. Methods Government workers and nonprofit professionals (N = 90 who participated in a nine-month citywide TIC training completed a survey before and after the training to assess organizational culture and professional quality of life. Survey data were analyzed using multiple regression analyses. A subset of participants (n = 16 was interviewed using a semi-structured format, and themes related to organizational and provider factors were identified using qualitative methods. Results Analysis of survey data indicated significant improvements in participants’ organizational culture and professional satisfaction at training completion. Participants’ perceptions of their own burnout and secondary traumatic stress also increased. Four themes emerged from analysis of the interview data, including “Implementation of more flexible, less-punitive policies towards clients,” “Adoption of trauma-informed workplace design,” “Heightened awareness of own traumatic stress and need for self-care,” and “Greater sense of camaraderie and empathy for colleagues.” Conclusion Use of a mixed-methods approach provided a nuanced understanding of the impact of TIC training and suggested potential benefits of the training on organizational and provider-level factors associated with implementation of trauma-informed policies and practices. Future trainings should explicitly address organizational factors such as safety climate and morale, managerial support, teamwork climate and collaboration, and

  19. Saudi Arabia: Background and U.S. Relations

    Science.gov (United States)

    2015-04-29

    23, 2013. 6 Background information on Saudi cabinet members is available at http://www.saudiembassy.net/about/ Biographies - of-Ministers.aspx. Saudi...territories it occupied in 1967, (2) agree to the establishment of a Palestinian state with a capital in East Jerusalem , and provide for the (3) “[a...compromise on Palestinian sovereignty in Jerusalem .” Elhanan Miller, “Arab ministers back Abbas in rejecting ‘Jewish’ Israel,” Times of Israel, January

  20. arXiv Probing non-Gaussian Stochastic Gravitational Wave Backgrounds with LISA

    CERN Document Server

    Bartolo, Nicola; Figueroa, Daniel G.; Garcia-Bellido, Juan; Peloso, Marco; Pieroni, Mauro; Ricciardone, Angelo; Sakellariadou, Mairi; Sorbo, Lorenzo; Tasinato, Gianmassimo

    The stochastic gravitational wave background (SGWB) contains a wealth of information on astrophysical and cosmological processes. A major challenge of upcoming years will be to extract the information contained in this background and to disentangle the contributions of different sources. In this paper we provide the formalism to extract, from the correlation of three signals in the Laser Interferometer Space Antenna (LISA), information about the tensor three-point function, which characterizes the non-Gaussian properties of the SGWB. Compared to the two-point function, the SGWB three-point function has a richer dependence on the gravitational wave momenta and chiralities, and a larger number of signal channels. It can be used therefore as a powerful discriminator between different models. We provide LISA's response functions to a general SGWB three-point function. As examples, we study in full detail the cases of an equilateral and squeezed SGWB bispectra, and provide the explicit form of the response functio...

  1. Video coding and decoding devices and methods preserving PPG relevant information

    NARCIS (Netherlands)

    2015-01-01

    The present invention relates to a video encoding device (10, 10', 10") and method for encoding video data and to a corresponding video decoding device (60, 60') and method. To preserve PPG relevant information after encoding without requiring a large amount of additional data for the video encoder

  2. Video coding and decoding devices and methods preserving ppg relevant information

    NARCIS (Netherlands)

    2013-01-01

    The present invention relates to a video encoding device (10, 10', 10'') and method for encoding video data and to a corresponding video decoding device (60, 60') and method. To preserve PPG relevant information after encoding without requiring a large amount of additional data for the video encoder

  3. Talker and background noise specificity in spoken word recognition memory

    Directory of Open Access Journals (Sweden)

    Angela Cooper

    2017-11-01

    Full Text Available Prior research has demonstrated that listeners are sensitive to changes in the indexical (talker-specific characteristics of speech input, suggesting that these signal-intrinsic features are integrally encoded in memory for spoken words. Given that listeners frequently must contend with concurrent environmental noise, to what extent do they also encode signal-extrinsic details? Native English listeners’ explicit memory for spoken English monosyllabic and disyllabic words was assessed as a function of consistency versus variation in the talker’s voice (talker condition and background noise (noise condition using a delayed recognition memory paradigm. The speech and noise signals were spectrally-separated, such that changes in a simultaneously presented non-speech signal (background noise from exposure to test would not be accompanied by concomitant changes in the target speech signal. The results revealed that listeners can encode both signal-intrinsic talker and signal-extrinsic noise information into integrated cognitive representations, critically even when the two auditory streams are spectrally non-overlapping. However, the extent to which extra-linguistic episodic information is encoded alongside linguistic information appears to be modulated by syllabic characteristics, with specificity effects found only for monosyllabic items. These findings suggest that encoding and retrieval of episodic information during spoken word processing may be modulated by lexical characteristics.

  4. Preoperative information needs of children undergoing tonsillectomy.

    LENUS (Irish Health Repository)

    Buckley, Aoife

    2012-02-01

    AIMS AND OBJECTIVES: To identify the information needs of children undergoing tonsillectomy with reference to content of information, method of delivery, information providers and timing of information provision. BACKGROUND: Tonsillectomy can be anxiety provoking for children and preoperative preparation programmes are long recognised to reduce anxiety. However, few have been designed from the perspectives of children and to date little is known about how best to prepare children in terms of what to tell them, how to convey information to them, who can best provide information and what is the best timing for information provision. DESIGN: A qualitative descriptive study. METHOD: Data were collected from nine children (aged 6-9) using interviews supported by a write and draw technique. Data were coded and categorised into themes reflecting content, method, providers and timing of information. RESULTS: Children openly communicated their information needs especially on what to tell them to expect when facing a tonsillectomy. Their principal concerns were about operation procedures, experiencing \\'soreness\\' and discomfort postoperatively and parental presence. Mothers were viewed as best situated to provide them with information. Children were uncertain about what method of information and timing would be most helpful to them. CONCLUSION: Preoperative educational interventions need to take account of children\\'s information needs so that they are prepared for surgery in ways that are meaningful and relevant to them. Future research is needed in this area. RELEVANCE TO CLINICAL PRACTICE: Practical steps towards informing children about having a tonsillectomy include asking them what they need to know and addressing their queries accordingly. Child-centred information leaflets using a question and answer format could also be helpful to children.

  5. Assessment of background particulate matter concentrations in small cities and rural locations--Prince George, Canada.

    Science.gov (United States)

    Veira, Andreas; Jackson, Peter L; Ainslie, Bruce; Fudge, Dennis

    2013-07-01

    This study investigates the development and application of a simple method to calculate annual and seasonal PM2.5 and PM10 background concentrations in small cities and rural areas. The Low Pollution Sectors and Conditions (LPSC) method is based on existing measured long-term data sets and is designed for locations where particulate matter (PM) monitors are only influenced by local anthropogenic emission sources from particular wind sectors. The LPSC method combines the analysis of measured hourly meteorological data, PM concentrations, and geographical emission source distributions. PM background levels emerge from measured data for specific wind conditions, where air parcel trajectories measured at a monitoring station are assumed to have passed over geographic sectors with negligible local emissions. Seasonal and annual background levels were estimated for two monitoring stations in Prince George, Canada, and the method was also applied to four other small cities (Burns Lake, Houston, Quesnel, Smithers) in northern British Columbia. The analysis showed reasonable background concentrations for both monitoring stations in Prince George, whereas annual PM10 background concentrations at two of the other locations and PM2.5 background concentrations at one other location were implausibly high. For those locations where the LPSC method was successful, annual background levels ranged between 1.8 +/- 0.1 microg/m3 and 2.5 +/- 0.1 microg/m3 for PM2.5 and between 6.3 +/- 0.3 microg/m3 and 8.5 +/- 0.3 microg/m3 for PM10. Precipitation effects and patterns of seasonal variability in the estimated background concentrations were detectable for all locations where the method was successful. Overall the method was dependent on the configuration of local geography and sources with respect to the monitoring location, and may fail at some locations and under some conditions. Where applicable, the LPSC method can provide a fast and cost-efficient way to estimate background PM

  6. Infrared image background modeling based on improved Susan filtering

    Science.gov (United States)

    Yuehua, Xia

    2018-02-01

    When SUSAN filter is used to model the infrared image, the Gaussian filter lacks the ability of direction filtering. After filtering, the edge information of the image cannot be preserved well, so that there are a lot of edge singular points in the difference graph, increase the difficulties of target detection. To solve the above problems, the anisotropy algorithm is introduced in this paper, and the anisotropic Gauss filter is used instead of the Gauss filter in the SUSAN filter operator. Firstly, using anisotropic gradient operator to calculate a point of image's horizontal and vertical gradient, to determine the long axis direction of the filter; Secondly, use the local area of the point and the neighborhood smoothness to calculate the filter length and short axis variance; And then calculate the first-order norm of the difference between the local area of the point's gray-scale and mean, to determine the threshold of the SUSAN filter; Finally, the built SUSAN filter is used to convolution the image to obtain the background image, at the same time, the difference between the background image and the original image is obtained. The experimental results show that the background modeling effect of infrared image is evaluated by Mean Squared Error (MSE), Structural Similarity (SSIM) and local Signal-to-noise Ratio Gain (GSNR). Compared with the traditional filtering algorithm, the improved SUSAN filter has achieved better background modeling effect, which can effectively preserve the edge information in the image, and the dim small target is effectively enhanced in the difference graph, which greatly reduces the false alarm rate of the image.

  7. Hazardous air pollutant emissions from process units in the synthetic organic chemical manufacturing industry: Background information for proposed standards. Volume 1B. Control technologies. Draft report

    International Nuclear Information System (INIS)

    1992-11-01

    A draft rule for the regulation of emissions of organic hazardous air pollutants (HAP's) from chemical processes of the synthetic organic chemical manufacturing industry (SOCMI) is being proposed under the authority of Sections 112, 114, 116, and 301 of the Clean Air Act, as amended in 1990. The volume of the Background Information Document presents discussions of control technologies used in the industry and the costs of those technologies

  8. Big bang nucleosynthesis, cosmic microwave background anisotropies and dark energy

    International Nuclear Information System (INIS)

    Signore, Monique; Puy, Denis

    2002-01-01

    Over the last decade, cosmological observations have attained a level of precision which allows for very detailed comparison with theoretical predictions. We are beginning to learn the answers to some fundamental questions, using information contained in Cosmic Microwave Background Anisotropy (CMBA) data. In this talk, we briefly review some studies of the current and prospected constraints imposed by CMBA measurements on the neutrino physics and on the dark energy. As it was already announced by Scott, we present some possible new physics from the Cosmic Microwave Background (CMB)

  9. Discussion about risk-informed regulations on the nuclear safety

    International Nuclear Information System (INIS)

    Gu Yeyi

    2008-01-01

    The article introduces the background and status quo of regulations on the nuclear safety in China, and points out the inadequacies existing with the current regulations. The author explains the risk-informed safety management concerning its development, status quo, and achievements made, in an attempt to make out the trend of improving regulations on the nuclear safety through risk-informed methods. Combining the U.S. development program of establishing risk-informed regulations on the nuclear safety, the author narrates principles and features of the new regulations system, and provides suggestions for the promotion of risk-informed safety management and establishment of risk-informed regulations on the nuclear safety. (author)

  10. Method of Choosing the Information Technology System Supporting Management of the Military Aircraft Operation

    Directory of Open Access Journals (Sweden)

    Barszcz Piotr

    2014-12-01

    Full Text Available The paper presents a method of choosing the information technology system, the task of which is to support the management process of the military aircraft operation. The proposed method is based on surveys conducted among direct users of IT systems used in aviation of the Polish Armed Forces. The analysis of results of the surveys was conducted using statistical methods. The paper was completed with practical conclusions related to further usefulness of the individual information technology systems. In the future, they can be extremely useful in the process of selecting the best solutions and integration of the information technology systems

  11. Setting up a station to monitor the background radiation

    International Nuclear Information System (INIS)

    Douglas, J.A.

    1980-12-01

    This paper gives a brief review of the sources of external background from nuclear radiations and presents methods of measurement that can be conveniently used to determine the components. The methods of calibration are considered in detail and some discussion of meteorological effects is included. (author)

  12. Activity – based costing in sport organizations:Theoretical background & future prospects

    Directory of Open Access Journals (Sweden)

    PANAGIOTIS E. DIMITROPOULOS

    2007-01-01

    Full Text Available Costing systems in recent years have shown a significantdevelopment and activity-based costing (ABC specificallyhas been considered as a major contribution to cost management, particularly in service businesses. The sport sector is composed to a great extent of service functions, yet considerably less have been reported of the use of activity based costing to support cost management in sport organizations. Since the power of information becomes continuously crucial for the implementation of effective business administration, the traditional methods of cost measurementproved insufficient on this issue, leading to the invention ofABC. The aim of this paper is twofold. First of all we wantto present the main theoretical background of ABC and itssubstantiated benefits, and secondly to present some practical steps for the implementation of ABC in sport organizations.

  13. Applying Multiple Methods to Comprehensively Evaluate a Patient Portal’s Effectiveness to Convey Information to Patients

    Science.gov (United States)

    Krist, Alex H; Aycock, Rebecca A; Kreps, Gary L

    2016-01-01

    Background Patient portals have yet to achieve their full potential for enhancing health communication and improving health outcomes. Although the Patient Protection and Affordable Care Act in the United States mandates the utilization of patient portals, and usage continues to rise, their impact has not been as profound as anticipated. Objective The objective of our case study was to evaluate how well portals convey information to patients. To demonstrate how multiple methodologies could be used to evaluate and improve the design of patient-centered portals, we conducted an in-depth evaluation of an exemplar patient-centered portal designed to promote preventive care to consumers. Methods We used 31 critical incident patient interviews, 2 clinician focus groups, and a thematic content analysis to understand patients’ and clinicians’ perspectives, as well as theoretical understandings of the portal’s use. Results We gathered over 140 critical incidents, 71.8% (102/142) negative and 28.2% (40/142) positive. Positive incident categories were (1) instant medical information access, (2) clear health information, and (3) patient vigilance. Negative incident categories were (1) standardized content, (2) desire for direct communication, (3) website functionality, and (4) difficulty interpreting laboratory data. Thematic analysis of the portal’s immediacy resulted in high scores in the attributes enhances understanding (18/23, 78%), personalization (18/24, 75%), and motivates behavior (17/24, 71%), but low levels of interactivity (7/24, 29%) and engagement (2/24, 8%). Two overarching themes emerged to guide portal refinements: (1) communication can be improved with directness and interactivity and (2) perceived personalization must be greater to engage patients. Conclusions Results suggest that simple modifications, such as increased interactivity and personalized messages, can make portals customized, robust, easily accessible, and trusted information sources

  14. Natural background approach to setting radiation standards

    International Nuclear Information System (INIS)

    Adler, H.I.; Federow, H.; Weinberg, A.M.

    1979-01-01

    The suggestion has often been made that an additional radiation exposure imposed on humanity as a result of some important activity such as electricity generation would be acceptable if the exposure was small compared to the natural background. In order to make this concept quantitative and objective, we propose that small compared with the natural background be interpreted as the standard deviation (weighted with the exposed population) of the natural background. This use of the variation in natural background radiation is less arbitrary and requires fewer unfounded assumptions than some current approaches to standard-setting. The standard deviation is an easily calculated statistic that is small compared with the mean value for natural exposures of populations. It is an objectively determined quantity and its significance is generally understood. Its determination does not omit any of the pertinent data. When this method is applied to the population of the United States, it suggests that a dose of 20 mrem/year would be an acceptable standard. This is comparable to the 25 mrem/year suggested as the maximum allowable exposure to an individual from the complete uranium fuel cycle

  15. An Accurate Integral Method for Vibration Signal Based on Feature Information Extraction

    Directory of Open Access Journals (Sweden)

    Yong Zhu

    2015-01-01

    Full Text Available After summarizing the advantages and disadvantages of current integral methods, a novel vibration signal integral method based on feature information extraction was proposed. This method took full advantage of the self-adaptive filter characteristic and waveform correction feature of ensemble empirical mode decomposition in dealing with nonlinear and nonstationary signals. This research merged the superiorities of kurtosis, mean square error, energy, and singular value decomposition on signal feature extraction. The values of the four indexes aforementioned were combined into a feature vector. Then, the connotative characteristic components in vibration signal were accurately extracted by Euclidean distance search, and the desired integral signals were precisely reconstructed. With this method, the interference problem of invalid signal such as trend item and noise which plague traditional methods is commendably solved. The great cumulative error from the traditional time-domain integral is effectively overcome. Moreover, the large low-frequency error from the traditional frequency-domain integral is successfully avoided. Comparing with the traditional integral methods, this method is outstanding at removing noise and retaining useful feature information and shows higher accuracy and superiority.

  16. Methods of information geometry

    CERN Document Server

    Amari, Shun-Ichi

    2000-01-01

    Information geometry provides the mathematical sciences with a new framework of analysis. It has emerged from the investigation of the natural differential geometric structure on manifolds of probability distributions, which consists of a Riemannian metric defined by the Fisher information and a one-parameter family of affine connections called the \\alpha-connections. The duality between the \\alpha-connection and the (-\\alpha)-connection together with the metric play an essential role in this geometry. This kind of duality, having emerged from manifolds of probability distributions, is ubiquitous, appearing in a variety of problems which might have no explicit relation to probability theory. Through the duality, it is possible to analyze various fundamental problems in a unified perspective. The first half of this book is devoted to a comprehensive introduction to the mathematical foundation of information geometry, including preliminaries from differential geometry, the geometry of manifolds or probability d...

  17. Reducing the information load in map animations as a tool for exploratory analysis

    OpenAIRE

    Multimäki, Salla

    2016-01-01

    This dissertation investigates the information load that animated maps cause to their viewers, and presents two novel visualisation methods to support the exploratory visual analysis of the animations. Information load consists of the information content of the map and its presentation. The number of objects and their attributes are the unavoidable content, but the visualisation of the objects, the background map, and display settings of an animation have an effect on the information load and...

  18. Operator-independent method for background subtraction in adrenal-uptake measurements: concise communication

    International Nuclear Information System (INIS)

    Koral, K.F.; Sarkar, S.D.

    1977-01-01

    A new computer program for adrenal-uptake measurements is presented in which the algorithm identifies the adrenal and background regions automatically after being given a starting point in the image. Adrenal uptakes and results of reproducibility tests are given for patients injected with [ 131 I] 6β-iodomethyl-19-norcholesterol. The data to date indicate no overlap in the percent-of-dose uptakes for normal patients and patients with Cushing's disease and Cushing's syndrome

  19. Planck 2013 results. XVIII. Gravitational lensing-infrared background correlation

    CERN Document Server

    Ade, P.A.R.; Armitage-Caplan, C.; Arnaud, M.; Ashdown, M.; Atrio-Barandela, F.; Aumont, J.; Baccigalupi, C.; Banday, A.J.; Barreiro, R.B.; Bartlett, J.G.; Basak, S.; Battaner, E.; Benabed, K.; Benoit, A.; Benoit-Levy, A.; Bernard, J.P.; Bersanelli, M.; Bethermin, M.; Bielewicz, P.; Bobin, J.; Bock, J.J.; Bonaldi, A.; Bond, J.R.; Borrill, J.; Bouchet, F.R.; Boulanger, F.; Bridges, M.; Bucher, M.; Burigana, C.; Butler, R.C.; Cardoso, J.F.; Catalano, A.; Challinor, A.; Chamballu, A.; Chiang, L.Y.; Chiang, H.C.; Christensen, P.R.; Church, S.; Clements, D.L.; Colombi, S.; Colombo, L.P.L.; Couchot, F.; Coulais, A.; Crill, B.P.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R.D.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.M.; Desert, F.X.; Diego, J.M.; Dole, H.; Donzelli, S.; Dore, O.; Douspis, M.; Dupac, X.; Efstathiou, G.; Ensslin, T.A.; Eriksen, H.K.; Finelli, F.; Forni, O.; Frailis, M.; Franceschi, E.; Galeotta, S.; Ganga, K.; Giard, M.; Giardino, G.; Giraud-Heraud, Y.; Gonzalez-Nuevo, J.; Gorski, K.M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J.E.; Hansen, F.K.; Hanson, D.; Harrison, D.; Henrot-Versille, S.; Hernandez-Monteagudo, C.; Herranz, D.; Hildebrandt, S.R.; Hivon, E.; Hobson, M.; Holmes, W.A.; Hornstrup, A.; Hovest, W.; Huffenberger, K.M.; Jaffe, T.R.; Jaffe, A.H.; Jones, W.C.; Juvela, M.; Keihanen, E.; Keskitalo, R.; Kisner, T.S.; Kneissl, R.; Knoche, J.; Knox, L.; Kunz, M.; Kurki-Suonio, H.; Lacasa, F.; Lagache, G.; Lahteenmaki, A.; Lamarre, J.M.; Lasenby, A.; Laureijs, R.J.; Lawrence, C.R.; Leonardi, R.; Leon-Tavares, J.; Lesgourgues, J.; Liguori, M.; Lilje, P.B.; Linden-Vornle, M.; Lopez-Caniego, M.; Lubin, P.M.; Macias-Perez, J.F.; Maffei, B.; Maino, D.; Mandolesi, N.; Maris, M.; Marshall, D.J.; Martin, P.G.; Martinez-Gonzalez, E.; Masi, S.; Matarrese, S.; Matthai, F.; Mazzotta, P.; Melchiorri, A.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mitra, S.; Miville-Deschenes, M.A.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Munshi, D.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C.B.; Norgaard-Nielsen, H.U.; Noviello, F.; Novikov, D.; Novikov, I.; Osborne, S.; Oxborrow, C.A.; Paci, F.; Pagano, L.; Pajot, F.; Paoletti, D.; Patanchon, G.; Perdereau, O.; Perotto, L.; Perrotta, F.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Popa, L.; Poutanen, T.; Pratt, G.W.; Prezeau, G.; Prunet, S.; Puget, J.L.; Rachen, J.P.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Ricciardi, S.; Riller, T.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Roudier, G.; Rowan-Robinson, M.; Rusholme, B.; Sandri, M.; Santos, D.; Savini, G.; Scott, D.; Seiffert, M.D.; Serra, P.; Shellard, E.P.S.; Spencer, L.D.; Starck, J.L.; Stolyarov, V.; Stompor, R.; Sudiwala, R.; Sunyaev, R.; Sureau, F.; Sutton, D.; Suur-Uski, A.S.; Sygnet, J.F.; Tauber, J.A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Vielva, P.; Villa, F.; Vittorio, N.; Wade, L.A.; Wandelt, B.D.; White, S.D.M.; Yvon, D.; Zacchei, A.; Zonca, A.

    2014-01-01

    The multi-frequency capability of the Planck satellite provides information both on the integrated history of star formation (via the cosmic infrared background, or CIB) and on the distribution of dark matter (via the lensing effect on the cosmic microwave background, or CMB). The conjunction of these two unique probes allows us to measure directly the connection between dark and luminous matter in the high redshift (1 1. We measure directly the SFR density with around 2 sigma significance for three redshift bins between z=1 and 7, thus opening a new window into the study of the formation of stars at early times.

  20. The Alaskan mineral resource assessment program; background information to accompany folio of geologic and mineral resource maps of the Ambler River Quadrangle, Alaska

    Science.gov (United States)

    Mayfield, Charles F.; Tailleur, I.L.; Albert, N.R.; Ellersieck, Inyo; Grybeck, Donald; Hackett, S.W.

    1983-01-01

    The Ambler River quadrangle, consisting of 14,290 km2 (5,520 mi2) in northwest Alaska, was investigated by an interdisciplinary research team for the purpose of assessing the mineral resource potential of the quadrangle. This report provides background information for a folio of maps on the geology, reconnaissance geochemistry, aeromagnetics, Landsat imagery, and mineral resource evaluation of the quadrangle. A summary of the geologic history, radiometric dates, and fossil localities and a comprehensive bibliography are also included. The quadrangle contains jade reserves, now being mined, and potentially significant resources of copper, zinc, lead, and silver.

  1. A systematic analysis of the XMM-Newton background: IV. Origin of the unfocused and focused components

    Science.gov (United States)

    Gastaldello, F.; Ghizzardi, S.; Marelli, M.; Salvetti, D.; Molendi, S.; De Luca, A.; Moretti, A.; Rossetti, M.; Tiengo, A.

    2017-12-01

    We show the results obtained in the FP7 European program EXTraS and in the ESA R&D ATHENA activity AREMBES aimed at a deeper understanding of the XMM-Newton background to better design the ATHENA mission. Thanks to an analysis of the full EPIC archive coupled to the information obtained by the Radiation Monitor we show the cosmic ray origin of the unfocused particle background and its anti-correlation with the solar activity. We show the first results of the effort to obtain informations about the particle component of the soft proton focused background.

  2. Report: Management Alert - EPA Has Not Initiated Required Background Investigations for Information Systems Contractor Personnel

    Science.gov (United States)

    Report #17-P-0409, September 27, 2017. Not vetting contractor personnel before granting them network access exposes the EPA to risks. Contractor personnel with potentially questionable backgrounds who access sensitive agency data could cause harm.

  3. Cosmic microwave background bispectrum from recombination.

    Science.gov (United States)

    Huang, Zhiqi; Vernizzi, Filippo

    2013-03-08

    We compute the cosmic microwave background temperature bispectrum generated by nonlinearities at recombination on all scales. We use CosmoLib2nd, a numerical Boltzmann code at second order to compute cosmic microwave background bispectra on the full sky. We consistently include all effects except gravitational lensing, which can be added to our result using standard methods. The bispectrum is peaked on squeezed triangles and agrees with the analytic approximation in the squeezed limit at the few percent level for all the scales where this is applicable. On smaller scales, we recover previous results on perturbed recombination. For cosmic-variance limited data to l(max)=2000, its signal-to-noise ratio is S/N=0.47, corresponding to f(NL)(eff)=-2.79, and will bias a local signal by f(NL)(loc) ~/= 0.82.

  4. Measuring Extinction in Local Group Galaxies Using Background Galaxies

    Science.gov (United States)

    Wyder, T. K.; Hodge, P. W.

    1999-05-01

    Knowledge of the distribution and quantity of dust in galaxies is important for understanding their structure and evolution. The goal of our research is to measure the total extinction through Local Group galaxies using measured properties of background galaxies. Our method relies on the SExtractor software as an objective and automated method of detecting background galaxies. In an initial test, we have explored two WFPC2 fields in the SMC and two in M31 obtained from the HST archives. The two pointings in the SMC are fields around the open clusters L31 and B83 while the two M31 fields target the globular clusters G1 and G170. Except for the G1 observations of M31, the fields chosen are very crowded (even when observed with HST) and we chose them as a particularly stringent test of the method. We performed several experiments using a series of completeness tests that involved superimposing comparison fields, adjusted to the equivalent exposure time, from the HST Medium-Deep and Groth-Westphal surveys. These tests showed that for crowded fields, such as the two in the core of the SMC and the one in the bulge of M31, this automated method of detecting galaxies can be completely dominated by the effects of crowding. For these fields, only a small fraction of the added galaxies was recovered. However, in the outlying G1 field in M31, almost all of the added galaxies were recovered. The numbers of actual background galaxies in this field are consistent with zero extinction. As a follow-up experiment, we used image processing techniques to suppress stellar objects while enhancing objects with non-stellar, more gradual luminosity profiles. This method yielded significant numbers of background galaxies in even the most crowded fields, which we are now analyzing to determine the total extinction and reddening caused by the foreground galaxy.

  5. Effect of background music on auditory-verbal memory performance

    OpenAIRE

    Sona Matloubi; Ali Mohammadzadeh; Zahra Jafari; Alireza Akbarzade Baghban

    2014-01-01

    Background and Aim: Music exists in all cultures; many scientists are seeking to understand how music effects cognitive development such as comprehension, memory, and reading skills. More recently, a considerable number of neuroscience studies on music have been developed. This study aimed to investigate the effects of null and positive background music in comparison with silence on auditory-verbal memory performance.Methods: Forty young adults (male and female) with normal hearing, aged betw...

  6. A Review of Methods for Analysis of the Expected Value of Information.

    Science.gov (United States)

    Heath, Anna; Manolopoulou, Ioanna; Baio, Gianluca

    2017-10-01

    In recent years, value-of-information analysis has become more widespread in health economic evaluations, specifically as a tool to guide further research and perform probabilistic sensitivity analysis. This is partly due to methodological advancements allowing for the fast computation of a typical summary known as the expected value of partial perfect information (EVPPI). A recent review discussed some approximation methods for calculating the EVPPI, but as the research has been active over the intervening years, that review does not discuss some key estimation methods. Therefore, this paper presents a comprehensive review of these new methods. We begin by providing the technical details of these computation methods. We then present two case studies in order to compare the estimation performance of these new methods. We conclude that a method based on nonparametric regression offers the best method for calculating the EVPPI in terms of accuracy, computational time, and ease of implementation. This means that the EVPPI can now be used practically in health economic evaluations, especially as all the methods are developed in parallel with R functions and a web app to aid practitioners.

  7. Adaptation of the European Commission-recommended user testing method to patient medication information leaflets in Japan

    Directory of Open Access Journals (Sweden)

    Yamamoto M

    2017-06-01

    Full Text Available Michiko Yamamoto,1 Hirohisa Doi,1 Ken Yamamoto,2 Kazuhiro Watanabe,2 Tsugumichi Sato,3 Machi Suka,4 Takeo Nakayama,5 Hiroki Sugimori6 1Department of Drug Informatics, Center for Education & Research on Clinical Pharmacy, Showa Pharmaceutical University, Tokyo, Japan; 2Department of Pharmacy Practice, Center for Education & Research on Clinical Pharmacy, Showa Pharmaceutical University, Tokyo, Japan; 3Faculty of Pharmaceutical Sciences, Tokyo University of Science, Chiba, Japan; 4Department of Public Health and Environmental Medicine, The Jikei University School of Medicine, Tokyo, Japan; 5Department of Health Informatics, Kyoto University School of Public, Kyoto, Japan; 6Department of Preventive Medicine, Graduate School of Sports and Health Sciences, Daito Bunka University, Saitama, Japan Background: The safe use of drugs relies on providing accurate drug information to patients. In Japan, patient leaflets called Drug Guide for Patients are officially available; however, their utility has never been verified. This is the first attempt to improve Drug Guide for Patients via user testing in Japan.Purpose: To test and improve communication of drug information to minimize risk for patients via user testing of the current and revised versions of Drug Guide for Patients, and to demonstrate that this method is effective for improving Drug Guide for Patients in Japan.Method: We prepared current and revised versions of the Drug Guide for Patients and performed user testing via semi-structured interviews with consumers to compare these versions for two guides for Mercazole and Strattera. We evenly divided 54 participants into two groups with similar distributions of sex, age, and literacy level to test the differing versions of the Mercazole guide. Another group of 30 participants were divided evenly to test the versions of the Strattera guide. After completing user testing, the participants evaluated both guides in terms of amount of information

  8. Vector analysis as a fast and easy method to compare gene expression responses between different experimental backgrounds

    NARCIS (Netherlands)

    Breitling, R.; Armengaud, P.; Amtmann, A.

    2005-01-01

    Background Gene expression studies increasingly compare expression responses between different experimental backgrounds (genetic, physiological, or phylogenetic). By focusing on dynamic responses rather than a direct comparison of static expression levels, this type of study allows a finer

  9. JEM-X background models

    DEFF Research Database (Denmark)

    Huovelin, J.; Maisala, S.; Schultz, J.

    2003-01-01

    Background and determination of its components for the JEM-X X-ray telescope on INTEGRAL are discussed. A part of the first background observations by JEM-X are analysed and results are compared to predictions. The observations are based on extensive imaging of background near the Crab Nebula...... on revolution 41 of INTEGRAL. Total observing time used for the analysis was 216 502 s, with the average of 25 cps of background for each of the two JEM-X telescopes. JEM-X1 showed slightly higher average background intensity than JEM-X2. The detectors were stable during the long exposures, and weak orbital...... background was enhanced in the central area of a detector, and it decreased radially towards the edge, with a clear vignetting effect for both JEM-X units. The instrument background was weakest in the central area of a detector and showed a steep increase at the very edges of both JEM-X detectors...

  10. A Similarity-Ranking Method on Semantic Computing for Providing Information-Services in Station-Concierge System

    Directory of Open Access Journals (Sweden)

    Motoki Yokoyama

    2017-07-01

    Full Text Available The prevalence of smartphones and wireless broadband networks have been progressing as a new Railway infomration environment. According to the spread of such devices and information technology, various types of information can be obtained from databases connected to the Internet. One scenario of obtaining such a wide variety of information resources is in the phase of user’s transportation. This paper proposes an information provision system, named the Station Concierge System that matches the situation and intention of passengers. The purpose of this system is to estimate the needs of passengers like station staff or hotel concierge and to provide information resources that satisfy user’s expectations dynamically. The most important module of the system is constructed based on a new information ranking method for passenger intention prediction and service recommendation. This method has three main features, which are (1 projecting a user to semantic vector space by using her current context, (2 predicting the intention of a user based on selecting a semantic vector subspace, and (3 ranking the services by a descending order of relevant scores to the user’ intention. By comparing the predicted results of our method with those of two straightforward computation methods, the experimental studies show the effectiveness and efficiency of the proposed method. Using this system, users can obtain transit information and service map that dynamically matches their context.

  11. Fault Diagnosis Method Based on Information Entropy and Relative Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    Xiaoming Xu

    2017-01-01

    Full Text Available In traditional principle component analysis (PCA, because of the neglect of the dimensions influence between different variables in the system, the selected principal components (PCs often fail to be representative. While the relative transformation PCA is able to solve the above problem, it is not easy to calculate the weight for each characteristic variable. In order to solve it, this paper proposes a kind of fault diagnosis method based on information entropy and Relative Principle Component Analysis. Firstly, the algorithm calculates the information entropy for each characteristic variable in the original dataset based on the information gain algorithm. Secondly, it standardizes every variable’s dimension in the dataset. And, then, according to the information entropy, it allocates the weight for each standardized characteristic variable. Finally, it utilizes the relative-principal-components model established for fault diagnosis. Furthermore, the simulation experiments based on Tennessee Eastman process and Wine datasets demonstrate the feasibility and effectiveness of the new method.

  12. Hazardous air pollutant emissions from process units in the synthetic organic chemical manufacturing industry: Background information for proposed standards. Volume 1A. National impacts assessment. Draft report

    International Nuclear Information System (INIS)

    1992-11-01

    A draft rule for the regulation of emissions of organic hazardous air pollutants (HAP's) from chemical processes of the synthetic organic chemical manufacturing industry (SOCMI) is being proposed under the authority of Sections 112, 114, 116, and 301 of the Clean Air Act, as amended in 1990. The volume of the Background Information Document presents the results of the national impacts assessment for the proposed rule

  13. Information in Our World: Conceptions of Information and Problems of Method in Information Science

    Science.gov (United States)

    Ma, Lai

    2012-01-01

    Many concepts of information have been proposed and discussed in library and information science. These concepts of information can be broadly categorized as empirical and situational information. Unlike nomenclatures in many sciences, however, the concept of information in library and information science does not bear a generally accepted…

  14. Optimization of the Regularization in Background and Foreground Modeling

    Directory of Open Access Journals (Sweden)

    Si-Qi Wang

    2014-01-01

    Full Text Available Background and foreground modeling is a typical method in the application of computer vision. The current general “low-rank + sparse” model decomposes the frames from the video sequences into low-rank background and sparse foreground. But the sparse assumption in such a model may not conform with the reality, and the model cannot directly reflect the correlation between the background and foreground either. Thus, we present a novel model to solve this problem by decomposing the arranged data matrix D into low-rank background L and moving foreground M. Here, we only need to give the priori assumption of the background to be low-rank and let the foreground be separated from the background as much as possible. Based on this division, we use a pair of dual norms, nuclear norm and spectral norm, to regularize the foreground and background, respectively. Furthermore, we use a reweighted function instead of the normal norm so as to get a better and faster approximation model. Detailed explanation based on linear algebra about our two models will be presented in this paper. By the observation of the experimental results, we can see that our model can get better background modeling, and even simplified versions of our algorithms perform better than mainstream techniques IALM and GoDec.

  15. Value of information methods to design a clinical trial in a small population to optimise a health economic utility function

    Directory of Open Access Journals (Sweden)

    Michael Pearce

    2018-02-01

    Full Text Available Abstract Background Most confirmatory randomised controlled clinical trials (RCTs are designed with specified power, usually 80% or 90%, for a hypothesis test conducted at a given significance level, usually 2.5% for a one-sided test. Approval of the experimental treatment by regulatory agencies is then based on the result of such a significance test with other information to balance the risk of adverse events against the benefit of the treatment to future patients. In the setting of a rare disease, recruiting sufficient patients to achieve conventional error rates for clinically reasonable effect sizes may be infeasible, suggesting that the decision-making process should reflect the size of the target population. Methods We considered the use of a decision-theoretic value of information (VOI method to obtain the optimal sample size and significance level for confirmatory RCTs in a range of settings. We assume the decision maker represents society. For simplicity we assume the primary endpoint to be normally distributed with unknown mean following some normal prior distribution representing information on the anticipated effectiveness of the therapy available before the trial. The method is illustrated by an application in an RCT in haemophilia A. We explicitly specify the utility in terms of improvement in primary outcome and compare this with the costs of treating patients, both financial and in terms of potential harm, during the trial and in the future. Results The optimal sample size for the clinical trial decreases as the size of the population decreases. For non-zero cost of treating future patients, either monetary or in terms of potential harmful effects, stronger evidence is required for approval as the population size increases, though this is not the case if the costs of treating future patients are ignored. Conclusions Decision-theoretic VOI methods offer a flexible approach with both type I error rate and power (or equivalently

  16. Background based Gaussian mixture model lesion segmentation in PET

    Energy Technology Data Exchange (ETDEWEB)

    Soffientini, Chiara Dolores, E-mail: chiaradolores.soffientini@polimi.it; Baselli, Giuseppe [DEIB, Department of Electronics, Information, and Bioengineering, Politecnico di Milano, Piazza Leonardo da Vinci 32, Milan 20133 (Italy); De Bernardi, Elisabetta [Department of Medicine and Surgery, Tecnomed Foundation, University of Milano—Bicocca, Monza 20900 (Italy); Zito, Felicia; Castellani, Massimo [Nuclear Medicine Department, Fondazione IRCCS Ca’ Granda Ospedale Maggiore Policlinico, via Francesco Sforza 35, Milan 20122 (Italy)

    2016-05-15

    Purpose: Quantitative {sup 18}F-fluorodeoxyglucose positron emission tomography is limited by the uncertainty in lesion delineation due to poor SNR, low resolution, and partial volume effects, subsequently impacting oncological assessment, treatment planning, and follow-up. The present work develops and validates a segmentation algorithm based on statistical clustering. The introduction of constraints based on background features and contiguity priors is expected to improve robustness vs clinical image characteristics such as lesion dimension, noise, and contrast level. Methods: An eight-class Gaussian mixture model (GMM) clustering algorithm was modified by constraining the mean and variance parameters of four background classes according to the previous analysis of a lesion-free background volume of interest (background modeling). Hence, expectation maximization operated only on the four classes dedicated to lesion detection. To favor the segmentation of connected objects, a further variant was introduced by inserting priors relevant to the classification of neighbors. The algorithm was applied to simulated datasets and acquired phantom data. Feasibility and robustness toward initialization were assessed on a clinical dataset manually contoured by two expert clinicians. Comparisons were performed with respect to a standard eight-class GMM algorithm and to four different state-of-the-art methods in terms of volume error (VE), Dice index, classification error (CE), and Hausdorff distance (HD). Results: The proposed GMM segmentation with background modeling outperformed standard GMM and all the other tested methods. Medians of accuracy indexes were VE <3%, Dice >0.88, CE <0.25, and HD <1.2 in simulations; VE <23%, Dice >0.74, CE <0.43, and HD <1.77 in phantom data. Robustness toward image statistic changes (±15%) was shown by the low index changes: <26% for VE, <17% for Dice, and <15% for CE. Finally, robustness toward the user-dependent volume initialization was

  17. COMPARISON OF IMAGE ENHANCEMENT METHODS FOR CHROMOSOME KARYOTYPE IMAGE ENHANCEMENT

    Directory of Open Access Journals (Sweden)

    Dewa Made Sri Arsa

    2017-02-01

    Full Text Available The chromosome is a set of DNA structure that carry information about our life. The information can be obtained through Karyotyping. The process requires a clear image so the chromosome can be evaluate well. Preprocessing have to be done on chromosome images that is image enhancement. The process starts with image background removing. The image will be cleaned background color. The next step is image enhancement. This paper compares several methods for image enhancement. We evaluate some method in image enhancement like Histogram Equalization (HE, Contrast-limiting Adaptive Histogram Equalization (CLAHE, Histogram Equalization with 3D Block Matching (HE+BM3D, and basic image enhancement, unsharp masking. We examine and discuss the best method for enhancing chromosome image. Therefore, to evaluate the methods, the original image was manipulated by the addition of some noise and blur. Peak Signal-to-noise Ratio (PSNR and Structural Similarity Index (SSIM are used to examine method performance. The output of enhancement method will be compared with result of Professional software for karyotyping analysis named Ikaros MetasystemT M . Based on experimental results, HE+BM3D method gets a stable result on both scenario noised and blur image.

  18. Background and planning requirements for spent fuel shipments to DOE

    Energy Technology Data Exchange (ETDEWEB)

    Ravenscroft, Norman [Edlow International Company, 1666 Connecticut Avenue, NW, Suite 201, Washington, DC 20009 (United States)

    1996-10-01

    Information is provided on the planning required and the factors that must be included in the planning process for spent fuel shipments to DOE. A summary is also provided on the background concerning renewal of the DOE spent fuel acceptance policy in May 1996. (author)

  19. Novel Methods for Measuring Depth of Anesthesia by Quantifying Dominant Information Flow in Multichannel EEGs

    Directory of Open Access Journals (Sweden)

    Kab-Mun Cha

    2017-01-01

    Full Text Available In this paper, we propose novel methods for measuring depth of anesthesia (DOA by quantifying dominant information flow in multichannel EEGs. Conventional methods mainly use few EEG channels independently and most of multichannel EEG based studies are limited to specific regions of the brain. Therefore the function of the cerebral cortex over wide brain regions is hardly reflected in DOA measurement. Here, DOA is measured by the quantification of dominant information flow obtained from principle bipartition. Three bipartitioning methods are used to detect the dominant information flow in entire EEG channels and the dominant information flow is quantified by calculating information entropy. High correlation between the proposed measures and the plasma concentration of propofol is confirmed from the experimental results of clinical data in 39 subjects. To illustrate the performance of the proposed methods more easily we present the results for multichannel EEG on a two-dimensional (2D brain map.

  20. A method for automating the extraction of specialized information from the web

    NARCIS (Netherlands)

    Lin, L.; Liotta, A.; Hippisley, A.; Hao, Y.; Liu, J.; Wang, Y.; Cheung, Y-M.; Yin, H.; Jiao, L.; Ma, j.; Jiao, Y-C.

    2005-01-01

    The World Wide Web can be viewed as a gigantic distributed database including millions of interconnected hosts some of which publish information via web servers or peer-to-peer systems. We present here a novel method for the extraction of semantically rich information from the web in a fully

  1. Robust estimation of the noise variance from background MR data

    NARCIS (Netherlands)

    Sijbers, J.; Den Dekker, A.J.; Poot, D.; Bos, R.; Verhoye, M.; Van Camp, N.; Van der Linden, A.

    2006-01-01

    In the literature, many methods are available for estimation of the variance of the noise in magnetic resonance (MR) images. A commonly used method, based on the maximum of the background mode of the histogram, is revisited and a new, robust, and easy to use method is presented based on maximum

  2. Patient information about radiation therapy: a survey in Europe

    International Nuclear Information System (INIS)

    Hubert, Annie; Kantor, Guy; Dilhuydy, Jean-Marie; Toulouse, Claude; Germain, Colette; Le Polles, Gisele; Salamon, Roger; Scalliet, Pierre

    1997-01-01

    Background and purpose: We performed a survey to evaluate the present status and means of information given to patients treated by radiotherapy. A short questionnaire was sent, with the help of ESTRO, to 746 European heads of department with a request to send specific documents used for informing the patient. Within 2 months (March and April 1996) we received 290 answers (39%) and 97 centres sent documents. Materials and methods: Analysis of the questionnaire and the documents was performed quantitatively with usual statistical methods and qualitatively with a socio-anthropological method of content analysis. Results: Analysis of the questionnaire shows the major role of the radiation oncologist in giving information and writing documents. The 298 different samples sent from 97 centres represent a wide panel with a booklet of general information (59 booklets/57 centres), practical advice and specific explanations (177 documents/49 centres) and informed consent (36 documents/28 centres). The anthropological study was centred on the way information was given, evaluation of the patient's understanding and analysis of documents sent. Conclusion: This preliminary survey needs to be completed by a study, including the patient's point of view and needs, about the information given

  3. Informed consent: attitudes, knowledge and information concerning prenatal examination

    DEFF Research Database (Denmark)

    Dahl, Katja; Kesmodel, Ulrik; hvidman, lone

    2006-01-01

    Background: Providing women with information enabling an informed consent to prenatal examinations has been widely recommended. Objective: The primary purpose of this review is to summarise current knowledge of the pregnant woman's expectations and attitudes concerning prenatal examinations, as w...

  4. An improved algorithm of laser spot center detection in strong noise background

    Science.gov (United States)

    Zhang, Le; Wang, Qianqian; Cui, Xutai; Zhao, Yu; Peng, Zhong

    2018-01-01

    Laser spot center detection is demanded in many applications. The common algorithms for laser spot center detection such as centroid and Hough transform method have poor anti-interference ability and low detection accuracy in the condition of strong background noise. In this paper, firstly, the median filtering was used to remove the noise while preserving the edge details of the image. Secondly, the binarization of the laser facula image was carried out to extract target image from background. Then the morphological filtering was performed to eliminate the noise points inside and outside the spot. At last, the edge of pretreated facula image was extracted and the laser spot center was obtained by using the circle fitting method. In the foundation of the circle fitting algorithm, the improved algorithm added median filtering, morphological filtering and other processing methods. This method could effectively filter background noise through theoretical analysis and experimental verification, which enhanced the anti-interference ability of laser spot center detection and also improved the detection accuracy.

  5. Background information for the SER Energy Agreement for Sustainable Growth calculations. Sector Built Environment; Achtergronddocument bij doorrekening SER Energieakkoord. Sector Gebouwde omgeving

    Energy Technology Data Exchange (ETDEWEB)

    Menkveld, M.; Tigchelaar, C. [ECN Beleidsstudies, Petten (Netherlands)

    2013-09-01

    This publication is part of the support given by ECN and PBL in the development of a national energy agreement between March and September 2013 as initiated by the SER (Social and Economic Council of the Netherlands). The report gives background information on the evaluation of measures in the agreement aimed at the built environment. It is an annex of the general evaluation of PBL/ECN [Dutch] Dit rapport is geschreven als onderdeel van de ondersteuning door ECN en PBL bij het tot stand komen van het energieakkoord in de periode maart tot september 2013. Dit rapport dient als achtergrond bij de doorrekening van de maatregelen gericht op energiebesparing in de gebouwde omgeving.

  6. A time averaged background compensator for Geiger-Mueller counters

    International Nuclear Information System (INIS)

    Bhattacharya, R.C.; Ghosh, P.K.

    1983-01-01

    The GM tube compensator described stores background counts to cancel an equal number of pulses from the measuring channel providing time averaged compensation. The method suits portable instruments. (orig.)

  7. INNOVATIVE METHODS TO EVALUATE THE RELIABILITY OF INFORMATION CONSOLIDATED FINANCIAL STATEMENTS

    Directory of Open Access Journals (Sweden)

    Irina P. Kurochkina

    2014-01-01

    Full Text Available The article explores the possibility of using foreign innovative methods to assess the reliabilityof information consolidated fi nancial statements of Russian companies. Recommendations aremade under their adaptation and applicationinto commercial organizations. Banish methodindicators are implemented in one of the world’s largest vertically integrated steel and miningcompanies. Audit firms are proposed to usemethods of assessing the reliability of information in the practical application of ISA.

  8. Optimal background matching camouflage.

    Science.gov (United States)

    Michalis, Constantine; Scott-Samuel, Nicholas E; Gibson, David P; Cuthill, Innes C

    2017-07-12

    Background matching is the most familiar and widespread camouflage strategy: avoiding detection by having a similar colour and pattern to the background. Optimizing background matching is straightforward in a homogeneous environment, or when the habitat has very distinct sub-types and there is divergent selection leading to polymorphism. However, most backgrounds have continuous variation in colour and texture, so what is the best solution? Not all samples of the background are likely to be equally inconspicuous, and laboratory experiments on birds and humans support this view. Theory suggests that the most probable background sample (in the statistical sense), at the size of the prey, would, on average, be the most cryptic. We present an analysis, based on realistic assumptions about low-level vision, that estimates the distribution of background colours and visual textures, and predicts the best camouflage. We present data from a field experiment that tests and supports our predictions, using artificial moth-like targets under bird predation. Additionally, we present analogous data for humans, under tightly controlled viewing conditions, searching for targets on a computer screen. These data show that, in the absence of predator learning, the best single camouflage pattern for heterogeneous backgrounds is the most probable sample. © 2017 The Authors.

  9. An information theory criteria based blind method for enumerating active users in DS-CDMA system

    Science.gov (United States)

    Samsami Khodadad, Farid; Abed Hodtani, Ghosheh

    2014-11-01

    In this paper, a new and blind algorithm for active user enumeration in asynchronous direct sequence code division multiple access (DS-CDMA) in multipath channel scenario is proposed. The proposed method is based on information theory criteria. There are two main categories of information criteria which are widely used in active user enumeration, Akaike Information Criterion (AIC) and Minimum Description Length (MDL) information theory criteria. The main difference between these two criteria is their penalty functions. Due to this difference, MDL is a consistent enumerator which has better performance in higher signal-to-noise ratios (SNR) but AIC is preferred in lower SNRs. In sequel, we propose a SNR compliance method based on subspace and training genetic algorithm to have the performance of both of them. Moreover, our method uses only a single antenna, in difference to the previous methods which decrease hardware complexity. Simulation results show that the proposed method is capable of estimating the number of active users without any prior knowledge and the efficiency of the method.

  10. Application of mathematical methods of analysis in selection of competing information technologies

    Science.gov (United States)

    Semenov, V. L.; Kadyshev, E. N.; Zakharova, A. N.; Patianova, A. O.; Dulina, G. S.

    2018-05-01

    The article discusses the use of qualimetry methods using the apparatus of mathematical analysis in the formation of the integral index that allows one to select the best option among competing information technology. The authors propose the use of affine space in the evaluation and selection of competing information technologies.

  11. The Conterminous United States Mineral Appraisal Program; background information to accompany folio of geologic, geochemical, geophysical, and mineral resources maps of the Tonopah 1 by 2 degree Quadrangle, Nevada

    Science.gov (United States)

    John, David A.; Nash, J.T.; Plouff, Donald; Whitebread, D.H.

    1991-01-01

    The Tonopah 1 ? by 2 ? quadrangle in south-central Nevada was studied by an interdisciplinary research team to appraise its mineral resources. The appraisal is based on geological, geochemical, and geophysical field and laboratory investigations, the results of which are published as a folio of maps, figures, and tables, with accompanying discussions. This circular provides background information on the investigations and integrates the information presented in the folio. The selected bibliography lists references to the geology, geochemistry, geophysics, and mineral deposits of the Tonopah 1 ? by 2 ? quadrangle.

  12. A Method for Extracting Road Boundary Information from Crowdsourcing Vehicle GPS Trajectories.

    Science.gov (United States)

    Yang, Wei; Ai, Tinghua; Lu, Wei

    2018-04-19

    Crowdsourcing trajectory data is an important approach for accessing and updating road information. In this paper, we present a novel approach for extracting road boundary information from crowdsourcing vehicle traces based on Delaunay triangulation (DT). First, an optimization and interpolation method is proposed to filter abnormal trace segments from raw global positioning system (GPS) traces and interpolate the optimization segments adaptively to ensure there are enough tracking points. Second, constructing the DT and the Voronoi diagram within interpolated tracking lines to calculate road boundary descriptors using the area of Voronoi cell and the length of triangle edge. Then, the road boundary detection model is established integrating the boundary descriptors and trajectory movement features (e.g., direction) by DT. Third, using the boundary detection model to detect road boundary from the DT constructed by trajectory lines, and a regional growing method based on seed polygons is proposed to extract the road boundary. Experiments were conducted using the GPS traces of taxis in Beijing, China, and the results show that the proposed method is suitable for extracting the road boundary from low-frequency GPS traces, multi-type road structures, and different time intervals. Compared with two existing methods, the automatically extracted boundary information was proved to be of higher quality.

  13. A Method for Extracting Road Boundary Information from Crowdsourcing Vehicle GPS Trajectories

    Directory of Open Access Journals (Sweden)

    Wei Yang

    2018-04-01

    Full Text Available Crowdsourcing trajectory data is an important approach for accessing and updating road information. In this paper, we present a novel approach for extracting road boundary information from crowdsourcing vehicle traces based on Delaunay triangulation (DT. First, an optimization and interpolation method is proposed to filter abnormal trace segments from raw global positioning system (GPS traces and interpolate the optimization segments adaptively to ensure there are enough tracking points. Second, constructing the DT and the Voronoi diagram within interpolated tracking lines to calculate road boundary descriptors using the area of Voronoi cell and the length of triangle edge. Then, the road boundary detection model is established integrating the boundary descriptors and trajectory movement features (e.g., direction by DT. Third, using the boundary detection model to detect road boundary from the DT constructed by trajectory lines, and a regional growing method based on seed polygons is proposed to extract the road boundary. Experiments were conducted using the GPS traces of taxis in Beijing, China, and the results show that the proposed method is suitable for extracting the road boundary from low-frequency GPS traces, multi-type road structures, and different time intervals. Compared with two existing methods, the automatically extracted boundary information was proved to be of higher quality.

  14. First aid practices, beliefs, and sources of information among ...

    African Journals Online (AJOL)

    Background: While burns take seconds to occur, injuries incurred result in pain and undesirable long term effects that might take a lifetime to overcome. The study was carried out to determine the measures of first aid delivered by caregivers after a burn injury and sources of the information. Methods: A cross- sectional study ...

  15. Multimedia Informed Consent Tool for a Low Literacy African Research Population: Development and Pilot-Testing

    OpenAIRE

    Afolabi, Muhammed Olanrewaju; Bojang, Kalifa; D’Alessandro, Umberto; Imoukhuede, Egeruan Babatunde; Ravinetto, Raffaella; Larson, Heidi Jane; McGrath, Nuala; Chandramohan, Daniel

    2014-01-01

    Background International guidelines recommend the use of appropriate informed consent procedures in low literacy research settings because written information is not known to guarantee comprehension of study information. Objectives This study developed and evaluated a multimedia informed consent tool for people with low literacy in an area where a malaria treatment trial was being planned in The Gambia. Methods We developed the informed consent document of the malaria treatment trial into a m...

  16. Application of information-retrieval methods to the classification of physical data

    Science.gov (United States)

    Mamotko, Z. N.; Khorolskaya, S. K.; Shatrovskiy, L. I.

    1975-01-01

    Scientific data received from satellites are characterized as a multi-dimensional time series, whose terms are vector functions of a vector of measurement conditions. Information retrieval methods are used to construct lower dimensional samples on the basis of the condition vector, in order to obtain these data and to construct partial relations. The methods are applied to the joint Soviet-French Arkad project.

  17. Background Material

    DEFF Research Database (Denmark)

    Zandersen, Marianne; Hyytiäinen, Kari; Saraiva, Sofia

    This document serves as a background material to the BONUS Pilot Scenario Workshop, which aims to develop harmonised regional storylines of socio-ecological futures in the Baltic Sea region in a collaborative effort together with other BONUS projects and stakeholders.......This document serves as a background material to the BONUS Pilot Scenario Workshop, which aims to develop harmonised regional storylines of socio-ecological futures in the Baltic Sea region in a collaborative effort together with other BONUS projects and stakeholders....

  18. Thresholding of auditory cortical representation by background noise

    Science.gov (United States)

    Liang, Feixue; Bai, Lin; Tao, Huizhong W.; Zhang, Li I.; Xiao, Zhongju

    2014-01-01

    It is generally thought that background noise can mask auditory information. However, how the noise specifically transforms neuronal auditory processing in a level-dependent manner remains to be carefully determined. Here, with in vivo loose-patch cell-attached recordings in layer 4 of the rat primary auditory cortex (A1), we systematically examined how continuous wideband noise of different levels affected receptive field properties of individual neurons. We found that the background noise, when above a certain critical/effective level, resulted in an elevation of intensity threshold for tone-evoked responses. This increase of threshold was linearly dependent on the noise intensity above the critical level. As such, the tonal receptive field (TRF) of individual neurons was translated upward as an entirety toward high intensities along the intensity domain. This resulted in preserved preferred characteristic frequency (CF) and the overall shape of TRF, but reduced frequency responding range and an enhanced frequency selectivity for the same stimulus intensity. Such translational effects on intensity threshold were observed in both excitatory and fast-spiking inhibitory neurons, as well as in both monotonic and nonmonotonic (intensity-tuned) A1 neurons. Our results suggest that in a noise background, fundamental auditory representations are modulated through a background level-dependent linear shifting along intensity domain, which is equivalent to reducing stimulus intensity. PMID:25426029

  19. Thresholding of auditory cortical representation by background noise.

    Science.gov (United States)

    Liang, Feixue; Bai, Lin; Tao, Huizhong W; Zhang, Li I; Xiao, Zhongju

    2014-01-01

    It is generally thought that background noise can mask auditory information. However, how the noise specifically transforms neuronal auditory processing in a level-dependent manner remains to be carefully determined. Here, with in vivo loose-patch cell-attached recordings in layer 4 of the rat primary auditory cortex (A1), we systematically examined how continuous wideband noise of different levels affected receptive field properties of individual neurons. We found that the background noise, when above a certain critical/effective level, resulted in an elevation of intensity threshold for tone-evoked responses. This increase of threshold was linearly dependent on the noise intensity above the critical level. As such, the tonal receptive field (TRF) of individual neurons was translated upward as an entirety toward high intensities along the intensity domain. This resulted in preserved preferred characteristic frequency (CF) and the overall shape of TRF, but reduced frequency responding range and an enhanced frequency selectivity for the same stimulus intensity. Such translational effects on intensity threshold were observed in both excitatory and fast-spiking inhibitory neurons, as well as in both monotonic and nonmonotonic (intensity-tuned) A1 neurons. Our results suggest that in a noise background, fundamental auditory representations are modulated through a background level-dependent linear shifting along intensity domain, which is equivalent to reducing stimulus intensity.

  20. Background paper on Technology Roadmaps (TRMs)

    Energy Technology Data Exchange (ETDEWEB)

    More, E.; Phaal, R. [Institute for Manufacturing IfM, Department of Engineering, University of Cambridge, Cambridge (United Kingdom); Londo, H.M.; Wurtenberger, L.; Cameron, L.R. [ECN Policy Studies, Amsterdam (Netherlands)

    2013-04-15

    This background paper reports on the use of technology roadmaps (TRMs) related to climate change mitigation and adaptation technologies. The study is motivated by the UNFCCC Conference of the Parties (CoP) request to the Technology Executive Committee (TEC) to catalyse the development and use of TRMs as facilitative tools for action on mitigation and adaptation. Having originated in industry, TRMs are now used extensively in policy settings too, however their widespread use across sectors and by different stakeholders has resulted in a lack of understanding of their real value to help catalyse cooperation towards technological solutions to the problems presented by climate change. Consequently this background paper presents (1) an overview of different TRM methods, (2) an initial analysis of gaps and barriers in existing TRMs, and (3) a review of current TRM good practices.

  1. Cosmic Microwave Background Timeline

    Science.gov (United States)

    Cosmic Microwave Background Timeline 1934 : Richard Tolman shows that blackbody radiation in an will have a blackbody cosmic microwave background with temperature about 5 K 1955: Tigran Shmaonov anisotropy in the cosmic microwave background, this strongly supports the big bang model with gravitational

  2. Method s for Measuring Productivity in Libraries and Information Centres

    Directory of Open Access Journals (Sweden)

    Mohammad Alaaei

    2009-04-01

    Full Text Available   Within Information centers, productivity is the result of optimal and effective use of information resources, service quality improvement, increased user satisfaction, pleasantness of working environment, increased motivation and enthusiasm of staff to work better. All contribute to the growth and development of information centers. Thus these centers would need to be familiar with methods employed in productivity measurement. Productivity is one of the criteria for evaluating system performance. In the past decades particular emphasis has been placed on measurement and improvement of human resource, creativity, innovation and expert analysis. Contemplation and efforts made towards identification of problems and issues and new means to make more useful and better resource management is the very definition of productivity. Simply put, productivity is the relationship between system output and the elements garnered to produce these outputs. The causality between variables and factors impacting on productivity is very complex. In information centers, given the large volume of elements involved, it seems necessary to increase efficiency and productivity

  3. Precision Foreground Removal in Cosmic Microwave Background Polarization Maps

    Data.gov (United States)

    National Aeronautics and Space Administration — The most promising method for detecting primordial gravitational waves lies in the B-mode polarization of the cosmic microwave background, or CMB. A measurement of...

  4. Hanford Site background: Part 1, Soil background for nonradioactive analytes. Revision 1, Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    1993-04-01

    Volume two contains the following appendices: Description of soil sampling sites; sampling narrative; raw data soil background; background data analysis; sitewide background soil sampling plan; and use of soil background data for the detection of contamination at waste management unit on the Hanford Site.

  5. Clinical guideline representation in a CDS: a human information processing method.

    Science.gov (United States)

    Kilsdonk, Ellen; Riezebos, Rinke; Kremer, Leontien; Peute, Linda; Jaspers, Monique

    2012-01-01

    The Dutch Childhood Oncology Group (DCOG) has developed evidence-based guidelines for screening childhood cancer survivors for possible late complications of treatment. These paper-based guidelines appeared to not suit clinicians' information retrieval strategies; it was thus decided to communicate the guidelines through a Computerized Decision Support (CDS) tool. To ensure high usability of this tool, an analysis of clinicians' cognitive strategies in retrieving information from the paper-based guidelines was used as requirements elicitation method. An information processing model was developed through an analysis of think aloud protocols and used as input for the design of the CDS user interface. Usability analysis of the user interface showed that the navigational structure of the CDS tool fitted well with the clinicians' mental strategies employed in deciding on survivors screening protocols. Clinicians were more efficient and more complete in deciding on patient-tailored screening procedures when supported by the CDS tool than by the paper-based guideline booklet. The think-aloud method provided detailed insight into users' clinical work patterns that supported the design of a highly usable CDS system.

  6. Background sources at PEP

    International Nuclear Information System (INIS)

    Lynch, H.; Schwitters, R.F.; Toner, W.T.

    1988-01-01

    Important sources of background for PEP experiments are studied. Background particles originate from high-energy electrons and positrons which have been lost from stable orbits, γ-rays emitted by the primary beams through bremsstrahlung in the residual gas, and synchrotron radiation x-rays. The effect of these processes on the beam lifetime are calculated and estimates of background rates at the interaction region are given. Recommendations for the PEP design, aimed at minimizing background are presented. 7 figs., 4 tabs

  7. Microarray background correction: maximum likelihood estimation for the normal-exponential convolution

    DEFF Research Database (Denmark)

    Silver, Jeremy D; Ritchie, Matthew E; Smyth, Gordon K

    2009-01-01

    exponentially distributed, representing background noise and signal, respectively. Using a saddle-point approximation, Ritchie and others (2007) found normexp to be the best background correction method for 2-color microarray data. This article develops the normexp method further by improving the estimation...... is developed for exact maximum likelihood estimation (MLE) using high-quality optimization software and using the saddle-point estimates as starting values. "MLE" is shown to outperform heuristic estimators proposed by other authors, both in terms of estimation accuracy and in terms of performance on real data...

  8. Traceability information carriers. The technology backgrounds and consumers' perceptions of the technological solutions

    DEFF Research Database (Denmark)

    Chrysochou, Polymeros; Chryssochoidis, George; Kehagia, Olga

    2009-01-01

    of and confidence in the information provided, perceived levels of convenience, impact on product quality and safety, impact on consumers' health and the environment, and potential consequences on ethical and privacy liberties constitute important factors influencing consumers' perceptions of technologies......The implementation of traceability in the food supply chain has reinforced adoption of technologies with the ability to track forward and trace back product-related information. Based on the premise that these technologies can be used as a means to provide product-related information to consumers...... in their production lines. For the purposes of the study, a focus group study was conducted across 12 European countries, while a set of four different technologies used as a means to provide traceability information to consumers was the focal point of the discussions in each focus group. Results show that the amount...

  9. Methods and Systems for Advanced Spaceport Information Management

    Science.gov (United States)

    Fussell, Ronald M. (Inventor); Ely, Donald W. (Inventor); Meier, Gary M. (Inventor); Halpin, Paul C. (Inventor); Meade, Phillip T. (Inventor); Jacobson, Craig A. (Inventor); Blackwell-Thompson, Charlie (Inventor)

    2007-01-01

    Advanced spaceport information management methods and systems are disclosed. In one embodiment, a method includes coupling a test system to the payload and transmitting one or more test signals that emulate an anticipated condition from the test system to the payload. One or more responsive signals are received from the payload into the test system and are analyzed to determine whether one or more of the responsive signals comprises an anomalous signal. At least one of the steps of transmitting, receiving, analyzing and determining includes transmitting at least one of the test signals and the responsive signals via a communications link from a payload processing facility to a remotely located facility. In one particular embodiment, the communications link is an Internet link from a payload processing facility to a remotely located facility (e.g. a launch facility, university, etc.).

  10. METHODS OF MANAGING TRAFFIC DISTRIBUTION IN INFORMATION AND COMMUNICATION NETWORKS OF CRITICAL INFRASTRUCTURE SYSTEMS

    OpenAIRE

    Kosenko, Viktor; Persiyanova, Elena; Belotskyy, Oleksiy; Malyeyeva, Olga

    2017-01-01

    The subject matter of the article is information and communication networks (ICN) of critical infrastructure systems (CIS). The goal of the work is to create methods for managing the data flows and resources of the ICN of CIS to improve the efficiency of information processing. The following tasks were solved in the article: the data flow model of multi-level ICN structure was developed, the method of adaptive distribution of data flows was developed, the method of network resource assignment...

  11. Method and apparatus for bistable optical information storage for erasable optical disks

    Science.gov (United States)

    Land, Cecil E.; McKinney, Ira D.

    1990-01-01

    A method and an optical device for bistable storage of optical information, together with reading and erasure of the optical information, using a photoactivated shift in a field dependent phase transition between a metastable or a bias-stabilized ferroelectric (FE) phase and a stable antiferroelectric (AFE) phase in an lead lanthanum zirconate titanate (PLZT). An optical disk contains the PLZT. Writing and erasing of optical information can be accomplished by a light beam normal to the disk. Reading of optical information can be accomplished by a light beam at an incidence angle of 15 to 60 degrees to the normal of the disk.

  12. Reducing DRIFT backgrounds with a submicron aluminized-mylar cathode

    Science.gov (United States)

    Battat, J. B. R.; Daw, E.; Dorofeev, A.; Ezeribe, A. C.; Fox, J. R.; Gauvreau, J.-L.; Gold, M.; Harmon, L.; Harton, J.; Lafler, R.; Landers, J.; Lauer, R. J.; Lee, E. R.; Loomba, D.; Lumnah, A.; Matthews, J.; Miller, E. H.; Mouton, F.; Murphy, A. St. J.; Paling, S. M.; Phan, N.; Sadler, S. W.; Scarff, A.; Schuckman, F. G.; Snowden-Ifft, D.; Spooner, N. J. C.; Walker, D.

    2015-09-01

    Background events in the DRIFT-IId dark matter detector, mimicking potential WIMP signals, are predominantly caused by alpha decays on the central cathode in which the alpha particle is completely or partially absorbed by the cathode material. We installed a 0.9 μm thick aluminized-mylar cathode as a way to reduce the probability of producing these backgrounds. We study three generations of cathode (wire, thin-film, and radiologically clean thin-film) with a focus on the ratio of background events to alpha decays. Two independent methods of measuring the absolute alpha decay rate are used to ensure an accurate result, and agree to within 10%. Using alpha range spectroscopy, we measure the radiologically cleanest cathode version to have a contamination of 3.3±0.1 ppt 234U and 73±2 ppb 238U. This cathode reduces the probability of producing an RPR from an alpha decay by a factor of 70±20 compared to the original stainless steel wire cathode. First results are presented from a texturized version of the cathode, intended to be even more transparent to alpha particles. These efforts, along with other background reduction measures, have resulted in a drop in the observed background rate from 500/day to 1/day. With the recent implementation of full-volume fiducialization, these remaining background events are identified, allowing for background-free operation.

  13. Accurate facade feature extraction method for buildings from three-dimensional point cloud data considering structural information

    Science.gov (United States)

    Wang, Yongzhi; Ma, Yuqing; Zhu, A.-xing; Zhao, Hui; Liao, Lixia

    2018-05-01

    Facade features represent segmentations of building surfaces and can serve as a building framework. Extracting facade features from three-dimensional (3D) point cloud data (3D PCD) is an efficient method for 3D building modeling. By combining the advantages of 3D PCD and two-dimensional optical images, this study describes the creation of a highly accurate building facade feature extraction method from 3D PCD with a focus on structural information. The new extraction method involves three major steps: image feature extraction, exploration of the mapping method between the image features and 3D PCD, and optimization of the initial 3D PCD facade features considering structural information. Results show that the new method can extract the 3D PCD facade features of buildings more accurately and continuously. The new method is validated using a case study. In addition, the effectiveness of the new method is demonstrated by comparing it with the range image-extraction method and the optical image-extraction method in the absence of structural information. The 3D PCD facade features extracted by the new method can be applied in many fields, such as 3D building modeling and building information modeling.

  14. Background of SIFs and Stress Indices for Moment Loadings of Piping Components

    International Nuclear Information System (INIS)

    Wais, E. A.; Rodabaugh, E. C.

    2005-01-01

    This report provides background information, references, and equations for twenty-four piping components (thirteen component SIFs and eleven component stress indices) that justify the values or expressions for the SIFs and indices

  15. Aquaculture Information Package

    Energy Technology Data Exchange (ETDEWEB)

    Boyd, T.; Rafferty, K. [editors

    1998-01-01

    This package of information is intended to provide background to developers of geothermal aquaculture projects. The material is divided into eight sections and includes information on market and price information for typical species, aquaculture water quality issues, typical species culture information, pond heat loss calculations, an aquaculture glossary, regional and university aquaculture offices and state aquaculture permit requirements.

  16. European network for promoting the physical health of residents in psychiatric and social care facilities (HELPS): background, aims and methods

    Science.gov (United States)

    Weiser, Prisca; Becker, Thomas; Losert, Carolin; Alptekin, Köksal; Berti, Loretta; Burti, Lorenzo; Burton, Alexandra; Dernovsek, Mojca; Dragomirecka, Eva; Freidl, Marion; Friedrich, Fabian; Genova, Aneta; Germanavicius, Arunas; Halis, Ulaş; Henderson, John; Hjorth, Peter; Lai, Taavi; Larsen, Jens Ivar; Lech, Katarzyna; Lucas, Ramona; Marginean, Roxana; McDaid, David; Mladenova, Maya; Munk-Jørgensen, Povl; Paziuc, Alexandru; Paziuc, Petronela; Priebe, Stefan; Prot-Klinger, Katarzyna; Wancata, Johannes; Kilian, Reinhold

    2009-01-01

    Background People with mental disorders have a higher prevalence of physical illnesses and reduced life expectancy as compared with the general population. However, there is a lack of knowledge across Europe concerning interventions that aim at reducing somatic morbidity and excess mortality by promoting behaviour-based and/or environment-based interventions. Methods and design HELPS is an interdisciplinary European network that aims at (i) gathering relevant knowledge on physical illness in people with mental illness, (ii) identifying health promotion initiatives in European countries that meet country-specific needs, and (iii) at identifying best practice across Europe. Criteria for best practice will include evidence on the efficacy of physical health interventions and of their effectiveness in routine care, cost implications and feasibility for adaptation and implementation of interventions across different settings in Europe. HELPS will develop and implement a "physical health promotion toolkit". The toolkit will provide information to empower residents and staff to identify the most relevant risk factors in their specific context and to select the most appropriate action out of a range of defined health promoting interventions. The key methods are (a) stakeholder analysis, (b) international literature reviews, (c) Delphi rounds with experts from participating centres, and (d) focus groups with staff and residents of mental health care facilities. Meanwhile a multi-disciplinary network consisting of 15 European countries has been established and took up the work. As one main result of the project they expect that a widespread use of the HELPS toolkit could have a significant positive effect on the physical health status of residents of mental health and social care facilities, as well as to hold resonance for community dwelling people with mental health problems. Discussion A general strategy on health promotion for people with mental disorders must take into

  17. E-learning for Critical Thinking: Using Nominal Focus Group Method to Inform Software Content and Design

    Science.gov (United States)

    Parker, Steve; Mayner, Lidia; Michael Gillham, David

    2015-01-01

    Background: Undergraduate nursing students are often confused by multiple understandings of critical thinking. In response to this situation, the Critiique for critical thinking (CCT) project was implemented to provide consistent structured guidance about critical thinking. Objectives: This paper introduces Critiique software, describes initial validation of the content of this critical thinking tool and explores wider applications of the Critiique software. Materials and Methods: Critiique is flexible, authorable software that guides students step-by-step through critical appraisal of research papers. The spelling of Critiique was deliberate, so as to acquire a unique web domain name and associated logo. The CCT project involved implementation of a modified nominal focus group process with academic staff working together to establish common understandings of critical thinking. Previous work established a consensus about critical thinking in nursing and provided a starting point for the focus groups. The study was conducted at an Australian university campus with the focus group guided by open ended questions. Results: Focus group data established categories of content that academic staff identified as important for teaching critical thinking. This emerging focus group data was then used to inform modification of Critiique software so that students had access to consistent and structured guidance in relation to critical thinking and critical appraisal. Conclusions: The project succeeded in using focus group data from academics to inform software development while at the same time retaining the benefits of broader philosophical dimensions of critical thinking. PMID:26835469

  18. Community Information Systems.

    Science.gov (United States)

    Freeman, Andrew

    Information is provided on technological and social trends as background for a workshop designed to heighten the consciousness of workers in community information systems. Initially, the basic terminology is considered in its implications for an integrated perspective of community information systems, with particular attention given to the meaning…

  19. A Method for Evaluating Information Security Governance (ISG) Components in Banking Environment

    Science.gov (United States)

    Ula, M.; Ula, M.; Fuadi, W.

    2017-02-01

    As modern banking increasingly relies on the internet and computer technologies to operate their businesses and market interactions, the threats and security breaches have highly increased in recent years. Insider and outsider attacks have caused global businesses lost trillions of Dollars a year. Therefore, that is a need for a proper framework to govern the information security in the banking system. The aim of this research is to propose and design an enhanced method to evaluate information security governance (ISG) implementation in banking environment. This research examines and compares the elements from the commonly used information security governance frameworks, standards and best practices. Their strength and weakness are considered in its approaches. The initial framework for governing the information security in banking system was constructed from document review. The framework was categorized into three levels which are Governance level, Managerial level, and technical level. The study further conducts an online survey for banking security professionals to get their professional judgment about the ISG most critical components and the importance for each ISG component that should be implemented in banking environment. Data from the survey was used to construct a mathematical model for ISG evaluation, component importance data used as weighting coefficient for the related component in the mathematical model. The research further develops a method for evaluating ISG implementation in banking based on the mathematical model. The proposed method was tested through real bank case study in an Indonesian local bank. The study evidently proves that the proposed method has sufficient coverage of ISG in banking environment and effectively evaluates the ISG implementation in banking environment.

  20. Computational Methods for Physical Model Information Management: Opening the Aperture

    International Nuclear Information System (INIS)

    Moser, F.; Kirgoeze, R.; Gagne, D.; Calle, D.; Murray, J.; Crowley, J.

    2015-01-01

    The volume, velocity and diversity of data available to analysts are growing exponentially, increasing the demands on analysts to stay abreast of developments in their areas of investigation. In parallel to the growth in data, technologies have been developed to efficiently process, store, and effectively extract information suitable for the development of a knowledge base capable of supporting inferential (decision logic) reasoning over semantic spaces. These technologies and methodologies, in effect, allow for automated discovery and mapping of information to specific steps in the Physical Model (Safeguard's standard reference of the Nuclear Fuel Cycle). This paper will describe and demonstrate an integrated service under development at the IAEA that utilizes machine learning techniques, computational natural language models, Bayesian methods and semantic/ontological reasoning capabilities to process large volumes of (streaming) information and associate relevant, discovered information to the appropriate process step in the Physical Model. The paper will detail how this capability will consume open source and controlled information sources and be integrated with other capabilities within the analysis environment, and provide the basis for a semantic knowledge base suitable for hosting future mission focused applications. (author)