WorldWideScience

Sample records for source analysis study

  1. USING THE METHODS OF WAVELET ANALYSIS AND SINGULAR SPECTRUM ANALYSIS IN THE STUDY OF RADIO SOURCE BL LAC

    OpenAIRE

    Donskykh, G. I.; Ryabov, M. I.; Sukharev, A. I.; Aller, M.

    2014-01-01

    We investigated the monitoring data of extragalactic source BL Lac. This monitoring was held withUniversityofMichigan26-meter radio  telescope. To study flux density of extragalactic source BL Lac at frequencies of 14.5, 8 and 4.8 GHz, the wavelet analysis and singular spectrum analysis were used. Calculating the integral wavelet spectra allowed revealing long-term  components  (~7-8 years) and short-term components (~ 1-4 years) in BL Lac. Studying of VLBI radio maps (by the program Mojave) ...

  2. Obsidian sourcing studies in Papua New Guinea using PIXE-PIGME analysis

    Energy Technology Data Exchange (ETDEWEB)

    Summerhayes, G R; Gosden, C [La Trobe Univ., Bundoora, VIC (Australia); Bird, R; Hotchkis, M [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia); Specht, J; Torrence, R; Fullaga, R [Australian Museum, Sydney, NSW (Australia). Div. of Anthropology

    1994-12-31

    Over 100 obsidian samples were analysed using PIXE-PIGME in 1990. These samples were collected during intensive surveys of the source areas around Talasea, Garua Island, and the Mopir area in 1988, 1989 and 1990. A ratio combination of 9 elements were used to separate out groups as per previous studies: F/Na, Al/Na, K/Fe, Ca/Fe, Mn/Fe, Rb/Fe, Y/Zr, Sr/Fe and Zr/Fe. In spite of variations in major elements, the close agreement between results for minor and trace elements concentrations in artefacts and known source material indicates that the provenance of each artefact can be reliably determined. This conclusion provides important validation of the use of ion beam analysis in artefact characterisation. ills.

  3. Obsidian sourcing studies in Papua New Guinea using PIXE-PIGME analysis

    International Nuclear Information System (INIS)

    Summerhayes, G.R.; Gosden, C.; Bird, R.; Hotchkis, M.; Specht, J.; Torrence, R.; Fullaga, R.

    1993-01-01

    Over 100 obsidian samples were analysed using PIXE-PIGME in 1990. These samples were collected during intensive surveys of the source areas around Talasea, Garua Island, and the Mopir area in 1988, 1989 and 1990. A ratio combination of 9 elements were used to separate out groups as per previous studies: F/Na, Al/Na, K/Fe, Ca/Fe, Mn/Fe, Rb/Fe, Y/Zr, Sr/Fe and Zr/Fe. In spite of variations in major elements, the close agreement between results for minor and trace elements concentrations in artefacts and known source material indicates that the provenance of each artefact can be reliably determined. This conclusion provides important validation of the use of ion beam analysis in artefact characterisation. ills

  4. Obsidian sourcing studies in Papua New Guinea using PIXE-PIGME analysis

    Energy Technology Data Exchange (ETDEWEB)

    Summerhayes, G.R.; Gosden, C. [La Trobe Univ., Bundoora, VIC (Australia); Bird, R.; Hotchkis, M. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia); Specht, J.; Torrence, R.; Fullaga, R. [Australian Museum, Sydney, NSW (Australia). Div. of Anthropology

    1993-12-31

    Over 100 obsidian samples were analysed using PIXE-PIGME in 1990. These samples were collected during intensive surveys of the source areas around Talasea, Garua Island, and the Mopir area in 1988, 1989 and 1990. A ratio combination of 9 elements were used to separate out groups as per previous studies: F/Na, Al/Na, K/Fe, Ca/Fe, Mn/Fe, Rb/Fe, Y/Zr, Sr/Fe and Zr/Fe. In spite of variations in major elements, the close agreement between results for minor and trace elements concentrations in artefacts and known source material indicates that the provenance of each artefact can be reliably determined. This conclusion provides important validation of the use of ion beam analysis in artefact characterisation. ills.

  5. Performance analysis and experimental study of heat-source tower solution regeneration

    International Nuclear Information System (INIS)

    Liang, Caihua; Wen, Xiantai; Liu, Chengxing; Zhang, Xiaosong

    2014-01-01

    Highlights: • Theoretical analysis is performed on the characteristics of heat-source tower. • Experimental study is performed on various rules of the solution regeneration rate. • The characteristics of solution regeneration vary widely with different demands. • Results are useful for optimizing the process of solution regeneration. - Abstract: By analyzing similarities and difference between the solution regeneration of a heat-source tower and desiccant solution regeneration, this paper points out that solution regeneration of a heat-source tower has the characteristics of small demands and that a regeneration rate is susceptible to outdoor ambient environments. A theoretical analysis is performed on the characteristics of a heat-source tower solution in different outdoor environments and different regeneration modes, and an experimental study is performed on variation rules of the solution regeneration rate of a cross-flow heat-source tower under different inlet parameters and operating parameters. The experimental results show that: in the operating regeneration mode, as the air volume was increased from 123 m 3 h −1 to 550 m 3 h −1 , the system heat transfer amount increased from 0.42 kW to 0.78 kW, and the regeneration rate increased from 0.03 g s −1 to 0.19 g s −1 . Increasing the solution flow may increase the system heat transfer amount; however, the regeneration rate decreased to a certain extent. In the regeneration mode when the system is idle, as the air volume was increased from 136 m 3 h −1 to 541 m 3 h −1 , the regeneration rate increased from 0.03 g s −1 to 0.1 g s −1 . The regeneration rate almost remained unchanged around 0.07 g s −1 as the solution flow is increased. In the regeneration mode with auxiliary heat when the system is idle, increasing the air volume and increasing the solution flow required more auxiliary heat, thereby improving the solution regeneration rate. As the auxiliary heat was increased from 0.33 k

  6. Free/open source software: a study of some applications for scientific data analysis of nuclear experiments

    Energy Technology Data Exchange (ETDEWEB)

    Menezes, Mario Olimpio de [Instituto de Pesquisas Energeticas e Nucleares (IPEN), Sao Paulo, SP (Brazil)]. E-mail: mario@ipen.br; mo.menezes@gmail.com

    2005-07-01

    Free/Open Source Software (FOSS) has been used in science long before the formal social movement known as 'Free Software/Open Source Software' came in to existence. After the Personal Computer (PC) boom in the 80s, commercial closed source software became widely available to scientists for data analysis in this platform. In this paper, we study some high quality FOSS, available also for free, that can be used for complex data analysis tasks. We show the results and data analysis process, aiming to expose the high quality and highly productive ways of both results and processes, while highlighting the different approach used in some of the FOSS. We show that scientists have today in FOSS a viable, high quality alternative to commercial closed source software which, besides being ready to use, also offer the possibility of great customization or extension to fit very particular needs of many fields of scientific data analysis. Among the FOSS, we study in this paper GNU Octave and SCILAB - free alternatives to MATLAB; Gnuplot - free alternative to ORIGIN-like software. We also show that scientists have invaluable resources in modern FOSS programming languages such as Python, and Perl, that can be used both to do data analysis and manipulation, allowing very complex tasks to be done automatically after some few lines of easy programming. (author)

  7. Free/open source software: a study of some applications for scientific data analysis of nuclear experiments

    International Nuclear Information System (INIS)

    Menezes, Mario Olimpio de

    2005-01-01

    Free/Open Source Software (FOSS) has been used in science long before the formal social movement known as 'Free Software/Open Source Software' came in to existence. After the Personal Computer (PC) boom in the 80s, commercial closed source software became widely available to scientists for data analysis in this platform. In this paper, we study some high quality FOSS, available also for free, that can be used for complex data analysis tasks. We show the results and data analysis process, aiming to expose the high quality and highly productive ways of both results and processes, while highlighting the different approach used in some of the FOSS. We show that scientists have today in FOSS a viable, high quality alternative to commercial closed source software which, besides being ready to use, also offer the possibility of great customization or extension to fit very particular needs of many fields of scientific data analysis. Among the FOSS, we study in this paper GNU Octave and SCILAB - free alternatives to MATLAB; Gnuplot - free alternative to ORIGIN-like software. We also show that scientists have invaluable resources in modern FOSS programming languages such as Python, and Perl, that can be used both to do data analysis and manipulation, allowing very complex tasks to be done automatically after some few lines of easy programming. (author)

  8. Neutron activation analysis: Modelling studies to improve the neutron flux of Americium-Beryllium source

    Energy Technology Data Exchange (ETDEWEB)

    Didi, Abdessamad; Dadouch, Ahmed; Tajmouati, Jaouad; Bekkouri, Hassane [Advanced Technology and Integration System, Dept. of Physics, Faculty of Science Dhar Mehraz, University Sidi Mohamed Ben Abdellah, Fez (Morocco); Jai, Otman [Laboratory of Radiation and Nuclear Systems, Dept. of Physics, Faculty of Sciences, Tetouan (Morocco)

    2017-06-15

    Americium–beryllium (Am-Be; n, γ) is a neutron emitting source used in various research fields such as chemistry, physics, geology, archaeology, medicine, and environmental monitoring, as well as in the forensic sciences. It is a mobile source of neutron activity (20 Ci), yielding a small thermal neutron flux that is water moderated. The aim of this study is to develop a model to increase the neutron thermal flux of a source such as Am-Be. This study achieved multiple advantageous results: primarily, it will help us perform neutron activation analysis. Next, it will give us the opportunity to produce radio-elements with short half-lives. Am-Be single and multisource (5 sources) experiments were performed within an irradiation facility with a paraffin moderator. The resulting models mainly increase the thermal neutron flux compared to the traditional method with water moderator.

  9. Chromatographic fingerprint similarity analysis for pollutant source identification

    International Nuclear Information System (INIS)

    Xie, Juan-Ping; Ni, Hong-Gang

    2015-01-01

    In the present study, a similarity analysis method was proposed to evaluate the source-sink relationships among environmental media for polybrominated diphenyl ethers (PBDEs), which were taken as the representative contaminants. Chromatographic fingerprint analysis has been widely used in the fields of natural products chemistry and forensic chemistry, but its application to environmental science has been limited. We established a library of various sources of media containing contaminants (e.g., plastics), recognizing that the establishment of a more comprehensive library allows for a better understanding of the sources of contamination. We then compared an environmental complex mixture (e.g., sediment, soil) with the profiles in the library. These comparisons could be used as the first step in source tracking. The cosine similarities between plastic and soil or sediment ranged from 0.53 to 0.68, suggesting that plastic in electronic waste is an important source of PBDEs in the environment, but it is not the only source. A similarity analysis between soil and sediment indicated that they have a source-sink relationship. Generally, the similarity analysis method can encompass more relevant information of complex mixtures in the environment than a profile-based approach that only focuses on target pollutants. There is an inherent advantage to creating a data matrix containing all peaks and their relative levels after matching the peaks based on retention times and peak areas. This data matrix can be used for source identification via a similarity analysis without quantitative or qualitative analysis of all chemicals in a sample. - Highlights: • Chromatographic fingerprint analysis can be used as the first step in source tracking. • Similarity analysis method can encompass more relevant information of pollution. • The fingerprints strongly depend on the chromatographic conditions. • A more effective and robust method for identifying similarities is required

  10. Comparative Analysis Study of Open Source GIS in Malaysia

    International Nuclear Information System (INIS)

    Rasid, Muhammad Zamir Abdul; Kamis, Naddia; Halim, Mohd Khuizham Abd

    2014-01-01

    Open source origin might appear like a major prospective change which is qualified to deliver in various industries and also competing means in developing countries. The leading purpose of this research study is to basically discover the degree of adopting Open Source Software (OSS) that is connected with Geographic Information System (GIS) application within Malaysia. It was derived based on inadequate awareness with regards to the origin ideas or even on account of techie deficiencies in the open origin instruments. This particular research has been carried out based on two significant stages; the first stage involved a survey questionnaire: to evaluate the awareness and acceptance level based on the comparison feedback regarding OSS and commercial GIS. This particular survey was conducted among three groups of candidates: government servant, university students and lecturers, as well as individual. The approaches of measuring awareness in this research were based on a comprehending signal plus a notion signal for each survey questions. These kinds of signs had been designed throughout the analysis in order to supply a measurable and also a descriptive signal to produce the final result. The second stage involved an interview session with a major organization that carries out available origin internet GIS; the Federal Department of Town and Country Planning Peninsular Malaysia (JPBD). The impact of this preliminary study was to understand the particular viewpoint of different groups of people on the available origin, and also their insufficient awareness with regards to origin ideas as well as likelihood may be significant root of adopting level connected with available origin options

  11. Neutron activation analysis: Modelling studies to improve the neutron flux of Americium–Beryllium source

    Directory of Open Access Journals (Sweden)

    Abdessamad Didi

    2017-06-01

    Full Text Available Americium–beryllium (Am-Be; n, γ is a neutron emitting source used in various research fields such as chemistry, physics, geology, archaeology, medicine, and environmental monitoring, as well as in the forensic sciences. It is a mobile source of neutron activity (20 Ci, yielding a small thermal neutron flux that is water moderated. The aim of this study is to develop a model to increase the neutron thermal flux of a source such as Am-Be. This study achieved multiple advantageous results: primarily, it will help us perform neutron activation analysis. Next, it will give us the opportunity to produce radio-elements with short half-lives. Am-Be single and multisource (5 sources experiments were performed within an irradiation facility with a paraffin moderator. The resulting models mainly increase the thermal neutron flux compared to the traditional method with water moderator.

  12. SWOT analysis of the renewable energy sources in Romania - case study: solar energy

    Science.gov (United States)

    Lupu, A. G.; Dumencu, A.; Atanasiu, M. V.; Panaite, C. E.; Dumitrașcu, Gh; Popescu, A.

    2016-08-01

    The evolution of energy sector worldwide triggered intense preoccupation on both finding alternative renewable energy sources and environmental issues. Romania is considered to have technological potential and geographical location suitable to renewable energy usage for electricity generation. But this high potential is not fully exploited in the context of policies and regulations adopted globally, and more specific, European Union (EU) environmental and energy strategies and legislation related to renewable energy sources. This SWOT analysis of solar energy source presents the state of the art, potential and future prospects for development of renewable energy in Romania. The analysis concluded that the development of solar energy sector in Romania depends largely on: viability of legislative framework on renewable energy sources, increased subsidies for solar R&D, simplified methodology of green certificates, and educating the public, investors, developers and decision-makers.

  13. Statistical studies of powerful extragalactic radio sources

    Energy Technology Data Exchange (ETDEWEB)

    Macklin, J T

    1981-01-01

    This dissertation is mainly about the use of efficient statistical tests to study the properties of powerful extragalactic radio sources. Most of the analysis is based on subsets of a sample of 166 bright (3CR) sources selected at 178 MHz. The first chapter is introductory and it is followed by three on the misalignment and symmetry of double radio sources. The properties of nuclear components in extragalactic sources are discussed in the next chapter, using statistical tests which make efficient use of upper limits, often the only available information on the flux density from the nuclear component. Multifrequency observations of four 3CR sources are presented in the next chapter. The penultimate chapter is about the analysis of correlations involving more than two variables. The Spearman partial rank correlation coefficient is shown to be the most powerful test available which is based on non-parametric statistics. It is therefore used to study the dependences of the properties of sources on their size at constant redshift, and the results are interpreted in terms of source evolution. Correlations of source properties with luminosity and redshift are then examined.

  14. Optimal Measurement Conditions for Spatiotemporal EEG/MEG Source Analysis.

    Science.gov (United States)

    Huizenga, Hilde M.; Heslenfeld, Dirk J.; Molenaar, Peter C. M.

    2002-01-01

    Developed a method to determine the required number and position of sensors for human brain electromagnetic source analysis. Studied the method through a simulation study and an empirical study on visual evoked potentials in one adult male. Results indicate the method is fast and reliable and improves source precision. (SLD)

  15. Joint source based analysis of multiple brain structures in studying major depressive disorder

    Science.gov (United States)

    Ramezani, Mahdi; Rasoulian, Abtin; Hollenstein, Tom; Harkness, Kate; Johnsrude, Ingrid; Abolmaesumi, Purang

    2014-03-01

    We propose a joint Source-Based Analysis (jSBA) framework to identify brain structural variations in patients with Major Depressive Disorder (MDD). In this framework, features representing position, orientation and size (i.e. pose), shape, and local tissue composition are extracted. Subsequently, simultaneous analysis of these features within a joint analysis method is performed to generate the basis sources that show signi cant di erences between subjects with MDD and those in healthy control. Moreover, in a cross-validation leave- one-out experiment, we use a Fisher Linear Discriminant (FLD) classi er to identify individuals within the MDD group. Results show that we can classify the MDD subjects with an accuracy of 76% solely based on the information gathered from the joint analysis of pose, shape, and tissue composition in multiple brain structures.

  16. Automated Source Code Analysis to Identify and Remove Software Security Vulnerabilities: Case Studies on Java Programs

    OpenAIRE

    Natarajan Meghanathan

    2013-01-01

    The high-level contribution of this paper is to illustrate the development of generic solution strategies to remove software security vulnerabilities that could be identified using automated tools for source code analysis on software programs (developed in Java). We use the Source Code Analyzer and Audit Workbench automated tools, developed by HP Fortify Inc., for our testing purposes. We present case studies involving a file writer program embedded with features for password validation, and ...

  17. Studies and modeling of cold neutron sources

    International Nuclear Information System (INIS)

    Campioni, G.

    2004-11-01

    With the purpose of updating knowledge in the fields of cold neutron sources, the work of this thesis has been run according to the 3 following axes. First, the gathering of specific information forming the materials of this work. This set of knowledge covers the following fields: cold neutron, cross-sections for the different cold moderators, flux slowing down, different measurements of the cold flux and finally, issues in the thermal analysis of the problem. Secondly, the study and development of suitable computation tools. After an analysis of the problem, several tools have been planed, implemented and tested in the 3-dimensional radiation transport code Tripoli-4. In particular, a module of uncoupling, integrated in the official version of Tripoli-4, can perform Monte-Carlo parametric studies with a spare factor of Cpu time fetching 50 times. A module of coupling, simulating neutron guides, has also been developed and implemented in the Monte-Carlo code McStas. Thirdly, achieving a complete study for the validation of the installed calculation chain. These studies focus on 3 cold sources currently functioning: SP1 from Orphee reactor and 2 other sources (SFH and SFV) from the HFR at the Laue Langevin Institute. These studies give examples of problems and methods for the design of future cold sources

  18. Hydrodynamic analysis of potential groundwater extraction capacity increase: case study of 'Nelt' groundwater source at Dobanovci

    Directory of Open Access Journals (Sweden)

    Bajić Dragoljub I.

    2017-01-01

    Full Text Available A comprehensive hydrodynamic analysis of the groundwater regime undertaken to assess the potential for expanding the 'Nelt' groundwater source at Dobanovci, or developing a new groundwater source for a future baby food factory, including the quantification of the impact on the production wells of the nearby 'Pepsi' groundwater source, is presented in the paper. The existing Nelt source is comprised of three active production wells that tap a subartesian aquifer formed in sands and gravelly sands; however, the analysis considers only the two nearest wells. A long-term group pumping test was con-ducted of production wells N-1 and N2 (Nelt source and production wells B-1 and B-2 (Pepsi source, while the piezometric head in the vicinity of these wells was monitored at observation well P-1, which is located in the area considered for Nelt source expansion. Data were collected at maximum pumping capacity of all the production wells. A hydrodynamic model of groundwater flow in the extended area of the Nelt source was generated for the purposes of the comprehensive hydrodynamic analysis. Hydrodynamic prognostic calculations addressed two solution alternatives for the capacity increase over a period of ten years. Licensed Visual MODFLOW Pro software, deemed to be at the very top in this field, was used for the calculations.

  19. LED intense headband light source for fingerprint analysis

    Science.gov (United States)

    Villa-Aleman, Eliel

    2005-03-08

    A portable, lightweight and high-intensity light source for detecting and analyzing fingerprints during field investigation. On-site field analysis requires long hours of mobile analysis. In one embodiment, the present invention comprises a plurality of light emitting diodes; a power source; and a personal attachment means; wherein the light emitting diodes are powered by the power source, and wherein the power source and the light emitting diodes are attached to the personal attachment means to produce a personal light source for on-site analysis of latent fingerprints. The present invention is available for other applications as well.

  20. Bias analysis applied to Agricultural Health Study publications to estimate non-random sources of uncertainty.

    Science.gov (United States)

    Lash, Timothy L

    2007-11-26

    The associations of pesticide exposure with disease outcomes are estimated without the benefit of a randomized design. For this reason and others, these studies are susceptible to systematic errors. I analyzed studies of the associations between alachlor and glyphosate exposure and cancer incidence, both derived from the Agricultural Health Study cohort, to quantify the bias and uncertainty potentially attributable to systematic error. For each study, I identified the prominent result and important sources of systematic error that might affect it. I assigned probability distributions to the bias parameters that allow quantification of the bias, drew a value at random from each assigned distribution, and calculated the estimate of effect adjusted for the biases. By repeating the draw and adjustment process over multiple iterations, I generated a frequency distribution of adjusted results, from which I obtained a point estimate and simulation interval. These methods were applied without access to the primary record-level dataset. The conventional estimates of effect associating alachlor and glyphosate exposure with cancer incidence were likely biased away from the null and understated the uncertainty by quantifying only random error. For example, the conventional p-value for a test of trend in the alachlor study equaled 0.02, whereas fewer than 20% of the bias analysis iterations yielded a p-value of 0.02 or lower. Similarly, the conventional fully-adjusted result associating glyphosate exposure with multiple myleoma equaled 2.6 with 95% confidence interval of 0.7 to 9.4. The frequency distribution generated by the bias analysis yielded a median hazard ratio equal to 1.5 with 95% simulation interval of 0.4 to 8.9, which was 66% wider than the conventional interval. Bias analysis provides a more complete picture of true uncertainty than conventional frequentist statistical analysis accompanied by a qualitative description of study limitations. The latter approach is

  1. Bias analysis applied to Agricultural Health Study publications to estimate non-random sources of uncertainty

    Directory of Open Access Journals (Sweden)

    Lash Timothy L

    2007-11-01

    Full Text Available Abstract Background The associations of pesticide exposure with disease outcomes are estimated without the benefit of a randomized design. For this reason and others, these studies are susceptible to systematic errors. I analyzed studies of the associations between alachlor and glyphosate exposure and cancer incidence, both derived from the Agricultural Health Study cohort, to quantify the bias and uncertainty potentially attributable to systematic error. Methods For each study, I identified the prominent result and important sources of systematic error that might affect it. I assigned probability distributions to the bias parameters that allow quantification of the bias, drew a value at random from each assigned distribution, and calculated the estimate of effect adjusted for the biases. By repeating the draw and adjustment process over multiple iterations, I generated a frequency distribution of adjusted results, from which I obtained a point estimate and simulation interval. These methods were applied without access to the primary record-level dataset. Results The conventional estimates of effect associating alachlor and glyphosate exposure with cancer incidence were likely biased away from the null and understated the uncertainty by quantifying only random error. For example, the conventional p-value for a test of trend in the alachlor study equaled 0.02, whereas fewer than 20% of the bias analysis iterations yielded a p-value of 0.02 or lower. Similarly, the conventional fully-adjusted result associating glyphosate exposure with multiple myleoma equaled 2.6 with 95% confidence interval of 0.7 to 9.4. The frequency distribution generated by the bias analysis yielded a median hazard ratio equal to 1.5 with 95% simulation interval of 0.4 to 8.9, which was 66% wider than the conventional interval. Conclusion Bias analysis provides a more complete picture of true uncertainty than conventional frequentist statistical analysis accompanied by a

  2. Studies and modeling of cold neutron sources; Etude et modelisation des sources froides de neutron

    Energy Technology Data Exchange (ETDEWEB)

    Campioni, G

    2004-11-15

    With the purpose of updating knowledge in the fields of cold neutron sources, the work of this thesis has been run according to the 3 following axes. First, the gathering of specific information forming the materials of this work. This set of knowledge covers the following fields: cold neutron, cross-sections for the different cold moderators, flux slowing down, different measurements of the cold flux and finally, issues in the thermal analysis of the problem. Secondly, the study and development of suitable computation tools. After an analysis of the problem, several tools have been planed, implemented and tested in the 3-dimensional radiation transport code Tripoli-4. In particular, a module of uncoupling, integrated in the official version of Tripoli-4, can perform Monte-Carlo parametric studies with a spare factor of Cpu time fetching 50 times. A module of coupling, simulating neutron guides, has also been developed and implemented in the Monte-Carlo code McStas. Thirdly, achieving a complete study for the validation of the installed calculation chain. These studies focus on 3 cold sources currently functioning: SP1 from Orphee reactor and 2 other sources (SFH and SFV) from the HFR at the Laue Langevin Institute. These studies give examples of problems and methods for the design of future cold sources.

  3. Application of the Frequency Map Analysis to the Study of the Beam Dynamics of Light Sources

    International Nuclear Information System (INIS)

    Nadolski, Laurent

    2001-01-01

    The topic of this thesis is the study of beam dynamics in storage rings with a restriction to single particle transverse dynamics. In a first part, tools (Frequency Map Analysis, Hamiltonian, Integrator) are presented for studying and exploring the dynamics. Numerical simulations of four synchrotron radiation sources (the ALS, the ESRF, SOLEIL and Super-ACO) are performed. We construct a tracking code based on a new class of symplectic integrators (Laskar and Robutel, 2000). These integrators with only positive steps are more precise by an order of magnitude than the standard Forest and Ruth's scheme. Comparisons with the BETA, DESPOT and MAD codes are carried out. Frequency Map Analysis (Laskar, 1990) is our main analysis tool. This is a numerical method for analysing a conservative dynamical system. Based on a refined Fourier technique, it enables us to compute frequency maps which are real footprints of the beam dynamics of an accelerator. We stress the high sensitivity of the dynamics to magnetics errors and sextipolar strengths. The second part of this work is dedicated to the analysis of experimental results from two light sources. Together with the ALS accelerator team (Berkeley), we succeeded in obtaining the first experimental frequency map of an accelerator. The agreement with the machine model is very impressive. At the Super-ACO ring, the study of the tune shift with amplitude enabled us to highlight a strong octupolar-like component related to the quadrupole fringe field. The aftermaths for the beam dynamics are important and give us a better understanding the measured ring performance. All these results are based on turn by turn measurements. Many closely related phenomena are treated such as response matrix analysis or beam decoherence. (author) [fr

  4. Analysis of open source GIS software

    OpenAIRE

    Božnis, Andrius

    2006-01-01

    GIS is one of the most perspective information technology sciences sphere. GIS conjuncts the digital image analysis and data base systems. This makes GIS wide applicable and very high skills demanding system. There is a lot of commercial GIS software which is well advertised and which functionality is pretty well known, while open source software is forgotten. In this diploma work is made analysis of available open source GIS software on the Internet, in the scope of different projects interr...

  5. Blind source separation dependent component analysis

    CERN Document Server

    Xiang, Yong; Yang, Zuyuan

    2015-01-01

    This book provides readers a complete and self-contained set of knowledge about dependent source separation, including the latest development in this field. The book gives an overview on blind source separation where three promising blind separation techniques that can tackle mutually correlated sources are presented. The book further focuses on the non-negativity based methods, the time-frequency analysis based methods, and the pre-coding based methods, respectively.

  6. Source apportionment and sensitivity analysis: two methodologies with two different purposes

    Science.gov (United States)

    Clappier, Alain; Belis, Claudio A.; Pernigotti, Denise; Thunis, Philippe

    2017-11-01

    This work reviews the existing methodologies for source apportionment and sensitivity analysis to identify key differences and stress their implicit limitations. The emphasis is laid on the differences between source impacts (sensitivity analysis) and contributions (source apportionment) obtained by using four different methodologies: brute-force top-down, brute-force bottom-up, tagged species and decoupled direct method (DDM). A simple theoretical example to compare these approaches is used highlighting differences and potential implications for policy. When the relationships between concentration and emissions are linear, impacts and contributions are equivalent concepts. In this case, source apportionment and sensitivity analysis may be used indifferently for both air quality planning purposes and quantifying source contributions. However, this study demonstrates that when the relationship between emissions and concentrations is nonlinear, sensitivity approaches are not suitable to retrieve source contributions and source apportionment methods are not appropriate to evaluate the impact of abatement strategies. A quantification of the potential nonlinearities should therefore be the first step prior to source apportionment or planning applications, to prevent any limitations in their use. When nonlinearity is mild, these limitations may, however, be acceptable in the context of the other uncertainties inherent to complex models. Moreover, when using sensitivity analysis for planning, it is important to note that, under nonlinear circumstances, the calculated impacts will only provide information for the exact conditions (e.g. emission reduction share) that are simulated.

  7. The quantitative analysis of 163Ho source by PIXE

    International Nuclear Information System (INIS)

    Sera, K.; Ishii, K.; Fujioka, M.; Izawa, G.; Omori, T.

    1984-01-01

    We have been studying the electron-capture in 163 Ho as a method for determining the mass of electron neutrino. The 163 Ho sources were produced with the 164 Dy(p,2n) reaction by means of a method of internal irradiation 2 ). We applied the PIXE method to determine the total number of 163 Ho atoms in the source. Proton beams of 3 MeV and a method of ''external standard'' were employed for nondestructive analysis of the 163 Ho source as well as an additional method of ''internal standard''. (author)

  8. How Many Separable Sources? Model Selection In Independent Components Analysis

    Science.gov (United States)

    Woods, Roger P.; Hansen, Lars Kai; Strother, Stephen

    2015-01-01

    Unlike mixtures consisting solely of non-Gaussian sources, mixtures including two or more Gaussian components cannot be separated using standard independent components analysis methods that are based on higher order statistics and independent observations. The mixed Independent Components Analysis/Principal Components Analysis (mixed ICA/PCA) model described here accommodates one or more Gaussian components in the independent components analysis model and uses principal components analysis to characterize contributions from this inseparable Gaussian subspace. Information theory can then be used to select from among potential model categories with differing numbers of Gaussian components. Based on simulation studies, the assumptions and approximations underlying the Akaike Information Criterion do not hold in this setting, even with a very large number of observations. Cross-validation is a suitable, though computationally intensive alternative for model selection. Application of the algorithm is illustrated using Fisher's iris data set and Howells' craniometric data set. Mixed ICA/PCA is of potential interest in any field of scientific investigation where the authenticity of blindly separated non-Gaussian sources might otherwise be questionable. Failure of the Akaike Information Criterion in model selection also has relevance in traditional independent components analysis where all sources are assumed non-Gaussian. PMID:25811988

  9. Polar source analysis : technical memorandum

    Science.gov (United States)

    2017-09-29

    The following technical memorandum describes the development, testing and analysis of various polar source data sets. The memorandum also includes recommendation for potential inclusion in future releases of AEDT. This memorandum is the final deliver...

  10. Overview of receptor-based source apportionment studies for speciated atmospheric mercury

    OpenAIRE

    Cheng, I.; Xu, X.; Zhang, L.

    2015-01-01

    Receptor-based source apportionment studies of speciated atmospheric mercury are not only concerned with source contributions but also with the influence of transport, transformation, and deposition processes on speciated atmospheric mercury concentrations at receptor locations. Previous studies applied multivariate receptor models including principal components analysis and positive matrix factorization, and back trajectory receptor models including potential source contri...

  11. Proposed Sources of Coaching Efficacy: A Meta-Analysis.

    Science.gov (United States)

    Myers, Nicholas D; Park, Sung Eun; Ahn, Soyeon; Lee, Seungmin; Sullivan, Philip J; Feltz, Deborah L

    2017-08-01

    Coaching efficacy refers to the extent to which a coach believes that he or she has the capacity to affect the learning and performance of his or her athletes. The purpose of the current study was to empirically synthesize findings across the extant literature to estimate relationships between the proposed sources of coaching efficacy and each of the dimensions of coaching efficacy. A literature search yielded 20 studies and 278 effect size estimates that met the inclusion criteria. The overall relationship between the proposed sources of coaching efficacy and each dimension of coaching efficacy was positive and ranged from small to medium in size. Coach gender and level coached moderated the overall relationship between the proposed sources of coaching efficacy and each of the dimensions of coaching efficacy. Results from this meta-analysis provided some evidence for both the utility of, and possible revisions to, the conceptual model of coaching efficacy.

  12. Moessbauer spectroscopy and X-ray fluorescence analysis in studies for determinate the sources of several prehispanic objects

    International Nuclear Information System (INIS)

    Arriola S, H.; Ramos R, P.; Castro V, P.; Jimenez R, A.; Flores D, F.; Garcia Moreno C, C.

    1980-01-01

    A study by the Moessbauer effect and X-ray fluorescence analysis of the mexican prehispanic ceramic specimens is presented. Several iron compounds of the ceramics are determined, the different iron compounds indicate different sources of the clays, and different forms of ovens used with them, this compounds are identified by the differents oxidation states of the magnetic iron Fe 3+ , Fe 2+ . (author)

  13. Study of non-metallic inclusion sources in steel

    International Nuclear Information System (INIS)

    Khons, Ya.; Mrazek, L.

    1976-01-01

    A study of potential inclusion sources was carried out at the Tvinec steel plant using an unified labelling procedure for different sources. A lanthanum oxide labelling method has been used for refractories with the subsequent La determination in steel by the neutron activation analysis. Samarium and cerium oxides and the 141 Ce radionuclide have been used in conjunction with the testing. The following sources of exogenous inclusions have been studied: 1)Refractory material comprising fireclay and corundum for steel-teeming trough in open-heart furnaces; 2) Fireclay bottom-pouring refractories; 3) Steel-teeming laddle lining; 4) Heat-insulating and exothermic compounds for steel ingots; 5) Vacuum treatment plant lining; 6) Open-hearth and electric arc furnace slag. The major oxide inclusion source in steel was found to be represented by the furnace slag, since it forms about 40 p.c. of all oxide inclusions. The contributions of the remaining sources did not exceede 5 p.c. each

  14. Crime analysis using open source information

    DEFF Research Database (Denmark)

    Nizamani, Sarwat; Memon, Nasrullah; Shah, Azhar Ali

    2015-01-01

    In this paper, we present a method of crime analysis from open source information. We employed un-supervised methods of data mining to explore the facts regarding the crimes of an area of interest. The analysis is based on well known clustering and association techniques. The results show...

  15. Comparative analysis of methods and sources of financing of the transport organizations activity

    Science.gov (United States)

    Gorshkov, Roman

    2017-10-01

    The article considers the analysis of methods of financing of transport organizations in conditions of limited investment resources. A comparative analysis of these methods is carried out, the classification of investment, methods and sources of financial support for projects being implemented to date are presented. In order to select the optimal sources of financing for the projects, various methods of financial management and financial support for the activities of the transport organization were analyzed, which were considered from the perspective of analysis of advantages and limitations. The result of the study is recommendations on the selection of optimal sources and methods of financing of transport organizations.

  16. Evolution of source term definition and analysis

    International Nuclear Information System (INIS)

    Lutz, R.J. Jr.

    2004-01-01

    The objective of this presentation was to provide an overview of the evolution of accident fission product release analysis methodology and the obtained results; and to provide an overview of the source term implementation analysis in regulatory decisions

  17. Nonpoint source pollution of urban stormwater runoff: a methodology for source analysis.

    Science.gov (United States)

    Petrucci, Guido; Gromaire, Marie-Christine; Shorshani, Masoud Fallah; Chebbo, Ghassan

    2014-09-01

    The characterization and control of runoff pollution from nonpoint sources in urban areas are a major issue for the protection of aquatic environments. We propose a methodology to quantify the sources of pollutants in an urban catchment and to analyze the associated uncertainties. After describing the methodology, we illustrate it through an application to the sources of Cu, Pb, Zn, and polycyclic aromatic hydrocarbons (PAH) from a residential catchment (228 ha) in the Paris region. In this application, we suggest several procedures that can be applied for the analysis of other pollutants in different catchments, including an estimation of the total extent of roof accessories (gutters and downspouts, watertight joints and valleys) in a catchment. These accessories result as the major source of Pb and as an important source of Zn in the example catchment, while activity-related sources (traffic, heating) are dominant for Cu (brake pad wear) and PAH (tire wear, atmospheric deposition).

  18. A Pilot Study of EEG Source Analysis Based Repetitive Transcranial Magnetic Stimulation for the Treatment of Tinnitus.

    Directory of Open Access Journals (Sweden)

    Hui Wang

    Full Text Available Repetitive Transcranial Magnetic Stimulation (rTMS is a novel therapeutic tool to induce a suppression of tinnitus. However, the optimal target sites are unknown. We aimed to determine whether low-frequency rTMS induced lasting suppression of tinnitus by decreasing neural activity in the cortex, navigated by high-density electroencephalogram (EEG source analysis, and the utility of EEG for targeting treatment.In this controlled three-armed trial, seven normal hearing patients with tonal tinnitus received a 10-day course of 1-Hz rTMS to the cortex, navigated by high-density EEG source analysis, to the left temporoparietal cortex region, and to the left temporoparietal with sham stimulation. The Tinnitus handicap inventory (THI and a visual analog scale (VAS were used to assess tinnitus severity and loudness. Measurements were taken before, and immediately, 2 weeks, and 4 weeks after the end of the interventions.Low-frequency rTMS decreased tinnitus significantly after active, but not sham, treatment. Responders in the EEG source analysis-based rTMS group, 71.4% (5/7 patients, experienced a significant reduction in tinnitus loudness, as evidenced by VAS scores. The target site of neuronal generators most consistently associated with a positive response was the frontal lobe in the right hemisphere, sourced using high-density EEG equipment, in the tinnitus patients. After left temporoparietal rTMS stimulation, 42.8% (3/7 patients experienced a decrease in tinnitus loudness.Active EEG source analysis based rTMS resulted in significant suppression in tinnitus loudness, showing the superiority of neuronavigation-guided coil positioning in dealing with tinnitus. Non-auditory areas should be considered in the pathophysiology of tinnitus. This knowledge in turn can contribute to investigate the pathophysiology of tinnitus.

  19. Examination of Conservatism in Ground-level Source Release Assumption when Performing Consequence Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sung-yeop; Lim, Ho-Gon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    One of these assumptions frequently assumed is the assumption of ground-level source release. The user manual of a consequence analysis software HotSpot is mentioning like below: 'If you cannot estimate or calculate the effective release height, the actual physical release height (height of the stack) or zero for ground-level release should be used. This will usually yield a conservative estimate, (i.e., larger radiation doses for all downwind receptors, etc).' This recommendation could be agreed in aspect of conservatism but quantitative examination of the effect of this assumption to the result of consequence analysis is necessary. The source terms of Fukushima Dai-ichi NPP accident have been estimated by several studies using inverse modeling and one of the biggest sources of the difference between the results of these studies was different effective source release height assumed by each studies. It supports the importance of the quantitative examination of the influence by release height. Sensitivity analysis of the effective release height of radioactive sources was performed and the influence to the total effective dose was quantitatively examined in this study. Above 20% difference is maintained even at longer distances, when we compare the dose between the result assuming ground-level release and the results assuming other effective plume height. It means that we cannot ignore the influence of ground-level source assumption to the latent cancer fatality estimations. In addition, the assumption of ground-level release fundamentally prevents detailed analysis including diffusion of plume from effective plume height to the ground even though the influence of it is relatively lower in longer distance. When we additionally consider the influence of surface roughness, situations could be more serious. The ground level dose could be highly over-estimated in short downwind distance at the NPP sites which have low surface roughness such as Barakah site in

  20. FECAL SOURCE TRACKING BY ANTIBIOTIC RESISTANCE ANALYSIS ON A WATERSHED EXHIBITING LOW RESISTANCE

    Science.gov (United States)

    The ongoing development of microbial source tracking has made it possible to identify contamination sources with varying accuracy, depending on the method used. The purpose of this study was done to test the efficiency of the antibiotic resistance analysis (ARA) method under low ...

  1. Sustainability in Open Source Software Commons: Lessons Learned from an Empirical Study of SourceForge Projects

    Directory of Open Access Journals (Sweden)

    Charles M. Schweik

    2013-01-01

    Full Text Available In this article, we summarize a five-year US National Science Foundation funded study designed to investigate the factors that lead some open source projects to ongoing collaborative success while many others become abandoned. Our primary interest was to conduct a study that was closely representative of the population of open source software projects in the world, rather than focus on the more-often studied, high-profile successful cases. After building a large database of projects (n=174,333 and implementing a major survey of open source developers (n=1403, we were able to conduct statistical analyses to investigate over forty theoretically-based testable hypotheses. Our data firmly support what we call the conventional theory of open source software, showing that projects start small, and, in successful cases, grow slightly larger in terms of team size. We describe the “virtuous circle” supporting conventional wisdom of open source collaboration that comes out of this analysis, and we discuss two other interesting findings related to developer motivations and how team members find each other. Each of these findings is related to the sustainability of these projects.

  2. Source modelling in seismic risk analysis for nuclear power plants

    International Nuclear Information System (INIS)

    Yucemen, M.S.

    1978-12-01

    The proposed probabilistic procedure provides a consistent method for the modelling, analysis and updating of uncertainties that are involved in the seismic risk analysis for nuclear power plants. The potential earthquake activity zones are idealized as point, line or area sources. For these seismic source types, expressions to evaluate their contribution to seismic risk are derived, considering all the possible site-source configurations. The seismic risk at a site is found to depend not only on the inherent randomness of the earthquake occurrences with respect to magnitude, time and space, but also on the uncertainties associated with the predicted values of the seismic and geometric parameters, as well as the uncertainty in the attenuation model. The uncertainty due to the attenuation equation is incorporated into the analysis through the use of random correction factors. The influence of the uncertainty resulting from the insufficient information on the seismic parameters and source geometry is introduced into the analysis by computing a mean risk curve averaged over the various alternative assumptions on the parameters and source geometry. Seismic risk analysis is carried for the city of Denizli, which is located in the seismically most active zone of Turkey. The second analysis is for Akkuyu

  3. Continuous wavelet transform analysis and modal location analysis acoustic emission source location for nuclear piping crack growth monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Mohd, Shukri [Nondestructive Testing Group, Industrial Technology Division, Malaysian Nuclear Agency, 43000, Bangi, Selangor (Malaysia); Holford, Karen M.; Pullin, Rhys [Cardiff School of Engineering, Cardiff University, Queen' s Buildings, The Parade, CARDIFF CF24 3AA (United Kingdom)

    2014-02-12

    Source location is an important feature of acoustic emission (AE) damage monitoring in nuclear piping. The ability to accurately locate sources can assist in source characterisation and early warning of failure. This paper describe the development of a novelAE source location technique termed 'Wavelet Transform analysis and Modal Location (WTML)' based on Lamb wave theory and time-frequency analysis that can be used for global monitoring of plate like steel structures. Source location was performed on a steel pipe of 1500 mm long and 220 mm outer diameter with nominal thickness of 5 mm under a planar location test setup using H-N sources. The accuracy of the new technique was compared with other AE source location methods such as the time of arrival (TOA) techniqueand DeltaTlocation. Theresults of the study show that the WTML method produces more accurate location resultscompared with TOA and triple point filtering location methods. The accuracy of the WTML approach is comparable with the deltaT location method but requires no initial acoustic calibration of the structure.

  4. Continuous wavelet transform analysis and modal location analysis acoustic emission source location for nuclear piping crack growth monitoring

    International Nuclear Information System (INIS)

    Shukri Mohd

    2013-01-01

    Full-text: Source location is an important feature of acoustic emission (AE) damage monitoring in nuclear piping. The ability to accurately locate sources can assist in source characterisation and early warning of failure. This paper describe the development of a novelAE source location technique termed Wavelet Transform analysis and Modal Location (WTML) based on Lamb wave theory and time-frequency analysis that can be used for global monitoring of plate like steel structures. Source location was performed on a steel pipe of 1500 mm long and 220 mm outer diameter with nominal thickness of 5 mm under a planar location test setup using H-N sources. The accuracy of the new technique was compared with other AE source location methods such as the time of arrival (TOA) technique and DeltaTlocation. The results of the study show that the WTML method produces more accurate location results compared with TOA and triple point filtering location methods. The accuracy of the WTML approach is comparable with the deltaT location method but requires no initial acoustic calibration of the structure. (author)

  5. Continuous wavelet transform analysis and modal location analysis acoustic emission source location for nuclear piping crack growth monitoring

    International Nuclear Information System (INIS)

    Mohd, Shukri; Holford, Karen M.; Pullin, Rhys

    2014-01-01

    Source location is an important feature of acoustic emission (AE) damage monitoring in nuclear piping. The ability to accurately locate sources can assist in source characterisation and early warning of failure. This paper describe the development of a novelAE source location technique termed 'Wavelet Transform analysis and Modal Location (WTML)' based on Lamb wave theory and time-frequency analysis that can be used for global monitoring of plate like steel structures. Source location was performed on a steel pipe of 1500 mm long and 220 mm outer diameter with nominal thickness of 5 mm under a planar location test setup using H-N sources. The accuracy of the new technique was compared with other AE source location methods such as the time of arrival (TOA) techniqueand DeltaTlocation. Theresults of the study show that the WTML method produces more accurate location resultscompared with TOA and triple point filtering location methods. The accuracy of the WTML approach is comparable with the deltaT location method but requires no initial acoustic calibration of the structure

  6. Dosimetric analysis of radiation sources for use dermatological lesions

    International Nuclear Information System (INIS)

    Tada, Ariane

    2010-01-01

    Skin lesions undergoing therapy with radiation sources may have different patterns of malignancy. Malignant lesions or cancer most commonly found in radiotherapy services are carcinomas. Radiation therapy in skin lesions is performed with low penetration beams and orthovoltage X-rays, electron beams and radioactive sources ( 192 Ir, 198 Au, e 90 Sr) arranged on a surface mold or in metal applicator. This study aims to analyze the therapeutic radiation dose profile produced by radiation sources used in skin lesions radiotherapy procedures . Experimental measurements for the analysis of dosimetric radiation sources were compared with calculations obtained from a computer system based on the Monte Carlo Method. Computational results had a good agreement with the experimental measurements. Experimental measurements and computational results by the MCNP4C code were both physically consistent as expected. These experimental measurements compared with calculations using the MCNP-4C code have been used to validate the calculations obtained by MCNP code and to provide a reliable medical application for each clinical case. (author)

  7. Bispectral pairwise interacting source analysis for identifying systems of cross-frequency interacting brain sources from electroencephalographic or magnetoencephalographic signals

    Science.gov (United States)

    Chella, Federico; Pizzella, Vittorio; Zappasodi, Filippo; Nolte, Guido; Marzetti, Laura

    2016-05-01

    Brain cognitive functions arise through the coordinated activity of several brain regions, which actually form complex dynamical systems operating at multiple frequencies. These systems often consist of interacting subsystems, whose characterization is of importance for a complete understanding of the brain interaction processes. To address this issue, we present a technique, namely the bispectral pairwise interacting source analysis (biPISA), for analyzing systems of cross-frequency interacting brain sources when multichannel electroencephalographic (EEG) or magnetoencephalographic (MEG) data are available. Specifically, the biPISA makes it possible to identify one or many subsystems of cross-frequency interacting sources by decomposing the antisymmetric components of the cross-bispectra between EEG or MEG signals, based on the assumption that interactions are pairwise. Thanks to the properties of the antisymmetric components of the cross-bispectra, biPISA is also robust to spurious interactions arising from mixing artifacts, i.e., volume conduction or field spread, which always affect EEG or MEG functional connectivity estimates. This method is an extension of the pairwise interacting source analysis (PISA), which was originally introduced for investigating interactions at the same frequency, to the study of cross-frequency interactions. The effectiveness of this approach is demonstrated in simulations for up to three interacting source pairs and for real MEG recordings of spontaneous brain activity. Simulations show that the performances of biPISA in estimating the phase difference between the interacting sources are affected by the increasing level of noise rather than by the number of the interacting subsystems. The analysis of real MEG data reveals an interaction between two pairs of sources of central mu and beta rhythms, localizing in the proximity of the left and right central sulci.

  8. Scoping Study of Machine Learning Techniques for Visualization and Analysis of Multi-source Data in Nuclear Safeguards

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Yonggang

    2018-05-07

    In implementation of nuclear safeguards, many different techniques are being used to monitor operation of nuclear facilities and safeguard nuclear materials, ranging from radiation detectors, flow monitors, video surveillance, satellite imagers, digital seals to open source search and reports of onsite inspections/verifications. Each technique measures one or more unique properties related to nuclear materials or operation processes. Because these data sets have no or loose correlations, it could be beneficial to analyze the data sets together to improve the effectiveness and efficiency of safeguards processes. Advanced visualization techniques and machine-learning based multi-modality analysis could be effective tools in such integrated analysis. In this project, we will conduct a survey of existing visualization and analysis techniques for multi-source data and assess their potential values in nuclear safeguards.

  9. Problems in the fingerprints based polycyclic aromatic hydrocarbons source apportionment analysis and a practical solution

    International Nuclear Information System (INIS)

    Zou, Yonghong; Wang, Lixia; Christensen, Erik R.

    2015-01-01

    This work intended to explain the challenges of the fingerprints based source apportionment method for polycyclic aromatic hydrocarbons (PAH) in the aquatic environment, and to illustrate a practical and robust solution. The PAH data detected in the sediment cores from the Illinois River provide the basis of this study. Principal component analysis (PCA) separates PAH compounds into two groups reflecting their possible airborne transport patterns; but it is not able to suggest specific sources. Not all positive matrix factorization (PMF) determined sources are distinguishable due to the variability of source fingerprints. However, they constitute useful suggestions for inputs for a Bayesian chemical mass balance (CMB) analysis. The Bayesian CMB analysis takes into account the measurement errors as well as the variations of source fingerprints, and provides a credible source apportionment. Major PAH sources for Illinois River sediments are traffic (35%), coke oven (24%), coal combustion (18%), and wood combustion (14%). - Highlights: • Fingerprint variability poses challenges in PAH source apportionment analysis. • PCA can be used to group compounds or cluster measurements. • PMF requires results validation but is useful for source suggestion. • Bayesian CMB provide practical and credible solution. - A Bayesian CMB model combined with PMF is a practical and credible fingerprints based PAH source apportionment method

  10. Beamformer source analysis and connectivity on concurrent EEG and MEG data during voluntary movements.

    Science.gov (United States)

    Muthuraman, Muthuraman; Hellriegel, Helge; Hoogenboom, Nienke; Anwar, Abdul Rauf; Mideksa, Kidist Gebremariam; Krause, Holger; Schnitzler, Alfons; Deuschl, Günther; Raethjen, Jan

    2014-01-01

    Electroencephalography (EEG) and magnetoencephalography (MEG) are the two modalities for measuring neuronal dynamics at a millisecond temporal resolution. Different source analysis methods, to locate the dipoles in the brain from which these dynamics originate, have been readily applied to both modalities alone. However, direct comparisons and possible advantages of combining both modalities have rarely been assessed during voluntary movements using coherent source analysis. In the present study, the cortical and sub-cortical network of coherent sources at the finger tapping task frequency (2-4 Hz) and the modes of interaction within this network were analysed in 15 healthy subjects using a beamformer approach called the dynamic imaging of coherent sources (DICS) with subsequent source signal reconstruction and renormalized partial directed coherence analysis (RPDC). MEG and EEG data were recorded simultaneously allowing the comparison of each of the modalities separately to that of the combined approach. We found the identified network of coherent sources for the finger tapping task as described in earlier studies when using only the MEG or combined MEG+EEG whereas the EEG data alone failed to detect single sub-cortical sources. The signal-to-noise ratio (SNR) level of the coherent rhythmic activity at the tapping frequency in MEG and combined MEG+EEG data was significantly higher than EEG alone. The functional connectivity analysis revealed that the combined approach had more active connections compared to either of the modalities during the finger tapping (FT) task. These results indicate that MEG is superior in the detection of deep coherent sources and that the SNR seems to be more vital than the sensitivity to theoretical dipole orientation and the volume conduction effect in the case of EEG.

  11. A Study on Improvement of Algorithm for Source Term Evaluation

    International Nuclear Information System (INIS)

    Park, Jeong Ho; Park, Do Hyung; Lee, Jae Hee

    2010-03-01

    The program developed by KAERI for source term assessment of radwastes from the advanced nuclear fuel cycle consists of spent fuel database analysis module, spent fuel arising projection module, and automatic characterization module for radwastes from pyroprocess. To improve the algorithm adopted the developed program, following items were carried out: - development of an algorithm to decrease analysis time for spent fuel database - development of setup routine for a analysis procedure - improvement of interface for spent fuel arising projection module - optimization of data management algorithm needed for massive calculation to estimate source terms of radwastes from advanced fuel cycle The program developed through this study has a capability to perform source term estimation although several spent fuel assemblies with different fuel design, initial enrichment, irradiation history, discharge burnup, and cooling time are processed at the same time in the pyroprocess. It is expected that this program will be very useful for the design of unit process of pyroprocess and disposal system

  12. Mechanistic facility safety and source term analysis

    International Nuclear Information System (INIS)

    PLYS, M.G.

    1999-01-01

    A PC-based computer program was created for facility safety and source term analysis at Hanford The program has been successfully applied to mechanistic prediction of source terms from chemical reactions in underground storage tanks, hydrogen combustion in double contained receiver tanks, and proccss evaluation including the potential for runaway reactions in spent nuclear fuel processing. Model features include user-defined facility room, flow path geometry, and heat conductors, user-defined non-ideal vapor and aerosol species, pressure- and density-driven gas flows, aerosol transport and deposition, and structure to accommodate facility-specific source terms. Example applications are presented here

  13. OSSMETER D3.4 – Language-Specific Source Code Quality Analysis

    NARCIS (Netherlands)

    J.J. Vinju (Jurgen); A. Shahi (Ashim); H.J.S. Basten (Bas)

    2014-01-01

    htmlabstractThis deliverable is part of WP3: Source Code Quality and Activity Analysis. It provides descriptions and prototypes of the tools that are needed for source code quality analysis in open source software projects. It builds upon the results of: • Deliverable 3.1 where infra-structure and

  14. Problems in the fingerprints based polycyclic aromatic hydrocarbons source apportionment analysis and a practical solution.

    Science.gov (United States)

    Zou, Yonghong; Wang, Lixia; Christensen, Erik R

    2015-10-01

    This work intended to explain the challenges of the fingerprints based source apportionment method for polycyclic aromatic hydrocarbons (PAH) in the aquatic environment, and to illustrate a practical and robust solution. The PAH data detected in the sediment cores from the Illinois River provide the basis of this study. Principal component analysis (PCA) separates PAH compounds into two groups reflecting their possible airborne transport patterns; but it is not able to suggest specific sources. Not all positive matrix factorization (PMF) determined sources are distinguishable due to the variability of source fingerprints. However, they constitute useful suggestions for inputs for a Bayesian chemical mass balance (CMB) analysis. The Bayesian CMB analysis takes into account the measurement errors as well as the variations of source fingerprints, and provides a credible source apportionment. Major PAH sources for Illinois River sediments are traffic (35%), coke oven (24%), coal combustion (18%), and wood combustion (14%). Copyright © 2015. Published by Elsevier Ltd.

  15. Relationship of Source Selection Methods to Contract Outcomes: an Analysis of Air Force Source Selection

    Science.gov (United States)

    2015-12-01

    some occasions, performance is terminated early; this can occur due to either mutual agreement or a breach of contract by one of the parties (Garrett...Relationship of Source Selection Methods to Contract Outcomes: an Analysis of Air Force Source Selection December 2015 Capt Jacques Lamoureux, USAF...on the contract management process, with special emphasis on the source selection methods of tradeoff and lowest price technically acceptable (LPTA

  16. Analysis on Dangerous Source of Large Safety Accident in Storage Tank Area

    Science.gov (United States)

    Wang, Tong; Li, Ying; Xie, Tiansheng; Liu, Yu; Zhu, Xueyuan

    2018-01-01

    The difference between a large safety accident and a general accident is that the consequences of a large safety accident are particularly serious. To study the tank area which factors directly or indirectly lead to the occurrence of large-sized safety accidents. According to the three kinds of hazard source theory and the consequence cause analysis of the super safety accident, this paper analyzes the dangerous source of the super safety accident in the tank area from four aspects, such as energy source, large-sized safety accident reason, management missing, environmental impact Based on the analysis of three kinds of hazard sources and environmental analysis to derive the main risk factors and the AHP evaluation model is established, and after rigorous and scientific calculation, the weights of the related factors in four kinds of risk factors and each type of risk factors are obtained. The result of analytic hierarchy process shows that management reasons is the most important one, and then the environmental factors and the direct cause and Energy source. It should be noted that although the direct cause is relatively low overall importance, the direct cause of Failure of emergency measures and Failure of prevention and control facilities in greater weight.

  17. Source Signals Separation and Reconstruction Following Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    WANG Cheng

    2014-02-01

    Full Text Available For separation and reconstruction of source signals from observed signals problem, the physical significance of blind source separation modal and independent component analysis is not very clear, and its solution is not unique. Aiming at these disadvantages, a new linear and instantaneous mixing model and a novel source signals separation reconstruction solving method from observed signals based on principal component analysis (PCA are put forward. Assumption of this new model is statistically unrelated rather than independent of source signals, which is different from the traditional blind source separation model. A one-to-one relationship between linear and instantaneous mixing matrix of new model and linear compound matrix of PCA, and a one-to-one relationship between unrelated source signals and principal components are demonstrated using the concept of linear separation matrix and unrelated of source signals. Based on this theoretical link, source signals separation and reconstruction problem is changed into PCA of observed signals then. The theoretical derivation and numerical simulation results show that, in despite of Gauss measurement noise, wave form and amplitude information of unrelated source signal can be separated and reconstructed by PCA when linear mixing matrix is column orthogonal and normalized; only wave form information of unrelated source signal can be separated and reconstructed by PCA when linear mixing matrix is column orthogonal but not normalized, unrelated source signal cannot be separated and reconstructed by PCA when mixing matrix is not column orthogonal or linear.

  18. Analysis of the Structure Ratios of the Funding Sources

    Directory of Open Access Journals (Sweden)

    Maria Daniela Bondoc

    2014-06-01

    Full Text Available The funding sources of the assets and liabilities in the balance sheet include equity capitals and the debts of the entity. The analysis of the structure rates of the funding sources allows for making assessments related to the funding policy, highlighting the financial autonomy and how resources are provided. Using the literature specializing in economic and financial analysis, this paper aims at presenting these rates that focus, on the one hand, to reflect the degree of financial dependence (the rate of financial stability, the rate of global financial autonomy, the rate of on-term financial autonomy and on the other hand the debt structure (the rate of short-term debts, the global indebtedness rate, the on-term indebtedness rate. Based on the financial statements of an entity in the Argeş County, I analysed these indicators, and I drew conclusions and made assessments related to the autonomy, indebtedness and financial stability of the studied entity.

  19. Beamformer source analysis and connectivity on concurrent EEG and MEG data during voluntary movements.

    Directory of Open Access Journals (Sweden)

    Muthuraman Muthuraman

    Full Text Available Electroencephalography (EEG and magnetoencephalography (MEG are the two modalities for measuring neuronal dynamics at a millisecond temporal resolution. Different source analysis methods, to locate the dipoles in the brain from which these dynamics originate, have been readily applied to both modalities alone. However, direct comparisons and possible advantages of combining both modalities have rarely been assessed during voluntary movements using coherent source analysis. In the present study, the cortical and sub-cortical network of coherent sources at the finger tapping task frequency (2-4 Hz and the modes of interaction within this network were analysed in 15 healthy subjects using a beamformer approach called the dynamic imaging of coherent sources (DICS with subsequent source signal reconstruction and renormalized partial directed coherence analysis (RPDC. MEG and EEG data were recorded simultaneously allowing the comparison of each of the modalities separately to that of the combined approach. We found the identified network of coherent sources for the finger tapping task as described in earlier studies when using only the MEG or combined MEG+EEG whereas the EEG data alone failed to detect single sub-cortical sources. The signal-to-noise ratio (SNR level of the coherent rhythmic activity at the tapping frequency in MEG and combined MEG+EEG data was significantly higher than EEG alone. The functional connectivity analysis revealed that the combined approach had more active connections compared to either of the modalities during the finger tapping (FT task. These results indicate that MEG is superior in the detection of deep coherent sources and that the SNR seems to be more vital than the sensitivity to theoretical dipole orientation and the volume conduction effect in the case of EEG.

  20. Semantic integration of gene expression analysis tools and data sources using software connectors

    Science.gov (United States)

    2013-01-01

    Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools

  1. DeltaSA tool for source apportionment benchmarking, description and sensitivity analysis

    Science.gov (United States)

    Pernigotti, D.; Belis, C. A.

    2018-05-01

    DeltaSA is an R-package and a Java on-line tool developed at the EC-Joint Research Centre to assist and benchmark source apportionment applications. Its key functionalities support two critical tasks in this kind of studies: the assignment of a factor to a source in factor analytical models (source identification) and the model performance evaluation. The source identification is based on the similarity between a given factor and source chemical profiles from public databases. The model performance evaluation is based on statistical indicators used to compare model output with reference values generated in intercomparison exercises. The references values are calculated as the ensemble average of the results reported by participants that have passed a set of testing criteria based on chemical profiles and time series similarity. In this study, a sensitivity analysis of the model performance criteria is accomplished using the results of a synthetic dataset where "a priori" references are available. The consensus modulated standard deviation punc gives the best choice for the model performance evaluation when a conservative approach is adopted.

  2. Comparative analysis of traditional and alternative energy sources

    Directory of Open Access Journals (Sweden)

    Adriana Csikósová

    2008-11-01

    Full Text Available The presented thesis with designation of Comparing analysis of traditional and alternative energy resources includes, on basisof theoretical information source, research in firm, internal data, trends in company development and market, descriptionof the problem and its application. Theoretical information source is dedicated to the traditional and alternative energy resources,reserves of it, trends in using and development, the balance of it in the world, EU and in Slovakia as well. Analysis of the thesisis reflecting profile of the company and the thermal pump market evaluation using General Electric method. While the companyis implementing, except other products, the thermal pumps on geothermal energy base and surround energy base (air, the missionof the comparing analysis is to compare traditional energy resources with thermal pump from the ecological, utility and economic sideof it. The results of the comparing analysis are resumed in to the SWOT analysis. The part of the thesis includes he questionnaire offerfor effectiveness improvement and customer satisfaction analysis, and expected possibilities of alternative energy resources assistance(benefits from the government and EU funds.

  3. Dosimetric analysis of radiation sources to use in dermatological lesions

    International Nuclear Information System (INIS)

    Tada, Ariane

    2010-01-01

    Skin lesions undergoing therapy with radiation sources may have different patterns of malignancy. Malignant lesions or cancer most commonly found in radiotherapy services are carcinomas. Radiation therapy in skin lesions is performed with low penetration beams and orthovoltage X-rays, electron beams and radioactive sources ( 192 Ir, 198 Au, e 90 Sr) arranged on a surface mold or in metal applicator. This study aims to analyze the therapeutic radiation dose profile produced by radiation sources used in skin lesions radiotherapy procedures. Experimental measurements for the analysis of dosimetric radiation sources were compared with calculations obtained from a computer system based on the Monte Carlo Method. Computational results had a good agreement with the experimental measurements. Experimental measurements and computational results by the MCNP4C code have been used to validate the calculations obtained by MCNP code and to provide a reliable medical application for each clinical case. (author)

  4. Spatiotemporal source analysis in scalp EEG vs. intracerebral EEG and SPECT: a case study in a 2-year-old child.

    Science.gov (United States)

    Aarabi, A; Grebe, R; Berquin, P; Bourel Ponchel, E; Jalin, C; Fohlen, M; Bulteau, C; Delalande, O; Gondry, C; Héberlé, C; Moullart, V; Wallois, F

    2012-06-01

    This case study aims to demonstrate that spatiotemporal spike discrimination and source analysis are effective to monitor the development of sources of epileptic activity in time and space. Therefore, they can provide clinically useful information allowing a better understanding of the pathophysiology of individual seizures with time- and space-resolved characteristics of successive epileptic states, including interictal, preictal, postictal, and ictal states. High spatial resolution scalp EEGs (HR-EEG) were acquired from a 2-year-old girl with refractory central epilepsy and single-focus seizures as confirmed by intracerebral EEG recordings and ictal single-photon emission computed tomography (SPECT). Evaluation of HR-EEG consists of the following three global steps: (1) creation of the initial head model, (2) automatic spike and seizure detection, and finally (3) source localization. During the source localization phase, epileptic states are determined to allow state-based spike detection and localization of underlying sources for each spike. In a final cluster analysis, localization results are integrated to determine the possible sources of epileptic activity. The results were compared with the cerebral locations identified by intracerebral EEG recordings and SPECT. The results obtained with this approach were concordant with those of MRI, SPECT and distribution of intracerebral potentials. Dipole cluster centres found for spikes in interictal, preictal, ictal and postictal states were situated an average of 6.3mm from the intracerebral contacts with the highest voltage. Both amplitude and shape of spikes change between states. Dispersion of the dipoles was higher in the preictal state than in the postictal state. Two clusters of spikes were identified. The centres of these clusters changed position periodically during the various epileptic states. High-resolution surface EEG evaluated by an advanced algorithmic approach can be used to investigate the

  5. School adjustment of children in residential care: a multi-source analysis.

    Science.gov (United States)

    Martín, Eduardo; Muñoz de Bustillo, María del Carmen

    2009-11-01

    School adjustment is one the greatest challenges in residential child care programs. This study has two aims: to analyze school adjustment compared to a normative population, and to carry out a multi-source analysis (child, classmates, and teacher) of this adjustment. A total of 50 classrooms containing 60 children from residential care units were studied. The "Método de asignación de atributos perceptivos" (Allocation of perceptive attributes; Díaz-Aguado, 2006), the "Test Autoevaluativo Multifactorial de Adaptación Infantil" (TAMAI [Multifactor Self-assessment Test of Child Adjustment]; Hernández, 1996) and the "Protocolo de valoración para el profesorado (Evaluation Protocol for Teachers; Fernández del Valle, 1998) were applied. The main results indicate that, compared with their classmates, children in residential care are perceived as more controversial and less integrated at school, although no differences were observed in problems of isolation. The multi-source analysis shows that there is agreement among the different sources when the externalized and visible aspects are evaluated. These results are discussed in connection with the practices that are being developed in residential child care programs.

  6. Source-system windowing for speech analysis

    NARCIS (Netherlands)

    Yegnanarayana, B.; Satyanarayana Murthy, P.; Eggen, J.H.

    1993-01-01

    In this paper we propose a speech-analysis method to bring out characteristics of the vocal tract system in short segments which are much less than a pitch period. The method performs windowing in the source and system components of the speech signal and recombines them to obtain a signal reflecting

  7. Meta-analysis on Methane Mitigating Properties of Saponin-rich Sources in the Rumen: Influence of Addition Levels and Plant Sources

    Directory of Open Access Journals (Sweden)

    Anuraga Jayanegara

    2014-10-01

    Full Text Available Saponins have been considered as promising natural substances for mitigating methane emissions from ruminants. However, studies reported that addition of saponin-rich sources often arrived at contrasting results, i.e. either it decreased methane or it did not. The aim of the present study was to assess ruminal methane emissions through a meta-analytical approach of integrating related studies from published papers which described various levels of different saponin-rich sources being added to ruminant feed. A database was constructed from published literature reporting the addition of saponin-rich sources at various levels and then monitoring ruminal methane emissions in vitro. Accordingly, levels of saponin-rich source additions as well as different saponin sources were specified in the database. Apart from methane, other related rumen fermentation parameters were also included in the database, i.e. organic matter digestibility, gas production, pH, ammonia concentration, short-chain fatty acid profiles and protozoal count. A total of 23 studies comprised of 89 data points met the inclusion criteria. The data obtained were subsequently subjected to a statistical meta-analysis based on mixed model methodology. Accordingly, different studies were treated as random effects whereas levels of saponin-rich source additions or different saponin sources were considered as fixed effects. Model statistics used were p-value and root mean square error. Results showed that an addition of increasing levels of a saponin-rich source decreased methane emission per unit of substrate incubated as well as per unit of total gas produced (ptea>quillaja, statistically they did not differ each other. It can be concluded that methane mitigating properties of saponins in the rumen are level- and source-dependent.

  8. The Source Inversion Validation (SIV) Initiative: A Collaborative Study on Uncertainty Quantification in Earthquake Source Inversions

    Science.gov (United States)

    Mai, P. M.; Schorlemmer, D.; Page, M.

    2012-04-01

    Earthquake source inversions image the spatio-temporal rupture evolution on one or more fault planes using seismic and/or geodetic data. Such studies are critically important for earthquake seismology in general, and for advancing seismic hazard analysis in particular, as they reveal earthquake source complexity and help (i) to investigate earthquake mechanics; (ii) to develop spontaneous dynamic rupture models; (iii) to build models for generating rupture realizations for ground-motion simulations. In applications (i - iii), the underlying finite-fault source models are regarded as "data" (input information), but their uncertainties are essentially unknown. After all, source models are obtained from solving an inherently ill-posed inverse problem to which many a priori assumptions and uncertain observations are applied. The Source Inversion Validation (SIV) project is a collaborative effort to better understand the variability between rupture models for a single earthquake (as manifested in the finite-source rupture model database) and to develop robust uncertainty quantification for earthquake source inversions. The SIV project highlights the need to develop a long-standing and rigorous testing platform to examine the current state-of-the-art in earthquake source inversion, and to develop and test novel source inversion approaches. We will review the current status of the SIV project, and report the findings and conclusions of the recent workshops. We will briefly discuss several source-inversion methods, how they treat uncertainties in data, and assess the posterior model uncertainty. Case studies include initial forward-modeling tests on Green's function calculations, and inversion results for synthetic data from spontaneous dynamic crack-like strike-slip earthquake on steeply dipping fault, embedded in a layered crustal velocity-density structure.

  9. Energy and exergy analysis of a double effect absorption refrigeration system based on different heat sources

    International Nuclear Information System (INIS)

    Kaynakli, Omer; Saka, Kenan; Kaynakli, Faruk

    2015-01-01

    Highlights: • Energy and exergy analysis was performed on double effect series flow absorption refrigeration system. • The refrigeration system runs on various heat sources such as hot water, hot air and steam. • A comparative analysis was carried out on these heat sources in terms of exergy destruction and mass flow rate of heat source. • The effect of heat sources on the exergy destruction of high pressure generator was investigated. - Abstract: Absorption refrigeration systems are environmental friendly since they can utilize industrial waste heat and/or solar energy. In terms of heat source of the systems, researchers prefer one type heat source usually such as hot water or steam. Some studies can be free from environment. In this study, energy and exergy analysis is performed on a double effect series flow absorption refrigeration system with water/lithium bromide as working fluid pair. The refrigeration system runs on various heat sources such as hot water, hot air and steam via High Pressure Generator (HPG) because of hot water/steam and hot air are the most common available heat source for absorption applications but the first law of thermodynamics may not be sufficient analyze the absorption refrigeration system and to show the difference of utilize for different type heat source. On the other hand operation temperatures of the overall system and its components have a major effect on their performance and functionality. In this regard, a parametric study conducted here to investigate this effect on heat capacity and exergy destruction of the HPG, coefficient of performance (COP) of the system, and mass flow rate of heat sources. Also, a comparative analysis is carried out on several heat sources (e.g. hot water, hot air and steam) in terms of exergy destruction and mass flow rate of heat source. From the analyses it is observed that exergy destruction of the HPG increases at higher temperature of the heat sources, condenser and absorber, and lower

  10. Analysis of Earthquake Source Spectra in Salton Trough

    Science.gov (United States)

    Chen, X.; Shearer, P. M.

    2009-12-01

    Previous studies of the source spectra of small earthquakes in southern California show that average Brune-type stress drops vary among different regions, with particularly low stress drops observed in the Salton Trough (Shearer et al., 2006). The Salton Trough marks the southern end of the San Andreas Fault and is prone to earthquake swarms, some of which are driven by aseismic creep events (Lohman and McGuire, 2007). In order to learn the stress state and understand the physical mechanisms of swarms and slow slip events, we analyze the source spectra of earthquakes in this region. We obtain Southern California Seismic Network (SCSN) waveforms for earthquakes from 1977 to 2009 archived at the Southern California Earthquake Center (SCEC) data center, which includes over 17,000 events. After resampling the data to a uniform 100 Hz sample rate, we compute spectra for both signal and noise windows for each seismogram, and select traces with a P-wave signal-to-noise ratio greater than 5 between 5 Hz and 15 Hz. Using selected displacement spectra, we isolate the source spectra from station terms and path effects using an empirical Green’s function approach. From the corrected source spectra, we compute corner frequencies and estimate moments and stress drops. Finally we analyze spatial and temporal variations in stress drop in the Salton Trough and compare them with studies of swarms and creep events to assess the evolution of faulting and stress in the region. References: Lohman, R. B., and J. J. McGuire (2007), Earthquake swarms driven by aseismic creep in the Salton Trough, California, J. Geophys. Res., 112, B04405, doi:10.1029/2006JB004596 Shearer, P. M., G. A. Prieto, and E. Hauksson (2006), Comprehensive analysis of earthquake source spectra in southern California, J. Geophys. Res., 111, B06303, doi:10.1029/2005JB003979.

  11. How Many Separable Sources? Model Selection In Independent Components Analysis

    DEFF Research Database (Denmark)

    Woods, Roger P.; Hansen, Lars Kai; Strother, Stephen

    2015-01-01

    among potential model categories with differing numbers of Gaussian components. Based on simulation studies, the assumptions and approximations underlying the Akaike Information Criterion do not hold in this setting, even with a very large number of observations. Cross-validation is a suitable, though....../Principal Components Analysis (mixed ICA/PCA) model described here accommodates one or more Gaussian components in the independent components analysis model and uses principal components analysis to characterize contributions from this inseparable Gaussian subspace. Information theory can then be used to select from...... might otherwise be questionable. Failure of the Akaike Information Criterion in model selection also has relevance in traditional independent components analysis where all sources are assumed non-Gaussian....

  12. Optimization of H.E.S.S. instrumental performances for the analysis of weak gamma-ray sources: Application to the study of HESS J1832-092

    International Nuclear Information System (INIS)

    Laffon, H.

    2012-01-01

    H.E.S.S. (High Energy Stereoscopic System) is an array of very-high energy gamma-ray telescopes located in Namibia. These telescopes take advantage of the atmospheric Cherenkov technique using stereoscopy, allowing to detect gamma-rays between 100 GeV and a few tens of TeV. The location of the H.E.S.S. telescopes in the Southern hemisphere allows to observe the central parts of our galaxy, the Milky Way. Tens of new gamma-ray sources were thereby discovered thanks to the galactic plane survey strategy. After ten years of fruitful observations with many detections, it is now necessary to improve the detector performance in order to detect new sources by increasing the sensitivity and improving the angular resolution. The aim of this thesis consists in the development of advanced analysis techniques allowing to make sharper analysis. An automatic tool to look for new sources and to improve the subtraction of the background noise is presented. It is optimized for the study of weak sources that needs a very rigorous analysis. A combined reconstruction method is built in order to improve the angular resolution without reducing the statistics, which is critical for weak sources. These advanced methods are applied to the analysis of a complex region of the galactic plane near the supernova remnant G22.7-0.2, leading to the detection of a new source, HESS J1832-092. Multi-wavelength counterparts are shown and several scenarios are considered to explain the origin of the gamma-ray signal of this astrophysical object. (author)

  13. Post-processing of Monte Carlo simulations for rapid BNCT source optimization studies

    International Nuclear Information System (INIS)

    Bleuel, D.L.; Chu, W.T.; Donahue, R.J.; Ludewigt, B.A.; Vujic, J.

    2000-01-01

    A great advantage of some neutron sources, such as accelerator-produced sources, is that they can be tuned to produce different spectra. Unfortunately, optimization studies are often time-consuming and difficult, as they require a lengthy Monte Carlo simulation for each source. When multiple characteristics, such as energy, angle, and spatial distribution of a neutron beam are allowed to vary, an overwhelming number of simulations may be required. Many optimization studies, therefore, suffer from a small number of datapoints, restrictive treatment conditions, or poor statistics. By scoring pertinent information from every particle tally in a Monte Carlo simulation, then applying appropriate source variable weight factors in a post-processing algorithm, a single simulation can be used to model any number of multiple sources. Through this method, the response to a new source can be modeled in minutes or seconds, rather than hours or days, allowing for the analysis of truly variable source conditions of much greater resolution than is normally possible when a new simulation must be run for each datapoint in a study. This method has been benchmarked and used to recreate optimization studies in a small fraction of the time spent in the original studies

  14. Multicriteria analysis for sources of renewable energy using data from remote sensing

    Science.gov (United States)

    Matejicek, L.

    2015-04-01

    Renewable energy sources are major components of the strategy to reduce harmful emissions and to replace depleting fossil energy resources. Data from remote sensing can provide information for multicriteria analysis for sources of renewable energy. Advanced land cover quantification makes it possible to search for suitable sites. Multicriteria analysis, together with other data, is used to determine the energy potential and socially acceptability of suggested locations. The described case study is focused on an area of surface coal mines in the northwestern region of the Czech Republic, where the impacts of surface mining and reclamation constitute a dominant force in land cover changes. High resolution satellite images represent the main input datasets for identification of suitable sites. Solar mapping, wind predictions, the location of weirs in watersheds, road maps and demographic information complement the data from remote sensing for multicriteria analysis, which is implemented in a geographic information system (GIS). The input spatial datasets for multicriteria analysis in GIS are reclassified to a common scale and processed with raster algebra tools to identify suitable sites for sources of renewable energy. The selection of suitable sites is limited by the CORINE land cover database to mining and agricultural areas. The case study is focused on long term land cover changes in the 1985-2015 period. Multicriteria analysis based on CORINE data shows moderate changes in mapping of suitable sites for utilization of selected sources of renewable energy in 1990, 2000, 2006 and 2012. The results represent map layers showing the energy potential on a scale of a few preference classes (1-7), where the first class is linked to minimum preference and the last class to maximum preference. The attached histograms show the moderate variability of preference classes due to land cover changes caused by mining activities. The results also show a slight increase in the more

  15. Dynamic response analysis of the LBL Advanced Light Source synchrotron radiation storage ring

    International Nuclear Information System (INIS)

    Leung, K.

    1993-05-01

    This paper presents the dynamic response analysis of the photon source synchrotron radiation storage ring excited by ground motion measured at the Lawrence Berkeley Laboratory advanced light source building site. The high spectral brilliance requirement the photon beams of the advanced light source storage ring specified displacement of the quadrupole focusing magnets in the order of 1 micron in vertical motion.There are 19 magnets supported by a 430-inch steel box beam girder. The girder and all magnets are supported by the kinematic mount system normally used in optical equipment. The kinematic mount called a six-strut magnet support system is now considered as an alternative system for supporting SSC magnets in the Super Collider. The effectively designed and effectively operated six-strut support system is now successfully operated for the Advanced Light Source (ALS) accelerator at the Lawrence Berkeley Laboratory. This paper will present the method of analysis and results of the dynamic motion study at the center of the magnets under the most critical excitation source as recorded at the LBL site

  16. Analysis of the tuning characteristics of microwave plasma source

    International Nuclear Information System (INIS)

    Miotk, Robert; Jasiński, Mariusz; Mizeraczyk, Jerzy

    2016-01-01

    In this paper, we present an analysis of the tuning characteristics of waveguide-supplied metal-cylinder-based nozzleless microwave plasma source. This analysis has enabled to estimate the electron concentration n_e and electron frequency collisions ν in the plasma generated in nitrogen and in a mixture of nitrogen and ethanol vapour. The parameters n_e and ν are the basic quantities that characterize the plasma. The presented new plasma diagnostic method is particularly useful, when spectroscopic methods are useless. The presented plasma source is currently used in research of a hydrogen production from liquids.

  17. Constrained Null Space Component Analysis for Semiblind Source Separation Problem.

    Science.gov (United States)

    Hwang, Wen-Liang; Lu, Keng-Shih; Ho, Jinn

    2018-02-01

    The blind source separation (BSS) problem extracts unknown sources from observations of their unknown mixtures. A current trend in BSS is the semiblind approach, which incorporates prior information on sources or how the sources are mixed. The constrained independent component analysis (ICA) approach has been studied to impose constraints on the famous ICA framework. We introduced an alternative approach based on the null space component (NCA) framework and referred to the approach as the c-NCA approach. We also presented the c-NCA algorithm that uses signal-dependent semidefinite operators, which is a bilinear mapping, as signatures for operator design in the c-NCA approach. Theoretically, we showed that the source estimation of the c-NCA algorithm converges with a convergence rate dependent on the decay of the sequence, obtained by applying the estimated operators on corresponding sources. The c-NCA can be formulated as a deterministic constrained optimization method, and thus, it can take advantage of solvers developed in optimization society for solving the BSS problem. As examples, we demonstrated electroencephalogram interference rejection problems can be solved by the c-NCA with proximal splitting algorithms by incorporating a sparsity-enforcing separation model and considering the case when reference signals are available.

  18. Does recruitment source moderate treatment effectiveness? A subgroup analysis from the EVIDENT study, a randomised controlled trial of an internet intervention for depressive symptoms.

    Science.gov (United States)

    Klein, Jan Philipp; Gamon, Carla; Späth, Christina; Berger, Thomas; Meyer, Björn; Hohagen, Fritz; Hautzinger, Martin; Lutz, Wolfgang; Vettorazzi, Eik; Moritz, Steffen; Schröder, Johanna

    2017-07-13

    This study aims to examine whether the effects of internet interventions for depression generalise to participants recruited in clinical settings. This study uses subgroup analysis of the results of a randomised, controlled, single-blind trial. The study takes place in five diagnostic centres in Germany. A total of 1013 people with mild to moderate depressive symptoms were recruited from clinical sources as well as internet forums, statutory insurance companies and other sources. This study uses either care-as-usual alone (control) or a 12-week internet intervention (Deprexis) plus usual care (intervention). The primary outcome measure was self-rated depression severity (Patient Health Questionnaire-9) at 3 months and 6 months. Further measures ranged from demographic and clinical parameters to a measure of attitudes towards internet interventions (Attitudes towards Psychological Online Interventions Questionnaire). The recruitment source was only associated with very few of the examined demographic and clinical characteristics. Compared with participants recruited from clinical sources, participants recruited through insurance companies were more likely to be employed. Clinically recruited participants were as severely affected as those from other recruitment sources but more sceptical of internet interventions. The effectiveness of the intervention was not differentially associated with recruitment source (treatment by recruitment source interaction=0.28, p=0.84). Our results support the hypothesis that the intervention we studied is effective across different recruitment sources including clinical settings. ClinicalTrials.gov NCT01636752. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  19. Sensitivity Analysis of Deviation Source for Fast Assembly Precision Optimization

    Directory of Open Access Journals (Sweden)

    Jianjun Tang

    2014-01-01

    Full Text Available Assembly precision optimization of complex product has a huge benefit in improving the quality of our products. Due to the impact of a variety of deviation source coupling phenomena, the goal of assembly precision optimization is difficult to be confirmed accurately. In order to achieve optimization of assembly precision accurately and rapidly, sensitivity analysis of deviation source is proposed. First, deviation source sensitivity is defined as the ratio of assembly dimension variation and deviation source dimension variation. Second, according to assembly constraint relations, assembly sequences and locating, deviation transmission paths are established by locating the joints between the adjacent parts, and establishing each part’s datum reference frame. Third, assembly multidimensional vector loops are created using deviation transmission paths, and the corresponding scalar equations of each dimension are established. Then, assembly deviation source sensitivity is calculated by using a first-order Taylor expansion and matrix transformation method. Finally, taking assembly precision optimization of wing flap rocker as an example, the effectiveness and efficiency of the deviation source sensitivity analysis method are verified.

  20. Open-Source RTOS Space Qualification: An RTEMS Case Study

    Science.gov (United States)

    Zemerick, Scott

    2017-01-01

    NASA space-qualification of reusable off-the-shelf real-time operating systems (RTOSs) remains elusive due to several factors notably (1) The diverse nature of RTOSs utilized across NASA, (2) No single NASA space-qualification criteria, lack of verification and validation (V&V) analysis, or test beds, and (3) different RTOS heritages, specifically open-source RTOSs and closed vendor-provided RTOSs. As a leader in simulation test beds, the NASA IV&V Program is poised to help jump-start and lead the space-qualification effort of the open source Real-Time Executive for Multiprocessor Systems (RTEMS) RTOS. RTEMS, as a case-study, can be utilized as an example of how to qualify all RTOSs, particularly the reusable non-commercial (open-source) ones that are gaining usage and popularity across NASA. Qualification will improve the overall safety and mission assurance of RTOSs for NASA-agency wide usage. NASA's involvement in space-qualification of an open-source RTOS such as RTEMS will drive the RTOS industry toward a more qualified and mature open-source RTOS product.

  1. Source identification of underground fuel spills in a petroleum refinery using fingerprinting techniques and chemo-metric analysis. A Case Study

    International Nuclear Information System (INIS)

    Kanellopoulou, G.; Gidarakos, E.; Pasadakis, N.

    2005-01-01

    Crude oil and its refining products are the most frequent contaminants, found in the environment due to spills. The aim of this work was the identification of spill source(s) in the subsurface of a petroleum refinery. Free phase samples were analyzed with gas chromatography and the analytical results were interpreted using Principal Component Analysis (PCA) method. The chemical analysis of groundwater samples from the refinery subsurface was also employed to obtain a comprehensive picture of the spill distribution and origin. (authors)

  2. Validation of botanical origins and geographical sources of some Saudi honeys using ultraviolet spectroscopy and chemometric analysis.

    Science.gov (United States)

    Ansari, Mohammad Javed; Al-Ghamdi, Ahmad; Khan, Khalid Ali; Adgaba, Nuru; El-Ahmady, Sherweit H; Gad, Haidy A; Roshan, Abdulrahman; Meo, Sultan Ayoub; Kolyali, Sevgi

    2018-02-01

    This study aims at distinguishing honey based on botanical and geographical sources. Different floral honey samples were collected from diverse geographical locations of Saudi Arabia. UV spectroscopy in combination with chemometric analysis including Hierarchical Cluster Analysis (HCA), Principal Component Analysis (PCA), and Soft Independent Modeling of Class Analogy (SIMCA) were used to classify honey samples. HCA and PCA presented the initial clustering pattern to differentiate between botanical as well as geographical sources. The SIMCA model clearly separated the Ziziphus sp. and other monofloral honey samples based on different locations and botanical sources. The results successfully discriminated the honey samples of different botanical and geographical sources validating the segregation observed using few physicochemical parameters that are regularly used for discrimination.

  3. Contract Source Selection: An Analysis of Lowest Price Technically Acceptable and Tradeoff Strategies

    Science.gov (United States)

    2016-06-15

    using- spss - statistics.php Lamoureux, J., Murrow, M., & Walls, C. (2015). Relationship of source selection methods to contract outcomes: an analysis ...Contract Source Selection: an Analysis of Lowest Price Technically Acceptable and Tradeoff Strategies 15 June 2016 LCDR Jamal M. Osman, USN...ACQUISITION RESEARCH PROGRAM SPONSORED REPORT SERIES Contract Source Selection: an Analysis of Lowest Price Technically Acceptable and Tradeoff

  4. Operational analysis and comparative evaluation of embedded Z-Source inverters

    DEFF Research Database (Denmark)

    Blaabjerg, Frede; Gao, F.; Loh, P.C.

    2008-01-01

    ) circuitry connected instead of the generic voltage source inverter (VSI) circuitry. Further proceeding on to the topological variation, parallel embedded Z-source inverters are presented with the detailed analysis of topological configuration and operational principles showing that they are the superior......This paper presents various embedded Z-source (EZ-source) inverters broadly classified as shunt or parallel embedded Z-source inverter. Being different from the traditional Z-source inverter, EZ-source inverters are constructed by inserting dc sources into the X-shaped impedance network so...... that the dc input current flows smoothly during the whole switching period unlike the traditional Z-source inverter. This feature is interesting when PV panels or fuel cells are assumed to power load since the continuous input current flow reduces control complexity of dc source and system design burden...

  5. Post-processing of Monte Carlo simulations for rapid BNCT source optimization studies

    International Nuclear Information System (INIS)

    Bleuel, D.L.; Chu, W.T.; Donahue, R.J.; Ludewigt, B.A.; Vujic, J.

    2000-01-01

    A great advantage of some neutron sources, such as accelerator-produced sources, is that they can be tuned to produce different spectra. Unfortunately, optimization studies are often time-consuming and difficult, as they require a lengthy Monte Carlo simulation for each source. When multiple characteristics, such as energy, angle, and spatial distribution of a neutron beam are allowed to vary, an overwhelming number of simulations may be required. Many optimization studies, therefore, suffer from a small number of data points, restrictive treatment conditions, or poor statistics. By scoring pertinent information from every particle tally in a Monte Carlo simulation, then applying appropriate source variable weight factors in a post-processing algorithm; a single simulation can be used to model any number of multiple sources. Through this method, the response to a new source can be modeled in minutes or seconds, rather than hours or days, allowing for the analysis of truly variable source conditions of much greater resolution than is normally possible when a new simulation must be run for each data point in a study. This method has been benchmarked and used to recreate optimization studies in a small fraction of the time spent in the original studies. (author)

  6. Radioisotope sources for X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Leonowich, J.; Pandian, S.; Preiss, I.L.

    1977-01-01

    Problems involved in developing radioisotope sources and the characteristics of potentially useful radioisotopes for X-ray fluorescence analysis are presented. These include the following. The isotope must be evaluated for the physical and chemical forms available, purity, half-life, specific activity, toxicity, and cost. The radiation hazards of the source must be considered. The type and amount of radiation output of the source must be evaluated. The source construction must be planned. The source should also present an advance over those currently available in order to justify its development. Some of the isotopes, which are not in use but look very promising, are indicated, and their data are tabulated. A more or less ''perfect'' source within a given range of interest would exhibit the following characteristics. (1) Decay by an isometric transition with little or no internal conversion, (2) Have an intense gamma transition near the absorption edge of the element(s) of interest with no high energy gammas, (3) Have a sufficiently long half-life (in the order of years) for both economic and calibration reasons, (4) Have a sufficiently large cross-section for production in a reasonable amount of time. If there are competing reactions the interfering isotopes should be reasonably short-lived, or if not, be apt to be separated from the isotope chemically with a minimum of difficulty. (T.G.)

  7. Analysis of the tuning characteristics of microwave plasma source

    Energy Technology Data Exchange (ETDEWEB)

    Miotk, Robert, E-mail: rmiotk@imp.gda.pl; Jasiński, Mariusz [Centre for Plasma and Laser Engineering, The Szewalski Institute of Fluid-Flow Machinery, Polish Academy of Sciences, Fiszera 14, 80-231 Gdańsk (Poland); Mizeraczyk, Jerzy [Department of Marine Electronics, Gdynia Maritime University, Morska 81-87, 81-225 Gdynia (Poland)

    2016-04-15

    In this paper, we present an analysis of the tuning characteristics of waveguide-supplied metal-cylinder-based nozzleless microwave plasma source. This analysis has enabled to estimate the electron concentration n{sub e} and electron frequency collisions ν in the plasma generated in nitrogen and in a mixture of nitrogen and ethanol vapour. The parameters n{sub e} and ν are the basic quantities that characterize the plasma. The presented new plasma diagnostic method is particularly useful, when spectroscopic methods are useless. The presented plasma source is currently used in research of a hydrogen production from liquids.

  8. Acoustic Source Analysis of Magnetoacoustic Tomography With Magnetic Induction for Conductivity Gradual-Varying Tissues.

    Science.gov (United States)

    Wang, Jiawei; Zhou, Yuqi; Sun, Xiaodong; Ma, Qingyu; Zhang, Dong

    2016-04-01

    As a multiphysics imaging approach, magnetoacoustic tomography with magnetic induction (MAT-MI) works on the physical mechanism of magnetic excitation, acoustic vibration, and transmission. Based on the theoretical analysis of the source vibration, numerical studies are conducted to simulate the pathological changes of tissues for a single-layer cylindrical conductivity gradual-varying model and estimate the strengths of sources inside the model. The results suggest that the inner source is generated by the product of the conductivity and the curl of the induced electric intensity inside conductivity homogeneous medium, while the boundary source is produced by the cross product of the gradient of conductivity and the induced electric intensity at conductivity boundary. For a biological tissue with low conductivity, the strength of boundary source is much higher than that of the inner source only when the size of conductivity transition zone is small. In this case, the tissue can be treated as a conductivity abrupt-varying model, ignoring the influence of inner source. Otherwise, the contributions of inner and boundary sources should be evaluated together quantitatively. This study provide basis for further study of precise image reconstruction of MAT-MI for pathological tissues.

  9. Analysis of filtration properties of locally sourced base oil for the ...

    African Journals Online (AJOL)

    This study examines the use of locally sourced oil like, groundnut oil, melon oil, vegetable oil, soya oil and palm oil as substitute for diesel oil in formulating oil base drilling fluids relative to filtration properties. The filtrate volumes of each of the oils were obtained for filtration control analysis. With increasing potash and ...

  10. PROTEINCHALLENGE: Crowd sourcing in proteomics analysis and software development

    DEFF Research Database (Denmark)

    Martin, Sarah F.; Falkenberg, Heiner; Dyrlund, Thomas Franck

    2013-01-01

    , including arguments for community-wide open source software development and “big data” compatible solutions for the future. For the meantime, we have laid out ten top tips for data processing. With these at hand, a first large-scale proteomics analysis hopefully becomes less daunting to navigate.......However there is clearly a real need for robust tools, standard operating procedures and general acceptance of best practises. Thus we submit to the proteomics community a call for a community-wide open set of proteomics analysis challenges—PROTEINCHALLENGE—that directly target and compare data analysis workflows......In large-scale proteomics studies there is a temptation, after months of experimental work, to plug resulting data into a convenient—if poorly implemented—set of tools, which may neither do the data justice nor help answer the scientific question. In this paper we have captured key concerns...

  11. Open source EMR software: profiling, insights and hands-on analysis.

    Science.gov (United States)

    Kiah, M L M; Haiqi, Ahmed; Zaidan, B B; Zaidan, A A

    2014-11-01

    literature landscape more perceivable. Nevertheless, the surveyed articles fall short of fulfilling the targeted objective of providing clear reference to potential implementers. The hands-on study contributed a more detailed comparative guide relative to our set of assessment measures. Overall, no system seems to satisfy an industry-standard measure, particularly in security and interoperability. The systems, as software applications, feel similar from a usability perspective and share a common set of functionality, though they vary considerably in community support and activity. More detailed analysis of popular open source software can benefit the potential implementers of electronic health/medical records systems. The number of examined systems and the measures by which to compare them vary across studies, but still rewarding insights start to emerge. Our work is one step toward that goal. Our overall conclusion is that open source options in the medical field are still far behind the highly acknowledged open source products in other domains, e.g. operating systems market share. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  12. Isotopic neutron sources for neutron activation analysis

    International Nuclear Information System (INIS)

    Hoste, J.

    1988-06-01

    This User's Manual is an attempt to provide for teaching and training purposes, a series of well thought out demonstrative experiments in neutron activation analysis based on the utilization of an isotopic neutron source. In some cases, these ideas can be applied to solve practical analytical problems. 19 refs, figs and tabs

  13. System optimization for continuous on-stream elemental analysis using low-output isotopic neutron sources

    International Nuclear Information System (INIS)

    Rizk, R.A.M.

    1989-01-01

    In continuous on-stream neutron activation analysis, the material to be analyzed may be continuously recirculated in a closed loop system between an activation source and a shielded detector. In this paper an analytical formulation of the detector response for such a system is presented. This formulation should be useful in optimizing the system design parameters for specific applications. A study has been made of all parameters that influence the detector response during on-stream analysis. Feasibility applications of the method to solutions of manganese and vanadium using a 5 μg 252 Cf neutron source are demonstrated. (author)

  14. Energy sources and nuclear energy. Comparative analysis and ethical reflections

    International Nuclear Information System (INIS)

    Hoenraet, C.

    1999-01-01

    Under the authority of the episcopacy of Brugge in Belgium an independent working group Ethics and Nuclear Energy was set up. The purpose of the working group was to collect all the necessary information on existing energy sources and to carry out a comparative analysis of their impact on mankind and the environment. Also attention was paid to economical and social aspects. The results of the study are subjected to an ethical reflection. The book is aimed at politicians, teachers, journalists and every interested layman who wants to gain insight into the consequences of the use of nuclear energy and other energy sources. Based on the information in this book one should be able to objectively define one's position in future debates on this subject

  15. Study of classification and disposed method for disused sealed radioactive source in Korea

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Suk Hoon; Kim, Ju Youl; Lee, Seung Hee [FNC Technology Co., Ltd.,Yongin (Korea, Republic of)

    2016-09-15

    In accordance with the classification system of radioactive waste in Korea, all the disused sealed radioactive sources (DSRSs) fall under the category of EW, VLLW or LILW, and should be managed in compliance with the restrictions for the disposal method. In this study, the management and disposal method are drawn in consideration of half-life of radionuclides contained in the source and A/D value (i.e. the activity A of the source dividing by the D value for the relevant radionuclide, which is used to provide an initial ranking of relative risk for sources) in addition to the domestic classification scheme and disposal method, based on the characteristic analysis and review results of the management practices in IAEA and foreign countries. For all the DSRSs that are being stored (as of March 2015) in the centralized temporary disposal facility for radioisotope wastes, applicability of the derivation result is confirmed through performing the characteristic analysis and case studies for assessing quantity and volume of DSRSs to be managed by each method. However, the methodology derived from this study is not applicable to the following sources; i) DSRSs without information on the radioactivity, ii) DSRSs that are not possible to calculate the specific activity and/or the source-specific A/D value. Accordingly, it is essential to identify the inherent characteristics for each of DSRSs prior to implementation of this management and disposal method.

  16. Data analysis and source modelling for LISA

    International Nuclear Information System (INIS)

    Shang, Yu

    2014-01-01

    The gravitational waves are one of the most important predictions in general relativity. Besides of the directly proof of the existence of GWs, there are already several ground based detectors (such as LIGO, GEO, etc) and the planed future space mission (such as: LISA) which are aim to detect the GWs directly. GW contain a large amount of information of its source, extracting these information can help us dig out the physical property of the source, even open a new window for understanding the Universe. Hence, GW data analysis will be a challenging task in seeking the GWs. In this thesis, I present two works about the data analysis for LISA. In the first work, we introduce an extended multimodal genetic algorithm which utilizes the properties of the signal and the detector response function to analyze the data from the third round of mock LISA data challenge. We have found all five sources present in the data and recovered the coalescence time, chirp mass, mass ratio and sky location with reasonable accuracy. As for the orbital angular momentum and two spins of the Black Holes, we have found a large number of widely separated modes in the parameter space with similar maximum likelihood values. The performance of this method is comparable, if not better, to already existing algorithms. In the second work, we introduce an new phenomenological waveform model for the extreme mass ratio inspiral system. This waveform consists of a set of harmonics with constant amplitude and slowly evolving phase which we decompose in a Taylor series. We use these phenomenological templates to detect the signal in the simulated data, and then, assuming a particular EMRI model, estimate the physical parameters of the binary with high precision. The results show that our phenomenological waveform is very feasible in the data analysis of EMRI signal.

  17. Quantitative analysis of Internet television and video (WebTV: A study of formats, content, and source

    Directory of Open Access Journals (Sweden)

    José Borja ARJONA MARTÍN

    2014-07-01

    Full Text Available Due to the significant increase in the last five years of audiovisual content distribution over the web, this paper is focused on a study aimed at the description and classification of a wide sample of audiovisual initiatives whose access is carried out by means of the World Wide Web. The purpose of this study is to promote the debate concerning the different names of these incipient media, as well as their categorization and description so that an organised universe of the WebTV phenomenon could be provided. An analysis of formats and content is carried out on the basis of quantitative techniques in order to propose a categorization typology. These formats and content will be studied under three key variables: "Content", "Origin" and "Domain .tv". "Content" will help us define the programmatic lines of our study sample; “Source” refers to the source of a particular item of study (“Native WebTV or WebTV representative of a conventional media and "Domain.tv" will specify the proportion of case studies hosted with domain .tv. The results obtained in this study will offer the researchers and the professionals a comprehensive description of the models currently adopted in the field of video and television on the net.

  18. Identifying the source of fluvial terrace deposits using XRF scanning and Canonical Discriminant Analysis: A case study of the Chihshang terraces, eastern Taiwan

    Science.gov (United States)

    Chang, Queenie; Lee, Jian-Cheng; Hunag, Jyh-Jaan; Wei, Kuo-Yen; Chen, Yue-Gau; Byrne, Timothy B.

    2018-05-01

    The source of fluvial deposits in terraces provides important information about the catchment fluvial processes and landform evolution. In this study, we propose a novel approach that combines high-resolution Itrax-XRF scanning and Canonical Discriminant Analysis (CDA) to identify the source of fine-grained fluvial terrace deposits. We apply this approach to a group of terraces that are located on the hanging wall of the Chihshang Fault in eastern Taiwan with two possible sources, the Coastal Range on the east and the Central Range on the west. Our results of standard samples from the two potential sources show distinct ranges of canonical variables, which provided a better separation ability than individual chemical elements. We then tested the possibility of using this approach by applying it to several samples with known sediment sources and obtain positive results. Applying this same approach to the fine-grained sediments in Chihshang terraces indicates that they are mostly composed of Coastal Range material but also contain some inputs from the Central Range. In two lowest terraces T1 and T2, the fine-grained deposits show significant Central Range component. For terrace T4, the results show less Central Range input and a trend of decreasing Central Range influences up section. The Coastal Range material becomes dominant in the two highest terraces T7 and T10. Sediments in terrace T5 appear to have been potentially altered by post-deposition chemical alteration processes and are not included in the analysis. Our results show that the change in source material in the terraces deposits was relatively gradual rather than the sharp changes suggested by the composition of the gravels and conglomerates. We suggest that this change in sources is related to the change in dominant fluvial processes that controlled by the tectonic activity.

  19. Chemometric Analysis for Pollution Source Assessment of Harbour Sediments in Arctic Locations

    DEFF Research Database (Denmark)

    Pedersen, Kristine B.; Lejon, Tore; Jensen, Pernille Erland

    2015-01-01

    Pollution levels, pollutant distribution and potential source assessments based on multivariate analysis (chemometrics) were made for harbour sediments from two Arctic locations; Hammerfest in Norway and Sisimiut in Greenland. High levels of heavy metals were detected in addition to organic...... pollutants. Preliminary assessments based on principal component analysis (PCA) revealed different sources and pollutant distribution in the sediments of the two harbours. Tributyltin (TBT) was, however, found to originate from point source(s), and the highest concentrations of TBT in both harbours were...... indicated relation primarily to German, Russian and American mixtures in Hammerfest; and American, Russian and Japanese mixtures in Sisimiut. PCA was shown to be an important tool for identifying pollutant sources and differences in pollutant composition in relation to sediment characteristics....

  20. Thermal-hydraulic studies of the Advanced Neutron Source cold source

    International Nuclear Information System (INIS)

    Williams, P.T.; Lucas, A.T.

    1995-08-01

    The Advanced Neutron Source (ANS), in its conceptual design phase at Oak Ridge National Laboratory, was to be a user-oriented neutron research facility producing the most intense steady-state flux of thermal and cold neutrons in the world. Among its many scientific applications, the production of cold neutrons was a significant research mission for the ANS. The cold neutrons come from two independent cold sources positioned near the reactor core. Contained by an aluminum alloy vessel, each cold source is a 410-mm-diam sphere of liquid deuterium that functions both as a neutron moderator and a cryogenic coolant. With nuclear heating of the containment vessel and internal baffling, steady-state operation requires close control of the liquid deuterium flow near the vessel's inner surface. Preliminary thermal-hydraulic analyses supporting the cold source design were performed with heat conduction simulations of the vessel walls and multidimensional computational fluid dynamics simulations of the liquid deuterium flow and heat transfer. This report presents the starting phase of a challenging program and describes the cold source conceptual design, the thermal-hydraulic feasibility studies of the containment vessel, and the future computational and experimental studies that were planned to verify the final design

  1. Neutronics of the IFMIF neutron source: development and analysis

    International Nuclear Information System (INIS)

    Wilson, P.P.H.

    1999-01-01

    The accurate analysis of this system required the development of a code system and methodology capable of modelling the various physical processes. A generic code system for the neutronics analysis of neutron sources has been created by loosely integrating existing components with new developments: the data processing code NJOY, the Monte Carlo neutron transport code MCNP, and the activation code ALARA were supplemented by a damage data processing program, damChar, and integrated with a number of flexible and extensible modules for the Perl scripting language. Specific advances were required to apply this code system to IFMIF. Based on the ENDF-6 data format requirements of this system, new data evaluations have been implemented for neutron transport and activation. Extensive analysis of the Li(d, xn) reaction has led to a new MCNP source function module, M c DeLi, based on physical reaction models and capable of accurate and flexible modelling of the IFMIF neutron source term. In depth analyses of the neutron flux spectra and spatial distribution throughout the high flux test region permitted a basic validation of the tools and data. The understanding of the features of the neutron flux provided a foundation for the analyses of the other neutron responses. (orig./DGE) [de

  2. Detection, Source Location, and Analysis of Volcano Infrasound

    Science.gov (United States)

    McKee, Kathleen F.

    The study of volcano infrasound focuses on low frequency sound from volcanoes, how volcanic processes produce it, and the path it travels from the source to our receivers. In this dissertation we focus on detecting, locating, and analyzing infrasound from a number of different volcanoes using a variety of analysis techniques. These works will help inform future volcano monitoring using infrasound with respect to infrasonic source location, signal characterization, volatile flux estimation, and back-azimuth to source determination. Source location is an important component of the study of volcano infrasound and in its application to volcano monitoring. Semblance is a forward grid search technique and common source location method in infrasound studies as well as seismology. We evaluated the effectiveness of semblance in the presence of significant topographic features for explosions of Sakurajima Volcano, Japan, while taking into account temperature and wind variations. We show that topographic obstacles at Sakurajima cause a semblance source location offset of 360-420 m to the northeast of the actual source location. In addition, we found despite the consistent offset in source location semblance can still be a useful tool for determining periods of volcanic activity. Infrasonic signal characterization follows signal detection and source location in volcano monitoring in that it informs us of the type of volcanic activity detected. In large volcanic eruptions the lowermost portion of the eruption column is momentum-driven and termed the volcanic jet or gas-thrust zone. This turbulent fluid-flow perturbs the atmosphere and produces a sound similar to that of jet and rocket engines, known as jet noise. We deployed an array of infrasound sensors near an accessible, less hazardous, fumarolic jet at Aso Volcano, Japan as an analogue to large, violent volcanic eruption jets. We recorded volcanic jet noise at 57.6° from vertical, a recording angle not normally feasible

  3. Analysis of potential combustion source impacts on acid deposition using an independently derived inventory. Volume I

    Energy Technology Data Exchange (ETDEWEB)

    1983-12-01

    This project had three major objectives. The first objective was to develop a fossil fuel combustion source inventory (NO/sub x/, SO/sub x/, and hydrocarbon emissions) that would be relatively easy to use and update for analyzing the impact of combustion emissions on acid deposition in the eastern United States. The second objective of the project was to use the inventory data as a basis for selection of a number of areas that, by virtue of their importance in the acid rain issue, could be further studied to assess the impact of local and intraregional combustion sources. The third objective was to conduct an analysis of wet deposition monitoring data in the areas under study, along with pertinent physical characteristics, meteorological conditions, and emission patterns of these areas, to investigate probable relationships between local and intraregional combustion sources and the deposition of acidic material. The combustion source emissions inventory has been developed for the eastern United States. It characterizes all important area sources and point sources on a county-by-county basis. Its design provides flexibility and simplicity and makes it uniquely useful in overall analysis of emission patterns in the eastern United States. Three regions with basically different emission patterns have been identified and characterized. The statistical analysis of wet deposition monitoring data in conjunction with emission patterns, wind direction, and topography has produced consistent results for each study area and has demonstrated that the wet deposition in each area reflects the characteristics of the localized area around the monitoring sites (typically 50 to 150 miles). 8 references, 28 figures, 39 tables.

  4. Pteros: fast and easy to use open-source C++ library for molecular analysis.

    Science.gov (United States)

    Yesylevskyy, Semen O

    2012-07-15

    An open-source Pteros library for molecular modeling and analysis of molecular dynamics trajectories for C++ programming language is introduced. Pteros provides a number of routine analysis operations ranging from reading and writing trajectory files and geometry transformations to structural alignment and computation of nonbonded interaction energies. The library features asynchronous trajectory reading and parallel execution of several analysis routines, which greatly simplifies development of computationally intensive trajectory analysis algorithms. Pteros programming interface is very simple and intuitive while the source code is well documented and easily extendible. Pteros is available for free under open-source Artistic License from http://sourceforge.net/projects/pteros/. Copyright © 2012 Wiley Periodicals, Inc.

  5. Time-correlated neutron analysis of a multiplying HEU source

    Energy Technology Data Exchange (ETDEWEB)

    Miller, E.C., E-mail: Eric.Miller@jhuapl.edu [Johns Hopkins University Applied Physics Laboratory, Laurel, MD (United States); Kalter, J.M.; Lavelle, C.M. [Johns Hopkins University Applied Physics Laboratory, Laurel, MD (United States); Watson, S.M.; Kinlaw, M.T.; Chichester, D.L. [Idaho National Laboratory, Idaho Falls, ID (United States); Noonan, W.A. [Johns Hopkins University Applied Physics Laboratory, Laurel, MD (United States)

    2015-06-01

    The ability to quickly identify and characterize special nuclear material remains a national security challenge. In counter-proliferation applications, identifying the neutron multiplication of a sample can be a good indication of the level of threat. Currently neutron multiplicity measurements are performed with moderated {sup 3}He proportional counters. These systems rely on the detection of thermalized neutrons, a process which obscures both energy and time information from the source. Fast neutron detectors, such as liquid scintillators, have the ability to detect events on nanosecond time scales, providing more information on the temporal structure of the arriving signal, and provide an alternative method for extracting information from the source. To explore this possibility, a series of measurements were performed on the Idaho National Laboratory's MARVEL assembly, a configurable HEU source. The source assembly was measured in a variety of different HEU configurations and with different reflectors, covering a range of neutron multiplications from 2 to 8. The data was collected with liquid scintillator detectors and digitized for offline analysis. A gap based approach for identifying the bursts of detected neutrons associated with the same fission chain was used. Using this approach, we are able to study various statistical properties of individual fission chains. One of these properties is the distribution of neutron arrival times within a given burst. We have observed two interesting empirical trends. First, this distribution exhibits a weak, but definite, dependence on source multiplication. Second, there are distinctive differences in the distribution depending on the presence and type of reflector. Both of these phenomena might prove to be useful when assessing an unknown source. The physical origins of these phenomena can be illuminated with help of MCNPX-PoliMi simulations.

  6. Time-correlated neutron analysis of a multiplying HEU source

    Science.gov (United States)

    Miller, E. C.; Kalter, J. M.; Lavelle, C. M.; Watson, S. M.; Kinlaw, M. T.; Chichester, D. L.; Noonan, W. A.

    2015-06-01

    The ability to quickly identify and characterize special nuclear material remains a national security challenge. In counter-proliferation applications, identifying the neutron multiplication of a sample can be a good indication of the level of threat. Currently neutron multiplicity measurements are performed with moderated 3He proportional counters. These systems rely on the detection of thermalized neutrons, a process which obscures both energy and time information from the source. Fast neutron detectors, such as liquid scintillators, have the ability to detect events on nanosecond time scales, providing more information on the temporal structure of the arriving signal, and provide an alternative method for extracting information from the source. To explore this possibility, a series of measurements were performed on the Idaho National Laboratory's MARVEL assembly, a configurable HEU source. The source assembly was measured in a variety of different HEU configurations and with different reflectors, covering a range of neutron multiplications from 2 to 8. The data was collected with liquid scintillator detectors and digitized for offline analysis. A gap based approach for identifying the bursts of detected neutrons associated with the same fission chain was used. Using this approach, we are able to study various statistical properties of individual fission chains. One of these properties is the distribution of neutron arrival times within a given burst. We have observed two interesting empirical trends. First, this distribution exhibits a weak, but definite, dependence on source multiplication. Second, there are distinctive differences in the distribution depending on the presence and type of reflector. Both of these phenomena might prove to be useful when assessing an unknown source. The physical origins of these phenomena can be illuminated with help of MCNPX-PoliMi simulations.

  7. Time-correlated neutron analysis of a multiplying HEU source

    International Nuclear Information System (INIS)

    Miller, E.C.; Kalter, J.M.; Lavelle, C.M.; Watson, S.M.; Kinlaw, M.T.; Chichester, D.L.; Noonan, W.A.

    2015-01-01

    The ability to quickly identify and characterize special nuclear material remains a national security challenge. In counter-proliferation applications, identifying the neutron multiplication of a sample can be a good indication of the level of threat. Currently neutron multiplicity measurements are performed with moderated 3 He proportional counters. These systems rely on the detection of thermalized neutrons, a process which obscures both energy and time information from the source. Fast neutron detectors, such as liquid scintillators, have the ability to detect events on nanosecond time scales, providing more information on the temporal structure of the arriving signal, and provide an alternative method for extracting information from the source. To explore this possibility, a series of measurements were performed on the Idaho National Laboratory's MARVEL assembly, a configurable HEU source. The source assembly was measured in a variety of different HEU configurations and with different reflectors, covering a range of neutron multiplications from 2 to 8. The data was collected with liquid scintillator detectors and digitized for offline analysis. A gap based approach for identifying the bursts of detected neutrons associated with the same fission chain was used. Using this approach, we are able to study various statistical properties of individual fission chains. One of these properties is the distribution of neutron arrival times within a given burst. We have observed two interesting empirical trends. First, this distribution exhibits a weak, but definite, dependence on source multiplication. Second, there are distinctive differences in the distribution depending on the presence and type of reflector. Both of these phenomena might prove to be useful when assessing an unknown source. The physical origins of these phenomena can be illuminated with help of MCNPX-PoliMi simulations

  8. Multi-Criteria Analysis to Prioritize Energy Sources for Ambience in Poultry Production

    Directory of Open Access Journals (Sweden)

    DC Collatto

    Full Text Available ABSTRACT This paper intends to outline a model of multi-criteria analysis to pinpoint the most suitable energy source for heating aviaries in poultry broiler production from the point of view of the farmer and under environmental logic. Therefore, the identification of criteria was enabled through an exploratory study in three poultry broiler production units located in the mountain region of Rio Grande do Sul. In order to identify the energy source, the Analytic Hierarchy Process was applied. The criteria determined and validated in the research contemplated the cost of energy source, leadtime, investment in equipment, energy efficiency, quality of life and environmental impacts. The result of applying the method revealed firewood as the most appropriate energy for heating. The decision support model developed could be replicated in order to strengthen the criteria and energy alternatives presented, besides identifying new criteria and alternatives that were not considered in this study.

  9. Validation of the direct analysis in real time source for use in forensic drug screening.

    Science.gov (United States)

    Steiner, Robert R; Larson, Robyn L

    2009-05-01

    The Direct Analysis in Real Time (DART) ion source is a relatively new mass spectrometry technique that is seeing widespread use in chemical analyses world-wide. DART studies include such diverse topics as analysis of flavors and fragrances, melamine in contaminated dog food, differentiation of writing inks, characterization of solid counterfeit drugs, and as a detector for planar chromatography. Validation of this new technique for the rapid screening of forensic evidence for drugs of abuse, utilizing the DART source coupled to an accurate mass time-of-flight mass spectrometer, was conducted. The study consisted of the determination of the lower limit of detection for the method, determination of selectivity and a comparison of this technique to established analytical protocols. Examples of DART spectra are included. The results of this study have allowed the Virginia Department of Forensic Science to incorporate this new technique into their analysis scheme for the screening of solid dosage forms of drugs of abuse.

  10. A Study on Conjugate Heat Transfer Analysis of Reactor Vessel including Irradiated Structural Heat Source

    Energy Technology Data Exchange (ETDEWEB)

    Yi, Kunwoo; Cho, Hyuksu; Im, Inyoung; Kim, Eunkee [KEPCO EnC, Daejeon (Korea, Republic of)

    2015-10-15

    Though Material reliability programs (MRPs) have a purpose to provide the evaluation or management methodologies for the operating RVI, the similar evaluation methodologies can be applied to the APR1400 fleet in the design stage for the evaluation of neutron irradiation effects. The purposes of this study are: to predict the thermal behavior whether or not irradiated structure heat source; to evaluate effective thermal conductivity (ETC) in relation to isotropic and anisotropic conductivity of porous media for APR1400 Reactor Vessel. The CFD simulations are performed so as to evaluate thermal behavior whether or not irradiated structure heat source and effective thermal conductivity for APR1400 Reactor Vessel. In respective of using irradiated structure heat source, the maximum temperature of fluid and core shroud for isotropic ETC are 325.8 .deg. C, 341.5 .deg. C. The total amount of irradiated structure heat source is about 5.41 MWth and not effect to fluid temperature.

  11. Search Analytics: Automated Learning, Analysis, and Search with Open Source

    Science.gov (United States)

    Hundman, K.; Mattmann, C. A.; Hyon, J.; Ramirez, P.

    2016-12-01

    The sheer volume of unstructured scientific data makes comprehensive human analysis impossible, resulting in missed opportunities to identify relationships, trends, gaps, and outliers. As the open source community continues to grow, tools like Apache Tika, Apache Solr, Stanford's DeepDive, and Data-Driven Documents (D3) can help address this challenge. With a focus on journal publications and conference abstracts often in the form of PDF and Microsoft Office documents, we've initiated an exploratory NASA Advanced Concepts project aiming to use the aforementioned open source text analytics tools to build a data-driven justification for the HyspIRI Decadal Survey mission. We call this capability Search Analytics, and it fuses and augments these open source tools to enable the automatic discovery and extraction of salient information. In the case of HyspIRI, a hyperspectral infrared imager mission, key findings resulted from the extractions and visualizations of relationships from thousands of unstructured scientific documents. The relationships include links between satellites (e.g. Landsat 8), domain-specific measurements (e.g. spectral coverage) and subjects (e.g. invasive species). Using the above open source tools, Search Analytics mined and characterized a corpus of information that would be infeasible for a human to process. More broadly, Search Analytics offers insights into various scientific and commercial applications enabled through missions and instrumentation with specific technical capabilities. For example, the following phrases were extracted in close proximity within a publication: "In this study, hyperspectral images…with high spatial resolution (1 m) were analyzed to detect cutleaf teasel in two areas. …Classification of cutleaf teasel reached a users accuracy of 82 to 84%." Without reading a single paper we can use Search Analytics to automatically identify that a 1 m spatial resolution provides a cutleaf teasel detection users accuracy of 82

  12. GEOSPATIAL ANALYSIS OF ATMOSPHERIC HAZE EFFECT BY SOURCE AND SINK LANDSCAPE

    Directory of Open Access Journals (Sweden)

    T. Yu

    2017-09-01

    Full Text Available Based on geospatial analysis model, this paper analyzes the relationship between the landscape patterns of source and sink in urban areas and atmospheric haze pollution. Firstly, the classification result and aerosol optical thickness (AOD of Wuhan are divided into a number of square grids with the side length of 6 km, and the category level landscape indices (PLAND, PD, COHESION, LPI, FRAC_MN and AOD of each grid are calculated. Then the source and sink landscapes of atmospheric haze pollution are selected based on the analysis of the correlation between landscape indices and AOD. Next, to make the following analysis more efficient, the indices selected before should be determined through the correlation coefficient between them. Finally, due to the spatial dependency and spatial heterogeneity of the data used in this paper, spatial autoregressive model and geo-weighted regression model are used to analyze atmospheric haze effect by source and sink landscape from the global and local level. The results show that the source landscape of atmospheric haze pollution is the building, and the sink landscapes are shrub and woodland. PLAND, PD and COHESION are suitable for describing the atmospheric haze effect by source and sink landscape. Comparing these models, the fitting effect of SLM, SEM and GWR is significantly better than that of OLS model. The SLM model is superior to the SEM model in this paper. Although the fitting effect of GWR model is more unsuited than that of SLM, the influence degree of influencing factors on atmospheric haze of different geography can be expressed clearer. Through the analysis results of these models, following conclusions can be summarized: Reducing the proportion of source landscape area and increasing the degree of fragmentation could cut down aerosol optical thickness; And distributing the source and sink landscape evenly and interspersedly could effectively reduce aerosol optical thickness which represents

  13. Geospatial Analysis of Atmospheric Haze Effect by Source and Sink Landscape

    Science.gov (United States)

    Yu, T.; Xu, K.; Yuan, Z.

    2017-09-01

    Based on geospatial analysis model, this paper analyzes the relationship between the landscape patterns of source and sink in urban areas and atmospheric haze pollution. Firstly, the classification result and aerosol optical thickness (AOD) of Wuhan are divided into a number of square grids with the side length of 6 km, and the category level landscape indices (PLAND, PD, COHESION, LPI, FRAC_MN) and AOD of each grid are calculated. Then the source and sink landscapes of atmospheric haze pollution are selected based on the analysis of the correlation between landscape indices and AOD. Next, to make the following analysis more efficient, the indices selected before should be determined through the correlation coefficient between them. Finally, due to the spatial dependency and spatial heterogeneity of the data used in this paper, spatial autoregressive model and geo-weighted regression model are used to analyze atmospheric haze effect by source and sink landscape from the global and local level. The results show that the source landscape of atmospheric haze pollution is the building, and the sink landscapes are shrub and woodland. PLAND, PD and COHESION are suitable for describing the atmospheric haze effect by source and sink landscape. Comparing these models, the fitting effect of SLM, SEM and GWR is significantly better than that of OLS model. The SLM model is superior to the SEM model in this paper. Although the fitting effect of GWR model is more unsuited than that of SLM, the influence degree of influencing factors on atmospheric haze of different geography can be expressed clearer. Through the analysis results of these models, following conclusions can be summarized: Reducing the proportion of source landscape area and increasing the degree of fragmentation could cut down aerosol optical thickness; And distributing the source and sink landscape evenly and interspersedly could effectively reduce aerosol optical thickness which represents atmospheric haze

  14. A tsunami wave propagation analysis for the Ulchin Nuclear Power Plant considering the tsunami sources of western part of Japan

    International Nuclear Information System (INIS)

    Rhee, Hyun Me; Kim, Min Kyu; Sheen, Dong Hoon; Choi, In Kil

    2013-01-01

    The accident which was caused by a tsunami and the Great East-Japan earthquake in 2011 occurred at the Fukushima Nuclear Power Plant (NPP) site. It is obvious that the NPP accident could be incurred by the tsunami. Therefore a Probabilistic Tsunami Hazard Analysis (PTHA) for an NPP site should be required in Korea. The PTHA methodology is developed on the PSHA (Probabilistic Seismic Hazard Analysis) method which is performed by using various tsunami sources and their weights. In this study, the fault sources of northwestern part of Japan were used to analyze as the tsunami sources. These fault sources were suggested by the Atomic Energy Society of Japan (AESJ). To perform the PTHA, the calculations of maximum and minimum wave elevations from the result of tsunami simulations are required. Thus, in this study, tsunami wave propagation analysis were performed for developing the future study of the PTHA

  15. Stratified source-sampling techniques for Monte Carlo eigenvalue analysis

    International Nuclear Information System (INIS)

    Mohamed, A.

    1998-01-01

    In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo ''Eigenvalue of the World'' problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. In this paper, stratified source-sampling techniques are generalized and applied to three different Eigenvalue of the World configurations which take into account real-world statistical noise sources not included in the model problem, but which differ in the amount of neutronic coupling among the constituents of each configuration. It is concluded that, in Monte Carlo eigenvalue analysis of loosely-coupled arrays, the use of stratified source-sampling reduces the probability of encountering an anomalous result over that if conventional source-sampling methods are used. However, this gain in reliability is substantially less than that observed in the model-problem results

  16. Radiation studies in the antiproton source

    International Nuclear Information System (INIS)

    Church, M.

    1990-01-01

    Experiment E760 has a lead glass (Pb-G) calorimeter situated in the antiproton source tunnel in the accumulator ring at location A50. This location is exposed to radiation from several sources during antiproton stacking operations. A series of radiation studies has been performed over the last two years to determine the sources of this radiation and as a result, some shielding has been installed in the antiproton source in order to protect the lead glass from radiation damage

  17. Human error analysis project (HEAP) - The fourth pilot study: verbal data for analysis of operator performance

    International Nuclear Information System (INIS)

    Braarud, Per Oeyvind; Droeyvoldsmo, Asgeir; Hollnagel, Erik

    1997-06-01

    This report is the second report from the Pilot study No. 4 within the Human Error Analyses Project (HEAP). The overall objective of HEAP is to provide a better understanding and explicit modelling of how and why ''cognitive errors'' occur. This study investigated the contribution from different verbal data sources for analysis of control room operator's performance. Operator's concurrent verbal report, retrospective verbal report, and process expert's comments were compared for their contribution to an operator performance measure. This study looked into verbal protocols for single operator and for team. The main findings of the study were that all the three verbal data sources could be used to study performance. There was a relative high overlap between the data sources, but also a unique contribution from each source. There was a common pattern in the types of operator activities the data sources gave information about. The operator's concurrent protocol overall contained slightly more information on the operator's activities than the other two verbal sources. The study also showed that concurrent verbal protocol is feasible and useful for analysis of team's activities during a scenario. (author)

  18. Distributed medical image analysis and diagnosis through crowd-sourced games: a malaria case study.

    Science.gov (United States)

    Mavandadi, Sam; Dimitrov, Stoyan; Feng, Steve; Yu, Frank; Sikora, Uzair; Yaglidere, Oguzhan; Padmanabhan, Swati; Nielsen, Karin; Ozcan, Aydogan

    2012-01-01

    In this work we investigate whether the innate visual recognition and learning capabilities of untrained humans can be used in conducting reliable microscopic analysis of biomedical samples toward diagnosis. For this purpose, we designed entertaining digital games that are interfaced with artificial learning and processing back-ends to demonstrate that in the case of binary medical diagnostics decisions (e.g., infected vs. uninfected), with the use of crowd-sourced games it is possible to approach the accuracy of medical experts in making such diagnoses. Specifically, using non-expert gamers we report diagnosis of malaria infected red blood cells with an accuracy that is within 1.25% of the diagnostics decisions made by a trained medical professional.

  19. Distributed medical image analysis and diagnosis through crowd-sourced games: a malaria case study.

    Directory of Open Access Journals (Sweden)

    Sam Mavandadi

    Full Text Available In this work we investigate whether the innate visual recognition and learning capabilities of untrained humans can be used in conducting reliable microscopic analysis of biomedical samples toward diagnosis. For this purpose, we designed entertaining digital games that are interfaced with artificial learning and processing back-ends to demonstrate that in the case of binary medical diagnostics decisions (e.g., infected vs. uninfected, with the use of crowd-sourced games it is possible to approach the accuracy of medical experts in making such diagnoses. Specifically, using non-expert gamers we report diagnosis of malaria infected red blood cells with an accuracy that is within 1.25% of the diagnostics decisions made by a trained medical professional.

  20. Your Personal Analysis Toolkit - An Open Source Solution

    Science.gov (United States)

    Mitchell, T.

    2009-12-01

    Open source software is commonly known for its web browsers, word processors and programming languages. However, there is a vast array of open source software focused on geographic information management and geospatial application building in general. As geo-professionals, having easy access to tools for our jobs is crucial. Open source software provides the opportunity to add a tool to your tool belt and carry it with you for your entire career - with no license fees, a supportive community and the opportunity to test, adopt and upgrade at your own pace. OSGeo is a US registered non-profit representing more than a dozen mature geospatial data management applications and programming resources. Tools cover areas such as desktop GIS, web-based mapping frameworks, metadata cataloging, spatial database analysis, image processing and more. Learn about some of these tools as they apply to AGU members, as well as how you can join OSGeo and its members in getting the job done with powerful open source tools. If you haven't heard of OSSIM, MapServer, OpenLayers, PostGIS, GRASS GIS or the many other projects under our umbrella - then you need to hear this talk. Invest in yourself - use open source!

  1. Java Source Code Analysis for API Migration to Embedded Systems

    Energy Technology Data Exchange (ETDEWEB)

    Winter, Victor [Univ. of Nebraska, Omaha, NE (United States); McCoy, James A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Guerrero, Jonathan [Univ. of Nebraska, Omaha, NE (United States); Reinke, Carl Werner [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Perry, James Thomas [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    Embedded systems form an integral part of our technological infrastructure and oftentimes play a complex and critical role within larger systems. From the perspective of reliability, security, and safety, strong arguments can be made favoring the use of Java over C in such systems. In part, this argument is based on the assumption that suitable subsets of Java’s APIs and extension libraries are available to embedded software developers. In practice, a number of Java-based embedded processors do not support the full features of the JVM. For such processors, source code migration is a mechanism by which key abstractions offered by APIs and extension libraries can made available to embedded software developers. The analysis required for Java source code-level library migration is based on the ability to correctly resolve element references to their corresponding element declarations. A key challenge in this setting is how to perform analysis for incomplete source-code bases (e.g., subsets of libraries) from which types and packages have been omitted. This article formalizes an approach that can be used to extend code bases targeted for migration in such a manner that the threats associated the analysis of incomplete code bases are eliminated.

  2. Study and manufacture of an analysis magnet

    International Nuclear Information System (INIS)

    Pronier, Jean

    1965-01-01

    This document reports the study and design of an apparatus aimed at a precise qualitative and quantitative analysis of particle beams produced by a number of ion sources of different types, and at using mono-energetic ion beams for experiments related to accelerators. Two analysis methods are addressed, presented and discussed (the electromagnetic and the electrostatic analysis) and the reasons for the choice of the electromagnetic analysis are explained. The author then reports the study of the analyser: analysis of theoretical capacities of separation of particles with a not much different M/Z rate, imposed characteristics, optical design, calculation of the slot image position, second and third order aberrations, correction of second order aberrations, and so on. He reports calculations related to the analyser: vacuum chamber, field map in the air gap, surface of polar parts, flux calculation in the air gap, comparison between experimental and theoretical results, ampere turn calculation, winding calculation. The induction measurement is described and the experiment is reported. Experimental results are reported in terms of analysis of gases ionized by a high frequency ion source [fr

  3. A nuclear source term analysis for spacecraft power systems

    International Nuclear Information System (INIS)

    McCulloch, W.H.

    1998-01-01

    All US space missions involving on board nuclear material must be approved by the Office of the President. To be approved the mission and the hardware systems must undergo evaluations of the associated nuclear health and safety risk. One part of these evaluations is the characterization of the source terms, i.e., the estimate of the amount, physical form, and location of nuclear material, which might be released into the environment in the event of credible accidents. This paper presents a brief overview of the source term analysis by the Interagency Nuclear Safety Review Panel for the NASA Cassini Space Mission launched in October 1997. Included is a description of the Energy Interaction Model, an innovative approach to the analysis of potential releases from high velocity impacts resulting from launch aborts and reentries

  4. A Method for the Analysis of Information Use in Source-Based Writing

    Science.gov (United States)

    Sormunen, Eero; Heinstrom, Jannica; Romu, Leena; Turunen, Risto

    2012-01-01

    Introduction: Past research on source-based writing assignments has hesitated to scrutinize how students actually use information afforded by sources. This paper introduces a method for the analysis of text transformations from sources to texts composed. The method is aimed to serve scholars in building a more detailed understanding of how…

  5. Analysis on the inbound tourist source market in Fujian Province

    Science.gov (United States)

    YU, Tong

    2017-06-01

    The paper analyzes the development and structure of inbound tourism in Fujian Province by Excel software and conducts the cluster analysis on the inbound tourism market by SPSS 23.0 software based on the inbound tourism data of Fujian Province from 2006 to 2015. The results show: the rapid development of inbound tourism in Fujian Province and the diversified inbound tourist source countries indicate the stability of inbound tourism market; the inbound tourist source market in Fujian Province can be divided into four categories according to the cluster analysis, and tourists from the United States, Japan, Malaysia, and Singapore are the key of inbound tourism in Fujian Province.

  6. Study and Analysis of an Intelligent Microgrid Energy Management Solution with Distributed Energy Sources

    Directory of Open Access Journals (Sweden)

    Swaminathan Ganesan

    2017-09-01

    Full Text Available In this paper, a robust energy management solution which will facilitate the optimum and economic control of energy flows throughout a microgrid network is proposed. The increased penetration of renewable energy sources is highly intermittent in nature; the proposed solution demonstrates highly efficient energy management. This study enables precise management of power flows by forecasting of renewable energy generation, estimating the availability of energy at storage batteries, and invoking the appropriate mode of operation, based on the load demand to achieve efficient and economic operation. The predefined mode of operation is derived out of an expert rule set and schedules the load and distributed energy sources along with utility grid.

  7. All-Source Information Acquisition and Analysis in the IAEA Department of Safeguards

    International Nuclear Information System (INIS)

    Ferguson, Matthew; Norman, Claude

    2010-01-01

    All source information analysis enables proactive implementation of in-field verification activities, supports the State Evaluation process, and is essential to the IAEA's strengthened safeguards system. Information sources include State-declared nuclear material accounting and facility design information; voluntarily supplied information such as nuclear procurement data; commercial satellite imagery; open source information and information/results from design information verifications (DIVs), inspections and complementary accesses (CAs). The analysis of disparate information sources directly supports inspections, design information verifications and complementary access, and enables both more reliable cross-examination for consistency and completeness as well as in-depth investigation of possible safeguards compliance issues. Comparison of State-declared information against information on illicit nuclear procurement networks, possible trafficking in nuclear materials, and scientific and technical information on nuclear-related research and development programmes, provides complementary measures for monitoring nuclear developments and increases Agency capabilities to detect possible undeclared nuclear activities. Likewise, expert analysis of commercial satellite imagery plays a critical role for monitoring un-safeguarded sites and facilities. In sum, the combination of these measures provides early identification of possible undeclared nuclear material or activities, thus enhancing deterrence of safeguards system that is fully information driven, and increasing confidence in Safeguards conclusions. By increasing confidence that nuclear materials and technologies in States under Safeguards are used solely for peaceful purposes, information-driven safeguards will strengthen the nuclear non-proliferation system. Key assets for Agency collection, processing, expert analysis, and integration of these information sources are the Information Collection and Analysis

  8. Modular Open-Source Software for Item Factor Analysis

    Science.gov (United States)

    Pritikin, Joshua N.; Hunter, Micheal D.; Boker, Steven M.

    2015-01-01

    This article introduces an item factor analysis (IFA) module for "OpenMx," a free, open-source, and modular statistical modeling package that runs within the R programming environment on GNU/Linux, Mac OS X, and Microsoft Windows. The IFA module offers a novel model specification language that is well suited to programmatic generation…

  9. Apportionment of sources affecting water quality: Case study of Kandla Creek, Gulf of Katchchh

    Digital Repository Service at National Institute of Oceanography (India)

    Dalal, S.G.; Shirodkar, P.V.; Verlekar, X.N.; Jagtap, T.G.; Rao, G.S.

    status of the environment. Several multivariate models are used for source apportionment studies, as they pinpoint the possible factors or sources that influence the water quality (Morales et al., 1999; Wunderlin et al., 2001; Petersen et al., 2001... and statistical approaches. Ecology 74: 2201– 2214. Morales, M. M., Martih, P., Llopis, A., Campos, L., and Sagrado, J. 1999. An environmental study by factor analysis of surface seawater in the Gulf of Valencia (western Mediterranean). Analytica Chimica Acta 394...

  10. Noise source analysis of nuclear ship Mutsu plant using multivariate autoregressive model

    International Nuclear Information System (INIS)

    Hayashi, K.; Shimazaki, J.; Shinohara, Y.

    1996-01-01

    The present study is concerned with the noise sources in N.S. Mutsu reactor plant. The noise experiments on the Mutsu plant were performed in order to investigate the plant dynamics and the effect of sea condition and and ship motion on the plant. The reactor noise signals as well as the ship motion signals were analyzed by a multivariable autoregressive (MAR) modeling method to clarify the noise sources in the reactor plant. It was confirmed from the analysis results that most of the plant variables were affected mainly by a horizontal component of the ship motion, that is the sway, through vibrations of the plant structures. Furthermore, the effect of ship motion on the reactor power was evaluated through the analysis of wave components extracted by a geometrical transform method. It was concluded that the amplitude of the reactor power oscillation was about 0.15% in normal sea condition, which was small enough for safe operation of the reactor plant. (authors)

  11. Analysis of core-concrete interaction event with flooding for the Advanced Neutron Source reactor

    International Nuclear Information System (INIS)

    Kim, S.H.; Taleyarkhan, R.P.; Georgevich, V.; Navarro-Valenti, S.

    1993-01-01

    This paper discusses salient aspects of the methodology, assumptions, and modeling of various features related to estimation of source terms from an accident involving a molten core-concrete interaction event (with and without flooding) in the Advanced Neutron Source (ANS) reactor at the Oak Ridge National Laboratory. Various containment configurations are considered for this postulated severe accident. Several design features (such as rupture disks) are examined to study containment response during this severe accident. Also, thermal-hydraulic response of the containment and radionuclide transport and retention in the containment are studied. The results are described as transient variations of source terms, which are then used for studying off-site radiological consequences and health effects for the support of the Conceptual Safety Analysis Report for ANS. The results are also to be used to examine the effectiveness of subpile room flooding during this type of severe accident

  12. Source and LINAC3 studies

    CERN Document Server

    Bellodi, G

    2017-01-01

    In the framework of the LHC Ion Injector Upgrade pro-gramme (LIU), several activities have been carried out in2016 to improve the ion source and Linac3 performance,with the goal to increase the beam current routinely deliv-ered to LEIR. The extraction region of the GTS-LHC ionsource was upgraded with enlarged vacuum chamber aper-tures and the addition of an einzel lens, yielding highertransmission through the rest of the machine. Also, a seriesof experiments have been performed to study the effects ofdouble frequency mixing on the afterglow performance ofthe source after installation of a Travelling Wave Tube Am-plifier (TWTA) as secondary microwave source at variablefrequency. Measurements have been carried out at a dedi-cated oven test stand for better understanding of the ionsource performance. Finally, several MD sessions werededicated to the study and characterization of the strippingfoils, after evidence of degradation in time was discoveredin the 2015 run.

  13. Study of the 137Cs Stabilizer Source

    Directory of Open Access Journals (Sweden)

    GAO Yan;WANG Yan-ling;XU Zhi-jian;XU Liang;REN Chun-xia;TAN Xiao-ming;CUI Hong-qi

    2014-02-01

    Full Text Available The attenuation laws of the Cesium -137 γ-ray penetrating the ceramic core、stainless steel and tungsten steel were studied. The radioactivity of the 137Cs stabilizer source was determined through the surface dose rate of 137Cs stabilizer sources. In addition, the adsorption properties of the ceramic core were studied to improve the stability of the output rate, and established a production line. The application results showed that the output rate of ray source was accurate and was of a good consistency. At present, the source had been used in logging lithology, and achieved the realization of domestic product.

  14. Critical Analysis on Open Source LMSs Using FCA

    Science.gov (United States)

    Sumangali, K.; Kumar, Ch. Aswani

    2013-01-01

    The objective of this paper is to apply Formal Concept Analysis (FCA) to identify the best open source Learning Management System (LMS) for an E-learning environment. FCA is a mathematical framework that represents knowledge derived from a formal context. In constructing the formal context, LMSs are treated as objects and their features as…

  15. Obsidian sources characterized by neutron-activation analysis.

    Science.gov (United States)

    Gordus, A A; Wright, G A; Griffin, J B

    1968-07-26

    Concentrations of elements such as manganese, scandium, lanthanum, rubidium, samarium, barium, and zirconium in obsidian samples from different flows show ranges of 1000 percent or more, whereas the variation in element content in obsidian samples from a single flow appears to be less than 40 percent. Neutron-activation analysis of these elements, as well as of sodium and iron, provides a means of identifying the geologic source of an archeological artifact of obsidian.

  16. An Analysis of Air Pollution in Makkah - a View Point of Source Identification

    Directory of Open Access Journals (Sweden)

    Turki M. Habeebullah

    2013-07-01

    Full Text Available Makkah is one of the busiest cities in Saudi Arabia and remains busy all year around, especially during the season of Hajj and the month of Ramadan when millions of people visit this city. This emphasizes the importance of clean air and of understanding the sources of various air pollutants, which is vital for the management and advanced modeling of air pollution. This study intends to identify the major sources of air pollutants in Makkah, near the Holy Mosque (Al-Haram using a graphical approach. Air pollutants considered in this study are nitrogen oxides (NOx, nitrogen dioxide (NO2, nitric oxide (NO, carbon monoxide (CO, sulphur dioxide (SO2, ozone (O3 and particulate matter with aero-dynamic diameter of 10 um or less (PM10. Polar plots, time variation plots and correlation analysis are used to analyse the data and identify the major sources of emissions. Most of the pollutants demonstrate high concentrations during the morning traffic peak hours, suggesting road traffic as the main source of emission. The main sources of pollutant emissions identified in Makkahwere road traffic, re-suspended and windblown dust and sand particles. Further investigation on detailedsource apportionment is required, which is part of the ongoing project.

  17. PANDORA: keyword-based analysis of protein sets by integration of annotation sources.

    Science.gov (United States)

    Kaplan, Noam; Vaaknin, Avishay; Linial, Michal

    2003-10-01

    Recent advances in high-throughput methods and the application of computational tools for automatic classification of proteins have made it possible to carry out large-scale proteomic analyses. Biological analysis and interpretation of sets of proteins is a time-consuming undertaking carried out manually by experts. We have developed PANDORA (Protein ANnotation Diagram ORiented Analysis), a web-based tool that provides an automatic representation of the biological knowledge associated with any set of proteins. PANDORA uses a unique approach of keyword-based graphical analysis that focuses on detecting subsets of proteins that share unique biological properties and the intersections of such sets. PANDORA currently supports SwissProt keywords, NCBI Taxonomy, InterPro entries and the hierarchical classification terms from ENZYME, SCOP and GO databases. The integrated study of several annotation sources simultaneously allows a representation of biological relations of structure, function, cellular location, taxonomy, domains and motifs. PANDORA is also integrated into the ProtoNet system, thus allowing testing thousands of automatically generated clusters. We illustrate how PANDORA enhances the biological understanding of large, non-uniform sets of proteins originating from experimental and computational sources, without the need for prior biological knowledge on individual proteins.

  18. Analysis of 3-panel and 4-panel microscale ionization sources

    International Nuclear Information System (INIS)

    Natarajan, Srividya; Parker, Charles B.; Glass, Jeffrey T.; Piascik, Jeffrey R.; Gilchrist, Kristin H.; Stoner, Brian R.

    2010-01-01

    Two designs of a microscale electron ionization (EI) source are analyzed herein: a 3-panel design and a 4-panel design. Devices were fabricated using microelectromechanical systems technology. Field emission from carbon nanotube provided the electrons for the EI source. Ion currents were measured for helium, nitrogen, and xenon at pressures ranging from 10 -4 to 0.1 Torr. A comparison of the performance of both designs is presented. The 4-panel microion source showed a 10x improvement in performance compared to the 3-panel device. An analysis of the various factors affecting the performance of the microion sources is also presented. SIMION, an electron and ion optics software, was coupled with experimental measurements to analyze the ion current results. The electron current contributing to ionization and the ion collection efficiency are believed to be the primary factors responsible for the higher efficiency of the 4-panel microion source. Other improvements in device design that could lead to higher ion source efficiency in the future are also discussed. These microscale ion sources are expected to find application as stand alone ion sources as well as in miniature mass spectrometers.

  19. Study of the Release Process of Open Source Software: Case Study

    OpenAIRE

    Eide, Tor Erik

    2007-01-01

    This report presents the results of a case study focusing on the release process of open source projects initiated with commercial motives. The purpose of the study is to gain an increased understanding of the release process, how a community can be attracted to the project, and how the interaction with the community evolves in commercial open source initiatives. Data has been gathered from four distinct sources to form the basis of this thesis. A thorough review of the open source literatu...

  20. OVAS: an open-source variant analysis suite with inheritance modelling.

    Science.gov (United States)

    Mozere, Monika; Tekman, Mehmet; Kari, Jameela; Bockenhauer, Detlef; Kleta, Robert; Stanescu, Horia

    2018-02-08

    The advent of modern high-throughput genetics continually broadens the gap between the rising volume of sequencing data, and the tools required to process them. The need to pinpoint a small subset of functionally important variants has now shifted towards identifying the critical differences between normal variants and disease-causing ones. The ever-increasing reliance on cloud-based services for sequence analysis and the non-transparent methods they utilize has prompted the need for more in-situ services that can provide a safer and more accessible environment to process patient data, especially in circumstances where continuous internet usage is limited. To address these issues, we herein propose our standalone Open-source Variant Analysis Sequencing (OVAS) pipeline; consisting of three key stages of processing that pertain to the separate modes of annotation, filtering, and interpretation. Core annotation performs variant-mapping to gene-isoforms at the exon/intron level, append functional data pertaining the type of variant mutation, and determine hetero/homozygosity. An extensive inheritance-modelling module in conjunction with 11 other filtering components can be used in sequence ranging from single quality control to multi-file penetrance model specifics such as X-linked recessive or mosaicism. Depending on the type of interpretation required, additional annotation is performed to identify organ specificity through gene expression and protein domains. In the course of this paper we analysed an autosomal recessive case study. OVAS made effective use of the filtering modules to recapitulate the results of the study by identifying the prescribed compound-heterozygous disease pattern from exome-capture sequence input samples. OVAS is an offline open-source modular-driven analysis environment designed to annotate and extract useful variants from Variant Call Format (VCF) files, and process them under an inheritance context through a top-down filtering schema of

  1. Analysis of geological material and especially ores by means of a 252Cf source

    International Nuclear Information System (INIS)

    Barrandon, J.N.; Borderie, B.; Melky, S.; Halfon, J.; Marce, A.

    1976-01-01

    Tests were made on the possibilities for analysis by 252 Cf activation in the earth sciences and mining research. The results obtained show that while 252 Cf activation can only resolve certain very specific geochemical research problems, it does allow the exact and rapid determination of numerous elements whose ores are of great economic importance such as fluorine, titanium, vanadium, manganese, copper, antimony, barium, and tungsten. The utilization of activation analysis methods in the earth sciences is not a recent phenomenon. It has generally been limited to the analysis of traces in relatively small volumes by means of irradiation in nuclear reactors. Traditional neutron sources were little used and were not very applicable. The development of 252 Cf isotopic sources emitting more intense neutron fluxes make it possible to consider carrying out more sensitive determinations without making use of a nuclear reactor. In addition, this technique can be adapted for in situ analysis in mines and mine borings. Our work which is centered upon the possibilities of instrumental laboratory analyses of geological materials through 252 Cf activation is oriented in two principal directions: the study of the experimental sensitivities of the various elements in different rocks with the usual compositions; and the study of the possibilities for routine ore analyses

  2. The adoption of total cost of ownership for sourcing decisions - a structural equations analysis

    NARCIS (Netherlands)

    Wouters, Marc; Anderson, James C.; Wynstra, Finn

    2005-01-01

    This study investigates the adoption of total cost of ownership (TCO) analysis to improve sourcing decisions. TCO can be seen as an application of activity based costing (ABC) that quantifies the costs that are involved in acquiring and using purchased goods or services. TCO supports purchasing

  3. On Road Study of Colorado Front Range Greenhouse Gases Distribution and Sources

    Science.gov (United States)

    Petron, G.; Hirsch, A.; Trainer, M. K.; Karion, A.; Kofler, J.; Sweeney, C.; Andrews, A.; Kolodzey, W.; Miller, B. R.; Miller, L.; Montzka, S. A.; Kitzis, D. R.; Patrick, L.; Frost, G. J.; Ryerson, T. B.; Robers, J. M.; Tans, P.

    2008-12-01

    The Global Monitoring Division and Chemical Sciences Division of the NOAA Earth System Research Laboratory have teamed up over the summer 2008 to experiment with a new measurement strategy to characterize greenhouse gases distribution and sources in the Colorado Front Range. Combining expertise in greenhouse gases measurements and in local to regional scales air quality study intensive campaigns, we have built the 'Hybrid Lab'. A continuous CO2 and CH4 cavity ring down spectroscopic analyzer (Picarro, Inc.), a CO gas-filter correlation instrument (Thermo Environmental, Inc.) and a continuous UV absorption ozone monitor (2B Technologies, Inc., model 202SC) have been installed securely onboard a 2006 Toyota Prius Hybrid vehicle with an inlet bringing in outside air from a few meters above the ground. To better characterize point and distributed sources, air samples were taken with a Portable Flask Package (PFP) for later multiple species analysis in the lab. A GPS unit hooked up to the ozone analyzer and another one installed on the PFP kept track of our location allowing us to map measured concentrations on the driving route using Google Earth. The Hybrid Lab went out for several drives in the vicinity of the NOAA Boulder Atmospheric Observatory (BAO) tall tower located in Erie, CO and covering areas from Boulder, Denver, Longmont, Fort Collins and Greeley. Enhancements in CO2, CO and destruction of ozone mainly reflect emissions from traffic. Methane enhancements however are clearly correlated with nearby point sources (landfill, feedlot, natural gas compressor ...) or with larger scale air masses advected from the NE Colorado, where oil and gas drilling operations are widespread. The multiple species analysis (hydrocarbons, CFCs, HFCs) of the air samples collected along the way bring insightful information about the methane sources at play. We will present results of the analysis and interpretation of the Hybrid Lab Front Range Study and conclude with perspectives

  4. Soprano and source: A laryngographic analysis

    Science.gov (United States)

    Bateman, Laura Anne

    2005-04-01

    Popular music in the 21st century uses a particular singing quality for female voice that is quite different from the trained classical singing quality. Classical quality has been the subject of a vast body of research, whereas research that deals with non-classical qualities is limited. In order to learn more about these issues, the author chose to do research on singing qualities using a variety of standard voice quality tests. This paper looks at voice qualities found in various different styles of singing: Classical, Belt, Legit, R&B, Jazz, Country, and Pop. The data was elicited from a professional soprano and the voice qualities reflect industry standards. The data set for this paper is limited to samples using the vowel [i]. Laryngographic (LGG) data was generated simultaneously with the audio samples. This paper will focus on the results of the LGG analysis; however, an audio analysis was also performed using Spectrogram, LPC, and FFT. Data from the LGG is used to calculate the contact quotient, speed quotient, and ascending slope. The LGG waveform is also visually assessed. The LGG analysis gives insights into the source vibration for the different singing styles.

  5. [Study of self-reported health of people living near point sources of environmental pollution: a review. Second part: analysis of results and perspectives].

    Science.gov (United States)

    Daniau, C; Dor, F; Eilstein, D; Lefranc, A; Empereur-Bissonnet, P; Dab, W

    2013-08-01

    Epidemiological studies have investigated the health impacts of local sources of environmental pollution using as an outcome variable self-reported health, reflecting the overall perception interviewed people have of their own health. This work aims at analyzing the advantages and the results of this approach. This second part presents the results of the studies. Based on a literature review (51 papers), this article presents an analysis of the contribution of self-reported health to epidemiological studies investigating local sources of environmental pollution. It discusses the associations between self-reported health and exposure variables, and other risk factors that can influence health reporting. Studies using self-reported health showed that local sources can be associated with a wide range of health outcomes, including an impact on mental health and well-being. The perception of pollution, especially sensory information such as odors, affects self-reported health. Attitudes referring to beliefs, worries and personal behaviors concerning the source of pollution have a striking influence on reported health. Attitudes can be used to estimate the reporting bias in a biomedical approach, and also constitute the main explanatory factors in biopsychosocial studies taking into account not only the biological, physical, and chemical factors but also the psychological and social factors at stake in a situation of environmental exposure. Studying self-reported health enables a multifactorial approach to health in a context of environmental exposure. This approach is most relevant when conducted within a multidisciplinary framework involving human and social sciences to better understand psychosocial factors. The relevance of this type of approach used as an epidemiological surveillance tool to monitor local situations should be assessed with regard to needs for public health management of these situations. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  6. Global sensitivity analysis in wastewater treatment plant model applications: Prioritizing sources of uncertainty

    DEFF Research Database (Denmark)

    Sin, Gürkan; Gernaey, Krist; Neumann, Marc B.

    2011-01-01

    This study demonstrates the usefulness of global sensitivity analysis in wastewater treatment plant (WWTP) design to prioritize sources of uncertainty and quantify their impact on performance criteria. The study, which is performed with the Benchmark Simulation Model no. 1 plant design, complements...... insight into devising useful ways for reducing uncertainties in the plant performance. This information can help engineers design robust WWTP plants....... a previous paper on input uncertainty characterisation and propagation (Sin et al., 2009). A sampling-based sensitivity analysis is conducted to compute standardized regression coefficients. It was found that this method is able to decompose satisfactorily the variance of plant performance criteria (with R2...

  7. Analysis of primary teacher stress' sources

    Directory of Open Access Journals (Sweden)

    Katja Depolli Steiner

    2011-12-01

    Full Text Available Teachers are subject to many different work stressors. This study focused on differences in intensity and frequency of potential stressors facing primary schoolteachers and set the goal to identify the most important sources of teacher stress in primary school. The study included 242 primary schoolteachers from different parts of Slovenia. We used Stress Inventory that is designed for identification of intensity and frequency of 49 situations that can play the role of teachers' work stressors. Findings showed that the major sources of stress facing teachers are factors related to work overload, factors stemming from pupils' behaviour and motivation and factors related to school system. Results also showed some small differences in perception of stressors in different groups of teachers (by gender and by teaching level.

  8. Source localization of rhythmic ictal EEG activity: a study of diagnostic accuracy following STARD criteria.

    Science.gov (United States)

    Beniczky, Sándor; Lantz, Göran; Rosenzweig, Ivana; Åkeson, Per; Pedersen, Birthe; Pinborg, Lars H; Ziebell, Morten; Jespersen, Bo; Fuglsang-Frederiksen, Anders

    2013-10-01

    Although precise identification of the seizure-onset zone is an essential element of presurgical evaluation, source localization of ictal electroencephalography (EEG) signals has received little attention. The aim of our study was to estimate the accuracy of source localization of rhythmic ictal EEG activity using a distributed source model. Source localization of rhythmic ictal scalp EEG activity was performed in 42 consecutive cases fulfilling inclusion criteria. The study was designed according to recommendations for studies on diagnostic accuracy (STARD). The initial ictal EEG signals were selected using a standardized method, based on frequency analysis and voltage distribution of the ictal activity. A distributed source model-local autoregressive average (LAURA)-was used for the source localization. Sensitivity, specificity, and measurement of agreement (kappa) were determined based on the reference standard-the consensus conclusion of the multidisciplinary epilepsy surgery team. Predictive values were calculated from the surgical outcome of the operated patients. To estimate the clinical value of the ictal source analysis, we compared the likelihood ratios of concordant and discordant results. Source localization was performed blinded to the clinical data, and before the surgical decision. Reference standard was available for 33 patients. The ictal source localization had a sensitivity of 70% and a specificity of 76%. The mean measurement of agreement (kappa) was 0.61, corresponding to substantial agreement (95% confidence interval (CI) 0.38-0.84). Twenty patients underwent resective surgery. The positive predictive value (PPV) for seizure freedom was 92% and the negative predictive value (NPV) was 43%. The likelihood ratio was nine times higher for the concordant results, as compared with the discordant ones. Source localization of rhythmic ictal activity using a distributed source model (LAURA) for the ictal EEG signals selected with a standardized method

  9. Monte Carlo design study of a moderated {sup 252}Cf source for in vivo neutron activation analysis of aluminium

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, D.G.; Natto, S.S.A.; Evans, C.J. [Swansea In Vivo Analysis and Cancer Research Group, Department of Physics, University of Wales, Swansea (United Kingdom); Ryde, S.J.S. [Swansea In Vivo Analysis and Cancer Research Group, Department of Medical Physics and Clinical Engineering, Singleton Hospital, Swansea (United Kingdom)

    1997-04-01

    The Monte Carlo computer code MCNP has been used to design a moderated 2{sup 52}Cf neutron source for in vivo neutron activation analysis of aluminium (Al) in the bones of the hand. The clinical motivation is the need to monitor l body burden in subjects with renal dysfunction, at risk of Al toxicity. The design involves the source positioned on the central axis at one end of a cylindrical deuterium oxide moderator. The moderator is surrounded by a graphite reflector, with the hand inserted at the end of the moderator opposing the source. For a 1 mg {sup 252}Cf source, 15 cm long x 20 cm radius moderator and 20 cm thick reflector, the estimated minimum detection limit is .5 mg Al for a 20 min irradiation, with an equivalent dose of 16.5 mSv to the hand. Increasing the moderator length and/or introducing a fast neutron filter (for example silicon) further reduces interference from fast-neutron-induced reactions on phosphorus in bone, at the expense of decreased fluence of the thermal neutrons which activate Al. Increased source strengths may be necessary to compensate for this decreased thermal fluence, or allow measurements to be made within an acceptable time limit for the comfort of the patient. (author)

  10. Open-Source Data and the Study of Homicide.

    Science.gov (United States)

    Parkin, William S; Gruenewald, Jeff

    2015-07-20

    To date, no discussion has taken place in the social sciences as to the appropriateness of using open-source data to augment, or replace, official data sources in homicide research. The purpose of this article is to examine whether open-source data have the potential to be used as a valid and reliable data source in testing theory and studying homicide. Official and open-source homicide data were collected as a case study in a single jurisdiction over a 1-year period. The data sets were compared to determine whether open-sources could recreate the population of homicides and variable responses collected in official data. Open-source data were able to replicate the population of homicides identified in the official data. Also, for every variable measured, the open-sources captured as much, or more, of the information presented in the official data. Also, variables not available in official data, but potentially useful for testing theory, were identified in open-sources. The results of the case study show that open-source data are potentially as effective as official data in identifying individual- and situational-level characteristics, provide access to variables not found in official homicide data, and offer geographic data that can be used to link macro-level characteristics to homicide events. © The Author(s) 2015.

  11. Statistical Analysis of the Microvariable AGN Source Mrk 501

    Directory of Open Access Journals (Sweden)

    Alberto C. Sadun

    2018-02-01

    Full Text Available We report on the optical observations and analysis of the high-energy peaked BL Lac object (HBL, Mrk 501, at redshift z = 0.033. We can confirm microvariable behavior over the course of minutes on several occasions per night. As an alternative to the commonly understood dynamical model of random variations in intensity of the AGN, we develop a relativistic beaming model with a minimum of free parameters, which allows us to infer changes in the line of sight angles for the motion of the different relativistic components. We hope our methods can be used in future studies of beamed emission in other active microvariable sources, similar to the one we explored.

  12. Frequency spectrum analysis of 252Cf neutron source based on LabVIEW

    International Nuclear Information System (INIS)

    Mi Deling; Li Pengcheng

    2011-01-01

    The frequency spectrum analysis of 252 Cf Neutron source is an extremely important method in nuclear stochastic signal processing. Focused on the special '0' and '1' structure of neutron pulse series, this paper proposes a fast-correlation algorithm to improve the computational rate of the spectrum analysis system. And the multi-core processor technology is employed as well as multi-threaded programming techniques of LabVIEW to construct frequency spectrum analysis system of 252 Cf neutron source based on LabVIEW. It not only obtains the auto-correlation and cross correlation results, but also auto-power spectrum,cross-power spectrum and ratio of spectral density. The results show that: analysis tools based on LabVIEW improve the fast auto-correlation and cross correlation code operating efficiency about by 25% to 35%, also verify the feasibility of using LabVIEW for spectrum analysis. (authors)

  13. Receptor modeling studies for the characterization of PM10 pollution sources in Belgrade

    Directory of Open Access Journals (Sweden)

    Mijić Zoran

    2012-01-01

    Full Text Available The objective of this study is to determine the major sources and potential source regions of PM10 over Belgrade, Serbia. The PM10 samples were collected from July 2003 to December 2006 in very urban area of Belgrade and concentrations of Al, V, Cr, Mn, Fe, Ni, Cu, Zn, Cd and Pb were analyzed by atomic absorption spectrometry. The analysis of seasonal variations of PM10 mass and some element concentrations reported relatively higher concentrations in winter, what underlined the importance of local emission sources. The Unmix model was used for source apportionment purpose and the four main source profiles (fossil fuel combustion; traffic exhaust/regional transport from industrial centers; traffic related particles/site specific sources and mineral/crustal matter were identified. Among the resolved factors the fossil fuel combustion was the highest contributor (34% followed by traffic/regional industry (26%. Conditional probability function (CPF results identified possible directions of local sources. The potential source contribution function (PSCF and concentration weighted trajectory (CWT receptor models were used to identify spatial source distribution and contribution of regional-scale transported aerosols. [Projekat Ministarstva nauke Republike Srbije, br. III43007 i br. III41011

  14. Positron annihilation lifetime spectroscopy source correction determination: A simulation study

    Energy Technology Data Exchange (ETDEWEB)

    Kanda, Gurmeet S.; Keeble, David J., E-mail: d.j.keeble@dundee.ac.uk

    2016-02-01

    Positron annihilation lifetime spectroscopy (PALS) can provide sensitive detection and identification of vacancy-related point defects in materials. These measurements are normally performed using a positron source supported, and enclosed by, a thin foil. Annihilation events from this source arrangement must be quantified and are normally subtracted from the spectrum before analysis of the material lifetime components proceeds. Here simulated PALS spectra reproducing source correction evaluation experiments have been systematically fitted and analysed using the packages PALSfit and MELT. Simulations were performed assuming a single lifetime material, and for a material with two lifetime components. Source correction terms representing a directly deposited source and various foil supported sources were added. It is shown that in principle these source terms can be extracted from suitably designed experiments, but that fitting a number of independent, nominally identical, spectra is recommended.

  15. Capturing heterogeneity in gene expression studies by surrogate variable analysis.

    Directory of Open Access Journals (Sweden)

    Jeffrey T Leek

    2007-09-01

    Full Text Available It has unambiguously been shown that genetic, environmental, demographic, and technical factors may have substantial effects on gene expression levels. In addition to the measured variable(s of interest, there will tend to be sources of signal due to factors that are unknown, unmeasured, or too complicated to capture through simple models. We show that failing to incorporate these sources of heterogeneity into an analysis can have widespread and detrimental effects on the study. Not only can this reduce power or induce unwanted dependence across genes, but it can also introduce sources of spurious signal to many genes. This phenomenon is true even for well-designed, randomized studies. We introduce "surrogate variable analysis" (SVA to overcome the problems caused by heterogeneity in expression studies. SVA can be applied in conjunction with standard analysis techniques to accurately capture the relationship between expression and any modeled variables of interest. We apply SVA to disease class, time course, and genetics of gene expression studies. We show that SVA increases the biological accuracy and reproducibility of analyses in genome-wide expression studies.

  16. pyAudioAnalysis: An Open-Source Python Library for Audio Signal Analysis.

    Science.gov (United States)

    Giannakopoulos, Theodoros

    2015-01-01

    Audio information plays a rather important role in the increasing digital content that is available today, resulting in a need for methodologies that automatically analyze such content: audio event recognition for home automations and surveillance systems, speech recognition, music information retrieval, multimodal analysis (e.g. audio-visual analysis of online videos for content-based recommendation), etc. This paper presents pyAudioAnalysis, an open-source Python library that provides a wide range of audio analysis procedures including: feature extraction, classification of audio signals, supervised and unsupervised segmentation and content visualization. pyAudioAnalysis is licensed under the Apache License and is available at GitHub (https://github.com/tyiannak/pyAudioAnalysis/). Here we present the theoretical background behind the wide range of the implemented methodologies, along with evaluation metrics for some of the methods. pyAudioAnalysis has been already used in several audio analysis research applications: smart-home functionalities through audio event detection, speech emotion recognition, depression classification based on audio-visual features, music segmentation, multimodal content-based movie recommendation and health applications (e.g. monitoring eating habits). The feedback provided from all these particular audio applications has led to practical enhancement of the library.

  17. Broadband Studies of Semsmic Sources at Regional and Teleseismic Distances Using Advanced Time Series Analysis Methods. Volume 1.

    Science.gov (United States)

    1991-03-21

    discussion of spectral factorability and motivations for broadband analysis, the report is subdivided into four main sections. In Section 1.0, we...estimates. The motivation for developing our multi-channel deconvolution method was to gain information about seismic sources, most notably, nuclear...with complex constraints for estimating the rupture history. Such methods (applied mostly to data sets that also include strong rmotion data), were

  18. Analysis of coherence properties of 3-rd generation synchrotron sources and free-electron lasers

    Energy Technology Data Exchange (ETDEWEB)

    Vartanyants, I.A.; Singer, A. [HASYLAB at Deutsches Elektronen-Synchrotron DESY, Hamburg (Germany)

    2009-07-15

    A general theoretical approach based on the results of statistical optics is used for the analysis of the transverse coherence properties of 3-rd generation synchrotron sources and X-ray free-electron lasers (XFEL). Correlation properties of the wave elds are calculated at different distances from an equivalent Gaussian Schell-model source. This model is used to describe coherence properties of the five meter undulator source at the synchrotron storage ring PETRA III. In the case of XFEL sources the decomposition of the statistical fields into a sum of independently propagating transverse modes is used for the analysis of the coherence properties of these new sources. A detailed calculation is performed for the parameters of the SASE1 undulator at the European XFEL. It is demonstrated that only a few modes contribute significantly to the total radiation field of that source. (orig.)

  19. Analysis of coherence properties of 3-rd generation synchrotron sources and free-electron lasers

    International Nuclear Information System (INIS)

    Vartanyants, I.A.; Singer, A.

    2009-07-01

    A general theoretical approach based on the results of statistical optics is used for the analysis of the transverse coherence properties of 3-rd generation synchrotron sources and X-ray free-electron lasers (XFEL). Correlation properties of the wave elds are calculated at different distances from an equivalent Gaussian Schell-model source. This model is used to describe coherence properties of the five meter undulator source at the synchrotron storage ring PETRA III. In the case of XFEL sources the decomposition of the statistical fields into a sum of independently propagating transverse modes is used for the analysis of the coherence properties of these new sources. A detailed calculation is performed for the parameters of the SASE1 undulator at the European XFEL. It is demonstrated that only a few modes contribute significantly to the total radiation field of that source. (orig.)

  20. Neural correlates of interference resolution in the multi-source interference task: a meta-analysis of functional neuroimaging studies.

    Science.gov (United States)

    Deng, Yuqin; Wang, Xiaochun; Wang, Yan; Zhou, Chenglin

    2018-04-10

    Interference resolution refers to cognitive control processes enabling one to focus on task-related information while filtering out unrelated information. But the exact neural areas, which underlie a specific cognitive task on interference resolution, are still equivocal. The multi-source interference task (MSIT), as a particular cognitive task, is a well-established experimental paradigm used to evaluate interference resolution. Studies combining the MSIT with functional magnetic resonance imaging (fMRI) have shown that the MSIT evokes the dorsal anterior cingulate cortex (dACC) and cingulate-frontal-parietal cognitive-attentional networks. However, these brain areas have not been evaluated quantitatively and these findings have not been replicated. In the current study, we firstly report a voxel-based meta-analysis of functional brain activation associated with the MSIT so as to identify the localization of interference resolution in such a specific cognitive task. Articles on MSIT-related fMRI published between 2003 and July 2017 were eligible. The electronic databases searched included PubMed, Web of Knowledge, and Google Scholar. Differential BOLD activation patterns between the incongruent and congruent condition were meta-analyzed in anisotropic effect-size signed differential mapping software. Robustness meta-analysis indicated that two significant activation clusters were shown to have reliable functional activity in comparisons between incongruent and congruent conditions. The first reliable activation cluster, which included the dACC, medial prefrontal cortex, supplementary motor area, replicated the previous MSIT-related fMRI study results. Furthermore, we found another reliable activation cluster comprising areas of the right insula, right inferior frontal gyrus, and right lenticular nucleus-putamen, which were not typically discussed in previous MSIT-related fMRI studies. The current meta-analysis study presents the reliable brain activation patterns

  1. FieldTrip: Open source software for advanced analysis of MEG, EEG, and invasive electrophysiological data.

    Science.gov (United States)

    Oostenveld, Robert; Fries, Pascal; Maris, Eric; Schoffelen, Jan-Mathijs

    2011-01-01

    This paper describes FieldTrip, an open source software package that we developed for the analysis of MEG, EEG, and other electrophysiological data. The software is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow experimental neuroscientists to analyze experimental data. It includes algorithms for simple and advanced analysis, such as time-frequency analysis using multitapers, source reconstruction using dipoles, distributed sources and beamformers, connectivity analysis, and nonparametric statistical permutation tests at the channel and source level. The implementation as toolbox allows the user to perform elaborate and structured analyses of large data sets using the MATLAB command line and batch scripting. Furthermore, users and developers can easily extend the functionality and implement new algorithms. The modular design facilitates the reuse in other software packages.

  2. Thermal neutron source study

    International Nuclear Information System (INIS)

    Holden, T.M.

    1983-05-01

    The value of intense neutron beams for condensed matter research is discussed with emphasis on the complementary nature of steady state and pulsed neutron sources. A large body of information on neutron sources, both existing and planned, is then summarized under four major headings: fission reactors, electron accelerators with heavy metal targets, pulsed spallation sources and 'steady state' spallation sources. Although the cost of a spallation source is expected to exceed that of a fission reactor of the same flux by a factor of two, there are significant advantages for a spallation device such as the proposed Electronuclear Materials Test Facility (EMTF)

  3. Open source drug discovery in practice: a case study.

    Science.gov (United States)

    Årdal, Christine; Røttingen, John-Arne

    2012-01-01

    Open source drug discovery offers potential for developing new and inexpensive drugs to combat diseases that disproportionally affect the poor. The concept borrows two principle aspects from open source computing (i.e., collaboration and open access) and applies them to pharmaceutical innovation. By opening a project to external contributors, its research capacity may increase significantly. To date there are only a handful of open source R&D projects focusing on neglected diseases. We wanted to learn from these first movers, their successes and failures, in order to generate a better understanding of how a much-discussed theoretical concept works in practice and may be implemented. A descriptive case study was performed, evaluating two specific R&D projects focused on neglected diseases. CSIR Team India Consortium's Open Source Drug Discovery project (CSIR OSDD) and The Synaptic Leap's Schistosomiasis project (TSLS). Data were gathered from four sources: interviews of participating members (n = 14), a survey of potential members (n = 61), an analysis of the websites and a literature review. Both cases have made significant achievements; however, they have done so in very different ways. CSIR OSDD encourages international collaboration, but its process facilitates contributions from mostly Indian researchers and students. Its processes are formal with each task being reviewed by a mentor (almost always offline) before a result is made public. TSLS, on the other hand, has attracted contributors internationally, albeit significantly fewer than CSIR OSDD. Both have obtained funding used to pay for access to facilities, physical resources and, at times, labor costs. TSLS releases its results into the public domain, whereas CSIR OSDD asserts ownership over its results. Technically TSLS is an open source project, whereas CSIR OSDD is a crowdsourced project. However, both have enabled high quality research at low cost. The critical success factors appear to be clearly

  4. Assessment of metal pollution sources by SEM/EDS analysis of solid particles in snow: a case study of Žerjav, Slovenia.

    Science.gov (United States)

    Miler, Miloš; Gosar, Mateja

    2013-12-01

    Solid particles in snow deposits, sampled in mining and Pb-processing area of Žerjav, Slovenia, have been investigated using scanning electron microscopy/energy-dispersive X-ray spectroscopy (SEM/EDS). Identified particles were classified as geogenic-anthropogenic, anthropogenic, and secondary weathering products. Geogenic-anthropogenic particles were represented by scarce Zn- and Pb-bearing ore minerals, originating from mine waste deposit. The most important anthropogenic metal-bearing particles in snow were Pb-, Sb- and Sn-bearing oxides and sulphides. The morphology of these particles showed that they formed at temperatures above their melting points. They were most abundant in snow sampled closest to the Pb-processing plant and least abundant in snow taken farthest from the plant, thus indicating that Pb processing was their predominant source between the last snowfall and the time of sampling. SEM/EDS analysis showed that Sb and Sn contents in these anthropogenic phases were higher and more variable than in natural Pb-bearing ore minerals. The most important secondary weathering products were Pb- and Zn-containing Fe-oxy-hydroxides whose elemental composition and morphology indicated that they mostly resulted from oxidation of metal-bearing sulphides emitted from the Pb-processing plant. This study demonstrated the importance of single particle analysis using SEM/EDS for differentiation between various sources of metals in the environment.

  5. Characterization of sealed radioactive sources. Uncertainty analysis to improve detection methods

    International Nuclear Information System (INIS)

    Cummings, D.G.; Sommers, J.D.; Adamic, M.L.; Jimenez, M.; Giglio, J.J.; Carney, K.P.

    2009-01-01

    A radioactive 137 Cs source has been analyzed for the radioactive parent 137 Cs and stable decay daughter 137 Ba. The ratio of the daughter to parent atoms is used to estimate the date when Cs was purified prior to source encapsulation (an 'age' since purification). The isotopes were analyzed by inductively coupled plasma mass spectrometry (ICP-MS) after chemical separation. In addition, Ba was analyzed by isotope dilution ICP-MS (ID-ICP-MS). A detailed error analysis of the mass spectrometric work has been undertaken to identify areas of improvement, as well as quantifying the effect the errors have on the 'age' determined. This paper reports an uncertainty analysis to identifying areas of improvement and alternative techniques that may reduce the uncertainties. In particular, work on isotope dilution using ICP-MS for the 'age' determination of sealed sources is presented. The results will be compared to the original work done using external standards to calibrate the ICP-MS instrument. (author)

  6. Open source information acquisition, analysis and integration in the IAEA Department of Safeguards

    International Nuclear Information System (INIS)

    Barletta, M.; Zarimpas, N.; Zarucki, R.

    2010-10-01

    Acquisition and analysis of open source information plays an increasingly important role in the IAEA strengthened safeguards system. The Agency's focal point for open source information collection and analysis is the Division of Safeguards Information Management (SGIM) within the IAEA Department of Safeguards. In parallel with the approval of the Model Additional Protocol in 1997, a new centre of information acquisition and analysis expertise was created within SGIM. By acquiring software, developing databases, retraining existing staff and hiring new staff with diverse analytical skills, SGIM is pro actively contributing to the future implementation of information-driven safeguards in collaboration with other Divisions within the Department of Safeguards. Open source information support is now fully integrated with core safeguards processes and activities, and has become an effective tool in the work of the Department of Safeguards. This provides and overview of progress realized through the acquisition and use of open source information in several thematic areas: evaluation of additional protocol declarations; support to the State Evaluation process; in-depth investigation of safeguards issues, including assisting inspections and complementary access; research on illicit nuclear procurement networks and trafficking; and monitoring nuclear developments. Demands for open source information have steadily grown and are likely to continue to grow in the future. Coupled with the enormous growth and accessibility in the volume and sources of information, new challenges are presented, both technical and analytical. This paper discusses actions taken and future plans for multi-source and multi-disciplinary analytic integration to strengthen confidence in safeguards conclusions - especially regarding the absence of undeclared nuclear materials and activities. (Author)

  7. Gaussian process based independent analysis for temporal source separation in fMRI

    DEFF Research Database (Denmark)

    Hald, Ditte Høvenhoff; Henao, Ricardo; Winther, Ole

    2017-01-01

    Functional Magnetic Resonance Imaging (fMRI) gives us a unique insight into the processes of the brain, and opens up for analyzing the functional activation patterns of the underlying sources. Task-inferred supervised learning with restrictive assumptions in the regression set-up, restricts...... the exploratory nature of the analysis. Fully unsupervised independent component analysis (ICA) algorithms, on the other hand, can struggle to detect clear classifiable components on single-subject data. We attribute this shortcoming to inadequate modeling of the fMRI source signals by failing to incorporate its...

  8. Analysis of the TMI-2 source range detector response

    International Nuclear Information System (INIS)

    Carew, J.F.; Diamond, D.J.; Eridon, J.M.

    1980-01-01

    In the first few hours following the TMI-2 accident large variations (factors of 10-100) in the source range (SR) detector response were observed. The purpose of this analysis was to quantify the various effects which could contribute to these large variations. The effects evaluated included the transmission of neutrons and photons from the core to detector and the reduction in the multiplication of the Am-Be startup sources, and subsequent reduction in SR detector response, due to core voiding. A one-dimensional ANISN slab model of the TMI-2 core, core externals, pressure vessel and containment has been constructed for calculation of the SR detector response and is presented

  9. Sources of Safety Data and Statistical Strategies for Design and Analysis: Clinical Trials.

    Science.gov (United States)

    Zink, Richard C; Marchenko, Olga; Sanchez-Kam, Matilde; Ma, Haijun; Jiang, Qi

    2018-03-01

    There has been an increased emphasis on the proactive and comprehensive evaluation of safety endpoints to ensure patient well-being throughout the medical product life cycle. In fact, depending on the severity of the underlying disease, it is important to plan for a comprehensive safety evaluation at the start of any development program. Statisticians should be intimately involved in this process and contribute their expertise to study design, safety data collection, analysis, reporting (including data visualization), and interpretation. In this manuscript, we review the challenges associated with the analysis of safety endpoints and describe the safety data that are available to influence the design and analysis of premarket clinical trials. We share our recommendations for the statistical and graphical methodologies necessary to appropriately analyze, report, and interpret safety outcomes, and we discuss the advantages and disadvantages of safety data obtained from clinical trials compared to other sources. Clinical trials are an important source of safety data that contribute to the totality of safety information available to generate evidence for regulators, sponsors, payers, physicians, and patients. This work is a result of the efforts of the American Statistical Association Biopharmaceutical Section Safety Working Group.

  10. Human Error Analysis Project (HEAP) - The Fourth Pilot Study: Scoring and Analysis of Raw Data Types

    International Nuclear Information System (INIS)

    Hollnagel, Erik; Braarud; Per Oeyvind; Droeivoldsmo, Asgeir; Follesoe; Knut; Helgar, Stein; Kaarstad, Magnhild

    1996-01-01

    Pilot study No. 4 rounded off the series of pilot studies by looking at the important issue of the quality of the various data sources. The preceding experiments had clearly shown that that it was necessary to use both concurrent and interrupted verbal protocols, and also that information about eye movements was very valuable. The effort and resources needed to analyse a combination of the different data sources is, however, significant, and it was therefore important to find out whether one or more of the data sources could replace another. In order to determine this issue, pilot study No. 4 looked specifically at the quality of information provided by different data sources. The main hypotheses were that information about operators' diagnosis and decision making would be provided by verbal protocols, expert commentators, and auto-confrontation protocols, that the data sources would be valid, and that they would complement each other. The study used three main data sources: (1) concurrent verbal protocols, which were the operators' verbalisations during the experiment; (2) expert commentator reports, which were descriptions by process experts of the operators' performance; and (3) auto-confrontation, which were the operators' comments on their performance based on a replay of the performance recording minus the concurrent verbal protocol. Additional data sources were eye movement recordings, process data, alarms, etc. The three main data sources were treated as independent variables and applied according to an experimental design that facilitated the test of the main hypotheses. The pilot study produced altogether 59 verbal protocols, some of which were in Finnish. After a translation into English, each protocol was analysed and scored according to a specific scheme. The scoring was designed to facilitate the evaluation of the experimental hypotheses. Due to the considerable work involved, the analysis process has only been partly completed, and no firm results

  11. Neutron activation analysis of essential elements in Multani mitti clay using miniature neutron source reactor

    International Nuclear Information System (INIS)

    Waheed, S.; Rahman, S.; Faiz, Y.; Siddique, N.

    2012-01-01

    Multani mitti clay was studied for 19 essential and other elements. Four different radio-assay schemes were adopted for instrumental neutron activation analysis (INAA) using miniature neutron source reactor. The estimated weekly intakes of Cr and Fe are high for men, women, pregnant and lactating women and children while intake of Co is higher in adult categories and Mn by pregnant women. Comparison of MM clay with other type of clays shows that it is a good source of essential elements. - Highlights: ► Multani mitti clay has been studied for 19 essential elements for human adequacy and safety using INAA and AAS. ► Weekly intakes for different consumer categories have been calculated and compared with DRIs. ► Comparison of MM with other type of clays depict that MM clay is a good source of essential elements.

  12. An Analysis of Open Source Security Software Products Downloads

    Science.gov (United States)

    Barta, Brian J.

    2014-01-01

    Despite the continued demand for open source security software, a gap in the identification of success factors related to the success of open source security software persists. There are no studies that accurately assess the extent of this persistent gap, particularly with respect to the strength of the relationships of open source software…

  13. Blind Time-Frequency Analysis for Source Discrimination in Multisensor Array Processing

    National Research Council Canada - National Science Library

    Amin, Moeness

    1999-01-01

    .... We have clearly demonstrated, through analysis and simulations, the offerings of time-frequency distributions in solving key problems in sensor array processing, including direction finding, source...

  14. Qualitative case study data analysis: an example from practice.

    Science.gov (United States)

    Houghton, Catherine; Murphy, Kathy; Shaw, David; Casey, Dympna

    2015-05-01

    To illustrate an approach to data analysis in qualitative case study methodology. There is often little detail in case study research about how data were analysed. However, it is important that comprehensive analysis procedures are used because there are often large sets of data from multiple sources of evidence. Furthermore, the ability to describe in detail how the analysis was conducted ensures rigour in reporting qualitative research. The research example used is a multiple case study that explored the role of the clinical skills laboratory in preparing students for the real world of practice. Data analysis was conducted using a framework guided by the four stages of analysis outlined by Morse ( 1994 ): comprehending, synthesising, theorising and recontextualising. The specific strategies for analysis in these stages centred on the work of Miles and Huberman ( 1994 ), which has been successfully used in case study research. The data were managed using NVivo software. Literature examining qualitative data analysis was reviewed and strategies illustrated by the case study example provided. Discussion Each stage of the analysis framework is described with illustration from the research example for the purpose of highlighting the benefits of a systematic approach to handling large data sets from multiple sources. By providing an example of how each stage of the analysis was conducted, it is hoped that researchers will be able to consider the benefits of such an approach to their own case study analysis. This paper illustrates specific strategies that can be employed when conducting data analysis in case study research and other qualitative research designs.

  15. Tracing diffuse anthropogenic Pb sources in rural soils by means of Pb isotope analysis

    NARCIS (Netherlands)

    Walraven, N.; Gaans, P.F.M. van; Veer, G. van der; Os, B.J.H. van; Klaver, G.T.; Vriend, S.P.; Middelburg, J.J.; Davies, G.R.

    2013-01-01

    Knowledge of the cause and source of Pb pollution is important to abate environmental Pb pollution by taking source-related actions. Lead isotope analysis is a potentially powerful tool to identify anthropogenic Pb and its sources in the environment. Spatial information on the variation of

  16. Spallation neutrons pulsed sources

    International Nuclear Information System (INIS)

    Carpenter, J.

    1996-01-01

    This article describes the range of scientific applications which can use these pulsed neutrons sources: Studies on super fluids, measures to verify the crawling model for the polymers diffusion; these sources are also useful to study the neutron disintegration, the ultra cold neutrons. In certain applications which were not accessible by neutrons diffusion, for example, radiations damages, radionuclides production and activation analysis, the spallation sources find their use and their improvement will bring new possibilities. Among others contributions, one must notice the place at disposal of pulsed muons sources and neutrinos sources. (N.C.). 3 figs

  17. Lattice Study for the Taiwan Photon Source

    CERN Document Server

    Kuo, Chin-Cheng; Chen Chien Te; Luo, Gwo-Huei; Tsai, Hung-Jen; Wang, Min-Huey

    2005-01-01

    The feasibility study for the new 3.0~3.3 GeV Taiwan synchrotron light source, dubbed Taiwan Photon Source, was initiated in July, 2004. The goal is to construct a high performance light source with extremely bright X-ray in complementary to the existing 1.5 GeV light source in Taiwan. The ring circumference is 518.4 m and a 24-cell DBA lattice structure is chosen. The natural emittance with distributed dispersion is less than 2 nm-rad. A large booster ring of 499.2 m sharing the storage ring tunnel will be adopted.

  18. Identifying avian sources of faecal contamination using sterol analysis.

    Science.gov (United States)

    Devane, Megan L; Wood, David; Chappell, Andrew; Robson, Beth; Webster-Brown, Jenny; Gilpin, Brent J

    2015-10-01

    Discrimination of the source of faecal pollution in water bodies is an important step in the assessment and mitigation of public health risk. One tool for faecal source tracking is the analysis of faecal sterols which are present in faeces of animals in a range of distinctive ratios. Published ratios are able to discriminate between human and herbivore mammal faecal inputs but are of less value for identifying pollution from wildfowl, which can be a common cause of elevated bacterial indicators in rivers and streams. In this study, the sterol profiles of 50 avian-derived faecal specimens (seagulls, ducks and chickens) were examined alongside those of 57 ruminant faeces and previously published sterol profiles of human wastewater, chicken effluent and animal meatwork effluent. Two novel sterol ratios were identified as specific to avian faecal scats, which, when incorporated into a decision tree with human and herbivore mammal indicative ratios, were able to identify sterols from avian-polluted waterways. For samples where the sterol profile was not consistent with herbivore mammal or human pollution, avian pollution is indicated when the ratio of 24-ethylcholestanol/(24-ethylcholestanol + 24-ethylcoprostanol + 24-ethylepicoprostanol) is ≥0.4 (avian ratio 1) and the ratio of cholestanol/(cholestanol + coprostanol + epicoprostanol) is ≥0.5 (avian ratio 2). When avian pollution is indicated, further confirmation by targeted PCR specific markers can be employed if greater confidence in the pollution source is required. A 66% concordance between sterol ratios and current avian PCR markers was achieved when 56 water samples from polluted waterways were analysed.

  19. Open Source Parallel Image Analysis and Machine Learning Pipeline, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Continuum Analytics proposes a Python-based open-source data analysis machine learning pipeline toolkit for satellite data processing, weather and climate data...

  20. Error Analysis of CM Data Products Sources of Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Hunt, Brian D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eckert-Gallup, Aubrey Celia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Cochran, Lainy Dromgoole [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kraus, Terrence D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Allen, Mark B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Beal, Bill [National Security Technologies, Joint Base Andrews, MD (United States); Okada, Colin [National Security Technologies, LLC. (NSTec), Las Vegas, NV (United States); Simpson, Mathew [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-01

    This goal of this project is to address the current inability to assess the overall error and uncertainty of data products developed and distributed by DOE’s Consequence Management (CM) Program. This is a widely recognized shortfall, the resolution of which would provide a great deal of value and defensibility to the analysis results, data products, and the decision making process that follows this work. A global approach to this problem is necessary because multiple sources of error and uncertainty contribute to the ultimate production of CM data products. Therefore, this project will require collaboration with subject matter experts across a wide range of FRMAC skill sets in order to quantify the types of uncertainty that each area of the CM process might contain and to understand how variations in these uncertainty sources contribute to the aggregated uncertainty present in CM data products. The ultimate goal of this project is to quantify the confidence level of CM products to ensure that appropriate public and worker protections decisions are supported by defensible analysis.

  1. Obisdian sourcing by PIXE analysis at AURA2

    International Nuclear Information System (INIS)

    Neve, S.R.; Barker, P.H.; Holroyd, S.; Sheppard, P.J.

    1994-01-01

    The technique of Proton Induced X-ray Emission is a suitable method for the elemental analysis of obsidian samples and artefacts. By comparing the elemental composition of obsidian artefacts with those of known sources of obsidian and identifying similarities, the likely origin of the sample can be discovered and information about resource procurement gained. A PIXE facility has now been established at the Auckland University Research Accelerator Laboratory, AURA2. It offers a rapid, multi-element, non-destructive method of characterisation of obsidian samples ranging from small chips to large pieces. In an extensive survey of Mayor Island obsidian, a discrimination has been made between the different locations of obsidian deposits on the island. In addition, using the database developed at AURA2, artefacts from the site of Opita, Hauraki Plains, have been sourced. (Author). 18 refs., 8 figs., 7 tabs., 1 appendix

  2. Open source information acquisition, analysis and integration in the IAEA Department of Safeguards

    Energy Technology Data Exchange (ETDEWEB)

    Barletta, M.; Zarimpas, N.; Zarucki, R., E-mail: M.Barletta@iaea.or [IAEA, Wagramerstrasse 5, P.O. Box 100, 1400 Vienna (Austria)

    2010-10-15

    Acquisition and analysis of open source information plays an increasingly important role in the IAEA strengthened safeguards system. The Agency's focal point for open source information collection and analysis is the Division of Safeguards Information Management (SGIM) within the IAEA Department of Safeguards. In parallel with the approval of the Model Additional Protocol in 1997, a new centre of information acquisition and analysis expertise was created within SGIM. By acquiring software, developing databases, retraining existing staff and hiring new staff with diverse analytical skills, SGIM is pro actively contributing to the future implementation of information-driven safeguards in collaboration with other Divisions within the Department of Safeguards. Open source information support is now fully integrated with core safeguards processes and activities, and has become an effective tool in the work of the Department of Safeguards. This provides and overview of progress realized through the acquisition and use of open source information in several thematic areas: evaluation of additional protocol declarations; support to the State Evaluation process; in-depth investigation of safeguards issues, including assisting inspections and complementary access; research on illicit nuclear procurement networks and trafficking; and monitoring nuclear developments. Demands for open source information have steadily grown and are likely to continue to grow in the future. Coupled with the enormous growth and accessibility in the volume and sources of information, new challenges are presented, both technical and analytical. This paper discusses actions taken and future plans for multi-source and multi-disciplinary analytic integration to strengthen confidence in safeguards conclusions - especially regarding the absence of undeclared nuclear materials and activities. (Author)

  3. Time Series Analysis of Monte Carlo Fission Sources - I: Dominance Ratio Computation

    International Nuclear Information System (INIS)

    Ueki, Taro; Brown, Forrest B.; Parsons, D. Kent; Warsa, James S.

    2004-01-01

    In the nuclear engineering community, the error propagation of the Monte Carlo fission source distribution through cycles is known to be a linear Markov process when the number of histories per cycle is sufficiently large. In the statistics community, linear Markov processes with linear observation functions are known to have an autoregressive moving average (ARMA) representation of orders p and p - 1. Therefore, one can perform ARMA fitting of the binned Monte Carlo fission source in order to compute physical and statistical quantities relevant to nuclear criticality analysis. In this work, the ARMA fitting of a binary Monte Carlo fission source has been successfully developed as a method to compute the dominance ratio, i.e., the ratio of the second-largest to the largest eigenvalues. The method is free of binning mesh refinement and does not require the alteration of the basic source iteration cycle algorithm. Numerical results are presented for problems with one-group isotropic, two-group linearly anisotropic, and continuous-energy cross sections. Also, a strategy for the analysis of eigenmodes higher than the second-largest eigenvalue is demonstrated numerically

  4. Analysis of source regions and meteorological factors for the variability of spring PM10 concentrations in Seoul, Korea

    Science.gov (United States)

    Lee, Jangho; Kim, Kwang-Yul

    2018-02-01

    CSEOF analysis is applied for the springtime (March, April, May) daily PM10 concentrations measured at 23 Ministry of Environment stations in Seoul, Korea for the period of 2003-2012. Six meteorological variables at 12 pressure levels are also acquired from the ERA Interim reanalysis datasets. CSEOF analysis is conducted for each meteorological variable over East Asia. Regression analysis is conducted in CSEOF space between the PM10 concentrations and individual meteorological variables to identify associated atmospheric conditions for each CSEOF mode. By adding the regressed loading vectors with the mean meteorological fields, the daily atmospheric conditions are obtained for the first five CSEOF modes. Then, HYSPLIT model is run with the atmospheric conditions for each CSEOF mode in order to back trace the air parcels and dust reaching Seoul. The K-means clustering algorithm is applied to identify major source regions for each CSEOF mode of the PM10 concentrations in Seoul. Three main source regions identified based on the mean fields are: (1) northern Taklamakan Desert (NTD), (2) Gobi Desert and (GD), and (3) East China industrial area (ECI). The main source regions for the mean meteorological fields are consistent with those of previous study; 41% of the source locations are located in GD followed by ECI (37%) and NTD (21%). Back trajectory calculations based on CSEOF analysis of meteorological variables identify distinct source characteristics associated with each CSEOF mode and greatly facilitate the interpretation of the PM10 variability in Seoul in terms of transportation route and meteorological conditions including the source area.

  5. 42 CFR 456.244 - Data sources for studies.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Data sources for studies. 456.244 Section 456.244 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES...: Medical Care Evaluation Studies § 456.244 Data sources for studies. Data that the committee uses to...

  6. 42 CFR 456.144 - Data sources for studies.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Data sources for studies. 456.144 Section 456.144 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... Care Evaluation Studies § 456.144 Data sources for studies. Data that the committee uses to perform...

  7. Dynamic Stability Analysis of Autonomous Medium-Voltage Mixed-Source Microgrid

    DEFF Research Database (Denmark)

    Zhao, Zhuoli; Yang, Ping; Guerrero, Josep M.

    2015-01-01

    -space model of the autonomous MV mixed-source microgrid containing diesel generator set (DGS), grid-supporting battery energy storage system (BESS), squirrel cage induction generator (SCIG) wind turbine and network is developed. Sensitivity analysis is carried out to reveal the dynamic stability margin...

  8. Nitrate source identification in groundwater of multiple land-use areas by combining isotopes and multivariate statistical analysis: A case study of Asopos basin (Central Greece).

    Science.gov (United States)

    Matiatos, Ioannis

    2016-01-15

    Nitrate (NO3) is one of the most common contaminants in aquatic environments and groundwater. Nitrate concentrations and environmental isotope data (δ(15)N-NO3 and δ(18)O-NO3) from groundwater of Asopos basin, which has different land-use types, i.e., a large number of industries (e.g., textile, metal processing, food, fertilizers, paint), urban and agricultural areas and livestock breeding facilities, were analyzed to identify the nitrate sources of water contamination and N-biogeochemical transformations. A Bayesian isotope mixing model (SIAR) and multivariate statistical analysis of hydrochemical data were used to estimate the proportional contribution of different NO3 sources and to identify the dominant factors controlling the nitrate content of the groundwater in the region. The comparison of SIAR and Principal Component Analysis showed that wastes originating from urban and industrial zones of the basin are mainly responsible for nitrate contamination of groundwater in these areas. Agricultural fertilizers and manure likely contribute to groundwater contamination away from urban fabric and industrial land-use areas. Soil contribution to nitrate contamination due to organic matter is higher in the south-western part of the area far from the industries and the urban settlements. The present study aims to highlight the use of environmental isotopes combined with multivariate statistical analysis in locating sources of nitrate contamination in groundwater leading to a more effective planning of environmental measures and remediation strategies in river basins and water bodies as defined by the European Water Frame Directive (Directive 2000/60/EC).

  9. Qualitative analysis of precipiation distribution in Poland with use of different data sources

    Directory of Open Access Journals (Sweden)

    J. Walawender

    2008-04-01

    Full Text Available Geographical Information Systems (GIS can be used to integrate data from different sources and in different formats to perform innovative spatial and temporal analysis. GIS can be also applied for climatic research to manage, investigate and display all kinds of weather data.

    The main objective of this study is to demonstrate that GIS is a useful tool to examine and visualise precipitation distribution obtained from different data sources: ground measurements, satellite and radar data.

    Three selected days (30 cases with convective rainfall situations were analysed. Firstly, scalable GRID-based approach was applied to store data from three different sources in comparable layout. Then, geoprocessing algorithm was created within ArcGIS 9.2 environment. The algorithm included: GRID definition, reclassification and raster algebra. All of the calculations and procedures were performed automatically. Finally, contingency tables and pie charts were created to show relationship between ground measurements and both satellite and radar derived data. The results were visualised on maps.

  10. Getting to the Source: a Survey of Quantitative Data Sources Available to the Everyday Librarian: Part 1: Web Server Log Analysis

    Directory of Open Access Journals (Sweden)

    Lisa Goddard

    2007-03-01

    Full Text Available This is the first part of a two‐part article that provides a survey of data sources which are likely to be immediately available to the typical practitioner who wishes to engage instatistical analysis of collections and services within his or her own library. Part I outlines the data elements which can be extracted from web server logs, and discusses web log analysis tools. Part II looks at logs, reports, and data sources from proxy servers, resource vendors, link resolvers, federated search engines, institutional repositories, electronic reference services, and the integrated library system.

  11. Computerized study of several electrostatic, surface-ionization ion-source configurations

    Energy Technology Data Exchange (ETDEWEB)

    Balestrini, S.J.; Schuster, B.G.

    1984-08-01

    A computer-based method is presented whereby the optics of electrostatic, surface-ionization ion-source designs can be analyzed theoretically. The analysis solves for the luminosity and disperstion of a beam of charged particles at the final collimating slit and at locations preceding the slit. The performance of an ion source tested in 1960 and also some newer optical configurations are compared with theory.

  12. Source attribution of human salmonellosis using a meta-analysis of case-control studies of sporadic infections

    DEFF Research Database (Denmark)

    Coutinho Calado Domingues, Ana Rita; Pires, Sara Monteiro; Hisham Beshara Halasa, Tariq

    2012-01-01

    Salmonella is an important cause of human illness. Disease is frequently associated with foodborne transmission, but other routes of exposure are recognized. Identifying sources of disease is essential for prioritizing public health interventions. Numerous case-control studies of sporadic salmone...

  13. Human Campylobacteriosis in Luxembourg, 2010-2013: A Case-Control Study Combined with Multilocus Sequence Typing for Source Attribution and Risk Factor Analysis.

    Science.gov (United States)

    Mossong, Joël; Mughini-Gras, Lapo; Penny, Christian; Devaux, Anthony; Olinger, Christophe; Losch, Serge; Cauchie, Henry-Michel; van Pelt, Wilfrid; Ragimbeau, Catherine

    2016-02-10

    Campylobacteriosis has increased markedly in Luxembourg during recent years. We sought to determine which Campylobacter genotypes infect humans, where they may originate from, and how they may infect humans. Multilocus sequence typing was performed on 1153 Campylobacter jejuni and 136 C. coli human strains to be attributed to three putative animal reservoirs (poultry, ruminants, pigs) and to environmental water using the asymmetric island model. A nationwide case-control study (2010-2013) for domestic campylobacteriosis was also conducted, including 367 C. jejuni and 48 C. coli cases, and 624 controls. Risk factors were investigated by Campylobacter species, and for strains attributed to different sources using a combined case-control and source attribution analysis. 282 sequence types (STs) were identified: ST-21, ST-48, ST-572, ST-50 and ST-257 were prevailing. Most cases were attributed to poultry (61.2%) and ruminants (33.3%). Consuming chicken outside the home was the dominant risk factor for both Campylobacter species. Newly identified risk factors included contact with garden soil for either species, and consuming beef specifically for C. coli. Poultry-associated campylobacteriosis was linked to poultry consumption in wintertime, and ruminant-associated campylobacteriosis to tap-water provider type. Besides confirming chicken as campylobacteriosis primary source, additional evidence was found for other reservoirs and transmission routes.

  14. Off-design performance analysis of Kalina cycle for low temperature geothermal source

    International Nuclear Information System (INIS)

    Li, Hang; Hu, Dongshuai; Wang, Mingkun; Dai, Yiping

    2016-01-01

    Highlights: • The off-design performance analysis of Kalina cycle is conducted. • The off-design models are established. • The genetic algorithm is used in the design phase. • The sliding pressure control strategy is applied. - Abstract: Low temperature geothermal sources with brilliant prospects have attracted more and more people’s attention. Kalina cycle system using ammonia water as working fluid could exploit geothermal energy effectively. In this paper, the quantitative analysis of off-design performance of Kalina cycle for the low temperature geothermal source is conducted. The off-design models including turbine, pump and heat exchangers are established preliminarily. Genetic algorithm is used to maximize the net power output and determine the thermodynamic parameters in the design phase. The sliding pressure control strategy applied widely in existing Rankine cycle power plants is adopted to response to the variations of geothermal source mass flow rate ratio (70–120%), geothermal source temperature (116–128 °C) and heat sink temperature (0–35 °C). In the off-design research scopes, the guidance for pump rotational speed adjustment is listed to provide some reference for off-design operation of geothermal power plants. The required adjustment rate of pump rotational speed is more sensitive to per unit geothermal source temperature than per unit heat sink temperature. Influence of the heat sink variation is greater than that of the geothermal source variation on the ranges of net power output and thermal efficiency.

  15. PROTEINCHALLENGE: Crowd sourcing in proteomics analysis and software development

    DEFF Research Database (Denmark)

    Martin, Sarah F.; Falkenberg, Heiner; Dyrlund, Thomas Franck

    2013-01-01

    , including arguments for community-wide open source software development and “big data” compatible solutions for the future. For the meantime, we have laid out ten top tips for data processing. With these at hand, a first large-scale proteomics analysis hopefully becomes less daunting to navigate......, with the aim of setting a community-driven gold standard for data handling, reporting and sharing. This article is part of a Special Issue entitled: New Horizons and Applications for Proteomics [EuPA 2012].......In large-scale proteomics studies there is a temptation, after months of experimental work, to plug resulting data into a convenient—if poorly implemented—set of tools, which may neither do the data justice nor help answer the scientific question. In this paper we have captured key concerns...

  16. Analysis of the monitoring system for the spallation neutron source 'SINQ'

    International Nuclear Information System (INIS)

    Badreddin, E.

    1998-01-01

    Petri Net models (PN) and Fault-Tree Analysis (FTA) are employed for the purpose of reliability analysis of the spallation neutron source SINQ. The monitoring and shut-down system (SDS) structure is investigated using a Petri-Net model. The reliability data are processed using a Fault-Tree model of the dominant part. Finally, suggestions for the improvement of system availability are made. (author)

  17. Sensitivity analysis of source driven subcritical systems by the HGPT methodology

    International Nuclear Information System (INIS)

    Gandini, A.

    1997-01-01

    The heuristically based generalized perturbation theory (HGPT) methodology has been extensively used in the last decades for analysis studies in the nuclear reactor field. Its use leads to fundamental reciprocity relationships from which perturbation, or sensitivity expressions can be derived, to first and higher order, in terms of simple integration operation of quantities calculated at unperturbed system conditions. Its application to subcritical, source-driven systems, now considered with increasing interest in many laboratories for their potential use as nuclear waste burners and/or safer energy producers, is here commented, with particular emphasis to problems implying an intensive system control variable. (author)

  18. EEG source space analysis of the supervised factor analytic approach for the classification of multi-directional arm movement

    Science.gov (United States)

    Shenoy Handiru, Vikram; Vinod, A. P.; Guan, Cuntai

    2017-08-01

    Objective. In electroencephalography (EEG)-based brain-computer interface (BCI) systems for motor control tasks the conventional practice is to decode motor intentions by using scalp EEG. However, scalp EEG only reveals certain limited information about the complex tasks of movement with a higher degree of freedom. Therefore, our objective is to investigate the effectiveness of source-space EEG in extracting relevant features that discriminate arm movement in multiple directions. Approach. We have proposed a novel feature extraction algorithm based on supervised factor analysis that models the data from source-space EEG. To this end, we computed the features from the source dipoles confined to Brodmann areas of interest (BA4a, BA4p and BA6). Further, we embedded class-wise labels of multi-direction (multi-class) source-space EEG to an unsupervised factor analysis to make it into a supervised learning method. Main Results. Our approach provided an average decoding accuracy of 71% for the classification of hand movement in four orthogonal directions, that is significantly higher (>10%) than the classification accuracy obtained using state-of-the-art spatial pattern features in sensor space. Also, the group analysis on the spectral characteristics of source-space EEG indicates that the slow cortical potentials from a set of cortical source dipoles reveal discriminative information regarding the movement parameter, direction. Significance. This study presents evidence that low-frequency components in the source space play an important role in movement kinematics, and thus it may lead to new strategies for BCI-based neurorehabilitation.

  19. Phenotypic and genotypic analysis of bio-serotypes of Yersinia enterocolitica from various sources in Brazil.

    Science.gov (United States)

    Rusak, Leonardo Alves; dos Reis, Cristhiane Moura Falavina; Barbosa, André Victor; Santos, André Felipe Mercês; Paixão, Renata; Hofer, Ernesto; Vallim, Deyse Christina; Asensi, Marise Dutra

    2014-12-15

    Yersinia enterocolitica is a well-known foodborne pathogen widely distributed in nature with high public health relevance, especially in Europe. This study aimed to analyze the pathogenic potential of Y. enterocolitica isolated strains from human, animal, food, and environmental sources and from different regions of Brazil by detecting virulence genes inv, ail, ystA, and virF through polymerase chain reaction (PCR), phenotypic tests, and antimicrobial susceptibility analysis. Pulsed-field gel electrophoresis (PFGE) was used for the assessment of phylogenetic diversity. All virulence genes were detected in 11/60 (18%) strains of serotype O:3, biotype 4 isolated from human and animal sources. Ten human strains (4/O:3) presented three chromosomal virulence genes, and nine strains of biotype 1A presented the inv gene. Six (10%) strains were resistant to sulfamethoxazole-trimethoprim, seven (12%) to tetracycline, and one (2%) to amikacin, all of which are used to treat yersiniosis. AMP-CEF-SXT was the predominant resistance profile. PFGE analysis revealed 36 unique pulsotypes, grouped into nine clusters (A to I) with similarity ≥ 85%, generating a diversity discriminatory index of 0.957. Cluster A comprised all bio-serotype 4/O:3 strains isolated from animal and humans sources. This study shows the existence of strains with the same genotypic profiles, bearing all virulence genes, from human and animal sources, circulating among several Brazilian states. This supports the hypothesis that swine is likely to serve as a main element in Y. enterocolitica transmission to humans in Brazil, and it could become a potential threat to public health as in Europe.

  20. Open Source Drug Discovery in Practice: A Case Study

    Science.gov (United States)

    Årdal, Christine; Røttingen, John-Arne

    2012-01-01

    Background Open source drug discovery offers potential for developing new and inexpensive drugs to combat diseases that disproportionally affect the poor. The concept borrows two principle aspects from open source computing (i.e., collaboration and open access) and applies them to pharmaceutical innovation. By opening a project to external contributors, its research capacity may increase significantly. To date there are only a handful of open source R&D projects focusing on neglected diseases. We wanted to learn from these first movers, their successes and failures, in order to generate a better understanding of how a much-discussed theoretical concept works in practice and may be implemented. Methodology/Principal Findings A descriptive case study was performed, evaluating two specific R&D projects focused on neglected diseases. CSIR Team India Consortium's Open Source Drug Discovery project (CSIR OSDD) and The Synaptic Leap's Schistosomiasis project (TSLS). Data were gathered from four sources: interviews of participating members (n = 14), a survey of potential members (n = 61), an analysis of the websites and a literature review. Both cases have made significant achievements; however, they have done so in very different ways. CSIR OSDD encourages international collaboration, but its process facilitates contributions from mostly Indian researchers and students. Its processes are formal with each task being reviewed by a mentor (almost always offline) before a result is made public. TSLS, on the other hand, has attracted contributors internationally, albeit significantly fewer than CSIR OSDD. Both have obtained funding used to pay for access to facilities, physical resources and, at times, labor costs. TSLS releases its results into the public domain, whereas CSIR OSDD asserts ownership over its results. Conclusions/Significance Technically TSLS is an open source project, whereas CSIR OSDD is a crowdsourced project. However, both have enabled high quality

  1. Feasibility of fissile mass assay of spent nuclear fuel using 252Cf-source-driven frequency-analysis

    International Nuclear Information System (INIS)

    Mattingly, J.K.; Valentine, T.E.; Mihalczo, J.T.

    1996-01-01

    The feasibility was evaluated using MCNP-DSP, an analog Monte Carlo transport cod to simulate source-driven measurements. Models of an isolated Westinghouse 17x17 PWR fuel assembly in a 1500-ppM borated water storage pool were used. In the models, the fuel burnup profile was represented using seven axial burnup zones, each with isotopics estimated by the PDQ code. Four different fuel assemblies with average burnups from fresh to 32 GWd/MTU were modeled and analyzed. Analysis of the fuel assemblies was simulated by inducing fission in the fuel using a 252 Cf source adjacent to the assembly and correlating source fissions with the response of a bank of 3 He detectors adjacent to the assembly opposite the source. This analysis was performed at 7 different axial positions on each of the 4 assemblies, and the source-detector cross-spectrum signature was calculated for each of these 28 simulated measurements. The magnitude of the cross-spectrum signature follows a smooth upward trend with increasing fissile material ( 235 U and 239 Pu) content, and the signature is independent of the concentration of spontaneously fissioning isotopes (e.g., 244 Cm) and (α,n) sources. Furthermore, the cross-spectrum signature is highly sensitive to changes in fissile material content. This feasibility study indicated that the signature would increase ∼100% in response to an increase of only 0.1 g/cm 3 of fissile material

  2. Determination of sources and analysis of micro-pollutants in drinking water

    International Nuclear Information System (INIS)

    Md Pauzi Abdullah; Soh Shiau Chian

    2005-01-01

    The objectives of the study are to develop and validate selected analytical methods for the analysis of micro organics and metals in water; to identify, monitor and assess the levels of micro organics and metals in drinking water supplies; to evaluate the relevancy of the guidelines set in the National Standard of Drinking Water Quality 2001; and to identify the sources of pollution and to carryout risk assessment of exposure to drinking water. The presentation discussed the progress of the work include determination of VOCs (Volatile organic compounds) in drinking water using SPME (Solid phase micro-extraction) extraction techniques, analysis of heavy metals in drinking water, determination of Cr(VI) with ICPES (Inductively coupled plasma emission spectrometry) and the presence of halogenated volatile organic compounds (HVOCs), which is heavily used by agricultural sector, in trace concentrations in waters

  3. Systems analysis and engineering of the X-1 Advanced Radiation Source

    International Nuclear Information System (INIS)

    Rochau, G.E.; Hands, J.A.; Raglin, P.S.; Ramirez, J.J.

    1998-01-01

    The X-1 Advanced Radiation Source, which will produce ∼ 16 MJ in x-rays, represents the next step in providing US Department of Energy's Stockpile Stewardship program with the high-energy, large volume, laboratory x-ray sources needed for the Radiation Effects Science and Simulation (RES), Inertial Confinement Fusion (ICF), and Weapon Physics (WP) Programs. Advances in fast pulsed power technology and in z-pinch hohlraums on Sandia National Laboratories' Z Accelerator in 1997 provide sufficient basis for pursuing the development of X-1. This paper will introduce the X-1 Advanced Radiation Source Facility Project, describe the systems analysis and engineering approach being used, and identify critical technology areas being researched

  4. Multiband Study of Radio Sources of the RCR Catalogue with Virtual Observatory Tools

    Directory of Open Access Journals (Sweden)

    Zhelenkova O. P.

    2012-09-01

    Full Text Available We present early results of our multiband study of the RATAN Cold Revised (RCR catalogue obtained from seven cycles of the “Cold” survey carried with the RATAN-600 radio telescope at 7.6 cm in 1980-1999, at the declination of the SS 433 source. We used the 2MASS and LAS UKIDSS infrared surveys, the DSS-II and SDSS DR7 optical surveys, as well as the USNO-B1 and GSC-II catalogues, the VLSS, TXS, NVSS, FIRST and GB6 radio surveys to accumulate information about the sources. For radio sources that have no detectable optical candidate in optical or infrared catalogues, we additionally looked through images in several bands from the SDSS, LAS UKIDSS, DPOSS, 2MASS surveys and also used co-added frames in different bands. We reliably identified 76% of radio sources of the RCR catalogue. We used the ALADIN and SAOImage DS9 scripting capabilities, interoperability services of ALADIN and TOPCAT, and also other Virtual Observatory (VO tools and resources, such as CASJobs, NED, Vizier, and WSA, for effective data access, visualization and analysis. Without VO tools it would have been problematic to perform our study.

  5. Market Analysis and Consumer Impacts Source Document. Part III. Consumer Behavior and Attitudes Toward Fuel Efficient Vehicles

    Science.gov (United States)

    1980-12-01

    This source document on motor vehicle market analysis and consumer impacts consists of three parts. Part III consists of studies and reviews on: consumer awareness of fuel efficiency issues; consumer acceptance of fuel efficient vehicles; car size ch...

  6. Source study of local coalfield events using the modal synthesis of shear and surface waves

    Energy Technology Data Exchange (ETDEWEB)

    MacBeth, C.D.; Redmayne, D.W.

    1989-10-01

    Results from the BGS LOWNET array from the Midlothian coalfield in Scotland have been studied. Vertical component seismograms have been analysed using a waveform matching technique based on the modal summation method for constructing synthetic seismograms. Results of the analysis are applied to S and surface wave portions of the seismogram. Effects of different earth structures, source depths, source orientation, and type of event, rockburst or triggered earthquake 2-3 km from the mine workings, can be evaluated.

  7. Identification of sources of lead exposure in French children by lead isotope analysis: a cross-sectional study

    Directory of Open Access Journals (Sweden)

    Lucas Jean-Paul

    2011-08-01

    Full Text Available Abstract Background The amount of lead in the environment has decreased significantly in recent years, and so did exposure. However, there is no known safe exposure level and, therefore, the exposure of children to lead, although low, remains a major public health issue. With the lower levels of exposure, it is becoming more difficult to identify lead sources and new approaches may be required for preventive action. This study assessed the usefulness of lead isotope ratios for identifying sources of lead using data from a nationwide sample of French children aged from six months to six years with blood lead levels ≥25 μg/L. Methods Blood samples were taken from 125 children, representing about 600,000 French children; environmental samples were taken from their homes and personal information was collected. Lead isotope ratios were determined using quadrupole ICP-MS (inductively coupled plasma - mass spectrometry and the isotopic signatures of potential sources of exposure were matched with those of blood in order to identify the most likely sources. Results In addition to the interpretation of lead concentrations, lead isotope ratios were potentially of use for 57% of children aged from six months to six years with blood lead level ≥ 25 μg/L (7% of overall children in France, about 332,000 children, with at least one potential source of lead and sufficiently well discriminated lead isotope ratios. Lead isotope ratios revealed a single suspected source of exposure for 32% of the subjects and were able to eliminate at least one unlikely source of exposure for 30% of the children. Conclusions In France, lead isotope ratios could provide valuable additional information in about a third of routine environmental investigations.

  8. Study on sources of colored glaze of Xiyue Temple in Shanxi province by INAA and multivariable statistical analysis

    International Nuclear Information System (INIS)

    Cheng Lin; Feng Songlin

    2005-01-01

    The major, minor and trace elements in the bodies of ancient colored glazes which came from the site of Xiyue Temple and Lidipo kiln in Shanxi province, and were unearthed from the stratums of Song, Yuan, Ming, Early Qing and Late Qing dynasty were analyzed by instrumental neutron activation analysis (INAA). The results of multivariable statistical analyses show that the chemical compositions of the colored glaze bodies are steady from Song to Early Qing dynasty, but distinctly different from that in Late Qing. Probably, the sources of fired material of ancient colored glaze from Song to Early Qing came from the site of Xiyue Temple. The chemical compositions of three pieces of colored glazes in Ming dynasty and that in Late Qing are similar to that of Lidipo kiln. From this, authors could conclude that the sources of the materials of ancient coloured glazes of Xiyue Temple in Late Qing dynasty were fired in Lidipo kiln. (authors)

  9. Dust Storm over the Middle East: Retrieval Approach, Source Identification, and Trend Analysis

    Science.gov (United States)

    Moridnejad, A.; Karimi, N.; Ariya, P. A.

    2014-12-01

    The Middle East region has been considered to be responsible for approximately 25% of the Earth's global emissions of dust particles. By developing Middle East Dust Index (MEDI) and applying to 70 dust storms characterized on MODIS images and occurred during the period between 2001 and 2012, we herein present a new high resolution mapping of major atmospheric dust source points participating in this region. To assist environmental managers and decision maker in taking proper and prioritized measures, we then categorize identified sources in terms of intensity based on extracted indices for Deep Blue algorithm and also utilize frequency of occurrence approach to find the sensitive sources. In next step, by implementing the spectral mixture analysis on the Landsat TM images (1984 and 2012), a novel desertification map will be presented. The aim is to understand how human perturbations and land-use change have influenced the dust storm points in the region. Preliminary results of this study indicate for the first time that c.a., 39 % of all detected source points are located in this newly anthropogenically desertified area. A large number of low frequency sources are located within or close to the newly desertified areas. These severely desertified regions require immediate concern at a global scale. During next 6 months, further research will be performed to confirm these preliminary results.

  10. Nitrate source identification in groundwater of multiple land-use areas by combining isotopes and multivariate statistical analysis: A case study of Asopos basin (Central Greece)

    International Nuclear Information System (INIS)

    Matiatos, Ioannis

    2016-01-01

    Nitrate (NO_3) is one of the most common contaminants in aquatic environments and groundwater. Nitrate concentrations and environmental isotope data (δ"1"5N–NO_3 and δ"1"8O–NO_3) from groundwater of Asopos basin, which has different land-use types, i.e., a large number of industries (e.g., textile, metal processing, food, fertilizers, paint), urban and agricultural areas and livestock breeding facilities, were analyzed to identify the nitrate sources of water contamination and N-biogeochemical transformations. A Bayesian isotope mixing model (SIAR) and multivariate statistical analysis of hydrochemical data were used to estimate the proportional contribution of different NO_3 sources and to identify the dominant factors controlling the nitrate content of the groundwater in the region. The comparison of SIAR and Principal Component Analysis showed that wastes originating from urban and industrial zones of the basin are mainly responsible for nitrate contamination of groundwater in these areas. Agricultural fertilizers and manure likely contribute to groundwater contamination away from urban fabric and industrial land-use areas. Soil contribution to nitrate contamination due to organic matter is higher in the south-western part of the area far from the industries and the urban settlements. The present study aims to highlight the use of environmental isotopes combined with multivariate statistical analysis in locating sources of nitrate contamination in groundwater leading to a more effective planning of environmental measures and remediation strategies in river basins and water bodies as defined by the European Water Frame Directive (Directive 2000/60/EC). - Highlights: • More enriched N-isotope values were observed in the industrial/urban areas. • A Bayesian isotope mixing model was applied in a multiple land-use area. • A 3-component model explained the factors controlling nitrate content in groundwater. • Industrial/urban nitrogen source was

  11. Nitrate source identification in groundwater of multiple land-use areas by combining isotopes and multivariate statistical analysis: A case study of Asopos basin (Central Greece)

    Energy Technology Data Exchange (ETDEWEB)

    Matiatos, Ioannis, E-mail: i.matiatos@iaea.org

    2016-01-15

    Nitrate (NO{sub 3}) is one of the most common contaminants in aquatic environments and groundwater. Nitrate concentrations and environmental isotope data (δ{sup 15}N–NO{sub 3} and δ{sup 18}O–NO{sub 3}) from groundwater of Asopos basin, which has different land-use types, i.e., a large number of industries (e.g., textile, metal processing, food, fertilizers, paint), urban and agricultural areas and livestock breeding facilities, were analyzed to identify the nitrate sources of water contamination and N-biogeochemical transformations. A Bayesian isotope mixing model (SIAR) and multivariate statistical analysis of hydrochemical data were used to estimate the proportional contribution of different NO{sub 3} sources and to identify the dominant factors controlling the nitrate content of the groundwater in the region. The comparison of SIAR and Principal Component Analysis showed that wastes originating from urban and industrial zones of the basin are mainly responsible for nitrate contamination of groundwater in these areas. Agricultural fertilizers and manure likely contribute to groundwater contamination away from urban fabric and industrial land-use areas. Soil contribution to nitrate contamination due to organic matter is higher in the south-western part of the area far from the industries and the urban settlements. The present study aims to highlight the use of environmental isotopes combined with multivariate statistical analysis in locating sources of nitrate contamination in groundwater leading to a more effective planning of environmental measures and remediation strategies in river basins and water bodies as defined by the European Water Frame Directive (Directive 2000/60/EC). - Highlights: • More enriched N-isotope values were observed in the industrial/urban areas. • A Bayesian isotope mixing model was applied in a multiple land-use area. • A 3-component model explained the factors controlling nitrate content in groundwater. • Industrial

  12. Source-Type Identification Analysis Using Regional Seismic Moment Tensors

    Science.gov (United States)

    Chiang, A.; Dreger, D. S.; Ford, S. R.; Walter, W. R.

    2012-12-01

    Waveform inversion to determine the seismic moment tensor is a standard approach in determining the source mechanism of natural and manmade seismicity, and may be used to identify, or discriminate different types of seismic sources. The successful applications of the regional moment tensor method at the Nevada Test Site (NTS) and the 2006 and 2009 North Korean nuclear tests (Ford et al., 2009a, 2009b, 2010) show that the method is robust and capable for source-type discrimination at regional distances. The well-separated populations of explosions, earthquakes and collapses on a Hudson et al., (1989) source-type diagram enables source-type discrimination; however the question remains whether or not the separation of events is universal in other regions, where we have limited station coverage and knowledge of Earth structure. Ford et al., (2012) have shown that combining regional waveform data and P-wave first motions removes the CLVD-isotropic tradeoff and uniquely discriminating the 2009 North Korean test as an explosion. Therefore, including additional constraints from regional and teleseismic P-wave first motions enables source-type discrimination at regions with limited station coverage. We present moment tensor analysis of earthquakes and explosions (M6) from Lop Nor and Semipalatinsk test sites for station paths crossing Kazakhstan and Western China. We also present analyses of smaller events from industrial sites. In these sparse coverage situations we combine regional long-period waveforms, and high-frequency P-wave polarity from the same stations, as well as from teleseismic arrays to constrain the source type. Discrimination capability with respect to velocity model and station coverage is examined, and additionally we investigate the velocity model dependence of vanishing free-surface traction effects on seismic moment tensor inversion of shallow sources and recovery of explosive scalar moment. Our synthetic data tests indicate that biases in scalar

  13. Organic tracer-based source analysis of PM2.5 organic and elemental carbon: A case study at Dongguan in the Pearl River Delta, China

    Science.gov (United States)

    Wang, Qiong Qiong; Huang, X. H. Hilda; Zhang, Ting; Zhang, Qingyan; Feng, Yongming; Yuan, Zibing; Wu, Dui; Lau, Alexis K. H.; Yu, Jian Zhen

    2015-10-01

    Organic carbon (OC) and elemental carbon (EC) are major constituents of PM2.5 and their source apportionment remains a challenging task due to the great diversity of their sources and lack of source-specific tracer data. In this work, sources of OC and EC are investigated using positive matrix factorization (PMF) analysis of PM2.5 chemical composition data, including major ions, OC, EC, elements, and organic molecular source markers, for a set of 156 filter samples collected over three years from 2010 to 2012 at Dongguan in the Pearl River Delta, China. The key organic tracers include levoglucosan, mannosan, hopanes, C27-C33n-alkanes, and polycyclic aromatic hydrocarbons (PAHs). Using these species as input for the PMF model, nine factors were resolved. Among them, biomass burning and coal combustion were significant sources contributing 15-17% of OC and 24-30% and 34-35% of EC, respectively. Industrial emissions and ship emissions, identified through their characteristic metal signatures, contributed 16-24% and 7-8% of OC and 8-11% and 16-17% of EC, respectively. Vehicle exhaust was a less significant source, accounting for 3-4% of OC and 5-8% of EC. Secondary OC, taken to be the sum of OC present in secondary sulfate and nitrate formation source factors, made up 27-36% of OC. Plastic burning, identified through 1,3,5-triphenylbenzene as a tracer, was a less important source for OC(≤4%) and EC (5-10%), but a significant source for PAHs at this site. The utility of organic source tracers was demonstrated by comparing PMF runs with different combinations of organic tracers removed from the input species list. Levoglucosan and mannosan were important additions to distinguish biomass burning from coal combustion by reducing collinearity among source profiles. Inclusion of hopanes and 1,3,5-triphenylbenzene was found to be necessary in resolving the less significant sources vehicle exhaust and plastic burning. Inclusion of C27-C33n-alkanes and PAHs can influence the

  14. Mechanisms Supporting Superior Source Memory for Familiar Items: A Multi-Voxel Pattern Analysis Study

    Science.gov (United States)

    Poppenk, Jordan; Norman, Kenneth A.

    2012-01-01

    Recent cognitive research has revealed better source memory performance for familiar relative to novel stimuli. Here we consider two possible explanations for this finding. The source memory advantage for familiar stimuli could arise because stimulus novelty induces attention to stimulus features at the expense of contextual processing, resulting…

  15. Bulk - Samples gamma-rays activation analysis (PGNAA) with Isotopic Neutron Sources

    International Nuclear Information System (INIS)

    HASSAN, A.M.

    2009-01-01

    An overview is given on research towards the Prompt Gamma-ray Neutron Activation Analysis (PGNAA) of bulk-samples. Some aspects in bulk-sample PGNAA are discussed, where irradiation by isotopic neutron sources is used mostly for in-situ or on-line analysis. The research was carried out in a comparative and/or qualitative way or by using a prior knowledge about the sample material. Sometimes we need to use the assumption that the mass fractions of all determined elements add up to 1. The sensitivity curves are also used for some elements in such complex samples, just to estimate the exact percentage concentration values. The uses of 252 Cf, 241 Arn/Be and 239 Pu/Be isotopic neutron sources for elemental investigation of: hematite, ilmenite, coal, petroleum, edible oils, phosphates and pollutant lake water samples have been mentioned.

  16. IQM: an extensible and portable open source application for image and signal analysis in Java.

    Science.gov (United States)

    Kainz, Philipp; Mayrhofer-Reinhartshuber, Michael; Ahammer, Helmut

    2015-01-01

    Image and signal analysis applications are substantial in scientific research. Both open source and commercial packages provide a wide range of functions for image and signal analysis, which are sometimes supported very well by the communities in the corresponding fields. Commercial software packages have the major drawback of being expensive and having undisclosed source code, which hampers extending the functionality if there is no plugin interface or similar option available. However, both variants cannot cover all possible use cases and sometimes custom developments are unavoidable, requiring open source applications. In this paper we describe IQM, a completely free, portable and open source (GNU GPLv3) image and signal analysis application written in pure Java. IQM does not depend on any natively installed libraries and is therefore runnable out-of-the-box. Currently, a continuously growing repertoire of 50 image and 16 signal analysis algorithms is provided. The modular functional architecture based on the three-tier model is described along the most important functionality. Extensibility is achieved using operator plugins, and the development of more complex workflows is provided by a Groovy script interface to the JVM. We demonstrate IQM's image and signal processing capabilities in a proof-of-principle analysis and provide example implementations to illustrate the plugin framework and the scripting interface. IQM integrates with the popular ImageJ image processing software and is aiming at complementing functionality rather than competing with existing open source software. Machine learning can be integrated into more complex algorithms via the WEKA software package as well, enabling the development of transparent and robust methods for image and signal analysis.

  17. Radiocarbon Analysis to Calculate New End-Member Values for Biomass Burning Source Samples Specific to the Bay Area

    Science.gov (United States)

    Yoon, S.; Kirchstetter, T.; Fairley, D.; Sheesley, R. J.; Tang, X.

    2017-12-01

    Elemental carbon (EC), also known as black carbon or soot, is an important particulate air pollutant that contributes to climate forcing through absorption of solar radiation and to adverse human health impacts through inhalation. Both fossil fuel combustion and biomass burning, via residential firewood burning, agricultural burning, wild fires, and controlled burns, are significant sources of EC. Our ability to successfully control ambient EC concentrations requires understanding the contribution of these different emission sources. Radiocarbon (14C) analysis has been increasingly used as an apportionment tool to distinguish between EC from fossil fuel and biomass combustion sources. However, there are uncertainties associated with this method including: 1) uncertainty associated with the isolation of EC to be used for radiocarbon analysis (e.g., inclusion of organic carbon, blank contamination, recovery of EC, etc.) 2) uncertainty associated with the radiocarbon signature of the end member. The objective of this research project is to utilize laboratory experiments to evaluate some of these uncertainties, particularly for EC sources that significantly impact the San Francisco Bay Area. Source samples of EC only and a mix of EC and organic carbon (OC) were produced for this study to represent known emission sources and to approximate the mixing of EC and OC that would be present in the atmosphere. These samples include a combination of methane flame soot, various wood smoke samples (i.e. cedar, oak, sugar pine, pine at various ages, etc.), meat cooking, and smoldering cellulose smoke. EC fractions were isolated using a Sunset Laboratory's thermal optical transmittance carbon analyzer. For 14C analysis, samples were sent to Woods Hole Oceanographic Institution for isotope analysis using an accelerated mass spectrometry. End member values and uncertainties for the EC isolation utilizing this method will be reported.

  18. Unique effects and moderators of effects of sources on self-efficacy: A model-based meta-analysis.

    Science.gov (United States)

    Byars-Winston, Angela; Diestelmann, Jacob; Savoy, Julia N; Hoyt, William T

    2017-11-01

    Self-efficacy beliefs are strong predictors of academic pursuits, performance, and persistence, and in theory are developed and maintained by 4 classes of experiences Bandura (1986) referred to as sources: performance accomplishments (PA), vicarious learning (VL), social persuasion (SP), and affective arousal (AA). The effects of sources on self-efficacy vary by performance domain and individual difference factors. In this meta-analysis (k = 61 studies of academic self-efficacy; N = 8,965), we employed B. J. Becker's (2009) model-based approach to examine cumulative effects of the sources as a set and unique effects of each source, controlling for the others. Following Becker's recommendations, we used available data to create a correlation matrix for the 4 sources and self-efficacy, then used these meta-analytically derived correlations to test our path model. We further examined moderation of these associations by subject area (STEM vs. non-STEM), grade, sex, and ethnicity. PA showed by far the strongest unique association with self-efficacy beliefs. Subject area was a significant moderator, with sources collectively predicting self-efficacy more strongly in non-STEM (k = 14) compared with STEM (k = 47) subjects (R2 = .37 and .22, respectively). Within studies of STEM subjects, grade level was a significant moderator of the coefficients in our path model, as were 2 continuous study characteristics (percent non-White and percent female). Practical implications of the findings and future research directions are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  19. The training of Olympic wrestling coaches: study of the sources of knowledge and essential training contents

    Directory of Open Access Journals (Sweden)

    Paulo Martins

    2017-08-01

    Full Text Available The aim of this study was to analyze the representation of wrestling coaches regarding the sources of knowledge and the training contents to be adopted during the training process of young wrestlers’ coaches. The study was based on Grossman’s (1990 model of professional knowledge for teaching and followed a qualitative, multiple case study methodology. Following a semi-structured script, six Olympic wrestling experts were interviewed in-depth, trying to identify the sources of knowledge that the coaches used for their training and what didactic-methodological contents they considered essential to play their role as coach. The analysis revealed that the coaches’ sources of professional knowledge were diverse, including academic training and professional experience as the main sources of access to professional knowledge. The coaches also pointed out that their first sources of knowledge were their experiences as competitive athletes. Finally, this study concludes that expert coaches must acquire a profound knowledge of the competition environment, seeking to optimize their influence on athletes, which should extend not only to the sport practice of the youngster – as an athlete – but also at the level of the athlete as a person.

  20. Assessing heavy metal sources in sugarcane Brazilian soils: an approach using multivariate analysis.

    Science.gov (United States)

    da Silva, Fernando Bruno Vieira; do Nascimento, Clístenes Williams Araújo; Araújo, Paula Renata Muniz; da Silva, Luiz Henrique Vieira; da Silva, Roberto Felipe

    2016-08-01

    Brazil is the world's largest sugarcane producer and soils in the northeastern part of the country have been cultivated with the crop for over 450 years. However, so far, there has been no study on the status of heavy metal accumulation in these long-history cultivated soils. To fill the gap, we collect soil samples from 60 sugarcane fields in order to determine the contents of Cd, Cr, Cu, Ni, Pb, and Zn. We used multivariate analysis to distinguish between natural and anthropogenic sources of these metals in soils. Analytical determinations were performed in ICP-OES after microwave acid solution digestion. Mean concentrations of Cd, Cr, Cu, Ni, Pb, and Zn were 1.9, 18.8, 6.4, 4.9, 11.2, and 16.2 mg kg(-1), respectively. The principal component one was associated with lithogenic origin and comprised the metals Cr, Cu, Ni, and Zn. Cluster analysis confirmed that 68 % of the evaluated sites have soil heavy metal concentrations close to the natural background. The Cd concentration (principal component two) was clearly associated with anthropogenic sources with P fertilization being the most likely source of Cd to soils. On the other hand, the third component (Pb concentration) indicates a mixed origin for this metal (natural and anthropogenic); hence, Pb concentrations are probably related not only to the soil parent material but also to industrial emissions and urbanization in the vicinity of the agricultural areas.

  1. PM10 source apportionment study in Pleasant Valley, Nevada

    International Nuclear Information System (INIS)

    Egami, R.T.; Chow, J.C.; Watson, J.G.; DeLong, T.

    1990-01-01

    A source apportionment study was conducted between March 18 and April 4, 1988, at Pleasant Valley, Nevada, to evaluate air pollutant concentrations to which community residents were exposed and the source contributions to those pollutants. Daily PM 10 samples were taken for chemical speciation of 40 trace elements, ions, and organic and elemental carbon. This paper reports that the objectives of this case study are: to determine the emissions source composition of the potential upwind source, a geothermal plant; to measure the ambient particulate concentration and its chemical characteristics in Pleasant Valley; and to estimate the contributions of different emissions sources to PM 10 . The study found that: particulate emissions from the geothermal cooling-tower plume consisted primarily of sulfate, ammonia, chloride, and trace elements; no significant quantities of toxic inorganic species were found in the ambient air; ambient PM 10 concentrations in Pleasant Valley were within Federal standards; and source contribution to PM 10 were approximately 60% geological material; 20% motor vehicle exhaust; and 10% cooling-tower plume

  2. P-wave pulse analysis to retrieve source and propagation effects in the case of Vrancea earthquakes

    International Nuclear Information System (INIS)

    Popescu, E.; Popa, M.; Placinta, A.; Grecu, B.; Radulian, M.

    2004-01-01

    Seismic source parameters and attenuation structure properties are obtained from the first P-wave pulse analysis and empirical Green's function deconvolution. The P pulse characteristics are combined effects of source and path properties. To reproduce the real source and structure parameters it is crucial to apply a method able to distinguish between the different factors affecting the observed seismograms. For example the empirical Green's function deconvolution method (Hartzell, 1978) allows the retrieval of the apparent source time function or source spectrum corrected for path, site and instrumental effects. The apparent source duration is given by the width of the deconvoluted source pulse and is directly related to the source dimension. Once the source time function established, next we can extract the parameters related to path effects. The difference between the pulse recorded at a given station and the source pulse obtained by deconvolution is a measure of the attenuation along the path from focus to the station. On the other hand, the pulse width variations with azimuth depend critically on the fault plane orientation and source directivity. In favourable circumstances (high signal/noise ratio, high resolution and station coverage), the method of analysis proposed in this paper allows the constraint of the rupture plane among the two nodal planes characterizing the fault plane solution, even for small events. P-wave pulse analysis was applied for 25 Vrancea earthquakes recorded between 1999 and 2003 by the Romanian local network to determine source parameters and attenuation properties. Our results outline high-stress drop seismic energy release with relatively simple rupture process for the considered events and strong lateral variation of attenuation of seismic waves across Carpathians Arc. (authors)

  3. Identification of sources and long term trends for pollutants in the arctic using isentropic trajectory analysis

    Energy Technology Data Exchange (ETDEWEB)

    Mahura, A.; Jaffe, D.; Harris, J.

    2003-07-01

    The understanding of factors driving climate and ecosystem changes in the Arctic requires careful consideration of the sources, correlation and trends for anthropogenic pollutants. The database from the NOAA-CMDL Barrow Observatory (71deg.17'N, 156deg.47'W) is the longest and most complete record of pollutant measurements in the Arctic. It includes observations of carbon dioxide (CO{sub 2}), methane (CH{sub 4}), carbon monoxide (CO), ozone (O{sub 3}), aerosol scattering coefficient ({sigma}{sub sp}), aerosol number concentration (NC{sub asl}), etc. The objectives of this study are to understand the role of long-range transport to Barrow in explaining: (1) the year-to-year variations, and (2) the trends in the atmospheric chemistry record at the NOAA-CMDL Barrow observatory. The key questions we try to answer are: 1. What is the relationship between various chemical species measured at Barrow Observatory, Alaska and transport pathways at various altitudes? 2. What are the trends of species and their relation to transport patterns from the source regions? 3. What is the impact of the Prudhoe Bay emissions on the Barrow's records? To answer on these questions we apply the following main research tools. First, it is an isentropic trajectory model used to calculate the trajectories arriving at Barrow at three altitudes of 0.5, 1.5 and 3 km above sea level. Second - clustering procedure used to divide the trajectories into groups based on source regions. Third - various statistical analysis tools such as the exploratory data analysis, two component correlation analysis, trend analysis, principal components and factor analysis used to identify the relationship between various chemical species vs. source regions as a function of time. In this study, we used the chemical data from the NOAA-CMDL Barrow observatory in combination with isentropic backward trajectories from gridded ECMWF data to understand the importance of various pollutant source regions on

  4. Identification of sources and long term trends for pollutants in the arctic using isentropic trajectory analysis

    International Nuclear Information System (INIS)

    Mahura, A.; Jaffe, D.; Harris, J.

    2003-01-01

    The understanding of factors driving climate and ecosystem changes in the Arctic requires careful consideration of the sources, correlation and trends for anthropogenic pollutants. The database from the NOAA-CMDL Barrow Observatory (71deg.17'N, 156deg.47'W) is the longest and most complete record of pollutant measurements in the Arctic. It includes observations of carbon dioxide (CO 2 ), methane (CH 4 ), carbon monoxide (CO), ozone (O 3 ), aerosol scattering coefficient (σ sp ), aerosol number concentration (NC asl ), etc. The objectives of this study are to understand the role of long-range transport to Barrow in explaining: (1) the year-to-year variations, and (2) the trends in the atmospheric chemistry record at the NOAA-CMDL Barrow observatory. The key questions we try to answer are: 1. What is the relationship between various chemical species measured at Barrow Observatory, Alaska and transport pathways at various altitudes? 2. What are the trends of species and their relation to transport patterns from the source regions? 3. What is the impact of the Prudhoe Bay emissions on the Barrow's records? To answer on these questions we apply the following main research tools. First, it is an isentropic trajectory model used to calculate the trajectories arriving at Barrow at three altitudes of 0.5, 1.5 and 3 km above sea level. Second - clustering procedure used to divide the trajectories into groups based on source regions. Third - various statistical analysis tools such as the exploratory data analysis, two component correlation analysis, trend analysis, principal components and factor analysis used to identify the relationship between various chemical species vs. source regions as a function of time. In this study, we used the chemical data from the NOAA-CMDL Barrow observatory in combination with isentropic backward trajectories from gridded ECMWF data to understand the importance of various pollutant source regions on atmospheric composition in the Arctic. We

  5. Multiwavelength study of Chandra X-ray sources in the Antennae

    Science.gov (United States)

    Clark, D. M.; Eikenberry, S. S.; Brandl, B. R.; Wilson, J. C.; Carson, J. C.; Henderson, C. P.; Hayward, T. L.; Barry, D. J.; Ptak, A. F.; Colbert, E. J. M.

    2011-01-01

    We use Wide-field InfraRed Camera (WIRC) infrared (IR) images of the Antennae (NGC 4038/4039) together with the extensive catalogue of 120 X-ray point sources to search for counterpart candidates. Using our proven frame-tie technique, we find 38 X-ray sources with IR counterparts, almost doubling the number of IR counterparts to X-ray sources that we first identified. In our photometric analysis, we consider the 35 IR counterparts that are confirmed star clusters. We show that the clusters with X-ray sources tend to be brighter, Ks≈ 16 mag, with (J-Ks) = 1.1 mag. We then use archival Hubble Space Telescope (HST) images of the Antennae to search for optical counterparts to the X-ray point sources. We employ our previous IR-to-X-ray frame-tie as an intermediary to establish a precise optical-to-X-ray frame-tie with <0.6 arcsec rms positional uncertainty. Due to the high optical source density near the X-ray sources, we determine that we cannot reliably identify counterparts. Comparing the HST positions to the 35 identified IR star cluster counterparts, we find optical matches for 27 of these sources. Using Bruzual-Charlot spectral evolutionary models, we find that most clusters associated with an X-ray source are massive, and young, ˜ 106 yr.

  6. High flux isotope reactor cold source preconceptual design study report

    International Nuclear Information System (INIS)

    Selby, D.L.; Bucholz, J.A.; Burnette, S.E.

    1995-12-01

    In February 1995, the deputy director of Oak Ridge National Laboratory (ORNL) formed a group to examine the need for upgrades to the High Flux Isotope Reactor (HFIR) system in light of the cancellation of the Advanced Neutron Source Project. One of the major findings of this study was that there was an immediate need for the installation of a cold neutron source facility in the HFIR complex. The anticipated cold source will consist of a cryogenic LH 2 moderator plug, a cryogenic pump system, a refrigerator that uses helium gas as a refrigerant, a heat exchanger to interface the refrigerant with the hydrogen loop, liquid hydrogen transfer lines, a gas handling system that includes vacuum lines, and an instrumentation and control system to provide constant system status monitoring and to maintain system stability. The scope of this project includes the development, design, safety analysis, procurement/fabrication, testing, and installation of all of the components necessary to produce a working cold source within an existing HFIR beam tube. This project will also include those activities necessary to transport the cold neutron beam to the front face of the present HFIR beam room. The cold source project has been divided into four phases: (1) preconceptual, (2) conceptual design and research and development (R and D), (3) detailed design and procurement, and (4) installation and operation. This report marks the conclusion of the preconceptual phase and establishes the concept feasibility. The information presented includes the project scope, the preliminary design requirements, the preliminary cost and schedule, the preliminary performance data, and an outline of the various plans for completing the project

  7. Determination of volatile organic compounds pollution sources in malaysian drinking water using multivariate analysis.

    Science.gov (United States)

    Soh, Shiau-Chian; Abdullah, Md Pauzi

    2007-01-01

    A field investigation was conducted at all water treatment plants throughout 11 states and Federal Territory in Peninsular Malaysia. The sampling points in this study include treatment plant operation, service reservoir outlet and auxiliary outlet point at the water pipelines. Analysis was performed by solid phase micro-extraction technique with a 100 microm polydimethylsiloxane fibre using gas chromatography with mass spectrometry detection to analyse 54 volatile organic compounds (VOCs) of different chemical families in drinking water. The concentration of VOCs ranged from undetectable to 230.2 microg/l. Among all of the VOCs species, chloroform has the highest concentration and was detected in all drinking water samples. Average concentrations of total trihalomethanes (THMs) were almost similar among all states which were in the range of 28.4--33.0 microg/l. Apart from THMs, other abundant compounds detected were cis and trans-1,2-dichloroethylene, trichloroethylene, 1,2-dibromoethane, benzene, toluene, ethylbenzene, chlorobenzene, 1,4-dichlorobenzene and 1,2-dichloro - benzene. Principal component analysis (PCA) with the aid of varimax rotation, and parallel factor analysis (PARAFAC) method were used to statistically verify the correlation between VOCs and the source of pollution. The multivariate analysis pointed out that the maintenance of auxiliary pipelines in the distribution systems is vital as it can become significant point source pollution to Malaysian drinking water.

  8. Characterization of polar organic compounds and source analysis of fine organic aerosols in Hong Kong

    Science.gov (United States)

    Li, Yunchun

    Organic aerosols, as an important fraction of airborne particulate mass, significantly affect the environment, climate, and human health. Compared with inorganic species, characterization of individual organic compounds is much less complete and comprehensive because they number in thousands or more and are diverse in chemical structures. The source contributions of organic aerosols are far from being well understood because they can be emitted from a variety of sources as well as formed from photochemical reactions of numerous precursors. This thesis work aims to improve the characterization of polar organic compounds and source apportionment analysis of fine organic carbon (OC) in Hong Kong, which consists of two parts: (1) An improved analytical method to determine monocarboxylic acids, dicarboxylic acids, ketocarboxylic acids, and dicarbonyls collected on filter substrates has been established. These oxygenated compounds were determined as their butyl ester or butyl acetal derivatives using gas chromatography-mass spectrometry. The new method made improvements over the original Kawamura method by eliminating the water extraction and evaporation steps. Aerosol materials were directly mixed with the BF 3/BuOH derivatization agent and the extracting solvent hexane. This modification improves recoveries for both the more volatile and the less water-soluble compounds. This improved method was applied to study the abundances and sources of these oxygenated compounds in PM2.5 aerosol samples collected in Hong Kong under different synoptic conditions during 2003-2005. These compounds account for on average 5.2% of OC (range: 1.4%-13.6%) on a carbon basis. Oxalic acid was the most abundant species. Six C2 and C3 oxygenated compounds, namely oxalic, malonic, glyoxylic, pyruvic acids, glyoxal, and methylglyoxal, dominated this suite of oxygenated compounds. More efforts are therefore suggested to focus on these small compounds in understanding the role of oxygenated

  9. Analysis of the Potential of Low-Temperature Heat Pump Energy Sources

    Directory of Open Access Journals (Sweden)

    Pavel Neuberger

    2017-11-01

    Full Text Available The paper deals with an analysis of temperatures of ground masses in the proximities of linear and slinky-type HGHE (horizontal ground heat exchanger. It evaluates and compares the potentials of HGHEs and ambient air. The reason and aim of the verification was to gain knowledge of the temperature course of the monitored low-temperature heat pump energy sources during heating periods and periods of stagnation and to analyse the knowledge in terms of the potential to use those sources for heat pumps. The study was conducted in the years 2012–2015 during three heating periods and three periods of HGHEs stagnation. The results revealed that linear HGHE had the highest temperature potential of the observed low-temperature heat pump energy sources. The average daily temperatures of the ground mass surrounding the linear HGHE were the highest ranging from 7.08 °C to 9.20 °C during the heating periods, and having the lowest temperature variation range of 12.62–15.14 K, the relative frequency of the average daily temperatures of the ground mass being the highest at 22.64% in the temperature range containing the mode of all monitored temperatures in a recorded interval of [4.10, 6.00] °C. Ambient air had lower temperature potential than the monitored HGHEs.

  10. Study on a groundwater source heat pump cooling system in solar greenhouse

    Energy Technology Data Exchange (ETDEWEB)

    Chai, Lilong; Ma, Chengwei [China Agricultural Univ., Beijing (China). Coll. of Water Conservancy and Civil Engineering. Dept. of Agricultural Structure and Bio-environmental Engineering], E-mail: macwbs@cau.edu.cn

    2008-07-01

    This study aims at exploiting the potential of ground source heat pump (GSHP) technology in cooling agricultural greenhouse, and advocating the use of renewable and clean energy in agriculture. GSHP has the multi-function of heating, cooling and dehumidifying, which is one of the fastest growing technologies of renewable energy air conditioning in recent years. The authors carried out experiment on the ground source heat pump system in cooling greenhouse in Beijing region during the summertime of 2007, and conducted analysis on the energy efficiency of the system by using coefficient of performance (COP). According to the data collected during Aug.13-18th, 2007, the coefficient of performance of GSHP system (COP{sub sys}) has reached 3.15 on average during the test. (author)

  11. Rascal: A domain specific language for source code analysis and manipulation

    NARCIS (Netherlands)

    P. Klint (Paul); T. van der Storm (Tijs); J.J. Vinju (Jurgen); A. Walenstein; S. Schuppe

    2009-01-01

    htmlabstractMany automated software engineering tools require tight integration of techniques for source code analysis and manipulation. State-of-the-art tools exist for both, but the domains have remained notoriously separate because different computational paradigms fit each domain best. This

  12. RASCAL : a domain specific language for source code analysis and manipulationa

    NARCIS (Netherlands)

    Klint, P.; Storm, van der T.; Vinju, J.J.

    2009-01-01

    Many automated software engineering tools require tight integration of techniques for source code analysis and manipulation. State-of-the-art tools exist for both, but the domains have remained notoriously separate because different computational paradigms fit each domain best. This impedance

  13. CHANDRA ACIS SURVEY OF X-RAY POINT SOURCES: THE SOURCE CATALOG

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Song; Liu, Jifeng; Qiu, Yanli; Bai, Yu; Yang, Huiqin; Guo, Jincheng; Zhang, Peng, E-mail: jfliu@bao.ac.cn, E-mail: songw@bao.ac.cn [Key Laboratory of Optical Astronomy, National Astronomical Observatories, Chinese Academy of Sciences, Beijing 100012 (China)

    2016-06-01

    The Chandra archival data is a valuable resource for various studies on different X-ray astronomy topics. In this paper, we utilize this wealth of information and present a uniformly processed data set, which can be used to address a wide range of scientific questions. The data analysis procedures are applied to 10,029 Advanced CCD Imaging Spectrometer observations, which produces 363,530 source detections belonging to 217,828 distinct X-ray sources. This number is twice the size of the Chandra Source Catalog (Version 1.1). The catalogs in this paper provide abundant estimates of the detected X-ray source properties, including source positions, counts, colors, fluxes, luminosities, variability statistics, etc. Cross-correlation of these objects with galaxies shows that 17,828 sources are located within the D {sub 25} isophotes of 1110 galaxies, and 7504 sources are located between the D {sub 25} and 2 D {sub 25} isophotes of 910 galaxies. Contamination analysis with the log N –log S relation indicates that 51.3% of objects within 2 D {sub 25} isophotes are truly relevant to galaxies, and the “net” source fraction increases to 58.9%, 67.3%, and 69.1% for sources with luminosities above 10{sup 37}, 10{sup 38}, and 10{sup 39} erg s{sup −1}, respectively. Among the possible scientific uses of this catalog, we discuss the possibility of studying intra-observation variability, inter-observation variability, and supersoft sources (SSSs). About 17,092 detected sources above 10 counts are classified as variable in individual observation with the Kolmogorov–Smirnov (K–S) criterion ( P {sub K–S} < 0.01). There are 99,647 sources observed more than once and 11,843 sources observed 10 times or more, offering us a wealth of data with which to explore the long-term variability. There are 1638 individual objects (∼2350 detections) classified as SSSs. As a quite interesting subclass, detailed studies on X-ray spectra and optical spectroscopic follow-up are needed to

  14. Factors influencing the spatial extent of mobile source air pollution impacts: a meta-analysis

    Directory of Open Access Journals (Sweden)

    Levy Jonathan I

    2007-05-01

    Full Text Available Abstract Background There has been growing interest among exposure assessors, epidemiologists, and policymakers in the concept of "hot spots", or more broadly, the "spatial extent" of impacts from traffic-related air pollutants. This review attempts to quantitatively synthesize findings about the spatial extent under various circumstances. Methods We include both the peer-reviewed literature and government reports, and focus on four significant air pollutants: carbon monoxide, benzene, nitrogen oxides, and particulate matter (including both ultrafine particle counts and fine particle mass. From the identified studies, we extracted information about significant factors that would be hypothesized to influence the spatial extent within the study, such as the study type (e.g., monitoring, air dispersion modeling, GIS-based epidemiological studies, focus on concentrations or health risks, pollutant under study, background concentration, emission rate, and meteorological factors, as well as the study's implicit or explicit definition of spatial extent. We supplement this meta-analysis with results from some illustrative atmospheric dispersion modeling. Results We found that pollutant characteristics and background concentrations best explained variability in previously published spatial extent estimates, with a modifying influence of local meteorology, once some extreme values based on health risk estimates were removed from the analysis. As hypothesized, inert pollutants with high background concentrations had the largest spatial extent (often demonstrating no significant gradient, and pollutants formed in near-source chemical reactions (e.g., nitrogen dioxide had a larger spatial extent than pollutants depleted in near-source chemical reactions or removed through coagulation processes (e.g., nitrogen oxide and ultrafine particles. Our illustrative dispersion model illustrated the complex interplay of spatial extent definitions, emission rates

  15. Nuisance Source Population Modeling for Radiation Detection System Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sokkappa, P; Lange, D; Nelson, K; Wheeler, R

    2009-10-05

    source frequencies, but leave the task of estimating these frequencies for future work. Modeling of nuisance source populations is only useful if it helps in understanding detector system performance in real operational environments. Examples of previous studies in which nuisance source models played a key role are briefly discussed. These include screening of in-bound urban traffic and monitoring of shipping containers in transit to U.S. ports.

  16. Real-time analysis, visualization, and steering of microtomography experiments at photon sources

    International Nuclear Information System (INIS)

    Laszeski, G. von; Insley, J.A.; Foster, I.; Bresnahan, J.; Kesselman, C.; Su, M.; Thiebaux, M.; Rivers, M.L.; Wang, S.; Tieman, B.; McNulty, I.

    2000-01-01

    A new generation of specialized scientific instruments called synchrotron light sources allow the imaging of materials at very fine scales. However, in contrast to a traditional microscope, interactive use has not previously been possible because of the large amounts of data generated and the considerable computation required translating this data into a useful image. The authors describe a new software architecture that uses high-speed networks and supercomputers to enable quasi-real-time and hence interactive analysis of synchrotron light source data. This architecture uses technologies provided by the Globus computational grid toolkit to allow dynamic creation of a reconstruction pipeline that transfers data from a synchrotron source beamline to a preprocessing station, next to a parallel reconstruction system, and then to multiple visualization stations. Collaborative analysis tools allow multiple users to control data visualization. As a result, local and remote scientists can see and discuss preliminary results just minutes after data collection starts. The implications for more efficient use of this scarce resource and for more effective science appear tremendous

  17. ITER safety task NID-5a: ITER tritium environmental source terms - safety analysis basis

    International Nuclear Information System (INIS)

    Natalizio, A.; Kalyanam, K.M.

    1994-09-01

    The Canadian Fusion Fuels Technology Project's (CFFTP) is part of the contribution to ITER task NID-5a, Initial Tritium Source Term. This safety analysis basis constitutes the first part of the work for establishing tritium source terms and is intended to solicit comments and obtain agreement. The analysis objective is to provide an early estimate of tritium environmental source terms for the events to be analyzed. Events that would result in the loss of tritium are: a Loss of Coolant Accident (LOCA), a vacuum vessel boundary breach. a torus exhaust line failure, a fuelling machine process boundary failure, a fuel processing system process boundary failure, a water detritiation system process boundary failure and an isotope separation system process boundary failure. 9 figs

  18. Open source tools for the information theoretic analysis of neural data

    Directory of Open Access Journals (Sweden)

    Robin A. A Ince

    2010-05-01

    Full Text Available The recent and rapid development of open-source software tools for the analysis of neurophysiological datasets consisting of multiple simultaneous recordings of spikes, field potentials and other neural signals holds the promise for a significant advance in the standardization, transparency, quality, reproducibility and variety of techniques used to analyze neurophysiological data and integrate the information obtained at different spatial and temporal scales. In this Review we focus on recent advances in open source toolboxes for the information theoretic analysis of neural responses. We also present examples of their use to investigate the role of spike timing precision, correlations across neurons and field potential fluctuations in the encoding of sensory information. These information toolboxes, available both in Matlab and Python programming environments, hold the potential to enlarge the domain of application of information theory to neuroscience and to lead to new discoveries about how neurons encode and transmit information.

  19. Open source tools for the information theoretic analysis of neural data.

    Science.gov (United States)

    Ince, Robin A A; Mazzoni, Alberto; Petersen, Rasmus S; Panzeri, Stefano

    2010-01-01

    The recent and rapid development of open source software tools for the analysis of neurophysiological datasets consisting of simultaneous multiple recordings of spikes, field potentials and other neural signals holds the promise for a significant advance in the standardization, transparency, quality, reproducibility and variety of techniques used to analyze neurophysiological data and for the integration of information obtained at different spatial and temporal scales. In this review we focus on recent advances in open source toolboxes for the information theoretic analysis of neural responses. We also present examples of their use to investigate the role of spike timing precision, correlations across neurons, and field potential fluctuations in the encoding of sensory information. These information toolboxes, available both in MATLAB and Python programming environments, hold the potential to enlarge the domain of application of information theory to neuroscience and to lead to new discoveries about how neurons encode and transmit information.

  20. Experimental study of high current negative ion sources D- / H-. Analysis based on the simulation of the negative ion transport in the plasma source

    International Nuclear Information System (INIS)

    Riz, D.

    1996-01-01

    In the frame of the development of a neutral beam injection system able to work the ITER tokamak (International Thermonuclear Experimental Reactor), two negative ion sources, Dragon and Kamaboko, have been installed on the MANTIS test bed in Cadarache, and studies in order to extract 20 mA/cm 2 of D - . The two production modes of negative ions have been investigated: volume production; surface production after cesium injection in the discharge. Experiments have shown that cesium seeding is necessary in order to reach the requested performances for ITER. 20 mA/cm 2 have been extracted from the Kamaboko source for an arc power density of 2.5 kW/liter. Simultaneously, a code called NIETZSCHE has been developed to simulate the negative ions transport in the source plasma, from their birth place to the extraction holes. The ion trajectory is calculated by numerically solving the 3D motion equation, while the atomic processes of destruction, of elastic collisions H - /H + and of charge exchange H - /H 0 are handled at each time step by a Monte Carlo procedure. The code allows to obtain the extraction probability of a negative ion produced at a given location. The calculations performed with NIETZSCHE have allowed to explain several phenomena observed on negative ion sources, such as the isotopic effect H - /D - and the influence of the polarisation of the plasma grid and of the magnetic filter on the negative ions current. The code has also shown that, in the type of sources contemplated for ITER, working with large arc power densities (> 1 kW/liter), only negative ions produced in volume at a distance lower that 2 cm from the plasma grid and those produced at the grid surface have a chance of being extracted. (author)

  1. GLOBAL SOURCING: A THEORETICAL STUDY ON TURKEY

    Directory of Open Access Journals (Sweden)

    Aytac GOKMEN

    2010-07-01

    Full Text Available Global sourcing is to source from the global market for goods and services across national boundaries in order to take advantage of the global efficiencies in the delivery of a product or service. Such efficiencies are consists of low cost skilled labor, low cost raw materials and other economic factors like tax breaks and deductions as well as low trade tariffs. When we assess the case regarding to Turkey, global sourcing is an effective device for some firms. The domestic firms in Turkey at various industries are inclined to global source finished or intermediate goods from the world markets, finish the production process in Turkey and export. Eventually, on the one hand the export volume of Turkey increases, but on the other hand the import of a considerable volume of finished or intermediate goods bring about a negative trade balance and loss of jobs in Turkey. Therefore, the objective of this study is to assess the concept of global sourcing transactions on Turkey resting on comprehensive publications.

  2. Extracting functional components of neural dynamics with Independent Component Analysis and inverse Current Source Density.

    Science.gov (United States)

    Lęski, Szymon; Kublik, Ewa; Swiejkowski, Daniel A; Wróbel, Andrzej; Wójcik, Daniel K

    2010-12-01

    Local field potentials have good temporal resolution but are blurred due to the slow spatial decay of the electric field. For simultaneous recordings on regular grids one can reconstruct efficiently the current sources (CSD) using the inverse Current Source Density method (iCSD). It is possible to decompose the resultant spatiotemporal information about the current dynamics into functional components using Independent Component Analysis (ICA). We show on test data modeling recordings of evoked potentials on a grid of 4 × 5 × 7 points that meaningful results are obtained with spatial ICA decomposition of reconstructed CSD. The components obtained through decomposition of CSD are better defined and allow easier physiological interpretation than the results of similar analysis of corresponding evoked potentials in the thalamus. We show that spatiotemporal ICA decompositions can perform better for certain types of sources but it does not seem to be the case for the experimental data studied. Having found the appropriate approach to decomposing neural dynamics into functional components we use the technique to study the somatosensory evoked potentials recorded on a grid spanning a large part of the forebrain. We discuss two example components associated with the first waves of activation of the somatosensory thalamus. We show that the proposed method brings up new, more detailed information on the time and spatial location of specific activity conveyed through various parts of the somatosensory thalamus in the rat.

  3. Thermal hydraulic analysis of the encapsulated nuclear heat source

    Energy Technology Data Exchange (ETDEWEB)

    Sienicki, J.J.; Wade, D.C. [Argonne National Lab., IL (United States)

    2001-07-01

    An analysis has been carried out of the steady state thermal hydraulic performance of the Encapsulated Nuclear Heat Source (ENHS) 125 MWt, heavy liquid metal coolant (HLMC) reactor concept at nominal operating power and shutdown decay heat levels. The analysis includes the development and application of correlation-type analytical solutions based upon first principles modeling of the ENHS concept that encompass both pure as well as gas injection augmented natural circulation conditions, and primary-to-intermediate coolant heat transfer. The results indicate that natural circulation of the primary coolant is effective in removing heat from the core and transferring it to the intermediate coolant without the attainment of excessive coolant temperatures. (authors)

  4. Applicability of annular-source excited systems in quantitative XRF analysis

    International Nuclear Information System (INIS)

    Mahmoud, A.; Bernasconi, G.; Bamford, S.A.; Dosan, B.; Haselberger, N.; Markowicz, A.

    1996-01-01

    Radioisotope-excited XRF systems, using annular sources, are widely used in view of their simplicity, wide availability, relatively low price for the complete system and good overall performance with respect to accuracy and detection limits. However some problems arise when the use of fundamental parameter techniques for quantitative analysis is attempted. These problems are due to the fact that the systems operate with large solid angles for incoming and emerging radiation and both the incident and take-off angles are not trivial. In this paper an improved way to calculate effective values for the incident and take-off angles, using monte Carlo (M C) integration techniques is shown. In addition, a study of the applicability of the effective angles for analysing different samples, or standards was carried out. The M C method allows also calculation of the excitation-detection efficiency for different parts of the sample and estimation of the overall efficiency of a source-excited XRF setup. The former information is useful in the design of optimized XRF set-ups and prediction of the response of inhomogeneous samples. A study of the sensitivity of the results due to sample characteristics and a comparison of the results with experimentally determined values for incident and take-off angles is also presented. A flexible and user-friendly computer program was developed in order to perform efficiently the lengthy calculation involved. (author). 14 refs. 5 figs

  5. The application of x-ray spectrometry to isotopic-source activation analysis of dysprosium and holmium

    International Nuclear Information System (INIS)

    Pillay, A.E.; Mboweni, R.C.M.

    1990-01-01

    A novel aspect of activation analysis is described for the determination of dysprosium and holmium at low concentrations. The method involves the measurement of K x-rays from radionuclides produced by thermal neutron activation using a 1 mg 252 Cf source. The basis for elemental selection depends largely on the demand for analysis and on the existence of favourable nuclear properties for the production of a practicable x-ray yield. A full appraisal of the analytical potential of the method is presented with particular emphasis on its application to geological matrices. The sensitivity was optimised by employing a detector that was particularly effective at photon energies below 150 keV. Analytical conditions are demonstrated for the elements of interest over a wide range of concentrations in small powdered samples. The investigation formed the basis of a feasibility study to establish if the application could be developed for the routine off-line determination of dysprosium and holmium using an isotopic-neutron source. (author)

  6. Comparative studies of energy sources in gynecologic laparoscopy.

    Science.gov (United States)

    Law, Kenneth S K; Lyons, Stephen D

    2013-01-01

    Energy sources incorporating "vessel sealing" capabilities are being increasingly used in gynecologic laparoscopic surgery although conventional monopolar and bipolar electrosurgery remain popular. The preference for one device over another is based on a combination of factors, including the surgeon's subjective experience, availability, and cost. Although comparative clinical studies and meta-analyses of laparoscopic energy sources have reported small but statistically significant differences in volumes of blood loss, the clinical significance of such small volumes is questionable. The overall usefulness of the various energy sources available will depend on a number of factors including vessel burst pressure and seal time, lateral thermal spread, and smoke production. Animal studies and laboratory-based trials are useful in providing a controlled environment to investigate such parameters. At present, there is insufficient evidence to support the use of one energy source over another. Copyright © 2013 AAGL. All rights reserved.

  7. Seismic hazard assessment of the Province of Murcia (SE Spain): analysis of source contribution to hazard

    Science.gov (United States)

    García-Mayordomo, J.; Gaspar-Escribano, J. M.; Benito, B.

    2007-10-01

    A probabilistic seismic hazard assessment of the Province of Murcia in terms of peak ground acceleration (PGA) and spectral accelerations [SA( T)] is presented in this paper. In contrast to most of the previous studies in the region, which were performed for PGA making use of intensity-to-PGA relationships, hazard is here calculated in terms of magnitude and using European spectral ground-motion models. Moreover, we have considered the most important faults in the region as specific seismic sources, and also comprehensively reviewed the earthquake catalogue. Hazard calculations are performed following the Probabilistic Seismic Hazard Assessment (PSHA) methodology using a logic tree, which accounts for three different seismic source zonings and three different ground-motion models. Hazard maps in terms of PGA and SA(0.1, 0.2, 0.5, 1.0 and 2.0 s) and coefficient of variation (COV) for the 475-year return period are shown. Subsequent analysis is focused on three sites of the province, namely, the cities of Murcia, Lorca and Cartagena, which are important industrial and tourism centres. Results at these sites have been analysed to evaluate the influence of the different input options. The most important factor affecting the results is the choice of the attenuation relationship, whereas the influence of the selected seismic source zonings appears strongly site dependant. Finally, we have performed an analysis of source contribution to hazard at each of these cities to provide preliminary guidance in devising specific risk scenarios. We have found that local source zones control the hazard for PGA and SA( T ≤ 1.0 s), although contribution from specific fault sources and long-distance north Algerian sources becomes significant from SA(0.5 s) onwards.

  8. Identification of sources of heavy metals in the Dutch atmosphere using air filter and lichen analysis

    International Nuclear Information System (INIS)

    de Bruin, M.; Wolterbeek, H.T.

    1984-01-01

    Aerosol samples collected in an industrialized region were analyzed by instrumental neutron activation analysis. Correlation with wind direction and factor analysis were applied to the concentration data to obtain information on the nature and position of the sources. Epiphytic lichens were sampled over the country and analyzed for heavy metals (As, Cd, Sc, Zn, Sb). The data were interpreted by geographically plotting element concentrations and enrichment factors, and by factor analysis. Some pitfalls are discussed which are associated with the use of aerosol and lichen data in studies of heavy metal air pollution. 14 references, 8 figures, 3 tables

  9. Active Control of Fan Noise: Feasibility Study. Volume 6; Theoretical Analysis for Coupling of Active Noise Control Actuator Ring Sources to an Annular Duct with Flow

    Science.gov (United States)

    Kraft, R. E.

    1996-01-01

    The objective of this effort is to develop an analytical model for the coupling of active noise control (ANC) piston-type actuators that are mounted flush to the inner and outer walls of an annular duct to the modes in the duct generated by the actuator motion. The analysis will be used to couple the ANC actuators to the modal analysis propagation computer program for the annular duct, to predict the effects of active suppression of fan-generated engine noise sources. This combined program will then be available to assist in the design or evaluation of ANC systems in fan engine annular exhaust ducts. An analysis has been developed to predict the modes generated in an annular duct due to the coupling of flush-mounted ring actuators on the inner and outer walls of the duct. The analysis has been combined with a previous analysis for the coupling of modes to a cylindrical duct in a FORTRAN computer program to perform the computations. The method includes the effects of uniform mean flow in the duct. The program can be used for design or evaluation purposes for active noise control hardware for turbofan engines. Predictions for some sample cases modeled after the geometry of the NASA Lewis ANC Fan indicate very efficient coupling in both the inlet and exhaust ducts for the m = 6 spinning mode at frequencies where only a single radial mode is cut-on. Radial mode content in higher order cut-off modes at the source plane and the required actuator displacement amplitude to achieve 110 dB SPL levels in the desired mode were predicted. Equivalent cases with and without flow were examined for the cylindrical and annular geometry, and little difference was found for a duct flow Mach number of 0.1. The actuator ring coupling program will be adapted as a subroutine to the cylindrical duct modal analysis and the exhaust duct modal analysis. This will allow the fan source to be defined in terms of characteristic modes at the fan source plane and predict the propagation to the

  10. Collection, Analysis, and Dissemination of Open Source News and Analysis for Safeguards Implementation and Evaluation

    International Nuclear Information System (INIS)

    Khaled, J.; Reed, J.; Ferguson, M.; Hepworth, C.; Serrat, J.; Priori, M.; Hammond, W.

    2015-01-01

    Analysis of all safeguards-relevant information is an essential component of IAEA safeguards and the ongoing State evaluation underlying IAEA verification activities. In addition to State declared safeguards information and information generated from safeguards activities both in the field and at headquarters, the IAEA collects and analyzes information from a wide array of open sources relevant to States' nuclear related activities. A number of these open sources include information that could be loosely categorized as ''news'': international, regional, and local media; company and government press releases; public records of parliamentary proceedings; and NGO/academic commentaries and analyzes. It is the task of the State Factors Analysis Section of the Department of Safeguards to collect, analyze and disseminate news of relevance to support ongoing State evaluation. This information supports State evaluation by providing the Department with a global overview of safeguards-relevant nuclear developments. Additionally, this type of information can support in-depth analyses of nuclear fuel cycle related activities, alerting State Evaluation Groups to potential inconsistencies in State declarations, and preparing inspectors for activities in the field. The State Factors Analysis Section uses a variety of tools, including subscription services, news aggregators, a roster of specialized sources, and a custom software application developed by an external partner to manage incoming data streams and assist with making sure that critical information is not overlooked. When analyzing data, it is necessary to determine the credibility of a given source and piece of information. Data must be considered for accuracy, bias, and relevance to the overall assessment. Analysts use a variety of methodological techniques to make these types of judgments, which are included when the information is presented to State Evaluation Groups. Dissemination of news to

  11. Recommendation of ruthenium source for sludge batch flowsheet studies

    Energy Technology Data Exchange (ETDEWEB)

    Woodham, W. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-09-13

    Included herein is a preliminary analysis of previously-generated data from sludge batches 7a, 7b, 8, and 9 sludge simulant and real-waste testing, performed to recommend a form of ruthenium for future sludge batch simulant testing under the nitric-formic flowsheet. Focus is given to reactions present in the Sludge Receipt and Adjustment Tank cycle, given that this cycle historically produces the most changes in chemical composition during Chemical Process Cell processing. Data is presented and analyzed for several runs performed under the nitric-formic flowsheet, with consideration given to effects on the production of hydrogen gas, nitrous oxide gas, consumption of formate, conversion of nitrite to nitrate, and the removal and recovery of mercury during processing. Additionally, a brief discussion is given to the effect of ruthenium source selection under the nitric-glycolic flowsheet. An analysis of data generated from scaled demonstration testing, sludge batch 9 qualification testing, and antifoam degradation testing under the nitric-glycolic flowsheet is presented. Experimental parameters of interest under the nitric-glycolic flowsheet include N2O production, glycolate destruction, conversion of glycolate to formate and oxalate, and the conversion of nitrite to nitrate. To date, the number of real-waste experiments that have been performed under the nitric-glycolic flowsheet is insufficient to provide a complete understanding of the effects of ruthenium source selection in simulant experiments with regard to fidelity to real-waste testing. Therefore, a determination of comparability between the two ruthenium sources as employed under the nitric-glycolic flowsheet is made based on available data in order to inform ruthenium source selection for future testing under the nitric-glycolic flowsheet.

  12. Study of neutron focusing at the Texas Cold Neutron Source. Final report

    International Nuclear Information System (INIS)

    Wehring, B.W.; Uenlue, K.

    1995-01-01

    Funds were received for the first year of a three year DOE Nuclear Engineering Research Grant, ''Study of Neutron Focusing at the Texas Cold Neutron Source'' (FGO2-92ER75711). The purpose of this three year study was to develop a neutron focusing system to be used with the Texas Cold Neutron Source (TCNS) to produce an intense beam of neutrons. A prompt gamma activation analysis (PGAA) facility was also to be designed, setup, and tested under the three year project. During the first year of the DOE grant, a new procedure was developed and used to design a focusing converging guide consisting of truncated rectangular cone sections. Detailed calculations were performed using a 3-D Monte Carlo code which we wrote to trace neutrons through the curved guide of the TCNS into the proposed converging guide. Using realistic reflectivities for Ni-Ti supermirrors, we obtained gains of 3 to 5 for the neutron flux averaged over an area of 1 x 1 cm

  13. Development and validation of an open source quantification tool for DSC-MRI studies.

    Science.gov (United States)

    Gordaliza, P M; Mateos-Pérez, J M; Montesinos, P; Guzmán-de-Villoria, J A; Desco, M; Vaquero, J J

    2015-03-01

    This work presents the development of an open source tool for the quantification of dynamic susceptibility-weighted contrast-enhanced (DSC) perfusion studies. The development of this tool is motivated by the lack of open source tools implemented on open platforms to allow external developers to implement their own quantification methods easily and without the need of paying for a development license. This quantification tool was developed as a plugin for the ImageJ image analysis platform using the Java programming language. A modular approach was used in the implementation of the components, in such a way that the addition of new methods can be done without breaking any of the existing functionalities. For the validation process, images from seven patients with brain tumors were acquired and quantified with the presented tool and with a widely used clinical software package. The resulting perfusion parameters were then compared. Perfusion parameters and the corresponding parametric images were obtained. When no gamma-fitting is used, an excellent agreement with the tool used as a gold-standard was obtained (R(2)>0.8 and values are within 95% CI limits in Bland-Altman plots). An open source tool that performs quantification of perfusion studies using magnetic resonance imaging has been developed and validated using a clinical software package. It works as an ImageJ plugin and the source code has been published with an open source license. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Climate Change Studies over Bangalore using Multi-source Remote Sensing Data and GIS

    Science.gov (United States)

    B, S.; Gouda, K. C.; Laxmikantha, B. P.; Bhat, N.

    2014-12-01

    Urbanization is a form of metropolitan growth that is a response to often bewildering sets of economic, social, and political forces and to the physical geography of an area. Some of the causes of the sprawl include - population growth, economy, patterns of infrastructure initiatives like the construction of roads and the provision of infrastructure using public money encouraging development. The direct implication of such urban sprawl is the change in land use and land cover of the region. In this study the long term climate data from multiple sources like NCEP reanalysis, IMD observations and various satellite derived products from MAIRS, IMD, ERSL and TRMM are considered and analyzed using the developed algorithms for the better understanding of the variability in the climate parameters over Bangalore. These products are further mathematically analyzed to arrive at desired results by extracting land surface temperature (LST), Potential evapo-transmission (PET), Rainfall, Humidity etc. Various satellites products are derived from NASA (National Aeronautics Space Agency), Indian meteorological satellites and global satellites are helpful in massive study of urban issues at global and regional scale. Climate change analysis is well studied by using either single source data such as Temperature or Rainfall from IMD (Indian Meteorological Department) or combined data products available as in case of MAIRS (Monsoon Asia Integrated Regional Scale) program to get rainfall at regional scale. Finally all the above said parameters are normalized and analyzed with the help of various open source available software's for pre and post processing our requirements to obtain desired results. A sample of analysis i.e. the Inter annual variability of annual averaged Temperature over Bangalore is presented in figure 1, which clearly shows the rising trend of the temperature (0.06oC/year). Also the Land use and land cover (LULC) analysis over Bangalore, Day light hours from

  15. Analysis of a carbon dioxide transcritical power cycle using a low temperature source

    International Nuclear Information System (INIS)

    Cayer, Emmanuel; Galanis, Nicolas; Desilets, Martin; Nesreddine, Hakim; Roy, Philippe

    2009-01-01

    A detailed analysis of a carbon dioxide transcritical power cycle using an industrial low-grade stream of process gases as its heat source is presented. The methodology is divided in four steps: energy analysis, exergy analysis, finite size thermodynamics and calculation of the heat exchangers' surface. The results have been calculated for fixed temperature and mass flow rate of the heat source, fixed maximum and minimum temperatures in the cycle and a fixed sink temperature by varying the high pressure of the cycle and its net power output. The main results show the existence of an optimum high pressure for each of the four steps; in the first two steps, the optimum pressure maximises the thermal or exergetic efficiency while in the last two steps it minimises the product UA or the heat exchangers' surface. These high pressures are very similar for the energy and exergy analyses. The last two steps also have nearly identical optimizing high pressures that are significantly lower that the ones for the first two steps. In addition, the results show that the augmentation of the net power output produced from the limited energy source has no influence on the results of the energy analysis, decreases the exergetic efficiency and increases the heat exchangers' surface. Changing the net power output has no significant impact on the high pressures optimizing each of the four steps

  16. Empirical Study on Factors Influencing Residents' Behavior of Separating Household Wastes at Source

    Institute of Scientific and Technical Information of China (English)

    Qu Ying; Zhu Qinghua; Murray Haight

    2007-01-01

    Source separation is the basic premise for making effective use of household wastes. In eight cities of China, however, several pilot projects of source separation finally failed because of the poor participation rate of residents. In order to solve this problem, identifying those factors that influence residents' behavior of source separation becomes crucial. By means of questionnaire survey, we conducted descriptive analysis and exploratory factor analysis. The results show that trouble-feeling, moral notion, environment protection, public education, environment value and knowledge deficiency are the main factors that play an important role for residents in deciding to separate their household wastes. Also, according to the contribution percentage of the six main factors to the total behavior of source separation, their influencing power is analyzed, which will provide suggestions on household waste management for policy makers and decision makers in China.

  17. Analysis of source spectra, attenuation, and site effects from central and eastern United States earthquakes

    International Nuclear Information System (INIS)

    Lindley, G.

    1998-02-01

    This report describes the results from three studies of source spectra, attenuation, and site effects of central and eastern United States earthquakes. In the first study source parameter estimates taken from 27 previous studies were combined to test the assumption that the earthquake stress drop is roughly a constant, independent of earthquake size. 200 estimates of stress drop and seismic moment from eastern North American earthquakes were combined. It was found that the estimated stress drop from the 27 studies increases approximately as the square-root of the seismic moment, from about 3 bars at 10 20 dyne-cm to 690 bars at 10 25 dyne-cm. These results do not support the assumption of a constant stress drop when estimating ground motion parameters from eastern North American earthquakes. In the second study, broadband seismograms recorded by the United States National Seismograph Network and cooperating stations have been analysed to determine Q Lg as a function of frequency in five regions: the northeastern US, southeastern US, central US, northern Basin and Range, and California and western Nevada. In the third study, using spectral analysis, estimates have been made for the anelastic attenuation of four regional phases, and estimates have been made for the source parameters of 27 earthquakes, including the M b 5.6, 14 April, 1995, West Texas earthquake

  18. Detection and Characterization of Ground Displacement Sources from Variational Bayesian Independent Component Analysis of GPS Time Series

    Science.gov (United States)

    Gualandi, A.; Serpelloni, E.; Belardinelli, M. E.

    2014-12-01

    A critical point in the analysis of ground displacements time series is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies. Indeed, PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem, i.e. in recovering and separating the original sources that generated the observed data. This is mainly due to the assumptions on which PCA relies: it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources, giving a more reliable estimate of them. Here we present the application of the vbICA technique to GPS position time series. First, we use vbICA on synthetic data that simulate a seismic cycle

  19. Application of Abaqus to analysis of the temperature field in elements heated by moving heat sources

    Directory of Open Access Journals (Sweden)

    W. Piekarska

    2010-10-01

    Full Text Available Numerical analysis of thermal phenomena occurring during laser beam heating is presented in this paper. Numerical models of surface andvolumetric heat sources were presented and the influence of different laser beam heat source power distribution on temperature field wasanalyzed. Temperature field was obtained by a numerical solution the transient heat transfer equation with activity of inner heat sources using finite element method. Temperature distribution analysis in welded joint was performed in the ABAQUS/Standard solver. The DFLUXsubroutine was used for implementation of the movable welding heat source model. Temperature-depended thermophysical properties for steelwere assumed in computer simulations. Temperature distribution in laser beam surface heated and butt welded plates was numericallyestimated.

  20. Efficiency and Effectiveness in the Collection and Analysis of S&T Open Source Information

    International Nuclear Information System (INIS)

    Pericou-Cayere, M.; Lemaire, P.; Pace, J.-M.; Baude, S.; Samson, N.

    2015-01-01

    While looking for information in scientific database, we are overwhelmed by the amount of information that we encounter. In this big data collection, getting information with added-value could be strategic for nuclear verification. In our study, we have worked about ''best practices'' in collecting, processing and analyzing open source scientific and technical information. First, we were insistent on working with information authenticated by referees such as scientific publications (structured information). Analysis of this structured data is made with bibliometric tools. Several steps are carried out: collecting data related to the paradigm, creating a database to store data generated by bibliographic research, analyzing data with selected tools. With analysis of bibliographic data only, we are able to get: · a panoramic view of countries that publish in the paradigm, · co-publication networks, · organizations that contribute to scientific publications, · countries with which a country collaborates, · areas of interest of a country, . . . So we are able to identify a target. On a second phase, we can focus on a target (countries for example). Working with non-structured data (i.e., press release, social networks, full text analysis of publications) is in progress and needs other tools to be added to the process, as we will discuss in this paper. In information analysis, methodology and expert analysis are important. Software analysis is just a tool to achieve our goal. This presentation deals with concrete measures that improve the efficiency and effectiveness in the use of open source S&T information and in the management of that information over time. Examples are shown. (author)

  1. On the autarchic use of solely PIXE data in particulate matter source apportionment studies by receptor modeling

    Energy Technology Data Exchange (ETDEWEB)

    Lucarelli, F. [Department of Physics and Astronomy, University of Florence, Via G. Sansone 1, 50019 Sesto Fiorentino (Italy); National Institute of Nuclear Physics (INFN)-Florence, Via G. Sansone 1, 50019 Sesto Fiorentino (Italy); Nava, S., E-mail: nava@fi.infn.it [National Institute of Nuclear Physics (INFN)-Florence, Via G. Sansone 1, 50019 Sesto Fiorentino (Italy); Calzolai, G. [Department of Physics and Astronomy, University of Florence, Via G. Sansone 1, 50019 Sesto Fiorentino (Italy); Chiari, M. [National Institute of Nuclear Physics (INFN)-Florence, Via G. Sansone 1, 50019 Sesto Fiorentino (Italy); Giannoni, M.; Traversi, R.; Udisti, R. [Department of Chemistry, University of Florence, Via della Lastruccia 3, 50019 Sesto Fiorentino (Italy)

    2015-11-15

    Particle Induced X-ray Emission (PIXE) analysis of aerosol samples allows simultaneous detection of several elements, including important tracers of many particulate matter sources. This capability, together with the possibility of analyzing a high number of samples in very short times, makes PIXE a very effective tool for source apportionment studies by receptor modeling. However, important aerosol components, like nitrates, OC and EC, cannot be assessed by PIXE: this limitation may strongly compromise the results of a source apportionment study if based on PIXE data alone. In this work, an experimental dataset characterised by an extended chemical speciation (elements, EC–OC, ions) is used to test the effect of reducing input species in the application of one of the most widely used receptor model, namely Positive Matrix Factorization (PMF). The main effect of using only PIXE data is that the secondary nitrate source is not identified and the contribution of biomass burning is overestimated, probably due to the similar seasonal pattern of these two sources.

  2. Red pepper (Capsicum annuum) carotenoids as a source of natural food colors: analysis and stability-a review.

    Science.gov (United States)

    Arimboor, Ranjith; Natarajan, Ramesh Babu; Menon, K Ramakrishna; Chandrasekhar, Lekshmi P; Moorkoth, Vidya

    2015-03-01

    Carotenoids are increasingly drawing the attention of researchers as a major natural food color due to their inherent nutritional characteristics and the implicated possible role in prevention and protection against degenerative diseases. In this report, we review the role of red pepper as a source for natural carotenoids. The composition of the carotenoids in red pepper and the application of different methodologies for their analysis were discussed in this report. The stability of red pepper carotenoids during post-harvest processing and storage is also reviewed. This review highlights the potential of red pepper carotenoids as a source of natural food colors and also discusses the need for a standardized approach for the analysis and reporting of composition of carotenoids in plant products and designing model systems for stability studies.

  3. Review on solving the inverse problem in EEG source analysis

    Directory of Open Access Journals (Sweden)

    Fabri Simon G

    2008-11-01

    Full Text Available Abstract In this primer, we give a review of the inverse problem for EEG source localization. This is intended for the researchers new in the field to get insight in the state-of-the-art techniques used to find approximate solutions of the brain sources giving rise to a scalp potential recording. Furthermore, a review of the performance results of the different techniques is provided to compare these different inverse solutions. The authors also include the results of a Monte-Carlo analysis which they performed to compare four non parametric algorithms and hence contribute to what is presently recorded in the literature. An extensive list of references to the work of other researchers is also provided. This paper starts off with a mathematical description of the inverse problem and proceeds to discuss the two main categories of methods which were developed to solve the EEG inverse problem, mainly the non parametric and parametric methods. The main difference between the two is to whether a fixed number of dipoles is assumed a priori or not. Various techniques falling within these categories are described including minimum norm estimates and their generalizations, LORETA, sLORETA, VARETA, S-MAP, ST-MAP, Backus-Gilbert, LAURA, Shrinking LORETA FOCUSS (SLF, SSLOFO and ALF for non parametric methods and beamforming techniques, BESA, subspace techniques such as MUSIC and methods derived from it, FINES, simulated annealing and computational intelligence algorithms for parametric methods. From a review of the performance of these techniques as documented in the literature, one could conclude that in most cases the LORETA solution gives satisfactory results. In situations involving clusters of dipoles, higher resolution algorithms such as MUSIC or FINES are however preferred. Imposing reliable biophysical and psychological constraints, as done by LAURA has given superior results. The Monte-Carlo analysis performed, comparing WMN, LORETA, sLORETA and SLF

  4. Evaluating sources and processing of nonpoint source nitrate in a small suburban watershed in China

    Science.gov (United States)

    Han, Li; Huang, Minsheng; Ma, Minghai; Wei, Jinbao; Hu, Wei; Chouhan, Seema

    2018-04-01

    Identifying nonpoint sources of nitrate has been a long-term challenge in mixed land-use watershed. In the present study, we combine dual nitrate isotope, runoff and stream water monitoring to elucidate the nonpoint nitrate sources across land use, and determine the relative importance of biogeochemical processes for nitrate export in a small suburban watershed, Longhongjian watershed, China. Our study suggested that NH4+ fertilizer, soil NH4+, litter fall and groundwater were the main nitrate sources in Longhongjian Stream. There were large changes in nitrate sources in response to season and land use. Runoff analysis illustrated that the tea plantation and forest areas contributed to a dominated proportion of the TN export. Spatial analysis illustrated that NO3- concentration was high in the tea plantation and forest areas, and δ15N-NO3 and δ18O-NO3 were enriched in the step ponds. Temporal analysis showed high NO3- level in spring, and nitrate isotopes were enriched in summer. Study as well showed that the step ponds played an important role in mitigating nitrate pollution. Nitrification and plant uptake were the significant biogeochemical processes contributing to the nitrogen transformation, and denitrification hardly occurred in the stream.

  5. Inter-comparison of receptor models for PM source apportionment: Case study in an industrial area

    Science.gov (United States)

    Viana, M.; Pandolfi, M.; Minguillón, M. C.; Querol, X.; Alastuey, A.; Monfort, E.; Celades, I.

    2008-05-01

    Receptor modelling techniques are used to identify and quantify the contributions from emission sources to the levels and major and trace components of ambient particulate matter (PM). A wide variety of receptor models are currently available, and consequently the comparability between models should be evaluated if source apportionment data are to be used as input in health effects studies or mitigation plans. Three of the most widespread receptor models (principal component analysis, PCA; positive matrix factorization, PMF; chemical mass balance, CMB) were applied to a single PM10 data set (n=328 samples, 2002-2005) obtained from an industrial area in NE Spain, dedicated to ceramic production. Sensitivity and temporal trend analyses (using the Mann-Kendall test) were applied. Results evidenced the good overall performance of the three models (r2>0.83 and α>0.91×between modelled and measured PM10 mass), with a good agreement regarding source identification and high correlations between input (CMB) and output (PCA, PMF) source profiles. Larger differences were obtained regarding the quantification of source contributions (up to a factor of 4 in some cases). The combined application of different types of receptor models would solve the limitations of each of the models, by constructing a more robust solution based on their strengths. The authors suggest the combined use of factor analysis techniques (PCA, PMF) to identify and interpret emission sources, and to obtain a first quantification of their contributions to the PM mass, and the subsequent application of CMB. Further research is needed to ensure that source apportionment methods are robust enough for application to PM health effects assessments.

  6. Sealed source and device removal and consolidation feasibility study

    International Nuclear Information System (INIS)

    Ward, J.E.; Carter, J.G.; Meyers, R.L.

    1993-02-01

    The purpose of this study is to assess the feasibility of removing Greater-Than-Class C (GTCC) sealed sources from their containment device and consolidating them for transport to a storage or disposal facility. A sealed source is a sealed capsule containing a radioactive material that is placed in a device providing radioactive containment. It is used in the medical, industrial, research, and food-processing communities for calibrating, measuring, gauging, controlling processes, and testing. This feasibility study addresses the key operational, safety, regulatory, and financial requirements of the removal/consolidation process. This report discusses the process to receive, handle, repackage, and ship these sources to an interim or dedicated storage facility until a final disposal repository can be built and become operational (∼ c. 2010). The study identifies operational and facility requirements to perform this work. Hanford, other DOE facilities, and private hot-cell facilities were evaluated to determine which facilities could perform this work. The personnel needed, design and engineering, facility preparation, process waste disposal requirements, and regulatory compliance were evaluated to determine the cost to perform this work. Cost requirements for items that will have to meet future changing regulatory requirements for transportation, transportation container design and engineering, and disposal were not included in this study. The cost associated with in-process consolidation of the sealed sources reported in this study may have not been modified for inflation and were based on 1992 dollars. This study shows that sealed source consolidation is possible with minimal personnel exposure, and would reduce the risk of radioactive releases to the environment. An initial pilot-scale operation could evaluate the possible methods to reduce the cost and consolidate sources

  7. Probabilistic forward model for electroencephalography source analysis

    International Nuclear Information System (INIS)

    Plis, Sergey M; George, John S; Jun, Sung C; Ranken, Doug M; Volegov, Petr L; Schmidt, David M

    2007-01-01

    Source localization by electroencephalography (EEG) requires an accurate model of head geometry and tissue conductivity. The estimation of source time courses from EEG or from EEG in conjunction with magnetoencephalography (MEG) requires a forward model consistent with true activity for the best outcome. Although MRI provides an excellent description of soft tissue anatomy, a high resolution model of the skull (the dominant resistive component of the head) requires CT, which is not justified for routine physiological studies. Although a number of techniques have been employed to estimate tissue conductivity, no present techniques provide the noninvasive 3D tomographic mapping of conductivity that would be desirable. We introduce a formalism for probabilistic forward modeling that allows the propagation of uncertainties in model parameters into possible errors in source localization. We consider uncertainties in the conductivity profile of the skull, but the approach is general and can be extended to other kinds of uncertainties in the forward model. We and others have previously suggested the possibility of extracting conductivity of the skull from measured electroencephalography data by simultaneously optimizing over dipole parameters and the conductivity values required by the forward model. Using Cramer-Rao bounds, we demonstrate that this approach does not improve localization results nor does it produce reliable conductivity estimates. We conclude that the conductivity of the skull has to be either accurately measured by an independent technique, or that the uncertainties in the conductivity values should be reflected in uncertainty in the source location estimates

  8. Automatic landslide detection from LiDAR DTM derivatives by geographic-object-based image analysis based on open-source software

    Science.gov (United States)

    Knevels, Raphael; Leopold, Philip; Petschko, Helene

    2017-04-01

    With high-resolution airborne Light Detection and Ranging (LiDAR) data more commonly available, many studies have been performed to facilitate the detailed information on the earth surface and to analyse its limitation. Specifically in the field of natural hazards, digital terrain models (DTM) have been used to map hazardous processes such as landslides mainly by visual interpretation of LiDAR DTM derivatives. However, new approaches are striving towards automatic detection of landslides to speed up the process of generating landslide inventories. These studies usually use a combination of optical imagery and terrain data, and are designed in commercial software packages such as ESRI ArcGIS, Definiens eCognition, or MathWorks MATLAB. The objective of this study was to investigate the potential of open-source software for automatic landslide detection based only on high-resolution LiDAR DTM derivatives in a study area within the federal state of Burgenland, Austria. The study area is very prone to landslides which have been mapped with different methodologies in recent years. The free development environment R was used to integrate open-source geographic information system (GIS) software, such as SAGA (System for Automated Geoscientific Analyses), GRASS (Geographic Resources Analysis Support System), or TauDEM (Terrain Analysis Using Digital Elevation Models). The implemented geographic-object-based image analysis (GEOBIA) consisted of (1) derivation of land surface parameters, such as slope, surface roughness, curvature, or flow direction, (2) finding optimal scale parameter by the use of an objective function, (3) multi-scale segmentation, (4) classification of landslide parts (main scarp, body, flanks) by k-mean thresholding, (5) assessment of the classification performance using a pre-existing landslide inventory, and (6) post-processing analysis for the further use in landslide inventories. The results of the developed open-source approach demonstrated good

  9. Neutron activation analysis detection limits using 252Cf sources

    International Nuclear Information System (INIS)

    DiPrete, D.P.; Sigg, R.A.

    2000-01-01

    The Savannah River Technology Center (SRTC) developed a neutron activation analysis (NAA) facility several decades ago using low-flux 252 Cf neutron sources. Through this time, the facility has addressed areas of applied interest in managing the Savannah River Site (SRS). Some applications are unique because of the site's operating history and its chemical-processing facilities. Because sensitivity needs for many applications are not severe, they can be accomplished using an ∼6-mg 252 Cf NAA facility. The SRTC 252 Cf facility continues to support applied research programs at SRTC as well as other SRS programs for environmental and waste management customers. Samples analyzed by NAA include organic compounds, metal alloys, sediments, site process solutions, and many other materials. Numerous radiochemical analyses also rely on the facility for production of short-lived tracers, yielding by activation of carriers and small-scale isotope production for separation methods testing. These applications are more fully reviewed in Ref. 1. Although the flux [approximately2 x 10 7 n/cm 2 ·s] is low relative to reactor facilities, more than 40 elements can be detected at low and sub-part-per-million levels. Detection limits provided by the facility are adequate for many analytical projects. Other multielement analysis methods, particularly inductively coupled plasma atomic emission and inductively coupled plasma mass spectrometry, can now provide sensitivities on dissolved samples that are often better than those available by NAA using low-flux isotopic sources. Because NAA allows analysis of bulk samples, (a) it is a more cost-effective choice when its sensitivity is adequate than methods that require digestion and (b) it eliminates uncertainties that can be introduced by digestion processes

  10. Validation Study for an Atmospheric Dispersion Model, Using Effective Source Heights Determined from Wind Tunnel Experiments in Nuclear Safety Analysis

    Directory of Open Access Journals (Sweden)

    Masamichi Oura

    2018-03-01

    Full Text Available For more than fifty years, atmospheric dispersion predictions based on the joint use of a Gaussian plume model and wind tunnel experiments have been applied in both Japan and the U.K. for the evaluation of public radiation exposure in nuclear safety analysis. The effective source height used in the Gaussian model is determined from ground-level concentration data obtained by a wind tunnel experiment using a scaled terrain and site model. In the present paper, the concentrations calculated by this method are compared with data observed over complex terrain in the field, under a number of meteorological conditions. Good agreement was confirmed in near-neutral and unstable stabilities. However, it was found to be necessary to reduce the effective source height by 50% in order to achieve a conservative estimation of the field observations in a stable atmosphere.

  11. SOURCES FOR THE STUDY OF THE PHILOSOPHY OF ARCHBISHOP NIKANOR (BROVKOVICH

    Directory of Open Access Journals (Sweden)

    Artem Solov'ev

    2013-04-01

    Full Text Available This article is an attempt to analyze and classify the sources for the study of the philosophy of Archbishop Nikanor (Brovkovich (1826-1890. Archbishop Nikanor is little known as a philosopher, in spite of the fact that he was the author of one of the first Russian philosophical systems. This is unfortunately due to the diffi culties involved with accessing the works of Nikanor as well as to the fact that the texts themselves are very obscure and complicated. Thus it falls to the author of this article to provide an analysis of the content of the archive of Nikanor, to fix the date of the writing of certain of his more important philosophical texts, to reveal the existence of unpublished texts, to explore the meaning of Nikanor’s theological works and teachings, as well as to reconstruct his biographical material. The article is based on a scrupulous analysis of all available archival material as well as on a painstaking selection of published books from hard to access sources.The author of the article sets out to evaluate the philosophical output of nikanor comparing it with that of his contemporaries as well as with the work of later philosophers. He underscores the problems connected with research on Nikanor especially those stemming from the lack of accessibility of his work. Special attention is given to the most important source for students of his philosophy — Nikanor’s personal archive conserved in Odessa. The author concludes that it is without a doubt necessary to close the gap in Russian philosophy by restoring Nikanor, one of its brightest fi gures, to his rightful place in its history

  12. A statistical study of faint radio sources at 81.5 MHz

    International Nuclear Information System (INIS)

    Duffett-Smith, P.J.; Purvis, A.; Hewish, A.

    1980-01-01

    The method of interplanetary scintillations (IPS) together with the technique of background deflection analysis (P(D)) have been used to determine the mean angular size and the sky density of scintillating radio sources in the range 2 to 3 Jy at 81.5 MHz. It is found that the radio power from a high proportion of the sources in this range comes from one or two components of angular diameter about 0.7 arcsec. (author)

  13. Preliminary radiation transport analysis for the proposed National Spallation Neutron Source (NSNS)

    International Nuclear Information System (INIS)

    Johnson, J.O.; Lillie, R.A.

    1997-01-01

    The use of neutrons in science and industry has increased continuously during the past 50 years with applications now widely used in physics, chemistry, biology, engineering, and medicine. Within this history, the relative merits of using pulsed accelerator spallation sources versus reactors for neutron sources as the preferred option for the future. To address this future need, the Department of Energy (DOE) has initiated a pre-conceptual design study for the National Spallation Neutron Source (NSNS) and given preliminary approval for the proposed facility to be built at Oak Ridge National Laboratory (ORNL). The DOE directive is to design and build a short pulse spallation source in the 1 MS power range with sufficient design flexibility that it can be upgraded and operated at a significantly higher power at a later stage. The pre-conceptualized design of the NSNS initially consists of an accelerator system capable of delivering a 1 to 2 GeV proton beam with 1 MW of beam power in an approximate 0.5 microsecond pulse at a 60 Hz frequency onto a single target station. The NSNS will be upgraded in stages to a 5 MW facility with two target stations (a high power station operating at 60 Hz and a low power station operating at 10 Hz). Each target station will contain four moderators (combinations of cryogenic and ambient temperature) and 18 beam liens for a total of 36 experiment stations. This paper summarizes the radiation transport analysis strategies for the proposed NSNS facility

  14. Fiji: an open-source platform for biological-image analysis.

    Science.gov (United States)

    Schindelin, Johannes; Arganda-Carreras, Ignacio; Frise, Erwin; Kaynig, Verena; Longair, Mark; Pietzsch, Tobias; Preibisch, Stephan; Rueden, Curtis; Saalfeld, Stephan; Schmid, Benjamin; Tinevez, Jean-Yves; White, Daniel James; Hartenstein, Volker; Eliceiri, Kevin; Tomancak, Pavel; Cardona, Albert

    2012-06-28

    Fiji is a distribution of the popular open-source software ImageJ focused on biological-image analysis. Fiji uses modern software engineering practices to combine powerful software libraries with a broad range of scripting languages to enable rapid prototyping of image-processing algorithms. Fiji facilitates the transformation of new algorithms into ImageJ plugins that can be shared with end users through an integrated update system. We propose Fiji as a platform for productive collaboration between computer science and biology research communities.

  15. Organic aerosol source apportionment in London 2013 with ME-2: exploring the solution space with annual and seasonal analysis

    Directory of Open Access Journals (Sweden)

    E. Reyes-Villegas

    2016-12-01

    Full Text Available The multilinear engine (ME-2 factorization tool is being widely used following the recent development of the Source Finder (SoFi interface at the Paul Scherrer Institute. However, the success of this tool, when using the a value approach, largely depends on the inputs (i.e. target profiles applied as well as the experience of the user. A strategy to explore the solution space is proposed, in which the solution that best describes the organic aerosol (OA sources is determined according to the systematic application of predefined statistical tests. This includes trilinear regression, which proves to be a useful tool for comparing different ME-2 solutions. Aerosol Chemical Speciation Monitor (ACSM measurements were carried out at the urban background site of North Kensington, London from March to December 2013, where for the first time the behaviour of OA sources and their possible environmental implications were studied using an ACSM. Five OA sources were identified: biomass burning OA (BBOA, hydrocarbon-like OA (HOA, cooking OA (COA, semivolatile oxygenated OA (SVOOA and low-volatility oxygenated OA (LVOOA. ME-2 analysis of the seasonal data sets (spring, summer and autumn showed a higher variability in the OA sources that was not detected in the combined March–December data set; this variability was explored with the triangle plots f44 : f43 f44 : f60, in which a high variation of SVOOA relative to LVOOA was observed in the f44 : f43 analysis. Hence, it was possible to conclude that, when performing source apportionment to long-term measurements, important information may be lost and this analysis should be done to short periods of time, such as seasonally. Further analysis on the atmospheric implications of these OA sources was carried out, identifying evidence of the possible contribution of heavy-duty diesel vehicles to air pollution during weekdays compared to those fuelled by petrol.

  16. 2007 California Aerosol Study: Evaluation of δ15N as a Tracer Of NOx Sources and Chemsitry

    Science.gov (United States)

    Katzman, T. L.

    2017-12-01

    Although stable isotopes of N are commonly used as a source tracer, how this tracer is applied is a point of contention. The "source" hypothesis argues that the δ15N value of NO3- reflects the δ15N value of NOx source inputs into the environment, and any observed variation is solely the result of differences in source contributions. Conversely, the "chemistry" hypothesis argues that N isotopes are influenced by chemical reactions, atmospheric or biologic processing, and post-depositional effects. Previous studies often apply the source hypothesis, writing off the chemistry hypothesis as "minor," but others have noted the impact chemistry should has on δ15N values. Given the known complications, this work seeks to assess the use of stable isotopes as tracers, specifically, the assumption that the δ15N value is a tracer of source alone without significant influence from chemical reactions. If the "source" hypothesis is correct, source emission data, known source δ15N values, and isotope mass balance should be able to approximate measured δ15NNO3 values and determine the δ15N value associated with wildfire derived NOx, which is currently unknown. Significant deviations from observed values would support the significance of equilibrium and kinetic isotope effects associated with chemical reactions and processing in the atmosphere. Aerosols collected in during 2007, emission data, and isotopic analysis were utilized to determine the utility of δ15N as tracer of NOx sources. San Diego, California is a coastal urban area influenced by sea salt aerosols, anthropogenic combustion emissions, and seasonal wildfires. Wildfires also have a significant influence on local atmospheric chemistry and 2007 was notable for being one of the worst fire seasons in the San Diego region on record. Isotopic analysis of collected NO3- has suggested that source δ15N values are likely not conserved as NOx is oxidized into NO3-. Given known source contributions and known δ15N values

  17. Cost Analysis Sources and Documents Data Base Reference Manual (Update)

    Science.gov (United States)

    1989-06-01

    M: Refcrence Manual PRICE H: Training Course Workbook 11. Use in Cost Analysis. Important source of cost estimates for electronic and mechanical...Nature of Data. Contains many microeconomic time series by month or quarter. 5. Level of Detail. Very detailed. 6. Normalization Processes Required...Reference Manual. Moorestown, N.J,: GE Corporation, September 1986. 64. PRICE Training Course Workbook . Moorestown, N.J.: GE Corporation, February 1986

  18. Uncertainties in Earthquake Loss Analysis: A Case Study From Southern California

    Science.gov (United States)

    Mahdyiar, M.; Guin, J.

    2005-12-01

    Probabilistic earthquake hazard and loss analyses play important roles in many areas of risk management, including earthquake related public policy and insurance ratemaking. Rigorous loss estimation for portfolios of properties is difficult since there are various types of uncertainties in all aspects of modeling and analysis. It is the objective of this study to investigate the sensitivity of earthquake loss estimation to uncertainties in regional seismicity, earthquake source parameters, ground motions, and sites' spatial correlation on typical property portfolios in Southern California. Southern California is an attractive region for such a study because it has a large population concentration exposed to significant levels of seismic hazard. During the last decade, there have been several comprehensive studies of most regional faults and seismogenic sources. There have also been detailed studies on regional ground motion attenuations and regional and local site responses to ground motions. This information has been used by engineering seismologists to conduct regional seismic hazard and risk analysis on a routine basis. However, one of the more difficult tasks in such studies is the proper incorporation of uncertainties in the analysis. From the hazard side, there are uncertainties in the magnitudes, rates and mechanisms of the seismic sources and local site conditions and ground motion site amplifications. From the vulnerability side, there are considerable uncertainties in estimating the state of damage of buildings under different earthquake ground motions. From an analytical side, there are challenges in capturing the spatial correlation of ground motions and building damage, and integrating thousands of loss distribution curves with different degrees of correlation. In this paper we propose to address some of these issues by conducting loss analyses of a typical small portfolio in southern California, taking into consideration various source and ground

  19. Surface-Source Downhole Seismic Analysis in R

    Science.gov (United States)

    Thompson, Eric M.

    2007-01-01

    This report discusses a method for interpreting a layered slowness or velocity model from surface-source downhole seismic data originally presented by Boore (2003). I have implemented this method in the statistical computing language R (R Development Core Team, 2007), so that it is freely and easily available to researchers and practitioners that may find it useful. I originally applied an early version of these routines to seismic cone penetration test data (SCPT) to analyze the horizontal variability of shear-wave velocity within the sediments in the San Francisco Bay area (Thompson et al., 2006). A more recent version of these codes was used to analyze the influence of interface-selection and model assumptions on velocity/slowness estimates and the resulting differences in site amplification (Boore and Thompson, 2007). The R environment has many benefits for scientific and statistical computation; I have chosen R to disseminate these routines because it is versatile enough to program specialized routines, is highly interactive which aids in the analysis of data, and is freely and conveniently available to install on a wide variety of computer platforms. These scripts are useful for the interpretation of layered velocity models from surface-source downhole seismic data such as deep boreholes and SCPT data. The inputs are the travel-time data and the offset of the source at the surface. The travel-time arrivals for the P- and S-waves must already be picked from the original data. An option in the inversion is to include estimates of the standard deviation of the travel-time picks for a weighted inversion of the velocity profile. The standard deviation of each travel-time pick is defined relative to the standard deviation of the best pick in a profile and is based on the accuracy with which the travel-time measurement could be determined from the seismogram. The analysis of the travel-time data consists of two parts: the identification of layer-interfaces, and the

  20. Incorporating priors for EEG source imaging and connectivity analysis

    Directory of Open Access Journals (Sweden)

    Xu eLei

    2015-08-01

    Full Text Available Electroencephalography source imaging (ESI is a useful technique to localize the generators from a given scalp electric measurement and to investigate the temporal dynamics of the large-scale neural circuits. By introducing reasonable priors from other modalities, ESI reveals the most probable sources and communication structures at every moment in time. Here, we review the available priors from such techniques as magnetic resonance imaging (MRI, functional MRI (fMRI, and positron emission tomography (PET. The modality's specific contribution is analyzed from the perspective of source reconstruction. For spatial priors, such as EEG-correlated fMRI, temporally coherent networks and resting-state fMRI are systematically introduced in the ESI. Moreover, the fiber tracking (diffusion tensor imaging, DTI and neuro-stimulation techniques (transcranial magnetic stimulation, TMS are also introduced as the potential priors, which can help to draw inferences about the neuroelectric connectivity in the source space. We conclude that combining EEG source imaging with other complementary modalities is a promising approach towards the study of brain networks in cognitive and clinical neurosciences.

  1. Zoomed MRI Guided by Combined EEG/MEG Source Analysis: A Multimodal Approach for Optimizing Presurgical Epilepsy Work-up and its Application in a Multi-focal Epilepsy Patient Case Study.

    Science.gov (United States)

    Aydin, Ü; Rampp, S; Wollbrink, A; Kugel, H; Cho, J -H; Knösche, T R; Grova, C; Wellmer, J; Wolters, C H

    2017-07-01

    In recent years, the use of source analysis based on electroencephalography (EEG) and magnetoencephalography (MEG) has gained considerable attention in presurgical epilepsy diagnosis. However, in many cases the source analysis alone is not used to tailor surgery unless the findings are confirmed by lesions, such as, e.g., cortical malformations in MRI. For many patients, the histology of tissue resected from MRI negative epilepsy shows small lesions, which indicates the need for more sensitive MR sequences. In this paper, we describe a technique to maximize the synergy between combined EEG/MEG (EMEG) source analysis and high resolution MRI. The procedure has three main steps: (1) construction of a detailed and calibrated finite element head model that considers the variation of individual skull conductivities and white matter anisotropy, (2) EMEG source analysis performed on averaged interictal epileptic discharges (IED), (3) high resolution (0.5 mm) zoomed MR imaging, limited to small areas centered at the EMEG source locations. The proposed new diagnosis procedure was then applied in a particularly challenging case of an epilepsy patient: EMEG analysis at the peak of the IED coincided with a right frontal focal cortical dysplasia (FCD), which had been detected at standard 1 mm resolution MRI. Of higher interest, zoomed MR imaging (applying parallel transmission, 'ZOOMit') guided by EMEG at the spike onset revealed a second, fairly subtle, FCD in the left fronto-central region. The evaluation revealed that this second FCD, which had not been detectable with standard 1 mm resolution, was the trigger of the seizures.

  2. Study on analysis from sources of error for Airborne LIDAR

    Science.gov (United States)

    Ren, H. C.; Yan, Q.; Liu, Z. J.; Zuo, Z. Q.; Xu, Q. Q.; Li, F. F.; Song, C.

    2016-11-01

    With the advancement of Aerial Photogrammetry, it appears that to obtain geo-spatial information of high spatial and temporal resolution provides a new technical means for Airborne LIDAR measurement techniques, with unique advantages and broad application prospects. Airborne LIDAR is increasingly becoming a new kind of space for earth observation technology, which is mounted by launching platform for aviation, accepting laser pulses to get high-precision, high-density three-dimensional coordinate point cloud data and intensity information. In this paper, we briefly demonstrates Airborne laser radar systems, and that some errors about Airborne LIDAR data sources are analyzed in detail, so the corresponding methods is put forwarded to avoid or eliminate it. Taking into account the practical application of engineering, some recommendations were developed for these designs, which has crucial theoretical and practical significance in Airborne LIDAR data processing fields.

  3. Jet flow analysis of liquid poison injection in a CANDU reactor using source term

    International Nuclear Information System (INIS)

    Chae, Kyung Myung; Choi, Hang Bok; Rhee, Bo Wook

    2001-01-01

    For the performance analysis of Canadian deuterium uranium (CANDU) reactor shutdown system number 2 (SDS2), a computational fluid dynamics model of poison jet flow has been developed to estimate the flow field and poison concentration formed inside the CANDU reactor calandria. As the ratio of calandria shell radius over injection nozzle hole diameter is so large (1055), it is impractical to develop a full-size model encompassing the whole calandria shell. In order to reduce the model to a manageable size, a quarter of one-pitch length segment of the shell was modeled using symmetric nature of the jet; and the injected jet was treated as a source term to avoid the modeling difficulty caused by the big difference of the hole sizes. For the analysis of an actual CANDU-6 SDS2 poison injection, the grid structure was determined based on the results of two-dimensional real- and source-jet simulations. The maximum injection velocity of the liquid poison is 27.8 m/s and the mass fraction of the poison is 8000 ppm (mg/kg). The simulation results have shown well-established jet flow field. In general, the jet develops narrowly at first but stretches rapidly. Then, the flow recirculates a little in r-x plane, while it recirculates largely in r-θ plane. As the time goes on, the adjacent jets contact each other and form a wavy front such that the whole jet develops in a plate form. his study has shown that the source term model can be effectively used for the analysis of the poison injection and the simulation result of the CANDU reactor is consistent with the model currently being used for the safety analysis. In the future, it is strongly recommended to analyze the transient (from helium tank to injection nozzle hole) of the poison injection by applying Bernoulli equation with real boundary conditions

  4. Emotion impairs extrinsic source memory--An ERP study.

    Science.gov (United States)

    Mao, Xinrui; You, Yuqi; Li, Wen; Guo, Chunyan

    2015-09-01

    Substantial advancements in understanding emotional modulation of item memory notwithstanding, controversies remain as to how emotion influences source memory. Using an emotional extrinsic source memory paradigm combined with remember/know judgments and two key event-related potentials (ERPs)-the FN400 (a frontal potential at 300-500 ms related to familiarity) and the LPC (a later parietal potential at 500-700 ms related to recollection), our research investigated the impact of emotion on extrinsic source memory and the underlying processes. We varied a semantic prompt (either "people" or "scene") preceding a study item to manipulate the extrinsic source. Behavioral data indicated a significant effect of emotion on "remember" responses to extrinsic source details, suggesting impaired recollection-based source memory in emotional (both positive and negative) relative to neutral conditions. In parallel, differential FN400 and LPC amplitudes (correctly remembered - incorrectly remembered sources) revealed emotion-related interference, suggesting impaired familiarity and recollection memory of extrinsic sources associated with positive or negative items. These findings thus lend support to the notion of emotion-induced memory trade off: while enhancing memory of central items and intrinsic/integral source details, emotion nevertheless disrupts memory of peripheral contextual details, potentially impairing both familiarity and recollection. Importantly, that positive and negative items result in comparable memory impairment suggests that arousal (vs. affective valence) plays a critical role in modulating dynamic interactions among automatic and elaborate processes involved in memory. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Sustainability in Open Source Software Commons: Lessons Learned from an Empirical Study of SourceForge Projects

    OpenAIRE

    Charles M. Schweik

    2013-01-01

    In this article, we summarize a five-year US National Science Foundation funded study designed to investigate the factors that lead some open source projects to ongoing collaborative success while many others become abandoned. Our primary interest was to conduct a study that was closely representative of the population of open source software projects in the world, rather than focus on the more-often studied, high-profile successful cases. After building a large database of projects (n=174,33...

  6. Nutrient patterns and their food sources in an International Study Setting: report from the EPIC study.

    Directory of Open Access Journals (Sweden)

    Aurelie Moskal

    Full Text Available Compared to food patterns, nutrient patterns have been rarely used particularly at international level. We studied, in the context of a multi-center study with heterogeneous data, the methodological challenges regarding pattern analyses.We identified nutrient patterns from food frequency questionnaires (FFQ in the European Prospective Investigation into Cancer and Nutrition (EPIC Study and used 24-hour dietary recall (24-HDR data to validate and describe the nutrient patterns and their related food sources. Associations between lifestyle factors and the nutrient patterns were also examined. Principal component analysis (PCA was applied on 23 nutrients derived from country-specific FFQ combining data from all EPIC centers (N = 477,312. Harmonized 24-HDRs available for a representative sample of the EPIC populations (N = 34,436 provided accurate mean group estimates of nutrients and foods by quintiles of pattern scores, presented graphically. An overall PCA combining all data captured a good proportion of the variance explained in each EPIC center. Four nutrient patterns were identified explaining 67% of the total variance: Principle component (PC 1 was characterized by a high contribution of nutrients from plant food sources and a low contribution of nutrients from animal food sources; PC2 by a high contribution of micro-nutrients and proteins; PC3 was characterized by polyunsaturated fatty acids and vitamin D; PC4 was characterized by calcium, proteins, riboflavin, and phosphorus. The nutrients with high loadings on a particular pattern as derived from country-specific FFQ also showed high deviations in their mean EPIC intakes by quintiles of pattern scores when estimated from 24-HDR. Center and energy intake explained most of the variability in pattern scores.The use of 24-HDR enabled internal validation and facilitated the interpretation of the nutrient patterns derived from FFQs in term of food sources. These outcomes open research

  7. Nutrient patterns and their food sources in an International Study Setting: report from the EPIC study.

    Science.gov (United States)

    Moskal, Aurelie; Pisa, Pedro T; Ferrari, Pietro; Byrnes, Graham; Freisling, Heinz; Boutron-Ruault, Marie-Christine; Cadeau, Claire; Nailler, Laura; Wendt, Andrea; Kühn, Tilman; Boeing, Heiner; Buijsse, Brian; Tjønneland, Anne; Halkjær, Jytte; Dahm, Christina C; Chiuve, Stephanie E; Quirós, Jose R; Buckland, Genevieve; Molina-Montes, Esther; Amiano, Pilar; Huerta Castaño, José M; Gurrea, Aurelio Barricarte; Khaw, Kay-Tee; Lentjes, Marleen A; Key, Timothy J; Romaguera, Dora; Vergnaud, Anne-Claire; Trichopoulou, Antonia; Bamia, Christina; Orfanos, Philippos; Palli, Domenico; Pala, Valeria; Tumino, Rosario; Sacerdote, Carlotta; de Magistris, Maria Santucci; Bueno-de-Mesquita, H Bas; Ocké, Marga C; Beulens, Joline W J; Ericson, Ulrika; Drake, Isabel; Nilsson, Lena M; Winkvist, Anna; Weiderpass, Elisabete; Hjartåker, Anette; Riboli, Elio; Slimani, Nadia

    2014-01-01

    Compared to food patterns, nutrient patterns have been rarely used particularly at international level. We studied, in the context of a multi-center study with heterogeneous data, the methodological challenges regarding pattern analyses. We identified nutrient patterns from food frequency questionnaires (FFQ) in the European Prospective Investigation into Cancer and Nutrition (EPIC) Study and used 24-hour dietary recall (24-HDR) data to validate and describe the nutrient patterns and their related food sources. Associations between lifestyle factors and the nutrient patterns were also examined. Principal component analysis (PCA) was applied on 23 nutrients derived from country-specific FFQ combining data from all EPIC centers (N = 477,312). Harmonized 24-HDRs available for a representative sample of the EPIC populations (N = 34,436) provided accurate mean group estimates of nutrients and foods by quintiles of pattern scores, presented graphically. An overall PCA combining all data captured a good proportion of the variance explained in each EPIC center. Four nutrient patterns were identified explaining 67% of the total variance: Principle component (PC) 1 was characterized by a high contribution of nutrients from plant food sources and a low contribution of nutrients from animal food sources; PC2 by a high contribution of micro-nutrients and proteins; PC3 was characterized by polyunsaturated fatty acids and vitamin D; PC4 was characterized by calcium, proteins, riboflavin, and phosphorus. The nutrients with high loadings on a particular pattern as derived from country-specific FFQ also showed high deviations in their mean EPIC intakes by quintiles of pattern scores when estimated from 24-HDR. Center and energy intake explained most of the variability in pattern scores. The use of 24-HDR enabled internal validation and facilitated the interpretation of the nutrient patterns derived from FFQs in term of food sources. These outcomes open research opportunities and

  8. Modeling and analysis of a transcritical rankine power cycle with a low grade heat source

    DEFF Research Database (Denmark)

    Nguyen, Chan; Veje, Christian

    efficiency, exergetic efficiency and specific net power output. A generic cycle configuration has been used for analysis of a geothermal energy heat source. This model has been validated against similar calculations using industrial waste heat as the energy source. Calculations are done with fixed...

  9. Analysis of the source term in the Chernobyl-4 accident

    International Nuclear Information System (INIS)

    Alonso, A.; Lopez Montero, J.V.; Pinedo Garrido, P.

    1990-01-01

    The report presents the analysis of the Chernobyl accident and of the phenomena with major influence on the source term, including the chemical effects of materials dumped over the reactor, carried out by the Chair of Nuclear Technology at Madrid University under a contract with the CEC. It also includes the comparison of the ratio (Cs-137/Cs-134) between measurements performed by Soviet authorities and countries belonging to the Community and OECD area. Chapter II contains a summary of both isotope measurements (Cs-134 and Cs-137), and their ratios, in samples of air, water, soil and agricultural and animal products collected by the Soviets in their report presented in Vienna (1986). Chapter III reports on the inventories of cesium isotopes in the core, while Chapter IV analyses the transient, especially the fuel temperature reached, as a way to deduce the mechanisms which took place in the cesium escape. The cesium source term is analyzed in Chapter V. Normal conditions have been considered, as well as the transient and the post-accidental period, including the effects of deposited materials. The conclusion of this study is that Chernobyl accidental sequence is specific of the RBMK type of reactors, and that in the Western world, basic research on fuel behaviour for reactivity transients has already been carried out

  10. Sourcing of internal auditing : An empirical study

    NARCIS (Netherlands)

    Speklé, R.F.; Elten, van H.J.; Kruis, A.

    2007-01-01

    This paper studies the factors associated with organizations’ internal audit sourcing decisions, building from a previous study by Widener and Selto (henceforth W&S) [Widener, S.K., Selto, F.H., 1999. Management control systems and boundaries of the firm: why do firms outsource internal audit

  11. Polarisation analysis of elastic neutron scattering using a filter spectrometer on a pulsed source

    International Nuclear Information System (INIS)

    Mayers, J.; Williams, W.G.

    1981-05-01

    The experimental and theoretical aspects of the polarisation analysis technique in elastic neutron scattering are described. An outline design is presented for a filter polarisation analysis spectrometer on the Rutherford Laboratory Spallation Neutron Source and estimates made of its expected count rates and resolution. (author)

  12. Identification of Watershed-scale Critical Source Areas Using Bayesian Maximum Entropy Spatiotemporal Analysis

    Science.gov (United States)

    Roostaee, M.; Deng, Z.

    2017-12-01

    The states' environmental agencies are required by The Clean Water Act to assess all waterbodies and evaluate potential sources of impairments. Spatial and temporal distributions of water quality parameters are critical in identifying Critical Source Areas (CSAs). However, due to limitations in monetary resources and a large number of waterbodies, available monitoring stations are typically sparse with intermittent periods of data collection. Hence, scarcity of water quality data is a major obstacle in addressing sources of pollution through management strategies. In this study spatiotemporal Bayesian Maximum Entropy method (BME) is employed to model the inherent temporal and spatial variability of measured water quality indicators such as Dissolved Oxygen (DO) concentration for Turkey Creek Watershed. Turkey Creek is located in northern Louisiana and has been listed in 303(d) list for DO impairment since 2014 in Louisiana Water Quality Inventory Reports due to agricultural practices. BME method is proved to provide more accurate estimates than the methods of purely spatial analysis by incorporating space/time distribution and uncertainty in available measured soft and hard data. This model would be used to estimate DO concentration at unmonitored locations and times and subsequently identifying CSAs. The USDA's crop-specific land cover data layers of the watershed were then used to determine those practices/changes that led to low DO concentration in identified CSAs. Primary results revealed that cultivation of corn and soybean as well as urban runoff are main contributing sources in low dissolved oxygen in Turkey Creek Watershed.

  13. Fiber Based Mid Infrared Supercontinuum Source for Spectroscopic Analysis in Food Production

    DEFF Research Database (Denmark)

    Ramsay, Jacob; Dupont, Sune Vestergaard Lund; Keiding, Søren Rud

    Optimization of sustainable food production is a worldwide challenge that is undergoing continuous development as new technologies emerge. Applying solutions for food analysis with novel bright and broad mid-infrared (MIR) light sources has the potential to meet the increasing demands for food...

  14. Market Orientation and Sources of Knowledge to Innovate in SMEs: A Firm Level Study

    Directory of Open Access Journals (Sweden)

    Simone Regina Didonet

    2016-10-01

    Full Text Available This work examines the relationship between the three market orientation (MO components, i.e. customer orientation, competitor orientation and inter-functional coordination, and the extension to which small and medium-sized enterprises (SMEs use different sources of knowledge to innovate. Based on a sample of 181 Chilean SMEs, a confirmatory factorial analysis (CFA was performed to analyze the relationship among constructs. The results show that the extension to which SMEs use different sources of knowledge to innovate depends on the interactions between MO components. This study addresses a gap in the literature, by linking and interrelating market orientation components to the innovation perspective in SMEs. Therefore, we provide insights into the role of each MO component in influencing the extension to which firms seek for and use different sources of knowledge to innovate and attempt to explain some literature inconsistencies on the theme.

  15. Review of SFR In-Vessel Radiological Source Term Studies

    International Nuclear Information System (INIS)

    Suk, Soo Dong; Lee, Yong Bum

    2008-10-01

    An effort has been made in this study to search for and review the literatures in public domain on the studies of the phenomena related to the release of radionuclides and aerosols to the reactor containment of the sodium fast reactor (SFR) plants (i.e., in-vessel source term), made in Japan and Europe including France, Germany and UK over the last few decades. Review work is focused on the experimental programs to investigate the phenomena related to determining the source terms, with a brief review on supporting analytical models and computer programs. In this report, the research programs conducted to investigate the CDA (core disruptive accident) bubble behavior in the sodium pool for determining 'primary' or 'instantaneous' source term are first introduced. The studies performed to determine 'delayed source term' are then described, including the various stages of phenomena and processes: fission product (FP) release from fuel , evaporation release from the surface of the pool, iodine mass transfer from fission gas bubble, FP deposition , and aerosol release from core-concrete interaction. The research programs to investigate the release and transport of FPs and aerosols in the reactor containment (i.e., in-containment source term) are not described in this report

  16. Failure analysis of radioisotopic heat source capsules tested under multi-axial conditions

    International Nuclear Information System (INIS)

    Zielinski, R.E.; Stacy, E.; Burgan, C.E.

    In order to qualify small radioisotopic heat sources for a 25-yr design life, multi-axial mechanical tests were performed on the structural components of the heat source. The results of these tests indicated that failure predominantly occurred in the middle of the weld ramp-down zone. Examination of the failure zone by standard metallographic techniques failed to indicate the true cause of failure. A modified technique utilizing chemical etching, scanning electron microscopy, and energy dispersive x-ray analysis was employed and dramatically indicated the true cause of failure, impurity concentration in the ramp-down zone. As a result of the initial investigation, weld parameters for the heat sources were altered. Example welds made with a pulse arc technique did not have this impurity buildup in the ramp-down zone

  17. Identifying sources of emerging organic contaminants in a mixed use watershed using principal components analysis.

    Science.gov (United States)

    Karpuzcu, M Ekrem; Fairbairn, David; Arnold, William A; Barber, Brian L; Kaufenberg, Elizabeth; Koskinen, William C; Novak, Paige J; Rice, Pamela J; Swackhamer, Deborah L

    2014-01-01

    Principal components analysis (PCA) was used to identify sources of emerging organic contaminants in the Zumbro River watershed in Southeastern Minnesota. Two main principal components (PCs) were identified, which together explained more than 50% of the variance in the data. Principal Component 1 (PC1) was attributed to urban wastewater-derived sources, including municipal wastewater and residential septic tank effluents, while Principal Component 2 (PC2) was attributed to agricultural sources. The variances of the concentrations of cotinine, DEET and the prescription drugs carbamazepine, erythromycin and sulfamethoxazole were best explained by PC1, while the variances of the concentrations of the agricultural pesticides atrazine, metolachlor and acetochlor were best explained by PC2. Mixed use compounds carbaryl, iprodione and daidzein did not specifically group with either PC1 or PC2. Furthermore, despite the fact that caffeine and acetaminophen have been historically associated with human use, they could not be attributed to a single dominant land use category (e.g., urban/residential or agricultural). Contributions from septic systems did not clarify the source for these two compounds, suggesting that additional sources, such as runoff from biosolid-amended soils, may exist. Based on these results, PCA may be a useful way to broadly categorize the sources of new and previously uncharacterized emerging contaminants or may help to clarify transport pathways in a given area. Acetaminophen and caffeine were not ideal markers for urban/residential contamination sources in the study area and may need to be reconsidered as such in other areas as well.

  18. Design and analysis of nuclear battery driven by the external neutron source

    International Nuclear Information System (INIS)

    Wang, Sanbing; He, Chaohui

    2014-01-01

    Highlights: • A new type of space nuclear power called NBDEx is investigated. • NBDEx with 252 Cf has better performance than RTG with similar structure. • Its thermal power gets great improvement with increment of fuel enrichment. • The service life of NBDEx is about 2.96 year. • The launch abortion accident analysis fully demonstrates the advantage of NBDEx. - Abstract: Based on the theory of ADS (Accelerator Driven Subcritical reactor), a new type of nuclear battery was investigated, which was composed of a subcritical fission module and an isotope neutron source, called NBDEx (Nuclear Battery Driven by External neutron source). According to the structure of GPHS-RTG (General Purpose Heat Source Radioisotope Thermoelectric Generator), the fuel cell model and fuel assembly model of NBDEx were set up, and then their performances were analyzed with MCNP code. From these results, it was found that the power and power density of NBDEx were almost six times higher than the RTG’s. For fully demonstrating the advantage of NBDEx, the analysis of its impact factors was performed with MCNP code, and its lifetime was also calculated using the Origen code. These results verified that NBDEx was more suitable for the space missions than RTG

  19. A simple iterative independent component analysis algorithm for vibration source signal identification of complex structures

    Directory of Open Access Journals (Sweden)

    Dong-Sup Lee

    2015-01-01

    Full Text Available Independent Component Analysis (ICA, one of the blind source separation methods, can be applied for extracting unknown source signals only from received signals. This is accomplished by finding statistical independence of signal mixtures and has been successfully applied to myriad fields such as medical science, image processing, and numerous others. Nevertheless, there are inherent problems that have been reported when using this technique: insta- bility and invalid ordering of separated signals, particularly when using a conventional ICA technique in vibratory source signal identification of complex structures. In this study, a simple iterative algorithm of the conventional ICA has been proposed to mitigate these problems. The proposed method to extract more stable source signals having valid order includes an iterative and reordering process of extracted mixing matrix to reconstruct finally converged source signals, referring to the magnitudes of correlation coefficients between the intermediately separated signals and the signals measured on or nearby sources. In order to review the problems of the conventional ICA technique and to vali- date the proposed method, numerical analyses have been carried out for a virtual response model and a 30 m class submarine model. Moreover, in order to investigate applicability of the proposed method to real problem of complex structure, an experiment has been carried out for a scaled submarine mockup. The results show that the proposed method could resolve the inherent problems of a conventional ICA technique.

  20. Air Pollution in Shanghai Studied by Nuclear Analysis Techniques

    International Nuclear Information System (INIS)

    Zhang, G.; Tan, M.; Chen, J.; Jin, C.; Lin, J.; Li, X.; Li, Y.

    2009-01-01

    In this paper PIXE, μ-PIXE, XAFS, Moessbauer effect and radioisotope labelling method are briefly introduced. Those methods were used to study the pollution of atmospheric particulate matter (PM) in Shanghai. The speciation of Cr, Mn, Cu, and Zn in the PM10 and PM2.5 and different character of vehicle exhausted particles from other emission sources were studied. Source apportionment of the atmospheric lead was calculated with a combined method of lead isotope ratios and lead mass balance, along with μ-PIXE analysis of single particles and pattern recognition of the spectra. The fabricated ultrafine particles to simulate aerosol particle was used to study the translocation from alveolus into circulation across the air blood barrier

  1. Physical activity and social support in adolescents: analysis of different types and sources of social support.

    Science.gov (United States)

    Mendonça, Gerfeson; Júnior, José Cazuza de Farias

    2015-01-01

    Little is known about the influence of different types and sources of social support on physical activity in adolescents. The aim of this study was to analyse the association between physical activity and different types and sources of social support in adolescents. The sample consisted of 2,859 adolescents between 14-19 years of age in the city of João Pessoa, in Northeastern Brazil. Physical activity was measured with a questionnaire and social support from parents and friends using a 10-item scale five for each group (type of support: encouragement, joint participation, watching, inviting, positive comments and transportation). Multivariable analysis showed that the types of support provided by parents associated with physical activity in adolescents were encouragement for females (P genders (males: P = 0.009; females: P physical activity varies according to its source, as well as the gender and age of the adolescents.

  2. Food Sources of Sodium Intake in an Adult Mexican Population: A Sub-Analysis of the SALMEX Study

    Science.gov (United States)

    Colin-Ramirez, Eloisa; Miranda-Alatriste, Paola Vanessa; Tovar-Villegas, Verónica Ivette; Arcand, JoAnne; Correa-Rotter, Ricardo

    2017-01-01

    Excessive dietary sodium intake increases blood pressure and cardiovascular risk. In Western diets, the majority of dietary sodium comes from packaged and prepared foods (≈75%); however, in Mexico there is no available data on the main food sources of dietary sodium. The main objective of this study was to identify and characterize the major food sources of dietary sodium in a sample of the Mexican Salt and Mexico (SALMEX) cohort. Adult male and female participants of the SALMEX study who provided a complete and valid three-day food record during the baseline visit were included. Overall, 950 participants (mean age 38.6 ± 10.7 years) were analyzed to determine the total sodium contributed by the main food sources of sodium identified. Mean daily sodium intake estimated by three-day food records and 24-h urinary sodium excretion was 2647.2 ± 976.9 mg/day and 3497.2 ± 1393.0, in the overall population, respectively. Processed meat was the main contributor to daily sodium intake, representing 8% of total sodium intake per capita as measured by three-day food records. When savory bread (8%) and sweet bakery goods (8%) were considered together as bread products, these were the major contributor to daily sodium intake, accounting for the 16% of total sodium intake, followed by processed meat (8%), natural cheeses (5%), and tacos (5%). These results highlight the need for public health policies focused on reducing the sodium content of processed food in Mexico. PMID:28749449

  3. Source rock contributions to the Lower Cretaceous heavy oil accumulations in Alberta: a basin modeling study

    Science.gov (United States)

    Berbesi, Luiyin Alejandro; di Primio, Rolando; Anka, Zahie; Horsfield, Brian; Higley, Debra K.

    2012-01-01

    The origin of the immense oil sand deposits in Lower Cretaceous reservoirs of the Western Canada sedimentary basin is still a matter of debate, specifically with respect to the original in-place volumes and contributing source rocks. In this study, the contributions from the main source rocks were addressed using a three-dimensional petroleum system model calibrated to well data. A sensitivity analysis of source rock definition was performed in the case of the two main contributors, which are the Lower Jurassic Gordondale Member of the Fernie Group and the Upper Devonian–Lower Mississippian Exshaw Formation. This sensitivity analysis included variations of assigned total organic carbon and hydrogen index for both source intervals, and in the case of the Exshaw Formation, variations of thickness in areas beneath the Rocky Mountains were also considered. All of the modeled source rocks reached the early or main oil generation stages by 60 Ma, before the onset of the Laramide orogeny. Reconstructed oil accumulations were initially modest because of limited trapping efficiency. This was improved by defining lateral stratigraphic seals within the carrier system. An additional sealing effect by biodegraded oil may have hindered the migration of petroleum in the northern areas, but not to the east of Athabasca. In the latter case, the main trapping controls are dominantly stratigraphic and structural. Our model, based on available data, identifies the Gordondale source rock as the contributor of more than 54% of the oil in the Athabasca and Peace River accumulations, followed by minor amounts from Exshaw (15%) and other Devonian to Lower Jurassic source rocks. The proposed strong contribution of petroleum from the Exshaw Formation source rock to the Athabasca oil sands is only reproduced by assuming 25 m (82 ft) of mature Exshaw in the kitchen areas, with original total organic carbon of 9% or more.

  4. An open source cryostage and software analysis method for detection of antifreeze activity

    DEFF Research Database (Denmark)

    Lørup Buch, Johannes; Ramløv, H

    2016-01-01

    AFP could reliably be told apart from controls after only two minutes of recrystallisation. The goal of providing a fast, cheap and easy method for detecting antifreeze proteins in solution was met, and further development of the system can be followed at https://github.com/pechano/cryostage.......The aim of this study is to provide the reader with a simple setup that can detect antifreeze proteins (AFP) by inhibition of ice recrystallisation in very small sample sizes. This includes an open source cryostage, a method for preparing and loading samples as well as a software analysis method...

  5. Vibration analysis of the photon shutter designed for the advanced photon source

    International Nuclear Information System (INIS)

    Wang, Z.; Shu, D.; Kuzay, T.M.

    1992-01-01

    The photon shutter is a critical component of the beamline front end for the 7 GeV Advanced Photon Source (APS) project, now under construction at Argonne National Laboratory (ANL). The shutter is designed to close in tens of milliseconds to absorb up to 10 kW heat load (with high heat flux). Our shutter design uses innovative enhanced heat transfer tubes to withstand the high heat load. Although designed to be light weight and compact, the very fast movement of the shutter gives rise to concern regarding vibration and dynamic sensitivity. To guarantee long-term functionality and reliability of the shutter, the dynamic behavior should be fully studied. In this paper, the natural frequency and transient dynamic analysis for the shutter during operation are presented. Through analysis of the vibration characteristics, as well as stress and deformation, several options in design were developed and compared, including selection of materials for the shutter and structural details

  6. CURRENT STUDY ON THE FUNDING SOURCES COVERAGE OF CURRENT ASSETS TO COMPANIES LISTED ON THE BUCHAREST STOCK EXCHANGE

    Directory of Open Access Journals (Sweden)

    Teodor HADA

    2014-06-01

    Full Text Available This paper presents issues about the coverage with financing sources of current assets for 64 companies listed on the Bucharest Stock Exchange. The aim of the study is to see how to calculate indicators specific to current assets and the general framework offered as whole analysis of the financing sources of current assets. The introduction of the paper presents the objective, the research methodology and the novelties brought by this study. Further on, this study shows the various views of the authors about the concept of "current assets", financing sources of current assets, the calculation of net working capital, setting the limits of the normal working capital and determining the speed of rotation. After that was done, based on the theory, a case study was performed, for companies covered in this study. Conclusions focused on determining the final data about what was detailed in the previous paragraphs.

  7. Source inversion in the full-wave tomography; Full wave tomography ni okeru source inversion

    Energy Technology Data Exchange (ETDEWEB)

    Tsuchiya, T [DIA Consultants Co. Ltd., Tokyo (Japan)

    1997-10-22

    In order to consider effects of characteristics of a vibration source in the full-wave tomography (FWT), a study has been performed on a method to invert vibration source parameters together with V(p)/V(s) distribution. The study has expanded an analysis method which uses as the basic the gradient method invented by Tarantola and the partial space method invented by Sambridge, and conducted numerical experiments. The experiment No. 1 has performed inversion of only the vibration source parameters, and the experiment No. 2 has executed simultaneous inversion of the V(p)/V(s) distribution and the vibration source parameters. The result of the discussions revealed that and effective analytical procedure would be as follows: in order to predict maximum stress, the average vibration source parameters and the property parameters are first inverted simultaneously; in order to estimate each vibration source parameter at a high accuracy, the property parameters are fixed, and each vibration source parameter is inverted individually; and the derived vibration source parameters are fixed, and the property parameters are again inverted from the initial values. 5 figs., 2 tabs.

  8. Dental students' perceived sources of stress: a multi-country study.

    Science.gov (United States)

    Polychronopoulou, Argy; Divaris, Kimon

    2009-05-01

    The aim of this study was to identify dental students' self-reported sources of stress and to explore the role of specific curricular and institutional differences in the variation of perceived stressors among dental students in Greece, Ireland, Slovenia, Sweden, Spain, and Croatia. A thirty-item modified version of the Dental Environment Stress (DES) questionnaire was administered to all undergraduate students enrolled at six European dental schools selected to reflect geographical, curricular, and professional environment diversity: Athens, Greece; Dublin, Ireland; Ljubljana, Slovenia; Malmö, Sweden; Santiago de Compostela, Spain; and Zagreb, Croatia. Participation varied from 93 percent in Athens to 65 percent in Dublin. A total of 1,492 questionnaires were available for analysis. Univariate analysis and multivariate modelling were used for data analysis. Performance pressure, workload, and self-efficacy beliefs constituted the students' main concerns. In the univariate analysis, student responses differed by country: Swedish students provided the lowestst scores in five out of six DES factors, Spanish students were the most concerned about "clinical training" and "performance pressure," whereas Greek students were the most concerned about "patient treatment." Multivariate modelling revealed that problem-based learning (PBL) was inversely associated with perceived stress for "self-efficacy beliefs" OR (95% CI): 0.66 (0.52, 0.84), "workload" OR (95% CI): 0.58 (0.41, 0.80); and "clinical training" OR (95% CI): 0.69 (0.50, 0.95) when compared to traditional curricula. Students' perceived stressors differed greatly among the six institutions and were associated with both individual (gender, study level) and educational/institutional (curriculum type, class size, educational costs) parameters.

  9. Sources of Safety Data and Statistical Strategies for Design and Analysis: Postmarket Surveillance.

    Science.gov (United States)

    Izem, Rima; Sanchez-Kam, Matilde; Ma, Haijun; Zink, Richard; Zhao, Yueqin

    2018-03-01

    Safety data are continuously evaluated throughout the life cycle of a medical product to accurately assess and characterize the risks associated with the product. The knowledge about a medical product's safety profile continually evolves as safety data accumulate. This paper discusses data sources and analysis considerations for safety signal detection after a medical product is approved for marketing. This manuscript is the second in a series of papers from the American Statistical Association Biopharmaceutical Section Safety Working Group. We share our recommendations for the statistical and graphical methodologies necessary to appropriately analyze, report, and interpret safety outcomes, and we discuss the advantages and disadvantages of safety data obtained from passive postmarketing surveillance systems compared to other sources. Signal detection has traditionally relied on spontaneous reporting databases that have been available worldwide for decades. However, current regulatory guidelines and ease of reporting have increased the size of these databases exponentially over the last few years. With such large databases, data-mining tools using disproportionality analysis and helpful graphics are often used to detect potential signals. Although the data sources have many limitations, analyses of these data have been successful at identifying safety signals postmarketing. Experience analyzing these dynamic data is useful in understanding the potential and limitations of analyses with new data sources such as social media, claims, or electronic medical records data.

  10. Off-design performance analysis of organic Rankine cycle using real operation data from a heat source plant

    International Nuclear Information System (INIS)

    Kim, In Seop; Kim, Tong Seop; Lee, Jong Jun

    2017-01-01

    Highlights: • ORC systems driven by waste or residual heat from a combined cycle cogeneration plant were analyzed. • An off-design analysis model was developed and validated with commercial ORC data. • A procedure to predict the actual variation of ORC performance using the off-design model was set up. • The importance of using long-term operation data of the heat source plant was demonstrated. - Abstract: There has been increasing demand for cogeneration power plants, which provides high energy utilization. Research on upgrading power plant performance is also being actively pursued. The organic Rankine cycle (ORC) can operate with mid- and low-temperature heat sources and is suitable for enhancing performance of existing power plants. In this study, an off-design analysis model for the ORC was developed, which is driven by waste heat or residual heat from a combined cycle cogeneration plant. The applied heat sources are the exhaust gas from the heat recovery steam generator (Case 1) and waste heat from a heat storage unit (Case 2). Optimal design points of the ORC were selected based on the design heat source condition of each case. Then, the available ORC power output for each case was predicted using actual long-term plant operation data and a validated off-design analysis model. The ORC capacity of Case 2 was almost two times larger than that of Case 1. The predicted average electricity generation of both cases was less than the design output. The results of this paper reveal the importance of both the prediction of electricity generation using actual plant operation data and the need for optimal ORC system sizing.

  11. Earthquake Source Spectral Study beyond the Omega-Square Model

    Science.gov (United States)

    Uchide, T.; Imanishi, K.

    2017-12-01

    Earthquake source spectra have been used for characterizing earthquake source processes quantitatively and, at the same time, simply, so that we can analyze the source spectra for many earthquakes, especially for small earthquakes, at once and compare them each other. A standard model for the source spectra is the omega-square model, which has the flat spectrum and the falloff inversely proportional to the square of frequencies at low and high frequencies, respectively, which are bordered by a corner frequency. The corner frequency has often been converted to the stress drop under the assumption of circular crack models. However, recent studies claimed the existence of another corner frequency [Denolle and Shearer, 2016; Uchide and Imanishi, 2016] thanks to the recent development of seismic networks. We have found that many earthquakes in areas other than the area studied by Uchide and Imanishi [2016] also have source spectra deviating from the omega-square model. Another part of the earthquake spectra we now focus on is the falloff rate at high frequencies, which will affect the seismic energy estimation [e.g., Hirano and Yagi, 2017]. In June, 2016, we deployed seven velocity seismometers in the northern Ibaraki prefecture, where the shallow crustal seismicity mainly with normal-faulting events was activated by the 2011 Tohoku-oki earthquake. We have recorded seismograms at 1000 samples per second and at a short distance from the source, so that we can investigate the high-frequency components of the earthquake source spectra. Although we are still in the stage of discovery and confirmation of the deviation from the standard omega-square model, the update of the earthquake source spectrum model will help us systematically extract more information on the earthquake source process.

  12. Car indoor air pollution - analysis of potential sources

    Directory of Open Access Journals (Sweden)

    Müller Daniel

    2011-12-01

    Full Text Available Abstract The population of industrialized countries such as the United States or of countries from the European Union spends approximately more than one hour each day in vehicles. In this respect, numerous studies have so far addressed outdoor air pollution that arises from traffic. By contrast, only little is known about indoor air quality in vehicles and influences by non-vehicle sources. Therefore the present article aims to summarize recent studies that address i.e. particulate matter exposure. It can be stated that although there is a large amount of data present for outdoor air pollution, research in the area of indoor air quality in vehicles is still limited. Especially, knowledge on non-vehicular sources is missing. In this respect, an understanding of the effects and interactions of i.e. tobacco smoke under realistic automobile conditions should be achieved in future.

  13. Analysis of rod drop and pulsed source measurements of reactivity in the Winfrith SGHWR

    International Nuclear Information System (INIS)

    Brittain, I.

    1970-05-01

    Reactivity measurements by the rod-drop and pulsed source methods in the Winfrith SGHWR are seriously affected by spatial harmonics. A method of calculation is described which enables the spatial harmonics to be calculated in non-uniform cores in two or three dimensions, and thus allows a much more rigorous analysis of the experimental results than the usual point model. The method is used to analyse all the rod-drop measurements made during commissioning of the Winfrith SGHWR, and to comment on the results of pulsed source measurements. The reactivity worths of banks of ten and twelve shut-down tubes deduced from rod-drop and pulsed source experiments are in satisfactory agreement with each other and also with AIMAZ calculated values. The ability to calculate higher spatial harmonics in nonuniform cores is thought to be new, and may have a wider application to reactor kinetics through the method of Modal Analysis. (author)

  14. Irradiation Pattern Analysis for Designing Light Sources-Based on Light Emitting Diodes

    International Nuclear Information System (INIS)

    Rojas, E.; Stolik, S.; La Rosa, J. de; Valor, A.

    2016-01-01

    Nowadays it is possible to design light sources with a specific irradiation pattern for many applications. Light Emitting Diodes present features like high luminous efficiency, durability, reliability, flexibility, among others as the result of its rapid development. In this paper the analysis of the irradiation pattern of the light emitting diodes is presented. The approximation of these irradiation patterns to both, a Lambertian, as well as a Gaussian functions for the design of light sources is proposed. Finally, the obtained results and the functionality of bringing the irradiation pattern of the light emitting diodes to these functions are discussed. (Author)

  15. Economic analysis of the need for advanced power sources

    International Nuclear Information System (INIS)

    Hardie, R.W.; Omberg, R.P.

    1975-01-01

    The purpose of this paper is to determine the economic need for an advanced power source, be it fusion, solar, or some other concept. However, calculations were also performed assuming abandonment of the LMFBR program, so breeder benefits are a by-product of this study. The model used was the ALPS linear programming system for forecasting optimum power growth patterns. Total power costs were calculated over a planning horizon from 1975 to 2041 and discounted at 7 1 / 2 percent. The benefit of a particular advanced power source is simply the reduction in total power cost resulting from its introduction. Since data concerning advanced power sources (APS) are speculative, parametric calculations varying introduction dates and capital costs about a hypothetical APS plant were performed. Calculations were also performed without the LMFBR to determine the effect of the breeder on the benefits of an advanced power source. Other data used in the study, such as the energy demand curve and uranium resource estimates, are given in the Appendix, and a list of the 11 power plants used in this study is given. Calculations were performed for APS introduction dates of 2001 and 2011. Estimates of APS capital costs included cases where it was assumed the costs were $50/kW and $25/kW higher than the LMFBR. In addition, cases where APS and LMFBR capital costs are identical were also considered. It is noted that the APS capital costs used in this study are not estimates of potential advanced power system plant costs, but were chosen to compute potential dollar benefits of advanced power systems under extremely optimistic assumptions. As a further example, all APS fuel cycle costs were assumed to be zero

  16. Turbulence in extended synchrotron radio sources. I. Polarization of turbulent sources. II. Power-spectral analysis

    International Nuclear Information System (INIS)

    Eilek, J.A.

    1989-01-01

    Recent theories of magnetohydrodynamic turbulence are used to construct microphysical turbulence models, with emphasis on models of anisotropic turbulence. These models have been applied to the determination of the emergent polarization from a resolved uniform source. It is found that depolarization alone is not a unique measure of the turbulence, and that the turblence will also affect the total-intensity distributions. Fluctuations in the intensity image can thus be employed to measure turbulence strength. In the second part, it is demonstrated that a power-spectral analysis of the total and polarized intensity images can be used to obtain the power spectra of the synchrotron emission. 81 refs

  17. The comparison of four neutron sources for Prompt Gamma Neutron Activation Analysis (PGNAA) in vivo detections of boron.

    Science.gov (United States)

    Fantidis, J G; Nicolaou, G E; Potolias, C; Vordos, N; Bandekas, D V

    A Prompt Gamma Ray Neutron Activation Analysis (PGNAA) system, incorporating an isotopic neutron source has been simulated using the MCNPX Monte Carlo code. In order to improve the signal to noise ratio different collimators and a filter were placed between the neutron source and the object. The effect of the positioning of the neutron beam and the detector relative to the object has been studied. In this work the optimisation procedure is demonstrated for boron. Monte Carlo calculations were carried out to compare the performance of the proposed PGNAA system using four different neutron sources ( 241 Am/Be, 252 Cf, 241 Am/B, and DT neutron generator). Among the different systems the 252 Cf neutron based PGNAA system has the best performance.

  18. Observation of Point-Light-Walker Locomotion Induces Motor Resonance When Explicitly Represented; An EEG Source Analysis Study

    Directory of Open Access Journals (Sweden)

    Alberto Inuggi

    2018-03-01

    Full Text Available Understanding human motion, to infer the goal of others' actions, is thought to involve the observer's motor repertoire. One prominent class of actions, the human locomotion, has been object of several studies, all focused on manipulating the shape of degraded human figures like point-light walker (PLW stimuli, represented as walking on the spot. Nevertheless, since the main goal of the locomotor function is to displace the whole body from one position to the other, these stimuli might not fully represent a goal-directed action and thus might not be able to induce the same motor resonance mechanism expected when observing a natural locomotion. To explore this hypothesis, we recorded the event-related potentials (ERP of canonical/scrambled and translating/centered PLWs decoding. We individuated a novel ERP component (N2c over central electrodes, around 435 ms after stimulus onset, for translating compared to centered PLW, only when the canonical shape was preserved. Consistently with our hypothesis, sources analysis associated this component to the activation of trunk and lower legs primary sensory-motor and supplementary motor areas. These results confirm the role of own motor repertoire in processing human action and suggest that ERP can detect the associated motor resonance only when the human figure is explicitly involved in performing a meaningful action.

  19. Design-Oriented Analysis of Resonance Damping and Harmonic Compensation for LCL-Filtered Voltage Source Converters

    DEFF Research Database (Denmark)

    Wang, Xiongfei; Blaabjerg, Frede; Loh, Poh Chiang

    2014-01-01

    This paper addresses the interaction between harmonic resonant controllers and active damping of LCL resonance in voltage source converters. A virtual series R-C damper in parallel with the filter capacitor is proposed with the capacitor current feedback loop. The phase lag resulting from...... crossover frequency defined by the proportional gain of current controller. This is of particular interest for high-performance active harmonic filtering applications and low-pulse-ratio converters. Case studies in experiments validate the theoretical analysis....

  20. Fine particulates over South Asia: Review and meta-analysis of PM2.5 source apportionment through receptor model.

    Science.gov (United States)

    Singh, Nandita; Murari, Vishnu; Kumar, Manish; Barman, S C; Banerjee, Tirthankar

    2017-04-01

    Fine particulates (PM 2.5 ) constitute dominant proportion of airborne particulates and have been often associated with human health disorders, changes in regional climate, hydrological cycle and more recently to food security. Intrinsic properties of particulates are direct function of sources. This initiates the necessity of conducting a comprehensive review on PM 2.5 sources over South Asia which in turn may be valuable to develop strategies for emission control. Particulate source apportionment (SA) through receptor models is one of the existing tool to quantify contribution of particulate sources. Review of 51 SA studies were performed of which 48 (94%) were appeared within a span of 2007-2016. Almost half of SA studies (55%) were found concentrated over few typical urban stations (Delhi, Dhaka, Mumbai, Agra and Lahore). Due to lack of local particulate source profile and emission inventory, positive matrix factorization and principal component analysis (62% of studies) were the primary choices, followed by chemical mass balance (CMB, 18%). Metallic species were most regularly used as source tracers while use of organic molecular markers and gas-to-particle conversion were minimum. Among all the SA sites, vehicular emissions (mean ± sd: 37 ± 20%) emerged as most dominating PM 2.5 source followed by industrial emissions (23 ± 16%), secondary aerosols (22 ± 12%) and natural sources (20 ± 15%). Vehicular emissions (39 ± 24%) also identified as dominating source for highly polluted sites (PM 2.5 >100 μgm -3 , n = 15) while site specific influence of either or in combination of industrial, secondary aerosols and natural sources were recognized. Source specific trends were considerably varied in terms of region and seasonality. Both natural and industrial sources were most influential over Pakistan and Afghanistan while over Indo-Gangetic plain, vehicular, natural and industrial emissions appeared dominant. Influence of vehicular emission was

  1. A THEORETICAL ANALYSIS OF KEY POINTS WHEN CHOOSING OPEN SOURCE ERP SYSTEMS

    Directory of Open Access Journals (Sweden)

    Fernando Gustavo Dos Santos Gripe

    2011-08-01

    Full Text Available The present work is aimed at presenting a theoretical analysis of the main features of Open Source ERP systems, herein identified as success technical factors, in order to contribute to the establishment of parameters to be used in decision-making processes when choosing a system which fulfills the organization´s needs. Initially, the life cycle of ERP systems is contextualized, highlighting the features of Open Source ERP systems. As a result, it was verified that, when carefully analyzed, these systems need further attention regarding issues of project continuity and maturity, structure, transparency, updating frequency, and support, all of which are inherent to the reality of this type of software. Nevertheless, advantages were observed in what concerns flexibility, costs, and non-discontinuity as benefits. The main goal is to broaden the discussion about the adoption of Open Source ERP systems.

  2. Phase 2 safety analysis report: National Synchrotron Light Source

    International Nuclear Information System (INIS)

    Stefan, P.

    1989-06-01

    The Phase II program was established in order to provide additional space for experiments, and also staging and equipment storage areas. It also provides additional office space and new types of advanced instrumentation for users. This document will deal with the new safety issues resulting from this extensive expansion program, and should be used as a supplement to BNL Report No. 51584 ''National Synchrotron Light Source Safety Analysis Report,'' July 1982 (hereafter referred to as the Phase I SAR). The initial NSLS facility is described in the Phase I SAR. It comprises two electron storage rings, an injection system common to both, experimental beam lines and equipment, and office and support areas, all of which are housed in a 74,000 sq. ft. building. The X-ray Ring provides for 28 primary beam ports and the VUV Ring, 16. Each port is capable of division into 2 or 3 separate beam lines. All ports receive their synchrotron light from conventional bending magnet sources, the magnets being part of the storage ring lattice. 4 refs

  3. Search for neutrino point sources with an all-sky autocorrelation analysis in IceCube

    Energy Technology Data Exchange (ETDEWEB)

    Turcati, Andrea; Bernhard, Anna; Coenders, Stefan [TU, Munich (Germany); Collaboration: IceCube-Collaboration

    2016-07-01

    The IceCube Neutrino Observatory is a cubic kilometre scale neutrino telescope located in the Antarctic ice. Its full-sky field of view gives unique opportunities to study the neutrino emission from the Galactic and extragalactic sky. Recently, IceCube found the first signal of astrophysical neutrinos with energies up to the PeV scale, but the origin of these particles still remains unresolved. Given the observed flux, the absence of observations of bright point-sources is explainable with the presence of numerous weak sources. This scenario can be tested using autocorrelation methods. We present here the sensitivities and discovery potentials of a two-point angular correlation analysis performed on seven years of IceCube data, taken between 2008 and 2015. The test is applied on the northern and southern skies separately, using the neutrino energy information to improve the effectiveness of the method.

  4. Requirements Analysis Study for Master Pump Shutdown System Project Development Specification

    International Nuclear Information System (INIS)

    BEVINS, R.R.

    2000-01-01

    This study is a requirements document that presents analysis for the functional description for the master pump shutdown system. This document identifies the sources of the requirements and/or how these were derived. Each requirement is validated either by quoting the source or an analysis process involving the required functionality, performance characteristics, operations input or engineering judgment. The requirements in this study apply to the first phase of the W314 Project. This document has been updated during the definitive design portion of the first phase of the W314 Project to capture additional software requirements and is planned to be updated during the second phase of the W314 Project to cover the second phase of the project's scope

  5. AtomicJ: An open source software for analysis of force curves

    Science.gov (United States)

    Hermanowicz, Paweł; Sarna, Michał; Burda, Kvetoslava; Gabryś, Halina

    2014-06-01

    We present an open source Java application for analysis of force curves and images recorded with the Atomic Force Microscope. AtomicJ supports a wide range of contact mechanics models and implements procedures that reduce the influence of deviations from the contact model. It generates maps of mechanical properties, including maps of Young's modulus, adhesion force, and sample height. It can also calculate stacks, which reveal how sample's response to deformation changes with indentation depth. AtomicJ analyzes force curves concurrently on multiple threads, which allows for high speed of analysis. It runs on all popular operating systems, including Windows, Linux, and Macintosh.

  6. AtomicJ: An open source software for analysis of force curves

    International Nuclear Information System (INIS)

    Hermanowicz, Paweł; Gabryś, Halina; Sarna, Michał; Burda, Kvetoslava

    2014-01-01

    We present an open source Java application for analysis of force curves and images recorded with the Atomic Force Microscope. AtomicJ supports a wide range of contact mechanics models and implements procedures that reduce the influence of deviations from the contact model. It generates maps of mechanical properties, including maps of Young's modulus, adhesion force, and sample height. It can also calculate stacks, which reveal how sample's response to deformation changes with indentation depth. AtomicJ analyzes force curves concurrently on multiple threads, which allows for high speed of analysis. It runs on all popular operating systems, including Windows, Linux, and Macintosh

  7. Simulation study on ion extraction from ECR ion sources

    International Nuclear Information System (INIS)

    Fu, S.; Kitagawa, A.; Yamada, S.

    1993-07-01

    In order to study beam optics of NIRS-ECR ion source used in HIMAC, EGUN code has been modified to make it capable of modeling ion extraction from a plasma. Two versions of the modified code are worked out with two different methods in which 1-D and 2-D sheath theories are used respectively. Convergence problem of the strong nonlinear self-consistent equations is investigated. Simulations on NIRS-ECR ion source and HYPER-ECR ion source (in INS, Univ. of Tokyo) are presented in this paper, exhibiting an agreement with the experimental results. Some preliminary suggestions on the upgrading the extraction systems of these sources are also proposed. (author)

  8. Simulation study on ion extraction from ECR ion sources

    Energy Technology Data Exchange (ETDEWEB)

    Fu, S.; Kitagawa, A.; Yamada, S.

    1993-07-01

    In order to study beam optics of NIRS-ECR ion source used in HIMAC, EGUN code has been modified to make it capable of modeling ion extraction from a plasma. Two versions of the modified code are worked out with two different methods in which 1-D and 2-D sheath theories are used respectively. Convergence problem of the strong nonlinear self-consistent equations is investigated. Simulations on NIRS-ECR ion source and HYPER-ECR ion source (in INS, Univ. of Tokyo) are presented in this paper, exhibiting an agreement with the experimental results. Some preliminary suggestions on the upgrading the extraction systems of these sources are also proposed. (author).

  9. Further study on source parameters at Quirke Mine, Elliot Lake, Ontario

    International Nuclear Information System (INIS)

    Chen, S.

    1991-01-01

    A further analysis on source parameters for thirty-seven mining-induced seismic events at Quirke Mine, Elliot Lake, Ontario, has been carried out to study the self-similarity assumption in scaling law of seismic spectrum for mining-induced microearthquakes, and to understand the focal mechanism in the mine. Evidence from high P-wave energy in a ratio E p /E s of 5% to 30%, and about 80% of the events with E s /E p L ). For the same total seismic energy, the apparent stress is limited by 80 GN.m and 800 GN.m of seismic moment. The observed stress drop is dependent on the seismic moment, which implies a breakdown in scaling law for events induced by mining. An analysis of peak particle velocity and acceleration presents the evidence for seismic attenuation over the fractured zone above the rock burst area in the mine

  10. Municipal solid waste source-separated collection in China: A comparative analysis

    International Nuclear Information System (INIS)

    Tai Jun; Zhang Weiqian; Che Yue; Feng Di

    2011-01-01

    A pilot program focusing on municipal solid waste (MSW) source-separated collection was launched in eight major cities throughout China in 2000. Detailed investigations were carried out and a comprehensive system was constructed to evaluate the effects of the eight-year implementation in those cities. This paper provides an overview of different methods of collection, transportation, and treatment of MSW in the eight cities; as well as making a comparative analysis of MSW source-separated collection in China. Information about the quantity and composition of MSW shows that the characteristics of MSW are similar, which are low calorific value, high moisture content and high proportion of organisms. Differences which exist among the eight cities in municipal solid waste management (MSWM) are presented in this paper. Only Beijing and Shanghai demonstrated a relatively effective result in the implementation of MSW source-separated collection. While the six remaining cities result in poor performance. Considering the current status of MSWM, source-separated collection should be a key priority. Thus, a wider range of cities should participate in this program instead of merely the eight pilot cities. It is evident that an integrated MSWM system is urgently needed. Kitchen waste and recyclables are encouraged to be separated at the source. Stakeholders involved play an important role in MSWM, thus their responsibilities should be clearly identified. Improvement in legislation, coordination mechanisms and public education are problematic issues that need to be addressed.

  11. Paleomagnetism.org : An online multi-platform open source environment for paleomagnetic data analysis

    NARCIS (Netherlands)

    Koymans, Mathijs R.; Langereis, C.G.; Pastor-Galán, D.; van Hinsbergen, D.J.J.

    2016-01-01

    This contribution provides an overview of Paleomagnetism.org, an open-source, multi-platform online environment for paleomagnetic data analysis. Paleomagnetism.org provides an interactive environment where paleomagnetic data can be interpreted, evaluated, visualized, and exported. The

  12. Time-Reversal Study of the Hemet (CA) Tremor Source

    Science.gov (United States)

    Larmat, C. S.; Johnson, P. A.; Guyer, R. A.

    2010-12-01

    Since its first observation by Nadeau & Dolenc (2005) and Gomberg et al. (2008), tremor along the San Andreas fault system is thought to be a probe into the frictional state of the deep part of the fault (e.g. Shelly et al., 2007). Tremor is associated with slow, otherwise deep, aseismic slip events that may be triggered by faint signals such as passing waves from remote earthquakes or solid Earth tides.Well resolved tremor source location is key to constrain frictional models of the fault. However, tremor source location is challenging because of the high-frequency and highly-scattered nature of tremor signal characterized by the lack of isolated phase arrivals. Time Reversal (TR) methods are emerging as a useful tool for location. The unique requirement is a good velocity model for the different time-reversed phases to arrive coherently onto the source point. We present results of location for a tremor source near the town of Hemet, CA, which was triggered by the 2002 M 7.9 Denali Fault earthquake (Gomberg et al., 2008) and by the 2009 M 6.9 Gulf of California earthquake. We performed TR in a volume model of 88 (N-S) x 70 (W-E) x 60 km (Z) using the full-wave 3D wave-propagation package SPECFEM3D (Komatitsch et al., 2002). The results for the 2009 episode indicate a deep source (at about 22km) which is about 4km SW the fault surface scarp. We perform STA/SLA and correlation analysis in order to have independent confirmation of the Hemet tremor source. We gratefully acknowledge the support of the U. S. Department of Energy through the LANL/LDRD Program for this work.

  13. Is drinking water from 'improved sources' really safe? A case study in the Logone valley (Chad-Cameroon).

    Science.gov (United States)

    Sorlini, S; Palazzini, D; Mbawala, A; Ngassoum, M B; Collivignarelli, M C

    2013-12-01

    Within a cooperation project coordinated by the Association for Rural Cooperation in Africa and Latin America (ACRA) Foundation, water supplies were sampled across the villages of the Logone valley (Chad-Cameroon) mostly from boreholes, open wells, rivers and lakes as well as from some piped water. Microbiological analyses and sanitary inspections were carried out at each source. The microbiological quality was determined by analysis of indicators of faecal contamination, Escherichia coli, Enterococci and Salmonellae, using the membrane filtration method. Sanitary inspections were done using WHO query forms. The assessment confirmed that there are several parameters of health concern in the studied area; bacteria of faecal origins are the most significant. Furthermore, this study demonstrated that Joint Monitoring Programme (JMP) classification and E. coli measurement are not sufficient to state water safety. In fact, in the studied area, JMP defined 'improved sources' may provide unsafe water depending on their structure and sources without E. coli may have Enterococci and Salmonellae. Sanitary inspections also revealed high health risks for some boreholes. In other cases, sources with low sanitary risk and no E. coli were contaminated by Enterococci and Salmonellae. Better management and protection of the sources, hygiene improvement and domestic water treatment before consumption are possible solutions to reduce health risks in the Logone valley.

  14. Analysis of jet-airfoil interaction noise sources by using a microphone array technique

    Science.gov (United States)

    Fleury, Vincent; Davy, Renaud

    2016-03-01

    The paper is concerned with the characterization of jet noise sources and jet-airfoil interaction sources by using microphone array data. The measurements were carried-out in the anechoic open test section wind tunnel of Onera, Cepra19. The microphone array technique relies on the convected, Lighthill's and Ffowcs-Williams and Hawkings' acoustic analogy equation. The cross-spectrum of the source term of the analogy equation is sought. It is defined as the optimal solution to a minimal error equation using the measured microphone cross-spectra as reference. This inverse problem is ill-posed yet. A penalty term based on a localization operator is therefore added to improve the recovery of jet noise sources. The analysis of isolated jet noise data in subsonic regime shows the contribution of the conventional mixing noise source in the low frequency range, as expected, and of uniformly distributed, uncorrelated noise sources in the jet flow at higher frequencies. In underexpanded supersonic regime, a shock-associated noise source is clearly identified, too. An additional source is detected in the vicinity of the nozzle exit both in supersonic and subsonic regimes. In the presence of the airfoil, the distribution of the noise sources is deeply modified. In particular, a strong noise source is localized on the flap. For high Strouhal numbers, higher than about 2 (based on the jet mixing velocity and diameter), a significant contribution from the shear-layer near the flap is observed, too. Indications of acoustic reflections on the airfoil are also discerned.

  15. Nmrglue: an open source Python package for the analysis of multidimensional NMR data.

    Science.gov (United States)

    Helmus, Jonathan J; Jaroniec, Christopher P

    2013-04-01

    Nmrglue, an open source Python package for working with multidimensional NMR data, is described. When used in combination with other Python scientific libraries, nmrglue provides a highly flexible and robust environment for spectral processing, analysis and visualization and includes a number of common utilities such as linear prediction, peak picking and lineshape fitting. The package also enables existing NMR software programs to be readily tied together, currently facilitating the reading, writing and conversion of data stored in Bruker, Agilent/Varian, NMRPipe, Sparky, SIMPSON, and Rowland NMR Toolkit file formats. In addition to standard applications, the versatility offered by nmrglue makes the package particularly suitable for tasks that include manipulating raw spectrometer data files, automated quantitative analysis of multidimensional NMR spectra with irregular lineshapes such as those frequently encountered in the context of biomacromolecular solid-state NMR, and rapid implementation and development of unconventional data processing methods such as covariance NMR and other non-Fourier approaches. Detailed documentation, install files and source code for nmrglue are freely available at http://nmrglue.com. The source code can be redistributed and modified under the New BSD license.

  16. Nmrglue: an open source Python package for the analysis of multidimensional NMR data

    Energy Technology Data Exchange (ETDEWEB)

    Helmus, Jonathan J., E-mail: jjhelmus@gmail.com [Argonne National Laboratory, Environmental Science Division (United States); Jaroniec, Christopher P., E-mail: jaroniec@chemistry.ohio-state.edu [Ohio State University, Department of Chemistry and Biochemistry (United States)

    2013-04-15

    Nmrglue, an open source Python package for working with multidimensional NMR data, is described. When used in combination with other Python scientific libraries, nmrglue provides a highly flexible and robust environment for spectral processing, analysis and visualization and includes a number of common utilities such as linear prediction, peak picking and lineshape fitting. The package also enables existing NMR software programs to be readily tied together, currently facilitating the reading, writing and conversion of data stored in Bruker, Agilent/Varian, NMRPipe, Sparky, SIMPSON, and Rowland NMR Toolkit file formats. In addition to standard applications, the versatility offered by nmrglue makes the package particularly suitable for tasks that include manipulating raw spectrometer data files, automated quantitative analysis of multidimensional NMR spectra with irregular lineshapes such as those frequently encountered in the context of biomacromolecular solid-state NMR, and rapid implementation and development of unconventional data processing methods such as covariance NMR and other non-Fourier approaches. Detailed documentation, install files and source code for nmrglue are freely available at http://nmrglue.comhttp://nmrglue.com. The source code can be redistributed and modified under the New BSD license.

  17. Nmrglue: an open source Python package for the analysis of multidimensional NMR data

    International Nuclear Information System (INIS)

    Helmus, Jonathan J.; Jaroniec, Christopher P.

    2013-01-01

    Nmrglue, an open source Python package for working with multidimensional NMR data, is described. When used in combination with other Python scientific libraries, nmrglue provides a highly flexible and robust environment for spectral processing, analysis and visualization and includes a number of common utilities such as linear prediction, peak picking and lineshape fitting. The package also enables existing NMR software programs to be readily tied together, currently facilitating the reading, writing and conversion of data stored in Bruker, Agilent/Varian, NMRPipe, Sparky, SIMPSON, and Rowland NMR Toolkit file formats. In addition to standard applications, the versatility offered by nmrglue makes the package particularly suitable for tasks that include manipulating raw spectrometer data files, automated quantitative analysis of multidimensional NMR spectra with irregular lineshapes such as those frequently encountered in the context of biomacromolecular solid-state NMR, and rapid implementation and development of unconventional data processing methods such as covariance NMR and other non-Fourier approaches. Detailed documentation, install files and source code for nmrglue are freely available at http://nmrglue.comhttp://nmrglue.com. The source code can be redistributed and modified under the New BSD license.

  18. Experimental analysis of a diffusion absorption refrigeration system used alternative energy sources

    International Nuclear Information System (INIS)

    Soezen, A.; Oezbas, E.

    2009-01-01

    The continuous-cycle absorption refrigeration device is widely used in domestic refrigerators, and recreational vehicles. It is also used in year-around air conditioning of both homes and larger buildings. The unit consists of four main parts the boiler, condenser, evaporator and the absorber. When the unit operates on kerosene or gas, the heat is supplied by a burner. This element is fitted underneath the central tube. When operating on electricity, the heat is supplied by an element inserted in the pocket. No moving parts are employed. The operation of the refrigerating mechanism is based on Dalton's law. In this study, experimental analysis was performed of a diffusion absorption refrigeration system (DARS) used alternative energy sources such as solar, liquid petroleum gas (LPG) sources. Two basic DAR cycles were set up and investigated: i) In the first cycle (DARS-1), the condensate is sub-cooled prior to the evaporator entrance by the coupled evaporator/gas heat exchanger similar with manufactured by Electrolux Sweden. ii) In the second cycle (DARS-2), the condensate is not sub-cooled prior to the evaporator entrance and gas heat exchanger is separated from the evaporator. (author)

  19. Analysis and assessment on heavy metal sources in the coastal soils developed from alluvial deposits using multivariate statistical methods.

    Science.gov (United States)

    Li, Jinling; He, Ming; Han, Wei; Gu, Yifan

    2009-05-30

    An investigation on heavy metal sources, i.e., Cu, Zn, Ni, Pb, Cr, and Cd in the coastal soils of Shanghai, China, was conducted using multivariate statistical methods (principal component analysis, clustering analysis, and correlation analysis). All the results of the multivariate analysis showed that: (i) Cu, Ni, Pb, and Cd had anthropogenic sources (e.g., overuse of chemical fertilizers and pesticides, industrial and municipal discharges, animal wastes, sewage irrigation, etc.); (ii) Zn and Cr were associated with parent materials and therefore had natural sources (e.g., the weathering process of parent materials and subsequent pedo-genesis due to the alluvial deposits). The effect of heavy metals in the soils was greatly affected by soil formation, atmospheric deposition, and human activities. These findings provided essential information on the possible sources of heavy metals, which would contribute to the monitoring and assessment process of agricultural soils in worldwide regions.

  20. Provenance study of ancient Chinese Yaozhou porcelain by neutron activation analysis

    Science.gov (United States)

    Li, G. X.; Y Gao, Z.; Li, R. W.; Zhao, W. J.; Xie, J. Z.; Feng, S. L.; Zhuo, Z. X.; Y Fan, D.; Zhang, Y.; Cai, Z. F.; Liu, H.

    2003-09-01

    This paper reports our study of the provenance of ancient Chinese Yaozhou porcelain. The content of 29 elements in the Yaozhou porcelain samples was measured by neutron activation analysis (NAA). The NAA data were further analysed using fuzzy cluster analysis to obtain the trend fuzzy cluster diagrams. These samples with different glaze colour, ranging over more than 700 years, were fired in different kilns. Our analysis indicates the relatively concentrated distribution of the sources of the raw material for the Yaozhou porcelain body samples. They can be classified into two independent periods, i.e. the Tang (AD 618-907) and the Five Dynasties (AD 907-960) period, and the Song (AD 960-1279) and Jin (AD 1115-1234) period. Our analysis also indicates that the sources of the raw material for the ancient Yaozhou porcelain glaze samples are quite scattered and those for the black glaze in the Tang Dynasty are very concentrated. The sources of the raw material for the celadon glaze and the white glaze in the Tang Dynasty are widely distributed and those for the celadon glaze in the Song Dynasty are close to those of the bluish white glaze in the Jin Dynasty, and they are very concentrated. The sources of the raw material for the porcelain glazes cover those of the porcelain bodies.

  1. A Flexible Method for Producing F.E.M. Analysis of Bone Using Open-Source Software

    Science.gov (United States)

    Boppana, Abhishektha; Sefcik, Ryan; Meyers, Jerry G.; Lewandowski, Beth E.

    2016-01-01

    This project, performed in support of the NASA GRC Space Academy summer program, sought to develop an open-source workflow methodology that segmented medical image data, created a 3D model from the segmented data, and prepared the model for finite-element analysis. In an initial step, a technological survey evaluated the performance of various existing open-source software that claim to perform these tasks. However, the survey concluded that no single software exhibited the wide array of functionality required for the potential NASA application in the area of bone, muscle and bio fluidic studies. As a result, development of a series of Python scripts provided the bridging mechanism to address the shortcomings of the available open source tools. The implementation of the VTK library provided the most quick and effective means of segmenting regions of interest from the medical images; it allowed for the export of a 3D model by using the marching cubes algorithm to build a surface mesh. To facilitate the development of the model domain from this extracted information required a surface mesh to be processed in the open-source software packages Blender and Gmsh. The Preview program of the FEBio suite proved to be sufficient for volume filling the model with an unstructured mesh and preparing boundaries specifications for finite element analysis. To fully allow FEM modeling, an in house developed Python script allowed assignment of material properties on an element by element basis by performing a weighted interpolation of voxel intensity of the parent medical image correlated to published information of image intensity to material properties, such as ash density. A graphical user interface combined the Python scripts and other software into a user friendly interface. The work using Python scripts provides a potential alternative to expensive commercial software and inadequate, limited open-source freeware programs for the creation of 3D computational models. More work

  2. Analysis and simulation of a small-angle neutron scattering instrument on a 1 MW long pulse spallation source

    International Nuclear Information System (INIS)

    Olah, G.A.; Hjelm, R.P.; Lujan, M. Jr.

    1996-01-01

    We studied the design and performance of a small-angle neutron scattering (SANS) instrument for a proposed 1 MW, 60 Hz long pulsed spallation source at the Los Alamos Neutron Science Center (LANSCE). An analysis of the effects of source characteristics and chopper performance combined with instrument simulations using the LANSCE Monte Carlo instrument simulations package shows that the T 0 chopper should be no more than 5 m from the source with the frame overlap and frame definition choppers at 5.6 and greater than 7 m, respectively. The study showed that an optimal pulse structure has an exponential decaying tail with τ ∼ 750 μs. The Monte Carlo simulations were used to optimize the LPSS SANS, showing that an optimal length is 18 m. The simulations show that an instrument with variable length is best to match the needs of a given measurement. The performance of the optimized LPSS instrument was found to be comparable with present world standard instruments

  3. Exergy analysis of a two-stage ground source heat pump with a vertical bore for residential space conditioning under simulated occupancy

    International Nuclear Information System (INIS)

    Ally, Moonis R.; Munk, Jeffrey D.; Baxter, Van D.; Gehl, Anthony C.

    2015-01-01

    Highlights: • Exergy and energy analysis of a vertical-bore ground source heat pump over a 12-month period is presented. • The ground provided more than 75% of the heating energy. • Performance metrics are presented. • Sources of systemic inefficiency are identified and prioritized using Exergy analysis. • Understanding performance metrics is vital for judicial use of renewable energy. - Abstract: This twelve-month field study analyzes the performance of a 7.56 W (2.16-ton) water-to-air-ground source heat pump (WA-GSHP) to satisfy domestic space conditioning loads in a 253 m 2 house in a mixed-humid climate in the United States. The practical feasibility of using the ground as a source of renewable energy is clearly demonstrated. Better than 75% of the energy needed for space heating was extracted from the ground. The average monthly electricity consumption for space conditioning was only 40 kW h at summer and winter thermostat set points of 24.4 °C and 21.7 °C, respectively. The WA-GSHP shared the same 94.5 m vertical bore ground loop with a separate water-to-water ground-source heat pump (WW-GSHP) for meeting domestic hot water needs in the same house. Sources of systemic irreversibility, the main cause of lost work, are identified using Exergy and energy analysis. Quantifying the sources of Exergy and energy losses is essential for further systemic improvements. The research findings suggest that the WA-GSHPs are a practical and viable technology to reduce primary energy consumption and greenhouse gas emissions under the IECC 2012 Standard, as well as the European Union (EU) 2020 targets of using renewable energy resources

  4. Automatic Wave Equation Migration Velocity Analysis by Focusing Subsurface Virtual Sources

    KAUST Repository

    Sun, Bingbing

    2017-11-03

    Macro velocity model building is important for subsequent pre-stack depth migration and full waveform inversion. Wave equation migration velocity analysis (WEMVA) utilizes the band-limited waveform to invert for the velocity. Normally, inversion would be implemented by focusing the subsurface offset common image gathers (SOCIGs). We re-examine this concept with a different perspective: In subsurface offset domain, using extended Born modeling, the recorded data can be considered as invariant with respect to the perturbation of the position of the virtual sources and velocity at the same time. A linear system connecting the perturbation of the position of those virtual sources and velocity is derived and solved subsequently by Conjugate Gradient method. In theory, the perturbation of the position of the virtual sources is given by the Rytov approximation. Thus, compared to the Born approximation, it relaxes the dependency on amplitude and makes the proposed method more applicable for real data. We demonstrate the effectiveness of the approach by applying the proposed method on both isotropic and anisotropic VTI synthetic data. A real dataset example verifies the robustness of the proposed method.

  5. Automatic Wave Equation Migration Velocity Analysis by Focusing Subsurface Virtual Sources

    KAUST Repository

    Sun, Bingbing; Alkhalifah, Tariq Ali

    2017-01-01

    Macro velocity model building is important for subsequent pre-stack depth migration and full waveform inversion. Wave equation migration velocity analysis (WEMVA) utilizes the band-limited waveform to invert for the velocity. Normally, inversion would be implemented by focusing the subsurface offset common image gathers (SOCIGs). We re-examine this concept with a different perspective: In subsurface offset domain, using extended Born modeling, the recorded data can be considered as invariant with respect to the perturbation of the position of the virtual sources and velocity at the same time. A linear system connecting the perturbation of the position of those virtual sources and velocity is derived and solved subsequently by Conjugate Gradient method. In theory, the perturbation of the position of the virtual sources is given by the Rytov approximation. Thus, compared to the Born approximation, it relaxes the dependency on amplitude and makes the proposed method more applicable for real data. We demonstrate the effectiveness of the approach by applying the proposed method on both isotropic and anisotropic VTI synthetic data. A real dataset example verifies the robustness of the proposed method.

  6. A novel syngas-fired hybrid heating source for solar-thermal applications: Energy and exergy analysis

    International Nuclear Information System (INIS)

    Pramanik, Santanu; Ravikrishna, R.V.

    2016-01-01

    Highlights: • Biomass-derived syngas as a hybrid energy source for solar thermal power plants. • A novel combustor concept using rich-catalytic and MILD combustion technologies. • Hybrid energy source for a solar-driven supercritical CO 2 -based Brayton cycle. • Comprehensive energetic and exergetic analysis of the combined system. - Abstract: A hybrid heating source using biomass-derived syngas is proposed to enable continuous operation of standalone solar thermal power generation plants. A novel, two-stage, low temperature combustion system is proposed that has the potential to provide stable combustion of syngas with near-zero NO x emissions. The hybrid heating system consists of a downdraft gasifier, a two-stage combustion system, and other auxiliaries. When integrated with a solar cycle, the entire system can be referred to as the integrated gasification solar combined cycle (IGSCC). The supercritical CO 2 Brayton cycle (SCO 2 ) is selected for the solar cycle due to its high efficiency. The thermodynamic performance evaluation of the individual unit and the combined system has been conducted from both energy and exergy considerations. The effect of parameters such as gasification temperature, biomass moisture content, equivalence ratio, and pressure ratio is studied. The efficiency of the IGSCC exhibited a non-monotonic behavior. A maximum thermal efficiency of 36.5% was achieved at an overall equivalence ratio of 0.22 and pressure ratio of 2.75 when the gasifier was operating at T g = 1073 K with biomass containing 20% moisture. The efficiency increased to 40.8% when dry biomass was gasified at a temperature of 973 K. The exergy analysis revealed that the maximum exergy destruction occurred in the gasification system, followed by the combustion system, SCO 2 cycle, and regenerator. The exergy analysis also showed that 8.72% of the total exergy is lost in the exhaust; however, this can be utilized for drying of the biomass.

  7. Hydrodynamic analysis of the interaction of two operating groundwater sources, case study: Groundwater supply of Bečej

    Directory of Open Access Journals (Sweden)

    Polomčić Dušan M.

    2014-01-01

    Full Text Available The existing groundwater source 'Vodokanal' for the public water supply of Bečej city in Serbia tapping groundwater from three water-bearing horizons over 15 wells with summary capacity of 100 l/s. Near the public water source of Bečej exists groundwater source 'Soja Protein' for industry with current capacity of 12 l/s which tapped same horizons. In the coming period is planned to increase summary capacity of this groundwater source up to 57 l/s. Also, the increase of summary city's source capacity is planned for 50 l/s in the next few years. That is means an increase of groundwater abstraction for an additional 84 % from the same water-bearing horizons. Application of hydrodynamic modeling, based on numerical method of finite difference will show the impact of increasing the total capacity of the source 'Soja Protein' on the groundwater level in groundwater source 'Vodokanal' and effects of additional decrease in groundwater levels, in all three water-bearing horizons, on the wells of the 'Vodokanala' groundwater source due to operation of industrial source. It was done 7 variant solutions of the extensions of groundwater sources and are their effects for a period of 10 years with the aim of the sustainable management of groundwater.

  8. Study on hybrid heat source overlap welding of magnesium alloy AZ31B

    International Nuclear Information System (INIS)

    Liang, G.L.; Zhou, G.; Yuan, S.Q.

    2009-01-01

    The magnesium alloy AZ31B was overlap welded by hybrid welding (laser-tungsten inert gas arc). According to the hybrid welding interaction principle, a new heat source model, hybrid welding heat source model, was developed with finite element analysis. At the same time, using a high-temperature metallographical microscope, the macro-appearance and microstructure characteristics of the joint after hybrid overlap welding were studied. The results indicate that the hybrid welding was superior to the single tungsten inert gas welding or laser welding on the aspects of improving the utilized efficiency of the arc and enhancing the absorptivity of materials to laser energy. Due to the energy characteristics of hybrid overlap welding the macro-appearance of the joint was cup-shaped, the top weld showed the hybrid welding microstructure, while, the lower weld showed the typical laser welding microstructure

  9. Study on hybrid heat source overlap welding of magnesium alloy AZ31B

    Energy Technology Data Exchange (ETDEWEB)

    Liang, G.L. [Department of Electromechanical Engineering, Tangshan College, Tangshan 063000 (China)], E-mail: guoliliang@sohu.com; Zhou, G. [School of Material Science and Engineering, Harbin Institute of Technology, Harbin 150001 (China); Yuan, S.Q. [Department of Electromechanical Engineering, Tangshan College, Tangshan 063000 (China)

    2009-01-15

    The magnesium alloy AZ31B was overlap welded by hybrid welding (laser-tungsten inert gas arc). According to the hybrid welding interaction principle, a new heat source model, hybrid welding heat source model, was developed with finite element analysis. At the same time, using a high-temperature metallographical microscope, the macro-appearance and microstructure characteristics of the joint after hybrid overlap welding were studied. The results indicate that the hybrid welding was superior to the single tungsten inert gas welding or laser welding on the aspects of improving the utilized efficiency of the arc and enhancing the absorptivity of materials to laser energy. Due to the energy characteristics of hybrid overlap welding the macro-appearance of the joint was cup-shaped, the top weld showed the hybrid welding microstructure, while, the lower weld showed the typical laser welding microstructure.

  10. Feasibility study on X-ray source with pinhole imaging method

    International Nuclear Information System (INIS)

    Qiu Rui; Li Junli

    2007-01-01

    In order to verify the feasibility of study on X-ray source with pinhole imaging method, and optimize the design of X-ray pinhole imaging system, an X-ray pinhole imaging equipment was set up. The change of image due to the change of the position and intensity of X-ray source was estimated with mathematical method and validated with experiment. The results show that the change of the spot position and gray of the spot is linearly related with the change of the position and intensity of X-ray source, so it is feasible to study X-ray source with pinhole imaging method in this application. The results provide some references for the design of X-ray pinhole imaging system. (authors)

  11. Sociodemographic characteristics and frequency of consuming home-cooked meals and meals from out-of-home sources: cross-sectional analysis of a population-based cohort study.

    Science.gov (United States)

    Mills, Susanna; Adams, Jean; Wrieden, Wendy; White, Martin; Brown, Heather

    2018-04-11

    To identify sociodemographic characteristics associated with frequency of consuming home-cooked meals and meals from out-of-home sources. Cross-sectional analysis of a population-based cohort study. Frequency of consuming home-cooked meals, ready meals, takeaways and meals out were derived from a participant questionnaire. Sociodemographic characteristics regarding sex, age, ethnicity, working overtime and socio-economic status (SES; measured by household income, educational attainment, occupational status and employment status) were self-reported. Sociodemographic differences in higher v. lower meal consumption frequency were explored using logistic regression, adjusted for other key sociodemographic variables. Cambridgeshire, UK. Fenland Study participants (n 11 326), aged 29-64 years at baseline. Eating home-cooked meals more frequently was associated with being female, older, of higher SES (measured by greater educational attainment and household income) and not working overtime. Being male was associated with a higher frequency of consumption for all out-of-home meal types. Consuming takeaways more frequently was associated with lower SES (measured by lower educational attainment and household income), whereas eating out more frequently was associated with higher SES (measured by greater educational attainment and household income) and working overtime. Sociodemographic characteristics associated with frequency of eating meals from different out-of-home sources varied according to meal source. Findings may be used to target public health policies and interventions for promoting healthier diets and dietary-related health towards people consuming home-cooked meals less frequently, such as men, those with lower educational attainment and household income, and overtime workers.

  12. Source apportionment studies on particulate matter in Beijing/China

    Science.gov (United States)

    Suppan, P.; Shen, R.; Shao, L.; Schrader, S.; Schäfer, K.; Norra, S.; Vogel, B.; Cen, K.; Wang, Y.

    2013-05-01

    measured dust storm concentration variability at Beijing in the course of time. The results show the importance of intertwine investigations of measurements and modeling, the analysis of local air pollution levels as well as the impact and analysis of advective processes in the greater region of Beijing. Comprehensive investigations on particulate matter are a prerequisite for the knowledge of the source strengths and source attribution to the overall air pollution level. Only this knowledge can help to formulate and to introduce specific reduction measures to reduce coarser as well as finer particulates.

  13. Cell_motility: a cross-platform, open source application for the study of cell motion paths

    Directory of Open Access Journals (Sweden)

    Gevaert Kris

    2006-06-01

    Full Text Available Abstract Background Migration is an important aspect of cellular behaviour and is therefore widely studied in cell biology. Numerous components are known to participate in this process in a highly dynamic manner. In order to obtain a better insight in cell migration, mutants or drugs are used and their motive phenotype is then linked with the disturbing factors. One of the typical approaches to study motion paths of individual cells relies on fitting mean square displacements to a persistent random walk function. Since the numerous calculations involved often rely on diverse commercial software packages, the analysis can be expensive, labour-intensive and error-prone work. Additionally, due to the nature of algorithms employed the calculations involved are not readily reproducible without access to the exact software package(s used. Results We here present the cell_motility software, an open source Java application under the GNU-GPL license that provides a clear and concise analysis workbench for large amounts of cell motion data. Apart from performing the necessary calculations, the software also visualizes the original motion paths as well as the results of the calculations to help the user interpret the data. The application features an intuitive graphical user interface as well as full user and developer documentation and both source and binary files can be freely downloaded from the project website at http://genesis.UGent.be/cell_motility . Conclusion In providing a free, open source software solution for the automated processing of cell motion data, we aim to achieve two important goals: labs can greatly simplify their data analysis pipeline as switching between different computational software packages becomes obsolete (thus reducing the chances for human error during data manipulation and transfer and secondly, to provide scientists in the field with a freely available common platform to perform their analyses, enabling more efficient

  14. Characterization of Ground Displacement Sources from Variational Bayesian Independent Component Analysis of Space Geodetic Time Series

    Science.gov (United States)

    Gualandi, Adriano; Serpelloni, Enrico; Elina Belardinelli, Maria; Bonafede, Maurizio; Pezzo, Giuseppe; Tolomei, Cristiano

    2015-04-01

    A critical point in the analysis of ground displacement time series, as those measured by modern space geodetic techniques (primarly continuous GPS/GNSS and InSAR) is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies, since PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem. The recovering and separation of the different sources that generate the observed ground deformation is a fundamental task in order to provide a physical meaning to the possible different sources. PCA fails in the BSS problem since it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the displacement time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient deformation signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources

  15. Factor analysis of sources of information on organ donation and transplantation in journalism students.

    Science.gov (United States)

    Martínez-Alarcón, L; Ríos, A; Ramis, G; López-Navas, A; Febrero, B; Ramírez, P; Parrilla, P

    2013-01-01

    Journalists and the information they disseminate are essential to promote health and organ donation and transplantation (ODT). The attitude of journalism students toward ODT could influence public opinion and help promote this treatment option. The aim of this study was to determine the media through which journalism students receive information on ODT and to analyze the association between the sources of information and psychosocial variables. We surveyed journalism students (n = 129) recruited in compulsory classes. A validated psychosocial questionnaire (self-administered, anonymous) about ODT was used. Student t test and χ(2) test were applied. Questionnaire completion rate was 98% (n = 126). The medium with the greatest incidence on students was television (TV), followed by press and magazines/books. In the factor analysis to determine the impact of the information by its source, the first factor was talks with friends and family; the second was shared by hoardings/publicity posters, health professionals, and college/school; and the third was TV and radio. In the factor analysis between information sources and psychosocial variables, the associations were between information about organ donation transmitted by friends and family and having spoken about ODT with them; by TV, radio, and hoardings and not having spoken in the family; and by TV/radio and the father's and mother's opinion about ODT. The medium with the greatest incidence on students is TV, and the medium with the greatest impact on broadcasting information was conversations with friends, family, and health professionals. This could be useful for society, because they should be provided with clear and concise information. Copyright © 2013 Elsevier Inc. All rights reserved.

  16. Fusing Open Source Intelligence and Handheld Situational Awareness - Benghazi Case Study

    Science.gov (United States)

    2014-10-01

    1 DM-0001694 Fusing Open Source Intelligence and Handheld Situational Awareness Benghazi Case Study Jeff Boleng, PhD Marc Novakouski Gene...command and control element at the CIA compound that would have been monitoring OSINT and other sources of intelligence before the attack and... Source Intelligence and Handheld Situational Awareness - Benghazi Case Study 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6

  17. Studies on the method of producing radiographic 170Tm source

    International Nuclear Information System (INIS)

    Maeda, Sho

    1976-08-01

    A method of producing radiographic 170 Tm source has been studied, including target preparation, neutron irradiation, handling of the irradiated target in the hot cell and source capsules. On the basis of the results, practical 170 Tm radiographic sources (29 -- 49Ci, with pellets 3mm in diameter and 3mm long) were produced in trial by neutron irradiation with the JMTR. (auth.)

  18. THE CHANDRA SOURCE CATALOG

    International Nuclear Information System (INIS)

    Evans, Ian N.; Primini, Francis A.; Glotfelty, Kenny J.; Anderson, Craig S.; Bonaventura, Nina R.; Chen, Judy C.; Doe, Stephen M.; Evans, Janet D.; Fabbiano, Giuseppina; Galle, Elizabeth C.; Gibbs, Danny G.; Grier, John D.; Hain, Roger M.; Harbo, Peter N.; He Xiangqun; Karovska, Margarita; Kashyap, Vinay L.; Davis, John E.; Houck, John C.; Hall, Diane M.

    2010-01-01

    The Chandra Source Catalog (CSC) is a general purpose virtual X-ray astrophysics facility that provides access to a carefully selected set of generally useful quantities for individual X-ray sources, and is designed to satisfy the needs of a broad-based group of scientists, including those who may be less familiar with astronomical data analysis in the X-ray regime. The first release of the CSC includes information about 94,676 distinct X-ray sources detected in a subset of public Advanced CCD Imaging Spectrometer imaging observations from roughly the first eight years of the Chandra mission. This release of the catalog includes point and compact sources with observed spatial extents ∼<30''. The catalog (1) provides access to the best estimates of the X-ray source properties for detected sources, with good scientific fidelity, and directly supports scientific analysis using the individual source data; (2) facilitates analysis of a wide range of statistical properties for classes of X-ray sources; and (3) provides efficient access to calibrated observational data and ancillary data products for individual X-ray sources, so that users can perform detailed further analysis using existing tools. The catalog includes real X-ray sources detected with flux estimates that are at least 3 times their estimated 1σ uncertainties in at least one energy band, while maintaining the number of spurious sources at a level of ∼<1 false source per field for a 100 ks observation. For each detected source, the CSC provides commonly tabulated quantities, including source position, extent, multi-band fluxes, hardness ratios, and variability statistics, derived from the observations in which the source is detected. In addition to these traditional catalog elements, for each X-ray source the CSC includes an extensive set of file-based data products that can be manipulated interactively, including source images, event lists, light curves, and spectra from each observation in which a

  19. The Chandra Source Catalog

    Science.gov (United States)

    Evans, Ian N.; Primini, Francis A.; Glotfelty, Kenny J.; Anderson, Craig S.; Bonaventura, Nina R.; Chen, Judy C.; Davis, John E.; Doe, Stephen M.; Evans, Janet D.; Fabbiano, Giuseppina; Galle, Elizabeth C.; Gibbs, Danny G., II; Grier, John D.; Hain, Roger M.; Hall, Diane M.; Harbo, Peter N.; He, Xiangqun Helen; Houck, John C.; Karovska, Margarita; Kashyap, Vinay L.; Lauer, Jennifer; McCollough, Michael L.; McDowell, Jonathan C.; Miller, Joseph B.; Mitschang, Arik W.; Morgan, Douglas L.; Mossman, Amy E.; Nichols, Joy S.; Nowak, Michael A.; Plummer, David A.; Refsdal, Brian L.; Rots, Arnold H.; Siemiginowska, Aneta; Sundheim, Beth A.; Tibbetts, Michael S.; Van Stone, David W.; Winkelman, Sherry L.; Zografou, Panagoula

    2010-07-01

    The Chandra Source Catalog (CSC) is a general purpose virtual X-ray astrophysics facility that provides access to a carefully selected set of generally useful quantities for individual X-ray sources, and is designed to satisfy the needs of a broad-based group of scientists, including those who may be less familiar with astronomical data analysis in the X-ray regime. The first release of the CSC includes information about 94,676 distinct X-ray sources detected in a subset of public Advanced CCD Imaging Spectrometer imaging observations from roughly the first eight years of the Chandra mission. This release of the catalog includes point and compact sources with observed spatial extents lsim30''. The catalog (1) provides access to the best estimates of the X-ray source properties for detected sources, with good scientific fidelity, and directly supports scientific analysis using the individual source data; (2) facilitates analysis of a wide range of statistical properties for classes of X-ray sources; and (3) provides efficient access to calibrated observational data and ancillary data products for individual X-ray sources, so that users can perform detailed further analysis using existing tools. The catalog includes real X-ray sources detected with flux estimates that are at least 3 times their estimated 1σ uncertainties in at least one energy band, while maintaining the number of spurious sources at a level of lsim1 false source per field for a 100 ks observation. For each detected source, the CSC provides commonly tabulated quantities, including source position, extent, multi-band fluxes, hardness ratios, and variability statistics, derived from the observations in which the source is detected. In addition to these traditional catalog elements, for each X-ray source the CSC includes an extensive set of file-based data products that can be manipulated interactively, including source images, event lists, light curves, and spectra from each observation in which a

  20. Anomaly metrics to differentiate threat sources from benign sources in primary vehicle screening.

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, Israel Dov; Mengesha, Wondwosen

    2011-09-01

    Discrimination of benign sources from threat sources at Port of Entries (POE) is of a great importance in efficient screening of cargo and vehicles using Radiation Portal Monitors (RPM). Currently RPM's ability to distinguish these radiological sources is seriously hampered by the energy resolution of the deployed RPMs. As naturally occurring radioactive materials (NORM) are ubiquitous in commerce, false alarms are problematic as they require additional resources in secondary inspection in addition to impacts on commerce. To increase the sensitivity of such detection systems without increasing false alarm rates, alarm metrics need to incorporate the ability to distinguish benign and threat sources. Principal component analysis (PCA) and clustering technique were implemented in the present study. Such techniques were investigated for their potential to lower false alarm rates and/or increase sensitivity to weaker threat sources without loss of specificity. Results of the investigation demonstrated improved sensitivity and specificity in discriminating benign sources from threat sources.

  1. Sensitivity analysis of the relationship between disease occurrence and distance from a putative source of pollution

    Directory of Open Access Journals (Sweden)

    Emanuela Dreassi

    2008-05-01

    Full Text Available The relation between disease risk and a point source of pollution is usually investigated using distance from the source as a proxy of exposure. The analysis may be based on case-control data or on aggregated data. The definition of the function relating risk of disease and distance is critical, both in a classical and in a Bayesian framework, because the likelihood is usually very flat, even with large amounts of data. In this paper we investigate how the specification of the function relating risk of disease with distance from the source and of the prior distributions on the parameters of the function affects the results when case-control data and Bayesian methods are used. We consider different popular parametric models for the risk distance function in a Bayesian approach, comparing estimates with those derived by maximum likelihood. As an example we have analyzed the relationship between a putative source of environmental pollution (an asbestos cement plant and the occurrence of pleural malignant mesothelioma in the area of Casale Monferrato (Italy in 1987-1993. Risk of pleural malignant mesothelioma turns out to be strongly related to distance from the asbestos cement plant. However, as the models appeared to be sensitive to modeling choices, we suggest that any analysis of disease risk around a putative source should be integrated with a careful sensitivity analysis and possibly with prior knowledge. The choice of prior distribution is extremely important and should be based on epidemiological considerations.

  2. An innovative expression model of human health risk based on the quantitative analysis of soil metals sources contribution in different spatial scales.

    Science.gov (United States)

    Zhang, Yimei; Li, Shuai; Wang, Fei; Chen, Zhuang; Chen, Jie; Wang, Liqun

    2018-09-01

    Toxicity of heavy metals from industrialization poses critical concern, and analysis of sources associated with potential human health risks is of unique significance. Assessing human health risk of pollution sources (factored health risk) concurrently in the whole and the sub region can provide more instructive information to protect specific potential victims. In this research, we establish a new expression model of human health risk based on quantitative analysis of sources contribution in different spatial scales. The larger scale grids and their spatial codes are used to initially identify the level of pollution risk, the type of pollution source and the sensitive population at high risk. The smaller scale grids and their spatial codes are used to identify the contribution of various sources of pollution to each sub region (larger grid) and to assess the health risks posed by each source for each sub region. The results of case study show that, for children (sensitive populations, taking school and residential area as major region of activity), the major pollution source is from the abandoned lead-acid battery plant (ALP), traffic emission and agricultural activity. The new models and results of this research present effective spatial information and useful model for quantifying the hazards of source categories and human health a t complex industrial system in the future. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. The Human Face of Health News: A Multi-Method Analysis of Sourcing Practices in Health-Related News in Belgian Magazines.

    Science.gov (United States)

    De Dobbelaer, Rebeca; Van Leuven, Sarah; Raeymaeckers, Karin

    2018-05-01

    Health journalists are central gatekeepers who select, frame, and communicate health news to a broad audience, but the selection and content of health news are also influenced by the sources journalists, rely on (Hinnant, Len-Rios, & Oh, 2012). In this paper, we examine whether the traditional elitist sourcing practices (e.g., research institutions, government) are still important in a digitalized news environment where bottom-up non-elite actors (e.g., patients, civil society organizations) can act as producers (Bruns, 2003). Our main goal, therefore, is to detect whether sourcing practices in health journalism can be linked with strategies of empowerment. We use a multi-method approach combining quantitative and qualitative research methods. First, two content analyses are developed to examine health-related news in Belgian magazines (popular weeklies, health magazines, general interest magazines, and women's magazines). The analyses highlight sourcing practices as visible in the texts and give an overview of the different stakeholders represented as sources. In the first wave, the content analysis includes 1047 health-related news items in 19 different Belgian magazines (March-June 2013). In the second wave, a smaller sample of 202 health-related items in 10 magazines was studied for follow-up reasons (February 2015). Second, to contextualize the findings of the quantitative analysis, we interviewed 16 health journalists and editors-in-chief. The results illustrate that journalists consider patients and blogs as relevant sources for health news; nonetheless, elitist sourcing practices still prevail at the cost of bottom-up communication. However, the in-depth interviews demonstrate that journalists increasingly consult patients and civil society actors to give health issues a more "human" face. Importantly, the study reveals that this strategy is differently applied by the various types of magazines. While popular weeklies and women's magazines give a voice to

  4. Seismic Hazard characterization study using an earthquake source with Probabilistic Seismic Hazard Analysis (PSHA) method in the Northern of Sumatra

    International Nuclear Information System (INIS)

    Yahya, A.; Palupi, M. I. R.; Suharsono

    2016-01-01

    Sumatra region is one of the earthquake-prone areas in Indonesia because it is lie on an active tectonic zone. In 2004 there is earthquake with a moment magnitude of 9.2 located on the coast with the distance 160 km in the west of Nanggroe Aceh Darussalam and triggering a tsunami. These events take a lot of casualties and material losses, especially in the Province of Nanggroe Aceh Darussalam and North Sumatra. To minimize the impact of the earthquake disaster, a fundamental assessment of the earthquake hazard in the region is needed. Stages of research include the study of literature, collection and processing of seismic data, seismic source characterization and analysis of earthquake hazard by probabilistic methods (PSHA) used earthquake catalog from 1907 through 2014. The earthquake hazard represented by the value of Peak Ground Acceleration (PGA) and Spectral Acceleration (SA) in the period of 0.2 and 1 second on bedrock that is presented in the form of a map with a return period of 2475 years and the earthquake hazard curves for the city of Medan and Banda Aceh. (paper)

  5. Dominant seismic sources for the cities in South Sumatra

    Science.gov (United States)

    Sunardi, Bambang; Sakya, Andi Eka; Masturyono, Murjaya, Jaya; Rohadi, Supriyanto; Sulastri, Putra, Ade Surya

    2017-07-01

    Subduction zone along west of Sumatra and Sumatran fault zone are active seismic sources. Seismotectonically, South Sumatra could be affected by earthquakes triggered by these seismic sources. This paper discussed contribution of each seismic source to earthquake hazards for cities of Palembang, Prabumulih, Banyuasin, OganIlir, Ogan Komering Ilir, South Oku, Musi Rawas and Empat Lawang. These hazards are presented in form of seismic hazard curves. The study was conducted by using Probabilistic Seismic Hazard Analysis (PSHA) of 2% probability of exceedance in 50 years. Seismic sources used in analysis included megathrust zone M2 of Sumatra and South Sumatra, background seismic sources and shallow crustal seismic sources consist of Ketaun, Musi, Manna and Kumering faults. The results of the study showed that for cities relatively far from the seismic sources, subduction / megathrust seismic source with a depth ≤ 50 km greatly contributed to the seismic hazard and the other areas showed deep background seismic sources with a depth of more than 100 km dominate to seismic hazard respectively.

  6. Spectrum analysis of a voltage source converter due to semiconductor voltage drops

    DEFF Research Database (Denmark)

    Rasmussen, Tonny Wederberg; Eltouki, Mustafa

    2017-01-01

    It is known that power electronic voltage source converters are non-ideal. This paper presents a state-of-the-art review on the effect of semiconductor voltage drop on the output voltage spectrum, using single-phase H-bridge two-level converter topology with natural sampled pulse width modulation....... The paper describes the analysis of output voltage spectrum, when the semiconductor voltage drop is added. The results of the analysis of the spectral contribution including and excluding semiconductor voltage drop reveal a good agreement between the theoretical results, simulations and laboratory...

  7. Analysis of the environmental behavior of farmers for non-point source pollution control and management in a water source protection area in China.

    Science.gov (United States)

    Wang, Yandong; Yang, Jun; Liang, Jiping; Qiang, Yanfang; Fang, Shanqi; Gao, Minxue; Fan, Xiaoyu; Yang, Gaihe; Zhang, Baowen; Feng, Yongzhong

    2018-08-15

    The environmental behavior of farmers plays an important role in exploring the causes of non-point source pollution and taking scientific control and management measures. Based on the theory of planned behavior (TPB), the present study investigated the environmental behavior of farmers in the Water Source Area of the Middle Route of the South-to-North Water Diversion Project in China. Results showed that TPB could explain farmers' environmental behavior (SMC=0.26) and intention (SMC=0.36) well. Furthermore, the farmers' attitude towards behavior (AB), subjective norm (SN), and perceived behavioral control (PBC) positively and significantly influenced their environmental intention; their environmental intention further impacted their behavior. SN was proved to be the main key factor indirectly influencing the farmers' environmental behavior, while PBC had no significant and direct effect. Moreover, environmental knowledge following as a moderator, gender and age was used as control variables to conduct the environmental knowledge on TPB construct moderated mediation analysis. It demonstrated that gender had a significant controlling effect on environmental behavior; that is, males engage in more environmentally friendly behaviors. However, age showed a significant negative controlling effect on pro-environmental intention and an opposite effect on pro-environmental behavior. In addition, environmental knowledge could negatively moderate the relationship between PBC and environmental intention. PBC had a greater impact on the environmental intention of farmers with poor environmental knowledge, compared to those with plenty environmental knowledge. Altogether, the present study could provide a theoretical basis for non-point source pollution control and management. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. The impacts of source structure on geodetic parameters demonstrated by the radio source 3C371

    Science.gov (United States)

    Xu, Ming H.; Heinkelmann, Robert; Anderson, James M.; Mora-Diaz, Julian; Karbon, Maria; Schuh, Harald; Wang, Guang L.

    2017-07-01

    Closure quantities measured by very-long-baseline interferometry (VLBI) observations are independent of instrumental and propagation instabilities and antenna gain factors, but are sensitive to source structure. A new method is proposed to calculate a structure index based on the median values of closure quantities rather than the brightness distribution of a source. The results are comparable to structure indices based on imaging observations at other epochs and demonstrate the flexibility of deriving structure indices from exactly the same observations as used for geodetic analysis and without imaging analysis. A three-component model for the structure of source 3C371 is developed by model-fitting closure phases. It provides a real case of tracing how the structure effect identified by closure phases in the same observations as the delay observables affects the geodetic analysis, and investigating which geodetic parameters are corrupted to what extent by the structure effect. Using the resulting structure correction based on the three-component model of source 3C371, two solutions, with and without correcting the structure effect, are made. With corrections, the overall rms of this source is reduced by 1 ps, and the impacts of the structure effect introduced by this single source are up to 1.4 mm on station positions and up to 4.4 microarcseconds on Earth orientation parameters. This study is considered as a starting point for handling the source structure effect on geodetic VLBI from geodetic sessions themselves.

  9. Regional Moment Tensor Source-Type Discrimination Analysis

    Science.gov (United States)

    2015-11-16

    unique normalized eigenvalues (black ‘+’ signs) or unique source-types on (a) the fundamental Lune (Tape and Tape, 2012a,b), and (b) on the Hudson...Solutions color-coded by variance reduction (VR) pre- sented on the Tape and Tape (2012a) and Tape and Tape (2012b) Lune . The white circle...eigenvalues (black ‘+’ signs) or unique source-types on (a) the fundamental Lune (Tape and Tape, 2012a,b), and (b) on the Hudson source-type plot (Hudson

  10. Application of Open Source Technologies for Oceanographic Data Analysis

    Science.gov (United States)

    Huang, T.; Gangl, M.; Quach, N. T.; Wilson, B. D.; Chang, G.; Armstrong, E. M.; Chin, T. M.; Greguska, F.

    2015-12-01

    NEXUS is a data-intensive analysis solution developed with a new approach for handling science data that enables large-scale data analysis by leveraging open source technologies such as Apache Cassandra, Apache Spark, Apache Solr, and Webification. NEXUS has been selected to provide on-the-fly time-series and histogram generation for the Soil Moisture Active Passive (SMAP) mission for Level 2 and Level 3 Active, Passive, and Active Passive products. It also provides an on-the-fly data subsetting capability. NEXUS is designed to scale horizontally, enabling it to handle massive amounts of data in parallel. It takes a new approach on managing time and geo-referenced array data by dividing data artifacts into chunks and stores them in an industry-standard, horizontally scaled NoSQL database. This approach enables the development of scalable data analysis services that can infuse and leverage the elastic computing infrastructure of the Cloud. It is equipped with a high-performance geospatial and indexed data search solution, coupled with a high-performance data Webification solution free from file I/O bottlenecks, as well as a high-performance, in-memory data analysis engine. In this talk, we will focus on the recently funded AIST 2014 project by using NEXUS as the core for oceanographic anomaly detection service and web portal. We call it, OceanXtremes

  11. A Sensitivity Study for an Evaluation of Input Parameters Effect on a Preliminary Probabilistic Tsunami Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rhee, Hyun-Me; Kim, Min Kyu; Choi, In-Kil [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Sheen, Dong-Hoon [Chonnam National University, Gwangju (Korea, Republic of)

    2014-10-15

    The tsunami hazard analysis has been based on the seismic hazard analysis. The seismic hazard analysis has been performed by using the deterministic method and the probabilistic method. To consider the uncertainties in hazard analysis, the probabilistic method has been regarded as attractive approach. The various parameters and their weight are considered by using the logic tree approach in the probabilistic method. The uncertainties of parameters should be suggested by analyzing the sensitivity because the various parameters are used in the hazard analysis. To apply the probabilistic tsunami hazard analysis, the preliminary study for the Ulchin NPP site had been performed. The information on the fault sources which was published by the Atomic Energy Society of Japan (AESJ) had been used in the preliminary study. The tsunami propagation was simulated by using the TSUNAMI{sub 1}.0 which was developed by Japan Nuclear Energy Safety Organization (JNES). The wave parameters have been estimated from the result of tsunami simulation. In this study, the sensitivity analysis for the fault sources which were selected in the previous studies has been performed. To analyze the effect of the parameters, the sensitivity analysis for the E3 fault source which was published by AESJ was performed. The effect of the recurrence interval, the potential maximum magnitude, and the beta were suggested by the sensitivity analysis results. Level of annual exceedance probability has been affected by the recurrence interval.. Wave heights have been influenced by the potential maximum magnitude and the beta. In the future, the sensitivity analysis for the all fault sources in the western part of Japan which were published AESJ would be performed.

  12. Decoupling and Sources of Structural Transformation of East Asian Economies: An International Input-Output Decomposition Analysis

    Directory of Open Access Journals (Sweden)

    Jong-Hwan Ko

    2014-03-01

    Full Text Available This study aims to answer two questions using input-output decomposition analysis: 1 Have emerging Asian economies decoupled? 2 What are the sources of structural changes in gross outputs and value-added of emerging Asian economies related to the first question? The main findings of the study are as follows: First, since 1990, there has been a trend of increasing dependence on exports to extra-regions such as G3 and the ROW, indicating no sign of "decoupling", but rather an increasing integration of emerging Asian countries into global trade. Second, there is a contrasting feature in the sources of structural changes between non-China emerging Asia and China. Dependence of non-China emerging Asia on intra-regional trade has increased in line with strengthening economic integration in East Asia, whereas China has disintegrated from the region. Therefore, it can be said that China has contributed to no sign of decoupling of emerging Asia as a whole.

  13. COMBINING SOURCES IN STABLE ISOTOPE MIXING MODELS: ALTERNATIVE METHODS

    Science.gov (United States)

    Stable isotope mixing models are often used to quantify source contributions to a mixture. Examples include pollution source identification; trophic web studies; analysis of water sources for soils, plants, or water bodies; and many others. A common problem is having too many s...

  14. Open source Modeling and optimization tools for Planning

    Energy Technology Data Exchange (ETDEWEB)

    Peles, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-10

    Open source modeling and optimization tools for planning The existing tools and software used for planning and analysis in California are either expensive, difficult to use, or not generally accessible to a large number of participants. These limitations restrict the availability of participants for larger scale energy and grid studies in the state. The proposed initiative would build upon federal and state investments in open source software, and create and improve open source tools for use in the state planning and analysis activities. Computational analysis and simulation frameworks in development at national labs and universities can be brought forward to complement existing tools. An open source platform would provide a path for novel techniques and strategies to be brought into the larger community and reviewed by a broad set of stakeholders.

  15. Sources of Strategic Information in Farm Management in Poland. Study Results

    Directory of Open Access Journals (Sweden)

    Jacek Jaworski

    2017-03-01

    Full Text Available Purposes: The main goal of the paper is to determine the signifcance of selected sources of strategic information, used by Polish farmers in decision making. In addition, an attempt was made to determine the factors impacting the evaluation of those sources among the traits of the farmer and his farm. Methodology: Data was gathered using the questionnaire method and analysed with standard tools of descriptive statistics. Findings: The farmers deemed personalised sources of strategic information the most important, especially agricultural advisers, input suppliers and buyers of agricultural products. From among institutional (non-personalised sources, local government and the chamber of agriculture were signifcant. Business information agencies and survey companies are the least important sources for farmers. The characteristics of the surroundings of the farm – specifcally, its geographic location and the size of settlement where it is located proved to have the widest impact on the evaluation of the sources included in the study. From among the organisational factors, only farm size has a signifcant impact. Research limitations/implications: The study was confned to a representative group of farmers in Poland. A closed list of sources of strategic information was used. Originality/value: The study results contribute to the knowledge on the functioning of Polish agriculture and may also be used in comparative studies, characterising this sector’s diversity within Europe. They can in turn contribute to properly focusing on supporting the policy of balanced agriculture development in the EU.

  16. Study of cold and hot sources in a research reactor. (Physics, specifications, operation, utilization)

    International Nuclear Information System (INIS)

    Safieh, J.

    1982-10-01

    A brief description of the reactor, sources and experimental channels (ORPHEE being taken as example) is first given. The first part deals with the hot neutron source, mainly made of a graphite block to be carried at a temperature of 1500 0 K by nuclear heating. The present study focused on the determination, with the code MERCURE IV, of heat sources generated in the graphite block. From these results the spatial distribution of temperatures have been calculated with two different methods. Mechanical and thermal stresses have been calculated for the hot points. Then, the outlet neutron spectra is determined by means of the code APOLLO. Finally, the operation of the device is presented and the risks and the safety measures are given. The second part deals with cold neutron sources comprising mainly a cold moderator (liquid hydrogen 20.4 0 K). The helium coolant circuit liquefies the hydrogen by means of heat exchange in a condenser. Cold neutron yields calculations are developed by means of the code THERMOS in the plane and cyclindrical geometries. Heat sources generated by nuclear radiations are calculated. A detailed description of the device and its coolant circuit is given, and a risk analysis is finally presented. The third part deals with the part of thermal cold and hot neutrons in the study of matter and its dynamics. Technical means needed to obtain a monochromatic beam, for diffraction experiments, are recalled emphasizing on the interest of these neutrons with regard to X radiation. Then, one deals with cold neutron guides. Finally, the efficiency of two neutron guides is calculated. 78 refs [fr

  17. Problems of accuracy and sources of error in trace analysis of elements

    International Nuclear Information System (INIS)

    Porat, Ze'ev.

    1995-07-01

    The technological developments in the field of analytical chemistry in recent years facilitates trace analysis of materials in sub-ppb levels. This provides important information regarding the presence of various trace elements in the human body, in drinking water and in the environment. However, it also exposes the measurements to more severe problems of contamination and inaccuracy due to the high sensitivity of the analytical methods. The sources of error are numerous and can be included in three main groups: (a) impurities of various sources; (b) loss of material during sample processing; (c) problems of calibration and interference. These difficulties are discussed here in detail, together with some practical solutions and examples.(authors) 8 figs., 2 tabs., 18 refs.,

  18. Problems of accuracy and sources of error in trace analysis of elements

    Energy Technology Data Exchange (ETDEWEB)

    Porat, Ze` ev

    1995-07-01

    The technological developments in the field of analytical chemistry in recent years facilitates trace analysis of materials in sub-ppb levels. This provides important information regarding the presence of various trace elements in the human body, in drinking water and in the environment. However, it also exposes the measurements to more severe problems of contamination and inaccuracy due to the high sensitivity of the analytical methods. The sources of error are numerous and can be included in three main groups: (a) impurities of various sources; (b) loss of material during sample processing; (c) problems of calibration and interference. These difficulties are discussed here in detail, together with some practical solutions and examples.(authors) 8 figs., 2 tabs., 18 refs.,.

  19. The Role of Mother in Informing Girls About Puberty: A Meta-Analysis Study

    Science.gov (United States)

    Sooki, Zahra; Shariati, Mohammad; Chaman, Reza; Khosravi, Ahmad; Effatpanah, Mohammad; Keramat, Afsaneh

    2016-01-01

    Context Family, especially the mother, has the most important role in the education, transformation of information, and health behaviors of girls in order for them to have a healthy transition from the critical stage of puberty, but there are different views in this regard. Objectives Considering the various findings about the source of information about puberty, a meta-analysis study was conducted to investigate the extent of the mother’s role in informing girls about puberty. Data Sources This meta-analysis study was based on English articles published from 2000 to February 2015 in the Scopus, PubMed, and Science direct databases and on Persian articles in the SID, Magiran, and Iran Medex databases with determined key words and their MeSH equivalent. Study Selection Quantitative cross-sectional articles were extracted by two independent researchers and finally 46 articles were selected based on inclusion criteria. STROBE list were used for evaluation of studies. Data Extraction The percent of mothers as the current and preferred source of gaining information about the process of puberty, menarche, and menstruation from the perspective of adolescent girls was extracted from the articles. The results of studies were analyzed using meta-analysis (random effects model) and the studies’ heterogeneity was analyzed using the I2 calculation index. Variance between studies was analyzed using tau squared (Tau2) and review manager 5 software. Results The results showed that, from the perspective of teenage girls in Iran and other countries, in 56% of cases, the mother was the current source of information about the process of puberty, menarche, and menstruation. The preferred source of information about the process of puberty, menarche, and menstruation was the mother in all studies at 60% (Iran 57%, and other countries 66%). Conclusions According to the findings of this study, it is essential that health professionals and officials of the ministry of health train

  20. Source Apportionment and Influencing Factor Analysis of Residential Indoor PM2.5 in Beijing

    Science.gov (United States)

    Yang, Yibing; Liu, Liu; Xu, Chunyu; Li, Na; Liu, Zhe; Wang, Qin; Xu, Dongqun

    2018-01-01

    In order to identify the sources of indoor PM2.5 and to check which factors influence the concentration of indoor PM2.5 and chemical elements, indoor concentrations of PM2.5 and its related elements in residential houses in Beijing were explored. Indoor and outdoor PM2.5 samples that were monitored continuously for one week were collected. Indoor and outdoor concentrations of PM2.5 and 15 elements (Al, As, Ca, Cd, Cu, Fe, K, Mg, Mn, Na, Pb, Se, Tl, V, Zn) were calculated and compared. The median indoor concentration of PM2.5 was 57.64 μg/m3. For elements in indoor PM2.5, Cd and As may be sensitive to indoor smoking, Zn, Ca and Al may be related to indoor sources other than smoking, Pb, V and Se may mainly come from outdoor. Five factors were extracted for indoor PM2.5 by factor analysis, explained 76.8% of total variance, outdoor sources contributed more than indoor sources. Multiple linear regression analysis for indoor PM2.5, Cd and Pb was performed. Indoor PM2.5 was influenced by factors including outdoor PM2.5, smoking during sampling, outdoor temperature and time of air conditioner use. Indoor Cd was affected by factors including smoking during sampling, outdoor Cd and building age. Indoor Pb concentration was associated with factors including outdoor Pb and time of window open per day, building age and RH. In conclusion, indoor PM2.5 mainly comes from outdoor sources, and the contributions of indoor sources also cannot be ignored. Factors associated indoor and outdoor air exchange can influence the concentrations of indoor PM2.5 and its constituents. PMID:29621164

  1. Methodology for Quantitative Analysis of Large Liquid Samples with Prompt Gamma Neutron Activation Analysis using Am-Be Source

    International Nuclear Information System (INIS)

    Idiri, Z.; Mazrou, H.; Beddek, S.; Amokrane, A.

    2009-01-01

    An optimized set-up for prompt gamma neutron activation analysis (PGNAA) with Am-Be source is described and used for large liquid samples analysis. A methodology for quantitative analysis is proposed: it consists on normalizing the prompt gamma count rates with thermal neutron flux measurements carried out with He-3 detector and gamma attenuation factors calculated using MCNP-5. The relative and absolute methods are considered. This methodology is then applied to the determination of cadmium in industrial phosphoric acid. The same sample is then analyzed by inductively coupled plasma (ICP) method. Our results are in good agreement with those obtained with ICP method.

  2. Arguments and sources on Italian online forums on childhood vaccinations: Results of a content analysis.

    Science.gov (United States)

    Fadda, Marta; Allam, Ahmed; Schulz, Peter J

    2015-12-16

    Despite being committed to the immunization agenda set by the WHO, Italy is currently experiencing decreasing vaccination rates and increasing incidence of vaccine-preventable diseases. Our aim is to analyze Italian online debates on pediatric immunizations through a content analytic approach in order to quantitatively evaluate and summarize users' arguments and information sources. Threads were extracted from 3 Italian forums. Threads had to include the keyword Vaccin* in the title, focus on childhood vaccination, and include at least 10 posts. They had to have been started between 2008 and June 2014. High inter-coder reliability was achieved. Exploratory analysis using k-means clustering was performed to identify users' posting patterns for arguments about vaccines and sources. The analysis included 6544 posts mentioning 6223 arguments about pediatric vaccinations and citing 4067 sources. The analysis of argument posting patterns included users who published a sufficient number of posts; they generated 85% of all arguments on the forum. Dominating patterns of three groups were identified: (1) an anti-vaccination group (n=280) posted arguments against vaccinations, (2) a general pro-vaccination group (n=222) posted substantially diverse arguments supporting vaccination and (3) a safety-focused pro-vaccination group (n=158) mainly forwarded arguments that questioned the negative side effects of vaccination. The anti-vaccination group was shown to be more active than the others. They use multiple sources, own experience and media as their cited sources of information. Medical professionals were among the cited sources of all three groups, suggesting that vaccination-adverse professionals are gaining attention. Knowing which information is shared online on the topic of pediatric vaccinations could shed light on why immunization rates have been decreasing and what strategies would be best suited to address parental concerns. This suggests there is a high need for

  3. Study of liquid hydrogen and liquid deuterium cold neutron sources

    International Nuclear Information System (INIS)

    Harig, H.D.

    1969-01-01

    In view of the plant of the cold neutron source for a high flux reactor (maximal thermal flux of about 10 15 n/cm 2 s) an experimental study of several cold sources of liquid hydrogen and liquid deuterium has been made in a low power reactor (100 kW, about 10 12 n/cm 2 s). We have investigated: -cold neutron sources of liquid hydrogen shaped as annular layers of different thickness. Normal liquid hydrogen was used as well as hydrogen with a high para-percentage. -Cold neutron sources of liquid deuterium in cylinders of 18 and 38 cm diameter. In this case the sources could be placed into different positions to the reactor core within the heavy water reflector. This report gives a general description of the experimental device and deals more detailed with the design of the cryogenic systems. Then, the measured results are communicated, interpreted and finally compared with those of a theoretical study about the same cold moderators which have been the matter of the experimental investigation. (authors) [fr

  4. Experimental study of adsorption chiller driven by variable heat source

    Energy Technology Data Exchange (ETDEWEB)

    Wang, D.C.; Wang, Y.J.; Zhang, J.P.; Tian, X.L. [College of Electromechanical Engineering, Qingdao University, Qingdao 266071 (China); Wu, J.Y. [Institute of Refrigeration and Cryogenics, Shanghai Jiao Tong University, Shanghai 200030 (China)

    2008-05-15

    A silica gel-water adsorption chiller has been developed in recent years and has been applied in an air conditioning system driven by solar energy. The heat source used to drive the adsorption chiller is variable at any moment because the solar radiation intensity or the waste heat from engines varies frequently. An adsorption cooling system may be badly impacted by a variable heat source with temperature variations in a large range. In this work, a silica gel-water adsorption chiller driven by a variable heat source is experimentally studied. The influences of the variable heat source on the performance of the chiller are analyzed, especially for a continuous temperature increase process and a continuous temperature decrease process of the heat source. As an example, the dynamic characteristics of the heat source are also analyzed when solar energy is taken as the heat source of the adsorption chiller. According to the experimental results for the adsorption chiller and the characteristics of the heat source from solar energy, control strategies of the adsorption chiller driven by solar energy are proposed. (author)

  5. Experimental study of adsorption chiller driven by variable heat source

    International Nuclear Information System (INIS)

    Wang, D.C.; Wang, Y.J.; Zhang, J.P.; Tian, X.L.; Wu, J.Y.

    2008-01-01

    A silica gel-water adsorption chiller has been developed in recent years and has been applied in an air conditioning system driven by solar energy. The heat source used to drive the adsorption chiller is variable at any moment because the solar radiation intensity or the waste heat from engines varies frequently. An adsorption cooling system may be badly impacted by a variable heat source with temperature variations in a large range. In this work, a silica gel-water adsorption chiller driven by a variable heat source is experimentally studied. The influences of the variable heat source on the performance of the chiller are analyzed, especially for a continuous temperature increase process and a continuous temperature decrease process of the heat source. As an example, the dynamic characteristics of the heat source are also analyzed when solar energy is taken as the heat source of the adsorption chiller. According to the experimental results for the adsorption chiller and the characteristics of the heat source from solar energy, control strategies of the adsorption chiller driven by solar energy are proposed

  6. Study of extragalactic sources with H.E.S.S

    International Nuclear Information System (INIS)

    Giebels, Berrie

    2007-01-01

    The field of Very High Energy (VHE) γ-ray emitting extragalactic sources has considerably evolved since the new generation of atmospheric Cerenkov telescopes (ACT) of improved sensitivity, such as H.E.S.S. array and the MAGIC ACT, have started operating. This has led to a wealth of new clues about emission mechanisms at high energy through the discovery of new sources, more accurate spectra and temporal studies of sources known previously, and simultaneous multi-wavelength (MWL) campaigns since broad band variability is a key phenomenon to the underlying physical mechanisms at play. The fact that some of these new sources are located at redshifts close to z ∼ 0.2 makes them powerful probes of the Extragalactic Background Light (EBL) through the attenuation of γ-rays above 100 GeV

  7. Exploiting heterogeneous publicly available data sources for drug safety surveillance: computational framework and case studies.

    Science.gov (United States)

    Koutkias, Vassilis G; Lillo-Le Louët, Agnès; Jaulent, Marie-Christine

    2017-02-01

    Driven by the need of pharmacovigilance centres and companies to routinely collect and review all available data about adverse drug reactions (ADRs) and adverse events of interest, we introduce and validate a computational framework exploiting dominant as well as emerging publicly available data sources for drug safety surveillance. Our approach relies on appropriate query formulation for data acquisition and subsequent filtering, transformation and joint visualization of the obtained data. We acquired data from the FDA Adverse Event Reporting System (FAERS), PubMed and Twitter. In order to assess the validity and the robustness of the approach, we elaborated on two important case studies, namely, clozapine-induced cardiomyopathy/myocarditis versus haloperidol-induced cardiomyopathy/myocarditis, and apixaban-induced cerebral hemorrhage. The analysis of the obtained data provided interesting insights (identification of potential patient and health-care professional experiences regarding ADRs in Twitter, information/arguments against an ADR existence across all sources), while illustrating the benefits (complementing data from multiple sources to strengthen/confirm evidence) and the underlying challenges (selecting search terms, data presentation) of exploiting heterogeneous information sources, thereby advocating the need for the proposed framework. This work contributes in establishing a continuous learning system for drug safety surveillance by exploiting heterogeneous publicly available data sources via appropriate support tools.

  8. Bayesian Inference for Neural Electromagnetic Source Localization: Analysis of MEG Visual Evoked Activity

    International Nuclear Information System (INIS)

    George, J.S.; Schmidt, D.M.; Wood, C.C.

    1999-01-01

    We have developed a Bayesian approach to the analysis of neural electromagnetic (MEG/EEG) data that can incorporate or fuse information from other imaging modalities and addresses the ill-posed inverse problem by sarnpliig the many different solutions which could have produced the given data. From these samples one can draw probabilistic inferences about regions of activation. Our source model assumes a variable number of variable size cortical regions of stimulus-correlated activity. An active region consists of locations on the cortical surf ace, within a sphere centered on some location in cortex. The number and radi of active regions can vary to defined maximum values. The goal of the analysis is to determine the posterior probability distribution for the set of parameters that govern the number, location, and extent of active regions. Markov Chain Monte Carlo is used to generate a large sample of sets of parameters distributed according to the posterior distribution. This sample is representative of the many different source distributions that could account for given data, and allows identification of probable (i.e. consistent) features across solutions. Examples of the use of this analysis technique with both simulated and empirical MEG data are presented

  9. Dense plasma focus PACO as a hard X-ray emitter: a study on the radiation source

    OpenAIRE

    Supán, L.; Guichón, S.; Milanese, Maria Magdalena; Niedbalski, Jorge Julio; Moroso, Roberto Luis; Acuña, H.; Malamud, Florencia

    2016-01-01

    The radiation in the X-ray range detected outside the vacuum chamber of the dense plasma focus (DPF) PACO, are produced on the anode zone. The zone of emission is studied in a shot-to-shot analysis, using pure deuterium as filling gas. We present a diagnostic method to determine the place and size of the hard X-ray source by image analysis of high density radiography plates. Fil: Supán, L.. Universidad Nacional del Centro de la Provincia de Buenos Aires. Facultad de Ciencias Exactas. Insti...

  10. Nuclear microprobe analysis and source apportionment of individual atmospheric aerosol particles

    International Nuclear Information System (INIS)

    Artaxo, P.; Rabello, M.L.C.; Watt, F.; Grime, G.; Swietlicki, E.

    1993-01-01

    In atmospheric aerosol reserach, one key issue is to determine the sources of the airborne particles. Bulk PIXE analysis coupled with receptor modeling provides a useful, but limited view of the aerosol sources influencing one particular site or sample. The scanning nuclear microprobe (SNM) technique is a microanalytical technique that gives unique information on individual aerosol particles. In the SNM analyses a 1.0 μm size 2.4 MeV proton beam from the Oxford SNM was used. The trace elements with Z>11 were measured by the particle induced X-ray emission (PIXE) method with detection limits in the 1-10 ppm range. Carbon, nitrogen and oxygen are measured simultaneously using Rutherford backscattering spectrometry (RBS). Atmospheric aerosol particles were collected at the Brazilian Antarctic Station and at biomass burning sites in the Amazon basin tropical rain forest in Brazil. In the Antarctic samples, the sea-salt aerosol particles were clearly predominating, with NaCl and CaSO 4 as major compounds with several trace elements as Al, Si, P, K, Mn, Fe, Ni, Cu, Zn, Br, Sr, and Pb. Factor analysis of the elemental data showed the presence of four components: 1) Soil dust particles; 2) NaCl particles; 3) CaSO 4 with Sr; and 4) Br and Mg. Strontium, observed at 20-100 ppm levels, was always present in the CaSO 4 particles. The hierarchical cluster procedure gave results similar to the ones obtained through factor analysis. For the tropical rain forest biomass burning aerosol emissions, biogenic particles with a high organic content dominate the particle population, while K, P, Ca, Mg, Zn, and Si are the dominant elements. Zinc at 10-200 ppm is present in biogenic particles rich in P and K. The quantitative aspects and excellent detection limits make SNM analysis of individual aerosol particles a very powerful analytical tool. (orig.)

  11. Python Materials Genomics (pymatgen): A robust, open-source python library for materials analysis

    OpenAIRE

    Ong, Shyue Ping; Richards, William Davidson; Jain, Anubhav; Hautier, Geoffroy; Kocher, Michael; Cholia, Shreyas; Gunter, Dan; Chevrier, Vincent L.; Persson, Kristin A.; Ceder, Gerbrand

    2012-01-01

    We present the Python Materials Genomics (pymatgen) library, a robust, open-source Python library for materials analysis. A key enabler in high-throughput computational materials science efforts is a robust set of software tools to perform initial setup for the calculations (e.g., generation of structures and necessary input files) and post-calculation analysis to derive useful material properties from raw calculated data. The pymatgen library aims to meet these needs by (1) defining core Pyt...

  12. Noise Source Identification of a Ring-Plate Cycloid Reducer Based on Coherence Analysis

    OpenAIRE

    Yang, Bing; Liu, Yan

    2013-01-01

    A ring-plate-type cycloid speed reducer is one of the most important reducers owing to its low volume, compactness, smooth and high performance, and high reliability. The vibration and noise tests of the reducer prototype are completed using the HEAD acoustics multichannel noise test and analysis system. The characteristics of the vibration and noise are obtained based on coherence analysis and the noise sources are identified. The conclusions provide the bases for further noise research and ...

  13. Effects of Funding Sources on Access to Quality Higher Education in Public Universities in Kenya: A Case Study

    Directory of Open Access Journals (Sweden)

    John Mutinda Mutiso

    2015-04-01

    Full Text Available In the last two decades, Kenya has witnessed an exponential growth of students’ enrolment in its public universities and an oscillatory government funding in these institutions precipitating quality concerns by employers on the skills of the graduates to meet industry needs. In education finance, the sources of funds and the size of the resources are key determinants of quality education. The objective of the study was to determine the relationship between various funding sources and access to quality education in Kenya public universities using a case approach. The data collection instruments used were an interview guide, a focus group discussion guide, a student’s survey questionnaire and secondary document analysis. Data was collected from October to December 2014 in the case university from a sample population of 10 top university management staff, 36 heads of department (HoDs and 400 undergraduate students. The study employed the education production function as a basic model of the study. The validity of the data collection instruments was established through scrutiny by thesis supervisors and the reliability test of the students’ questionnaire returned a cronbach alpha of 0.88. F-test and analysis of variance (ANOVA methods were used with aid of the statistical package for social science (SPSS version 2.0.The conclusion of the study was that, the sources of funds had a positive effect on quality though the results were not significant, while government capitation, tuition and other sources of funds were significantly important for the access of quality of education in the institution (P =0.30, P = 0.018, P = 0.000. The study recommended the adoption of performance based funding to enhance quality in higher education.

  14. Quantification of source impact to PM using three-dimensional weighted factor model analysis on multi-site data

    Science.gov (United States)

    Shi, Guoliang; Peng, Xing; Huangfu, Yanqi; Wang, Wei; Xu, Jiao; Tian, Yingze; Feng, Yinchang; Ivey, Cesunica E.; Russell, Armistead G.

    2017-07-01

    Source apportionment technologies are used to understand the impacts of important sources of particulate matter (PM) air quality, and are widely used for both scientific studies and air quality management. Generally, receptor models apportion speciated PM data from a single sampling site. With the development of large scale monitoring networks, PM speciation are observed at multiple sites in an urban area. For these situations, the models should account for three factors, or dimensions, of the PM, including the chemical species concentrations, sampling periods and sampling site information, suggesting the potential power of a three-dimensional source apportionment approach. However, the principle of three-dimensional Parallel Factor Analysis (Ordinary PARAFAC) model does not always work well in real environmental situations for multi-site receptor datasets. In this work, a new three-way receptor model, called "multi-site three way factor analysis" model is proposed to deal with the multi-site receptor datasets. Synthetic datasets were developed and introduced into the new model to test its performance. Average absolute error (AAE, between estimated and true contributions) for extracted sources were all less than 50%. Additionally, three-dimensional ambient datasets from a Chinese mega-city, Chengdu, were analyzed using this new model to assess the application. Four factors are extracted by the multi-site WFA3 model: secondary source have the highest contributions (64.73 and 56.24 μg/m3), followed by vehicular exhaust (30.13 and 33.60 μg/m3), crustal dust (26.12 and 29.99 μg/m3) and coal combustion (10.73 and 14.83 μg/m3). The model was also compared to PMF, with general agreement, though PMF suggested a lower crustal contribution.

  15. Relevance analysis and short-term prediction of PM2.5 concentrations in Beijing based on multi-source data

    Science.gov (United States)

    Ni, X. Y.; Huang, H.; Du, W. P.

    2017-02-01

    The PM2.5 problem is proving to be a major public crisis and is of great public-concern requiring an urgent response. Information about, and prediction of PM2.5 from the perspective of atmospheric dynamic theory is still limited due to the complexity of the formation and development of PM2.5. In this paper, we attempted to realize the relevance analysis and short-term prediction of PM2.5 concentrations in Beijing, China, using multi-source data mining. A correlation analysis model of PM2.5 to physical data (meteorological data, including regional average rainfall, daily mean temperature, average relative humidity, average wind speed, maximum wind speed, and other pollutant concentration data, including CO, NO2, SO2, PM10) and social media data (microblog data) was proposed, based on the Multivariate Statistical Analysis method. The study found that during these factors, the value of average wind speed, the concentrations of CO, NO2, PM10, and the daily number of microblog entries with key words 'Beijing; Air pollution' show high mathematical correlation with PM2.5 concentrations. The correlation analysis was further studied based on a big data's machine learning model- Back Propagation Neural Network (hereinafter referred to as BPNN) model. It was found that the BPNN method performs better in correlation mining. Finally, an Autoregressive Integrated Moving Average (hereinafter referred to as ARIMA) Time Series model was applied in this paper to explore the prediction of PM2.5 in the short-term time series. The predicted results were in good agreement with the observed data. This study is useful for helping realize real-time monitoring, analysis and pre-warning of PM2.5 and it also helps to broaden the application of big data and the multi-source data mining methods.

  16. The SSI TOOLBOX Source Term Model SOSIM - Screening for important radionuclides and parameter sensitivity analysis

    Energy Technology Data Exchange (ETDEWEB)

    Avila Moreno, R.; Barrdahl, R.; Haegg, C.

    1995-05-01

    The main objective of the present study was to carry out a screening and a sensitivity analysis of the SSI TOOLBOX source term model SOSIM. This model is a part of the SSI TOOLBOX for radiological impact assessment of the Swedish disposal concept for high-level waste KBS-3. The outputs of interest for this purpose were: the total released fraction, the time of total release, the time and value of maximum release rate, the dose rates after direct releases of the biosphere. The source term equations were derived and simple equations and methods were proposed for calculation of these. A literature survey has been performed in order to determine a characteristic variation range and a nominal value for each model parameter. In order to reduce the model uncertainties the authors recommend a change in the initial boundary condition for solution of the diffusion equation for highly soluble nuclides. 13 refs.

  17. Automatisation of reading and interpreting photographically recorded spark source mass spectra for the quantitative analysis in solids

    International Nuclear Information System (INIS)

    Naudin, Guy.

    1976-01-01

    Quantitative analysis in solids by spark source mass spectrometry involves the study of photographic plates by means of a microdensitometer. After a graphic treatment of data from the plate, a scientific program is used to calculate the concentrations of isotopes. The automatisation of the three parts has been realised by using a program for computer. This program has been written in the laboratory for a small computer (Multi 8, Intertechnique) [fr

  18. Market Analysis and Consumer Impacts Source Document. Part II. Review of Motor Vehicle Market and Consumer Expenditures on Motor Vehicle Transportation

    Science.gov (United States)

    1980-12-01

    This source document on motor vehicle market analysis and consumer impacts consists of three parts. Part II consists of studies and review on: motor vehicle sales trends; motor vehicle fleet life and fleet composition; car buying patterns of the busi...

  19. The Exponent of High-frequency Source Spectral Falloff and Contribution to Source Parameter Estimates

    Science.gov (United States)

    Kiuchi, R.; Mori, J. J.

    2015-12-01

    As a way to understand the characteristics of the earthquake source, studies of source parameters (such as radiated energy and stress drop) and their scaling are important. In order to estimate source parameters reliably, often we must use appropriate source spectrum models and the omega-square model is most frequently used. In this model, the spectrum is flat in lower frequencies and the falloff is proportional to the angular frequency squared. However, Some studies (e.g. Allmann and Shearer, 2009; Yagi et al., 2012) reported that the exponent of the high frequency falloff is other than -2. Therefore, in this study we estimate the source parameters using a spectral model for which the falloff exponent is not fixed. We analyze the mainshock and larger aftershocks of the 2008 Iwate-Miyagi Nairiku earthquake. Firstly, we calculate the P wave and SH wave spectra using empirical Green functions (EGF) to remove the path effect (such as attenuation) and site effect. For the EGF event, we select a smaller earthquake that is highly-correlated with the target event. In order to obtain the stable results, we calculate the spectral ratios using a multitaper spectrum analysis (Prieto et al., 2009). Then we take a geometric mean from multiple stations. Finally, using the obtained spectra ratios, we perform a grid search to determine the high frequency falloffs, as well as corner frequency of both of events. Our results indicate the high frequency falloff exponent is often less than 2.0. We do not observe any regional, focal mechanism, or depth dependencies for the falloff exponent. In addition, our estimated corner frequencies and falloff exponents are consistent between the P wave and SH wave analysis. In our presentation, we show differences in estimated source parameters using a fixed omega-square model and a model allowing variable high-frequency falloff.

  20. A multi-criteria methodology for energy planning and developing renewable energy sources at a regional level: A case study Thassos, Greece

    International Nuclear Information System (INIS)

    Mourmouris, J.C.; Potolias, C.

    2013-01-01

    Rational energy planning under the pressure of environmental and economic problems is imperative to humanity. An evaluational framework is proposed in order to support energy planning for promoting the use of renewable energy sources. A multi-criteria decision analysis is adopted, detailing exploitation of renewable energy sources (including Wind, Solar, Biomass, Geothermal, and small Hydro) for power and heat generation. The aim of this paper is the analysis and development of a multilevel decision-making structure, utilizing multiple criteria for energy planning and exploitation of Renewable Energy Sources of at the regional level. The proposed evaluation framework focuses on the use of a multi-criteria approach as a tool for supporting energy planning in the area of concern, based on a pool of qualitative and quantitative evaluation criteria. The final aim of this study is to discover the optimal amount of each Renewable Energy Source that can be produced in the region and to contribute to an optimal energy mix. In this paper, a case study for the island of Thassos, Greece is analyzed. The results prove that Renewable Energy Sources exploitation at a regional level can satisfy increasing power demands through environmental-friendly energy systems that combine wind power, biomass and PV systems. - Highlights: ► An evaluational framework is proposed in order to support energy planning. ► A multi-criteria decision analysis is adopted, detailing exploitation of RES for power and heat generation. ► The aim is to discover the optimal amount of each RES that can be produced in each region.

  1. Modeling and reliability analysis of three phase z-source AC-AC converter

    Directory of Open Access Journals (Sweden)

    Prasad Hanuman

    2017-12-01

    Full Text Available This paper presents the small signal modeling using the state space averaging technique and reliability analysis of a three-phase z-source ac-ac converter. By controlling the shoot-through duty ratio, it can operate in buck-boost mode and maintain desired output voltage during voltage sag and surge condition. It has faster dynamic response and higher efficiency as compared to the traditional voltage regulator. Small signal analysis derives different control transfer functions and this leads to design a suitable controller for a closed loop system during supply voltage variation. The closed loop system of the converter with a PID controller eliminates the transients in output voltage and provides steady state regulated output. The proposed model designed in the RT-LAB and executed in a field programming gate array (FPGA-based real-time digital simulator at a fixedtime step of 10 μs and a constant switching frequency of 10 kHz. The simulator was developed using very high speed integrated circuit hardware description language (VHDL, making it versatile and moveable. Hardware-in-the-loop (HIL simulation results are presented to justify the MATLAB simulation results during supply voltage variation of the three phase z-source ac-ac converter. The reliability analysis has been applied to the converter to find out the failure rate of its different components.

  2. Quantification of severe accident source terms of a Westinghouse 3-loop plant

    International Nuclear Information System (INIS)

    Lee Min; Ko, Y.-C.

    2008-01-01

    Integrated severe accident analysis codes are used to quantify the source terms of the representative sequences identified in PSA study. The characteristics of these source terms depend on the detail design of the plant and the accident scenario. A historical perspective of radioactive source term is provided. The grouping of radionuclides in different source terms or source term quantification tools based on TID-14844, NUREG-1465, and WASH-1400 is compared. The radionuclides release phenomena and models adopted in the integrated severe accident analysis codes of STCP and MAAP4 are described. In the present study, the severe accident source terms for risk quantification of Maanshan Nuclear Power Plant of Taiwan Power Company are quantified using MAAP 4.0.4 code. A methodology is developed to quantify the source terms of each source term category (STC) identified in the Level II PSA analysis of the plant. The characteristics of source terms obtained are compared with other source terms. The plant analyzed employs a Westinghouse designed 3-loop pressurized water reactor (PWR) with large dry containment

  3. Reliability and validity analysis of the open-source Chinese Foot and Ankle Outcome Score (FAOS).

    Science.gov (United States)

    Ling, Samuel K K; Chan, Vincent; Ho, Karen; Ling, Fona; Lui, T H

    2017-12-21

    Develop the first reliable and validated open-source outcome scoring system in the Chinese language for foot and ankle problems. Translation of the English FAOS into Chinese following regular protocols. First, two forward-translations were created separately, these were then combined into a preliminary version by an expert committee, and was subsequently back-translated into English. The process was repeated until the original and back translations were congruent. This version was then field tested on actual patients who provided feedback for modification. The final Chinese FAOS version was then tested for reliability and validity. Reliability analysis was performed on 20 subjects while validity analysis was performed on 50 subjects. Tools used to validate the Chinese FAOS were the SF36 and Pain Numeric Rating Scale (NRS). Internal consistency between the FAOS subgroups was measured using Cronbach's alpha. Spearman's correlation was calculated between each subgroup in the FAOS, SF36 and NRS. The Chinese FAOS passed both reliability and validity testing; meaning it is reliable, internally consistent and correlates positively with the SF36 and the NRS. The Chinese FAOS is a free, open-source scoring system that can be used to provide a relatively standardised outcome measure for foot and ankle studies. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. An evaluation of the use of mobile source emissions trading: Locomotive case study

    International Nuclear Information System (INIS)

    West, W.R.; Brazell, M.M.

    1993-01-01

    There are many proposals for generating mobil source credits for use by stationary and other sources. This paper examines the benefits and practicality of including locomotive rail emissions in proposed emissions trading programs in california. In particular, this paper examines (1) if trading of locomotive rail emissions will result in lower compliance costs for railroads than traditional open-quotes command and controlclose quotes approaches, and (2) if emissions trading programs provide large enough incentives to entice railroads to seek to meet or exceed expected emissions reduction open-quotes command and controlclose quotes targets. The paper also examines under what circumstances stationary sources would be willing to purchase mobile source credits from railroads, in order to offset some of the stationary source's emissions reductions requirements. Stated simply, this analysis examines whether proposed trading programs offer enough benefits to both trading partners to warrant their use

  5. Combined analysis of magnetic and gravity anomalies using normalized source strength (NSS)

    Science.gov (United States)

    Li, L.; Wu, Y.

    2017-12-01

    Gravity field and magnetic field belong to potential fields which lead inherent multi-solution. Combined analysis of magnetic and gravity anomalies based on Poisson's relation is used to determinate homology gravity and magnetic anomalies and decrease the ambiguity. The traditional combined analysis uses the linear regression of the reduction to pole (RTP) magnetic anomaly to the first order vertical derivative of the gravity anomaly, and provides the quantitative or semi-quantitative interpretation by calculating the correlation coefficient, slope and intercept. In the calculation process, due to the effect of remanent magnetization, the RTP anomaly still contains the effect of oblique magnetization. In this case the homology gravity and magnetic anomalies display irrelevant results in the linear regression calculation. The normalized source strength (NSS) can be transformed from the magnetic tensor matrix, which is insensitive to the remanence. Here we present a new combined analysis using NSS. Based on the Poisson's relation, the gravity tensor matrix can be transformed into the pseudomagnetic tensor matrix of the direction of geomagnetic field magnetization under the homologous condition. The NSS of pseudomagnetic tensor matrix and original magnetic tensor matrix are calculated and linear regression analysis is carried out. The calculated correlation coefficient, slope and intercept indicate the homology level, Poisson's ratio and the distribution of remanent respectively. We test the approach using synthetic model under complex magnetization, the results show that it can still distinguish the same source under the condition of strong remanence, and establish the Poisson's ratio. Finally, this approach is applied in China. The results demonstrated that our approach is feasible.

  6. Identification of 'Point A' as the prevalent source of error in cephalometric analysis of lateral radiographs.

    Science.gov (United States)

    Grogger, P; Sacher, C; Weber, S; Millesi, G; Seemann, R

    2018-04-10

    Deviations in measuring dentofacial components in a lateral X-ray represent a major hurdle in the subsequent treatment of dysgnathic patients. In a retrospective study, we investigated the most prevalent source of error in the following commonly used cephalometric measurements: the angles Sella-Nasion-Point A (SNA), Sella-Nasion-Point B (SNB) and Point A-Nasion-Point B (ANB); the Wits appraisal; the anteroposterior dysplasia indicator (APDI); and the overbite depth indicator (ODI). Preoperative lateral radiographic images of patients with dentofacial deformities were collected and the landmarks digitally traced by three independent raters. Cephalometric analysis was automatically performed based on 1116 tracings. Error analysis identified the x-coordinate of Point A as the prevalent source of error in all investigated measurements, except SNB, in which it is not incorporated. In SNB, the y-coordinate of Nasion predominated error variance. SNB showed lowest inter-rater variation. In addition, our observations confirmed previous studies showing that landmark identification variance follows characteristic error envelopes in the highest number of tracings analysed up to now. Variance orthogonal to defining planes was of relevance, while variance parallel to planes was not. Taking these findings into account, orthognathic surgeons as well as orthodontists would be able to perform cephalometry more accurately and accomplish better therapeutic results. Copyright © 2018 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  7. Sources to the landscape - detailed spatiotemporal analysis of 200 years Danish landscape dynamics using unexploited historical maps and aerial photos

    DEFF Research Database (Denmark)

    Svenningsen, Stig Roar; Christensen, Andreas Aagaard; Dupont, Henrik

    to declassification of military maps and aerial photos from the cold war, only relatively few sources have been made available to researchers due to lacking efforts in digitalization and related services. And even though the digitizing of cartographic material has been accelerated, the digitally available materials...... or to the commercial photo series from the last 20 years. This poster outlines a new research project focusing on the potential of unexploited cartographic sources for detailed analysis of the dynamic of the Danish landscape between 1800 – 2000. The project draws on cartographic sources available in Danish archives...... of material in landscape change studies giving a high temporal and spatial resolution. The project also deals with the opportunity and constrain of comparing different cartographic sources with diverse purpose and time of production, e.g. different scale and quality of aerial photos or the difference between...

  8. Evaluation of Personal and Built Environment Attributes to Physical Activity: A Multilevel Analysis on Multiple Population-Based Data Sources

    Directory of Open Access Journals (Sweden)

    Wei Yang

    2012-01-01

    Full Text Available Background. Studies have documented that built environment factors potentially promote or impede leisure time physical activity (LTPA. This study explored the relationship between multiple built environment factors and individual characteristics on LTPA. Methods. Multiple data sources were utilized including individual level data for health behaviors and health status from the Nevada Behavioral Risk Factor Surveillance System (BRFSS and community level data from different data sources including indicators for recreation facilities, safety, air quality, commute time, urbanization, population density, and land mix level. Mixed model logistic regression and geographic information system (GIS spatial analysis were conducted. Results. Among 6,311 respondents, 24.4% reported no LTPA engagement during the past 30 days. No engagement in LTPA was significantly associated with (1 individual factors: older age, less education, lower income, being obesity, and low life satisfaction and (2 community factors: more commute time, higher crime rate, urban residence, higher population density, but not for density and distance to recreation facilities, air quality, and land mix. Conclusions. Multiple data systems including complex population survey and spatial analysis are valuable tools on health and built environment studies.

  9. Critical analysis of documentary sources for Historical Climatology of Northern Portugal (17th-19th centuries)

    Science.gov (United States)

    Amorim, Inês; Sousa Silva, Luís; Garcia, João Carlos

    2017-04-01

    Critical analysis of documentary sources for Historical Climatology of Northern Portugal (17th-19th centuries) Inês Amorim CITCEM, Department of History, Political and International Studies, U. of Porto, Portugal. Luís Sousa Silva CITCEM, PhD Fellowship - FCT. João Carlos Garcia CIUHCT, Geography Department, U. of Porto, Portugal. The first major national project on Historical Climatology in Portugal, called "KLIMHIST: Reconstruction and model simulations of past climate in Portugal using documentary and early instrumental sources (17th-19th centuries)", ended in September 2015, coordinated by Maria João Alcoforado. This project began in March 2012 and counted on an interdisciplinary team of researchers from four Portuguese institutions (Centre of Geographical Studies, University of Trás-os-Montes and Alto Douro, University of Porto, and University of Évora), from different fields of knowledge (Geography, History, Biology, Climatology and Meteorology). The team networked and collaborated with other international research groups on Climate Change and Historical Climatology, resulting in several publications. This project aimed to reconstruct thermal and rainfall patterns in Portugal between the 17th and 19th centuries, as well as identify the main hydrometeorological extremes that occurred over that period. The basic methodology consisted in combining information from different types of anthropogenic sources (descriptive and instrumental) and natural sources (tree rings and geothermal holes), so as to develop climate change models of the past. The data collected were stored in a digital database, which can be searched by source, date, location and type of event. This database, which will be made publically available soon, contains about 3500 weather/climate-related records, which have begun to be studied, processed and published. Following this seminal project, other initiatives have taken place in Portugal in the area of Historical Climatology, namely a Ph

  10. Selection of important ecological source patches base on Green Infrastructure theory: A case study of Wuhan city

    Science.gov (United States)

    Ke, Yuanyuan; Yu, Yan; Tong, Yan

    2018-01-01

    Selecting urban ecological patches is of great significance for constructing urban green infrastructure network, protecting urban biodiversity and ecological environment. With the support of GIS technology, a criterion for selecting sources of patches was developed according to existing planning. Then ecological source patches of terrestrial organism, aquatic and amphibious organism were selected in Wuhan city. To increase the connectivity of the ecological patches and achieve greater ecological protection benefits, the green infrastructure networks in Wuhan city were constructed with the minimum path analysis method. Finally, the characteristics of ecological source patches were analyzed with landscape metrics, and ecological protection importance degree of ecological source patches were evaluated comprehensively. The results showed that there were 23 important ecological source patches in Wuhan city, among which Sushan Temple Forest Patch, Lu Lake and Shangshe Lake Wetland Patch were the most important in all kinds of patches for ecological protection. This study can provide a scientific basis for the preservation of urban ecological space, the delineation of natural conservation areas and the protection of biological diversity.

  11. Performance analyses of Z-source and quasi Z-source inverter for photovoltaic applications

    Science.gov (United States)

    Himabind, S.; Priya, T. Hari; Manjeera, Ch.

    2018-04-01

    This paper presents the comparative analysis of Z-source and Quasi Z-source converter for renewable energy applications. Due to the dependency of renewable energy sources on external weather conditions the output voltage, current changes accordingly which effects the performance of traditional voltage source and current source inverters connected across it. To overcome the drawbacks of VSI and CSI, Z-source and Quasi Z-source inverter (QZSI) are used, which can perform multiple tasks like ac-to-dc, dc-to-ac, ac-to-ac, dc-to-dc conversion. They can be used for both buck and boost operations, by utilizing the shoot-through zero state. The QZSI is derived from the ZSI topology, with a slight change in the impedance network and it overcomes the drawbacks of ZSI. The QZSI draws a constant current from the source when compared to ZSI. A comparative analysis is performed between Z-source and Quasi Z-source inverter, simulation is performed in MATLAB/Simulink environment.

  12. YouTube as a source of COPD patient education: A social media content analysis

    Science.gov (United States)

    Stellefson, Michael; Chaney, Beth; Ochipa, Kathleen; Chaney, Don; Haider, Zeerak; Hanik, Bruce; Chavarria, Enmanuel; Bernhardt, Jay M.

    2014-01-01

    Objective Conduct a social media content analysis of COPD patient education videos on YouTube. Methods A systematic search protocol was used to locate 223 videos. Two independent coders evaluated each video to determine topics covered, media source(s) of posted videos, information quality as measured by HONcode guidelines for posting trustworthy health information on the Internet, and viewer exposure/engagement metrics. Results Over half the videos (n=113, 50.7%) included information on medication management, with far fewer videos on smoking cessation (n=40, 17.9%). Most videos were posted by a health agency or organization (n=128, 57.4%), and the majority of videos were rated as high quality (n=154, 69.1%). HONcode adherence differed by media source (Fisher’s Exact Test=20.52, p=.01), with user-generated content (UGC) receiving the lowest quality scores. Overall level of user engagement as measured by number of “likes,” “favorites,” “dislikes,” and user comments was low (mdn range = 0–3, interquartile (IQR) range = 0–16) across all sources of media. Conclusion Study findings suggest that COPD education via YouTube has the potential to reach and inform patients, however, existing video content and quality varies significantly. Future interventions should help direct individuals with COPD to increase their engagement with high-quality patient education videos on YouTube that are posted by reputable health organizations and qualified medical professionals. Patients should be educated to avoid and/or critically view low-quality videos posted by individual YouTube users who are not health professionals. PMID:24659212

  13. A practical sensitivity analysis method for ranking sources of uncertainty in thermal–hydraulics applications

    Energy Technology Data Exchange (ETDEWEB)

    Pourgol-Mohammad, Mohammad, E-mail: pourgolmohammad@sut.ac.ir [Department of Mechanical Engineering, Sahand University of Technology, Tabriz (Iran, Islamic Republic of); Hoseyni, Seyed Mohsen [Department of Basic Sciences, East Tehran Branch, Islamic Azad University, Tehran (Iran, Islamic Republic of); Hoseyni, Seyed Mojtaba [Building & Housing Research Center, Tehran (Iran, Islamic Republic of); Sepanloo, Kamran [Nuclear Science and Technology Research Institute, Tehran (Iran, Islamic Republic of)

    2016-08-15

    Highlights: • Existing uncertainty ranking methods prove inconsistent for TH applications. • Introduction of a new method for ranking sources of uncertainty in TH codes. • Modified PIRT qualitatively identifies and ranks uncertainty sources more precisely. • The importance of parameters is calculated by a limited number of TH code executions. • Methodology is applied successfully on LOFT-LB1 test facility. - Abstract: In application to thermal–hydraulic calculations by system codes, sensitivity analysis plays an important role for managing the uncertainties of code output and risk analysis. Sensitivity analysis is also used to confirm the results of qualitative Phenomena Identification and Ranking Table (PIRT). Several methodologies have been developed to address uncertainty importance assessment. Generally, uncertainty importance measures, mainly devised for the Probabilistic Risk Assessment (PRA) applications, are not affordable for computationally demanding calculations of the complex thermal–hydraulics (TH) system codes. In other words, for effective quantification of the degree of the contribution of each phenomenon to the total uncertainty of the output, a practical approach is needed by considering high computational burden of TH calculations. This study aims primarily to show the inefficiency of the existing approaches and then introduces a solution to cope with the challenges in this area by modification of variance-based uncertainty importance method. Important parameters are identified by the modified PIRT approach qualitatively then their uncertainty importance is quantified by a local derivative index. The proposed index is attractive from its practicality point of view on TH applications. It is capable of calculating the importance of parameters by a limited number of TH code executions. Application of the proposed methodology is demonstrated on LOFT-LB1 test facility.

  14. A practical sensitivity analysis method for ranking sources of uncertainty in thermal–hydraulics applications

    International Nuclear Information System (INIS)

    Pourgol-Mohammad, Mohammad; Hoseyni, Seyed Mohsen; Hoseyni, Seyed Mojtaba; Sepanloo, Kamran

    2016-01-01

    Highlights: • Existing uncertainty ranking methods prove inconsistent for TH applications. • Introduction of a new method for ranking sources of uncertainty in TH codes. • Modified PIRT qualitatively identifies and ranks uncertainty sources more precisely. • The importance of parameters is calculated by a limited number of TH code executions. • Methodology is applied successfully on LOFT-LB1 test facility. - Abstract: In application to thermal–hydraulic calculations by system codes, sensitivity analysis plays an important role for managing the uncertainties of code output and risk analysis. Sensitivity analysis is also used to confirm the results of qualitative Phenomena Identification and Ranking Table (PIRT). Several methodologies have been developed to address uncertainty importance assessment. Generally, uncertainty importance measures, mainly devised for the Probabilistic Risk Assessment (PRA) applications, are not affordable for computationally demanding calculations of the complex thermal–hydraulics (TH) system codes. In other words, for effective quantification of the degree of the contribution of each phenomenon to the total uncertainty of the output, a practical approach is needed by considering high computational burden of TH calculations. This study aims primarily to show the inefficiency of the existing approaches and then introduces a solution to cope with the challenges in this area by modification of variance-based uncertainty importance method. Important parameters are identified by the modified PIRT approach qualitatively then their uncertainty importance is quantified by a local derivative index. The proposed index is attractive from its practicality point of view on TH applications. It is capable of calculating the importance of parameters by a limited number of TH code executions. Application of the proposed methodology is demonstrated on LOFT-LB1 test facility.

  15. A Study on the Information Analysis and Legal Affairs

    International Nuclear Information System (INIS)

    Chung, W. S.; Yang, M. H.; Yun, S. W.; Lee, D. S.; Kim, H. R.; Noh, B. C.

    2009-02-01

    It is followed that results and contents of a Study on the Nuclear Information Analyses and Legal Affairs. Our team makes an effort to secure KAERI's best legal interest in the process of enacting nuclear laws and codes, international collaborative study, and management. Moreover, as a international trend analysis, we studied Japan government's position to nuclear energy under the aspect of reducing climate change and supplying sustainable energy. Improvement of Japan's radiation use showed increasing contribution of radiation technology to the people. Results of studies of nuclear policy of Kazakhstan, forecasting global trend in 2030 of Nuclear area, and new U.S. government's policy to nuclear energy are also explained. Lastly, we performed evaluation of source of electric generator which reduce emitting carbon dioxide in the aspect of greenhouse gas emission statistic and tested green gas reducing ability of Korea's green source of electric generator that reducing greenhouse gas effect

  16. Evaluation of antibiotic resistance analysis and ribotyping for identification of faecal pollution sources in an urban watershed.

    Science.gov (United States)

    Moore, D F; Harwood, V J; Ferguson, D M; Lukasik, J; Hannah, P; Getrich, M; Brownell, M

    2005-01-01

    The accuracy of ribotyping and antibiotic resistance analysis (ARA) for prediction of sources of faecal bacterial pollution in an urban southern California watershed was determined using blinded proficiency samples. Antibiotic resistance patterns and HindIII ribotypes of Escherichia coli (n = 997), and antibiotic resistance patterns of Enterococcus spp. (n = 3657) were used to construct libraries from sewage samples and from faeces of seagulls, dogs, cats, horses and humans within the watershed. The three libraries were analysed to determine the accuracy of host source prediction. The internal accuracy of the libraries (average rate of correct classification, ARCC) with six source categories was 44% for E. coli ARA, 69% for E. coli ribotyping and 48% for Enterococcus ARA. Each library's predictive ability towards isolates that were not part of the library was determined using a blinded proficiency panel of 97 E. coli and 99 Enterococcus isolates. Twenty-eight per cent (by ARA) and 27% (by ribotyping) of the E. coli proficiency isolates were assigned to the correct source category. Sixteen per cent were assigned to the same source category by both methods, and 6% were assigned to the correct category. Addition of 2480 E. coli isolates to the ARA library did not improve the ARCC or proficiency accuracy. In contrast, 45% of Enterococcus proficiency isolates were correctly identified by ARA. None of the methods performed well enough on the proficiency panel to be judged ready for application to environmental samples. Most microbial source tracking (MST) studies published have demonstrated library accuracy solely by the internal ARCC measurement. Low rates of correct classification for E. coli proficiency isolates compared with the ARCCs of the libraries indicate that testing of bacteria from samples that are not represented in the library, such as blinded proficiency samples, is necessary to accurately measure predictive ability. The library-based MST methods used in

  17. International patent analysis of water source heat pump based on orbit database

    Science.gov (United States)

    Li, Na

    2018-02-01

    Using orbit database, this paper analysed the international patents of water source heat pump (WSHP) industry with patent analysis methods such as analysis of publication tendency, geographical distribution, technology leaders and top assignees. It is found that the beginning of the 21st century is a period of rapid growth of the patent application of WSHP. Germany and the United States had done researches and development of WSHP in an early time, but now Japan and China have become important countries of patent applications. China has been developing faster and faster in recent years, but the patents are concentrated in universities and urgent to be transferred. Through an objective analysis, this paper aims to provide appropriate decision references for the development of domestic WSHP industry.

  18. Multi-source analysis reveals latitudinal and altitudinal shifts in range of Ixodes ricinus at its northern distribution limit

    Directory of Open Access Journals (Sweden)

    Kristoffersen Anja B

    2011-05-01

    Full Text Available Abstract Background There is increasing evidence for a latitudinal and altitudinal shift in the distribution range of Ixodes ricinus. The reported incidence of tick-borne disease in humans is on the rise in many European countries and has raised political concern and attracted media attention. It is disputed which factors are responsible for these trends, though many ascribe shifts in distribution range to climate changes. Any possible climate effect would be most easily noticeable close to the tick's geographical distribution limits. In Norway- being the northern limit of this species in Europe- no documentation of changes in range has been published. The objectives of this study were to describe the distribution of I. ricinus in Norway and to evaluate if any range shifts have occurred relative to historical descriptions. Methods Multiple data sources - such as tick-sighting reports from veterinarians, hunters, and the general public - and surveillance of human and animal tick-borne diseases were compared to describe the present distribution of I. ricinus in Norway. Correlation between data sources and visual comparison of maps revealed spatial consistency. In order to identify the main spatial pattern of tick abundance, a principal component analysis (PCA was used to obtain a weighted mean of four data sources. The weighted mean explained 67% of the variation of the data sources covering Norway's 430 municipalities and was used to depict the present distribution of I. ricinus. To evaluate if any geographical range shift has occurred in recent decades, the present distribution was compared to historical data from 1943 and 1983. Results Tick-borne disease and/or observations of I. ricinus was reported in municipalities up to an altitude of 583 metres above sea level (MASL and is now present in coastal municipalities north to approximately 69°N. Conclusion I. ricinus is currently found further north and at higher altitudes than described in

  19. A CASE STUDY OF NONPOINT SOURCES BACTERIAL CONTRIBUTION TO RURAL SURFACE WATER

    Science.gov (United States)

    The presentation will address several bacterial issues affecting the Turkey Creek (TC) watershed, in north central Ok. Our results from seasonal stream Escherichia coli (E. coli) analysis, bacterial source tracking, and antibiotic resistance will be shared and discussed in relat...

  20. Monitoring Lead (Pb) Pollution and Identifying Pb Pollution Sources in Japan Using Stable Pb Isotope Analysis with Kidneys of Wild Rats.

    Science.gov (United States)

    Nakata, Hokuto; Nakayama, Shouta M M; Oroszlany, Balazs; Ikenaka, Yoshinori; Mizukawa, Hazuki; Tanaka, Kazuyuki; Harunari, Tsunehito; Tanikawa, Tsutomu; Darwish, Wageh Sobhy; Yohannes, Yared B; Saengtienchai, Aksorn; Ishizuka, Mayumi

    2017-01-10

    Although Japan has been considered to have little lead (Pb) pollution in modern times, the actual pollution situation is unclear. The present study aims to investigate the extent of Pb pollution and to identify the pollution sources in Japan using stable Pb isotope analysis with kidneys of wild rats. Wild brown ( Rattus norvegicus , n = 43) and black ( R. rattus , n = 98) rats were trapped from various sites in Japan. Mean Pb concentrations in the kidneys of rats from Okinawa (15.58 mg/kg, dry weight), Aichi (10.83), Niigata (10.62), Fukuoka (8.09), Ibaraki (5.06), Kyoto (4.58), Osaka (4.57), Kanagawa (3.42), and Tokyo (3.40) were above the threshold (2.50) for histological kidney changes. Similarly, compared with the previous report, it was regarded that even structural and functional kidney damage as well as neurotoxicity have spread among rats in Japan. Additionally, the possibility of human exposure to a high level of Pb was assumed. In regard to stable Pb isotope analysis, distinctive values of stable Pb isotope ratios (Pb-IRs) were detected in some kidney samples with Pb levels above 5.0 mg/kg. This result indicated that composite factors are involved in Pb pollution. However, the identification of a concrete pollution source has not been accomplished due to limited differences among previously reported values of Pb isotope composition in circulating Pb products. Namely, the current study established the limit of Pb isotope analysis for source identification. Further detailed research about monitoring Pb pollution in Japan and the demonstration of a novel method to identify Pb sources are needed.

  1. PIXE Analysis and source identification of airborne particulate matter collected in Downtown Havana City

    International Nuclear Information System (INIS)

    Perez, G.; Pinnera, I; Ramos, M; Guibert, R; Molina, E.; Martinez, M.; Fernandez, A.; Aldape, F.; Flores, M.

    2009-01-01

    A set of samples containing airborne particulate matter (in two particle size fraction PM10 and PM2,5) collected during five months from November 2006 to April 2007 in a urban area of Havana City were analyzed by Particle-Induced X-ray Emission (PIXE) technique and the concentrations of 14 elements (S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Ni, Cu, Zn, Br and Pb) were determined consistently in both particle size fractions with minimum detection limits in the range of ng/m3. A Gent air sampler was used for the aerosol collection in PM10 and PM2,5 particles simultaneously and the PIXE elemental analysis were performed using a proton beam of 2.5 MeV from the 2 MV Van de Graff Tandetron Accelerator at the ININ PIXE Laboratory in Mexico. The analytical database provided by PIXE was statistically analyzed in order to determine the promising local pollution sources. The statistical techniques of Multivariate Factor Analysis in combination with the Principal Component Analysis methods were applied to this data and allowed identifying five main pollution sources of airborne particulate matter (PM10 and PM2,5) collected in this area. The main (local) identified sources were: soil dust, sea spray, industry, fossil fuel combustion from motor vehicles and burnings or incinerations of diverse materials. A general discussion about these results is presented in this work. (Author)

  2. Study of groundwater arsenic pollution in Lanyang Plain using multivariate statistical analysis

    Science.gov (United States)

    chan, S.

    2013-12-01

    The study area, Lanyang Plain in the eastern Taiwan, has highly developed agriculture and aquaculture, which consume over 70% of the water supplies. Groundwater is frequently considered as an alternative water source. However, the serious arsenic pollution of groundwater in Lanyan Plain should be well studied to ensure the safety of groundwater usage. In this study, 39 groundwater samples were collected. The results of hydrochemistry demonstrate two major trends in Piper diagram. The major trend with most of groundwater samples is determined with water type between Ca+Mg-HCO3 and Na+K-HCO3. This can be explained with cation exchange reaction. The minor trend is obviously corresponding to seawater intrusion, which has water type of Na+K-Cl, because the localities of these samples are all in the coastal area. The multivariate statistical analysis on hydrochemical data was conducted for further exploration on the mechanism of arsenic contamination. Two major factors can be extracted with factor analysis. The major factor includes Ca, Mg and Sr while the minor factor includes Na, K and As. This reconfirms that cation exchange reaction mainly control the groundwater hydrochemistry in the study area. It is worth to note that arsenic is positively related to Na and K. The result of cluster analysis shows that groundwater samples with high arsenic concentration can be grouped into that with high Na, K and HCO3. This supports that cation exchange would enhance the release of arsenic and exclude the effect of seawater intrusion. In other words, the water-rock reaction time is key to obtain higher arsenic content. In general, the major source of arsenic in sediments include exchangeable, reducible and oxidizable phases, which are adsorbed ions, Fe-Mn oxides and organic matters/pyrite, respectively. However, the results of factor analysis do not show apparent correlation between arsenic and Fe/Mn. This may exclude Fe-Mn oxides as a major source of arsenic. The other sources

  3. Knowledge Sources and Opinions of Prospective Social Studies Teachers about Possible Risk and Benefit Analysis: Nuclear Energy and Power Stations

    Science.gov (United States)

    Yazici, Hakki; Bulut, Ramazan; Yazici, Sibel

    2016-01-01

    In this study, it was aimed to determine the trust status of prospective social studies teachers regarding various knowledge sources related to nuclear energy and power stations regarded as a controversial socio-scientific issue and their perceptions on the possible risks and benefits of nuclear energy and power stations. Target population of the…

  4. Effect of sample moisture and bulk density on performance of the 241Am-Be source based prompt gamma rays neutron activation analysis setup. A Monte Carlo study

    International Nuclear Information System (INIS)

    Almisned, Ghada

    2010-01-01

    Monte Carlo simulations were carried out using the dependence of gamma ray yield on the bulk density and moisture content for five different lengths of Portland cement samples in a thermal neutron capture based Prompt Gamma ray Neutron Activation Analysis (PGNAA) setup for source inside moderator geometry using an 241 Am-Be neutron source. In this study, yields of 1.94 and 6.42 MeV prompt gamma rays from calcium in the five Portland cement samples were calculated as a function of sample bulk density and moisture content. The study showed a strong dependence of the 1.94 and 6.42 MeV gamma ray yield upon the sample bulk density but a weaker dependence upon sample moisture content. For an order of magnitude increase in the sample bulk density, an order of magnitude increase in the gamma rays yield was observed, i.e., a one-to-one correspondence. In case of gamma ray yield dependence upon sample moisture content, an order of magnitude increase in the moisture content of the sample resulted in about 16-17% increase in the yield of 1.94 and 6.42 MeV gamma rays from calcium. (author)

  5. Comparison of three-phase three-level voltage source inverter with intermediate dc–dc boost converter and quasi-Z-source inverter

    DEFF Research Database (Denmark)

    Panfilov, Dmitry; Husev, Oleksandr; Blaabjerg, Frede

    2016-01-01

    This study compares a three-phase three-level voltage source inverter with an intermediate dc-dc boost converter and a quasi-Z-source inverter in terms of passive elements values and dimensions, semiconductor stresses, and overall efficiency. A comparative analysis was conducted with relative...

  6. Text mining and visualization case studies using open-source tools

    CERN Document Server

    Chisholm, Andrew

    2016-01-01

    Text Mining and Visualization: Case Studies Using Open-Source Tools provides an introduction to text mining using some of the most popular and powerful open-source tools: KNIME, RapidMiner, Weka, R, and Python. The contributors-all highly experienced with text mining and open-source software-explain how text data are gathered and processed from a wide variety of sources, including books, server access logs, websites, social media sites, and message boards. Each chapter presents a case study that you can follow as part of a step-by-step, reproducible example. You can also easily apply and extend the techniques to other problems. All the examples are available on a supplementary website. The book shows you how to exploit your text data, offering successful application examples and blueprints for you to tackle your text mining tasks and benefit from open and freely available tools. It gets you up to date on the latest and most powerful tools, the data mining process, and specific text mining activities.

  7. An analysis of solar assisted ground source heat pumps in cold climates

    International Nuclear Information System (INIS)

    Emmi, Giuseppe; Zarrella, Angelo; De Carli, Michele; Galgaro, Antonio

    2015-01-01

    Highlights: • The work focuses on solar assisted ground source heat pump in cold climates. • Multi-year simulations of SAGSHP, are carried out in six cold locations. • GSHP and SAGSHP are compared. • The effect of total borehole length on the heat pump energy efficiency is studied. • A dedicated control strategy is used to manage both solar and ground loops. - Abstract: Exploiting renewable energy sources for air-conditioning has been extensively investigated over recent years, and many countries have been working to promote the use of renewable energy to decrease energy consumption and CO_2 emissions. Electrical heat pumps currently represent the most promising technology to reduce fossil fuel usage. While ground source heat pumps, which use free heat sources, have been taking significant steps forward and despite the fact that their energy performance is better than that of air source heat pumps, their development has been limited by their high initial investment cost. An alternative solution is one that uses solar thermal collectors coupled with a ground source heat pump in a so-called solar assisted ground source heat pump. A ground source heat pump system, used to heat environments located in a cold climate, was investigated in this study. The solar assisted ground source heat pump extracted heat from the ground by means of borehole heat exchangers and it injected excess solar thermal energy into the ground. Building load profiles are usually heating dominated in cold climates, but when common ground source heat pump systems are used only for heating, their performance decreases due to an unbalanced ground load. Solar thermal collectors can help to ensure that systems installed in cold zones perform more efficiently. Computer simulations using a Transient System Simulation (TRNSYS) tool were carried out in six cold locations in order to investigate solar assisted ground source heat pumps. The effect of the borehole length on the energy efficiency of

  8. LIGHT SOURCE: A simulation study of Tsinghua Thomson scattering X-ray source

    Science.gov (United States)

    Tang, Chuan-Xiang; Li, Ren-Kai; Huang, Wen-Hui; Chen, Huai-Bi; Du, Ying-Chao; Du, Qiang; Du, Tai-Bin; He, Xiao-Zhong; Hua, Jian-Fei; Lin, Yu-Zhen; Qian, Hou-Jun; Shi, Jia-Ru; Xiang, Dao; Yan, Li-Xin; Yu, Pei-Cheng

    2009-06-01

    Thomson scattering X-ray sources are compact and affordable facilities that produce short duration, high brightness X-ray pulses enabling new experimental capacities in ultra-fast science studies, and also medical and industrial applications. Such a facility has been built at the Accelerator Laboratory of Tsinghua University, and upgrade is in progress. In this paper, we present a proposed layout of the upgrade with design parameters by simulation, aiming at high X-ray pulses flux and brightness, and also enabling advanced dynamics studies and applications of the electron beam. Design and construction status of main subsystems are also presented.

  9. R -Dimensional ESPRIT-Type Algorithms for Strictly Second-Order Non-Circular Sources and Their Performance Analysis

    Science.gov (United States)

    Steinwandt, Jens; Roemer, Florian; Haardt, Martin; Galdo, Giovanni Del

    2014-09-01

    High-resolution parameter estimation algorithms designed to exploit the prior knowledge about incident signals from strictly second-order (SO) non-circular (NC) sources allow for a lower estimation error and can resolve twice as many sources. In this paper, we derive the R-D NC Standard ESPRIT and the R-D NC Unitary ESPRIT algorithms that provide a significantly better performance compared to their original versions for arbitrary source signals. They are applicable to shift-invariant R-D antenna arrays and do not require a centrosymmetric array structure. Moreover, we present a first-order asymptotic performance analysis of the proposed algorithms, which is based on the error in the signal subspace estimate arising from the noise perturbation. The derived expressions for the resulting parameter estimation error are explicit in the noise realizations and asymptotic in the effective signal-to-noise ratio (SNR), i.e., the results become exact for either high SNRs or a large sample size. We also provide mean squared error (MSE) expressions, where only the assumptions of a zero mean and finite SO moments of the noise are required, but no assumptions about its statistics are necessary. As a main result, we analytically prove that the asymptotic performance of both R-D NC ESPRIT-type algorithms is identical in the high effective SNR regime. Finally, a case study shows that no improvement from strictly non-circular sources can be achieved in the special case of a single source.

  10. Uncertainty analysis methods for quantification of source terms using a large computer code

    International Nuclear Information System (INIS)

    Han, Seok Jung

    1997-02-01

    Quantification of uncertainties in the source term estimations by a large computer code, such as MELCOR and MAAP, is an essential process of the current probabilistic safety assessments (PSAs). The main objectives of the present study are (1) to investigate the applicability of a combined procedure of the response surface method (RSM) based on input determined from a statistical design and the Latin hypercube sampling (LHS) technique for the uncertainty analysis of CsI release fractions under a hypothetical severe accident sequence of a station blackout at Young-Gwang nuclear power plant using MAAP3.0B code as a benchmark problem; and (2) to propose a new measure of uncertainty importance based on the distributional sensitivity analysis. On the basis of the results obtained in the present work, the RSM is recommended to be used as a principal tool for an overall uncertainty analysis in source term quantifications, while using the LHS in the calculations of standardized regression coefficients (SRC) and standardized rank regression coefficients (SRRC) to determine the subset of the most important input parameters in the final screening step and to check the cumulative distribution functions (cdfs) obtained by RSM. Verification of the response surface model for its sufficient accuracy is a prerequisite for the reliability of the final results obtained by the combined procedure proposed in the present work. In the present study a new measure has been developed to utilize the metric distance obtained from cumulative distribution functions (cdfs). The measure has been evaluated for three different cases of distributions in order to assess the characteristics of the measure: The first case and the second are when the distribution is known as analytical distributions and the other case is when the distribution is unknown. The first case is given by symmetry analytical distributions. The second case consists of two asymmetry distributions of which the skewness is non zero

  11. Molecular Ionization-Desorption Analysis Source (MIDAS) for Mass Spectrometry: Thin-Layer Chromatography

    Science.gov (United States)

    Winter, Gregory T.; Wilhide, Joshua A.; LaCourse, William R.

    2016-02-01

    Molecular ionization-desorption analysis source (MIDAS), which is a desorption atmospheric pressure chemical ionization (DAPCI) type source, for mass spectrometry has been developed as a multi-functional platform for the direct sampling of surfaces. In this article, its utility for the analysis of thin-layer chromatography (TLC) plates is highlighted. Amino acids, which are difficult to visualize without staining reagents or charring, were detected and identified directly from a TLC plate. To demonstrate the full potential of MIDAS, all active ingredients from an analgesic tablet, separated on a TLC plate, were successfully detected using both positive and negative ion modes. The identity of each of the compounds was confirmed from their mass spectra and compared against standards. Post separation, the chemical signal (blue permanent marker) as reference marks placed at the origin and solvent front were used to calculate retention factor (Rf) values from the resulting ion chromatogram. The quantitative capabilities of the device were exhibited by scanning caffeine spots on a TLC plate of increasing sample amount. A linear curve based on peak are, R2 = 0.994, was generated for seven spots ranging from 50 to 1000 ng of caffeine per spot.

  12. Complementarity among climate related energy sources: Sensitivity study to climate characteristics across Europe

    Science.gov (United States)

    Francois, Baptiste; Hingray, Benoit; Creutin, Jean-Dominique; Raynaud, Damien; Borga, Marco; Vautard, Robert

    2015-04-01

    Climate related energy sources like solar-power, wind-power and hydro-power are important contributors to the transitions to a low-carbon economy. Past studies, mainly based on solar and wind powers, showed that the power from such energy sources fluctuates in time and space following their driving climatic variables. However, when combining different energy sources together, their intermittent feature is smoothed, resulting to lower time variability of the produced power and to lower storage capacity required for balancing. In this study, we consider solar, wind and hydro energy sources in a 100% renewable Europe using a set of 12 regions following two climate transects, the first one going from the Northern regions (Norway, Finland) to the Southern ones (Greece, Andalucía, Tunisia) and the second one going from the oceanic climate (West of France, Galicia) to the continental one (Romania, Belorussia). For each of those regions, we combine wind and solar irradiance data from the Weather Research and Forecasting Model (Vautard et al., 2014), temperature data from the European Climate Assessment & Dataset (Haylock et al., 2008) and runoff from the Global Runoff Data Center (GRDC, 1999) for estimating solar-power, wind-power, run-of-the-river hydro-power and the electricity demand over a time period of 30 years. The use of this set of 12 regions across Europe allows integrating knowledge about time and space variability for each different energy sources. We then assess the optimal share of each energy sources, aiming to decrease the time variability of the regional energy balance at different time scales as well as the energy storage required for balancing within each region. We also evaluate how energy transport among regions contributes for smoothing out both the energy balance and the storage requirement. The strengths of this study are i) to handle with run-of-the-river hydro power in addition to wind and solar energy sources and ii) to carry out this analysis

  13. Flash sourcing, or rapid detection and characterization of earthquake effects through website traffic analysis

    Directory of Open Access Journals (Sweden)

    Laurent Frobert

    2011-06-01

    Full Text Available

    This study presents the latest developments of an approach called ‘flash sourcing’, which provides information on the effects of an earthquake within minutes of its occurrence. Information is derived from an analysis of the website traffic surges of the European–Mediterranean Seismological Centre website after felt earthquakes. These surges are caused by eyewitnesses to a felt earthquake, who are the first who are informed of, and hence the first concerned by, an earthquake occurrence. Flash sourcing maps the felt area, and at least in some circumstances, the regions affected by severe damage or network disruption. We illustrate how the flash-sourced information improves and speeds up the delivery of public earthquake information, and beyond seismology, we consider what it can teach us about public responses when experiencing an earthquake. Future developments should improve the description of the earthquake effects and potentially contribute to the improvement of the efficiency of earthquake responses by filling the information gap after the occurrence of an earthquake.

  14. [The use of personal sources for the study of emigration from Galicia: present state and perspectives].

    Science.gov (United States)

    Vazquez Gonzalez, A

    1996-08-01

    "Spanish sources for the study of emigration are sparse and fragmentary.... Mortgage documents for the payment of ocean transportation enable us to appreciate the spreading action of shipping agents; official listings of draft dodgers reveal that in general the River Plate was a favorite destination, rather than Cuba or Brazil. People from Galicia emigrated from rural origins to urban destinations in America; the analysis of place of birth of emigrants residing in A Coruna at the time of emigration show that there was also, in some cases, a first stage of rural-urban migration within Galicia. The general picture of emigration from Galicia is built [up] through the combination of the existing sources in Spain." (EXCERPT)

  15. Antibiotics in the coastal environment of the Hailing Bay region, South China Sea: Spatial distribution, source analysis and ecological risks.

    Science.gov (United States)

    Chen, Hui; Liu, Shan; Xu, Xiang-Rong; Zhou, Guang-Jie; Liu, Shuang-Shuang; Yue, Wei-Zhong; Sun, Kai-Feng; Ying, Guang-Guo

    2015-06-15

    In this study, the occurrence and spatial distribution of 38 antibiotics in surface water and sediment samples of the Hailing Bay region, South China Sea, were investigated. Twenty-one, 16 and 15 of 38 antibiotics were detected with the concentrations ranging from antibiotics in the water phase were correlated positively with chemical oxygen demand and nitrate. The source analysis indicated that untreated domestic sewage was the primary source of antibiotics in the study region. Fluoroquinolones showed strong sorption capacity onto sediments due to their high pseudo-partitioning coefficients. Risk assessment indicated that oxytetracycline, norfloxacin and erythromycin-H2O posed high risks to aquatic organisms. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Source characterization of Purnima Neutron Generator (PNG)

    International Nuclear Information System (INIS)

    Bishnoi, Saroj; Patel, T.; Paul, Ram K.; Sarkar, P.S.; Adhikari, P.S.; Sinha, Amar

    2011-01-01

    The use of 14.1 MeV neutron generators for the applications such as elemental analysis, Accelerated Driven System (ADS) study, fast neutron radiography requires the characterization of neutron source i.e neutron yield (emission rate in n/sec), neutron dose, beam spot size and energy spectrum. In this paper, a series of experiments carried out to characterize this neutron source. The neutron source has been quantified with neutron emission rate, neutron dose at various source strength and beam spot size at target position

  17. TRAM (Transcriptome Mapper: database-driven creation and analysis of transcriptome maps from multiple sources

    Directory of Open Access Journals (Sweden)

    Danieli Gian

    2011-02-01

    Full Text Available Abstract Background Several tools have been developed to perform global gene expression profile data analysis, to search for specific chromosomal regions whose features meet defined criteria as well as to study neighbouring gene expression. However, most of these tools are tailored for a specific use in a particular context (e.g. they are species-specific, or limited to a particular data format and they typically accept only gene lists as input. Results TRAM (Transcriptome Mapper is a new general tool that allows the simple generation and analysis of quantitative transcriptome maps, starting from any source listing gene expression values for a given gene set (e.g. expression microarrays, implemented as a relational database. It includes a parser able to assign univocal and updated gene symbols to gene identifiers from different data sources. Moreover, TRAM is able to perform intra-sample and inter-sample data normalization, including an original variant of quantile normalization (scaled quantile, useful to normalize data from platforms with highly different numbers of investigated genes. When in 'Map' mode, the software generates a quantitative representation of the transcriptome of a sample (or of a pool of samples and identifies if segments of defined lengths are over/under-expressed compared to the desired threshold. When in 'Cluster' mode, the software searches for a set of over/under-expressed consecutive genes. Statistical significance for all results is calculated with respect to genes localized on the same chromosome or to all genome genes. Transcriptome maps, showing differential expression between two sample groups, relative to two different biological conditions, may be easily generated. We present the results of a biological model test, based on a meta-analysis comparison between a sample pool of human CD34+ hematopoietic progenitor cells and a sample pool of megakaryocytic cells. Biologically relevant chromosomal segments and gene

  18. Economic analysis for the electricity production in isolated areas in Cuba using different renewable sources

    International Nuclear Information System (INIS)

    Morales Salas, Joel; Moreno Figueredo, Conrado; Briesemeister, Ludwig; Arzola, Jose

    2015-01-01

    Despite the effort and commitment of the Cuban government in more of 50 year, there are houses without electricity in remote areas of the Electricity Network. These houses or communities have the promise and commitment of the local and national authorities to help them in improve his life quality. How the houses and communities are remote of the electricity network, the cost to extend the network is considerably high. For that reason, the use of renewable sources in these areas is an acceptable proposal. This article does an analysis to obtain different configurations depending to the number of houses. It do a proposal with the use of the Hydrothermal Carbonization process in the cases where is not feasible introduce different renewable source; a technology new in Cuba, and advantageous taking into consideration the kind of biomass that exist in Cuba. The study of the chemical process of the Hydrothermal Carbonization with the Cuban biomass should be further researched. (full text)

  19. Monte Carlo Simulations Validation Study: Vascular Brachytherapy Beta Sources

    International Nuclear Information System (INIS)

    Orion, I.; Koren, K.

    2004-01-01

    During the last decade many versions of angioplasty irradiation treatments have been proposed. The purpose of this unique brachytherapy is to administer a sufficient radiation dose into the vein walls in order to prevent restonosis, a clinical sequel to balloon angioplasty. The most suitable sources for this vascular brachytherapy are the β - emitters such as Re-188, P-32, and Sr-90/Y-90, with a maximum energy range of up to 2.1 MeV [1,2,3]. The radioactive catheters configurations offered for these treatments can be a simple wire [4], a fluid filled balloon or a coated stent. Each source is differently positioned inside the blood vessel, and the emitted electrons ranges therefore vary. Many types of sources and configurations were studied either experimentally or with the use of the Monte Carlo calculation technique, while most of the Monte Carlo simulations were carried out using EGS4 [5] or MCNP [6]. In this study we compared the beta-source absorbed-dose versus radial-distance of two treatment configurations using MCNP and EGS4 simulations. This comparison was aimed to discover the differences between the MCNP and the EGS4 simulation code systems in intermediate energies electron transport

  20. Prospects for accelerator neutron sources for large volume minerals analysis

    International Nuclear Information System (INIS)

    Clayton, C.G.; Spackman, R.

    1988-01-01

    The electron Linac can be regarded as a practical source of thermal neutrons for activation analysis of large volume mineral samples. With a suitable target and moderator, a neutron flux of about 10 10 n/cm/s over 2-3 kg of rock can be generated. The proton Linac gives the possibility of a high neutron yield (> 10 12 n/s) of fast neutrons at selected energies. For the electron Linac, targets of W-U and W-Be are discussed. The advantages and limitations of the system are demonstrated for the analysis of gold in rocks and ores and for platinum in chromitite. These elements were selected as they are most likely to justify an accelerator installation at the present time. Errors due to self shielding in gold particles for thermal neutrons are discussed. The proton Linac is considered for neutrons generated from a lithium target through the 7 Li(p, n) 7 Be reaction. The analysis of gold by fast neutron activation is considered. This approach avoids particle self-absorption and, by appropriate proton energy selection, avoids potentially dominating interfering reactions. The analysis of 235 U in the presence of 238 U and 232 Th is also considered. (author)

  1. Source analysis of peroxyacetyl nitrate (PAN) in Guangzhou, China: a yearlong observation study

    Science.gov (United States)

    Wang, B. G.; Zhu, D.; Zou, Y.; Wang, H.; Zhou, L.; Ouyang, X.; Shao, H. F.; Deng, X. J.

    2015-06-01

    In recent years, photochemical smog has been a major cause of air pollution in the metropolitan area of Guangzhou, China, with a continuing increase in the concentrations of photochemical pollutants. The concentration of peroxyacetyl nitrate (PAN) has often been found to reach very high levels, posing a potential threat to the public health. To better understand the changes in PAN concentration and its sources, a study was carried from January to December of 2012 at the Guangzhou Panyu Atmospheric Composition Station (GPACS) to measure the atmospheric concentrations of PAN as well as those of ozone (O3), nitrogen oxides (NOx), and non-methane hydrocarbon (NMHC). These data were analyzed to investigate the quantitative relationships between PAN and its precursors. In the study period, the hourly concentrations of PAN varied from below instrument detection limit to 12.0 ppbv. The yearly mean concentration of PAN was 0.84 ppbv, with the daily mean concentration exceeding 5 ppbv in 32 of the total observation days. Calculations indicate that among the measured NMHC species, alkenes accounted for 53 % of the total NMHC contribution to the PAN production, with aromatics and alkanes accounting for about 11 and 7 % of the total, respectively. During the period of our observation only a modest correlation was found between the concentrations of PAN and O3 for daytime hours, and observed PAN concentrations were relatively high even though the observed NMHCs/NOx ratio was low. This suggests regional air mass transport of pollutants had a major impact on the PAN concentrations in Guangzhou area.

  2. Source apportionment of PAH in Hamilton Harbour suspended sediments: comparison of two factor analysis methods.

    Science.gov (United States)

    Sofowote, Uwayemi M; McCarry, Brian E; Marvin, Christopher H

    2008-08-15

    A total of 26 suspended sediment samples collected over a 5-year period in Hamilton Harbour, Ontario, Canada and surrounding creeks were analyzed for a suite of polycyclic aromatic hydrocarbons and sulfur heterocycles. Hamilton Harbour sediments contain relatively high levels of polycyclic aromatic compounds and heavy metals due to emissions from industrial and mobile sources. Two receptor modeling methods using factor analyses were compared to determine the profiles and relative contributions of pollution sources to the harbor; these methods are principal component analyses (PCA) with multiple linear regression analysis (MLR) and positive matrix factorization (PMF). Both methods identified four factors and gave excellent correlation coefficients between predicted and measured levels of 25 aromatic compounds; both methods predicted similar contributions from coal tar/coal combustion sources to the harbor (19 and 26%, respectively). One PCA factor was identified as contributions from vehicular emissions (61%); PMF was able to differentiate vehicular emissions into two factors, one attributed to gasoline emissions sources (28%) and the other to diesel emissions sources (24%). Overall, PMF afforded better source identification than PCA with MLR. This work constitutes one of the few examples of the application of PMF to the source apportionment of sediments; the addition of sulfur heterocycles to the analyte list greatly aided in the source identification process.

  3. Discussion of metallogenic substance source of Xiangshan uranium orefield

    International Nuclear Information System (INIS)

    Shao Fei; Tang Xiangsheng; Zou Maoqin; Hu Maomei; He Xiaomei; Chen Xiaoming; Xu Hengli

    2008-01-01

    Analysis of uranium source is a key problem for study on uranium deposit genesis. Based on analysis of general implication for determination of uranium source on distribution characteristics of regional uranium abundance, according to temporal and spatial evolution of regional metallogenic substances in process of geological history, and combining with indication for analysis of uranium source by Pb isotopic composition of ores and REE geochemistry of both rocks and ores in Xiangshan orefield, Lower Cambrian strata are determined as regional uranium source bed, Xiangshan volcanic basin is the accumulation area for regional metallogenic substances, magma and hydrothermal solution of post magmatism are media for uranium. Magmatism realizes uranium migration from 'source' to 'accumulation'. In process of magmatic evolution, uranium transformed into gas phase to provide substance base for uranium mineralization. Fluid-rock interaction of post magmatism also promoted some uranium from schist of the basement and rhyodacite into metallogenic solution. (authors)

  4. Analysis of hard X-ray emission from selected very high energy γ-ray sources observed with INTEGRAL

    International Nuclear Information System (INIS)

    Hoffmann, Agnes Irene Dorothee

    2009-01-01

    A few years ago, the era of very high energy γ-ray astronomy started, when the latest generation of Imaging Atmospheric Cherenkov Telescopes (IACT) like H.E.S.S. began to operate and to resolve the sources of TeV emission. Identifications via multi-wavelength studies reveal that the detected sources are supernova remnants and active galactic nuclei, but also pulsar wind nebulae and a few binaries. One widely discussed open question is, how these sources are able to accelerate particles to such high energies. The understanding of the underlying particle distribution, the acceleration processes taking place, and the knowledge of the radiation processes which produce the observed emission, is, therefore, of crucial interest. Observations in the hard X-ray domain can be a key to get information on these particle distributions and processes. Important for this thesis are the TeV and the hard X-ray range. The two instruments, H.E.S.S. and INTEGRAL, whose data were used, are, therefore, described in detail. The main part of this thesis is focused on the X-ray binary system LS 5039/RX J1826.2-1450. It was observed in several energy ranges. The nature of the compact object is still not known, and it was proposed either to be a microquasar system or a non-accreting pulsar system. The observed TeV emission is modulated with the orbital cycle. Several explanations for this variability have been discussed in recent years. The observations with INTEGRAL presented in this thesis have provided new information to solve this question. Therefore, a search for a detection in the hard X-ray range and for its orbital dependence was worthwhile. Since LS 5039 is a faint source and the sky region where it is located is crowded, a very careful, non-standard handling of the INTEGRAL data was necessary, and a cross-checking with other analysis methods was essential to provide reliable results. We found that LS 5039 is emitting in the hard X-ray energy range. A flux rate and an upper flux

  5. Analysis of hard X-ray emission from selected very high energy {gamma}-ray sources observed with INTEGRAL

    Energy Technology Data Exchange (ETDEWEB)

    Hoffmann, Agnes Irene Dorothee

    2009-11-13

    A few years ago, the era of very high energy {gamma}-ray astronomy started, when the latest generation of Imaging Atmospheric Cherenkov Telescopes (IACT) like H.E.S.S. began to operate and to resolve the sources of TeV emission. Identifications via multi-wavelength studies reveal that the detected sources are supernova remnants and active galactic nuclei, but also pulsar wind nebulae and a few binaries. One widely discussed open question is, how these sources are able to accelerate particles to such high energies. The understanding of the underlying particle distribution, the acceleration processes taking place, and the knowledge of the radiation processes which produce the observed emission, is, therefore, of crucial interest. Observations in the hard X-ray domain can be a key to get information on these particle distributions and processes. Important for this thesis are the TeV and the hard X-ray range. The two instruments, H.E.S.S. and INTEGRAL, whose data were used, are, therefore, described in detail. The main part of this thesis is focused on the X-ray binary system LS 5039/RX J1826.2-1450. It was observed in several energy ranges. The nature of the compact object is still not known, and it was proposed either to be a microquasar system or a non-accreting pulsar system. The observed TeV emission is modulated with the orbital cycle. Several explanations for this variability have been discussed in recent years. The observations with INTEGRAL presented in this thesis have provided new information to solve this question. Therefore, a search for a detection in the hard X-ray range and for its orbital dependence was worthwhile. Since LS 5039 is a faint source and the sky region where it is located is crowded, a very careful, non-standard handling of the INTEGRAL data was necessary, and a cross-checking with other analysis methods was essential to provide reliable results. We found that LS 5039 is emitting in the hard X-ray energy range. A flux rate and an upper

  6. PARTITION: A program for defining the source term/consequence analysis interface in the NUREG--1150 probabilistic risk assessments

    International Nuclear Information System (INIS)

    Iman, R.L.; Helton, J.C.; Johnson, J.D.

    1990-05-01

    The individual plant analyses in the US Nuclear Regulatory Commission's reassessment of the risk from commercial nuclear power plants (NUREG-1150) consist of four parts: systems analysis, accident progression analysis, source term analysis, and consequence analysis. Careful definition of the interfaces between these parts is necessary for both information flow and computational efficiency. This document has been designed for users of the PARTITION computer program developed by the authors at Sandia National Laboratories for defining the interface between the source term analysis (performed with the XXSOR programs) and the consequence analysis (performed with the MACCS program). This report provides a tutorial that details how the interactive partitioning is performed, along with detailed information on the partitioning process. The PARTITION program was written in ANSI standard FORTRAN 77 to make the code as machine-independent (i.e., portable) as possible. 9 refs., 4 figs

  7. Analysis of loss distribution of Conventional Boost, Z-source and Y-source Converters for wide power and voltage range

    DEFF Research Database (Denmark)

    Gadalla, Brwene Salah Abdelkarim; Schaltz, Erik; Siwakoti, Yam Prasad

    2017-01-01

    Boost converters are needed in many applications which require the output voltage to be higher than the input voltage. Recently, boost type converters have been applied for industrial applications, and hence it has become an interesting topic of research. Many researchers proposed different...... impedance source converters with their unique advantages as having a high voltage gain in a small range of duty cycle ratio. However, the thermal behaviour of the semiconductor devices and passive elements in the impedance source converter is an important issue from a reliability point of view and it has...... not been investigated yet. Therefore, this paper presents a comparison between the conventional boost, the Z-source, and the Y-source converters based on a thermal evaluation of the semiconductors. In addition, the three topologies are also compared with respect to their efficiency. In this study...

  8. Evaluation of Skin Surface as an Alternative Source of Reference DNA Samples: A Pilot Study.

    Science.gov (United States)

    Albujja, Mohammed H; Bin Dukhyil, Abdul Aziz; Chaudhary, Abdul Rauf; Kassab, Ahmed Ch; Refaat, Ahmed M; Babu, Saranya Ramesh; Okla, Mohammad K; Kumar, Sachil

    2018-01-01

    An acceptable area for collecting DNA reference sample is a part of the forensic DNA analysis development. The aim of this study was to evaluate skin surface cells (SSC) as an alternate source of reference DNA sample. From each volunteer (n = 10), six samples from skin surface areas (forearm and fingertips) and two traditional samples (blood and buccal cells) were collected. Genomic DNA was extracted and quantified then genotyped using standard techniques. The highest DNA concentration of SSC samples was collected using the tape/forearm method of collection (2.1 ng/μL). Cotton swabs moistened with ethanol yielded higher quantities of DNA than swabs moistened with salicylic acid, and it gave the highest percentage of full STR profiles (97%). This study supports the use of SSC as a noninvasive sampling technique and as a extremely useful source of DNA reference samples among certain cultures where the use of buccal swabs can be considered socially unacceptable. © 2017 American Academy of Forensic Sciences.

  9. A Study of Porphyrins in Petroleum Source Rocks

    Energy Technology Data Exchange (ETDEWEB)

    Huseby, Berit

    1997-12-31

    This thesis discusses several aspects of porphyrin geochemistry. Degradation experiments have been performed on the Messel oil shale (Eocene, Germany) to obtain information on porphyrins bound or incorporated into macromolecular structures. Thermal heating of the preextracted kerogen by hydrous pyrolysis was used to study the release of porphyrins and their temperature dependent changes during simulated diagenesis and catagenesis. Selective chemical degradation experiments were performed on the preextracted sediment to get more detailed information about porphyrins that are specifically bound to the macromolecular structures via ester bonds. From the heating experiments, in a separate study, the porphyrin nitrogen content in the generated bitumens was compared to the bulk of organic nitrogen compounds in the fraction. The bulk nitrogen contents in the generated bitumens, the water phase and the residual organic matter was recorded to establish the distribution of nitrogen between the kerogen and product phases. Porphyrins as biomarkers were examined in naturally matured Kimmeridge clay source rocks (Upper Jurassic, Norway), and the use of porphyrins as general indicators of maturity was evaluated. Underlying maturity trends in the biomarker data was investigated by Partial Least Squares analysis. Porphyrin as indicators of depositional conditions was also addressed, where the correlations between the (amounts) abundance of nickel and vanadyl porphyrins were mapped together with other descriptors that are assumed to be indicative of redox depositional conditions. 252 refs., 28 figs., 4 tabs.

  10. A Study of Porphyrins in Petroleum Source Rocks

    Energy Technology Data Exchange (ETDEWEB)

    Huseby, Berit

    1996-12-31

    This thesis discusses several aspects of porphyrin geochemistry. Degradation experiments have been performed on the Messel oil shale (Eocene, Germany) to obtain information on porphyrins bound or incorporated into macromolecular structures. Thermal heating of the preextracted kerogen by hydrous pyrolysis was used to study the release of porphyrins and their temperature dependent changes during simulated diagenesis and catagenesis. Selective chemical degradation experiments were performed on the preextracted sediment to get more detailed information about porphyrins that are specifically bound to the macromolecular structures via ester bonds. From the heating experiments, in a separate study, the porphyrin nitrogen content in the generated bitumens was compared to the bulk of organic nitrogen compounds in the fraction. The bulk nitrogen contents in the generated bitumens, the water phase and the residual organic matter was recorded to establish the distribution of nitrogen between the kerogen and product phases. Porphyrins as biomarkers were examined in naturally matured Kimmeridge clay source rocks (Upper Jurassic, Norway), and the use of porphyrins as general indicators of maturity was evaluated. Underlying maturity trends in the biomarker data was investigated by Partial Least Squares analysis. Porphyrin as indicators of depositional conditions was also addressed, where the correlations between the (amounts) abundance of nickel and vanadyl porphyrins were mapped together with other descriptors that are assumed to be indicative of redox depositional conditions. 252 refs., 28 figs., 4 tabs.

  11. Adapting Controlled-source Coherence Analysis to Dense Array Data in Earthquake Seismology

    Science.gov (United States)

    Schwarz, B.; Sigloch, K.; Nissen-Meyer, T.

    2017-12-01

    Exploration seismology deals with highly coherent wave fields generated by repeatable controlled sources and recorded by dense receiver arrays, whose geometry is tailored to back-scattered energy normally neglected in earthquake seismology. Owing to these favorable conditions, stacking and coherence analysis are routinely employed to suppress incoherent noise and regularize the data, thereby strongly contributing to the success of subsequent processing steps, including migration for the imaging of back-scattering interfaces or waveform tomography for the inversion of velocity structure. Attempts have been made to utilize wave field coherence on the length scales of passive-source seismology, e.g. for the imaging of transition-zone discontinuities or the core-mantle-boundary using reflected precursors. Results are however often deteriorated due to the sparse station coverage and interference of faint back-scattered with transmitted phases. USArray sampled wave fields generated by earthquake sources at an unprecedented density and similar array deployments are ongoing or planned in Alaska, the Alps and Canada. This makes the local coherence of earthquake data an increasingly valuable resource to exploit.Building on the experience in controlled-source surveys, we aim to extend the well-established concept of beam-forming to the richer toolbox that is nowadays used in seismic exploration. We suggest adapted strategies for local data coherence analysis, where summation is performed with operators that extract the local slope and curvature of wave fronts emerging at the receiver array. Besides estimating wave front properties, we demonstrate that the inherent data summation can also be used to generate virtual station responses at intermediate locations where no actual deployment was performed. Owing to the fact that stacking acts as a directional filter, interfering coherent wave fields can be efficiently separated from each other by means of coherent subtraction. We

  12. Study on the cathode of ion source for neutral beam injector

    International Nuclear Information System (INIS)

    Tanaka, Shigeru

    1983-08-01

    Durability of the cathode is an important problem in developing a high power long pulse ion source for neutral beam injector. The Purpose of this study is to develope a long life cathode and investigate the applicability of it to the source. Directly heated filaments which are commonly used as the cathode of injector source do not live very long in general. In the present work, an indirectly heated hollow cathode made of impregnated porous tungsten tube is proposed as the alternative of the directly heated cathode. At first, we fabricated a small hollow cathode to study the discharge characteristcs in a bell-jar configuration and to apply it to a duoPIGatron hydrogen ion source. The experiment showed that the gas flow rate for sustaining the stable arc discharge in the discharge chamber becomes higher than that when the filament cathode is used. To solve this problem, an experiment for gas reduction was made using a newly fabricated larger hollow cathode and a magnetic multi-pole ion source. The influence of the orifice diameter, the effect of a button and of magnetic field on the gas flow rate were experimentally studied and a method for gas reduction was found. In addition, effect of the magnetic field on the characteristics of the hollow cathode ion source was examined in detail and an optimum field configuration around the cathode was found. Finally, beam extraction from an intensively cooled hollow cathode ion source for up to 10 sec was successfully carried out. (author)

  13. Population studies of the unidentified EGRET sources

    Energy Technology Data Exchange (ETDEWEB)

    Siegal-Gaskins, J M [University of Chicago, Chicago, IL 60637 (United States); Pavlidou, V [University of Chicago, Chicago, IL 60637 (United States); Olinto, A V [University of Chicago, Chicago, IL 60637 (United States); Brown, C [University of Chicago, Chicago, IL 60637 (United States); Fields, B D [University of Illinois at Urbana-Champaign, Urbana, IL 61801 (United States)

    2007-03-15

    The third EGRET catalog contains a large number of unidentified sources. Current data allows the intriguing possibility that some of these objects may represent a new class of yet undiscovered gamma-ray sources. By assuming that galaxies similar to the Milky Way host comparable populations of objects, we constrain the allowed Galactic abundance and distribution of various classes of gamma-ray sources using the EGRET data set. Furthermore, regardless of the nature of the unidentified sources, faint unresolved objects of the same class contribute to the observed diffuse gamma-ray background. We investigate the potential contribution of these unresolved sources to the extragalactic gamma-ray background.

  14. Population studies of the unidentified EGRET sources

    International Nuclear Information System (INIS)

    Siegal-Gaskins, J M; Pavlidou, V; Olinto, A V; Brown, C; Fields, B D

    2007-01-01

    The third EGRET catalog contains a large number of unidentified sources. Current data allows the intriguing possibility that some of these objects may represent a new class of yet undiscovered gamma-ray sources. By assuming that galaxies similar to the Milky Way host comparable populations of objects, we constrain the allowed Galactic abundance and distribution of various classes of gamma-ray sources using the EGRET data set. Furthermore, regardless of the nature of the unidentified sources, faint unresolved objects of the same class contribute to the observed diffuse gamma-ray background. We investigate the potential contribution of these unresolved sources to the extragalactic gamma-ray background

  15. Molecular line study of massive star-forming regions from the Red MSX Source survey

    Science.gov (United States)

    Yu, Naiping; Wang, Jun-Jie

    2014-05-01

    In this paper, we have selected a sample of massive star-forming regions from the Red MSX Source survey, in order to study star formation activities (mainly outflow and inflow signatures). We have focused on three molecular lines from the Millimeter Astronomy Legacy Team Survey at 90 GHz: HCO+(1-0), H13CO+(1-0) and SiO(2-1). According to previous observations, our sources can be divided into two groups: nine massive young stellar object candidates (radio-quiet) and 10 H II regions (which have spherical or unresolved radio emissions). Outflow activities have been found in 11 sources, while only three show inflow signatures in all. The high outflow detection rate means that outflows are common in massive star-forming regions. The inflow detection rate was relatively low. We suggest that this was because of the beam dilution of the telescope. All three inflow candidates have outflow(s). The outward radiation and thermal pressure from the central massive star(s) do not seem to be strong enough to halt accretion in G345.0034-00.2240. Our simple model of G318.9480-00.1969 shows that it has an infall velocity of about 1.8 km s-1. The spectral energy distribution analysis agrees our sources are massive and intermediate-massive star formation regions.

  16. Tools for Trade Analysis and Open Source Information Monitoring for Non-proliferation

    International Nuclear Information System (INIS)

    Cojazzi, G.G.M.; Versino, C.; Wolfart, E.; Renda, G.; Janssens, W.A.M.; )

    2015-01-01

    The new state level approach being proposed by IAEA envisions an objective based and information driven safeguards approach utilizing all relevant information to improve the effectiveness and efficiency of safeguards. To this goal the IAEA makes also use of open source information, here broadly defined as any information that is neither classified nor proprietary. It includes, but is not limited to: media sources, government and non-governmental reports and analyzes, commercial data, and scientific/technical literature, including trade data. Within the EC support programme to IAEA, JRC has surveyed and catalogued open sources on import-export customs trade data and developed tools for supporting the use of the related databases in safeguards. The JRC software The Big Table, (TBT), supports i.a.: a) the search through a collection of reference documents relevant to trade analysis (legal/regulatory documents, technical handbooks); b) the selection of items of interests to specific verifications and c) the mapping of these items to customs commodities searchable in trade databases. In the field of open source monitoring, JRC is developing and operating a ''Nuclear Security Media Monitor'' (NSMM), which is a web-based multilingual news aggregation system that automatically collects news articles from pre-defined web sites. NSMM is a domain specific version of the general JRC-Europe Media Monitor (EMM). NSMM has been established within the EC support programme with the aim, i.e., to streamline IAEA's process of open source information monitoring. In the first part, the paper will recall the trade data sources relevant for non-proliferation and will then illustrate the main features of TBT, recently coupled with the IAEA Physical Model, and new visualization techniques applied to trade data. In the second part it will present the main aspects of the NSMM also by illustrating some of uses done at JRC. (author)

  17. A compact hard X-ray source for medical imaging and biomolecular studies

    International Nuclear Information System (INIS)

    Cline, D.B.; Green, M.A.; Kolonko, J.

    1995-01-01

    There are a large number of synchrotron light sources in the world. However, these sources are designed for physics, chemistry, and engineering studies. To our knowledge, none have been optimized for either medical imaging or biomolecular studies. There are special needs for these applications. We present here a preliminary design of a very compact source, small enough for a hospital or a biomolecular laboratory, that is suitable for these applications. (orig.)

  18. Strategic planning as a competitive differential: A case study of the Sealed Sources Production Laboratory

    International Nuclear Information System (INIS)

    Vieira, Imário; Nascimento, Fernando C.; Calvo, Wilson A. Parejo

    2017-01-01

    Strategic planning has always been and continues to be one of the most important management tools for decision making. Amidst the uncertainties of the 21"s"t century, public, private and third sector organizations are steadily struggling to improve their strategic plans by using more effective results management tools such as BSC-Balanced Scorecard. Nuclear research institutes and research centers around the world have been using more and more these types of tools in their strategic planning and management. The objective of this article was to recommend the use the BSC as a strategic tool for decision making for the Sealed Sources Production Laboratory located in the Radiation Technology Center, at Nuclear and Energy Research Institute (IPEN/CNEN-SP), in Sao Paulo, Brazil. The methodology used in this academic article was a case study, which considered the object of the study, the Sealed Sources Production Laboratory, from January 2014 to August 2016. Among the main results obtained with this study can be cited: the improvement of the information flow, the visualization and proposition to change the periodicity of analysis of the results, among others. In view of the expected results, it was possible to conclude that this study may be of value to the Sealed Sources Production Laboratory for Industrial Radiography and Industrial Process Control and also to other research centers, as it will allow and contribute with an additional management support tool. (author)

  19. Strategic planning as a competitive differential: A case study of the Sealed Sources Production Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Vieira, Imário; Nascimento, Fernando C.; Calvo, Wilson A. Parejo, E-mail: imariovieira@yahoo.com, E-mail: wapcalvo@ipen.br, E-mail: fcodelo@gmail.com [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Faculdade SENAI de Tecnologia Ambiental, Sao Bernardo do Campo, SP (Brazil)

    2017-11-01

    Strategic planning has always been and continues to be one of the most important management tools for decision making. Amidst the uncertainties of the 21{sup st} century, public, private and third sector organizations are steadily struggling to improve their strategic plans by using more effective results management tools such as BSC-Balanced Scorecard. Nuclear research institutes and research centers around the world have been using more and more these types of tools in their strategic planning and management. The objective of this article was to recommend the use the BSC as a strategic tool for decision making for the Sealed Sources Production Laboratory located in the Radiation Technology Center, at Nuclear and Energy Research Institute (IPEN/CNEN-SP), in Sao Paulo, Brazil. The methodology used in this academic article was a case study, which considered the object of the study, the Sealed Sources Production Laboratory, from January 2014 to August 2016. Among the main results obtained with this study can be cited: the improvement of the information flow, the visualization and proposition to change the periodicity of analysis of the results, among others. In view of the expected results, it was possible to conclude that this study may be of value to the Sealed Sources Production Laboratory for Industrial Radiography and Industrial Process Control and also to other research centers, as it will allow and contribute with an additional management support tool. (author)

  20. Noise Source Identification of a Ring-Plate Cycloid Reducer Based on Coherence Analysis

    Directory of Open Access Journals (Sweden)

    Bing Yang

    2013-01-01

    Full Text Available A ring-plate-type cycloid speed reducer is one of the most important reducers owing to its low volume, compactness, smooth and high performance, and high reliability. The vibration and noise tests of the reducer prototype are completed using the HEAD acoustics multichannel noise test and analysis system. The characteristics of the vibration and noise are obtained based on coherence analysis and the noise sources are identified. The conclusions provide the bases for further noise research and control of the ring-plate-type cycloid reducer.

  1. Multi-source Geospatial Data Analysis with Google Earth Engine

    Science.gov (United States)

    Erickson, T.

    2014-12-01

    The Google Earth Engine platform is a cloud computing environment for data analysis that combines a public data catalog with a large-scale computational facility optimized for parallel processing of geospatial data. The data catalog is a multi-petabyte archive of georeferenced datasets that include images from Earth observing satellite and airborne sensors (examples: USGS Landsat, NASA MODIS, USDA NAIP), weather and climate datasets, and digital elevation models. Earth Engine supports both a just-in-time computation model that enables real-time preview and debugging during algorithm development for open-ended data exploration, and a batch computation mode for applying algorithms over large spatial and temporal extents. The platform automatically handles many traditionally-onerous data management tasks, such as data format conversion, reprojection, and resampling, which facilitates writing algorithms that combine data from multiple sensors and/or models. Although the primary use of Earth Engine, to date, has been the analysis of large Earth observing satellite datasets, the computational platform is generally applicable to a wide variety of use cases that require large-scale geospatial data analyses. This presentation will focus on how Earth Engine facilitates the analysis of geospatial data streams that originate from multiple separate sources (and often communities) and how it enables collaboration during algorithm development and data exploration. The talk will highlight current projects/analyses that are enabled by this functionality.https://earthengine.google.org

  2. A new energy analysis tool for ground source heat pump systems

    Energy Technology Data Exchange (ETDEWEB)

    Michopoulos, A.; Kyriakis, N. [Process Equipment Design Laboratory, Mechanical Engineering Department, Aristotle University of Thessaloniki, POB 487, 541 24 Thessaloniki (Greece)

    2009-09-15

    A new tool, suitable for energy analysis of vertical ground source heat pump systems, is presented. The tool is based on analytical equations describing the heat exchanged with the ground, developed in Matlab {sup registered} environment. The time step of the simulation can be freely chosen by the user (e.g. 1, 2 h etc.) and the calculation time required is very short. The heating and cooling loads of the building, at the afore mentioned time step, are needed as input, along with the thermophysical properties of the soil and of the ground heat exchanger, the operation characteristic curves of the system's heat pumps and the basic ground source heat exchanger dimensions. The results include the electricity consumption of the system and the heat absorbed from or rejected to the ground. The efficiency of the tool is verified through comparison with actual electricity consumption data collected from an existing large scale ground coupled heat pump installation over a three-year period. (author)

  3. Analysis of insertion device magnet measurements for the Advanced Light Source

    International Nuclear Information System (INIS)

    Marks, S.; Humphries, D.; Kincaid, B.M.; Schlueter, R.; Wang, C.

    1993-07-01

    The Advanced Light Source (ALS), which is currently being commissioned at Lawrence Berkeley Laboratory, is a third generation light source designed to produce XUV radiation of unprecedented brightness. To meet the high brightness goal the storage ring has been designed for very small electron beam emittance and the undulators installed in the ALS are built to a high degree of precision. The allowable magnetic field errors are driven by electron beam and radiation requirements. Detailed magnetic measurements and adjustments are performed on each undulator to qualify it for installation in the ALS. The first two ALS undulators, IDA and IDB, have been installed. This paper describes the program of measurements, data analysis, and adjustments carried out for these two devices. Calculations of the radiation spectrum, based upon magnetic measurements, are included. Final field integral distributions are also shown. Good field integral uniformity has been achieved using a novel correction scheme, which is also described

  4. Langmuir probe studies on a RF ion source for NBI

    International Nuclear Information System (INIS)

    McNeely, P.; Heineman, B.; Kraus, W.; Riedl, R.; Speth, E.; Vollmer, O.

    2001-01-01

    IPP Garching has been developing a RF ion source for H - production. In order to improve the data quality a new scanning probe system with passive RF compensation has been installed on the Type VI ion source on the BATMAN test stand. Using this probe, measurements have been carried out to study changes to the plasma parameters (electron density, electron temperature, and plasma potential) due to variation in the source operating conditions. The data were collected at a source pressure of 0.5 Pa and with 60±5 kW applied RF power. Presented are some of the results of these measurements, focusing on the effect of: argon seeding, addition of Cs to the source, and the newly added Faraday screen. The electron density behaves in a fashion that agrees with the theory of ambipolar diffusion. Typically there is little change to the average electron energy observed regardless of which effect is considered. The plasma potential shows the most significant changes with external source conditions, both in value for all cases and shape when the Faraday screen was added

  5. A new software for deformation source optimization, the Bayesian Earthquake Analysis Tool (BEAT)

    Science.gov (United States)

    Vasyura-Bathke, H.; Dutta, R.; Jonsson, S.; Mai, P. M.

    2017-12-01

    Modern studies of crustal deformation and the related source estimation, including magmatic and tectonic sources, increasingly use non-linear optimization strategies to estimate geometric and/or kinematic source parameters and often consider both jointly, geodetic and seismic data. Bayesian inference is increasingly being used for estimating posterior distributions of deformation source model parameters, given measured/estimated/assumed data and model uncertainties. For instance, some studies consider uncertainties of a layered medium and propagate these into source parameter uncertainties, while others use informative priors to reduce the model parameter space. In addition, innovative sampling algorithms have been developed to efficiently explore the high-dimensional parameter spaces. Compared to earlier studies, these improvements have resulted in overall more robust source model parameter estimates that include uncertainties. However, the computational burden of these methods is high and estimation codes are rarely made available along with the published results. Even if the codes are accessible, it is usually challenging to assemble them into a single optimization framework as they are typically coded in different programing languages. Therefore, further progress and future applications of these methods/codes are hampered, while reproducibility and validation of results has become essentially impossible. In the spirit of providing open-access and modular codes to facilitate progress and reproducible research in deformation source estimations, we undertook the effort of developing BEAT, a python package that comprises all the above-mentioned features in one single programing environment. The package builds on the pyrocko seismological toolbox (www.pyrocko.org), and uses the pymc3 module for Bayesian statistical model fitting. BEAT is an open-source package (https://github.com/hvasbath/beat), and we encourage and solicit contributions to the project. Here, we

  6. Uses of radioactive isotopes and radiation sources in biological studies in U. A. R; Utilisation des radioisotopes et des sources de rayonnement dans les etudes biologiques en RAU

    Energy Technology Data Exchange (ETDEWEB)

    Hashish, S. E. [Radiobiology Department, U. A. R. Atomic Energy Establishment, Cairo, United Arab Republic (Egypt)

    1970-01-15

    An attempt is made to give examples rather than a review of the uses of radioactive isotopes and radiation sources in biological studies in U.A.R. Studies along these lines started early in 1955 and are still progressing. The prospects of future developments and improvements are unlimited. The studies are classified according to the radio technique adopted. The techniques so far used in U.A.R. include all the techniques known elsewhere. Some detailed modifications and combinations of more than one technique have been successfully introduced. Both in basic and applied biological studies, one or more of the following techniques have been applied, namely tracer technique, isotopic dilution analysis, autoradiography, radiochromatography and electrophoresis, double or multi-bioassays, radioactivation analysis, neutron absorption analysis, and use of different radiation source for somatic and/or genetic effect studies. Mass spectrometry for stable isotopic studies in the field of biology has been recently used. Studies undertaken in the applied fields of biology e. g, in medicine (diagnosis and therapy) and agriculture (soil, plant and animal) have proved extremely valuable from the practical and developmental points of view. (author) [French] Le mémoire a pour objet d'illustrer plutôt que d'exposer systématiquement les utilisations des radioisotopes et des sources de rayonnement dans des études biologiques en République Arabe Unie. Ces études, entreprises au début de 1955, se poursuivent. Les possibilités de développement et de perfectionnement sont illimitées. Les études sont classées d'après la radiotechnique adoptée. Les techniques régulièrement utilisées jusqu' à présent en République Arabe Unie couvrent toute la gamme des techniques connues ailleurs. On a réussi à apporter des modifications de détail et à combiner plusieurs techniques. Dans les études de biologie tant fondamentale qu' appliquée, une ou plusieurs des techniques suivantes

  7. Cold source economic study

    International Nuclear Information System (INIS)

    Fuster, Serge.

    1975-01-01

    This computer code is intended for the statement of the general economic balance resulting from using a given cold source. The balance includes the investments needed for constructing the various materials, and also production balances resulting from their utilization. The case of either using an open circuit condenser on sea or river, or using air cooling systems with closed circuits or as auxiliaries can be dealt with. The program can be used to optimize the characteristics of the various parts of the cold source. The performance of the various materials can be evaluated for a given situation from using very full, precise economic balances, these materials can also be classified according to their possible uses, the outer constraints being taken into account (limits for heat disposal into rivers or seas, water temperature, air temperature). Technical choices whose economic consequences are important have been such clarified [fr

  8. Analysis of the Source System of Nantun Group in Huhehu Depression of Hailar Basin

    Science.gov (United States)

    Li, Yue; Li, Junhui; Wang, Qi; Lv, Bingyang; Zhang, Guannan

    2017-10-01

    Huhehu Depression will be the new battlefield in Hailar Basin in the future, while at present it’s in a low exploration level. The study about the source system of Nantun group is little, so fine depiction of the source system would be significant to sedimentary system reconstruction, the reservoir distribution and prediction of favorable area. In this paper, it comprehensive uses of many methods such as ancient landform, light and heavy mineral combination, seismic reflection characteristics, to do detailed study about the source system of Nantun group in different views and different levels. The results show that the source system in Huhehu Depression is from the east of Xilinbeir bulge and the west of Bayan Moutain uplift, which is surrounded by basin. The slope belt is the main source, and the southern bulge is the secondary source. The distribution of source system determines the distribution of sedimentary system and the regularity of the distribution of sand body.

  9. A californium-252 source for radiobiological studies at Hiroshima University

    International Nuclear Information System (INIS)

    Kato, Kazuo; Takeoka, Seiji; Kuroda, Tokue; Tsujimura, Tomotaka; Kawami, Masaharu; Hoshi, Masaharu; Sawada, Shozo

    1987-01-01

    A 1.93 Ci (3.6 mg) californium-252 source was installed in the radiation facility of the Research Institute for Nuclear Medicine and Biology, Hiroshima University. This source produces fission neutrons (8.7 x 10 9 n/s at the time of its installation), which are similar to neutron spectrum of the atomic bombs. It is useful for studying biological effects of fission neutrons and neutron dosimetry. An apparatus was dosigned to accomodate this source and to apply it to such studies. It has resulted in profitable fission neutron exposures, while suppressing scattered neutrons and secondary gamma rays. This apparatus incorporates many safety systems, including one which interlocks with all of doors and an elevator serving the exposure room, so as to prevent accidents involving users. (author)

  10. Thermal analysis studies of ammonium uranyl carbonate

    International Nuclear Information System (INIS)

    Cao Xinsheng; Ma Xuezhong; Wang Fapin; Liu Naixin; Ji Changhong

    1988-01-01

    The simultaneous thermogravimetry and differential thermal analysis of the ammonium uranyl carbonate powder were performed with heat balance in the following atmosphers: Air, Ar and Ar-8%H 2 . The thermogravimetry and differential thermal analysis curves of the ammonium uranyl carbonate powder obtained from different source were reported and discussed

  11. Vrancea seismic source analysis using a small-aperture array

    International Nuclear Information System (INIS)

    Popescu, E.; Popa, M.; Radulian, M.; Placinta, A.O.

    2005-01-01

    A small-aperture seismic array (BURAR) was installed in 1999 in the northern part of the Romanian territory (Bucovina area). Since then, the array has been in operation under a joint cooperation programme between Romania and USA. The array consists of 10 stations installed in boreholes (nine short period instruments and one broadband instrument) with enough high sensitivity to properly detect earthquakes generated in Vrancea subcrustal domain (at about 250 km epicentral distance) with magnitude M w below 3. Our main purpose is to investigate and calibrate the source parameters of the Vrancea intermediate-depth earthquakes using specific techniques provided by the BURAR array data. Forty earthquakes with magnitudes between 2.9 and 6.0 were selected, including the recent events of September 27, 2004 (45.70 angle N, 26.45 angle E, h = 166 km, M w = 4.7), October 27, 2004 (45.84 angle N, 26.63 angle E, h = 105 km, M w = 6.0) and May 14, 2005 (45.66 angle N, 26.52 angle E, h = 146 km, M w = 5.1), which are the best ever recorded earthquakes on the Romanian territory: Empirical Green's function deconvolution and spectral ratio methods are applied for pairs of collocated events with similar focal mechanism. Stability tests are performed for the retrieved source time function using the array elements. Empirical scaling and calibration relationships are also determined. Our study shows the capability of the BURAR array to determine the source parameters of the Vrancea intermediate-depth earthquakes as a stand alone station and proves that the recordings of this array alone provides reliable and useful tools to efficiently constrain the source parameters and consequently source scaling properties. (authors)

  12. Image acquisition and analysis for beam diagnostics, applications of the Taiwan photon source

    International Nuclear Information System (INIS)

    Liao, C.Y.; Chen, J.; Cheng, Y.S.; Hsu, K.T.; Hu, K.H.; Kuo, C.H.; Wu, C.Y.

    2012-01-01

    Design and implementation of image acquisition and analysis is in proceeding for the Taiwan Photon Source (TPS) diagnostic applications. The optical system contains screen, lens, and lighting system. A CCD camera with Gigabit Ethernet interface (GigE Vision) will be a standard image acquisition device. Image acquisition will be done on EPICS IOC via PV channel and analysis the properties by using Matlab tool to evaluate the beam profile (sigma), beam size position and tilt angle et al. The EPICS IOC integrated with Matlab as a data processing system is not only could be used in image analysis but also in many types of equipment data processing applications. Progress of the project will be summarized in this report. (authors)

  13. Analysis of the image of pion-emitting sources in the source center-of-mass frame

    Energy Technology Data Exchange (ETDEWEB)

    Ren, Yanyu; Feng, Qichun; Huo, Lei; Zhang, Jingbo; Liu, Jianli; Tang, Guixin [Harbin Institute of Technology, Department of Physics, Harbin, Heilongjiang (China); Zhang, Weining [Harbin Institute of Technology, Department of Physics, Harbin, Heilongjiang (China); Dalian University of Technology, School of Physics and Optoelectronic Technology, Dalian, Liaoning (China)

    2017-08-15

    In this paper, we try a method to extract the image of pion-emitting source function in the center-of-mass frame of the source (CMFS). We choose identical pion pairs according to the difference of their energy and use these pion pairs to build the correlation function. The purpose is to reduce the effect of ΔEΔt, thus the corresponding imaging result can tend to the real source function. We examine the effect of this method by comparing its results with real source functions extracted from models directly. (orig.)

  14. Analysis of internal radiation and radiotoxicity source base on aerosol distribution in RMI

    International Nuclear Information System (INIS)

    Yuwono, I.

    2000-01-01

    Destructive testing of nuclear fuel element during post irradiation examination in radio metallurgy installation may cause air contamination in the working area in the form of radioactive aerosol. Inhalation of the radioactive aerosol by worker will to become internal radiation source. Potential hazard of radioactive particle in the body also depends on the particle size. Analysis of internal radiation source and radiotoxicity showed that in the normal operation only natural radioactive materials are found with high radiotoxicity, i.e. Pb-212 and Ac-228. High deposit in the alveolar instersial (Ai) is 95 % and lower in the bronchial area (BB) is 1 % for particle size 11.7 nm and 350 nm respectively. (author)

  15. Fetal source extraction from magnetocardiographic recordings by dependent component analysis

    Energy Technology Data Exchange (ETDEWEB)

    Araujo, Draulio B de [Department of Physics and Mathematics, FFCLRP, University of Sao Paulo, Ribeirao Preto, SP (Brazil); Barros, Allan Kardec [Department of Electrical Engineering, Federal University of Maranhao, Sao Luis, Maranhao (Brazil); Estombelo-Montesco, Carlos [Department of Physics and Mathematics, FFCLRP, University of Sao Paulo, Ribeirao Preto, SP (Brazil); Zhao, Hui [Department of Medical Physics, University of Wisconsin, Madison, WI (United States); Filho, A C Roque da Silva [Department of Physics and Mathematics, FFCLRP, University of Sao Paulo, Ribeirao Preto, SP (Brazil); Baffa, Oswaldo [Department of Physics and Mathematics, FFCLRP, University of Sao Paulo, Ribeirao Preto, SP (Brazil); Wakai, Ronald [Department of Medical Physics, University of Wisconsin, Madison, WI (United States); Ohnishi, Noboru [Department of Information Engineering, Nagoya University (Japan)

    2005-10-07

    Fetal magnetocardiography (fMCG) has been extensively reported in the literature as a non-invasive, prenatal technique that can be used to monitor various functions of the fetal heart. However, fMCG signals often have low signal-to-noise ratio (SNR) and are contaminated by strong interference from the mother's magnetocardiogram signal. A promising, efficient tool for extracting signals, even under low SNR conditions, is blind source separation (BSS), or independent component analysis (ICA). Herein we propose an algorithm based on a variation of ICA, where the signal of interest is extracted using a time delay obtained from an autocorrelation analysis. We model the system using autoregression, and identify the signal component of interest from the poles of the autocorrelation function. We show that the method is effective in removing the maternal signal, and is computationally efficient. We also compare our results to more established ICA methods, such as FastICA.

  16. Antioxidants: Characterization, natural sources, extraction and analysis

    OpenAIRE

    OROIAN, MIRCEA; Escriche Roberto, Mª Isabel

    2015-01-01

    [EN] Recently many review papers regarding antioxidants fromdifferent sources and different extraction and quantification procedures have been published. However none of them has all the information regarding antioxidants (chemistry, sources, extraction and quantification). This article tries to take a different perspective on antioxidants for the new researcher involved in this field. Antioxidants from fruit, vegetables and beverages play an important role in human health, fo...

  17. Muticriteria decision making model for chosing between open source and non-open source software

    Directory of Open Access Journals (Sweden)

    Edmilson Alves de Moraes

    2008-09-01

    Full Text Available This article proposes the use of a multicriterio method for supporting decision on a problem where the intent is to chose for software given the options of open source and not-open source. The study shows how a method for decison making can be used to provide problem structuration and simplify the decision maker job. The method Analytic Hierarchy Process-AHP is described step-by-step and its benefits and flaws are discussed. Followin the theoretical discussion, a muliple case study is presented, where two companies are to use the decison making method. The analysis was supported by Expert Choice, a software developed based on AHP framework.

  18. Comparative Transcriptome Analysis of Penicillium citrinum Cultured with Different Carbon Sources Identifies Genes Involved in Citrinin Biosynthesis

    Directory of Open Access Journals (Sweden)

    Taotao Li

    2017-02-01

    Full Text Available Citrinin is a toxic secondary metabolite of Penicillium citrinum and its contamination in many food items has been widely reported. However, research on the citrinin biosynthesis pathway and its regulation mechanism in P. citrinum is rarely reported. In this study, we investigated the effect of different carbon sources on citrinin production by P. citrinum and used transcriptome analysis to study the underlying molecular mechanism. Our results indicated that glucose, used as the sole carbon source, could significantly promote citrinin production by P. citrinum in Czapek’s broth medium compared with sucrose. A total of 19,967 unigenes were annotated by BLAST in Nr, Nt, Swiss-Prot and Kyoto Encyclopedia of Genes and Genomes (KEGG databases. Transcriptome comparison between P. citrinum cultured with sucrose and glucose revealed 1085 differentially expressed unigenes. Among them, 610 were upregulated while 475 were downregulated under glucose as compared to sucrose. KEGG pathway and Gene ontology (GO analysis indicated that many metabolic processes (e.g., carbohydrate, secondary metabolism, fatty acid and amino acid metabolism were affected, and potentially interesting genes that encoded putative components of signal transduction, stress response and transcription factor were identified. These genes obviously had important impacts on their regulation in citrinin biosynthesis, which provides a better understanding of the molecular mechanism of citrinin biosynthesis by P. citrinum.

  19. BNL feasibility studies of spallation neutron sources

    International Nuclear Information System (INIS)

    Lee, Y.Y.; Ruggiero, A.G.; Van Steenbergen, A.; Weng, W.T.

    1995-01-01

    This paper is the summary of conceptual design studies of a 5 MW Pulsed Spallation Neutron Source (PSNS) conducted by an interdepartmental study group at Brookhaven National Laboratory. The study was made of two periods. First, a scenario based on the use of a 600 MeV Linac followed by two fast-cycling 3.6 GeV Synchrotrons was investigated. Then, in a subsequent period, the attention of the study was directed toward an Accumulator scenario with two options: (1) a 1.25 GeV normal conducting Linac followed by two Accumulator Rings, and (2) a 2.4 GeV superconducting Linac followed by a single Accumulator Ring. The study did not make any reference to a specific site

  20. Source localization analysis using seismic noise data acquired in exploration geophysics

    Science.gov (United States)

    Roux, P.; Corciulo, M.; Campillo, M.; Dubuq, D.

    2011-12-01

    Passive monitoring using seismic noise data shows a growing interest at exploration scale. Recent studies demonstrated source localization capability using seismic noise cross-correlation at observation scales ranging from hundreds of kilometers to meters. In the context of exploration geophysics, classical localization methods using travel-time picking fail when no evident first arrivals can be detected. Likewise, methods based on the intensity decrease as a function of distance to the source also fail when the noise intensity decay gets more complicated than the power-law expected from geometrical spreading. We propose here an automatic procedure developed in ocean acoustics that permits to iteratively locate the dominant and secondary noise sources. The Matched-Field Processing (MFP) technique is based on the spatial coherence of raw noise signals acquired on a dense array of receivers in order to produce high-resolution source localizations. Standard MFP algorithms permits to locate the dominant noise source by matching the seismic noise Cross-Spectral Density Matrix (CSDM) with the equivalent CSDM calculated from a model and a surrogate source position that scans each position of a 3D grid below the array of seismic sensors. However, at exploration scale, the background noise is mostly dominated by surface noise sources related to human activities (roads, industrial platforms,..), which localization is of no interest for the monitoring of the hydrocarbon reservoir. In other words, the dominant noise sources mask lower-amplitude noise sources associated to the extraction process (in the volume). Their location is therefore difficult through standard MFP technique. The Multi-Rate Adaptative Beamforming (MRABF) is a further improvement of the MFP technique that permits to locate low-amplitude secondary noise sources using a projector matrix calculated from the eigen-value decomposition of the CSDM matrix. The MRABF approach aims at cancelling the contributions of

  1. Analysis on the distribution characteristics and sources of soil heavy metals in suburban farmland in Xiangtan City

    Science.gov (United States)

    Zhang, Yong; Sun, Xinxin

    2018-01-01

    The rapid development of the economy will inevitably have an impact on the farmland soil environment. The content of heavy metal is increasing day by day, and the heavy metal can enter people's body through different channels and endanger people's health. Based on agricultural land and crop types in accordance with the regional land use classification, using the method of the Single Factor Index and Comprehensive Pollution Index, the pollution status of heavy metals in farmland soil in the suburbs of Xiangtan city was studied and evaluated. At the same time, we use SPSS software to analyze the four heavy metal elements (Cu, Zn, As and Pb) and analyze their possible sources. The results showed that the farmland soils in Erhuan Road and Zhubu Port were polluted, and the farmland soil in Shuangma (an old industrial district) was not polluted; for different crop lands, orchards and vegetable lands were not contaminated, but rape and rice lands were contaminated. Pearson correlation analysis showed that Cu, As and Pb might come from the same pollution source, while Zn might come from other sources. Waste water from a chemical plant, crop types, automobile exhaust and other human factors may be important sources of soil pollution in agricultural fields.

  2. Time-resolved X-ray studies using third generation synchrotron radiation sources

    International Nuclear Information System (INIS)

    Mills, D.M.

    1991-10-01

    The third generation, high-brilliance, hard x-ray, synchrotron radiation (SR) sources currently under construction (ESRF at Grenoble, France; APS at Argonne, Illinois; and SPring-8 at Harima, Japan) will usher in a new era of x-ray experimentation for both physical and biological sciences. One of the most exciting areas of experimentation will be the extension of x-ray scattering and diffraction techniques to the study of transient or time-evolving systems. The high repetition rate, short-pulse duration, high brilliance, and variable spectral bandwidth of these sources make them ideal for x-ray time-resolved studies. The temporal properties (bunch length, interpulse period, etc.) of these new sources will be summarized. Finally, the scientific potential and the technological challenges of time-resolved x-ray scattering from these new sources will be described. 13 refs., 4 figs

  3. Substance Flow Analysis and Source Mapping of Chemical UV-filters

    International Nuclear Information System (INIS)

    Eriksson, E.; Andersen, H. R.; Ledin, A.

    2008-01-01

    Chemical ultraviolet (UV)-filters are used in sunscreens to protect the skin from harmful UV radiation which may otherwise cause sunburns and skin cancer. Commonly used chemical UV-filters are known to cause endocrine disrupting effects in both aquatic and terrestrial animals as well as in human skin cells. Here, source mapping and substance flow analysis were applied to find the sources of six UV-filters (oxybenzone, avobenzone, 4-methylbenzylidene camphor, octyl methoxycinnamate, octyl dimethyl PABA and homosalate) and to identify the most dominant flows of these substances in Denmark. Urban water, composed of wastewater and surface waters, was found to be the primary recipient of UV-filters, whereby wastewater received an estimated 8.5-65 tonnes and surface waters received 7.1-51 tonnes in 2005. In wastewater treatment plants, their sorption onto sludge is perceived to be an important process and presence in effluents can be expected due to a lack of biodegradability. In addition, the use of UV-filters is expected to continue to increase significantly. Not all filters (e.g., octyl dimethyl PABA and homosalate) are used in Denmark. For example, 4-MBC is mainly associated with self-tanning liquids and private import of sunscreens

  4. Substance Flow Analysis and Source Mapping of Chemical UV-filters

    Energy Technology Data Exchange (ETDEWEB)

    Eriksson, E., E-mail: eve@env.dtu.dk; Andersen, H. R.; Ledin, A. [Technical University of Denmark, Department of Environmental Engineering (Denmark)

    2008-12-15

    Chemical ultraviolet (UV)-filters are used in sunscreens to protect the skin from harmful UV radiation which may otherwise cause sunburns and skin cancer. Commonly used chemical UV-filters are known to cause endocrine disrupting effects in both aquatic and terrestrial animals as well as in human skin cells. Here, source mapping and substance flow analysis were applied to find the sources of six UV-filters (oxybenzone, avobenzone, 4-methylbenzylidene camphor, octyl methoxycinnamate, octyl dimethyl PABA and homosalate) and to identify the most dominant flows of these substances in Denmark. Urban water, composed of wastewater and surface waters, was found to be the primary recipient of UV-filters, whereby wastewater received an estimated 8.5-65 tonnes and surface waters received 7.1-51 tonnes in 2005. In wastewater treatment plants, their sorption onto sludge is perceived to be an important process and presence in effluents can be expected due to a lack of biodegradability. In addition, the use of UV-filters is expected to continue to increase significantly. Not all filters (e.g., octyl dimethyl PABA and homosalate) are used in Denmark. For example, 4-MBC is mainly associated with self-tanning liquids and private import of sunscreens.

  5. Advanced Neutron Source enrichment study

    International Nuclear Information System (INIS)

    Bari, R.A.; Ludewig, H.; Weeks, J.R.

    1996-01-01

    A study has been performed of the impact on performance of using low-enriched uranium (20% 235 U) or medium-enriched uranium (35% 235 U) as an alternative fuel for the Advanced Neutron Source, which was initially designed to use uranium enriched to 93% 235 U. Higher fuel densities and larger volume cores were evaluated at the lower enrichments in terms of impact on neutron flux, safety, safeguards, technical feasibility, and cost. The feasibility of fabricating uranium silicide fuel at increasing material density was specifically addressed by a panel of international experts on research reactor fuels. The most viable alternative designs for the reactor at lower enrichments were identified and discussed. Several sensitivity analyses were performed to gain an understanding of the performance of the reactor at parametric values of power, fuel density, core volume, and enrichment that were interpolations between the boundary values imposed on the study or extrapolations from known technology

  6. Using open source accelerometer analysis to assess physical activity and sedentary behaviour in overweight and obese adults.

    Science.gov (United States)

    Innerd, Paul; Harrison, Rory; Coulson, Morc

    2018-04-23

    Physical activity and sedentary behaviour are difficult to assess in overweight and obese adults. However, the use of open-source, raw accelerometer data analysis could overcome this. This study compared raw accelerometer and questionnaire-assessed moderate-to-vigorous physical activity (MVPA), walking and sedentary behaviour in normal, overweight and obese adults, and determined the effect of using different methods to categorise overweight and obesity, namely body mass index (BMI), bioelectrical impedance analysis (BIA) and waist-to-hip ratio (WHR). One hundred twenty adults, aged 24-60 years, wore a raw, tri-axial accelerometer (Actigraph GT3X+), for 3 days and completed a physical activity questionnaire (IPAQ-S). We used open-source accelerometer analyses to estimate MVPA, walking and sedentary behaviour from a single raw accelerometer signal. Accelerometer and questionnaire-assessed measures were compared in normal, overweight and obese adults categorised using BMI, BIA and WHR. Relationships between accelerometer and questionnaire-assessed MVPA (Rs = 0.30 to 0.48) and walking (Rs = 0.43 to 0.58) were stronger in normal and overweight groups whilst sedentary behaviour were modest (Rs = 0.22 to 0.38) in normal, overweight and obese groups. The use of WHR resulted in stronger agreement between the questionnaire and accelerometer than BMI and BIA. Finally, accelerometer data showed stronger associations with BMI, BIA and WHR (Rs = 0.40 to 0.77) than questionnaire data (Rs = 0.24 to 0.37). Open-source, raw accelerometer data analysis can be used to estimate MVPA, walking and sedentary behaviour from a single acceleration signal in normal, overweight and obese adults. Our data supports the use of WHR to categorise overweight and obese adults. This evidence helps researchers obtain more accurate measures of physical activity and sedentary behaviour in overweight and obese populations.

  7. An Analysis of Source Tilting and Sub-cell Opacity Sampling for IMC

    Energy Technology Data Exchange (ETDEWEB)

    Wollaeger, Ryan T. [Los Alamos National Laboratory; Urbatsch, Todd J. [Los Alamos National Laboratory; Wollaber, Allan B. [Los Alamos National Laboratory; Densmore, Jeffery D. [Los Alamos National Laboratory

    2012-08-02

    Implicit Monte Carlo (IMC) is a stochastic method for solving the radiative transfer equations for multiphysics application with the material in local thermodynamic equilibrium. The IMC method employs a fictitious scattering term that is computed from an implicit discretization of the material temperature equation. Unfortunately, the original histogram representation of the temperature and opacity with respect to the spatial domain leads to nonphysically fast propagation of radiation waves through optically thick material. In the past, heuristic source tilting schemes have been used to mitigate the numerical teleportation error of the radiation particles in IMC that cause this overly rapid radiation wave propagation. While improving the material temperature profile throughout the time duration, these tilting schemes alone do not generally alleviate the teleportation error to suitable levels. Another means of potentially reducing teleportation error in IMC is implementing continuous sub-cell opacities based on sub-cell temperature profiles. We present here an analysis of source tilting and continuous sub-cell opacity sampling applied to various discretizations of the temperature equation. Through this analysis, we demonstrate that applying both heuristics does not necessarily yield more accurate results if the discretization of the material equation is inconsistent with the Monte Carlo sub-cell transport.

  8. Text-Based Argumentation with Multiple Sources: A Descriptive Study of Opportunity to Learn in Secondary English Language Arts, History, and Science

    Science.gov (United States)

    Litman, Cindy; Marple, Stacy; Greenleaf, Cynthia; Charney-Sirott, Irisa; Bolz, Michael J.; Richardson, Lisa K.; Hall, Allison H.; George, MariAnne; Goldman, Susan R.

    2017-01-01

    This study presents a descriptive analysis of 71 videotaped lessons taught by 34 highly regarded secondary English language arts, history, and science teachers, collected to inform an intervention focused on evidence-based argumentation from multiple text sources. Studying the practices of highly regarded teachers is valuable for identifying…

  9. A Requirements-Based Exploration of Open-Source Software Development Projects--Towards a Natural Language Processing Software Analysis Framework

    Science.gov (United States)

    Vlas, Radu Eduard

    2012-01-01

    Open source projects do have requirements; they are, however, mostly informal, text descriptions found in requests, forums, and other correspondence. Understanding such requirements provides insight into the nature of open source projects. Unfortunately, manual analysis of natural language requirements is time-consuming, and for large projects,…

  10. Security analysis of an untrusted source for quantum key distribution: passive approach

    International Nuclear Information System (INIS)

    Zhao Yi; Qi Bing; Lo, H-K; Qian Li

    2010-01-01

    We present a passive approach to the security analysis of quantum key distribution (QKD) with an untrusted source. A complete proof of its unconditional security is also presented. This scheme has significant advantages in real-life implementations as it does not require fast optical switching or a quantum random number generator. The essential idea is to use a beam splitter to split each input pulse. We show that we can characterize the source using a cross-estimate technique without active routing of each pulse. We have derived analytical expressions for the passive estimation scheme. Moreover, using simulations, we have considered four real-life imperfections: additional loss introduced by the 'plug and play' structure, inefficiency of the intensity monitor noise of the intensity monitor, and statistical fluctuation introduced by finite data size. Our simulation results show that the passive estimate of an untrusted source remains useful in practice, despite these four imperfections. Also, we have performed preliminary experiments, confirming the utility of our proposal in real-life applications. Our proposal makes it possible to implement the 'plug and play' QKD with the security guaranteed, while keeping the implementation practical.

  11. Online characterization of planetary surfaces: PlanetServer, an open-source analysis and visualization tool

    Science.gov (United States)

    Marco Figuera, R.; Pham Huu, B.; Rossi, A. P.; Minin, M.; Flahaut, J.; Halder, A.

    2018-01-01

    The lack of open-source tools for hyperspectral data visualization and analysis creates a demand for new tools. In this paper we present the new PlanetServer, a set of tools comprising a web Geographic Information System (GIS) and a recently developed Python Application Programming Interface (API) capable of visualizing and analyzing a wide variety of hyperspectral data from different planetary bodies. Current WebGIS open-source tools are evaluated in order to give an overview and contextualize how PlanetServer can help in this matters. The web client is thoroughly described as well as the datasets available in PlanetServer. Also, the Python API is described and exposed the reason of its development. Two different examples of mineral characterization of different hydrosilicates such as chlorites, prehnites and kaolinites in the Nili Fossae area on Mars are presented. As the obtained results show positive outcome in hyperspectral analysis and visualization compared to previous literature, we suggest using the PlanetServer approach for such investigations.

  12. Use of the spectral analysis for estimating the intensity of a weak periodic source

    International Nuclear Information System (INIS)

    Marseguerra, M.

    1989-01-01

    This paper deals with the possibility of exploiting spectral methods for the analysis of counting experiments in which one has to estimate the intensity of a weak periodic source of particles buried in a high background. The general theoretical expressions here obtained for the auto- and cross-spectra are applied to three kinds of simulated experiments. In all cases it turns out that the source intensity can acutally be estimated with a standard deviation comparable with that obtained in classical experiments in which the source can be moved out. Thus the spectral methods represent an interesting technique nowadays easy to implement on low-cost computers which could also be used in many research fields by suitably redesigning classical experiments. The convenience of using these methods in the field of nuclear safeguards is presently investigated in our Institute. (orig.)

  13. Creep analysis of fuel plates for the Advanced Neutron Source

    International Nuclear Information System (INIS)

    Swinson, W.F.; Yahr, G.T.

    1994-11-01

    The reactor for the planned Advanced Neutron Source will use closely spaced arrays of fuel plates. The plates are thin and will have a core containing enriched uranium silicide fuel clad in aluminum. The heat load caused by the nuclear reactions within the fuel plates will be removed by flowing high-velocity heavy water through narrow channels between the plates. However, the plates will still be at elevated temperatures while in service, and the potential for excessive plate deformation because of creep must be considered. An analysis to include creep for deformation and stresses because of temperature over a given time span has been performed and is reported herein

  14. Monitoring and Analysis of Nonpoint Source Pollution - Case study on terraced paddy fields in an agricultural watershed

    Science.gov (United States)

    Chen, Shih-Kai; Jang, Cheng-Shin; Yeh, Chun-Lin

    2013-04-01

    The intensive use of chemical fertilizer has negatively impacted environments in recent decades, mainly through water pollution by nitrogen (N) and phosphate (P) originating from agricultural activities. As a main crop with the largest cultivation area about 0.25 million ha per year in Taiwan, rice paddies account for a significant share of fertilizer consumption among agriculture crops. This study evaluated the fertilization of paddy fields impacting return flow water quality in an agricultural watershed located at Hsinchu County, northern Taiwan. Water quality monitoring continued for two crop-periods in 2012, around subject to different water bodies, including the irrigation water, drainage water, and shallow groundwater. The results indicated that obviously increasing of ammonium-N, nitrate-N and TP concentrations in the surface drainage water were observed immediately following three times of fertilizer applications (including basal, tillering, and panicle fertilizer application), but reduced to relatively low concentrations after 7-10 days after each fertilizer application. Groundwater quality monitoring showed that the observation wells with the more shallow water depth, the more significant variation of concentrations of ammonium-N, nitrate-N and TP could be observed, which means that the contamination potential of nutrient of groundwater is related not only to the impermeable plow sole layer but also to the length of percolation route in this area. The study also showed that the potential pollution load of nutrient could be further reduced by well drainage water control and rational fertilizer management, such as deep-water irrigation, reuse of return flow, the rational application of fertilizers, and the SRI (The System of Rice Intensification) method. The results of this study can provide as an evaluation basis to formulate effective measures for agricultural non-point source pollution control and the reuse of agricultural return flow. Keywords

  15. InterviewStreamliner, a minimalist, free, open source, relational approach to computer-assisted qualitative data analysis software

    NARCIS (Netherlands)

    H.D. Pruijt (Hans)

    2010-01-01

    textabstractInterviewStreamliner is a free, open source, minimalist alternative to complex computer-assisted qualitative data analysis packages. It builds on the flexibility of relational database management technology.

  16. Studies of electron cyclotron resonance ion source plasma physics

    International Nuclear Information System (INIS)

    Tarvainen, O.

    2005-01-01

    This thesis consists of an introduction to the plasma physics of electron cyclotron resonance ion sources (ECRIS) and a review of the results obtained by the author and co-workers including discussion of related work by others. The thesis begins with a theoretical discussion dealing with plasma physics relevant for the production of highly charged ions in ECR ion source plasmas. This is followed by an overview of different techniques, such as gas mixing and double frequency heating, that can be used to improve the performance of this type of ion source. The experimental part of the work consists of studies related to ECRIS plasma physics. The effect of the gas mixing technique on the production efficiency of different ion beams was studied with both gaseous and solid materials. It was observed that gas mixing improves the confinement of the heavier element while the confinement of the lighter element is reduced. When the effect of gas mixing on MIVOC-plasmas was studied with several mixing gases it was observed that applying this technique can reduce the inevitable carbon contamination by a significant factor. In order to understand the different plasma processes taking place in ECRIS plasmas, a series of plasma potential and emittance measurements was carried out. An instrument, which can be used to measure the plasma potential in a single measurement without disturbing the plasma, was developed for this work. Studying the plasma potential of ECR ion sources is important not only because it helps to understand different plasma processes, but also because the information can be used as an input parameter for beam transport simulations and ion source extraction design. The experiments performed have revealed clear dependencies of the plasma potential on certain source parameters such as the amount of carbon contamination accumulated on the walls of the plasma chamber during a MIVOC-run. It was also observed that gas mixing affects not only the production efficiency

  17. Transient thermal stress analysis of a near-edge elliptical defect in a semi-infinite plate subjected to a moving heat source

    International Nuclear Information System (INIS)

    Mingjong Wang; Weichung Wang

    1994-01-01

    In this paper, the maximum transient thermal stresses on the boundary of a near-edge elliptical defect in a semi-infinite thin plate were determined by the digital photoelastic technique, when the plate edge experiences a moving heat source. The relationships between the maximum transient thermal stresses and the size and inclination of the elliptical defect, the minimum distance from the elliptical defect to the plate edge as well as the speed of the moving heat source were also studied. Finally, by using a statistical analysis package, the variations of the maximum transient thermal stresses were then correlated with the time, the minimum distance between the edge and the elliptical defect, temperature difference, and speed of the moving heat source. (author)

  18. Nuclear power plant control room task analysis. Pilot study for pressurized water reactors

    International Nuclear Information System (INIS)

    Barks, D.B.; Kozinsky, E.J.; Eckel, S.

    1982-05-01

    The purposes of this nuclear plant task analysis pilot study: to demonstrate the use of task analysis techniques on selected abnormal or emergency operation events in a nuclear power plant; to evaluate the use of simulator data obtained from an automated Performance Measurement System to supplement and validate data obtained by traditional task analysis methods; and to demonstrate sample applications of task analysis data to address questions pertinent to nuclear power plant operational safety: control room layout, staffing and training requirements, operating procedures, interpersonal communications, and job performance aids. Five data sources were investigated to provide information for a task analysis. These sources were (1) written operating procedures (event-based); (2) interviews with subject matter experts (the control room operators); (3) videotapes of the control room operators (senior reactor operators and reactor operators) while responding to each event in a simulator; (4) walk-/talk-throughs conducted by control room operators for each event; and (5) simulator data from the PMS

  19. Study of localized photon source in space of measures

    International Nuclear Information System (INIS)

    Lisi, M.

    2010-01-01

    In this paper we study a three-dimensional photon transport problem in an interstellar cloud, with a localized photon source inside. The problem is solved indirectly, by defining the adjoint of an operator acting on an appropriate space of continuous functions. By means of sun-adjoint semi groups theory of operators in a Banach space of regular Borel measures, we prove existence and uniqueness of the solution of the problem. A possible approach to identify the localization of the photon source is finally proposed.

  20. Source apportionment and air quality impact assessment studies in Beijing/China

    Science.gov (United States)

    Suppan, P.; Schrader, S.; Shen, R.; Ling, H.; Schäfer, K.; Norra, S.; Vogel, B.; Wang, Y.

    2012-04-01

    More than 15 million people in the greater area of Beijing are still suffering from severe air pollution levels caused by sources within the city itself but also from external impacts like severe dust storms and long range advection from the southern and central part of China. Within this context particulate matter (PM) is the major air pollutant in the greater area of Beijing (Garland et al., 2009). PM did not serve only as lead substance for air quality levels and therefore for adverse health impact effects but also for a strong influence on the climate system by changing e.g. the radiative balance. Investigations on emission reductions during the Olympic Summer Games in 2008 have caused a strong reduction on coarser particles (PM10) but not on smaller particles (PM2.5). In order to discriminate the composition of the particulate matter levels, the different behavior of coarser and smaller particles investigations on source attribution, particle characteristics and external impacts on the PM levels of the city of Beijing by measurements and modeling are performed: Examples of long term measurements of PM2.5 filter sampling in 2005 with the objectives of detailed chemical (source attribution, carbon fraction, organic speciation and inorganic composition) and isotopic analyses as well as toxicological assessment in cooperation with several institutions (Karlsruhe Institute of Technology (IfGG/IMG), Helmholtz Zentrum München (HMGU), University Rostock (UR), Chinese University of Mining and Technology Beijing, CUMTB) will be discussed. Further experimental studies include the operation of remote sensing systems to determine continuously the MLH (by a ceilometer) and gaseous air pollutants near the ground (by DOAS systems) as well as at the 320 m measurement tower (adhesive plates at different heights for passive particle collection) in cooperation with the Institute of Atmospheric Physics (IAP) of the Chinese Academy of Sciences (CAS). The influence of the MLH on