WorldWideScience

Sample records for source analysis study

  1. Comparative Analysis Study of Open Source GIS in Malaysia

    International Nuclear Information System (INIS)

    Rasid, Muhammad Zamir Abdul; Kamis, Naddia; Halim, Mohd Khuizham Abd

    2014-01-01

    Open source origin might appear like a major prospective change which is qualified to deliver in various industries and also competing means in developing countries. The leading purpose of this research study is to basically discover the degree of adopting Open Source Software (OSS) that is connected with Geographic Information System (GIS) application within Malaysia. It was derived based on inadequate awareness with regards to the origin ideas or even on account of techie deficiencies in the open origin instruments. This particular research has been carried out based on two significant stages; the first stage involved a survey questionnaire: to evaluate the awareness and acceptance level based on the comparison feedback regarding OSS and commercial GIS. This particular survey was conducted among three groups of candidates: government servant, university students and lecturers, as well as individual. The approaches of measuring awareness in this research were based on a comprehending signal plus a notion signal for each survey questions. These kinds of signs had been designed throughout the analysis in order to supply a measurable and also a descriptive signal to produce the final result. The second stage involved an interview session with a major organization that carries out available origin internet GIS; the Federal Department of Town and Country Planning Peninsular Malaysia (JPBD). The impact of this preliminary study was to understand the particular viewpoint of different groups of people on the available origin, and also their insufficient awareness with regards to origin ideas as well as likelihood may be significant root of adopting level connected with available origin options

  2. Study on analysis from sources of error for Airborne LIDAR

    Science.gov (United States)

    Ren, H. C.; Yan, Q.; Liu, Z. J.; Zuo, Z. Q.; Xu, Q. Q.; Li, F. F.; Song, C.

    2016-11-01

    With the advancement of Aerial Photogrammetry, it appears that to obtain geo-spatial information of high spatial and temporal resolution provides a new technical means for Airborne LIDAR measurement techniques, with unique advantages and broad application prospects. Airborne LIDAR is increasingly becoming a new kind of space for earth observation technology, which is mounted by launching platform for aviation, accepting laser pulses to get high-precision, high-density three-dimensional coordinate point cloud data and intensity information. In this paper, we briefly demonstrates Airborne laser radar systems, and that some errors about Airborne LIDAR data sources are analyzed in detail, so the corresponding methods is put forwarded to avoid or eliminate it. Taking into account the practical application of engineering, some recommendations were developed for these designs, which has crucial theoretical and practical significance in Airborne LIDAR data processing fields.

  3. Automated Source Code Analysis to Identify and Remove Software Security Vulnerabilities: Case Studies on Java Programs

    OpenAIRE

    Natarajan Meghanathan

    2013-01-01

    The high-level contribution of this paper is to illustrate the development of generic solution strategies to remove software security vulnerabilities that could be identified using automated tools for source code analysis on software programs (developed in Java). We use the Source Code Analyzer and Audit Workbench automated tools, developed by HP Fortify Inc., for our testing purposes. We present case studies involving a file writer program embedded with features for password validation, and ...

  4. Neutron activation analysis: Modelling studies to improve the neutron flux of Americium-Beryllium source

    Energy Technology Data Exchange (ETDEWEB)

    Didi, Abdessamad; Dadouch, Ahmed; Tajmouati, Jaouad; Bekkouri, Hassane [Advanced Technology and Integration System, Dept. of Physics, Faculty of Science Dhar Mehraz, University Sidi Mohamed Ben Abdellah, Fez (Morocco); Jai, Otman [Laboratory of Radiation and Nuclear Systems, Dept. of Physics, Faculty of Sciences, Tetouan (Morocco)

    2017-06-15

    Americium–beryllium (Am-Be; n, γ) is a neutron emitting source used in various research fields such as chemistry, physics, geology, archaeology, medicine, and environmental monitoring, as well as in the forensic sciences. It is a mobile source of neutron activity (20 Ci), yielding a small thermal neutron flux that is water moderated. The aim of this study is to develop a model to increase the neutron thermal flux of a source such as Am-Be. This study achieved multiple advantageous results: primarily, it will help us perform neutron activation analysis. Next, it will give us the opportunity to produce radio-elements with short half-lives. Am-Be single and multisource (5 sources) experiments were performed within an irradiation facility with a paraffin moderator. The resulting models mainly increase the thermal neutron flux compared to the traditional method with water moderator.

  5. USING THE METHODS OF WAVELET ANALYSIS AND SINGULAR SPECTRUM ANALYSIS IN THE STUDY OF RADIO SOURCE BL LAC

    OpenAIRE

    Donskykh, G. I.; Ryabov, M. I.; Sukharev, A. I.; Aller, M.

    2014-01-01

    We investigated the monitoring data of extragalactic source BL Lac. This monitoring was held withUniversityofMichigan26-meter radio  telescope. To study flux density of extragalactic source BL Lac at frequencies of 14.5, 8 and 4.8 GHz, the wavelet analysis and singular spectrum analysis were used. Calculating the integral wavelet spectra allowed revealing long-term  components  (~7-8 years) and short-term components (~ 1-4 years) in BL Lac. Studying of VLBI radio maps (by the program Mojave) ...

  6. Obsidian sourcing studies in Papua New Guinea using PIXE-PIGME analysis

    International Nuclear Information System (INIS)

    Summerhayes, G.R.; Gosden, C.; Bird, R.; Hotchkis, M.; Specht, J.; Torrence, R.; Fullaga, R.

    1993-01-01

    Over 100 obsidian samples were analysed using PIXE-PIGME in 1990. These samples were collected during intensive surveys of the source areas around Talasea, Garua Island, and the Mopir area in 1988, 1989 and 1990. A ratio combination of 9 elements were used to separate out groups as per previous studies: F/Na, Al/Na, K/Fe, Ca/Fe, Mn/Fe, Rb/Fe, Y/Zr, Sr/Fe and Zr/Fe. In spite of variations in major elements, the close agreement between results for minor and trace elements concentrations in artefacts and known source material indicates that the provenance of each artefact can be reliably determined. This conclusion provides important validation of the use of ion beam analysis in artefact characterisation. ills

  7. Obsidian sourcing studies in Papua New Guinea using PIXE-PIGME analysis

    Energy Technology Data Exchange (ETDEWEB)

    Summerhayes, G R; Gosden, C [La Trobe Univ., Bundoora, VIC (Australia); Bird, R; Hotchkis, M [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia); Specht, J; Torrence, R; Fullaga, R [Australian Museum, Sydney, NSW (Australia). Div. of Anthropology

    1994-12-31

    Over 100 obsidian samples were analysed using PIXE-PIGME in 1990. These samples were collected during intensive surveys of the source areas around Talasea, Garua Island, and the Mopir area in 1988, 1989 and 1990. A ratio combination of 9 elements were used to separate out groups as per previous studies: F/Na, Al/Na, K/Fe, Ca/Fe, Mn/Fe, Rb/Fe, Y/Zr, Sr/Fe and Zr/Fe. In spite of variations in major elements, the close agreement between results for minor and trace elements concentrations in artefacts and known source material indicates that the provenance of each artefact can be reliably determined. This conclusion provides important validation of the use of ion beam analysis in artefact characterisation. ills.

  8. Obsidian sourcing studies in Papua New Guinea using PIXE-PIGME analysis

    Energy Technology Data Exchange (ETDEWEB)

    Summerhayes, G.R.; Gosden, C. [La Trobe Univ., Bundoora, VIC (Australia); Bird, R.; Hotchkis, M. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia); Specht, J.; Torrence, R.; Fullaga, R. [Australian Museum, Sydney, NSW (Australia). Div. of Anthropology

    1993-12-31

    Over 100 obsidian samples were analysed using PIXE-PIGME in 1990. These samples were collected during intensive surveys of the source areas around Talasea, Garua Island, and the Mopir area in 1988, 1989 and 1990. A ratio combination of 9 elements were used to separate out groups as per previous studies: F/Na, Al/Na, K/Fe, Ca/Fe, Mn/Fe, Rb/Fe, Y/Zr, Sr/Fe and Zr/Fe. In spite of variations in major elements, the close agreement between results for minor and trace elements concentrations in artefacts and known source material indicates that the provenance of each artefact can be reliably determined. This conclusion provides important validation of the use of ion beam analysis in artefact characterisation. ills.

  9. Application of the Frequency Map Analysis to the Study of the Beam Dynamics of Light Sources

    International Nuclear Information System (INIS)

    Nadolski, Laurent

    2001-01-01

    The topic of this thesis is the study of beam dynamics in storage rings with a restriction to single particle transverse dynamics. In a first part, tools (Frequency Map Analysis, Hamiltonian, Integrator) are presented for studying and exploring the dynamics. Numerical simulations of four synchrotron radiation sources (the ALS, the ESRF, SOLEIL and Super-ACO) are performed. We construct a tracking code based on a new class of symplectic integrators (Laskar and Robutel, 2000). These integrators with only positive steps are more precise by an order of magnitude than the standard Forest and Ruth's scheme. Comparisons with the BETA, DESPOT and MAD codes are carried out. Frequency Map Analysis (Laskar, 1990) is our main analysis tool. This is a numerical method for analysing a conservative dynamical system. Based on a refined Fourier technique, it enables us to compute frequency maps which are real footprints of the beam dynamics of an accelerator. We stress the high sensitivity of the dynamics to magnetics errors and sextipolar strengths. The second part of this work is dedicated to the analysis of experimental results from two light sources. Together with the ALS accelerator team (Berkeley), we succeeded in obtaining the first experimental frequency map of an accelerator. The agreement with the machine model is very impressive. At the Super-ACO ring, the study of the tune shift with amplitude enabled us to highlight a strong octupolar-like component related to the quadrupole fringe field. The aftermaths for the beam dynamics are important and give us a better understanding the measured ring performance. All these results are based on turn by turn measurements. Many closely related phenomena are treated such as response matrix analysis or beam decoherence. (author) [fr

  10. Performance analysis and experimental study of heat-source tower solution regeneration

    International Nuclear Information System (INIS)

    Liang, Caihua; Wen, Xiantai; Liu, Chengxing; Zhang, Xiaosong

    2014-01-01

    Highlights: • Theoretical analysis is performed on the characteristics of heat-source tower. • Experimental study is performed on various rules of the solution regeneration rate. • The characteristics of solution regeneration vary widely with different demands. • Results are useful for optimizing the process of solution regeneration. - Abstract: By analyzing similarities and difference between the solution regeneration of a heat-source tower and desiccant solution regeneration, this paper points out that solution regeneration of a heat-source tower has the characteristics of small demands and that a regeneration rate is susceptible to outdoor ambient environments. A theoretical analysis is performed on the characteristics of a heat-source tower solution in different outdoor environments and different regeneration modes, and an experimental study is performed on variation rules of the solution regeneration rate of a cross-flow heat-source tower under different inlet parameters and operating parameters. The experimental results show that: in the operating regeneration mode, as the air volume was increased from 123 m 3 h −1 to 550 m 3 h −1 , the system heat transfer amount increased from 0.42 kW to 0.78 kW, and the regeneration rate increased from 0.03 g s −1 to 0.19 g s −1 . Increasing the solution flow may increase the system heat transfer amount; however, the regeneration rate decreased to a certain extent. In the regeneration mode when the system is idle, as the air volume was increased from 136 m 3 h −1 to 541 m 3 h −1 , the regeneration rate increased from 0.03 g s −1 to 0.1 g s −1 . The regeneration rate almost remained unchanged around 0.07 g s −1 as the solution flow is increased. In the regeneration mode with auxiliary heat when the system is idle, increasing the air volume and increasing the solution flow required more auxiliary heat, thereby improving the solution regeneration rate. As the auxiliary heat was increased from 0.33 k

  11. Hydrodynamic analysis of potential groundwater extraction capacity increase: case study of 'Nelt' groundwater source at Dobanovci

    Directory of Open Access Journals (Sweden)

    Bajić Dragoljub I.

    2017-01-01

    Full Text Available A comprehensive hydrodynamic analysis of the groundwater regime undertaken to assess the potential for expanding the 'Nelt' groundwater source at Dobanovci, or developing a new groundwater source for a future baby food factory, including the quantification of the impact on the production wells of the nearby 'Pepsi' groundwater source, is presented in the paper. The existing Nelt source is comprised of three active production wells that tap a subartesian aquifer formed in sands and gravelly sands; however, the analysis considers only the two nearest wells. A long-term group pumping test was con-ducted of production wells N-1 and N2 (Nelt source and production wells B-1 and B-2 (Pepsi source, while the piezometric head in the vicinity of these wells was monitored at observation well P-1, which is located in the area considered for Nelt source expansion. Data were collected at maximum pumping capacity of all the production wells. A hydrodynamic model of groundwater flow in the extended area of the Nelt source was generated for the purposes of the comprehensive hydrodynamic analysis. Hydrodynamic prognostic calculations addressed two solution alternatives for the capacity increase over a period of ten years. Licensed Visual MODFLOW Pro software, deemed to be at the very top in this field, was used for the calculations.

  12. Bias analysis applied to Agricultural Health Study publications to estimate non-random sources of uncertainty.

    Science.gov (United States)

    Lash, Timothy L

    2007-11-26

    The associations of pesticide exposure with disease outcomes are estimated without the benefit of a randomized design. For this reason and others, these studies are susceptible to systematic errors. I analyzed studies of the associations between alachlor and glyphosate exposure and cancer incidence, both derived from the Agricultural Health Study cohort, to quantify the bias and uncertainty potentially attributable to systematic error. For each study, I identified the prominent result and important sources of systematic error that might affect it. I assigned probability distributions to the bias parameters that allow quantification of the bias, drew a value at random from each assigned distribution, and calculated the estimate of effect adjusted for the biases. By repeating the draw and adjustment process over multiple iterations, I generated a frequency distribution of adjusted results, from which I obtained a point estimate and simulation interval. These methods were applied without access to the primary record-level dataset. The conventional estimates of effect associating alachlor and glyphosate exposure with cancer incidence were likely biased away from the null and understated the uncertainty by quantifying only random error. For example, the conventional p-value for a test of trend in the alachlor study equaled 0.02, whereas fewer than 20% of the bias analysis iterations yielded a p-value of 0.02 or lower. Similarly, the conventional fully-adjusted result associating glyphosate exposure with multiple myleoma equaled 2.6 with 95% confidence interval of 0.7 to 9.4. The frequency distribution generated by the bias analysis yielded a median hazard ratio equal to 1.5 with 95% simulation interval of 0.4 to 8.9, which was 66% wider than the conventional interval. Bias analysis provides a more complete picture of true uncertainty than conventional frequentist statistical analysis accompanied by a qualitative description of study limitations. The latter approach is

  13. Bias analysis applied to Agricultural Health Study publications to estimate non-random sources of uncertainty

    Directory of Open Access Journals (Sweden)

    Lash Timothy L

    2007-11-01

    Full Text Available Abstract Background The associations of pesticide exposure with disease outcomes are estimated without the benefit of a randomized design. For this reason and others, these studies are susceptible to systematic errors. I analyzed studies of the associations between alachlor and glyphosate exposure and cancer incidence, both derived from the Agricultural Health Study cohort, to quantify the bias and uncertainty potentially attributable to systematic error. Methods For each study, I identified the prominent result and important sources of systematic error that might affect it. I assigned probability distributions to the bias parameters that allow quantification of the bias, drew a value at random from each assigned distribution, and calculated the estimate of effect adjusted for the biases. By repeating the draw and adjustment process over multiple iterations, I generated a frequency distribution of adjusted results, from which I obtained a point estimate and simulation interval. These methods were applied without access to the primary record-level dataset. Results The conventional estimates of effect associating alachlor and glyphosate exposure with cancer incidence were likely biased away from the null and understated the uncertainty by quantifying only random error. For example, the conventional p-value for a test of trend in the alachlor study equaled 0.02, whereas fewer than 20% of the bias analysis iterations yielded a p-value of 0.02 or lower. Similarly, the conventional fully-adjusted result associating glyphosate exposure with multiple myleoma equaled 2.6 with 95% confidence interval of 0.7 to 9.4. The frequency distribution generated by the bias analysis yielded a median hazard ratio equal to 1.5 with 95% simulation interval of 0.4 to 8.9, which was 66% wider than the conventional interval. Conclusion Bias analysis provides a more complete picture of true uncertainty than conventional frequentist statistical analysis accompanied by a

  14. SWOT analysis of the renewable energy sources in Romania - case study: solar energy

    Science.gov (United States)

    Lupu, A. G.; Dumencu, A.; Atanasiu, M. V.; Panaite, C. E.; Dumitrașcu, Gh; Popescu, A.

    2016-08-01

    The evolution of energy sector worldwide triggered intense preoccupation on both finding alternative renewable energy sources and environmental issues. Romania is considered to have technological potential and geographical location suitable to renewable energy usage for electricity generation. But this high potential is not fully exploited in the context of policies and regulations adopted globally, and more specific, European Union (EU) environmental and energy strategies and legislation related to renewable energy sources. This SWOT analysis of solar energy source presents the state of the art, potential and future prospects for development of renewable energy in Romania. The analysis concluded that the development of solar energy sector in Romania depends largely on: viability of legislative framework on renewable energy sources, increased subsidies for solar R&D, simplified methodology of green certificates, and educating the public, investors, developers and decision-makers.

  15. Joint source based analysis of multiple brain structures in studying major depressive disorder

    Science.gov (United States)

    Ramezani, Mahdi; Rasoulian, Abtin; Hollenstein, Tom; Harkness, Kate; Johnsrude, Ingrid; Abolmaesumi, Purang

    2014-03-01

    We propose a joint Source-Based Analysis (jSBA) framework to identify brain structural variations in patients with Major Depressive Disorder (MDD). In this framework, features representing position, orientation and size (i.e. pose), shape, and local tissue composition are extracted. Subsequently, simultaneous analysis of these features within a joint analysis method is performed to generate the basis sources that show signi cant di erences between subjects with MDD and those in healthy control. Moreover, in a cross-validation leave- one-out experiment, we use a Fisher Linear Discriminant (FLD) classi er to identify individuals within the MDD group. Results show that we can classify the MDD subjects with an accuracy of 76% solely based on the information gathered from the joint analysis of pose, shape, and tissue composition in multiple brain structures.

  16. A Study on Conjugate Heat Transfer Analysis of Reactor Vessel including Irradiated Structural Heat Source

    Energy Technology Data Exchange (ETDEWEB)

    Yi, Kunwoo; Cho, Hyuksu; Im, Inyoung; Kim, Eunkee [KEPCO EnC, Daejeon (Korea, Republic of)

    2015-10-15

    Though Material reliability programs (MRPs) have a purpose to provide the evaluation or management methodologies for the operating RVI, the similar evaluation methodologies can be applied to the APR1400 fleet in the design stage for the evaluation of neutron irradiation effects. The purposes of this study are: to predict the thermal behavior whether or not irradiated structure heat source; to evaluate effective thermal conductivity (ETC) in relation to isotropic and anisotropic conductivity of porous media for APR1400 Reactor Vessel. The CFD simulations are performed so as to evaluate thermal behavior whether or not irradiated structure heat source and effective thermal conductivity for APR1400 Reactor Vessel. In respective of using irradiated structure heat source, the maximum temperature of fluid and core shroud for isotropic ETC are 325.8 .deg. C, 341.5 .deg. C. The total amount of irradiated structure heat source is about 5.41 MWth and not effect to fluid temperature.

  17. Study and Analysis of an Intelligent Microgrid Energy Management Solution with Distributed Energy Sources

    Directory of Open Access Journals (Sweden)

    Swaminathan Ganesan

    2017-09-01

    Full Text Available In this paper, a robust energy management solution which will facilitate the optimum and economic control of energy flows throughout a microgrid network is proposed. The increased penetration of renewable energy sources is highly intermittent in nature; the proposed solution demonstrates highly efficient energy management. This study enables precise management of power flows by forecasting of renewable energy generation, estimating the availability of energy at storage batteries, and invoking the appropriate mode of operation, based on the load demand to achieve efficient and economic operation. The predefined mode of operation is derived out of an expert rule set and schedules the load and distributed energy sources along with utility grid.

  18. Moessbauer spectroscopy and X-ray fluorescence analysis in studies for determinate the sources of several prehispanic objects

    International Nuclear Information System (INIS)

    Arriola S, H.; Ramos R, P.; Castro V, P.; Jimenez R, A.; Flores D, F.; Garcia Moreno C, C.

    1980-01-01

    A study by the Moessbauer effect and X-ray fluorescence analysis of the mexican prehispanic ceramic specimens is presented. Several iron compounds of the ceramics are determined, the different iron compounds indicate different sources of the clays, and different forms of ovens used with them, this compounds are identified by the differents oxidation states of the magnetic iron Fe 3+ , Fe 2+ . (author)

  19. Neutron activation analysis: Modelling studies to improve the neutron flux of Americium–Beryllium source

    Directory of Open Access Journals (Sweden)

    Abdessamad Didi

    2017-06-01

    Full Text Available Americium–beryllium (Am-Be; n, γ is a neutron emitting source used in various research fields such as chemistry, physics, geology, archaeology, medicine, and environmental monitoring, as well as in the forensic sciences. It is a mobile source of neutron activity (20 Ci, yielding a small thermal neutron flux that is water moderated. The aim of this study is to develop a model to increase the neutron thermal flux of a source such as Am-Be. This study achieved multiple advantageous results: primarily, it will help us perform neutron activation analysis. Next, it will give us the opportunity to produce radio-elements with short half-lives. Am-Be single and multisource (5 sources experiments were performed within an irradiation facility with a paraffin moderator. The resulting models mainly increase the thermal neutron flux compared to the traditional method with water moderator.

  20. Free/open source software: a study of some applications for scientific data analysis of nuclear experiments

    International Nuclear Information System (INIS)

    Menezes, Mario Olimpio de

    2005-01-01

    Free/Open Source Software (FOSS) has been used in science long before the formal social movement known as 'Free Software/Open Source Software' came in to existence. After the Personal Computer (PC) boom in the 80s, commercial closed source software became widely available to scientists for data analysis in this platform. In this paper, we study some high quality FOSS, available also for free, that can be used for complex data analysis tasks. We show the results and data analysis process, aiming to expose the high quality and highly productive ways of both results and processes, while highlighting the different approach used in some of the FOSS. We show that scientists have today in FOSS a viable, high quality alternative to commercial closed source software which, besides being ready to use, also offer the possibility of great customization or extension to fit very particular needs of many fields of scientific data analysis. Among the FOSS, we study in this paper GNU Octave and SCILAB - free alternatives to MATLAB; Gnuplot - free alternative to ORIGIN-like software. We also show that scientists have invaluable resources in modern FOSS programming languages such as Python, and Perl, that can be used both to do data analysis and manipulation, allowing very complex tasks to be done automatically after some few lines of easy programming. (author)

  1. Free/open source software: a study of some applications for scientific data analysis of nuclear experiments

    Energy Technology Data Exchange (ETDEWEB)

    Menezes, Mario Olimpio de [Instituto de Pesquisas Energeticas e Nucleares (IPEN), Sao Paulo, SP (Brazil)]. E-mail: mario@ipen.br; mo.menezes@gmail.com

    2005-07-01

    Free/Open Source Software (FOSS) has been used in science long before the formal social movement known as 'Free Software/Open Source Software' came in to existence. After the Personal Computer (PC) boom in the 80s, commercial closed source software became widely available to scientists for data analysis in this platform. In this paper, we study some high quality FOSS, available also for free, that can be used for complex data analysis tasks. We show the results and data analysis process, aiming to expose the high quality and highly productive ways of both results and processes, while highlighting the different approach used in some of the FOSS. We show that scientists have today in FOSS a viable, high quality alternative to commercial closed source software which, besides being ready to use, also offer the possibility of great customization or extension to fit very particular needs of many fields of scientific data analysis. Among the FOSS, we study in this paper GNU Octave and SCILAB - free alternatives to MATLAB; Gnuplot - free alternative to ORIGIN-like software. We also show that scientists have invaluable resources in modern FOSS programming languages such as Python, and Perl, that can be used both to do data analysis and manipulation, allowing very complex tasks to be done automatically after some few lines of easy programming. (author)

  2. Distributed medical image analysis and diagnosis through crowd-sourced games: a malaria case study.

    Science.gov (United States)

    Mavandadi, Sam; Dimitrov, Stoyan; Feng, Steve; Yu, Frank; Sikora, Uzair; Yaglidere, Oguzhan; Padmanabhan, Swati; Nielsen, Karin; Ozcan, Aydogan

    2012-01-01

    In this work we investigate whether the innate visual recognition and learning capabilities of untrained humans can be used in conducting reliable microscopic analysis of biomedical samples toward diagnosis. For this purpose, we designed entertaining digital games that are interfaced with artificial learning and processing back-ends to demonstrate that in the case of binary medical diagnostics decisions (e.g., infected vs. uninfected), with the use of crowd-sourced games it is possible to approach the accuracy of medical experts in making such diagnoses. Specifically, using non-expert gamers we report diagnosis of malaria infected red blood cells with an accuracy that is within 1.25% of the diagnostics decisions made by a trained medical professional.

  3. Distributed medical image analysis and diagnosis through crowd-sourced games: a malaria case study.

    Directory of Open Access Journals (Sweden)

    Sam Mavandadi

    Full Text Available In this work we investigate whether the innate visual recognition and learning capabilities of untrained humans can be used in conducting reliable microscopic analysis of biomedical samples toward diagnosis. For this purpose, we designed entertaining digital games that are interfaced with artificial learning and processing back-ends to demonstrate that in the case of binary medical diagnostics decisions (e.g., infected vs. uninfected, with the use of crowd-sourced games it is possible to approach the accuracy of medical experts in making such diagnoses. Specifically, using non-expert gamers we report diagnosis of malaria infected red blood cells with an accuracy that is within 1.25% of the diagnostics decisions made by a trained medical professional.

  4. Polar source analysis : technical memorandum

    Science.gov (United States)

    2017-09-29

    The following technical memorandum describes the development, testing and analysis of various polar source data sets. The memorandum also includes recommendation for potential inclusion in future releases of AEDT. This memorandum is the final deliver...

  5. Mechanisms Supporting Superior Source Memory for Familiar Items: A Multi-Voxel Pattern Analysis Study

    Science.gov (United States)

    Poppenk, Jordan; Norman, Kenneth A.

    2012-01-01

    Recent cognitive research has revealed better source memory performance for familiar relative to novel stimuli. Here we consider two possible explanations for this finding. The source memory advantage for familiar stimuli could arise because stimulus novelty induces attention to stimulus features at the expense of contextual processing, resulting…

  6. Source analysis of peroxyacetyl nitrate (PAN) in Guangzhou, China: a yearlong observation study

    Science.gov (United States)

    Wang, B. G.; Zhu, D.; Zou, Y.; Wang, H.; Zhou, L.; Ouyang, X.; Shao, H. F.; Deng, X. J.

    2015-06-01

    In recent years, photochemical smog has been a major cause of air pollution in the metropolitan area of Guangzhou, China, with a continuing increase in the concentrations of photochemical pollutants. The concentration of peroxyacetyl nitrate (PAN) has often been found to reach very high levels, posing a potential threat to the public health. To better understand the changes in PAN concentration and its sources, a study was carried from January to December of 2012 at the Guangzhou Panyu Atmospheric Composition Station (GPACS) to measure the atmospheric concentrations of PAN as well as those of ozone (O3), nitrogen oxides (NOx), and non-methane hydrocarbon (NMHC). These data were analyzed to investigate the quantitative relationships between PAN and its precursors. In the study period, the hourly concentrations of PAN varied from below instrument detection limit to 12.0 ppbv. The yearly mean concentration of PAN was 0.84 ppbv, with the daily mean concentration exceeding 5 ppbv in 32 of the total observation days. Calculations indicate that among the measured NMHC species, alkenes accounted for 53 % of the total NMHC contribution to the PAN production, with aromatics and alkanes accounting for about 11 and 7 % of the total, respectively. During the period of our observation only a modest correlation was found between the concentrations of PAN and O3 for daytime hours, and observed PAN concentrations were relatively high even though the observed NMHCs/NOx ratio was low. This suggests regional air mass transport of pollutants had a major impact on the PAN concentrations in Guangzhou area.

  7. Thermal neutron source study

    International Nuclear Information System (INIS)

    Holden, T.M.

    1983-05-01

    The value of intense neutron beams for condensed matter research is discussed with emphasis on the complementary nature of steady state and pulsed neutron sources. A large body of information on neutron sources, both existing and planned, is then summarized under four major headings: fission reactors, electron accelerators with heavy metal targets, pulsed spallation sources and 'steady state' spallation sources. Although the cost of a spallation source is expected to exceed that of a fission reactor of the same flux by a factor of two, there are significant advantages for a spallation device such as the proposed Electronuclear Materials Test Facility (EMTF)

  8. Study of clay behaviour around a heat source by frequency spectrum analysis of seismic waves

    International Nuclear Information System (INIS)

    Sloovere, P. de.

    1993-01-01

    Wave propagated into soft rock is not completely described by purely linear elastic theory. Through spectrum analysis of wave, one can see that several frequencies are selected by the ground. ME2i uses this method to check grouting, piles a.s.o. The Mol experiment (on Radioactive Waste Disposal) aims to prove that little changes into heated clay can be detected by 'frequential seismic'. A cross-hole investigation system has been installed and tests have been performed for two years with a shear-hammer named MARGOT built to work inside horizontal boreholes: - Before heating the tests show the same results every time: . main frequency at 330 hertz; . maximal frequency at 520 hertz; - During heating: . the rays at 330 and 520 hertz disappear; . The frequencies in the range 100 - 300 hertz are prevailing; - After heating spectra have again their original shape. These results show that the effect is clear around an heated zone. The next steps should be: - Interpretation with computer's codes treating of wave propagation into a viscoelastic body; - Experimentations: . at the opening of a new gallery; . on big samples; . on granites and salt. 9 refs., 4 appendices

  9. Spatial GHG Inventory: Analysis of Uncertainty Sources. A Case Study for Ukraine

    International Nuclear Information System (INIS)

    Bun, R.; Gusti, M.; Kujii, L.; Tokar, O.; Tsybrivskyy, Y.; Bun, A.

    2007-01-01

    A geoinformation technology for creating spatially distributed greenhouse gas inventories based on a methodology provided by the Intergovernmental Panel on Climate Change and special software linking input data, inventory models, and a means for visualization are proposed. This technology opens up new possibilities for qualitative and quantitative spatially distributed presentations of inventory uncertainty at the regional level. Problems concerning uncertainty and verification of the distributed inventory are discussed. A Monte Carlo analysis of uncertainties in the energy sector at the regional level is performed, and a number of simulations concerning the effectiveness of uncertainty reduction in some regions are carried out. Uncertainties in activity data have a considerable influence on overall inventory uncertainty, for example, the inventory uncertainty in the energy sector declines from 3.2 to 2.0% when the uncertainty of energy-related statistical data on fuels combusted in the energy industries declines from 10 to 5%. Within the energy sector, the 'energy industries' subsector has the greatest impact on inventory uncertainty. The relative uncertainty in the energy sector inventory can be reduced from 2.19 to 1.47% if the uncertainty of specific statistical data on fuel consumption decreases from 10 to 5%. The 'energy industries' subsector has the greatest influence in the Donetsk oblast. Reducing the uncertainty of statistical data on electricity generation in just three regions - the Donetsk, Dnipropetrovsk, and Luhansk oblasts - from 7.5 to 4.0% results in a decline from 2.6 to 1.6% in the uncertainty in the national energy sector inventory

  10. Sequence analysis in multilevel models. A study on different sources of patient cues in medical consultations.

    Science.gov (United States)

    Del Piccolo, Lidia; Mazzi, Maria Angela; Dunn, Graham; Sandri, Marco; Zimmermann, Christa

    2007-12-01

    The aims of the study were to explore the importance of macro (patient, physician, consultation) and micro (doctor-patient speech sequences) variables in promoting patient cues (unsolicited new information or expressions of feelings), and to describe the methodological implications related to the study of speech sequences. Patient characteristics, a consultation index of partnership and doctor-patient speech sequences were recorded for 246 primary care consultations in six primary care surgeries in Verona, Italy. Homogeneity and stationarity conditions of speech sequences allowed the creation of a hierarchy of multilevel logit models including micro and macro level variables, with the presence/absence of cues as the dependent variable. We found that emotional distress of the patient increased cues and that cues appeared among other patient expressions and were preceded by physicians' facilitations and handling of emotion. Partnership, in terms of open-ended inquiry, active listening skills and handling of emotion by the physician and active participation by the patient throughout the consultation, reduced cue frequency.

  11. Cold source economic study

    International Nuclear Information System (INIS)

    Fuster, Serge.

    1975-01-01

    This computer code is intended for the statement of the general economic balance resulting from using a given cold source. The balance includes the investments needed for constructing the various materials, and also production balances resulting from their utilization. The case of either using an open circuit condenser on sea or river, or using air cooling systems with closed circuits or as auxiliaries can be dealt with. The program can be used to optimize the characteristics of the various parts of the cold source. The performance of the various materials can be evaluated for a given situation from using very full, precise economic balances, these materials can also be classified according to their possible uses, the outer constraints being taken into account (limits for heat disposal into rivers or seas, water temperature, air temperature). Technical choices whose economic consequences are important have been such clarified [fr

  12. Quantitative analysis of Internet television and video (WebTV: A study of formats, content, and source

    Directory of Open Access Journals (Sweden)

    José Borja ARJONA MARTÍN

    2014-07-01

    Full Text Available Due to the significant increase in the last five years of audiovisual content distribution over the web, this paper is focused on a study aimed at the description and classification of a wide sample of audiovisual initiatives whose access is carried out by means of the World Wide Web. The purpose of this study is to promote the debate concerning the different names of these incipient media, as well as their categorization and description so that an organised universe of the WebTV phenomenon could be provided. An analysis of formats and content is carried out on the basis of quantitative techniques in order to propose a categorization typology. These formats and content will be studied under three key variables: "Content", "Origin" and "Domain .tv". "Content" will help us define the programmatic lines of our study sample; “Source” refers to the source of a particular item of study (“Native WebTV or WebTV representative of a conventional media and "Domain.tv" will specify the proportion of case studies hosted with domain .tv. The results obtained in this study will offer the researchers and the professionals a comprehensive description of the models currently adopted in the field of video and television on the net.

  13. Duopigatron ion source studies

    International Nuclear Information System (INIS)

    Bacon, F.M.; Bickes, R.W. Jr.; O'Hagan, J.B.

    1978-07-01

    Ion source performance characteristics consisting of total ion current, ion energy distribution, mass distribution, and ion current density distribution were measured for several models of a duopigatron. Variations on the duopigatron design involved plasma expansion cup material and dimensions, secondary cathode material, and interelectrode spacings. Of the designs tested, the one with a copper and molybdenum secondary cathode and a mild steel plasma expansion cup proved to give the best results. The ion current density distribution was peaked at the center of the plasma expansion cup and fell off to 80 percent of the peak value at the cup wall for a cup 15.2 mm deep. A total ion current of 180 mA consisting of 60 to 70 percent atomic ions was produced with an arc current of 20 A and source pressure of 9.3 Pa. More shallow cups produced a larger beam current and a more sharply peaked ion current density distribution. Typical ion energy distributions were bell-shaped curves with a peak 10 to 20 V below anode potential and with ion energies extending 30 to 40 V on either side of the peak

  14. Experimental study of high current negative ion sources D- / H-. Analysis based on the simulation of the negative ion transport in the plasma source

    International Nuclear Information System (INIS)

    Riz, D.

    1996-01-01

    In the frame of the development of a neutral beam injection system able to work the ITER tokamak (International Thermonuclear Experimental Reactor), two negative ion sources, Dragon and Kamaboko, have been installed on the MANTIS test bed in Cadarache, and studies in order to extract 20 mA/cm 2 of D - . The two production modes of negative ions have been investigated: volume production; surface production after cesium injection in the discharge. Experiments have shown that cesium seeding is necessary in order to reach the requested performances for ITER. 20 mA/cm 2 have been extracted from the Kamaboko source for an arc power density of 2.5 kW/liter. Simultaneously, a code called NIETZSCHE has been developed to simulate the negative ions transport in the source plasma, from their birth place to the extraction holes. The ion trajectory is calculated by numerically solving the 3D motion equation, while the atomic processes of destruction, of elastic collisions H - /H + and of charge exchange H - /H 0 are handled at each time step by a Monte Carlo procedure. The code allows to obtain the extraction probability of a negative ion produced at a given location. The calculations performed with NIETZSCHE have allowed to explain several phenomena observed on negative ion sources, such as the isotopic effect H - /D - and the influence of the polarisation of the plasma grid and of the magnetic filter on the negative ions current. The code has also shown that, in the type of sources contemplated for ITER, working with large arc power densities (> 1 kW/liter), only negative ions produced in volume at a distance lower that 2 cm from the plasma grid and those produced at the grid surface have a chance of being extracted. (author)

  15. Scoping Study of Machine Learning Techniques for Visualization and Analysis of Multi-source Data in Nuclear Safeguards

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Yonggang

    2018-05-07

    In implementation of nuclear safeguards, many different techniques are being used to monitor operation of nuclear facilities and safeguard nuclear materials, ranging from radiation detectors, flow monitors, video surveillance, satellite imagers, digital seals to open source search and reports of onsite inspections/verifications. Each technique measures one or more unique properties related to nuclear materials or operation processes. Because these data sets have no or loose correlations, it could be beneficial to analyze the data sets together to improve the effectiveness and efficiency of safeguards processes. Advanced visualization techniques and machine-learning based multi-modality analysis could be effective tools in such integrated analysis. In this project, we will conduct a survey of existing visualization and analysis techniques for multi-source data and assess their potential values in nuclear safeguards.

  16. A Pilot Study of EEG Source Analysis Based Repetitive Transcranial Magnetic Stimulation for the Treatment of Tinnitus.

    Directory of Open Access Journals (Sweden)

    Hui Wang

    Full Text Available Repetitive Transcranial Magnetic Stimulation (rTMS is a novel therapeutic tool to induce a suppression of tinnitus. However, the optimal target sites are unknown. We aimed to determine whether low-frequency rTMS induced lasting suppression of tinnitus by decreasing neural activity in the cortex, navigated by high-density electroencephalogram (EEG source analysis, and the utility of EEG for targeting treatment.In this controlled three-armed trial, seven normal hearing patients with tonal tinnitus received a 10-day course of 1-Hz rTMS to the cortex, navigated by high-density EEG source analysis, to the left temporoparietal cortex region, and to the left temporoparietal with sham stimulation. The Tinnitus handicap inventory (THI and a visual analog scale (VAS were used to assess tinnitus severity and loudness. Measurements were taken before, and immediately, 2 weeks, and 4 weeks after the end of the interventions.Low-frequency rTMS decreased tinnitus significantly after active, but not sham, treatment. Responders in the EEG source analysis-based rTMS group, 71.4% (5/7 patients, experienced a significant reduction in tinnitus loudness, as evidenced by VAS scores. The target site of neuronal generators most consistently associated with a positive response was the frontal lobe in the right hemisphere, sourced using high-density EEG equipment, in the tinnitus patients. After left temporoparietal rTMS stimulation, 42.8% (3/7 patients experienced a decrease in tinnitus loudness.Active EEG source analysis based rTMS resulted in significant suppression in tinnitus loudness, showing the superiority of neuronavigation-guided coil positioning in dealing with tinnitus. Non-auditory areas should be considered in the pathophysiology of tinnitus. This knowledge in turn can contribute to investigate the pathophysiology of tinnitus.

  17. Neural correlates of interference resolution in the multi-source interference task: a meta-analysis of functional neuroimaging studies.

    Science.gov (United States)

    Deng, Yuqin; Wang, Xiaochun; Wang, Yan; Zhou, Chenglin

    2018-04-10

    Interference resolution refers to cognitive control processes enabling one to focus on task-related information while filtering out unrelated information. But the exact neural areas, which underlie a specific cognitive task on interference resolution, are still equivocal. The multi-source interference task (MSIT), as a particular cognitive task, is a well-established experimental paradigm used to evaluate interference resolution. Studies combining the MSIT with functional magnetic resonance imaging (fMRI) have shown that the MSIT evokes the dorsal anterior cingulate cortex (dACC) and cingulate-frontal-parietal cognitive-attentional networks. However, these brain areas have not been evaluated quantitatively and these findings have not been replicated. In the current study, we firstly report a voxel-based meta-analysis of functional brain activation associated with the MSIT so as to identify the localization of interference resolution in such a specific cognitive task. Articles on MSIT-related fMRI published between 2003 and July 2017 were eligible. The electronic databases searched included PubMed, Web of Knowledge, and Google Scholar. Differential BOLD activation patterns between the incongruent and congruent condition were meta-analyzed in anisotropic effect-size signed differential mapping software. Robustness meta-analysis indicated that two significant activation clusters were shown to have reliable functional activity in comparisons between incongruent and congruent conditions. The first reliable activation cluster, which included the dACC, medial prefrontal cortex, supplementary motor area, replicated the previous MSIT-related fMRI study results. Furthermore, we found another reliable activation cluster comprising areas of the right insula, right inferior frontal gyrus, and right lenticular nucleus-putamen, which were not typically discussed in previous MSIT-related fMRI studies. The current meta-analysis study presents the reliable brain activation patterns

  18. Monte Carlo design study of a moderated {sup 252}Cf source for in vivo neutron activation analysis of aluminium

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, D.G.; Natto, S.S.A.; Evans, C.J. [Swansea In Vivo Analysis and Cancer Research Group, Department of Physics, University of Wales, Swansea (United Kingdom); Ryde, S.J.S. [Swansea In Vivo Analysis and Cancer Research Group, Department of Medical Physics and Clinical Engineering, Singleton Hospital, Swansea (United Kingdom)

    1997-04-01

    The Monte Carlo computer code MCNP has been used to design a moderated 2{sup 52}Cf neutron source for in vivo neutron activation analysis of aluminium (Al) in the bones of the hand. The clinical motivation is the need to monitor l body burden in subjects with renal dysfunction, at risk of Al toxicity. The design involves the source positioned on the central axis at one end of a cylindrical deuterium oxide moderator. The moderator is surrounded by a graphite reflector, with the hand inserted at the end of the moderator opposing the source. For a 1 mg {sup 252}Cf source, 15 cm long x 20 cm radius moderator and 20 cm thick reflector, the estimated minimum detection limit is .5 mg Al for a 20 min irradiation, with an equivalent dose of 16.5 mSv to the hand. Increasing the moderator length and/or introducing a fast neutron filter (for example silicon) further reduces interference from fast-neutron-induced reactions on phosphorus in bone, at the expense of decreased fluence of the thermal neutrons which activate Al. Increased source strengths may be necessary to compensate for this decreased thermal fluence, or allow measurements to be made within an acceptable time limit for the comfort of the patient. (author)

  19. Knowledge Sources and Opinions of Prospective Social Studies Teachers about Possible Risk and Benefit Analysis: Nuclear Energy and Power Stations

    Science.gov (United States)

    Yazici, Hakki; Bulut, Ramazan; Yazici, Sibel

    2016-01-01

    In this study, it was aimed to determine the trust status of prospective social studies teachers regarding various knowledge sources related to nuclear energy and power stations regarded as a controversial socio-scientific issue and their perceptions on the possible risks and benefits of nuclear energy and power stations. Target population of the…

  20. Seismic Hazard characterization study using an earthquake source with Probabilistic Seismic Hazard Analysis (PSHA) method in the Northern of Sumatra

    International Nuclear Information System (INIS)

    Yahya, A.; Palupi, M. I. R.; Suharsono

    2016-01-01

    Sumatra region is one of the earthquake-prone areas in Indonesia because it is lie on an active tectonic zone. In 2004 there is earthquake with a moment magnitude of 9.2 located on the coast with the distance 160 km in the west of Nanggroe Aceh Darussalam and triggering a tsunami. These events take a lot of casualties and material losses, especially in the Province of Nanggroe Aceh Darussalam and North Sumatra. To minimize the impact of the earthquake disaster, a fundamental assessment of the earthquake hazard in the region is needed. Stages of research include the study of literature, collection and processing of seismic data, seismic source characterization and analysis of earthquake hazard by probabilistic methods (PSHA) used earthquake catalog from 1907 through 2014. The earthquake hazard represented by the value of Peak Ground Acceleration (PGA) and Spectral Acceleration (SA) in the period of 0.2 and 1 second on bedrock that is presented in the form of a map with a return period of 2475 years and the earthquake hazard curves for the city of Medan and Banda Aceh. (paper)

  1. Observation of Point-Light-Walker Locomotion Induces Motor Resonance When Explicitly Represented; An EEG Source Analysis Study

    Directory of Open Access Journals (Sweden)

    Alberto Inuggi

    2018-03-01

    Full Text Available Understanding human motion, to infer the goal of others' actions, is thought to involve the observer's motor repertoire. One prominent class of actions, the human locomotion, has been object of several studies, all focused on manipulating the shape of degraded human figures like point-light walker (PLW stimuli, represented as walking on the spot. Nevertheless, since the main goal of the locomotor function is to displace the whole body from one position to the other, these stimuli might not fully represent a goal-directed action and thus might not be able to induce the same motor resonance mechanism expected when observing a natural locomotion. To explore this hypothesis, we recorded the event-related potentials (ERP of canonical/scrambled and translating/centered PLWs decoding. We individuated a novel ERP component (N2c over central electrodes, around 435 ms after stimulus onset, for translating compared to centered PLW, only when the canonical shape was preserved. Consistently with our hypothesis, sources analysis associated this component to the activation of trunk and lower legs primary sensory-motor and supplementary motor areas. These results confirm the role of own motor repertoire in processing human action and suggest that ERP can detect the associated motor resonance only when the human figure is explicitly involved in performing a meaningful action.

  2. Broadband Studies of Semsmic Sources at Regional and Teleseismic Distances Using Advanced Time Series Analysis Methods. Volume 1.

    Science.gov (United States)

    1991-03-21

    discussion of spectral factorability and motivations for broadband analysis, the report is subdivided into four main sections. In Section 1.0, we...estimates. The motivation for developing our multi-channel deconvolution method was to gain information about seismic sources, most notably, nuclear...with complex constraints for estimating the rupture history. Such methods (applied mostly to data sets that also include strong rmotion data), were

  3. Source attribution of human salmonellosis using a meta-analysis of case-control studies of sporadic infections

    DEFF Research Database (Denmark)

    Coutinho Calado Domingues, Ana Rita; Pires, Sara Monteiro; Hisham Beshara Halasa, Tariq

    2012-01-01

    Salmonella is an important cause of human illness. Disease is frequently associated with foodborne transmission, but other routes of exposure are recognized. Identifying sources of disease is essential for prioritizing public health interventions. Numerous case-control studies of sporadic salmone...

  4. Identification of sources of lead exposure in French children by lead isotope analysis: a cross-sectional study

    Directory of Open Access Journals (Sweden)

    Lucas Jean-Paul

    2011-08-01

    Full Text Available Abstract Background The amount of lead in the environment has decreased significantly in recent years, and so did exposure. However, there is no known safe exposure level and, therefore, the exposure of children to lead, although low, remains a major public health issue. With the lower levels of exposure, it is becoming more difficult to identify lead sources and new approaches may be required for preventive action. This study assessed the usefulness of lead isotope ratios for identifying sources of lead using data from a nationwide sample of French children aged from six months to six years with blood lead levels ≥25 μg/L. Methods Blood samples were taken from 125 children, representing about 600,000 French children; environmental samples were taken from their homes and personal information was collected. Lead isotope ratios were determined using quadrupole ICP-MS (inductively coupled plasma - mass spectrometry and the isotopic signatures of potential sources of exposure were matched with those of blood in order to identify the most likely sources. Results In addition to the interpretation of lead concentrations, lead isotope ratios were potentially of use for 57% of children aged from six months to six years with blood lead level ≥ 25 μg/L (7% of overall children in France, about 332,000 children, with at least one potential source of lead and sufficiently well discriminated lead isotope ratios. Lead isotope ratios revealed a single suspected source of exposure for 32% of the subjects and were able to eliminate at least one unlikely source of exposure for 30% of the children. Conclusions In France, lead isotope ratios could provide valuable additional information in about a third of routine environmental investigations.

  5. Study on sources of colored glaze of Xiyue Temple in Shanxi province by INAA and multivariable statistical analysis

    International Nuclear Information System (INIS)

    Cheng Lin; Feng Songlin

    2005-01-01

    The major, minor and trace elements in the bodies of ancient colored glazes which came from the site of Xiyue Temple and Lidipo kiln in Shanxi province, and were unearthed from the stratums of Song, Yuan, Ming, Early Qing and Late Qing dynasty were analyzed by instrumental neutron activation analysis (INAA). The results of multivariable statistical analyses show that the chemical compositions of the colored glaze bodies are steady from Song to Early Qing dynasty, but distinctly different from that in Late Qing. Probably, the sources of fired material of ancient colored glaze from Song to Early Qing came from the site of Xiyue Temple. The chemical compositions of three pieces of colored glazes in Ming dynasty and that in Late Qing are similar to that of Lidipo kiln. From this, authors could conclude that the sources of the materials of ancient coloured glazes of Xiyue Temple in Late Qing dynasty were fired in Lidipo kiln. (authors)

  6. Validation Study for an Atmospheric Dispersion Model, Using Effective Source Heights Determined from Wind Tunnel Experiments in Nuclear Safety Analysis

    Directory of Open Access Journals (Sweden)

    Masamichi Oura

    2018-03-01

    Full Text Available For more than fifty years, atmospheric dispersion predictions based on the joint use of a Gaussian plume model and wind tunnel experiments have been applied in both Japan and the U.K. for the evaluation of public radiation exposure in nuclear safety analysis. The effective source height used in the Gaussian model is determined from ground-level concentration data obtained by a wind tunnel experiment using a scaled terrain and site model. In the present paper, the concentrations calculated by this method are compared with data observed over complex terrain in the field, under a number of meteorological conditions. Good agreement was confirmed in near-neutral and unstable stabilities. However, it was found to be necessary to reduce the effective source height by 50% in order to achieve a conservative estimation of the field observations in a stable atmosphere.

  7. Statistical studies of powerful extragalactic radio sources

    Energy Technology Data Exchange (ETDEWEB)

    Macklin, J T

    1981-01-01

    This dissertation is mainly about the use of efficient statistical tests to study the properties of powerful extragalactic radio sources. Most of the analysis is based on subsets of a sample of 166 bright (3CR) sources selected at 178 MHz. The first chapter is introductory and it is followed by three on the misalignment and symmetry of double radio sources. The properties of nuclear components in extragalactic sources are discussed in the next chapter, using statistical tests which make efficient use of upper limits, often the only available information on the flux density from the nuclear component. Multifrequency observations of four 3CR sources are presented in the next chapter. The penultimate chapter is about the analysis of correlations involving more than two variables. The Spearman partial rank correlation coefficient is shown to be the most powerful test available which is based on non-parametric statistics. It is therefore used to study the dependences of the properties of sources on their size at constant redshift, and the results are interpreted in terms of source evolution. Correlations of source properties with luminosity and redshift are then examined.

  8. Source and LINAC3 studies

    CERN Document Server

    Bellodi, G

    2017-01-01

    In the framework of the LHC Ion Injector Upgrade pro-gramme (LIU), several activities have been carried out in2016 to improve the ion source and Linac3 performance,with the goal to increase the beam current routinely deliv-ered to LEIR. The extraction region of the GTS-LHC ionsource was upgraded with enlarged vacuum chamber aper-tures and the addition of an einzel lens, yielding highertransmission through the rest of the machine. Also, a seriesof experiments have been performed to study the effects ofdouble frequency mixing on the afterglow performance ofthe source after installation of a Travelling Wave Tube Am-plifier (TWTA) as secondary microwave source at variablefrequency. Measurements have been carried out at a dedi-cated oven test stand for better understanding of the ionsource performance. Finally, several MD sessions werededicated to the study and characterization of the strippingfoils, after evidence of degradation in time was discoveredin the 2015 run.

  9. Evolution of source term definition and analysis

    International Nuclear Information System (INIS)

    Lutz, R.J. Jr.

    2004-01-01

    The objective of this presentation was to provide an overview of the evolution of accident fission product release analysis methodology and the obtained results; and to provide an overview of the source term implementation analysis in regulatory decisions

  10. Food Sources of Sodium Intake in an Adult Mexican Population: A Sub-Analysis of the SALMEX Study

    Science.gov (United States)

    Colin-Ramirez, Eloisa; Miranda-Alatriste, Paola Vanessa; Tovar-Villegas, Verónica Ivette; Arcand, JoAnne; Correa-Rotter, Ricardo

    2017-01-01

    Excessive dietary sodium intake increases blood pressure and cardiovascular risk. In Western diets, the majority of dietary sodium comes from packaged and prepared foods (≈75%); however, in Mexico there is no available data on the main food sources of dietary sodium. The main objective of this study was to identify and characterize the major food sources of dietary sodium in a sample of the Mexican Salt and Mexico (SALMEX) cohort. Adult male and female participants of the SALMEX study who provided a complete and valid three-day food record during the baseline visit were included. Overall, 950 participants (mean age 38.6 ± 10.7 years) were analyzed to determine the total sodium contributed by the main food sources of sodium identified. Mean daily sodium intake estimated by three-day food records and 24-h urinary sodium excretion was 2647.2 ± 976.9 mg/day and 3497.2 ± 1393.0, in the overall population, respectively. Processed meat was the main contributor to daily sodium intake, representing 8% of total sodium intake per capita as measured by three-day food records. When savory bread (8%) and sweet bakery goods (8%) were considered together as bread products, these were the major contributor to daily sodium intake, accounting for the 16% of total sodium intake, followed by processed meat (8%), natural cheeses (5%), and tacos (5%). These results highlight the need for public health policies focused on reducing the sodium content of processed food in Mexico. PMID:28749449

  11. Group Independent Component Analysis (gICA) and Current Source Density (CSD) in the study of EEG in ADHD adults.

    Science.gov (United States)

    Ponomarev, Valery A; Mueller, Andreas; Candrian, Gian; Grin-Yatsenko, Vera A; Kropotov, Juri D

    2014-01-01

    To investigate the performance of the spectral analysis of resting EEG, Current Source Density (CSD) and group independent components (gIC) in diagnosing ADHD adults. Power spectra of resting EEG, CSD and gIC (19 channels, linked ears reference, eyes open/closed) from 96 ADHD and 376 healthy adults were compared between eyes open and eyes closed conditions, and between groups of subjects. Pattern of differences in gIC and CSD spectral power between conditions was approximately similar, whereas it was more widely spatially distributed for EEG. Size effect (Cohen's d) of differences in gIC and CSD spectral power between groups of subjects was considerably greater than in the case of EEG. Significant reduction of gIC and CSD spectral power depending on conditions was found in ADHD patients. Reducing power in a wide frequency range in the fronto-central areas is a common phenomenon regardless of whether the eyes were open or closed. Spectral power of local EEG activity isolated by gICA or CSD in the fronto-central areas may be a suitable marker for discrimination of ADHD and healthy adults. Spectral analysis of gIC and CSD provides better sensitivity to discriminate ADHD and healthy adults. Copyright © 2013 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  12. Does recruitment source moderate treatment effectiveness? A subgroup analysis from the EVIDENT study, a randomised controlled trial of an internet intervention for depressive symptoms.

    Science.gov (United States)

    Klein, Jan Philipp; Gamon, Carla; Späth, Christina; Berger, Thomas; Meyer, Björn; Hohagen, Fritz; Hautzinger, Martin; Lutz, Wolfgang; Vettorazzi, Eik; Moritz, Steffen; Schröder, Johanna

    2017-07-13

    This study aims to examine whether the effects of internet interventions for depression generalise to participants recruited in clinical settings. This study uses subgroup analysis of the results of a randomised, controlled, single-blind trial. The study takes place in five diagnostic centres in Germany. A total of 1013 people with mild to moderate depressive symptoms were recruited from clinical sources as well as internet forums, statutory insurance companies and other sources. This study uses either care-as-usual alone (control) or a 12-week internet intervention (Deprexis) plus usual care (intervention). The primary outcome measure was self-rated depression severity (Patient Health Questionnaire-9) at 3 months and 6 months. Further measures ranged from demographic and clinical parameters to a measure of attitudes towards internet interventions (Attitudes towards Psychological Online Interventions Questionnaire). The recruitment source was only associated with very few of the examined demographic and clinical characteristics. Compared with participants recruited from clinical sources, participants recruited through insurance companies were more likely to be employed. Clinically recruited participants were as severely affected as those from other recruitment sources but more sceptical of internet interventions. The effectiveness of the intervention was not differentially associated with recruitment source (treatment by recruitment source interaction=0.28, p=0.84). Our results support the hypothesis that the intervention we studied is effective across different recruitment sources including clinical settings. ClinicalTrials.gov NCT01636752. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  13. Monitoring and Analysis of Nonpoint Source Pollution - Case study on terraced paddy fields in an agricultural watershed

    Science.gov (United States)

    Chen, Shih-Kai; Jang, Cheng-Shin; Yeh, Chun-Lin

    2013-04-01

    The intensive use of chemical fertilizer has negatively impacted environments in recent decades, mainly through water pollution by nitrogen (N) and phosphate (P) originating from agricultural activities. As a main crop with the largest cultivation area about 0.25 million ha per year in Taiwan, rice paddies account for a significant share of fertilizer consumption among agriculture crops. This study evaluated the fertilization of paddy fields impacting return flow water quality in an agricultural watershed located at Hsinchu County, northern Taiwan. Water quality monitoring continued for two crop-periods in 2012, around subject to different water bodies, including the irrigation water, drainage water, and shallow groundwater. The results indicated that obviously increasing of ammonium-N, nitrate-N and TP concentrations in the surface drainage water were observed immediately following three times of fertilizer applications (including basal, tillering, and panicle fertilizer application), but reduced to relatively low concentrations after 7-10 days after each fertilizer application. Groundwater quality monitoring showed that the observation wells with the more shallow water depth, the more significant variation of concentrations of ammonium-N, nitrate-N and TP could be observed, which means that the contamination potential of nutrient of groundwater is related not only to the impermeable plow sole layer but also to the length of percolation route in this area. The study also showed that the potential pollution load of nutrient could be further reduced by well drainage water control and rational fertilizer management, such as deep-water irrigation, reuse of return flow, the rational application of fertilizers, and the SRI (The System of Rice Intensification) method. The results of this study can provide as an evaluation basis to formulate effective measures for agricultural non-point source pollution control and the reuse of agricultural return flow. Keywords

  14. Optimization of H.E.S.S. instrumental performances for the analysis of weak gamma-ray sources: Application to the study of HESS J1832-092

    International Nuclear Information System (INIS)

    Laffon, H.

    2012-01-01

    H.E.S.S. (High Energy Stereoscopic System) is an array of very-high energy gamma-ray telescopes located in Namibia. These telescopes take advantage of the atmospheric Cherenkov technique using stereoscopy, allowing to detect gamma-rays between 100 GeV and a few tens of TeV. The location of the H.E.S.S. telescopes in the Southern hemisphere allows to observe the central parts of our galaxy, the Milky Way. Tens of new gamma-ray sources were thereby discovered thanks to the galactic plane survey strategy. After ten years of fruitful observations with many detections, it is now necessary to improve the detector performance in order to detect new sources by increasing the sensitivity and improving the angular resolution. The aim of this thesis consists in the development of advanced analysis techniques allowing to make sharper analysis. An automatic tool to look for new sources and to improve the subtraction of the background noise is presented. It is optimized for the study of weak sources that needs a very rigorous analysis. A combined reconstruction method is built in order to improve the angular resolution without reducing the statistics, which is critical for weak sources. These advanced methods are applied to the analysis of a complex region of the galactic plane near the supernova remnant G22.7-0.2, leading to the detection of a new source, HESS J1832-092. Multi-wavelength counterparts are shown and several scenarios are considered to explain the origin of the gamma-ray signal of this astrophysical object. (author)

  15. Hydrodynamic analysis of the interaction of two operating groundwater sources, case study: Groundwater supply of Bečej

    Directory of Open Access Journals (Sweden)

    Polomčić Dušan M.

    2014-01-01

    Full Text Available The existing groundwater source 'Vodokanal' for the public water supply of Bečej city in Serbia tapping groundwater from three water-bearing horizons over 15 wells with summary capacity of 100 l/s. Near the public water source of Bečej exists groundwater source 'Soja Protein' for industry with current capacity of 12 l/s which tapped same horizons. In the coming period is planned to increase summary capacity of this groundwater source up to 57 l/s. Also, the increase of summary city's source capacity is planned for 50 l/s in the next few years. That is means an increase of groundwater abstraction for an additional 84 % from the same water-bearing horizons. Application of hydrodynamic modeling, based on numerical method of finite difference will show the impact of increasing the total capacity of the source 'Soja Protein' on the groundwater level in groundwater source 'Vodokanal' and effects of additional decrease in groundwater levels, in all three water-bearing horizons, on the wells of the 'Vodokanala' groundwater source due to operation of industrial source. It was done 7 variant solutions of the extensions of groundwater sources and are their effects for a period of 10 years with the aim of the sustainable management of groundwater.

  16. Identifying the source of fluvial terrace deposits using XRF scanning and Canonical Discriminant Analysis: A case study of the Chihshang terraces, eastern Taiwan

    Science.gov (United States)

    Chang, Queenie; Lee, Jian-Cheng; Hunag, Jyh-Jaan; Wei, Kuo-Yen; Chen, Yue-Gau; Byrne, Timothy B.

    2018-05-01

    The source of fluvial deposits in terraces provides important information about the catchment fluvial processes and landform evolution. In this study, we propose a novel approach that combines high-resolution Itrax-XRF scanning and Canonical Discriminant Analysis (CDA) to identify the source of fine-grained fluvial terrace deposits. We apply this approach to a group of terraces that are located on the hanging wall of the Chihshang Fault in eastern Taiwan with two possible sources, the Coastal Range on the east and the Central Range on the west. Our results of standard samples from the two potential sources show distinct ranges of canonical variables, which provided a better separation ability than individual chemical elements. We then tested the possibility of using this approach by applying it to several samples with known sediment sources and obtain positive results. Applying this same approach to the fine-grained sediments in Chihshang terraces indicates that they are mostly composed of Coastal Range material but also contain some inputs from the Central Range. In two lowest terraces T1 and T2, the fine-grained deposits show significant Central Range component. For terrace T4, the results show less Central Range input and a trend of decreasing Central Range influences up section. The Coastal Range material becomes dominant in the two highest terraces T7 and T10. Sediments in terrace T5 appear to have been potentially altered by post-deposition chemical alteration processes and are not included in the analysis. Our results show that the change in source material in the terraces deposits was relatively gradual rather than the sharp changes suggested by the composition of the gravels and conglomerates. We suggest that this change in sources is related to the change in dominant fluvial processes that controlled by the tectonic activity.

  17. Blind source separation dependent component analysis

    CERN Document Server

    Xiang, Yong; Yang, Zuyuan

    2015-01-01

    This book provides readers a complete and self-contained set of knowledge about dependent source separation, including the latest development in this field. The book gives an overview on blind source separation where three promising blind separation techniques that can tackle mutually correlated sources are presented. The book further focuses on the non-negativity based methods, the time-frequency analysis based methods, and the pre-coding based methods, respectively.

  18. Quantitative EEG and Current Source Density Analysis of Combined Antiepileptic Drugs and Dopaminergic Agents in Genetic Epilepsy: Two Case Studies.

    Science.gov (United States)

    Emory, Hamlin; Wells, Christopher; Mizrahi, Neptune

    2015-07-01

    Two adolescent females with absence epilepsy were classified, one as attention deficit and the other as bipolar disorder. Physical and cognitive exams identified hypotension, bradycardia, and cognitive dysfunction. Their initial electroencephalograms (EEGs) were considered slightly slow, but within normal limits. Quantitative EEG (QEEG) data included relative theta excess and low alpha mean frequencies. A combined treatment of antiepileptic drugs with a catecholamine agonist/reuptake inhibitor was sequentially used. Both patients' physical and cognitive functions improved and they have remained seizure free. The clinical outcomes were correlated with statistically significant changes in QEEG measures toward normal Z-scores in both anterior and posterior regions. In addition, low resolution electromagnetic tomography (LORETA) Z-scored source correlation analyses of the initial and treated QEEG data showed normalized patterns, supporting a neuroanatomic resolution. This study presents preliminary evidence for a neurophysiologic approach to patients with absence epilepsy and comorbid disorders and may provide a method for further research. © EEG and Clinical Neuroscience Society (ECNS) 2014.

  19. Analysis of open source GIS software

    OpenAIRE

    Božnis, Andrius

    2006-01-01

    GIS is one of the most perspective information technology sciences sphere. GIS conjuncts the digital image analysis and data base systems. This makes GIS wide applicable and very high skills demanding system. There is a lot of commercial GIS software which is well advertised and which functionality is pretty well known, while open source software is forgotten. In this diploma work is made analysis of available open source GIS software on the Internet, in the scope of different projects interr...

  20. Optimal Measurement Conditions for Spatiotemporal EEG/MEG Source Analysis.

    Science.gov (United States)

    Huizenga, Hilde M.; Heslenfeld, Dirk J.; Molenaar, Peter C. M.

    2002-01-01

    Developed a method to determine the required number and position of sensors for human brain electromagnetic source analysis. Studied the method through a simulation study and an empirical study on visual evoked potentials in one adult male. Results indicate the method is fast and reliable and improves source precision. (SLD)

  1. Studies and modeling of cold neutron sources

    International Nuclear Information System (INIS)

    Campioni, G.

    2004-11-01

    With the purpose of updating knowledge in the fields of cold neutron sources, the work of this thesis has been run according to the 3 following axes. First, the gathering of specific information forming the materials of this work. This set of knowledge covers the following fields: cold neutron, cross-sections for the different cold moderators, flux slowing down, different measurements of the cold flux and finally, issues in the thermal analysis of the problem. Secondly, the study and development of suitable computation tools. After an analysis of the problem, several tools have been planed, implemented and tested in the 3-dimensional radiation transport code Tripoli-4. In particular, a module of uncoupling, integrated in the official version of Tripoli-4, can perform Monte-Carlo parametric studies with a spare factor of Cpu time fetching 50 times. A module of coupling, simulating neutron guides, has also been developed and implemented in the Monte-Carlo code McStas. Thirdly, achieving a complete study for the validation of the installed calculation chain. These studies focus on 3 cold sources currently functioning: SP1 from Orphee reactor and 2 other sources (SFH and SFV) from the HFR at the Laue Langevin Institute. These studies give examples of problems and methods for the design of future cold sources

  2. Advanced Neutron Source enrichment study

    International Nuclear Information System (INIS)

    Bari, R.A.; Ludewig, H.; Weeks, J.R.

    1996-01-01

    A study has been performed of the impact on performance of using low-enriched uranium (20% 235 U) or medium-enriched uranium (35% 235 U) as an alternative fuel for the Advanced Neutron Source, which was initially designed to use uranium enriched to 93% 235 U. Higher fuel densities and larger volume cores were evaluated at the lower enrichments in terms of impact on neutron flux, safety, safeguards, technical feasibility, and cost. The feasibility of fabricating uranium silicide fuel at increasing material density was specifically addressed by a panel of international experts on research reactor fuels. The most viable alternative designs for the reactor at lower enrichments were identified and discussed. Several sensitivity analyses were performed to gain an understanding of the performance of the reactor at parametric values of power, fuel density, core volume, and enrichment that were interpolations between the boundary values imposed on the study or extrapolations from known technology

  3. Spatiotemporal source analysis in scalp EEG vs. intracerebral EEG and SPECT: a case study in a 2-year-old child.

    Science.gov (United States)

    Aarabi, A; Grebe, R; Berquin, P; Bourel Ponchel, E; Jalin, C; Fohlen, M; Bulteau, C; Delalande, O; Gondry, C; Héberlé, C; Moullart, V; Wallois, F

    2012-06-01

    This case study aims to demonstrate that spatiotemporal spike discrimination and source analysis are effective to monitor the development of sources of epileptic activity in time and space. Therefore, they can provide clinically useful information allowing a better understanding of the pathophysiology of individual seizures with time- and space-resolved characteristics of successive epileptic states, including interictal, preictal, postictal, and ictal states. High spatial resolution scalp EEGs (HR-EEG) were acquired from a 2-year-old girl with refractory central epilepsy and single-focus seizures as confirmed by intracerebral EEG recordings and ictal single-photon emission computed tomography (SPECT). Evaluation of HR-EEG consists of the following three global steps: (1) creation of the initial head model, (2) automatic spike and seizure detection, and finally (3) source localization. During the source localization phase, epileptic states are determined to allow state-based spike detection and localization of underlying sources for each spike. In a final cluster analysis, localization results are integrated to determine the possible sources of epileptic activity. The results were compared with the cerebral locations identified by intracerebral EEG recordings and SPECT. The results obtained with this approach were concordant with those of MRI, SPECT and distribution of intracerebral potentials. Dipole cluster centres found for spikes in interictal, preictal, ictal and postictal states were situated an average of 6.3mm from the intracerebral contacts with the highest voltage. Both amplitude and shape of spikes change between states. Dispersion of the dipoles was higher in the preictal state than in the postictal state. Two clusters of spikes were identified. The centres of these clusters changed position periodically during the various epileptic states. High-resolution surface EEG evaluated by an advanced algorithmic approach can be used to investigate the

  4. A study on characteristics and sources of winter time atmospheric aerosols in Kyoto and Seoul using PIXE and supplementary analysis

    International Nuclear Information System (INIS)

    Ma, C.-J.; Kasahara, M.; Tohno, S.; Yeo, H.-G.

    1999-01-01

    Atmospheric aerosols were collected using a two stages filter sampler to classify into the fine and coarse fraction in Kyoto and Seoul in winter season. Elemental concentrations of aerosols were analyzed by PIXE and EAS as well as ion concentrations by IC. Analyzed data were used to source of aerosol particles. (author)

  5. Crime analysis using open source information

    DEFF Research Database (Denmark)

    Nizamani, Sarwat; Memon, Nasrullah; Shah, Azhar Ali

    2015-01-01

    In this paper, we present a method of crime analysis from open source information. We employed un-supervised methods of data mining to explore the facts regarding the crimes of an area of interest. The analysis is based on well known clustering and association techniques. The results show...

  6. Quality Analysis of DNA from Cord Blood Buffy Coat: The Best Neonatal DNA Source for Epidemiological Studies?

    Science.gov (United States)

    Zhou, Guangdi; Li, Qin; Huang, Lisu; Wu, Yuhang; Wu, Meiqin; Wang, Weiye C

    2016-04-01

    Umbilical cord blood is an economical and easy to obtain source of high-quality neonatal genomic DNA. However, although large numbers of cord blood samples have been collected, information on the yield and quality of the DNA extracted from cord blood is scarce. Moreover, considerable doubt still exists on the utility of the buffy coat instead of whole blood as a DNA source. We compared the sample storage and DNA extraction costs for whole blood, buffy coat, and all-cell pellet. We evaluated three different DNA purification kits and selected the most suitable one to purify 1011 buffy coat samples. We determined the DNA yield and optical density (OD) ratios and analyzed 48 single-nucleotide polymorphisms using time-of-flight mass spectrometry (TOF MS). We also analyzed eight possible preanalytical variables that may correlate with DNA yield or quality. Buffy coat was the most economical and least labor-intensive source for sample storage and DNA extraction. The average yield of genomic DNA from 200 μL of buffy coat sample was 16.01 ± 8.00 μg, which is sufficient for analytic experiments. The mean A260/A280 ratio and the mean A260/A230 ratio were 1.89 ± 0.09 and 1.95 ± 0.66, respectively. More than 99.5% of DNA samples passed the TOF MS test. Only hemolysis showed a strong correlation with OD ratios of DNA, but not with yield. Our findings show that cord blood buffy coat yields high-quality DNA in sufficient quantities to meet the requirements of experiments. Buffy coat was also found to be the most economic, efficient, and stable source of genomic DNA.

  7. Review and meta-analysis of 82 studies on end-of-life management methods for source separated organics.

    Science.gov (United States)

    Morris, Jeffrey; Scott Matthews, H; Morawski, Clarissa

    2013-03-01

    This article reports on a literature review and meta-analysis of 82 studies, mostly life cycle assessments (LCAs), which quantified end-of-life (EOL) management options for organic waste. These studies were reviewed to determine the environmental preferability, or lack thereof, for a number of EOL management methods such as aerobic composting (AC), anaerobic digestion (AD), gasification, combustion, incineration with energy recovery (often denoted as waste-to-energy incineration), mechanical biological treatment, incineration without energy recovery (sometimes referenced by just the word "incineration"), and landfill disposal with and without energy recovery from generated methane. Given the vast differences in boundaries as well as uncertainty and variability in results, the LCAs among the 82 studies provided enough data and results to make conclusions regarding just four EOL management methods - aerobic composting, anaerobic digestion, mass burn waste-to-energy (WTE), and landfill gas-to-energy (LFGTE). For these four, the LCAs proved sufficient to determine that aerobic composting and anaerobic digestion are both environmentally preferable to either WTE or LFGTE in terms of climate change impacts. For climate change, LCA results were mixed for WTE versus LFGTE. Furthermore, there is a lack of empirically reliable estimates of the amount of organics input to AD that is converted to energy output versus remaining in the digestate. This digestate can be processed through aerobic composting into a compost product similar to the compost output from aerobic composting, assuming that the same type of organic materials are managed under AD as are managed via AC. The magnitude of any trade-off between generation of energy and production of compost in an AD system appears to be critical for ranking AC and AD for differing types of organics diversion streams. These results emphasize how little we generally know, and exemplify the fact that in the reviewed literature no

  8. Nitrate source identification in groundwater of multiple land-use areas by combining isotopes and multivariate statistical analysis: A case study of Asopos basin (Central Greece)

    International Nuclear Information System (INIS)

    Matiatos, Ioannis

    2016-01-01

    Nitrate (NO_3) is one of the most common contaminants in aquatic environments and groundwater. Nitrate concentrations and environmental isotope data (δ"1"5N–NO_3 and δ"1"8O–NO_3) from groundwater of Asopos basin, which has different land-use types, i.e., a large number of industries (e.g., textile, metal processing, food, fertilizers, paint), urban and agricultural areas and livestock breeding facilities, were analyzed to identify the nitrate sources of water contamination and N-biogeochemical transformations. A Bayesian isotope mixing model (SIAR) and multivariate statistical analysis of hydrochemical data were used to estimate the proportional contribution of different NO_3 sources and to identify the dominant factors controlling the nitrate content of the groundwater in the region. The comparison of SIAR and Principal Component Analysis showed that wastes originating from urban and industrial zones of the basin are mainly responsible for nitrate contamination of groundwater in these areas. Agricultural fertilizers and manure likely contribute to groundwater contamination away from urban fabric and industrial land-use areas. Soil contribution to nitrate contamination due to organic matter is higher in the south-western part of the area far from the industries and the urban settlements. The present study aims to highlight the use of environmental isotopes combined with multivariate statistical analysis in locating sources of nitrate contamination in groundwater leading to a more effective planning of environmental measures and remediation strategies in river basins and water bodies as defined by the European Water Frame Directive (Directive 2000/60/EC). - Highlights: • More enriched N-isotope values were observed in the industrial/urban areas. • A Bayesian isotope mixing model was applied in a multiple land-use area. • A 3-component model explained the factors controlling nitrate content in groundwater. • Industrial/urban nitrogen source was

  9. Mechanistic facility safety and source term analysis

    International Nuclear Information System (INIS)

    PLYS, M.G.

    1999-01-01

    A PC-based computer program was created for facility safety and source term analysis at Hanford The program has been successfully applied to mechanistic prediction of source terms from chemical reactions in underground storage tanks, hydrogen combustion in double contained receiver tanks, and proccss evaluation including the potential for runaway reactions in spent nuclear fuel processing. Model features include user-defined facility room, flow path geometry, and heat conductors, user-defined non-ideal vapor and aerosol species, pressure- and density-driven gas flows, aerosol transport and deposition, and structure to accommodate facility-specific source terms. Example applications are presented here

  10. [Study of self-reported health of people living near point sources of environmental pollution: a review. Second part: analysis of results and perspectives].

    Science.gov (United States)

    Daniau, C; Dor, F; Eilstein, D; Lefranc, A; Empereur-Bissonnet, P; Dab, W

    2013-08-01

    Epidemiological studies have investigated the health impacts of local sources of environmental pollution using as an outcome variable self-reported health, reflecting the overall perception interviewed people have of their own health. This work aims at analyzing the advantages and the results of this approach. This second part presents the results of the studies. Based on a literature review (51 papers), this article presents an analysis of the contribution of self-reported health to epidemiological studies investigating local sources of environmental pollution. It discusses the associations between self-reported health and exposure variables, and other risk factors that can influence health reporting. Studies using self-reported health showed that local sources can be associated with a wide range of health outcomes, including an impact on mental health and well-being. The perception of pollution, especially sensory information such as odors, affects self-reported health. Attitudes referring to beliefs, worries and personal behaviors concerning the source of pollution have a striking influence on reported health. Attitudes can be used to estimate the reporting bias in a biomedical approach, and also constitute the main explanatory factors in biopsychosocial studies taking into account not only the biological, physical, and chemical factors but also the psychological and social factors at stake in a situation of environmental exposure. Studying self-reported health enables a multifactorial approach to health in a context of environmental exposure. This approach is most relevant when conducted within a multidisciplinary framework involving human and social sciences to better understand psychosocial factors. The relevance of this type of approach used as an epidemiological surveillance tool to monitor local situations should be assessed with regard to needs for public health management of these situations. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  11. Optical studies of UHURU sources. XIII. A photometric analysis of X Persei (=3U 0352+30)

    International Nuclear Information System (INIS)

    Gottlieb, E.W.; Wright, E.L.; Liller, W.

    1975-01-01

    2677 magnitudes of the hot massive variable star X Persei, the prime optical candidate for the X-ray source 3U 0352+30, have been obtained from blue plates in the Harvard collection taken between 1894.9 and 1975.1. These data have been analyzed for periods from 0.5 to 180 days and find no periodic component of brightness with full sinusoidal amplitude greater than or equal to0.05 mag. Other period searches, from 550 to 618 days and from 10.75 to 11.75 hours, likewise gave negative results for full amplitudes of greater than or equal to0.07 mag. The mean light curve is published and its properties are discussed

  12. Chromatographic fingerprint similarity analysis for pollutant source identification

    International Nuclear Information System (INIS)

    Xie, Juan-Ping; Ni, Hong-Gang

    2015-01-01

    In the present study, a similarity analysis method was proposed to evaluate the source-sink relationships among environmental media for polybrominated diphenyl ethers (PBDEs), which were taken as the representative contaminants. Chromatographic fingerprint analysis has been widely used in the fields of natural products chemistry and forensic chemistry, but its application to environmental science has been limited. We established a library of various sources of media containing contaminants (e.g., plastics), recognizing that the establishment of a more comprehensive library allows for a better understanding of the sources of contamination. We then compared an environmental complex mixture (e.g., sediment, soil) with the profiles in the library. These comparisons could be used as the first step in source tracking. The cosine similarities between plastic and soil or sediment ranged from 0.53 to 0.68, suggesting that plastic in electronic waste is an important source of PBDEs in the environment, but it is not the only source. A similarity analysis between soil and sediment indicated that they have a source-sink relationship. Generally, the similarity analysis method can encompass more relevant information of complex mixtures in the environment than a profile-based approach that only focuses on target pollutants. There is an inherent advantage to creating a data matrix containing all peaks and their relative levels after matching the peaks based on retention times and peak areas. This data matrix can be used for source identification via a similarity analysis without quantitative or qualitative analysis of all chemicals in a sample. - Highlights: • Chromatographic fingerprint analysis can be used as the first step in source tracking. • Similarity analysis method can encompass more relevant information of pollution. • The fingerprints strongly depend on the chromatographic conditions. • A more effective and robust method for identifying similarities is required

  13. Source-system windowing for speech analysis

    NARCIS (Netherlands)

    Yegnanarayana, B.; Satyanarayana Murthy, P.; Eggen, J.H.

    1993-01-01

    In this paper we propose a speech-analysis method to bring out characteristics of the vocal tract system in short segments which are much less than a pitch period. The method performs windowing in the source and system components of the speech signal and recombines them to obtain a signal reflecting

  14. Isotopic neutron sources for neutron activation analysis

    International Nuclear Information System (INIS)

    Hoste, J.

    1988-06-01

    This User's Manual is an attempt to provide for teaching and training purposes, a series of well thought out demonstrative experiments in neutron activation analysis based on the utilization of an isotopic neutron source. In some cases, these ideas can be applied to solve practical analytical problems. 19 refs, figs and tabs

  15. Nitrate source identification in groundwater of multiple land-use areas by combining isotopes and multivariate statistical analysis: A case study of Asopos basin (Central Greece).

    Science.gov (United States)

    Matiatos, Ioannis

    2016-01-15

    Nitrate (NO3) is one of the most common contaminants in aquatic environments and groundwater. Nitrate concentrations and environmental isotope data (δ(15)N-NO3 and δ(18)O-NO3) from groundwater of Asopos basin, which has different land-use types, i.e., a large number of industries (e.g., textile, metal processing, food, fertilizers, paint), urban and agricultural areas and livestock breeding facilities, were analyzed to identify the nitrate sources of water contamination and N-biogeochemical transformations. A Bayesian isotope mixing model (SIAR) and multivariate statistical analysis of hydrochemical data were used to estimate the proportional contribution of different NO3 sources and to identify the dominant factors controlling the nitrate content of the groundwater in the region. The comparison of SIAR and Principal Component Analysis showed that wastes originating from urban and industrial zones of the basin are mainly responsible for nitrate contamination of groundwater in these areas. Agricultural fertilizers and manure likely contribute to groundwater contamination away from urban fabric and industrial land-use areas. Soil contribution to nitrate contamination due to organic matter is higher in the south-western part of the area far from the industries and the urban settlements. The present study aims to highlight the use of environmental isotopes combined with multivariate statistical analysis in locating sources of nitrate contamination in groundwater leading to a more effective planning of environmental measures and remediation strategies in river basins and water bodies as defined by the European Water Frame Directive (Directive 2000/60/EC).

  16. On the feasibility of using emergy analysis as a source of benchmarking criteria through data envelopment analysis: A case study for wind energy

    International Nuclear Information System (INIS)

    Iribarren, Diego; Vázquez-Rowe, Ian; Rugani, Benedetto; Benetto, Enrico

    2014-01-01

    The definition of criteria for the benchmarking of similar entities is often a critical issue in analytical studies because of the multiplicity of criteria susceptible to be taken into account. This issue can be aggravated by the need to handle multiple data for multiple facilities. This article presents a methodological framework, named the Em + DEA method, which combines emergy analysis with Data Envelopment Analysis (DEA) for the ecocentric benchmarking of multiple resembling entities (i.e., multiple decision making units or DMUs). Provided that the life-cycle inventories of these DMUs are available, an emergy analysis is performed through the computation of seven different indicators, which refer to the use of fossil, metal, mineral, nuclear, renewable energy, water and land resources. These independent emergy values are then implemented as inputs for DEA computation, thus providing operational emergy-based efficiency scores and, for the inefficient DMUs, target emergy flows (i.e., feasible emergy benchmarks that would turn inefficient DMUs into efficient). The use of the Em + DEA method is exemplified through a case study of wind energy farms. The potential use of CED (cumulative energy demand) and CExD (cumulative exergy demand) indicators as alternative benchmarking criteria to emergy is discussed. The combined use of emergy analysis with DEA is proven to be a valid methodological approach to provide benchmarks oriented towards the optimisation of the life-cycle performance of a set of multiple similar facilities, not being limited to the operational traits of the assessed units. - Highlights: • Combined emergy and DEA method to benchmark multiple resembling entities. • Life-cycle inventory, emergy analysis and DEA as key steps of the Em + DEA method. • Valid ecocentric benchmarking approach proven through a case study of wind farms. • Comparison with life-cycle energy-based benchmarking criteria (CED/CExD + DEA). • Analysts and decision and policy

  17. Source identification of underground fuel spills in a petroleum refinery using fingerprinting techniques and chemo-metric analysis. A Case Study

    International Nuclear Information System (INIS)

    Kanellopoulou, G.; Gidarakos, E.; Pasadakis, N.

    2005-01-01

    Crude oil and its refining products are the most frequent contaminants, found in the environment due to spills. The aim of this work was the identification of spill source(s) in the subsurface of a petroleum refinery. Free phase samples were analyzed with gas chromatography and the analytical results were interpreted using Principal Component Analysis (PCA) method. The chemical analysis of groundwater samples from the refinery subsurface was also employed to obtain a comprehensive picture of the spill distribution and origin. (authors)

  18. Human Campylobacteriosis in Luxembourg, 2010-2013: A Case-Control Study Combined with Multilocus Sequence Typing for Source Attribution and Risk Factor Analysis.

    Science.gov (United States)

    Mossong, Joël; Mughini-Gras, Lapo; Penny, Christian; Devaux, Anthony; Olinger, Christophe; Losch, Serge; Cauchie, Henry-Michel; van Pelt, Wilfrid; Ragimbeau, Catherine

    2016-02-10

    Campylobacteriosis has increased markedly in Luxembourg during recent years. We sought to determine which Campylobacter genotypes infect humans, where they may originate from, and how they may infect humans. Multilocus sequence typing was performed on 1153 Campylobacter jejuni and 136 C. coli human strains to be attributed to three putative animal reservoirs (poultry, ruminants, pigs) and to environmental water using the asymmetric island model. A nationwide case-control study (2010-2013) for domestic campylobacteriosis was also conducted, including 367 C. jejuni and 48 C. coli cases, and 624 controls. Risk factors were investigated by Campylobacter species, and for strains attributed to different sources using a combined case-control and source attribution analysis. 282 sequence types (STs) were identified: ST-21, ST-48, ST-572, ST-50 and ST-257 were prevailing. Most cases were attributed to poultry (61.2%) and ruminants (33.3%). Consuming chicken outside the home was the dominant risk factor for both Campylobacter species. Newly identified risk factors included contact with garden soil for either species, and consuming beef specifically for C. coli. Poultry-associated campylobacteriosis was linked to poultry consumption in wintertime, and ruminant-associated campylobacteriosis to tap-water provider type. Besides confirming chicken as campylobacteriosis primary source, additional evidence was found for other reservoirs and transmission routes.

  19. Improved separability of dipole sources by tripolar versus conventional disk electrodes: a modeling study using independent component analysis.

    Science.gov (United States)

    Cao, H; Besio, W; Jones, S; Medvedev, A

    2009-01-01

    Tripolar electrodes have been shown to have less mutual information and higher spatial resolution than disc electrodes. In this work, a four-layer anisotropic concentric spherical head computer model was programmed, then four configurations of time-varying dipole signals were used to generate the scalp surface signals that would be obtained with tripolar and disc electrodes, and four important EEG artifacts were tested: eye blinking, cheek movements, jaw movements, and talking. Finally, a fast fixed-point algorithm was used for signal independent component analysis (ICA). The results show that signals from tripolar electrodes generated better ICA separation results than from disc electrodes for EEG signals with these four types of artifacts.

  20. Assessment of metal pollution sources by SEM/EDS analysis of solid particles in snow: a case study of Žerjav, Slovenia.

    Science.gov (United States)

    Miler, Miloš; Gosar, Mateja

    2013-12-01

    Solid particles in snow deposits, sampled in mining and Pb-processing area of Žerjav, Slovenia, have been investigated using scanning electron microscopy/energy-dispersive X-ray spectroscopy (SEM/EDS). Identified particles were classified as geogenic-anthropogenic, anthropogenic, and secondary weathering products. Geogenic-anthropogenic particles were represented by scarce Zn- and Pb-bearing ore minerals, originating from mine waste deposit. The most important anthropogenic metal-bearing particles in snow were Pb-, Sb- and Sn-bearing oxides and sulphides. The morphology of these particles showed that they formed at temperatures above their melting points. They were most abundant in snow sampled closest to the Pb-processing plant and least abundant in snow taken farthest from the plant, thus indicating that Pb processing was their predominant source between the last snowfall and the time of sampling. SEM/EDS analysis showed that Sb and Sn contents in these anthropogenic phases were higher and more variable than in natural Pb-bearing ore minerals. The most important secondary weathering products were Pb- and Zn-containing Fe-oxy-hydroxides whose elemental composition and morphology indicated that they mostly resulted from oxidation of metal-bearing sulphides emitted from the Pb-processing plant. This study demonstrated the importance of single particle analysis using SEM/EDS for differentiation between various sources of metals in the environment.

  1. Organic tracer-based source analysis of PM2.5 organic and elemental carbon: A case study at Dongguan in the Pearl River Delta, China

    Science.gov (United States)

    Wang, Qiong Qiong; Huang, X. H. Hilda; Zhang, Ting; Zhang, Qingyan; Feng, Yongming; Yuan, Zibing; Wu, Dui; Lau, Alexis K. H.; Yu, Jian Zhen

    2015-10-01

    Organic carbon (OC) and elemental carbon (EC) are major constituents of PM2.5 and their source apportionment remains a challenging task due to the great diversity of their sources and lack of source-specific tracer data. In this work, sources of OC and EC are investigated using positive matrix factorization (PMF) analysis of PM2.5 chemical composition data, including major ions, OC, EC, elements, and organic molecular source markers, for a set of 156 filter samples collected over three years from 2010 to 2012 at Dongguan in the Pearl River Delta, China. The key organic tracers include levoglucosan, mannosan, hopanes, C27-C33n-alkanes, and polycyclic aromatic hydrocarbons (PAHs). Using these species as input for the PMF model, nine factors were resolved. Among them, biomass burning and coal combustion were significant sources contributing 15-17% of OC and 24-30% and 34-35% of EC, respectively. Industrial emissions and ship emissions, identified through their characteristic metal signatures, contributed 16-24% and 7-8% of OC and 8-11% and 16-17% of EC, respectively. Vehicle exhaust was a less significant source, accounting for 3-4% of OC and 5-8% of EC. Secondary OC, taken to be the sum of OC present in secondary sulfate and nitrate formation source factors, made up 27-36% of OC. Plastic burning, identified through 1,3,5-triphenylbenzene as a tracer, was a less important source for OC(≤4%) and EC (5-10%), but a significant source for PAHs at this site. The utility of organic source tracers was demonstrated by comparing PMF runs with different combinations of organic tracers removed from the input species list. Levoglucosan and mannosan were important additions to distinguish biomass burning from coal combustion by reducing collinearity among source profiles. Inclusion of hopanes and 1,3,5-triphenylbenzene was found to be necessary in resolving the less significant sources vehicle exhaust and plastic burning. Inclusion of C27-C33n-alkanes and PAHs can influence the

  2. Nitrate source identification in groundwater of multiple land-use areas by combining isotopes and multivariate statistical analysis: A case study of Asopos basin (Central Greece)

    Energy Technology Data Exchange (ETDEWEB)

    Matiatos, Ioannis, E-mail: i.matiatos@iaea.org

    2016-01-15

    Nitrate (NO{sub 3}) is one of the most common contaminants in aquatic environments and groundwater. Nitrate concentrations and environmental isotope data (δ{sup 15}N–NO{sub 3} and δ{sup 18}O–NO{sub 3}) from groundwater of Asopos basin, which has different land-use types, i.e., a large number of industries (e.g., textile, metal processing, food, fertilizers, paint), urban and agricultural areas and livestock breeding facilities, were analyzed to identify the nitrate sources of water contamination and N-biogeochemical transformations. A Bayesian isotope mixing model (SIAR) and multivariate statistical analysis of hydrochemical data were used to estimate the proportional contribution of different NO{sub 3} sources and to identify the dominant factors controlling the nitrate content of the groundwater in the region. The comparison of SIAR and Principal Component Analysis showed that wastes originating from urban and industrial zones of the basin are mainly responsible for nitrate contamination of groundwater in these areas. Agricultural fertilizers and manure likely contribute to groundwater contamination away from urban fabric and industrial land-use areas. Soil contribution to nitrate contamination due to organic matter is higher in the south-western part of the area far from the industries and the urban settlements. The present study aims to highlight the use of environmental isotopes combined with multivariate statistical analysis in locating sources of nitrate contamination in groundwater leading to a more effective planning of environmental measures and remediation strategies in river basins and water bodies as defined by the European Water Frame Directive (Directive 2000/60/EC). - Highlights: • More enriched N-isotope values were observed in the industrial/urban areas. • A Bayesian isotope mixing model was applied in a multiple land-use area. • A 3-component model explained the factors controlling nitrate content in groundwater. • Industrial

  3. Effect of sample moisture and bulk density on performance of the 241Am-Be source based prompt gamma rays neutron activation analysis setup. A Monte Carlo study

    International Nuclear Information System (INIS)

    Almisned, Ghada

    2010-01-01

    Monte Carlo simulations were carried out using the dependence of gamma ray yield on the bulk density and moisture content for five different lengths of Portland cement samples in a thermal neutron capture based Prompt Gamma ray Neutron Activation Analysis (PGNAA) setup for source inside moderator geometry using an 241 Am-Be neutron source. In this study, yields of 1.94 and 6.42 MeV prompt gamma rays from calcium in the five Portland cement samples were calculated as a function of sample bulk density and moisture content. The study showed a strong dependence of the 1.94 and 6.42 MeV gamma ray yield upon the sample bulk density but a weaker dependence upon sample moisture content. For an order of magnitude increase in the sample bulk density, an order of magnitude increase in the gamma rays yield was observed, i.e., a one-to-one correspondence. In case of gamma ray yield dependence upon sample moisture content, an order of magnitude increase in the moisture content of the sample resulted in about 16-17% increase in the yield of 1.94 and 6.42 MeV gamma rays from calcium. (author)

  4. Radiation studies in the antiproton source

    International Nuclear Information System (INIS)

    Church, M.

    1990-01-01

    Experiment E760 has a lead glass (Pb-G) calorimeter situated in the antiproton source tunnel in the accumulator ring at location A50. This location is exposed to radiation from several sources during antiproton stacking operations. A series of radiation studies has been performed over the last two years to determine the sources of this radiation and as a result, some shielding has been installed in the antiproton source in order to protect the lead glass from radiation damage

  5. Data analysis and source modelling for LISA

    International Nuclear Information System (INIS)

    Shang, Yu

    2014-01-01

    The gravitational waves are one of the most important predictions in general relativity. Besides of the directly proof of the existence of GWs, there are already several ground based detectors (such as LIGO, GEO, etc) and the planed future space mission (such as: LISA) which are aim to detect the GWs directly. GW contain a large amount of information of its source, extracting these information can help us dig out the physical property of the source, even open a new window for understanding the Universe. Hence, GW data analysis will be a challenging task in seeking the GWs. In this thesis, I present two works about the data analysis for LISA. In the first work, we introduce an extended multimodal genetic algorithm which utilizes the properties of the signal and the detector response function to analyze the data from the third round of mock LISA data challenge. We have found all five sources present in the data and recovered the coalescence time, chirp mass, mass ratio and sky location with reasonable accuracy. As for the orbital angular momentum and two spins of the Black Holes, we have found a large number of widely separated modes in the parameter space with similar maximum likelihood values. The performance of this method is comparable, if not better, to already existing algorithms. In the second work, we introduce an new phenomenological waveform model for the extreme mass ratio inspiral system. This waveform consists of a set of harmonics with constant amplitude and slowly evolving phase which we decompose in a Taylor series. We use these phenomenological templates to detect the signal in the simulated data, and then, assuming a particular EMRI model, estimate the physical parameters of the binary with high precision. The results show that our phenomenological waveform is very feasible in the data analysis of EMRI signal.

  6. Probabilistic forward model for electroencephalography source analysis

    International Nuclear Information System (INIS)

    Plis, Sergey M; George, John S; Jun, Sung C; Ranken, Doug M; Volegov, Petr L; Schmidt, David M

    2007-01-01

    Source localization by electroencephalography (EEG) requires an accurate model of head geometry and tissue conductivity. The estimation of source time courses from EEG or from EEG in conjunction with magnetoencephalography (MEG) requires a forward model consistent with true activity for the best outcome. Although MRI provides an excellent description of soft tissue anatomy, a high resolution model of the skull (the dominant resistive component of the head) requires CT, which is not justified for routine physiological studies. Although a number of techniques have been employed to estimate tissue conductivity, no present techniques provide the noninvasive 3D tomographic mapping of conductivity that would be desirable. We introduce a formalism for probabilistic forward modeling that allows the propagation of uncertainties in model parameters into possible errors in source localization. We consider uncertainties in the conductivity profile of the skull, but the approach is general and can be extended to other kinds of uncertainties in the forward model. We and others have previously suggested the possibility of extracting conductivity of the skull from measured electroencephalography data by simultaneously optimizing over dipole parameters and the conductivity values required by the forward model. Using Cramer-Rao bounds, we demonstrate that this approach does not improve localization results nor does it produce reliable conductivity estimates. We conclude that the conductivity of the skull has to be either accurately measured by an independent technique, or that the uncertainties in the conductivity values should be reflected in uncertainty in the source location estimates

  7. Analysis of primary teacher stress' sources

    Directory of Open Access Journals (Sweden)

    Katja Depolli Steiner

    2011-12-01

    Full Text Available Teachers are subject to many different work stressors. This study focused on differences in intensity and frequency of potential stressors facing primary schoolteachers and set the goal to identify the most important sources of teacher stress in primary school. The study included 242 primary schoolteachers from different parts of Slovenia. We used Stress Inventory that is designed for identification of intensity and frequency of 49 situations that can play the role of teachers' work stressors. Findings showed that the major sources of stress facing teachers are factors related to work overload, factors stemming from pupils' behaviour and motivation and factors related to school system. Results also showed some small differences in perception of stressors in different groups of teachers (by gender and by teaching level.

  8. The quantitative analysis of 163Ho source by PIXE

    International Nuclear Information System (INIS)

    Sera, K.; Ishii, K.; Fujioka, M.; Izawa, G.; Omori, T.

    1984-01-01

    We have been studying the electron-capture in 163 Ho as a method for determining the mass of electron neutrino. The 163 Ho sources were produced with the 164 Dy(p,2n) reaction by means of a method of internal irradiation 2 ). We applied the PIXE method to determine the total number of 163 Ho atoms in the source. Proton beams of 3 MeV and a method of ''external standard'' were employed for nondestructive analysis of the 163 Ho source as well as an additional method of ''internal standard''. (author)

  9. Study of the 137Cs Stabilizer Source

    Directory of Open Access Journals (Sweden)

    GAO Yan;WANG Yan-ling;XU Zhi-jian;XU Liang;REN Chun-xia;TAN Xiao-ming;CUI Hong-qi

    2014-02-01

    Full Text Available The attenuation laws of the Cesium -137 γ-ray penetrating the ceramic core、stainless steel and tungsten steel were studied. The radioactivity of the 137Cs stabilizer source was determined through the surface dose rate of 137Cs stabilizer sources. In addition, the adsorption properties of the ceramic core were studied to improve the stability of the output rate, and established a production line. The application results showed that the output rate of ray source was accurate and was of a good consistency. At present, the source had been used in logging lithology, and achieved the realization of domestic product.

  10. Active Control of Fan Noise: Feasibility Study. Volume 6; Theoretical Analysis for Coupling of Active Noise Control Actuator Ring Sources to an Annular Duct with Flow

    Science.gov (United States)

    Kraft, R. E.

    1996-01-01

    The objective of this effort is to develop an analytical model for the coupling of active noise control (ANC) piston-type actuators that are mounted flush to the inner and outer walls of an annular duct to the modes in the duct generated by the actuator motion. The analysis will be used to couple the ANC actuators to the modal analysis propagation computer program for the annular duct, to predict the effects of active suppression of fan-generated engine noise sources. This combined program will then be available to assist in the design or evaluation of ANC systems in fan engine annular exhaust ducts. An analysis has been developed to predict the modes generated in an annular duct due to the coupling of flush-mounted ring actuators on the inner and outer walls of the duct. The analysis has been combined with a previous analysis for the coupling of modes to a cylindrical duct in a FORTRAN computer program to perform the computations. The method includes the effects of uniform mean flow in the duct. The program can be used for design or evaluation purposes for active noise control hardware for turbofan engines. Predictions for some sample cases modeled after the geometry of the NASA Lewis ANC Fan indicate very efficient coupling in both the inlet and exhaust ducts for the m = 6 spinning mode at frequencies where only a single radial mode is cut-on. Radial mode content in higher order cut-off modes at the source plane and the required actuator displacement amplitude to achieve 110 dB SPL levels in the desired mode were predicted. Equivalent cases with and without flow were examined for the cylindrical and annular geometry, and little difference was found for a duct flow Mach number of 0.1. The actuator ring coupling program will be adapted as a subroutine to the cylindrical duct modal analysis and the exhaust duct modal analysis. This will allow the fan source to be defined in terms of characteristic modes at the fan source plane and predict the propagation to the

  11. A simulation-based analysis of variable flow pumping in ground source heat pump systems with different types of borehole heat exchangers: A case study

    International Nuclear Information System (INIS)

    Zarrella, Angelo; Emmi, Giuseppe; De Carli, Michele

    2017-01-01

    Highlights: • The work focuses on the variable flow in ground source heat pump systems. • The constant and variable speed circulation pumps in the ground loop are compared. • The constant temperature difference control across the heat pump is studied. • The variable flow affects the energy performance of the heat pump. • The constant temperature difference control offers an attractive energy saving. - Abstract: A simulation model of ground source heat pump systems has been used to investigate to what extent a variable flow of the heat-carrier fluid of the ground loop affects the energy efficiency of the entire system. The model contemporaneously considers the borehole heat exchangers, the heat pump, the building load, and the control strategies for the circulation pumps of the ground loop. A constant speed of the circulation pumps of the ground loop was compared with a variable flow controlled by means of a constant temperature difference across the heat pump on the ground side considering the load profile of an office building located in North Italy. The analysis was carried out for a single U-tube, double U-tube and coaxial pipe heat exchangers. The control strategies adopted to manage the flow rate of the heat-carrier fluid of the ground loop affect both the heat exchange rate of the borehole field and the heat pump’s long-term energy efficiency. The simulations show considerable differences in the system’s seasonal energy efficiency. The constant speed of the circulation pumps leads to the best results as far as the heat pump’s energy performance was concerned, but this advantage was lost because of the greater amount of electrical energy used by the circulation pumps; this, of course, affects the energy efficiency of the entire system. The optimal solution appears then to be a constant temperature difference in the heat-carrier fluid across the heat pump.

  12. Lattice Study for the Taiwan Photon Source

    CERN Document Server

    Kuo, Chin-Cheng; Chen Chien Te; Luo, Gwo-Huei; Tsai, Hung-Jen; Wang, Min-Huey

    2005-01-01

    The feasibility study for the new 3.0~3.3 GeV Taiwan synchrotron light source, dubbed Taiwan Photon Source, was initiated in July, 2004. The goal is to construct a high performance light source with extremely bright X-ray in complementary to the existing 1.5 GeV light source in Taiwan. The ring circumference is 518.4 m and a 24-cell DBA lattice structure is chosen. The natural emittance with distributed dispersion is less than 2 nm-rad. A large booster ring of 499.2 m sharing the storage ring tunnel will be adopted.

  13. Soprano and source: A laryngographic analysis

    Science.gov (United States)

    Bateman, Laura Anne

    2005-04-01

    Popular music in the 21st century uses a particular singing quality for female voice that is quite different from the trained classical singing quality. Classical quality has been the subject of a vast body of research, whereas research that deals with non-classical qualities is limited. In order to learn more about these issues, the author chose to do research on singing qualities using a variety of standard voice quality tests. This paper looks at voice qualities found in various different styles of singing: Classical, Belt, Legit, R&B, Jazz, Country, and Pop. The data was elicited from a professional soprano and the voice qualities reflect industry standards. The data set for this paper is limited to samples using the vowel [i]. Laryngographic (LGG) data was generated simultaneously with the audio samples. This paper will focus on the results of the LGG analysis; however, an audio analysis was also performed using Spectrogram, LPC, and FFT. Data from the LGG is used to calculate the contact quotient, speed quotient, and ascending slope. The LGG waveform is also visually assessed. The LGG analysis gives insights into the source vibration for the different singing styles.

  14. Retrospective and prospective analysis of water use and point source pollution from an economic perspective-a case study of Urumqi, China.

    Science.gov (United States)

    Wang, Bing; Liu, Lei; Huang, Guohe

    2017-11-01

    Using the Environmental Kuznets Curve (EKC) hypothesis, this study explored the dynamic trends of water use and point source pollution in Urumqi (2000-2014) from an economic perspective. Retrospective analysis results indicated that total GDP and GDP per capita increased around tenfold and a fivefold since 2000. Total, municipal and industrial water use had average annual growth rates of 3.96, 7.01, and 3.69%, respectively. However, agricultural water use, emissions of COD and NH 3 -N showed average annual decreases of 3.06, 12.40, and 4.74%. Regression models reveal that total water demand in Urumqi would keep monotonically increasing relationships with GDP and GDP per capita in the foreseeable years. However, the relations of specific water usage and economic growth showed diverse trends. In the future, the discharge of COD and NH 3 -N would further reduce with economic growth. It could be concluded that Urumqi has almost passed the stage where economic growth had caused serious environment deterioration, but the increasing water demand in Urumqi is still an urgent problem. The obtained results would be helpful for water resources management and pollution control in the future.

  15. Dosimetric analysis of radiation sources to use in dermatological lesions

    International Nuclear Information System (INIS)

    Tada, Ariane

    2010-01-01

    Skin lesions undergoing therapy with radiation sources may have different patterns of malignancy. Malignant lesions or cancer most commonly found in radiotherapy services are carcinomas. Radiation therapy in skin lesions is performed with low penetration beams and orthovoltage X-rays, electron beams and radioactive sources ( 192 Ir, 198 Au, e 90 Sr) arranged on a surface mold or in metal applicator. This study aims to analyze the therapeutic radiation dose profile produced by radiation sources used in skin lesions radiotherapy procedures. Experimental measurements for the analysis of dosimetric radiation sources were compared with calculations obtained from a computer system based on the Monte Carlo Method. Computational results had a good agreement with the experimental measurements. Experimental measurements and computational results by the MCNP4C code have been used to validate the calculations obtained by MCNP code and to provide a reliable medical application for each clinical case. (author)

  16. Analysis of coupled model uncertainties in source-to-dose modeling of human exposures to ambient air pollution: A PM 2.5 case study

    Science.gov (United States)

    Özkaynak, Halûk; Frey, H. Christopher; Burke, Janet; Pinder, Robert W.

    Quantitative assessment of human exposures and health effects due to air pollution involve detailed characterization of impacts of air quality on exposure and dose. A key challenge is to integrate these three components on a consistent spatial and temporal basis taking into account linkages and feedbacks. The current state-of-practice for such assessments is to exercise emission, meteorology, air quality, exposure, and dose models separately, and to link them together by using the output of one model as input to the subsequent downstream model. Quantification of variability and uncertainty has been an important topic in the exposure assessment community for a number of years. Variability refers to differences in the value of a quantity (e.g., exposure) over time, space, or among individuals. Uncertainty refers to lack of knowledge regarding the true value of a quantity. An emerging challenge is how to quantify variability and uncertainty in integrated assessments over the source-to-dose continuum by considering contributions from individual as well as linked components. For a case study of fine particulate matter (PM 2.5) in North Carolina during July 2002, we characterize variability and uncertainty associated with each of the individual concentration, exposure and dose models that are linked, and use a conceptual framework to quantify and evaluate the implications of coupled model uncertainties. We find that the resulting overall uncertainties due to combined effects of both variability and uncertainty are smaller (usually by a factor of 3-4) than the crudely multiplied model-specific overall uncertainty ratios. Future research will need to examine the impact of potential dependencies among the model components by conducting a truly coupled modeling analysis.

  17. How Many Separable Sources? Model Selection In Independent Components Analysis

    Science.gov (United States)

    Woods, Roger P.; Hansen, Lars Kai; Strother, Stephen

    2015-01-01

    Unlike mixtures consisting solely of non-Gaussian sources, mixtures including two or more Gaussian components cannot be separated using standard independent components analysis methods that are based on higher order statistics and independent observations. The mixed Independent Components Analysis/Principal Components Analysis (mixed ICA/PCA) model described here accommodates one or more Gaussian components in the independent components analysis model and uses principal components analysis to characterize contributions from this inseparable Gaussian subspace. Information theory can then be used to select from among potential model categories with differing numbers of Gaussian components. Based on simulation studies, the assumptions and approximations underlying the Akaike Information Criterion do not hold in this setting, even with a very large number of observations. Cross-validation is a suitable, though computationally intensive alternative for model selection. Application of the algorithm is illustrated using Fisher's iris data set and Howells' craniometric data set. Mixed ICA/PCA is of potential interest in any field of scientific investigation where the authenticity of blindly separated non-Gaussian sources might otherwise be questionable. Failure of the Akaike Information Criterion in model selection also has relevance in traditional independent components analysis where all sources are assumed non-Gaussian. PMID:25811988

  18. Experimental study of high current negative ion sources D{sup -} / H{sup -}. Analysis based on the simulation of the negative ion transport in the plasma source; Etude experimentale de sources a fort courant d`ions negatifs D{sup -} / H{sup -}. Analyse fondee sur la simulation du transport des ions dans le plasma de la source

    Energy Technology Data Exchange (ETDEWEB)

    Riz, D.

    1996-10-30

    In the frame of the development of a neutral beam injection system able to work the ITER tokamak (International Thermonuclear Experimental Reactor), two negative ion sources, Dragon and Kamaboko, have been installed on the MANTIS test bed in Cadarache, and studies in order to extract 20 mA/cm{sup 2} of D{sup -}. The two production modes of negative ions have been investigated: volume production; surface production after cesium injection in the discharge. Experiments have shown that cesium seeding is necessary in order to reach the requested performances for ITER. 20 mA/cm{sup 2} have been extracted from the Kamaboko source for an arc power density of 2.5 kW/liter. Simultaneously, a code called NIETZSCHE has been developed to simulate the negative ions transport in the source plasma, from their birth place to the extraction holes. The ion trajectory is calculated by numerically solving the 3D motion equation, while the atomic processes of destruction, of elastic collisions H{sup -}/H{sup +} and of charge exchange H{sup -}/H{sup 0} are handled at each time step by a Monte Carlo procedure. The code allows to obtain the extraction probability of a negative ion produced at a given location. The calculations performed with NIETZSCHE have allowed to explain several phenomena observed on negative ion sources, such as the isotopic effect H{sup -}/D{sup -} and the influence of the polarisation of the plasma grid and of the magnetic filter on the negative ions current. The code has also shown that, in the type of sources contemplated for ITER, working with large arc power densities (> 1 kW/liter), only negative ions produced in volume at a distance lower that 2 cm from the plasma grid and those produced at the grid surface have a chance of being extracted. (author). 122 refs.

  19. Proposed Sources of Coaching Efficacy: A Meta-Analysis.

    Science.gov (United States)

    Myers, Nicholas D; Park, Sung Eun; Ahn, Soyeon; Lee, Seungmin; Sullivan, Philip J; Feltz, Deborah L

    2017-08-01

    Coaching efficacy refers to the extent to which a coach believes that he or she has the capacity to affect the learning and performance of his or her athletes. The purpose of the current study was to empirically synthesize findings across the extant literature to estimate relationships between the proposed sources of coaching efficacy and each of the dimensions of coaching efficacy. A literature search yielded 20 studies and 278 effect size estimates that met the inclusion criteria. The overall relationship between the proposed sources of coaching efficacy and each dimension of coaching efficacy was positive and ranged from small to medium in size. Coach gender and level coached moderated the overall relationship between the proposed sources of coaching efficacy and each of the dimensions of coaching efficacy. Results from this meta-analysis provided some evidence for both the utility of, and possible revisions to, the conceptual model of coaching efficacy.

  20. Sourcing of internal auditing : An empirical study

    NARCIS (Netherlands)

    Speklé, R.F.; Elten, van H.J.; Kruis, A.

    2007-01-01

    This paper studies the factors associated with organizations’ internal audit sourcing decisions, building from a previous study by Widener and Selto (henceforth W&S) [Widener, S.K., Selto, F.H., 1999. Management control systems and boundaries of the firm: why do firms outsource internal audit

  1. Quantification of Greenhouse Gas Emission Rates from strong Point Sources by Space-borne IPDA Lidar Measurements: Results from a Sensitivity Analysis Study

    Science.gov (United States)

    Ehret, G.; Kiemle, C.; Rapp, M.

    2017-12-01

    The practical implementation of the Paris Agreement (COP21) vastly profit from an independent, reliable and global measurement system of greenhouse gas emissions, in particular of CO2, in order to complement and cross-check national efforts. Most fossil-fuel CO2 emitters emanate from large sources such as cities and power plants. These emissions increase the local CO2 abundance in the atmosphere by 1-10 parts per million (ppm) which is a signal that is significantly larger than the variability from natural sources and sinks over the local source domain. Despite these large signals, they are only sparsely sampled by the ground-based network which calls for satellite measurements. However, none of the existing and forthcoming passive satellite instruments, operating in the NIR spectral domain, can measure CO2 emissions at night time or in low sunlight conditions and in high latitude regions in winter times. The resulting sparse coverage of passive spectrometers is a serious limitation, particularly for the Northern Hemisphere, since these regions exhibit substantial emissions during the winter as well as other times of the year. In contrast, CO2 measurements by an Integrated Path Differential Absorption (IPDA) Lidar are largely immune to these limitations and initial results from airborne application look promising. In this study, we discuss the implication for a space-borne IPDA Lidar system. A Gaussian plume model will be used to simulate the CO2-distribution of large power plants downstream to the source. The space-borne measurements are simulated by applying a simple forward model based on Gaussian error distribution. Besides the sampling frequency, the sampling geometry (e.g. measurement distance to the emitting source) and the error of the measurement itself vastly impact on the flux inversion performance. We will discuss the results by incorporating Gaussian plume and mass budget approaches to quantify the emission rates.

  2. Constrained Null Space Component Analysis for Semiblind Source Separation Problem.

    Science.gov (United States)

    Hwang, Wen-Liang; Lu, Keng-Shih; Ho, Jinn

    2018-02-01

    The blind source separation (BSS) problem extracts unknown sources from observations of their unknown mixtures. A current trend in BSS is the semiblind approach, which incorporates prior information on sources or how the sources are mixed. The constrained independent component analysis (ICA) approach has been studied to impose constraints on the famous ICA framework. We introduced an alternative approach based on the null space component (NCA) framework and referred to the approach as the c-NCA approach. We also presented the c-NCA algorithm that uses signal-dependent semidefinite operators, which is a bilinear mapping, as signatures for operator design in the c-NCA approach. Theoretically, we showed that the source estimation of the c-NCA algorithm converges with a convergence rate dependent on the decay of the sequence, obtained by applying the estimated operators on corresponding sources. The c-NCA can be formulated as a deterministic constrained optimization method, and thus, it can take advantage of solvers developed in optimization society for solving the BSS problem. As examples, we demonstrated electroencephalogram interference rejection problems can be solved by the c-NCA with proximal splitting algorithms by incorporating a sparsity-enforcing separation model and considering the case when reference signals are available.

  3. Concentration, ozone formation potential and source analysis of volatile organic compounds (VOCs) in a thermal power station centralized area: A study in Shuozhou, China.

    Science.gov (United States)

    Yan, Yulong; Peng, Lin; Li, Rumei; Li, Yinghui; Li, Lijuan; Bai, Huiling

    2017-04-01

    Volatile organic compounds (VOCs) from two sampling sites (HB and XB) in a power station centralized area, in Shuozhou city, China, were sampled by stainless steel canisters and measured by gas chromatography-mass selective detection/flame ionization detection (GC-MSD/FID) in the spring and autumn of 2014. The concentration of VOCs was higher in the autumn (HB, 96.87 μg/m 3 ; XB, 58.94 μg/m 3 ) than in the spring (HB, 41.49 μg/m 3 ; XB, 43.46 μg/m 3 ), as lower wind speed in the autumn could lead to pollutant accumulation, especially at HB, which is a new urban area surrounded by residential areas and a transportation hub. Alkanes were the dominant group at both HB and XB in both sampling periods, but the contribution of aromatic pollutants at HB in the autumn was much higher than that of the other alkanes (11.16-19.55%). Compared to other cities, BTEX pollution in Shuozhou was among the lowest levels in the world. Because of the high levels of aromatic pollutants, the ozone formation potential increased significantly at HB in the autumn. Using the ratio analyses to identify the age of the air masses and analyze the sources, the results showed that the atmospheric VOCs at XB were strongly influenced by the remote sources of coal combustion, while at HB in the spring and autumn were affected by the remote sources of coal combustion and local sources of vehicle emission, respectively. Source analysis conducted using the Positive Matrix Factorization (PMF) model at Shuozhou showed that coal combustion and vehicle emissions made the two largest contributions (29.98% and 21.25%, respectively) to atmospheric VOCs. With further economic restructuring, the influence of vehicle emissions on the air quality should become more significant, indicating that controlling vehicle emissions is key to reducing the air pollution. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Sources of Currency Crisis: An Empirical Analysis

    OpenAIRE

    Weber, Axel A.

    1997-01-01

    Two types of currency crisis models coexist in the literature: first generation models view speculative attacks as being caused by economic fundamentals which are inconsistent with a given parity. Second generation models claim self-fulfilling speculation as the main source of a currency crisis. Recent empirical research in international macroeconomics has attempted to distinguish between the sources of currency crises. This paper adds to this literature by proposing a new empirical approach ...

  5. Antioxidants: Characterization, natural sources, extraction and analysis

    OpenAIRE

    OROIAN, MIRCEA; Escriche Roberto, Mª Isabel

    2015-01-01

    [EN] Recently many review papers regarding antioxidants fromdifferent sources and different extraction and quantification procedures have been published. However none of them has all the information regarding antioxidants (chemistry, sources, extraction and quantification). This article tries to take a different perspective on antioxidants for the new researcher involved in this field. Antioxidants from fruit, vegetables and beverages play an important role in human health, fo...

  6. LED intense headband light source for fingerprint analysis

    Science.gov (United States)

    Villa-Aleman, Eliel

    2005-03-08

    A portable, lightweight and high-intensity light source for detecting and analyzing fingerprints during field investigation. On-site field analysis requires long hours of mobile analysis. In one embodiment, the present invention comprises a plurality of light emitting diodes; a power source; and a personal attachment means; wherein the light emitting diodes are powered by the power source, and wherein the power source and the light emitting diodes are attached to the personal attachment means to produce a personal light source for on-site analysis of latent fingerprints. The present invention is available for other applications as well.

  7. The Source Inversion Validation (SIV) Initiative: A Collaborative Study on Uncertainty Quantification in Earthquake Source Inversions

    Science.gov (United States)

    Mai, P. M.; Schorlemmer, D.; Page, M.

    2012-04-01

    Earthquake source inversions image the spatio-temporal rupture evolution on one or more fault planes using seismic and/or geodetic data. Such studies are critically important for earthquake seismology in general, and for advancing seismic hazard analysis in particular, as they reveal earthquake source complexity and help (i) to investigate earthquake mechanics; (ii) to develop spontaneous dynamic rupture models; (iii) to build models for generating rupture realizations for ground-motion simulations. In applications (i - iii), the underlying finite-fault source models are regarded as "data" (input information), but their uncertainties are essentially unknown. After all, source models are obtained from solving an inherently ill-posed inverse problem to which many a priori assumptions and uncertain observations are applied. The Source Inversion Validation (SIV) project is a collaborative effort to better understand the variability between rupture models for a single earthquake (as manifested in the finite-source rupture model database) and to develop robust uncertainty quantification for earthquake source inversions. The SIV project highlights the need to develop a long-standing and rigorous testing platform to examine the current state-of-the-art in earthquake source inversion, and to develop and test novel source inversion approaches. We will review the current status of the SIV project, and report the findings and conclusions of the recent workshops. We will briefly discuss several source-inversion methods, how they treat uncertainties in data, and assess the posterior model uncertainty. Case studies include initial forward-modeling tests on Green's function calculations, and inversion results for synthetic data from spontaneous dynamic crack-like strike-slip earthquake on steeply dipping fault, embedded in a layered crustal velocity-density structure.

  8. GLOBAL SOURCING: A THEORETICAL STUDY ON TURKEY

    Directory of Open Access Journals (Sweden)

    Aytac GOKMEN

    2010-07-01

    Full Text Available Global sourcing is to source from the global market for goods and services across national boundaries in order to take advantage of the global efficiencies in the delivery of a product or service. Such efficiencies are consists of low cost skilled labor, low cost raw materials and other economic factors like tax breaks and deductions as well as low trade tariffs. When we assess the case regarding to Turkey, global sourcing is an effective device for some firms. The domestic firms in Turkey at various industries are inclined to global source finished or intermediate goods from the world markets, finish the production process in Turkey and export. Eventually, on the one hand the export volume of Turkey increases, but on the other hand the import of a considerable volume of finished or intermediate goods bring about a negative trade balance and loss of jobs in Turkey. Therefore, the objective of this study is to assess the concept of global sourcing transactions on Turkey resting on comprehensive publications.

  9. Analysis of the Structure Ratios of the Funding Sources

    Directory of Open Access Journals (Sweden)

    Maria Daniela Bondoc

    2014-06-01

    Full Text Available The funding sources of the assets and liabilities in the balance sheet include equity capitals and the debts of the entity. The analysis of the structure rates of the funding sources allows for making assessments related to the funding policy, highlighting the financial autonomy and how resources are provided. Using the literature specializing in economic and financial analysis, this paper aims at presenting these rates that focus, on the one hand, to reflect the degree of financial dependence (the rate of financial stability, the rate of global financial autonomy, the rate of on-term financial autonomy and on the other hand the debt structure (the rate of short-term debts, the global indebtedness rate, the on-term indebtedness rate. Based on the financial statements of an entity in the Argeş County, I analysed these indicators, and I drew conclusions and made assessments related to the autonomy, indebtedness and financial stability of the studied entity.

  10. Runoff characteristics and non-point source pollution analysis in the Taihu Lake Basin: a case study of the town of Xueyan, China.

    Science.gov (United States)

    Zhu, Q D; Sun, J H; Hua, G F; Wang, J H; Wang, H

    2015-10-01

    Non-point source pollution is a significant environmental issue in small watersheds in China. To study the effects of rainfall on pollutants transported by runoff, rainfall was monitored in Xueyan town in the Taihu Lake Basin (TLB) for over 12 consecutive months. The concentrations of different forms of nitrogen (N) and phosphorus (P), and chemical oxygen demand, were monitored in runoff and river water across different land use types. The results indicated that pollutant loads were highly variable. Most N losses due to runoff were found around industrial areas (printing factories), while residential areas exhibited the lowest nitrogen losses through runoff. Nitrate nitrogen (NO3-N) and ammonia nitrogen (NH4-N) were the dominant forms of soluble N around printing factories and hotels, respectively. The levels of N in river water were stable prior to the generation of runoff from a rainfall event, after which they were positively correlated to rainfall intensity. In addition, three sites with different areas were selected for a case study to analyze trends in pollutant levels during two rainfall events, using the AnnAGNPS model. The modeled results generally agreed with the observed data, which suggests that AnnAGNPS can be used successfully for modeling runoff nutrient loading in this region. The conclusions of this study provide important information on controlling non-point source pollution in TLB.

  11. Ion sources for solids isotopic analysis

    International Nuclear Information System (INIS)

    Tyrrell, A.C.

    Of the dozen or so methods of producing ions from solid samples only the surface or thermal ionisation method has found general application for precise measurement of isotopic ratios. The author discusses the principal variables affecting the performance of the thermal source; sample preparation, loading onto the filament, sample pre-treatment, filament material. (Auth.)

  12. Analysis of Contract Source Selection Strategy

    Science.gov (United States)

    2015-07-07

    accomplish this milestone due to his unconditional love. I would like to thank my mom, Saraswati, and my dad , Khilendra, for their support and patience...FOR FURTHER RESEARCH The task of understanding the impact of a source selection strategy on resultant contract outcomes is a topic rich for further

  13. Ion sources for solids isotopic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Tyrrell, A. C. [Ministry of Defence, Foulness (UK). Atomic Weapons Research Establishment

    1978-12-15

    Of the dozen or so methods of producing ions from solid samples only the surface or thermal ionisation method has found general application for precise measurement of isotopic ratios. The author discusses the principal variables affecting the performance of the thermal source; sample preparation, loading onto the filament, sample pre-treatment, filament material.

  14. How Many Separable Sources? Model Selection In Independent Components Analysis

    DEFF Research Database (Denmark)

    Woods, Roger P.; Hansen, Lars Kai; Strother, Stephen

    2015-01-01

    among potential model categories with differing numbers of Gaussian components. Based on simulation studies, the assumptions and approximations underlying the Akaike Information Criterion do not hold in this setting, even with a very large number of observations. Cross-validation is a suitable, though....../Principal Components Analysis (mixed ICA/PCA) model described here accommodates one or more Gaussian components in the independent components analysis model and uses principal components analysis to characterize contributions from this inseparable Gaussian subspace. Information theory can then be used to select from...... might otherwise be questionable. Failure of the Akaike Information Criterion in model selection also has relevance in traditional independent components analysis where all sources are assumed non-Gaussian....

  15. Studies and modeling of cold neutron sources; Etude et modelisation des sources froides de neutron

    Energy Technology Data Exchange (ETDEWEB)

    Campioni, G

    2004-11-15

    With the purpose of updating knowledge in the fields of cold neutron sources, the work of this thesis has been run according to the 3 following axes. First, the gathering of specific information forming the materials of this work. This set of knowledge covers the following fields: cold neutron, cross-sections for the different cold moderators, flux slowing down, different measurements of the cold flux and finally, issues in the thermal analysis of the problem. Secondly, the study and development of suitable computation tools. After an analysis of the problem, several tools have been planed, implemented and tested in the 3-dimensional radiation transport code Tripoli-4. In particular, a module of uncoupling, integrated in the official version of Tripoli-4, can perform Monte-Carlo parametric studies with a spare factor of Cpu time fetching 50 times. A module of coupling, simulating neutron guides, has also been developed and implemented in the Monte-Carlo code McStas. Thirdly, achieving a complete study for the validation of the installed calculation chain. These studies focus on 3 cold sources currently functioning: SP1 from Orphee reactor and 2 other sources (SFH and SFV) from the HFR at the Laue Langevin Institute. These studies give examples of problems and methods for the design of future cold sources.

  16. Antioxidants: Characterization, natural sources, extraction and analysis.

    Science.gov (United States)

    Oroian, Mircea; Escriche, Isabel

    2015-08-01

    Recently many review papers regarding antioxidants from different sources and different extraction and quantification procedures have been published. However none of them has all the information regarding antioxidants (chemistry, sources, extraction and quantification). This article tries to take a different perspective on antioxidants for the new researcher involved in this field. Antioxidants from fruit, vegetables and beverages play an important role in human health, for example preventing cancer and cardiovascular diseases, and lowering the incidence of different diseases. In this paper the main classes of antioxidants are presented: vitamins, carotenoids and polyphenols. Recently, many analytical methodologies involving diverse instrumental techniques have been developed for the extraction, separation, identification and quantification of these compounds. Antioxidants have been quantified by different researchers using one or more of these methods: in vivo, in vitro, electrochemical, chemiluminescent, electron spin resonance, chromatography, capillary electrophoresis, nuclear magnetic resonance, near infrared spectroscopy and mass spectrometry methods. Copyright © 2015. Published by Elsevier Ltd.

  17. A Study on Improvement of Algorithm for Source Term Evaluation

    International Nuclear Information System (INIS)

    Park, Jeong Ho; Park, Do Hyung; Lee, Jae Hee

    2010-03-01

    The program developed by KAERI for source term assessment of radwastes from the advanced nuclear fuel cycle consists of spent fuel database analysis module, spent fuel arising projection module, and automatic characterization module for radwastes from pyroprocess. To improve the algorithm adopted the developed program, following items were carried out: - development of an algorithm to decrease analysis time for spent fuel database - development of setup routine for a analysis procedure - improvement of interface for spent fuel arising projection module - optimization of data management algorithm needed for massive calculation to estimate source terms of radwastes from advanced fuel cycle The program developed through this study has a capability to perform source term estimation although several spent fuel assemblies with different fuel design, initial enrichment, irradiation history, discharge burnup, and cooling time are processed at the same time in the pyroprocess. It is expected that this program will be very useful for the design of unit process of pyroprocess and disposal system

  18. Supercontinuum light sources for food analysis

    DEFF Research Database (Denmark)

    Møller, Uffe Visbech; Petersen, Christian Rosenberg; Kubat, Irnis

    2014-01-01

    . One track of Light & Food will target the mid-infrared spectral region. To date, the limitations of mid-infraredlight sources, such as thermal emitters, low-power laser diodes, quantum cascade lasers and synchrotron radiation, have precluded mid-IR applications where the spatial coherence, broad...... bandwidth,high brightness and portability of a supercontinuum laser are all required. DTU Fotonik has now demonstrated the first optical fiber based broadband supercontinuum light souce, which covers 1.4-13.3μm and thereby most of the molecular fingerprint region....

  19. An Analysis of Programming Beginners' Source Programs

    Science.gov (United States)

    Matsuyama, Chieko; Nakashima, Toyoshiro; Ishii, Naohiro

    The production of animations was made the subject of a university programming course in order to make students understand the process of program creation, and so that students could tackle programming with interest. In this paper, the formats and composition of the programs which students produced were investigated. As a result, it was found that there were a lot of problems related to such matters as how to use indent, how to apply comments and functions etc. for the format and the composition of the source codes.

  20. Time-correlated neutron analysis of a multiplying HEU source

    International Nuclear Information System (INIS)

    Miller, E.C.; Kalter, J.M.; Lavelle, C.M.; Watson, S.M.; Kinlaw, M.T.; Chichester, D.L.; Noonan, W.A.

    2015-01-01

    The ability to quickly identify and characterize special nuclear material remains a national security challenge. In counter-proliferation applications, identifying the neutron multiplication of a sample can be a good indication of the level of threat. Currently neutron multiplicity measurements are performed with moderated 3 He proportional counters. These systems rely on the detection of thermalized neutrons, a process which obscures both energy and time information from the source. Fast neutron detectors, such as liquid scintillators, have the ability to detect events on nanosecond time scales, providing more information on the temporal structure of the arriving signal, and provide an alternative method for extracting information from the source. To explore this possibility, a series of measurements were performed on the Idaho National Laboratory's MARVEL assembly, a configurable HEU source. The source assembly was measured in a variety of different HEU configurations and with different reflectors, covering a range of neutron multiplications from 2 to 8. The data was collected with liquid scintillator detectors and digitized for offline analysis. A gap based approach for identifying the bursts of detected neutrons associated with the same fission chain was used. Using this approach, we are able to study various statistical properties of individual fission chains. One of these properties is the distribution of neutron arrival times within a given burst. We have observed two interesting empirical trends. First, this distribution exhibits a weak, but definite, dependence on source multiplication. Second, there are distinctive differences in the distribution depending on the presence and type of reflector. Both of these phenomena might prove to be useful when assessing an unknown source. The physical origins of these phenomena can be illuminated with help of MCNPX-PoliMi simulations

  1. Time-correlated neutron analysis of a multiplying HEU source

    Energy Technology Data Exchange (ETDEWEB)

    Miller, E.C., E-mail: Eric.Miller@jhuapl.edu [Johns Hopkins University Applied Physics Laboratory, Laurel, MD (United States); Kalter, J.M.; Lavelle, C.M. [Johns Hopkins University Applied Physics Laboratory, Laurel, MD (United States); Watson, S.M.; Kinlaw, M.T.; Chichester, D.L. [Idaho National Laboratory, Idaho Falls, ID (United States); Noonan, W.A. [Johns Hopkins University Applied Physics Laboratory, Laurel, MD (United States)

    2015-06-01

    The ability to quickly identify and characterize special nuclear material remains a national security challenge. In counter-proliferation applications, identifying the neutron multiplication of a sample can be a good indication of the level of threat. Currently neutron multiplicity measurements are performed with moderated {sup 3}He proportional counters. These systems rely on the detection of thermalized neutrons, a process which obscures both energy and time information from the source. Fast neutron detectors, such as liquid scintillators, have the ability to detect events on nanosecond time scales, providing more information on the temporal structure of the arriving signal, and provide an alternative method for extracting information from the source. To explore this possibility, a series of measurements were performed on the Idaho National Laboratory's MARVEL assembly, a configurable HEU source. The source assembly was measured in a variety of different HEU configurations and with different reflectors, covering a range of neutron multiplications from 2 to 8. The data was collected with liquid scintillator detectors and digitized for offline analysis. A gap based approach for identifying the bursts of detected neutrons associated with the same fission chain was used. Using this approach, we are able to study various statistical properties of individual fission chains. One of these properties is the distribution of neutron arrival times within a given burst. We have observed two interesting empirical trends. First, this distribution exhibits a weak, but definite, dependence on source multiplication. Second, there are distinctive differences in the distribution depending on the presence and type of reflector. Both of these phenomena might prove to be useful when assessing an unknown source. The physical origins of these phenomena can be illuminated with help of MCNPX-PoliMi simulations.

  2. Time-correlated neutron analysis of a multiplying HEU source

    Science.gov (United States)

    Miller, E. C.; Kalter, J. M.; Lavelle, C. M.; Watson, S. M.; Kinlaw, M. T.; Chichester, D. L.; Noonan, W. A.

    2015-06-01

    The ability to quickly identify and characterize special nuclear material remains a national security challenge. In counter-proliferation applications, identifying the neutron multiplication of a sample can be a good indication of the level of threat. Currently neutron multiplicity measurements are performed with moderated 3He proportional counters. These systems rely on the detection of thermalized neutrons, a process which obscures both energy and time information from the source. Fast neutron detectors, such as liquid scintillators, have the ability to detect events on nanosecond time scales, providing more information on the temporal structure of the arriving signal, and provide an alternative method for extracting information from the source. To explore this possibility, a series of measurements were performed on the Idaho National Laboratory's MARVEL assembly, a configurable HEU source. The source assembly was measured in a variety of different HEU configurations and with different reflectors, covering a range of neutron multiplications from 2 to 8. The data was collected with liquid scintillator detectors and digitized for offline analysis. A gap based approach for identifying the bursts of detected neutrons associated with the same fission chain was used. Using this approach, we are able to study various statistical properties of individual fission chains. One of these properties is the distribution of neutron arrival times within a given burst. We have observed two interesting empirical trends. First, this distribution exhibits a weak, but definite, dependence on source multiplication. Second, there are distinctive differences in the distribution depending on the presence and type of reflector. Both of these phenomena might prove to be useful when assessing an unknown source. The physical origins of these phenomena can be illuminated with help of MCNPX-PoliMi simulations.

  3. Relationship of Source Selection Methods to Contract Outcomes: an Analysis of Air Force Source Selection

    Science.gov (United States)

    2015-12-01

    some occasions, performance is terminated early; this can occur due to either mutual agreement or a breach of contract by one of the parties (Garrett...Relationship of Source Selection Methods to Contract Outcomes: an Analysis of Air Force Source Selection December 2015 Capt Jacques Lamoureux, USAF...on the contract management process, with special emphasis on the source selection methods of tradeoff and lowest price technically acceptable (LPTA

  4. Dosimetric analysis of radiation sources for use dermatological lesions

    International Nuclear Information System (INIS)

    Tada, Ariane

    2010-01-01

    Skin lesions undergoing therapy with radiation sources may have different patterns of malignancy. Malignant lesions or cancer most commonly found in radiotherapy services are carcinomas. Radiation therapy in skin lesions is performed with low penetration beams and orthovoltage X-rays, electron beams and radioactive sources ( 192 Ir, 198 Au, e 90 Sr) arranged on a surface mold or in metal applicator. This study aims to analyze the therapeutic radiation dose profile produced by radiation sources used in skin lesions radiotherapy procedures . Experimental measurements for the analysis of dosimetric radiation sources were compared with calculations obtained from a computer system based on the Monte Carlo Method. Computational results had a good agreement with the experimental measurements. Experimental measurements and computational results by the MCNP4C code were both physically consistent as expected. These experimental measurements compared with calculations using the MCNP-4C code have been used to validate the calculations obtained by MCNP code and to provide a reliable medical application for each clinical case. (author)

  5. PROTEINCHALLENGE: Crowd sourcing in proteomics analysis and software development

    DEFF Research Database (Denmark)

    Martin, Sarah F.; Falkenberg, Heiner; Dyrlund, Thomas Franck

    2013-01-01

    , including arguments for community-wide open source software development and “big data” compatible solutions for the future. For the meantime, we have laid out ten top tips for data processing. With these at hand, a first large-scale proteomics analysis hopefully becomes less daunting to navigate.......However there is clearly a real need for robust tools, standard operating procedures and general acceptance of best practises. Thus we submit to the proteomics community a call for a community-wide open set of proteomics analysis challenges—PROTEINCHALLENGE—that directly target and compare data analysis workflows......In large-scale proteomics studies there is a temptation, after months of experimental work, to plug resulting data into a convenient—if poorly implemented—set of tools, which may neither do the data justice nor help answer the scientific question. In this paper we have captured key concerns...

  6. Energy sources and nuclear energy. Comparative analysis and ethical reflections

    International Nuclear Information System (INIS)

    Hoenraet, C.

    1999-01-01

    Under the authority of the episcopacy of Brugge in Belgium an independent working group Ethics and Nuclear Energy was set up. The purpose of the working group was to collect all the necessary information on existing energy sources and to carry out a comparative analysis of their impact on mankind and the environment. Also attention was paid to economical and social aspects. The results of the study are subjected to an ethical reflection. The book is aimed at politicians, teachers, journalists and every interested layman who wants to gain insight into the consequences of the use of nuclear energy and other energy sources. Based on the information in this book one should be able to objectively define one's position in future debates on this subject

  7. BNL feasibility studies of spallation neutron sources

    International Nuclear Information System (INIS)

    Lee, Y.Y.; Ruggiero, A.G.; Van Steenbergen, A.; Weng, W.T.

    1995-01-01

    This paper is the summary of conceptual design studies of a 5 MW Pulsed Spallation Neutron Source (PSNS) conducted by an interdepartmental study group at Brookhaven National Laboratory. The study was made of two periods. First, a scenario based on the use of a 600 MeV Linac followed by two fast-cycling 3.6 GeV Synchrotrons was investigated. Then, in a subsequent period, the attention of the study was directed toward an Accumulator scenario with two options: (1) a 1.25 GeV normal conducting Linac followed by two Accumulator Rings, and (2) a 2.4 GeV superconducting Linac followed by a single Accumulator Ring. The study did not make any reference to a specific site

  8. Analysis of Earthquake Source Spectra in Salton Trough

    Science.gov (United States)

    Chen, X.; Shearer, P. M.

    2009-12-01

    Previous studies of the source spectra of small earthquakes in southern California show that average Brune-type stress drops vary among different regions, with particularly low stress drops observed in the Salton Trough (Shearer et al., 2006). The Salton Trough marks the southern end of the San Andreas Fault and is prone to earthquake swarms, some of which are driven by aseismic creep events (Lohman and McGuire, 2007). In order to learn the stress state and understand the physical mechanisms of swarms and slow slip events, we analyze the source spectra of earthquakes in this region. We obtain Southern California Seismic Network (SCSN) waveforms for earthquakes from 1977 to 2009 archived at the Southern California Earthquake Center (SCEC) data center, which includes over 17,000 events. After resampling the data to a uniform 100 Hz sample rate, we compute spectra for both signal and noise windows for each seismogram, and select traces with a P-wave signal-to-noise ratio greater than 5 between 5 Hz and 15 Hz. Using selected displacement spectra, we isolate the source spectra from station terms and path effects using an empirical Green’s function approach. From the corrected source spectra, we compute corner frequencies and estimate moments and stress drops. Finally we analyze spatial and temporal variations in stress drop in the Salton Trough and compare them with studies of swarms and creep events to assess the evolution of faulting and stress in the region. References: Lohman, R. B., and J. J. McGuire (2007), Earthquake swarms driven by aseismic creep in the Salton Trough, California, J. Geophys. Res., 112, B04405, doi:10.1029/2006JB004596 Shearer, P. M., G. A. Prieto, and E. Hauksson (2006), Comprehensive analysis of earthquake source spectra in southern California, J. Geophys. Res., 111, B06303, doi:10.1029/2005JB003979.

  9. Source Signals Separation and Reconstruction Following Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    WANG Cheng

    2014-02-01

    Full Text Available For separation and reconstruction of source signals from observed signals problem, the physical significance of blind source separation modal and independent component analysis is not very clear, and its solution is not unique. Aiming at these disadvantages, a new linear and instantaneous mixing model and a novel source signals separation reconstruction solving method from observed signals based on principal component analysis (PCA are put forward. Assumption of this new model is statistically unrelated rather than independent of source signals, which is different from the traditional blind source separation model. A one-to-one relationship between linear and instantaneous mixing matrix of new model and linear compound matrix of PCA, and a one-to-one relationship between unrelated source signals and principal components are demonstrated using the concept of linear separation matrix and unrelated of source signals. Based on this theoretical link, source signals separation and reconstruction problem is changed into PCA of observed signals then. The theoretical derivation and numerical simulation results show that, in despite of Gauss measurement noise, wave form and amplitude information of unrelated source signal can be separated and reconstructed by PCA when linear mixing matrix is column orthogonal and normalized; only wave form information of unrelated source signal can be separated and reconstructed by PCA when linear mixing matrix is column orthogonal but not normalized, unrelated source signal cannot be separated and reconstructed by PCA when mixing matrix is not column orthogonal or linear.

  10. Nonpoint source pollution of urban stormwater runoff: a methodology for source analysis.

    Science.gov (United States)

    Petrucci, Guido; Gromaire, Marie-Christine; Shorshani, Masoud Fallah; Chebbo, Ghassan

    2014-09-01

    The characterization and control of runoff pollution from nonpoint sources in urban areas are a major issue for the protection of aquatic environments. We propose a methodology to quantify the sources of pollutants in an urban catchment and to analyze the associated uncertainties. After describing the methodology, we illustrate it through an application to the sources of Cu, Pb, Zn, and polycyclic aromatic hydrocarbons (PAH) from a residential catchment (228 ha) in the Paris region. In this application, we suggest several procedures that can be applied for the analysis of other pollutants in different catchments, including an estimation of the total extent of roof accessories (gutters and downspouts, watertight joints and valleys) in a catchment. These accessories result as the major source of Pb and as an important source of Zn in the example catchment, while activity-related sources (traffic, heating) are dominant for Cu (brake pad wear) and PAH (tire wear, atmospheric deposition).

  11. Data Analysis with Open Source Tools

    CERN Document Server

    Janert, Philipp

    2010-01-01

    Collecting data is relatively easy, but turning raw information into something useful requires that you know how to extract precisely what you need. With this insightful book, intermediate to experienced programmers interested in data analysis will learn techniques for working with data in a business environment. You'll learn how to look at data to discover what it contains, how to capture those ideas in conceptual models, and then feed your understanding back into the organization through business plans, metrics dashboards, and other applications. Along the way, you'll experiment with conce

  12. An Analysis of Open Source Security Software Products Downloads

    Science.gov (United States)

    Barta, Brian J.

    2014-01-01

    Despite the continued demand for open source security software, a gap in the identification of success factors related to the success of open source security software persists. There are no studies that accurately assess the extent of this persistent gap, particularly with respect to the strength of the relationships of open source software…

  13. sources

    Directory of Open Access Journals (Sweden)

    Shu-Yin Chiang

    2002-01-01

    Full Text Available In this paper, we study the simplified models of the ATM (Asynchronous Transfer Mode multiplexer network with Bernoulli random traffic sources. Based on the model, the performance measures are analyzed by the different output service schemes.

  14. Risk analysis of alternative energy sources

    International Nuclear Information System (INIS)

    Kazmer, D.R.

    1982-01-01

    The author explores two points raised by Miller Spangler in a January 1981 issue: public perception of risks involving nuclear power plants relative to those of conventional plants and criteria for evaluating the way risk analyses are made. On the first point, he concludes that translating public attitudes into the experts' language of probability and risk could provide better information and understanding of both the attitudes and the risks. Viewing risk analysis methodologies as filters which help to test historical change, he suggests that the lack of information favors a lay jury approach for energy decisions. Spangler responds that Congress is an example of lay decision making, but that a lay jury, given public disinterest and polarization, would probably not improve social justice on the nuclear issue. 5 references, 4 figures

  15. Finite element analysis and frequency shift studies for the bridge coupler of the coupled cavity linear accelerator of the spallation neutron source.

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Z. (Zukun)

    2001-01-01

    The Spallation Neutron Source (SNS) is an accelerator-based neutron scattering research facility. The linear accelerator (linac) is the principal accelerating structure and divided into a room-temperature linac and a superconducting linac. The normal conducting linac system that consists of a Drift Tube Linac (DTL) and a Coupled Cavity Linac (CCL) is to be built by Los Alamos National Laboratory. The CCL structure is 55.36-meters long. It accelerates H- beam from 86.8 Mev to 185.6 Mev at operating frequency of 805 MHz. This side coupled cavity structure has 8 cells per segment, 12 segments and 11 bridge couplers per module, and 4 modules total. A 5-MW klystron powers each module. The number 3 and number 9 bridge coupler of each module are connected to the 5-MW RF power supply. The bridge coupler with length of 2.5 {beta}{gamma} is a three-cell structure and located between the segments and allows power flow through the module. The center cell of each bridge coupler is excited during normal operation. To obtain a uniform electromagnetic filed and meet the resonant frequency shift, the RF induced heat must be removed. Thus, the thermal deformation and frequency shift studies are performed via numerical simulations in order to have an appropriate cooling design and predict the frequency shift under operation. The center cell of the bridge coupler also contains a large 4-inch slug tuner and a tuning post that used to provide bulk frequency adjustment and field intensity adjustment, so that produce the proper total field distribution in the module assembly.

  16. Population studies of the unidentified EGRET sources

    Energy Technology Data Exchange (ETDEWEB)

    Siegal-Gaskins, J M [University of Chicago, Chicago, IL 60637 (United States); Pavlidou, V [University of Chicago, Chicago, IL 60637 (United States); Olinto, A V [University of Chicago, Chicago, IL 60637 (United States); Brown, C [University of Chicago, Chicago, IL 60637 (United States); Fields, B D [University of Illinois at Urbana-Champaign, Urbana, IL 61801 (United States)

    2007-03-15

    The third EGRET catalog contains a large number of unidentified sources. Current data allows the intriguing possibility that some of these objects may represent a new class of yet undiscovered gamma-ray sources. By assuming that galaxies similar to the Milky Way host comparable populations of objects, we constrain the allowed Galactic abundance and distribution of various classes of gamma-ray sources using the EGRET data set. Furthermore, regardless of the nature of the unidentified sources, faint unresolved objects of the same class contribute to the observed diffuse gamma-ray background. We investigate the potential contribution of these unresolved sources to the extragalactic gamma-ray background.

  17. Population studies of the unidentified EGRET sources

    International Nuclear Information System (INIS)

    Siegal-Gaskins, J M; Pavlidou, V; Olinto, A V; Brown, C; Fields, B D

    2007-01-01

    The third EGRET catalog contains a large number of unidentified sources. Current data allows the intriguing possibility that some of these objects may represent a new class of yet undiscovered gamma-ray sources. By assuming that galaxies similar to the Milky Way host comparable populations of objects, we constrain the allowed Galactic abundance and distribution of various classes of gamma-ray sources using the EGRET data set. Furthermore, regardless of the nature of the unidentified sources, faint unresolved objects of the same class contribute to the observed diffuse gamma-ray background. We investigate the potential contribution of these unresolved sources to the extragalactic gamma-ray background

  18. Uncertainty sources in radiopharmaceuticals clinical studies

    International Nuclear Information System (INIS)

    Degenhardt, Aemilie Louize; Oliveira, Silvia Maria Velasques de

    2014-01-01

    The radiopharmaceuticals should be approved for consumption by evaluating their quality, safety and efficacy. Clinical studies are designed to verify the pharmacodynamics, pharmacological and clinical effects in humans and are required for assuring safety and efficacy. The Bayesian analysis has been used for clinical studies effectiveness evaluation. This work aims to identify uncertainties associated with the process of production of the radionuclide and radiopharmaceutical labelling as well as the radiopharmaceutical administration and scintigraphy images acquisition and processing. For the development of clinical studies in the country, the metrological chain shall assure the traceability of the surveys performed in all phases. (author)

  19. Multi-criteria analysis applied to the selection of drinking water sources in developing countries : A case study of Cali, Colombia

    NARCIS (Netherlands)

    Gutiérrez, Juan Pablo; Delgado, Luis Germán; van Halem, D.; Wessels, Peter; Rietveld, L.C.

    2016-01-01

    Guaranteeing a safe and continuous drinking water supply for the city of Cali, Colombia, has become a concern for the water company of Cali, the environmental authorities, universities, and entities involved in the water resource. The progressive deterioration of the city’s water sources has led

  20. Relative accuracy and availability of an Irish National Database of dispensed medication as a source of medication history information: observational study and retrospective record analysis.

    LENUS (Irish Health Repository)

    Grimes, T

    2013-01-27

    WHAT IS KNOWN AND OBJECTIVE: The medication reconciliation process begins by identifying which medicines a patient used before presentation to hospital. This is time-consuming, labour intensive and may involve interruption of clinicians. We sought to identify the availability and accuracy of data held in a national dispensing database, relative to other sources of medication history information. METHODS: For patients admitted to two acute hospitals in Ireland, a Gold Standard Pre-Admission Medication List (GSPAML) was identified and corroborated with the patient or carer. The GSPAML was compared for accuracy and availability to PAMLs from other sources, including the Health Service Executive Primary Care Reimbursement Scheme (HSE-PCRS) dispensing database. RESULTS: Some 1111 medication were assessed for 97 patients, who were median age 74 years (range 18-92 years), median four co-morbidities (range 1-9), used median 10 medications (range 3-25) and half (52%) were male. The HSE-PCRS PAML was the most accurate source compared to lists provided by the general practitioner, community pharmacist or cited in previous hospital documentation: the list agreed for 74% of the medications the patients actually used, representing complete agreement for all medications in 17% of patients. It was equally contemporaneous to other sources, but was less reliable for male than female patients, those using increasing numbers of medications and those using one or more item that was not reimbursable by the HSE. WHAT IS NEW AND CONCLUSION: The HSE-PCRS database is a relatively accurate, available and contemporaneous source of medication history information and could support acute hospital medication reconciliation.

  1. Search Analytics: Automated Learning, Analysis, and Search with Open Source

    Science.gov (United States)

    Hundman, K.; Mattmann, C. A.; Hyon, J.; Ramirez, P.

    2016-12-01

    The sheer volume of unstructured scientific data makes comprehensive human analysis impossible, resulting in missed opportunities to identify relationships, trends, gaps, and outliers. As the open source community continues to grow, tools like Apache Tika, Apache Solr, Stanford's DeepDive, and Data-Driven Documents (D3) can help address this challenge. With a focus on journal publications and conference abstracts often in the form of PDF and Microsoft Office documents, we've initiated an exploratory NASA Advanced Concepts project aiming to use the aforementioned open source text analytics tools to build a data-driven justification for the HyspIRI Decadal Survey mission. We call this capability Search Analytics, and it fuses and augments these open source tools to enable the automatic discovery and extraction of salient information. In the case of HyspIRI, a hyperspectral infrared imager mission, key findings resulted from the extractions and visualizations of relationships from thousands of unstructured scientific documents. The relationships include links between satellites (e.g. Landsat 8), domain-specific measurements (e.g. spectral coverage) and subjects (e.g. invasive species). Using the above open source tools, Search Analytics mined and characterized a corpus of information that would be infeasible for a human to process. More broadly, Search Analytics offers insights into various scientific and commercial applications enabled through missions and instrumentation with specific technical capabilities. For example, the following phrases were extracted in close proximity within a publication: "In this study, hyperspectral images…with high spatial resolution (1 m) were analyzed to detect cutleaf teasel in two areas. …Classification of cutleaf teasel reached a users accuracy of 82 to 84%." Without reading a single paper we can use Search Analytics to automatically identify that a 1 m spatial resolution provides a cutleaf teasel detection users accuracy of 82

  2. Problems in the fingerprints based polycyclic aromatic hydrocarbons source apportionment analysis and a practical solution.

    Science.gov (United States)

    Zou, Yonghong; Wang, Lixia; Christensen, Erik R

    2015-10-01

    This work intended to explain the challenges of the fingerprints based source apportionment method for polycyclic aromatic hydrocarbons (PAH) in the aquatic environment, and to illustrate a practical and robust solution. The PAH data detected in the sediment cores from the Illinois River provide the basis of this study. Principal component analysis (PCA) separates PAH compounds into two groups reflecting their possible airborne transport patterns; but it is not able to suggest specific sources. Not all positive matrix factorization (PMF) determined sources are distinguishable due to the variability of source fingerprints. However, they constitute useful suggestions for inputs for a Bayesian chemical mass balance (CMB) analysis. The Bayesian CMB analysis takes into account the measurement errors as well as the variations of source fingerprints, and provides a credible source apportionment. Major PAH sources for Illinois River sediments are traffic (35%), coke oven (24%), coal combustion (18%), and wood combustion (14%). Copyright © 2015. Published by Elsevier Ltd.

  3. Source modelling in seismic risk analysis for nuclear power plants

    International Nuclear Information System (INIS)

    Yucemen, M.S.

    1978-12-01

    The proposed probabilistic procedure provides a consistent method for the modelling, analysis and updating of uncertainties that are involved in the seismic risk analysis for nuclear power plants. The potential earthquake activity zones are idealized as point, line or area sources. For these seismic source types, expressions to evaluate their contribution to seismic risk are derived, considering all the possible site-source configurations. The seismic risk at a site is found to depend not only on the inherent randomness of the earthquake occurrences with respect to magnitude, time and space, but also on the uncertainties associated with the predicted values of the seismic and geometric parameters, as well as the uncertainty in the attenuation model. The uncertainty due to the attenuation equation is incorporated into the analysis through the use of random correction factors. The influence of the uncertainty resulting from the insufficient information on the seismic parameters and source geometry is introduced into the analysis by computing a mean risk curve averaged over the various alternative assumptions on the parameters and source geometry. Seismic risk analysis is carried for the city of Denizli, which is located in the seismically most active zone of Turkey. The second analysis is for Akkuyu

  4. Study of non-metallic inclusion sources in steel

    International Nuclear Information System (INIS)

    Khons, Ya.; Mrazek, L.

    1976-01-01

    A study of potential inclusion sources was carried out at the Tvinec steel plant using an unified labelling procedure for different sources. A lanthanum oxide labelling method has been used for refractories with the subsequent La determination in steel by the neutron activation analysis. Samarium and cerium oxides and the 141 Ce radionuclide have been used in conjunction with the testing. The following sources of exogenous inclusions have been studied: 1)Refractory material comprising fireclay and corundum for steel-teeming trough in open-heart furnaces; 2) Fireclay bottom-pouring refractories; 3) Steel-teeming laddle lining; 4) Heat-insulating and exothermic compounds for steel ingots; 5) Vacuum treatment plant lining; 6) Open-hearth and electric arc furnace slag. The major oxide inclusion source in steel was found to be represented by the furnace slag, since it forms about 40 p.c. of all oxide inclusions. The contributions of the remaining sources did not exceede 5 p.c. each

  5. Overview of receptor-based source apportionment studies for speciated atmospheric mercury

    OpenAIRE

    Cheng, I.; Xu, X.; Zhang, L.

    2015-01-01

    Receptor-based source apportionment studies of speciated atmospheric mercury are not only concerned with source contributions but also with the influence of transport, transformation, and deposition processes on speciated atmospheric mercury concentrations at receptor locations. Previous studies applied multivariate receptor models including principal components analysis and positive matrix factorization, and back trajectory receptor models including potential source contri...

  6. Statistical Analysis of the Microvariable AGN Source Mrk 501

    Directory of Open Access Journals (Sweden)

    Alberto C. Sadun

    2018-02-01

    Full Text Available We report on the optical observations and analysis of the high-energy peaked BL Lac object (HBL, Mrk 501, at redshift z = 0.033. We can confirm microvariable behavior over the course of minutes on several occasions per night. As an alternative to the commonly understood dynamical model of random variations in intensity of the AGN, we develop a relativistic beaming model with a minimum of free parameters, which allows us to infer changes in the line of sight angles for the motion of the different relativistic components. We hope our methods can be used in future studies of beamed emission in other active microvariable sources, similar to the one we explored.

  7. PROTEINCHALLENGE: Crowd sourcing in proteomics analysis and software development

    DEFF Research Database (Denmark)

    Martin, Sarah F.; Falkenberg, Heiner; Dyrlund, Thomas Franck

    2013-01-01

    , including arguments for community-wide open source software development and “big data” compatible solutions for the future. For the meantime, we have laid out ten top tips for data processing. With these at hand, a first large-scale proteomics analysis hopefully becomes less daunting to navigate......, with the aim of setting a community-driven gold standard for data handling, reporting and sharing. This article is part of a Special Issue entitled: New Horizons and Applications for Proteomics [EuPA 2012].......In large-scale proteomics studies there is a temptation, after months of experimental work, to plug resulting data into a convenient—if poorly implemented—set of tools, which may neither do the data justice nor help answer the scientific question. In this paper we have captured key concerns...

  8. Open-Source RTOS Space Qualification: An RTEMS Case Study

    Science.gov (United States)

    Zemerick, Scott

    2017-01-01

    NASA space-qualification of reusable off-the-shelf real-time operating systems (RTOSs) remains elusive due to several factors notably (1) The diverse nature of RTOSs utilized across NASA, (2) No single NASA space-qualification criteria, lack of verification and validation (V&V) analysis, or test beds, and (3) different RTOS heritages, specifically open-source RTOSs and closed vendor-provided RTOSs. As a leader in simulation test beds, the NASA IV&V Program is poised to help jump-start and lead the space-qualification effort of the open source Real-Time Executive for Multiprocessor Systems (RTEMS) RTOS. RTEMS, as a case-study, can be utilized as an example of how to qualify all RTOSs, particularly the reusable non-commercial (open-source) ones that are gaining usage and popularity across NASA. Qualification will improve the overall safety and mission assurance of RTOSs for NASA-agency wide usage. NASA's involvement in space-qualification of an open-source RTOS such as RTEMS will drive the RTOS industry toward a more qualified and mature open-source RTOS product.

  9. Neutral beam source commercialization study. Final report

    International Nuclear Information System (INIS)

    King, H.J.

    1980-06-01

    The basic tasks of this Phase II project were to: generate a set of design drawings suitable for quantity production of sources of this design; fabricate a functional neutral beam source incorporating as many of the proposed design changes as proved feasible; and document the procedures and findings developed during the contract. These tasks have been accomplished and represent a demonstrated milestone in the industrialization of this complete device

  10. Studies of the infrared source CRL 2688

    International Nuclear Information System (INIS)

    Ney, E.P.; Merrill, K.M.; Becklin, E.E.; Neugebauer, G.; Wynn-Williams, C.G.

    1975-01-01

    Infrared, optical, and radio observations are descirbed of a newly discovered galactic infrared source. Most of the radiation comes from 1/sup double-prime/./sub /5 diameter infrared source at a temperature of about 150 K, but some visible emission in the form of a symmetrical highly polarized reflection nebulosity is also seen. The object could represent either a very early or a very late stage in stellar evolution

  11. The Term of the “Tatar-Mongols/Mongol-Tatars”: The Ethnic or Political Concept? An Experience of the Source Study and Conceptual Analysis »

    Directory of Open Access Journals (Sweden)

    D.M. Iskhakov

    2016-07-01

    Full Text Available In recent years, researchers have begun to pay greater attention to the ethnic aspects of the Great Mongolian State’s formation at the turn of the 12th–13th centuries. However, a key problem of this period still remains controversial. This problem is related to the definition of ethnicity of the Tatar and other kindred clans. This article analyzes the problem in order to achieve a clear understanding of the ethnic situation in Central Asia during the formation of Eke Mongol Ulus. As a result of consideration of historiographical approaches to ethnic nomenclature, which the Mongolian and Chinese sources used with respect to the Turkic and Mongolian groups that settled in this area, the author is inclined to the view of Turkish ethnicity of the Tatars and some other (Naimans, Merkits clans known by source, whom Chinggis Khan faced in the process of formation of the “people of the Mongols”. At the same time, the author establishes a historical connection between the pre-Mongol Tartars and Kimak and Uyghur khaganates. In particular, he reveals their affiliation to the elite, “royal” layers of these Turkic states. In turn, this allows us to reveal the presence of a Tatar component among the eastern Kipchak-Kimaks (Yemeks with close ties with the last dynasty of Khwarezm shahs. On the basis of a detailed and comprehensive review of material, the author points to the need for a new understanding of the term “Mongol-Tatars”. This term was not imposed by the Chinese officials, but it was a meaningful politonym marking a two-part (Turkic (Tatar – Mongol nature of the “people” who established the Great Mongol Empire. The author informs in his article about his plans to consider in detail this issue in relation to the ulus of Jochi.

  12. Analysis of the source term in the Chernobyl-4 accident

    International Nuclear Information System (INIS)

    Alonso, A.; Lopez Montero, J.V.; Pinedo Garrido, P.

    1990-01-01

    The report presents the analysis of the Chernobyl accident and of the phenomena with major influence on the source term, including the chemical effects of materials dumped over the reactor, carried out by the Chair of Nuclear Technology at Madrid University under a contract with the CEC. It also includes the comparison of the ratio (Cs-137/Cs-134) between measurements performed by Soviet authorities and countries belonging to the Community and OECD area. Chapter II contains a summary of both isotope measurements (Cs-134 and Cs-137), and their ratios, in samples of air, water, soil and agricultural and animal products collected by the Soviets in their report presented in Vienna (1986). Chapter III reports on the inventories of cesium isotopes in the core, while Chapter IV analyses the transient, especially the fuel temperature reached, as a way to deduce the mechanisms which took place in the cesium escape. The cesium source term is analyzed in Chapter V. Normal conditions have been considered, as well as the transient and the post-accidental period, including the effects of deposited materials. The conclusion of this study is that Chernobyl accidental sequence is specific of the RBMK type of reactors, and that in the Western world, basic research on fuel behaviour for reactivity transients has already been carried out

  13. Analysis of 3-panel and 4-panel microscale ionization sources

    International Nuclear Information System (INIS)

    Natarajan, Srividya; Parker, Charles B.; Glass, Jeffrey T.; Piascik, Jeffrey R.; Gilchrist, Kristin H.; Stoner, Brian R.

    2010-01-01

    Two designs of a microscale electron ionization (EI) source are analyzed herein: a 3-panel design and a 4-panel design. Devices were fabricated using microelectromechanical systems technology. Field emission from carbon nanotube provided the electrons for the EI source. Ion currents were measured for helium, nitrogen, and xenon at pressures ranging from 10 -4 to 0.1 Torr. A comparison of the performance of both designs is presented. The 4-panel microion source showed a 10x improvement in performance compared to the 3-panel device. An analysis of the various factors affecting the performance of the microion sources is also presented. SIMION, an electron and ion optics software, was coupled with experimental measurements to analyze the ion current results. The electron current contributing to ionization and the ion collection efficiency are believed to be the primary factors responsible for the higher efficiency of the 4-panel microion source. Other improvements in device design that could lead to higher ion source efficiency in the future are also discussed. These microscale ion sources are expected to find application as stand alone ion sources as well as in miniature mass spectrometers.

  14. Identifying avian sources of faecal contamination using sterol analysis.

    Science.gov (United States)

    Devane, Megan L; Wood, David; Chappell, Andrew; Robson, Beth; Webster-Brown, Jenny; Gilpin, Brent J

    2015-10-01

    Discrimination of the source of faecal pollution in water bodies is an important step in the assessment and mitigation of public health risk. One tool for faecal source tracking is the analysis of faecal sterols which are present in faeces of animals in a range of distinctive ratios. Published ratios are able to discriminate between human and herbivore mammal faecal inputs but are of less value for identifying pollution from wildfowl, which can be a common cause of elevated bacterial indicators in rivers and streams. In this study, the sterol profiles of 50 avian-derived faecal specimens (seagulls, ducks and chickens) were examined alongside those of 57 ruminant faeces and previously published sterol profiles of human wastewater, chicken effluent and animal meatwork effluent. Two novel sterol ratios were identified as specific to avian faecal scats, which, when incorporated into a decision tree with human and herbivore mammal indicative ratios, were able to identify sterols from avian-polluted waterways. For samples where the sterol profile was not consistent with herbivore mammal or human pollution, avian pollution is indicated when the ratio of 24-ethylcholestanol/(24-ethylcholestanol + 24-ethylcoprostanol + 24-ethylepicoprostanol) is ≥0.4 (avian ratio 1) and the ratio of cholestanol/(cholestanol + coprostanol + epicoprostanol) is ≥0.5 (avian ratio 2). When avian pollution is indicated, further confirmation by targeted PCR specific markers can be employed if greater confidence in the pollution source is required. A 66% concordance between sterol ratios and current avian PCR markers was achieved when 56 water samples from polluted waterways were analysed.

  15. Detection, Source Location, and Analysis of Volcano Infrasound

    Science.gov (United States)

    McKee, Kathleen F.

    The study of volcano infrasound focuses on low frequency sound from volcanoes, how volcanic processes produce it, and the path it travels from the source to our receivers. In this dissertation we focus on detecting, locating, and analyzing infrasound from a number of different volcanoes using a variety of analysis techniques. These works will help inform future volcano monitoring using infrasound with respect to infrasonic source location, signal characterization, volatile flux estimation, and back-azimuth to source determination. Source location is an important component of the study of volcano infrasound and in its application to volcano monitoring. Semblance is a forward grid search technique and common source location method in infrasound studies as well as seismology. We evaluated the effectiveness of semblance in the presence of significant topographic features for explosions of Sakurajima Volcano, Japan, while taking into account temperature and wind variations. We show that topographic obstacles at Sakurajima cause a semblance source location offset of 360-420 m to the northeast of the actual source location. In addition, we found despite the consistent offset in source location semblance can still be a useful tool for determining periods of volcanic activity. Infrasonic signal characterization follows signal detection and source location in volcano monitoring in that it informs us of the type of volcanic activity detected. In large volcanic eruptions the lowermost portion of the eruption column is momentum-driven and termed the volcanic jet or gas-thrust zone. This turbulent fluid-flow perturbs the atmosphere and produces a sound similar to that of jet and rocket engines, known as jet noise. We deployed an array of infrasound sensors near an accessible, less hazardous, fumarolic jet at Aso Volcano, Japan as an analogue to large, violent volcanic eruption jets. We recorded volcanic jet noise at 57.6° from vertical, a recording angle not normally feasible

  16. Analysis of the tuning characteristics of microwave plasma source

    Energy Technology Data Exchange (ETDEWEB)

    Miotk, Robert, E-mail: rmiotk@imp.gda.pl; Jasiński, Mariusz [Centre for Plasma and Laser Engineering, The Szewalski Institute of Fluid-Flow Machinery, Polish Academy of Sciences, Fiszera 14, 80-231 Gdańsk (Poland); Mizeraczyk, Jerzy [Department of Marine Electronics, Gdynia Maritime University, Morska 81-87, 81-225 Gdynia (Poland)

    2016-04-15

    In this paper, we present an analysis of the tuning characteristics of waveguide-supplied metal-cylinder-based nozzleless microwave plasma source. This analysis has enabled to estimate the electron concentration n{sub e} and electron frequency collisions ν in the plasma generated in nitrogen and in a mixture of nitrogen and ethanol vapour. The parameters n{sub e} and ν are the basic quantities that characterize the plasma. The presented new plasma diagnostic method is particularly useful, when spectroscopic methods are useless. The presented plasma source is currently used in research of a hydrogen production from liquids.

  17. Analysis of the tuning characteristics of microwave plasma source

    International Nuclear Information System (INIS)

    Miotk, Robert; Jasiński, Mariusz; Mizeraczyk, Jerzy

    2016-01-01

    In this paper, we present an analysis of the tuning characteristics of waveguide-supplied metal-cylinder-based nozzleless microwave plasma source. This analysis has enabled to estimate the electron concentration n_e and electron frequency collisions ν in the plasma generated in nitrogen and in a mixture of nitrogen and ethanol vapour. The parameters n_e and ν are the basic quantities that characterize the plasma. The presented new plasma diagnostic method is particularly useful, when spectroscopic methods are useless. The presented plasma source is currently used in research of a hydrogen production from liquids.

  18. Analysis on the inbound tourist source market in Fujian Province

    Science.gov (United States)

    YU, Tong

    2017-06-01

    The paper analyzes the development and structure of inbound tourism in Fujian Province by Excel software and conducts the cluster analysis on the inbound tourism market by SPSS 23.0 software based on the inbound tourism data of Fujian Province from 2006 to 2015. The results show: the rapid development of inbound tourism in Fujian Province and the diversified inbound tourist source countries indicate the stability of inbound tourism market; the inbound tourist source market in Fujian Province can be divided into four categories according to the cluster analysis, and tourists from the United States, Japan, Malaysia, and Singapore are the key of inbound tourism in Fujian Province.

  19. Nuisance Source Population Modeling for Radiation Detection System Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sokkappa, P; Lange, D; Nelson, K; Wheeler, R

    2009-10-05

    source frequencies, but leave the task of estimating these frequencies for future work. Modeling of nuisance source populations is only useful if it helps in understanding detector system performance in real operational environments. Examples of previous studies in which nuisance source models played a key role are briefly discussed. These include screening of in-bound urban traffic and monitoring of shipping containers in transit to U.S. ports.

  20. Problems in the fingerprints based polycyclic aromatic hydrocarbons source apportionment analysis and a practical solution

    International Nuclear Information System (INIS)

    Zou, Yonghong; Wang, Lixia; Christensen, Erik R.

    2015-01-01

    This work intended to explain the challenges of the fingerprints based source apportionment method for polycyclic aromatic hydrocarbons (PAH) in the aquatic environment, and to illustrate a practical and robust solution. The PAH data detected in the sediment cores from the Illinois River provide the basis of this study. Principal component analysis (PCA) separates PAH compounds into two groups reflecting their possible airborne transport patterns; but it is not able to suggest specific sources. Not all positive matrix factorization (PMF) determined sources are distinguishable due to the variability of source fingerprints. However, they constitute useful suggestions for inputs for a Bayesian chemical mass balance (CMB) analysis. The Bayesian CMB analysis takes into account the measurement errors as well as the variations of source fingerprints, and provides a credible source apportionment. Major PAH sources for Illinois River sediments are traffic (35%), coke oven (24%), coal combustion (18%), and wood combustion (14%). - Highlights: • Fingerprint variability poses challenges in PAH source apportionment analysis. • PCA can be used to group compounds or cluster measurements. • PMF requires results validation but is useful for source suggestion. • Bayesian CMB provide practical and credible solution. - A Bayesian CMB model combined with PMF is a practical and credible fingerprints based PAH source apportionment method

  1. Sensitivity Analysis of Deviation Source for Fast Assembly Precision Optimization

    Directory of Open Access Journals (Sweden)

    Jianjun Tang

    2014-01-01

    Full Text Available Assembly precision optimization of complex product has a huge benefit in improving the quality of our products. Due to the impact of a variety of deviation source coupling phenomena, the goal of assembly precision optimization is difficult to be confirmed accurately. In order to achieve optimization of assembly precision accurately and rapidly, sensitivity analysis of deviation source is proposed. First, deviation source sensitivity is defined as the ratio of assembly dimension variation and deviation source dimension variation. Second, according to assembly constraint relations, assembly sequences and locating, deviation transmission paths are established by locating the joints between the adjacent parts, and establishing each part’s datum reference frame. Third, assembly multidimensional vector loops are created using deviation transmission paths, and the corresponding scalar equations of each dimension are established. Then, assembly deviation source sensitivity is calculated by using a first-order Taylor expansion and matrix transformation method. Finally, taking assembly precision optimization of wing flap rocker as an example, the effectiveness and efficiency of the deviation source sensitivity analysis method are verified.

  2. FECAL SOURCE TRACKING BY ANTIBIOTIC RESISTANCE ANALYSIS ON A WATERSHED EXHIBITING LOW RESISTANCE

    Science.gov (United States)

    The ongoing development of microbial source tracking has made it possible to identify contamination sources with varying accuracy, depending on the method used. The purpose of this study was done to test the efficiency of the antibiotic resistance analysis (ARA) method under low ...

  3. Open source drug discovery in practice: a case study.

    Science.gov (United States)

    Årdal, Christine; Røttingen, John-Arne

    2012-01-01

    Open source drug discovery offers potential for developing new and inexpensive drugs to combat diseases that disproportionally affect the poor. The concept borrows two principle aspects from open source computing (i.e., collaboration and open access) and applies them to pharmaceutical innovation. By opening a project to external contributors, its research capacity may increase significantly. To date there are only a handful of open source R&D projects focusing on neglected diseases. We wanted to learn from these first movers, their successes and failures, in order to generate a better understanding of how a much-discussed theoretical concept works in practice and may be implemented. A descriptive case study was performed, evaluating two specific R&D projects focused on neglected diseases. CSIR Team India Consortium's Open Source Drug Discovery project (CSIR OSDD) and The Synaptic Leap's Schistosomiasis project (TSLS). Data were gathered from four sources: interviews of participating members (n = 14), a survey of potential members (n = 61), an analysis of the websites and a literature review. Both cases have made significant achievements; however, they have done so in very different ways. CSIR OSDD encourages international collaboration, but its process facilitates contributions from mostly Indian researchers and students. Its processes are formal with each task being reviewed by a mentor (almost always offline) before a result is made public. TSLS, on the other hand, has attracted contributors internationally, albeit significantly fewer than CSIR OSDD. Both have obtained funding used to pay for access to facilities, physical resources and, at times, labor costs. TSLS releases its results into the public domain, whereas CSIR OSDD asserts ownership over its results. Technically TSLS is an open source project, whereas CSIR OSDD is a crowdsourced project. However, both have enabled high quality research at low cost. The critical success factors appear to be clearly

  4. Open Source Drug Discovery in Practice: A Case Study

    Science.gov (United States)

    Årdal, Christine; Røttingen, John-Arne

    2012-01-01

    Background Open source drug discovery offers potential for developing new and inexpensive drugs to combat diseases that disproportionally affect the poor. The concept borrows two principle aspects from open source computing (i.e., collaboration and open access) and applies them to pharmaceutical innovation. By opening a project to external contributors, its research capacity may increase significantly. To date there are only a handful of open source R&D projects focusing on neglected diseases. We wanted to learn from these first movers, their successes and failures, in order to generate a better understanding of how a much-discussed theoretical concept works in practice and may be implemented. Methodology/Principal Findings A descriptive case study was performed, evaluating two specific R&D projects focused on neglected diseases. CSIR Team India Consortium's Open Source Drug Discovery project (CSIR OSDD) and The Synaptic Leap's Schistosomiasis project (TSLS). Data were gathered from four sources: interviews of participating members (n = 14), a survey of potential members (n = 61), an analysis of the websites and a literature review. Both cases have made significant achievements; however, they have done so in very different ways. CSIR OSDD encourages international collaboration, but its process facilitates contributions from mostly Indian researchers and students. Its processes are formal with each task being reviewed by a mentor (almost always offline) before a result is made public. TSLS, on the other hand, has attracted contributors internationally, albeit significantly fewer than CSIR OSDD. Both have obtained funding used to pay for access to facilities, physical resources and, at times, labor costs. TSLS releases its results into the public domain, whereas CSIR OSDD asserts ownership over its results. Conclusions/Significance Technically TSLS is an open source project, whereas CSIR OSDD is a crowdsourced project. However, both have enabled high quality

  5. Critical Analysis on Open Source LMSs Using FCA

    Science.gov (United States)

    Sumangali, K.; Kumar, Ch. Aswani

    2013-01-01

    The objective of this paper is to apply Formal Concept Analysis (FCA) to identify the best open source Learning Management System (LMS) for an E-learning environment. FCA is a mathematical framework that represents knowledge derived from a formal context. In constructing the formal context, LMSs are treated as objects and their features as…

  6. Modular Open-Source Software for Item Factor Analysis

    Science.gov (United States)

    Pritikin, Joshua N.; Hunter, Micheal D.; Boker, Steven M.

    2015-01-01

    This article introduces an item factor analysis (IFA) module for "OpenMx," a free, open-source, and modular statistical modeling package that runs within the R programming environment on GNU/Linux, Mac OS X, and Microsoft Windows. The IFA module offers a novel model specification language that is well suited to programmatic generation…

  7. Comparative analysis of methods and sources of financing of the transport organizations activity

    Science.gov (United States)

    Gorshkov, Roman

    2017-10-01

    The article considers the analysis of methods of financing of transport organizations in conditions of limited investment resources. A comparative analysis of these methods is carried out, the classification of investment, methods and sources of financial support for projects being implemented to date are presented. In order to select the optimal sources of financing for the projects, various methods of financial management and financial support for the activities of the transport organization were analyzed, which were considered from the perspective of analysis of advantages and limitations. The result of the study is recommendations on the selection of optimal sources and methods of financing of transport organizations.

  8. Bispectral pairwise interacting source analysis for identifying systems of cross-frequency interacting brain sources from electroencephalographic or magnetoencephalographic signals

    Science.gov (United States)

    Chella, Federico; Pizzella, Vittorio; Zappasodi, Filippo; Nolte, Guido; Marzetti, Laura

    2016-05-01

    Brain cognitive functions arise through the coordinated activity of several brain regions, which actually form complex dynamical systems operating at multiple frequencies. These systems often consist of interacting subsystems, whose characterization is of importance for a complete understanding of the brain interaction processes. To address this issue, we present a technique, namely the bispectral pairwise interacting source analysis (biPISA), for analyzing systems of cross-frequency interacting brain sources when multichannel electroencephalographic (EEG) or magnetoencephalographic (MEG) data are available. Specifically, the biPISA makes it possible to identify one or many subsystems of cross-frequency interacting sources by decomposing the antisymmetric components of the cross-bispectra between EEG or MEG signals, based on the assumption that interactions are pairwise. Thanks to the properties of the antisymmetric components of the cross-bispectra, biPISA is also robust to spurious interactions arising from mixing artifacts, i.e., volume conduction or field spread, which always affect EEG or MEG functional connectivity estimates. This method is an extension of the pairwise interacting source analysis (PISA), which was originally introduced for investigating interactions at the same frequency, to the study of cross-frequency interactions. The effectiveness of this approach is demonstrated in simulations for up to three interacting source pairs and for real MEG recordings of spontaneous brain activity. Simulations show that the performances of biPISA in estimating the phase difference between the interacting sources are affected by the increasing level of noise rather than by the number of the interacting subsystems. The analysis of real MEG data reveals an interaction between two pairs of sources of central mu and beta rhythms, localizing in the proximity of the left and right central sulci.

  9. Human Campylobacteriosis in Luxembourg, 2010?2013: A Case-Control Study Combined with Multilocus Sequence Typing for Source Attribution and Risk Factor Analysis

    OpenAIRE

    Mossong, Jo?l; Mughini-Gras, Lapo; Penny, Christian; Devaux, Anthony; Olinger, Christophe; Losch, Serge; Cauchie, Henry-Michel; van Pelt, Wilfrid; Ragimbeau, Catherine

    2016-01-01

    Campylobacteriosis has increased markedly in Luxembourg during recent years. We sought to determine which Campylobacter genotypes infect humans, where they may originate from, and how they may infect humans. Multilocus sequence typing was performed on 1153 Campylobacter jejuni and 136 C. coli human strains to be attributed to three putative animal reservoirs (poultry, ruminants, pigs) and to environmental water using the asymmetric island model. A nationwide case-control study (2010?2013) for...

  10. Radioisotope sources for X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Leonowich, J.; Pandian, S.; Preiss, I.L.

    1977-01-01

    Problems involved in developing radioisotope sources and the characteristics of potentially useful radioisotopes for X-ray fluorescence analysis are presented. These include the following. The isotope must be evaluated for the physical and chemical forms available, purity, half-life, specific activity, toxicity, and cost. The radiation hazards of the source must be considered. The type and amount of radiation output of the source must be evaluated. The source construction must be planned. The source should also present an advance over those currently available in order to justify its development. Some of the isotopes, which are not in use but look very promising, are indicated, and their data are tabulated. A more or less ''perfect'' source within a given range of interest would exhibit the following characteristics. (1) Decay by an isometric transition with little or no internal conversion, (2) Have an intense gamma transition near the absorption edge of the element(s) of interest with no high energy gammas, (3) Have a sufficiently long half-life (in the order of years) for both economic and calibration reasons, (4) Have a sufficiently large cross-section for production in a reasonable amount of time. If there are competing reactions the interfering isotopes should be reasonably short-lived, or if not, be apt to be separated from the isotope chemically with a minimum of difficulty. (T.G.)

  11. Reactivity studies on the advanced neutron source

    International Nuclear Information System (INIS)

    Ryskamp, J.M.; Redmond, E.L. II; Fletcher, C.D.

    1990-01-01

    An Advanced Neutron Source (ANS) with a peak thermal neutron flux of about 8.5 x 10 19 m -2 s -1 is being designed for condensed matter physics, materials science, isotope production, and fundamental physics research. The ANS is a new reactor-based research facility being planned by Oak Ridge National Laboratory (ORNL) to meet the need for an intense steady-state source of neutrons. The design effort is currently in the conceptual phase. A reference reactor design has been selected in order to examine the safety, performance, and costs associated with this one design. The ANS Project has an established, documented safety philosophy, and safety-related design criteria are currently being established. The purpose of this paper is to present analyses of safety aspects of the reference reactor design that are related to core reactivity events. These analyses include control rod worth, shutdown rod worth, heavy water voiding, neutron beam tube flooding, light water ingress, and single fuel element criticality. Understanding these safety aspects will allow us to make design modifications that improve the reactor safety and achieve the safety related design criteria. 8 refs., 3 tabs

  12. High flux isotope reactor cold source preconceptual design study report

    International Nuclear Information System (INIS)

    Selby, D.L.; Bucholz, J.A.; Burnette, S.E.

    1995-12-01

    In February 1995, the deputy director of Oak Ridge National Laboratory (ORNL) formed a group to examine the need for upgrades to the High Flux Isotope Reactor (HFIR) system in light of the cancellation of the Advanced Neutron Source Project. One of the major findings of this study was that there was an immediate need for the installation of a cold neutron source facility in the HFIR complex. The anticipated cold source will consist of a cryogenic LH 2 moderator plug, a cryogenic pump system, a refrigerator that uses helium gas as a refrigerant, a heat exchanger to interface the refrigerant with the hydrogen loop, liquid hydrogen transfer lines, a gas handling system that includes vacuum lines, and an instrumentation and control system to provide constant system status monitoring and to maintain system stability. The scope of this project includes the development, design, safety analysis, procurement/fabrication, testing, and installation of all of the components necessary to produce a working cold source within an existing HFIR beam tube. This project will also include those activities necessary to transport the cold neutron beam to the front face of the present HFIR beam room. The cold source project has been divided into four phases: (1) preconceptual, (2) conceptual design and research and development (R and D), (3) detailed design and procurement, and (4) installation and operation. This report marks the conclusion of the preconceptual phase and establishes the concept feasibility. The information presented includes the project scope, the preliminary design requirements, the preliminary cost and schedule, the preliminary performance data, and an outline of the various plans for completing the project

  13. Sociodemographic characteristics and frequency of consuming home-cooked meals and meals from out-of-home sources: cross-sectional analysis of a population-based cohort study.

    Science.gov (United States)

    Mills, Susanna; Adams, Jean; Wrieden, Wendy; White, Martin; Brown, Heather

    2018-04-11

    To identify sociodemographic characteristics associated with frequency of consuming home-cooked meals and meals from out-of-home sources. Cross-sectional analysis of a population-based cohort study. Frequency of consuming home-cooked meals, ready meals, takeaways and meals out were derived from a participant questionnaire. Sociodemographic characteristics regarding sex, age, ethnicity, working overtime and socio-economic status (SES; measured by household income, educational attainment, occupational status and employment status) were self-reported. Sociodemographic differences in higher v. lower meal consumption frequency were explored using logistic regression, adjusted for other key sociodemographic variables. Cambridgeshire, UK. Fenland Study participants (n 11 326), aged 29-64 years at baseline. Eating home-cooked meals more frequently was associated with being female, older, of higher SES (measured by greater educational attainment and household income) and not working overtime. Being male was associated with a higher frequency of consumption for all out-of-home meal types. Consuming takeaways more frequently was associated with lower SES (measured by lower educational attainment and household income), whereas eating out more frequently was associated with higher SES (measured by greater educational attainment and household income) and working overtime. Sociodemographic characteristics associated with frequency of eating meals from different out-of-home sources varied according to meal source. Findings may be used to target public health policies and interventions for promoting healthier diets and dietary-related health towards people consuming home-cooked meals less frequently, such as men, those with lower educational attainment and household income, and overtime workers.

  14. Positron annihilation lifetime spectroscopy source correction determination: A simulation study

    Energy Technology Data Exchange (ETDEWEB)

    Kanda, Gurmeet S.; Keeble, David J., E-mail: d.j.keeble@dundee.ac.uk

    2016-02-01

    Positron annihilation lifetime spectroscopy (PALS) can provide sensitive detection and identification of vacancy-related point defects in materials. These measurements are normally performed using a positron source supported, and enclosed by, a thin foil. Annihilation events from this source arrangement must be quantified and are normally subtracted from the spectrum before analysis of the material lifetime components proceeds. Here simulated PALS spectra reproducing source correction evaluation experiments have been systematically fitted and analysed using the packages PALSfit and MELT. Simulations were performed assuming a single lifetime material, and for a material with two lifetime components. Source correction terms representing a directly deposited source and various foil supported sources were added. It is shown that in principle these source terms can be extracted from suitably designed experiments, but that fitting a number of independent, nominally identical, spectra is recommended.

  15. Studies on the method of producing radiographic 170Tm source

    International Nuclear Information System (INIS)

    Maeda, Sho

    1976-08-01

    A method of producing radiographic 170 Tm source has been studied, including target preparation, neutron irradiation, handling of the irradiated target in the hot cell and source capsules. On the basis of the results, practical 170 Tm radiographic sources (29 -- 49Ci, with pellets 3mm in diameter and 3mm long) were produced in trial by neutron irradiation with the JMTR. (auth.)

  16. Comparative analysis of traditional and alternative energy sources

    Directory of Open Access Journals (Sweden)

    Adriana Csikósová

    2008-11-01

    Full Text Available The presented thesis with designation of Comparing analysis of traditional and alternative energy resources includes, on basisof theoretical information source, research in firm, internal data, trends in company development and market, descriptionof the problem and its application. Theoretical information source is dedicated to the traditional and alternative energy resources,reserves of it, trends in using and development, the balance of it in the world, EU and in Slovakia as well. Analysis of the thesisis reflecting profile of the company and the thermal pump market evaluation using General Electric method. While the companyis implementing, except other products, the thermal pumps on geothermal energy base and surround energy base (air, the missionof the comparing analysis is to compare traditional energy resources with thermal pump from the ecological, utility and economic sideof it. The results of the comparing analysis are resumed in to the SWOT analysis. The part of the thesis includes he questionnaire offerfor effectiveness improvement and customer satisfaction analysis, and expected possibilities of alternative energy resources assistance(benefits from the government and EU funds.

  17. Stratified source-sampling techniques for Monte Carlo eigenvalue analysis

    International Nuclear Information System (INIS)

    Mohamed, A.

    1998-01-01

    In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo ''Eigenvalue of the World'' problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. In this paper, stratified source-sampling techniques are generalized and applied to three different Eigenvalue of the World configurations which take into account real-world statistical noise sources not included in the model problem, but which differ in the amount of neutronic coupling among the constituents of each configuration. It is concluded that, in Monte Carlo eigenvalue analysis of loosely-coupled arrays, the use of stratified source-sampling reduces the probability of encountering an anomalous result over that if conventional source-sampling methods are used. However, this gain in reliability is substantially less than that observed in the model-problem results

  18. Your Personal Analysis Toolkit - An Open Source Solution

    Science.gov (United States)

    Mitchell, T.

    2009-12-01

    Open source software is commonly known for its web browsers, word processors and programming languages. However, there is a vast array of open source software focused on geographic information management and geospatial application building in general. As geo-professionals, having easy access to tools for our jobs is crucial. Open source software provides the opportunity to add a tool to your tool belt and carry it with you for your entire career - with no license fees, a supportive community and the opportunity to test, adopt and upgrade at your own pace. OSGeo is a US registered non-profit representing more than a dozen mature geospatial data management applications and programming resources. Tools cover areas such as desktop GIS, web-based mapping frameworks, metadata cataloging, spatial database analysis, image processing and more. Learn about some of these tools as they apply to AGU members, as well as how you can join OSGeo and its members in getting the job done with powerful open source tools. If you haven't heard of OSSIM, MapServer, OpenLayers, PostGIS, GRASS GIS or the many other projects under our umbrella - then you need to hear this talk. Invest in yourself - use open source!

  19. Source-Type Identification Analysis Using Regional Seismic Moment Tensors

    Science.gov (United States)

    Chiang, A.; Dreger, D. S.; Ford, S. R.; Walter, W. R.

    2012-12-01

    Waveform inversion to determine the seismic moment tensor is a standard approach in determining the source mechanism of natural and manmade seismicity, and may be used to identify, or discriminate different types of seismic sources. The successful applications of the regional moment tensor method at the Nevada Test Site (NTS) and the 2006 and 2009 North Korean nuclear tests (Ford et al., 2009a, 2009b, 2010) show that the method is robust and capable for source-type discrimination at regional distances. The well-separated populations of explosions, earthquakes and collapses on a Hudson et al., (1989) source-type diagram enables source-type discrimination; however the question remains whether or not the separation of events is universal in other regions, where we have limited station coverage and knowledge of Earth structure. Ford et al., (2012) have shown that combining regional waveform data and P-wave first motions removes the CLVD-isotropic tradeoff and uniquely discriminating the 2009 North Korean test as an explosion. Therefore, including additional constraints from regional and teleseismic P-wave first motions enables source-type discrimination at regions with limited station coverage. We present moment tensor analysis of earthquakes and explosions (M6) from Lop Nor and Semipalatinsk test sites for station paths crossing Kazakhstan and Western China. We also present analyses of smaller events from industrial sites. In these sparse coverage situations we combine regional long-period waveforms, and high-frequency P-wave polarity from the same stations, as well as from teleseismic arrays to constrain the source type. Discrimination capability with respect to velocity model and station coverage is examined, and additionally we investigate the velocity model dependence of vanishing free-surface traction effects on seismic moment tensor inversion of shallow sources and recovery of explosive scalar moment. Our synthetic data tests indicate that biases in scalar

  20. A nuclear source term analysis for spacecraft power systems

    International Nuclear Information System (INIS)

    McCulloch, W.H.

    1998-01-01

    All US space missions involving on board nuclear material must be approved by the Office of the President. To be approved the mission and the hardware systems must undergo evaluations of the associated nuclear health and safety risk. One part of these evaluations is the characterization of the source terms, i.e., the estimate of the amount, physical form, and location of nuclear material, which might be released into the environment in the event of credible accidents. This paper presents a brief overview of the source term analysis by the Interagency Nuclear Safety Review Panel for the NASA Cassini Space Mission launched in October 1997. Included is a description of the Energy Interaction Model, an innovative approach to the analysis of potential releases from high velocity impacts resulting from launch aborts and reentries

  1. Source apportionment and sensitivity analysis: two methodologies with two different purposes

    Science.gov (United States)

    Clappier, Alain; Belis, Claudio A.; Pernigotti, Denise; Thunis, Philippe

    2017-11-01

    This work reviews the existing methodologies for source apportionment and sensitivity analysis to identify key differences and stress their implicit limitations. The emphasis is laid on the differences between source impacts (sensitivity analysis) and contributions (source apportionment) obtained by using four different methodologies: brute-force top-down, brute-force bottom-up, tagged species and decoupled direct method (DDM). A simple theoretical example to compare these approaches is used highlighting differences and potential implications for policy. When the relationships between concentration and emissions are linear, impacts and contributions are equivalent concepts. In this case, source apportionment and sensitivity analysis may be used indifferently for both air quality planning purposes and quantifying source contributions. However, this study demonstrates that when the relationship between emissions and concentrations is nonlinear, sensitivity approaches are not suitable to retrieve source contributions and source apportionment methods are not appropriate to evaluate the impact of abatement strategies. A quantification of the potential nonlinearities should therefore be the first step prior to source apportionment or planning applications, to prevent any limitations in their use. When nonlinearity is mild, these limitations may, however, be acceptable in the context of the other uncertainties inherent to complex models. Moreover, when using sensitivity analysis for planning, it is important to note that, under nonlinear circumstances, the calculated impacts will only provide information for the exact conditions (e.g. emission reduction share) that are simulated.

  2. Obsidian sources characterized by neutron-activation analysis.

    Science.gov (United States)

    Gordus, A A; Wright, G A; Griffin, J B

    1968-07-26

    Concentrations of elements such as manganese, scandium, lanthanum, rubidium, samarium, barium, and zirconium in obsidian samples from different flows show ranges of 1000 percent or more, whereas the variation in element content in obsidian samples from a single flow appears to be less than 40 percent. Neutron-activation analysis of these elements, as well as of sodium and iron, provides a means of identifying the geologic source of an archeological artifact of obsidian.

  3. Cost Analysis Sources and Documents Data Base Reference Manual (Update)

    Science.gov (United States)

    1989-06-01

    M: Refcrence Manual PRICE H: Training Course Workbook 11. Use in Cost Analysis. Important source of cost estimates for electronic and mechanical...Nature of Data. Contains many microeconomic time series by month or quarter. 5. Level of Detail. Very detailed. 6. Normalization Processes Required...Reference Manual. Moorestown, N.J,: GE Corporation, September 1986. 64. PRICE Training Course Workbook . Moorestown, N.J.: GE Corporation, February 1986

  4. Neutronics of the IFMIF neutron source: development and analysis

    International Nuclear Information System (INIS)

    Wilson, P.P.H.

    1999-01-01

    The accurate analysis of this system required the development of a code system and methodology capable of modelling the various physical processes. A generic code system for the neutronics analysis of neutron sources has been created by loosely integrating existing components with new developments: the data processing code NJOY, the Monte Carlo neutron transport code MCNP, and the activation code ALARA were supplemented by a damage data processing program, damChar, and integrated with a number of flexible and extensible modules for the Perl scripting language. Specific advances were required to apply this code system to IFMIF. Based on the ENDF-6 data format requirements of this system, new data evaluations have been implemented for neutron transport and activation. Extensive analysis of the Li(d, xn) reaction has led to a new MCNP source function module, M c DeLi, based on physical reaction models and capable of accurate and flexible modelling of the IFMIF neutron source term. In depth analyses of the neutron flux spectra and spatial distribution throughout the high flux test region permitted a basic validation of the tools and data. The understanding of the features of the neutron flux provided a foundation for the analyses of the other neutron responses. (orig./DGE) [de

  5. Java Source Code Analysis for API Migration to Embedded Systems

    Energy Technology Data Exchange (ETDEWEB)

    Winter, Victor [Univ. of Nebraska, Omaha, NE (United States); McCoy, James A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Guerrero, Jonathan [Univ. of Nebraska, Omaha, NE (United States); Reinke, Carl Werner [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Perry, James Thomas [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    Embedded systems form an integral part of our technological infrastructure and oftentimes play a complex and critical role within larger systems. From the perspective of reliability, security, and safety, strong arguments can be made favoring the use of Java over C in such systems. In part, this argument is based on the assumption that suitable subsets of Java’s APIs and extension libraries are available to embedded software developers. In practice, a number of Java-based embedded processors do not support the full features of the JVM. For such processors, source code migration is a mechanism by which key abstractions offered by APIs and extension libraries can made available to embedded software developers. The analysis required for Java source code-level library migration is based on the ability to correctly resolve element references to their corresponding element declarations. A key challenge in this setting is how to perform analysis for incomplete source-code bases (e.g., subsets of libraries) from which types and packages have been omitted. This article formalizes an approach that can be used to extend code bases targeted for migration in such a manner that the threats associated the analysis of incomplete code bases are eliminated.

  6. Photoemission studies using laboratory and synchrotron sources

    International Nuclear Information System (INIS)

    Phase, D.M.

    2012-01-01

    Synchrotron radiation sources, providing intense, polarized and stable beams of ultra violet soft and hard X-ray photons, are having great impact on physics, chemistry, biology materials science and other areas research. In particular synchrotron radiation has revolutionized photoelectron spectroscopy by enhancing its capabilities for investigating the electronic properties of solids. The first Indian synchrotron storage ring, Indus- 1 is in operation at RRCAT, Indore. The UGC-DAE CSR with the help of university scientist had designed and developed an angle integrated photoelectron spectroscopy (PES) beamline on this 450 MeV storage ring. A storage ring of this kind is most suitable for investigation in the energy range from few electron volts to around five hundred electron volts. In this lecture we will describe the details of PES beamline and its experimental station. Till date the different university users carried out photoemission measurements on variety of samples. Some of the spectra recorded by users will be presented in order to show the capability of this beamline. In the later part we will report a review of our recent research work carried out on dilute magnetic thin films using this beamline. (author)

  7. Turbulence in extended synchrotron radio sources. I. Polarization of turbulent sources. II. Power-spectral analysis

    International Nuclear Information System (INIS)

    Eilek, J.A.

    1989-01-01

    Recent theories of magnetohydrodynamic turbulence are used to construct microphysical turbulence models, with emphasis on models of anisotropic turbulence. These models have been applied to the determination of the emergent polarization from a resolved uniform source. It is found that depolarization alone is not a unique measure of the turbulence, and that the turblence will also affect the total-intensity distributions. Fluctuations in the intensity image can thus be employed to measure turbulence strength. In the second part, it is demonstrated that a power-spectral analysis of the total and polarized intensity images can be used to obtain the power spectra of the synchrotron emission. 81 refs

  8. PUREX source Aggregate Area management study report

    International Nuclear Information System (INIS)

    1993-03-01

    This report presents the results of an aggregate area management study (AAMS) for the PUREX Plant Aggregate Area in the 200 Areas of the US Department of Energy (DOE)Hanford Site in Washington State. This scoping level study provides the basis for initiating Remedial Investigation/Feasibility Study (RI/FS) activities under the comprehensive Environmental Response, Compensation, and Liability Act of 1980 (CERCLA) or Resource Conservation and Recovery Act (RCRA) Facility Investigations (RFI) and Corrective Measures Studies (CMS) under RCRA. This report also integrates select RCRA treatment, storage, or disposal (TSD) closure activities with CERCLA and RCRA past-practice investigations

  9. 42 CFR 456.244 - Data sources for studies.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Data sources for studies. 456.244 Section 456.244 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES...: Medical Care Evaluation Studies § 456.244 Data sources for studies. Data that the committee uses to...

  10. 42 CFR 456.144 - Data sources for studies.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Data sources for studies. 456.144 Section 456.144 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... Care Evaluation Studies § 456.144 Data sources for studies. Data that the committee uses to perform...

  11. Energy and exergy analysis of a double effect absorption refrigeration system based on different heat sources

    International Nuclear Information System (INIS)

    Kaynakli, Omer; Saka, Kenan; Kaynakli, Faruk

    2015-01-01

    Highlights: • Energy and exergy analysis was performed on double effect series flow absorption refrigeration system. • The refrigeration system runs on various heat sources such as hot water, hot air and steam. • A comparative analysis was carried out on these heat sources in terms of exergy destruction and mass flow rate of heat source. • The effect of heat sources on the exergy destruction of high pressure generator was investigated. - Abstract: Absorption refrigeration systems are environmental friendly since they can utilize industrial waste heat and/or solar energy. In terms of heat source of the systems, researchers prefer one type heat source usually such as hot water or steam. Some studies can be free from environment. In this study, energy and exergy analysis is performed on a double effect series flow absorption refrigeration system with water/lithium bromide as working fluid pair. The refrigeration system runs on various heat sources such as hot water, hot air and steam via High Pressure Generator (HPG) because of hot water/steam and hot air are the most common available heat source for absorption applications but the first law of thermodynamics may not be sufficient analyze the absorption refrigeration system and to show the difference of utilize for different type heat source. On the other hand operation temperatures of the overall system and its components have a major effect on their performance and functionality. In this regard, a parametric study conducted here to investigate this effect on heat capacity and exergy destruction of the HPG, coefficient of performance (COP) of the system, and mass flow rate of heat sources. Also, a comparative analysis is carried out on several heat sources (e.g. hot water, hot air and steam) in terms of exergy destruction and mass flow rate of heat source. From the analyses it is observed that exergy destruction of the HPG increases at higher temperature of the heat sources, condenser and absorber, and lower

  12. Fecal bacteria source characterization and sensitivity analysis of SWAT 2005

    Science.gov (United States)

    The Soil and Water Assessment Tool (SWAT) version 2005 includes a microbial sub-model to simulate fecal bacteria transport at the watershed scale. The objectives of this study were to demonstrate methods to characterize fecal coliform bacteria (FCB) source loads and to assess the model sensitivity t...

  13. Incorporating priors for EEG source imaging and connectivity analysis

    Directory of Open Access Journals (Sweden)

    Xu eLei

    2015-08-01

    Full Text Available Electroencephalography source imaging (ESI is a useful technique to localize the generators from a given scalp electric measurement and to investigate the temporal dynamics of the large-scale neural circuits. By introducing reasonable priors from other modalities, ESI reveals the most probable sources and communication structures at every moment in time. Here, we review the available priors from such techniques as magnetic resonance imaging (MRI, functional MRI (fMRI, and positron emission tomography (PET. The modality's specific contribution is analyzed from the perspective of source reconstruction. For spatial priors, such as EEG-correlated fMRI, temporally coherent networks and resting-state fMRI are systematically introduced in the ESI. Moreover, the fiber tracking (diffusion tensor imaging, DTI and neuro-stimulation techniques (transcranial magnetic stimulation, TMS are also introduced as the potential priors, which can help to draw inferences about the neuroelectric connectivity in the source space. We conclude that combining EEG source imaging with other complementary modalities is a promising approach towards the study of brain networks in cognitive and clinical neurosciences.

  14. Semiworks source aggregate area management study report

    International Nuclear Information System (INIS)

    1993-05-01

    This report presents the results of an aggregate area management study (AAMS) for the Semi-Works Aggregate Area in the 200 Areas of the US Department of Energy (DOE) . Hanford Site in Washington State. This scoping level study provides the basis for initiating Remedial Investigation/Feasibility Study (RI/FS) activities under the Comprehensive Environmental Response, Compensation, and Liability Act of 1980 (CERCLA) or Resource Conservation and Recovery Act (RCRA) Facility Investigations WD and Corrective Measures Studies (CMS) under RCRA. This report also integrates select RCRA treatment storage, or disposal (TSD) closure activities with CERCLA and RCRA past-practice investigations. This approach is described and justified in The Hanford Federal Facility Agreement and Consent Order Change Package. This strategy provides new concepts for: accelerating decision-malting by maximizing the use of existing data consistent with data quality objectives (DQOs); and undertaking expedited response actions (ERAS) and/or interim remedial measures (IRMs), as appropriate, to either remove threats to human health and welfare and the environment, or to reduce risk by reducing toxicity, mobility, or volume of contaminants

  15. Analysis of the TMI-2 source range detector response

    International Nuclear Information System (INIS)

    Carew, J.F.; Diamond, D.J.; Eridon, J.M.

    1980-01-01

    In the first few hours following the TMI-2 accident large variations (factors of 10-100) in the source range (SR) detector response were observed. The purpose of this analysis was to quantify the various effects which could contribute to these large variations. The effects evaluated included the transmission of neutrons and photons from the core to detector and the reduction in the multiplication of the Am-Be startup sources, and subsequent reduction in SR detector response, due to core voiding. A one-dimensional ANISN slab model of the TMI-2 core, core externals, pressure vessel and containment has been constructed for calculation of the SR detector response and is presented

  16. Obisdian sourcing by PIXE analysis at AURA2

    International Nuclear Information System (INIS)

    Neve, S.R.; Barker, P.H.; Holroyd, S.; Sheppard, P.J.

    1994-01-01

    The technique of Proton Induced X-ray Emission is a suitable method for the elemental analysis of obsidian samples and artefacts. By comparing the elemental composition of obsidian artefacts with those of known sources of obsidian and identifying similarities, the likely origin of the sample can be discovered and information about resource procurement gained. A PIXE facility has now been established at the Auckland University Research Accelerator Laboratory, AURA2. It offers a rapid, multi-element, non-destructive method of characterisation of obsidian samples ranging from small chips to large pieces. In an extensive survey of Mayor Island obsidian, a discrimination has been made between the different locations of obsidian deposits on the island. In addition, using the database developed at AURA2, artefacts from the site of Opita, Hauraki Plains, have been sourced. (Author). 18 refs., 8 figs., 7 tabs., 1 appendix

  17. Open-Source Data and the Study of Homicide.

    Science.gov (United States)

    Parkin, William S; Gruenewald, Jeff

    2015-07-20

    To date, no discussion has taken place in the social sciences as to the appropriateness of using open-source data to augment, or replace, official data sources in homicide research. The purpose of this article is to examine whether open-source data have the potential to be used as a valid and reliable data source in testing theory and studying homicide. Official and open-source homicide data were collected as a case study in a single jurisdiction over a 1-year period. The data sets were compared to determine whether open-sources could recreate the population of homicides and variable responses collected in official data. Open-source data were able to replicate the population of homicides identified in the official data. Also, for every variable measured, the open-sources captured as much, or more, of the information presented in the official data. Also, variables not available in official data, but potentially useful for testing theory, were identified in open-sources. The results of the case study show that open-source data are potentially as effective as official data in identifying individual- and situational-level characteristics, provide access to variables not found in official homicide data, and offer geographic data that can be used to link macro-level characteristics to homicide events. © The Author(s) 2015.

  18. Seismicity and source spectra analysis in Salton Sea Geothermal Field

    Science.gov (United States)

    Cheng, Y.; Chen, X.

    2016-12-01

    The surge of "man-made" earthquakes in recent years has led to considerable concerns about the associated hazards. Improved monitoring of small earthquakes would significantly help understand such phenomena and the underlying physical mechanisms. In the Salton Sea Geothermal field in southern California, open access of a local borehole network provides a unique opportunity to better understand the seismicity characteristics, the related earthquake hazards, and the relationship with the geothermal system, tectonic faulting and other physical conditions. We obtain high-resolution earthquake locations in the Salton Sea Geothermal Field, analyze characteristics of spatiotemporal isolated earthquake clusters, magnitude-frequency distributions and spatial variation of stress drops. The analysis reveals spatial coherent distributions of different types of clustering, b-value distributions, and stress drop distribution. The mixture type clusters (short-duration rapid bursts with high aftershock productivity) are predominately located within active geothermal field that correlate with high b-value, low stress drop microearthquake clouds, while regular aftershock sequences and swarms are distributed throughout the study area. The differences between earthquakes inside and outside of geothermal operation field suggest a possible way to distinguish directly induced seismicity due to energy operation versus typical seismic slip driven sequences. The spatial coherent b-value distribution enables in-situ estimation of probabilities for M≥3 earthquakes, and shows that the high large-magnitude-event (LME) probability zones with high stress drop are likely associated with tectonic faulting. The high stress drop in shallow (1-3 km) depth indicates the existence of active faults, while low stress drops near injection wells likely corresponds to the seismic response to fluid injection. I interpret the spatial variation of seismicity and source characteristics as the result of fluid

  19. Zoomed MRI Guided by Combined EEG/MEG Source Analysis: A Multimodal Approach for Optimizing Presurgical Epilepsy Work-up and its Application in a Multi-focal Epilepsy Patient Case Study.

    Science.gov (United States)

    Aydin, Ü; Rampp, S; Wollbrink, A; Kugel, H; Cho, J -H; Knösche, T R; Grova, C; Wellmer, J; Wolters, C H

    2017-07-01

    In recent years, the use of source analysis based on electroencephalography (EEG) and magnetoencephalography (MEG) has gained considerable attention in presurgical epilepsy diagnosis. However, in many cases the source analysis alone is not used to tailor surgery unless the findings are confirmed by lesions, such as, e.g., cortical malformations in MRI. For many patients, the histology of tissue resected from MRI negative epilepsy shows small lesions, which indicates the need for more sensitive MR sequences. In this paper, we describe a technique to maximize the synergy between combined EEG/MEG (EMEG) source analysis and high resolution MRI. The procedure has three main steps: (1) construction of a detailed and calibrated finite element head model that considers the variation of individual skull conductivities and white matter anisotropy, (2) EMEG source analysis performed on averaged interictal epileptic discharges (IED), (3) high resolution (0.5 mm) zoomed MR imaging, limited to small areas centered at the EMEG source locations. The proposed new diagnosis procedure was then applied in a particularly challenging case of an epilepsy patient: EMEG analysis at the peak of the IED coincided with a right frontal focal cortical dysplasia (FCD), which had been detected at standard 1 mm resolution MRI. Of higher interest, zoomed MR imaging (applying parallel transmission, 'ZOOMit') guided by EMEG at the spike onset revealed a second, fairly subtle, FCD in the left fronto-central region. The evaluation revealed that this second FCD, which had not been detectable with standard 1 mm resolution, was the trigger of the seizures.

  20. Creep analysis of fuel plates for the Advanced Neutron Source

    International Nuclear Information System (INIS)

    Swinson, W.F.; Yahr, G.T.

    1994-11-01

    The reactor for the planned Advanced Neutron Source will use closely spaced arrays of fuel plates. The plates are thin and will have a core containing enriched uranium silicide fuel clad in aluminum. The heat load caused by the nuclear reactions within the fuel plates will be removed by flowing high-velocity heavy water through narrow channels between the plates. However, the plates will still be at elevated temperatures while in service, and the potential for excessive plate deformation because of creep must be considered. An analysis to include creep for deformation and stresses because of temperature over a given time span has been performed and is reported herein

  1. Sources

    International Nuclear Information System (INIS)

    Duffy, L.P.

    1991-01-01

    This paper discusses the sources of radiation in the narrow perspective of radioactivity and the even narrow perspective of those sources that concern environmental management and restoration activities at DOE facilities, as well as a few related sources. Sources of irritation, Sources of inflammatory jingoism, and Sources of information. First, the sources of irritation fall into three categories: No reliable scientific ombudsman to speak without bias and prejudice for the public good, Technical jargon with unclear definitions exists within the radioactive nomenclature, and Scientific community keeps a low-profile with regard to public information. The next area of personal concern are the sources of inflammation. This include such things as: Plutonium being described as the most dangerous substance known to man, The amount of plutonium required to make a bomb, Talk of transuranic waste containing plutonium and its health affects, TMI-2 and Chernobyl being described as Siamese twins, Inadequate information on low-level disposal sites and current regulatory requirements under 10 CFR 61, Enhanced engineered waste disposal not being presented to the public accurately. Numerous sources of disinformation regarding low level radiation high-level radiation, Elusive nature of the scientific community, The Federal and State Health Agencies resources to address comparative risk, and Regulatory agencies speaking out without the support of the scientific community

  2. Review on solving the inverse problem in EEG source analysis

    Directory of Open Access Journals (Sweden)

    Fabri Simon G

    2008-11-01

    Full Text Available Abstract In this primer, we give a review of the inverse problem for EEG source localization. This is intended for the researchers new in the field to get insight in the state-of-the-art techniques used to find approximate solutions of the brain sources giving rise to a scalp potential recording. Furthermore, a review of the performance results of the different techniques is provided to compare these different inverse solutions. The authors also include the results of a Monte-Carlo analysis which they performed to compare four non parametric algorithms and hence contribute to what is presently recorded in the literature. An extensive list of references to the work of other researchers is also provided. This paper starts off with a mathematical description of the inverse problem and proceeds to discuss the two main categories of methods which were developed to solve the EEG inverse problem, mainly the non parametric and parametric methods. The main difference between the two is to whether a fixed number of dipoles is assumed a priori or not. Various techniques falling within these categories are described including minimum norm estimates and their generalizations, LORETA, sLORETA, VARETA, S-MAP, ST-MAP, Backus-Gilbert, LAURA, Shrinking LORETA FOCUSS (SLF, SSLOFO and ALF for non parametric methods and beamforming techniques, BESA, subspace techniques such as MUSIC and methods derived from it, FINES, simulated annealing and computational intelligence algorithms for parametric methods. From a review of the performance of these techniques as documented in the literature, one could conclude that in most cases the LORETA solution gives satisfactory results. In situations involving clusters of dipoles, higher resolution algorithms such as MUSIC or FINES are however preferred. Imposing reliable biophysical and psychological constraints, as done by LAURA has given superior results. The Monte-Carlo analysis performed, comparing WMN, LORETA, sLORETA and SLF

  3. Fetal source extraction from magnetocardiographic recordings by dependent component analysis

    Energy Technology Data Exchange (ETDEWEB)

    Araujo, Draulio B de [Department of Physics and Mathematics, FFCLRP, University of Sao Paulo, Ribeirao Preto, SP (Brazil); Barros, Allan Kardec [Department of Electrical Engineering, Federal University of Maranhao, Sao Luis, Maranhao (Brazil); Estombelo-Montesco, Carlos [Department of Physics and Mathematics, FFCLRP, University of Sao Paulo, Ribeirao Preto, SP (Brazil); Zhao, Hui [Department of Medical Physics, University of Wisconsin, Madison, WI (United States); Filho, A C Roque da Silva [Department of Physics and Mathematics, FFCLRP, University of Sao Paulo, Ribeirao Preto, SP (Brazil); Baffa, Oswaldo [Department of Physics and Mathematics, FFCLRP, University of Sao Paulo, Ribeirao Preto, SP (Brazil); Wakai, Ronald [Department of Medical Physics, University of Wisconsin, Madison, WI (United States); Ohnishi, Noboru [Department of Information Engineering, Nagoya University (Japan)

    2005-10-07

    Fetal magnetocardiography (fMCG) has been extensively reported in the literature as a non-invasive, prenatal technique that can be used to monitor various functions of the fetal heart. However, fMCG signals often have low signal-to-noise ratio (SNR) and are contaminated by strong interference from the mother's magnetocardiogram signal. A promising, efficient tool for extracting signals, even under low SNR conditions, is blind source separation (BSS), or independent component analysis (ICA). Herein we propose an algorithm based on a variation of ICA, where the signal of interest is extracted using a time delay obtained from an autocorrelation analysis. We model the system using autoregression, and identify the signal component of interest from the poles of the autocorrelation function. We show that the method is effective in removing the maternal signal, and is computationally efficient. We also compare our results to more established ICA methods, such as FastICA.

  4. Thermal hydraulic analysis of the encapsulated nuclear heat source

    Energy Technology Data Exchange (ETDEWEB)

    Sienicki, J.J.; Wade, D.C. [Argonne National Lab., IL (United States)

    2001-07-01

    An analysis has been carried out of the steady state thermal hydraulic performance of the Encapsulated Nuclear Heat Source (ENHS) 125 MWt, heavy liquid metal coolant (HLMC) reactor concept at nominal operating power and shutdown decay heat levels. The analysis includes the development and application of correlation-type analytical solutions based upon first principles modeling of the ENHS concept that encompass both pure as well as gas injection augmented natural circulation conditions, and primary-to-intermediate coolant heat transfer. The results indicate that natural circulation of the primary coolant is effective in removing heat from the core and transferring it to the intermediate coolant without the attainment of excessive coolant temperatures. (authors)

  5. Car indoor air pollution - analysis of potential sources

    Directory of Open Access Journals (Sweden)

    Müller Daniel

    2011-12-01

    Full Text Available Abstract The population of industrialized countries such as the United States or of countries from the European Union spends approximately more than one hour each day in vehicles. In this respect, numerous studies have so far addressed outdoor air pollution that arises from traffic. By contrast, only little is known about indoor air quality in vehicles and influences by non-vehicle sources. Therefore the present article aims to summarize recent studies that address i.e. particulate matter exposure. It can be stated that although there is a large amount of data present for outdoor air pollution, research in the area of indoor air quality in vehicles is still limited. Especially, knowledge on non-vehicular sources is missing. In this respect, an understanding of the effects and interactions of i.e. tobacco smoke under realistic automobile conditions should be achieved in future.

  6. Development of in-vessel source term analysis code, tracer

    International Nuclear Information System (INIS)

    Miyagi, K.; Miyahara, S.

    1996-01-01

    Analyses of radionuclide transport in fuel failure accidents (generally referred to source terms) are considered to be important especially in the severe accident evaluation. The TRACER code has been developed to realistically predict the time dependent behavior of FPs and aerosols within the primary cooling system for wide range of fuel failure events. This paper presents the model description, results of validation study, the recent model advancement status of the code, and results of check out calculations under reactor conditions. (author)

  7. Economic analysis of the need for advanced power sources

    International Nuclear Information System (INIS)

    Hardie, R.W.; Omberg, R.P.

    1975-01-01

    The purpose of this paper is to determine the economic need for an advanced power source, be it fusion, solar, or some other concept. However, calculations were also performed assuming abandonment of the LMFBR program, so breeder benefits are a by-product of this study. The model used was the ALPS linear programming system for forecasting optimum power growth patterns. Total power costs were calculated over a planning horizon from 1975 to 2041 and discounted at 7 1 / 2 percent. The benefit of a particular advanced power source is simply the reduction in total power cost resulting from its introduction. Since data concerning advanced power sources (APS) are speculative, parametric calculations varying introduction dates and capital costs about a hypothetical APS plant were performed. Calculations were also performed without the LMFBR to determine the effect of the breeder on the benefits of an advanced power source. Other data used in the study, such as the energy demand curve and uranium resource estimates, are given in the Appendix, and a list of the 11 power plants used in this study is given. Calculations were performed for APS introduction dates of 2001 and 2011. Estimates of APS capital costs included cases where it was assumed the costs were $50/kW and $25/kW higher than the LMFBR. In addition, cases where APS and LMFBR capital costs are identical were also considered. It is noted that the APS capital costs used in this study are not estimates of potential advanced power system plant costs, but were chosen to compute potential dollar benefits of advanced power systems under extremely optimistic assumptions. As a further example, all APS fuel cycle costs were assumed to be zero

  8. Error Analysis of CM Data Products Sources of Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Hunt, Brian D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eckert-Gallup, Aubrey Celia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Cochran, Lainy Dromgoole [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kraus, Terrence D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Allen, Mark B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Beal, Bill [National Security Technologies, Joint Base Andrews, MD (United States); Okada, Colin [National Security Technologies, LLC. (NSTec), Las Vegas, NV (United States); Simpson, Mathew [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-01

    This goal of this project is to address the current inability to assess the overall error and uncertainty of data products developed and distributed by DOE’s Consequence Management (CM) Program. This is a widely recognized shortfall, the resolution of which would provide a great deal of value and defensibility to the analysis results, data products, and the decision making process that follows this work. A global approach to this problem is necessary because multiple sources of error and uncertainty contribute to the ultimate production of CM data products. Therefore, this project will require collaboration with subject matter experts across a wide range of FRMAC skill sets in order to quantify the types of uncertainty that each area of the CM process might contain and to understand how variations in these uncertainty sources contribute to the aggregated uncertainty present in CM data products. The ultimate goal of this project is to quantify the confidence level of CM products to ensure that appropriate public and worker protections decisions are supported by defensible analysis.

  9. Acoustic Source Analysis of Magnetoacoustic Tomography With Magnetic Induction for Conductivity Gradual-Varying Tissues.

    Science.gov (United States)

    Wang, Jiawei; Zhou, Yuqi; Sun, Xiaodong; Ma, Qingyu; Zhang, Dong

    2016-04-01

    As a multiphysics imaging approach, magnetoacoustic tomography with magnetic induction (MAT-MI) works on the physical mechanism of magnetic excitation, acoustic vibration, and transmission. Based on the theoretical analysis of the source vibration, numerical studies are conducted to simulate the pathological changes of tissues for a single-layer cylindrical conductivity gradual-varying model and estimate the strengths of sources inside the model. The results suggest that the inner source is generated by the product of the conductivity and the curl of the induced electric intensity inside conductivity homogeneous medium, while the boundary source is produced by the cross product of the gradient of conductivity and the induced electric intensity at conductivity boundary. For a biological tissue with low conductivity, the strength of boundary source is much higher than that of the inner source only when the size of conductivity transition zone is small. In this case, the tissue can be treated as a conductivity abrupt-varying model, ignoring the influence of inner source. Otherwise, the contributions of inner and boundary sources should be evaluated together quantitatively. This study provide basis for further study of precise image reconstruction of MAT-MI for pathological tissues.

  10. Vrancea seismic source analysis using a small-aperture array

    International Nuclear Information System (INIS)

    Popescu, E.; Popa, M.; Radulian, M.; Placinta, A.O.

    2005-01-01

    A small-aperture seismic array (BURAR) was installed in 1999 in the northern part of the Romanian territory (Bucovina area). Since then, the array has been in operation under a joint cooperation programme between Romania and USA. The array consists of 10 stations installed in boreholes (nine short period instruments and one broadband instrument) with enough high sensitivity to properly detect earthquakes generated in Vrancea subcrustal domain (at about 250 km epicentral distance) with magnitude M w below 3. Our main purpose is to investigate and calibrate the source parameters of the Vrancea intermediate-depth earthquakes using specific techniques provided by the BURAR array data. Forty earthquakes with magnitudes between 2.9 and 6.0 were selected, including the recent events of September 27, 2004 (45.70 angle N, 26.45 angle E, h = 166 km, M w = 4.7), October 27, 2004 (45.84 angle N, 26.63 angle E, h = 105 km, M w = 6.0) and May 14, 2005 (45.66 angle N, 26.52 angle E, h = 146 km, M w = 5.1), which are the best ever recorded earthquakes on the Romanian territory: Empirical Green's function deconvolution and spectral ratio methods are applied for pairs of collocated events with similar focal mechanism. Stability tests are performed for the retrieved source time function using the array elements. Empirical scaling and calibration relationships are also determined. Our study shows the capability of the BURAR array to determine the source parameters of the Vrancea intermediate-depth earthquakes as a stand alone station and proves that the recordings of this array alone provides reliable and useful tools to efficiently constrain the source parameters and consequently source scaling properties. (authors)

  11. Application of Open Source Technologies for Oceanographic Data Analysis

    Science.gov (United States)

    Huang, T.; Gangl, M.; Quach, N. T.; Wilson, B. D.; Chang, G.; Armstrong, E. M.; Chin, T. M.; Greguska, F.

    2015-12-01

    NEXUS is a data-intensive analysis solution developed with a new approach for handling science data that enables large-scale data analysis by leveraging open source technologies such as Apache Cassandra, Apache Spark, Apache Solr, and Webification. NEXUS has been selected to provide on-the-fly time-series and histogram generation for the Soil Moisture Active Passive (SMAP) mission for Level 2 and Level 3 Active, Passive, and Active Passive products. It also provides an on-the-fly data subsetting capability. NEXUS is designed to scale horizontally, enabling it to handle massive amounts of data in parallel. It takes a new approach on managing time and geo-referenced array data by dividing data artifacts into chunks and stores them in an industry-standard, horizontally scaled NoSQL database. This approach enables the development of scalable data analysis services that can infuse and leverage the elastic computing infrastructure of the Cloud. It is equipped with a high-performance geospatial and indexed data search solution, coupled with a high-performance data Webification solution free from file I/O bottlenecks, as well as a high-performance, in-memory data analysis engine. In this talk, we will focus on the recently funded AIST 2014 project by using NEXUS as the core for oceanographic anomaly detection service and web portal. We call it, OceanXtremes

  12. Prospects for accelerator neutron sources for large volume minerals analysis

    International Nuclear Information System (INIS)

    Clayton, C.G.; Spackman, R.

    1988-01-01

    The electron Linac can be regarded as a practical source of thermal neutrons for activation analysis of large volume mineral samples. With a suitable target and moderator, a neutron flux of about 10 10 n/cm/s over 2-3 kg of rock can be generated. The proton Linac gives the possibility of a high neutron yield (> 10 12 n/s) of fast neutrons at selected energies. For the electron Linac, targets of W-U and W-Be are discussed. The advantages and limitations of the system are demonstrated for the analysis of gold in rocks and ores and for platinum in chromitite. These elements were selected as they are most likely to justify an accelerator installation at the present time. Errors due to self shielding in gold particles for thermal neutrons are discussed. The proton Linac is considered for neutrons generated from a lithium target through the 7 Li(p, n) 7 Be reaction. The analysis of gold by fast neutron activation is considered. This approach avoids particle self-absorption and, by appropriate proton energy selection, avoids potentially dominating interfering reactions. The analysis of 235 U in the presence of 238 U and 232 Th is also considered. (author)

  13. Sustainability in Open Source Software Commons: Lessons Learned from an Empirical Study of SourceForge Projects

    Directory of Open Access Journals (Sweden)

    Charles M. Schweik

    2013-01-01

    Full Text Available In this article, we summarize a five-year US National Science Foundation funded study designed to investigate the factors that lead some open source projects to ongoing collaborative success while many others become abandoned. Our primary interest was to conduct a study that was closely representative of the population of open source software projects in the world, rather than focus on the more-often studied, high-profile successful cases. After building a large database of projects (n=174,333 and implementing a major survey of open source developers (n=1403, we were able to conduct statistical analyses to investigate over forty theoretically-based testable hypotheses. Our data firmly support what we call the conventional theory of open source software, showing that projects start small, and, in successful cases, grow slightly larger in terms of team size. We describe the “virtuous circle” supporting conventional wisdom of open source collaboration that comes out of this analysis, and we discuss two other interesting findings related to developer motivations and how team members find each other. Each of these findings is related to the sustainability of these projects.

  14. Beamformer source analysis and connectivity on concurrent EEG and MEG data during voluntary movements.

    Science.gov (United States)

    Muthuraman, Muthuraman; Hellriegel, Helge; Hoogenboom, Nienke; Anwar, Abdul Rauf; Mideksa, Kidist Gebremariam; Krause, Holger; Schnitzler, Alfons; Deuschl, Günther; Raethjen, Jan

    2014-01-01

    Electroencephalography (EEG) and magnetoencephalography (MEG) are the two modalities for measuring neuronal dynamics at a millisecond temporal resolution. Different source analysis methods, to locate the dipoles in the brain from which these dynamics originate, have been readily applied to both modalities alone. However, direct comparisons and possible advantages of combining both modalities have rarely been assessed during voluntary movements using coherent source analysis. In the present study, the cortical and sub-cortical network of coherent sources at the finger tapping task frequency (2-4 Hz) and the modes of interaction within this network were analysed in 15 healthy subjects using a beamformer approach called the dynamic imaging of coherent sources (DICS) with subsequent source signal reconstruction and renormalized partial directed coherence analysis (RPDC). MEG and EEG data were recorded simultaneously allowing the comparison of each of the modalities separately to that of the combined approach. We found the identified network of coherent sources for the finger tapping task as described in earlier studies when using only the MEG or combined MEG+EEG whereas the EEG data alone failed to detect single sub-cortical sources. The signal-to-noise ratio (SNR) level of the coherent rhythmic activity at the tapping frequency in MEG and combined MEG+EEG data was significantly higher than EEG alone. The functional connectivity analysis revealed that the combined approach had more active connections compared to either of the modalities during the finger tapping (FT) task. These results indicate that MEG is superior in the detection of deep coherent sources and that the SNR seems to be more vital than the sensitivity to theoretical dipole orientation and the volume conduction effect in the case of EEG.

  15. Beamformer source analysis and connectivity on concurrent EEG and MEG data during voluntary movements.

    Directory of Open Access Journals (Sweden)

    Muthuraman Muthuraman

    Full Text Available Electroencephalography (EEG and magnetoencephalography (MEG are the two modalities for measuring neuronal dynamics at a millisecond temporal resolution. Different source analysis methods, to locate the dipoles in the brain from which these dynamics originate, have been readily applied to both modalities alone. However, direct comparisons and possible advantages of combining both modalities have rarely been assessed during voluntary movements using coherent source analysis. In the present study, the cortical and sub-cortical network of coherent sources at the finger tapping task frequency (2-4 Hz and the modes of interaction within this network were analysed in 15 healthy subjects using a beamformer approach called the dynamic imaging of coherent sources (DICS with subsequent source signal reconstruction and renormalized partial directed coherence analysis (RPDC. MEG and EEG data were recorded simultaneously allowing the comparison of each of the modalities separately to that of the combined approach. We found the identified network of coherent sources for the finger tapping task as described in earlier studies when using only the MEG or combined MEG+EEG whereas the EEG data alone failed to detect single sub-cortical sources. The signal-to-noise ratio (SNR level of the coherent rhythmic activity at the tapping frequency in MEG and combined MEG+EEG data was significantly higher than EEG alone. The functional connectivity analysis revealed that the combined approach had more active connections compared to either of the modalities during the finger tapping (FT task. These results indicate that MEG is superior in the detection of deep coherent sources and that the SNR seems to be more vital than the sensitivity to theoretical dipole orientation and the volume conduction effect in the case of EEG.

  16. Neutron activation analysis detection limits using 252Cf sources

    International Nuclear Information System (INIS)

    DiPrete, D.P.; Sigg, R.A.

    2000-01-01

    The Savannah River Technology Center (SRTC) developed a neutron activation analysis (NAA) facility several decades ago using low-flux 252 Cf neutron sources. Through this time, the facility has addressed areas of applied interest in managing the Savannah River Site (SRS). Some applications are unique because of the site's operating history and its chemical-processing facilities. Because sensitivity needs for many applications are not severe, they can be accomplished using an ∼6-mg 252 Cf NAA facility. The SRTC 252 Cf facility continues to support applied research programs at SRTC as well as other SRS programs for environmental and waste management customers. Samples analyzed by NAA include organic compounds, metal alloys, sediments, site process solutions, and many other materials. Numerous radiochemical analyses also rely on the facility for production of short-lived tracers, yielding by activation of carriers and small-scale isotope production for separation methods testing. These applications are more fully reviewed in Ref. 1. Although the flux [approximately2 x 10 7 n/cm 2 ·s] is low relative to reactor facilities, more than 40 elements can be detected at low and sub-part-per-million levels. Detection limits provided by the facility are adequate for many analytical projects. Other multielement analysis methods, particularly inductively coupled plasma atomic emission and inductively coupled plasma mass spectrometry, can now provide sensitivities on dissolved samples that are often better than those available by NAA using low-flux isotopic sources. Because NAA allows analysis of bulk samples, (a) it is a more cost-effective choice when its sensitivity is adequate than methods that require digestion and (b) it eliminates uncertainties that can be introduced by digestion processes

  17. Emotion impairs extrinsic source memory--An ERP study.

    Science.gov (United States)

    Mao, Xinrui; You, Yuqi; Li, Wen; Guo, Chunyan

    2015-09-01

    Substantial advancements in understanding emotional modulation of item memory notwithstanding, controversies remain as to how emotion influences source memory. Using an emotional extrinsic source memory paradigm combined with remember/know judgments and two key event-related potentials (ERPs)-the FN400 (a frontal potential at 300-500 ms related to familiarity) and the LPC (a later parietal potential at 500-700 ms related to recollection), our research investigated the impact of emotion on extrinsic source memory and the underlying processes. We varied a semantic prompt (either "people" or "scene") preceding a study item to manipulate the extrinsic source. Behavioral data indicated a significant effect of emotion on "remember" responses to extrinsic source details, suggesting impaired recollection-based source memory in emotional (both positive and negative) relative to neutral conditions. In parallel, differential FN400 and LPC amplitudes (correctly remembered - incorrectly remembered sources) revealed emotion-related interference, suggesting impaired familiarity and recollection memory of extrinsic sources associated with positive or negative items. These findings thus lend support to the notion of emotion-induced memory trade off: while enhancing memory of central items and intrinsic/integral source details, emotion nevertheless disrupts memory of peripheral contextual details, potentially impairing both familiarity and recollection. Importantly, that positive and negative items result in comparable memory impairment suggests that arousal (vs. affective valence) plays a critical role in modulating dynamic interactions among automatic and elaborate processes involved in memory. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Limit of detection in the presence of instrumental and non-instrumental errors: study of the possible sources of error and application to the analysis of 41 elements at trace levels by inductively coupled plasma-mass spectrometry technique

    International Nuclear Information System (INIS)

    Badocco, Denis; Lavagnini, Irma; Mondin, Andrea; Tapparo, Andrea; Pastore, Paolo

    2015-01-01

    In this paper the detection limit was estimated when signals were affected by two error contributions, namely instrumental errors and operational-non-instrumental errors. The detection limit was theoretically obtained following the hypothesis testing schema implemented with the calibration curve methodology. The experimental calibration design was based on J standards measured I times with non-instrumental errors affecting each standard systematically but randomly among the J levels. A two-component variance regression was performed to determine the calibration curve and to define the detection limit in these conditions. The detection limit values obtained from the calibration at trace levels of 41 elements by ICP-MS resulted larger than those obtainable from a one component variance regression. The role of the reagent impurities on the instrumental errors was ascertained and taken into account. Environmental pollution was studied as source of non-instrumental errors. The environmental pollution role was evaluated by Principal Component Analysis technique (PCA) applied to a series of nine calibrations performed in fourteen months. The influence of the seasonality of the environmental pollution on the detection limit was evidenced for many elements usually present in the urban air particulate. The obtained results clearly indicated the need of using the two-component variance regression approach for the calibration of all the elements usually present in the environment at significant concentration levels. - Highlights: • Limit of detection was obtained considering a two variance component regression. • Calibration data may be affected by instrumental and operational conditions errors. • Calibration model was applied to determine 41 elements at trace level by ICP-MS. • Non instrumental errors were evidenced by PCA analysis

  19. Engagement as a source of positive consumer behaviour: a study ...

    African Journals Online (AJOL)

    Engagement as a source of positive consumer behaviour: a study amongst South African football fans. ... Remember me ... Further, the potential of fan engagement as a predictor of positive consumer behaviours (match attendance and ...

  20. System optimization for continuous on-stream elemental analysis using low-output isotopic neutron sources

    International Nuclear Information System (INIS)

    Rizk, R.A.M.

    1989-01-01

    In continuous on-stream neutron activation analysis, the material to be analyzed may be continuously recirculated in a closed loop system between an activation source and a shielded detector. In this paper an analytical formulation of the detector response for such a system is presented. This formulation should be useful in optimizing the system design parameters for specific applications. A study has been made of all parameters that influence the detector response during on-stream analysis. Feasibility applications of the method to solutions of manganese and vanadium using a 5 μg 252 Cf neutron source are demonstrated. (author)

  1. Multi-source Geospatial Data Analysis with Google Earth Engine

    Science.gov (United States)

    Erickson, T.

    2014-12-01

    The Google Earth Engine platform is a cloud computing environment for data analysis that combines a public data catalog with a large-scale computational facility optimized for parallel processing of geospatial data. The data catalog is a multi-petabyte archive of georeferenced datasets that include images from Earth observing satellite and airborne sensors (examples: USGS Landsat, NASA MODIS, USDA NAIP), weather and climate datasets, and digital elevation models. Earth Engine supports both a just-in-time computation model that enables real-time preview and debugging during algorithm development for open-ended data exploration, and a batch computation mode for applying algorithms over large spatial and temporal extents. The platform automatically handles many traditionally-onerous data management tasks, such as data format conversion, reprojection, and resampling, which facilitates writing algorithms that combine data from multiple sensors and/or models. Although the primary use of Earth Engine, to date, has been the analysis of large Earth observing satellite datasets, the computational platform is generally applicable to a wide variety of use cases that require large-scale geospatial data analyses. This presentation will focus on how Earth Engine facilitates the analysis of geospatial data streams that originate from multiple separate sources (and often communities) and how it enables collaboration during algorithm development and data exploration. The talk will highlight current projects/analyses that are enabled by this functionality.https://earthengine.google.org

  2. Continuous wavelet transform analysis and modal location analysis acoustic emission source location for nuclear piping crack growth monitoring

    International Nuclear Information System (INIS)

    Shukri Mohd

    2013-01-01

    Full-text: Source location is an important feature of acoustic emission (AE) damage monitoring in nuclear piping. The ability to accurately locate sources can assist in source characterisation and early warning of failure. This paper describe the development of a novelAE source location technique termed Wavelet Transform analysis and Modal Location (WTML) based on Lamb wave theory and time-frequency analysis that can be used for global monitoring of plate like steel structures. Source location was performed on a steel pipe of 1500 mm long and 220 mm outer diameter with nominal thickness of 5 mm under a planar location test setup using H-N sources. The accuracy of the new technique was compared with other AE source location methods such as the time of arrival (TOA) technique and DeltaTlocation. The results of the study show that the WTML method produces more accurate location results compared with TOA and triple point filtering location methods. The accuracy of the WTML approach is comparable with the deltaT location method but requires no initial acoustic calibration of the structure. (author)

  3. Continuous wavelet transform analysis and modal location analysis acoustic emission source location for nuclear piping crack growth monitoring

    International Nuclear Information System (INIS)

    Mohd, Shukri; Holford, Karen M.; Pullin, Rhys

    2014-01-01

    Source location is an important feature of acoustic emission (AE) damage monitoring in nuclear piping. The ability to accurately locate sources can assist in source characterisation and early warning of failure. This paper describe the development of a novelAE source location technique termed 'Wavelet Transform analysis and Modal Location (WTML)' based on Lamb wave theory and time-frequency analysis that can be used for global monitoring of plate like steel structures. Source location was performed on a steel pipe of 1500 mm long and 220 mm outer diameter with nominal thickness of 5 mm under a planar location test setup using H-N sources. The accuracy of the new technique was compared with other AE source location methods such as the time of arrival (TOA) techniqueand DeltaTlocation. Theresults of the study show that the WTML method produces more accurate location resultscompared with TOA and triple point filtering location methods. The accuracy of the WTML approach is comparable with the deltaT location method but requires no initial acoustic calibration of the structure

  4. Continuous wavelet transform analysis and modal location analysis acoustic emission source location for nuclear piping crack growth monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Mohd, Shukri [Nondestructive Testing Group, Industrial Technology Division, Malaysian Nuclear Agency, 43000, Bangi, Selangor (Malaysia); Holford, Karen M.; Pullin, Rhys [Cardiff School of Engineering, Cardiff University, Queen' s Buildings, The Parade, CARDIFF CF24 3AA (United Kingdom)

    2014-02-12

    Source location is an important feature of acoustic emission (AE) damage monitoring in nuclear piping. The ability to accurately locate sources can assist in source characterisation and early warning of failure. This paper describe the development of a novelAE source location technique termed 'Wavelet Transform analysis and Modal Location (WTML)' based on Lamb wave theory and time-frequency analysis that can be used for global monitoring of plate like steel structures. Source location was performed on a steel pipe of 1500 mm long and 220 mm outer diameter with nominal thickness of 5 mm under a planar location test setup using H-N sources. The accuracy of the new technique was compared with other AE source location methods such as the time of arrival (TOA) techniqueand DeltaTlocation. Theresults of the study show that the WTML method produces more accurate location resultscompared with TOA and triple point filtering location methods. The accuracy of the WTML approach is comparable with the deltaT location method but requires no initial acoustic calibration of the structure.

  5. Phase 2 safety analysis report: National Synchrotron Light Source

    International Nuclear Information System (INIS)

    Stefan, P.

    1989-06-01

    The Phase II program was established in order to provide additional space for experiments, and also staging and equipment storage areas. It also provides additional office space and new types of advanced instrumentation for users. This document will deal with the new safety issues resulting from this extensive expansion program, and should be used as a supplement to BNL Report No. 51584 ''National Synchrotron Light Source Safety Analysis Report,'' July 1982 (hereafter referred to as the Phase I SAR). The initial NSLS facility is described in the Phase I SAR. It comprises two electron storage rings, an injection system common to both, experimental beam lines and equipment, and office and support areas, all of which are housed in a 74,000 sq. ft. building. The X-ray Ring provides for 28 primary beam ports and the VUV Ring, 16. Each port is capable of division into 2 or 3 separate beam lines. All ports receive their synchrotron light from conventional bending magnet sources, the magnets being part of the storage ring lattice. 4 refs

  6. Thermal-hydraulic studies of the Advanced Neutron Source cold source

    International Nuclear Information System (INIS)

    Williams, P.T.; Lucas, A.T.

    1995-08-01

    The Advanced Neutron Source (ANS), in its conceptual design phase at Oak Ridge National Laboratory, was to be a user-oriented neutron research facility producing the most intense steady-state flux of thermal and cold neutrons in the world. Among its many scientific applications, the production of cold neutrons was a significant research mission for the ANS. The cold neutrons come from two independent cold sources positioned near the reactor core. Contained by an aluminum alloy vessel, each cold source is a 410-mm-diam sphere of liquid deuterium that functions both as a neutron moderator and a cryogenic coolant. With nuclear heating of the containment vessel and internal baffling, steady-state operation requires close control of the liquid deuterium flow near the vessel's inner surface. Preliminary thermal-hydraulic analyses supporting the cold source design were performed with heat conduction simulations of the vessel walls and multidimensional computational fluid dynamics simulations of the liquid deuterium flow and heat transfer. This report presents the starting phase of a challenging program and describes the cold source conceptual design, the thermal-hydraulic feasibility studies of the containment vessel, and the future computational and experimental studies that were planned to verify the final design

  7. 252Cf-source-driven neutron noise analysis method

    International Nuclear Information System (INIS)

    Mihalczo, J.T.; King, W.T.; Blakeman, E.D.

    1985-01-01

    The 252 Cf-source-driven neutron noise analysis method has been tested in a a wide variety of experiments that have indicated the broad range of applicability of the method. The neutron multiplication factor, k/sub eff/ has been satisfactorily determined for a variety of materials including uranium metal, light water reactor fuel pins, fissile solutions, fuel plates in water, and interacting cylinders. For a uranyl nitrate solution tank which is typical of a fuel processing or reprocessing plant, the k/sub eff/ values were satisfactorily determined for values between 0.92 and 0.5 using a simple point kinetics interpretation of the experimental data. The short measurement times, in several cases as low as 1 min, have shown that the development of this method can lead to a practical subcriticality monitor for many in-plant applications. The further development of the method will require experiments and the development of theoretical methods to predict the experimental observables

  8. Evaluating source separation of plastic waste using conjoint analysis.

    Science.gov (United States)

    Nakatani, Jun; Aramaki, Toshiya; Hanaki, Keisuke

    2008-11-01

    Using conjoint analysis, we estimated the willingness to pay (WTP) of households for source separation of plastic waste and the improvement of related environmental impacts, the residents' loss of life expectancy (LLE), the landfill capacity, and the CO2 emissions. Unreliable respondents were identified and removed from the sample based on their answers to follow-up questions. It was found that the utility associated with reducing LLE and with the landfill capacity were both well expressed by logarithmic functions, but that residents were indifferent to the level of CO2 emissions even though they approved of CO2 reduction. In addition, residents derived utility from the act of separating plastic waste, irrespective of its environmental impacts; that is, they were willing to practice the separation of plastic waste at home in anticipation of its "invisible effects", such as the improvement of citizens' attitudes toward solid waste issues.

  9. The adoption of total cost of ownership for sourcing decisions - a structural equations analysis

    NARCIS (Netherlands)

    Wouters, Marc; Anderson, James C.; Wynstra, Finn

    2005-01-01

    This study investigates the adoption of total cost of ownership (TCO) analysis to improve sourcing decisions. TCO can be seen as an application of activity based costing (ABC) that quantifies the costs that are involved in acquiring and using purchased goods or services. TCO supports purchasing

  10. Analysis of filtration properties of locally sourced base oil for the ...

    African Journals Online (AJOL)

    This study examines the use of locally sourced oil like, groundnut oil, melon oil, vegetable oil, soya oil and palm oil as substitute for diesel oil in formulating oil base drilling fluids relative to filtration properties. The filtrate volumes of each of the oils were obtained for filtration control analysis. With increasing potash and ...

  11. Experimental study of adsorption chiller driven by variable heat source

    Energy Technology Data Exchange (ETDEWEB)

    Wang, D.C.; Wang, Y.J.; Zhang, J.P.; Tian, X.L. [College of Electromechanical Engineering, Qingdao University, Qingdao 266071 (China); Wu, J.Y. [Institute of Refrigeration and Cryogenics, Shanghai Jiao Tong University, Shanghai 200030 (China)

    2008-05-15

    A silica gel-water adsorption chiller has been developed in recent years and has been applied in an air conditioning system driven by solar energy. The heat source used to drive the adsorption chiller is variable at any moment because the solar radiation intensity or the waste heat from engines varies frequently. An adsorption cooling system may be badly impacted by a variable heat source with temperature variations in a large range. In this work, a silica gel-water adsorption chiller driven by a variable heat source is experimentally studied. The influences of the variable heat source on the performance of the chiller are analyzed, especially for a continuous temperature increase process and a continuous temperature decrease process of the heat source. As an example, the dynamic characteristics of the heat source are also analyzed when solar energy is taken as the heat source of the adsorption chiller. According to the experimental results for the adsorption chiller and the characteristics of the heat source from solar energy, control strategies of the adsorption chiller driven by solar energy are proposed. (author)

  12. Experimental study of adsorption chiller driven by variable heat source

    International Nuclear Information System (INIS)

    Wang, D.C.; Wang, Y.J.; Zhang, J.P.; Tian, X.L.; Wu, J.Y.

    2008-01-01

    A silica gel-water adsorption chiller has been developed in recent years and has been applied in an air conditioning system driven by solar energy. The heat source used to drive the adsorption chiller is variable at any moment because the solar radiation intensity or the waste heat from engines varies frequently. An adsorption cooling system may be badly impacted by a variable heat source with temperature variations in a large range. In this work, a silica gel-water adsorption chiller driven by a variable heat source is experimentally studied. The influences of the variable heat source on the performance of the chiller are analyzed, especially for a continuous temperature increase process and a continuous temperature decrease process of the heat source. As an example, the dynamic characteristics of the heat source are also analyzed when solar energy is taken as the heat source of the adsorption chiller. According to the experimental results for the adsorption chiller and the characteristics of the heat source from solar energy, control strategies of the adsorption chiller driven by solar energy are proposed

  13. Finite element analysis of advanced neutron source fuel plates

    International Nuclear Information System (INIS)

    Luttrell, C.R.

    1995-08-01

    The proposed design for the Advanced Neutron Source reactor core consists of closely spaced involute fuel plates. Coolant flows between the plates at high velocities. It is vital that adjacent plates do not come in contact and that the coolant channels between the plates remain open. Several scenarios that could result in problems with the fuel plates are studied. Finite element analyses are performed on fuel plates under pressure from the coolant flowing between the plates at a high velocity, under pressure because of a partial flow blockage in one of the channels, and with different temperature profiles

  14. Surface-Source Downhole Seismic Analysis in R

    Science.gov (United States)

    Thompson, Eric M.

    2007-01-01

    This report discusses a method for interpreting a layered slowness or velocity model from surface-source downhole seismic data originally presented by Boore (2003). I have implemented this method in the statistical computing language R (R Development Core Team, 2007), so that it is freely and easily available to researchers and practitioners that may find it useful. I originally applied an early version of these routines to seismic cone penetration test data (SCPT) to analyze the horizontal variability of shear-wave velocity within the sediments in the San Francisco Bay area (Thompson et al., 2006). A more recent version of these codes was used to analyze the influence of interface-selection and model assumptions on velocity/slowness estimates and the resulting differences in site amplification (Boore and Thompson, 2007). The R environment has many benefits for scientific and statistical computation; I have chosen R to disseminate these routines because it is versatile enough to program specialized routines, is highly interactive which aids in the analysis of data, and is freely and conveniently available to install on a wide variety of computer platforms. These scripts are useful for the interpretation of layered velocity models from surface-source downhole seismic data such as deep boreholes and SCPT data. The inputs are the travel-time data and the offset of the source at the surface. The travel-time arrivals for the P- and S-waves must already be picked from the original data. An option in the inversion is to include estimates of the standard deviation of the travel-time picks for a weighted inversion of the velocity profile. The standard deviation of each travel-time pick is defined relative to the standard deviation of the best pick in a profile and is based on the accuracy with which the travel-time measurement could be determined from the seismogram. The analysis of the travel-time data consists of two parts: the identification of layer-interfaces, and the

  15. Fire Hazard Analysis for the Cold Neutron Source System

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jung Won; Kim, Young Ki; Wu, Sang Ik; Park, Young Cheol; Kim, Bong Soo; Kang, Mee Jin; Oh, Sung Wook

    2006-04-15

    As the Cold Neutron Source System for its installation in HANARO has been designing, the fire hazard analysis upon the CNS system becomes required under No. 2003-20 of the MOST notice, Technical Standard about the Fire Hazard Analysis. As a moderator, the strongly flammable hydrogen is filled in the hydrogen system of CNS. Against the fire or explosion in the reactor hall, accordingly, the physical damage on the reactor safety system should be evaluated in order to reflect the safety protection precaution in the design of CNS system. For the purpose of fire hazard analysis, the accident scenarios were divided into three: hydrogen leak during the hydrogen charging in the system, hydrogen leak during the normal operation of CNS, explosion of hydrogen buffer tank by the external fire. The analysis results can be summarized as follows. First, there is no physical damage threatening the reactor safety system although all hydrogen gas came out of the system then ignited as a jet fire. Second, since the CNS equipment island (CEI) is located enough away from the reactor, no physical damage caused by the buffer tank explosion is on the reactor in terms of the overpressure except the flying debris so that the light two-hour fireproof panel is installed in an one side of hydrogen buffer tank. Third, there are a few combustibles on the second floor of CEI so that the fire cannot be propagated to other areas in the reactor hall; however, the light two-hour fireproof panel will be built on the second floor against the external or internal fire so as to play the role of a fire protection area.

  16. Fire Hazard Analysis for the Cold Neutron Source System

    International Nuclear Information System (INIS)

    Choi, Jung Won; Kim, Young Ki; Wu, Sang Ik; Park, Young Cheol; Kim, Bong Soo; Kang, Mee Jin; Oh, Sung Wook

    2006-04-01

    As the Cold Neutron Source System for its installation in HANARO has been designing, the fire hazard analysis upon the CNS system becomes required under No. 2003-20 of the MOST notice, Technical Standard about the Fire Hazard Analysis. As a moderator, the strongly flammable hydrogen is filled in the hydrogen system of CNS. Against the fire or explosion in the reactor hall, accordingly, the physical damage on the reactor safety system should be evaluated in order to reflect the safety protection precaution in the design of CNS system. For the purpose of fire hazard analysis, the accident scenarios were divided into three: hydrogen leak during the hydrogen charging in the system, hydrogen leak during the normal operation of CNS, explosion of hydrogen buffer tank by the external fire. The analysis results can be summarized as follows. First, there is no physical damage threatening the reactor safety system although all hydrogen gas came out of the system then ignited as a jet fire. Second, since the CNS equipment island (CEI) is located enough away from the reactor, no physical damage caused by the buffer tank explosion is on the reactor in terms of the overpressure except the flying debris so that the light two-hour fireproof panel is installed in an one side of hydrogen buffer tank. Third, there are a few combustibles on the second floor of CEI so that the fire cannot be propagated to other areas in the reactor hall; however, the light two-hour fireproof panel will be built on the second floor against the external or internal fire so as to play the role of a fire protection area

  17. Contract Source Selection: An Analysis of Lowest Price Technically Acceptable and Tradeoff Strategies

    Science.gov (United States)

    2016-06-15

    using- spss - statistics.php Lamoureux, J., Murrow, M., & Walls, C. (2015). Relationship of source selection methods to contract outcomes: an analysis ...Contract Source Selection: an Analysis of Lowest Price Technically Acceptable and Tradeoff Strategies 15 June 2016 LCDR Jamal M. Osman, USN...ACQUISITION RESEARCH PROGRAM SPONSORED REPORT SERIES Contract Source Selection: an Analysis of Lowest Price Technically Acceptable and Tradeoff

  18. Comparative studies of energy sources in gynecologic laparoscopy.

    Science.gov (United States)

    Law, Kenneth S K; Lyons, Stephen D

    2013-01-01

    Energy sources incorporating "vessel sealing" capabilities are being increasingly used in gynecologic laparoscopic surgery although conventional monopolar and bipolar electrosurgery remain popular. The preference for one device over another is based on a combination of factors, including the surgeon's subjective experience, availability, and cost. Although comparative clinical studies and meta-analyses of laparoscopic energy sources have reported small but statistically significant differences in volumes of blood loss, the clinical significance of such small volumes is questionable. The overall usefulness of the various energy sources available will depend on a number of factors including vessel burst pressure and seal time, lateral thermal spread, and smoke production. Animal studies and laboratory-based trials are useful in providing a controlled environment to investigate such parameters. At present, there is insufficient evidence to support the use of one energy source over another. Copyright © 2013 AAGL. All rights reserved.

  19. Validation of the direct analysis in real time source for use in forensic drug screening.

    Science.gov (United States)

    Steiner, Robert R; Larson, Robyn L

    2009-05-01

    The Direct Analysis in Real Time (DART) ion source is a relatively new mass spectrometry technique that is seeing widespread use in chemical analyses world-wide. DART studies include such diverse topics as analysis of flavors and fragrances, melamine in contaminated dog food, differentiation of writing inks, characterization of solid counterfeit drugs, and as a detector for planar chromatography. Validation of this new technique for the rapid screening of forensic evidence for drugs of abuse, utilizing the DART source coupled to an accurate mass time-of-flight mass spectrometer, was conducted. The study consisted of the determination of the lower limit of detection for the method, determination of selectivity and a comparison of this technique to established analytical protocols. Examples of DART spectra are included. The results of this study have allowed the Virginia Department of Forensic Science to incorporate this new technique into their analysis scheme for the screening of solid dosage forms of drugs of abuse.

  20. Barrow Black Carbon Source and Impact Study Final Campaign Report

    Energy Technology Data Exchange (ETDEWEB)

    Barrett, Tate

    2014-07-01

    The goal of the Barrow Black Carbon Source and Impact campaign was to characterize the concentration and isotopic composition of carbonaceous atmospheric particulate matter (PM) at the Atmospheric Radiation Measurement (ARM) Climate Research Facility site in Barrow, Alaska. The carbonaceous component was characterized by measuring the organic and black carbon (OC and BC) components of the total PM. To facilitate complete characterization of the PM, filter-based collections were used, including a medium volume PM2.5 sampler and a high volume PM10 sampler. Thirty-eight fine PM fractions (PM2.5) and 49 coarse (PM10) PM fractions were collected at weekly and bi-monthly intervals. The PM2.5 sampler operated with minimal maintenance during the 12 month campaign. The PM10 sampler used for the Barrow Black Carbon Source and Impact (BBCSI) study used standard Tisch “hi-vol” motors that have a known lifetime of approximately 1 month under constant use; this necessitated monthly maintenance, and it is suggested that, for future deployment in the Arctic, the motors be upgraded to industrial blowers. The BBCSI sampling campaign successfully collected and archived 87 ambient atmospheric PM samples from Barrow, Alaska, from July 2012 to June 2013. Preliminary analysis of the OC and BC concentrations has been completed. This campaign confirmed known trends of high BC lasting from the winter through to spring haze periods and low BC concentrations in the summer. However, the annual OC concentrations had a very different seasonal pattern with the highest concentrations during the summer, lowest concentrations during the fall, and increased concentrations during the winter and spring (Figure 1).

  1. The analysis of security cost for different energy sources

    International Nuclear Information System (INIS)

    Jun, Eunju; Kim, Wonjoon; Chang, Soon Heung

    2009-01-01

    Global concerns for the security of energy have steadily been on the increase and are expected to become a major issue over the next few decades. Urgent policy response is thus essential. However, little attempt has been made at defining both energy security and energy metrics. In this study, we provide such metrics and apply them to four major energy sources in the Korean electricity market: coal, oil, liquefied natural gas, and nuclear. In our approach, we measure the cost of energy security in terms of supply disruption and price volatility, and we consider the degree of concentration in energy supply and demand using the Hirschman-Herfindahl index (HHI). Due to its balanced fuel supply and demand, relatively stable price, and high abundance, we find nuclear energy to be the most competitive energy source in terms of energy security in the Korean electricity market. LNG, on the other hand, was found to have the highest cost in term of energy security due to its high concentration in supply and demand, and its high price volatility. In addition, in terms of cost, we find that economic security dominates supply security, and as such, it is the main factor in the total security cost. Within the confines of concern for global energy security, our study both broadens our understanding of energy security and enables a strategic approach in the portfolio management of energy consumption.

  2. Earthquake Source Spectral Study beyond the Omega-Square Model

    Science.gov (United States)

    Uchide, T.; Imanishi, K.

    2017-12-01

    Earthquake source spectra have been used for characterizing earthquake source processes quantitatively and, at the same time, simply, so that we can analyze the source spectra for many earthquakes, especially for small earthquakes, at once and compare them each other. A standard model for the source spectra is the omega-square model, which has the flat spectrum and the falloff inversely proportional to the square of frequencies at low and high frequencies, respectively, which are bordered by a corner frequency. The corner frequency has often been converted to the stress drop under the assumption of circular crack models. However, recent studies claimed the existence of another corner frequency [Denolle and Shearer, 2016; Uchide and Imanishi, 2016] thanks to the recent development of seismic networks. We have found that many earthquakes in areas other than the area studied by Uchide and Imanishi [2016] also have source spectra deviating from the omega-square model. Another part of the earthquake spectra we now focus on is the falloff rate at high frequencies, which will affect the seismic energy estimation [e.g., Hirano and Yagi, 2017]. In June, 2016, we deployed seven velocity seismometers in the northern Ibaraki prefecture, where the shallow crustal seismicity mainly with normal-faulting events was activated by the 2011 Tohoku-oki earthquake. We have recorded seismograms at 1000 samples per second and at a short distance from the source, so that we can investigate the high-frequency components of the earthquake source spectra. Although we are still in the stage of discovery and confirmation of the deviation from the standard omega-square model, the update of the earthquake source spectrum model will help us systematically extract more information on the earthquake source process.

  3. A Study of Porphyrins in Petroleum Source Rocks

    Energy Technology Data Exchange (ETDEWEB)

    Huseby, Berit

    1997-12-31

    This thesis discusses several aspects of porphyrin geochemistry. Degradation experiments have been performed on the Messel oil shale (Eocene, Germany) to obtain information on porphyrins bound or incorporated into macromolecular structures. Thermal heating of the preextracted kerogen by hydrous pyrolysis was used to study the release of porphyrins and their temperature dependent changes during simulated diagenesis and catagenesis. Selective chemical degradation experiments were performed on the preextracted sediment to get more detailed information about porphyrins that are specifically bound to the macromolecular structures via ester bonds. From the heating experiments, in a separate study, the porphyrin nitrogen content in the generated bitumens was compared to the bulk of organic nitrogen compounds in the fraction. The bulk nitrogen contents in the generated bitumens, the water phase and the residual organic matter was recorded to establish the distribution of nitrogen between the kerogen and product phases. Porphyrins as biomarkers were examined in naturally matured Kimmeridge clay source rocks (Upper Jurassic, Norway), and the use of porphyrins as general indicators of maturity was evaluated. Underlying maturity trends in the biomarker data was investigated by Partial Least Squares analysis. Porphyrin as indicators of depositional conditions was also addressed, where the correlations between the (amounts) abundance of nickel and vanadyl porphyrins were mapped together with other descriptors that are assumed to be indicative of redox depositional conditions. 252 refs., 28 figs., 4 tabs.

  4. A Study of Porphyrins in Petroleum Source Rocks

    Energy Technology Data Exchange (ETDEWEB)

    Huseby, Berit

    1996-12-31

    This thesis discusses several aspects of porphyrin geochemistry. Degradation experiments have been performed on the Messel oil shale (Eocene, Germany) to obtain information on porphyrins bound or incorporated into macromolecular structures. Thermal heating of the preextracted kerogen by hydrous pyrolysis was used to study the release of porphyrins and their temperature dependent changes during simulated diagenesis and catagenesis. Selective chemical degradation experiments were performed on the preextracted sediment to get more detailed information about porphyrins that are specifically bound to the macromolecular structures via ester bonds. From the heating experiments, in a separate study, the porphyrin nitrogen content in the generated bitumens was compared to the bulk of organic nitrogen compounds in the fraction. The bulk nitrogen contents in the generated bitumens, the water phase and the residual organic matter was recorded to establish the distribution of nitrogen between the kerogen and product phases. Porphyrins as biomarkers were examined in naturally matured Kimmeridge clay source rocks (Upper Jurassic, Norway), and the use of porphyrins as general indicators of maturity was evaluated. Underlying maturity trends in the biomarker data was investigated by Partial Least Squares analysis. Porphyrin as indicators of depositional conditions was also addressed, where the correlations between the (amounts) abundance of nickel and vanadyl porphyrins were mapped together with other descriptors that are assumed to be indicative of redox depositional conditions. 252 refs., 28 figs., 4 tabs.

  5. Semantic integration of gene expression analysis tools and data sources using software connectors

    Science.gov (United States)

    2013-01-01

    Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools

  6. PM10 source apportionment study in Pleasant Valley, Nevada

    International Nuclear Information System (INIS)

    Egami, R.T.; Chow, J.C.; Watson, J.G.; DeLong, T.

    1990-01-01

    A source apportionment study was conducted between March 18 and April 4, 1988, at Pleasant Valley, Nevada, to evaluate air pollutant concentrations to which community residents were exposed and the source contributions to those pollutants. Daily PM 10 samples were taken for chemical speciation of 40 trace elements, ions, and organic and elemental carbon. This paper reports that the objectives of this case study are: to determine the emissions source composition of the potential upwind source, a geothermal plant; to measure the ambient particulate concentration and its chemical characteristics in Pleasant Valley; and to estimate the contributions of different emissions sources to PM 10 . The study found that: particulate emissions from the geothermal cooling-tower plume consisted primarily of sulfate, ammonia, chloride, and trace elements; no significant quantities of toxic inorganic species were found in the ambient air; ambient PM 10 concentrations in Pleasant Valley were within Federal standards; and source contribution to PM 10 were approximately 60% geological material; 20% motor vehicle exhaust; and 10% cooling-tower plume

  7. Uncertainty assessment of source attribution of PM(2.5) and its water-soluble organic carbon content using different biomass burning tracers in positive matrix factorization analysis--a case study in Beijing, China.

    Science.gov (United States)

    Tao, Jun; Zhang, Leiming; Zhang, Renjian; Wu, Yunfei; Zhang, Zhisheng; Zhang, Xiaoling; Tang, Yixi; Cao, Junji; Zhang, Yuanhang

    2016-02-01

    Daily PM2.5 samples were collected at an urban site in Beijing during four one-month periods in 2009-2010, with each period in a different season. Samples were subject to chemical analysis for various chemical components including major water-soluble ions, organic carbon (OC) and water-soluble organic carbon (WSOC), element carbon (EC), trace elements, anhydrosugar levoglucosan (LG), and mannosan (MN). Three sets of source profiles of PM2.5 were first identified through positive matrix factorization (PMF) analysis using single or combined biomass tracers - non-sea salt potassium (nss-K(+)), LG, and a combination of nss-K(+) and LG. The six major source factors of PM2.5 included secondary inorganic aerosol, industrial pollution, soil dust, biomass burning, traffic emission, and coal burning, which were estimated to contribute 31±37%, 39±28%, 14±14%, 7±7%, 5±6%, and 4±8%, respectively, to PM2.5 mass if using the nss-K(+) source profiles, 22±19%, 29±17%, 20±20%, 13±13%, 12±10%, and 4±6%, respectively, if using the LG source profiles, and 21±17%, 31±18%, 19±19%, 11±12%, 14±11%, and 4±6%, respectively, if using the combined nss-K(+) and LG source profiles. The uncertainties in the estimation of biomass burning contributions to WSOC due to the different choices of biomass burning tracers were around 3% annually and up to 24% seasonally in terms of absolute percentage contributions, or on a factor of 1.7 annually and up to a factor of 3.3 seasonally in terms of the actual concentrations. The uncertainty from the major source (e.g. industrial pollution) was on a factor of 1.9 annually and up to a factor of 2.5 seasonally in the estimated WSOC concentrations. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Apportionment of sources affecting water quality: Case study of Kandla Creek, Gulf of Katchchh

    Digital Repository Service at National Institute of Oceanography (India)

    Dalal, S.G.; Shirodkar, P.V.; Verlekar, X.N.; Jagtap, T.G.; Rao, G.S.

    status of the environment. Several multivariate models are used for source apportionment studies, as they pinpoint the possible factors or sources that influence the water quality (Morales et al., 1999; Wunderlin et al., 2001; Petersen et al., 2001... and statistical approaches. Ecology 74: 2201– 2214. Morales, M. M., Martih, P., Llopis, A., Campos, L., and Sagrado, J. 1999. An environmental study by factor analysis of surface seawater in the Gulf of Valencia (western Mediterranean). Analytica Chimica Acta 394...

  9. Review on solving the forward problem in EEG source analysis

    Directory of Open Access Journals (Sweden)

    Vergult Anneleen

    2007-11-01

    Full Text Available Abstract Background The aim of electroencephalogram (EEG source localization is to find the brain areas responsible for EEG waves of interest. It consists of solving forward and inverse problems. The forward problem is solved by starting from a given electrical source and calculating the potentials at the electrodes. These evaluations are necessary to solve the inverse problem which is defined as finding brain sources which are responsible for the measured potentials at the EEG electrodes. Methods While other reviews give an extensive summary of the both forward and inverse problem, this review article focuses on different aspects of solving the forward problem and it is intended for newcomers in this research field. Results It starts with focusing on the generators of the EEG: the post-synaptic potentials in the apical dendrites of pyramidal neurons. These cells generate an extracellular current which can be modeled by Poisson's differential equation, and Neumann and Dirichlet boundary conditions. The compartments in which these currents flow can be anisotropic (e.g. skull and white matter. In a three-shell spherical head model an analytical expression exists to solve the forward problem. During the last two decades researchers have tried to solve Poisson's equation in a realistically shaped head model obtained from 3D medical images, which requires numerical methods. The following methods are compared with each other: the boundary element method (BEM, the finite element method (FEM and the finite difference method (FDM. In the last two methods anisotropic conducting compartments can conveniently be introduced. Then the focus will be set on the use of reciprocity in EEG source localization. It is introduced to speed up the forward calculations which are here performed for each electrode position rather than for each dipole position. Solving Poisson's equation utilizing FEM and FDM corresponds to solving a large sparse linear system. Iterative

  10. Simulation study on ion extraction from ECR ion sources

    International Nuclear Information System (INIS)

    Fu, S.; Kitagawa, A.; Yamada, S.

    1993-07-01

    In order to study beam optics of NIRS-ECR ion source used in HIMAC, EGUN code has been modified to make it capable of modeling ion extraction from a plasma. Two versions of the modified code are worked out with two different methods in which 1-D and 2-D sheath theories are used respectively. Convergence problem of the strong nonlinear self-consistent equations is investigated. Simulations on NIRS-ECR ion source and HYPER-ECR ion source (in INS, Univ. of Tokyo) are presented in this paper, exhibiting an agreement with the experimental results. Some preliminary suggestions on the upgrading the extraction systems of these sources are also proposed. (author)

  11. Simulation study on ion extraction from ECR ion sources

    Energy Technology Data Exchange (ETDEWEB)

    Fu, S.; Kitagawa, A.; Yamada, S.

    1993-07-01

    In order to study beam optics of NIRS-ECR ion source used in HIMAC, EGUN code has been modified to make it capable of modeling ion extraction from a plasma. Two versions of the modified code are worked out with two different methods in which 1-D and 2-D sheath theories are used respectively. Convergence problem of the strong nonlinear self-consistent equations is investigated. Simulations on NIRS-ECR ion source and HYPER-ECR ion source (in INS, Univ. of Tokyo) are presented in this paper, exhibiting an agreement with the experimental results. Some preliminary suggestions on the upgrading the extraction systems of these sources are also proposed. (author).

  12. Study of extragalactic sources with H.E.S.S

    International Nuclear Information System (INIS)

    Giebels, Berrie

    2007-01-01

    The field of Very High Energy (VHE) γ-ray emitting extragalactic sources has considerably evolved since the new generation of atmospheric Cerenkov telescopes (ACT) of improved sensitivity, such as H.E.S.S. array and the MAGIC ACT, have started operating. This has led to a wealth of new clues about emission mechanisms at high energy through the discovery of new sources, more accurate spectra and temporal studies of sources known previously, and simultaneous multi-wavelength (MWL) campaigns since broad band variability is a key phenomenon to the underlying physical mechanisms at play. The fact that some of these new sources are located at redshifts close to z ∼ 0.2 makes them powerful probes of the Extragalactic Background Light (EBL) through the attenuation of γ-rays above 100 GeV

  13. Analysis on Dangerous Source of Large Safety Accident in Storage Tank Area

    Science.gov (United States)

    Wang, Tong; Li, Ying; Xie, Tiansheng; Liu, Yu; Zhu, Xueyuan

    2018-01-01

    The difference between a large safety accident and a general accident is that the consequences of a large safety accident are particularly serious. To study the tank area which factors directly or indirectly lead to the occurrence of large-sized safety accidents. According to the three kinds of hazard source theory and the consequence cause analysis of the super safety accident, this paper analyzes the dangerous source of the super safety accident in the tank area from four aspects, such as energy source, large-sized safety accident reason, management missing, environmental impact Based on the analysis of three kinds of hazard sources and environmental analysis to derive the main risk factors and the AHP evaluation model is established, and after rigorous and scientific calculation, the weights of the related factors in four kinds of risk factors and each type of risk factors are obtained. The result of analytic hierarchy process shows that management reasons is the most important one, and then the environmental factors and the direct cause and Energy source. It should be noted that although the direct cause is relatively low overall importance, the direct cause of Failure of emergency measures and Failure of prevention and control facilities in greater weight.

  14. Barrow Black Carbon Source and Impact Study Final Campaign Report

    Energy Technology Data Exchange (ETDEWEB)

    Barrett, Tate [Baylor Univ., Waco, TX (United States)

    2014-07-01

    The goal of the Barrow Black Carbon Source and Impact (BBCSI) Study was to characterize the concentration and isotopic composition of carbonaceous atmospheric particulate matter (PM) at the Atmospheric Radiation Measurement site in Barrow, AK. The carbonaceous component was characterized via measurement of the organic and black carbon (OC and BC) components of the total PM. To facilitate complete characterization of the particulate matter, filter-based collections were used, including a medium volume PM2.5 sampler and a high volume PM10 sampler. Thirty-eight fine (PM2.5) and 49 coarse (PM10) particulate matter fractions were collected at weekly and bi-monthly intervals. The PM2.5 sampler operated with minimal maintenance during the 12 month campaign. The PM10 sampler used for the BBCSI used standard Tisch hi-vol motors which have a known lifetime of ~1 month under constant use; this necessitated monthly maintenance and it is suggested that the motors be upgraded to industrial blowers for future deployment in the Arctic. The BBCSI sampling campaign successfully collected and archived 87 ambient atmospheric particulate matter samples from Barrow, AK from July 2012 to June 2013. Preliminary analysis of the organic and black carbon concentrations has been completed. This campaign confirmed known trends of high BC lasting from the winter through to spring haze periods and low BC concentrations in the summer.

  15. Analysis of core-concrete interaction event with flooding for the Advanced Neutron Source reactor

    International Nuclear Information System (INIS)

    Kim, S.H.; Taleyarkhan, R.P.; Georgevich, V.; Navarro-Valenti, S.

    1993-01-01

    This paper discusses salient aspects of the methodology, assumptions, and modeling of various features related to estimation of source terms from an accident involving a molten core-concrete interaction event (with and without flooding) in the Advanced Neutron Source (ANS) reactor at the Oak Ridge National Laboratory. Various containment configurations are considered for this postulated severe accident. Several design features (such as rupture disks) are examined to study containment response during this severe accident. Also, thermal-hydraulic response of the containment and radionuclide transport and retention in the containment are studied. The results are described as transient variations of source terms, which are then used for studying off-site radiological consequences and health effects for the support of the Conceptual Safety Analysis Report for ANS. The results are also to be used to examine the effectiveness of subpile room flooding during this type of severe accident

  16. 252Cf-source-driven neutron noise analysis method

    International Nuclear Information System (INIS)

    Mihalczo, J.T.; King, W.T.; Blakeman, E.D.

    1985-01-01

    The 252 Cf-source-driven neutron noise analysis method has been tested in a wide variety of experiments that have indicated the broad range of applicability of the method. The neutron multiplication factor k/sub eff/ has been satisfactorily detemined for a variety of materials including uranium metal, light water reactor fuel pins, fissile solutions, fuel plates in water, and interacting cylinders. For a uranyl nitrate solution tank which is typical of a fuel processing or reprocessing plant, the k/sub eff/ values were satisfactorily determined for values between 0.92 and 0.5 using a simple point kinetics interpretation of the experimental data. The short measurement times, in several cases as low as 1 min, have shown that the development of this method can lead to a practical subcriticality monitor for many in-plant applications. The further development of the method will require experiments oriented toward particular applications including dynamic experiments and the development of theoretical methods to predict the experimental observables

  17. Source analysis of spaceborne microwave radiometer interference over land

    Science.gov (United States)

    Guan, Li; Zhang, Sibo

    2016-03-01

    Satellite microwave thermal emissions mixed with signals from active sensors are referred to as radiofrequency interference (RFI). Based on Advanced Microwave Scanning Radiometer-Earth Observing System (AMSR-E) observations from June 1 to 16, 2011, RFI over Europe was identified and analyzed using the modified principal component analysis algorithm in this paper. The X band AMSR-E measurements in England and Italy are mostly affected by the stable, persistent, active microwave transmitters on the surface, while the RFI source of other European countries is the interference of the reflected geostationary TV satellite downlink signals to the measurements of spaceborne microwave radiometers. The locations and intensities of the RFI induced by the geostationary TV and communication satellites changed with time within the observed period. The observations of spaceborne microwave radiometers in ascending portions of orbits are usually interfered with over European land, while no RFI was detected in descending passes. The RFI locations and intensities from the reflection of downlink radiation are highly dependent upon the relative geometry between the geostationary satellite and the measuring passive sensor. Only these fields of view of a spaceborne instrument whose scan azimuths are close to the azimuth relative to the geostationary satellite are likely to be affected by RFI.

  18. Sealed source and device removal and consolidation feasibility study

    International Nuclear Information System (INIS)

    Ward, J.E.; Carter, J.G.; Meyers, R.L.

    1993-02-01

    The purpose of this study is to assess the feasibility of removing Greater-Than-Class C (GTCC) sealed sources from their containment device and consolidating them for transport to a storage or disposal facility. A sealed source is a sealed capsule containing a radioactive material that is placed in a device providing radioactive containment. It is used in the medical, industrial, research, and food-processing communities for calibrating, measuring, gauging, controlling processes, and testing. This feasibility study addresses the key operational, safety, regulatory, and financial requirements of the removal/consolidation process. This report discusses the process to receive, handle, repackage, and ship these sources to an interim or dedicated storage facility until a final disposal repository can be built and become operational (∼ c. 2010). The study identifies operational and facility requirements to perform this work. Hanford, other DOE facilities, and private hot-cell facilities were evaluated to determine which facilities could perform this work. The personnel needed, design and engineering, facility preparation, process waste disposal requirements, and regulatory compliance were evaluated to determine the cost to perform this work. Cost requirements for items that will have to meet future changing regulatory requirements for transportation, transportation container design and engineering, and disposal were not included in this study. The cost associated with in-process consolidation of the sealed sources reported in this study may have not been modified for inflation and were based on 1992 dollars. This study shows that sealed source consolidation is possible with minimal personnel exposure, and would reduce the risk of radioactive releases to the environment. An initial pilot-scale operation could evaluate the possible methods to reduce the cost and consolidate sources

  19. Langmuir probe studies on a RF ion source for NBI

    International Nuclear Information System (INIS)

    McNeely, P.; Heineman, B.; Kraus, W.; Riedl, R.; Speth, E.; Vollmer, O.

    2001-01-01

    IPP Garching has been developing a RF ion source for H - production. In order to improve the data quality a new scanning probe system with passive RF compensation has been installed on the Type VI ion source on the BATMAN test stand. Using this probe, measurements have been carried out to study changes to the plasma parameters (electron density, electron temperature, and plasma potential) due to variation in the source operating conditions. The data were collected at a source pressure of 0.5 Pa and with 60±5 kW applied RF power. Presented are some of the results of these measurements, focusing on the effect of: argon seeding, addition of Cs to the source, and the newly added Faraday screen. The electron density behaves in a fashion that agrees with the theory of ambipolar diffusion. Typically there is little change to the average electron energy observed regardless of which effect is considered. The plasma potential shows the most significant changes with external source conditions, both in value for all cases and shape when the Faraday screen was added

  20. Review of SFR In-Vessel Radiological Source Term Studies

    International Nuclear Information System (INIS)

    Suk, Soo Dong; Lee, Yong Bum

    2008-10-01

    An effort has been made in this study to search for and review the literatures in public domain on the studies of the phenomena related to the release of radionuclides and aerosols to the reactor containment of the sodium fast reactor (SFR) plants (i.e., in-vessel source term), made in Japan and Europe including France, Germany and UK over the last few decades. Review work is focused on the experimental programs to investigate the phenomena related to determining the source terms, with a brief review on supporting analytical models and computer programs. In this report, the research programs conducted to investigate the CDA (core disruptive accident) bubble behavior in the sodium pool for determining 'primary' or 'instantaneous' source term are first introduced. The studies performed to determine 'delayed source term' are then described, including the various stages of phenomena and processes: fission product (FP) release from fuel , evaporation release from the surface of the pool, iodine mass transfer from fission gas bubble, FP deposition , and aerosol release from core-concrete interaction. The research programs to investigate the release and transport of FPs and aerosols in the reactor containment (i.e., in-containment source term) are not described in this report

  1. Study of two different radioactive sources for prostate brachytherapy treatment

    International Nuclear Information System (INIS)

    Pereira Neves, Lucio; Perini, Ana Paula; Souza Santos, William de; Caldas, Linda V.E.; Belinato, Walmir

    2015-01-01

    In this study we evaluated two radioactive sources for brachytherapy treatments. Our main goal was to quantify the absorbed doses on organs and tissues of an adult male patient, submitted to a brachytherapy treatment with two radioactive sources. We evaluated a 192 Ir and a 125 I radioactive sources. The 192 Ir radioactive source is a cylinder with 0.09 cm in diameter and 0.415 cm long. The 125 I radioactive source is also a cylinder, with 0.08 cm in diameter and 0.45 cm long. To evaluate the absorbed dose distribution on the prostate, and other organs and tissues of an adult man, a male virtual anthropomorphic phantom MASH, coupled in the radiation transport code MCNPX 2.7.0, was employed.We simulated 75, 90 and 102 radioactive sources of 125 I and one of 192 Ir, inside the prostate, as normally used in these treatments, and each treatment was simulated separately. As this phantom was developed in a supine position, the displacement of the internal organs of the chest, compression of the lungs and reduction of the sagittal diameter were all taken into account. For the 192 Ir, the higher doses values were obtained for the prostate and surrounding organs, as the colon, gonads and bladder. Considering the 125 I sources, with photons with lower energies, the doses to organs that are far from the prostate were lower. All values for the dose rates are in agreement with those recommended for brachytherapy treatments. Besides that, the new seeds evaluated in this work present usefulness as a new tool in prostate brachytherapy treatments, and the methodology employed in this work may be applied for other radiation sources, or treatments. (authors)

  2. Study of two different radioactive sources for prostate brachytherapy treatment

    Energy Technology Data Exchange (ETDEWEB)

    Pereira Neves, Lucio; Perini, Ana Paula [Instituto de Fisica, Universidade Federal de Uberlandia, Caixa Postal 593, 38400-902, Uberlandia, MG (Brazil); Souza Santos, William de; Caldas, Linda V.E. [Instituto de Pesquisas Energeticas e Nucleares, Comissao Nacional de Energia Nuclear, IPENCNEN/SP, Av. Prof. Lineu Prestes, 2242, Cidade Universitaria, 05508-000 Sao Paulo, SP (Brazil); Belinato, Walmir [Departamento de Ensino, Instituto Federal de Educacao, Ciencia e Tecnologia da Bahia, Campus Vitoria da Conquista, Zabele, Av. Amazonas 3150, 45030-220 Vitoria da Conquista, BA (Brazil)

    2015-07-01

    In this study we evaluated two radioactive sources for brachytherapy treatments. Our main goal was to quantify the absorbed doses on organs and tissues of an adult male patient, submitted to a brachytherapy treatment with two radioactive sources. We evaluated a {sup 192}Ir and a {sup 125}I radioactive sources. The {sup 192}Ir radioactive source is a cylinder with 0.09 cm in diameter and 0.415 cm long. The {sup 125}I radioactive source is also a cylinder, with 0.08 cm in diameter and 0.45 cm long. To evaluate the absorbed dose distribution on the prostate, and other organs and tissues of an adult man, a male virtual anthropomorphic phantom MASH, coupled in the radiation transport code MCNPX 2.7.0, was employed.We simulated 75, 90 and 102 radioactive sources of {sup 125}I and one of {sup 192}Ir, inside the prostate, as normally used in these treatments, and each treatment was simulated separately. As this phantom was developed in a supine position, the displacement of the internal organs of the chest, compression of the lungs and reduction of the sagittal diameter were all taken into account. For the {sup 192}Ir, the higher doses values were obtained for the prostate and surrounding organs, as the colon, gonads and bladder. Considering the {sup 125}I sources, with photons with lower energies, the doses to organs that are far from the prostate were lower. All values for the dose rates are in agreement with those recommended for brachytherapy treatments. Besides that, the new seeds evaluated in this work present usefulness as a new tool in prostate brachytherapy treatments, and the methodology employed in this work may be applied for other radiation sources, or treatments. (authors)

  3. Study of an hybrid positron source using channeling for CLIC

    CERN Document Server

    Dadoun, O; Chehab, R; Poirier, F; Rinolfi, L; Strakhovenko, V; Variola, A; Vivoli, A

    2009-01-01

    The CLIC study considers the hybrid source using channeling as the baseline for positron production. The hybrid source uses a few GeV electron beam impinging on a crystal tungsten radiator. With the tungsten crystal oriented on its axis it results an intense, relatively low energy photon beam due mainly to channeling radiation. Those photons are then impinging on an amorphous tungsten target producing positrons by e+e− pair creation. In this note the optimization of the positron yield and the peak energy deposition density in the amorphous target are studied according to the distance between the crystal and the amorphous targets, the primary electron energy and the amorphous target thickness.

  4. Analysis of potential combustion source impacts on acid deposition using an independently derived inventory. Volume I

    Energy Technology Data Exchange (ETDEWEB)

    1983-12-01

    This project had three major objectives. The first objective was to develop a fossil fuel combustion source inventory (NO/sub x/, SO/sub x/, and hydrocarbon emissions) that would be relatively easy to use and update for analyzing the impact of combustion emissions on acid deposition in the eastern United States. The second objective of the project was to use the inventory data as a basis for selection of a number of areas that, by virtue of their importance in the acid rain issue, could be further studied to assess the impact of local and intraregional combustion sources. The third objective was to conduct an analysis of wet deposition monitoring data in the areas under study, along with pertinent physical characteristics, meteorological conditions, and emission patterns of these areas, to investigate probable relationships between local and intraregional combustion sources and the deposition of acidic material. The combustion source emissions inventory has been developed for the eastern United States. It characterizes all important area sources and point sources on a county-by-county basis. Its design provides flexibility and simplicity and makes it uniquely useful in overall analysis of emission patterns in the eastern United States. Three regions with basically different emission patterns have been identified and characterized. The statistical analysis of wet deposition monitoring data in conjunction with emission patterns, wind direction, and topography has produced consistent results for each study area and has demonstrated that the wet deposition in each area reflects the characteristics of the localized area around the monitoring sites (typically 50 to 150 miles). 8 references, 28 figures, 39 tables.

  5. Market Analysis and Consumer Impacts Source Document. Part III. Consumer Behavior and Attitudes Toward Fuel Efficient Vehicles

    Science.gov (United States)

    1980-12-01

    This source document on motor vehicle market analysis and consumer impacts consists of three parts. Part III consists of studies and reviews on: consumer awareness of fuel efficiency issues; consumer acceptance of fuel efficient vehicles; car size ch...

  6. Identification of sources of heavy metals in the Dutch atmosphere using air filter and lichen analysis

    International Nuclear Information System (INIS)

    de Bruin, M.; Wolterbeek, H.T.

    1984-01-01

    Aerosol samples collected in an industrialized region were analyzed by instrumental neutron activation analysis. Correlation with wind direction and factor analysis were applied to the concentration data to obtain information on the nature and position of the sources. Epiphytic lichens were sampled over the country and analyzed for heavy metals (As, Cd, Sc, Zn, Sb). The data were interpreted by geographically plotting element concentrations and enrichment factors, and by factor analysis. Some pitfalls are discussed which are associated with the use of aerosol and lichen data in studies of heavy metal air pollution. 14 references, 8 figures, 3 tables

  7. Source Code Analysis Laboratory (SCALe) for Energy Delivery Systems

    Science.gov (United States)

    2010-12-01

    technical competence for the type of tests and calibrations SCALe undertakes. Testing and calibration laboratories that comply with ISO / IEC 17025 ...and exec t [ ISO / IEC 2005]. f a software system indicates that the SCALe analysis di by a CERT secure coding standard. Successful conforma antees that...to be more secure than non- systems. However, no study has yet been performed to p t ssment in accordance with ISO / IEC 17000: “a demonstr g to a

  8. Source localization of rhythmic ictal EEG activity: a study of diagnostic accuracy following STARD criteria.

    Science.gov (United States)

    Beniczky, Sándor; Lantz, Göran; Rosenzweig, Ivana; Åkeson, Per; Pedersen, Birthe; Pinborg, Lars H; Ziebell, Morten; Jespersen, Bo; Fuglsang-Frederiksen, Anders

    2013-10-01

    Although precise identification of the seizure-onset zone is an essential element of presurgical evaluation, source localization of ictal electroencephalography (EEG) signals has received little attention. The aim of our study was to estimate the accuracy of source localization of rhythmic ictal EEG activity using a distributed source model. Source localization of rhythmic ictal scalp EEG activity was performed in 42 consecutive cases fulfilling inclusion criteria. The study was designed according to recommendations for studies on diagnostic accuracy (STARD). The initial ictal EEG signals were selected using a standardized method, based on frequency analysis and voltage distribution of the ictal activity. A distributed source model-local autoregressive average (LAURA)-was used for the source localization. Sensitivity, specificity, and measurement of agreement (kappa) were determined based on the reference standard-the consensus conclusion of the multidisciplinary epilepsy surgery team. Predictive values were calculated from the surgical outcome of the operated patients. To estimate the clinical value of the ictal source analysis, we compared the likelihood ratios of concordant and discordant results. Source localization was performed blinded to the clinical data, and before the surgical decision. Reference standard was available for 33 patients. The ictal source localization had a sensitivity of 70% and a specificity of 76%. The mean measurement of agreement (kappa) was 0.61, corresponding to substantial agreement (95% confidence interval (CI) 0.38-0.84). Twenty patients underwent resective surgery. The positive predictive value (PPV) for seizure freedom was 92% and the negative predictive value (NPV) was 43%. The likelihood ratio was nine times higher for the concordant results, as compared with the discordant ones. Source localization of rhythmic ictal activity using a distributed source model (LAURA) for the ictal EEG signals selected with a standardized method

  9. A californium-252 source for radiobiological studies at Hiroshima University

    International Nuclear Information System (INIS)

    Kato, Kazuo; Takeoka, Seiji; Kuroda, Tokue; Tsujimura, Tomotaka; Kawami, Masaharu; Hoshi, Masaharu; Sawada, Shozo

    1987-01-01

    A 1.93 Ci (3.6 mg) californium-252 source was installed in the radiation facility of the Research Institute for Nuclear Medicine and Biology, Hiroshima University. This source produces fission neutrons (8.7 x 10 9 n/s at the time of its installation), which are similar to neutron spectrum of the atomic bombs. It is useful for studying biological effects of fission neutrons and neutron dosimetry. An apparatus was dosigned to accomodate this source and to apply it to such studies. It has resulted in profitable fission neutron exposures, while suppressing scattered neutrons and secondary gamma rays. This apparatus incorporates many safety systems, including one which interlocks with all of doors and an elevator serving the exposure room, so as to prevent accidents involving users. (author)

  10. Examination of Conservatism in Ground-level Source Release Assumption when Performing Consequence Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sung-yeop; Lim, Ho-Gon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    One of these assumptions frequently assumed is the assumption of ground-level source release. The user manual of a consequence analysis software HotSpot is mentioning like below: 'If you cannot estimate or calculate the effective release height, the actual physical release height (height of the stack) or zero for ground-level release should be used. This will usually yield a conservative estimate, (i.e., larger radiation doses for all downwind receptors, etc).' This recommendation could be agreed in aspect of conservatism but quantitative examination of the effect of this assumption to the result of consequence analysis is necessary. The source terms of Fukushima Dai-ichi NPP accident have been estimated by several studies using inverse modeling and one of the biggest sources of the difference between the results of these studies was different effective source release height assumed by each studies. It supports the importance of the quantitative examination of the influence by release height. Sensitivity analysis of the effective release height of radioactive sources was performed and the influence to the total effective dose was quantitatively examined in this study. Above 20% difference is maintained even at longer distances, when we compare the dose between the result assuming ground-level release and the results assuming other effective plume height. It means that we cannot ignore the influence of ground-level source assumption to the latent cancer fatality estimations. In addition, the assumption of ground-level release fundamentally prevents detailed analysis including diffusion of plume from effective plume height to the ground even though the influence of it is relatively lower in longer distance. When we additionally consider the influence of surface roughness, situations could be more serious. The ground level dose could be highly over-estimated in short downwind distance at the NPP sites which have low surface roughness such as Barakah site in

  11. Analysis of fuel management in the KIPT neutron source facility

    Energy Technology Data Exchange (ETDEWEB)

    Zhong Zhaopeng, E-mail: zzhong@anl.gov [Nuclear Engineering Division, Argonne National Laboratory, 9700 South Cass Avenue, Argonne, IL 60439 (United States); Gohar, Yousry; Talamo, Alberto [Nuclear Engineering Division, Argonne National Laboratory, 9700 South Cass Avenue, Argonne, IL 60439 (United States)

    2011-05-15

    Research highlights: > Fuel management of KIPT ADS was analyzed. > Core arrangement was shuffled in stage wise. > New fuel assemblies was added into core periodically. > Beryllium reflector could also be utilized to increase the fuel life. - Abstract: Argonne National Laboratory (ANL) of USA and Kharkov Institute of Physics and Technology (KIPT) of Ukraine have been collaborating on the conceptual design development of an experimental neutron source facility consisting of an electron accelerator driven sub-critical assembly. The neutron source driving the sub-critical assembly is generated from the interaction of 100 KW electron beam with a natural uranium target. The sub-critical assembly surrounding the target is fueled with low enriched WWR-M2 type hexagonal fuel assemblies. The U-235 enrichment of the fuel material is <20%. The facility will be utilized for basic and applied research, producing medical isotopes, and training young specialists. With the 100 KW electron beam power, the total thermal power of the facility is {approx}360 kW including the fission power of {approx}260 kW. The burnup of the fissile materials and the buildup of fission products continuously reduce the system reactivity during the operation, decrease the neutron flux level, and consequently impact the facility performance. To preserve the neutron flux level during the operation, the fuel assemblies should be added and shuffled for compensating the lost reactivity caused by burnup. Beryllium reflector could also be utilized to increase the fuel life time in the sub-critical core. This paper studies the fuel cycles and shuffling schemes of the fuel assemblies of the sub-critical assembly to preserve the system reactivity and the neutron flux level during the operation.

  12. Analysis of fuel management in the KIPT neutron source facility

    International Nuclear Information System (INIS)

    Zhong Zhaopeng; Gohar, Yousry; Talamo, Alberto

    2011-01-01

    Research highlights: → Fuel management of KIPT ADS was analyzed. → Core arrangement was shuffled in stage wise. → New fuel assemblies was added into core periodically. → Beryllium reflector could also be utilized to increase the fuel life. - Abstract: Argonne National Laboratory (ANL) of USA and Kharkov Institute of Physics and Technology (KIPT) of Ukraine have been collaborating on the conceptual design development of an experimental neutron source facility consisting of an electron accelerator driven sub-critical assembly. The neutron source driving the sub-critical assembly is generated from the interaction of 100 KW electron beam with a natural uranium target. The sub-critical assembly surrounding the target is fueled with low enriched WWR-M2 type hexagonal fuel assemblies. The U-235 enrichment of the fuel material is <20%. The facility will be utilized for basic and applied research, producing medical isotopes, and training young specialists. With the 100 KW electron beam power, the total thermal power of the facility is ∼360 kW including the fission power of ∼260 kW. The burnup of the fissile materials and the buildup of fission products continuously reduce the system reactivity during the operation, decrease the neutron flux level, and consequently impact the facility performance. To preserve the neutron flux level during the operation, the fuel assemblies should be added and shuffled for compensating the lost reactivity caused by burnup. Beryllium reflector could also be utilized to increase the fuel life time in the sub-critical core. This paper studies the fuel cycles and shuffling schemes of the fuel assemblies of the sub-critical assembly to preserve the system reactivity and the neutron flux level during the operation.

  13. Time-Reversal Study of the Hemet (CA) Tremor Source

    Science.gov (United States)

    Larmat, C. S.; Johnson, P. A.; Guyer, R. A.

    2010-12-01

    Since its first observation by Nadeau & Dolenc (2005) and Gomberg et al. (2008), tremor along the San Andreas fault system is thought to be a probe into the frictional state of the deep part of the fault (e.g. Shelly et al., 2007). Tremor is associated with slow, otherwise deep, aseismic slip events that may be triggered by faint signals such as passing waves from remote earthquakes or solid Earth tides.Well resolved tremor source location is key to constrain frictional models of the fault. However, tremor source location is challenging because of the high-frequency and highly-scattered nature of tremor signal characterized by the lack of isolated phase arrivals. Time Reversal (TR) methods are emerging as a useful tool for location. The unique requirement is a good velocity model for the different time-reversed phases to arrive coherently onto the source point. We present results of location for a tremor source near the town of Hemet, CA, which was triggered by the 2002 M 7.9 Denali Fault earthquake (Gomberg et al., 2008) and by the 2009 M 6.9 Gulf of California earthquake. We performed TR in a volume model of 88 (N-S) x 70 (W-E) x 60 km (Z) using the full-wave 3D wave-propagation package SPECFEM3D (Komatitsch et al., 2002). The results for the 2009 episode indicate a deep source (at about 22km) which is about 4km SW the fault surface scarp. We perform STA/SLA and correlation analysis in order to have independent confirmation of the Hemet tremor source. We gratefully acknowledge the support of the U. S. Department of Energy through the LANL/LDRD Program for this work.

  14. OSSMETER D3.4 – Language-Specific Source Code Quality Analysis

    NARCIS (Netherlands)

    J.J. Vinju (Jurgen); A. Shahi (Ashim); H.J.S. Basten (Bas)

    2014-01-01

    htmlabstractThis deliverable is part of WP3: Source Code Quality and Activity Analysis. It provides descriptions and prototypes of the tools that are needed for source code quality analysis in open source software projects. It builds upon the results of: • Deliverable 3.1 where infra-structure and

  15. Study of liquid hydrogen and liquid deuterium cold neutron sources

    International Nuclear Information System (INIS)

    Harig, H.D.

    1969-01-01

    In view of the plant of the cold neutron source for a high flux reactor (maximal thermal flux of about 10 15 n/cm 2 s) an experimental study of several cold sources of liquid hydrogen and liquid deuterium has been made in a low power reactor (100 kW, about 10 12 n/cm 2 s). We have investigated: -cold neutron sources of liquid hydrogen shaped as annular layers of different thickness. Normal liquid hydrogen was used as well as hydrogen with a high para-percentage. -Cold neutron sources of liquid deuterium in cylinders of 18 and 38 cm diameter. In this case the sources could be placed into different positions to the reactor core within the heavy water reflector. This report gives a general description of the experimental device and deals more detailed with the design of the cryogenic systems. Then, the measured results are communicated, interpreted and finally compared with those of a theoretical study about the same cold moderators which have been the matter of the experimental investigation. (authors) [fr

  16. LIGHT SOURCE: A simulation study of Tsinghua Thomson scattering X-ray source

    Science.gov (United States)

    Tang, Chuan-Xiang; Li, Ren-Kai; Huang, Wen-Hui; Chen, Huai-Bi; Du, Ying-Chao; Du, Qiang; Du, Tai-Bin; He, Xiao-Zhong; Hua, Jian-Fei; Lin, Yu-Zhen; Qian, Hou-Jun; Shi, Jia-Ru; Xiang, Dao; Yan, Li-Xin; Yu, Pei-Cheng

    2009-06-01

    Thomson scattering X-ray sources are compact and affordable facilities that produce short duration, high brightness X-ray pulses enabling new experimental capacities in ultra-fast science studies, and also medical and industrial applications. Such a facility has been built at the Accelerator Laboratory of Tsinghua University, and upgrade is in progress. In this paper, we present a proposed layout of the upgrade with design parameters by simulation, aiming at high X-ray pulses flux and brightness, and also enabling advanced dynamics studies and applications of the electron beam. Design and construction status of main subsystems are also presented.

  17. Multi-Criteria Analysis to Prioritize Energy Sources for Ambience in Poultry Production

    Directory of Open Access Journals (Sweden)

    DC Collatto

    Full Text Available ABSTRACT This paper intends to outline a model of multi-criteria analysis to pinpoint the most suitable energy source for heating aviaries in poultry broiler production from the point of view of the farmer and under environmental logic. Therefore, the identification of criteria was enabled through an exploratory study in three poultry broiler production units located in the mountain region of Rio Grande do Sul. In order to identify the energy source, the Analytic Hierarchy Process was applied. The criteria determined and validated in the research contemplated the cost of energy source, leadtime, investment in equipment, energy efficiency, quality of life and environmental impacts. The result of applying the method revealed firewood as the most appropriate energy for heating. The decision support model developed could be replicated in order to strengthen the criteria and energy alternatives presented, besides identifying new criteria and alternatives that were not considered in this study.

  18. Neutron activation analysis of essential elements in Multani mitti clay using miniature neutron source reactor

    International Nuclear Information System (INIS)

    Waheed, S.; Rahman, S.; Faiz, Y.; Siddique, N.

    2012-01-01

    Multani mitti clay was studied for 19 essential and other elements. Four different radio-assay schemes were adopted for instrumental neutron activation analysis (INAA) using miniature neutron source reactor. The estimated weekly intakes of Cr and Fe are high for men, women, pregnant and lactating women and children while intake of Co is higher in adult categories and Mn by pregnant women. Comparison of MM clay with other type of clays shows that it is a good source of essential elements. - Highlights: ► Multani mitti clay has been studied for 19 essential elements for human adequacy and safety using INAA and AAS. ► Weekly intakes for different consumer categories have been calculated and compared with DRIs. ► Comparison of MM with other type of clays depict that MM clay is a good source of essential elements.

  19. Preliminary thermal analysis of grids for twin source extraction system

    International Nuclear Information System (INIS)

    Pandey, Ravi; Bandyopadhyay, Mainak; Chakraborty, Arun K.

    2017-01-01

    The TWIN (Two driver based Indigenously built Negative ion source) source provides a bridge between the operational single driver based negative ion source test facility, ROBIN in IPR and an ITER-type multi driver based ion source. The source is designed to be operated in CW mode with 180kW, 1MHz, 5s ON/600s OFF duty cycle and also in 5Hz modulation mode with 3s ON/20s OFF duty cycle for 3 such cycle. TWIN source comprises of ion source sub-assembly (consist of driver and plasma box) and extraction system sub-assembly. Extraction system consists of Plasma grid (PG), extraction grid (EG) and Ground grid (GG) sub assembly. Negative ion beams produced at plasma grid seeing the plasma side of ion source will receive moderate heat flux whereas the extraction grid and ground grid would be receiving majority of heat flux from extracted negative ion and co-extracted electron beams. Entire Co-extracted electron beam would be dumped at extraction grid via electron deflection magnetic field making the requirement of thermal and hydraulic design for extraction grid to be critical. All the three grids are made of OFHC Copper and would be actively water cooled keeping the peak temperature rise of grid surface within allowable limit with optimum uniformity. All the grids are to be made by vacuum brazing process where joint strength becomes crucial at elevated temperature. Hydraulic design must maintain the peak temperature at the brazing joint within acceptable limit

  20. Plagiarism and Source Deception Detection Based on Syntax Analysis

    Directory of Open Access Journals (Sweden)

    Eman Salih Al-Shamery

    2017-02-01

    Full Text Available In this research, the shingle algorithm with Jaccard method are employed as a new approach to detect deception in sources in addition to detect plagiarism . Source deception occurs as a result of taking a particular text from a source and relative it to another source, while plagiarism occurs in the documents as a result of taking part or all of the text belong to another research, this approach is based on Shingle algorithm with Jaccard coefficient , Shingling is an efficient way to compare the set of shingle in the files that contain text which are used as a feature to measure the syntactic similarity of the documents and it will work with Jaccard coefficient that measures similarity between sample sets . In this proposed system, text will be checked whether it contains syntax plagiarism or not and gives a percentage of similarity with other documents , As well as research sources will be checked to detect deception in source , by matching it with available sources from Turnitin report of the same research by using shingle algorithm with Jaccard coefficient. The motivations of this work is to discovery of literary thefts that occur on the researches , especially what students are doing in their researches , also discover the deception that occurs in the sources.

  1. Source study of local coalfield events using the modal synthesis of shear and surface waves

    Energy Technology Data Exchange (ETDEWEB)

    MacBeth, C.D.; Redmayne, D.W.

    1989-10-01

    Results from the BGS LOWNET array from the Midlothian coalfield in Scotland have been studied. Vertical component seismograms have been analysed using a waveform matching technique based on the modal summation method for constructing synthetic seismograms. Results of the analysis are applied to S and surface wave portions of the seismogram. Effects of different earth structures, source depths, source orientation, and type of event, rockburst or triggered earthquake 2-3 km from the mine workings, can be evaluated.

  2. A Comparative Study Of Source Location And Depth Estimates From ...

    African Journals Online (AJOL)

    ... the analytic signal amplitude (ASA) and the local wave number (LWN) of the total intensity magnetic field. In this study, a synthetic magnetic field due to four buried dipoles was analysed to show that estimates of source location and depth can be improved significantly by reducing the data to the pole prior to the application ...

  3. A study investigating sound sources and noise levels in neonatal ...

    African Journals Online (AJOL)

    Background. Exposure to noise in the neonatal intensive care unit (NICU) has the potential to affect neonatal auditory development, sleep patterns and physiological stability, thus impacting on developmental progress. Objectives. This study aimed to identify noise sources in three NICUs in Johannesburg, South Africa, and ...

  4. Dynamic response analysis of the LBL Advanced Light Source synchrotron radiation storage ring

    International Nuclear Information System (INIS)

    Leung, K.

    1993-05-01

    This paper presents the dynamic response analysis of the photon source synchrotron radiation storage ring excited by ground motion measured at the Lawrence Berkeley Laboratory advanced light source building site. The high spectral brilliance requirement the photon beams of the advanced light source storage ring specified displacement of the quadrupole focusing magnets in the order of 1 micron in vertical motion.There are 19 magnets supported by a 430-inch steel box beam girder. The girder and all magnets are supported by the kinematic mount system normally used in optical equipment. The kinematic mount called a six-strut magnet support system is now considered as an alternative system for supporting SSC magnets in the Super Collider. The effectively designed and effectively operated six-strut support system is now successfully operated for the Advanced Light Source (ALS) accelerator at the Lawrence Berkeley Laboratory. This paper will present the method of analysis and results of the dynamic motion study at the center of the magnets under the most critical excitation source as recorded at the LBL site

  5. Study of localized photon source in space of measures

    International Nuclear Information System (INIS)

    Lisi, M.

    2010-01-01

    In this paper we study a three-dimensional photon transport problem in an interstellar cloud, with a localized photon source inside. The problem is solved indirectly, by defining the adjoint of an operator acting on an appropriate space of continuous functions. By means of sun-adjoint semi groups theory of operators in a Banach space of regular Borel measures, we prove existence and uniqueness of the solution of the problem. A possible approach to identify the localization of the photon source is finally proposed.

  6. Global sensitivity analysis in wastewater treatment plant model applications: Prioritizing sources of uncertainty

    DEFF Research Database (Denmark)

    Sin, Gürkan; Gernaey, Krist; Neumann, Marc B.

    2011-01-01

    This study demonstrates the usefulness of global sensitivity analysis in wastewater treatment plant (WWTP) design to prioritize sources of uncertainty and quantify their impact on performance criteria. The study, which is performed with the Benchmark Simulation Model no. 1 plant design, complements...... insight into devising useful ways for reducing uncertainties in the plant performance. This information can help engineers design robust WWTP plants....... a previous paper on input uncertainty characterisation and propagation (Sin et al., 2009). A sampling-based sensitivity analysis is conducted to compute standardized regression coefficients. It was found that this method is able to decompose satisfactorily the variance of plant performance criteria (with R2...

  7. A Study on Water Pollution Source Localization in Sensor Networks

    Directory of Open Access Journals (Sweden)

    Jun Yang

    2016-01-01

    Full Text Available The water pollution source localization is of great significance to water environment protection. In this paper, a study on water pollution source localization is presented. Firstly, the source detection is discussed. Then, the coarse localization methods and the localization methods based on diffusion models are introduced and analyzed, respectively. In addition, the localization method based on the contour is proposed. The detection and localization methods are compared in experiments finally. The results show that the detection method using hypotheses testing is more stable. The performance of the coarse localization algorithm depends on the nodes density. The localization based on the diffusion model can yield precise localization results; however, the results are not stable. The localization method based on the contour is better than the other two localization methods when the concentration contours are axisymmetric. Thus, in the water pollution source localization, the detection using hypotheses testing is more preferable in the source detection step. If concentration contours are axisymmetric, the localization method based on the contour is the first option. And, in case the nodes are dense and there is no explicit diffusion model, the coarse localization algorithm can be used, or else the localization based on diffusion models is a good choice.

  8. Systematic study of source mask optimization and verification flows

    Science.gov (United States)

    Ben, Yu; Latypov, Azat; Chua, Gek Soon; Zou, Yi

    2012-06-01

    Source mask optimization (SMO) emerged as powerful resolution enhancement technique (RET) for advanced technology nodes. However, there is a plethora of flow and verification metrics in the field, confounding the end user of the technique. Systemic study of different flows and the possible unification thereof is missing. This contribution is intended to reveal the pros and cons of different SMO approaches and verification metrics, understand the commonality and difference, and provide a generic guideline for RET selection via SMO. The paper discusses 3 different type of variations commonly arise in SMO, namely pattern preparation & selection, availability of relevant OPC recipe for freeform source and finally the metrics used in source verification. Several pattern selection algorithms are compared and advantages of systematic pattern selection algorithms are discussed. In the absence of a full resist model for SMO, alternative SMO flow without full resist model is reviewed. Preferred verification flow with quality metrics of DOF and MEEF is examined.

  9. A simple iterative independent component analysis algorithm for vibration source signal identification of complex structures

    Directory of Open Access Journals (Sweden)

    Dong-Sup Lee

    2015-01-01

    Full Text Available Independent Component Analysis (ICA, one of the blind source separation methods, can be applied for extracting unknown source signals only from received signals. This is accomplished by finding statistical independence of signal mixtures and has been successfully applied to myriad fields such as medical science, image processing, and numerous others. Nevertheless, there are inherent problems that have been reported when using this technique: insta- bility and invalid ordering of separated signals, particularly when using a conventional ICA technique in vibratory source signal identification of complex structures. In this study, a simple iterative algorithm of the conventional ICA has been proposed to mitigate these problems. The proposed method to extract more stable source signals having valid order includes an iterative and reordering process of extracted mixing matrix to reconstruct finally converged source signals, referring to the magnitudes of correlation coefficients between the intermediately separated signals and the signals measured on or nearby sources. In order to review the problems of the conventional ICA technique and to vali- date the proposed method, numerical analyses have been carried out for a virtual response model and a 30 m class submarine model. Moreover, in order to investigate applicability of the proposed method to real problem of complex structure, an experiment has been carried out for a scaled submarine mockup. The results show that the proposed method could resolve the inherent problems of a conventional ICA technique.

  10. Study of the Release Process of Open Source Software: Case Study

    OpenAIRE

    Eide, Tor Erik

    2007-01-01

    This report presents the results of a case study focusing on the release process of open source projects initiated with commercial motives. The purpose of the study is to gain an increased understanding of the release process, how a community can be attracted to the project, and how the interaction with the community evolves in commercial open source initiatives. Data has been gathered from four distinct sources to form the basis of this thesis. A thorough review of the open source literatu...

  11. Theoretical analysis and experimental study to solar assisted ground-source heat pump system%太阳能辅助系统的理论分析和实验研究

    Institute of Scientific and Technical Information of China (English)

    杨鹏; 刘自强; 侯静

    2011-01-01

    As a clean, renewable energy, the geothermal energy and solar energy are trend of develo- ping and using new energies in the future. This paper introduces solar assisted Ground Source Heat Pump system, combined with the advantage of t the geothermal energy and solar energy. Through theoretical analysis and experiment, the solar assisted Ground Source Heat Pump system is proved to be feasible and scientific.%地热能和太阳能作为清洁、可再生的能源,是未来开发和利用新能源的趋势,本文介绍了太阳能辅助地源热泵系统,是将二者结合,取长补短的一种热泵形式。通过理论分析和实验验证。证明了太阳能辅助地源热泵系统的可行性和科学性。

  12. Recommendation of ruthenium source for sludge batch flowsheet studies

    Energy Technology Data Exchange (ETDEWEB)

    Woodham, W. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-09-13

    Included herein is a preliminary analysis of previously-generated data from sludge batches 7a, 7b, 8, and 9 sludge simulant and real-waste testing, performed to recommend a form of ruthenium for future sludge batch simulant testing under the nitric-formic flowsheet. Focus is given to reactions present in the Sludge Receipt and Adjustment Tank cycle, given that this cycle historically produces the most changes in chemical composition during Chemical Process Cell processing. Data is presented and analyzed for several runs performed under the nitric-formic flowsheet, with consideration given to effects on the production of hydrogen gas, nitrous oxide gas, consumption of formate, conversion of nitrite to nitrate, and the removal and recovery of mercury during processing. Additionally, a brief discussion is given to the effect of ruthenium source selection under the nitric-glycolic flowsheet. An analysis of data generated from scaled demonstration testing, sludge batch 9 qualification testing, and antifoam degradation testing under the nitric-glycolic flowsheet is presented. Experimental parameters of interest under the nitric-glycolic flowsheet include N2O production, glycolate destruction, conversion of glycolate to formate and oxalate, and the conversion of nitrite to nitrate. To date, the number of real-waste experiments that have been performed under the nitric-glycolic flowsheet is insufficient to provide a complete understanding of the effects of ruthenium source selection in simulant experiments with regard to fidelity to real-waste testing. Therefore, a determination of comparability between the two ruthenium sources as employed under the nitric-glycolic flowsheet is made based on available data in order to inform ruthenium source selection for future testing under the nitric-glycolic flowsheet.

  13. Scoping Analysis of Source Term and Functional Containment Attenuation Factors

    Energy Technology Data Exchange (ETDEWEB)

    Pete Lowry

    2012-10-01

    In order to meet future regulatory requirements, the Next Generation Nuclear Plant (NGNP) Project must fully establish and validate the mechanistic modular high temperature gas-cooled reactor (HTGR) source term. This is not possible at this stage in the project, as significant uncertainties in the final design remain unresolved. In the interim, however, there is a need to establish an approximate characterization of the source term. The NGNP team developed a simplified parametric model to establish mechanistic source term estimates for a set of proposed HTGR configurations.

  14. Scoping Analysis of Source Term and Functional Containment Attenuation Factors

    Energy Technology Data Exchange (ETDEWEB)

    Pete Lowry

    2012-02-01

    In order to meet future regulatory requirements, the Next Generation Nuclear Plant (NGNP) Project must fully establish and validate the mechanistic modular high temperature gas-cooled reactor (HTGR) source term. This is not possible at this stage in the project, as significant uncertainties in the final design remain unresolved. In the interim, however, there is a need to establish an approximate characterization of the source term. The NGNP team developed a simplified parametric model to establish mechanistic source term estimates for a set of proposed HTGR configurations.

  15. Scoping Analysis of Source Term and Functional Containment Attenuation Factors

    Energy Technology Data Exchange (ETDEWEB)

    Pete Lowry

    2012-01-01

    In order to meet future regulatory requirements, the Next Generation Nuclear Plant (NGNP) Project must fully establish and validate the mechanistic modular high temperature gas-cooled reactor (HTGR) source term. This is not possible at this stage in the project, as significant uncertainties in the final design remain unresolved. In the interim, however, there is a need to establish an approximate characterization of the source term. The NGNP team developed a simplified parametric model to establish mechanistic source term estimates for a set of proposed HTGR configurations.

  16. Regional Moment Tensor Source-Type Discrimination Analysis

    Science.gov (United States)

    2015-11-16

    unique normalized eigenvalues (black ‘+’ signs) or unique source-types on (a) the fundamental Lune (Tape and Tape, 2012a,b), and (b) on the Hudson...Solutions color-coded by variance reduction (VR) pre- sented on the Tape and Tape (2012a) and Tape and Tape (2012b) Lune . The white circle...eigenvalues (black ‘+’ signs) or unique source-types on (a) the fundamental Lune (Tape and Tape, 2012a,b), and (b) on the Hudson source-type plot (Hudson

  17. Seismic hazard assessment of the Province of Murcia (SE Spain): analysis of source contribution to hazard

    Science.gov (United States)

    García-Mayordomo, J.; Gaspar-Escribano, J. M.; Benito, B.

    2007-10-01

    A probabilistic seismic hazard assessment of the Province of Murcia in terms of peak ground acceleration (PGA) and spectral accelerations [SA( T)] is presented in this paper. In contrast to most of the previous studies in the region, which were performed for PGA making use of intensity-to-PGA relationships, hazard is here calculated in terms of magnitude and using European spectral ground-motion models. Moreover, we have considered the most important faults in the region as specific seismic sources, and also comprehensively reviewed the earthquake catalogue. Hazard calculations are performed following the Probabilistic Seismic Hazard Assessment (PSHA) methodology using a logic tree, which accounts for three different seismic source zonings and three different ground-motion models. Hazard maps in terms of PGA and SA(0.1, 0.2, 0.5, 1.0 and 2.0 s) and coefficient of variation (COV) for the 475-year return period are shown. Subsequent analysis is focused on three sites of the province, namely, the cities of Murcia, Lorca and Cartagena, which are important industrial and tourism centres. Results at these sites have been analysed to evaluate the influence of the different input options. The most important factor affecting the results is the choice of the attenuation relationship, whereas the influence of the selected seismic source zonings appears strongly site dependant. Finally, we have performed an analysis of source contribution to hazard at each of these cities to provide preliminary guidance in devising specific risk scenarios. We have found that local source zones control the hazard for PGA and SA( T ≤ 1.0 s), although contribution from specific fault sources and long-distance north Algerian sources becomes significant from SA(0.5 s) onwards.

  18. Analysis of glottal source parameters in Parkinsonian speech.

    Science.gov (United States)

    Hanratty, Jane; Deegan, Catherine; Walsh, Mary; Kirkpatrick, Barry

    2016-08-01

    Diagnosis and monitoring of Parkinson's disease has a number of challenges as there is no definitive biomarker despite the broad range of symptoms. Research is ongoing to produce objective measures that can either diagnose Parkinson's or act as an objective decision support tool. Recent research on speech based measures have demonstrated promising results. This study aims to investigate the characteristics of the glottal source signal in Parkinsonian speech. An experiment is conducted in which a selection of glottal parameters are tested for their ability to discriminate between healthy and Parkinsonian speech. Results for each glottal parameter are presented for a database of 50 healthy speakers and a database of 16 speakers with Parkinsonian speech symptoms. Receiver operating characteristic (ROC) curves were employed to analyse the results and the area under the ROC curve (AUC) values were used to quantify the performance of each glottal parameter. The results indicate that glottal parameters can be used to discriminate between healthy and Parkinsonian speech, although results varied for each parameter tested. For the task of separating healthy and Parkinsonian speech, 2 out of the 7 glottal parameters tested produced AUC values of over 0.9.

  19. Nutrient patterns and their food sources in an International Study Setting: report from the EPIC study.

    Science.gov (United States)

    Moskal, Aurelie; Pisa, Pedro T; Ferrari, Pietro; Byrnes, Graham; Freisling, Heinz; Boutron-Ruault, Marie-Christine; Cadeau, Claire; Nailler, Laura; Wendt, Andrea; Kühn, Tilman; Boeing, Heiner; Buijsse, Brian; Tjønneland, Anne; Halkjær, Jytte; Dahm, Christina C; Chiuve, Stephanie E; Quirós, Jose R; Buckland, Genevieve; Molina-Montes, Esther; Amiano, Pilar; Huerta Castaño, José M; Gurrea, Aurelio Barricarte; Khaw, Kay-Tee; Lentjes, Marleen A; Key, Timothy J; Romaguera, Dora; Vergnaud, Anne-Claire; Trichopoulou, Antonia; Bamia, Christina; Orfanos, Philippos; Palli, Domenico; Pala, Valeria; Tumino, Rosario; Sacerdote, Carlotta; de Magistris, Maria Santucci; Bueno-de-Mesquita, H Bas; Ocké, Marga C; Beulens, Joline W J; Ericson, Ulrika; Drake, Isabel; Nilsson, Lena M; Winkvist, Anna; Weiderpass, Elisabete; Hjartåker, Anette; Riboli, Elio; Slimani, Nadia

    2014-01-01

    Compared to food patterns, nutrient patterns have been rarely used particularly at international level. We studied, in the context of a multi-center study with heterogeneous data, the methodological challenges regarding pattern analyses. We identified nutrient patterns from food frequency questionnaires (FFQ) in the European Prospective Investigation into Cancer and Nutrition (EPIC) Study and used 24-hour dietary recall (24-HDR) data to validate and describe the nutrient patterns and their related food sources. Associations between lifestyle factors and the nutrient patterns were also examined. Principal component analysis (PCA) was applied on 23 nutrients derived from country-specific FFQ combining data from all EPIC centers (N = 477,312). Harmonized 24-HDRs available for a representative sample of the EPIC populations (N = 34,436) provided accurate mean group estimates of nutrients and foods by quintiles of pattern scores, presented graphically. An overall PCA combining all data captured a good proportion of the variance explained in each EPIC center. Four nutrient patterns were identified explaining 67% of the total variance: Principle component (PC) 1 was characterized by a high contribution of nutrients from plant food sources and a low contribution of nutrients from animal food sources; PC2 by a high contribution of micro-nutrients and proteins; PC3 was characterized by polyunsaturated fatty acids and vitamin D; PC4 was characterized by calcium, proteins, riboflavin, and phosphorus. The nutrients with high loadings on a particular pattern as derived from country-specific FFQ also showed high deviations in their mean EPIC intakes by quintiles of pattern scores when estimated from 24-HDR. Center and energy intake explained most of the variability in pattern scores. The use of 24-HDR enabled internal validation and facilitated the interpretation of the nutrient patterns derived from FFQs in term of food sources. These outcomes open research opportunities and

  20. Nutrient patterns and their food sources in an International Study Setting: report from the EPIC study.

    Directory of Open Access Journals (Sweden)

    Aurelie Moskal

    Full Text Available Compared to food patterns, nutrient patterns have been rarely used particularly at international level. We studied, in the context of a multi-center study with heterogeneous data, the methodological challenges regarding pattern analyses.We identified nutrient patterns from food frequency questionnaires (FFQ in the European Prospective Investigation into Cancer and Nutrition (EPIC Study and used 24-hour dietary recall (24-HDR data to validate and describe the nutrient patterns and their related food sources. Associations between lifestyle factors and the nutrient patterns were also examined. Principal component analysis (PCA was applied on 23 nutrients derived from country-specific FFQ combining data from all EPIC centers (N = 477,312. Harmonized 24-HDRs available for a representative sample of the EPIC populations (N = 34,436 provided accurate mean group estimates of nutrients and foods by quintiles of pattern scores, presented graphically. An overall PCA combining all data captured a good proportion of the variance explained in each EPIC center. Four nutrient patterns were identified explaining 67% of the total variance: Principle component (PC 1 was characterized by a high contribution of nutrients from plant food sources and a low contribution of nutrients from animal food sources; PC2 by a high contribution of micro-nutrients and proteins; PC3 was characterized by polyunsaturated fatty acids and vitamin D; PC4 was characterized by calcium, proteins, riboflavin, and phosphorus. The nutrients with high loadings on a particular pattern as derived from country-specific FFQ also showed high deviations in their mean EPIC intakes by quintiles of pattern scores when estimated from 24-HDR. Center and energy intake explained most of the variability in pattern scores.The use of 24-HDR enabled internal validation and facilitated the interpretation of the nutrient patterns derived from FFQs in term of food sources. These outcomes open research

  1. Physical activity and social support in adolescents: analysis of different types and sources of social support.

    Science.gov (United States)

    Mendonça, Gerfeson; Júnior, José Cazuza de Farias

    2015-01-01

    Little is known about the influence of different types and sources of social support on physical activity in adolescents. The aim of this study was to analyse the association between physical activity and different types and sources of social support in adolescents. The sample consisted of 2,859 adolescents between 14-19 years of age in the city of João Pessoa, in Northeastern Brazil. Physical activity was measured with a questionnaire and social support from parents and friends using a 10-item scale five for each group (type of support: encouragement, joint participation, watching, inviting, positive comments and transportation). Multivariable analysis showed that the types of support provided by parents associated with physical activity in adolescents were encouragement for females (P genders (males: P = 0.009; females: P physical activity varies according to its source, as well as the gender and age of the adolescents.

  2. Monte Carlo Simulations Validation Study: Vascular Brachytherapy Beta Sources

    International Nuclear Information System (INIS)

    Orion, I.; Koren, K.

    2004-01-01

    During the last decade many versions of angioplasty irradiation treatments have been proposed. The purpose of this unique brachytherapy is to administer a sufficient radiation dose into the vein walls in order to prevent restonosis, a clinical sequel to balloon angioplasty. The most suitable sources for this vascular brachytherapy are the β - emitters such as Re-188, P-32, and Sr-90/Y-90, with a maximum energy range of up to 2.1 MeV [1,2,3]. The radioactive catheters configurations offered for these treatments can be a simple wire [4], a fluid filled balloon or a coated stent. Each source is differently positioned inside the blood vessel, and the emitted electrons ranges therefore vary. Many types of sources and configurations were studied either experimentally or with the use of the Monte Carlo calculation technique, while most of the Monte Carlo simulations were carried out using EGS4 [5] or MCNP [6]. In this study we compared the beta-source absorbed-dose versus radial-distance of two treatment configurations using MCNP and EGS4 simulations. This comparison was aimed to discover the differences between the MCNP and the EGS4 simulation code systems in intermediate energies electron transport

  3. Receptor modeling studies for the characterization of PM10 pollution sources in Belgrade

    Directory of Open Access Journals (Sweden)

    Mijić Zoran

    2012-01-01

    Full Text Available The objective of this study is to determine the major sources and potential source regions of PM10 over Belgrade, Serbia. The PM10 samples were collected from July 2003 to December 2006 in very urban area of Belgrade and concentrations of Al, V, Cr, Mn, Fe, Ni, Cu, Zn, Cd and Pb were analyzed by atomic absorption spectrometry. The analysis of seasonal variations of PM10 mass and some element concentrations reported relatively higher concentrations in winter, what underlined the importance of local emission sources. The Unmix model was used for source apportionment purpose and the four main source profiles (fossil fuel combustion; traffic exhaust/regional transport from industrial centers; traffic related particles/site specific sources and mineral/crustal matter were identified. Among the resolved factors the fossil fuel combustion was the highest contributor (34% followed by traffic/regional industry (26%. Conditional probability function (CPF results identified possible directions of local sources. The potential source contribution function (PSCF and concentration weighted trajectory (CWT receptor models were used to identify spatial source distribution and contribution of regional-scale transported aerosols. [Projekat Ministarstva nauke Republike Srbije, br. III43007 i br. III41011

  4. Experimental study on source efficiencies for estimating surface contamination level

    International Nuclear Information System (INIS)

    Ichiji, Takeshi; Ogino, Haruyuki

    2008-01-01

    Source efficiency was measured experimentally for various materials, such as metals, nonmetals, flooring materials, sheet materials and other materials, contaminated by alpha and beta emitter radioactive nuclides. Five nuclides, 147 Pm, 60 Co, 137 Cs, 204 Tl and 90 Sr- 90 Y, were used as the beta emitters, and one nuclide 241 Am was used as the alpha emitter. The test samples were prepared by placing drops of the radioactive standardized solutions uniformly on the various materials using an automatic quantitative dispenser system from Musashi Engineering, Inc. After placing drops of the radioactive standardized solutions, the test materials were allowed to dry for more than 12 hours in a draft chamber with a hood. The radioactivity of each test material was about 30 Bq. Beta rays or alpha rays from the test materials were measured with a 2-pi gas flow proportional counter from Aloka Co., Ltd. The source efficiencies of the metals, nonmetals and sheet materials were higher than 0.5 in the case of contamination by the 137 Cs, 204 Tl and 90 Sr- 90 Y radioactive standardized solutions, higher than 0.4 in the case of contamination by the 60 Co radioactive standardized solution, and higher than 0.25 in the case of contamination by the alpha emitter the 241 Am radioactive standardized solution. These values were higher than those given in Japanese Industrial Standards (JIS) documents. In contrast, the source efficiencies of some permeable materials were lower than those given in JIS documents, because source efficiency varies depending on whether the materials or radioactive sources are wet or dry. This study provides basic data on source efficiency, which is useful for estimating the surface contamination level of materials. (author)

  5. Post-processing of Monte Carlo simulations for rapid BNCT source optimization studies

    International Nuclear Information System (INIS)

    Bleuel, D.L.; Chu, W.T.; Donahue, R.J.; Ludewigt, B.A.; Vujic, J.

    2000-01-01

    A great advantage of some neutron sources, such as accelerator-produced sources, is that they can be tuned to produce different spectra. Unfortunately, optimization studies are often time-consuming and difficult, as they require a lengthy Monte Carlo simulation for each source. When multiple characteristics, such as energy, angle, and spatial distribution of a neutron beam are allowed to vary, an overwhelming number of simulations may be required. Many optimization studies, therefore, suffer from a small number of datapoints, restrictive treatment conditions, or poor statistics. By scoring pertinent information from every particle tally in a Monte Carlo simulation, then applying appropriate source variable weight factors in a post-processing algorithm, a single simulation can be used to model any number of multiple sources. Through this method, the response to a new source can be modeled in minutes or seconds, rather than hours or days, allowing for the analysis of truly variable source conditions of much greater resolution than is normally possible when a new simulation must be run for each datapoint in a study. This method has been benchmarked and used to recreate optimization studies in a small fraction of the time spent in the original studies

  6. Post-processing of Monte Carlo simulations for rapid BNCT source optimization studies

    International Nuclear Information System (INIS)

    Bleuel, D.L.; Chu, W.T.; Donahue, R.J.; Ludewigt, B.A.; Vujic, J.

    2000-01-01

    A great advantage of some neutron sources, such as accelerator-produced sources, is that they can be tuned to produce different spectra. Unfortunately, optimization studies are often time-consuming and difficult, as they require a lengthy Monte Carlo simulation for each source. When multiple characteristics, such as energy, angle, and spatial distribution of a neutron beam are allowed to vary, an overwhelming number of simulations may be required. Many optimization studies, therefore, suffer from a small number of data points, restrictive treatment conditions, or poor statistics. By scoring pertinent information from every particle tally in a Monte Carlo simulation, then applying appropriate source variable weight factors in a post-processing algorithm; a single simulation can be used to model any number of multiple sources. Through this method, the response to a new source can be modeled in minutes or seconds, rather than hours or days, allowing for the analysis of truly variable source conditions of much greater resolution than is normally possible when a new simulation must be run for each data point in a study. This method has been benchmarked and used to recreate optimization studies in a small fraction of the time spent in the original studies. (author)

  7. Meta-analysis on Methane Mitigating Properties of Saponin-rich Sources in the Rumen: Influence of Addition Levels and Plant Sources

    Directory of Open Access Journals (Sweden)

    Anuraga Jayanegara

    2014-10-01

    Full Text Available Saponins have been considered as promising natural substances for mitigating methane emissions from ruminants. However, studies reported that addition of saponin-rich sources often arrived at contrasting results, i.e. either it decreased methane or it did not. The aim of the present study was to assess ruminal methane emissions through a meta-analytical approach of integrating related studies from published papers which described various levels of different saponin-rich sources being added to ruminant feed. A database was constructed from published literature reporting the addition of saponin-rich sources at various levels and then monitoring ruminal methane emissions in vitro. Accordingly, levels of saponin-rich source additions as well as different saponin sources were specified in the database. Apart from methane, other related rumen fermentation parameters were also included in the database, i.e. organic matter digestibility, gas production, pH, ammonia concentration, short-chain fatty acid profiles and protozoal count. A total of 23 studies comprised of 89 data points met the inclusion criteria. The data obtained were subsequently subjected to a statistical meta-analysis based on mixed model methodology. Accordingly, different studies were treated as random effects whereas levels of saponin-rich source additions or different saponin sources were considered as fixed effects. Model statistics used were p-value and root mean square error. Results showed that an addition of increasing levels of a saponin-rich source decreased methane emission per unit of substrate incubated as well as per unit of total gas produced (ptea>quillaja, statistically they did not differ each other. It can be concluded that methane mitigating properties of saponins in the rumen are level- and source-dependent.

  8. An Analysis of Air Pollution in Makkah - a View Point of Source Identification

    Directory of Open Access Journals (Sweden)

    Turki M. Habeebullah

    2013-07-01

    Full Text Available Makkah is one of the busiest cities in Saudi Arabia and remains busy all year around, especially during the season of Hajj and the month of Ramadan when millions of people visit this city. This emphasizes the importance of clean air and of understanding the sources of various air pollutants, which is vital for the management and advanced modeling of air pollution. This study intends to identify the major sources of air pollutants in Makkah, near the Holy Mosque (Al-Haram using a graphical approach. Air pollutants considered in this study are nitrogen oxides (NOx, nitrogen dioxide (NO2, nitric oxide (NO, carbon monoxide (CO, sulphur dioxide (SO2, ozone (O3 and particulate matter with aero-dynamic diameter of 10 um or less (PM10. Polar plots, time variation plots and correlation analysis are used to analyse the data and identify the major sources of emissions. Most of the pollutants demonstrate high concentrations during the morning traffic peak hours, suggesting road traffic as the main source of emission. The main sources of pollutant emissions identified in Makkahwere road traffic, re-suspended and windblown dust and sand particles. Further investigation on detailedsource apportionment is required, which is part of the ongoing project.

  9. DeltaSA tool for source apportionment benchmarking, description and sensitivity analysis

    Science.gov (United States)

    Pernigotti, D.; Belis, C. A.

    2018-05-01

    DeltaSA is an R-package and a Java on-line tool developed at the EC-Joint Research Centre to assist and benchmark source apportionment applications. Its key functionalities support two critical tasks in this kind of studies: the assignment of a factor to a source in factor analytical models (source identification) and the model performance evaluation. The source identification is based on the similarity between a given factor and source chemical profiles from public databases. The model performance evaluation is based on statistical indicators used to compare model output with reference values generated in intercomparison exercises. The references values are calculated as the ensemble average of the results reported by participants that have passed a set of testing criteria based on chemical profiles and time series similarity. In this study, a sensitivity analysis of the model performance criteria is accomplished using the results of a synthetic dataset where "a priori" references are available. The consensus modulated standard deviation punc gives the best choice for the model performance evaluation when a conservative approach is adopted.

  10. Source apportionment studies on particulate matter in Beijing/China

    Science.gov (United States)

    Suppan, P.; Shen, R.; Shao, L.; Schrader, S.; Schäfer, K.; Norra, S.; Vogel, B.; Cen, K.; Wang, Y.

    2013-05-01

    measured dust storm concentration variability at Beijing in the course of time. The results show the importance of intertwine investigations of measurements and modeling, the analysis of local air pollution levels as well as the impact and analysis of advective processes in the greater region of Beijing. Comprehensive investigations on particulate matter are a prerequisite for the knowledge of the source strengths and source attribution to the overall air pollution level. Only this knowledge can help to formulate and to introduce specific reduction measures to reduce coarser as well as finer particulates.

  11. Blind Time-Frequency Analysis for Source Discrimination in Multisensor Array Processing

    National Research Council Canada - National Science Library

    Amin, Moeness

    1999-01-01

    .... We have clearly demonstrated, through analysis and simulations, the offerings of time-frequency distributions in solving key problems in sensor array processing, including direction finding, source...

  12. pyAudioAnalysis: An Open-Source Python Library for Audio Signal Analysis.

    Science.gov (United States)

    Giannakopoulos, Theodoros

    2015-01-01

    Audio information plays a rather important role in the increasing digital content that is available today, resulting in a need for methodologies that automatically analyze such content: audio event recognition for home automations and surveillance systems, speech recognition, music information retrieval, multimodal analysis (e.g. audio-visual analysis of online videos for content-based recommendation), etc. This paper presents pyAudioAnalysis, an open-source Python library that provides a wide range of audio analysis procedures including: feature extraction, classification of audio signals, supervised and unsupervised segmentation and content visualization. pyAudioAnalysis is licensed under the Apache License and is available at GitHub (https://github.com/tyiannak/pyAudioAnalysis/). Here we present the theoretical background behind the wide range of the implemented methodologies, along with evaluation metrics for some of the methods. pyAudioAnalysis has been already used in several audio analysis research applications: smart-home functionalities through audio event detection, speech emotion recognition, depression classification based on audio-visual features, music segmentation, multimodal content-based movie recommendation and health applications (e.g. monitoring eating habits). The feedback provided from all these particular audio applications has led to practical enhancement of the library.

  13. A tsunami wave propagation analysis for the Ulchin Nuclear Power Plant considering the tsunami sources of western part of Japan

    International Nuclear Information System (INIS)

    Rhee, Hyun Me; Kim, Min Kyu; Sheen, Dong Hoon; Choi, In Kil

    2013-01-01

    The accident which was caused by a tsunami and the Great East-Japan earthquake in 2011 occurred at the Fukushima Nuclear Power Plant (NPP) site. It is obvious that the NPP accident could be incurred by the tsunami. Therefore a Probabilistic Tsunami Hazard Analysis (PTHA) for an NPP site should be required in Korea. The PTHA methodology is developed on the PSHA (Probabilistic Seismic Hazard Analysis) method which is performed by using various tsunami sources and their weights. In this study, the fault sources of northwestern part of Japan were used to analyze as the tsunami sources. These fault sources were suggested by the Atomic Energy Society of Japan (AESJ). To perform the PTHA, the calculations of maximum and minimum wave elevations from the result of tsunami simulations are required. Thus, in this study, tsunami wave propagation analysis were performed for developing the future study of the PTHA

  14. Studies of electron cyclotron resonance ion source plasma physics

    International Nuclear Information System (INIS)

    Tarvainen, O.

    2005-01-01

    This thesis consists of an introduction to the plasma physics of electron cyclotron resonance ion sources (ECRIS) and a review of the results obtained by the author and co-workers including discussion of related work by others. The thesis begins with a theoretical discussion dealing with plasma physics relevant for the production of highly charged ions in ECR ion source plasmas. This is followed by an overview of different techniques, such as gas mixing and double frequency heating, that can be used to improve the performance of this type of ion source. The experimental part of the work consists of studies related to ECRIS plasma physics. The effect of the gas mixing technique on the production efficiency of different ion beams was studied with both gaseous and solid materials. It was observed that gas mixing improves the confinement of the heavier element while the confinement of the lighter element is reduced. When the effect of gas mixing on MIVOC-plasmas was studied with several mixing gases it was observed that applying this technique can reduce the inevitable carbon contamination by a significant factor. In order to understand the different plasma processes taking place in ECRIS plasmas, a series of plasma potential and emittance measurements was carried out. An instrument, which can be used to measure the plasma potential in a single measurement without disturbing the plasma, was developed for this work. Studying the plasma potential of ECR ion sources is important not only because it helps to understand different plasma processes, but also because the information can be used as an input parameter for beam transport simulations and ion source extraction design. The experiments performed have revealed clear dependencies of the plasma potential on certain source parameters such as the amount of carbon contamination accumulated on the walls of the plasma chamber during a MIVOC-run. It was also observed that gas mixing affects not only the production efficiency

  15. Renewable energy sources cost benefit analysis and prospects for Italy

    International Nuclear Information System (INIS)

    Ariemma, A.; Montanino, G.

    1992-01-01

    In light of Italy's over-dependency on imported oil, and due to this nation's commitment to the pursuit of the strict environmental protection policies of the European Communities, ENEL (the Italian National Electricity Board) has become actively involved in research efforts aimed at the commercialization of renewable energy sources - photovoltaic, wind, biomass, and mini-hydraulic. Through the use of energy production cost estimates based on current and near- future levels of technological advancement, this paper assesses prospects for the different sources. The advantages and disadvantages of each source in its use as a suitable complementary energy supply satisfying specific sets of constraints regarding siting, weather, capital and operating costs, maintenance, etc., are pointed out. In comparing the various alternatives, the paper also considers environmental benefits and commercialization feasibility in terms of time and outlay

  16. Sources for comparative studies of placentation I. Embryological collections

    DEFF Research Database (Denmark)

    Carter, Anthony Michael

    2008-01-01

    A rich source of material for comparative studies of the placenta is the collections made by pioneers in the field such as H.W. Mossman, A.A.W. Hubrecht and J.P. Hill. This overview gives a brief description of collections known to be available and information on how each can be accessed. Include...... are some of the major series of human and animal embryos, such as the Boyd and Carnegie collections, as these also house placental material....

  17. French studies for improvement of the data for radioactive sources

    International Nuclear Information System (INIS)

    Duchemin, B.; Nimal, B.; Nimal, J.C.; Blachot, J.; Chouha, M.

    1988-01-01

    The 1987 version of the CEA radioactivity data bank is just distributed. This data bank is used to compute concentrations, activities, β and γ spectra, which give sources for shielding purposes. To improve this data bank at short cooling time (t < 200 sec) a methodology based on the statistical model is used to take account of the upper unknown levels. To give an example of the results we get, a brief summary of the studies we made for the TCHERNOBYL case is given

  18. Study of classification and disposed method for disused sealed radioactive source in Korea

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Suk Hoon; Kim, Ju Youl; Lee, Seung Hee [FNC Technology Co., Ltd.,Yongin (Korea, Republic of)

    2016-09-15

    In accordance with the classification system of radioactive waste in Korea, all the disused sealed radioactive sources (DSRSs) fall under the category of EW, VLLW or LILW, and should be managed in compliance with the restrictions for the disposal method. In this study, the management and disposal method are drawn in consideration of half-life of radionuclides contained in the source and A/D value (i.e. the activity A of the source dividing by the D value for the relevant radionuclide, which is used to provide an initial ranking of relative risk for sources) in addition to the domestic classification scheme and disposal method, based on the characteristic analysis and review results of the management practices in IAEA and foreign countries. For all the DSRSs that are being stored (as of March 2015) in the centralized temporary disposal facility for radioisotope wastes, applicability of the derivation result is confirmed through performing the characteristic analysis and case studies for assessing quantity and volume of DSRSs to be managed by each method. However, the methodology derived from this study is not applicable to the following sources; i) DSRSs without information on the radioactivity, ii) DSRSs that are not possible to calculate the specific activity and/or the source-specific A/D value. Accordingly, it is essential to identify the inherent characteristics for each of DSRSs prior to implementation of this management and disposal method.

  19. HFIR cold neutron source moderator vessel design analysis

    International Nuclear Information System (INIS)

    Chang, S.J.

    1998-04-01

    A cold neutron source capsule made of aluminum alloy is to be installed and located at the tip of one of the neutron beam tubes of the High Flux Isotope Reactor. Cold hydrogen liquid of temperature approximately 20 degree Kelvin and 15 bars pressure is designed to flow through the aluminum capsule that serves to chill and to moderate the incoming neutrons produced from the reactor core. The cold and low energy neutrons thus produced will be used as cold neutron sources for the diffraction experiments. The structural design calculation for the aluminum capsule is reported in this paper

  20. Comprehensive analysis of earthquake source spectra in southern California

    OpenAIRE

    Shearer, Peter M.; Prieto, Germán A.; Hauksson, Egill

    2006-01-01

    We compute and analyze P wave spectra from earthquakes in southern California between 1989 and 2001 using a method that isolates source-, receiver-, and path-dependent terms. We correct observed source spectra for attenuation using both fixed and spatially varying empirical Green's function methods. Estimated Brune-type stress drops for over 60,000 M_L = 1.5 to 3.1 earthquakes range from 0.2 to 20 MPa with no dependence on moment or local b value. Median computed stress drop increases with de...

  1. Operational analysis and comparative evaluation of embedded Z-Source inverters

    DEFF Research Database (Denmark)

    Blaabjerg, Frede; Gao, F.; Loh, P.C.

    2008-01-01

    ) circuitry connected instead of the generic voltage source inverter (VSI) circuitry. Further proceeding on to the topological variation, parallel embedded Z-source inverters are presented with the detailed analysis of topological configuration and operational principles showing that they are the superior......This paper presents various embedded Z-source (EZ-source) inverters broadly classified as shunt or parallel embedded Z-source inverter. Being different from the traditional Z-source inverter, EZ-source inverters are constructed by inserting dc sources into the X-shaped impedance network so...... that the dc input current flows smoothly during the whole switching period unlike the traditional Z-source inverter. This feature is interesting when PV panels or fuel cells are assumed to power load since the continuous input current flow reduces control complexity of dc source and system design burden...

  2. Dust Storm over the Middle East: Retrieval Approach, Source Identification, and Trend Analysis

    Science.gov (United States)

    Moridnejad, A.; Karimi, N.; Ariya, P. A.

    2014-12-01

    The Middle East region has been considered to be responsible for approximately 25% of the Earth's global emissions of dust particles. By developing Middle East Dust Index (MEDI) and applying to 70 dust storms characterized on MODIS images and occurred during the period between 2001 and 2012, we herein present a new high resolution mapping of major atmospheric dust source points participating in this region. To assist environmental managers and decision maker in taking proper and prioritized measures, we then categorize identified sources in terms of intensity based on extracted indices for Deep Blue algorithm and also utilize frequency of occurrence approach to find the sensitive sources. In next step, by implementing the spectral mixture analysis on the Landsat TM images (1984 and 2012), a novel desertification map will be presented. The aim is to understand how human perturbations and land-use change have influenced the dust storm points in the region. Preliminary results of this study indicate for the first time that c.a., 39 % of all detected source points are located in this newly anthropogenically desertified area. A large number of low frequency sources are located within or close to the newly desertified areas. These severely desertified regions require immediate concern at a global scale. During next 6 months, further research will be performed to confirm these preliminary results.

  3. Reactor Core Design and Analysis for a Micronuclear Power Source

    Directory of Open Access Journals (Sweden)

    Hao Sun

    2018-03-01

    Full Text Available Underwater vehicle is designed to ensure the security of country sea boundary, providing harsh requirements for its power system design. Conventional power sources, such as battery and Stirling engine, are featured with low power and short lifetime. Micronuclear reactor power source featured with higher power density and longer lifetime would strongly meet the demands of unmanned underwater vehicle power system. In this paper, a 2.4 MWt lithium heat pipe cooled reactor core is designed for micronuclear power source, which can be applied for underwater vehicles. The core features with small volume, high power density, long lifetime, and low noise level. Uranium nitride fuel with 70% enrichment and lithium heat pipes are adopted in the core. The reactivity is controlled by six control drums with B4C neutron absorber. Monte Carlo code MCNP is used for calculating the power distribution, characteristics of reactivity feedback, and core criticality safety. A code MCORE coupling MCNP and ORIGEN is used to analyze the burnup characteristics of the designed core. The results show that the core life is 14 years, and the core parameters satisfy the safety requirements. This work provides reference to the design and application of the micronuclear power source.

  4. Source term analysis for a RCRA mixed waste disposal facility

    International Nuclear Information System (INIS)

    Jordan, D.L.; Blandford, T.N.; MacKinnon, R.J.

    1996-01-01

    A Monte Carlo transport scheme was used to estimate the source strength resulting from potential releases from a mixed waste disposal facility. Infiltration rates were estimated using the HELP code, and transport through the facility was modeled using the DUST code, linked to a Monte Carlo driver

  5. Stability analysis of direct current control in current source rectifier

    DEFF Research Database (Denmark)

    Lu, Dapeng; Wang, Xiongfei; Blaabjerg, Frede

    2017-01-01

    Current source rectifier with high switching frequency has a great potential for improving the power efficiency and power density in ac-dc power conversion. This paper analyzes the stability of direct current control based on the time delay effect. Small signal model including dynamic behaviors...

  6. A Method for the Analysis of Information Use in Source-Based Writing

    Science.gov (United States)

    Sormunen, Eero; Heinstrom, Jannica; Romu, Leena; Turunen, Risto

    2012-01-01

    Introduction: Past research on source-based writing assignments has hesitated to scrutinize how students actually use information afforded by sources. This paper introduces a method for the analysis of text transformations from sources to texts composed. The method is aimed to serve scholars in building a more detailed understanding of how…

  7. Tracing diffuse anthropogenic Pb sources in rural soils by means of Pb isotope analysis

    NARCIS (Netherlands)

    Walraven, N.; Gaans, P.F.M. van; Veer, G. van der; Os, B.J.H. van; Klaver, G.T.; Vriend, S.P.; Middelburg, J.J.; Davies, G.R.

    2013-01-01

    Knowledge of the cause and source of Pb pollution is important to abate environmental Pb pollution by taking source-related actions. Lead isotope analysis is a potentially powerful tool to identify anthropogenic Pb and its sources in the environment. Spatial information on the variation of

  8. Identifying sources of emerging organic contaminants in a mixed use watershed using principal components analysis.

    Science.gov (United States)

    Karpuzcu, M Ekrem; Fairbairn, David; Arnold, William A; Barber, Brian L; Kaufenberg, Elizabeth; Koskinen, William C; Novak, Paige J; Rice, Pamela J; Swackhamer, Deborah L

    2014-01-01

    Principal components analysis (PCA) was used to identify sources of emerging organic contaminants in the Zumbro River watershed in Southeastern Minnesota. Two main principal components (PCs) were identified, which together explained more than 50% of the variance in the data. Principal Component 1 (PC1) was attributed to urban wastewater-derived sources, including municipal wastewater and residential septic tank effluents, while Principal Component 2 (PC2) was attributed to agricultural sources. The variances of the concentrations of cotinine, DEET and the prescription drugs carbamazepine, erythromycin and sulfamethoxazole were best explained by PC1, while the variances of the concentrations of the agricultural pesticides atrazine, metolachlor and acetochlor were best explained by PC2. Mixed use compounds carbaryl, iprodione and daidzein did not specifically group with either PC1 or PC2. Furthermore, despite the fact that caffeine and acetaminophen have been historically associated with human use, they could not be attributed to a single dominant land use category (e.g., urban/residential or agricultural). Contributions from septic systems did not clarify the source for these two compounds, suggesting that additional sources, such as runoff from biosolid-amended soils, may exist. Based on these results, PCA may be a useful way to broadly categorize the sources of new and previously uncharacterized emerging contaminants or may help to clarify transport pathways in a given area. Acetaminophen and caffeine were not ideal markers for urban/residential contamination sources in the study area and may need to be reconsidered as such in other areas as well.

  9. Polarization Study for NLC Positron Source Using EGS4

    Energy Technology Data Exchange (ETDEWEB)

    Liu, James C

    2000-09-20

    SLAC is exploring a polarized positron source to study new physics for the NLC project. The positron source envisioned in this paper consists of a polarized electron source, a 50-MeV electron accelerator, a thin target less-than-or-equal-to 0.2 radiation length for positron production, and a capture system for high-energy, small angular divergence positrons. The EGS4 code was used to study the yield, energy spectra, emission-angle distribution, and the mean polarization of the positrons emanating from W-Re and Ti targets hit by longitudinally polarized electron and photon beams. To account for polarization within the EGS4 code a method devised by Flottmann was used, which takes into account polarization transfer for pair production, bremsstrahlung, and Compton interactions. A mean polarization of 0.85 for positrons with energies greater than 25 MeV was obtained. Most of the high-energy positrons were emitted within a forward angle of 20 degrees. The yield of positrons above 25 MeV per incident photon was 0.034, which was about 70 times higher than that obtained with an electron beam.

  10. Study of spear as a dedicated source of synchrotron radiation

    International Nuclear Information System (INIS)

    Cerino, J.; Golde, A.; Hastings, J.; Lindau, I.; Salsburg, B.; Winick, H.; Lee, M.; Morton, P.; Garren, A.

    1977-11-01

    A study was made of the potential of SPEAR as a dedicated source of synchrotron radiation, based on the expectation that SPEAR will become increasingly available for this purpose as PEP, the 18-GeV colliding-beam storage ring now under construction by LBL and SLAC, becomes operational. A synchrotron radiation research program has been underway since May, 1974. Two beam ports capable of serving 9 simultaneous users are now operational. In single-beam multi-bunch operation high currents are possible (225 mA has been achieved and > approximately 300 mA is expected) and the electron beam emittance can be made smaller, resulting in higher source point brightness. Descriptions are given of SPEAR capabilities and of plans to expand the research capability by adding beam runs and by inserting wiggler magnets in SPEAR straight sections

  11. Beam Collimation Studies for the ILC Positron Source

    Energy Technology Data Exchange (ETDEWEB)

    Drozhdin, A.; /Fermilab; Nosochkov, Y.; Zhou, F.; /SLAC

    2008-06-26

    Results of the collimation studies for the ILC positron source beam line are presented. The calculations of primary positron beam loss are done using the ELEGANT code. The secondary positron and electron beam loss, the synchrotron radiation along the beam line and the bremsstrahlung radiation in the collimators are simulated using the STRUCT code. The first part of the collimation system, located right after the positron source target (0.125 GeV), is used for protection of the RF Linac sections from heating and radiation. The second part of the system is used for final collimation before the beam injection into the Damping Ring at 5 GeV. The calculated power loss in the collimation region is within 100 W/m, with the loss in the collimators of 0.2-5 kW. The beam transfer efficiency from the target to the Damping Ring is 13.5%.

  12. The advanced neutron source facility: Safety philosophy and studies

    International Nuclear Information System (INIS)

    Greene, S.R.; Harrington, R.M.

    1988-01-01

    The Advanced Neutron Source (ANS) is currently the only new civilian nuclear reactor facility proposed for construction in the United States. Even though the thermal power of this research-oriented reactor is a relatively low 300 MW, the design will undoubtedly receive intense scrutiny before construction is allowed to proceed. Safety studies are already under way to ensure that the maximum degree of safety in incorporated into the design and that the design is acceptable to the Department of Energy (DOE) and can meet the Nuclear Regulatory Commission regulations. This document discusses these safety studies

  13. Sources of political violence, political and psychological analysis

    Directory of Open Access Journals (Sweden)

    O. B. Balatska

    2015-05-01

    We also consider the following approaches to determining the nature and sources of aggression and violence such as instinktyvizm (K. Lorenz and behaviorism (J. B. Watson and B. F. Skinner et al.. Special attention is paid to theories of frustration aggression (J. Dollard, N. E. Miller, L. Berkowitz et al., according to which the causes of aggression and violence are hidden in a particular mental state – frustration. The particular importance of the theory of T. R. Gurr, in which the source of aggression and political violence are defined through the concept of relative deprivation, is underlined. Another approach is described in the article ­ the concept of aggression as a learned reaction (A. Bandura, G. Levin, B. Fleischmann et al.. Supporters of this approach believe that aggressive behavior is formed in the process of social training.

  14. The Spallation Neutron Source (SNS) conceptual design shielding analysis

    International Nuclear Information System (INIS)

    Johnson, J.O.; Odano, N.; Lillie, R.A.

    1998-03-01

    The shielding design is important for the construction of an intense high-energy accelerator facility like the proposed Spallation Neutron Source (SNS) due to its impact on conventional facility design, maintenance operations, and since the cost for the radiation shielding shares a considerable part of the total facility costs. A calculational strategy utilizing coupled high energy Monte Carlo calculations and multi-dimensional discrete ordinates calculations, along with semi-empirical calculations, was implemented to perform the conceptual design shielding assessment of the proposed SNS. Biological shields have been designed and assessed for the proton beam transport system and associated beam dumps, the target station, and the target service cell and general remote maintenance cell. Shielding requirements have been assessed with respect to weight, space, and dose-rate constraints for operating, shutdown, and accident conditions. A discussion of the proposed facility design, conceptual design shielding requirements calculational strategy, source terms, preliminary results and conclusions, and recommendations for additional analyses are presented

  15. National Synchrotron Light Source safety-analysis report

    International Nuclear Information System (INIS)

    Batchelor, K.

    1982-07-01

    This document covers all of the safety issues relating to the design and operation of the storage rings and injection system of the National Synchrotron Light Source. The building systems for fire protection, access and egress are described together with air and other gaseous control or venting systems. Details of shielding against prompt bremstrahlung radiation and synchrotron radiation are described and the administrative requirements to be satisfied for operation of a beam line at the facility are given

  16. Analysis of source spectra, attenuation, and site effects from central and eastern United States earthquakes

    International Nuclear Information System (INIS)

    Lindley, G.

    1998-02-01

    This report describes the results from three studies of source spectra, attenuation, and site effects of central and eastern United States earthquakes. In the first study source parameter estimates taken from 27 previous studies were combined to test the assumption that the earthquake stress drop is roughly a constant, independent of earthquake size. 200 estimates of stress drop and seismic moment from eastern North American earthquakes were combined. It was found that the estimated stress drop from the 27 studies increases approximately as the square-root of the seismic moment, from about 3 bars at 10 20 dyne-cm to 690 bars at 10 25 dyne-cm. These results do not support the assumption of a constant stress drop when estimating ground motion parameters from eastern North American earthquakes. In the second study, broadband seismograms recorded by the United States National Seismograph Network and cooperating stations have been analysed to determine Q Lg as a function of frequency in five regions: the northeastern US, southeastern US, central US, northern Basin and Range, and California and western Nevada. In the third study, using spectral analysis, estimates have been made for the anelastic attenuation of four regional phases, and estimates have been made for the source parameters of 27 earthquakes, including the M b 5.6, 14 April, 1995, West Texas earthquake

  17. Analysis of polymer foil heaters as infrared radiation sources

    International Nuclear Information System (INIS)

    Witek, Krzysztof; Piotrowski, Tadeusz; Skwarek, Agata

    2012-01-01

    Infrared radiation as a heat source is used in many fields. In particular, the positive effect of far-infrared radiation on living organisms has been observed. This paper presents two technological solutions for infrared heater production using polymer-silver and polymer-carbon pastes screenprinted on foil substrates. The purpose of this work was the identification of polymer layers as a specific frequency range IR radiation sources. The characterization of the heaters was determined mainly by measurement of the surface temperature distribution using a thermovision camera and the spectral characteristics were determined using a special measuring system. Basic parameters obtained for both, polymer silver and polymer carbon heaters were similar and were as follows: power rating of 10–12 W/dm 2 , continuous working surface temperature of 80–90 °C, temperature coefficient of resistance (TCR) about +900 ppm/K for polymer-carbon heater and about +2000 ppm/K for polymer-silver, maximum radiation intensity in the wavelength range of 6–14 μm with top intensity at 8.5 μm and heating time about 20 min. For comparison purposes, commercial panel heater was tested. The results show that the characteristics of infrared polymer heaters are similar to the characteristics of the commercial heater, so they can be taken into consideration as the alternative infrared radiation sources.

  18. Analysis of Extended Z-source Inverter for Photovoltaic System

    Science.gov (United States)

    Prakash, G.; Subramani, C.; Dhineshkumar, K.; Rayavel, P.

    2018-04-01

    The Z-source inverter has picked up prominence as a solitary stage buck-support inverter topology among numerous specialists. Notwithstanding, its boosting capacity could be constrained, and in this manner, it may not be reasonable for a few applications requiring high lift request of falling other dc-dc help converters. The Z-source inverter is a recent converter topology that exhibits both voltage-buck and voltage-boost capability This could lose the effectiveness and request all the more detecting for controlling the additional new stages. This paper is proposing another group of broadened help semi Z - source inverter (ZSI) to fill the exploration hole left in the improvement of ZSI. These new topologies can be worked with same regulation strategies that were produced for unique ZSI. Likewise, they have a similar number of dynamic switches as unique ZSI saving the single-organize nature of ZSI. Proposed topologies are dissected in the enduring state and their exhibitions are approved utilizing recreated comes about acquired in MATLAB/Simulink. Besides, they are tentatively approved with comes about acquired from a model created in the research facility. The trend of fast increase of the PV energy use is related to the increasing efficiency of solar cells as well as the improvements of manufacturing technology of solar panels.

  19. A radio and optical study of Molonglo radio sources

    Science.gov (United States)

    Ishwara-Chandra, C. H.; Saikia, D. J.; McCarthy, P. J.; van Breugel, W. J. M.

    2001-05-01

    We present multi-wavelength radio observations with the Very Large Array, and narrow- and broad-band optical observations with the 2.5-m telescope at the Las Campanas Observatory, of a well-defined sample of high-luminosity Fanaroff-Riley class II radio galaxies and quasars, selected from the Molonglo Reference Catalogue 1-Jy sample. These observations were carried out as part of a programme to investigate the effects of orientation and environment on some of the observed properties of these sources. We examine the dependence of the Liu-Pooley relationship, which shows that radio lobes with flatter radio spectra are less depolarized, on size, identification and redshift, and show that it is significantly stronger for smaller sources, with the strength of the relationship being similar for both radio galaxies and quasars. In addition to Doppler effects, there appear to be intrinsic differences between the lobes on opposite sides. We discuss the asymmetry in brightness and location of the hotspots, and present estimates of the ages and velocities from matched-resolution observations in the L and C bands. Narrow- and broad-band optical images of some of these sources were made to study their environments and correlate with the symmetry parameters. An extended emission-line region is seen in a quasar, and in four of the objects possible companion galaxies are seen close to the radio axis.

  20. Earthquake source studies and seismic imaging in Alaska

    Science.gov (United States)

    Tape, C.; Silwal, V.

    2015-12-01

    Alaska is one of the world's most seismically and tectonically active regions. Its enhanced seismicity, including slab seismicity down to 180 km, provides opportunities (1) to characterize pervasive crustal faulting and slab deformation through the estimation of moment tensors and (2) to image subsurface structures to help understand the tectonic evolution of Alaska. Most previous studies of earthquakes and seismic imaging in Alaska have emphasized earthquake locations and body-wave travel-time tomography. In the past decade, catalogs of seismic moment tensors have been established, while seismic surface waves, active-source data, and potential field data have been used to improve models of seismic structure. We have developed moment tensor catalogs in the regions of two of the largest sedimentary basins in Alaska: Cook Inlet forearc basin, west of Anchorage, and Nenana basin, west of Fairbanks. Our moment tensor solutions near Nenana basin suggest a transtensional tectonic setting, with the basin developing in a stepover of a left-lateral strike-slip fault system. We explore the effects of seismic wave propagation from point-source and finite-source earthquake models by performing three-dimensional wavefield simulations using seismic velocity models that include major sedimentary basins. We will use our catalog of moment tensors within an adjoint-based, iterative inversion to improve the three-dimensional tomographic model of Alaska.

  1. Critical analysis of documentary sources for Historical Climatology of Northern Portugal (17th-19th centuries)

    Science.gov (United States)

    Amorim, Inês; Sousa Silva, Luís; Garcia, João Carlos

    2017-04-01

    Critical analysis of documentary sources for Historical Climatology of Northern Portugal (17th-19th centuries) Inês Amorim CITCEM, Department of History, Political and International Studies, U. of Porto, Portugal. Luís Sousa Silva CITCEM, PhD Fellowship - FCT. João Carlos Garcia CIUHCT, Geography Department, U. of Porto, Portugal. The first major national project on Historical Climatology in Portugal, called "KLIMHIST: Reconstruction and model simulations of past climate in Portugal using documentary and early instrumental sources (17th-19th centuries)", ended in September 2015, coordinated by Maria João Alcoforado. This project began in March 2012 and counted on an interdisciplinary team of researchers from four Portuguese institutions (Centre of Geographical Studies, University of Trás-os-Montes and Alto Douro, University of Porto, and University of Évora), from different fields of knowledge (Geography, History, Biology, Climatology and Meteorology). The team networked and collaborated with other international research groups on Climate Change and Historical Climatology, resulting in several publications. This project aimed to reconstruct thermal and rainfall patterns in Portugal between the 17th and 19th centuries, as well as identify the main hydrometeorological extremes that occurred over that period. The basic methodology consisted in combining information from different types of anthropogenic sources (descriptive and instrumental) and natural sources (tree rings and geothermal holes), so as to develop climate change models of the past. The data collected were stored in a digital database, which can be searched by source, date, location and type of event. This database, which will be made publically available soon, contains about 3500 weather/climate-related records, which have begun to be studied, processed and published. Following this seminal project, other initiatives have taken place in Portugal in the area of Historical Climatology, namely a Ph

  2. Analysis of geological material and especially ores by means of a 252Cf source

    International Nuclear Information System (INIS)

    Barrandon, J.N.; Borderie, B.; Melky, S.; Halfon, J.; Marce, A.

    1976-01-01

    Tests were made on the possibilities for analysis by 252 Cf activation in the earth sciences and mining research. The results obtained show that while 252 Cf activation can only resolve certain very specific geochemical research problems, it does allow the exact and rapid determination of numerous elements whose ores are of great economic importance such as fluorine, titanium, vanadium, manganese, copper, antimony, barium, and tungsten. The utilization of activation analysis methods in the earth sciences is not a recent phenomenon. It has generally been limited to the analysis of traces in relatively small volumes by means of irradiation in nuclear reactors. Traditional neutron sources were little used and were not very applicable. The development of 252 Cf isotopic sources emitting more intense neutron fluxes make it possible to consider carrying out more sensitive determinations without making use of a nuclear reactor. In addition, this technique can be adapted for in situ analysis in mines and mine borings. Our work which is centered upon the possibilities of instrumental laboratory analyses of geological materials through 252 Cf activation is oriented in two principal directions: the study of the experimental sensitivities of the various elements in different rocks with the usual compositions; and the study of the possibilities for routine ore analyses

  3. Linac Coherent Light Source (LCLS) Design Study Report

    Energy Technology Data Exchange (ETDEWEB)

    Cornacchia, Massimo

    1998-12-04

    The Stanford Linear Accelerator Center, in collaboration with Los Alamos National Laboratory, Lawrence Livermore National Laboratory, and the University of California at Los Angeles, is proposing to build a Free-Electron-Laser (FEL) R and D facility operating in the wavelength range 1.5-15 {angstrom}. This FEL, called the ''Linac Coherent Light Source'' (LCLS), utilizes the SLAC linac and produces sub-picosecond pulses of short wavelength x-rays with very high peak brightness and full transverse coherence. Starting in FY 1998, the first two-thirds of the SLAC linac will be used for injection into the B factory. This leaves the last one-third free for acceleration to 15 GeV. The LCLS takes advantage of this opportunity, opening the way for the next generation of synchrotron light sources with largely proven technology and cost effective methods. This proposal is consistent with the recommendations of the Report of the Basic Energy Sciences Advisory Committee (Synchrotron Radiation Light Source Working Group, October 18-19, 1997). The report recognizes that ''fourth-generation x-ray sources...will in all likelihood be based on the free electron laser concepts. If successful, this technology could yield improvements in brightness by many orders of magnitude.'' This Design Study, the authors believe, confirms the feasibility of constructing an x-ray FEL based on the SLAC linac. Although this design is based on a consistent and feasible set of parameters, some components require more research and development to guarantee the performance. Given appropriate funding, this R and D phase can be completed in 2 years.

  4. A multi-level differential item functioning analysis of trends in international mathematics and science study: Potential sources of gender and minority difference among U.S. eighth graders' science achievement

    Science.gov (United States)

    Qian, Xiaoyu

    Science is an area where a large achievement gap has been observed between White and minority, and between male and female students. The science minority gap has continued as indicated by the National Assessment of Educational Progress and the Trends in International Mathematics and Science Studies (TIMSS). TIMSS also shows a gender gap favoring males emerging at the eighth grade. Both gaps continue to be wider in the number of doctoral degrees and full professorships awarded (NSF, 2008). The current study investigated both minority and gender achievement gaps in science utilizing a multi-level differential item functioning (DIF) methodology (Kamata, 2001) within fully Bayesian framework. All dichotomously coded items from TIMSS 2007 science assessment at eighth grade were analyzed. Both gender DIF and minority DIF were studied. Multi-level models were employed to identify DIF items and sources of DIF at both student and teacher levels. The study found that several student variables were potential sources of achievement gaps. It was also found that gender DIF favoring male students was more noticeable in the content areas of physics and earth science than biology and chemistry. In terms of item type, the majority of these gender DIF items were multiple choice than constructed response items. Female students also performed less well on items requiring visual-spatial ability. Minority students performed significantly worse on physics and earth science items as well. A higher percentage of minority DIF items in earth science and biology were constructed response than multiple choice items, indicating that literacy may be the cause of minority DIF. Three-level model results suggested that some teacher variables may be the cause of DIF variations from teacher to teacher. It is essential for both middle school science teachers and science educators to find instructional methods that work more effectively to improve science achievement of both female and minority students

  5. School adjustment of children in residential care: a multi-source analysis.

    Science.gov (United States)

    Martín, Eduardo; Muñoz de Bustillo, María del Carmen

    2009-11-01

    School adjustment is one the greatest challenges in residential child care programs. This study has two aims: to analyze school adjustment compared to a normative population, and to carry out a multi-source analysis (child, classmates, and teacher) of this adjustment. A total of 50 classrooms containing 60 children from residential care units were studied. The "Método de asignación de atributos perceptivos" (Allocation of perceptive attributes; Díaz-Aguado, 2006), the "Test Autoevaluativo Multifactorial de Adaptación Infantil" (TAMAI [Multifactor Self-assessment Test of Child Adjustment]; Hernández, 1996) and the "Protocolo de valoración para el profesorado (Evaluation Protocol for Teachers; Fernández del Valle, 1998) were applied. The main results indicate that, compared with their classmates, children in residential care are perceived as more controversial and less integrated at school, although no differences were observed in problems of isolation. The multi-source analysis shows that there is agreement among the different sources when the externalized and visible aspects are evaluated. These results are discussed in connection with the practices that are being developed in residential child care programs.

  6. Extracting functional components of neural dynamics with Independent Component Analysis and inverse Current Source Density.

    Science.gov (United States)

    Lęski, Szymon; Kublik, Ewa; Swiejkowski, Daniel A; Wróbel, Andrzej; Wójcik, Daniel K

    2010-12-01

    Local field potentials have good temporal resolution but are blurred due to the slow spatial decay of the electric field. For simultaneous recordings on regular grids one can reconstruct efficiently the current sources (CSD) using the inverse Current Source Density method (iCSD). It is possible to decompose the resultant spatiotemporal information about the current dynamics into functional components using Independent Component Analysis (ICA). We show on test data modeling recordings of evoked potentials on a grid of 4 × 5 × 7 points that meaningful results are obtained with spatial ICA decomposition of reconstructed CSD. The components obtained through decomposition of CSD are better defined and allow easier physiological interpretation than the results of similar analysis of corresponding evoked potentials in the thalamus. We show that spatiotemporal ICA decompositions can perform better for certain types of sources but it does not seem to be the case for the experimental data studied. Having found the appropriate approach to decomposing neural dynamics into functional components we use the technique to study the somatosensory evoked potentials recorded on a grid spanning a large part of the forebrain. We discuss two example components associated with the first waves of activation of the somatosensory thalamus. We show that the proposed method brings up new, more detailed information on the time and spatial location of specific activity conveyed through various parts of the somatosensory thalamus in the rat.

  7. Multicriteria analysis for sources of renewable energy using data from remote sensing

    Science.gov (United States)

    Matejicek, L.

    2015-04-01

    Renewable energy sources are major components of the strategy to reduce harmful emissions and to replace depleting fossil energy resources. Data from remote sensing can provide information for multicriteria analysis for sources of renewable energy. Advanced land cover quantification makes it possible to search for suitable sites. Multicriteria analysis, together with other data, is used to determine the energy potential and socially acceptability of suggested locations. The described case study is focused on an area of surface coal mines in the northwestern region of the Czech Republic, where the impacts of surface mining and reclamation constitute a dominant force in land cover changes. High resolution satellite images represent the main input datasets for identification of suitable sites. Solar mapping, wind predictions, the location of weirs in watersheds, road maps and demographic information complement the data from remote sensing for multicriteria analysis, which is implemented in a geographic information system (GIS). The input spatial datasets for multicriteria analysis in GIS are reclassified to a common scale and processed with raster algebra tools to identify suitable sites for sources of renewable energy. The selection of suitable sites is limited by the CORINE land cover database to mining and agricultural areas. The case study is focused on long term land cover changes in the 1985-2015 period. Multicriteria analysis based on CORINE data shows moderate changes in mapping of suitable sites for utilization of selected sources of renewable energy in 1990, 2000, 2006 and 2012. The results represent map layers showing the energy potential on a scale of a few preference classes (1-7), where the first class is linked to minimum preference and the last class to maximum preference. The attached histograms show the moderate variability of preference classes due to land cover changes caused by mining activities. The results also show a slight increase in the more

  8. Linac design study for an intense neutron-source driver

    International Nuclear Information System (INIS)

    Lynch, M.T.; Browman, A.; DeHaven, R.; Jameson, R.; Jason, A.; Neuschaefer, G.; Tallerico, P.; Regan, A.

    1993-01-01

    The 1-MW spallation-neutron source under design study at Los Alamos is driven by a linac-compressor-ring scheme that utilizes a large portion of the existing Los Alamos Meson Physics Facility (LAMPF) linac, as well as the facility infrastructure. The project is referred to as the National Center for Neutron Research (NCNR). A second phase of the proposal will upgrade the driver power to 5 MW. A description of the 1-MW scheme is given in this paper. In addition, the upgrade path to the substantial increase of beam power required for the 5 MW scenario is discussed

  9. Study of phenol extraction from coke-chemical sources

    Energy Technology Data Exchange (ETDEWEB)

    Catana, E.; Mateescu, I.; Giurcaneanu, V.; Bota, T.

    1990-09-01

    The paper presents an experimental study of the phase equilibrium in the coke-chemical tarphenols-solvent system (NaOH) solution and (phenolate solution) implied in the extraction of the phenols from coke-chemical sources. The possibility of using the phenolate solution as an extraction agent, thus making possible the improvement of the specific consumption and also simplifying the problem of the corrosion and of the waste water at the same time is presented. The influence of the solvent tar mass ratio on the selectivity of the process is discussed, this criterion being considered for establishing the conditions of the extraction. 2 figs., 7 tabs., 13 refs.

  10. Open source EMR software: profiling, insights and hands-on analysis.

    Science.gov (United States)

    Kiah, M L M; Haiqi, Ahmed; Zaidan, B B; Zaidan, A A

    2014-11-01

    literature landscape more perceivable. Nevertheless, the surveyed articles fall short of fulfilling the targeted objective of providing clear reference to potential implementers. The hands-on study contributed a more detailed comparative guide relative to our set of assessment measures. Overall, no system seems to satisfy an industry-standard measure, particularly in security and interoperability. The systems, as software applications, feel similar from a usability perspective and share a common set of functionality, though they vary considerably in community support and activity. More detailed analysis of popular open source software can benefit the potential implementers of electronic health/medical records systems. The number of examined systems and the measures by which to compare them vary across studies, but still rewarding insights start to emerge. Our work is one step toward that goal. Our overall conclusion is that open source options in the medical field are still far behind the highly acknowledged open source products in other domains, e.g. operating systems market share. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  11. Ground-source heat pump case studies and utility programs

    Energy Technology Data Exchange (ETDEWEB)

    Lienau, P.J.; Boyd, T.L.; Rogers, R.L.

    1995-04-01

    Ground-source heat pump systems are one of the promising new energy technologies that has shown rapid increase in usage over the past ten years in the United States. These systems offer substantial benefits to consumers and utilities in energy (kWh) and demand (kW) savings. The purpose of this study was to determine what existing monitored data was available mainly from electric utilities on heat pump performance, energy savings and demand reduction for residential, school and commercial building applications. In order to verify the performance, information was collected for 253 case studies from mainly utilities throughout the United States. The case studies were compiled into a database. The database was organized into general information, system information, ground system information, system performance, and additional information. Information was developed on the status of demand-side management of ground-source heat pump programs for about 60 electric utility and rural electric cooperatives on marketing, incentive programs, barriers to market penetration, number units installed in service area, and benefits.

  12. Overview of the Lombardy Region (I) Source Apportionment Study

    Science.gov (United States)

    Larsen, B. R.

    2009-04-01

    analysis of 700 filters including the bulk compounds OC, EC, nitrate, sulfate and ammonium together with a number of source marker compounds such as levoglucosan, K, Rb, PAH (wood combustion); linear alkanes (fuel/biogenic emissions); (Fe, Cu, Sn, Sb, and Mo (break-ware); Ce, Rh, Pt, and Pd (vehicle exhaust catalysts), Ca, Al, Fe, Mg, K, Ti, Ce, and Sr (soil/dust re-suspension), Na, Cl (road salt); V and Ni (fuel oil); Zn (tire-ware/tire combustion); Fe, Mn, Cr (railroad steel abrasion). The 76 ± 33 ug/m3 average PM10 concentration over the whole region was apportioned into ‘Secondary Aerosol - mostly inorganics' (30-40%), 'Transport - including re-suspension' (30-40%), and 'Residential Heating - mostly wood burning' (10-18% - 28% in Sondrio) and shows that reduction of industrial emissions of inorganic gaseous PM precursors should not be left out of the regions PM abatement strategy. Minor specific sources were also revealed. A detailed presentation will be given of the obtained data and results for the nine sites in the Po Valley in comparison with the site in the Valtelline Valley (Sondrio).

  13. Validation of botanical origins and geographical sources of some Saudi honeys using ultraviolet spectroscopy and chemometric analysis.

    Science.gov (United States)

    Ansari, Mohammad Javed; Al-Ghamdi, Ahmad; Khan, Khalid Ali; Adgaba, Nuru; El-Ahmady, Sherweit H; Gad, Haidy A; Roshan, Abdulrahman; Meo, Sultan Ayoub; Kolyali, Sevgi

    2018-02-01

    This study aims at distinguishing honey based on botanical and geographical sources. Different floral honey samples were collected from diverse geographical locations of Saudi Arabia. UV spectroscopy in combination with chemometric analysis including Hierarchical Cluster Analysis (HCA), Principal Component Analysis (PCA), and Soft Independent Modeling of Class Analogy (SIMCA) were used to classify honey samples. HCA and PCA presented the initial clustering pattern to differentiate between botanical as well as geographical sources. The SIMCA model clearly separated the Ziziphus sp. and other monofloral honey samples based on different locations and botanical sources. The results successfully discriminated the honey samples of different botanical and geographical sources validating the segregation observed using few physicochemical parameters that are regularly used for discrimination.

  14. Collection, Analysis, and Dissemination of Open Source News and Analysis for Safeguards Implementation and Evaluation

    International Nuclear Information System (INIS)

    Khaled, J.; Reed, J.; Ferguson, M.; Hepworth, C.; Serrat, J.; Priori, M.; Hammond, W.

    2015-01-01

    Analysis of all safeguards-relevant information is an essential component of IAEA safeguards and the ongoing State evaluation underlying IAEA verification activities. In addition to State declared safeguards information and information generated from safeguards activities both in the field and at headquarters, the IAEA collects and analyzes information from a wide array of open sources relevant to States' nuclear related activities. A number of these open sources include information that could be loosely categorized as ''news'': international, regional, and local media; company and government press releases; public records of parliamentary proceedings; and NGO/academic commentaries and analyzes. It is the task of the State Factors Analysis Section of the Department of Safeguards to collect, analyze and disseminate news of relevance to support ongoing State evaluation. This information supports State evaluation by providing the Department with a global overview of safeguards-relevant nuclear developments. Additionally, this type of information can support in-depth analyses of nuclear fuel cycle related activities, alerting State Evaluation Groups to potential inconsistencies in State declarations, and preparing inspectors for activities in the field. The State Factors Analysis Section uses a variety of tools, including subscription services, news aggregators, a roster of specialized sources, and a custom software application developed by an external partner to manage incoming data streams and assist with making sure that critical information is not overlooked. When analyzing data, it is necessary to determine the credibility of a given source and piece of information. Data must be considered for accuracy, bias, and relevance to the overall assessment. Analysts use a variety of methodological techniques to make these types of judgments, which are included when the information is presented to State Evaluation Groups. Dissemination of news to

  15. Pteros: fast and easy to use open-source C++ library for molecular analysis.

    Science.gov (United States)

    Yesylevskyy, Semen O

    2012-07-15

    An open-source Pteros library for molecular modeling and analysis of molecular dynamics trajectories for C++ programming language is introduced. Pteros provides a number of routine analysis operations ranging from reading and writing trajectory files and geometry transformations to structural alignment and computation of nonbonded interaction energies. The library features asynchronous trajectory reading and parallel execution of several analysis routines, which greatly simplifies development of computationally intensive trajectory analysis algorithms. Pteros programming interface is very simple and intuitive while the source code is well documented and easily extendible. Pteros is available for free under open-source Artistic License from http://sourceforge.net/projects/pteros/. Copyright © 2012 Wiley Periodicals, Inc.

  16. Dosimetric characterization of two radium sources for retrospective dosimetry studies

    Energy Technology Data Exchange (ETDEWEB)

    Candela-Juan, C., E-mail: ccanjuan@gmail.com [Radiation Oncology Department, La Fe University and Polytechnic Hospital, Valencia 46026, Spain and Department of Atomic, Molecular and Nuclear Physics, University of Valencia, Burjassot 46100 (Spain); Karlsson, M. [Division of Radiological Sciences, Department of Medical and Health Sciences, Linköping University, Linköping SE 581 85 (Sweden); Lundell, M. [Department of Medical Physics and Oncology, Karolinska University Hospital and Karolinska Institute, Stockholm SE 171 76 (Sweden); Ballester, F. [Department of Atomic, Molecular and Nuclear Physics, University of Valencia, Burjassot 46100 (Spain); Tedgren, Å. Carlsson [Division of Radiological Sciences, Department of Medical and Health Sciences, Linköping University, Linköping SE 581 85, Sweden and Swedish Radiation Safety Authority, Stockholm SE 171 16 (Sweden)

    2015-05-15

    Purpose: During the first part of the 20th century, {sup 226}Ra was the most used radionuclide for brachytherapy. Retrospective accurate dosimetry, coupled with patient follow up, is important for advancing knowledge on long-term radiation effects. The purpose of this work was to dosimetrically characterize two {sup 226}Ra sources, commonly used in Sweden during the first half of the 20th century, for retrospective dose–effect studies. Methods: An 8 mg {sup 226}Ra tube and a 10 mg {sup 226}Ra needle, used at Radiumhemmet (Karolinska University Hospital, Stockholm, Sweden), from 1925 to the 1960s, were modeled in two independent Monte Carlo (MC) radiation transport codes: GEANT4 and MCNP5. Absorbed dose and collision kerma around the two sources were obtained, from which the TG-43 parameters were derived for the secular equilibrium state. Furthermore, results from this dosimetric formalism were compared with results from a MC simulation with a superficial mould constituted by five needles inside a glass casing, placed over a water phantom, trying to mimic a typical clinical setup. Calculated absorbed doses using the TG-43 formalism were also compared with previously reported measurements and calculations based on the Sievert integral. Finally, the dose rate at large distances from a {sup 226}Ra point-like-source placed in the center of 1 m radius water sphere was calculated with GEANT4. Results: TG-43 parameters [including g{sub L}(r), F(r, θ), Λ, and s{sub K}] have been uploaded in spreadsheets as additional material, and the fitting parameters of a mathematical curve that provides the dose rate between 10 and 60 cm from the source have been provided. Results from TG-43 formalism are consistent within the treatment volume with those of a MC simulation of a typical clinical scenario. Comparisons with reported measurements made with thermoluminescent dosimeters show differences up to 13% along the transverse axis of the radium needle. It has been estimated that

  17. Identification of sources and long term trends for pollutants in the arctic using isentropic trajectory analysis

    Energy Technology Data Exchange (ETDEWEB)

    Mahura, A.; Jaffe, D.; Harris, J.

    2003-07-01

    The understanding of factors driving climate and ecosystem changes in the Arctic requires careful consideration of the sources, correlation and trends for anthropogenic pollutants. The database from the NOAA-CMDL Barrow Observatory (71deg.17'N, 156deg.47'W) is the longest and most complete record of pollutant measurements in the Arctic. It includes observations of carbon dioxide (CO{sub 2}), methane (CH{sub 4}), carbon monoxide (CO), ozone (O{sub 3}), aerosol scattering coefficient ({sigma}{sub sp}), aerosol number concentration (NC{sub asl}), etc. The objectives of this study are to understand the role of long-range transport to Barrow in explaining: (1) the year-to-year variations, and (2) the trends in the atmospheric chemistry record at the NOAA-CMDL Barrow observatory. The key questions we try to answer are: 1. What is the relationship between various chemical species measured at Barrow Observatory, Alaska and transport pathways at various altitudes? 2. What are the trends of species and their relation to transport patterns from the source regions? 3. What is the impact of the Prudhoe Bay emissions on the Barrow's records? To answer on these questions we apply the following main research tools. First, it is an isentropic trajectory model used to calculate the trajectories arriving at Barrow at three altitudes of 0.5, 1.5 and 3 km above sea level. Second - clustering procedure used to divide the trajectories into groups based on source regions. Third - various statistical analysis tools such as the exploratory data analysis, two component correlation analysis, trend analysis, principal components and factor analysis used to identify the relationship between various chemical species vs. source regions as a function of time. In this study, we used the chemical data from the NOAA-CMDL Barrow observatory in combination with isentropic backward trajectories from gridded ECMWF data to understand the importance of various pollutant source regions on

  18. Identification of sources and long term trends for pollutants in the arctic using isentropic trajectory analysis

    International Nuclear Information System (INIS)

    Mahura, A.; Jaffe, D.; Harris, J.

    2003-01-01

    The understanding of factors driving climate and ecosystem changes in the Arctic requires careful consideration of the sources, correlation and trends for anthropogenic pollutants. The database from the NOAA-CMDL Barrow Observatory (71deg.17'N, 156deg.47'W) is the longest and most complete record of pollutant measurements in the Arctic. It includes observations of carbon dioxide (CO 2 ), methane (CH 4 ), carbon monoxide (CO), ozone (O 3 ), aerosol scattering coefficient (σ sp ), aerosol number concentration (NC asl ), etc. The objectives of this study are to understand the role of long-range transport to Barrow in explaining: (1) the year-to-year variations, and (2) the trends in the atmospheric chemistry record at the NOAA-CMDL Barrow observatory. The key questions we try to answer are: 1. What is the relationship between various chemical species measured at Barrow Observatory, Alaska and transport pathways at various altitudes? 2. What are the trends of species and their relation to transport patterns from the source regions? 3. What is the impact of the Prudhoe Bay emissions on the Barrow's records? To answer on these questions we apply the following main research tools. First, it is an isentropic trajectory model used to calculate the trajectories arriving at Barrow at three altitudes of 0.5, 1.5 and 3 km above sea level. Second - clustering procedure used to divide the trajectories into groups based on source regions. Third - various statistical analysis tools such as the exploratory data analysis, two component correlation analysis, trend analysis, principal components and factor analysis used to identify the relationship between various chemical species vs. source regions as a function of time. In this study, we used the chemical data from the NOAA-CMDL Barrow observatory in combination with isentropic backward trajectories from gridded ECMWF data to understand the importance of various pollutant source regions on atmospheric composition in the Arctic. We

  19. Identification of Watershed-scale Critical Source Areas Using Bayesian Maximum Entropy Spatiotemporal Analysis

    Science.gov (United States)

    Roostaee, M.; Deng, Z.

    2017-12-01

    The states' environmental agencies are required by The Clean Water Act to assess all waterbodies and evaluate potential sources of impairments. Spatial and temporal distributions of water quality parameters are critical in identifying Critical Source Areas (CSAs). However, due to limitations in monetary resources and a large number of waterbodies, available monitoring stations are typically sparse with intermittent periods of data collection. Hence, scarcity of water quality data is a major obstacle in addressing sources of pollution through management strategies. In this study spatiotemporal Bayesian Maximum Entropy method (BME) is employed to model the inherent temporal and spatial variability of measured water quality indicators such as Dissolved Oxygen (DO) concentration for Turkey Creek Watershed. Turkey Creek is located in northern Louisiana and has been listed in 303(d) list for DO impairment since 2014 in Louisiana Water Quality Inventory Reports due to agricultural practices. BME method is proved to provide more accurate estimates than the methods of purely spatial analysis by incorporating space/time distribution and uncertainty in available measured soft and hard data. This model would be used to estimate DO concentration at unmonitored locations and times and subsequently identifying CSAs. The USDA's crop-specific land cover data layers of the watershed were then used to determine those practices/changes that led to low DO concentration in identified CSAs. Primary results revealed that cultivation of corn and soybean as well as urban runoff are main contributing sources in low dissolved oxygen in Turkey Creek Watershed.

  20. Preliminary radiation transport analysis for the proposed National Spallation Neutron Source (NSNS)

    International Nuclear Information System (INIS)

    Johnson, J.O.; Lillie, R.A.

    1997-01-01

    The use of neutrons in science and industry has increased continuously during the past 50 years with applications now widely used in physics, chemistry, biology, engineering, and medicine. Within this history, the relative merits of using pulsed accelerator spallation sources versus reactors for neutron sources as the preferred option for the future. To address this future need, the Department of Energy (DOE) has initiated a pre-conceptual design study for the National Spallation Neutron Source (NSNS) and given preliminary approval for the proposed facility to be built at Oak Ridge National Laboratory (ORNL). The DOE directive is to design and build a short pulse spallation source in the 1 MS power range with sufficient design flexibility that it can be upgraded and operated at a significantly higher power at a later stage. The pre-conceptualized design of the NSNS initially consists of an accelerator system capable of delivering a 1 to 2 GeV proton beam with 1 MW of beam power in an approximate 0.5 microsecond pulse at a 60 Hz frequency onto a single target station. The NSNS will be upgraded in stages to a 5 MW facility with two target stations (a high power station operating at 60 Hz and a low power station operating at 10 Hz). Each target station will contain four moderators (combinations of cryogenic and ambient temperature) and 18 beam liens for a total of 36 experiment stations. This paper summarizes the radiation transport analysis strategies for the proposed NSNS facility

  1. Microjet burners for molecular-beam sources and combustion studies

    Science.gov (United States)

    Groeger, Wolfgang; Fenn, John B.

    1988-09-01

    A novel microjet burner is described in which combustion is stabilized by a hot wall. The scale is so small that the entire burner flow can be passed through a nozzle only 0.2 mm or less in diameter into an evacuated chamber to form a supersonic free jet with expansion so rapid that all collisional processes in the jet gas are frozen in a microsecond or less. This burner can be used to provide high-temperature source gas for free jet expansion to produce intense beams of internally hot molecules. A more immediate use would seem to be in the analysis of combustion products and perhaps intermediates by various kinds of spectroscopies without some of the perturbation effects encountered in probe sampling of flames and other types of combustion devices. As an example of the latter application of this new tool, we present infrared emission spectra for jet gas obtained from the combustion of oxygen-hydrocarbon mixtures both fuel-rich and fuel-lean operation. In addition, we show results obtained by mass spectrometric analysis of the combustion products.

  2. Determination of volatile organic compounds pollution sources in malaysian drinking water using multivariate analysis.

    Science.gov (United States)

    Soh, Shiau-Chian; Abdullah, Md Pauzi

    2007-01-01

    A field investigation was conducted at all water treatment plants throughout 11 states and Federal Territory in Peninsular Malaysia. The sampling points in this study include treatment plant operation, service reservoir outlet and auxiliary outlet point at the water pipelines. Analysis was performed by solid phase micro-extraction technique with a 100 microm polydimethylsiloxane fibre using gas chromatography with mass spectrometry detection to analyse 54 volatile organic compounds (VOCs) of different chemical families in drinking water. The concentration of VOCs ranged from undetectable to 230.2 microg/l. Among all of the VOCs species, chloroform has the highest concentration and was detected in all drinking water samples. Average concentrations of total trihalomethanes (THMs) were almost similar among all states which were in the range of 28.4--33.0 microg/l. Apart from THMs, other abundant compounds detected were cis and trans-1,2-dichloroethylene, trichloroethylene, 1,2-dibromoethane, benzene, toluene, ethylbenzene, chlorobenzene, 1,4-dichlorobenzene and 1,2-dichloro - benzene. Principal component analysis (PCA) with the aid of varimax rotation, and parallel factor analysis (PARAFAC) method were used to statistically verify the correlation between VOCs and the source of pollution. The multivariate analysis pointed out that the maintenance of auxiliary pipelines in the distribution systems is vital as it can become significant point source pollution to Malaysian drinking water.

  3. Feasibility study of a 1-MW pulsed spallation source

    International Nuclear Information System (INIS)

    Cho, Y.; Chae, Y.C.; Crosbie, E.

    1995-01-01

    A feasibility study of a 1-MW pulsed spallation source based on a rapidly cycling proton synchrotron (RCS) has been completed. The facility consists of a 400-MeV HP - linac, a 30-Hz RCS that accelerates the 400-MeV beam to 2 GeV, and two neutron-generating target stations. The design time-averaged current of the accelerator system is 0.5 mA, or 1.04x1014 protons per pulse. The linac system consists of an H - ion source, a 2-MeV RFQ, a 70-MeV DTL and a 330-MeV CCL. Transverse phase space painting to achieve a Kapchinskij-Vladimirskij (K-V) distribution of the injected particles in the RCS is accomplished by charge exchange injection and programming of the closed orbit during injection. The synchrotron lattice uses FODO cells of ∼90 degrees phase advance. Dispersion-free straight sections are obtained by using a missing magnet scheme. Synchrotron magnets are powered by a dual-frequency resonant circuit that excites the magnets at a 20-Hz rate and de-excites them at a 60-Hz rate, resulting in an effective rate of 30 Hz, and reducing the required peak rf voltage by 1/3. A key feature, of the design of this accelerator system is that beam losses are from injection to extraction, reducing activation to levels consistent with hands-on maintenance. Details of the study are presented

  4. Preliminary studies of Brazilian wood using different radioisotopic sources

    International Nuclear Information System (INIS)

    Carvalho, Gilberto; Silva, Leonardo Gondim de Andrade e

    2013-01-01

    Due to availability and particular features, wood was one of the first materials used by mankind with a wide variety of applications. It can be used as raw material for paper and cellulose manufacturing; in industries such as chemical, naval, furniture, sports goods, toys, and musical instrument; in building construction and in the distribution of electric energy. Wood has been widely researched; therefore, wood researchers know that several aspects such as temperature, latitude, longitude, altitude, sunlight, soil, and rainfall index interfere with the growth of trees. This behavior explains why average physical-chemical properties are important when wood is studied. The majority of researchers consider density to be the most important wood property because of its straight relationship with the physical and mechanical properties of wood. There are three types of wood density: basic, apparent and green. The apparent density was used here at 12% of moisture content. In this study, four different types of wood were used: 'freijo', 'jequetiba', 'muiracatiara' and 'ipe'. For wood density determination by non-conventional method, Am-241, Ba-133 and Cs-137 radioisotopic sources; a NaI scintillation detector and a counter were used. The results demonstrated this technique to be quick and accurate. By considering the nuclear parameters obtained as half value layers and linear absorption coefficients, Cs-137 radioisotopic source demonstrated to be the best option to be used for inspection of the physical integrity of electric wooden poles and live trees for future works. (author)

  5. Search for neutrino point sources with an all-sky autocorrelation analysis in IceCube

    Energy Technology Data Exchange (ETDEWEB)

    Turcati, Andrea; Bernhard, Anna; Coenders, Stefan [TU, Munich (Germany); Collaboration: IceCube-Collaboration

    2016-07-01

    The IceCube Neutrino Observatory is a cubic kilometre scale neutrino telescope located in the Antarctic ice. Its full-sky field of view gives unique opportunities to study the neutrino emission from the Galactic and extragalactic sky. Recently, IceCube found the first signal of astrophysical neutrinos with energies up to the PeV scale, but the origin of these particles still remains unresolved. Given the observed flux, the absence of observations of bright point-sources is explainable with the presence of numerous weak sources. This scenario can be tested using autocorrelation methods. We present here the sensitivities and discovery potentials of a two-point angular correlation analysis performed on seven years of IceCube data, taken between 2008 and 2015. The test is applied on the northern and southern skies separately, using the neutrino energy information to improve the effectiveness of the method.

  6. The SSI TOOLBOX Source Term Model SOSIM - Screening for important radionuclides and parameter sensitivity analysis

    Energy Technology Data Exchange (ETDEWEB)

    Avila Moreno, R.; Barrdahl, R.; Haegg, C.

    1995-05-01

    The main objective of the present study was to carry out a screening and a sensitivity analysis of the SSI TOOLBOX source term model SOSIM. This model is a part of the SSI TOOLBOX for radiological impact assessment of the Swedish disposal concept for high-level waste KBS-3. The outputs of interest for this purpose were: the total released fraction, the time of total release, the time and value of maximum release rate, the dose rates after direct releases of the biosphere. The source term equations were derived and simple equations and methods were proposed for calculation of these. A literature survey has been performed in order to determine a characteristic variation range and a nominal value for each model parameter. In order to reduce the model uncertainties the authors recommend a change in the initial boundary condition for solution of the diffusion equation for highly soluble nuclides. 13 refs.

  7. Economic analysis for the electricity production in isolated areas in Cuba using different renewable sources

    International Nuclear Information System (INIS)

    Morales Salas, Joel; Moreno Figueredo, Conrado; Briesemeister, Ludwig; Arzola, Jose

    2015-01-01

    Despite the effort and commitment of the Cuban government in more of 50 year, there are houses without electricity in remote areas of the Electricity Network. These houses or communities have the promise and commitment of the local and national authorities to help them in improve his life quality. How the houses and communities are remote of the electricity network, the cost to extend the network is considerably high. For that reason, the use of renewable sources in these areas is an acceptable proposal. This article does an analysis to obtain different configurations depending to the number of houses. It do a proposal with the use of the Hydrothermal Carbonization process in the cases where is not feasible introduce different renewable source; a technology new in Cuba, and advantageous taking into consideration the kind of biomass that exist in Cuba. The study of the chemical process of the Hydrothermal Carbonization with the Cuban biomass should be further researched. (full text)

  8. Development of a hydrogen analysis using a small neutron source

    International Nuclear Information System (INIS)

    Ishikawa, I.; Tachikawa, N.; Tominaga, H.

    1998-01-01

    Most of industrial nuclear gauges are based on the use of radiation transmission through matter. This document presents new techniques to measure hydrogen using a small neutron source. A new technique has been developed for measuring the thickness of a thin layer of 30-200 μm thick plastic, which is sandwiched between two sheets of 0.6-4.2 mm in total thickness. Another technique allows to monitor residual moisture in wet refractory newly coated on the inner surface of a steel vessel from its outside through a thick steel plate. For saving on the use of coke and for strict control of furnace heating in the iron making process a new type moisture gauge was developed using simultaneous measurement of transmission rates of both fast neutrons and gamma rays from 252 Cf

  9. Dissolution And Analysis Of Yellowcake Components For Fingerprinting UOC Sources

    International Nuclear Information System (INIS)

    Hexel, Cole R.; Bostick, Debra A.; Kennedy, Angel K.; Begovich, John M.; Carter, Joel A.

    2012-01-01

    There are a number of chemical and physical parameters that might be used to help elucidate the ore body from which uranium ore concentrate (UOC) was derived. It is the variation in the concentration and isotopic composition of these components that can provide information as to the identity of the ore body from which the UOC was mined and the type of subsequent processing that has been undertaken. Oak Ridge National Laboratory (ORNL) in collaboration with Lawrence Livermore and Los Alamos National Laboratories is surveying ore characteristics of yellowcake samples from known geologic origin. The data sets are being incorporated into a national database to help in sourcing interdicted material, as well as aid in safeguards and nonproliferation activities. Geologic age and attributes from chemical processing are site-specific. Isotopic abundances of lead, neodymium, and strontium provide insight into the provenance of geologic location of ore material. Variations in lead isotopes are due to the radioactive decay of uranium in the ore. Likewise, neodymium isotopic abundances are skewed due to the radiogenic decay of samarium. Rubidium decay similarly alters the isotopic signature of strontium isotopic composition in ores. This paper will discuss the chemical processing of yellowcake performed at ORNL. Variations in lead, neodymium, and strontium isotopic abundances are being analyzed in UOC from two geologic sources. Chemical separation and instrumental protocols will be summarized. The data will be correlated with chemical signatures (such as elemental composition, uranium, carbon, and nitrogen isotopic content) to demonstrate the utility of principal component and cluster analyses to aid in the determination of UOC provenance.

  10. Assessing heavy metal sources in sugarcane Brazilian soils: an approach using multivariate analysis.

    Science.gov (United States)

    da Silva, Fernando Bruno Vieira; do Nascimento, Clístenes Williams Araújo; Araújo, Paula Renata Muniz; da Silva, Luiz Henrique Vieira; da Silva, Roberto Felipe

    2016-08-01

    Brazil is the world's largest sugarcane producer and soils in the northeastern part of the country have been cultivated with the crop for over 450 years. However, so far, there has been no study on the status of heavy metal accumulation in these long-history cultivated soils. To fill the gap, we collect soil samples from 60 sugarcane fields in order to determine the contents of Cd, Cr, Cu, Ni, Pb, and Zn. We used multivariate analysis to distinguish between natural and anthropogenic sources of these metals in soils. Analytical determinations were performed in ICP-OES after microwave acid solution digestion. Mean concentrations of Cd, Cr, Cu, Ni, Pb, and Zn were 1.9, 18.8, 6.4, 4.9, 11.2, and 16.2 mg kg(-1), respectively. The principal component one was associated with lithogenic origin and comprised the metals Cr, Cu, Ni, and Zn. Cluster analysis confirmed that 68 % of the evaluated sites have soil heavy metal concentrations close to the natural background. The Cd concentration (principal component two) was clearly associated with anthropogenic sources with P fertilization being the most likely source of Cd to soils. On the other hand, the third component (Pb concentration) indicates a mixed origin for this metal (natural and anthropogenic); hence, Pb concentrations are probably related not only to the soil parent material but also to industrial emissions and urbanization in the vicinity of the agricultural areas.

  11. Phenotypic and genotypic analysis of bio-serotypes of Yersinia enterocolitica from various sources in Brazil.

    Science.gov (United States)

    Rusak, Leonardo Alves; dos Reis, Cristhiane Moura Falavina; Barbosa, André Victor; Santos, André Felipe Mercês; Paixão, Renata; Hofer, Ernesto; Vallim, Deyse Christina; Asensi, Marise Dutra

    2014-12-15

    Yersinia enterocolitica is a well-known foodborne pathogen widely distributed in nature with high public health relevance, especially in Europe. This study aimed to analyze the pathogenic potential of Y. enterocolitica isolated strains from human, animal, food, and environmental sources and from different regions of Brazil by detecting virulence genes inv, ail, ystA, and virF through polymerase chain reaction (PCR), phenotypic tests, and antimicrobial susceptibility analysis. Pulsed-field gel electrophoresis (PFGE) was used for the assessment of phylogenetic diversity. All virulence genes were detected in 11/60 (18%) strains of serotype O:3, biotype 4 isolated from human and animal sources. Ten human strains (4/O:3) presented three chromosomal virulence genes, and nine strains of biotype 1A presented the inv gene. Six (10%) strains were resistant to sulfamethoxazole-trimethoprim, seven (12%) to tetracycline, and one (2%) to amikacin, all of which are used to treat yersiniosis. AMP-CEF-SXT was the predominant resistance profile. PFGE analysis revealed 36 unique pulsotypes, grouped into nine clusters (A to I) with similarity ≥ 85%, generating a diversity discriminatory index of 0.957. Cluster A comprised all bio-serotype 4/O:3 strains isolated from animal and humans sources. This study shows the existence of strains with the same genotypic profiles, bearing all virulence genes, from human and animal sources, circulating among several Brazilian states. This supports the hypothesis that swine is likely to serve as a main element in Y. enterocolitica transmission to humans in Brazil, and it could become a potential threat to public health as in Europe.

  12. Source rock contributions to the Lower Cretaceous heavy oil accumulations in Alberta: a basin modeling study

    Science.gov (United States)

    Berbesi, Luiyin Alejandro; di Primio, Rolando; Anka, Zahie; Horsfield, Brian; Higley, Debra K.

    2012-01-01

    The origin of the immense oil sand deposits in Lower Cretaceous reservoirs of the Western Canada sedimentary basin is still a matter of debate, specifically with respect to the original in-place volumes and contributing source rocks. In this study, the contributions from the main source rocks were addressed using a three-dimensional petroleum system model calibrated to well data. A sensitivity analysis of source rock definition was performed in the case of the two main contributors, which are the Lower Jurassic Gordondale Member of the Fernie Group and the Upper Devonian–Lower Mississippian Exshaw Formation. This sensitivity analysis included variations of assigned total organic carbon and hydrogen index for both source intervals, and in the case of the Exshaw Formation, variations of thickness in areas beneath the Rocky Mountains were also considered. All of the modeled source rocks reached the early or main oil generation stages by 60 Ma, before the onset of the Laramide orogeny. Reconstructed oil accumulations were initially modest because of limited trapping efficiency. This was improved by defining lateral stratigraphic seals within the carrier system. An additional sealing effect by biodegraded oil may have hindered the migration of petroleum in the northern areas, but not to the east of Athabasca. In the latter case, the main trapping controls are dominantly stratigraphic and structural. Our model, based on available data, identifies the Gordondale source rock as the contributor of more than 54% of the oil in the Athabasca and Peace River accumulations, followed by minor amounts from Exshaw (15%) and other Devonian to Lower Jurassic source rocks. The proposed strong contribution of petroleum from the Exshaw Formation source rock to the Athabasca oil sands is only reproduced by assuming 25 m (82 ft) of mature Exshaw in the kitchen areas, with original total organic carbon of 9% or more.

  13. FEASIBILITY STUDY II OF A MUON BASED NEUTRINO SOURCE.

    Energy Technology Data Exchange (ETDEWEB)

    GALLARDO,J.C.; OZAKI,S.; PALMER,R.B.; ZISMAN,M.

    2001-06-30

    The concept of using a muon storage ring to provide a well characterized beam of muon and electron neutrinos (a Neutrino Factory) has been under study for a number of years now at various laboratories throughout the world. The physics program of a Neutrino Factoryis focused on the relatively unexplored neutrino sector. In conjunction with a detector located a suitable distance from the neutrino source, the facility would make valuable contributions to the study of neutrino masses and lepton mixing. A Neutrino Factory is expected to improve the measurement accuracy of sin{sup 2}(2{theta}{sub 23}) and {Delta}m{sup 2}{sub 32} and provide measurements of sin{sup 2}(2{theta}{sub 13}) and the sign of {Delta}m{sup 2}{sub 32}. It may also be able to measure CP violation in the lepton sector.

  14. Open-Source Software in Computational Research: A Case Study

    Directory of Open Access Journals (Sweden)

    Sreekanth Pannala

    2008-04-01

    Full Text Available A case study of open-source (OS development of the computational research software MFIX, used for multiphase computational fluid dynamics simulations, is presented here. The verification and validation steps required for constructing modern computational software and the advantages of OS development in those steps are discussed. The infrastructure used for enabling the OS development of MFIX is described. The impact of OS development on computational research and education in gas-solids flow, as well as the dissemination of information to other areas such as geophysical and volcanology research, is demonstrated. This study shows that the advantages of OS development were realized in the case of MFIX: verification by many users, which enhances software quality; the use of software as a means for accumulating and exchanging information; the facilitation of peer review of the results of computational research.

  15. Sources of International Courts' Legitimacy: A comparative study

    DEFF Research Database (Denmark)

    Godzimirska, Zuzanna; Creamer, Cosette

    Despite ample scholarship on the legitimacy of international legal institutions, existing studies on international courts (ICs) tend to adopt normative or deductive approaches to specify their legitimacy and assess its effects. Very few adopt empirical or inductive approaches and examine the reas......Despite ample scholarship on the legitimacy of international legal institutions, existing studies on international courts (ICs) tend to adopt normative or deductive approaches to specify their legitimacy and assess its effects. Very few adopt empirical or inductive approaches and examine...... the reasons why an IC is considered more or less legitimate in the eyes of a court’s constituents. This paper addresses this scholarly gap by identifying the sources of ICs’ legitimacy within the expressed views of one category of constituents: a court’s member states. Although we emphasize the importance...

  16. An open source cryostage and software analysis method for detection of antifreeze activity

    DEFF Research Database (Denmark)

    Lørup Buch, Johannes; Ramløv, H

    2016-01-01

    AFP could reliably be told apart from controls after only two minutes of recrystallisation. The goal of providing a fast, cheap and easy method for detecting antifreeze proteins in solution was met, and further development of the system can be followed at https://github.com/pechano/cryostage.......The aim of this study is to provide the reader with a simple setup that can detect antifreeze proteins (AFP) by inhibition of ice recrystallisation in very small sample sizes. This includes an open source cryostage, a method for preparing and loading samples as well as a software analysis method...

  17. Sensitivity analysis of source driven subcritical systems by the HGPT methodology

    International Nuclear Information System (INIS)

    Gandini, A.

    1997-01-01

    The heuristically based generalized perturbation theory (HGPT) methodology has been extensively used in the last decades for analysis studies in the nuclear reactor field. Its use leads to fundamental reciprocity relationships from which perturbation, or sensitivity expressions can be derived, to first and higher order, in terms of simple integration operation of quantities calculated at unperturbed system conditions. Its application to subcritical, source-driven systems, now considered with increasing interest in many laboratories for their potential use as nuclear waste burners and/or safer energy producers, is here commented, with particular emphasis to problems implying an intensive system control variable. (author)

  18. Jet flow analysis of liquid poison injection in a CANDU reactor using source term

    International Nuclear Information System (INIS)

    Chae, Kyung Myung; Choi, Hang Bok; Rhee, Bo Wook

    2001-01-01

    For the performance analysis of Canadian deuterium uranium (CANDU) reactor shutdown system number 2 (SDS2), a computational fluid dynamics model of poison jet flow has been developed to estimate the flow field and poison concentration formed inside the CANDU reactor calandria. As the ratio of calandria shell radius over injection nozzle hole diameter is so large (1055), it is impractical to develop a full-size model encompassing the whole calandria shell. In order to reduce the model to a manageable size, a quarter of one-pitch length segment of the shell was modeled using symmetric nature of the jet; and the injected jet was treated as a source term to avoid the modeling difficulty caused by the big difference of the hole sizes. For the analysis of an actual CANDU-6 SDS2 poison injection, the grid structure was determined based on the results of two-dimensional real- and source-jet simulations. The maximum injection velocity of the liquid poison is 27.8 m/s and the mass fraction of the poison is 8000 ppm (mg/kg). The simulation results have shown well-established jet flow field. In general, the jet develops narrowly at first but stretches rapidly. Then, the flow recirculates a little in r-x plane, while it recirculates largely in r-θ plane. As the time goes on, the adjacent jets contact each other and form a wavy front such that the whole jet develops in a plate form. his study has shown that the source term model can be effectively used for the analysis of the poison injection and the simulation result of the CANDU reactor is consistent with the model currently being used for the safety analysis. In the future, it is strongly recommended to analyze the transient (from helium tank to injection nozzle hole) of the poison injection by applying Bernoulli equation with real boundary conditions

  19. Feasibility study for the spallation neutron source (SNQ). Pt. 1

    International Nuclear Information System (INIS)

    Bauer, G.S.; Sebening, H.; Vetter, J.E.; Willax, H.

    1981-06-01

    A concept for a new neutron source for fundamental research has been developed and is described in this report. The spallation neutron source SNQ is characterized in its first stage by a time average thermal neutron flux of 7 x 10 14 cm -2 s -1 and a peak flux of 1.3 x 10 16 cm -2 s -1 at 100 Hz repetition rate. The scientific case is presented with particular emphasis on solid state and nuclear physics. In these research domains, unique conditions are given for experimental use. The proposed machine consists in its basic stage of a 1.1 GeV, 5 mA time average, 100 mA peak current proton linear accelerator, a rotating lead target, and H 2 O and D 2 O moderators. Additional beam channels are provided for experiments with protons at 350 MeV and at the final energy. Construction of the SNQ is considered feasible within eight years at a cost of 680 million DM. As future options, use of uranium as a target material, increase of the accelerator beam power by a factor of 2, addition of a pulse compressor and a second target station for pulsed neutron and neutrino research are described. As a back-up solution to the rotating target, a liquid metal target was studied. (orig.) [de

  20. FEL polarization control studies on Dalian coherent light source

    International Nuclear Information System (INIS)

    Zhang Tong; Deng Haixiao; Wang Dong; Zhao Zhentang; Zhang Weiqing; Wu Guorong; Dai Dongxu; Yang Xueming

    2013-01-01

    The polarization switch of a free-electron laser (FEL) is of great importance to the user scientific community. In this paper, we investigate the generation of controllable polarization FEL from two well-known approaches for Dalian coherent light source, i.e., crossed planar undulator and elliptical permanent undulator. In order to perform a fair comparative study, a one-dimensional time-dependent FEL code has been developed, in which the imperfection effects of an elliptical permanent undulator are taken into account. Comprehensive simulation results indicate that the residual beam energy chirp and the intrinsic FEL gain may contribute to the degradation of the polarization performance for the crossed planar undulator. The elliptical permanent undulator is not very sensitive to the undulator errors and beam imperfections. Meanwhile, with proper configurations of the main planar undulators and additional elliptical permanent undulator section, circular polarized FEL with pulse energy exceeding 100 μJ could be achieved at Dalian coherent light source. (authors)

  1. A New ECR Ion Source for Nuclear Astrophysics Studies

    Science.gov (United States)

    Cesaratto, John M.

    2008-10-01

    The Laboratory for Experimental Nuclear Astrophysics (LENA) is a low energy facility designed to study nuclear reactions of astrophysical interest at energies which are important for nucleosysthesis. In general, these reactions have extremely small cross sections, requiring intense beams and efficient detection systems. Recently, a new, high intensity electron-cyclotron-resonance (ECR) ion source has been constructed (based on a design by Wills et al.[1]), which represents a substantial improvement in the capabilities of LENA. Beam is extracted from an ECR plasma excited at 2.45 GHz and confined by an array of permanent magnets. It has produced H^+ beams in excess of 1 mA on target over the energy range 100 - 200 keV, which greatly increases our ability to measure small cross sections. Initial measurements will focus on the ^23Na(p,γ)^24Mg reaction, which is of interest in a variety of astrophysical scenarios. The present uncertainty in the rate of this reaction is the result of an unobserved resonance expected at Elab =144 keV, which should be detectable using beams from the new ECR source. In collaboration with Arthur E. Champagne and Thomas B. Clegg, University of North Carolina, Chapel Hill and TUNL. [3pt] [1] J. S. C. Wills et al., Rev. Sci. Instrum. 69, 65 (1999).

  2. Integrated source-risk model for radon: A definition study

    International Nuclear Information System (INIS)

    Laheij, G.M.H.; Aldenkamp, F.J.; Stoop, P.

    1993-10-01

    The purpose of a source-risk model is to support policy making on radon mitigation by comparing effects of various policy options and to enable optimization of counter measures applied to different parts of the source-risk chain. There are several advantages developing and using a source-risk model: risk calculations are standardized; the effects of measures applied to different parts of the source-risk chain can be better compared because interactions are included; and sensitivity analyses can be used to determine the most important parameters within the total source-risk chain. After an inventory of processes and sources to be included in the source-risk chain, the models presently available in the Netherlands are investigated. The models were screened for completeness, validation and operational status. The investigation made clear that, by choosing for each part of the source-risk chain the most convenient model, a source-risk chain model for radon may be realized. However, the calculation of dose out of the radon concentrations and the status of the validation of most models should be improved. Calculations with the proposed source-risk model will give estimations with a large uncertainty at the moment. For further development of the source-risk model an interaction between the source-risk model and experimental research is recommended. Organisational forms of the source-risk model are discussed. A source-risk model in which only simple models are included is also recommended. The other models are operated and administrated by the model owners. The model owners execute their models for a combination of input parameters. The output of the models is stored in a database which will be used for calculations with the source-risk model. 5 figs., 15 tabs., 7 appendices, 14 refs

  3. Imaging spectroscopic analysis at the Advanced Light Source

    International Nuclear Information System (INIS)

    MacDowell, A. A.; Warwick, T.; Anders, S.; Lamble, G.M.; Martin, M.C.; McKinney, W.R.; Padmore, H.A.

    1999-01-01

    One of the major advances at the high brightness third generation synchrotrons is the dramatic improvement of imaging capability. There is a large multi-disciplinary effort underway at the ALS to develop imaging X-ray, UV and Infra-red spectroscopic analysis on a spatial scale from. a few microns to 10nm. These developments make use of light that varies in energy from 6meV to 15KeV. Imaging and spectroscopy are finding applications in surface science, bulk materials analysis, semiconductor structures, particulate contaminants, magnetic thin films, biology and environmental science. This article is an overview and status report from the developers of some of these techniques at the ALS. The following table lists all the currently available microscopes at the. ALS. This article will describe some of the microscopes and some of the early applications

  4. The training of Olympic wrestling coaches: study of the sources of knowledge and essential training contents

    Directory of Open Access Journals (Sweden)

    Paulo Martins

    2017-08-01

    Full Text Available The aim of this study was to analyze the representation of wrestling coaches regarding the sources of knowledge and the training contents to be adopted during the training process of young wrestlers’ coaches. The study was based on Grossman’s (1990 model of professional knowledge for teaching and followed a qualitative, multiple case study methodology. Following a semi-structured script, six Olympic wrestling experts were interviewed in-depth, trying to identify the sources of knowledge that the coaches used for their training and what didactic-methodological contents they considered essential to play their role as coach. The analysis revealed that the coaches’ sources of professional knowledge were diverse, including academic training and professional experience as the main sources of access to professional knowledge. The coaches also pointed out that their first sources of knowledge were their experiences as competitive athletes. Finally, this study concludes that expert coaches must acquire a profound knowledge of the competition environment, seeking to optimize their influence on athletes, which should extend not only to the sport practice of the youngster – as an athlete – but also at the level of the athlete as a person.

  5. Sustainability in Open Source Software Commons: Lessons Learned from an Empirical Study of SourceForge Projects

    OpenAIRE

    Charles M. Schweik

    2013-01-01

    In this article, we summarize a five-year US National Science Foundation funded study designed to investigate the factors that lead some open source projects to ongoing collaborative success while many others become abandoned. Our primary interest was to conduct a study that was closely representative of the population of open source software projects in the world, rather than focus on the more-often studied, high-profile successful cases. After building a large database of projects (n=174,33...

  6. Efficiency and Effectiveness in the Collection and Analysis of S&T Open Source Information

    International Nuclear Information System (INIS)

    Pericou-Cayere, M.; Lemaire, P.; Pace, J.-M.; Baude, S.; Samson, N.

    2015-01-01

    While looking for information in scientific database, we are overwhelmed by the amount of information that we encounter. In this big data collection, getting information with added-value could be strategic for nuclear verification. In our study, we have worked about ''best practices'' in collecting, processing and analyzing open source scientific and technical information. First, we were insistent on working with information authenticated by referees such as scientific publications (structured information). Analysis of this structured data is made with bibliometric tools. Several steps are carried out: collecting data related to the paradigm, creating a database to store data generated by bibliographic research, analyzing data with selected tools. With analysis of bibliographic data only, we are able to get: · a panoramic view of countries that publish in the paradigm, · co-publication networks, · organizations that contribute to scientific publications, · countries with which a country collaborates, · areas of interest of a country, . . . So we are able to identify a target. On a second phase, we can focus on a target (countries for example). Working with non-structured data (i.e., press release, social networks, full text analysis of publications) is in progress and needs other tools to be added to the process, as we will discuss in this paper. In information analysis, methodology and expert analysis are important. Software analysis is just a tool to achieve our goal. This presentation deals with concrete measures that improve the efficiency and effectiveness in the use of open source S&T information and in the management of that information over time. Examples are shown. (author)

  7. Analysis of coherence properties of 3-rd generation synchrotron sources and free-electron lasers

    International Nuclear Information System (INIS)

    Vartanyants, I.A.; Singer, A.

    2009-07-01

    A general theoretical approach based on the results of statistical optics is used for the analysis of the transverse coherence properties of 3-rd generation synchrotron sources and X-ray free-electron lasers (XFEL). Correlation properties of the wave elds are calculated at different distances from an equivalent Gaussian Schell-model source. This model is used to describe coherence properties of the five meter undulator source at the synchrotron storage ring PETRA III. In the case of XFEL sources the decomposition of the statistical fields into a sum of independently propagating transverse modes is used for the analysis of the coherence properties of these new sources. A detailed calculation is performed for the parameters of the SASE1 undulator at the European XFEL. It is demonstrated that only a few modes contribute significantly to the total radiation field of that source. (orig.)

  8. Analysis of coherence properties of 3-rd generation synchrotron sources and free-electron lasers

    Energy Technology Data Exchange (ETDEWEB)

    Vartanyants, I.A.; Singer, A. [HASYLAB at Deutsches Elektronen-Synchrotron DESY, Hamburg (Germany)

    2009-07-15

    A general theoretical approach based on the results of statistical optics is used for the analysis of the transverse coherence properties of 3-rd generation synchrotron sources and X-ray free-electron lasers (XFEL). Correlation properties of the wave elds are calculated at different distances from an equivalent Gaussian Schell-model source. This model is used to describe coherence properties of the five meter undulator source at the synchrotron storage ring PETRA III. In the case of XFEL sources the decomposition of the statistical fields into a sum of independently propagating transverse modes is used for the analysis of the coherence properties of these new sources. A detailed calculation is performed for the parameters of the SASE1 undulator at the European XFEL. It is demonstrated that only a few modes contribute significantly to the total radiation field of that source. (orig.)

  9. Automated absolute activation analysis with californium-252 sources

    International Nuclear Information System (INIS)

    MacMurdo, K.W.; Bowman, W.W.

    1978-09-01

    A 100-mg 252 Cf neutron activation analysis facility is used routinely at the Savannah River Laboratory for multielement analysis of many solid and liquid samples. An absolute analysis technique converts counting data directly to elemental concentration without the use of classical comparative standards and flux monitors. With the totally automated pneumatic sample transfer system, cyclic irradiation-decay-count regimes can be pre-selected for up to 40 samples, and samples can be analyzed with the facility unattended. An automatic data control system starts and stops a high-resolution gamma-ray spectrometer and/or a delayed-neutron detector; the system also stores data and controls output modes. Gamma ray data are reduced by three main programs in the IBM 360/195 computer: the 4096-channel spectrum and pertinent experimental timing, counting, and sample data are stored on magnetic tape; the spectrum is then reduced to a list of significant photopeak energies, integrated areas, and their associated statistical errors; and the third program assigns gamma ray photopeaks to the appropriate neutron activation product(s) by comparing photopeak energies to tabulated gamma ray energies. Photopeak areas are then converted to elemental concentration by using experimental timing and sample data, calculated elemental neutron capture rates, absolute detector efficiencies, and absolute spectroscopic decay data. Calculational procedures have been developed so that fissile material can be analyzed by cyclic neutron activation and delayed-neutron counting procedures. These calculations are based on a 6 half-life group model of delayed neutron emission; calculations include corrections for delayed neutron interference from 17 O. Detection sensitivities of 239 Pu were demonstrated with 15-g samples at a throughput of up to 140 per day. Over 40 elements can be detected at the sub-ppM level

  10. Sources of Safety Data and Statistical Strategies for Design and Analysis: Clinical Trials.

    Science.gov (United States)

    Zink, Richard C; Marchenko, Olga; Sanchez-Kam, Matilde; Ma, Haijun; Jiang, Qi

    2018-03-01

    There has been an increased emphasis on the proactive and comprehensive evaluation of safety endpoints to ensure patient well-being throughout the medical product life cycle. In fact, depending on the severity of the underlying disease, it is important to plan for a comprehensive safety evaluation at the start of any development program. Statisticians should be intimately involved in this process and contribute their expertise to study design, safety data collection, analysis, reporting (including data visualization), and interpretation. In this manuscript, we review the challenges associated with the analysis of safety endpoints and describe the safety data that are available to influence the design and analysis of premarket clinical trials. We share our recommendations for the statistical and graphical methodologies necessary to appropriately analyze, report, and interpret safety outcomes, and we discuss the advantages and disadvantages of safety data obtained from clinical trials compared to other sources. Clinical trials are an important source of safety data that contribute to the totality of safety information available to generate evidence for regulators, sponsors, payers, physicians, and patients. This work is a result of the efforts of the American Statistical Association Biopharmaceutical Section Safety Working Group.

  11. PANDORA: keyword-based analysis of protein sets by integration of annotation sources.

    Science.gov (United States)

    Kaplan, Noam; Vaaknin, Avishay; Linial, Michal

    2003-10-01

    Recent advances in high-throughput methods and the application of computational tools for automatic classification of proteins have made it possible to carry out large-scale proteomic analyses. Biological analysis and interpretation of sets of proteins is a time-consuming undertaking carried out manually by experts. We have developed PANDORA (Protein ANnotation Diagram ORiented Analysis), a web-based tool that provides an automatic representation of the biological knowledge associated with any set of proteins. PANDORA uses a unique approach of keyword-based graphical analysis that focuses on detecting subsets of proteins that share unique biological properties and the intersections of such sets. PANDORA currently supports SwissProt keywords, NCBI Taxonomy, InterPro entries and the hierarchical classification terms from ENZYME, SCOP and GO databases. The integrated study of several annotation sources simultaneously allows a representation of biological relations of structure, function, cellular location, taxonomy, domains and motifs. PANDORA is also integrated into the ProtoNet system, thus allowing testing thousands of automatically generated clusters. We illustrate how PANDORA enhances the biological understanding of large, non-uniform sets of proteins originating from experimental and computational sources, without the need for prior biological knowledge on individual proteins.

  12. Study and manufacture of an analysis magnet

    International Nuclear Information System (INIS)

    Pronier, Jean

    1965-01-01

    This document reports the study and design of an apparatus aimed at a precise qualitative and quantitative analysis of particle beams produced by a number of ion sources of different types, and at using mono-energetic ion beams for experiments related to accelerators. Two analysis methods are addressed, presented and discussed (the electromagnetic and the electrostatic analysis) and the reasons for the choice of the electromagnetic analysis are explained. The author then reports the study of the analyser: analysis of theoretical capacities of separation of particles with a not much different M/Z rate, imposed characteristics, optical design, calculation of the slot image position, second and third order aberrations, correction of second order aberrations, and so on. He reports calculations related to the analyser: vacuum chamber, field map in the air gap, surface of polar parts, flux calculation in the air gap, comparison between experimental and theoretical results, ampere turn calculation, winding calculation. The induction measurement is described and the experiment is reported. Experimental results are reported in terms of analysis of gases ionized by a high frequency ion source [fr

  13. Preliminary studies of Brazilian wood using different radioisotopic sources

    Energy Technology Data Exchange (ETDEWEB)

    Carvalho, Gilberto; Silva, Leonardo Gondim de Andrade e, E-mail: gcarval@ipen.br, E-mail: ftgasilva@gmail.com [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2013-07-01

    Due to availability and particular features, wood was one of the first materials used by mankind with a wide variety of applications. It can be used as raw material for paper and cellulose manufacturing; in industries such as chemical, naval, furniture, sports goods, toys, and musical instrument; in building construction and in the distribution of electric energy. Wood has been widely researched; therefore, wood researchers know that several aspects such as temperature, latitude, longitude, altitude, sunlight, soil, and rainfall index interfere with the growth of trees. This behavior explains why average physical-chemical properties are important when wood is studied. The majority of researchers consider density to be the most important wood property because of its straight relationship with the physical and mechanical properties of wood. There are three types of wood density: basic, apparent and green. The apparent density was used here at 12% of moisture content. In this study, four different types of wood were used: 'freijo', 'jequetiba', 'muiracatiara' and 'ipe'. For wood density determination by non-conventional method, Am-241, Ba-133 and Cs-137 radioisotopic sources; a NaI scintillation detector and a counter were used. The results demonstrated this technique to be quick and accurate. By considering the nuclear parameters obtained as half value layers and linear absorption coefficients, Cs-137 radioisotopic source demonstrated to be the best option to be used for inspection of the physical integrity of electric wooden poles and live trees for future works. (author)

  14. Open source software and crowdsourcing for energy analysis

    International Nuclear Information System (INIS)

    Bazilian, Morgan; Rice, Andrew; Rotich, Juliana; Howells, Mark; DeCarolis, Joseph; Macmillan, Stuart; Brooks, Cameron; Bauer, Florian; Liebreich, Michael

    2012-01-01

    Informed energy decision making requires effective software, high-quality input data, and a suitably trained user community. Developing these resources can be expensive and time consuming. Even when data and tools are intended for public re-use they often come with technical, legal, economic and social barriers that make them difficult to adopt, adapt and combine for use in new contexts. We focus on the promise of open, publically accessible software and data as well as crowdsourcing techniques to develop robust energy analysis tools that can deliver crucial, policy-relevant insight, particularly in developing countries, where planning resources are highly constrained—and the need to adapt these resources and methods to the local context is high. We survey existing research, which argues that these techniques can produce high-quality results, and also explore the potential role that linked, open data can play in both supporting the modelling process and in enhancing public engagement with energy issues. - Highlights: ► We focus on the promise of open, publicly accessible software and data. ► These emerging techniques can produce high-quality results for energy analysis. ► Developing economies require new techniques for energy planning.

  15. Who bears the environmental burden in China? An analysis of the distribution of industrial pollution sources

    Energy Technology Data Exchange (ETDEWEB)

    Ma, Chunbo [School of Agricultural and Resource Economics, University of Western Australia, 35 Stirling Highway, Crawley, 6009, Western Australia (Australia)

    2010-07-15

    A remaining challenge for environmental inequality researchers is to translate the principles developed in the U.S. to China which is experiencing the staggering environmental impacts of its astounding economic growth and social changes. This study builds on U.S. contemporary environmental justice literature and examines the issue of environmental inequality in China through an analysis of the geographical distribution of industrial pollution sources in Henan province. This study attempts to answer two central questions: (1) whether environmental inequality exists in China and if it does, (2) what socioeconomic lenses can be used to identify environmental inequality. The study found that: (1) race and income - the two common lenses used in many U.S. studies play different roles in the Chinese context; (2) rural residents and especially rural migrants are disproportionately exposed to industrial pollution. (author)

  16. Study of characteristic X-ray source and its applications

    International Nuclear Information System (INIS)

    Li Fuquan

    1994-11-01

    The law of characteristic X-rays emitted by target element under the radiation of isotope source in a range of low energy is discussed. Both the way of improving the rate of γ-X conversion and the method to eliminate the influence of scatter rays are introduced. The influence of the variation of isotopes source, targets and the relative position of source-target to the output of X-rays is also discussed and then the conditions of improving signal-to-noise radio is presented. The X-ray source based on these results can produce different energy X-rays, and so can be broadly used on nuclear instruments and other fields as a low energy source. The thickness gauge, as one of the applications, has succeeded in thickness measuring of the different materials in large range, and it presents a new application field for characteristic X-ray source. (11 figs., 10 tabs.)

  17. Modeling and analysis of a transcritical rankine power cycle with a low grade heat source

    DEFF Research Database (Denmark)

    Nguyen, Chan; Veje, Christian

    efficiency, exergetic efficiency and specific net power output. A generic cycle configuration has been used for analysis of a geothermal energy heat source. This model has been validated against similar calculations using industrial waste heat as the energy source. Calculations are done with fixed...

  18. Inter-comparison of receptor models for PM source apportionment: Case study in an industrial area

    Science.gov (United States)

    Viana, M.; Pandolfi, M.; Minguillón, M. C.; Querol, X.; Alastuey, A.; Monfort, E.; Celades, I.

    2008-05-01

    Receptor modelling techniques are used to identify and quantify the contributions from emission sources to the levels and major and trace components of ambient particulate matter (PM). A wide variety of receptor models are currently available, and consequently the comparability between models should be evaluated if source apportionment data are to be used as input in health effects studies or mitigation plans. Three of the most widespread receptor models (principal component analysis, PCA; positive matrix factorization, PMF; chemical mass balance, CMB) were applied to a single PM10 data set (n=328 samples, 2002-2005) obtained from an industrial area in NE Spain, dedicated to ceramic production. Sensitivity and temporal trend analyses (using the Mann-Kendall test) were applied. Results evidenced the good overall performance of the three models (r2>0.83 and α>0.91×between modelled and measured PM10 mass), with a good agreement regarding source identification and high correlations between input (CMB) and output (PCA, PMF) source profiles. Larger differences were obtained regarding the quantification of source contributions (up to a factor of 4 in some cases). The combined application of different types of receptor models would solve the limitations of each of the models, by constructing a more robust solution based on their strengths. The authors suggest the combined use of factor analysis techniques (PCA, PMF) to identify and interpret emission sources, and to obtain a first quantification of their contributions to the PM mass, and the subsequent application of CMB. Further research is needed to ensure that source apportionment methods are robust enough for application to PM health effects assessments.

  19. A Business Case Study of Open Source Software

    Science.gov (United States)

    2001-07-01

    include cost or price, availability or multiple distribution sources, and popularity or brand /reputation. While both the commercial and government...availability, quality, security, management, scalability, brand /reputation, and service and support. 1 2 3 4 5 Pr ice Re lia bil ity Pe rfo rm an ce Av ail...source community. Open source, and Linux in particular, is often regarded as the heroic underdog . Linux has been touted as a “Windows killer.”44 Over

  20. Evaluation of an Information Source Illustrated by a Case Study

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2011-01-01

    It is argued that to evaluate an information source (e.g., a Wikipedia article), it is necessary to relate the content of that source to an interpretation of the state of knowledge at the research front (which is typically developing dynamically). In the research literature, there is a controversy...... about the effect of screening programs for breast cancer. This controversy is used to compare the value of Wikipedia with Encyclopedia Britannica and two Danish encyclopedias as information sources. It is argued that this method of examining information sources is preferable to other methods which have...

  1. Open Source Parallel Image Analysis and Machine Learning Pipeline, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Continuum Analytics proposes a Python-based open-source data analysis machine learning pipeline toolkit for satellite data processing, weather and climate data...

  2. FieldTrip: Open source software for advanced analysis of MEG, EEG, and invasive electrophysiological data.

    Science.gov (United States)

    Oostenveld, Robert; Fries, Pascal; Maris, Eric; Schoffelen, Jan-Mathijs

    2011-01-01

    This paper describes FieldTrip, an open source software package that we developed for the analysis of MEG, EEG, and other electrophysiological data. The software is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow experimental neuroscientists to analyze experimental data. It includes algorithms for simple and advanced analysis, such as time-frequency analysis using multitapers, source reconstruction using dipoles, distributed sources and beamformers, connectivity analysis, and nonparametric statistical permutation tests at the channel and source level. The implementation as toolbox allows the user to perform elaborate and structured analyses of large data sets using the MATLAB command line and batch scripting. Furthermore, users and developers can easily extend the functionality and implement new algorithms. The modular design facilitates the reuse in other software packages.

  3. Paleomagnetism.org : An online multi-platform open source environment for paleomagnetic data analysis

    NARCIS (Netherlands)

    Koymans, Mathijs R.; Langereis, C.G.; Pastor-Galán, D.; van Hinsbergen, D.J.J.

    2016-01-01

    This contribution provides an overview of Paleomagnetism.org, an open-source, multi-platform online environment for paleomagnetic data analysis. Paleomagnetism.org provides an interactive environment where paleomagnetic data can be interpreted, evaluated, visualized, and exported. The

  4. Identification of 'Point A' as the prevalent source of error in cephalometric analysis of lateral radiographs.

    Science.gov (United States)

    Grogger, P; Sacher, C; Weber, S; Millesi, G; Seemann, R

    2018-04-10

    Deviations in measuring dentofacial components in a lateral X-ray represent a major hurdle in the subsequent treatment of dysgnathic patients. In a retrospective study, we investigated the most prevalent source of error in the following commonly used cephalometric measurements: the angles Sella-Nasion-Point A (SNA), Sella-Nasion-Point B (SNB) and Point A-Nasion-Point B (ANB); the Wits appraisal; the anteroposterior dysplasia indicator (APDI); and the overbite depth indicator (ODI). Preoperative lateral radiographic images of patients with dentofacial deformities were collected and the landmarks digitally traced by three independent raters. Cephalometric analysis was automatically performed based on 1116 tracings. Error analysis identified the x-coordinate of Point A as the prevalent source of error in all investigated measurements, except SNB, in which it is not incorporated. In SNB, the y-coordinate of Nasion predominated error variance. SNB showed lowest inter-rater variation. In addition, our observations confirmed previous studies showing that landmark identification variance follows characteristic error envelopes in the highest number of tracings analysed up to now. Variance orthogonal to defining planes was of relevance, while variance parallel to planes was not. Taking these findings into account, orthognathic surgeons as well as orthodontists would be able to perform cephalometry more accurately and accomplish better therapeutic results. Copyright © 2018 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  5. Light ion source studies with a magnetically insulated extraction diode

    International Nuclear Information System (INIS)

    Struckman, C.K.

    1992-01-01

    Light ion sources are currently being studied to assess their ability to drive an inertial confinement fusion reactor. The author has produced a high purity, 1MV, 300A/cm 2 lithium beam using a 200cm 2 extraction geometry, magnetically insulated ion diode. The lithium source was an AC glow discharge cleaned, LiF/Al film active anode. The active anode plasma was formed after 50KA of current was shunted through the anode film for 20ns. The stoichiometry of the resulting ion beam was 65% Li + , 20% Al +2 , and 15% H + . Without the glow discharge cleaning, the ion beam was over 55% hydrogen and only 20% Li + . At the time of the diode's design, extraction diodes were producing poor ion beams: their current efficiency was only 60-70%, and their extracted ion current was radially nonuniform. This diode was the first high efficiency extraction diode, and produced over 200KA of ions with 80-90% ion current efficiency. In addition, by varying the tilt of the applied magnetic field, it was possible to show that the ion current density could be made independent of radius. Since the author was unable to make a Li + beam with a passive anode, he installed an active anode that used an external current to vaporize a thin metal film on the anode surface. Poor beam purity was the most serious problem with active anodes. In order to remove impurities, especially the hydrogen contamination, the author cleaned the anodes with a glow discharge. Al film anodes were cleaned with a 110mA, 33W DC glow discharge, and the LiF/Al film anodes were cleaned with an equivalent AC discharge. The results obtained and a model for the mechanism behind the cleaning process are throughly discussed

  6. 5 MW pulsed spallation neutron source, Preconceptual design study

    Energy Technology Data Exchange (ETDEWEB)

    1994-06-01

    This report describes a self-consistent base line design for a 5 MW Pulsed Spallation Neutron Source (PSNS). It is intended to establish feasibility of design and as a basis for further expanded and detailed studies. It may also serve as a basis for establishing project cost (30% accuracy) in order to intercompare competing designs for a PSNS not only on the basis of technical feasibility and technical merit but also on the basis of projected total cost. The accelerator design considered here is based on the objective of a pulsed neutron source obtained by means of a pulsed proton beam with average beam power of 5 MW, in {approx} 1 {mu}sec pulses, operating at a repetition rate of 60 Hz. Two target stations are incorporated in the basic facility: one for operation at 10 Hz for long-wavelength instruments, and one operating at 50 Hz for instruments utilizing thermal neutrons. The design approach for the proton accelerator is to use a low energy linear accelerator (at 0.6 GeV), operating at 60 Hz, in tandem with two fast cycling booster synchrotrons (at 3.6 GeV), operating at 30 Hz. It is assumed here that considerations of cost and overall system reliability may favor the present design approach over the alternative approach pursued elsewhere, whereby use is made of a high energy linear accelerator in conjunction with a dc accumulation ring. With the knowledge that this alternative design is under active development, it was deliberately decided to favor here the low energy linac-fast cycling booster approach. Clearly, the present design, as developed here, must be carried to the full conceptual design stage in order to facilitate a meaningful technology and cost comparison with alternative designs.

  7. 5 MW pulsed spallation neutron source, Preconceptual design study

    International Nuclear Information System (INIS)

    1994-06-01

    This report describes a self-consistent base line design for a 5 MW Pulsed Spallation Neutron Source (PSNS). It is intended to establish feasibility of design and as a basis for further expanded and detailed studies. It may also serve as a basis for establishing project cost (30% accuracy) in order to intercompare competing designs for a PSNS not only on the basis of technical feasibility and technical merit but also on the basis of projected total cost. The accelerator design considered here is based on the objective of a pulsed neutron source obtained by means of a pulsed proton beam with average beam power of 5 MW, in ∼ 1 μsec pulses, operating at a repetition rate of 60 Hz. Two target stations are incorporated in the basic facility: one for operation at 10 Hz for long-wavelength instruments, and one operating at 50 Hz for instruments utilizing thermal neutrons. The design approach for the proton accelerator is to use a low energy linear accelerator (at 0.6 GeV), operating at 60 Hz, in tandem with two fast cycling booster synchrotrons (at 3.6 GeV), operating at 30 Hz. It is assumed here that considerations of cost and overall system reliability may favor the present design approach over the alternative approach pursued elsewhere, whereby use is made of a high energy linear accelerator in conjunction with a dc accumulation ring. With the knowledge that this alternative design is under active development, it was deliberately decided to favor here the low energy linac-fast cycling booster approach. Clearly, the present design, as developed here, must be carried to the full conceptual design stage in order to facilitate a meaningful technology and cost comparison with alternative designs

  8. Vibration analysis of the photon shutter designed for the advanced photon source

    International Nuclear Information System (INIS)

    Wang, Z.; Shu, D.; Kuzay, T.M.

    1992-01-01

    The photon shutter is a critical component of the beamline front end for the 7 GeV Advanced Photon Source (APS) project, now under construction at Argonne National Laboratory (ANL). The shutter is designed to close in tens of milliseconds to absorb up to 10 kW heat load (with high heat flux). Our shutter design uses innovative enhanced heat transfer tubes to withstand the high heat load. Although designed to be light weight and compact, the very fast movement of the shutter gives rise to concern regarding vibration and dynamic sensitivity. To guarantee long-term functionality and reliability of the shutter, the dynamic behavior should be fully studied. In this paper, the natural frequency and transient dynamic analysis for the shutter during operation are presented. Through analysis of the vibration characteristics, as well as stress and deformation, several options in design were developed and compared, including selection of materials for the shutter and structural details

  9. Determination of sources and analysis of micro-pollutants in drinking water

    International Nuclear Information System (INIS)

    Md Pauzi Abdullah; Soh Shiau Chian

    2005-01-01

    The objectives of the study are to develop and validate selected analytical methods for the analysis of micro organics and metals in water; to identify, monitor and assess the levels of micro organics and metals in drinking water supplies; to evaluate the relevancy of the guidelines set in the National Standard of Drinking Water Quality 2001; and to identify the sources of pollution and to carryout risk assessment of exposure to drinking water. The presentation discussed the progress of the work include determination of VOCs (Volatile organic compounds) in drinking water using SPME (Solid phase micro-extraction) extraction techniques, analysis of heavy metals in drinking water, determination of Cr(VI) with ICPES (Inductively coupled plasma emission spectrometry) and the presence of halogenated volatile organic compounds (HVOCs), which is heavily used by agricultural sector, in trace concentrations in waters

  10. Studies on characteristics of water sources around Kaiga project area

    International Nuclear Information System (INIS)

    Prakash, T.R.; Krishna Bhat, D.; Thimme Gowda, B.; Sherigara, B.S.; Abdul Khadar, A.M.

    1995-01-01

    A systematic and detailed study of characteristics of ground water, Kali river water and rain water samples around Kaiga project area has been undertaken. The analysis of a large number of parameters revealed that the ground waters and Kali river water are of calcium-bicarbonate type as indicated by Romani's modified Hill Piper diagram. The ionic impurities in ground waters and Kali river water are well within the Indian Drinking Water Specifications. The results obtained would serve as base line data for future impact studies. (author). 6 refs., 1 tab

  11. Market Orientation and Sources of Knowledge to Innovate in SMEs: A Firm Level Study

    Directory of Open Access Journals (Sweden)

    Simone Regina Didonet

    2016-10-01

    Full Text Available This work examines the relationship between the three market orientation (MO components, i.e. customer orientation, competitor orientation and inter-functional coordination, and the extension to which small and medium-sized enterprises (SMEs use different sources of knowledge to innovate. Based on a sample of 181 Chilean SMEs, a confirmatory factorial analysis (CFA was performed to analyze the relationship among constructs. The results show that the extension to which SMEs use different sources of knowledge to innovate depends on the interactions between MO components. This study addresses a gap in the literature, by linking and interrelating market orientation components to the innovation perspective in SMEs. Therefore, we provide insights into the role of each MO component in influencing the extension to which firms seek for and use different sources of knowledge to innovate and attempt to explain some literature inconsistencies on the theme.

  12. Analysis of Computer Experiments with Multiple Noise Sources

    DEFF Research Database (Denmark)

    Dehlendorff, Christian; Kulahci, Murat; Andersen, Klaus Kaae

    2010-01-01

    In this paper we present a modeling framework for analyzing computer models with two types of variations. The paper is based on a case study of an orthopedic surgical unit, which has both controllable and uncontrollable factors. Our results show that this structure of variation can be modeled...

  13. Analysis of the Potential of Low-Temperature Heat Pump Energy Sources

    Directory of Open Access Journals (Sweden)

    Pavel Neuberger

    2017-11-01

    Full Text Available The paper deals with an analysis of temperatures of ground masses in the proximities of linear and slinky-type HGHE (horizontal ground heat exchanger. It evaluates and compares the potentials of HGHEs and ambient air. The reason and aim of the verification was to gain knowledge of the temperature course of the monitored low-temperature heat pump energy sources during heating periods and periods of stagnation and to analyse the knowledge in terms of the potential to use those sources for heat pumps. The study was conducted in the years 2012–2015 during three heating periods and three periods of HGHEs stagnation. The results revealed that linear HGHE had the highest temperature potential of the observed low-temperature heat pump energy sources. The average daily temperatures of the ground mass surrounding the linear HGHE were the highest ranging from 7.08 °C to 9.20 °C during the heating periods, and having the lowest temperature variation range of 12.62–15.14 K, the relative frequency of the average daily temperatures of the ground mass being the highest at 22.64% in the temperature range containing the mode of all monitored temperatures in a recorded interval of [4.10, 6.00] °C. Ambient air had lower temperature potential than the monitored HGHEs.

  14. Qualitative analysis of precipiation distribution in Poland with use of different data sources

    Directory of Open Access Journals (Sweden)

    J. Walawender

    2008-04-01

    Full Text Available Geographical Information Systems (GIS can be used to integrate data from different sources and in different formats to perform innovative spatial and temporal analysis. GIS can be also applied for climatic research to manage, investigate and display all kinds of weather data.

    The main objective of this study is to demonstrate that GIS is a useful tool to examine and visualise precipitation distribution obtained from different data sources: ground measurements, satellite and radar data.

    Three selected days (30 cases with convective rainfall situations were analysed. Firstly, scalable GRID-based approach was applied to store data from three different sources in comparable layout. Then, geoprocessing algorithm was created within ArcGIS 9.2 environment. The algorithm included: GRID definition, reclassification and raster algebra. All of the calculations and procedures were performed automatically. Finally, contingency tables and pie charts were created to show relationship between ground measurements and both satellite and radar derived data. The results were visualised on maps.

  15. Comparative Genomic Analysis of Mannheimia haemolytica from Bovine Sources.

    Science.gov (United States)

    Klima, Cassidy L; Cook, Shaun R; Zaheer, Rahat; Laing, Chad; Gannon, Vick P; Xu, Yong; Rasmussen, Jay; Potter, Andrew; Hendrick, Steve; Alexander, Trevor W; McAllister, Tim A

    2016-01-01

    Bovine respiratory disease is a common health problem in beef production. The primary bacterial agent involved, Mannheimia haemolytica, is a target for antimicrobial therapy and at risk for associated antimicrobial resistance development. The role of M. haemolytica in pathogenesis is linked to serotype with serotypes 1 (S1) and 6 (S6) isolated from pneumonic lesions and serotype 2 (S2) found in the upper respiratory tract of healthy animals. Here, we sequenced the genomes of 11 strains of M. haemolytica, representing all three serotypes and performed comparative genomics analysis to identify genetic features that may contribute to pathogenesis. Possible virulence associated genes were identified within 14 distinct prophage, including a periplasmic chaperone, a lipoprotein, peptidoglycan glycosyltransferase and a stress response protein. Prophage content ranged from 2-8 per genome, but was higher in S1 and S6 strains. A type I-C CRISPR-Cas system was identified in each strain with spacer diversity and organization conserved among serotypes. The majority of spacers occur in S1 and S6 strains and originate from phage suggesting that serotypes 1 and 6 may be more resistant to phage predation. However, two spacers complementary to the host chromosome targeting a UDP-N-acetylglucosamine 2-epimerase and a glycosyl transferases group 1 gene are present in S1 and S6 strains only indicating these serotypes may employ CRISPR-Cas to regulate gene expression to avoid host immune responses or enhance adhesion during infection. Integrative conjugative elements are present in nine of the eleven genomes. Three of these harbor extensive multi-drug resistance cassettes encoding resistance against the majority of drugs used to combat infection in beef cattle, including macrolides and tetracyclines used in human medicine. The findings here identify key features that are likely contributing to serotype related pathogenesis and specific targets for vaccine design intended to reduce the

  16. Comparative Genomic Analysis of Mannheimia haemolytica from Bovine Sources.

    Directory of Open Access Journals (Sweden)

    Cassidy L Klima

    Full Text Available Bovine respiratory disease is a common health problem in beef production. The primary bacterial agent involved, Mannheimia haemolytica, is a target for antimicrobial therapy and at risk for associated antimicrobial resistance development. The role of M. haemolytica in pathogenesis is linked to serotype with serotypes 1 (S1 and 6 (S6 isolated from pneumonic lesions and serotype 2 (S2 found in the upper respiratory tract of healthy animals. Here, we sequenced the genomes of 11 strains of M. haemolytica, representing all three serotypes and performed comparative genomics analysis to identify genetic features that may contribute to pathogenesis. Possible virulence associated genes were identified within 14 distinct prophage, including a periplasmic chaperone, a lipoprotein, peptidoglycan glycosyltransferase and a stress response protein. Prophage content ranged from 2-8 per genome, but was higher in S1 and S6 strains. A type I-C CRISPR-Cas system was identified in each strain with spacer diversity and organization conserved among serotypes. The majority of spacers occur in S1 and S6 strains and originate from phage suggesting that serotypes 1 and 6 may be more resistant to phage predation. However, two spacers complementary to the host chromosome targeting a UDP-N-acetylglucosamine 2-epimerase and a glycosyl transferases group 1 gene are present in S1 and S6 strains only indicating these serotypes may employ CRISPR-Cas to regulate gene expression to avoid host immune responses or enhance adhesion during infection. Integrative conjugative elements are present in nine of the eleven genomes. Three of these harbor extensive multi-drug resistance cassettes encoding resistance against the majority of drugs used to combat infection in beef cattle, including macrolides and tetracyclines used in human medicine. The findings here identify key features that are likely contributing to serotype related pathogenesis and specific targets for vaccine design

  17. Development and validation of an open source quantification tool for DSC-MRI studies.

    Science.gov (United States)

    Gordaliza, P M; Mateos-Pérez, J M; Montesinos, P; Guzmán-de-Villoria, J A; Desco, M; Vaquero, J J

    2015-03-01

    This work presents the development of an open source tool for the quantification of dynamic susceptibility-weighted contrast-enhanced (DSC) perfusion studies. The development of this tool is motivated by the lack of open source tools implemented on open platforms to allow external developers to implement their own quantification methods easily and without the need of paying for a development license. This quantification tool was developed as a plugin for the ImageJ image analysis platform using the Java programming language. A modular approach was used in the implementation of the components, in such a way that the addition of new methods can be done without breaking any of the existing functionalities. For the validation process, images from seven patients with brain tumors were acquired and quantified with the presented tool and with a widely used clinical software package. The resulting perfusion parameters were then compared. Perfusion parameters and the corresponding parametric images were obtained. When no gamma-fitting is used, an excellent agreement with the tool used as a gold-standard was obtained (R(2)>0.8 and values are within 95% CI limits in Bland-Altman plots). An open source tool that performs quantification of perfusion studies using magnetic resonance imaging has been developed and validated using a clinical software package. It works as an ImageJ plugin and the source code has been published with an open source license. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. An Analysis of Navy Nurse Corps Accession Sources

    Science.gov (United States)

    2014-03-01

    1 I. INTRODUCTION A. BACKGROUND Since 1998, the United States has experienced a nationwide nursing shortage (Juraschek, Zhang, Ranganathan ...supply of RNs in the civilian workforce (Juraschek, Zhang, Ranganathan , & Lin, 2012). A study by Juraschek, Zhang, Ranganathan , & Lin (2012...Postgraduate School, Monterey, CA. Retrieved from www.dtic.mil/cgi- bin/GetTRDoc?AD=ADA344570 Juraschek, S. P., Zhang, X., Ranganathan , V. K., & Lin, V

  19. Study theorizes use of geothermal sources for energy in refineries

    NARCIS (Netherlands)

    Golombok, M.; Beintema, K.

    2008-01-01

    Geothermal sources for direct heating can theoretically serve as an alternative source of high-temperature heat in processing plants. Cutting CO2 emissions from a refinery requires reducing the amount of fuel burned. Heat obtained from geothermal energy is more efficiently used for directly powering

  20. Differential Expression and Functional Analysis of High-Throughput -Omics Data Using Open Source Tools.

    Science.gov (United States)

    Kebschull, Moritz; Fittler, Melanie Julia; Demmer, Ryan T; Papapanou, Panos N

    2017-01-01

    Today, -omics analyses, including the systematic cataloging of messenger RNA and microRNA sequences or DNA methylation patterns in a cell population, organ, or tissue sample, allow for an unbiased, comprehensive genome-level analysis of complex diseases, offering a large advantage over earlier "candidate" gene or pathway analyses. A primary goal in the analysis of these high-throughput assays is the detection of those features among several thousand that differ between different groups of samples. In the context of oral biology, our group has successfully utilized -omics technology to identify key molecules and pathways in different diagnostic entities of periodontal disease.A major issue when inferring biological information from high-throughput -omics studies is the fact that the sheer volume of high-dimensional data generated by contemporary technology is not appropriately analyzed using common statistical methods employed in the biomedical sciences.In this chapter, we outline a robust and well-accepted bioinformatics workflow for the initial analysis of -omics data generated using microarrays or next-generation sequencing technology using open-source tools. Starting with quality control measures and necessary preprocessing steps for data originating from different -omics technologies, we next outline a differential expression analysis pipeline that can be used for data from both microarray and sequencing experiments, and offers the possibility to account for random or fixed effects. Finally, we present an overview of the possibilities for a functional analysis of the obtained data.

  1. Radiotracer and Sealed Source Applications in Sediment Transport Studies

    International Nuclear Information System (INIS)

    2014-01-01

    The investigation of sediment transport in seas and rivers is crucial for civil engineering and littoral protection and management. Coastlines and seabeds are dynamic regions, with sediments undergoing periods of erosion, transport, sedimentation and consolidation. The main causes for erosion in beaches include storms and human actions such as the construction of seawalls, jetties and the dredging of stream mouths. Each of these human actions disrupts the natural flow of sand. Current policies and practices are accelerating the beach erosion process. However, there are viable options available to mitigate this damage and to provide for sustainable coastlines. Radioactive methods can help in investigating sediment dynamics, providing important parameters for better designing, maintaining and optimizing civil engineering structures. Radioisotopes as tracers and sealed sources have been useful and often irreplaceable tools for sediment transport studies. The training course material is based on lecture notes and practical works delivered by many experts in IAEA supported activities. Lectures and case studies were reviewed by a number of specialists in this field

  2. Sources of International Courts' Legitimacy: A comparative study

    DEFF Research Database (Denmark)

    Godzimirska, Zuzanna; Creamer, Cosette

    Despite ample scholarship on the legitimacy of international legal institutions, existing studies on international courts (ICs) tend to adopt normative or deductive approaches to specify their legitimacy and assess its effects. Very few adopt empirical or inductive approaches and examine the reas......Despite ample scholarship on the legitimacy of international legal institutions, existing studies on international courts (ICs) tend to adopt normative or deductive approaches to specify their legitimacy and assess its effects. Very few adopt empirical or inductive approaches and examine...... of supply-side factors— the features, roles and practices of a court—in assessing its legitimacy, we argue that demand-side factors—namely the characteristics of the evaluating state—also largely determine the sources of an IC’s legitimacy. To support and illustrate this argument, we examine statements...... of members on the operation of three ICs with different institutional designs and roles: the International Court of Justice, the International Criminal Court, and the Appellate Body of the World Trade Organization. We employ supervised learning methods of text classification to identify statements...

  3. Intrabeam scattering studies at the Swiss light source

    CERN Document Server

    Antoniou, F; Aiba, M; Boege, M; Milas, N; Streun, A; Demma, T

    2012-01-01

    The target parameters of modern ultra-low emittance rings are entering into a regime where Intra-beam Scattering (IBS) becomes important and, in the case of linear collider damping rings, even a limitation for the delivered emittances. The Swiss Light Source (SLS) storage ring, as it has achieved a vertical geometrical emittance of around 1 pm at 2.4 GeV [1], and it has the ability to run at even lower energies, and the availability of emittance monitoring diagnostics, is an ideal testbed for IBS studies. Simulations using the classical IBS theories and tracking codes are undertaken in order to explore the possibilities and limitations for IBS measurements at the SLS. In this respect, comparison between the theories and codes is first discussed. The dependence of the output emittances, taking into account the effect of IBS, with respect to energy, bunch charge and zero current vertical and longitudinal emittance is also studied, in order to define the regimes where the IBS effect can be significant. First mea...

  4. Characterization of Ground Displacement Sources from Variational Bayesian Independent Component Analysis of Space Geodetic Time Series

    Science.gov (United States)

    Gualandi, Adriano; Serpelloni, Enrico; Elina Belardinelli, Maria; Bonafede, Maurizio; Pezzo, Giuseppe; Tolomei, Cristiano

    2015-04-01

    A critical point in the analysis of ground displacement time series, as those measured by modern space geodetic techniques (primarly continuous GPS/GNSS and InSAR) is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies, since PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem. The recovering and separation of the different sources that generate the observed ground deformation is a fundamental task in order to provide a physical meaning to the possible different sources. PCA fails in the BSS problem since it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the displacement time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient deformation signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources

  5. Characterization of polar organic compounds and source analysis of fine organic aerosols in Hong Kong

    Science.gov (United States)

    Li, Yunchun

    Organic aerosols, as an important fraction of airborne particulate mass, significantly affect the environment, climate, and human health. Compared with inorganic species, characterization of individual organic compounds is much less complete and comprehensive because they number in thousands or more and are diverse in chemical structures. The source contributions of organic aerosols are far from being well understood because they can be emitted from a variety of sources as well as formed from photochemical reactions of numerous precursors. This thesis work aims to improve the characterization of polar organic compounds and source apportionment analysis of fine organic carbon (OC) in Hong Kong, which consists of two parts: (1) An improved analytical method to determine monocarboxylic acids, dicarboxylic acids, ketocarboxylic acids, and dicarbonyls collected on filter substrates has been established. These oxygenated compounds were determined as their butyl ester or butyl acetal derivatives using gas chromatography-mass spectrometry. The new method made improvements over the original Kawamura method by eliminating the water extraction and evaporation steps. Aerosol materials were directly mixed with the BF 3/BuOH derivatization agent and the extracting solvent hexane. This modification improves recoveries for both the more volatile and the less water-soluble compounds. This improved method was applied to study the abundances and sources of these oxygenated compounds in PM2.5 aerosol samples collected in Hong Kong under different synoptic conditions during 2003-2005. These compounds account for on average 5.2% of OC (range: 1.4%-13.6%) on a carbon basis. Oxalic acid was the most abundant species. Six C2 and C3 oxygenated compounds, namely oxalic, malonic, glyoxylic, pyruvic acids, glyoxal, and methylglyoxal, dominated this suite of oxygenated compounds. More efforts are therefore suggested to focus on these small compounds in understanding the role of oxygenated

  6. Study and characterization of a phosphorous ion source and development of a emittancemeter suited to multi-beam ion sources

    International Nuclear Information System (INIS)

    Hoang Gia Tuong.

    1982-12-01

    The ionization process which is used is the electronic bombardment. Phosphorus choice for the source experimentation is motivated by its principal destination: ionic implantation. Heavy ion applications are also quoted. Operating conditions allowing good results to be obtained are determined after a study of different parameters such as the electron current, the neutron pressure and the extraction voltage: the ion current obtained is of the order of mA. The source emittance, representing the quality of the ionic beam, is measured by a method suited to multibeam sources [fr

  7. Chemometric Analysis for Pollution Source Assessment of Harbour Sediments in Arctic Locations

    DEFF Research Database (Denmark)

    Pedersen, Kristine B.; Lejon, Tore; Jensen, Pernille Erland

    2015-01-01

    Pollution levels, pollutant distribution and potential source assessments based on multivariate analysis (chemometrics) were made for harbour sediments from two Arctic locations; Hammerfest in Norway and Sisimiut in Greenland. High levels of heavy metals were detected in addition to organic...... pollutants. Preliminary assessments based on principal component analysis (PCA) revealed different sources and pollutant distribution in the sediments of the two harbours. Tributyltin (TBT) was, however, found to originate from point source(s), and the highest concentrations of TBT in both harbours were...... indicated relation primarily to German, Russian and American mixtures in Hammerfest; and American, Russian and Japanese mixtures in Sisimiut. PCA was shown to be an important tool for identifying pollutant sources and differences in pollutant composition in relation to sediment characteristics....

  8. Economic Analysis of Nitrate Source Reductions in California Agriculture

    Science.gov (United States)

    Medellin-Azuara, J.; Howitt, R.; Rosenstock, T.; Harter, T.; Pettygrove, S. G.; Dzurella, K.; Lund, J. R.

    2011-12-01

    We present an analytical approach to assess the economic impact of improving nitrogen management practices in California agriculture. We employ positive mathematical programming to calibrate crop production to base input information. The production function representation is a nested constant elasticity of substitution with two nests: one for applied water and one for applied nitrogen. The first nest accounts for the tradeoffs between irrigation efficiency and capital investments in irrigation technology. The second nest represents the tradeoffs between nitrogen application efficiency and the marginal costs of improving nitrogen efficiency. In the production function nest, low elasticities of substitution and water and nitrogen stress constraints keep agricultural crop yields constant despite changes in nitrogen management practices. We use the Tulare Basin, and the Salinas Valley in California's Central Valley and Central Coast respectively as our case studies. Preliminary results show that initial reductions of 25% in nitrogen loads to groundwater may not impose large costs to agricultural crop production as substitution of management inputs results in only small declines in net revenue from farming and total land use. Larger reductions in the nitrogen load to groundwater of 50% imposes larger marginal costs for better nitrogen management inputs and reductions in the area of lower valued crops grown in the study areas. Despite the shortage of data on quantitative effects of improved nitrogen efficiency; our results demonstrate the potential of combining economic and agronomic data into a model that can reflect differences in cost and substitutabilty in nitrogen application methods, that can be used to reduce the quantity of nitrogen leaching into groundwater.

  9. Analysis of the Source System of Nantun Group in Huhehu Depression of Hailar Basin

    Science.gov (United States)

    Li, Yue; Li, Junhui; Wang, Qi; Lv, Bingyang; Zhang, Guannan

    2017-10-01

    Huhehu Depression will be the new battlefield in Hailar Basin in the future, while at present it’s in a low exploration level. The study about the source system of Nantun group is little, so fine depiction of the source system would be significant to sedimentary system reconstruction, the reservoir distribution and prediction of favorable area. In this paper, it comprehensive uses of many methods such as ancient landform, light and heavy mineral combination, seismic reflection characteristics, to do detailed study about the source system of Nantun group in different views and different levels. The results show that the source system in Huhehu Depression is from the east of Xilinbeir bulge and the west of Bayan Moutain uplift, which is surrounded by basin. The slope belt is the main source, and the southern bulge is the secondary source. The distribution of source system determines the distribution of sedimentary system and the regularity of the distribution of sand body.

  10. Noise source analysis of nuclear ship Mutsu plant using multivariate autoregressive model

    International Nuclear Information System (INIS)

    Hayashi, K.; Shimazaki, J.; Shinohara, Y.

    1996-01-01

    The present study is concerned with the noise sources in N.S. Mutsu reactor plant. The noise experiments on the Mutsu plant were performed in order to investigate the plant dynamics and the effect of sea condition and and ship motion on the plant. The reactor noise signals as well as the ship motion signals were analyzed by a multivariable autoregressive (MAR) modeling method to clarify the noise sources in the reactor plant. It was confirmed from the analysis results that most of the plant variables were affected mainly by a horizontal component of the ship motion, that is the sway, through vibrations of the plant structures. Furthermore, the effect of ship motion on the reactor power was evaluated through the analysis of wave components extracted by a geometrical transform method. It was concluded that the amplitude of the reactor power oscillation was about 0.15% in normal sea condition, which was small enough for safe operation of the reactor plant. (authors)

  11. Reliability and validity analysis of the open-source Chinese Foot and Ankle Outcome Score (FAOS).

    Science.gov (United States)

    Ling, Samuel K K; Chan, Vincent; Ho, Karen; Ling, Fona; Lui, T H

    2017-12-21

    Develop the first reliable and validated open-source outcome scoring system in the Chinese language for foot and ankle problems. Translation of the English FAOS into Chinese following regular protocols. First, two forward-translations were created separately, these were then combined into a preliminary version by an expert committee, and was subsequently back-translated into English. The process was repeated until the original and back translations were congruent. This version was then field tested on actual patients who provided feedback for modification. The final Chinese FAOS version was then tested for reliability and validity. Reliability analysis was performed on 20 subjects while validity analysis was performed on 50 subjects. Tools used to validate the Chinese FAOS were the SF36 and Pain Numeric Rating Scale (NRS). Internal consistency between the FAOS subgroups was measured using Cronbach's alpha. Spearman's correlation was calculated between each subgroup in the FAOS, SF36 and NRS. The Chinese FAOS passed both reliability and validity testing; meaning it is reliable, internally consistent and correlates positively with the SF36 and the NRS. The Chinese FAOS is a free, open-source scoring system that can be used to provide a relatively standardised outcome measure for foot and ankle studies. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Study of hot electrons in a ECR ion source

    International Nuclear Information System (INIS)

    Barue, C.

    1992-12-01

    The perfecting of diagnosis connected with hot electrons of plasma, and then the behaviour of measured parameters of plasma according to parameters of source working are the purpose of this thesis. The experimental results obtained give new information on hot electrons of an ECR ion source. This thesis is divided in 4 parts: the first part presents an ECR source and the experimental configuration (ECRIS physics, minimafios GHz, diagnosis used); the second part, the diagnosis (computer code of cyclotron emission and calibration); the third part gives experimental results in continuous regime (emission cyclotron diagnosis, bremsstrahlung); the fourth part, experimental results in pulsed regime (emission cyclotron diagnosis, diamagnetism) calibration)

  13. Study on road surface source pollution controlled by permeable pavement

    Science.gov (United States)

    Zheng, Chaocheng

    2018-06-01

    The increase of impermeable pavement in urban construction not only increases the runoff of the pavement, but also produces a large number of Non-Point Source Pollution. In the process of controlling road surface runoff by permeable pavement, a large number of particulate matter will be withheld when rainwater is being infiltrated, so as to control the source pollution at the source. In this experiment, we determined the effect of permeable road surface to remove heavy pollutants in the laboratory and discussed the related factors that affect the non-point pollution of permeable pavement, so as to provide a theoretical basis for the application of permeable pavement.

  14. A content analysis of depression-related discourses on Sina Weibo: attribution, efficacy, and information sources.

    Science.gov (United States)

    Pan, Jiabao; Liu, Bingjie; Kreps, Gary L

    2018-06-20

    Depression is a mood disorder that may lead to severe outcomes including mental breakdown, self-injury, and suicide. Potential causes of depression include genetic, sociocultural, and individual-level factors. However, public understandings of depression guided by a complex interplay of media and other societal discourses might not be congruent with the scientific knowledge. Misunderstandings of depression can lead to under-treatment and stigmatization of depression. Against this backdrop, this study aims to achieve a holistic understanding of the patterns and dynamics in discourses about depression from various information sources in China by looking at related posts on social media. A content analysis was conducted with 902 posts about depression randomly selected within a three-year period (2014 to 2016) on the mainstream social media platform in China, Sina Weibo. Posts were analyzed with a focus on attributions of and solutions to depression, attitudes towards depression, and efficacy indicated by the posts across various information sources. Results suggested that depression was most often attributed to individual-level factors. Across all the sources, individual-level attributions were often adopted by state-owned media whereas health and academic experts and organizations most often mentioned biological causes of depression. Citizen journalists and unofficial social groups tended to make societal-level attributions. Overall, traditional media posts suggested the lowest efficacy in coping with depression and the most severe negative outcomes as compared with other sources. The dominance of individual-level attributions and solutions regarding depression on Chinese social media on one hand manifests the public's limited understanding of depression and on the other hand, may further constrain adoption of scientific explanations about depression and exacerbate stigmatization towards depressed individuals. Mass media's posts centered on description of severe

  15. YouTube as a source of COPD patient education: A social media content analysis

    Science.gov (United States)

    Stellefson, Michael; Chaney, Beth; Ochipa, Kathleen; Chaney, Don; Haider, Zeerak; Hanik, Bruce; Chavarria, Enmanuel; Bernhardt, Jay M.

    2014-01-01

    Objective Conduct a social media content analysis of COPD patient education videos on YouTube. Methods A systematic search protocol was used to locate 223 videos. Two independent coders evaluated each video to determine topics covered, media source(s) of posted videos, information quality as measured by HONcode guidelines for posting trustworthy health information on the Internet, and viewer exposure/engagement metrics. Results Over half the videos (n=113, 50.7%) included information on medication management, with far fewer videos on smoking cessation (n=40, 17.9%). Most videos were posted by a health agency or organization (n=128, 57.4%), and the majority of videos were rated as high quality (n=154, 69.1%). HONcode adherence differed by media source (Fisher’s Exact Test=20.52, p=.01), with user-generated content (UGC) receiving the lowest quality scores. Overall level of user engagement as measured by number of “likes,” “favorites,” “dislikes,” and user comments was low (mdn range = 0–3, interquartile (IQR) range = 0–16) across all sources of media. Conclusion Study findings suggest that COPD education via YouTube has the potential to reach and inform patients, however, existing video content and quality varies significantly. Future interventions should help direct individuals with COPD to increase their engagement with high-quality patient education videos on YouTube that are posted by reputable health organizations and qualified medical professionals. Patients should be educated to avoid and/or critically view low-quality videos posted by individual YouTube users who are not health professionals. PMID:24659212

  16. Studies of IRAS sources at high galactic latitudes

    International Nuclear Information System (INIS)

    Rowan-Robinson, M.; Helou, G.; Walker, D.

    1987-01-01

    A detailed study has been carried out of a complete sample of IRAS 25-, 60- and 100-μm sources identified with galaxies brighter than 14.5 mag at b > 60 0 . Redshifts are available for virtually all these galaxies. The 60- and 100-μm luminosities are well correlated with the corrected absolute magnitude Msub(B), the 25-μm luminosity less well so. There is a clear correlation of the ratio of far-infrared luminosity to optical luminosity, Lsub(FIR)/Lsub(B), with 100 μm/60 μm colour, in the sense that the more luminous infrared galaxies are warmer. This behaviour can be modelled as a mixture of a normal 'disc' component and a starburst component. There is no significant difference in the distribution of Lsub(FIR)/Lsub(B) versus 100 μm/60 μm colour for edge-on and face-on spirals, showing that the adopted internal extinction correction is a good approximation. (author)

  17. NATO Advanced Study Institute on Physics of New Laser Sources

    CERN Document Server

    Arecchi, F; Mooradian, Aram; Sona, Alberto

    1985-01-01

    This volume contains the lectures and seminars presented at the NATO Advanced Study Institute on "Physics of New Laser Sources", the twelfth course of the Europhysics School of Quantum Electronics, held under the supervision of the Quantum Electronics Division of the European Physical Society. The Institute was held at Centro "I Cappuccini" San Miniato, Tuscany, July 11-21, 1984. The Europhysics School of Quantum Electronics was started in 1970 with the aim of providing instruction for young researchers and advanced students already engaged in the area of quantum electronics or for those wishing to switch into this area after working previously in other areas. From the outset, the School has been under the direction of Prof. F. T. Arecchi, then at the University of Pavia, now at the University of Florence, and Dr. D. Roess of Heraeus, Hanau. In 1981, Prof. H. Walther, University of Munich and Max-Planck Institut fur Quantenoptik joined as co-director. Each year the Directors choose a subj~ct of particular int...

  18. A 12 GHz RF Power Source for the CLIC Study

    Energy Technology Data Exchange (ETDEWEB)

    Schirm, Karl; /CERN; Curt, Stephane; /CERN; Dobert, Steffen; /CERN; McMonagle, Gerard; /CERN; Rossat, Ghislain; /CERN; Syratchev, Igor; /CERN; Timeo, Luca; /CERN; Haase, Andrew /SLAC; Jensen, Aaron; /SLAC; Jongewaard, Erik; /SLAC; Nantista, Christopher; /SLAC; Sprehn, Daryl; /SLAC; Vlieks, Arnold; /SLAC; Hamdi, Abdallah; /Saclay; Peauger, Franck; /Saclay; Kuzikov, Sergey; /Nizhnii Novgorod, IAP; Vikharev, Alexandr; /Nizhnii Novgorod, IAP

    2012-07-03

    The CLIC RF frequency has been changed in 2008 from the initial 30 GHz to the European X-band 11.9942 GHz permitting beam independent power production using klystrons for CLIC accelerating structure testing. A design and fabrication contract for five klystrons at that frequency has been signed by different parties with SLAC. France (IRFU, CEA Saclay) is contributing a solid state modulator purchased in industry and specific 12 GHz RF network components to the CLIC study. RF pulses over 120 MW peak at 230 ns length will be obtained by using a novel SLED-I type pulse compression scheme designed and fabricated by IAP, Nizhny Novgorod, Russia. The X-band power test stand is being installed in the CLIC Test Facility CTF3 for independent structure and component testing in a bunker, but allowing, in a later stage, for powering RF components in the CTF3 beam lines. The design of the facility, results from commissioning of the RF power source and the expected performance of the Test Facility are reported.

  19. A 12 GHZ RF Power source for the CLIC study

    CERN Document Server

    Peauger, F; Curt, S; Doebert, S; McMonagle, G; Rossat, G; Schirm, KM; Syratchev, I; Timeo, L; Kuzikhov, S; Vikharev, AA; Haase, A; Sprehn, D; Jensen, A; Jongewaard, EN; Nantista, CD; Vlieks, A

    2010-01-01

    The CLIC RF frequency has been changed in 2008 from the initial 30 GHz to the European X-band 11.9942 GHz permitting beam independent power production using klystrons for CLIC accelerating structure testing. A design and fabrication contract for five klystrons at that frequency has been signed by different parties with SLAC. France (IRFU, CEA Saclay) is contributing a solid state modulator purchased in industry and specific 12 GHz RF network components to the CLIC study. RF pulses over 120 MW peak at 230 ns length will be obtained by using a novel SLED-I type pulse compression scheme designed and fabricated by IAP, Nizhny Novgorod, Russia. The X-band power test stand is being installed in the CLIC Test Facility CTF3 for independent structure and component testing in a bunker, but allowing, in a later stage, for powering RF components in the CTF3 beam lines. The design of the facility, results from commissioning of the RF power source and the expected performance of the Test Facility are reported.

  20. Factor analysis of sources of information on organ donation and transplantation in journalism students.

    Science.gov (United States)

    Martínez-Alarcón, L; Ríos, A; Ramis, G; López-Navas, A; Febrero, B; Ramírez, P; Parrilla, P

    2013-01-01

    Journalists and the information they disseminate are essential to promote health and organ donation and transplantation (ODT). The attitude of journalism students toward ODT could influence public opinion and help promote this treatment option. The aim of this study was to determine the media through which journalism students receive information on ODT and to analyze the association between the sources of information and psychosocial variables. We surveyed journalism students (n = 129) recruited in compulsory classes. A validated psychosocial questionnaire (self-administered, anonymous) about ODT was used. Student t test and χ(2) test were applied. Questionnaire completion rate was 98% (n = 126). The medium with the greatest incidence on students was television (TV), followed by press and magazines/books. In the factor analysis to determine the impact of the information by its source, the first factor was talks with friends and family; the second was shared by hoardings/publicity posters, health professionals, and college/school; and the third was TV and radio. In the factor analysis between information sources and psychosocial variables, the associations were between information about organ donation transmitted by friends and family and having spoken about ODT with them; by TV, radio, and hoardings and not having spoken in the family; and by TV/radio and the father's and mother's opinion about ODT. The medium with the greatest incidence on students is TV, and the medium with the greatest impact on broadcasting information was conversations with friends, family, and health professionals. This could be useful for society, because they should be provided with clear and concise information. Copyright © 2013 Elsevier Inc. All rights reserved.

  1. Exploiting heterogeneous publicly available data sources for drug safety surveillance: computational framework and case studies.

    Science.gov (United States)

    Koutkias, Vassilis G; Lillo-Le Louët, Agnès; Jaulent, Marie-Christine

    2017-02-01

    Driven by the need of pharmacovigilance centres and companies to routinely collect and review all available data about adverse drug reactions (ADRs) and adverse events of interest, we introduce and validate a computational framework exploiting dominant as well as emerging publicly available data sources for drug safety surveillance. Our approach relies on appropriate query formulation for data acquisition and subsequent filtering, transformation and joint visualization of the obtained data. We acquired data from the FDA Adverse Event Reporting System (FAERS), PubMed and Twitter. In order to assess the validity and the robustness of the approach, we elaborated on two important case studies, namely, clozapine-induced cardiomyopathy/myocarditis versus haloperidol-induced cardiomyopathy/myocarditis, and apixaban-induced cerebral hemorrhage. The analysis of the obtained data provided interesting insights (identification of potential patient and health-care professional experiences regarding ADRs in Twitter, information/arguments against an ADR existence across all sources), while illustrating the benefits (complementing data from multiple sources to strengthen/confirm evidence) and the underlying challenges (selecting search terms, data presentation) of exploiting heterogeneous information sources, thereby advocating the need for the proposed framework. This work contributes in establishing a continuous learning system for drug safety surveillance by exploiting heterogeneous publicly available data sources via appropriate support tools.

  2. OVAS: an open-source variant analysis suite with inheritance modelling.

    Science.gov (United States)

    Mozere, Monika; Tekman, Mehmet; Kari, Jameela; Bockenhauer, Detlef; Kleta, Robert; Stanescu, Horia

    2018-02-08

    The advent of modern high-throughput genetics continually broadens the gap between the rising volume of sequencing data, and the tools required to process them. The need to pinpoint a small subset of functionally important variants has now shifted towards identifying the critical differences between normal variants and disease-causing ones. The ever-increasing reliance on cloud-based services for sequence analysis and the non-transparent methods they utilize has prompted the need for more in-situ services that can provide a safer and more accessible environment to process patient data, especially in circumstances where continuous internet usage is limited. To address these issues, we herein propose our standalone Open-source Variant Analysis Sequencing (OVAS) pipeline; consisting of three key stages of processing that pertain to the separate modes of annotation, filtering, and interpretation. Core annotation performs variant-mapping to gene-isoforms at the exon/intron level, append functional data pertaining the type of variant mutation, and determine hetero/homozygosity. An extensive inheritance-modelling module in conjunction with 11 other filtering components can be used in sequence ranging from single quality control to multi-file penetrance model specifics such as X-linked recessive or mosaicism. Depending on the type of interpretation required, additional annotation is performed to identify organ specificity through gene expression and protein domains. In the course of this paper we analysed an autosomal recessive case study. OVAS made effective use of the filtering modules to recapitulate the results of the study by identifying the prescribed compound-heterozygous disease pattern from exome-capture sequence input samples. OVAS is an offline open-source modular-driven analysis environment designed to annotate and extract useful variants from Variant Call Format (VCF) files, and process them under an inheritance context through a top-down filtering schema of

  3. Source separation of household waste: A case study in China

    International Nuclear Information System (INIS)

    Zhuang Ying; Wu Songwei; Wang Yunlong; Wu Weixiang; Chen Yingxu

    2008-01-01

    A pilot program concerning source separation of household waste was launched in Hangzhou, capital city of Zhejiang province, China. Detailed investigations on the composition and properties of household waste in the experimental communities revealed that high water content and high percentage of food waste are the main limiting factors in the recovery of recyclables, especially paper from household waste, and the main contributors to the high cost and low efficiency of waste disposal. On the basis of the investigation, a novel source separation method, according to which household waste was classified as food waste, dry waste and harmful waste, was proposed and performed in four selected communities. In addition, a corresponding household waste management system that involves all stakeholders, a recovery system and a mechanical dehydration system for food waste were constituted to promote source separation activity. Performances and the questionnaire survey results showed that the active support and investment of a real estate company and a community residential committee play important roles in enhancing public participation and awareness of the importance of waste source separation. In comparison with the conventional mixed collection and transportation system of household waste, the established source separation and management system is cost-effective. It could be extended to the entire city and used by other cities in China as a source of reference

  4. Comparative study of wild edible mushrooms as sources of antioxidants.

    Science.gov (United States)

    Witkowska, Anna M; Zujko, Małgorzata E; Mirończuk-Chodakowska, Iwona

    2011-01-01

    The purpose of the study was to explore sixteen of the most popular edible species of wild-growing mushrooms as potential sources of antioxidants. Among the mushrooms tested, the highest total polyphenol contents, exceeding 100 mg/100 g fresh mass, were found in five mushrooms: Boletus chrysenteron, B. edulis, Leccinum scabrum, L. aurantiacum, and Macrolepiota procera. Antioxidant activity was measured with the FRAP, TEAC, DPPH scavenging ability and ferrous ions chelating ability assays. Results of the study show that wild mushrooms vary according to their antioxidant properties. The highest FRAP potentials, exceeding 1 mmol/100 g, were found in five species ofBoletales: Boletus edulis, B. chrysenteron, Leccinum scabrum, L. aurantiacum, and Suillus grevillei. TEAC values were from 1.07 to 4.01 mmol/100 g fresh mass. High TEAC values (>2.3 mmol/100 g) were found in Leccinum scabrum, L. aurantiacum, Macrolepiota procera, Boletus chrysenteron, and B. edulis. The DPPH radical scavenging effectiveness of mushroom extracts, expressed as EC50 values, was in range 2.91-13.86 mg/mL. Scavenging ability was the highest for B. edulis and B. chrysenteron. The metal chelating ability of mushroom extracts expressed as ECso values of chelating ability on ferrous ions were from 8.02 mg/mL in Cantharellus cibarius to 12.10 mg/mL in Suillus luteus. Among the mushrooms tested, Boletus chrysenteron and B. edulis were characterized by high scores of polyphenol contents and antioxidant activity in the FRAP, TEAC, and DPPH assays. These results place these culinary species of wild-growing mushrooms among products with considerable antioxidant potential.

  5. Materials compatibility studies for the Spallation Neutron Source

    International Nuclear Information System (INIS)

    DiStefano, J.R.; Pawel, S.J.; Manneschmidt, E.T.

    1998-01-01

    The Spallation Neutron Source (SNS) is a high power facility for producing neutrons that utilizes flowing liquid mercury inside an austenitic stainless steel container as the target for a 1.0 GeV proton beam. Type 316 SS has been selected as the container material for the mercury and consequences of exposure of 316 SS to radiation, thermal shock, thermal stress, cavitation and hot, flowing mercury are all being addressed by R and D programs. In addition, corrosion studies also include evaluation of Inconel 718 because it has been successfully used in previous spallation neutron systems as a window material. Two types of compatibility issues relative to 316 SS/mercury and Inconel 718/mercury are being examined: (1) liquid metal embrittlement (LME) and (2) temperature gradient mass transfer. Studies have shown that mercury does not easily wet type 316 SS below 275 C. In the LME experiments, attempts were made to promote wetting of the steel by mercury either by adding gallium to the mercury or coating the specimen with a tin-silver solder that the mercury easily wets. The latter proved more reliable in establishing wetting, but there was no evidence of LME in any of the constant extension rate tensile tests either at 23 or 100 C. Inconel 718 also showed no change in room temperature properties when tested in mercury or mercury-gallium. However, there was evidence that the fracture was less ductile. Preliminary evaluation of mass transfer of either type 316 SS or Inconel 718 in mercury or mercury-gallium at 350 C (maximum temperature) did not reveal significant effects. Two 5,000 h thermal convection loop tests of type 316 SS are in progress, with specimens in both hot and cold test regions, at 300 and 240 C, respectively

  6. Dietary sources of sugars in adolescents' diet: the HELENA study.

    Science.gov (United States)

    Mesana, M I; Hilbig, A; Androutsos, O; Cuenca-García, M; Dallongeville, J; Huybrechts, I; De Henauw, S; Widhalm, K; Kafatos, A; Nova, E; Marcos, A; González-Gross, M; Molnar, D; Gottrand, F; Moreno, L A

    2018-03-01

    To report dietary sugars consumption and their different types and food sources, in European adolescents. Food consumption data of selected groups were obtained from 1630 adolescents (45.6% males, 12.5-17.5 years) from the HELENA study using two nonconsecutive 24-h recalls. Energy intake, total sugars and free sugars were assessed using the HELENA-DIAT software. Multiple regression analyses were performed adjusting for relevant confounders. Total sugars intake (137.5 g/day) represented 23.6% and free sugars (110.1 g/day), 19% of energy intake. Girls had significantly lower intakes of energy, carbohydrates, total sugars and free sugars. 94% of adolescents had a consumption of free sugars above 10% of total energy intake. The main food contributor to free sugars was 'carbonated, soft and isotonic drinks,' followed by 'non-chocolate confectionary' and 'sugar, honey, jam and syrup.' Older boys and girls had significantly higher intakes of free sugars from 'cakes, pies and biscuits.' Free sugars intake was negatively associated with low socioeconomic status for 'non-chocolate confectionary' and 'sugar, honey and jam' groups; with low maternal educational level for carbonated and 'soft drinks,' 'sugar, honey and jam,' 'cakes and pies' and 'breakfast cereals' groups; and with high paternal educational level for 'carbonated and soft drinks' and 'chocolates' group. The majority (94%) of studied adolescents consumed free sugars above 10% of daily energy intake. Our data indicate a broad variety in foods providing free sugars. Continued efforts are required at different levels to reduce the intake of free sugars, especially in families with a low educational level.

  7. On Road Study of Colorado Front Range Greenhouse Gases Distribution and Sources

    Science.gov (United States)

    Petron, G.; Hirsch, A.; Trainer, M. K.; Karion, A.; Kofler, J.; Sweeney, C.; Andrews, A.; Kolodzey, W.; Miller, B. R.; Miller, L.; Montzka, S. A.; Kitzis, D. R.; Patrick, L.; Frost, G. J.; Ryerson, T. B.; Robers, J. M.; Tans, P.

    2008-12-01

    The Global Monitoring Division and Chemical Sciences Division of the NOAA Earth System Research Laboratory have teamed up over the summer 2008 to experiment with a new measurement strategy to characterize greenhouse gases distribution and sources in the Colorado Front Range. Combining expertise in greenhouse gases measurements and in local to regional scales air quality study intensive campaigns, we have built the 'Hybrid Lab'. A continuous CO2 and CH4 cavity ring down spectroscopic analyzer (Picarro, Inc.), a CO gas-filter correlation instrument (Thermo Environmental, Inc.) and a continuous UV absorption ozone monitor (2B Technologies, Inc., model 202SC) have been installed securely onboard a 2006 Toyota Prius Hybrid vehicle with an inlet bringing in outside air from a few meters above the ground. To better characterize point and distributed sources, air samples were taken with a Portable Flask Package (PFP) for later multiple species analysis in the lab. A GPS unit hooked up to the ozone analyzer and another one installed on the PFP kept track of our location allowing us to map measured concentrations on the driving route using Google Earth. The Hybrid Lab went out for several drives in the vicinity of the NOAA Boulder Atmospheric Observatory (BAO) tall tower located in Erie, CO and covering areas from Boulder, Denver, Longmont, Fort Collins and Greeley. Enhancements in CO2, CO and destruction of ozone mainly reflect emissions from traffic. Methane enhancements however are clearly correlated with nearby point sources (landfill, feedlot, natural gas compressor ...) or with larger scale air masses advected from the NE Colorado, where oil and gas drilling operations are widespread. The multiple species analysis (hydrocarbons, CFCs, HFCs) of the air samples collected along the way bring insightful information about the methane sources at play. We will present results of the analysis and interpretation of the Hybrid Lab Front Range Study and conclude with perspectives

  8. Polarisation analysis of elastic neutron scattering using a filter spectrometer on a pulsed source

    International Nuclear Information System (INIS)

    Mayers, J.; Williams, W.G.

    1981-05-01

    The experimental and theoretical aspects of the polarisation analysis technique in elastic neutron scattering are described. An outline design is presented for a filter polarisation analysis spectrometer on the Rutherford Laboratory Spallation Neutron Source and estimates made of its expected count rates and resolution. (author)

  9. Factors influencing the spatial extent of mobile source air pollution impacts: a meta-analysis

    Directory of Open Access Journals (Sweden)

    Levy Jonathan I

    2007-05-01

    Full Text Available Abstract Background There has been growing interest among exposure assessors, epidemiologists, and policymakers in the concept of "hot spots", or more broadly, the "spatial extent" of impacts from traffic-related air pollutants. This review attempts to quantitatively synthesize findings about the spatial extent under various circumstances. Methods We include both the peer-reviewed literature and government reports, and focus on four significant air pollutants: carbon monoxide, benzene, nitrogen oxides, and particulate matter (including both ultrafine particle counts and fine particle mass. From the identified studies, we extracted information about significant factors that would be hypothesized to influence the spatial extent within the study, such as the study type (e.g., monitoring, air dispersion modeling, GIS-based epidemiological studies, focus on concentrations or health risks, pollutant under study, background concentration, emission rate, and meteorological factors, as well as the study's implicit or explicit definition of spatial extent. We supplement this meta-analysis with results from some illustrative atmospheric dispersion modeling. Results We found that pollutant characteristics and background concentrations best explained variability in previously published spatial extent estimates, with a modifying influence of local meteorology, once some extreme values based on health risk estimates were removed from the analysis. As hypothesized, inert pollutants with high background concentrations had the largest spatial extent (often demonstrating no significant gradient, and pollutants formed in near-source chemical reactions (e.g., nitrogen dioxide had a larger spatial extent than pollutants depleted in near-source chemical reactions or removed through coagulation processes (e.g., nitrogen oxide and ultrafine particles. Our illustrative dispersion model illustrated the complex interplay of spatial extent definitions, emission rates

  10. Fine particulates over South Asia: Review and meta-analysis of PM2.5 source apportionment through receptor model.

    Science.gov (United States)

    Singh, Nandita; Murari, Vishnu; Kumar, Manish; Barman, S C; Banerjee, Tirthankar

    2017-04-01

    Fine particulates (PM 2.5 ) constitute dominant proportion of airborne particulates and have been often associated with human health disorders, changes in regional climate, hydrological cycle and more recently to food security. Intrinsic properties of particulates are direct function of sources. This initiates the necessity of conducting a comprehensive review on PM 2.5 sources over South Asia which in turn may be valuable to develop strategies for emission control. Particulate source apportionment (SA) through receptor models is one of the existing tool to quantify contribution of particulate sources. Review of 51 SA studies were performed of which 48 (94%) were appeared within a span of 2007-2016. Almost half of SA studies (55%) were found concentrated over few typical urban stations (Delhi, Dhaka, Mumbai, Agra and Lahore). Due to lack of local particulate source profile and emission inventory, positive matrix factorization and principal component analysis (62% of studies) were the primary choices, followed by chemical mass balance (CMB, 18%). Metallic species were most regularly used as source tracers while use of organic molecular markers and gas-to-particle conversion were minimum. Among all the SA sites, vehicular emissions (mean ± sd: 37 ± 20%) emerged as most dominating PM 2.5 source followed by industrial emissions (23 ± 16%), secondary aerosols (22 ± 12%) and natural sources (20 ± 15%). Vehicular emissions (39 ± 24%) also identified as dominating source for highly polluted sites (PM 2.5 >100 μgm -3 , n = 15) while site specific influence of either or in combination of industrial, secondary aerosols and natural sources were recognized. Source specific trends were considerably varied in terms of region and seasonality. Both natural and industrial sources were most influential over Pakistan and Afghanistan while over Indo-Gangetic plain, vehicular, natural and industrial emissions appeared dominant. Influence of vehicular emission was

  11. Applicability of annular-source excited systems in quantitative XRF analysis

    International Nuclear Information System (INIS)

    Mahmoud, A.; Bernasconi, G.; Bamford, S.A.; Dosan, B.; Haselberger, N.; Markowicz, A.

    1996-01-01

    Radioisotope-excited XRF systems, using annular sources, are widely used in view of their simplicity, wide availability, relatively low price for the complete system and good overall performance with respect to accuracy and detection limits. However some problems arise when the use of fundamental parameter techniques for quantitative analysis is attempted. These problems are due to the fact that the systems operate with large solid angles for incoming and emerging radiation and both the incident and take-off angles are not trivial. In this paper an improved way to calculate effective values for the incident and take-off angles, using monte Carlo (M C) integration techniques is shown. In addition, a study of the applicability of the effective angles for analysing different samples, or standards was carried out. The M C method allows also calculation of the excitation-detection efficiency for different parts of the sample and estimation of the overall efficiency of a source-excited XRF setup. The former information is useful in the design of optimized XRF set-ups and prediction of the response of inhomogeneous samples. A study of the sensitivity of the results due to sample characteristics and a comparison of the results with experimentally determined values for incident and take-off angles is also presented. A flexible and user-friendly computer program was developed in order to perform efficiently the lengthy calculation involved. (author). 14 refs. 5 figs

  12. Pinon Pine Tree Study, Los Alamos National Laboratory: Source document

    International Nuclear Information System (INIS)

    Gonzales, G.J.; Fresquez, P.R.; Mullen, M.A.; Naranjo, L. Jr.

    2000-01-01

    One of the dominant tree species growing within and around Los Alamos National Laboratory (LANL), Los Alamos, NM, lands is the pinon pine (Pinus edulis) tree. Pinon pine is used for firewood, fence posts, and building materials and is a source of nuts for food--the seeds are consumed by a wide variety of animals and are also gathered by people in the area and eaten raw or roasted. This study investigated the (1) concentration of 3 H, 137 Cs, 90 Sr, tot U, 238 Pu, 239,240 Pu, and 241 Am in soils (0- to 12-in. [31 cm] depth underneath the tree), pinon pine shoots (PPS), and pinon pine nuts (PPN) collected from LANL lands and regional background (BG) locations, (2) concentrations of radionuclides in PPN collected in 1977 to present data, (3) committed effective dose equivalent (CEDE) from the ingestion of nuts, and (4) soil to PPS to PPN concentration ratios (CRs). Most radionuclides, with the exception of 3 H in soils, were not significantly higher (p < 0.10) in soils, PPS, and PPN collected from LANL as compared to BG locations, and concentrations of most radionuclides in PPN from LANL have decreased over time. The maximum net CEDE (the CEDE plus two sigma minus BG) at the most conservative ingestion rate (10 lb [4.5 kg]) was 0.0018 mrem (0.018 microSv). Soil-to-nut CRs for most radionuclides were within the range of default values in the literature for common fruits and vegetables

  13. A study of γ-ray source for the transmutation

    International Nuclear Information System (INIS)

    Nomura, Masahiro; Takahashi, Hiroshi.

    1996-07-01

    PNC is developing high power CW electron linac for various applications, those are the transmutation of the fission products, Free Electron Laser (FEL), the positron source and so on. Especially, the transmutation by the electron linac has been studied for several years. As the results, high flux and high energy γ-ray (∼15 MeV) is required, one of the big problems is that plenty of transmutation energy is needed and the narrow γ-ray energy spectrum can reduce the transmutation energy. The γ-rays can be produced by synchrotron radiation, FEL and laser compton scattering. Those methods were described briefly and compared. As a result, the laser compton scattering is one of the good methods to produce high energy γ-ray. However the cross section between electron and photon is small and the scattered photon energy spectrum is not so narrow that the transmutation energy is reduced drastically. To enhance the interaction between electron and photon, the super cavity is proposed. And some experiments are in progress. To reduce the transmutation energy, scattered electron must be reused by the storage ring. If the scattered electrons are not used for producing γ-ray, the efficiency is less than 1%. In our system, the efficiency can be increased to 20% by reusing scattered electrons. But this efficiency is still low. To increase the efficiency, the RF bucket must be enlarged. If the momentans compaction factor α can be reduced, the RF bucket can be enlarged. And the storage ring must be designed to have small value of the α. The electron energy dependency of efficiency is investigated, too. In short word, it is difficult to increase the efficiency drastically by changing electron energy. This work was conducted as a part of the collaboration work between PNC and BNL. (author)

  14. Organic aerosol source apportionment in London 2013 with ME-2: exploring the solution space with annual and seasonal analysis

    Directory of Open Access Journals (Sweden)

    E. Reyes-Villegas

    2016-12-01

    Full Text Available The multilinear engine (ME-2 factorization tool is being widely used following the recent development of the Source Finder (SoFi interface at the Paul Scherrer Institute. However, the success of this tool, when using the a value approach, largely depends on the inputs (i.e. target profiles applied as well as the experience of the user. A strategy to explore the solution space is proposed, in which the solution that best describes the organic aerosol (OA sources is determined according to the systematic application of predefined statistical tests. This includes trilinear regression, which proves to be a useful tool for comparing different ME-2 solutions. Aerosol Chemical Speciation Monitor (ACSM measurements were carried out at the urban background site of North Kensington, London from March to December 2013, where for the first time the behaviour of OA sources and their possible environmental implications were studied using an ACSM. Five OA sources were identified: biomass burning OA (BBOA, hydrocarbon-like OA (HOA, cooking OA (COA, semivolatile oxygenated OA (SVOOA and low-volatility oxygenated OA (LVOOA. ME-2 analysis of the seasonal data sets (spring, summer and autumn showed a higher variability in the OA sources that was not detected in the combined March–December data set; this variability was explored with the triangle plots f44 : f43 f44 : f60, in which a high variation of SVOOA relative to LVOOA was observed in the f44 : f43 analysis. Hence, it was possible to conclude that, when performing source apportionment to long-term measurements, important information may be lost and this analysis should be done to short periods of time, such as seasonally. Further analysis on the atmospheric implications of these OA sources was carried out, identifying evidence of the possible contribution of heavy-duty diesel vehicles to air pollution during weekdays compared to those fuelled by petrol.

  15. Sources to the landscape - detailed spatiotemporal analysis of 200 years Danish landscape dynamics using unexploited historical maps and aerial photos

    DEFF Research Database (Denmark)

    Svenningsen, Stig Roar; Christensen, Andreas Aagaard; Dupont, Henrik

    to declassification of military maps and aerial photos from the cold war, only relatively few sources have been made available to researchers due to lacking efforts in digitalization and related services. And even though the digitizing of cartographic material has been accelerated, the digitally available materials...... or to the commercial photo series from the last 20 years. This poster outlines a new research project focusing on the potential of unexploited cartographic sources for detailed analysis of the dynamic of the Danish landscape between 1800 – 2000. The project draws on cartographic sources available in Danish archives...... of material in landscape change studies giving a high temporal and spatial resolution. The project also deals with the opportunity and constrain of comparing different cartographic sources with diverse purpose and time of production, e.g. different scale and quality of aerial photos or the difference between...

  16. A practical sensitivity analysis method for ranking sources of uncertainty in thermal–hydraulics applications

    Energy Technology Data Exchange (ETDEWEB)

    Pourgol-Mohammad, Mohammad, E-mail: pourgolmohammad@sut.ac.ir [Department of Mechanical Engineering, Sahand University of Technology, Tabriz (Iran, Islamic Republic of); Hoseyni, Seyed Mohsen [Department of Basic Sciences, East Tehran Branch, Islamic Azad University, Tehran (Iran, Islamic Republic of); Hoseyni, Seyed Mojtaba [Building & Housing Research Center, Tehran (Iran, Islamic Republic of); Sepanloo, Kamran [Nuclear Science and Technology Research Institute, Tehran (Iran, Islamic Republic of)

    2016-08-15

    Highlights: • Existing uncertainty ranking methods prove inconsistent for TH applications. • Introduction of a new method for ranking sources of uncertainty in TH codes. • Modified PIRT qualitatively identifies and ranks uncertainty sources more precisely. • The importance of parameters is calculated by a limited number of TH code executions. • Methodology is applied successfully on LOFT-LB1 test facility. - Abstract: In application to thermal–hydraulic calculations by system codes, sensitivity analysis plays an important role for managing the uncertainties of code output and risk analysis. Sensitivity analysis is also used to confirm the results of qualitative Phenomena Identification and Ranking Table (PIRT). Several methodologies have been developed to address uncertainty importance assessment. Generally, uncertainty importance measures, mainly devised for the Probabilistic Risk Assessment (PRA) applications, are not affordable for computationally demanding calculations of the complex thermal–hydraulics (TH) system codes. In other words, for effective quantification of the degree of the contribution of each phenomenon to the total uncertainty of the output, a practical approach is needed by considering high computational burden of TH calculations. This study aims primarily to show the inefficiency of the existing approaches and then introduces a solution to cope with the challenges in this area by modification of variance-based uncertainty importance method. Important parameters are identified by the modified PIRT approach qualitatively then their uncertainty importance is quantified by a local derivative index. The proposed index is attractive from its practicality point of view on TH applications. It is capable of calculating the importance of parameters by a limited number of TH code executions. Application of the proposed methodology is demonstrated on LOFT-LB1 test facility.

  17. A practical sensitivity analysis method for ranking sources of uncertainty in thermal–hydraulics applications

    International Nuclear Information System (INIS)

    Pourgol-Mohammad, Mohammad; Hoseyni, Seyed Mohsen; Hoseyni, Seyed Mojtaba; Sepanloo, Kamran

    2016-01-01

    Highlights: • Existing uncertainty ranking methods prove inconsistent for TH applications. • Introduction of a new method for ranking sources of uncertainty in TH codes. • Modified PIRT qualitatively identifies and ranks uncertainty sources more precisely. • The importance of parameters is calculated by a limited number of TH code executions. • Methodology is applied successfully on LOFT-LB1 test facility. - Abstract: In application to thermal–hydraulic calculations by system codes, sensitivity analysis plays an important role for managing the uncertainties of code output and risk analysis. Sensitivity analysis is also used to confirm the results of qualitative Phenomena Identification and Ranking Table (PIRT). Several methodologies have been developed to address uncertainty importance assessment. Generally, uncertainty importance measures, mainly devised for the Probabilistic Risk Assessment (PRA) applications, are not affordable for computationally demanding calculations of the complex thermal–hydraulics (TH) system codes. In other words, for effective quantification of the degree of the contribution of each phenomenon to the total uncertainty of the output, a practical approach is needed by considering high computational burden of TH calculations. This study aims primarily to show the inefficiency of the existing approaches and then introduces a solution to cope with the challenges in this area by modification of variance-based uncertainty importance method. Important parameters are identified by the modified PIRT approach qualitatively then their uncertainty importance is quantified by a local derivative index. The proposed index is attractive from its practicality point of view on TH applications. It is capable of calculating the importance of parameters by a limited number of TH code executions. Application of the proposed methodology is demonstrated on LOFT-LB1 test facility.

  18. To study the municipal solid waste as an energy source

    International Nuclear Information System (INIS)

    Ahmed, Z.; Khan, M.M.

    2005-01-01

    The solid waste management is a very complicated specially when it must be environmental friendly. In the present life, power energy is being more expensive than ever before and human off spring is struggling td acquire cheap ways of getting energy. At the same time, he is facing another problem of waste disposal pollution in the environment, which is a by-product of his industries and population, and when it would be hazardous to life, it will be a more serious problem. In this study, an idea is made to use garbage as an alternate fuel and the analysis of ingredients is done to compare it with the usual fuel i.e. coal. On the other hand, municipal waste (garbage) disposal will be automatically solved. (author)

  19. Uncertainty analysis methods for quantification of source terms using a large computer code

    International Nuclear Information System (INIS)

    Han, Seok Jung

    1997-02-01

    Quantification of uncertainties in the source term estimations by a large computer code, such as MELCOR and MAAP, is an essential process of the current probabilistic safety assessments (PSAs). The main objectives of the present study are (1) to investigate the applicability of a combined procedure of the response surface method (RSM) based on input determined from a statistical design and the Latin hypercube sampling (LHS) technique for the uncertainty analysis of CsI release fractions under a hypothetical severe accident sequence of a station blackout at Young-Gwang nuclear power plant using MAAP3.0B code as a benchmark problem; and (2) to propose a new measure of uncertainty importance based on the distributional sensitivity analysis. On the basis of the results obtained in the present work, the RSM is recommended to be used as a principal tool for an overall uncertainty analysis in source term quantifications, while using the LHS in the calculations of standardized regression coefficients (SRC) and standardized rank regression coefficients (SRRC) to determine the subset of the most important input parameters in the final screening step and to check the cumulative distribution functions (cdfs) obtained by RSM. Verification of the response surface model for its sufficient accuracy is a prerequisite for the reliability of the final results obtained by the combined procedure proposed in the present work. In the present study a new measure has been developed to utilize the metric distance obtained from cumulative distribution functions (cdfs). The measure has been evaluated for three different cases of distributions in order to assess the characteristics of the measure: The first case and the second are when the distribution is known as analytical distributions and the other case is when the distribution is unknown. The first case is given by symmetry analytical distributions. The second case consists of two asymmetry distributions of which the skewness is non zero

  20. Multiband Study of Radio Sources of the RCR Catalogue with Virtual Observatory Tools

    Directory of Open Access Journals (Sweden)

    Zhelenkova O. P.

    2012-09-01

    Full Text Available We present early results of our multiband study of the RATAN Cold Revised (RCR catalogue obtained from seven cycles of the “Cold” survey carried with the RATAN-600 radio telescope at 7.6 cm in 1980-1999, at the declination of the SS 433 source. We used the 2MASS and LAS UKIDSS infrared surveys, the DSS-II and SDSS DR7 optical surveys, as well as the USNO-B1 and GSC-II catalogues, the VLSS, TXS, NVSS, FIRST and GB6 radio surveys to accumulate information about the sources. For radio sources that have no detectable optical candidate in optical or infrared catalogues, we additionally looked through images in several bands from the SDSS, LAS UKIDSS, DPOSS, 2MASS surveys and also used co-added frames in different bands. We reliably identified 76% of radio sources of the RCR catalogue. We used the ALADIN and SAOImage DS9 scripting capabilities, interoperability services of ALADIN and TOPCAT, and also other Virtual Observatory (VO tools and resources, such as CASJobs, NED, Vizier, and WSA, for effective data access, visualization and analysis. Without VO tools it would have been problematic to perform our study.

  1. Source Attribution of Cyanides using Anionic Impurity Profiling, Stable Isotope Ratios, Trace Elemental Analysis and Chemometrics

    Energy Technology Data Exchange (ETDEWEB)

    Mirjankar, Nikhil S.; Fraga, Carlos G.; Carman, April J.; Moran, James J.

    2016-01-08

    Chemical attribution signatures (CAS) for chemical threat agents (CTAs) are being investigated to provide an evidentiary link between CTAs and specific sources to support criminal investigations and prosecutions. In a previous study, anionic impurity profiles developed using high performance ion chromatography (HPIC) were demonstrated as CAS for matching samples from eight potassium cyanide (KCN) stocks to their reported countries of origin. Herein, a larger number of solid KCN stocks (n = 13) and, for the first time, solid sodium cyanide (NaCN) stocks (n = 15) were examined to determine what additional sourcing information can be obtained through anion, carbon stable isotope, and elemental analyses of cyanide stocks by HPIC, isotope ratio mass spectrometry (IRMS), and inductively coupled plasma optical emission spectroscopy (ICP-OES), respectively. The HPIC anion data was evaluated using the variable selection methods of Fisher-ratio (F-ratio), interval partial least squares (iPLS), and genetic algorithm-based partial least squares (GAPLS) and the classification methods of partial least squares discriminate analysis (PLSDA), K nearest neighbors (KNN), and support vector machines discriminate analysis (SVMDA). In summary, hierarchical cluster analysis (HCA) of anion impurity profiles from multiple cyanide stocks from six reported country of origins resulted in cyanide samples clustering into three groups: Czech Republic, Germany, and United States, independent of the associated alkali metal (K or Na). The three country groups were independently corroborated by HCA of cyanide elemental profiles and corresponded to countries with known solid cyanide factories. Both the anion and elemental CAS are believed to originate from the aqueous alkali hydroxides used in cyanide manufacture. Carbon stable isotope measurements resulted in two clusters: Germany and United States (the single Czech stock grouped with United States stocks). The carbon isotope CAS is believed to

  2. Application of Abaqus to analysis of the temperature field in elements heated by moving heat sources

    Directory of Open Access Journals (Sweden)

    W. Piekarska

    2010-10-01

    Full Text Available Numerical analysis of thermal phenomena occurring during laser beam heating is presented in this paper. Numerical models of surface andvolumetric heat sources were presented and the influence of different laser beam heat source power distribution on temperature field wasanalyzed. Temperature field was obtained by a numerical solution the transient heat transfer equation with activity of inner heat sources using finite element method. Temperature distribution analysis in welded joint was performed in the ABAQUS/Standard solver. The DFLUXsubroutine was used for implementation of the movable welding heat source model. Temperature-depended thermophysical properties for steelwere assumed in computer simulations. Temperature distribution in laser beam surface heated and butt welded plates was numericallyestimated.

  3. Feasibility study of broadband efficient ''water window'' source

    International Nuclear Information System (INIS)

    Higashiguchi, Takeshi; Yugami, Noboru; Otsuka, Takamitsu; Jiang Weihua; Endo, Akira; Li Bowen; Dunne, Padraig; O'Sullivan, Gerry

    2012-01-01

    We demonstrate a table-top broadband emission water window source based on laser-produced high-Z plasmas. Resonance emission from multiply charged ions merges to produce intense unresolved transition arrays (UTAs) in the 2-4 nm region, extending below the carbon K edge (4.37 nm). Arrays resulting from n=4-n=4 transitions are overlaid with n=4-n=5 emission and shift to shorter wavelength with increasing atomic number. An outline of a microscope design for single-shot live cell imaging is proposed based on a bismuth plasma UTA source, coupled to multilayer mirror optics.

  4. Feasibility of fissile mass assay of spent nuclear fuel using 252Cf-source-driven frequency-analysis

    International Nuclear Information System (INIS)

    Mattingly, J.K.; Valentine, T.E.; Mihalczo, J.T.

    1996-01-01

    The feasibility was evaluated using MCNP-DSP, an analog Monte Carlo transport cod to simulate source-driven measurements. Models of an isolated Westinghouse 17x17 PWR fuel assembly in a 1500-ppM borated water storage pool were used. In the models, the fuel burnup profile was represented using seven axial burnup zones, each with isotopics estimated by the PDQ code. Four different fuel assemblies with average burnups from fresh to 32 GWd/MTU were modeled and analyzed. Analysis of the fuel assemblies was simulated by inducing fission in the fuel using a 252 Cf source adjacent to the assembly and correlating source fissions with the response of a bank of 3 He detectors adjacent to the assembly opposite the source. This analysis was performed at 7 different axial positions on each of the 4 assemblies, and the source-detector cross-spectrum signature was calculated for each of these 28 simulated measurements. The magnitude of the cross-spectrum signature follows a smooth upward trend with increasing fissile material ( 235 U and 239 Pu) content, and the signature is independent of the concentration of spontaneously fissioning isotopes (e.g., 244 Cm) and (α,n) sources. Furthermore, the cross-spectrum signature is highly sensitive to changes in fissile material content. This feasibility study indicated that the signature would increase ∼100% in response to an increase of only 0.1 g/cm 3 of fissile material

  5. Entropy Generation Analysis of Natural Convection in Square Enclosures with Two Isoflux Heat Sources

    Directory of Open Access Journals (Sweden)

    S. Z. Nejad

    2017-04-01

    Full Text Available This study investigates entropy generation resulting from natural convective heat transfer in square enclosures with local heating of the bottom and symmetrical cooling of the sidewalls. This analysis tends to optimize heat transfer of two pieces of semiconductor in a square electronic package. In this simulation, heaters are modeled as isoflux heat sources and sidewalls of the enclosure are isothermal heat sinks. The top wall and the non-heated portions of the bottom wall are adiabatic. Flow and temperature fields are obtained by numerical simulation of conservation equations of mass, momentum and energy in laminar, steady and two dimensional flows. With constant heat energy into the cavity, effect of Rayleigh number, heater length, heater strength ratios and heater position is evaluated on flow and temperature fields and local entropy generation. The results show that a minimum entropy generation rate is obtained under the same condition in which a minimum peak heater temperature is obtained.

  6. Quantitative Analysis of VIIRS DNB Nightlight Point Source for Light Power Estimation and Stability Monitoring

    Directory of Open Access Journals (Sweden)

    Changyong Cao

    2014-12-01

    Full Text Available The high sensitivity and advanced onboard calibration on the Visible Infrared Imaging Radiometer Suite (VIIRS Day/Night Band (DNB enables accurate measurements of low light radiances which leads to enhanced quantitative applications at night. The finer spatial resolution of DNB also allows users to examine social economic activities at urban scales. Given the growing interest in the use of the DNB data, there is a pressing need for better understanding of the calibration stability and absolute accuracy of the DNB at low radiances. The low light calibration accuracy was previously estimated at a moderate 15% using extended sources while the long-term stability has yet to be characterized. There are also several science related questions to be answered, for example, how the Earth’s atmosphere and surface variability contribute to the stability of the DNB measured radiances; how to separate them from instrument calibration stability; whether or not SI (International System of Units traceable active light sources can be designed and installed at selected sites to monitor the calibration stability, radiometric and geolocation accuracy, and point spread functions of the DNB; furthermore, whether or not such active light sources can be used for detecting environmental changes, such as aerosols. This paper explores the quantitative analysis of nightlight point sources, such as those from fishing vessels, bridges, and cities, using fundamental radiometry and radiative transfer, which would be useful for a number of applications including search and rescue in severe weather events, as well as calibration/validation of the DNB. Time series of the bridge light data are used to assess the stability of the light measurements and the calibration of VIIRS DNB. It was found that the light radiant power computed from the VIIRS DNB data matched relatively well with independent assessments based on the in situ light installations, although estimates have to be

  7. Tax evasion and income source: A comparative experimental study

    NARCIS (Netherlands)

    Gërxhani, K.; Schram, A.J.H.C.

    2006-01-01

    We compare tax evasive behavior in a country in transition from communism to that in a developed economy by running an experiment across distinct social groups in Albania and the Netherlands. Aside from the tax compliance decision, subjects choose a source of income, where one type enables

  8. Principal and experimental study of source of polarized electrons

    International Nuclear Information System (INIS)

    Shang Rencheng; Gao Junfang; Xiao Yuan; Pang Wenning; Deng Jingkang

    1999-01-01

    The getting of polarized electrons was briefly introduced, that is the source of polarized electrons. The measurement of polarization in future, the application of polarized electrons in atomic and molecular physics, condensed physics, biological physics, nuclear and particle physics were discussed

  9. Literature study of source term research for PWRs

    Energy Technology Data Exchange (ETDEWEB)

    Sponton, L.L.; NiIsson, Lars

    2001-04-01

    A literature survey has been carried out in support of ongoing source term calculations with the MELCOR code of some severe accident scenarios for the Swedish Ringhals 2 pressurised water reactor (PWR). The research in the field of severe accidents in power reactors and the source term for subsequent release of radioisotopes was intensified after the Harrisburg accident and has produced a large amount of reports and papers. This survey was therefore limited to research concerning PWR type of reactors and with emphasis on papers related to MELCOR code development. A background is given, relating to some historic documents, and then more recent research after 1990 is reviewed. Of special interest is the ongoing PMbus-programme which is creating new and important results of benefit to the code development and validation of, among others, the MELCOR code. It is concluded that source term calculations involve simulation of many interacting complex physical phenomena, which result in large uncertainties The research has, however, over the years led to considerable improvements Thus has the uncertainty in source term predictions been reduced one to two orders of magnitude from the simpler codes in the early 1980-s to the more realistic codes of today, like MELCOR.

  10. 61 SOURCE-CRITICAL STUDIES IN LUKE-ACTS: IMPLICATIONS ...

    African Journals Online (AJOL)

    HP

    its picture in the human mind.10 Social memory, especially in its wider concept of ... his sources, with special attention to their function and how they influenced Luke‟s narrative discourse. ..... But he also explicitly states his engagement in another dialogue ... some literary dependence without a clear dividing line. To have a ...

  11. Open Source Projects in Software Engineering Education: A Mapping Study

    Science.gov (United States)

    Nascimento, Debora M. C.; Almeida Bittencourt, Roberto; Chavez, Christina

    2015-01-01

    Context: It is common practice in academia to have students work with "toy" projects in software engineering (SE) courses. One way to make such courses more realistic and reduce the gap between academic courses and industry needs is getting students involved in open source projects (OSP) with faculty supervision. Objective: This study…

  12. Literature study of source term research for PWRs

    International Nuclear Information System (INIS)

    Sponton, L.L.; NiIsson, Lars

    2001-04-01

    A literature survey has been carried out in support of ongoing source term calculations with the MELCOR code of some severe accident scenarios for the Swedish Ringhals 2 pressurised water reactor (PWR). The research in the field of severe accidents in power reactors and the source term for subsequent release of radioisotopes was intensified after the Harrisburg accident and has produced a large amount of reports and papers. This survey was therefore limited to research concerning PWR type of reactors and with emphasis on papers related to MELCOR code development. A background is given, relating to some historic documents, and then more recent research after 1990 is reviewed. Of special interest is the ongoing PMbus-programme which is creating new and important results of benefit to the code development and validation of, among others, the MELCOR code. It is concluded that source term calculations involve simulation of many interacting complex physical phenomena, which result in large uncertainties The research has, however, over the years led to considerable improvements Thus has the uncertainty in source term predictions been reduced one to two orders of magnitude from the simpler codes in the early 1980-s to the more realistic codes of today, like MELCOR

  13. Study on a volume-production H- ion source

    International Nuclear Information System (INIS)

    Takama, S.

    1988-01-01

    H - ions formed by volume-production are extracted from a multicuspion source. By applying a large positive bias to the plasma electrode, the ratio I - /I e becomes 1/20. H - ion current of 0.4mA is extracted from a 0.3cm 2 circular aperture at an arc current of 10A. (author)

  14. Quasiballistic heat removal from small sources studied from first principles

    Science.gov (United States)

    Vermeersch, Bjorn; Mingo, Natalio

    2018-01-01

    Heat sources whose characteristic dimension R is comparable to phonon mean free paths display thermal resistances that exceed conventional diffusive predictions. This has direct implications to (opto)electronics thermal management and phonon spectroscopy. Theoretical analyses have so far limited themselves to particular experimental configurations. Here, we build upon the multidimensional Boltzmann transport equation (BTE) to derive universal expressions for the apparent conductivity suppression S (R ) =κeff(R ) /κbulk experienced by radially symmetric 2D and 3D sources. In striking analogy to cross-plane heat conduction in thin films, a distinct quasiballistic regime emerges between ballistic (κeff˜R ) and diffusive (κeff≃κbulk ) asymptotes that displays a logarithmic dependence κeff˜ln(R ) in single crystals and fractional power dependence κeff˜R2 -α in alloys (with α the Lévy superdiffusion exponent). Analytical solutions and Monte Carlo simulations for spherical and circular heat sources in Si, GaAs, Si0.99Ge0.01 , and Si0.82Ge0.18 , all carried out from first principles, confirm the predicted generic tendencies. Contrary to the thin film case, common approximations like kinetic theory estimates κeff≃∑Sωgreyκω and modified Fourier temperature curves perform relatively poorly. Up to threefold deviations from the BTE solutions for sub-100 nm sources underline the need for rigorous treatment of multidimensional nondiffusive transport.

  15. Flash sourcing, or rapid detection and characterization of earthquake effects through website traffic analysis

    Directory of Open Access Journals (Sweden)

    Laurent Frobert

    2011-06-01

    Full Text Available

    This study presents the latest developments of an approach called ‘flash sourcing’, which provides information on the effects of an earthquake within minutes of its occurrence. Information is derived from an analysis of the website traffic surges of the European–Mediterranean Seismological Centre website after felt earthquakes. These surges are caused by eyewitnesses to a felt earthquake, who are the first who are informed of, and hence the first concerned by, an earthquake occurrence. Flash sourcing maps the felt area, and at least in some circumstances, the regions affected by severe damage or network disruption. We illustrate how the flash-sourced information improves and speeds up the delivery of public earthquake information, and beyond seismology, we consider what it can teach us about public responses when experiencing an earthquake. Future developments should improve the description of the earthquake effects and potentially contribute to the improvement of the efficiency of earthquake responses by filling the information gap after the occurrence of an earthquake.

  16. Analysis of flood inundation in ungauged basins based on multi-source remote sensing data.

    Science.gov (United States)

    Gao, Wei; Shen, Qiu; Zhou, Yuehua; Li, Xin

    2018-02-09

    Floods are among the most expensive natural hazards experienced in many places of the world and can result in heavy losses of life and economic damages. The objective of this study is to analyze flood inundation in ungauged basins by performing near-real-time detection with flood extent and depth based on multi-source remote sensing data. Via spatial distribution analysis of flood extent and depth in a time series, the inundation condition and the characteristics of flood disaster can be reflected. The results show that the multi-source remote sensing data can make up the lack of hydrological data in ungauged basins, which is helpful to reconstruct hydrological sequence; the combination of MODIS (moderate-resolution imaging spectroradiometer) surface reflectance productions and the DFO (Dartmouth Flood Observatory) flood database can achieve the macro-dynamic monitoring of the flood inundation in ungauged basins, and then the differential technique of high-resolution optical and microwave images before and after floods can be used to calculate flood extent to reflect spatial changes of inundation; the monitoring algorithm for the flood depth combining RS and GIS is simple and easy and can quickly calculate the depth with a known flood extent that is obtained from remote sensing images in ungauged basins. Relevant results can provide effective help for the disaster relief work performed by government departments.

  17. Experimental analysis of a diffusion absorption refrigeration system used alternative energy sources

    International Nuclear Information System (INIS)

    Soezen, A.; Oezbas, E.

    2009-01-01

    The continuous-cycle absorption refrigeration device is widely used in domestic refrigerators, and recreational vehicles. It is also used in year-around air conditioning of both homes and larger buildings. The unit consists of four main parts the boiler, condenser, evaporator and the absorber. When the unit operates on kerosene or gas, the heat is supplied by a burner. This element is fitted underneath the central tube. When operating on electricity, the heat is supplied by an element inserted in the pocket. No moving parts are employed. The operation of the refrigerating mechanism is based on Dalton's law. In this study, experimental analysis was performed of a diffusion absorption refrigeration system (DARS) used alternative energy sources such as solar, liquid petroleum gas (LPG) sources. Two basic DAR cycles were set up and investigated: i) In the first cycle (DARS-1), the condensate is sub-cooled prior to the evaporator entrance by the coupled evaporator/gas heat exchanger similar with manufactured by Electrolux Sweden. ii) In the second cycle (DARS-2), the condensate is not sub-cooled prior to the evaporator entrance and gas heat exchanger is separated from the evaporator. (author)

  18. Evaluation of Collateral Source Characteristics With 3-Dimensional Analysis Using Micro-X-Ray Computed Tomography.

    Science.gov (United States)

    Arima, Yuichiro; Hokimoto, Seiji; Tabata, Noriaki; Nakagawa, Osamu; Oshima, Asahi; Matsumoto, Yosuke; Sato, Takahiro; Mukunoki, Toshifumi; Otani, Jun; Ishii, Masanobu; Uchikawa, Michie; Yamamoto, Eiichiro; Izumiya, Yasuhiro; Kaikita, Koichi; Ogawa, Hisao; Nishiyama, Koichi; Tsujita, Kenichi

    2018-03-23

    Collateral arteries provide an alternative blood supply and protect tissues from ischemic damage in patients with peripheral artery disease. However, the mechanism of collateral artery development is difficult to validate. Collateral arteries were visualized using micro-x-ray computed tomography. Developmental characteristics were assessed using confocal microscopy. We conducted a single-center, retrospective, observational study and assessed the dilatation of collateral arteries on ischemic sides. We quantified the vascular volume in both ischemic and nonischemic legs. A prominent increase in vascular volume was observed in the ischemic leg using a murine hind-limb ischemia model. We also performed qualitative assessment and confirmed that the inferior gluteal artery functioned as a major collateral source. Serial analysis of murine hind-limb vessel development revealed that the inferior gluteal artery was a remnant of the ischial artery, which emerged as a representative vessel on the dorsal side during hind-limb organogenesis. We retrospectively analyzed consecutive patients who were admitted for the diagnosis or treatment of peripheral artery disease. The diameter of the inferior gluteal artery on the ischemic side showed significant dilatation compared with that on the nonischemic side. Our findings indicate that an embryonic remnant artery can become a collateral source under ischemic conditions. Flow enhancement in the inferior gluteal artery might become a novel therapeutic approach for patients with peripheral artery disease. © 2018 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.

  19. Source apportionment of PAH in Hamilton Harbour suspended sediments: comparison of two factor analysis methods.

    Science.gov (United States)

    Sofowote, Uwayemi M; McCarry, Brian E; Marvin, Christopher H

    2008-08-15

    A total of 26 suspended sediment samples collected over a 5-year period in Hamilton Harbour, Ontario, Canada and surrounding creeks were analyzed for a suite of polycyclic aromatic hydrocarbons and sulfur heterocycles. Hamilton Harbour sediments contain relatively high levels of polycyclic aromatic compounds and heavy metals due to emissions from industrial and mobile sources. Two receptor modeling methods using factor analyses were compared to determine the profiles and relative contributions of pollution sources to the harbor; these methods are principal component analyses (PCA) with multiple linear regression analysis (MLR) and positive matrix factorization (PMF). Both methods identified four factors and gave excellent correlation coefficients between predicted and measured levels of 25 aromatic compounds; both methods predicted similar contributions from coal tar/coal combustion sources to the harbor (19 and 26%, respectively). One PCA factor was identified as contributions from vehicular emissions (61%); PMF was able to differentiate vehicular emissions into two factors, one attributed to gasoline emissions sources (28%) and the other to diesel emissions sources (24%). Overall, PMF afforded better source identification than PCA with MLR. This work constitutes one of the few examples of the application of PMF to the source apportionment of sediments; the addition of sulfur heterocycles to the analyte list greatly aided in the source identification process.

  20. TRAM (Transcriptome Mapper: database-driven creation and analysis of transcriptome maps from multiple sources

    Directory of Open Access Journals (Sweden)

    Danieli Gian

    2011-02-01

    Full Text Available Abstract Background Several tools have been developed to perform global gene expression profile data analysis, to search for specific chromosomal regions whose features meet defined criteria as well as to study neighbouring gene expression. However, most of these tools are tailored for a specific use in a particular context (e.g. they are species-specific, or limited to a particular data format and they typically accept only gene lists as input. Results TRAM (Transcriptome Mapper is a new general tool that allows the simple generation and analysis of quantitative transcriptome maps, starting from any source listing gene expression values for a given gene set (e.g. expression microarrays, implemented as a relational database. It includes a parser able to assign univocal and updated gene symbols to gene identifiers from different data sources. Moreover, TRAM is able to perform intra-sample and inter-sample data normalization, including an original variant of quantile normalization (scaled quantile, useful to normalize data from platforms with highly different numbers of investigated genes. When in 'Map' mode, the software generates a quantitative representation of the transcriptome of a sample (or of a pool of samples and identifies if segments of defined lengths are over/under-expressed compared to the desired threshold. When in 'Cluster' mode, the software searches for a set of over/under-expressed consecutive genes. Statistical significance for all results is calculated with respect to genes localized on the same chromosome or to all genome genes. Transcriptome maps, showing differential expression between two sample groups, relative to two different biological conditions, may be easily generated. We present the results of a biological model test, based on a meta-analysis comparison between a sample pool of human CD34+ hematopoietic progenitor cells and a sample pool of megakaryocytic cells. Biologically relevant chromosomal segments and gene

  1. Frequency spectrum analysis of 252Cf neutron source based on LabVIEW

    International Nuclear Information System (INIS)

    Mi Deling; Li Pengcheng

    2011-01-01

    The frequency spectrum analysis of 252 Cf Neutron source is an extremely important method in nuclear stochastic signal processing. Focused on the special '0' and '1' structure of neutron pulse series, this paper proposes a fast-correlation algorithm to improve the computational rate of the spectrum analysis system. And the multi-core processor technology is employed as well as multi-threaded programming techniques of LabVIEW to construct frequency spectrum analysis system of 252 Cf neutron source based on LabVIEW. It not only obtains the auto-correlation and cross correlation results, but also auto-power spectrum,cross-power spectrum and ratio of spectral density. The results show that: analysis tools based on LabVIEW improve the fast auto-correlation and cross correlation code operating efficiency about by 25% to 35%, also verify the feasibility of using LabVIEW for spectrum analysis. (authors)

  2. Strategic planning as a competitive differential: A case study of the Sealed Sources Production Laboratory

    International Nuclear Information System (INIS)

    Vieira, Imário; Nascimento, Fernando C.; Calvo, Wilson A. Parejo

    2017-01-01

    Strategic planning has always been and continues to be one of the most important management tools for decision making. Amidst the uncertainties of the 21"s"t century, public, private and third sector organizations are steadily struggling to improve their strategic plans by using more effective results management tools such as BSC-Balanced Scorecard. Nuclear research institutes and research centers around the world have been using more and more these types of tools in their strategic planning and management. The objective of this article was to recommend the use the BSC as a strategic tool for decision making for the Sealed Sources Production Laboratory located in the Radiation Technology Center, at Nuclear and Energy Research Institute (IPEN/CNEN-SP), in Sao Paulo, Brazil. The methodology used in this academic article was a case study, which considered the object of the study, the Sealed Sources Production Laboratory, from January 2014 to August 2016. Among the main results obtained with this study can be cited: the improvement of the information flow, the visualization and proposition to change the periodicity of analysis of the results, among others. In view of the expected results, it was possible to conclude that this study may be of value to the Sealed Sources Production Laboratory for Industrial Radiography and Industrial Process Control and also to other research centers, as it will allow and contribute with an additional management support tool. (author)

  3. Strategic planning as a competitive differential: A case study of the Sealed Sources Production Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Vieira, Imário; Nascimento, Fernando C.; Calvo, Wilson A. Parejo, E-mail: imariovieira@yahoo.com, E-mail: wapcalvo@ipen.br, E-mail: fcodelo@gmail.com [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Faculdade SENAI de Tecnologia Ambiental, Sao Bernardo do Campo, SP (Brazil)

    2017-11-01

    Strategic planning has always been and continues to be one of the most important management tools for decision making. Amidst the uncertainties of the 21{sup st} century, public, private and third sector organizations are steadily struggling to improve their strategic plans by using more effective results management tools such as BSC-Balanced Scorecard. Nuclear research institutes and research centers around the world have been using more and more these types of tools in their strategic planning and management. The objective of this article was to recommend the use the BSC as a strategic tool for decision making for the Sealed Sources Production Laboratory located in the Radiation Technology Center, at Nuclear and Energy Research Institute (IPEN/CNEN-SP), in Sao Paulo, Brazil. The methodology used in this academic article was a case study, which considered the object of the study, the Sealed Sources Production Laboratory, from January 2014 to August 2016. Among the main results obtained with this study can be cited: the improvement of the information flow, the visualization and proposition to change the periodicity of analysis of the results, among others. In view of the expected results, it was possible to conclude that this study may be of value to the Sealed Sources Production Laboratory for Industrial Radiography and Industrial Process Control and also to other research centers, as it will allow and contribute with an additional management support tool. (author)

  4. Molecular line study of massive star-forming regions from the Red MSX Source survey

    Science.gov (United States)

    Yu, Naiping; Wang, Jun-Jie

    2014-05-01

    In this paper, we have selected a sample of massive star-forming regions from the Red MSX Source survey, in order to study star formation activities (mainly outflow and inflow signatures). We have focused on three molecular lines from the Millimeter Astronomy Legacy Team Survey at 90 GHz: HCO+(1-0), H13CO+(1-0) and SiO(2-1). According to previous observations, our sources can be divided into two groups: nine massive young stellar object candidates (radio-quiet) and 10 H II regions (which have spherical or unresolved radio emissions). Outflow activities have been found in 11 sources, while only three show inflow signatures in all. The high outflow detection rate means that outflows are common in massive star-forming regions. The inflow detection rate was relatively low. We suggest that this was because of the beam dilution of the telescope. All three inflow candidates have outflow(s). The outward radiation and thermal pressure from the central massive star(s) do not seem to be strong enough to halt accretion in G345.0034-00.2240. Our simple model of G318.9480-00.1969 shows that it has an infall velocity of about 1.8 km s-1. The spectral energy distribution analysis agrees our sources are massive and intermediate-massive star formation regions.

  5. Gaussian process based independent analysis for temporal source separation in fMRI

    DEFF Research Database (Denmark)

    Hald, Ditte Høvenhoff; Henao, Ricardo; Winther, Ole

    2017-01-01

    Functional Magnetic Resonance Imaging (fMRI) gives us a unique insight into the processes of the brain, and opens up for analyzing the functional activation patterns of the underlying sources. Task-inferred supervised learning with restrictive assumptions in the regression set-up, restricts...... the exploratory nature of the analysis. Fully unsupervised independent component analysis (ICA) algorithms, on the other hand, can struggle to detect clear classifiable components on single-subject data. We attribute this shortcoming to inadequate modeling of the fMRI source signals by failing to incorporate its...

  6. Market Analysis and Consumer Impacts Source Document. Part II. Review of Motor Vehicle Market and Consumer Expenditures on Motor Vehicle Transportation

    Science.gov (United States)

    1980-12-01

    This source document on motor vehicle market analysis and consumer impacts consists of three parts. Part II consists of studies and review on: motor vehicle sales trends; motor vehicle fleet life and fleet composition; car buying patterns of the busi...

  7. Study on induced radioactivity of China Spallation Neutron Source

    International Nuclear Information System (INIS)

    Wu Qingbiao; Wang Qingbin; Wu Jingmin; Ma Zhongjian

    2011-01-01

    China Spallation Neutron Source (CSNS) is the first High Energy Intense Proton Accelerator planned to be constructed in China during the State Eleventh Five-Year Plan period, whose induced radioactivity is very important for occupational disease hazard assessment and environmental impact assessment. Adopting the FLUKA code, the authors have constructed a cylinder-tunnel geometric model and a line-source sampling physical model, deduced proper formulas to calculate air activation, and analyzed various issues with regard to the activation of different tunnel parts. The results show that the environmental impact resulting from induced activation is negligible, whereas the residual radiation in the tunnels has a great influence on maintenance personnel, so strict measures should be adopted.(authors)

  8. EEG source space analysis of the supervised factor analytic approach for the classification of multi-directional arm movement

    Science.gov (United States)

    Shenoy Handiru, Vikram; Vinod, A. P.; Guan, Cuntai

    2017-08-01

    Objective. In electroencephalography (EEG)-based brain-computer interface (BCI) systems for motor control tasks the conventional practice is to decode motor intentions by using scalp EEG. However, scalp EEG only reveals certain limited information about the complex tasks of movement with a higher degree of freedom. Therefore, our objective is to investigate the effectiveness of source-space EEG in extracting relevant features that discriminate arm movement in multiple directions. Approach. We have proposed a novel feature extraction algorithm based on supervised factor analysis that models the data from source-space EEG. To this end, we computed the features from the source dipoles confined to Brodmann areas of interest (BA4a, BA4p and BA6). Further, we embedded class-wise labels of multi-direction (multi-class) source-space EEG to an unsupervised factor analysis to make it into a supervised learning method. Main Results. Our approach provided an average decoding accuracy of 71% for the classification of hand movement in four orthogonal directions, that is significantly higher (>10%) than the classification accuracy obtained using state-of-the-art spatial pattern features in sensor space. Also, the group analysis on the spectral characteristics of source-space EEG indicates that the slow cortical potentials from a set of cortical source dipoles reveal discriminative information regarding the movement parameter, direction. Significance. This study presents evidence that low-frequency components in the source space play an important role in movement kinematics, and thus it may lead to new strategies for BCI-based neurorehabilitation.

  9. Study on hybrid heat source overlap welding of magnesium alloy AZ31B

    International Nuclear Information System (INIS)

    Liang, G.L.; Zhou, G.; Yuan, S.Q.

    2009-01-01

    The magnesium alloy AZ31B was overlap welded by hybrid welding (laser-tungsten inert gas arc). According to the hybrid welding interaction principle, a new heat source model, hybrid welding heat source model, was developed with finite element analysis. At the same time, using a high-temperature metallographical microscope, the macro-appearance and microstructure characteristics of the joint after hybrid overlap welding were studied. The results indicate that the hybrid welding was superior to the single tungsten inert gas welding or laser welding on the aspects of improving the utilized efficiency of the arc and enhancing the absorptivity of materials to laser energy. Due to the energy characteristics of hybrid overlap welding the macro-appearance of the joint was cup-shaped, the top weld showed the hybrid welding microstructure, while, the lower weld showed the typical laser welding microstructure

  10. Study on hybrid heat source overlap welding of magnesium alloy AZ31B

    Energy Technology Data Exchange (ETDEWEB)

    Liang, G.L. [Department of Electromechanical Engineering, Tangshan College, Tangshan 063000 (China)], E-mail: guoliliang@sohu.com; Zhou, G. [School of Material Science and Engineering, Harbin Institute of Technology, Harbin 150001 (China); Yuan, S.Q. [Department of Electromechanical Engineering, Tangshan College, Tangshan 063000 (China)

    2009-01-15

    The magnesium alloy AZ31B was overlap welded by hybrid welding (laser-tungsten inert gas arc). According to the hybrid welding interaction principle, a new heat source model, hybrid welding heat source model, was developed with finite element analysis. At the same time, using a high-temperature metallographical microscope, the macro-appearance and microstructure characteristics of the joint after hybrid overlap welding were studied. The results indicate that the hybrid welding was superior to the single tungsten inert gas welding or laser welding on the aspects of improving the utilized efficiency of the arc and enhancing the absorptivity of materials to laser energy. Due to the energy characteristics of hybrid overlap welding the macro-appearance of the joint was cup-shaped, the top weld showed the hybrid welding microstructure, while, the lower weld showed the typical laser welding microstructure.

  11. A Feasibility Study on the Inspection System Development of Underground Cavities Using Neutron Source

    International Nuclear Information System (INIS)

    Yim, Che Wook; Kim, Song Hyun; Kim, Do Hyun; Shin, Chang Ho

    2015-01-01

    The detection efficiency using the gravimetry method is significantly low; therefore, it requires large surveying time. The magnetometry method detects the cavities by the magnitude of the magnetic field. However, the magnetometry method is problematical in urban areas due to pipes and electrical installations. GPR is the method that uses high frequency electromagnetic wave. This method is widely used for the inspection; however, the detection accuracy of sinkholes can be low in specific soil types. In this study, to verify the feasibility of the neutron source-based inspection system to detect the cavity detection, the Monte Carlo simulation was performed using neutron source. The analysis shows that the detection of the cavity with the given condition is possible when the diameter of cavity is over 100 cm. However, the detection efficiency can be enough increased if some optimization strategies for the inspection are developed. Also, it is expected that the proposed inspection method can detect the expected locations of the cavities

  12. Study on a groundwater source heat pump cooling system in solar greenhouse

    Energy Technology Data Exchange (ETDEWEB)

    Chai, Lilong; Ma, Chengwei [China Agricultural Univ., Beijing (China). Coll. of Water Conservancy and Civil Engineering. Dept. of Agricultural Structure and Bio-environmental Engineering], E-mail: macwbs@cau.edu.cn

    2008-07-01

    This study aims at exploiting the potential of ground source heat pump (GSHP) technology in cooling agricultural greenhouse, and advocating the use of renewable and clean energy in agriculture. GSHP has the multi-function of heating, cooling and dehumidifying, which is one of the fastest growing technologies of renewable energy air conditioning in recent years. The authors carried out experiment on the ground source heat pump system in cooling greenhouse in Beijing region during the summertime of 2007, and conducted analysis on the energy efficiency of the system by using coefficient of performance (COP). According to the data collected during Aug.13-18th, 2007, the coefficient of performance of GSHP system (COP{sub sys}) has reached 3.15 on average during the test. (author)

  13. [The use of personal sources for the study of emigration from Galicia: present state and perspectives].

    Science.gov (United States)

    Vazquez Gonzalez, A

    1996-08-01

    "Spanish sources for the study of emigration are sparse and fragmentary.... Mortgage documents for the payment of ocean transportation enable us to appreciate the spreading action of shipping agents; official listings of draft dodgers reveal that in general the River Plate was a favorite destination, rather than Cuba or Brazil. People from Galicia emigrated from rural origins to urban destinations in America; the analysis of place of birth of emigrants residing in A Coruna at the time of emigration show that there was also, in some cases, a first stage of rural-urban migration within Galicia. The general picture of emigration from Galicia is built [up] through the combination of the existing sources in Spain." (EXCERPT)

  14. A Feasibility Study on the Inspection System Development of Underground Cavities Using Neutron Source

    Energy Technology Data Exchange (ETDEWEB)

    Yim, Che Wook; Kim, Song Hyun; Kim, Do Hyun; Shin, Chang Ho [Hanyang University, Seoul (Korea, Republic of)

    2015-05-15

    The detection efficiency using the gravimetry method is significantly low; therefore, it requires large surveying time. The magnetometry method detects the cavities by the magnitude of the magnetic field. However, the magnetometry method is problematical in urban areas due to pipes and electrical installations. GPR is the method that uses high frequency electromagnetic wave. This method is widely used for the inspection; however, the detection accuracy of sinkholes can be low in specific soil types. In this study, to verify the feasibility of the neutron source-based inspection system to detect the cavity detection, the Monte Carlo simulation was performed using neutron source. The analysis shows that the detection of the cavity with the given condition is possible when the diameter of cavity is over 100 cm. However, the detection efficiency can be enough increased if some optimization strategies for the inspection are developed. Also, it is expected that the proposed inspection method can detect the expected locations of the cavities.

  15. Do knowledge, knowledge sources and reasoning skills affect the accuracy of nursing diagnoses? a randomised study.

    Science.gov (United States)

    Paans, Wolter; Sermeus, Walter; Nieweg, Roos Mb; Krijnen, Wim P; van der Schans, Cees P

    2012-08-01

    This paper reports a study about the effect of knowledge sources, such as handbooks, an assessment format and a predefined record structure for diagnostic documentation, as well as the influence of knowledge, disposition toward critical thinking and reasoning skills, on the accuracy of nursing diagnoses.Knowledge sources can support nurses in deriving diagnoses. A nurse's disposition toward critical thinking and reasoning skills is also thought to influence the accuracy of his or her nursing diagnoses. A randomised factorial design was used in 2008-2009 to determine the effect of knowledge sources. We used the following instruments to assess the influence of ready knowledge, disposition, and reasoning skills on the accuracy of di