WorldWideScience

Sample records for multiple gravity sources

  1. The sources of atmospheric gravity waves

    International Nuclear Information System (INIS)

    Nagpal, O.P.

    1979-01-01

    The gravity wave theory has been very successful in the interpretation of various upper atmospheric phenomena. This article offers a review of the present state of knowledge about the various sources of atmospheric gravity waves, particularly those which give rise to different types of travelling ionospheric disturbance. Some specific case studies are discussed. (author)

  2. Neutron source multiplication method

    International Nuclear Information System (INIS)

    Clayton, E.D.

    1985-01-01

    Extensive use has been made of neutron source multiplication in thousands of measurements of critical masses and configurations and in subcritical neutron-multiplication measurements in situ that provide data for criticality prevention and control in nuclear materials operations. There is continuing interest in developing reliable methods for monitoring the reactivity, or k/sub eff/, of plant operations, but the required measurements are difficult to carry out and interpret on the far subcritical configurations usually encountered. The relationship between neutron multiplication and reactivity is briefly discussed and data presented to illustrate problems associated with the absolute measurement of neutron multiplication and reactivity in subcritical systems. A number of curves of inverse multiplication have been selected from a variety of experiments showing variations observed in multiplication during the course of critical and subcritical experiments where different methods of reactivity addition were used, with different neutron source detector position locations. Concern is raised regarding the meaning and interpretation of k/sub eff/ as might be measured in a far subcritical system because of the modal effects and spectrum differences that exist between the subcritical and critical systems. Because of this, the calculation of k/sub eff/ identical with unity for the critical assembly, although necessary, may not be sufficient to assure safety margins in calculations pertaining to far subcritical systems. Further study is needed on the interpretation and meaning of k/sub eff/ in the far subcritical system

  3. Laser Source for Atomic Gravity Wave Detector

    Data.gov (United States)

    National Aeronautics and Space Administration — The Atom Interferometry (AI) Technology for Gravity Wave Measurements demonstrates new matter wave Interferometric sensor technology for precise detection and...

  4. Plume rise from multiple sources

    International Nuclear Information System (INIS)

    Briggs, G.A.

    1975-01-01

    A simple enhancement factor for plume rise from multiple sources is proposed and tested against plume-rise observations. For bent-over buoyant plumes, this results in the recommendation that multiple-source rise be calculated as [(N + S)/(1 + S)]/sup 1/3/ times the single-source rise, Δh 1 , where N is the number of sources and S = 6 (total width of source configuration/N/sup 1/3/ Δh 1 )/sup 3/2/. For calm conditions a crude but simple method is suggested for predicting the height of plume merger and subsequent behavior which is based on the geometry and velocity variations of a single buoyant plume. Finally, it is suggested that large clusters of buoyant sources might occasionally give rise to concentrated vortices either within the source configuration or just downwind of it

  5. Gravity- and non-gravity-mediated couplings in multiple-field inflation

    International Nuclear Information System (INIS)

    Bernardeau, Francis

    2010-01-01

    Mechanisms for the generation of primordial non-Gaussian metric fluctuations in the context of multiple-field inflation are reviewed. As long as kinetic terms remain canonical, it appears that nonlinear couplings inducing non-Gaussianities can be split into two types. The extension of the one-field results to multiple degrees of freedom leads to gravity-mediated couplings that are ubiquitous but generally modest. Multiple-field inflation offers however the possibility of generating non-gravity-mediated coupling in isocurvature directions that can eventually induce large non-Gaussianities in the metric fluctuations. The robustness of the predictions of such models is eventually examined in view of a case study derived from a high-energy physics construction.

  6. pp waves of conformal gravity with self-interacting source

    International Nuclear Information System (INIS)

    Ayon-Beato, Eloy; Hassaine, Mokhtar

    2005-01-01

    Recently, Deser, Jackiw and Pi have shown that three-dimensional conformal gravity with a source given by a conformally coupled scalar field admits pp wave solutions. In this paper, we consider this model with a self-interacting potential preserving the conformal structure. A pp wave geometry is also supported by this system and, we show that this model is equivalent to topologically massive gravity with a cosmological constant whose value is given in terms of the potential strength

  7. Imaging multipole gravity anomaly sources by 3D probability tomography

    International Nuclear Information System (INIS)

    Alaia, Raffaele; Patella, Domenico; Mauriello, Paolo

    2009-01-01

    We present a generalized theory of the probability tomography applied to the gravity method, assuming that any Bouguer anomaly data set can be caused by a discrete number of monopoles, dipoles, quadrupoles and octopoles. These elementary sources are used to characterize, in an as detailed as possible way and without any a priori assumption, the shape and position of the most probable minimum structure of the gravity sources compatible with the observed data set, by picking out the location of their centres and peculiar points of their boundaries related to faces, edges and vertices. A few synthetic examples using simple geometries are discussed in order to demonstrate the notably enhanced resolution power of the new approach, compared with a previous formulation that used only monopoles and dipoles. A field example related to a gravity survey carried out in the volcanic area of Mount Etna (Sicily, Italy) is presented, aimed at imaging the geometry of the minimum gravity structure down to 8 km of depth bsl

  8. Lattice Boltzmann Simulation of Multiple Bubbles Motion under Gravity

    Directory of Open Access Journals (Sweden)

    Deming Nie

    2015-01-01

    Full Text Available The motion of multiple bubbles under gravity in two dimensions is numerically studied through the lattice Boltzmann method for the Eotvos number ranging from 1 to 12. Two kinds of initial arrangement are taken into account: vertical and horizontal arrangement. In both cases the effects of Eotvos number on the bubble coalescence and rising velocity are investigated. For the vertical arrangement, it has been found that the coalescence pattern is similar. The first coalescence always takes place between the two uppermost bubbles. And the last coalescence always takes place between the coalesced bubble and the bottommost bubble. For four bubbles in a horizontal arrangement, the outermost bubbles travel into the wake of the middle bubbles in all cases, which allows the bubbles to coalesce. The coalescence pattern is more complex for the case of eight bubbles, which strongly depends on the Eotvos number.

  9. Laser Source for Atomic Gravity Wave Detector Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The Atom Interferometry (AI) Technology for Gravity Wave Measurements demonstrates new matter wave Interferometric sensor technology for precise detection and...

  10. Land Streamer Surveying Using Multiple Sources

    KAUST Repository

    Mahmoud, Sherif

    2014-12-11

    Various examples are provided for land streamer seismic surveying using multiple sources. In one example, among others, a method includes disposing a land streamer in-line with first and second shot sources. The first shot source is at a first source location adjacent to a proximal end of the land streamer and the second shot source is at a second source location separated by a fixed length corresponding to a length of the land streamer. Shot gathers can be obtained when the shot sources are fired. In another example, a system includes a land streamer including a plurality of receivers, a first shot source located adjacent to the proximal end of the land streamer, and a second shot source located in-line with the land streamer and the first shot source. The second shot source is separated from the first shot source by a fixed overall length corresponding to the land streamer.

  11. Gravity

    CERN Document Server

    Gamow, George

    2003-01-01

    A distinguished physicist and teacher, George Gamow also possessed a special gift for making the intricacies of science accessible to a wide audience. In Gravity, he takes an enlightening look at three of the towering figures of science who unlocked many of the mysteries behind the laws of physics: Galileo, the first to take a close look at the process of free and restricted fall; Newton, originator of the concept of gravity as a universal force; and Einstein, who proposed that gravity is no more than the curvature of the four-dimensional space-time continuum.Graced with the author's own draw

  12. Low scale gravity as the source of neutrino masses?

    Energy Technology Data Exchange (ETDEWEB)

    Berezinsky, Veniamin [INFN, Laboratori Nazionali del Gran Sasso, I-67010 Assergi, AQ (Italy); Narayan, Mohan [INFN, Laboratori Nazionali del Gran Sasso, I-67010 Assergi, AQ (Italy); Vissani, Francesco [INFN, Laboratori Nazionali del Gran Sasso, I-67010 Assergi, AQ (Italy)

    2005-04-01

    We address the question whether low-scale gravity alone can generate the neutrino mass matrix needed to accommodate the observed phenomenology. In low-scale gravity the neutrino mass matrix in the flavor basis is characterized by one parameter (the gravity scale M{sub X}) and by an exact or approximate flavor blindness (namely, all elements of the mass matrix are of comparable size). Neutrino masses and mixings are consistent with the observational data for certain values of the matrix elements, but only when the spectrum of mass is inverted or degenerate. For the latter type of spectra the parameter M{sub ee} probed in double beta experiments and the mass parameter probed by cosmology are close to existing upper limits.

  13. Low scale gravity as the source of neutrino masses?

    International Nuclear Information System (INIS)

    Berezinsky, Veniamin; Narayan, Mohan; Vissani, Francesco

    2005-01-01

    We address the question whether low-scale gravity alone can generate the neutrino mass matrix needed to accommodate the observed phenomenology. In low-scale gravity the neutrino mass matrix in the flavor basis is characterized by one parameter (the gravity scale M X ) and by an exact or approximate flavor blindness (namely, all elements of the mass matrix are of comparable size). Neutrino masses and mixings are consistent with the observational data for certain values of the matrix elements, but only when the spectrum of mass is inverted or degenerate. For the latter type of spectra the parameter M ee probed in double beta experiments and the mass parameter probed by cosmology are close to existing upper limits

  14. gravity

    Indian Academy of Sciences (India)

    We study the cosmological dynamics for R p exp( λ R ) gravity theory in the metric formalism, using dynamical systems approach. Considering higher-dimensional FRW geometries in case of an imperfect fluid which has two different scale factors in the normal and extra dimensions, we find the exact solutions, and study its ...

  15. Preliminary trajectory design for a solar polar observatory using SEP and multiple gravity assists

    NARCIS (Netherlands)

    Corpaccioli, L.; Noomen, R.; De Smet, S.; Parker, J.S.; Herman, J.F.C.

    2015-01-01

    Satellite solar observatories have always been of central importance to heliophysics; while there have been numerous such missions, the solar poles have been extremely under-observed. This paper proposes to use low-thrust as well as multiple gravity assists to reach the enormous energies required

  16. An evaluation of gravity waves and gravity wave sources in the Southern Hemisphere in a 7 km global climate simulation.

    Science.gov (United States)

    Holt, L A; Alexander, M J; Coy, L; Liu, C; Molod, A; Putman, W; Pawson, S

    2017-07-01

    In this study, gravity waves (GWs) in the high-resolution GEOS-5 Nature Run are first evaluated with respect to satellite and other model results. Southern Hemisphere winter sources of non-orographic GWs in the model are then investigated by linking measures of tropospheric non-orographic gravity wave generation tied to precipitation and frontogenesis with absolute gravity wave momentum flux in the lower stratosphere. Finally, non-orographic GW momentum flux is compared to orographic gravity wave momentum flux and compared to previous estimates. The results show that the global patterns in GW amplitude, horizontal wavelength, and propagation direction are realistic compared to observations. However, as in other global models, the amplitudes are weaker and horizontal wavelengths longer than observed. The global patterns in absolute GW momentum flux also agree well with previous model and observational estimates. The evaluation of model non-orographic GW sources in the Southern Hemisphere winter shows that strong intermittent precipitation (greater than 10 mm h -1 ) is associated with GW momentum flux over the South Pacific, whereas frontogenesis and less intermittent, lower precipitation rates (less than 10 mm h -1 ) are associated with GW momentum flux near 60°S. In the model, orographic GWs contribute almost exclusively to a peak in zonal mean momentum flux between 70 and 75°S, while non-orographic waves dominate at 60°S, and non-orographic GWs contribute a third to a peak in zonal mean momentum flux between 25 and 30°S.

  17. Thermionic detector with multiple layered ionization source

    International Nuclear Information System (INIS)

    Patterson, P. L.

    1985-01-01

    Method and apparatus for analyzing specific chemical substances in a gaseous environment comprises a thermionic source formed of multiple layers of ceramic material composition, an electrical current instrumentality for heating the thermionic source to operating temperatures in the range of 100 0 C. to 1000 0 C., an instrumentality for exposing the surface of the thermionic source to contact with the specific chemical substances for the purpose of forming gas phase ionization of the substances by a process of electrical charge emission from the surface, a collector electrode disposed adjacent to the thermiomic source, an instrumentality for biasing the thermionic source at an electrical potential which causes the gas phase ions to move toward the collector, and an instrumentality for measuring the ion current arriving at the collector. The thermionic source is constructed of a metallic heater element molded inside a sub-layer of hardened ceramic cement material impregnated with a metallic compound additive which is non-corrosive to the heater element during operation. The sub-layer is further covered by a surface-layer formed of hardened ceramic cement material impregnated with an alkali metal compound in a manner that eliminates corrosive contact of the alkali compounds with the heater element. The sub-layer further protects the heater element from contact with gas environments which may be corrosive. The specific ionization of different chemical substances is varied over a wide range by changing the composition and temperature of the thermionic source, and by changing the composition of the gas environment

  18. Binaural Processing of Multiple Sound Sources

    Science.gov (United States)

    2016-08-18

    AFRL-AFOSR-VA-TR-2016-0298 Binaural Processing of Multiple Sound Sources William Yost ARIZONA STATE UNIVERSITY 660 S MILL AVE STE 312 TEMPE, AZ 85281...18-08-2016 2. REPORT TYPE Final Performance 3. DATES COVERED (From - To) 15 Jul 2012 to 14 Jul 2016 4. TITLE AND SUBTITLE Binaural Processing of...three topics cited above are entirely within the scope of the AFOSR grant. 15. SUBJECT TERMS Binaural hearing, Sound Localization, Interaural signal

  19. Determining the depth of certain gravity sources without a priori specification of their structural index

    Science.gov (United States)

    Zhou, Shuai; Huang, Danian

    2015-11-01

    We have developed a new method for the interpretation of gravity tensor data based on the generalized Tilt-depth method. Cooper (2011, 2012) extended the magnetic Tilt-depth method to gravity data. We take the gradient-ratio method of Cooper (2011, 2012) and modify it so that the source type does not need to be specified a priori. We develop the new method by generalizing the Tilt-depth method for depth estimation for different types of source bodies. The new technique uses only the three vertical tensor components of the full gravity tensor data observed or calculated at different height plane to estimate the depth of the buried bodies without a priori specification of their structural index. For severely noise-corrupted data, our method utilizes different upward continuation height data, which can effectively reduce the influence of noise. Theoretical simulations of the gravity source model with and without noise illustrate the ability of the method to provide source depth information. Additionally, the simulations demonstrate that the new method is simple, computationally fast and accurate. Finally, we apply the method using the gravity data acquired over the Humble Salt Dome in the USA as an example. The results show a good correspondence to the previous drilling and seismic interpretation results.

  20. Seasonal variation and sources of atmospheric gravity waves in the Antarctic

    Directory of Open Access Journals (Sweden)

    Kaoru Sato

    2010-12-01

    Full Text Available In the last recent ten years, our knowledge of gravity waves in the Antarctic has been significantly improved through numerous studies using balloon and satellite observations and high-resolution model simulations. In this report, we introduce results from two studies which were performed as a part of the NIPR project "Integrated analysis of the material circulation in the Antarctic atmosphere-cryosphere-ocean" (2004-2009, i.e., Yoshiki et al. (2004 and Sato and Yoshiki (2008. These two studies focused on the seasonal variation and sources of the gravity waves in the Antarctic, because horizontal wavelengths and phase velocities depend largely on the wave sources. The former study used original high-resolution data from operational radiosonde observations at Syowa Station. In the lowermost stratosphere, gravity waves do not exhibit characteristic seasonal variation; instead, the wave energy is intensified when lower latitude air intrudes into the area near Syowa Station in the upper troposphere. This intrusion is associated with blocking events or developed synoptic-scale waves. In the lower and middle stratosphere, the gravity wave energy is maximized in spring and particularly intensified when the axis of the polar night jet approaches Syowa Station. The latter study is based on intensive radiosonde observation campaigns that were performed in 2002 at Syowa Station as an activity of JARE-43. Gravity wave propagation was statistically examined using two dimensional (i.e., vertical wavenumber versus frequency spectra in each season. It was shown that the gravity waves are radiated upward and downward from an unbalanced region of the polar night jet. This feature is consistent with the gravity-wave resolving GCM simulation.

  1. Combined analysis of magnetic and gravity anomalies using normalized source strength (NSS)

    Science.gov (United States)

    Li, L.; Wu, Y.

    2017-12-01

    Gravity field and magnetic field belong to potential fields which lead inherent multi-solution. Combined analysis of magnetic and gravity anomalies based on Poisson's relation is used to determinate homology gravity and magnetic anomalies and decrease the ambiguity. The traditional combined analysis uses the linear regression of the reduction to pole (RTP) magnetic anomaly to the first order vertical derivative of the gravity anomaly, and provides the quantitative or semi-quantitative interpretation by calculating the correlation coefficient, slope and intercept. In the calculation process, due to the effect of remanent magnetization, the RTP anomaly still contains the effect of oblique magnetization. In this case the homology gravity and magnetic anomalies display irrelevant results in the linear regression calculation. The normalized source strength (NSS) can be transformed from the magnetic tensor matrix, which is insensitive to the remanence. Here we present a new combined analysis using NSS. Based on the Poisson's relation, the gravity tensor matrix can be transformed into the pseudomagnetic tensor matrix of the direction of geomagnetic field magnetization under the homologous condition. The NSS of pseudomagnetic tensor matrix and original magnetic tensor matrix are calculated and linear regression analysis is carried out. The calculated correlation coefficient, slope and intercept indicate the homology level, Poisson's ratio and the distribution of remanent respectively. We test the approach using synthetic model under complex magnetization, the results show that it can still distinguish the same source under the condition of strong remanence, and establish the Poisson's ratio. Finally, this approach is applied in China. The results demonstrated that our approach is feasible.

  2. Locating the Tohoku-Oki 2011 tsunami source using acoustic-gravity waves

    OpenAIRE

    Andriamiranto Raveloson; Rainer Kind; Xiaohui Yuan; L. Cerana

    2012-01-01

    The giant Tohoku-Oki earthquake of 11 March 2011 in offshore Japan did not only generate tsunami waves in the ocean but also infrasound (or acoustic-gravity) waves in the atmosphere. We indentified ultra-long-period signals (>500s) in the recordings of infrasound stations in northeast Asia, the northwest Pacific, and Alaska. Their source was fond close to the earthquake epicenter. Therefore, we conclude that in general, infrasound observations after a large offshore earthquake are evidence th...

  3. Modeling Volcanic Eruption Parameters by Near-Source Internal Gravity Waves.

    Science.gov (United States)

    Ripepe, M; Barfucci, G; De Angelis, S; Delle Donne, D; Lacanna, G; Marchetti, E

    2016-11-10

    Volcanic explosions release large amounts of hot gas and ash into the atmosphere to form plumes rising several kilometers above eruptive vents, which can pose serious risk on human health and aviation also at several thousands of kilometers from the volcanic source. However the most sophisticate atmospheric models and eruptive plume dynamics require input parameters such as duration of the ejection phase and total mass erupted to constrain the quantity of ash dispersed in the atmosphere and to efficiently evaluate the related hazard. The sudden ejection of this large quantity of ash can perturb the equilibrium of the whole atmosphere triggering oscillations well below the frequencies of acoustic waves, down to much longer periods typical of gravity waves. We show that atmospheric gravity oscillations induced by volcanic eruptions and recorded by pressure sensors can be modeled as a compact source representing the rate of erupted volcanic mass. We demonstrate the feasibility of using gravity waves to derive eruption source parameters such as duration of the injection and total erupted mass with direct application in constraining plume and ash dispersal models.

  4. 46 CFR 111.10-5 - Multiple energy sources.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 4 2010-10-01 2010-10-01 false Multiple energy sources. 111.10-5 Section 111.10-5...-GENERAL REQUIREMENTS Power Supply § 111.10-5 Multiple energy sources. Failure of any single generating set energy source such as a boiler, diesel, gas turbine, or steam turbine must not cause all generating sets...

  5. Model parameter estimations from residual gravity anomalies due to simple-shaped sources using Differential Evolution Algorithm

    Science.gov (United States)

    Ekinci, Yunus Levent; Balkaya, Çağlayan; Göktürkler, Gökhan; Turan, Seçil

    2016-06-01

    An efficient approach to estimate model parameters from residual gravity data based on differential evolution (DE), a stochastic vector-based metaheuristic algorithm, has been presented. We have showed the applicability and effectiveness of this algorithm on both synthetic and field anomalies. According to our knowledge, this is a first attempt of applying DE for the parameter estimations of residual gravity anomalies due to isolated causative sources embedded in the subsurface. The model parameters dealt with here are the amplitude coefficient (A), the depth and exact origin of causative source (zo and xo, respectively) and the shape factors (q and ƞ). The error energy maps generated for some parameter pairs have successfully revealed the nature of the parameter estimation problem under consideration. Noise-free and noisy synthetic single gravity anomalies have been evaluated with success via DE/best/1/bin, which is a widely used strategy in DE. Additionally some complicated gravity anomalies caused by multiple source bodies have been considered, and the results obtained have showed the efficiency of the algorithm. Then using the strategy applied in synthetic examples some field anomalies observed for various mineral explorations such as a chromite deposit (Camaguey district, Cuba), a manganese deposit (Nagpur, India) and a base metal sulphide deposit (Quebec, Canada) have been considered to estimate the model parameters of the ore bodies. Applications have exhibited that the obtained results such as the depths and shapes of the ore bodies are quite consistent with those published in the literature. Uncertainty in the solutions obtained from DE algorithm has been also investigated by Metropolis-Hastings (M-H) sampling algorithm based on simulated annealing without cooling schedule. Based on the resulting histogram reconstructions of both synthetic and field data examples the algorithm has provided reliable parameter estimations being within the sampling limits of

  6. Integrating multiple data sources for malware classification

    Science.gov (United States)

    Anderson, Blake Harrell; Storlie, Curtis B; Lane, Terran

    2015-04-28

    Disclosed herein are representative embodiments of tools and techniques for classifying programs. According to one exemplary technique, at least one graph representation of at least one dynamic data source of at least one program is generated. Also, at least one graph representation of at least one static data source of the at least one program is generated. Additionally, at least using the at least one graph representation of the at least one dynamic data source and the at least one graph representation of the at least one static data source, the at least one program is classified.

  7. Performance of light sources and radiation sensors under low gravity realized by parabolic airplane flights

    Science.gov (United States)

    Hirai, Hiroaki; Kitaya, Yoshiaki; Hirai, Takehiro

    A fundamental study was conducted to establish an experimental system for space farming. Since to ensure optimal light for plant cultivation in space is of grave importance, this study examined the performance of light sources and radiation sensors under microgravity conditions created during the parabolic airplane flight. Three kinds of light sources, a halogen bulb, a fluorescent tube, and blue and red LEDs, and ten models of radiation sensors available in the market were used for the experiment. Surface temperature of the light sources, output signals from the radiation sensors, spectroscopic characteristics were measured at the gravity levels of 0.01, 1.0 and 1.8 G for 20 seconds each during parabolic airplane flights. As a result, the performance of the halogen lamp was affected the most by the gravity level among the three light sources. Under the microgravity conditions which do not raise heat convection, the temperature of the halogen lamp rose and the output of the radiation sensors increased. Spectral distributions of the halogen lamp indicated that peak wavelength appeared the highest at the level of 0.01G, which contributed to the increase in light intensity. In the case of red and blue LEDs, which are promising light sources in space farming, the temperature of both LED chips rose but irradiance from red LED increased and that from blue LED decreased under microgravity conditions due to the different thermal characteristics.

  8. Land Streamer Surveying Using Multiple Sources

    KAUST Repository

    Mahmoud, Sherif; Schuster, Gerard T.

    2014-01-01

    are fired. In another example, a system includes a land streamer including a plurality of receivers, a first shot source located adjacent to the proximal end of the land streamer, and a second shot source located in-line with the land streamer and the first

  9. Reconstruction of multiple line source attenuation maps

    International Nuclear Information System (INIS)

    Celler, A.; Sitek, A.; Harrop, R.

    1996-01-01

    A simple configuration for a transmission source for the single photon emission computed tomography (SPECT) was proposed, which utilizes a series of collimated line sources parallel to the axis of rotation of a camera. The detector is equipped with a standard parallel hole collimator. We have demonstrated that this type of source configuration can be used to generate sufficient data for the reconstruction of the attenuation map when using 8-10 line sources spaced by 3.5-4.5 cm for a 30 x 40cm detector at 65cm distance from the sources. Transmission data for a nonuniform thorax phantom was simulated, then binned and reconstructed using filtered backprojection (FBP) and iterative methods. The optimum maps are obtained with data binned into 2-3 bins and FBP reconstruction. The activity in the source was investigated for uniform and exponential activity distributions, as well as the effect of gaps and overlaps of the neighboring fan beams. A prototype of the line source has been built and the experimental verification of the technique has started

  10. Study of Seulawah Agam’s Geothermal Source Using Gravity Method

    Directory of Open Access Journals (Sweden)

    Marwan Marwan

    2015-04-01

    Full Text Available Gravity method was carried out at Seulawah Agam Area to delineate the existence of geothermal source, which is specifically existed in both the geothermal filed of Heutsz’s Crater and.Cempaga’s Crater. The Seulawah Agam is located in Aceh Besar district. Geologically, the area is dominated by volcanic mudflow and Lam Teuba’s rocks having age from Tersier to Resen Period. The equipment used includes Gravimeter CG-5 Autograv, Portable GPS (Global Positioning System, Navigation type (map of the survey area, computer and the other technical supports, such as handy talky, umbrella, watch, pens and observed data notes. This research was conducted by doing two stages.  Firstly, establishing the base station which is a reference point for all gravity data measurements at each point. Secondly, measuring gravity data at each point by repeating three times following looping pattern as pathway of measurement either in Heutsz’s crater whose nine points recording or in Cempaga’s crater whose seventeen points. The data was simply processed using Microsoft Excel that can just plot the Bouguer anomaly and interpreted qualitatively due to preliminary research. The resultof this research has shown that both areas have two kinds of Bougeur anomalies which slightly attract attention who’s high and low anomaly. At Heutsz’s Crater has high and low density existed at F125 FR and B6 point. The Point whose high density means that it was formed mineralization by hydrothermal process through fracture materials, meanwhile the point whose low density indicates that there is existed the fault which is quite related to Seulimum’s Fault based on Aceh map. This result is also same as obtained at Cempaga’s Crater which means also same interpretation. In addition, it can be sum up that fault zones are essentially important in geothermal system that plays vital role in term of fluid circulation. Employing the gravity method in this research effectively can be

  11. Research on neutron source multiplication method in nuclear critical safety

    International Nuclear Information System (INIS)

    Zhu Qingfu; Shi Yongqian; Hu Dingsheng

    2005-01-01

    The paper concerns in the neutron source multiplication method research in nuclear critical safety. Based on the neutron diffusion equation with external neutron source the effective sub-critical multiplication factor k s is deduced, and k s is different to the effective neutron multiplication factor k eff in the case of sub-critical system with external neutron source. The verification experiment on the sub-critical system indicates that the parameter measured with neutron source multiplication method is k s , and k s is related to the external neutron source position in sub-critical system and external neutron source spectrum. The relation between k s and k eff and the effect of them on nuclear critical safety is discussed. (author)

  12. Learning from Multiple Sources for Video Summarisation

    OpenAIRE

    Zhu, Xiatian; Loy, Chen Change; Gong, Shaogang

    2015-01-01

    Many visual surveillance tasks, e.g.video summarisation, is conventionally accomplished through analysing imagerybased features. Relying solely on visual cues for public surveillance video understanding is unreliable, since visual observations obtained from public space CCTV video data are often not sufficiently trustworthy and events of interest can be subtle. On the other hand, non-visual data sources such as weather reports and traffic sensory signals are readily accessible but are not exp...

  13. Models of collapsing and expanding anisotropic gravitating source in f(R, T) theory of gravity

    Energy Technology Data Exchange (ETDEWEB)

    Abbas, G. [The Islamia University of Bahawalpur, Department of Mathematics, Bahawalpur (Pakistan); Ahmed, Riaz [The Islamia University of Bahawalpur, Department of Mathematics, Bahawalpur (Pakistan); University of the Central Punjab, Department of Mathematics, Lahore (Pakistan)

    2017-07-15

    In this paper, we have formulated the exact solutions of the non-static anisotropic gravitating source in f(R, T) gravity which may lead to expansion and collapse. By assuming there to be no thermal conduction in gravitating source, we have determined parametric solutions in f(R, T) gravity with a non-static spherical geometry filled using an anisotropic fluid. We have examined the ranges of the parameters for which the expansion scalar becomes negative and positive, leading to collapse and expansion, respectively. Further, using the definition of the mass function, the conditions for the trapped surface have been explored, and it has been investigated that there exists a single horizon in this case. The impact of the coupling parameter λ has been discussed in detail in both cases. For the various values of the coupling parameter λ, we have plotted the energy density, anisotropic pressure and anisotropy parameter in the cases of collapse and expansion. The physical significance of the graphs has been explained in detail. (orig.)

  14. Temporal sea-surface gravity changes observed near the source area prior to the 2011 Tohoku earthquake

    Science.gov (United States)

    Nakamura, T.; Tsuboi, S.

    2013-12-01

    Recent seismological studies suggested subsurface activities preceding the 2011 Tohoku earthquake; the occurrence of migration of seismicity (Kato et al., 2012) and slow slip events (Ito et al., 2013) in and around the source area one month before the mainshock. In this study, we investigated sea-surface gravity changes observed by the shipboard gravimeter mounted on research vessels before the mainshock. The vessels incidentally passed through the source area along almost the same cruise track twice, four months before and one month before the mainshock. Comparing the sea surface gravity in the former track with that in the latter after Bouguer correction, we find the gravity changes of approximately 7 mGal in coseismic slip areas near the trench axis during the three months. We find these gravity changes even in the crossing areas of the cruise tracks where seafloor topographies have no differences between the tracks. We also find that the topographic differences show positive changes but the gravity changes negative ones in other areas, which is a negative correlation inconsistent with the theoretical relationship between the topographic difference and the gravity change. These mean that the differences of seafloor topographies due to differences between the two cruise tracks are not main causes of the observed gravity changes there. The changes cannot also be explained by drifts of the gravimeter and geostrophic currents. Although we have not had any clear evidences, we speculate that the possible cause may be density increases around the seismogenic zone or uplifts of seafloor in order to explain the changes of this size. We estimate the density increases of 1.0 g/cm**3 in a disk with a radius of 40 km and a width of 200 m or the uplifts of several tens of meters in seafloor areas for the observed gravity changes. Our results indicate that sea-surface gravity observations may be one of valid approaches to monitor the approximate location of a possible great

  15. Source Parameter Inversion for Recent Great Earthquakes from a Decade-long Observation of Global Gravity Fields

    Science.gov (United States)

    Han, Shin-Chan; Riva, Ricccardo; Sauber, Jeanne; Okal, Emile

    2013-01-01

    We quantify gravity changes after great earthquakes present within the 10 year long time series of monthly Gravity Recovery and Climate Experiment (GRACE) gravity fields. Using spherical harmonic normal-mode formulation, the respective source parameters of moment tensor and double-couple were estimated. For the 2004 Sumatra-Andaman earthquake, the gravity data indicate a composite moment of 1.2x10(exp 23)Nm with a dip of 10deg, in agreement with the estimate obtained at ultralong seismic periods. For the 2010 Maule earthquake, the GRACE solutions range from 2.0 to 2.7x10(exp 22)Nm for dips of 12deg-24deg and centroid depths within the lower crust. For the 2011 Tohoku-Oki earthquake, the estimated scalar moments range from 4.1 to 6.1x10(exp 22)Nm, with dips of 9deg-19deg and centroid depths within the lower crust. For the 2012 Indian Ocean strike-slip earthquakes, the gravity data delineate a composite moment of 1.9x10(exp 22)Nm regardless of the centroid depth, comparing favorably with the total moment of the main ruptures and aftershocks. The smallest event we successfully analyzed with GRACE was the 2007 Bengkulu earthquake with M(sub 0) approx. 5.0x10(exp 21)Nm. We found that the gravity data constrain the focal mechanism with the centroid only within the upper and lower crustal layers for thrust events. Deeper sources (i.e., in the upper mantle) could not reproduce the gravity observation as the larger rigidity and bulk modulus at mantle depths inhibit the interior from changing its volume, thus reducing the negative gravity component. Focal mechanisms and seismic moments obtained in this study represent the behavior of the sources on temporal and spatial scales exceeding the seismic and geodetic spectrum.

  16. Volcano Monitoring using Multiple Remote Data Sources

    Science.gov (United States)

    Reath, K. A.; Pritchard, M. E.

    2016-12-01

    Satellite-based remote sensing instruments can be used to determine quantitative values related to precursory activity that can act as a warning sign of an upcoming eruption. These warning signs are measured through examining anomalous activity in: (1) thermal flux, (2) gas/aerosol emission rates, (3) ground deformation, and (4) ground-based seismic readings. Patterns in each of these data sources are then analyzed to create classifications of different phases of precursory activity. These different phases of activity act as guidelines to monitor the progression of precursory activity leading to an eruption. Current monitoring methods rely on using high temporal resolution satellite imagery from instruments like the Advanced Very High Resolution Radiometer (AVHRR) and the Moderate Resolution Imaging Spectrometer (MODIS) sensors, for variations in thermal and aerosol emissions, and the Ozone Monitoring Instruments (OMI) and Ozone Mapping Profiler Suite (OMPS) instruments, for variations in gas emissions, to provide a valuable resource for near real-time monitoring of volcanic activity. However, the low spatial resolution of these data only enable events that produce a high thermal output or a large amount of gas/aerosol emissions to be detected. High spatial resolution instruments, like the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) sensor, have a small enough pixel size (90m2) that the subtle variations in both thermal flux and gas/aerosol emission rates in the pre-eruptive period can be detected. Including these data with the already established high temporal resolution data helps to identify and classify precursory activity patterns months before an eruption (Reath et al, 2016). By correlating these data with ground surface deformation data, determined from the Interferometric Synthetic Aperture Radar (InSAR) sensor, and seismic data, collected by the Incorporated Research Institution for Seismology (IRIS) data archive, subtle

  17. Dynamical analysis of cylindrically symmetric anisotropic sources in f(R, T) gravity

    Energy Technology Data Exchange (ETDEWEB)

    Zubair, M.; Azmat, Hina [COMSATS Institute of Information Technology, Department of Mathematics, Lahore (Pakistan); Noureen, Ifra [University of Management and Technology, Department of Mathematics, Lahore (Pakistan)

    2017-03-15

    In this paper, we have analyzed the stability of cylindrically symmetric collapsing object filled with locally anisotropic fluid in f(R, T) theory, where R is the scalar curvature and T is the trace of stress-energy tensor of matter. Modified field equations and dynamical equations are constructed in f(R, T) gravity. The evolution or collapse equation is derived from dynamical equations by performing a linear perturbation on them. The instability range is explored in both the Newtonian and the post-Newtonian regimes with the help of an adiabatic index, which defines the impact of the physical parameters on the instability range. Some conditions are imposed on the physical quantities to secure the stability of the gravitating sources. (orig.)

  18. Brief communication "Seismic and acoustic-gravity signals from the source of the 2004 Indian Ocean Tsunami"

    Directory of Open Access Journals (Sweden)

    A. Raveloson

    2012-02-01

    Full Text Available The great Sumatra-Andaman earthquake of 26 December 2004 caused seismic waves propagating through the solid Earth, tsunami waves propagating through the ocean and infrasound or acoustic-gravity waves propagating through the atmosphere. Since the infrasound wave travels faster than its associated tsunami, it is for warning purposes very intriguing to study the possibility of infrasound generation directly at the earthquake source. Garces et al. (2005 and Le Pichon et al. (2005 emphasized that infrasound was generated by mountainous islands near the epicenter and by tsunami propagation along the continental shelf to the Bay of Bengal. Mikumo et al. (2008 concluded from the analysis of travel times and amplitudes of first arriving acoustic-gravity waves with periods of about 400–700 s that these waves are caused by coseismic motion of the sea surface mainly to the west of the Nicobar islands in the open seas. We reanalyzed the acoustic-gravity waves and corrected the first arrival times of Mikumo et al. (2008 by up to 20 min. We found the source of the first arriving acoustic-gravity wave about 300 km to the north of the US Geological Survey earthquake epicenter. This confirms the result of Mikumo et al. (2008 that sea level changes at the earthquake source cause long period acoustic-gravity waves, which indicate that a tsunami was generated. Therefore, a denser local network of infrasound stations may be helpful for tsunami warnings, not only for very large earthquakes.

  19. Formation and stabilization of multiple ball-like flames at Earth gravity

    KAUST Repository

    Zhou, Zhen

    2018-03-20

    Near-limit low-Lewis-number premixed flame behavior is studied experimentally and numerically for flames of H–CH–air mixtures that are located in a 55 mm diameter tube and below a perforated plate in a downward mixture flow. A combustion regime diagram is experimentally identified in terms of equivalence ratio and ratio of H to CH (variation of fuel Lewis number). Planar flames, cell-like flames, distorted cap-like flames, and arrays of ball-like flames are progressively observed in the experiments as the equivalence ratio is decreased. The experimentally observed ball-like lean limit flames experience chaotic motion, which is accompanied by sporadic events of flame splitting and extinction, while the total number of simultaneously burning flamelets remains approximately the same. In separate experiments, the multiple ball-like lean limit flames are stabilized by creating a slightly non-uniform mixture flow field. The CH* chemiluminescence distributions of the lean limit flames are recorded, showing that the ball-like lean limit flame front becomes more uniform in intensity and its shape approaches a spherical one with the increase of H content in the fuel. Numerical simulations are performed for single representative flames of the array of stabilized flamelets observed in the experiments. The simulated ball-like lean limit flame is further contrasted with the single ball-like flame that forms in a narrow tube (13.5 mm inner diameter) with an iso-thermal wall. The numerical results show that the ball-like lean limit flames present in the array of ball-like flames are more affected by the buoyancy-induced recirculation zone, compared with that in the narrow tube, revealing why the shape of the ball-like flame in the array deviates more from a spherical one. All in all, the wall confinement is not crucial for the formation of ball-like flames at terrestrial gravity.

  20. Short-period atmospheric gravity waves - A study of their statistical properties and source mechanisms

    Science.gov (United States)

    Gedzelman, S. D.

    1983-01-01

    Gravity waves for the one year period beginning 19 October 1976 around Palisades, New York, are investigated to determine their statistical properties and sources. The waves have typical periods of 10 min, pressure amplitudes of 3 Pa and velocities of 30 m/s. In general, the largest, amplitude waves occur during late fall and early winter when the upper tropospheric winds directly overhead are fastest and the static stability of the lower troposphere is greatest. Mean wave amplitudes correlate highly with the product of the mean maximum wind speed and the mean low level stratification directly aloft. A distinct diurnal variation of wave amplitudes with the largest waves occurring in the pre-dawn hours is also observed as a result of the increased static stability then. The majority of waves are generated by shear instability; however, a number of waves are generated by distant sources such as nuclear detonations or large thunderstorms. The waves with distant sources can be distinguished on the basis of their generally much higher coherency across the grid and velocities that depart markedly from the wind velocity at any point in the sounding.

  1. Utilizing Commercial Hardware and Open Source Computer Vision Software to Perform Motion Capture for Reduced Gravity Flight

    Science.gov (United States)

    Humphreys, Brad; Bellisario, Brian; Gallo, Christopher; Thompson, William K.; Lewandowski, Beth

    2016-01-01

    Long duration space travel to Mars or to an asteroid will expose astronauts to extended periods of reduced gravity. Since gravity is not present to aid loading, astronauts will use resistive and aerobic exercise regimes for the duration of the space flight to minimize the loss of bone density, muscle mass and aerobic capacity that occurs during exposure to a reduced gravity environment. Unlike the International Space Station (ISS), the area available for an exercise device in the next generation of spacecraft is limited. Therefore, compact resistance exercise device prototypes are being developed. The NASA Digital Astronaut Project (DAP) is supporting the Advanced Exercise Concepts (AEC) Project, Exercise Physiology and Countermeasures (ExPC) project and the National Space Biomedical Research Institute (NSBRI) funded researchers by developing computational models of exercising with these new advanced exercise device concepts. To perform validation of these models and to support the Advanced Exercise Concepts Project, several candidate devices have been flown onboard NASAs Reduced Gravity Aircraft. In terrestrial laboratories, researchers typically have available to them motion capture systems for the measurement of subject kinematics. Onboard the parabolic flight aircraft it is not practical to utilize the traditional motion capture systems due to the large working volume they require and their relatively high replacement cost if damaged. To support measuring kinematics on board parabolic aircraft, a motion capture system is being developed utilizing open source computer vision code with commercial off the shelf (COTS) video camera hardware. While the systems accuracy is lower than lab setups, it provides a means to produce quantitative comparison motion capture kinematic data. Additionally, data such as required exercise volume for small spaces such as the Orion capsule can be determined. METHODS: OpenCV is an open source computer vision library that provides the

  2. Coordinating a Large, Amalgamated REU Program with Multiple Funding Sources

    Science.gov (United States)

    Fiorini, Eugene; Myers, Kellen; Naqvi, Yusra

    2017-01-01

    In this paper, we discuss the challenges of organizing a large REU program amalgamated from multiple funding sources, including diverse participants, mentors, and research projects. We detail the program's structure, activities, and recruitment, and we hope to demonstrate that the organization of this REU is not only beneficial to its…

  3. The gravity anomaly of Mount Amiata; different approaches for understanding anomaly source distribution

    Science.gov (United States)

    Girolami, C.; Barchi, M. R.; Heyde, I.; Pauselli, C.; Vetere, F.; Cannata, A.

    2017-11-01

    In this work, the gravity anomaly signal beneath Mount Amiata and its surroundings have been analysed to reconstruct the subsurface setting. In particular, the work focuses on the investigation of the geological bodies responsible for the Bouguer gravity minimum observed in this area.

  4. Silkeborg gravity high revisited: Horizontal extension of the source and its uniqueness

    DEFF Research Database (Denmark)

    Strykowski, Gabriel

    2000-01-01

    Silkeborg Gravity High is a dominant positive gravity anomaly in Denmark. It is associated with an igneous intrusion within the crust. A deep refraction seismic profile locates the top of the intrusion in depths between 11 km and 25 Inn. The present contribution should be read together with two o...

  5. Evaluation of Network Reliability for Computer Networks with Multiple Sources

    Directory of Open Access Journals (Sweden)

    Yi-Kuei Lin

    2012-01-01

    Full Text Available Evaluating the reliability of a network with multiple sources to multiple sinks is a critical issue from the perspective of quality management. Due to the unrealistic definition of paths of network models in previous literature, existing models are not appropriate for real-world computer networks such as the Taiwan Advanced Research and Education Network (TWAREN. This paper proposes a modified stochastic-flow network model to evaluate the network reliability of a practical computer network with multiple sources where data is transmitted through several light paths (LPs. Network reliability is defined as being the probability of delivering a specified amount of data from the sources to the sink. It is taken as a performance index to measure the service level of TWAREN. This paper studies the network reliability of the international portion of TWAREN from two sources (Taipei and Hsinchu to one sink (New York that goes through a submarine and land surface cable between Taiwan and the United States.

  6. Multisensory softness perceived compliance from multiple sources of information

    CERN Document Server

    Luca, Massimiliano Di

    2014-01-01

    Offers a unique multidisciplinary overview of how humans interact with soft objects and how multiple sensory signals are used to perceive material properties, with an emphasis on object deformability. The authors describe a range of setups that have been employed to study and exploit sensory signals involved in interactions with compliant objects as well as techniques to simulate and modulate softness - including a psychophysical perspective of the field. Multisensory Softness focuses on the cognitive mechanisms underlying the use of multiple sources of information in softness perception. D

  7. Tracking of Multiple Moving Sources Using Recursive EM Algorithm

    Directory of Open Access Journals (Sweden)

    Böhme Johann F

    2005-01-01

    Full Text Available We deal with recursive direction-of-arrival (DOA estimation of multiple moving sources. Based on the recursive EM algorithm, we develop two recursive procedures to estimate the time-varying DOA parameter for narrowband signals. The first procedure requires no prior knowledge about the source movement. The second procedure assumes that the motion of moving sources is described by a linear polynomial model. The proposed recursion updates the polynomial coefficients when a new data arrives. The suggested approaches have two major advantages: simple implementation and easy extension to wideband signals. Numerical experiments show that both procedures provide excellent results in a slowly changing environment. When the DOA parameter changes fast or two source directions cross with each other, the procedure designed for a linear polynomial model has a better performance than the general procedure. Compared to the beamforming technique based on the same parameterization, our approach is computationally favorable and has a wider range of applications.

  8. Multiple LDPC decoding for distributed source coding and video coding

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Luong, Huynh Van; Huang, Xin

    2011-01-01

    Distributed source coding (DSC) is a coding paradigm for systems which fully or partly exploit the source statistics at the decoder to reduce the computational burden at the encoder. Distributed video coding (DVC) is one example. This paper considers the use of Low Density Parity Check Accumulate...... (LDPCA) codes in a DSC scheme with feed-back. To improve the LDPC coding performance in the context of DSC and DVC, while retaining short encoder blocks, this paper proposes multiple parallel LDPC decoding. The proposed scheme passes soft information between decoders to enhance performance. Experimental...

  9. Research on amplification multiple of source neutron number for ADS

    International Nuclear Information System (INIS)

    Liu Guisheng; Zhao Zhixiang; Zhang Baocheng; Shen Qingbiao; Ding Dazhao

    1998-01-01

    NJOY-91.91 and MILER code systems was applied to process and generate 44 group cross sections in AMPX master library format from CENDL-2 and ENDF/B-6. It is important an ADS (Accelerator-Driven System) assembly spectrum is used as the weighting spectrum for generating multi-group constants. Amplification multiples of source neutron number for several fast assemblies were calculated

  10. 75 FR 69591 - Medicaid Program; Withdrawal of Determination of Average Manufacturer Price, Multiple Source Drug...

    Science.gov (United States)

    2010-11-15

    ..., Multiple Source Drug Definition, and Upper Limits for Multiple Source Drugs AGENCY: Centers for Medicare... withdrawing the definition of ``multiple source drug'' as it was revised in the ``Medicaid Program; Multiple Source Drug Definition'' final rule published in the October 7, 2008 Federal Register. DATES: Effective...

  11. Multiple approaches to microbial source tracking in tropical northern Australia

    KAUST Repository

    Neave, Matthew

    2014-09-16

    Microbial source tracking is an area of research in which multiple approaches are used to identify the sources of elevated bacterial concentrations in recreational lakes and beaches. At our study location in Darwin, northern Australia, water quality in the harbor is generally good, however dry-season beach closures due to elevated Escherichia coli and enterococci counts are a cause for concern. The sources of these high bacteria counts are currently unknown. To address this, we sampled sewage outfalls, other potential inputs, such as urban rivers and drains, and surrounding beaches, and used genetic fingerprints from E. coli and enterococci communities, fecal markers and 454 pyrosequencing to track contamination sources. A sewage effluent outfall (Larrakeyah discharge) was a source of bacteria, including fecal bacteria that impacted nearby beaches. Two other treated effluent discharges did not appear to influence sites other than those directly adjacent. Several beaches contained fecal indicator bacteria that likely originated from urban rivers and creeks within the catchment. Generally, connectivity between the sites was observed within distinct geographical locations and it appeared that most of the bacterial contamination on Darwin beaches was confined to local sources.

  12. Field estimates of gravity terrain corrections and Y2K-compatible method to convert from gravity readings with multiple base stations to tide- and long-term drift-corrected observations

    Science.gov (United States)

    Plouff, Donald

    2000-01-01

    Gravity observations are directly made or are obtained from other sources by the U.S. Geological Survey in order to prepare maps of the anomalous gravity field and consequently to interpret the subsurface distribution of rock densities and associated lithologic or geologic units. Observations are made in the field with gravity meters at new locations and at reoccupations of previously established gravity "stations." This report illustrates an interactively-prompted series of steps needed to convert gravity "readings" to values that are tied to established gravity datums and includes computer programs to implement those steps. Inasmuch as individual gravity readings have small variations, gravity-meter (instrument) drift may not be smoothly variable, and acommodations may be needed for ties to previously established stations, the reduction process is iterative. Decision-making by the program user is prompted by lists of best values and graphical displays. Notes about irregularities of topography, which affect the value of observed gravity but are not shown in sufficient detail on topographic maps, must be recorded in the field. This report illustrates ways to record field notes (distances, heights, and slope angles) and includes computer programs to convert field notes to gravity terrain corrections. This report includes approaches that may serve as models for other applications, for example: portrayal of system flow; style of quality control to document and validate computer applications; lack of dependence on proprietary software except source code compilation; method of file-searching with a dwindling list; interactive prompting; computer code to write directly in the PostScript (Adobe Systems Incorporated) printer language; and high-lighting the four-digit year on the first line of time-dependent data sets for assured Y2K compatibility. Computer source codes provided are written in the Fortran scientific language. In order for the programs to operate, they first

  13. Localizing Brain Activity from Multiple Distinct Sources via EEG

    Directory of Open Access Journals (Sweden)

    George Dassios

    2014-01-01

    Full Text Available An important question arousing in the framework of electroencephalography (EEG is the possibility to recognize, by means of a recorded surface potential, the number of activated areas in the brain. In the present paper, employing a homogeneous spherical conductor serving as an approximation of the brain, we provide a criterion which determines whether the measured surface potential is evoked by a single or multiple localized neuronal excitations. We show that the uniqueness of the inverse problem for a single dipole is closely connected with attaining certain relations connecting the measured data. Further, we present the necessary and sufficient conditions which decide whether the collected data originates from a single dipole or from numerous dipoles. In the case where the EEG data arouses from multiple parallel dipoles, an isolation of the source is, in general, not possible.

  14. Writing in the workplace: Constructing documents using multiple digital sources

    Directory of Open Access Journals (Sweden)

    Mariëlle Leijten

    2014-02-01

    Full Text Available In today’s workplaces professional communication often involves constructing documents from multiple digital sources—integrating one’s own texts/graphics with ideas based on others’ text/graphics. This article presents a case study of a professional communication designer as he constructs a proposal over several days. Drawing on keystroke and interview data, we map the professional’s overall process, plot the time course of his writing/design, illustrate how he searches for content and switches among optional digital sources, and show how he modifies and reuses others’ content. The case study reveals not only that the professional (1 searches extensively through multiple sources for content and ideas but that he also (2 constructs visual content (charts, graphs, photographs as well as verbal content, and (3 manages his attention and motivation over this extended task. Since these three activities are not represented in current models of writing, we propose their addition not just to models of communication design, but also to models of writing in general.

  15. A signature of quantum gravity at the source of the seeds of cosmic structure?

    Energy Technology Data Exchange (ETDEWEB)

    Sudarsky, Daniel [Instituto de Ciencias Nucleares, Universidad National Autonoma de Mexico, A. Postal 70-543, Mexico D.F. 04510 (Mexico)

    2007-05-15

    This article reviews a recent work by a couple of colleagues and myself [Perez A, Sahlmann H and Sudarsky D 2006 Class Quant Gravity 23 2317-54] about the shortcomings of the standard explanations of the quantum origins of cosmic structure in the inflationary scenario, and a proposal to address them. The point is that, in the usual accounts, the inhomogeneity and anisotropy of our universe seems to emerge from an exactly homogeneous and isotropic initial state through processes that do not break those symmetries. We argued that some novel aspect of physics must be called upon to be able to address the problem in a fully satisfactory way. The proposed approach is inspired by Penrose's ideas regarding a quantum gravity induced, real and dynamical collapse of the wave function.

  16. Interpretation of the TRADE In-Pile source multiplication experiments

    International Nuclear Information System (INIS)

    Mercatali, Luigi; Carta, Mario; Peluso, Vincenzo

    2006-01-01

    Within the framework of the neutronic characterization of the TRIGA RC-1 reactor in support to the TRADE (TRiga Accelerator Driven Experiment) program, the interpretation of the subcriticality level measurements performed in static regime during the TRADE In-Pile experimental program is presented. Different levels of subcriticality have been measured using the MSA (Modified Source Approximated) method by the insertion of a standard fixed radioactive source into different core positions. Starting from a reference configuration, fuel elements were removed: control rods were moved outward as required for the coupling experiments envisioned with the proton accelerator and fission chambers were inserted in order to measure subcritical count rates. A neutron-physics analysis based on the modified formulation of the source multiplication method (MSM) has been carried out, which requires the systematic solution for each experimental configuration of the homogeneous, both in the forward and adjoint forms, and inhomogeneous Boltzmann equations. By means of such a methodology calculated correction factors to be applied to the MSA measured reactivities were produced in order to take into account spatial and energetic effects creating changes in the detector efficiencies and effective source with respect to the calibration configuration. The methodology presented has been tested against a large number of experimental states. The measurements have underlined the sensitivity of the MSA measured reactivities to core geometry changes and control rod perturbations; the efficiency of MSM factors to dramatically correct for this sensitivity is underlined, making of this technique a relevant methodology in view of the incoming US RACE program to be performed in TRIGA reactors

  17. Acoustic-gravity waves generated by atmospheric and near-surface sources

    Science.gov (United States)

    Kunitsyn, Viacheslav E.; Kholodov, Alexander S.; Krysanov, Boris Yu.; Andreeva, Elena S.; Nesterov, Ivan A.; Vorontsov, Artem M.

    2013-04-01

    Numerical simulation of the acoustic-gravity waves (AGW) generated by long-period oscillations of the Earth's (oceanic) surface, earthquakes, explosions, thermal heating, seiches, and tsunami is carried out. Wavelike disturbances are quite frequent phenomena in the atmosphere and ionosphere. These events can be caused by the impacts from space and atmosphere, by oscillations of the Earth'as surface and other near-surface events. These wavelike phenomena in the atmosphere and ionosphere appear as the alternating areas of enhanced and depleted density (in the atmosphere) or electron concentration (in the ionosphere). In the paper, AGW with typical frequencies of a few hertz - millihertz are analyzed. AGW are often observed after the atmospheric perturbations, during the earthquakes, and some time (a few days to hours) in advance of the earthquakes. Numerical simulation of the generation of AGW by long-period oscillations of the Earth's and oceanic surface, earthquakes, explosions, thermal heating, seiches, and tsunami is carried out. The AGW generated by the near-surface phenomena within a few hertz-millihertz frequency range build up at the mid-atmospheric and ionospheric altitudes, where they assume their typical spatial scales of the order of a few hundred kilometers. Oscillations of the ionospheric plasma within a few hertz-millihertz frequency range generate electromagnetic waves with corresponding frequencies as well as travelling ionospheric irregularities (TIDs). Such structures can be successfully monitored using satellite radio tomography (RT) techniques. For the purposes of RT diagnostics, 150/400 MHz transmissions from low-orbiting navigational satellites flying in polar orbits at the altitudes of about 1000 km as well as 1.2-1.5 GHz signals form high-orbiting (orbital altitudes about 20000 km) navigation systems like GPS/GLONASS are used. The results of experimental studies on generation of wavelike disturbances by particle precipitation are presented

  18. Feature extraction from multiple data sources using genetic programming.

    Energy Technology Data Exchange (ETDEWEB)

    Szymanski, J. J. (John J.); Brumby, Steven P.; Pope, P. A. (Paul A.); Eads, D. R. (Damian R.); Galassi, M. C. (Mark C.); Harvey, N. R. (Neal R.); Perkins, S. J. (Simon J.); Porter, R. B. (Reid B.); Theiler, J. P. (James P.); Young, A. C. (Aaron Cody); Bloch, J. J. (Jeffrey J.); David, N. A. (Nancy A.); Esch-Mosher, D. M. (Diana M.)

    2002-01-01

    Feature extration from imagery is an important and long-standing problem in remote sensing. In this paper, we report on work using genetic programming to perform feature extraction simultaneously from multispectral and digital elevation model (DEM) data. The tool used is the GENetic Imagery Exploitation (GENIE) software, which produces image-processing software that inherently combines spatial and spectral processing. GENIE is particularly useful in exploratory studies of imagery, such as one often does in combining data from multiple sources. The user trains the software by painting the feature of interest with a simple graphical user interface. GENIE then uses genetic programming techniques to produce an image-processing pipeline. Here, we demonstrate evolution of image processing algorithms that extract a range of land-cover features including towns, grasslands, wild fire burn scars, and several types of forest. We use imagery from the DOE/NNSA Multispectral Thermal Imager (MTI) spacecraft, fused with USGS 1:24000 scale DEM data.

  19. Estimation of subcriticality by neutron source multiplication method

    International Nuclear Information System (INIS)

    Sakurai, Kiyoshi; Suzaki, Takenori; Arakawa, Takuya; Naito, Yoshitaka

    1995-03-01

    Subcritical cores were constructed in a core tank of the TCA by arraying 2.6% enriched UO 2 fuel rods into nxn square lattices of 1.956 cm pitch. Vertical distributions of the neutron count rates for the fifteen subcritical cores (n=17, 16, 14, 11, 8) with different water levels were measured at 5 cm interval with 235 U micro-fission counters at the in-core and out-core positions arranging a 252 C f neutron source at near core center. The continuous energy Monte Carlo code MCNP-4A was used for the calculation of neutron multiplication factors and neutron count rates. In this study, important conclusions are as follows: (1) Differences of neutron multiplication factors resulted from exponential experiment and MCNP-4A are below 1% in most cases. (2) Standard deviations of neutron count rates calculated from MCNP-4A with 500000 histories are 5-8%. The calculated neutron count rates are consistent with the measured one. (author)

  20. Assessing the use of multiple sources in student essays.

    Science.gov (United States)

    Hastings, Peter; Hughes, Simon; Magliano, Joseph P; Goldman, Susan R; Lawless, Kimberly

    2012-09-01

    The present study explored different approaches for automatically scoring student essays that were written on the basis of multiple texts. Specifically, these approaches were developed to classify whether or not important elements of the texts were present in the essays. The first was a simple pattern-matching approach called "multi-word" that allowed for flexible matching of words and phrases in the sentences. The second technique was latent semantic analysis (LSA), which was used to compare student sentences to original source sentences using its high-dimensional vector-based representation. Finally, the third was a machine-learning technique, support vector machines, which learned a classification scheme from the corpus. The results of the study suggested that the LSA-based system was superior for detecting the presence of explicit content from the texts, but the multi-word pattern-matching approach was better for detecting inferences outside or across texts. These results suggest that the best approach for analyzing essays of this nature should draw upon multiple natural language processing approaches.

  1. Multiple time-reversed guide-sources in shallow water

    Science.gov (United States)

    Gaumond, Charles F.; Fromm, David M.; Lingevitch, Joseph F.; Gauss, Roger C.; Menis, Richard

    2003-10-01

    Detection in a monostatic, broadband, active sonar system in shallow water is degraded by propagation-induced spreading. The detection improvement from multiple spatially separated guide sources (GSs) is presented as a method to mitigate this degradation. The improvement of detection by using information in a set of one-way transmissions from a variety of positions is shown using sea data. The experimental area is south of the Hudson Canyon off the coast of New Jersey. The data were taken using five elements of a time-reversing VLA. The five elements were contiguous and at midwater depth. The target and guide source was an echo repeater positioned at various ranges and at middepth. The transmitted signals were 3.0- to 3.5-kHz LFMs. The data are analyzed to show the amount of information present in the collection, a baseline probability of detection (PD) not using the collection of GS signals, the improvement in PD from the use of various sets of GS signals. The dependence of the improvement as a function of range is also shown. [The authors acknowledge support from Dr. Jeffrey Simmen, ONR321OS, and the chief scientist Dr. Charles Holland. Work supported by ONR.

  2. Development of new experimental platform 'MARS'-Multiple Artificial-gravity Research System-to elucidate the impacts of micro/partial gravity on mice.

    Science.gov (United States)

    Shiba, Dai; Mizuno, Hiroyasu; Yumoto, Akane; Shimomura, Michihiko; Kobayashi, Hiroe; Morita, Hironobu; Shimbo, Miki; Hamada, Michito; Kudo, Takashi; Shinohara, Masahiro; Asahara, Hiroshi; Shirakawa, Masaki; Takahashi, Satoru

    2017-09-07

    This Japan Aerospace Exploration Agency project focused on elucidating the impacts of partial gravity (partial g) and microgravity (μg) on mice using newly developed mouse habitat cage units (HCU) that can be installed in the Centrifuge-equipped Biological Experiment Facility in the International Space Station. In the first mission, 12 C57BL/6 J male mice were housed under μg or artificial earth-gravity (1 g). Mouse activity was monitored daily via downlinked videos; μg mice floated inside the HCU, whereas artificial 1 g mice were on their feet on the floor. After 35 days of habitation, all mice were returned to the Earth and processed. Significant decreases were evident in femur bone density and the soleus/gastrocnemius muscle weights of μg mice, whereas artificial 1 g mice maintained the same bone density and muscle weight as mice in the ground control experiment, in which housing conditions in the flight experiment were replicated. These data indicate that these changes were particularly because of gravity. They also present the first evidence that the addition of gravity can prevent decreases in bone density and muscle mass, and that the new platform 'MARS' may provide novel insights on the molecular-mechanisms regulating biological processes controlled by partial g/μg.

  3. Testability evaluation using prior information of multiple sources

    Directory of Open Access Journals (Sweden)

    Wang Chao

    2014-08-01

    Full Text Available Testability plays an important role in improving the readiness and decreasing the life-cycle cost of equipment. Testability demonstration and evaluation is of significance in measuring such testability indexes as fault detection rate (FDR and fault isolation rate (FIR, which is useful to the producer in mastering the testability level and improving the testability design, and helpful to the consumer in making purchase decisions. Aiming at the problems with a small sample of testability demonstration test data (TDTD such as low evaluation confidence and inaccurate result, a testability evaluation method is proposed based on the prior information of multiple sources and Bayes theory. Firstly, the types of prior information are analyzed. The maximum entropy method is applied to the prior information with the mean and interval estimate forms on the testability index to obtain the parameters of prior probability density function (PDF, and the empirical Bayesian method is used to get the parameters for the prior information with a success-fail form. Then, a parametrical data consistency check method is used to check the compatibility between all the sources of prior information and TDTD. For the prior information to pass the check, the prior credibility is calculated. A mixed prior distribution is formed based on the prior PDFs and the corresponding credibility. The Bayesian posterior distribution model is acquired with the mixed prior distribution and TDTD, based on which the point and interval estimates are calculated. Finally, examples of a flying control system are used to verify the proposed method. The results show that the proposed method is feasible and effective.

  4. Testability evaluation using prior information of multiple sources

    Institute of Scientific and Technical Information of China (English)

    Wang Chao; Qiu Jing; Liu Guanjun; Zhang Yong

    2014-01-01

    Testability plays an important role in improving the readiness and decreasing the life-cycle cost of equipment. Testability demonstration and evaluation is of significance in measuring such testability indexes as fault detection rate (FDR) and fault isolation rate (FIR), which is useful to the producer in mastering the testability level and improving the testability design, and helpful to the consumer in making purchase decisions. Aiming at the problems with a small sample of testabil-ity demonstration test data (TDTD) such as low evaluation confidence and inaccurate result, a test-ability evaluation method is proposed based on the prior information of multiple sources and Bayes theory. Firstly, the types of prior information are analyzed. The maximum entropy method is applied to the prior information with the mean and interval estimate forms on the testability index to obtain the parameters of prior probability density function (PDF), and the empirical Bayesian method is used to get the parameters for the prior information with a success-fail form. Then, a parametrical data consistency check method is used to check the compatibility between all the sources of prior information and TDTD. For the prior information to pass the check, the prior credibility is calculated. A mixed prior distribution is formed based on the prior PDFs and the corresponding credibility. The Bayesian posterior distribution model is acquired with the mixed prior distribution and TDTD, based on which the point and interval estimates are calculated. Finally, examples of a flying control system are used to verify the proposed method. The results show that the proposed method is feasible and effective.

  5. A Time-Regularized, Multiple Gravity-Assist Low-Thrust, Bounded-Impulse Model for Trajectory Optimization

    Science.gov (United States)

    Ellison, Donald H.; Englander, Jacob A.; Conway, Bruce A.

    2017-01-01

    The multiple gravity assist low-thrust (MGALT) trajectory model combines the medium-fidelity Sims-Flanagan bounded-impulse transcription with a patched-conics flyby model and is an important tool for preliminary trajectory design. While this model features fast state propagation via Keplers equation and provides a pleasingly accurate estimation of the total mass budget for the eventual flight suitable integrated trajectory it does suffer from one major drawback, namely its temporal spacing of the control nodes. We introduce a variant of the MGALT transcription that utilizes the generalized anomaly from the universal formulation of Keplers equation as a decision variable in addition to the trajectory phase propagation time. This results in two improvements over the traditional model. The first is that the maneuver locations are equally spaced in generalized anomaly about the orbit rather than time. The second is that the Kepler propagator now has the generalized anomaly as its independent variable instead of time and thus becomes an iteration-free propagation method. The new algorithm is outlined, including the impact that this has on the computation of Jacobian entries for numerical optimization, and a motivating application problem is presented that illustrates the improvements that this model has over the traditional MGALT transcription.

  6. Rankine models for time-dependent gravity spreading of terrestrial source flows over subplanar slopes

    NARCIS (Netherlands)

    Wijermars, R.; Dooley, T.P.; Jackson, M.P.A.; Hudec, M.R.

    2014-01-01

    Geological mass flows extruding from a point source include mud, lava, and salt issued from subsurface reservoirs and ice from surface feeders. The delivery of the material may occur via a salt stock, a volcanic pipe (for magma and mud flows), or a valley glacier (for ice). All these source flows

  7. Neutron generators with size scalability, ease of fabrication and multiple ion source functionalities

    Science.gov (United States)

    Elizondo-Decanini, Juan M

    2014-11-18

    A neutron generator is provided with a flat, rectilinear geometry and surface mounted metallizations. This construction provides scalability and ease of fabrication, and permits multiple ion source functionalities.

  8. Modeling tectonic heat flow and source rock maturity in the Rub' Al-Khali Basin (Saudi Arabia), with the help of GOCE satellite gravity data

    NARCIS (Netherlands)

    Abdul Fattah, R.; Meekes, S.; Bouman, J.; Ebbing, J.; Haagmans, R.

    2014-01-01

    A 3D basin modeling study was carried out to reconstruct the regional heat flow and source rock maturity in the Rub'al-Khali basin. Gravity gradient data from the GOCE satellite were used to model deep structures, such as the Moho interface. Tectonic heat flow was modeled using the GOCE-based Moho

  9. Relationship between exposure to multiple noise sources and noise annoyance

    NARCIS (Netherlands)

    Miedema, H.M.E.

    2004-01-01

    Relationships between exposure to noise [metric: day-night level (DNL) or day-evening-night level (DENL)] from a single source (aircraft, road traffic, or railways) and annoyance based on a large international dataset have been published earlier. Also for stationary sources relationships have been

  10. Modeling water demand when households have multiple sources of water

    Science.gov (United States)

    Coulibaly, Lassina; Jakus, Paul M.; Keith, John E.

    2014-07-01

    A significant portion of the world's population lives in areas where public water delivery systems are unreliable and/or deliver poor quality water. In response, people have developed important alternatives to publicly supplied water. To date, most water demand research has been based on single-equation models for a single source of water, with very few studies that have examined water demand from two sources of water (where all nonpublic system water sources have been aggregated into a single demand). This modeling approach leads to two outcomes. First, the demand models do not capture the full range of alternatives, so the true economic relationship among the alternatives is obscured. Second, and more seriously, economic theory predicts that demand for a good becomes more price-elastic as the number of close substitutes increases. If researchers artificially limit the number of alternatives studied to something less than the true number, the price elasticity estimate may be biased downward. This paper examines water demand in a region with near universal access to piped water, but where system reliability and quality is such that many alternative sources of water exist. In extending the demand analysis to four sources of water, we are able to (i) demonstrate why households choose the water sources they do, (ii) provide a richer description of the demand relationships among sources, and (iii) calculate own-price elasticity estimates that are more elastic than those generally found in the literature.

  11. Multiple station beamline at an undulator x-ray source

    DEFF Research Database (Denmark)

    Als-Nielsen, J.; Freund, A.K.; Grübel, G.

    1994-01-01

    The undulator X-ray source is an ideal source for many applications: the beam is brilliant, highly collimated in all directions, quasi-monochromatic, pulsed and linearly polarized. Such a precious source can feed several independently operated instruments by utilizing a downstream series of X......-ray transparent monochromator crystals. Diamond in particular is an attractive monochromator as it is rather X-ray transparent and can be fabricated to a high degree of crystal perfection. Moreover, it has a very high heat conductivity and a rather small thermal expansion so the beam X-ray heat load problem...

  12. Multiple approaches to microbial source tracking in tropical northern Australia

    KAUST Repository

    Neave, Matthew; Luter, Heidi; Padovan, Anna; Townsend, Simon; Schobben, Xavier; Gibb, Karen

    2014-01-01

    , other potential inputs, such as urban rivers and drains, and surrounding beaches, and used genetic fingerprints from E. coli and enterococci communities, fecal markers and 454 pyrosequencing to track contamination sources. A sewage effluent outfall

  13. Metasurface Cloak Performance Near-by Multiple Line Sources and PEC Cylindrical Objects

    DEFF Research Database (Denmark)

    Arslanagic, Samel; Yatman, William H.; Pehrson, Signe

    2014-01-01

    The performance/robustness of metasurface cloaks to a complex field environment which may represent a realistic scenario of radiating sources is presently reported. Attention is devoted to the cloak operation near-by multiple line sources and multiple perfectly electrically conducting cylinders. ...

  14. Speculative Attacks with Multiple Sources of Public Information

    OpenAIRE

    Cornand, Camille; Heinemann, Frank

    2005-01-01

    We propose a speculative attack model in which agents receive multiple public signals. It is characterised by its focus on an informational structure, which sets free from the strict separation between public information and private information. Diverse pieces of public information can be taken into account differently by players and are likely to lead to different appreciations ex post. This process defines players’ private value. The main result is to show that equilibrium uniqueness depend...

  15. Synergies of multiple remote sensing data sources for REDD+ monitoring

    NARCIS (Netherlands)

    Sy, de V.; Herold, M.; Achard, F.; Asner, G.P.; Held, A.; Kellndorfer, J.; Verbesselt, J.

    2012-01-01

    Remote sensing technologies can provide objective, practical and cost-effective solutions for developing and maintaining REDD+ monitoring systems. This paper reviews the potential and status of available remote sensing data sources with a focus on different forest information products and synergies

  16. Using the Multiplicative Schwarz Alternating Algorithm (MSAA) for Solving the Large Linear System of Equations Related to Global Gravity Field Recovery up to Degree and Order 120

    Science.gov (United States)

    Safari, A.; Sharifi, M. A.; Amjadiparvar, B.

    2010-05-01

    The GRACE mission has substantiated the low-low satellite-to-satellite tracking (LL-SST) concept. The LL-SST configuration can be combined with the previously realized high-low SST concept in the CHAMP mission to provide a much higher accuracy. The line of sight (LOS) acceleration difference between the GRACE satellite pair is the mostly used observable for mapping the global gravity field of the Earth in terms of spherical harmonic coefficients. In this paper, mathematical formulae for LOS acceleration difference observations have been derived and the corresponding linear system of equations has been set up for spherical harmonic up to degree and order 120. The total number of unknowns is 14641. Such a linear equation system can be solved with iterative solvers or direct solvers. However, the runtime of direct methods or that of iterative solvers without a suitable preconditioner increases tremendously. This is the reason why we need a more sophisticated method to solve the linear system of problems with a large number of unknowns. Multiplicative variant of the Schwarz alternating algorithm is a domain decomposition method, which allows it to split the normal matrix of the system into several smaller overlaped submatrices. In each iteration step the multiplicative variant of the Schwarz alternating algorithm solves linear systems with the matrices obtained from the splitting successively. It reduces both runtime and memory requirements drastically. In this paper we propose the Multiplicative Schwarz Alternating Algorithm (MSAA) for solving the large linear system of gravity field recovery. The proposed algorithm has been tested on the International Association of Geodesy (IAG)-simulated data of the GRACE mission. The achieved results indicate the validity and efficiency of the proposed algorithm in solving the linear system of equations from accuracy and runtime points of view. Keywords: Gravity field recovery, Multiplicative Schwarz Alternating Algorithm, Low

  17. Multiple-source multiple-harmonic active vibration control of variable section cylindrical structures: A numerical study

    Science.gov (United States)

    Liu, Jinxin; Chen, Xuefeng; Gao, Jiawei; Zhang, Xingwu

    2016-12-01

    Air vehicles, space vehicles and underwater vehicles, the cabins of which can be viewed as variable section cylindrical structures, have multiple rotational vibration sources (e.g., engines, propellers, compressors and motors), making the spectrum of noise multiple-harmonic. The suppression of such noise has been a focus of interests in the field of active vibration control (AVC). In this paper, a multiple-source multiple-harmonic (MSMH) active vibration suppression algorithm with feed-forward structure is proposed based on reference amplitude rectification and conjugate gradient method (CGM). An AVC simulation scheme called finite element model in-loop simulation (FEMILS) is also proposed for rapid algorithm verification. Numerical studies of AVC are conducted on a variable section cylindrical structure based on the proposed MSMH algorithm and FEMILS scheme. It can be seen from the numerical studies that: (1) the proposed MSMH algorithm can individually suppress each component of the multiple-harmonic noise with an unified and improved convergence rate; (2) the FEMILS scheme is convenient and straightforward for multiple-source simulations with an acceptable loop time. Moreover, the simulations have similar procedure to real-life control and can be easily extended to physical model platform.

  18. PUBLIC EXPOSURE TO MULTIPLE RF SOURCES IN GHANA.

    Science.gov (United States)

    Deatanyah, P; Abavare, E K K; Menyeh, A; Amoako, J K

    2018-03-16

    This paper describes an effort to respond to the suggestion in World Health Organization (WHO) research agenda to better quantify potential exposure levels from a range of radiofrequency (RF) sources at 200 public access locations in Ghana. Wide-band measurements were performed-with a spectrum analyser and a log-periodic antenna using three-point spatial averaging method. The overall results represented a maximum of 0.19% of the ICNIRP reference levels for public exposure. These results were generally lower than found in some previous but were 58% (2.0 dB) greater, than found in similar work conducted in the USA. Major contributing sources of RF fields were identified to be FM broadcast and mobile base station sites. Three locations with the greatest measured RF fields could represent potential areas for epidemiological studies.

  19. Accommodating multiple illumination sources in an imaging colorimetry environment

    Science.gov (United States)

    Tobin, Kenneth W., Jr.; Goddard, James S., Jr.; Hunt, Martin A.; Hylton, Kathy W.; Karnowski, Thomas P.; Simpson, Marc L.; Richards, Roger K.; Treece, Dale A.

    2000-03-01

    Researchers at the Oak Ridge National Laboratory have been developing a method for measuring color quality in textile products using a tri-stimulus color camera system. Initial results of the Imaging Tristimulus Colorimeter (ITC) were reported during 1999. These results showed that the projection onto convex sets (POCS) approach to color estimation could be applied to complex printed patterns on textile products with high accuracy and repeatability. Image-based color sensors used for on-line measurement are not colorimetric by nature and require a non-linear transformation of the component colors based on the spectral properties of the incident illumination, imaging sensor, and the actual textile color. Our earlier work reports these results for a broad-band, smoothly varying D65 standard illuminant. To move the measurement to the on-line environment with continuously manufactured textile webs, the illumination source becomes problematic. The spectral content of these light sources varies substantially from the D65 standard illuminant and can greatly impact the measurement performance of the POCS system. Although absolute color measurements are difficult to make under different illumination, referential measurements to monitor color drift provide a useful indication of product quality. Modifications to the ITC system have been implemented to enable the study of different light sources. These results and the subsequent analysis of relative color measurements will be reported for textile products.

  20. Multiple sources of boron in urban surface waters and groundwaters

    Energy Technology Data Exchange (ETDEWEB)

    Hasenmueller, Elizabeth A., E-mail: eahasenm@wustl.edu; Criss, Robert E.

    2013-03-01

    Previous studies attribute abnormal boron (B) levels in streams and groundwaters to wastewater and fertilizer inputs. This study shows that municipal drinking water used for lawn irrigation contributes substantial non-point loads of B and other chemicals (S-species, Li, and Cu) to surface waters and shallow groundwaters in the St. Louis, Missouri, area. Background levels and potential B sources were characterized by analysis of lawn and street runoff, streams, rivers, springs, local rainfall, wastewater influent and effluent, and fertilizers. Urban surface waters and groundwaters are highly enriched in B (to 250 μg/L) compared to background levels found in rain and pristine, carbonate-hosted streams and springs (< 25 μg/L), but have similar concentrations (150 to 259 μg/L) compared to municipal drinking waters derived from the Missouri River. Other data including B/SO{sub 4}{sup 2-}−S and B/Li ratios confirm major contributions from this source. Moreover, sequential samples of runoff collected during storms show that B concentrations decrease with increased discharge, proving that elevated B levels are not primarily derived from combined sewer overflows (CSOs) during flooding. Instead, non-point source B exhibits complex behavior depending on land use. In urban settings B is rapidly mobilized from lawns during “first flush” events, likely representing surficial salt residues from drinking water used to irrigate lawns, and is also associated with the baseflow fraction, likely derived from the shallow groundwater reservoir that over time accumulates B from drinking water that percolates into the subsurface. The opposite occurs in small rural watersheds, where B is leached from soils by recent rainfall and covaries with the event water fraction. Highlights: ► Boron sources and loads differ between urban and rural watersheds. ► Wastewaters are not the major boron source in small St. Louis, MO watersheds. ► Municipal drinking water used for lawn

  1. Some problems of neutron source multiplication method for site measurement technology in nuclear critical safety

    International Nuclear Information System (INIS)

    Shi Yongqian; Zhu Qingfu; Hu Dingsheng; He Tao; Yao Shigui; Lin Shenghuo

    2004-01-01

    The paper gives experiment theory and experiment method of neutron source multiplication method for site measurement technology in the nuclear critical safety. The measured parameter by source multiplication method actually is a sub-critical with source neutron effective multiplication factor k s , but not the neutron effective multiplication factor k eff . The experiment research has been done on the uranium solution nuclear critical safety experiment assembly. The k s of different sub-criticality is measured by neutron source multiplication experiment method, and k eff of different sub-criticality, the reactivity coefficient of unit solution level, is first measured by period method, and then multiplied by difference of critical solution level and sub-critical solution level and obtained the reactivity of sub-critical solution level. The k eff finally can be extracted from reactivity formula. The effect on the nuclear critical safety and different between k eff and k s are discussed

  2. Multiple Sources of Prescription Payment and Risky Opioid Therapy Among Veterans.

    Science.gov (United States)

    Becker, William C; Fenton, Brenda T; Brandt, Cynthia A; Doyle, Erin L; Francis, Joseph; Goulet, Joseph L; Moore, Brent A; Torrise, Virginia; Kerns, Robert D; Kreiner, Peter W

    2017-07-01

    Opioid overdose and other related harms are a major source of morbidity and mortality among US Veterans, in part due to high-risk opioid prescribing. We sought to determine whether having multiple sources of payment for opioids-as a marker for out-of-system access-is associated with risky opioid therapy among veterans. Cross-sectional study examining the association between multiple sources of payment and risky opioid therapy among all individuals with Veterans Health Administration (VHA) payment for opioid analgesic prescriptions in Kentucky during fiscal year 2014-2015. Source of payment categories: (1) VHA only source of payment (sole source); (2) sources of payment were VHA and at least 1 cash payment [VHA+cash payment(s)] whether or not there was a third source of payment; and (3) at least one other noncash source: Medicare, Medicaid, or private insurance [VHA+noncash source(s)]. Our outcomes were 2 risky opioid therapies: combination opioid/benzodiazepine therapy and high-dose opioid therapy, defined as morphine equivalent daily dose ≥90 mg. Of the 14,795 individuals in the analytic sample, there were 81.9% in the sole source category, 6.6% in the VHA+cash payment(s) category, and 11.5% in the VHA+noncash source(s) category. In logistic regression, controlling for age and sex, persons with multiple payment sources had significantly higher odds of each risky opioid therapy, with those in the VHA+cash having significantly higher odds than those in the VHA+noncash source(s) group. Prescribers should examine the prescription monitoring program as multiple payment sources increase the odds of risky opioid therapy.

  3. Multiple Household Water Sources and Their Use in Remote Communities With Evidence From Pacific Island Countries

    Science.gov (United States)

    Elliott, Mark; MacDonald, Morgan C.; Chan, Terence; Kearton, Annika; Shields, Katherine F.; Bartram, Jamie K.; Hadwen, Wade L.

    2017-11-01

    Global water research and monitoring typically focus on the household's "main source of drinking-water." Use of multiple water sources to meet daily household needs has been noted in many developing countries but rarely quantified or reported in detail. We gathered self-reported data using a cross-sectional survey of 405 households in eight communities of the Republic of the Marshall Islands (RMI) and five Solomon Islands (SI) communities. Over 90% of households used multiple sources, with differences in sources and uses between wet and dry seasons. Most RMI households had large rainwater tanks and rationed stored rainwater for drinking throughout the dry season, whereas most SI households collected rainwater in small pots, precluding storage across seasons. Use of a source for cooking was strongly positively correlated with use for drinking, whereas use for cooking was negatively correlated or uncorrelated with nonconsumptive uses (e.g., bathing). Dry season water uses implied greater risk of water-borne disease, with fewer (frequently zero) handwashing sources reported and more unimproved sources consumed. Use of multiple sources is fundamental to household water management and feasible to monitor using electronic survey tools. We contend that recognizing multiple water sources can greatly improve understanding of household-level and community-level climate change resilience, that use of multiple sources confounds health impact studies of water interventions, and that incorporating multiple sources into water supply interventions can yield heretofore-unrealized benefits. We propose that failure to consider multiple sources undermines the design and effectiveness of global water monitoring, data interpretation, implementation, policy, and research.

  4. A robust poverty profile for Brazil using multiple data sources

    Directory of Open Access Journals (Sweden)

    Ferreira Francisco H. G.

    2003-01-01

    Full Text Available This paper presents a poverty profile for Brazil, based on three different sources of household data for 1996. We use PPV consumption data to estimate poverty and indigence lines. ''Contagem'' data is used to allow for an unprecedented refinement of the country's poverty map. Poverty measures and shares are also presented for a wide range of population subgroups, based on the PNAD 1996, with new adjustments for imputed rents and spatial differences in cost of living. Robustness of the profile is verified with respect to different poverty lines, spatial price deflators, and equivalence scales. Overall poverty incidence ranges from 23% with respect to an indigence line to 45% with respect to a more generous poverty line. More importantly, however, poverty is found to vary significantly across regions and city sizes, with rural areas, small and medium towns and the metropolitan peripheries of the North and Northeast regions being poorest.

  5. Exploiting semantic linkages among multiple sources for semantic information retrieval

    Science.gov (United States)

    Li, JianQiang; Yang, Ji-Jiang; Liu, Chunchen; Zhao, Yu; Liu, Bo; Shi, Yuliang

    2014-07-01

    The vision of the Semantic Web is to build a global Web of machine-readable data to be consumed by intelligent applications. As the first step to make this vision come true, the initiative of linked open data has fostered many novel applications aimed at improving data accessibility in the public Web. Comparably, the enterprise environment is so different from the public Web that most potentially usable business information originates in an unstructured form (typically in free text), which poses a challenge for the adoption of semantic technologies in the enterprise environment. Considering that the business information in a company is highly specific and centred around a set of commonly used concepts, this paper describes a pilot study to migrate the concept of linked data into the development of a domain-specific application, i.e. the vehicle repair support system. The set of commonly used concepts, including the part name of a car and the phenomenon term on the car repairing, are employed to build the linkage between data and documents distributed among different sources, leading to the fusion of documents and data across source boundaries. Then, we describe the approaches of semantic information retrieval to consume these linkages for value creation for companies. The experiments on two real-world data sets show that the proposed approaches outperform the best baseline 6.3-10.8% and 6.4-11.1% in terms of top five and top 10 precisions, respectively. We believe that our pilot study can serve as an important reference for the development of similar semantic applications in an enterprise environment.

  6. Search Strategy of Detector Position For Neutron Source Multiplication Method by Using Detected-Neutron Multiplication Factor

    International Nuclear Information System (INIS)

    Endo, Tomohiro

    2011-01-01

    In this paper, an alternative definition of a neutron multiplication factor, detected-neutron multiplication factor kdet, is produced for the neutron source multiplication method..(NSM). By using kdet, a search strategy of appropriate detector position for NSM is also proposed. The NSM is one of the practical subcritical measurement techniques, i.e., the NSM does not require any special equipment other than a stationary external neutron source and an ordinary neutron detector. Additionally, the NSM method is based on steady-state analysis, so that this technique is very suitable for quasi real-time measurement. It is noted that the correction factors play important roles in order to accurately estimate subcriticality from the measured neutron count rates. The present paper aims to clarify how to correct the subcriticality measured by the NSM method, the physical meaning of the correction factors, and how to reduce the impact of correction factors by setting a neutron detector at an appropriate detector position

  7. PDEPTH—A computer program for the geophysical interpretation of magnetic and gravity profiles through Fourier filtering, source-depth analysis, and forward modeling

    Science.gov (United States)

    Phillips, Jeffrey D.

    2018-01-10

    PDEPTH is an interactive, graphical computer program used to construct interpreted geological source models for observed potential-field geophysical profile data. The current version of PDEPTH has been adapted to the Windows platform from an earlier DOS-based version. The input total-field magnetic anomaly and vertical gravity anomaly profiles can be filtered to produce derivative products such as reduced-to-pole magnetic profiles, pseudogravity profiles, pseudomagnetic profiles, and upward-or-downward-continued profiles. A variety of source-location methods can be applied to the original and filtered profiles to estimate (and display on a cross section) the locations and physical properties of contacts, sheet edges, horizontal line sources, point sources, and interface surfaces. Two-and-a-half-dimensional source bodies having polygonal cross sections can be constructed using a mouse and keyboard. These bodies can then be adjusted until the calculated gravity and magnetic fields of the source bodies are close to the observed profiles. Auxiliary information such as the topographic surface, bathymetric surface, seismic basement, and geologic contact locations can be displayed on the cross section using optional input files. Test data files, used to demonstrate the source location methods in the report, and several utility programs are included.

  8. Gravity inversion code

    International Nuclear Information System (INIS)

    Burkhard, N.R.

    1979-01-01

    The gravity inversion code applies stabilized linear inverse theory to determine the topography of a subsurface density anomaly from Bouguer gravity data. The gravity inversion program consists of four source codes: SEARCH, TREND, INVERT, and AVERAGE. TREND and INVERT are used iteratively to converge on a solution. SEARCH forms the input gravity data files for Nevada Test Site data. AVERAGE performs a covariance analysis on the solution. This document describes the necessary input files and the proper operation of the code. 2 figures, 2 tables

  9. Lower dimensional gravity

    International Nuclear Information System (INIS)

    Brown, J.D.

    1988-01-01

    This book addresses the subject of gravity theories in two and three spacetime dimensions. The prevailing philosophy is that lower dimensional models of gravity provide a useful arena for developing new ideas and insights, which are applicable to four dimensional gravity. The first chapter consists of a comprehensive introduction to both two and three dimensional gravity, including a discussion of their basic structures. In the second chapter, the asymptotic structure of three dimensional Einstein gravity with a negative cosmological constant is analyzed. The third chapter contains a treatment of the effects of matter sources in classical two dimensional gravity. The fourth chapter gives a complete analysis of particle pair creation by electric and gravitational fields in two dimensions, and the resulting effect on the cosmological constant

  10. Gravity interpretation via EULDPH

    International Nuclear Information System (INIS)

    Ebrahimzadeh Ardestani, V.

    2003-01-01

    Euler's homogeneity equation for determining the coordinates of the source body especially to estimate the depth (EULDPH) is discussed at this paper. This method is applied to synthetic and high-resolution real data such as gradiometric or microgravity data. Low-quality gravity data especially in the areas with a complex geology structure has rarely been used. The Bouguer gravity anomalies are computed from absolute gravity data after the required corrections. Bouguer anomaly is transferred to residual gravity anomaly. The gravity gradients are estimated from residual anomaly values. Bouguer anomaly is the gravity gradients, using EULDPH. The coordinates of the perturbing body will be determined. Two field examples one in the east of Tehran (Mard Abad) where we would like to determine the location of the anomaly (hydrocarbon) and another in the south-east of Iran close to the border with Afghanistan (Nosrat Abad) where we are exploring chromite are presented

  11. Joint part-of-speech and dependency projection from multiple sources

    DEFF Research Database (Denmark)

    Johannsen, Anders Trærup; Agic, Zeljko; Søgaard, Anders

    2016-01-01

    for multiple tasks from multiple source languages, relying on parallel corpora available for hundreds of languages. When training POS taggers and dependency parsers on jointly projected POS tags and syntactic dependencies using our algorithm, we obtain better performance than a standard approach on 20...

  12. Genetic diversity and antimicrobial resistance of Escherichia coli from human and animal sources uncovers multiple resistances from human sources.

    Directory of Open Access Journals (Sweden)

    A Mark Ibekwe

    Full Text Available Escherichia coli are widely used as indicators of fecal contamination, and in some cases to identify host sources of fecal contamination in surface water. Prevalence, genetic diversity and antimicrobial susceptibility were determined for 600 generic E. coli isolates obtained from surface water and sediment from creeks and channels along the middle Santa Ana River (MSAR watershed of southern California, USA, after a 12 month study. Evaluation of E. coli populations along the creeks and channels showed that E. coli were more prevalent in sediment compared to surface water. E. coli populations were not significantly different (P = 0.05 between urban runoff sources and agricultural sources, however, E. coli genotypes determined by pulsed-field gel electrophoresis (PFGE were less diverse in the agricultural sources than in urban runoff sources. PFGE also showed that E. coli populations in surface water were more diverse than in the sediment, suggesting isolates in sediment may be dominated by clonal populations.Twenty four percent (144 isolates of the 600 isolates exhibited resistance to more than one antimicrobial agent. Most multiple resistances were associated with inputs from urban runoff and involved the antimicrobials rifampicin, tetracycline, and erythromycin. The occurrence of a greater number of E. coli with multiple antibiotic resistances from urban runoff sources than agricultural sources in this watershed provides useful evidence in planning strategies for water quality management and public health protection.

  13. The effect of energy distribution of external source on source multiplication in fast assemblies

    International Nuclear Information System (INIS)

    Karam, R.A.; Vakilian, M.

    1976-02-01

    The essence of this study is the effect of energy distribution of a source on the detection rate as a function of K effective in fast assemblies. This effectiveness, as a function of K was studied in a fission chamber, using the ABN cross-section set and Mach 1 code. It was found that with a source which has a fission spectrum, the reciprocal count rate versus mass relationship is linear down to K effective 0.59. For a thermal source, the linearity was never achieved. (author)

  14. Investigating sources and pathways of perfluoroalkyl acids (PFAAs) in aquifers in Tokyo using multiple tracers

    International Nuclear Information System (INIS)

    Kuroda, Keisuke; Murakami, Michio; Oguma, Kumiko; Takada, Hideshige; Takizawa, Satoshi

    2014-01-01

    We employed a multi-tracer approach to investigate sources and pathways of perfluoroalkyl acids (PFAAs) in urban groundwater, based on 53 groundwater samples taken from confined aquifers and unconfined aquifers in Tokyo. While the median concentrations of groundwater PFAAs were several ng/L, the maximum concentrations of perfluorooctane sulfonate (PFOS, 990 ng/L), perfluorooctanoate (PFOA, 1800 ng/L) and perfluorononanoate (PFNA, 620 ng/L) in groundwater were several times higher than those of wastewater and street runoff reported in the literature. PFAAs were more frequently detected than sewage tracers (carbamazepine and crotamiton), presumably owing to the higher persistence of PFAAs, the multiple sources of PFAAs beyond sewage (e.g., surface runoff, point sources) and the formation of PFAAs from their precursors. Use of multiple methods of source apportionment including principal component analysis–multiple linear regression (PCA–MLR) and perfluoroalkyl carboxylic acid ratio analysis highlighted sewage and point sources as the primary sources of PFAAs in the most severely polluted groundwater samples, with street runoff being a minor source (44.6% sewage, 45.7% point sources and 9.7% street runoff, by PCA–MLR). Tritium analysis indicated that, while young groundwater (recharged during or after the 1970s, when PFAAs were already in commercial use) in shallow aquifers (< 50 m depth) was naturally highly vulnerable to PFAA pollution, PFAAs were also found in old groundwater (recharged before the 1950s, when PFAAs were not in use) in deep aquifers (50–500 m depth). This study demonstrated the utility of multiple uses of tracers (pharmaceuticals and personal care products; PPCPs, tritium) and source apportionment methods in investigating sources and pathways of PFAAs in multiple aquifer systems. - Highlights: • Aquifers in Tokyo had high levels of perfluoroalkyl acids (up to 1800 ng/L). • PFAAs were more frequently detected than sewage

  15. Investigating sources and pathways of perfluoroalkyl acids (PFAAs) in aquifers in Tokyo using multiple tracers

    Energy Technology Data Exchange (ETDEWEB)

    Kuroda, Keisuke, E-mail: keisukekr@gmail.com [Department of Urban Engineering, Graduate School of Engineering, The University of Tokyo, 7-3-1 Hongo, Bunkyo, Tokyo 113-8656 (Japan); Murakami, Michio [Institute of Industrial Science, The University of Tokyo, 4-6-1 Komaba, Meguro, Tokyo 153-8505 (Japan); Oguma, Kumiko [Department of Urban Engineering, Graduate School of Engineering, The University of Tokyo, 7-3-1 Hongo, Bunkyo, Tokyo 113-8656 (Japan); Takada, Hideshige [Laboratory of Organic Geochemistry (LOG), Institute of Symbiotic Science and Technology, Tokyo University of Agriculture and Technology, Fuchu, Tokyo 183-8509 (Japan); Takizawa, Satoshi [Department of Urban Engineering, Graduate School of Engineering, The University of Tokyo, 7-3-1 Hongo, Bunkyo, Tokyo 113-8656 (Japan)

    2014-08-01

    We employed a multi-tracer approach to investigate sources and pathways of perfluoroalkyl acids (PFAAs) in urban groundwater, based on 53 groundwater samples taken from confined aquifers and unconfined aquifers in Tokyo. While the median concentrations of groundwater PFAAs were several ng/L, the maximum concentrations of perfluorooctane sulfonate (PFOS, 990 ng/L), perfluorooctanoate (PFOA, 1800 ng/L) and perfluorononanoate (PFNA, 620 ng/L) in groundwater were several times higher than those of wastewater and street runoff reported in the literature. PFAAs were more frequently detected than sewage tracers (carbamazepine and crotamiton), presumably owing to the higher persistence of PFAAs, the multiple sources of PFAAs beyond sewage (e.g., surface runoff, point sources) and the formation of PFAAs from their precursors. Use of multiple methods of source apportionment including principal component analysis–multiple linear regression (PCA–MLR) and perfluoroalkyl carboxylic acid ratio analysis highlighted sewage and point sources as the primary sources of PFAAs in the most severely polluted groundwater samples, with street runoff being a minor source (44.6% sewage, 45.7% point sources and 9.7% street runoff, by PCA–MLR). Tritium analysis indicated that, while young groundwater (recharged during or after the 1970s, when PFAAs were already in commercial use) in shallow aquifers (< 50 m depth) was naturally highly vulnerable to PFAA pollution, PFAAs were also found in old groundwater (recharged before the 1950s, when PFAAs were not in use) in deep aquifers (50–500 m depth). This study demonstrated the utility of multiple uses of tracers (pharmaceuticals and personal care products; PPCPs, tritium) and source apportionment methods in investigating sources and pathways of PFAAs in multiple aquifer systems. - Highlights: • Aquifers in Tokyo had high levels of perfluoroalkyl acids (up to 1800 ng/L). • PFAAs were more frequently detected than sewage

  16. A simple preparative free-flow electrophoresis joined with gratis gravity: I. Gas cushion injector and self-balance collector instead of multiple channel pump.

    Science.gov (United States)

    Chen, Su; Palmer, James F; Zhang, Wei; Shao, Jing; Li, Si; Fan, Liu-Yin; Sun, Ren; Dong, Yu-Chao; Cao, Cheng-Xi

    2009-06-01

    This paper describes a novel free-flow electrophoresis (FFE), which is joined with gratis gravity, gas cushion injector (GCI) and self-balance collector instead of multiple channel pump, for the purpose of preparative purification. The FFE was evaluated by systemic experiments. The results manifest that (i) even though one-channel peristaltic pump is used for the driving of background buffer, there is still stable flow in the FFE chamber; (ii) the stable flow is induced by the gravity-induced pressure due to the difference of buffer surfaces in the GCI and self-balance collector; (iii) the pulse flow of background buffer induced by the peristaltic pump is greatly reduced by the GCI with good compressibility of included air; (iv) the FFE can be well used for zone electrophoretic separation of amino acids; (v) up to 20 inlets simultaneous sample injection and up to five to tenfold condensation of amino acid can be achieved by combining the FFE device with the method of moving reaction boundary. To the best of authors' knowledge, FFE has not been used for such separation and condensation of amino acids. The relevant results achieved in the paper have evident significance for the development of preparative FFE.

  17. Observational constraints on the physical nature of submillimetre source multiplicity: chance projections are common

    Science.gov (United States)

    Hayward, Christopher C.; Chapman, Scott C.; Steidel, Charles C.; Golob, Anneya; Casey, Caitlin M.; Smith, Daniel J. B.; Zitrin, Adi; Blain, Andrew W.; Bremer, Malcolm N.; Chen, Chian-Chou; Coppin, Kristen E. K.; Farrah, Duncan; Ibar, Eduardo; Michałowski, Michał J.; Sawicki, Marcin; Scott, Douglas; van der Werf, Paul; Fazio, Giovanni G.; Geach, James E.; Gurwell, Mark; Petitpas, Glen; Wilner, David J.

    2018-05-01

    Interferometric observations have demonstrated that a significant fraction of single-dish submillimetre (submm) sources are blends of multiple submm galaxies (SMGs), but the nature of this multiplicity, i.e. whether the galaxies are physically associated or chance projections, has not been determined. We performed spectroscopy of 11 SMGs in six multicomponent submm sources, obtaining spectroscopic redshifts for nine of them. For an additional two component SMGs, we detected continuum emission but no obvious features. We supplement our observed sources with four single-dish submm sources from the literature. This sample allows us to statistically constrain the physical nature of single-dish submm source multiplicity for the first time. In three (3/7, { or} 43^{+39 }_{ -33} {per cent at 95 {per cent} confidence}) of the single-dish sources for which the nature of the blending is unambiguous, the components for which spectroscopic redshifts are available are physically associated, whereas 4/7 (57^{+33 }_{ -39} per cent) have at least one unassociated component. When components whose spectra exhibit continuum but no features and for which the photometric redshift is significantly different from the spectroscopic redshift of the other component are also considered, 6/9 (67^{+26 }_{ -37} per cent) of the single-dish sources are comprised of at least one unassociated component SMG. The nature of the multiplicity of one single-dish source is ambiguous. We conclude that physically associated systems and chance projections both contribute to the multicomponent single-dish submm source population. This result contradicts the conventional wisdom that bright submm sources are solely a result of merger-induced starbursts, as blending of unassociated galaxies is also important.

  18. Massive Gravity

    OpenAIRE

    de Rham, Claudia

    2014-01-01

    We review recent progress in massive gravity. We start by showing how different theories of massive gravity emerge from a higher-dimensional theory of general relativity, leading to the Dvali–Gabadadze–Porrati model (DGP), cascading gravity, and ghost-free massive gravity. We then explore their theoretical and phenomenological consistency, proving the absence of Boulware–Deser ghosts and reviewing the Vainshtein mechanism and the cosmological solutions in these models. Finally, we present alt...

  19. Use of ultrasonic array method for positioning multiple partial discharge sources in transformer oil.

    Science.gov (United States)

    Xie, Qing; Tao, Junhan; Wang, Yongqiang; Geng, Jianghai; Cheng, Shuyi; Lü, Fangcheng

    2014-08-01

    Fast and accurate positioning of partial discharge (PD) sources in transformer oil is very important for the safe, stable operation of power systems because it allows timely elimination of insulation faults. There is usually more than one PD source once an insulation fault occurs in the transformer oil. This study, which has both theoretical and practical significance, proposes a method of identifying multiple PD sources in the transformer oil. The method combines the two-sided correlation transformation algorithm in the broadband signal focusing and the modified Gerschgorin disk estimator. The method of classification of multiple signals is used to determine the directions of arrival of signals from multiple PD sources. The ultrasonic array positioning method is based on the multi-platform direction finding and the global optimization searching. Both the 4 × 4 square planar ultrasonic sensor array and the ultrasonic array detection platform are built to test the method of identifying and positioning multiple PD sources. The obtained results verify the validity and the engineering practicability of this method.

  20. Multiple sclerosis: patients’ information sources and needs on disease symptoms and management

    Directory of Open Access Journals (Sweden)

    Albert I Matti

    2010-06-01

    Full Text Available Albert I Matti1, Helen McCarl2, Pamela Klaer2, Miriam C Keane1, Celia S Chen11Department of Ophthalmology, Flinders Medical Centre and Flinders University, Bedford Park, SA, Australia; 2The Multiple Sclerosis Society of South Australia and Northern Territory, Klemzig, SA, AustraliaObjective: To investigate the current information sources of patients with multiple sclerosis (MS in the early stages of their disease and to identify patients’ preferred source of information. The relative amounts of information from the different sources were also compared.Methods: Participants at a newly diagnosed information session organized by the Multiple Sclerosis Society of South Australia were invited to complete a questionnaire. Participants were asked to rate on a visual analog scale how much information they had received about MS and optic neuritis from different information sources and how much information they would like to receive from each of the sources.Results: A close to ideal amount of information is being provided by the MS society and MS specialist nurses. There is a clear deficit between what information patients are currently receiving and the amount of information they actually want from various sources. Patients wish to receive significantly more information from treating general practitioners, eye specialists, neurologists, and education sessions. Patients have identified less than adequate information received on optic neuritis from all sources.Conclusion: This study noted a clear information deficit regarding MS from all sources. This information deficit is more pronounced in relation to optic neuritis and needs to be addressed in the future.Practice implications: More patient information and counselling needs to be provided to MS patients even at early stages of their disease, especially in relation to management of disease relapse.Keywords: information sources, information needs, MS patients, optic neuritis

  1. Mobility and Sector-specific Effects of Changes in Multiple Sources ...

    African Journals Online (AJOL)

    Using the second and third Cameroon household consumption surveys, this study examined mobility and sector-specific effects of changes in multiple sources of deprivation in Cameroon. Results indicated that between 2001 and 2007, deprivations associated with human capital and labour capital reduced, while ...

  2. Transfer functions of double- and multiple-cavity Fabry-Perot filters driven by Lorentzian sources.

    Science.gov (United States)

    Marti, J; Capmany, J

    1996-12-20

    We derive expressions for the transfer functions of double- and multiple-cavity Fabry-Perot filters driven by laser sources with Lorentzian spectrum. These are of interest because of their applications in sensing and channel filtering in optical frequency-division multiplexing networks.

  3. Organizational Communication in Emergencies: Using Multiple Channels and Sources to Combat Noise and Capture Attention

    Science.gov (United States)

    Stephens, Keri K.; Barrett, Ashley K.; Mahometa, Michael J.

    2013-01-01

    This study relies on information theory, social presence, and source credibility to uncover what best helps people grasp the urgency of an emergency. We surveyed a random sample of 1,318 organizational members who received multiple notifications about a large-scale emergency. We found that people who received 3 redundant messages coming through at…

  4. Reading on the World Wide Web: Dealing with conflicting information from multiple sources

    NARCIS (Netherlands)

    Van Strien, Johan; Brand-Gruwel, Saskia; Boshuizen, Els

    2011-01-01

    Van Strien, J. L. H., Brand-Gruwel, S., & Boshuizen, H. P. A. (2011, August). Reading on the World Wide Web: Dealing with conflicting information from multiple sources. Poster session presented at the biannual conference of the European Association for Research on Learning and Instruction, Exeter,

  5. Direct Position Determination of Multiple Non-Circular Sources with a Moving Coprime Array

    Directory of Open Access Journals (Sweden)

    Yankui Zhang

    2018-05-01

    Full Text Available Direct position determination (DPD is currently a hot topic in wireless localization research as it is more accurate than traditional two-step positioning. However, current DPD algorithms are all based on uniform arrays, which have an insufficient degree of freedom and limited estimation accuracy. To improve the DPD accuracy, this paper introduces a coprime array to the position model of multiple non-circular sources with a moving array. To maximize the advantages of this coprime array, we reconstruct the covariance matrix by vectorization, apply a spatial smoothing technique, and converge the subspace data from each measuring position to establish the cost function. Finally, we obtain the position coordinates of the multiple non-circular sources. The complexity of the proposed method is computed and compared with that of other methods, and the Cramer–Rao lower bound of DPD for multiple sources with a moving coprime array, is derived. Theoretical analysis and simulation results show that the proposed algorithm is not only applicable to circular sources, but can also improve the positioning accuracy of non-circular sources. Compared with existing two-step positioning algorithms and DPD algorithms based on uniform linear arrays, the proposed technique offers a significant improvement in positioning accuracy with a slight increase in complexity.

  6. Direct Position Determination of Multiple Non-Circular Sources with a Moving Coprime Array.

    Science.gov (United States)

    Zhang, Yankui; Ba, Bin; Wang, Daming; Geng, Wei; Xu, Haiyun

    2018-05-08

    Direct position determination (DPD) is currently a hot topic in wireless localization research as it is more accurate than traditional two-step positioning. However, current DPD algorithms are all based on uniform arrays, which have an insufficient degree of freedom and limited estimation accuracy. To improve the DPD accuracy, this paper introduces a coprime array to the position model of multiple non-circular sources with a moving array. To maximize the advantages of this coprime array, we reconstruct the covariance matrix by vectorization, apply a spatial smoothing technique, and converge the subspace data from each measuring position to establish the cost function. Finally, we obtain the position coordinates of the multiple non-circular sources. The complexity of the proposed method is computed and compared with that of other methods, and the Cramer⁻Rao lower bound of DPD for multiple sources with a moving coprime array, is derived. Theoretical analysis and simulation results show that the proposed algorithm is not only applicable to circular sources, but can also improve the positioning accuracy of non-circular sources. Compared with existing two-step positioning algorithms and DPD algorithms based on uniform linear arrays, the proposed technique offers a significant improvement in positioning accuracy with a slight increase in complexity.

  7. A nearly cylindrically symmetric source in Brans-Dicke gravity as the generator of the rotational curves of the galaxies

    Energy Technology Data Exchange (ETDEWEB)

    Santos, S.M. dos [Universidade Estadual Paulista Julio de Mesquita Filho-UNESP, Guaratingueta, SP (Brazil); Instituto Federal de Educacao, Ciencia e Tecnologia do Rio Grande do Sul-IFRS, Porto Alegre, RS (Brazil); Silva, J.M.H. da [Universidade Estadual Paulista Julio de Mesquita Filho-UNESP, Guaratingueta, SP (Brazil); Guimaraes, M.E.X. [Universidade Federal Fluminense-UFF, Instituto de Fisica, Niteroi, RJ (Brazil); Neto, J.L. [Universidade Federal do Rio de Janeiro-UFRJ, Instituto de Fisica, Rio de Janeiro, RJ (Brazil)

    2017-12-15

    Observation shows that the velocities of stars grow by approximately 2-3 orders of magnitude when the distances from the centers of the galaxies are in the range of 0.5-82.3 kpc, before they begin to tend to a constant value. Up to now, the reason for this behavior is still a matter for debate. In this work, we propose a model which adequately describes this unusual behavior using a (nearly) cylindrical symmetrical solution in the framework of a scalar-tensor-like (the Brans-Dicke model) theory of gravity. (orig.)

  8. A nearly cylindrically symmetric source in Brans-Dicke gravity as the generator of the rotational curves of the galaxies

    Science.gov (United States)

    dos Santos, S. Mittmann; da Silva, J. M. Hoff; Guimarães, M. E. X.; Neto, J. L.

    2017-12-01

    Observation shows that the velocities of stars grow by approximately 2-3 orders of magnitude when the distances from the centers of the galaxies are in the range of 0.5-82.3 kpc, before they begin to tend to a constant value. Up to now, the reason for this behavior is still a matter for debate. In this work, we propose a model which adequately describes this unusual behavior using a (nearly) cylindrical symmetrical solution in the framework of a scalar-tensor-like (the Brans-Dicke model) theory of gravity.

  9. Simulation of neutron multiplicity measurements using Geant4. Open source software for nuclear arms control

    Energy Technology Data Exchange (ETDEWEB)

    Kuett, Moritz

    2016-07-07

    Nuclear arms control, including nuclear safeguards and verification technologies for nuclear disarmament typically use software as part of many different technological applications. This thesis proposes to use three open source criteria for such software, allowing users and developers to have free access to a program, have access to the full source code and be able to publish modifications for the program. This proposition is presented and analyzed in detail, together with the description of the development of ''Open Neutron Multiplicity Simulation'', an open source software tool to simulate neutron multiplicity measurements. The description includes physical background of the method, details of the developed program and a comprehensive set of validation calculations.

  10. Multiple Speech Source Separation Using Inter-Channel Correlation and Relaxed Sparsity

    Directory of Open Access Journals (Sweden)

    Maoshen Jia

    2018-01-01

    Full Text Available In this work, a multiple speech source separation method using inter-channel correlation and relaxed sparsity is proposed. A B-format microphone with four spatially located channels is adopted due to the size of the microphone array to preserve the spatial parameter integrity of the original signal. Specifically, we firstly measure the proportion of overlapped components among multiple sources and find that there exist many overlapped time-frequency (TF components with increasing source number. Then, considering the relaxed sparsity of speech sources, we propose a dynamic threshold-based separation approach of sparse components where the threshold is determined by the inter-channel correlation among the recording signals. After conducting a statistical analysis of the number of active sources at each TF instant, a form of relaxed sparsity called the half-K assumption is proposed so that the active source number in a certain TF bin does not exceed half the total number of simultaneously occurring sources. By applying the half-K assumption, the non-sparse components are recovered by regarding the extracted sparse components as a guide, combined with vector decomposition and matrix factorization. Eventually, the final TF coefficients of each source are recovered by the synthesis of sparse and non-sparse components. The proposed method has been evaluated using up to six simultaneous speech sources under both anechoic and reverberant conditions. Both objective and subjective evaluations validated that the perceptual quality of the separated speech by the proposed approach outperforms existing blind source separation (BSS approaches. Besides, it is robust to different speeches whilst confirming all the separated speeches with similar perceptual quality.

  11. Dilatonic black holes in gravity's rainbow with a nonlinear source: the effects of thermal fluctuations

    Energy Technology Data Exchange (ETDEWEB)

    Hendi, S.H. [Shiraz University, Physics Department and Biruni Observatory, College of Sciences, Shiraz (Iran, Islamic Republic of); Research Institute for Astronomy and Astrophysics of Maragha (RIAAM), Maragha (Iran, Islamic Republic of); Panah, B.E. [Shiraz University, Physics Department and Biruni Observatory, College of Sciences, Shiraz (Iran, Islamic Republic of); Research Institute for Astronomy and Astrophysics of Maragha (RIAAM), Maragha (Iran, Islamic Republic of); ICRANet, Pescara (Italy); Panahiyan, S. [Shiraz University, Physics Department and Biruni Observatory, College of Sciences, Shiraz (Iran, Islamic Republic of); Helmholtz-Institut Jena, Jena (Germany); Shahid Beheshti University, Physics Department, Tehran (Iran, Islamic Republic of); Momennia, M. [Shiraz University, Physics Department and Biruni Observatory, College of Sciences, Shiraz (Iran, Islamic Republic of)

    2017-09-15

    This paper is devoted to an investigation of nonlinearly charged dilatonic black holes in the context of gravity's rainbow with two cases: (1) by considering the usual entropy, (2) in the presence of first order logarithmic correction of the entropy. First, exact black hole solutions of dilatonic Born-Infeld gravity with an energy dependent Liouville-type potential are obtained. Then, thermodynamic properties of the mentioned cases are studied, separately. It will be shown that although mass, entropy and the heat capacity are modified due to the presence of a first order correction, the temperature remains independent of it. Furthermore, it will be shown that divergences of the heat capacity, hence phase transition points are also independent of a first order correction, whereas the stability conditions are highly sensitive to variation of the correction parameter. Except for the effects of a first order correction, we will also present a limit on the values of the dilatonic parameter and show that it is possible to recognize AdS and dS thermodynamical behaviors for two specific branches of the dilatonic parameter. In addition, the effects of nonlinear electromagnetic field and energy functions on the thermodynamical behavior of the solutions will be highlighted and dependency of critical behavior, on these generalizations will be investigated. (orig.)

  12. Dilatonic black holes in gravity's rainbow with a nonlinear source: the effects of thermal fluctuations

    International Nuclear Information System (INIS)

    Hendi, S.H.; Panah, B.E.; Panahiyan, S.; Momennia, M.

    2017-01-01

    This paper is devoted to an investigation of nonlinearly charged dilatonic black holes in the context of gravity's rainbow with two cases: (1) by considering the usual entropy, (2) in the presence of first order logarithmic correction of the entropy. First, exact black hole solutions of dilatonic Born-Infeld gravity with an energy dependent Liouville-type potential are obtained. Then, thermodynamic properties of the mentioned cases are studied, separately. It will be shown that although mass, entropy and the heat capacity are modified due to the presence of a first order correction, the temperature remains independent of it. Furthermore, it will be shown that divergences of the heat capacity, hence phase transition points are also independent of a first order correction, whereas the stability conditions are highly sensitive to variation of the correction parameter. Except for the effects of a first order correction, we will also present a limit on the values of the dilatonic parameter and show that it is possible to recognize AdS and dS thermodynamical behaviors for two specific branches of the dilatonic parameter. In addition, the effects of nonlinear electromagnetic field and energy functions on the thermodynamical behavior of the solutions will be highlighted and dependency of critical behavior, on these generalizations will be investigated. (orig.)

  13. Source location in plates based on the multiple sensors array method and wavelet analysis

    International Nuclear Information System (INIS)

    Yang, Hong Jun; Shin, Tae Jin; Lee, Sang Kwon

    2014-01-01

    A new method for impact source localization in a plate is proposed based on the multiple signal classification (MUSIC) and wavelet analysis. For source localization, the direction of arrival of the wave caused by an impact on a plate and the distance between impact position and sensor should be estimated. The direction of arrival can be estimated accurately using MUSIC method. The distance can be obtained by using the time delay of arrival and the group velocity of the Lamb wave in a plate. Time delay is experimentally estimated using the continuous wavelet transform for the wave. The elasto dynamic theory is used for the group velocity estimation.

  14. Source location in plates based on the multiple sensors array method and wavelet analysis

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Hong Jun; Shin, Tae Jin; Lee, Sang Kwon [Inha University, Incheon (Korea, Republic of)

    2014-01-15

    A new method for impact source localization in a plate is proposed based on the multiple signal classification (MUSIC) and wavelet analysis. For source localization, the direction of arrival of the wave caused by an impact on a plate and the distance between impact position and sensor should be estimated. The direction of arrival can be estimated accurately using MUSIC method. The distance can be obtained by using the time delay of arrival and the group velocity of the Lamb wave in a plate. Time delay is experimentally estimated using the continuous wavelet transform for the wave. The elasto dynamic theory is used for the group velocity estimation.

  15. Monte Carlo analyses of the source multiplication factor of the YALINA booster facility

    Energy Technology Data Exchange (ETDEWEB)

    Talamo, Alberto; Gohar, Y.; Kondev, F.; Aliberti, Gerardo [Argonne National Laboratory, 9700 South Cass Avenue, Argonne, IL 60439 (United States); Bolshinsky, I. [Idaho National Laboratory, P. O. Box 2528, Idaho Falls, Idaho 83403 (United States); Kiyavitskaya, Hanna; Bournos, Victor; Fokov, Yury; Routkovskaya, Christina; Serafimovich, Ivan [Joint Institute for Power and Nuclear Research-Sosny, National Academy of Sciences, Minsk, acad. Krasin, 99, 220109 (Belarus)

    2008-07-01

    The multiplication factor of a subcritical assembly is affected by the energy spectrum and spatial distribution of the neutron source. In a critical assembly, neutrons emerge from the fission reactions with an average energy of approx2 MeV; in a deuteron accelerator driven subcritical assembly, neutrons emerge from the fusion target with a fixed energy of 2.45 or 14.1 MeV, from the Deuterium-Deuterium (D-D) and Deuterium-Tritium (D-T) reactions respectively. This study aims at generating accurate neutronics models for the YALINA Booster facility, based on the use of different Monte Carlo neutron transport codes, at defining the facility key physical parameters, and at comparing the neutron multiplication factor for three different neutron sources: fission, D-D and D-T. The calculated values are compared with the experimental results. (authors)

  16. Monte Carlo analyses of the source multiplication factor of the YALINA booster facility

    International Nuclear Information System (INIS)

    Talamo, Alberto; Gohar, Y.; Kondev, F.; Aliberti, Gerardo; Bolshinsky, I.; Kiyavitskaya, Hanna; Bournos, Victor; Fokov, Yury; Routkovskaya, Christina; Serafimovich, Ivan

    2008-01-01

    The multiplication factor of a subcritical assembly is affected by the energy spectrum and spatial distribution of the neutron source. In a critical assembly, neutrons emerge from the fission reactions with an average energy of ∼2 MeV; in a deuteron accelerator driven subcritical assembly, neutrons emerge from the fusion target with a fixed energy of 2.45 or 14.1 MeV, from the Deuterium-Deuterium (D-D) and Deuterium-Tritium (D-T) reactions respectively. This study aims at generating accurate neutronics models for the YALINA Booster facility, based on the use of different Monte Carlo neutron transport codes, at defining the facility key physical parameters, and at comparing the neutron multiplication factor for three different neutron sources: fission, D-D and D-T. The calculated values are compared with the experimental results. (authors)

  17. Distributed 3D Source Localization from 2D DOA Measurements Using Multiple Linear Arrays

    Directory of Open Access Journals (Sweden)

    Antonio Canclini

    2017-01-01

    Full Text Available This manuscript addresses the problem of 3D source localization from direction of arrivals (DOAs in wireless acoustic sensor networks. In this context, multiple sensors measure the DOA of the source, and a central node combines the measurements to yield the source location estimate. Traditional approaches require 3D DOA measurements; that is, each sensor estimates the azimuth and elevation of the source by means of a microphone array, typically in a planar or spherical configuration. The proposed methodology aims at reducing the hardware and computational costs by combining measurements related to 2D DOAs estimated from linear arrays arbitrarily displaced in the 3D space. Each sensor measures the DOA in the plane containing the array and the source. Measurements are then translated into an equivalent planar geometry, in which a set of coplanar equivalent arrays observe the source preserving the original DOAs. This formulation is exploited to define a cost function, whose minimization leads to the source location estimation. An extensive simulation campaign validates the proposed approach and compares its accuracy with state-of-the-art methodologies.

  18. A Monte Carlo multiple source model applied to radiosurgery narrow photon beams

    International Nuclear Information System (INIS)

    Chaves, A.; Lopes, M.C.; Alves, C.C.; Oliveira, C.; Peralta, L.; Rodrigues, P.; Trindade, A.

    2004-01-01

    Monte Carlo (MC) methods are nowadays often used in the field of radiotherapy. Through successive steps, radiation fields are simulated, producing source Phase Space Data (PSD) that enable a dose calculation with good accuracy. Narrow photon beams used in radiosurgery can also be simulated by MC codes. However, the poor efficiency in simulating these narrow photon beams produces PSD whose quality prevents calculating dose with the required accuracy. To overcome this difficulty, a multiple source model was developed that enhances the quality of the reconstructed PSD, reducing also the time and storage capacities. This multiple source model was based on the full MC simulation, performed with the MC code MCNP4C, of the Siemens Mevatron KD2 (6 MV mode) linear accelerator head and additional collimators. The full simulation allowed the characterization of the particles coming from the accelerator head and from the additional collimators that shape the narrow photon beams used in radiosurgery treatments. Eight relevant photon virtual sources were identified from the full characterization analysis. Spatial and energy distributions were stored in histograms for the virtual sources representing the accelerator head components and the additional collimators. The photon directions were calculated for virtual sources representing the accelerator head components whereas, for the virtual sources representing the additional collimators, they were recorded into histograms. All these histograms were included in the MC code, DPM code and using a sampling procedure that reconstructed the PSDs, dose distributions were calculated in a water phantom divided in 20000 voxels of 1x1x5 mm 3 . The model accurately calculates dose distributions in the water phantom for all the additional collimators; for depth dose curves, associated errors at 2σ were lower than 2.5% until a depth of 202.5 mm for all the additional collimators and for profiles at various depths, deviations between measured

  19. Gas production strategy of underground coal gasification based on multiple gas sources.

    Science.gov (United States)

    Tianhong, Duan; Zuotang, Wang; Limin, Zhou; Dongdong, Li

    2014-01-01

    To lower stability requirement of gas production in UCG (underground coal gasification), create better space and opportunities of development for UCG, an emerging sunrise industry, in its initial stage, and reduce the emission of blast furnace gas, converter gas, and coke oven gas, this paper, for the first time, puts forward a new mode of utilization of multiple gas sources mainly including ground gasifier gas, UCG gas, blast furnace gas, converter gas, and coke oven gas and the new mode was demonstrated by field tests. According to the field tests, the existing power generation technology can fully adapt to situation of high hydrogen, low calorific value, and gas output fluctuation in the gas production in UCG in multiple-gas-sources power generation; there are large fluctuations and air can serve as a gasifying agent; the gas production of UCG in the mode of both power and methanol based on multiple gas sources has a strict requirement for stability. It was demonstrated by the field tests that the fluctuations in gas production in UCG can be well monitored through a quality control chart method.

  20. Gas Production Strategy of Underground Coal Gasification Based on Multiple Gas Sources

    Directory of Open Access Journals (Sweden)

    Duan Tianhong

    2014-01-01

    Full Text Available To lower stability requirement of gas production in UCG (underground coal gasification, create better space and opportunities of development for UCG, an emerging sunrise industry, in its initial stage, and reduce the emission of blast furnace gas, converter gas, and coke oven gas, this paper, for the first time, puts forward a new mode of utilization of multiple gas sources mainly including ground gasifier gas, UCG gas, blast furnace gas, converter gas, and coke oven gas and the new mode was demonstrated by field tests. According to the field tests, the existing power generation technology can fully adapt to situation of high hydrogen, low calorific value, and gas output fluctuation in the gas production in UCG in multiple-gas-sources power generation; there are large fluctuations and air can serve as a gasifying agent; the gas production of UCG in the mode of both power and methanol based on multiple gas sources has a strict requirement for stability. It was demonstrated by the field tests that the fluctuations in gas production in UCG can be well monitored through a quality control chart method.

  1. Nonlocal gravity

    CERN Document Server

    Mashhoon, Bahram

    2017-01-01

    Relativity theory is based on a postulate of locality, which means that the past history of the observer is not directly taken into account. This book argues that the past history should be taken into account. In this way, nonlocality---in the sense of history dependence---is introduced into relativity theory. The deep connection between inertia and gravitation suggests that gravity could be nonlocal, and in nonlocal gravity the fading gravitational memory of past events must then be taken into account. Along this line of thought, a classical nonlocal generalization of Einstein's theory of gravitation has recently been developed. A significant consequence of this theory is that the nonlocal aspect of gravity appears to simulate dark matter. According to nonlocal gravity theory, what astronomers attribute to dark matter should instead be due to the nonlocality of gravitation. Nonlocality dominates on the scale of galaxies and beyond. Memory fades with time; therefore, the nonlocal aspect of gravity becomes wea...

  2. Managing Multiple Sources of Competitive Advantage in a Complex Competitive Environment

    Directory of Open Access Journals (Sweden)

    Alexandre Howard Henry Lapersonne

    2013-12-01

    Full Text Available The aim of this article is to review the literature on the topic of sustained and temporary competitive advantage creation, specifically in dynamic markets, and to propose further research possibilities. After having analyzed the main trends and scholars’ works on the subject, it was concluded that a firm which has been experiencing erosion of its core sources of economic rent generation, should have diversified its strategy portfolio in a search for new sources of competitive advantage, ones that could compensate for the decline of profits provoked by intensive competitive environments. This review concludes with the hypothesis that firms, who have decided to enter and manage multiple competitive environments, should have developed a multiple strategies framework approach. The management of this source of competitive advantage portfolio should have allowed persistence of a firm’s superior economic performance through the management of diverse temporary advantages lifecycle and through a resilient effect, where a very successful source of competitive advantage compensates the ones that have been eroded. Additionally, the review indicates that economies of emerging countries, such as the ones from the BRIC block, should present a more complex competitive environment due to their historical nature of cultural diversity, social contrasts and frequent economic disruption, and also because of recent institutional normalization that has turned the market into hypercompetition. Consequently, the study of complex competition should be appropriate in such environments.

  3. Two Wrongs Make a Right: Addressing Underreporting in Binary Data from Multiple Sources.

    Science.gov (United States)

    Cook, Scott J; Blas, Betsabe; Carroll, Raymond J; Sinha, Samiran

    2017-04-01

    Media-based event data-i.e., data comprised from reporting by media outlets-are widely used in political science research. However, events of interest (e.g., strikes, protests, conflict) are often underreported by these primary and secondary sources, producing incomplete data that risks inconsistency and bias in subsequent analysis. While general strategies exist to help ameliorate this bias, these methods do not make full use of the information often available to researchers. Specifically, much of the event data used in the social sciences is drawn from multiple, overlapping news sources (e.g., Agence France-Presse, Reuters). Therefore, we propose a novel maximum likelihood estimator that corrects for misclassification in data arising from multiple sources. In the most general formulation of our estimator, researchers can specify separate sets of predictors for the true-event model and each of the misclassification models characterizing whether a source fails to report on an event. As such, researchers are able to accurately test theories on both the causes of and reporting on an event of interest. Simulations evidence that our technique regularly out performs current strategies that either neglect misclassification, the unique features of the data-generating process, or both. We also illustrate the utility of this method with a model of repression using the Social Conflict in Africa Database.

  4. Using National Drug Codes and drug knowledge bases to organize prescription records from multiple sources.

    Science.gov (United States)

    Simonaitis, Linas; McDonald, Clement J

    2009-10-01

    The utility of National Drug Codes (NDCs) and drug knowledge bases (DKBs) in the organization of prescription records from multiple sources was studied. The master files of most pharmacy systems include NDCs and local codes to identify the products they dispense. We obtained a large sample of prescription records from seven different sources. These records carried a national product code or a local code that could be translated into a national product code via their formulary master. We obtained mapping tables from five DKBs. We measured the degree to which the DKB mapping tables covered the national product codes carried in or associated with the sample of prescription records. Considering the total prescription volume, DKBs covered 93.0-99.8% of the product codes from three outpatient sources and 77.4-97.0% of the product codes from four inpatient sources. Among the in-patient sources, invented codes explained 36-94% of the noncoverage. Outpatient pharmacy sources rarely invented codes, which comprised only 0.11-0.21% of their total prescription volume, compared with inpatient pharmacy sources for which invented codes comprised 1.7-7.4% of their prescription volume. The distribution of prescribed products was highly skewed, with 1.4-4.4% of codes accounting for 50% of the message volume and 10.7-34.5% accounting for 90% of the message volume. DKBs cover the product codes used by outpatient sources sufficiently well to permit automatic mapping. Changes in policies and standards could increase coverage of product codes used by inpatient sources.

  5. Transparent mediation-based access to multiple yeast data sources using an ontology driven interface.

    Science.gov (United States)

    Briache, Abdelaali; Marrakchi, Kamar; Kerzazi, Amine; Navas-Delgado, Ismael; Rossi Hassani, Badr D; Lairini, Khalid; Aldana-Montes, José F

    2012-01-25

    Saccharomyces cerevisiae is recognized as a model system representing a simple eukaryote whose genome can be easily manipulated. Information solicited by scientists on its biological entities (Proteins, Genes, RNAs...) is scattered within several data sources like SGD, Yeastract, CYGD-MIPS, BioGrid, PhosphoGrid, etc. Because of the heterogeneity of these sources, querying them separately and then manually combining the returned results is a complex and time-consuming task for biologists most of whom are not bioinformatics expert. It also reduces and limits the use that can be made on the available data. To provide transparent and simultaneous access to yeast sources, we have developed YeastMed: an XML and mediator-based system. In this paper, we present our approach in developing this system which takes advantage of SB-KOM to perform the query transformation needed and a set of Data Services to reach the integrated data sources. The system is composed of a set of modules that depend heavily on XML and Semantic Web technologies. User queries are expressed in terms of a domain ontology through a simple form-based web interface. YeastMed is the first mediation-based system specific for integrating yeast data sources. It was conceived mainly to help biologists to find simultaneously relevant data from multiple data sources. It has a biologist-friendly interface easy to use. The system is available at http://www.khaos.uma.es/yeastmed/.

  6. Multiple Spectral Ratio Analyses Reveal Earthquake Source Spectra of Small Earthquakes and Moment Magnitudes of Microearthquakes

    Science.gov (United States)

    Uchide, T.; Imanishi, K.

    2016-12-01

    Spectral studies for macroscopic earthquake source parameters are helpful for characterizing earthquake rupture process and hence understanding earthquake source physics and fault properties. Those studies require us mute wave propagation path and site effects in spectra of seismograms to accentuate source effect. We have recently developed the multiple spectral ratio method [Uchide and Imanishi, BSSA, 2016] employing many empirical Green's function (EGF) events to reduce errors from the choice of EGF events. This method helps us estimate source spectra more accurately as well as moment ratios among reference and EGF events, which are useful to constrain the seismic moment of microearthquakes. First, we focus on earthquake source spectra. The source spectra have generally been thought to obey the omega-square model with single corner-frequency. However recent studies imply the existence of another corner frequency for some earthquakes. We analyzed small shallow inland earthquakes (3.5 multiple spectral ratio analyses. For 20000 microearthquakes in Fukushima Hamadori and northern Ibaraki prefecture area, we found that the JMA magnitudes (Mj) based on displacement or velocity amplitude are systematically below Mw. The slope of the Mj-Mw relation is 0.5 for Mj 5. We propose a fitting curve for the obtained relationship as Mw = (1/2)Mj + (1/2)(Mjγ + Mcorγ)1/γ+ c, where Mcor is a corner magnitude, γ determines the sharpness of the corner, and c denotes an offset. We obtained Mcor = 4.1, γ = 5.6, and c = -0.47 to fit the observation. The parameters are useful for characterizing the Mj-Mw relationship. This non-linear relationship affects the b-value of the Gutenberg-Richter law. Quantitative discussions on b-values are affected by the definition of magnitude to use.

  7. Massive gravity from bimetric gravity

    International Nuclear Information System (INIS)

    Baccetti, Valentina; Martín-Moruno, Prado; Visser, Matt

    2013-01-01

    We discuss the subtle relationship between massive gravity and bimetric gravity, focusing particularly on the manner in which massive gravity may be viewed as a suitable limit of bimetric gravity. The limiting procedure is more delicate than currently appreciated. Specifically, this limiting procedure should not unnecessarily constrain the background metric, which must be externally specified by the theory of massive gravity itself. The fact that in bimetric theories one always has two sets of metric equations of motion continues to have an effect even in the massive gravity limit, leading to additional constraints besides the one set of equations of motion naively expected. Thus, since solutions of bimetric gravity in the limit of vanishing kinetic term are also solutions of massive gravity, but the contrary statement is not necessarily true, there is no complete continuity in the parameter space of the theory. In particular, we study the massive cosmological solutions which are continuous in the parameter space, showing that many interesting cosmologies belong to this class. (paper)

  8. Accounting for multiple sources of uncertainty in impact assessments: The example of the BRACE study

    Science.gov (United States)

    O'Neill, B. C.

    2015-12-01

    Assessing climate change impacts often requires the use of multiple scenarios, types of models, and data sources, leading to a large number of potential sources of uncertainty. For example, a single study might require a choice of a forcing scenario, climate model, bias correction and/or downscaling method, societal development scenario, model (typically several) for quantifying elements of societal development such as economic and population growth, biophysical model (such as for crop yields or hydrology), and societal impact model (e.g. economic or health model). Some sources of uncertainty are reduced or eliminated by the framing of the question. For example, it may be useful to ask what an impact outcome would be conditional on a given societal development pathway, forcing scenario, or policy. However many sources of uncertainty remain, and it is rare for all or even most of these sources to be accounted for. I use the example of a recent integrated project on the Benefits of Reduced Anthropogenic Climate changE (BRACE) to explore useful approaches to uncertainty across multiple components of an impact assessment. BRACE comprises 23 papers that assess the differences in impacts between two alternative climate futures: those associated with Representative Concentration Pathways (RCPs) 4.5 and 8.5. It quantifies difference in impacts in terms of extreme events, health, agriculture, tropical cyclones, and sea level rise. Methodologically, it includes climate modeling, statistical analysis, integrated assessment modeling, and sector-specific impact modeling. It employs alternative scenarios of both radiative forcing and societal development, but generally uses a single climate model (CESM), partially accounting for climate uncertainty by drawing heavily on large initial condition ensembles. Strengths and weaknesses of the approach to uncertainty in BRACE are assessed. Options under consideration for improving the approach include the use of perturbed physics

  9. An efficient central DOA tracking algorithm for multiple incoherently distributed sources

    Science.gov (United States)

    Hassen, Sonia Ben; Samet, Abdelaziz

    2015-12-01

    In this paper, we develop a new tracking method for the direction of arrival (DOA) parameters assuming multiple incoherently distributed (ID) sources. The new approach is based on a simple covariance fitting optimization technique exploiting the central and noncentral moments of the source angular power densities to estimate the central DOAs. The current estimates are treated as measurements provided to the Kalman filter that model the dynamic property of directional changes for the moving sources. Then, the covariance-fitting-based algorithm and the Kalman filtering theory are combined to formulate an adaptive tracking algorithm. Our algorithm is compared to the fast approximated power iteration-total least square-estimation of signal parameters via rotational invariance technique (FAPI-TLS-ESPRIT) algorithm using the TLS-ESPRIT method and the subspace updating via FAPI-algorithm. It will be shown that the proposed algorithm offers an excellent DOA tracking performance and outperforms the FAPI-TLS-ESPRIT method especially at low signal-to-noise ratio (SNR) values. Moreover, the performances of the two methods increase as the SNR values increase. This increase is more prominent with the FAPI-TLS-ESPRIT method. However, their performances degrade when the number of sources increases. It will be also proved that our method depends on the form of the angular distribution function when tracking the central DOAs. Finally, it will be shown that the more the sources are spaced, the more the proposed method can exactly track the DOAs.

  10. Interpolating between random walks and optimal transportation routes: Flow with multiple sources and targets

    Science.gov (United States)

    Guex, Guillaume

    2016-05-01

    In recent articles about graphs, different models proposed a formalism to find a type of path between two nodes, the source and the target, at crossroads between the shortest-path and the random-walk path. These models include a freely adjustable parameter, allowing to tune the behavior of the path toward randomized movements or direct routes. This article presents a natural generalization of these models, namely a model with multiple sources and targets. In this context, source nodes can be viewed as locations with a supply of a certain good (e.g. people, money, information) and target nodes as locations with a demand of the same good. An algorithm is constructed to display the flow of goods in the network between sources and targets. With again a freely adjustable parameter, this flow can be tuned to follow routes of minimum cost, thus displaying the flow in the context of the optimal transportation problem or, by contrast, a random flow, known to be similar to the electrical current flow if the random-walk is reversible. Moreover, a source-targetcoupling can be retrieved from this flow, offering an optimal assignment to the transportation problem. This algorithm is described in the first part of this article and then illustrated with case studies.

  11. Differentiating between anthropogenic and geological sources of nitrate using multiple geochemical tracers

    Science.gov (United States)

    Linhoff, B.; Norton, S.; Travis, R.; Romero, Z.; Waters, B.

    2017-12-01

    Nitrate contamination of groundwater is a major problem globally including within the Albuquerque Basin in New Mexico. Ingesting high concentrations of nitrate (> 10 mg/L as N) can lead to an increased risk of cancer and to methemoglobinemia in infants. Numerous anthropogenic sources of nitrate have been identified within the Albuquerque Basin including fertilizers, landfills, multiple sewer pipe releases, sewer lagoons, domestic septic leach fields, and a nitric acid line outfall. Furthermore, groundwater near ephemeral streams often exhibits elevated NO3 concentrations and high NO3/Cl ratios incongruous with an anthropogenic source. These results suggest that NO3 can be concentrated through evaporation beneath ephemeral streams and mobilized via irrigation or land use change. This study seeks to use extensive geochemical analyses of groundwater and surface water to differentiate between various sources of NO3 contamination. The U.S. Geological Survey collected 54 groundwater samples from wells and six samples from ephemeral streams from within and from outside of areas of known nitrate contamination. To fingerprint the sources of nitrate pollution, samples were analyzed for major ions, trace metals, nutrients, dissolved gases, δ15N and δ18O in NO3, δ15N within N2 gas, and, δ2H and δ18O in H2O. Furthermore, most sites were sampled for artificial sweeteners and numerous contaminants of emerging concern including pharmaceutical drugs, caffeine, and wastewater indicators. This study will also investigate the age distribution of groundwater and the approximate age of anthropogenic NO3 contamination using 3He/4He, δ13C, 14C, 3H, as well as pharmaceutical drugs and artificial sweeteners with known patent and U.S. Food and Drug Administration approval dates. This broad suite of analytes will be used to differentiate between naturally occurring and multiple anthropogenic NO3 sources, and to potentially determine the approximate date of NO3 contamination.

  12. Validation and calibration of structural models that combine information from multiple sources.

    Science.gov (United States)

    Dahabreh, Issa J; Wong, John B; Trikalinos, Thomas A

    2017-02-01

    Mathematical models that attempt to capture structural relationships between their components and combine information from multiple sources are increasingly used in medicine. Areas covered: We provide an overview of methods for model validation and calibration and survey studies comparing alternative approaches. Expert commentary: Model validation entails a confrontation of models with data, background knowledge, and other models, and can inform judgments about model credibility. Calibration involves selecting parameter values to improve the agreement of model outputs with data. When the goal of modeling is quantitative inference on the effects of interventions or forecasting, calibration can be viewed as estimation. This view clarifies issues related to parameter identifiability and facilitates formal model validation and the examination of consistency among different sources of information. In contrast, when the goal of modeling is the generation of qualitative insights about the modeled phenomenon, calibration is a rather informal process for selecting inputs that result in model behavior that roughly reproduces select aspects of the modeled phenomenon and cannot be equated to an estimation procedure. Current empirical research on validation and calibration methods consists primarily of methodological appraisals or case-studies of alternative techniques and cannot address the numerous complex and multifaceted methodological decisions that modelers must make. Further research is needed on different approaches for developing and validating complex models that combine evidence from multiple sources.

  13. Glutathione provides a source of cysteine essential for intracellular multiplication of Francisella tularensis.

    Directory of Open Access Journals (Sweden)

    Khaled Alkhuder

    2009-01-01

    Full Text Available Francisella tularensis is a highly infectious bacterium causing the zoonotic disease tularemia. Its ability to multiply and survive in macrophages is critical for its virulence. By screening a bank of HimarFT transposon mutants of the F. tularensis live vaccine strain (LVS to isolate intracellular growth-deficient mutants, we selected one mutant in a gene encoding a putative gamma-glutamyl transpeptidase (GGT. This gene (FTL_0766 was hence designated ggt. The mutant strain showed impaired intracellular multiplication and was strongly attenuated for virulence in mice. Here we present evidence that the GGT activity of F. tularensis allows utilization of glutathione (GSH, gamma-glutamyl-cysteinyl-glycine and gamma-glutamyl-cysteine dipeptide as cysteine sources to ensure intracellular growth. This is the first demonstration of the essential role of a nutrient acquisition system in the intracellular multiplication of F. tularensis. GSH is the most abundant source of cysteine in the host cytosol. Thus, the capacity this intracellular bacterial pathogen has evolved to utilize the available GSH, as a source of cysteine in the host cytosol, constitutes a paradigm of bacteria-host adaptation.

  14. Subcritical Neutron Multiplication Measurements of HEU Using Delayed Neutrons as the Driving Source

    International Nuclear Information System (INIS)

    Hollas, C.L.; Goulding, C.A.; Myers, W.L.

    1999-01-01

    A new method for the determination of the multiplication of highly enriched uranium systems is presented. The method uses delayed neutrons to drive the HEU system. These delayed neutrons are from fission events induced by a pulsed 14-MeV neutron source. Between pulses, neutrons are detected within a medium efficiency neutron detector using 3 He ionization tubes within polyethylene enclosures. The neutron detection times are recorded relative to the initiation of the 14-MeV neutron pulse, and subsequently analyzed with the Feynman reduced variance method to extract singles, doubles and triples neutron counting rates. Measurements have been made on a set of nested hollow spheres of 93% enriched uranium, with mass values from 3.86 kg to 21.48 kg. The singles, doubles and triples counting rates for each uranium system are compared to calculations from point kinetics models of neutron multiplicity to assign multiplication values. These multiplication values are compared to those from MC NP K-Code calculations

  15. Development of repository-wide radionuclide transport model considering the effects of multiple sources

    International Nuclear Information System (INIS)

    Hatanaka, Koichiro; Watari, Shingo; Ijiri, Yuji

    1999-11-01

    Safety assessment of the geological isolation system according to the groundwater scenario has traditionally been conducted based on the signal canister configuration and then the safety of total system has been evaluated based on the dose rates which were obtained by multiplying the migration rates released from the engineered barrier and/or the natural barrier by dose conversion factors and total number of canisters disposed in the repository. The dose conversion factors can be obtained from the biosphere analysis. In this study, we focused on the effect of multiple sources due to the disposal of canisters at different positions in the repository. By taking the effect of multiple sources into consideration, concentration interference in the repository region is possible to take place. Therefore, radionuclide transport model/code considering the effect of concentration interference due to the multiple sources was developed to make assessments of the effect quantitatively. The newly developed model/code was verified through the comparison analysis with the existing radionuclide transport analysis code used in the second progress report. In addition, the effect of the concentration interference was evaluated by setting a simple problem using the newly developed analysis code. This results shows that the maximum park value of the migration rates from the repository was about two orders of magnitude lower than that based on single canister configuration. Since the analysis code was developed by assuming that all canisters disposed of along the one-dimensional groundwater flow contribute to the concentration interference in the repository region, the assumption should be verified by conducting two or three-dimensional analysis considering heterogeneous geological structure as a future work. (author)

  16. Gravity brake

    Science.gov (United States)

    Lujan, Richard E.

    2001-01-01

    A mechanical gravity brake that prevents hoisted loads within a shaft from free-falling when a loss of hoisting force occurs. A loss of hoist lifting force may occur in a number of situations, for example if a hoist cable were to break, the brakes were to fail on a winch, or the hoist mechanism itself were to fail. Under normal hoisting conditions, the gravity brake of the invention is subject to an upward lifting force from the hoist and a downward pulling force from a suspended load. If the lifting force should suddenly cease, the loss of differential forces on the gravity brake in free-fall is translated to extend a set of brakes against the walls of the shaft to stop the free fall descent of the gravity brake and attached load.

  17. Analogue Gravity

    Directory of Open Access Journals (Sweden)

    Barceló Carlos

    2005-12-01

    Full Text Available Analogue models of (and for gravity have a long and distinguished history dating back to the earliest years of general relativity. In this review article we will discuss the history, aims, results, and future prospects for the various analogue models. We start the discussion by presenting a particularly simple example of an analogue model, before exploring the rich history and complex tapestry of models discussed in the literature. The last decade in particular has seen a remarkable and sustained development of analogue gravity ideas, leading to some hundreds of published articles, a workshop, two books, and this review article. Future prospects for the analogue gravity programme also look promising, both on the experimental front (where technology is rapidly advancing and on the theoretical front (where variants of analogue models can be used as a springboard for radical attacks on the problem of quantum gravity.

  18. Quantum Gravity

    OpenAIRE

    Alvarez, Enrique

    2004-01-01

    Gravitons should have momentum just as photons do; and since graviton momentum would cause compression rather than elongation of spacetime outside of matter; it does not appear that gravitons are compatible with Swartzchild's spacetime curvature. Also, since energy is proportional to mass, and mass is proportional to gravity; the energy of matter is proportional to gravity. The energy of matter could thus contract space within matter; and because of the inter-connectedness of space, cause the...

  19. Relaxation dynamics in the presence of pulse multiplicative noise sources with different correlation properties

    Science.gov (United States)

    Kargovsky, A. V.; Chichigina, O. A.; Anashkina, E. I.; Valenti, D.; Spagnolo, B.

    2015-10-01

    The relaxation dynamics of a system described by a Langevin equation with pulse multiplicative noise sources with different correlation properties is considered. The solution of the corresponding Fokker-Planck equation is derived for Gaussian white noise. Moreover, two pulse processes with regulated periodicity are considered as a noise source: the dead-time-distorted Poisson process and the process with fixed time intervals, which is characterized by an infinite correlation time. We find that the steady state of the system is dependent on the correlation properties of the pulse noise. An increase of the noise correlation causes the decrease of the mean value of the solution at the steady state. The analytical results are in good agreement with the numerical ones.

  20. System and method for integrating and accessing multiple data sources within a data warehouse architecture

    Science.gov (United States)

    Musick, Charles R [Castro Valley, CA; Critchlow, Terence [Livermore, CA; Ganesh, Madhaven [San Jose, CA; Slezak, Tom [Livermore, CA; Fidelis, Krzysztof [Brentwood, CA

    2006-12-19

    A system and method is disclosed for integrating and accessing multiple data sources within a data warehouse architecture. The metadata formed by the present method provide a way to declaratively present domain specific knowledge, obtained by analyzing data sources, in a consistent and useable way. Four types of information are represented by the metadata: abstract concepts, databases, transformations and mappings. A mediator generator automatically generates data management computer code based on the metadata. The resulting code defines a translation library and a mediator class. The translation library provides a data representation for domain specific knowledge represented in a data warehouse, including "get" and "set" methods for attributes that call transformation methods and derive a value of an attribute if it is missing. The mediator class defines methods that take "distinguished" high-level objects as input and traverse their data structures and enter information into the data warehouse.

  1. Prediction of the neutrons subcritical multiplication using the diffusion hybrid equation with external neutron sources

    Energy Technology Data Exchange (ETDEWEB)

    Costa da Silva, Adilson; Carvalho da Silva, Fernando [COPPE/UFRJ, Programa de Engenharia Nuclear, Caixa Postal 68509, 21941-914, Rio de Janeiro (Brazil); Senra Martinez, Aquilino, E-mail: aquilino@lmp.ufrj.br [COPPE/UFRJ, Programa de Engenharia Nuclear, Caixa Postal 68509, 21941-914, Rio de Janeiro (Brazil)

    2011-07-15

    Highlights: > We proposed a new neutron diffusion hybrid equation with external neutron source. > A coarse mesh finite difference method for the adjoint flux and reactivity calculation was developed. > 1/M curve to predict the criticality condition is used. - Abstract: We used the neutron diffusion hybrid equation, in cartesian geometry with external neutron sources to predict the subcritical multiplication of neutrons in a pressurized water reactor, using a 1/M curve to predict the criticality condition. A Coarse Mesh Finite Difference Method was developed for the adjoint flux calculation and to obtain the reactivity values of the reactor. The results obtained were compared with benchmark values in order to validate the methodology presented in this paper.

  2. Prediction of the neutrons subcritical multiplication using the diffusion hybrid equation with external neutron sources

    International Nuclear Information System (INIS)

    Costa da Silva, Adilson; Carvalho da Silva, Fernando; Senra Martinez, Aquilino

    2011-01-01

    Highlights: → We proposed a new neutron diffusion hybrid equation with external neutron source. → A coarse mesh finite difference method for the adjoint flux and reactivity calculation was developed. → 1/M curve to predict the criticality condition is used. - Abstract: We used the neutron diffusion hybrid equation, in cartesian geometry with external neutron sources to predict the subcritical multiplication of neutrons in a pressurized water reactor, using a 1/M curve to predict the criticality condition. A Coarse Mesh Finite Difference Method was developed for the adjoint flux calculation and to obtain the reactivity values of the reactor. The results obtained were compared with benchmark values in order to validate the methodology presented in this paper.

  3. An open-source software program for performing Bonferroni and related corrections for multiple comparisons

    Directory of Open Access Journals (Sweden)

    Kyle Lesack

    2011-01-01

    Full Text Available Increased type I error resulting from multiple statistical comparisons remains a common problem in the scientific literature. This may result in the reporting and promulgation of spurious findings. One approach to this problem is to correct groups of P-values for "family-wide significance" using a Bonferroni correction or the less conservative Bonferroni-Holm correction or to correct for the "false discovery rate" with a Benjamini-Hochberg correction. Although several solutions are available for performing this correction through commercially available software there are no widely available easy to use open source programs to perform these calculations. In this paper we present an open source program written in Python 3.2 that performs calculations for standard Bonferroni, Bonferroni-Holm and Benjamini-Hochberg corrections.

  4. Joint source based analysis of multiple brain structures in studying major depressive disorder

    Science.gov (United States)

    Ramezani, Mahdi; Rasoulian, Abtin; Hollenstein, Tom; Harkness, Kate; Johnsrude, Ingrid; Abolmaesumi, Purang

    2014-03-01

    We propose a joint Source-Based Analysis (jSBA) framework to identify brain structural variations in patients with Major Depressive Disorder (MDD). In this framework, features representing position, orientation and size (i.e. pose), shape, and local tissue composition are extracted. Subsequently, simultaneous analysis of these features within a joint analysis method is performed to generate the basis sources that show signi cant di erences between subjects with MDD and those in healthy control. Moreover, in a cross-validation leave- one-out experiment, we use a Fisher Linear Discriminant (FLD) classi er to identify individuals within the MDD group. Results show that we can classify the MDD subjects with an accuracy of 76% solely based on the information gathered from the joint analysis of pose, shape, and tissue composition in multiple brain structures.

  5. On the possibility of the multiple inductively coupled plasma and helicon plasma sources for large-area processes

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jin-Won; Lee, Yun-Seong, E-mail: leeeeys@kaist.ac.kr; Chang, Hong-Young [Low-temperature Plasma Laboratory, Department of Physics, Korea Advanced Institute of Science and Technology, Daejeon 305-701 (Korea, Republic of); An, Sang-Hyuk [Agency of Defense Development, Yuseong-gu, Daejeon 305-151 (Korea, Republic of)

    2014-08-15

    In this study, we attempted to determine the possibility of multiple inductively coupled plasma (ICP) and helicon plasma sources for large-area processes. Experiments were performed with the one and two coils to measure plasma and electrical parameters, and a circuit simulation was performed to measure the current at each coil in the 2-coil experiment. Based on the result, we could determine the possibility of multiple ICP sources due to a direct change of impedance due to current and saturation of impedance due to the skin-depth effect. However, a helicon plasma source is difficult to adapt to the multiple sources due to the consistent change of real impedance due to mode transition and the low uniformity of the B-field confinement. As a result, it is expected that ICP can be adapted to multiple sources for large-area processes.

  6. Shape optimization of an airfoil in a BZT flow with multiple-source uncertainties

    International Nuclear Information System (INIS)

    Congedo, P.M.; Corre, C.; Martinez, J.M.

    2011-01-01

    Bethe-Zel'dovich-Thompson fluids (BZT) are characterized by negative values of the fundamental derivative of gas dynamics for a range of temperatures and pressures in the vapor phase, which leads to non-classical gas dynamic behaviors such as the disintegration of compression shocks. These non-classical phenomena can be exploited, when using these fluids in Organic Rankine Cycles (ORCs), to increase isentropic efficiency. A predictive numerical simulation of these flows must account for two main sources of physical uncertainties: the BZT fluid properties often difficult to measure accurately and the usually fluctuating turbine inlet conditions. For taking full advantage of the BZT properties, the turbine geometry must also be specifically designed, keeping in mind the geometry achieved in practice after machining always slightly differs from the theoretical shape. This paper investigates some efficient procedures to perform shape optimization in a 2D BZT flow with multiple-source uncertainties (thermodynamic model, operating conditions and geometry). To demonstrate the feasibility of the proposed efficient strategies for shape optimization in the presence of multiple-source uncertainties, a zero incidence symmetric airfoil wave-drag minimization problem is retained as a case-study. This simplified configuration encompasses most of the features associated with a turbine design problem, as far the uncertainty quantification is concerned. A preliminary analysis of the contributions to the variance of the wave-drag allows to select the most significant sources of uncertainties using a reduced number of flow computations. The resulting mean value and variance of the objective are next turned into meta models. The optimal Pareto sets corresponding to the minimization of various substitute functions are obtained using a genetic algorithm as optimizer and their differences are discussed. (authors)

  7. Use of multiple data sources to estimate hepatitis C seroprevalence among prisoners: A retrospective cohort study.

    Directory of Open Access Journals (Sweden)

    Kathryn J Snow

    Full Text Available Hepatitis C is a major cause of preventable morbidity and mortality. Prisoners are a key population for hepatitis C control programs, and with the advent of highly effective therapies, prisons are increasingly important sites for hepatitis C diagnosis and treatment. Accurate estimates of hepatitis C prevalence among prisoners are needed in order to plan and resource service provision, however many prevalence estimates are based on surveys compromised by limited and potentially biased participation. We aimed to compare estimates derived from three different data sources, and to assess whether the use of self-report as a supplementary data source may help researchers assess the risk of selection bias. We used three data sources to estimate the prevalence of hepatitis C antibodies in a large cohort of Australian prisoners-prison medical records, self-reported status during a face-to-face interview prior to release from prison, and data from a statewide notifiable conditions surveillance system. Of 1,315 participants, 33.8% had at least one indicator of hepatitis C seropositivity, however less than one third of these (9.5% of the entire cohort were identified by all three data sources. Among participants of known status, self-report had a sensitivity of 80.1% and a positive predictive value of 97.8%. Any one data source used in isolation would have under-estimated the prevalence of hepatitis C in this cohort. Using multiple data sources in studies of hepatitis C seroprevalence among prisoners may improve case detection and help researchers assess the risk of selection bias due to non-participation in serological testing.

  8. Design, development and integration of a large scale multiple source X-ray computed tomography system

    International Nuclear Information System (INIS)

    Malcolm, Andrew A.; Liu, Tong; Ng, Ivan Kee Beng; Teng, Wei Yuen; Yap, Tsi Tung; Wan, Siew Ping; Kong, Chun Jeng

    2013-01-01

    X-ray Computed Tomography (CT) allows visualisation of the physical structures in the interior of an object without physically opening or cutting it. This technology supports a wide range of applications in the non-destructive testing, failure analysis or performance evaluation of industrial products and components. Of the numerous factors that influence the performance characteristics of an X-ray CT system the energy level in the X-ray spectrum to be used is one of the most significant. The ability of the X-ray beam to penetrate a given thickness of a specific material is directly related to the maximum available energy level in the beam. Higher energy levels allow penetration of thicker components made of more dense materials. In response to local industry demand and in support of on-going research activity in the area of 3D X-ray imaging for industrial inspection the Singapore Institute of Manufacturing Technology (SIMTech) engaged in the design, development and integration of large scale multiple source X-ray computed tomography system based on X-ray sources operating at higher energies than previously available in the Institute. The system consists of a large area direct digital X-ray detector (410 x 410 mm), a multiple-axis manipulator system, a 225 kV open tube microfocus X-ray source and a 450 kV closed tube millifocus X-ray source. The 225 kV X-ray source can be operated in either transmission or reflection mode. The body of the 6-axis manipulator system is fabricated from heavy-duty steel onto which high precision linear and rotary motors have been mounted in order to achieve high accuracy, stability and repeatability. A source-detector distance of up to 2.5 m can be achieved. The system is controlled by a proprietary X-ray CT operating system developed by SIMTech. The system currently can accommodate samples up to 0.5 x 0.5 x 0.5 m in size with weight up to 50 kg. These specifications will be increased to 1.0 x 1.0 x 1.0 m and 100 kg in future

  9. Optimizing Irrigation Water Allocation under Multiple Sources of Uncertainty in an Arid River Basin

    Science.gov (United States)

    Wei, Y.; Tang, D.; Gao, H.; Ding, Y.

    2015-12-01

    Population growth and climate change add additional pressures affecting water resources management strategies for meeting demands from different economic sectors. It is especially challenging in arid regions where fresh water is limited. For instance, in the Tailanhe River Basin (Xinjiang, China), a compromise must be made between water suppliers and users during drought years. This study presents a multi-objective irrigation water allocation model to cope with water scarcity in arid river basins. To deal with the uncertainties from multiple sources in the water allocation system (e.g., variations of available water amount, crop yield, crop prices, and water price), the model employs a interval linear programming approach. The multi-objective optimization model developed from this study is characterized by integrating eco-system service theory into water-saving measures. For evaluation purposes, the model is used to construct an optimal allocation system for irrigation areas fed by the Tailan River (Xinjiang Province, China). The objective functions to be optimized are formulated based on these irrigation areas' economic, social, and ecological benefits. The optimal irrigation water allocation plans are made under different hydroclimate conditions (wet year, normal year, and dry year), with multiple sources of uncertainty represented. The modeling tool and results are valuable for advising decision making by the local water authority—and the agricultural community—especially on measures for coping with water scarcity (by incorporating uncertain factors associated with crop production planning).

  10. Assessing regional groundwater stress for nations using multiple data sources with the groundwater footprint

    International Nuclear Information System (INIS)

    Gleeson, Tom; Wada, Yoshihide

    2013-01-01

    Groundwater is a critical resource for agricultural production, ecosystems, drinking water and industry, yet groundwater depletion is accelerating, especially in a number of agriculturally important regions. Assessing the stress of groundwater resources is crucial for science-based policy and management, yet water stress assessments have often neglected groundwater and used single data sources, which may underestimate the uncertainty of the assessment. We consistently analyze and interpret groundwater stress across whole nations using multiple data sources for the first time. We focus on two nations with the highest national groundwater abstraction rates in the world, the United States and India, and use the recently developed groundwater footprint and multiple datasets of groundwater recharge and withdrawal derived from hydrologic models and data synthesis. A minority of aquifers, mostly with known groundwater depletion, show groundwater stress regardless of the input dataset. The majority of aquifers are not stressed with any input data while less than a third are stressed for some input data. In both countries groundwater stress affects agriculturally important regions. In the United States, groundwater stress impacts a lower proportion of the national area and population, and is focused in regions with lower population and water well density compared to India. Importantly, the results indicate that the uncertainty is generally greater between datasets than within datasets and that much of the uncertainty is due to recharge estimates. Assessment of groundwater stress consistently across a nation and assessment of uncertainty using multiple datasets are critical for the development of a science-based rationale for policy and management, especially with regard to where and to what extent to focus limited research and management resources. (letter)

  11. Sources of water column methylmercury across multiple estuaries in the Northeast U.S.

    Science.gov (United States)

    Balcom, Prentiss H; Schartup, Amina T; Mason, Robert P; Chen, Celia Y

    2015-12-20

    Estuarine water column methylmercury (MeHg) is an important driver of mercury (Hg) bioaccumulation in pelagic organisms and thus it is necessary to understand the sources and processes affecting environmental levels of MeHg. Increases in water column MeHg concentrations can ultimately be transferred to fish consumed by humans, but despite this, the sources of MeHg to the estuarine water column are still poorly understood. Here we evaluate MeHg sources across 4 estuaries and 10 sampling sites and examine the distributions and partitioning of sediment and water column MeHg across a geographic range (Maine to New Jersey). Our study sites present a gradient in the concentrations of sediment, pore water and water column Hg species. Suspended particle MeHg ranged from below detection to 187 pmol g -1 , dissolved MeHg from 0.01 to 0.68 pM, and sediment MeHg from 0.01 to 109 pmol g -1 . Across multiple estuaries, dissolved MeHg correlated with Hg species in the water column, and sediment MeHg correlated with sediment total Hg (HgT). Water column MeHg did not correlate well with sediment Hg across estuaries, indicating that sediment concentrations were not a good predictor of water MeHg concentrations. This is an unexpected finding since it has been shown that MeHg production from inorganic Hg 2+ within sediment is the primary source of MeHg to coastal waters. Additional sources of MeHg regulate water column MeHg levels in some of the shallow estuaries included in this study.

  12. The Protein Identifier Cross-Referencing (PICR service: reconciling protein identifiers across multiple source databases

    Directory of Open Access Journals (Sweden)

    Leinonen Rasko

    2007-10-01

    Full Text Available Abstract Background Each major protein database uses its own conventions when assigning protein identifiers. Resolving the various, potentially unstable, identifiers that refer to identical proteins is a major challenge. This is a common problem when attempting to unify datasets that have been annotated with proteins from multiple data sources or querying data providers with one flavour of protein identifiers when the source database uses another. Partial solutions for protein identifier mapping exist but they are limited to specific species or techniques and to a very small number of databases. As a result, we have not found a solution that is generic enough and broad enough in mapping scope to suit our needs. Results We have created the Protein Identifier Cross-Reference (PICR service, a web application that provides interactive and programmatic (SOAP and REST access to a mapping algorithm that uses the UniProt Archive (UniParc as a data warehouse to offer protein cross-references based on 100% sequence identity to proteins from over 70 distinct source databases loaded into UniParc. Mappings can be limited by source database, taxonomic ID and activity status in the source database. Users can copy/paste or upload files containing protein identifiers or sequences in FASTA format to obtain mappings using the interactive interface. Search results can be viewed in simple or detailed HTML tables or downloaded as comma-separated values (CSV or Microsoft Excel (XLS files suitable for use in a local database or a spreadsheet. Alternatively, a SOAP interface is available to integrate PICR functionality in other applications, as is a lightweight REST interface. Conclusion We offer a publicly available service that can interactively map protein identifiers and protein sequences to the majority of commonly used protein databases. Programmatic access is available through a standards-compliant SOAP interface or a lightweight REST interface. The PICR

  13. Analogue Gravity

    Directory of Open Access Journals (Sweden)

    Carlos Barceló

    2011-05-01

    Full Text Available Analogue gravity is a research programme which investigates analogues of general relativistic gravitational fields within other physical systems, typically but not exclusively condensed matter systems, with the aim of gaining new insights into their corresponding problems. Analogue models of (and for gravity have a long and distinguished history dating back to the earliest years of general relativity. In this review article we will discuss the history, aims, results, and future prospects for the various analogue models. We start the discussion by presenting a particularly simple example of an analogue model, before exploring the rich history and complex tapestry of models discussed in the literature. The last decade in particular has seen a remarkable and sustained development of analogue gravity ideas, leading to some hundreds of published articles, a workshop, two books, and this review article. Future prospects for the analogue gravity programme also look promising, both on the experimental front (where technology is rapidly advancing and on the theoretical front (where variants of analogue models can be used as a springboard for radical attacks on the problem of quantum gravity.

  14. The test beamline of the European Spallation Source – Instrumentation development and wavelength frame multiplication

    International Nuclear Information System (INIS)

    Woracek, R.; Hofmann, T.; Bulat, M.; Sales, M.; Habicht, K.; Andersen, K.; Strobl, M.

    2016-01-01

    The European Spallation Source (ESS), scheduled to start operation in 2020, is aiming to deliver the most intense neutron beams for experimental research of any facility worldwide. Its long pulse time structure implies significant differences for instrumentation compared to other spallation sources which, in contrast, are all providing short neutron pulses. In order to enable the development of methods and technology adapted to this novel type of source well in advance of the first instruments being constructed at ESS, a test beamline (TBL) was designed and built at the BER II research reactor at Helmholtz-Zentrum Berlin (HZB). Operating the TBL shall provide valuable experience in order to allow for a smooth start of operations at ESS. The beamline is capable of mimicking the ESS pulse structure by a double chopper system and provides variable wavelength resolution as low as 0.5% over a wide wavelength band between 1.6 Å and 10 Å by a dedicated wavelength frame multiplication (WFM) chopper system. WFM is proposed for several ESS instruments to allow for flexible time-of-flight resolution. Hence, ESS will benefit from the TBL which offers unique possibilities for testing methods and components. This article describes the main capabilities of the instrument, its performance as experimentally verified during the commissioning, and its relevance to currently starting ESS instrumentation projects.

  15. Misconceptions and biases in German students' perception of multiple energy sources: implications for science education

    Science.gov (United States)

    Lee, Roh Pin

    2016-04-01

    Misconceptions and biases in energy perception could influence people's support for developments integral to the success of restructuring a nation's energy system. Science education, in equipping young adults with the cognitive skills and knowledge necessary to navigate in the confusing energy environment, could play a key role in paving the way for informed decision-making. This study examined German students' knowledge of the contribution of diverse energy sources to their nation's energy mix as well as their affective energy responses so as to identify implications for science education. Specifically, the study investigated whether and to what extent students hold mistaken beliefs about the role of multiple energy sources in their nation's energy mix, and assessed how misconceptions could act as self-generated reference points to underpin support/resistance of proposed developments. An in-depth analysis of spontaneous affective associations with five key energy sources also enabled the identification of underlying concerns driving people's energy responses and facilitated an examination of how affective perception, in acting as a heuristic, could lead to biases in energy judgment and decision-making. Finally, subgroup analysis differentiated by education and gender supported insights into a 'two culture' effect on energy perception and the challenge it poses to science education.

  16. The test beamline of the European Spallation Source – Instrumentation development and wavelength frame multiplication

    Energy Technology Data Exchange (ETDEWEB)

    Woracek, R., E-mail: robin.woracek@esss.se [European Spallation Source ESS ERIC, P.O. Box 176, SE-22100 Lund (Sweden); Hofmann, T.; Bulat, M. [Helmholtz-Zentrum Berlin für Materialien und Energie, Hahn-Meitner Platz 1, 14109 Berlin (Germany); Sales, M. [Technical University of Denmark, Fysikvej, 2800 Kgs. Lyngby (Denmark); Habicht, K. [Helmholtz-Zentrum Berlin für Materialien und Energie, Hahn-Meitner Platz 1, 14109 Berlin (Germany); Andersen, K. [European Spallation Source ESS ERIC, P.O. Box 176, SE-22100 Lund (Sweden); Strobl, M. [European Spallation Source ESS ERIC, P.O. Box 176, SE-22100 Lund (Sweden); Technical University of Denmark, Fysikvej, 2800 Kgs. Lyngby (Denmark)

    2016-12-11

    The European Spallation Source (ESS), scheduled to start operation in 2020, is aiming to deliver the most intense neutron beams for experimental research of any facility worldwide. Its long pulse time structure implies significant differences for instrumentation compared to other spallation sources which, in contrast, are all providing short neutron pulses. In order to enable the development of methods and technology adapted to this novel type of source well in advance of the first instruments being constructed at ESS, a test beamline (TBL) was designed and built at the BER II research reactor at Helmholtz-Zentrum Berlin (HZB). Operating the TBL shall provide valuable experience in order to allow for a smooth start of operations at ESS. The beamline is capable of mimicking the ESS pulse structure by a double chopper system and provides variable wavelength resolution as low as 0.5% over a wide wavelength band between 1.6 Å and 10 Å by a dedicated wavelength frame multiplication (WFM) chopper system. WFM is proposed for several ESS instruments to allow for flexible time-of-flight resolution. Hence, ESS will benefit from the TBL which offers unique possibilities for testing methods and components. This article describes the main capabilities of the instrument, its performance as experimentally verified during the commissioning, and its relevance to currently starting ESS instrumentation projects.

  17. Quantum Gravity

    International Nuclear Information System (INIS)

    Giribet, G E

    2005-01-01

    Claus Kiefer presents his book, Quantum Gravity, with his hope that '[the] book will convince readers of [the] outstanding problem [of unification and quantum gravity] and encourage them to work on its solution'. With this aim, the author presents a clear exposition of the fundamental concepts of gravity and the steps towards the understanding of its quantum aspects. The main part of the text is dedicated to the analysis of standard topics in the formulation of general relativity. An analysis of the Hamiltonian formulation of general relativity and the canonical quantization of gravity is performed in detail. Chapters four, five and eight provide a pedagogical introduction to the basic concepts of gravitational physics. In particular, aspects such as the quantization of constrained systems, the role played by the quadratic constraint, the ADM decomposition, the Wheeler-de Witt equation and the problem of time are treated in an expert and concise way. Moreover, other specific topics, such as the minisuperspace approach and the feasibility of defining extrinsic times for certain models, are discussed as well. The ninth chapter of the book is dedicated to the quantum gravitational aspects of string theory. Here, a minimalistic but clear introduction to string theory is presented, and this is actually done with emphasis on gravity. It is worth mentioning that no hard (nor explicit) computations are presented, even though the exposition covers the main features of the topic. For instance, black hole statistical physics (within the framework of string theory) is developed in a pedagogical and concise way by means of heuristical arguments. As the author asserts in the epilogue, the hope of the book is to give 'some impressions from progress' made in the study of quantum gravity since its beginning, i.e., since the end of 1920s. In my opinion, Kiefer's book does actually achieve this goal and gives an extensive review of the subject. (book review)

  18. A multiple objective magnet sorting algorithm for the Advanced Light Source insertion devices

    International Nuclear Information System (INIS)

    Humphries, D.; Goetz, F.; Kownacki, P.; Marks, S.; Schlueter, R.

    1995-01-01

    Insertion devices for the Advanced Light Source (ALS) incorporate large numbers of permanent magnets which have a variety of magnetization orientation errors. These orientation errors can produce field errors which affect both the spectral brightness of the insertion devices and the storage ring electron beam dynamics. A perturbation study was carried out to quantify the effects of orientation errors acting in a hybrid magnetic structure. The results of this study were used to develop a multiple stage sorting algorithm which minimizes undesirable integrated field errors and essentially eliminates pole excitation errors. When applied to a measured magnet population for an existing insertion device, an order of magnitude reduction in integrated field errors was achieved while maintaining near zero pole excitation errors

  19. Use of multiple water surface flow constructed wetlands for non-point source water pollution control.

    Science.gov (United States)

    Li, Dan; Zheng, Binghui; Liu, Yan; Chu, Zhaosheng; He, Yan; Huang, Minsheng

    2018-05-02

    Multiple free water surface flow constructed wetlands (multi-FWS CWs) are a variety of conventional water treatment plants for the interception of pollutants. This review encapsulated the characteristics and applications in the field of ecological non-point source water pollution control technology. The roles of in-series design and operation parameters (hydraulic residence time, hydraulic load rate, water depth and aspect ratio, composition of influent, and plant species) for performance intensification were also analyzed, which were crucial to achieve sustainable and effective contaminants removal, especially the retention of nutrient. The mechanism study of design and operation parameters for the removal of nitrogen and phosphorus was also highlighted. Conducive perspectives for further research on optimizing its design/operation parameters and advanced technologies of ecological restoration were illustrated to possibly interpret the functions of multi-FWS CWs.

  20. The test beamline of the European Spallation Source - Instrumentation development and wavelength frame multiplication

    DEFF Research Database (Denmark)

    Woracek, R.; Hofmann, T.; Bulat, M.

    2016-01-01

    which, in contrast, are all providing short neutron pulses. In order to enable the development of methods and technology adapted to this novel type of source well in advance of the first instruments being constructed at ESS, a test beamline (TBL) was designed and built at the BER II research reactor...... wavelength band between 1.6 A and 10 A by a dedicated wavelength frame multiplication (WFM) chopper system. WFM is proposed for several ESS instruments to allow for flexible time-of-flight resolution. Hence, ESS will benefit from the TBL which offers unique possibilities for testing methods and components....... This article describes the main capabilities of the instrument, its performance as experimentally verified during the commissioning, and its relevance to currently starting ESS instrumentation projects....

  1. Community Response to Multiple Sound Sources: Integrating Acoustic and Contextual Approaches in the Analysis

    Directory of Open Access Journals (Sweden)

    Peter Lercher

    2017-06-01

    Full Text Available Sufficient data refer to the relevant prevalence of sound exposure by mixed traffic sources in many nations. Furthermore, consideration of the potential effects of combined sound exposure is required in legal procedures such as environmental health impact assessments. Nevertheless, current practice still uses single exposure response functions. It is silently assumed that those standard exposure-response curves accommodate also for mixed exposures—although some evidence from experimental and field studies casts doubt on this practice. The ALPNAP-study population (N = 1641 shows sufficient subgroups with combinations of rail-highway, highway-main road and rail-highway-main road sound exposure. In this paper we apply a few suggested approaches of the literature to investigate exposure-response curves and its major determinants in the case of exposure to multiple traffic sources. Highly/moderate annoyance and full scale mean annoyance served as outcome. The results show several limitations of the current approaches. Even facing the inherent methodological limitations (energy equivalent summation of sound, rating of overall annoyance the consideration of main contextual factors jointly occurring with the sources (such as vibration, air pollution or coping activities and judgments of the wider area soundscape increases the variance explanation from up to 8% (bivariate, up to 15% (base adjustments up to 55% (full contextual model. The added predictors vary significantly, depending on the source combination. (e.g., significant vibration effects with main road/railway, not highway. Although no significant interactions were found, the observed additive effects are of public health importance. Especially in the case of a three source exposure situation the overall annoyance is already high at lower levels and the contribution of the acoustic indicators is small compared with the non-acoustic and contextual predictors. Noise mapping needs to go down to

  2. Multiple Sources of Pressure for Change: The Barroso Commission and Energy Policy for an Enlarged EU

    Directory of Open Access Journals (Sweden)

    Jan Frederik Braun

    2009-11-01

    Full Text Available This article presents a preliminary analysis of how and why the role, work and status of the European Commission are changing in an enlarged European Union. It does so by focusing on multiple sources of pressure for change. These include: enlargement, new modes of governance, administrative reforms and changed leadership under Barroso. Combined, though not interlinked, these multiple sources of pressure are evidence of the increasing difficulty for the Commission to design and propose Community-wide answers to complex challenges in a more diverse Union. For this reason, the Commission under Barroso relies less on its traditional monopoly power to propose formal legislation and more on non-traditional modes of policy-making. Energy policy, especially its external dimension, constitutes a policy field that has been affected by enlargement, i.e. characterised by an increasing heterogeneity of needs and preferences among the member states. Not only does it resists Community-wide answers, it also allows the Commission, as an agent, to make use of bureaucratic drifts, i.e. exploit its strategic position in the EU’s governance system and use of a range of formal and informal resources of expertise. To deliver sustainable European added value to this complex policy area, however, the Commission must focus more on pragmatic policy results by making smart use of the EU’s increasing asymmetry, diversity and subsidiarity in a bottom-up approach. A non-legislative approach can serve as a modus vivendi to keep the momentum going in the Union’s difficult struggle to establish a workable energy regime.

  3. The Use of Source-Related Strategies in Evaluating Multiple Psychology Texts: A Student-Scientist Comparison

    Science.gov (United States)

    von der Mühlen, Sarah; Richter, Tobias; Schmid, Sebastian; Schmidt, Elisabeth Marie; Berthold, Kirsten

    2016-01-01

    Multiple text comprehension can greatly benefit from paying attention to sources and from using this information for evaluating text information. Previous research based on texts from the domain of history suggests that source-related strategies are acquired as part of the discipline expertise as opposed to the spontaneous use of these strategies…

  4. Optimal planning of multiple distributed generation sources in distribution networks: A new approach

    Energy Technology Data Exchange (ETDEWEB)

    AlRashidi, M.R., E-mail: malrash2002@yahoo.com [Department of Electrical Engineering, College of Technological Studies, Public Authority for Applied Education and Training (PAAET) (Kuwait); AlHajri, M.F., E-mail: mfalhajri@yahoo.com [Department of Electrical Engineering, College of Technological Studies, Public Authority for Applied Education and Training (PAAET) (Kuwait)

    2011-10-15

    Highlights: {yields} A new hybrid PSO for optimal DGs placement and sizing. {yields} Statistical analysis to fine tune PSO parameters. {yields} Novel constraint handling mechanism to handle different constraints types. - Abstract: An improved particle swarm optimization algorithm (PSO) is presented for optimal planning of multiple distributed generation sources (DG). This problem can be divided into two sub-problems: the DG optimal size (continuous optimization) and location (discrete optimization) to minimize real power losses. The proposed approach addresses the two sub-problems simultaneously using an enhanced PSO algorithm capable of handling multiple DG planning in a single run. A design of experiment is used to fine tune the proposed approach via proper analysis of PSO parameters interaction. The proposed algorithm treats the problem constraints differently by adopting a radial power flow algorithm to satisfy the equality constraints, i.e. power flows in distribution networks, while the inequality constraints are handled by making use of some of the PSO features. The proposed algorithm was tested on the practical 69-bus power distribution system. Different test cases were considered to validate the proposed approach consistency in detecting optimal or near optimal solution. Results are compared with those of Sequential Quadratic Programming.

  5. Optimal planning of multiple distributed generation sources in distribution networks: A new approach

    International Nuclear Information System (INIS)

    AlRashidi, M.R.; AlHajri, M.F.

    2011-01-01

    Highlights: → A new hybrid PSO for optimal DGs placement and sizing. → Statistical analysis to fine tune PSO parameters. → Novel constraint handling mechanism to handle different constraints types. - Abstract: An improved particle swarm optimization algorithm (PSO) is presented for optimal planning of multiple distributed generation sources (DG). This problem can be divided into two sub-problems: the DG optimal size (continuous optimization) and location (discrete optimization) to minimize real power losses. The proposed approach addresses the two sub-problems simultaneously using an enhanced PSO algorithm capable of handling multiple DG planning in a single run. A design of experiment is used to fine tune the proposed approach via proper analysis of PSO parameters interaction. The proposed algorithm treats the problem constraints differently by adopting a radial power flow algorithm to satisfy the equality constraints, i.e. power flows in distribution networks, while the inequality constraints are handled by making use of some of the PSO features. The proposed algorithm was tested on the practical 69-bus power distribution system. Different test cases were considered to validate the proposed approach consistency in detecting optimal or near optimal solution. Results are compared with those of Sequential Quadratic Programming.

  6. Multiple Signal Classification Algorithm Based Electric Dipole Source Localization Method in an Underwater Environment

    Directory of Open Access Journals (Sweden)

    Yidong Xu

    2017-10-01

    Full Text Available A novel localization method based on multiple signal classification (MUSIC algorithm is proposed for positioning an electric dipole source in a confined underwater environment by using electric dipole-receiving antenna array. In this method, the boundary element method (BEM is introduced to analyze the boundary of the confined region by use of a matrix equation. The voltage of each dipole pair is used as spatial-temporal localization data, and it does not need to obtain the field component in each direction compared with the conventional fields based localization method, which can be easily implemented in practical engineering applications. Then, a global-multiple region-conjugate gradient (CG hybrid search method is used to reduce the computation burden and to improve the operation speed. Two localization simulation models and a physical experiment are conducted. Both the simulation results and physical experiment result provide accurate positioning performance, with the help to verify the effectiveness of the proposed localization method in underwater environments.

  7. Simulating Gravity

    Science.gov (United States)

    Pipinos, Savas

    2010-01-01

    This article describes one classroom activity in which the author simulates the Newtonian gravity, and employs the Euclidean Geometry with the use of new technologies (NT). The prerequisites for this activity were some knowledge of the formulae for a particle free fall in Physics and most certainly, a good understanding of the notion of similarity…

  8. Cellular gravity

    NARCIS (Netherlands)

    F.C. Gruau; J.T. Tromp (John)

    1999-01-01

    textabstractWe consider the problem of establishing gravity in cellular automata. In particular, when cellular automata states can be partitioned into empty, particle, and wall types, with the latter enclosing rectangular areas, we desire rules that will make the particles fall down and pile up on

  9. Electrical source imaging of interictal spikes using multiple sparse volumetric priors for presurgical epileptogenic focus localization

    Directory of Open Access Journals (Sweden)

    Gregor Strobbe

    2016-01-01

    Full Text Available Electrical source imaging of interictal spikes observed in EEG recordings of patients with refractory epilepsy provides useful information to localize the epileptogenic focus during the presurgical evaluation. However, the selection of the time points or time epochs of the spikes in order to estimate the origin of the activity remains a challenge. In this study, we consider a Bayesian EEG source imaging technique for distributed sources, i.e. the multiple volumetric sparse priors (MSVP approach. The approach allows to estimate the time courses of the intensity of the sources corresponding with a specific time epoch of the spike. Based on presurgical averaged interictal spikes in six patients who were successfully treated with surgery, we estimated the time courses of the source intensities for three different time epochs: (i an epoch starting 50 ms before the spike peak and ending at 50% of the spike peak during the rising phase of the spike, (ii an epoch starting 50 ms before the spike peak and ending at the spike peak and (iii an epoch containing the full spike time period starting 50 ms before the spike peak and ending 230 ms after the spike peak. To identify the primary source of the spike activity, the source with the maximum energy from 50 ms before the spike peak till 50% of the spike peak was subsequently selected for each of the time windows. For comparison, the activity at the spike peaks and at 50% of the peaks was localized using the LORETA inversion technique and an ECD approach. Both patient-specific spherical forward models and patient-specific 5-layered finite difference models were considered to evaluate the influence of the forward model. Based on the resected zones in each of the patients, extracted from post-operative MR images, we compared the distances to the resection border of the estimated activity. Using the spherical models, the distances to the resection border for the MSVP approach and each of the different time

  10. Prediction With Dimension Reduction of Multiple Molecular Data Sources for Patient Survival

    Directory of Open Access Journals (Sweden)

    Adam Kaplan

    2017-07-01

    Full Text Available Predictive modeling from high-dimensional genomic data is often preceded by a dimension reduction step, such as principal component analysis (PCA. However, the application of PCA is not straightforward for multisource data, wherein multiple sources of ‘omics data measure different but related biological components. In this article, we use recent advances in the dimension reduction of multisource data for predictive modeling. In particular, we apply exploratory results from Joint and Individual Variation Explained (JIVE, an extension of PCA for multisource data, for prediction of differing response types. We conduct illustrative simulations to illustrate the practical advantages and interpretability of our approach. As an application example, we consider predicting survival for patients with glioblastoma multiforme from 3 data sources measuring messenger RNA expression, microRNA expression, and DNA methylation. We also introduce a method to estimate JIVE scores for new samples that were not used in the initial dimension reduction and study its theoretical properties; this method is implemented in the R package R.JIVE on CRAN, in the function jive.predict.

  11. SIGMA/B, Doses in Space Vehicle for Multiple Trajectories, Various Radiation Source

    International Nuclear Information System (INIS)

    Jordan, T.M.

    2003-01-01

    1 - Description of problem or function: SIGMA/B calculates radiation dose at arbitrary points inside a space vehicle, taking into account vehicle geometry, heterogeneous placement of equipment and stores, vehicle materials, time-weighted astronaut positions and many radiation sources from mission trajectories, e.g. geomagnetically trapped protons and electrons, solar flare particles, galactic cosmic rays and their secondary radiations. The vehicle geometry, equipment and supplies, and man models are described by quadric surfaces. The irradiating flux field may be anisotropic. The code can be used to perform simultaneous dose calculations for multiple vehicle trajectories, each involving several radiation sources. Results are presented either as dose as a function of shield thickness, or the dose received through designated outer sections of the vehicle. 2 - Method of solution: Automatic sectoring of the vehicle is performed by a Simpson's rule integration over angle; the dose is computed by a numerical angular integration of the dose attenuation kernels about the dose points. The kernels are curve-fit functions constructed from input data tables. 3 - Restrictions on the complexity of the problem: The code uses variable dimensioning techniques to store data. The only restriction on problem size is the available core storage

  12. Characterize kinematic rupture history of large earthquakes with Multiple Haskell sources

    Science.gov (United States)

    Jia, Z.; Zhan, Z.

    2017-12-01

    Earthquakes are often regarded as continuous rupture along a single fault, but the occurrence of complex large events involving multiple faults and dynamic triggering challenges this view. Such rupture complexities cause difficulties in existing finite fault inversion algorithms, because they rely on specific parameterizations and regularizations to obtain physically meaningful solutions. Furthermore, it is difficult to assess reliability and uncertainty of obtained rupture models. Here we develop a Multi-Haskell Source (MHS) method to estimate rupture process of large earthquakes as a series of sub-events of varying location, timing and directivity. Each sub-event is characterized by a Haskell rupture model with uniform dislocation and constant unilateral rupture velocity. This flexible yet simple source parameterization allows us to constrain first-order rupture complexity of large earthquakes robustly. Additionally, relatively few parameters in the inverse problem yields improved uncertainty analysis based on Markov chain Monte Carlo sampling in a Bayesian framework. Synthetic tests and application of MHS method on real earthquakes show that our method can capture major features of large earthquake rupture process, and provide information for more detailed rupture history analysis.

  13. Gravity Before Einstein and Schwinger Before Gravity

    Science.gov (United States)

    Trimble, Virginia L.

    2012-05-01

    Julian Schwinger was a child prodigy, and Albert Einstein distinctly not; Schwinger had something like 73 graduate students, and Einstein very few. But both thought gravity was important. They were not, of course, the first, nor is the disagreement on how one should think about gravity that is being highlighted here the first such dispute. The talk will explore, first, several of the earlier dichotomies: was gravity capable of action at a distance (Newton), or was a transmitting ether required (many others). Did it act on everything or only on solids (an odd idea of the Herschels that fed into their ideas of solar structure and sunspots)? Did gravitational information require time for its transmission? Is the exponent of r precisely 2, or 2 plus a smidgeon (a suggestion by Simon Newcomb among others)? And so forth. Second, I will try to say something about Scwinger's lesser known early work and how it might have prefigured his "source theory," beginning with "On the Interaction of Several Electrons (the unpublished, 1934 "zeroth paper," whose title somewhat reminds one of "On the Dynamics of an Asteroid," through his days at Berkeley with Oppenheimer, Gerjuoy, and others, to his application of ideas from nuclear physics to radar and of radar engineering techniques to problems in nuclear physics. And folks who think good jobs are difficult to come by now might want to contemplate the couple of years Schwinger spent teaching elementary physics at Purdue before moving on to the MIT Rad Lab for war work.

  14. Management of Multiple Nitrogen Sources during Wine Fermentation by Saccharomyces cerevisiae.

    Science.gov (United States)

    Crépin, Lucie; Truong, Nhat My; Bloem, Audrey; Sanchez, Isabelle; Dequin, Sylvie; Camarasa, Carole

    2017-03-01

    During fermentative growth in natural and industrial environments, Saccharomyces cerevisiae must redistribute the available nitrogen from multiple exogenous sources to amino acids in order to suitably fulfill anabolic requirements. To exhaustively explore the management of this complex resource, we developed an advanced strategy based on the reconciliation of data from a set of stable isotope tracer experiments with labeled nitrogen sources. Thus, quantifying the partitioning of the N compounds through the metabolism network during fermentation, we demonstrated that, contrary to the generally accepted view, only a limited fraction of most of the consumed amino acids is directly incorporated into proteins. Moreover, substantial catabolism of these molecules allows for efficient redistribution of nitrogen, supporting the operative de novo synthesis of proteinogenic amino acids. In contrast, catabolism of consumed amino acids plays a minor role in the formation of volatile compounds. Another important feature is that the α-keto acid precursors required for the de novo syntheses originate mainly from the catabolism of sugars, with a limited contribution from the anabolism of consumed amino acids. This work provides a comprehensive view of the intracellular fate of consumed nitrogen sources and the metabolic origin of proteinogenic amino acids, highlighting a strategy of distribution of metabolic fluxes implemented by yeast as a means of adapting to environments with changing and scarce nitrogen resources. IMPORTANCE A current challenge for the wine industry, in view of the extensive competition in the worldwide market, is to meet consumer expectations regarding the sensory profile of the product while ensuring an efficient fermentation process. Understanding the intracellular fate of the nitrogen sources available in grape juice is essential to the achievement of these objectives, since nitrogen utilization affects both the fermentative activity of yeasts and the

  15. Management of Multiple Nitrogen Sources during Wine Fermentation by Saccharomyces cerevisiae

    Science.gov (United States)

    Crépin, Lucie; Truong, Nhat My; Bloem, Audrey; Sanchez, Isabelle; Dequin, Sylvie

    2017-01-01

    ABSTRACT During fermentative growth in natural and industrial environments, Saccharomyces cerevisiae must redistribute the available nitrogen from multiple exogenous sources to amino acids in order to suitably fulfill anabolic requirements. To exhaustively explore the management of this complex resource, we developed an advanced strategy based on the reconciliation of data from a set of stable isotope tracer experiments with labeled nitrogen sources. Thus, quantifying the partitioning of the N compounds through the metabolism network during fermentation, we demonstrated that, contrary to the generally accepted view, only a limited fraction of most of the consumed amino acids is directly incorporated into proteins. Moreover, substantial catabolism of these molecules allows for efficient redistribution of nitrogen, supporting the operative de novo synthesis of proteinogenic amino acids. In contrast, catabolism of consumed amino acids plays a minor role in the formation of volatile compounds. Another important feature is that the α-keto acid precursors required for the de novo syntheses originate mainly from the catabolism of sugars, with a limited contribution from the anabolism of consumed amino acids. This work provides a comprehensive view of the intracellular fate of consumed nitrogen sources and the metabolic origin of proteinogenic amino acids, highlighting a strategy of distribution of metabolic fluxes implemented by yeast as a means of adapting to environments with changing and scarce nitrogen resources. IMPORTANCE A current challenge for the wine industry, in view of the extensive competition in the worldwide market, is to meet consumer expectations regarding the sensory profile of the product while ensuring an efficient fermentation process. Understanding the intracellular fate of the nitrogen sources available in grape juice is essential to the achievement of these objectives, since nitrogen utilization affects both the fermentative activity of yeasts and

  16. Development of a Monte Carlo multiple source model for inclusion in a dose calculation auditing tool.

    Science.gov (United States)

    Faught, Austin M; Davidson, Scott E; Fontenot, Jonas; Kry, Stephen F; Etzel, Carol; Ibbott, Geoffrey S; Followill, David S

    2017-09-01

    The Imaging and Radiation Oncology Core Houston (IROC-H) (formerly the Radiological Physics Center) has reported varying levels of agreement in their anthropomorphic phantom audits. There is reason to believe one source of error in this observed disagreement is the accuracy of the dose calculation algorithms and heterogeneity corrections used. To audit this component of the radiotherapy treatment process, an independent dose calculation tool is needed. Monte Carlo multiple source models for Elekta 6 MV and 10 MV therapeutic x-ray beams were commissioned based on measurement of central axis depth dose data for a 10 × 10 cm 2 field size and dose profiles for a 40 × 40 cm 2 field size. The models were validated against open field measurements consisting of depth dose data and dose profiles for field sizes ranging from 3 × 3 cm 2 to 30 × 30 cm 2 . The models were then benchmarked against measurements in IROC-H's anthropomorphic head and neck and lung phantoms. Validation results showed 97.9% and 96.8% of depth dose data passed a ±2% Van Dyk criterion for 6 MV and 10 MV models respectively. Dose profile comparisons showed an average agreement using a ±2%/2 mm criterion of 98.0% and 99.0% for 6 MV and 10 MV models respectively. Phantom plan comparisons were evaluated using ±3%/2 mm gamma criterion, and averaged passing rates between Monte Carlo and measurements were 87.4% and 89.9% for 6 MV and 10 MV models respectively. Accurate multiple source models for Elekta 6 MV and 10 MV x-ray beams have been developed for inclusion in an independent dose calculation tool for use in clinical trial audits. © 2017 American Association of Physicists in Medicine.

  17. A Numerical Study on the Excitation of Guided Waves in Rectangular Plates Using Multiple Point Sources

    Directory of Open Access Journals (Sweden)

    Wenbo Duan

    2017-12-01

    Full Text Available Ultrasonic guided waves are widely used to inspect and monitor the structural integrity of plates and plate-like structures, such as ship hulls and large storage-tank floors. Recently, ultrasonic guided waves have also been used to remove ice and fouling from ship hulls, wind-turbine blades and aeroplane wings. In these applications, the strength of the sound source must be high for scanning a large area, or to break the bond between ice, fouling and plate substrate. More than one transducer may be used to achieve maximum sound power output. However, multiple sources can interact with each other, and form a sound field in the structure with local constructive and destructive regions. Destructive regions are weak regions and shall be avoided. When multiple transducers are used it is important that they are arranged in a particular way so that the desired wave modes can be excited to cover the whole structure. The objective of this paper is to provide a theoretical basis for generating particular wave mode patterns in finite-width rectangular plates whose length is assumed to be infinitely long with respect to its width and thickness. The wave modes have displacements in both width and thickness directions, and are thus different from the classical Lamb-type wave modes. A two-dimensional semi-analytical finite element (SAFE method was used to study dispersion characteristics and mode shapes in the plate up to ultrasonic frequencies. The modal analysis provided information on the generation of modes suitable for a particular application. The number of point sources and direction of loading for the excitation of a few representative modes was investigated. Based on the SAFE analysis, a standard finite element modelling package, Abaqus, was used to excite the designed modes in a three-dimensional plate. The generated wave patterns in Abaqus were then compared with mode shapes predicted in the SAFE model. Good agreement was observed between the

  18. Combining multiple sources of data to inform conservation of Lesser Prairie-Chicken populations

    Science.gov (United States)

    Ross, Beth; Haukos, David A.; Hagen, Christian A.; Pitman, James

    2018-01-01

    Conservation of small populations is often based on limited data from spatially and temporally restricted studies, resulting in management actions based on an incomplete assessment of the population drivers. If fluctuations in abundance are related to changes in weather, proper management is especially important, because extreme weather events could disproportionately affect population abundance. Conservation assessments, especially for vulnerable populations, are aided by a knowledge of how extreme events influence population status and trends. Although important for conservation efforts, data may be limited for small or vulnerable populations. Integrated population models maximize information from various sources of data to yield population estimates that fully incorporate uncertainty from multiple data sources while allowing for the explicit incorporation of environmental covariates of interest. Our goal was to assess the relative influence of population drivers for the Lesser Prairie-Chicken (Tympanuchus pallidicinctus) in the core of its range, western and southern Kansas, USA. We used data from roadside lek count surveys, nest monitoring surveys, and survival data from telemetry monitoring combined with climate (Palmer drought severity index) data in an integrated population model. Our results indicate that variability in population growth rate was most influenced by variability in juvenile survival. The Palmer drought severity index had no measurable direct effects on adult survival or mean number of offspring per female; however, there were declines in population growth rate following severe drought. Because declines in population growth rate occurred at a broad spatial scale, declines in response to drought were likely due to decreases in chick and juvenile survival rather than emigration outside of the study area. Overall, our model highlights the importance of accounting for environmental and demographic sources of variability, and provides a thorough

  19. Estimating the prevalence of illicit opioid use in New York City using multiple data sources

    Directory of Open Access Journals (Sweden)

    McNeely Jennifer

    2012-06-01

    Full Text Available Abstract Background Despite concerns about its health and social consequences, little is known about the prevalence of illicit opioid use in New York City. Individuals who misuse heroin and prescription opioids are known to bear a disproportionate burden of morbidity and mortality. Service providers and public health authorities are challenged to provide appropriate interventions in the absence of basic knowledge about the size and characteristics of this population. While illicit drug users are underrepresented in population-based surveys, they may be identified in multiple administrative data sources. Methods We analyzed large datasets tracking hospital inpatient and emergency room admissions as well as drug treatment and detoxification services utilization. These were applied in combination with findings from a large general population survey and administrative records tracking prescriptions, drug overdose deaths, and correctional health services, to estimate the prevalence of heroin and non-medical prescription opioid use among New York City residents in 2006. These data were further applied to a descriptive analysis of opioid users entering drug treatment and hospital-based medical care. Results These data sources identified 126,681 cases of opioid use among New York City residents in 2006. After applying adjustment scenarios to account for potential overlap between data sources, we estimated over 92,000 individual opioid users. By contrast, just 21,600 opioid users initiated drug treatment in 2006. Opioid users represented 4 % of all individuals hospitalized, and over 44,000 hospitalizations during the calendar year. Conclusions Our findings suggest that innovative approaches are needed to provide adequate services to this sizeable population of opioid users. Given the observed high rates of hospital services utilization, greater integration of drug services into medical settings could be one component of an effective approach to

  20. Quantum gravity

    International Nuclear Information System (INIS)

    Isham, C.

    1989-01-01

    Gravitational effects are seen as arising from a curvature in spacetime. This must be reconciled with gravity's apparently passive role in quantum theory to achieve a satisfactory quantum theory of gravity. The development of grand unified theories has spurred the search, with forces being of equal strength at a unification energy of 10 15 - 10 18 GeV, with the ''Plank length'', Lp ≅ 10 -35 m. Fundamental principles of general relativity and quantum mechanics are outlined. Gravitons are shown to have spin-0, as mediators of gravitation force in the classical sense or spin-2 which are related to the quantisation of general relativity. Applying the ideas of supersymmetry to gravitation implies partners for the graviton, especially the massless spin 3/2 fermion called a gravitino. The concept of supersymmetric strings is introduced and discussed. (U.K.)

  1. Quantum gravity

    International Nuclear Information System (INIS)

    Markov, M.A.; West, P.C.

    1984-01-01

    This book discusses the state of the art of quantum gravity, quantum effects in cosmology, quantum black-hole physics, recent developments in supergravity, and quantum gauge theories. Topics considered include the problems of general relativity, pregeometry, complete cosmological theories, quantum fluctuations in cosmology and galaxy formation, a new inflationary universe scenario, grand unified phase transitions and the early Universe, the generalized second law of thermodynamics, vacuum polarization near black holes, the relativity of vacuum, black hole evaporations and their cosmological consequences, currents in supersymmetric theories, the Kaluza-Klein theories, gauge algebra and quantization, and twistor theory. This volume constitutes the proceedings of the Second Seminar on Quantum Gravity held in Moscow in 1981

  2. DOA Estimation of Multiple LFM Sources Using a STFT-based and FBSS-based MUSIC Algorithm

    Directory of Open Access Journals (Sweden)

    K. B. Cui

    2017-12-01

    Full Text Available Direction of arrival (DOA estimation is an important problem in array signal processing. An effective multiple signal classification (MUSIC method based on the short-time Fourier transform (STFT and forward/ backward spatial smoothing (FBSS techniques for the DOA estimation problem of multiple time-frequency (t-f joint LFM sources is addressed. Previous work in the area e. g. STFT-MUSIC algorithm cannot resolve the t-f completely or largely joint sources because they can only select the single-source t-f points. The proposed method con¬structs the spatial t-f distributions (STFDs by selecting the multiple-source t-f points and uses the FBSS techniques to solve the problem of rank loss. In this way, the STFT-FBSS-MUSIC algorithm can resolve the t-f largely joint or completely joint LFM sources. In addition, the proposed algorithm also owns pretty low computational complexity when resolving multiple LFM sources because it can reduce the times of the feature decomposition and spectrum search. The performance of the proposed method is compared with that of the existing t-f based MUSIC algorithms through computer simulations and the results show its good performance.

  3. Food ordering for children in restaurants: multiple sources of influence on decision making

    Science.gov (United States)

    Castro, Iana A; Williams, Christine B; Madanat, Hala; Pickrel, Julie L; Jun, Hee-Jin; Zive, Michelle; Gahagan, Sheila; Ayala, Guadalupe X

    2017-01-01

    Objective Restaurants are playing an increasingly important role in children’s dietary intake. Interventions to promote healthy ordering in restaurants have primarily targeted adults. Much remains unknown about how to influence ordering for and by children. Using an ecological lens, the present study sought to identify sources of influence on ordering behaviour for and by children in restaurants. Design A mixed-methods study was conducted using unobtrusive observations of dining parties with children and post-order interviews. Observational data included: child’s gender, person ordering for the child and server interactions with the dining party. Interview data included: child’s age, restaurant visit frequency, timing of child’s decision making, and factors influencing decision making. Setting Ten independent, table-service restaurants in San Diego, CA, USA participated. Subjects Complete observational and interview data were obtained from 102 dining parties with 150 children (aged 3–14 years). Results Taste preferences, family influences and menus impacted ordering. However, most children knew what they intended to order before arriving at the restaurant, especially if they dined there at least monthly. Furthermore, about one-third of children shared their meals with others and all shared meals were ordered from adult (v. children’s) menus. Parents placed most orders, although parental involvement in ordering was less frequent with older children. Servers interacted frequently with children but generally did not recommend menu items or prompt use of the children’s menu. Conclusions Interventions to promote healthy ordering should consider the multiple sources of influence that are operating when ordering for and by children in restaurants. PMID:27334904

  4. Multiple data sources improve DNA-based mark-recapture population estimates of grizzly bears.

    Science.gov (United States)

    Boulanger, John; Kendall, Katherine C; Stetz, Jeffrey B; Roon, David A; Waits, Lisette P; Paetkau, David

    2008-04-01

    A fundamental challenge to estimating population size with mark-recapture methods is heterogeneous capture probabilities and subsequent bias of population estimates. Confronting this problem usually requires substantial sampling effort that can be difficult to achieve for some species, such as carnivores. We developed a methodology that uses two data sources to deal with heterogeneity and applied this to DNA mark-recapture data from grizzly bears (Ursus arctos). We improved population estimates by incorporating additional DNA "captures" of grizzly bears obtained by collecting hair from unbaited bear rub trees concurrently with baited, grid-based, hair snag sampling. We consider a Lincoln-Petersen estimator with hair snag captures as the initial session and rub tree captures as the recapture session and develop an estimator in program MARK that treats hair snag and rub tree samples as successive sessions. Using empirical data from a large-scale project in the greater Glacier National Park, Montana, USA, area and simulation modeling we evaluate these methods and compare the results to hair-snag-only estimates. Empirical results indicate that, compared with hair-snag-only data, the joint hair-snag-rub-tree methods produce similar but more precise estimates if capture and recapture rates are reasonably high for both methods. Simulation results suggest that estimators are potentially affected by correlation of capture probabilities between sample types in the presence of heterogeneity. Overall, closed population Huggins-Pledger estimators showed the highest precision and were most robust to sparse data, heterogeneity, and capture probability correlation among sampling types. Results also indicate that these estimators can be used when a segment of the population has zero capture probability for one of the methods. We propose that this general methodology may be useful for other species in which mark-recapture data are available from multiple sources.

  5. Systematic identification of yeast cell cycle transcription factors using multiple data sources

    Directory of Open Access Journals (Sweden)

    Li Wen-Hsiung

    2008-12-01

    Full Text Available Abstract Background Eukaryotic cell cycle is a complex process and is precisely regulated at many levels. Many genes specific to the cell cycle are regulated transcriptionally and are expressed just before they are needed. To understand the cell cycle process, it is important to identify the cell cycle transcription factors (TFs that regulate the expression of cell cycle-regulated genes. Results We developed a method to identify cell cycle TFs in yeast by integrating current ChIP-chip, mutant, transcription factor binding site (TFBS, and cell cycle gene expression data. We identified 17 cell cycle TFs, 12 of which are known cell cycle TFs, while the remaining five (Ash1, Rlm1, Ste12, Stp1, Tec1 are putative novel cell cycle TFs. For each cell cycle TF, we assigned specific cell cycle phases in which the TF functions and identified the time lag for the TF to exert regulatory effects on its target genes. We also identified 178 novel cell cycle-regulated genes, among which 59 have unknown functions, but they may now be annotated as cell cycle-regulated genes. Most of our predictions are supported by previous experimental or computational studies. Furthermore, a high confidence TF-gene regulatory matrix is derived as a byproduct of our method. Each TF-gene regulatory relationship in this matrix is supported by at least three data sources: gene expression, TFBS, and ChIP-chip or/and mutant data. We show that our method performs better than four existing methods for identifying yeast cell cycle TFs. Finally, an application of our method to different cell cycle gene expression datasets suggests that our method is robust. Conclusion Our method is effective for identifying yeast cell cycle TFs and cell cycle-regulated genes. Many of our predictions are validated by the literature. Our study shows that integrating multiple data sources is a powerful approach to studying complex biological systems.

  6. Stochastic Gravity: Theory and Applications

    Directory of Open Access Journals (Sweden)

    Hu Bei Lok

    2008-05-01

    Full Text Available Whereas semiclassical gravity is based on the semiclassical Einstein equation with sources given by the expectation value of the stress-energy tensor of quantum fields, stochastic semiclassical gravity is based on the Einstein–Langevin equation, which has, in addition, sources due to the noise kernel. The noise kernel is the vacuum expectation value of the (operator-valued stress-energy bitensor, which describes the fluctuations of quantum-matter fields in curved spacetimes. A new improved criterion for the validity of semiclassical gravity may also be formulated from the viewpoint of this theory. In the first part of this review we describe the fundamentals of this new theory via two approaches: the axiomatic and the functional. The axiomatic approach is useful to see the structure of the theory from the framework of semiclassical gravity, showing the link from the mean value of the stress-energy tensor to the correlation functions. The functional approach uses the Feynman–Vernon influence functional and the Schwinger–Keldysh closed-time-path effective action methods. In the second part, we describe three applications of stochastic gravity. First, we consider metric perturbations in a Minkowski spacetime, compute the two-point correlation functions of these perturbations and prove that Minkowski spacetime is a stable solution of semiclassical gravity. Second, we discuss structure formation from the stochastic-gravity viewpoint, which can go beyond the standard treatment by incorporating the full quantum effect of the inflaton fluctuations. Third, using the Einstein–Langevin equation, we discuss the backreaction of Hawking radiation and the behavior of metric fluctuations for both the quasi-equilibrium condition of a black-hole in a box and the fully nonequilibrium condition of an evaporating black hole spacetime. Finally, we briefly discuss the theoretical structure of stochastic gravity in relation to quantum gravity and point out

  7. A Tracking Analyst for large 3D spatiotemporal data from multiple sources (case study: Tracking volcanic eruptions in the atmosphere)

    Science.gov (United States)

    Gad, Mohamed A.; Elshehaly, Mai H.; Gračanin, Denis; Elmongui, Hicham G.

    2018-02-01

    This research presents a novel Trajectory-based Tracking Analyst (TTA) that can track and link spatiotemporally variable data from multiple sources. The proposed technique uses trajectory information to determine the positions of time-enabled and spatially variable scatter data at any given time through a combination of along trajectory adjustment and spatial interpolation. The TTA is applied in this research to track large spatiotemporal data of volcanic eruptions (acquired using multi-sensors) in the unsteady flow field of the atmosphere. The TTA enables tracking injections into the atmospheric flow field, the reconstruction of the spatiotemporally variable data at any desired time, and the spatiotemporal join of attribute data from multiple sources. In addition, we were able to create a smooth animation of the volcanic ash plume at interactive rates. The initial results indicate that the TTA can be applied to a wide range of multiple-source data.

  8. Development of an asymmetric multiple-position neutron source (AMPNS) method to monitor the criticality of a degraded reactor core

    International Nuclear Information System (INIS)

    Kim, S.S.; Levine, S.H.

    1985-01-01

    An analytical/experimental method has been developed to monitor the subcritical reactivity and unfold the k/sub infinity/ distribution of a degraded reactor core. The method uses several fixed neutron detectors and a Cf-252 neutron source placed sequentially in multiple positions in the core. Therefore, it is called the Asymmetric Multiple Position Neutron Source (AMPNS) method. The AMPNS method employs nucleonic codes to analyze the neutron multiplication of a Cf-252 neutron source. An optimization program, GPM, is utilized to unfold the k/sub infinity/ distribution of the degraded core, in which the desired performance measure minimizes the error between the calculated and the measured count rates of the degraded reactor core. The analytical/experimental approach is validated by performing experiments using the Penn State Breazeale TRIGA Reactor (PSBR). A significant result of this study is that it provides a method to monitor the criticality of a damaged core during the recovery period

  9. TRAM (Transcriptome Mapper: database-driven creation and analysis of transcriptome maps from multiple sources

    Directory of Open Access Journals (Sweden)

    Danieli Gian

    2011-02-01

    clusters with differential expression during the differentiation toward megakaryocyte were identified. Conclusions TRAM is designed to create, and statistically analyze, quantitative transcriptome maps, based on gene expression data from multiple sources. The release includes FileMaker Pro database management runtime application and it is freely available at http://apollo11.isto.unibo.it/software/, along with preconfigured implementations for mapping of human, mouse and zebrafish transcriptomes.

  10. High brightness fiber laser pump sources based on single emitters and multiple single emitters

    Science.gov (United States)

    Scheller, Torsten; Wagner, Lars; Wolf, Jürgen; Bonati, Guido; Dörfel, Falk; Gabler, Thomas

    2008-02-01

    Driven by the potential of the fiber laser market, the development of high brightness pump sources has been pushed during the last years. The main approaches to reach the targets of this market had been the direct coupling of single emitters (SE) on the one hand and the beam shaping of bars and stacks on the other hand, which often causes higher cost per watt. Meanwhile the power of single emitters with 100μm emitter size for direct coupling increased dramatically, which also pushed a new generation of wide stripe emitters or multi emitters (ME) of up to 1000μm emitter size respectively "minibars" with apertures of 3 to 5mm. The advantage of this emitter type compared to traditional bars is it's scalability to power levels of 40W to 60W combined with a small aperture which gives advantages when coupling into a fiber. We show concepts using this multiple single emitters for fiber coupled systems of 25W up to 40W out of a 100μm fiber NA 0.22 with a reasonable optical efficiency. Taking into account a further efficiency optimization and an increase in power of these devices in the near future, the EUR/W ratio pushed by the fiber laser manufacturer will further decrease. Results will be shown as well for higher power pump sources. Additional state of the art tapered fiber bundles for photonic crystal fibers are used to combine 7 (19) pump sources to output powers of 100W (370W) out of a 130μm (250μm) fiber NA 0.6 with nominal 20W per port. Improving those TFB's in the near future and utilizing 40W per pump leg, an output power of even 750W out of 250μm fiber NA 0.6 will be possible. Combined Counter- and Co-Propagated pumping of the fiber will then lead to the first 1kW fiber laser oscillator.

  11. Synergistic effect of multiple indoor allergen sources on atopic symptoms in primary school children

    International Nuclear Information System (INIS)

    Chen, W-Y.; Tseng, H-I.; Wu, M-T.; Hung, H-C.; Wu, H-T.; Chen, H-L.; Lu, C.-C.

    2003-01-01

    Accumulating data show that the complex modern indoor environment contributes to increasing prevalence of atopic diseases. However, the dose-response relationship between allergic symptoms and complexity of indoor environmental allergen sources (IEAS) has not been clearly evaluated before. Therefore, we designed this study to investigate the overall effect of multiple IEAS on appearance of asthma (AS), allergic rhinitis (AR), and eczema (EC) symptoms in 1472 primary school children. Among various IEAS analyzed, only stuffed toys, cockroaches, and mold patches fit the model of 'more IEAS, higher odds ratio (OR) of association'. The association of IEAS and AR increased stepwise as more IEAS appeared in the environment (1.71, 2.47, to 2.86). In AS and EC, the association was significant only when all three IEAS were present (1.42, 1.98, to 4.11 in AS; 1.40, 1.76, to 2.95 in EC). These results showed that different IEAS had a synergistic effect on their association with atopic symptoms and also suggest that there is a dose-response relationship between kinds of IEAS and risk of appearance of atopic diseases

  12. Integrating Multiple Data Sources for Combinatorial Marker Discovery: A Study in Tumorigenesis.

    Science.gov (United States)

    Bandyopadhyay, Sanghamitra; Mallik, Saurav

    2018-01-01

    Identification of combinatorial markers from multiple data sources is a challenging task in bioinformatics. Here, we propose a novel computational framework for identifying significant combinatorial markers ( s) using both gene expression and methylation data. The gene expression and methylation data are integrated into a single continuous data as well as a (post-discretized) boolean data based on their intrinsic (i.e., inverse) relationship. A novel combined score of methylation and expression data (viz., ) is introduced which is computed on the integrated continuous data for identifying initial non-redundant set of genes. Thereafter, (maximal) frequent closed homogeneous genesets are identified using a well-known biclustering algorithm applied on the integrated boolean data of the determined non-redundant set of genes. A novel sample-based weighted support ( ) is then proposed that is consecutively calculated on the integrated boolean data of the determined non-redundant set of genes in order to identify the non-redundant significant genesets. The top few resulting genesets are identified as potential s. Since our proposed method generates a smaller number of significant non-redundant genesets than those by other popular methods, the method is much faster than the others. Application of the proposed technique on an expression and a methylation data for Uterine tumor or Prostate Carcinoma produces a set of significant combination of markers. We expect that such a combination of markers will produce lower false positives than individual markers.

  13. Use of Multiple Data Sources to Estimate the Economic Cost of Dengue Illness in Malaysia

    Science.gov (United States)

    Shepard, Donald S.; Undurraga, Eduardo A.; Lees, Rosemary Susan; Halasa, Yara; Lum, Lucy Chai See; Ng, Chiu Wan

    2012-01-01

    Dengue represents a substantial burden in many tropical and sub-tropical regions of the world. We estimated the economic burden of dengue illness in Malaysia. Information about economic burden is needed for setting health policy priorities, but accurate estimation is difficult because of incomplete data. We overcame this limitation by merging multiple data sources to refine our estimates, including an extensive literature review, discussion with experts, review of data from health and surveillance systems, and implementation of a Delphi process. Because Malaysia has a passive surveillance system, the number of dengue cases is under-reported. Using an adjusted estimate of total dengue cases, we estimated an economic burden of dengue illness of US$56 million (Malaysian Ringgit MYR196 million) per year, which is approximately US$2.03 (Malaysian Ringgit 7.14) per capita. The overall economic burden of dengue would be even higher if we included costs associated with dengue prevention and control, dengue surveillance, and long-term sequelae of dengue. PMID:23033404

  14. Maximum Likelihood DOA Estimation of Multiple Wideband Sources in the Presence of Nonuniform Sensor Noise

    Directory of Open Access Journals (Sweden)

    K. Yao

    2007-12-01

    Full Text Available We investigate the maximum likelihood (ML direction-of-arrival (DOA estimation of multiple wideband sources in the presence of unknown nonuniform sensor noise. New closed-form expression for the direction estimation Cramér-Rao-Bound (CRB has been derived. The performance of the conventional wideband uniform ML estimator under nonuniform noise has been studied. In order to mitigate the performance degradation caused by the nonuniformity of the noise, a new deterministic wideband nonuniform ML DOA estimator is derived and two associated processing algorithms are proposed. The first algorithm is based on an iterative procedure which stepwise concentrates the log-likelihood function with respect to the DOAs and the noise nuisance parameters, while the second is a noniterative algorithm that maximizes the derived approximately concentrated log-likelihood function. The performance of the proposed algorithms is tested through extensive computer simulations. Simulation results show the stepwise-concentrated ML algorithm (SC-ML requires only a few iterations to converge and both the SC-ML and the approximately-concentrated ML algorithm (AC-ML attain a solution close to the derived CRB at high signal-to-noise ratio.

  15. Journey of a Package: Category 1 Source (Co-60) Shipment with Several Border Crossings, Multiple Modes

    International Nuclear Information System (INIS)

    Gray, P. A.

    2016-01-01

    Radioactive materials (RAM) are used extensively in a vast array of industries and in an even wider breadth of applications on a truly global basis each and every day. Over the past 50 years, these applications and the quantity (activity) of RAM shipped has grown significantly, with the next 50 years expected to show a continuing trend. The movement of these goods occurs in all regions of the world, and must therefore be conducted in a manner which will not adversely impact people or the environment. Industry and regulators have jointly met this challenge, so much so that RAM shipments are amongst the safest of any product. How has this level of performance been achieved? What is involved in shipping RAM from one corner of the world to another, often via a number of in-transit locations and often utilizing multiple modes of transport in any single shipment? This paper reviews one such journey, of Category 1 Cobalt-60 sources, as they move from point of manufacture through to point of use including the detailed and multi-approval process, the stringent regulatory requirements in place, the extensive communications required throughout, and the practical aspects needed to simply offer such a product for sale and transport. Upon completion, the rationale for such an exemplary safety and security record will be readily apparent. (author)

  16. Freezing of enkephalinergic functions by multiple noxious foci: a source of pain sensitization?

    Directory of Open Access Journals (Sweden)

    François Cesselin

    Full Text Available BACKGROUND: The functional significance of proenkephalin systems in processing pain remains an open question and indeed is puzzling. For example, a noxious mechanical stimulus does not alter the release of Met-enkephalin-like material (MELM from segments of the spinal cord related to the stimulated area of the body, but does increase its release from other segments. METHODOLOGY/PRINCIPAL FINDINGS: Here we show that, in the rat, a noxious mechanical stimulus applied to either the right or the left hind paw elicits a marked increase of MELM release during perifusion of either the whole spinal cord or the cervico-trigeminal area. However, these stimulatory effects were not additive and indeed, disappeared completely when the right and left paws were stimulated simultaneously. CONCLUSION/SIGNIFICANCE: We have concluded that in addition to the concept of a diffuse control of the transmission of nociceptive signals through the dorsal horn, there is a diffuse control of the modulation of this transmission. The "freezing" of Met-enkephalinergic functions represents a potential source of central sensitization in the spinal cord, notably in clinical situations involving multiple painful foci, e.g. cancer with metastases, poly-traumatism or rheumatoid arthritis.

  17. Use of multiple data sources to estimate the economic cost of dengue illness in Malaysia.

    Science.gov (United States)

    Shepard, Donald S; Undurraga, Eduardo A; Lees, Rosemary Susan; Halasa, Yara; Lum, Lucy Chai See; Ng, Chiu Wan

    2012-11-01

    Dengue represents a substantial burden in many tropical and sub-tropical regions of the world. We estimated the economic burden of dengue illness in Malaysia. Information about economic burden is needed for setting health policy priorities, but accurate estimation is difficult because of incomplete data. We overcame this limitation by merging multiple data sources to refine our estimates, including an extensive literature review, discussion with experts, review of data from health and surveillance systems, and implementation of a Delphi process. Because Malaysia has a passive surveillance system, the number of dengue cases is under-reported. Using an adjusted estimate of total dengue cases, we estimated an economic burden of dengue illness of US$56 million (Malaysian Ringgit MYR196 million) per year, which is approximately US$2.03 (Malaysian Ringgit 7.14) per capita. The overall economic burden of dengue would be even higher if we included costs associated with dengue prevention and control, dengue surveillance, and long-term sequelae of dengue.

  18. Is nonrelativistic gravity possible?

    International Nuclear Information System (INIS)

    Kocharyan, A. A.

    2009-01-01

    We study nonrelativistic gravity using the Hamiltonian formalism. For the dynamics of general relativity (relativistic gravity) the formalism is well known and called the Arnowitt-Deser-Misner (ADM) formalism. We show that if the lapse function is constrained correctly, then nonrelativistic gravity is described by a consistent Hamiltonian system. Surprisingly, nonrelativistic gravity can have solutions identical to relativistic gravity ones. In particular, (anti-)de Sitter black holes of Einstein gravity and IR limit of Horava gravity are locally identical.

  19. Observations of magnetic field and TEC fluctuations caused by ionospheric responses to acoustic and gravity waves from ground-level, natural hazard sources

    Science.gov (United States)

    Inchin, P.; Zettergren, M. D.; Snively, J. B.; Komjathy, A.; Verkhoglyadova, O. P.

    2017-12-01

    Recent studies have reported magnetic field fluctuations following intense seismic hazard events [e.g. Aoyama et al., EPS, 68, 2016; Toh et al., JGR, 116, 2011]. These perturbations can be associated with ionospheric dynamo phenomena driven by seismically generated acoustic and gravity waves (AGWs). AGW-related dynamo effects can be separated from other sources of magnetic fluctuations (e.g. piezo magnetic effects, magnetospheric forcing or Rayleigh surface waves) based on time delays from event onset (corresponding closely with travel times for AGWs from ground to the ionosphere) and spectral content measured concurrently in total electron content (TEC). Modeling studies aimed at understanding these magnetic field fluctuations have demonstrated the idea that AGWs propagating through the conducting ionosphere can induce current densities sufficient to produce observable magnetic signatures [Zettergren and Snively, JGR, 120, 2017]. Here, we investigate the features of seismic-related magnetic field fluctuations in data and their generation via the effects of seismically-forced AGWs on the ionosphere [Iyemori et al., EPS, 65, 2013; Hasbi et al., JASTP, 71, 2005]. Concurrent magnetic field and TEC data are analyzed for several events: the Chilean earthquakes of 2010 and 2015, Chile's Calbuco volcano eruption and the Sumatran earthquake on March 28, 2005. We investigate the qualitative features of the disturbances as well as quantitative spectral and timing analysis of the data. For Chilean earthquakes, TEC and ground-based magnetometer data reveal fluctuations in magnetic field exhibiting 4-5 mHz frequencies, the same as in TEC. For the Calbuco volcano eruption and Sumatran earthquake both TEC and magnetic field perturbations exhibit frequencies of 4-5 mHz. The results are consistent with previous reports [Aoyama et al., EPS, 68, 2016, Hasbi et al., JASTP, 71, 2005, Iyemori et al., EPS, 65, 2013]. These observations are further interpreted through detailed numerical

  20. DUSTMS-D: DISPOSAL UNIT SOURCE TERM - MULTIPLE SPECIES - DISTRIBUTED FAILURE DATA INPUT GUIDE.

    Energy Technology Data Exchange (ETDEWEB)

    SULLIVAN, T.M.

    2006-01-01

    Performance assessment of a low-level waste (LLW) disposal facility begins with an estimation of the rate at which radionuclides migrate out of the facility (i.e., the source term). The focus of this work is to develop a methodology for calculating the source term. In general, the source term is influenced by the radionuclide inventory, the wasteforms and containers used to dispose of the inventory, and the physical processes that lead to release from the facility (fluid flow, container degradation, wasteform leaching, and radionuclide transport). Many of these physical processes are influenced by the design of the disposal facility (e.g., how the engineered barriers control infiltration of water). The complexity of the problem and the absence of appropriate data prevent development of an entirely mechanistic representation of radionuclide release from a disposal facility. Typically, a number of assumptions, based on knowledge of the disposal system, are used to simplify the problem. This has been done and the resulting models have been incorporated into the computer code DUST-MS (Disposal Unit Source Term-Multiple Species). The DUST-MS computer code is designed to model water flow, container degradation, release of contaminants from the wasteform to the contacting solution and transport through the subsurface media. Water flow through the facility over time is modeled using tabular input. Container degradation models include three types of failure rates: (a) instantaneous (all containers in a control volume fail at once), (b) uniformly distributed failures (containers fail at a linear rate between a specified starting and ending time), and (c) gaussian failure rates (containers fail at a rate determined by a mean failure time, standard deviation and gaussian distribution). Wasteform release models include four release mechanisms: (a) rinse with partitioning (inventory is released instantly upon container failure subject to equilibrium partitioning (sorption) with

  1. Effects of neutron spectrum and external neutron source on neutron multiplication parameters in accelerator-driven system

    International Nuclear Information System (INIS)

    Shahbunder, Hesham; Pyeon, Cheol Ho; Misawa, Tsuyoshi; Lim, Jae-Yong; Shiroya, Seiji

    2010-01-01

    The neutron multiplication parameters: neutron multiplication M, subcritical multiplication factor k s , external source efficiency φ*, play an important role for numerical assessment and reactor power evaluation of an accelerator-driven system (ADS). Those parameters can be evaluated by using the measured reaction rate distribution in the subcritical system. In this study, the experimental verification of this methodology is performed in various ADS cores; with high-energy (100 MeV) proton-tungsten source in hard and soft neutron spectra cores and 14 MeV D-T neutron source in soft spectrum core. The comparison between measured and calculated multiplication parameters reveals a maximum relative difference in the range of 6.6-13.7% that is attributed to the calculation nuclear libraries uncertainty and accuracy for energies higher than 20 MeV and also dependent on the reaction rate distribution position and count rates. The effects of different core neutron spectra and external neutron sources on the neutron multiplication parameters are discussed.

  2. Application of the modified neutron source multiplication method for a measurement of sub-criticality in AGN-201K reactor

    International Nuclear Information System (INIS)

    Myung-Hyun Kim

    2010-01-01

    Measurement of sub-criticality is a challenging and required task in nuclear industry both for nuclear criticality safety and physics test in nuclear power plant. A relatively new method named as Modified Neutron Source Multiplication Method (MNSM) was proposed in Japan. This method is an improvement of traditional Neutron Source Multiplication (NSM) Method, in which three correction factors are applied additionally. In this study, MNSM was tested in calculation of rod worth using an educational reactor in Kyung Hee University, AGN-201K. For this study, a revised nuclear data library and a neutron transport code system TRANSX-PARTISN were used for the calculation of correction factors for various control rod positions and source locations. Experiments were designed and performed to enhance errors in NSM from the location effects of source and detectors. MNSM can correct these effects but current results showed not much correction effects. (author)

  3. Noncommutative gravity

    International Nuclear Information System (INIS)

    Schupp, P.

    2007-01-01

    Heuristic arguments suggest that the classical picture of smooth commutative spacetime should be replaced by some kind of quantum / noncommutative geometry at length scales and energies where quantum as well as gravitational effects are important. Motivated by this idea much research has been devoted to the study of quantum field theory on noncommutative spacetimes. More recently the focus has started to shift back to gravity in this context. We give an introductory overview to the formulation of general relativity in a noncommutative spacetime background and discuss the possibility of exact solutions. (author)

  4. Multiple uses for an old ibm-pc 486 in nuclear medicine using open source software

    International Nuclear Information System (INIS)

    Anselmi, C.E.; Anselmi, O.E.

    2002-01-01

    Multiple uses for an old ibm-pc 486 in nuclear medicine using open source software. Aim: To use a low budget platform to: 1 - send patient's images from processing workstation to the nuclear medicine information system; 2 - backup data files from acquisition in DICOM format in cd-rom; 3 - move data across different hospitals allowing remote processing and reading of studies. Both nuclear medicine systems in the two hospitals are Siemens Icon workstations. Material and methods: The computer used is an ibm-pc 486, which sells for about US dollar 70. The operating system installed is Red Hat Linux 6.2. The sending of the patient's images to the information system is performed through AppleTalk and Samba. The backup of acquisition files is performed by the communication from the workstation through DICOM to the Storage Class Provider (Office Dicom Toolkit) running in the 486, and the files are later burned on cd-rom. A similar configuration is present in another hospital, with minor differences in processor type. Data from any of the hospitals can be sent to the other one through the remote synchronization performed by Rsync. The connection between both Linux computers is encrypted through Secure Shell (open SSH). All software installed in the 486 was downloaded from the internet at no cost. No software was installed in the workstations. Results: The whole system is recognized transparently by the workstation's system as a local storage disk, such as the acquisition cameras or the other workstations. The transfer of images from the workstation to the information system or to a remote hospital is done the same way as copying data from the acquisition cameras in the vendor's software. When transferring large files across hospitals, the synchronization may take 1 to 3 minutes through broad band internet. The backup in DICOM format in cd-rom allows review of patient data in any computer equipped with a DICOM viewing software, as well as the re-processing of that

  5. Exploiting Deep Neural Networks and Head Movements for Robust Binaural Localization of Multiple Sources in Reverberant Environments

    DEFF Research Database (Denmark)

    Ma, Ning; May, Tobias; Brown, Guy J.

    2017-01-01

    This paper presents a novel machine-hearing system that exploits deep neural networks (DNNs) and head movements for robust binaural localization of multiple sources in reverberant environments. DNNs are used to learn the relationship between the source azimuth and binaural cues, consisting...... of the complete cross-correlation function (CCF) and interaural level differences (ILDs). In contrast to many previous binaural hearing systems, the proposed approach is not restricted to localization of sound sources in the frontal hemifield. Due to the similarity of binaural cues in the frontal and rear...

  6. Investigation of black and brown carbon multiple-wavelength-dependent light absorption from biomass and fossil fuel combustion source emissions

    Science.gov (United States)

    Michael R. Olson; Mercedes Victoria Garcia; Michael A. Robinson; Paul Van Rooy; Mark A. Dietenberger; Michael Bergin; James Jay Schauer

    2015-01-01

    Quantification of the black carbon (BC) and brown carbon (BrC) components of source emissions is critical to understanding the impact combustion aerosols have on atmospheric light absorption. Multiple-wavelength absorption was measured from fuels including wood, agricultural biomass, coals, plant matter, and petroleum distillates in controlled combustion settings....

  7. Frequency-swept laser light source at 1050 nm with higher bandwidth due to multiple semiconductor optical amplifiers in series

    DEFF Research Database (Denmark)

    Marschall, Sebastian; Thrane, Lars; Andersen, Peter E.

    2009-01-01

    We report on the development of an all-fiber frequency-swept laser light source in the 1050 nm range based on semiconductor optical amplifiers (SOA) with improved bandwidth due to multiple gain media. It is demonstrated that even two SOAs with nearly equal gain spectra can improve the performance...

  8. Conformal Gravity

    International Nuclear Information System (INIS)

    Hooft, G.

    2012-01-01

    The dynamical degree of freedom for the gravitational force is the metric tensor, having 10 locally independent degrees of freedom (of which 4 can be used to fix the coordinate choice). In conformal gravity, we split this field into an overall scalar factor and a nine-component remainder. All unrenormalizable infinities are in this remainder, while the scalar component can be handled like any other scalar field such as the Higgs field. In this formalism, conformal symmetry is spontaneously broken. An imperative demand on any healthy quantum gravity theory is that black holes should be described as quantum systems with micro-states as dictated by the Hawking-Bekenstein theory. This requires conformal symmetry that may be broken spontaneously but not explicitly, and this means that all conformal anomalies must cancel out. Cancellation of conformal anomalies yields constraints on the matter sector as described by some universal field theory. Thus black hole physics may eventually be of help in the construction of unified field theories. (author)

  9. Southern Africa Gravity Data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This data base (14,559 records) was received in January 1986. Principal gravity parameters include elevation and observed gravity. The observed gravity values are...

  10. NGS Absolute Gravity Data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The NGS Absolute Gravity data (78 stations) was received in July 1993. Principal gravity parameters include Gravity Value, Uncertainty, and Vertical Gradient. The...

  11. [Multiple time scales analysis of spatial differentiation characteristics of non-point source nitrogen loss within watershed].

    Science.gov (United States)

    Liu, Mei-bing; Chen, Xing-wei; Chen, Ying

    2015-07-01

    Identification of the critical source areas of non-point source pollution is an important means to control the non-point source pollution within the watershed. In order to further reveal the impact of multiple time scales on the spatial differentiation characteristics of non-point source nitrogen loss, a SWAT model of Shanmei Reservoir watershed was developed. Based on the simulation of total nitrogen (TN) loss intensity of all 38 subbasins, spatial distribution characteristics of nitrogen loss and critical source areas were analyzed at three time scales of yearly average, monthly average and rainstorms flood process, respectively. Furthermore, multiple linear correlation analysis was conducted to analyze the contribution of natural environment and anthropogenic disturbance on nitrogen loss. The results showed that there were significant spatial differences of TN loss in Shanmei Reservoir watershed at different time scales, and the spatial differentiation degree of nitrogen loss was in the order of monthly average > yearly average > rainstorms flood process. TN loss load mainly came from upland Taoxi subbasin, which was identified as the critical source area. At different time scales, land use types (such as farmland and forest) were always the dominant factor affecting the spatial distribution of nitrogen loss, while the effect of precipitation and runoff on the nitrogen loss was only taken in no fertilization month and several processes of storm flood at no fertilization date. This was mainly due to the significant spatial variation of land use and fertilization, as well as the low spatial variability of precipitation and runoff.

  12. Newtonian gravity in loop quantum gravity

    OpenAIRE

    Smolin, Lee

    2010-01-01

    We apply a recent argument of Verlinde to loop quantum gravity, to conclude that Newton's law of gravity emerges in an appropriate limit and setting. This is possible because the relationship between area and entropy is realized in loop quantum gravity when boundaries are imposed on a quantum spacetime.

  13. Determination of Key Risk Supervision Areas around River-Type Water Sources Affected by Multiple Risk Sources: A Case Study of Water Sources along the Yangtze’s Nanjing Section

    Directory of Open Access Journals (Sweden)

    Qi Zhou

    2017-02-01

    Full Text Available To provide a reference for risk management of water sources, this study screens the key risk supervision areas around river-type water sources (hereinafter referred to as the water sources threatened by multiple fixed risk sources (the risk sources, and establishes a comprehensive methodological system. Specifically, it comprises: (1 method of partitioning risk source concentrated sub-regions for screening water source perimeter key risk supervision areas; (2 approach of determining sub-regional risk indexes (SrRI, which characterizes the scale of sub-regional risks considering factors like risk distribution intensity within sub-regions, risk indexes of risk sources (RIRS, characterizing the risk scale of risk sources and the number of risk sources; and (3 method of calculating sub-region’s risk threats to the water sources (SrTWS which considers the positional relationship between water sources and sub-regions as well as SrRI, and the criteria for determining key supervision sub-regions. Favorable effects are achieved by applying this methodological system in determining water source perimeter sub-regions distributed along the Yangtze’s Nanjing section. Results revealed that for water sources, the key sub-regions needing supervision were SD16, SD06, SD21, SD26, SD15, SD03, SD02, SD32, SD10, SD11, SD14, SD05, SD27, etc., in the order of criticality. The sub-region with the greatest risk threats on the water sources was SD16, which was located in the middle reaches of Yangtze River. In general, sub-regions along the upper Yangtze reaches had greater threats to water sources than the lower reach sub-regions other than SD26 and SD21. Upstream water sources were less subject to the threats of sub-regions than the downstream sources other than NJ09B and NJ03.

  14. Measuring Gravity in International Trade Flows

    Directory of Open Access Journals (Sweden)

    E. Young Song

    2004-12-01

    Full Text Available The purpose of this paper is two-fold. One is to clarify the concept of gravity in international trade flows. The other is to measure the strength of gravity in international trade flows in a way that is consistent with a well-defined concept of gravity. This paper shows that the widely accepted belief that specialization is the source of gravity is not well grounded on theory. We propose to define gravity in international trade as the force that makes the market shares of an exporting country constant in all importing countries, regardless of their sizes. In a stochastic context, we should interpret it as implying that the strength of gravity increases i as the correlation between market shares and market sizes gets weaker and ii as the variance of market shares gets smaller. We estimate an empirical gravity equation thoroughly based on this definition of gravity. We find that a strong degree of gravity exists in most bilateral trade, regardless of income levels of countries, and in trade of most manThe purpose of this paper is two-fold. One is to clarify the concept of gravity in international trade flows. The other is to measure the strength of gravity in international trade flows in a way that is consistent with a well-defined concept of gravity. This paper shows that the widely accepted belief that specialization is the source of gravity is not well grounded on theory. We propose to define gravity in international trade as the force that makes the market shares of an exporting country constant in all importing countries, regardless of their sizes. In a stochastic context, we should interpret it as implying that the strength of gravity increases i as the correlation between market shares and market sizes gets weaker and ii as the variance of market shares gets smaller. We estimate an empirical gravity equation thoroughly based on this definition of gravity. We find that a strong degree of gravity exists in most bilateral trade, regardless of

  15. Theories of quantum gravity: Pt. 1

    International Nuclear Information System (INIS)

    Aragone, C.

    1990-01-01

    Superstrings continue to be a source of inspiration for the basic understanding of quantum gravity. They seem to provide a more fundamental arena than quantum field theory. Even though we still do not have a theory of everything, string concepts bring a new theoretical richness to research in quantum and classical gravity. Papers presented at the session on this subject are reviewed. (author)

  16. Enhanced Gravity Tractor Technique for Planetary Defense

    Science.gov (United States)

    Mazanek, Daniel D.; Reeves, David M.; Hopkins, Joshua B.; Wade, Darren W.; Tantardini, Marco; Shen, Haijun

    2015-01-01

    Given sufficient warning time, Earth-impacting asteroids and comets can be deflected with a variety of different "slow push/pull" techniques. The gravity tractor is one technique that uses the gravitational attraction of a rendezvous spacecraft to the impactor and a low-thrust, high-efficiency propulsion system to provide a gradual velocity change and alter its trajectory. An innovation to this technique, known as the Enhanced Gravity Tractor (EGT), uses mass collected in-situ to augment the mass of the spacecraft, thereby greatly increasing the gravitational force between the objects. The collected material can be a single boulder, multiple boulders, regolith or a combination of different sources. The collected mass would likely range from tens to hundreds of metric tons depending on the size of the impactor and warning time available. Depending on the propulsion system's capability and the mass collected, the EGT approach can reduce the deflection times by a factor of 10 to 50 or more, thus reducing the deflection times of several decades to years or less and overcoming the main criticism of the traditional gravity tractor approach. Additionally, multiple spacecraft can orbit the target in formation to provide the necessary velocity change and further reduce the time needed by the EGT technique to divert hazardous asteroids and comets. The robotic segment of NASA's Asteroid Redirect Mission (ARM) will collect a multi-ton boulder from the surface of a large Near-Earth Asteroid (NEA) and will provide the first ever demonstration of the EGT technique and validate one method of collecting in-situ mass on an asteroid of hazardous size.

  17. Real-Time Localization of Moving Dipole Sources for Tracking Multiple Free-Swimming Weakly Electric Fish

    Science.gov (United States)

    Jun, James Jaeyoon; Longtin, André; Maler, Leonard

    2013-01-01

    In order to survive, animals must quickly and accurately locate prey, predators, and conspecifics using the signals they generate. The signal source location can be estimated using multiple detectors and the inverse relationship between the received signal intensity (RSI) and the distance, but difficulty of the source localization increases if there is an additional dependence on the orientation of a signal source. In such cases, the signal source could be approximated as an ideal dipole for simplification. Based on a theoretical model, the RSI can be directly predicted from a known dipole location; but estimating a dipole location from RSIs has no direct analytical solution. Here, we propose an efficient solution to the dipole localization problem by using a lookup table (LUT) to store RSIs predicted by our theoretically derived dipole model at many possible dipole positions and orientations. For a given set of RSIs measured at multiple detectors, our algorithm found a dipole location having the closest matching normalized RSIs from the LUT, and further refined the location at higher resolution. Studying the natural behavior of weakly electric fish (WEF) requires efficiently computing their location and the temporal pattern of their electric signals over extended periods. Our dipole localization method was successfully applied to track single or multiple freely swimming WEF in shallow water in real-time, as each fish could be closely approximated by an ideal current dipole in two dimensions. Our optimized search algorithm found the animal’s positions, orientations, and tail-bending angles quickly and accurately under various conditions, without the need for calibrating individual-specific parameters. Our dipole localization method is directly applicable to studying the role of active sensing during spatial navigation, or social interactions between multiple WEF. Furthermore, our method could be extended to other application areas involving dipole source

  18. Real-Time Localization of Moving Dipole Sources for Tracking Multiple Free-Swimming Weakly Electric Fish.

    Directory of Open Access Journals (Sweden)

    James Jaeyoon Jun

    Full Text Available In order to survive, animals must quickly and accurately locate prey, predators, and conspecifics using the signals they generate. The signal source location can be estimated using multiple detectors and the inverse relationship between the received signal intensity (RSI and the distance, but difficulty of the source localization increases if there is an additional dependence on the orientation of a signal source. In such cases, the signal source could be approximated as an ideal dipole for simplification. Based on a theoretical model, the RSI can be directly predicted from a known dipole location; but estimating a dipole location from RSIs has no direct analytical solution. Here, we propose an efficient solution to the dipole localization problem by using a lookup table (LUT to store RSIs predicted by our theoretically derived dipole model at many possible dipole positions and orientations. For a given set of RSIs measured at multiple detectors, our algorithm found a dipole location having the closest matching normalized RSIs from the LUT, and further refined the location at higher resolution. Studying the natural behavior of weakly electric fish (WEF requires efficiently computing their location and the temporal pattern of their electric signals over extended periods. Our dipole localization method was successfully applied to track single or multiple freely swimming WEF in shallow water in real-time, as each fish could be closely approximated by an ideal current dipole in two dimensions. Our optimized search algorithm found the animal's positions, orientations, and tail-bending angles quickly and accurately under various conditions, without the need for calibrating individual-specific parameters. Our dipole localization method is directly applicable to studying the role of active sensing during spatial navigation, or social interactions between multiple WEF. Furthermore, our method could be extended to other application areas involving dipole

  19. Estimation of Multiple Point Sources for Linear Fractional Order Systems Using Modulating Functions

    KAUST Repository

    Belkhatir, Zehor; Laleg-Kirati, Taous-Meriem

    2017-01-01

    This paper proposes an estimation algorithm for the characterization of multiple point inputs for linear fractional order systems. First, using polynomial modulating functions method and a suitable change of variables the problem of estimating

  20. Logamediate Inflation in f ( T ) Teleparallel Gravity

    Energy Technology Data Exchange (ETDEWEB)

    Rezazadeh, Kazem; Karami, Kayoomars [Department of Physics, University of Kurdistan, Pasdaran Street, P.O. Box 66177-15175, Sanandaj (Iran, Islamic Republic of); Abdolmaleki, Asrin, E-mail: rezazadeh86@gmail.com [Research Institute for Astronomy and Astrophysics of Maragha (RIAAM), P.O. Box 55134-441, Maragha (Iran, Islamic Republic of)

    2017-02-20

    We study logamediate inflation in the context of f ( T ) teleparallel gravity. f ( T )-gravity is a generalization of the teleparallel gravity which is formulated on the Weitzenbock spacetime, characterized by the vanishing curvature tensor (absolute parallelism) and the non-vanishing torsion tensor. We consider an f ( T )-gravity model which is sourced by a canonical scalar field. Assuming a power-law f ( T ) function in the action, we investigate an inflationary universe with a logamediate scale factor. Our results show that, although logamediate inflation is completely ruled out by observational data in the standard inflationary scenario based on Einstein gravity, it can be compatible with the 68% confidence limit joint region of Planck 2015 TT,TE,EE+lowP data in the framework of f ( T )-gravity.

  1. Source term determination from subcritical multiplication measurements at Koral-1 reactor

    International Nuclear Information System (INIS)

    Blazquez, J.B.; Barrado, J.M.

    1978-01-01

    By using an AmBe neutron source two independent procedures have been settled for the zero-power experimental fast-reactor Coral-1 in order to measure the source term which appears in the point kinetical equations. In the first one, the source term is measured when the reactor is just critical with source by taking advantage of the wide range of the linear approach to critical for Coral-1. In the second one, the measurement is made in subcritical state by making use of the previous calibrated control rods. Several applications are also included such as the measurement of the detector dead time, the determinations of the reactivity of small samples and the shape of the neutron importance of the source. (author)

  2. Separation of Correlated Astrophysical Sources Using Multiple-Lag Data Covariance Matrices

    Directory of Open Access Journals (Sweden)

    Baccigalupi C

    2005-01-01

    Full Text Available This paper proposes a new strategy to separate astrophysical sources that are mutually correlated. This strategy is based on second-order statistics and exploits prior information about the possible structure of the mixing matrix. Unlike ICA blind separation approaches, where the sources are assumed mutually independent and no prior knowledge is assumed about the mixing matrix, our strategy allows the independence assumption to be relaxed and performs the separation of even significantly correlated sources. Besides the mixing matrix, our strategy is also capable to evaluate the source covariance functions at several lags. Moreover, once the mixing parameters have been identified, a simple deconvolution can be used to estimate the probability density functions of the source processes. To benchmark our algorithm, we used a database that simulates the one expected from the instruments that will operate onboard ESA's Planck Surveyor Satellite to measure the CMB anisotropies all over the celestial sphere.

  3. Reduced dose uncertainty in MRI-based polymer gel dosimetry using parallel RF transmission with multiple RF sources

    International Nuclear Information System (INIS)

    Sang-Young Kim; Jung-Hoon Lee; Jin-Young Jung; Do-Wan Lee; Seu-Ran Lee; Bo-Young Choe; Hyeon-Man Baek; Korea University of Science and Technology, Daejeon; Dae-Hyun Kim; Jung-Whan Min; Ji-Yeon Park

    2014-01-01

    In this work, we present the feasibility of using a parallel RF transmit with multiple RF sources imaging method (MultiTransmit imaging) in polymer gel dosimetry. Image quality and B 1 field homogeneity was statistically better in the MultiTransmit imaging method than in conventional single source RF transmission imaging method. In particular, the standard uncertainty of R 2 was lower on the MultiTransmit images than on the conventional images. Furthermore, the MultiTransmit measurement showed improved dose resolution. Improved image quality and B 1 homogeneity results in reduced dose uncertainty, thereby suggesting the feasibility of MultiTransmit MR imaging in gel dosimetry. (author)

  4. Source coherence impairments in a direct detection direct sequence optical code-division multiple-access system.

    Science.gov (United States)

    Fsaifes, Ihsan; Lepers, Catherine; Lourdiane, Mounia; Gallion, Philippe; Beugin, Vincent; Guignard, Philippe

    2007-02-01

    We demonstrate that direct sequence optical code- division multiple-access (DS-OCDMA) encoders and decoders using sampled fiber Bragg gratings (S-FBGs) behave as multipath interferometers. In that case, chip pulses of the prime sequence codes generated by spreading in time-coherent data pulses can result from multiple reflections in the interferometers that can superimpose within a chip time duration. We show that the autocorrelation function has to be considered as the sum of complex amplitudes of the combined chip as the laser source coherence time is much greater than the integration time of the photodetector. To reduce the sensitivity of the DS-OCDMA system to the coherence time of the laser source, we analyze the use of sparse and nonperiodic quadratic congruence and extended quadratic congruence codes.

  5. Source coherence impairments in a direct detection direct sequence optical code-division multiple-access system

    Science.gov (United States)

    Fsaifes, Ihsan; Lepers, Catherine; Lourdiane, Mounia; Gallion, Philippe; Beugin, Vincent; Guignard, Philippe

    2007-02-01

    We demonstrate that direct sequence optical code- division multiple-access (DS-OCDMA) encoders and decoders using sampled fiber Bragg gratings (S-FBGs) behave as multipath interferometers. In that case, chip pulses of the prime sequence codes generated by spreading in time-coherent data pulses can result from multiple reflections in the interferometers that can superimpose within a chip time duration. We show that the autocorrelation function has to be considered as the sum of complex amplitudes of the combined chip as the laser source coherence time is much greater than the integration time of the photodetector. To reduce the sensitivity of the DS-OCDMA system to the coherence time of the laser source, we analyze the use of sparse and nonperiodic quadratic congruence and extended quadratic congruence codes.

  6. Application of the Modified Source Multiplication (MSM) Technique to Subcritical Reactivity Worth Measurements in Thermal and Fast Reactor Systems

    International Nuclear Information System (INIS)

    Blaise, P.; Fougeras, Ph.; Mellier, F.

    2011-01-01

    The Amplified Source Multiplication (ASM) method and its improved Modified Source Multiplication (MSM) method have been widely used in the CEA's EOLE and MASURCA critical facilities over the past decades for the determination of reactivity worths by using fission chambers in subcritical configurations. The ASM methodology uses relatively simple relationships between count rates of efficient miniature fission chambers located in slightly subcritical reference and perturbed configurations. While this method works quite well for small reactivity variations, the raw results need to be corrected to take into account the flux perturbation at the fission chamber location. This is performed by applying to the measurement a correction factor called MSM. This paper describes in detail both methodologies, with their associated uncertainties. Applications on absorber cluster worth in the MISTRAL-4 full MOX mock-up core and the last core loaded in MASURCA show the importance of the MSM correction on raw ASM data. (authors)

  7. The wave of the future - Searching for gravity waves

    International Nuclear Information System (INIS)

    Goldsmith, D.

    1991-01-01

    Research on gravity waves conducted by such scientists as Gamov, Wheeler, Weber and Zel'dovich is discussed. Particular attention is given to current trends in the theoretical analysis of gravity waves carried out by theorists Kip Thorne and Leonid Grishchuk. The problems discussed include the search for gravity waves; calculation of the types of gravity waves; the possibility of detecting gravity waves from localized sources, e.g., from the collision of two black holes in a distant galaxy or the collapse of a star, through the Laser Interferometer Gravitational Wave Observatory; and detection primordial gravity waves from the big bang

  8. Identification of dust storm source areas in West Asia using multiple environmental datasets.

    Science.gov (United States)

    Cao, Hui; Amiraslani, Farshad; Liu, Jian; Zhou, Na

    2015-01-01

    Sand and Dust storms are common phenomena in arid and semi-arid areas. West Asia Region, especially Tigris-Euphrates alluvial plain, has been recognized as one of the most important dust source areas in the world. In this paper, a method is applied to extract SDS (Sand and Dust Storms) sources in West Asia region using thematic maps, climate and geography, HYSPLIT model and satellite images. Out of 50 dust storms happened during 2000-2013 and collected in form of MODIS images, 27 events were incorporated as demonstrations of the simulated trajectories by HYSPLIT model. Besides, a dataset of the newly released Landsat images was used as base-map for the interpretation of SDS source regions. As a result, six main clusters were recognized as dust source areas. Of which, 3 clusters situated in Tigris-Euphrates plain were identified as severe SDS sources (including 70% dust storms in this research). Another cluster in Sistan plain is also a potential source area. This approach also confirmed six main paths causing dust storms. These paths are driven by the climate system including Siberian and Polar anticyclones, monsoon from Indian Subcontinent and depression from north of Africa. The identification of SDS source areas and paths will improve our understandings on the mechanisms and impacts of dust storms on socio-economy and environment of the region. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Start-up Characteristics of Swallow-tailed Axial-grooved Heat Pipe under the conditions of Multiple Heat Sources

    Science.gov (United States)

    Zhang, Renping

    2017-12-01

    A mathematical model was developed for predicting start-up characteristics of Swallow-tailed Axial-grooved Heat Pipe under the conditions of Multiple Heat Sources. The effects of heat capacitance of heat source, liquid-vapour interfacial evaporation-condensation heat transfer, shear stress at the interface was considered in current model. The interfacial evaporating mass flow rate is based on the kinetic analysis. Time variations of evaporating mass rate, wall temperature and liquid velocity are studied from the start-up to steady state. The calculated results show that wall temperature demonstrates step transition at the junction between the heat source and non-existent heat source on the evaporator. The liquid velocity changes drastically at the evaporator section, however, it has slight variation at the evaporator section without heat source. When the effect of heat source is ignored, the numerical temperature demonstrates a quicker response. With the consideration of capacitance of the heat source, the data obtained from the proposed model agree well with the experimental results.

  10. Multiple sources of metals of mineralization in Lower Cambrian black shales of South China: Evidence from geochemical and petrographic study

    Czech Academy of Sciences Publication Activity Database

    Pašava, J.; Kříbek, B.; Vymazalová, A.; Sýkorová, Ivana; Žák, Karel; Orberger, B.

    2008-01-01

    Roč. 58, č. 1 (2008), s. 25-42 ISSN 1344-1698 R&D Projects: GA AV ČR IAA300460510 Institutional research plan: CEZ:AV0Z30460519; CEZ:AV0Z30130516 Keywords : multiple source * Cambrian Ni-Mo-polymetalic black shale * SEDEX barite deposit Subject RIV: DD - Geochemistry Impact factor: 0.377, year: 2008

  11. Basalt generation at the Apollo 12 site. Part 2: Source heterogeneity, multiple melts, and crustal contamination

    Science.gov (United States)

    Neal, Clive R.; Hacker, Matthew D.; Snyder, Gregory A.; Taylor, Lawrence A.; Liu, Yun-Gang; Schmitt, Roman A.

    1994-01-01

    The petrogenesis of Apollo 12 mare basalts has been examined with emphasis on trace-element ratios and abundances. Vitrophyric basalts were used as parental compositions for the modeling, and proportions of fractionating phases were determined using the MAGFOX prograqm of Longhi (1991). Crystal fractionation processes within crustal and sub-crustal magma chambers are evaluated as a function of pressure. Knowledge of the fractionating phases allows trace-element variations to be considered as either source related or as a product of post-magma-generation processes. For the ilmenite and olivine basalts, trace-element variations are inherited from the source, but the pigeonite basalt data have been interpreted with open-system evolution processes through crustal assimilation. Three groups of basalts have been examined: (1) Pigeonite basalts-produced by the assimilation of lunar crustal material by a parental melt (up to 3% assimilation and 10% crystal fractionation, with an 'r' value of 0.3). (2) Ilmenite basalts-produced by variable degrees of partial melting (4-8%) of a source of olivine, pigeonite, augite, and plagioclase, brought together by overturn of the Lunar Magma Ocean (LMO) cumulate pile. After generation, which did not exhaust any of the minerals in the source, these melts experienced closed-system crystal fractionation/accumulation. (3) Olivine basalts-produced by variable degrees of partial melting (5-10%) of a source of olivine, pigeonite, and augite. After generation, again without exhausting any of the minerals in the source, these melts evolved through crystal accumulation. The evolved liquid counterparts of these cumulates have not been sampled. The source compositions for the ilmenite and olivine basalts were calculated by assuming that the vitrophyric compositions were primary and the magmas were produced by non-modal batch melting. Although the magnitude is unclear, evaluation of these source regions indicates that both be composed of early- and

  12. The Cause of Gravity

    OpenAIRE

    Byrne, Michael

    1999-01-01

    Einstein said that gravity is an acceleration like any other acceleration. But gravity causes relativistic effects at non-relativistic speeds; so gravity could have relativistic origins. And since the strong force is thought to cause most of mass, and mass is proportional to gravity; the strong force is therefore also proportional to gravity. The strong force could thus cause relativistic increases of mass through the creation of virtual gluons; along with a comparable contraction of space ar...

  13. Failures in sand in reduced gravity environments

    Science.gov (United States)

    Marshall, Jason P.; Hurley, Ryan C.; Arthur, Dan; Vlahinic, Ivan; Senatore, Carmine; Iagnemma, Karl; Trease, Brian; Andrade, José E.

    2018-04-01

    The strength of granular materials, specifically sand is important for understanding physical phenomena on other celestial bodies. However, relatively few experiments have been conducted to determine the dependence of strength properties on gravity. In this work, we experimentally investigated relative values of strength (the peak friction angle, the residual friction angle, the angle of repose, and the peak dilatancy angle) in Earth, Martian, Lunar, and near-zero gravity. The various angles were captured in a classical passive Earth pressure experiment conducted on board a reduced gravity flight and analyzed using digital image correlation. The data showed essentially no dependence of the peak friction angle on gravity, a decrease in the residual friction angle between Martian and Lunar gravity, no dependence of the angle of repose on gravity, and an increase in the dilation angle between Martian and Lunar gravity. Additionally, multiple flow surfaces were seen in near-zero gravity. These results highlight the importance of understanding strength and deformation mechanisms of granular materials at different levels of gravity.

  14. FormScanner: Open-Source Solution for Grading Multiple-Choice Exams

    Science.gov (United States)

    Young, Chadwick; Lo, Glenn; Young, Kaisa; Borsetta, Alberto

    2016-01-01

    The multiple-choice exam remains a staple for many introductory physics courses. In the past, people have graded these by hand or even flaming needles. Today, one usually grades the exams with a form scanner that utilizes optical mark recognition (OMR). Several companies provide these scanners and particular forms, such as the eponymous…

  15. Isolating and Examining Sources of Suppression and Multicollinearity in Multiple Linear Regression

    Science.gov (United States)

    Beckstead, Jason W.

    2012-01-01

    The presence of suppression (and multicollinearity) in multiple regression analysis complicates interpretation of predictor-criterion relationships. The mathematical conditions that produce suppression in regression analysis have received considerable attention in the methodological literature but until now nothing in the way of an analytic…

  16. Using Multiple Sources of Information in Establishing Text Complexity. Reading Research Report. #11.03

    Science.gov (United States)

    Hiebert, Elfrieda H.

    2011-01-01

    A focus of the Common Core State Standards/English Language Arts (CCSS/ELA) is that students become increasingly more capable with complex text over their school careers. This focus has redirected attention to the measurement of text complexity. Although CCSS/ELA suggests multiple criteria for this task, the standards offer a single measure of…

  17. Noise analysis of a white-light supercontinuum light source for multiple wavelength confocal laser scanning fluorescence microscopy

    Energy Technology Data Exchange (ETDEWEB)

    McConnell, Gail [Centre for Biophotonics, Strathclyde Institute for Biomedical Sciences, University of Strathclyde, 27 Taylor Street, Glasgow, G4 0NR (United Kingdom)

    2005-08-07

    Intensity correlations of a Ti : sapphire, Kr/Ar and a white-light supercontinuum were performed to quantify the typical signal amplitude fluctuations and hence ascertain the comparative output stability of the white-light supercontinuum source for confocal laser scanning microscopy (CLSM). Intensity correlations across a two-pixel sample (n = 1000) of up to 98%, 95% and 94% were measured for the Ti : sapphire, Kr/Ar and white-light supercontinuum source, respectively. The white-light supercontinuum noise level is therefore acceptable for CLSM, with the added advantage of wider wavelength flexibility over traditional CLSM excitation sources. The relatively low-noise white-light supercontinuum was then used to perform multiple wavelength sequential CLSM of guinea pig detrusor to confirm the reliability of the system and to demonstrate system flexibility.

  18. Optimizing the regimes of the Advanced LIGO gravitational wave detector for multiple source types

    International Nuclear Information System (INIS)

    Kondrashov, I. S.; Simakov, D. A.; Khalili, F. Ya.; Danilishin, S. L.

    2008-01-01

    We developed algorithms which allow us to find regimes of the signal-recycled Fabry-Perot-Michelson interferometer [for example, the Advanced Laser Interferometric Gravitational Wave Observatory (LIGO)], optimized concurrently for two (binary inspirals + bursts) and three (binary inspirals + bursts + millisecond pulsars) types of gravitational wave sources. We show that there exists a relatively large area in the interferometer parameters space where the detector sensitivity to the first two kinds of sources differs only by a few percent from the maximal ones for each kind of source. In particular, there exists a specific regime where this difference is ≅0.5% for both of them. Furthermore, we show that even more multipurpose regimes are also possible that provide significant sensitivity gain for millisecond pulsars with only minor sensitivity degradation for binary inspirals and bursts.

  19. Multiple human schemas and the communication-information sources use: An application of Q-methodology

    Directory of Open Access Journals (Sweden)

    Mansour Shahvali

    2014-12-01

    Full Text Available This study was conducted with the aim of developing a communication and information model for greenhouse farmers in Yazd city using schema theory. Performing the Q methodology together with the factor analysis, as such, the different variables were loaded over the five schematic factors which included the human philosophical nature, ideological, economic, social, and environmental-conservation beliefs. Running AMOS,of course, it was also unveiled that the philosophical, ideological, social, economic and environmental schemas influence directly on the personal communication-information sources use. Furthermore, the environmental-conservation schema affects directly and indirectly the personal communication-information sources use. More importantly, this study indicated the important role of the indigenous sources which play in constructing, evaluating and retrieving the environmental knowledge in respondents. The research predisposes a suitable context for policymakers who seek to draw up much more effective and appropriate communication and information strategies to address the specific target groups’ needs.

  20. Ionizing radiation sources: very diversified means, multiple applications and a changing regulatory environment. Conference proceedings

    International Nuclear Information System (INIS)

    2011-11-01

    This document brings together the available presentations given at the conference organised by the French society of radiation protection about ionizing radiation source means, applications and regulatory environment. Twenty eight presentations (slides) are compiled in this document and deal with: 1 - Overview of sources - some quantitative data from the national inventory of ionizing radiation sources (Yann Billarand, IRSN); 2 - Overview of sources (Jerome Fradin, ASN); 3 - Regulatory framework (Sylvie Rodde, ASN); 4 - Alternatives to Iridium radiography - the case of pressure devices at the manufacturing stage (Henri Walaszek, Cetim; Bruno Kowalski, Welding Institute); 5 - Dosimetric stakes of medical scanner examinations (Jean-Louis Greffe, Charleroi hospital of Medical University); 6 - The removal of ionic smoke detectors (Bruno Charpentier, ASN); 7 - Joint-activity and reciprocal liabilities - Organisation of labour risk prevention in case of companies joint-activity (Paulo Pinto, DGT); 8 - Consideration of gamma-graphic testing in the organization of a unit outage activities (Jean-Gabriel Leonard, EDF); 9 - Radiological risk control at a closed and independent work field (Stephane Sartelet, Areva); 10 - Incidents and accidents status and typology (Pascale Scanff, IRSN); 11 - Regional overview of radiation protection significant events (Philippe Menechal, ASN); 12 - Incident leading to a tritium contamination in and urban area - consequences and experience feedback (Laurence Fusil, CEA); 13 - Experience feedback - loss of sealing of a calibration source (Philippe Mougnard, Areva); 14 - Blocking incident of a 60 Co source (Bruno Delille, Salvarem); 15 - Triggering of gantry's alarm: status of findings (Philippe Prat, Syctom); 16 - Non-medical electric devices: regulatory changes (Sophie Dagois, IRSN; Jerome Fradin, ASN); 17 - Evaluation of the dose equivalent rate in pulsed fields: method proposed by the IRSN and implementation test (Laurent Donadille, IRSN

  1. Multiple information sources and consequences of conflicting information about medicine use during pregnancy: a multinational Internet-based survey.

    Science.gov (United States)

    Hämeen-Anttila, Katri; Nordeng, Hedvig; Kokki, Esa; Jyrkkä, Johanna; Lupattelli, Angela; Vainio, Kirsti; Enlund, Hannes

    2014-02-20

    A wide variety of information sources on medicines is available for pregnant women. When using multiple information sources, there is the risk that information will vary or even conflict. The objective of this multinational study was to analyze the extent to which pregnant women use multiple information sources and the consequences of conflicting information, and to investigate which maternal sociodemographic, lifestyle, and medical factors were associated with these objectives. An anonymous Internet-based questionnaire was made accessible during a period of 2 months, on 1 to 4 Internet websites used by pregnant women in 5 regions (Eastern Europe, Western Europe, Northern Europe, Americas, Australia). A total of 7092 responses were obtained (n=5090 pregnant women; n=2002 women with a child younger than 25 weeks). Descriptive statistics and logistic regression analysis were used. Of the respondents who stated that they needed information, 16.16% (655/4054) used one information source and 83.69% (3393/4054) used multiple information sources. Of respondents who used more than one information source, 22.62% (759/3355) stated that the information was conflicted. According to multivariate logistic regression analysis, factors significantly associated with experiencing conflict in medicine information included being a mother (OR 1.32, 95% CI 1.11-1.58), having university (OR 1.33, 95% CI 1.09-1.63) or other education (OR 1.49, 95% CI 1.09-2.03), residing in Eastern Europe (OR 1.52, 95% CI 1.22-1.89) or Australia (OR 2.28, 95% CI 1.42-3.67), use of 3 (OR 1.29, 95% CI 1.04-1.60) or >4 information sources (OR 1.82, 95% CI 1.49-2.23), and having ≥2 chronic diseases (OR 1.49, 95% CI 1.18-1.89). Because of conflicting information, 43.61% (331/759) decided not to use medication during pregnancy, 30.30% (230/759) sought a new information source, 32.67% (248/759) chose to rely on one source and ignore the conflicting one, 25.03% (190/759) became anxious, and 2.64% (20/759) did

  2. Research on numerical method for multiple pollution source discharge and optimal reduction program

    Science.gov (United States)

    Li, Mingchang; Dai, Mingxin; Zhou, Bin; Zou, Bin

    2018-03-01

    In this paper, the optimal method for reduction program is proposed by the nonlinear optimal algorithms named that genetic algorithm. The four main rivers in Jiangsu province, China are selected for reducing the environmental pollution in nearshore district. Dissolved inorganic nitrogen (DIN) is studied as the only pollutant. The environmental status and standard in the nearshore district is used to reduce the discharge of multiple river pollutant. The research results of reduction program are the basis of marine environmental management.

  3. How organic carbon derived from multiple sources contributes to carbon sequestration processes in a shallow coastal system?

    Science.gov (United States)

    Watanabe, Kenta; Kuwae, Tomohiro

    2015-04-16

    Carbon captured by marine organisms helps sequester atmospheric CO 2 , especially in shallow coastal ecosystems, where rates of primary production and burial of organic carbon (OC) from multiple sources are high. However, linkages between the dynamics of OC derived from multiple sources and carbon sequestration are poorly understood. We investigated the origin (terrestrial, phytobenthos derived, and phytoplankton derived) of particulate OC (POC) and dissolved OC (DOC) in the water column and sedimentary OC using elemental, isotopic, and optical signatures in Furen Lagoon, Japan. Based on these data analysis, we explored how OC from multiple sources contributes to sequestration via storage in sediments, water column sequestration, and air-sea CO 2 exchanges, and analyzed how the contributions vary with salinity in a shallow seagrass meadow as well. The relative contribution of terrestrial POC in the water column decreased with increasing salinity, whereas autochthonous POC increased in the salinity range 10-30. Phytoplankton-derived POC dominated the water column POC (65-95%) within this salinity range; however, it was minor in the sediments (3-29%). In contrast, terrestrial and phytobenthos-derived POC were relatively minor contributors in the water column but were major contributors in the sediments (49-78% and 19-36%, respectively), indicating that terrestrial and phytobenthos-derived POC were selectively stored in the sediments. Autochthonous DOC, part of which can contribute to long-term carbon sequestration in the water column, accounted for >25% of the total water column DOC pool in the salinity range 15-30. Autochthonous OC production decreased the concentration of dissolved inorganic carbon in the water column and thereby contributed to atmospheric CO 2 uptake, except in the low-salinity zone. Our results indicate that shallow coastal ecosystems function not only as transition zones between land and ocean but also as carbon sequestration filters. They

  4. Memory for Textual Conflicts Predicts Sourcing When Adolescents Read Multiple Expository Texts

    Science.gov (United States)

    Stang Lund, Elisabeth; Bråten, Ivar; Brante, Eva W.; Strømsø, Helge I.

    2017-01-01

    This study investigated whether memory for conflicting information predicted mental representation of source-content links (i.e., who said what) in a sample of 86 Norwegian adolescent readers. Participants read four texts presenting conflicting claims about sun exposure and health. With differences in gender, prior knowledge, and interest…

  5. Recent performances of the multiple charged heavy-ion source - triple mafios

    International Nuclear Information System (INIS)

    Briand, P.; Chan-tung, N.; Geller, R.; Jacquot, B.

    1977-01-01

    The principle and the characteristics of the ion source are described. We also furnish upto date performances concerning ion currents, globale emittances of the beam as well as the emittances of Ar 1+ to Ar 10+ in the radial and axial planes. (orig./WL) [de

  6. Evaluation of Personal and Built Environment Attributes to Physical Activity: A Multilevel Analysis on Multiple Population-Based Data Sources

    Directory of Open Access Journals (Sweden)

    Wei Yang

    2012-01-01

    Full Text Available Background. Studies have documented that built environment factors potentially promote or impede leisure time physical activity (LTPA. This study explored the relationship between multiple built environment factors and individual characteristics on LTPA. Methods. Multiple data sources were utilized including individual level data for health behaviors and health status from the Nevada Behavioral Risk Factor Surveillance System (BRFSS and community level data from different data sources including indicators for recreation facilities, safety, air quality, commute time, urbanization, population density, and land mix level. Mixed model logistic regression and geographic information system (GIS spatial analysis were conducted. Results. Among 6,311 respondents, 24.4% reported no LTPA engagement during the past 30 days. No engagement in LTPA was significantly associated with (1 individual factors: older age, less education, lower income, being obesity, and low life satisfaction and (2 community factors: more commute time, higher crime rate, urban residence, higher population density, but not for density and distance to recreation facilities, air quality, and land mix. Conclusions. Multiple data systems including complex population survey and spatial analysis are valuable tools on health and built environment studies.

  7. Chiral gravity, log gravity, and extremal CFT

    International Nuclear Information System (INIS)

    Maloney, Alexander; Song Wei; Strominger, Andrew

    2010-01-01

    We show that the linearization of all exact solutions of classical chiral gravity around the AdS 3 vacuum have positive energy. Nonchiral and negative-energy solutions of the linearized equations are infrared divergent at second order, and so are removed from the spectrum. In other words, chirality is confined and the equations of motion have linearization instabilities. We prove that the only stationary, axially symmetric solutions of chiral gravity are BTZ black holes, which have positive energy. It is further shown that classical log gravity--the theory with logarithmically relaxed boundary conditions--has finite asymptotic symmetry generators but is not chiral and hence may be dual at the quantum level to a logarithmic conformal field theories (CFT). Moreover we show that log gravity contains chiral gravity within it as a decoupled charge superselection sector. We formally evaluate the Euclidean sum over geometries of chiral gravity and show that it gives precisely the holomorphic extremal CFT partition function. The modular invariance and integrality of the expansion coefficients of this partition function are consistent with the existence of an exact quantum theory of chiral gravity. We argue that the problem of quantizing chiral gravity is the holographic dual of the problem of constructing an extremal CFT, while quantizing log gravity is dual to the problem of constructing a logarithmic extremal CFT.

  8. Pesticide pollution of multiple drinking water sources in the Mekong Delta, Vietnam: evidence from two provinces.

    Science.gov (United States)

    Chau, N D G; Sebesvari, Z; Amelung, W; Renaud, F G

    2015-06-01

    Pollution of drinking water sources with agrochemicals is often a major threat to human and ecosystem health in some river deltas, where agricultural production must meet the requirements of national food security or export aspirations. This study was performed to survey the use of different drinking water sources and their pollution with pesticides in order to inform on potential exposure sources to pesticides in rural areas of the Mekong River delta, Vietnam. The field work comprised both household surveys and monitoring of 15 frequently used pesticide active ingredients in different water sources used for drinking (surface water, groundwater, water at public pumping stations, surface water chemically treated at household level, harvested rainwater, and bottled water). Our research also considered the surrounding land use systems as well as the cropping seasons. Improper pesticide storage and waste disposal as well as inadequate personal protection during pesticide handling and application were widespread amongst the interviewed households, with little overall risk awareness for human and environmental health. The results show that despite the local differences in the amount and frequency of pesticides applied, pesticide pollution was ubiquitous. Isoprothiolane (max. concentration 8.49 μg L(-1)), fenobucarb (max. 2.32 μg L(-1)), and fipronil (max. 0.41 μg L(-1)) were detected in almost all analyzed water samples (98 % of all surface samples contained isoprothiolane, for instance). Other pesticides quantified comprised butachlor, pretilachlor, propiconazole, hexaconazole, difenoconazole, cypermethrin, fenoxapro-p-ethyl, tebuconazole, trifloxystrobin, azoxystrobin, quinalphos, and thiamethoxam. Among the studied water sources, concentrations were highest in canal waters. Pesticide concentrations varied with cropping season but did not diminish through the year. Even in harvested rainwater or purchased bottled water, up to 12 different pesticides were detected at

  9. Locating non-volcanic tremor along the San Andreas Fault using a multiple array source imaging technique

    Science.gov (United States)

    Ryberg, T.; Haberland, C.H.; Fuis, G.S.; Ellsworth, W.L.; Shelly, D.R.

    2010-01-01

    Non-volcanic tremor (NVT) has been observed at several subduction zones and at the San Andreas Fault (SAF). Tremor locations are commonly derived by cross-correlating envelope-transformed seismic traces in combination with source-scanning techniques. Recently, they have also been located by using relative relocations with master events, that is low-frequency earthquakes that are part of the tremor; locations are derived by conventional traveltime-based methods. Here we present a method to locate the sources of NVT using an imaging approach for multiple array data. The performance of the method is checked with synthetic tests and the relocation of earthquakes. We also applied the method to tremor occurring near Cholame, California. A set of small-aperture arrays (i.e. an array consisting of arrays) installed around Cholame provided the data set for this study. We observed several tremor episodes and located tremor sources in the vicinity of SAF. During individual tremor episodes, we observed a systematic change of source location, indicating rapid migration of the tremor source along SAF. ?? 2010 The Authors Geophysical Journal International ?? 2010 RAS.

  10. A method for untriggered time-dependent searches for multiple flares from neutrino point sources

    International Nuclear Information System (INIS)

    Gora, D.; Bernardini, E.; Cruz Silva, A.H.

    2011-04-01

    A method for a time-dependent search for flaring astrophysical sources which can be potentially detected by large neutrino experiments is presented. The method uses a time-clustering algorithm combined with an unbinned likelihood procedure. By including in the likelihood function a signal term which describes the contribution of many small clusters of signal-like events, this method provides an effective way for looking for weak neutrino flares over different time-scales. The method is sensitive to an overall excess of events distributed over several flares which are not individually detectable. For standard cases (one flare) the discovery potential of the method is worse than a standard time-dependent point source analysis with unknown duration of the flare by a factor depending on the signal-to-background level. However, for flares sufficiently shorter than the total observation period, the method is more sensitive than a time-integrated analysis. (orig.)

  11. A method for untriggered time-dependent searches for multiple flares from neutrino point sources

    Energy Technology Data Exchange (ETDEWEB)

    Gora, D. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Institute of Nuclear Physics PAN, Cracow (Poland); Bernardini, E.; Cruz Silva, A.H. [Institute of Nuclear Physics PAN, Cracow (Poland)

    2011-04-15

    A method for a time-dependent search for flaring astrophysical sources which can be potentially detected by large neutrino experiments is presented. The method uses a time-clustering algorithm combined with an unbinned likelihood procedure. By including in the likelihood function a signal term which describes the contribution of many small clusters of signal-like events, this method provides an effective way for looking for weak neutrino flares over different time-scales. The method is sensitive to an overall excess of events distributed over several flares which are not individually detectable. For standard cases (one flare) the discovery potential of the method is worse than a standard time-dependent point source analysis with unknown duration of the flare by a factor depending on the signal-to-background level. However, for flares sufficiently shorter than the total observation period, the method is more sensitive than a time-integrated analysis. (orig.)

  12. Using multiple isotopes to understand the source of ingredients used in golden beverages

    Science.gov (United States)

    Wynn, J. G.

    2011-12-01

    Traditionally, beer contains 4 simple ingredients: water, barley, hops and yeast. Each of these ingredients used in the brewing process contributes some combination of a number of "traditional" stable isotopes (i.e., isotopes of H, C, O, N and S) to the final product. As an educational exercise in an "Analytical Techniques in Geology" course, a group of students analyzed the isotopic composition of the gas, liquid and solid phases of a variety of beer samples collected from throughout the world (including other beverages). The hydrogen and oxygen isotopic composition of the water followed closely the isotopic composition of local meteoric water at the source of the brewery, although there is a systematic offset from the global meteoric water line that may be due to the effects of CO2-H2O equilibration. The carbon isotopic composition of the CO2 reflected that of the solid residue (the source of carbon used as a fermentation substrate), but may potentially be modified by addition of gas-phase CO2 from an inorganic source. The carbon isotopic composition of the solid residue similarly tracks that of the fermentation substrate, and may indicate some alcohol fermented from added sugars in some cases. The nitrogen isotopic composition of the solid residue was relatively constant, and may track the source of nitrogen in the barley, hops and yeast. Each of the analytical methods used is a relatively standard technique used in geological applications, making this a "fun" exercise for those involved, and gives the students hands-on experience with a variety of analytes from a non-traditional sample material.

  13. Deterministic approach for multiple-source tsunami hazard assessment for Sines, Portugal

    OpenAIRE

    Wronna, M.; Omira, R.; Baptista, M. A.

    2015-01-01

    In this paper, we present a deterministic approach to tsunami hazard assessment for the city and harbour of Sines, Portugal, one of the test sites of project ASTARTE (Assessment, STrategy And Risk Reduction for Tsunamis in Europe). Sines has one of the most important deep-water ports, which has oil-bearing, petrochemical, liquid-bulk, coal, and container terminals. The port and its industrial infrastructures face the ocean southwest towards the main seismogenic sources. This...

  14. Scenario based approach for multiple source Tsunami Hazard assessment for Sines, Portugal

    OpenAIRE

    M. Wronna; R. Omira; M. A. Baptista

    2015-01-01

    In this paper, we present a scenario-based approach for tsunami hazard assessment for the city and harbour of Sines – Portugal, one of the test-sites of project ASTARTE. Sines holds one of the most important deep-water ports which contains oil-bearing, petrochemical, liquid bulk, coal and container terminals. The port and its industrial infrastructures are facing the ocean southwest towards the main seismogenic sources. This work considers two different seis...

  15. Using Generalizability Theory to Disattenuate Correlation Coefficients for Multiple Sources of Measurement Error.

    Science.gov (United States)

    Vispoel, Walter P; Morris, Carrie A; Kilinc, Murat

    2018-05-02

    Over the years, research in the social sciences has been dominated by reporting of reliability coefficients that fail to account for key sources of measurement error. Use of these coefficients, in turn, to correct for measurement error can hinder scientific progress by misrepresenting true relationships among the underlying constructs being investigated. In the research reported here, we addressed these issues using generalizability theory (G-theory) in both traditional and new ways to account for the three key sources of measurement error (random-response, specific-factor, and transient) that affect scores from objectively scored measures. Results from 20 widely used measures of personality, self-concept, and socially desirable responding showed that conventional indices consistently misrepresented reliability and relationships among psychological constructs by failing to account for key sources of measurement error and correlated transient errors within occasions. The results further revealed that G-theory served as an effective framework for remedying these problems. We discuss possible extensions in future research and provide code from the computer package R in an online supplement to enable readers to apply the procedures we demonstrate to their own research.

  16. Identification of Multiple Subtypes of Campylobacter jejuni in Chicken Meat and the Impact on Source Attribution

    Directory of Open Access Journals (Sweden)

    John A. Hudson

    2013-09-01

    Full Text Available Most source attribution studies for Campylobacter use subtyping data based on single isolates from foods and environmental sources in an attempt to draw epidemiological inferences. It has been suggested that subtyping only one Campylobacter isolate per chicken carcass incurs a risk of failing to recognise the presence of clinically relevant, but numerically infrequent, subtypes. To investigate this, between 21 and 25 Campylobacter jejuni isolates from each of ten retail chicken carcasses were subtyped by pulsed-field gel electrophoresis (PFGE using the two restriction enzymes SmaI and KpnI. Among the 227 isolates, thirteen subtypes were identified, the most frequently occurring subtype being isolated from three carcasses. Six carcasses carried a single subtype, three carcasses carried two subtypes each and one carcass carried three subtypes. Some subtypes carried by an individual carcass were shown to be potentially clonally related. Comparison of C. jejuni subtypes from chickens with isolate subtypes from human clinical cases (n = 1248 revealed seven of the thirteen chicken subtypes were indistinguishable from human cases. None of the numerically minor chicken subtypes were identified in the human data. Therefore, typing only one Campylobacter isolate from individual chicken carcasses may be adequate to inform Campylobacter source attribution.

  17. Multiple sources driving the organic matter dynamics in two contrasting tropical mangroves

    International Nuclear Information System (INIS)

    Ray, R.; Shahraki, M.

    2016-01-01

    In this study, we have selected two different mangroves based on their geological, hydrological and climatological variations to investigate the origin (terrestrial, phytobenthos derived, and phytoplankton derived) of dissolved organic carbon (DOC), particulate organic carbon (POC) in the water column and the sedimentary OC using elemental ratios and stable isotopes. Qeshm Island, representing the Iranian mangroves received no attention before this study in terms of DOC, POC biogeochemistry and their sources unlike the Sundarbans (Indian side), the world's largest mangrove system. Slightly higher DOC concentrations in the Iranian mangroves were recorded in our field campaigns between 2011 and 2014, compared to the Sundarbans (315 ± 25 μM vs. 278 ± 42 μM), owing to the longer water residence times, while 9–10 times greater POC concentration (303 ± 37 μM, n = 82) was linked to both suspended load (345 ± 104 mg L"− "1) and high algal production. Yearlong phytoplankton bloom in the mangrove-lined Persian Gulf was reported to be the perennial source of both POC and DOC contributing 80–86% to the DOC and 90–98% to the POC pool. Whereas in the Sundarbans, riverine input contributed 50–58% to the DOC pool and POC composition was regulated by the seasonal litter fall, river discharge and phytoplankton production. Algal derived organic matter (microphytobenthos) represented the maximum contribution (70–76%) to the sedimentary OC at Qeshm Island, while mangrove leaf litters dominated the OC pool in the Indian Sundarbans. Finally, hydrographical settings (i.e. riverine transport) appeared to be the determinant factor in differentiating OM sources in the water column between the dry and wet mangroves. - Highlights: • Sources of OC have been identified and compared between two contrasting mangroves. • Phytoplankton dominated the DOC and POC pool in the Iranian mangroves. • River input contributed half of the total DOC and part of POC in the Indian

  18. Multiple sources driving the organic matter dynamics in two contrasting tropical mangroves

    Energy Technology Data Exchange (ETDEWEB)

    Ray, R., E-mail: raghab.ray@gmail.com [Institut Universitaire Européen de la Mer, UBO, UMR 6539 LEMAR, rue Dumont dUrville, 29280 Plouzane (France); Leibniz Center for Tropical Marine Ecology, Fahrenheitstr. 6, 28359 Bremen (Germany); Shahraki, M. [Leibniz Center for Tropical Marine Ecology, Fahrenheitstr. 6, 28359 Bremen (Germany); Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research, Am Handelshafen 12, 27570 Bremerhaven (Germany)

    2016-11-15

    In this study, we have selected two different mangroves based on their geological, hydrological and climatological variations to investigate the origin (terrestrial, phytobenthos derived, and phytoplankton derived) of dissolved organic carbon (DOC), particulate organic carbon (POC) in the water column and the sedimentary OC using elemental ratios and stable isotopes. Qeshm Island, representing the Iranian mangroves received no attention before this study in terms of DOC, POC biogeochemistry and their sources unlike the Sundarbans (Indian side), the world's largest mangrove system. Slightly higher DOC concentrations in the Iranian mangroves were recorded in our field campaigns between 2011 and 2014, compared to the Sundarbans (315 ± 25 μM vs. 278 ± 42 μM), owing to the longer water residence times, while 9–10 times greater POC concentration (303 ± 37 μM, n = 82) was linked to both suspended load (345 ± 104 mg L{sup −} {sup 1}) and high algal production. Yearlong phytoplankton bloom in the mangrove-lined Persian Gulf was reported to be the perennial source of both POC and DOC contributing 80–86% to the DOC and 90–98% to the POC pool. Whereas in the Sundarbans, riverine input contributed 50–58% to the DOC pool and POC composition was regulated by the seasonal litter fall, river discharge and phytoplankton production. Algal derived organic matter (microphytobenthos) represented the maximum contribution (70–76%) to the sedimentary OC at Qeshm Island, while mangrove leaf litters dominated the OC pool in the Indian Sundarbans. Finally, hydrographical settings (i.e. riverine transport) appeared to be the determinant factor in differentiating OM sources in the water column between the dry and wet mangroves. - Highlights: • Sources of OC have been identified and compared between two contrasting mangroves. • Phytoplankton dominated the DOC and POC pool in the Iranian mangroves. • River input contributed half of the total DOC and part of POC in

  19. A modeling study of saltwater intrusion in the Andarax delta area using multiple data sources

    DEFF Research Database (Denmark)

    Antonsson, Arni Valur; Engesgaard, Peter Knudegaard; Jorreto, Sara

    context. The validity of a conceptual model is determined by different factors, where both data quantity and quality is of crucial importance. Often, when dealing with saltwater intrusion, data is limited. Therefore, using different sources (and types) of data can be beneficial and increase......In groundwater model development, construction of the conceptual model is one of the (initial and) critical aspects that determines the model reliability and applicability in terms of e.g. system (hydrogeological) understanding, groundwater quality predictions, and general use in water resources...

  20. PSD Applicability Determination for Multiple Owner/Operator Point Sources Within a Single Facility

    Science.gov (United States)

    This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.

  1. Estimation of Multiple Point Sources for Linear Fractional Order Systems Using Modulating Functions

    KAUST Repository

    Belkhatir, Zehor

    2017-06-28

    This paper proposes an estimation algorithm for the characterization of multiple point inputs for linear fractional order systems. First, using polynomial modulating functions method and a suitable change of variables the problem of estimating the locations and the amplitudes of a multi-pointwise input is decoupled into two algebraic systems of equations. The first system is nonlinear and solves for the time locations iteratively, whereas the second system is linear and solves for the input’s amplitudes. Second, closed form formulas for both the time location and the amplitude are provided in the particular case of single point input. Finally, numerical examples are given to illustrate the performance of the proposed technique in both noise-free and noisy cases. The joint estimation of pointwise input and fractional differentiation orders is also presented. Furthermore, a discussion on the performance of the proposed algorithm is provided.

  2. Exploring multiple sources of climatic information within personal and medical diaries, Bombay 1799-1828

    Science.gov (United States)

    Adamson, George

    2016-04-01

    Private diaries are being recognised as an important source of information on past climatic conditions, providing place-specific, often daily records of meteorological information. As many were not intended for publication, or indeed to be read by anyone other than the author, issues of observer bias are lower than some other types of documentary sources. This paper comprises an exploration of the variety of types of climatic information can be mined from a single document or set of documents. The focus of the analysis is three private and one medical diary kept by British colonists in Bombay, western India, during the first decades of the nineteenth century. The paper discusses the potential of the diaries for reconstruction of precipitation, temperature and extreme events. Ad-hoc temperature observations collected by the four observers prove to be particularly fruitful for reconstructing monthly extreme temperatures, with values comparable to more systematic observations collected during the period. This leads to a tentative conclusion that extreme temperatures in Bombay were around 5°C lower during the period than today, a difference likely predominantly attributable to the urban heat island effect.

  3. Testing the count rate performance of the scintillation camera by exponential attenuation: Decaying source; Multiple filters

    International Nuclear Information System (INIS)

    Adams, R.; Mena, I.

    1988-01-01

    An algorithm and two fortrAN programs have been developed to evaluate the count rate performance of scintillation cameras from count rates reduced exponentially, either by a decaying source or by filtration. The first method is used with short-lived radionuclides such as 191 /sup m/Ir or 191 /sup m/Au. The second implements a National Electrical Manufacturers' Association (NEMA) protocol in which the count rate from a source of 191 /sup m/Tc is attenuated by a varying number of copper filters stacked over it. The count rate at each data point is corrected for deadtime loss after assigning an arbitrary deadtime (tau). A second-order polynomial equation is fitted to the logarithms of net count rate values: ln(R) = A+BT+CT 2 where R is the net corrected count rate (cps), and T is the elapsed time (or the filter thickness in the NEMA method). Depending on C, tau is incremented or decremented iteratively, and the count rate corrections and curve fittings are repeated until C approaches zero, indicating a correct value of the deadtime (tau). The program then plots the measured count rate versus the corrected count rate values

  4. Examining Multiple Sources of Differential Item Functioning on the Clinician & Group CAHPS® Survey

    Science.gov (United States)

    Rodriguez, Hector P; Crane, Paul K

    2011-01-01

    Objective To evaluate psychometric properties of a widely used patient experience survey. Data Sources English-language responses to the Clinician & Group Consumer Assessment of Healthcare Providers and Systems (CG-CAHPS®) survey (n = 12,244) from a 2008 quality improvement initiative involving eight southern California medical groups. Methods We used an iterative hybrid ordinal logistic regression/item response theory differential item functioning (DIF) algorithm to identify items with DIF related to patient sociodemographic characteristics, duration of the physician–patient relationship, number of physician visits, and self-rated physical and mental health. We accounted for all sources of DIF and determined its cumulative impact. Principal Findings The upper end of the CG-CAHPS® performance range is measured with low precision. With sensitive settings, some items were found to have DIF. However, overall DIF impact was negligible, as 0.14 percent of participants had salient DIF impact. Latinos who spoke predominantly English at home had the highest prevalence of salient DIF impact at 0.26 percent. Conclusions The CG-CAHPS® functions similarly across commercially insured respondents from diverse backgrounds. Consequently, previously documented racial and ethnic group differences likely reflect true differences rather than measurement bias. The impact of low precision at the upper end of the scale should be clarified. PMID:22092021

  5. Quantum W3 gravity

    International Nuclear Information System (INIS)

    Schoutens, K.; van Nieuwenhuizen, P.; State Univ. of New York, Stony Brook, NY

    1991-11-01

    We briefly review some results in the theory of quantum W 3 gravity in the chiral gauge. We compare them with similar results in the analogous but simpler cases of d = 2 induced gauge theories and d = 2 induced gravity

  6. Urine specific gravity test

    Science.gov (United States)

    ... medlineplus.gov/ency/article/003587.htm Urine specific gravity test To use the sharing features on this page, please enable JavaScript. Urine specific gravity is a laboratory test that shows the concentration ...

  7. Cadiz, California Gravity Data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity station data (32 records) were gathered by Mr. Seth I. Gutman for AridTech Inc., Denver, Colorado using a Worden Prospector gravity meter. This data base...

  8. Andes 1997 Gravity Data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Central Andes gravity data (6,151 records) were compiled by Professor Gotze and the MIGRA Group. This data base was received in April, 1997. Principal gravity...

  9. DNAG Gravity Data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Decade of North American Geology (DNAG) gravity grid values, spaced at 6 km, were used to produce the Gravity Anomaly Map of North America (1987; scale...

  10. Gravity wave astronomy

    International Nuclear Information System (INIS)

    Pinheiro, R.

    1979-01-01

    The properties and production of gravitational radiation are described. The prospects for their detection are considered including the Weber apparatus and gravity-wave telescopes. Possibilities of gravity-wave astronomy are noted

  11. Northern Oklahoma Gravity Data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity station data (710 records) were compiled by Professor Ahern. This data base was received in June 1992. Principal gravity parameters include latitude,...

  12. Idaho State Gravity Data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity station data (24,284 records) were compiled by the U. S. Geological Survey. This data base was received on February 23, 1993. Principal gravity...

  13. The development of a methodology to assess population doses from multiple sources and exposure pathways of radioactivity

    International Nuclear Information System (INIS)

    Hancox, J.; Stansby, S.; Thorne, M.

    2002-01-01

    The Environment Agency (EA) has new duties in accordance with the Basic Safety Standards Directive under which it is required to ensure that doses to individuals received from exposure to anthropogenic sources of radioactivity are within defined limits. In order to assess compliance with these requirements, the EA needs to assess the doses to members of the most highly exposed population groups ('critical' groups) from all relevant potential sources of anthropogenic radioactivity and all relevant potential exposure pathways to such radioactivity. The EA has identified a need to develop a methodology for the retrospective assessment of effective doses from multiple sources of radioactive materials and exposure pathways associated with those sources. Under contract to the EA, AEA Technology has undertaken the development of a suitable methodology as part of EA R and D Project P3-070. The methodology developed under this research project has been designed to support the EA in meeting its obligations under the Euratom Basic Safety Standards Directive and is consistent with UK and international approaches to radiation dosimetry and radiological protection. The development and trial application of the methodology is described in this report

  14. The metal-organic framework MIL-53(Al) constructed from multiple metal sources: alumina, aluminum hydroxide, and boehmite.

    Science.gov (United States)

    Li, Zehua; Wu, Yi-nan; Li, Jie; Zhang, Yiming; Zou, Xin; Li, Fengting

    2015-04-27

    Three aluminum compounds, namely alumina, aluminum hydroxide, and boehmite, are probed as the metal sources for the hydrothermal synthesis of a typical metal-organic framework MIL-53(Al). The process exhibits enhanced synthetic efficiency without the generation of strongly acidic byproducts. The time-course monitoring of conversion from different aluminum sources into MIL-53(Al) is achieved by multiple characterization that reveals a similar but differentiated crystallinity, porosity, and morphology relative to typical MIL-53(Al) prepared from water-soluble aluminum salts. Moreover, the prepared MIL-53(Al) constructed with the three insoluble aluminum sources exhibit an improved thermal stability of up to nearly 600 °C and enhanced yields. Alumina and boehmite are more preferable than aluminum hydroxide in terms of product porosity, yield, and reaction time. The adsorption performances of a typical environmental endocrine disruptor, dimethyl phthalate, on the prepared MIL-53(Al) samples are also investigated. The improved structural stability of MIL-53(Al) prepared from these alternative aluminum sources enables double-enhanced adsorption performance (up to 206 mg g(-1)) relative to the conventionally obtained MIL-53(Al). © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. THE USE OF MULTIPLE DATA SOURCES IN THE PROCESS OF TOPOGRAPHIC MAPS UPDATING

    Directory of Open Access Journals (Sweden)

    A. Cantemir

    2016-06-01

    Full Text Available The methods used in the process of updating maps have evolved and become more complex, especially upon the development of the digital technology. At the same time, the development of technology has led to an abundance of available data that can be used in the updating process. The data sources came in a great variety of forms and formats from different acquisition sensors. Satellite images provided by certain satellite missions are now available on space agencies portals. Images stored in archives of satellite missions such us Sentinel, Landsat and other can be downloaded free of charge.The main advantages are represented by the large coverage area and rather good spatial resolution that enables the use of these images for the map updating at an appropriate scale. In our study we focused our research of these images on 1: 50.000 scale map. DEM that are globally available could represent an appropriate input for watershed delineation and stream network generation, that can be used as support for hydrography thematic layer update. If, in addition to remote sensing aerial photogrametry and LiDAR data are ussed, the accuracy of data sources is enhanced. Ortophotoimages and Digital Terrain Models are the main products that can be used for feature extraction and update. On the other side, the use of georeferenced analogical basemaps represent a significant addition to the process. Concerning the thematic maps, the classic representation of the terrain by contour lines derived from DTM, remains the best method of surfacing the earth on a map, nevertheless the correlation with other layers such as Hidrography are mandatory. In the context of the current national coverage of the Digital Terrain Model, one of the main concerns of the National Center of Cartography, through the Cartography and Photogrammetry Department, is represented by the exploitation of the available data in order to update the layers of the Topographic Reference Map 1:5000, known as

  16. Strings and quantum gravity

    International Nuclear Information System (INIS)

    Vega, H.J. de

    1990-01-01

    One of the main challenges in theoretical physics today is the unification of all interactions including gravity. At present, string theories appear as the most promising candidates to achieve such a unification. However, gravity has not completely been incorporated in string theory, many technical and conceptual problems remain and a full quantum theory of gravity is still non-existent. Our aim is to properly understand strings in the context of quantum gravity. Attempts towards this are reviewed. (author)

  17. A Multiple Source Approach to Organisational Justice: The Role of the Organisation, Supervisors, Coworkers, and Customers

    Directory of Open Access Journals (Sweden)

    Agustin Molina

    2015-07-01

    Full Text Available The vast research on organisational justice has focused on the organisation and the supervisor. This study aims to further this line of research by integrating two trends within organisational justice research: the overall approach to justice perceptions and the multifoci perspective of justice judgments. Specifically, this study aims to explore the effects of two additional sources of justice, coworker-focused justice and customer-focused justice, on relevant employees’ outcomes—burnout, turnover intentions, job satisfaction, and workplace deviance— while controlling the effect of organisation-focused justice and supervisor-focused justice. Given the increased importance attributed to coworkers and customers, we expect coworker-focused justice and customer-focused justice to explain incremental variance in the measured outcomes, above and beyond the effects of organisation-focused justice and supervisor-focused justice. Participants will be university students from Austria and Germany employed by service organisations. Data analysis will be conducted using structural equation modeling.

  18. High brightness--multiple beamlets source for patterned X-ray production

    Science.gov (United States)

    Leung, Ka-Ngo [Hercules, CA; Ji, Qing [Albany, CA; Barletta, William A [Oakland, CA; Jiang, Ximan [El Cerrito, CA; Ji, Lili [Albany, CA

    2009-10-27

    Techniques for controllably directing beamlets to a target substrate are disclosed. The beamlets may be either positive ions or electrons. It has been shown that beamlets may be produced with a diameter of 1 .mu.m, with inter-aperture spacings of 12 .mu.m. An array of such beamlets, may be used for maskless lithography. By step-wise movement of the beamlets relative to the target substrate, individual devices may be directly e-beam written. Ion beams may be directly written as well. Due to the high brightness of the beamlets from extraction from a multicusp source, exposure times for lithographic exposure are thought to be minimized. Alternatively, the beamlets may be electrons striking a high Z material for X-ray production, thereafter collimated to provide patterned X-ray exposures such as those used in CAT scans. Such a device may be used for remote detection of explosives.

  19. Evidence for multiple sources of 10Be in the early solar system

    DEFF Research Database (Denmark)

    Wielandt, Daniel Kim Peel; Nagashima, Kazuhide; Krot, Alexander N.

    2012-01-01

    Beryllium-10 is a short-lived radionuclide (t 1/2 = 1.4 Myr) uniquely synthesized by spallation reactions and inferred to have been present when the solar system's oldest solids (calcium-aluminum-rich inclusions, CAIs) formed. Yet, the astrophysical site of 10Be nucleosynthesis is uncertain. We...... in the gaseous CAI-forming reservoir, or in the inclusions themselves: this indicates at least two nucleosynthetic sources of 10Be in the early solar system. The most promising locale for 10Be synthesis is close to the proto-Sun during its early mass-accreting stages, as these are thought to coincide...

  20. Deterministic approach for multiple-source tsunami hazard assessment for Sines, Portugal

    Science.gov (United States)

    Wronna, M.; Omira, R.; Baptista, M. A.

    2015-11-01

    In this paper, we present a deterministic approach to tsunami hazard assessment for the city and harbour of Sines, Portugal, one of the test sites of project ASTARTE (Assessment, STrategy And Risk Reduction for Tsunamis in Europe). Sines has one of the most important deep-water ports, which has oil-bearing, petrochemical, liquid-bulk, coal, and container terminals. The port and its industrial infrastructures face the ocean southwest towards the main seismogenic sources. This work considers two different seismic zones: the Southwest Iberian Margin and the Gloria Fault. Within these two regions, we selected a total of six scenarios to assess the tsunami impact at the test site. The tsunami simulations are computed using NSWING, a Non-linear Shallow Water model wIth Nested Grids. In this study, the static effect of tides is analysed for three different tidal stages: MLLW (mean lower low water), MSL (mean sea level), and MHHW (mean higher high water). For each scenario, the tsunami hazard is described by maximum values of wave height, flow depth, drawback, maximum inundation area and run-up. Synthetic waveforms are computed at virtual tide gauges at specific locations outside and inside the harbour. The final results describe the impact at the Sines test site considering the single scenarios at mean sea level, the aggregate scenario, and the influence of the tide on the aggregate scenario. The results confirm the composite source of Horseshoe and Marques de Pombal faults as the worst-case scenario, with wave heights of over 10 m, which reach the coast approximately 22 min after the rupture. It dominates the aggregate scenario by about 60 % of the impact area at the test site, considering maximum wave height and maximum flow depth. The HSMPF scenario inundates a total area of 3.5 km2.

  1. Geometric Liouville gravity

    International Nuclear Information System (INIS)

    La, H.

    1992-01-01

    A new geometric formulation of Liouville gravity based on the area preserving diffeo-morphism is given and a possible alternative to reinterpret Liouville gravity is suggested, namely, a scalar field coupled to two-dimensional gravity with a curvature constraint

  2. Covariant w∞ gravity

    NARCIS (Netherlands)

    Bergshoeff, E.; Pope, C.N.; Stelle, K.S.

    1990-01-01

    We discuss the notion of higher-spin covariance in w∞ gravity. We show how a recently proposed covariant w∞ gravity action can be obtained from non-chiral w∞ gravity by making field redefinitions that introduce new gauge-field components with corresponding new gauge transformations.

  3. Induced quantum conformal gravity

    International Nuclear Information System (INIS)

    Novozhilov, Y.V.; Vassilevich, D.V.

    1988-11-01

    Quantum gravity is considered as induced by matter degrees of freedom and related to the symmetry breakdown in the low energy region of a non-Abelian gauge theory of fundamental fields. An effective action for quantum conformal gravity is derived where both the gravitational constant and conformal kinetic term are positive. Relation with induced classical gravity is established. (author). 15 refs

  4. Quantum Gravity Phenomenology

    OpenAIRE

    Amelino-Camelia, Giovanni

    2003-01-01

    Comment: 9 pages, LaTex. These notes were prepared while working on an invited contribution to the November 2003 issue of Physics World, which focused on quantum gravity. They intend to give a non-technical introduction (accessible to readers from outside quantum gravity) to "Quantum Gravity Phenomenology"

  5. Gravity is Geometry.

    Science.gov (United States)

    MacKeown, P. K.

    1984-01-01

    Clarifies two concepts of gravity--those of a fictitious force and those of how space and time may have geometry. Reviews the position of Newton's theory of gravity in the context of special relativity and considers why gravity (as distinct from electromagnetics) lends itself to Einstein's revolutionary interpretation. (JN)

  6. Bayesian inference based modelling for gene transcriptional dynamics by integrating multiple source of knowledge

    Directory of Open Access Journals (Sweden)

    Wang Shu-Qiang

    2012-07-01

    Full Text Available Abstract Background A key challenge in the post genome era is to identify genome-wide transcriptional regulatory networks, which specify the interactions between transcription factors and their target genes. Numerous methods have been developed for reconstructing gene regulatory networks from expression data. However, most of them are based on coarse grained qualitative models, and cannot provide a quantitative view of regulatory systems. Results A binding affinity based regulatory model is proposed to quantify the transcriptional regulatory network. Multiple quantities, including binding affinity and the activity level of transcription factor (TF are incorporated into a general learning model. The sequence features of the promoter and the possible occupancy of nucleosomes are exploited to estimate the binding probability of regulators. Comparing with the previous models that only employ microarray data, the proposed model can bridge the gap between the relative background frequency of the observed nucleotide and the gene's transcription rate. Conclusions We testify the proposed approach on two real-world microarray datasets. Experimental results show that the proposed model can effectively identify the parameters and the activity level of TF. Moreover, the kinetic parameters introduced in the proposed model can reveal more biological sense than previous models can do.

  7. Wheat multiple synthetic derivatives: a new source for heat stress tolerance adaptive traits

    Science.gov (United States)

    Elbashir, Awad Ahmed Elawad; Gorafi, Yasir Serag Alnor; Tahir, Izzat Sidahmed Ali; Kim, June-Sik; Tsujimoto, Hisashi

    2017-01-01

    Heat stress is detrimental to wheat (Triticum aestivum L.) productivity. In this study, we aimed to select heat-tolerant plants from a multiple synthetic derivatives (MSD) population and evaluate their agronomic and physiological traits. We selected six tolerant plants from the population with the background of the cultivar ‘Norin 61’ (N61) and established six MNH (MSD population of N61 selected as heat stress-tolerant) lines. We grew these lines with N61 in the field and growth chamber. In the field, we used optimum and late sowings to ensure plant exposure to heat. In the growth chamber, in addition to N61, we used the heat-tolerant cultivars ‘Gelenson’ and ‘Bacanora’. We confirmed that MNH2 and MNH5 lines acquired heat tolerance. These lines had higher photosynthesis and stomata conductance and exhibited no reduction in grain yield and biomass under heat stress compared to N61. We noticed that N61 had relatively good adaptability to heat stress. Our results indicate that the MSD population includes the diversity of Aegilops tauschii and is a promising resource to uncover useful quantitative traits derived from this wild species. Selected lines could be useful for heat stress tolerance breeding. PMID:28744178

  8. Application of the Modified Source Multiplication (MSM) technique to subcritical reactivity worth measurements in thermal and fast reactor systems

    International Nuclear Information System (INIS)

    Blaise, P.; Fougeras, P.; Mellier, F.

    2009-01-01

    The Amplified Source Multiplication (ASM) method and its improved Modified Source Multiplication (MSM) method have been widely used in the CEA's EOLE and MASURCA critical facilities over the past decades for the determination of reactivity worths by using fission chambers in subcritical configurations. They have been successfully applied to absorber (single or clusters) worth measurement in both thermal and fast spectra, or for (sodium or water) void reactivity worths. The ASM methodology, which is the basic technique to estimate a reactivity worth, uses relatively simple relationships between count rates of efficient miniature fission chambers located in slightly subcritical reference and perturbed configurations. If this method works quite well for small reactivity variation (a few effective delayed neutron fraction), its raw results needs to be corrected to take into account the flux perturbation in the fission chamber. This is performed by applying to the measurement a correction factor called MSM. Its characteristics is to take into account the local space and energy variation of the spectrum in the fission chamber, through standard perturbation theory applied to neutron transport calculation in the perturbed configuration. The proposed paper describes in details both methodologies, with their associated uncertainties. Applications on absorber cluster worth in the MISTRAL-4 full MOX mock-up core and the last core loaded in MASURCA show the importance of the MSM correction on raw data. (authors)

  9. Detecting and accounting for multiple sources of positional variance in peak list registration analysis and spin system grouping.

    Science.gov (United States)

    Smelter, Andrey; Rouchka, Eric C; Moseley, Hunter N B

    2017-08-01

    Peak lists derived from nuclear magnetic resonance (NMR) spectra are commonly used as input data for a variety of computer assisted and automated analyses. These include automated protein resonance assignment and protein structure calculation software tools. Prior to these analyses, peak lists must be aligned to each other and sets of related peaks must be grouped based on common chemical shift dimensions. Even when programs can perform peak grouping, they require the user to provide uniform match tolerances or use default values. However, peak grouping is further complicated by multiple sources of variance in peak position limiting the effectiveness of grouping methods that utilize uniform match tolerances. In addition, no method currently exists for deriving peak positional variances from single peak lists for grouping peaks into spin systems, i.e. spin system grouping within a single peak list. Therefore, we developed a complementary pair of peak list registration analysis and spin system grouping algorithms designed to overcome these limitations. We have implemented these algorithms into an approach that can identify multiple dimension-specific positional variances that exist in a single peak list and group peaks from a single peak list into spin systems. The resulting software tools generate a variety of useful statistics on both a single peak list and pairwise peak list alignment, especially for quality assessment of peak list datasets. We used a range of low and high quality experimental solution NMR and solid-state NMR peak lists to assess performance of our registration analysis and grouping algorithms. Analyses show that an algorithm using a single iteration and uniform match tolerances approach is only able to recover from 50 to 80% of the spin systems due to the presence of multiple sources of variance. Our algorithm recovers additional spin systems by reevaluating match tolerances in multiple iterations. To facilitate evaluation of the

  10. Scenario based approach for multiple source Tsunami Hazard assessment for Sines, Portugal

    Science.gov (United States)

    Wronna, M.; Omira, R.; Baptista, M. A.

    2015-08-01

    In this paper, we present a scenario-based approach for tsunami hazard assessment for the city and harbour of Sines - Portugal, one of the test-sites of project ASTARTE. Sines holds one of the most important deep-water ports which contains oil-bearing, petrochemical, liquid bulk, coal and container terminals. The port and its industrial infrastructures are facing the ocean southwest towards the main seismogenic sources. This work considers two different seismic zones: the Southwest Iberian Margin and the Gloria Fault. Within these two regions, we selected a total of six scenarios to assess the tsunami impact at the test site. The tsunami simulations are computed using NSWING a Non-linear Shallow Water Model With Nested Grids. In this study, the static effect of tides is analysed for three different tidal stages MLLW (mean lower low water), MSL (mean sea level) and MHHW (mean higher high water). For each scenario, inundation is described by maximum values of wave height, flow depth, drawback, runup and inundation distance. Synthetic waveforms are computed at virtual tide gauges at specific locations outside and inside the harbour. The final results describe the impact at Sines test site considering the single scenarios at mean sea level, the aggregate scenario and the influence of the tide on the aggregate scenario. The results confirm the composite of Horseshoe and Marques Pombal fault as the worst case scenario. It governs the aggregate scenario with about 60 % and inundates an area of 3.5 km2.

  11. IPeak: An open source tool to combine results from multiple MS/MS search engines.

    Science.gov (United States)

    Wen, Bo; Du, Chaoqin; Li, Guilin; Ghali, Fawaz; Jones, Andrew R; Käll, Lukas; Xu, Shaohang; Zhou, Ruo; Ren, Zhe; Feng, Qiang; Xu, Xun; Wang, Jun

    2015-09-01

    Liquid chromatography coupled tandem mass spectrometry (LC-MS/MS) is an important technique for detecting peptides in proteomics studies. Here, we present an open source software tool, termed IPeak, a peptide identification pipeline that is designed to combine the Percolator post-processing algorithm and multi-search strategy to enhance the sensitivity of peptide identifications without compromising accuracy. IPeak provides a graphical user interface (GUI) as well as a command-line interface, which is implemented in JAVA and can work on all three major operating system platforms: Windows, Linux/Unix and OS X. IPeak has been designed to work with the mzIdentML standard from the Proteomics Standards Initiative (PSI) as an input and output, and also been fully integrated into the associated mzidLibrary project, providing access to the overall pipeline, as well as modules for calling Percolator on individual search engine result files. The integration thus enables IPeak (and Percolator) to be used in conjunction with any software packages implementing the mzIdentML data standard. IPeak is freely available and can be downloaded under an Apache 2.0 license at https://code.google.com/p/mzidentml-lib/. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Exploiting multiple sources of information in learning an artificial language: human data and modeling.

    Science.gov (United States)

    Perruchet, Pierre; Tillmann, Barbara

    2010-03-01

    This study investigates the joint influences of three factors on the discovery of new word-like units in a continuous artificial speech stream: the statistical structure of the ongoing input, the initial word-likeness of parts of the speech flow, and the contextual information provided by the earlier emergence of other word-like units. Results of an experiment conducted with adult participants show that these sources of information have strong and interactive influences on word discovery. The authors then examine the ability of different models of word segmentation to account for these results. PARSER (Perruchet & Vinter, 1998) is compared to the view that word segmentation relies on the exploitation of transitional probabilities between successive syllables, and with the models based on the Minimum Description Length principle, such as INCDROP. The authors submit arguments suggesting that PARSER has the advantage of accounting for the whole pattern of data without ad-hoc modifications, while relying exclusively on general-purpose learning principles. This study strengthens the growing notion that nonspecific cognitive processes, mainly based on associative learning and memory principles, are able to account for a larger part of early language acquisition than previously assumed. Copyright © 2009 Cognitive Science Society, Inc.

  13. Sequence-based analysis of the microbial composition of water kefir from multiple sources.

    Science.gov (United States)

    Marsh, Alan J; O'Sullivan, Orla; Hill, Colin; Ross, R Paul; Cotter, Paul D

    2013-11-01

    Water kefir is a water-sucrose-based beverage, fermented by a symbiosis of bacteria and yeast to produce a final product that is lightly carbonated, acidic and that has a low alcohol percentage. The microorganisms present in water kefir are introduced via water kefir grains, which consist of a polysaccharide matrix in which the microorganisms are embedded. We aimed to provide a comprehensive sequencing-based analysis of the bacterial population of water kefir beverages and grains, while providing an initial insight into the corresponding fungal population. To facilitate this objective, four water kefirs were sourced from the UK, Canada and the United States. Culture-independent, high-throughput, sequencing-based analyses revealed that the bacterial fraction of each water kefir and grain was dominated by Zymomonas, an ethanol-producing bacterium, which has not previously been detected at such a scale. The other genera detected were representatives of the lactic acid bacteria and acetic acid bacteria. Our analysis of the fungal component established that it was comprised of the genera Dekkera, Hanseniaspora, Saccharomyces, Zygosaccharomyces, Torulaspora and Lachancea. This information will assist in the ultimate identification of the microorganisms responsible for the potentially health-promoting attributes of these beverages. © 2013 Federation of European Microbiological Societies. Published by John Wiley & Sons Ltd. All rights reserved.

  14. The role of envelope shape in the localization of multiple sound sources and echoes in the barn owl.

    Science.gov (United States)

    Baxter, Caitlin S; Nelson, Brian S; Takahashi, Terry T

    2013-02-01

    Echoes and sounds of independent origin often obscure sounds of interest, but echoes can go undetected under natural listening conditions, a perception called the precedence effect. How does the auditory system distinguish between echoes and independent sources? To investigate, we presented two broadband noises to barn owls (Tyto alba) while varying the similarity of the sounds' envelopes. The carriers of the noises were identical except for a 2- or 3-ms delay. Their onsets and offsets were also synchronized. In owls, sound localization is guided by neural activity on a topographic map of auditory space. When there are two sources concomitantly emitting sounds with overlapping amplitude spectra, space map neurons discharge when the stimulus in their receptive field is louder than the one outside it and when the averaged amplitudes of both sounds are rising. A model incorporating these features calculated the strengths of the two sources' representations on the map (B. S. Nelson and T. T. Takahashi; Neuron 67: 643-655, 2010). The target localized by the owls could be predicted from the model's output. The model also explained why the echo is not localized at short delays: when envelopes are similar, peaks in the leading sound mask corresponding peaks in the echo, weakening the echo's space map representation. When the envelopes are dissimilar, there are few or no corresponding peaks, and the owl localizes whichever source is predicted by the model to be less masked. Thus the precedence effect in the owl is a by-product of a mechanism for representing multiple sound sources on its map.

  15. ANEMOS: A computer code to estimate air concentrations and ground deposition rates for atmospheric nuclides emitted from multiple operating sources

    International Nuclear Information System (INIS)

    Miller, C.W.; Sjoreen, A.L.; Begovich, C.L.; Hermann, O.W.

    1986-11-01

    This code estimates concentrations in air and ground deposition rates for Atmospheric Nuclides Emitted from Multiple Operating Sources. ANEMOS is one component of an integrated Computerized Radiological Risk Investigation System (CRRIS) developed for the US Environmental Protection Agency (EPA) for use in performing radiological assessments and in developing radiation standards. The concentrations and deposition rates calculated by ANEMOS are used in subsequent portions of the CRRIS for estimating doses and risks to man. The calculations made in ANEMOS are based on the use of a straight-line Gaussian plume atmospheric dispersion model with both dry and wet deposition parameter options. The code will accommodate a ground-level or elevated point and area source or windblown source. Adjustments may be made during the calculations for surface roughness, building wake effects, terrain height, wind speed at the height of release, the variation in plume rise as a function of downwind distance, and the in-growth and decay of daughter products in the plume as it travels downwind. ANEMOS can also accommodate multiple particle sizes and clearance classes, and it may be used to calculate the dose from a finite plume of gamma-ray-emitting radionuclides passing overhead. The output of this code is presented for 16 sectors of a circular grid. ANEMOS can calculate both the sector-average concentrations and deposition rates at a given set of downwind distances in each sector and the average of these quantities over an area within each sector bounded by two successive downwind distances. ANEMOS is designed to be used primarily for continuous, long-term radionuclide releases. This report describes the models used in the code, their computer implementation, the uncertainty associated with their use, and the use of ANEMOS in conjunction with other codes in the CRRIS. A listing of the code is included in Appendix C

  16. ANEMOS: A computer code to estimate air concentrations and ground deposition rates for atmospheric nuclides emitted from multiple operating sources

    Energy Technology Data Exchange (ETDEWEB)

    Miller, C.W.; Sjoreen, A.L.; Begovich, C.L.; Hermann, O.W.

    1986-11-01

    This code estimates concentrations in air and ground deposition rates for Atmospheric Nuclides Emitted from Multiple Operating Sources. ANEMOS is one component of an integrated Computerized Radiological Risk Investigation System (CRRIS) developed for the US Environmental Protection Agency (EPA) for use in performing radiological assessments and in developing radiation standards. The concentrations and deposition rates calculated by ANEMOS are used in subsequent portions of the CRRIS for estimating doses and risks to man. The calculations made in ANEMOS are based on the use of a straight-line Gaussian plume atmospheric dispersion model with both dry and wet deposition parameter options. The code will accommodate a ground-level or elevated point and area source or windblown source. Adjustments may be made during the calculations for surface roughness, building wake effects, terrain height, wind speed at the height of release, the variation in plume rise as a function of downwind distance, and the in-growth and decay of daughter products in the plume as it travels downwind. ANEMOS can also accommodate multiple particle sizes and clearance classes, and it may be used to calculate the dose from a finite plume of gamma-ray-emitting radionuclides passing overhead. The output of this code is presented for 16 sectors of a circular grid. ANEMOS can calculate both the sector-average concentrations and deposition rates at a given set of downwind distances in each sector and the average of these quantities over an area within each sector bounded by two successive downwind distances. ANEMOS is designed to be used primarily for continuous, long-term radionuclide releases. This report describes the models used in the code, their computer implementation, the uncertainty associated with their use, and the use of ANEMOS in conjunction with other codes in the CRRIS. A listing of the code is included in Appendix C.

  17. Gravity in 2+ 1 dimensions

    International Nuclear Information System (INIS)

    Gerbert, P.S.

    1989-01-01

    A review of 2+1-dimensional gravity, and recent results concerning the quantum scattering of Klein-Gordon and Dirac test particles in background of point sources with and without spin are presented. The classical theory and general remarks of 2+1 dimensional gravity are reviewed. The space-time in presence of point sources is described. The classical scattering and applications to (Spinning) cosmic strings are discussed. The quantum theory is considered analysing the two body scattering problem. The scattering of spinless particles is discussed including spin-effects. Some classifying remarks about three-dimensional analogue of hte Weyl tensor and Chern-Simons theories of gravitation are also presented. (M.C.K.)

  18. Interrogating pollution sources in a mangrove food web using multiple stable isotopes.

    Science.gov (United States)

    Souza, Iara da C; Arrivabene, Hiulana P; Craig, Carol-Ann; Midwood, Andrew J; Thornton, Barry; Matsumoto, Silvia T; Elliott, Michael; Wunderlin, Daniel A; Monferrán, Magdalena V; Fernandes, Marisa N

    2018-06-01

    Anthropogenic activities including metal contamination create well-known problems in coastal mangrove ecosystems but understanding and linking specific pollution sources to distinct trophic levels within these environments is challenging. This study evaluated anthropogenic impacts on two contrasting mangrove food webs, by using stable isotopes (δ 13 C, δ 15 N, 87 Sr/ 86 Sr, 206 Pb/ 207 Pb and 208 Pb/ 207 Pb) measured in sediments, mangrove trees (Rhizophora mangle, Laguncularia racemosa, Avicennia schaueriana), plankton, shrimps (Macrobranchium sp.), crabs (Aratus sp.), oysters (Crassostrea rhizophorae) and fish (Centropomus parallelus) from both areas. Strontium and Pb isotopes were also analysed in water and atmospheric particulate matter (PM). δ 15 N indicated that crab, shrimp and oyster are at intermediate levels within the local food web and fish, in this case C. parallelus, was confirmed at the highest trophic level. δ 15 N also indicates different anthropogenic pressures between both estuaries; Vitória Bay, close to intensive human activities, showed higher δ 15 N across the food web, apparently influenced by sewage. The ratio 87 Sr/ 86 Sr showed the primary influence of marine water throughout the entire food web. Pb isotope ratios suggest that PM is primarily influenced by metallurgical activities, with some secondary influence on mangrove plants and crabs sampled in the area adjacent to the smelting works. To our knowledge, this is the first demonstration of the effect of anthropogenic pollution (probable sewage pollution) on the isotopic fingerprint of estuarine-mangrove systems located close to a city compared to less impacted estuarine mangroves. The influence of industrial metallurgical activity detected using Pb isotopic analysis of PM and mangrove plants close to such an impacted area is also notable and illustrates the value of isotopic analysis in tracing the impact and species affected by atmospheric pollution. Copyright © 2018 Elsevier B

  19. EVIDENCE FOR MULTIPLE SOURCES OF 10Be IN THE EARLY SOLAR SYSTEM

    International Nuclear Information System (INIS)

    Wielandt, Daniel; Krot, Alexander N.; Bizzarro, Martin; Nagashima, Kazuhide; Huss, Gary R.; Ivanova, Marina A.

    2012-01-01

    Beryllium-10 is a short-lived radionuclide (t 1/2 = 1.4 Myr) uniquely synthesized by spallation reactions and inferred to have been present when the solar system's oldest solids (calcium-aluminum-rich inclusions, CAIs) formed. Yet, the astrophysical site of 10 Be nucleosynthesis is uncertain. We report Li-Be-B isotope measurements of CAIs from CV chondrites, including CAIs that formed with the canonical 26 Al/ 27 Al ratio of ∼5 × 10 –5 (canonical CAIs) and CAIs with Fractionation and Unidentified Nuclear isotope effects (FUN-CAIs) characterized by 26 Al/ 27 Al ratios much lower than the canonical value. Our measurements demonstrate the presence of four distinct fossil 10 Be/ 9 Be isochrons, lower in the FUN-CAIs than in the canonical CAIs, and variable within these classes. Given that FUN-CAI precursors escaped evaporation-recondensation prior to evaporative melting, we suggest that the 10 Be/ 9 Be ratio recorded by FUN-CAIs represents a baseline level present in presolar material inherited from the protosolar molecular cloud, generated via enhanced trapping of galactic cosmic rays. The higher and possibly variable apparent 10 Be/ 9 Be ratios of canonical CAIs reflect additional spallogenesis, either in the gaseous CAI-forming reservoir, or in the inclusions themselves: this indicates at least two nucleosynthetic sources of 10 Be in the early solar system. The most promising locale for 10 Be synthesis is close to the proto-Sun during its early mass-accreting stages, as these are thought to coincide with periods of intense particle irradiation occurring on timescales significantly shorter than the formation interval of canonical CAIs.

  20. EarthCube Data Discovery Hub: Enhancing, Curating and Finding Data across Multiple Geoscience Data Sources.

    Science.gov (United States)

    Zaslavsky, I.; Valentine, D.; Richard, S. M.; Gupta, A.; Meier, O.; Peucker-Ehrenbrink, B.; Hudman, G.; Stocks, K. I.; Hsu, L.; Whitenack, T.; Grethe, J. S.; Ozyurt, I. B.

    2017-12-01

    EarthCube Data Discovery Hub (DDH) is an EarthCube Building Block project using technologies developed in CINERGI (Community Inventory of EarthCube Resources for Geoscience Interoperability) to enable geoscience users to explore a growing portfolio of EarthCube-created and other geoscience-related resources. Over 1 million metadata records are available for discovery through the project portal (cinergi.sdsc.edu). These records are retrieved from data facilities, including federal, state and academic sources, or contributed by geoscientists through workshops, surveys, or other channels. CINERGI metadata augmentation pipeline components 1) provide semantic enhancement based on a large ontology of geoscience terms, using text analytics to generate keywords with references to ontology classes, 2) add spatial extents based on place names found in the metadata record, and 3) add organization identifiers to the metadata. The records are indexed and can be searched via a web portal and standard search APIs. The added metadata content improves discoverability and interoperability of the registered resources. Specifically, the addition of ontology-anchored keywords enables faceted browsing and lets users navigate to datasets related by variables measured, equipment used, science domain, processes described, geospatial features studied, and other dataset characteristics that are generated by the pipeline. DDH also lets data curators access and edit the automatically generated metadata records using the CINERGI metadata editor, accept or reject the enhanced metadata content, and consider it in updating their metadata descriptions. We consider several complex data discovery workflows, in environmental seismology (quantifying sediment and water fluxes using seismic data), marine biology (determining available temperature, location, weather and bleaching characteristics of coral reefs related to measurements in a given coral reef survey), and river geochemistry (discovering

  1. Mosaic organization of the hippocampal neuroepithelium and the multiple germinal sources of dentate granule cells

    International Nuclear Information System (INIS)

    Altman, J.; Bayer, S.A.

    1990-01-01

    This study deals with the site of origin, migration, and settling of the principal cell constituents of the rat hippocampus during the embryonic period. The results indicate that the hippocampal neuroepithelium consists of three morphogenetically discrete components--the Ammonic neuroepithelium, the primary dentate neuroepithelium, and the fimbrial glioepithelium--and that these are discrete sources of the large neurons of Ammon's horn, the smaller granular neurons of the dentate gyrus, and the glial cells of the fimbria. The putative Ammonic neuroepithelium is marked in short-survival thymidine radiograms by a high level of proliferative activity and evidence of interkinetic nuclear migration from day E16 until day E19. On days E16 and E17 a diffuse band of unlabeled cells forms outside the Ammonic neuroepithelium. These postmitotic cells are considered to be stratum radiatum and stratum oriens neurons, which are produced in large numbers as early as day E15. A cell-dense layer, the incipient stratum pyramidale, begins to form on day E18 and spindle-shaped cells can be traced to it from the Ammonic neuroepithelium. This migratory band increases in size for several days, then declines, and finally disappears by day E22. It is inferred that this migration contains the pyramidal cells of Ammon's horn that are produced mostly on days E17 through E20. The putative primary dentate neuroepithelium is distinguished from the Ammonic neuroepithelium during the early phases of embryonic development by its location, shape, and cellular dynamics. It is located around a ventricular indentation, the dentate notch, contains fewer mitotic cells near the lumen of the ventricle than the Ammonic neuroepithelium, and shows a different labeling pattern both in short-survival and sequential-survival thymidine radiograms

  2. Sources

    International Nuclear Information System (INIS)

    Duffy, L.P.

    1991-01-01

    This paper discusses the sources of radiation in the narrow perspective of radioactivity and the even narrow perspective of those sources that concern environmental management and restoration activities at DOE facilities, as well as a few related sources. Sources of irritation, Sources of inflammatory jingoism, and Sources of information. First, the sources of irritation fall into three categories: No reliable scientific ombudsman to speak without bias and prejudice for the public good, Technical jargon with unclear definitions exists within the radioactive nomenclature, and Scientific community keeps a low-profile with regard to public information. The next area of personal concern are the sources of inflammation. This include such things as: Plutonium being described as the most dangerous substance known to man, The amount of plutonium required to make a bomb, Talk of transuranic waste containing plutonium and its health affects, TMI-2 and Chernobyl being described as Siamese twins, Inadequate information on low-level disposal sites and current regulatory requirements under 10 CFR 61, Enhanced engineered waste disposal not being presented to the public accurately. Numerous sources of disinformation regarding low level radiation high-level radiation, Elusive nature of the scientific community, The Federal and State Health Agencies resources to address comparative risk, and Regulatory agencies speaking out without the support of the scientific community

  3. Cr(Vi) reduction capacity of activated sludge as affected by nitrogen and carbon sources, microbial acclimation and cell multiplication

    International Nuclear Information System (INIS)

    Ferro Orozco, A.M.; Contreras, E.M.; Zaritzky, N.E.

    2010-01-01

    The objectives of the present work were: (i) to analyze the capacity of activated sludge to reduce hexavalent chromium using different carbon sources as electron donors in batch reactors, (ii) to determine the relationship between biomass growth and the amount of Cr(VI) reduced considering the effect of the nitrogen to carbon source ratio, and (iii) to determine the effect of the Cr(VI) acclimation stage on the performance of the biological chromium reduction assessing the stability of the Cr(VI) reduction capacity of the activated sludge. The highest specific Cr(VI) removal rate (q Cr ) was attained with cheese whey or lactose as electron donors decreasing in the following order: cheese whey ∼ lactose > glucose > citrate > acetate. Batch assays with different nitrogen to carbon source ratio demonstrated that biological Cr(VI) reduction is associated to the cell multiplication phase; as a result, maximum Cr(VI) removal rates occur when there is no substrate limitation. The biomass can be acclimated to the presence of Cr(VI) and generate new cells that maintain the ability to reduce chromate. Therefore, the activated sludge process could be applied to a continuous Cr(VI) removal process.

  4. NoSQL data model for semi-automatic integration of ethnomedicinal plant data from multiple sources.

    Science.gov (United States)

    Ningthoujam, Sanjoy Singh; Choudhury, Manabendra Dutta; Potsangbam, Kumar Singh; Chetia, Pankaj; Nahar, Lutfun; Sarker, Satyajit D; Basar, Norazah; Das Talukdar, Anupam

    2014-01-01

    Sharing traditional knowledge with the scientific community could refine scientific approaches to phytochemical investigation and conservation of ethnomedicinal plants. As such, integration of traditional knowledge with scientific data using a single platform for sharing is greatly needed. However, ethnomedicinal data are available in heterogeneous formats, which depend on cultural aspects, survey methodology and focus of the study. Phytochemical and bioassay data are also available from many open sources in various standards and customised formats. To design a flexible data model that could integrate both primary and curated ethnomedicinal plant data from multiple sources. The current model is based on MongoDB, one of the Not only Structured Query Language (NoSQL) databases. Although it does not contain schema, modifications were made so that the model could incorporate both standard and customised ethnomedicinal plant data format from different sources. The model presented can integrate both primary and secondary data related to ethnomedicinal plants. Accommodation of disparate data was accomplished by a feature of this database that supported a different set of fields for each document. It also allowed storage of similar data having different properties. The model presented is scalable to a highly complex level with continuing maturation of the database, and is applicable for storing, retrieving and sharing ethnomedicinal plant data. It can also serve as a flexible alternative to a relational and normalised database. Copyright © 2014 John Wiley & Sons, Ltd.

  5. Scales of gravity

    International Nuclear Information System (INIS)

    Dvali, Gia; Kolanovic, Marko; Nitti, Francesco; Gabadadze, Gregory

    2002-01-01

    We propose a framework in which the quantum gravity scale can be as low as 10 -3 eV. The key assumption is that the standard model ultraviolet cutoff is much higher than the quantum gravity scale. This ensures that we observe conventional weak gravity. We construct an explicit brane-world model in which the brane-localized standard model is coupled to strong 5D gravity of infinite-volume flat extra space. Because of the high ultraviolet scale, the standard model fields generate a large graviton kinetic term on the brane. This kinetic term 'shields' the standard model from the strong bulk gravity. As a result, an observer on the brane sees weak 4D gravity up to astronomically large distances beyond which gravity becomes five dimensional. Modeling quantum gravity above its scale by the closed string spectrum we show that the shielding phenomenon protects the standard model from an apparent phenomenological catastrophe due to the exponentially large number of light string states. The collider experiments, astrophysics, cosmology and gravity measurements independently point to the same lower bound on the quantum gravity scale, 10 -3 eV. For this value the model has experimental signatures both for colliders and for submillimeter gravity measurements. Black holes reveal certain interesting properties in this framework

  6. Estimating average alcohol consumption in the population using multiple sources: the case of Spain.

    Science.gov (United States)

    Sordo, Luis; Barrio, Gregorio; Bravo, María J; Villalbí, Joan R; Espelt, Albert; Neira, Montserrat; Regidor, Enrique

    2016-01-01

    National estimates on per capita alcohol consumption are provided regularly by various sources and may have validity problems, so corrections are needed for monitoring and assessment purposes. Our objectives were to compare different alcohol availability estimates for Spain, to build the best estimate (actual consumption), characterize its time trend during 2001-2011, and quantify the extent to which other estimates (coverage) approximated actual consumption. Estimates were: alcohol availability from the Spanish Tax Agency (Tax Agency availability), World Health Organization (WHO availability) and other international agencies, self-reported purchases from the Spanish Food Consumption Panel, and self-reported consumption from population surveys. Analyses included calculating: between-agency discrepancy in availability, multisource availability (correcting Tax Agency availability by underestimation of wine and cider), actual consumption (adjusting multisource availability by unrecorded alcohol consumption/purchases and alcohol losses), and coverage of selected estimates. Sensitivity analyses were undertaken. Time trends were characterized by joinpoint regression. Between-agency discrepancy in alcohol availability remained high in 2011, mainly because of wine and spirits, although some decrease was observed during the study period. The actual consumption was 9.5 l of pure alcohol/person-year in 2011, decreasing 2.3 % annually, mainly due to wine and spirits. 2011 coverage of WHO availability, Tax Agency availability, self-reported purchases, and self-reported consumption was 99.5, 99.5, 66.3, and 28.0 %, respectively, generally with downward trends (last three estimates, especially self-reported consumption). The multisource availability overestimated actual consumption by 12.3 %, mainly due to tourism imbalance. Spanish estimates of per capita alcohol consumption show considerable weaknesses. Using uncorrected estimates, especially self-reported consumption, for

  7. Contrasts between estimates of baseflow help discern multiple sources of water contributing to rivers

    Science.gov (United States)

    Cartwright, I.; Gilfedder, B.; Hofmann, H.

    2014-01-01

    This study compares baseflow estimates using chemical mass balance, local minimum methods, and recursive digital filters in the upper reaches of the Barwon River, southeast Australia. During the early stages of high-discharge events, the chemical mass balance overestimates groundwater inflows, probably due to flushing of saline water from wetlands and marshes, soils, or the unsaturated zone. Overall, however, estimates of baseflow from the local minimum and recursive digital filters are higher than those based on chemical mass balance using Cl calculated from continuous electrical conductivity measurements. Between 2001 and 2011, the baseflow contribution to the upper Barwon River calculated using chemical mass balance is between 12 and 25% of the annual discharge with a net baseflow contribution of 16% of total discharge. Recursive digital filters predict higher baseflow contributions of 19 to 52% of discharge annually with a net baseflow contribution between 2001 and 2011 of 35% of total discharge. These estimates are similar to those from the local minimum method (16 to 45% of annual discharge and 26% of total discharge). These differences most probably reflect how the different techniques characterise baseflow. The local minimum and recursive digital filters probably aggregate much of the water from delayed sources as baseflow. However, as many delayed transient water stores (such as bank return flow, floodplain storage, or interflow) are likely to be geochemically similar to surface runoff, chemical mass balance calculations aggregate them with the surface runoff component. The difference between the estimates is greatest following periods of high discharge in winter, implying that these transient stores of water feed the river for several weeks to months at that time. Cl vs. discharge variations during individual flow events also demonstrate that inflows of high-salinity older water occurs on the rising limbs of hydrographs followed by inflows of low

  8. Construction of a single/multiple wavelength RZ optical pulse source at 40 GHz by use of wavelength conversion in a high-nonlinearity DSF-NOLM

    DEFF Research Database (Denmark)

    Yu, Jianjun; Yujun, Qian; Jeppesen, Palle

    2001-01-01

    A single or multiple wavelength RZ optical pulse source at 40 GHz is successfully obtained by using wavelength conversion in a nonlinear optical loop mirror consisting of high nonlinearity-dispersion shifted fiber.......A single or multiple wavelength RZ optical pulse source at 40 GHz is successfully obtained by using wavelength conversion in a nonlinear optical loop mirror consisting of high nonlinearity-dispersion shifted fiber....

  9. Einstein gravity emerging from quantum weyl gravity

    International Nuclear Information System (INIS)

    Zee, A.

    1983-01-01

    We advocate a conformal invariant world described by the sum of the Weyl, Dirac, and Yang-Mills action. Quantum fluctuations bring back Einstein gravity so that the long-distance phenomenology is as observed. Formulas for the induced Newton's constant and Eddington's constant are derived in quantized Weyl gravity. We show that the analogue of the trace anomaly for the Weyl action is structurally similar to that for the Yang-Mills action

  10. MRMer, an interactive open source and cross-platform system for data extraction and visualization of multiple reaction monitoring experiments.

    Science.gov (United States)

    Martin, Daniel B; Holzman, Ted; May, Damon; Peterson, Amelia; Eastham, Ashley; Eng, Jimmy; McIntosh, Martin

    2008-11-01

    Multiple reaction monitoring (MRM) mass spectrometry identifies and quantifies specific peptides in a complex mixture with very high sensitivity and speed and thus has promise for the high throughput screening of clinical samples for candidate biomarkers. We have developed an interactive software platform, called MRMer, for managing highly complex MRM-MS experiments, including quantitative analyses using heavy/light isotopic peptide pairs. MRMer parses and extracts information from MS files encoded in the platform-independent mzXML data format. It extracts and infers precursor-product ion transition pairings, computes integrated ion intensities, and permits rapid visual curation for analyses exceeding 1000 precursor-product pairs. Results can be easily output for quantitative comparison of consecutive runs. Additionally MRMer incorporates features that permit the quantitative analysis experiments including heavy and light isotopic peptide pairs. MRMer is open source and provided under the Apache 2.0 license.

  11. Quantum electrodynamics of the internal source x-ray holographies: Bremsstrahlung, fluorescence, and multiple-energy x-ray holography

    International Nuclear Information System (INIS)

    Miller, G.A.; Sorensen, L.B.

    1997-01-01

    Quantum electrodynamics (QED) is used to derive the differential cross sections measured in the three new experimental internal source ensemble x-ray holographies: bremsstrahlung (BXH), fluorescence (XFH), and multiple-energy (MEXH) x-ray holography. The polarization dependence of the BXH cross section is also obtained. For BXH, we study analytically and numerically the possible effects of the virtual photons and electrons which enter QED calculations in summing over the intermediate states. For the low photon and electron energies used in the current experiments, we show that the virtual intermediate states produce only very small effects. This is because the uncertainty principle limits the distance that the virtual particles can propagate to be much shorter than the separation between the regions of high electron density in the adjacent atoms. We also find that using the asymptotic form of the scattering wave function causes about a 5 10% error for near forward scattering. copyright 1997 The American Physical Society

  12. sources

    Directory of Open Access Journals (Sweden)

    Shu-Yin Chiang

    2002-01-01

    Full Text Available In this paper, we study the simplified models of the ATM (Asynchronous Transfer Mode multiplexer network with Bernoulli random traffic sources. Based on the model, the performance measures are analyzed by the different output service schemes.

  13. Gravity Compensation Technique Uses Small dc Motor

    Science.gov (United States)

    Hollow, Richard

    1988-01-01

    Small dc servomotor powered by simple constant-current source and with suitable gearing used to cancel effect of gravity upon load. Lead-screw positioning system has load counterbalanced by small supplementary motor powered by constant current source. Motor lighter and more compact alternative to counterbalance. Used in variety of mechanical systems where load positioned or accelerated in vertical plane.

  14. Circular symmetry in topologically massive gravity

    International Nuclear Information System (INIS)

    Deser, S; Franklin, J

    2010-01-01

    We re-derive, compactly, a topologically massive gravity (TMG) decoupling theorem: source-free TMG separates into its Einstein and Cotton sectors for spaces with a hypersurface-orthogonal Killing vector, here concretely for circular symmetry. We then generalize the theorem to include matter; surprisingly, the single Killing symmetry also forces conformal invariance, requiring the sources to be null. (note)

  15. NOTE: Circular symmetry in topologically massive gravity

    Science.gov (United States)

    Deser, S.; Franklin, J.

    2010-05-01

    We re-derive, compactly, a topologically massive gravity (TMG) decoupling theorem: source-free TMG separates into its Einstein and Cotton sectors for spaces with a hypersurface-orthogonal Killing vector, here concretely for circular symmetry. We then generalize the theorem to include matter; surprisingly, the single Killing symmetry also forces conformal invariance, requiring the sources to be null.

  16. Circular symmetry in topologically massive gravity

    Energy Technology Data Exchange (ETDEWEB)

    Deser, S [Physics Department, Brandeis University, Waltham, MA 02454 (United States); Franklin, J, E-mail: deser@brandeis.ed, E-mail: jfrankli@reed.ed [Reed College, Portland, OR 97202 (United States)

    2010-05-21

    We re-derive, compactly, a topologically massive gravity (TMG) decoupling theorem: source-free TMG separates into its Einstein and Cotton sectors for spaces with a hypersurface-orthogonal Killing vector, here concretely for circular symmetry. We then generalize the theorem to include matter; surprisingly, the single Killing symmetry also forces conformal invariance, requiring the sources to be null. (note)

  17. Soft collinear effective theory for gravity

    Science.gov (United States)

    Okui, Takemichi; Yunesi, Arash

    2018-03-01

    We present how to construct a soft collinear effective theory (SCET) for gravity at the leading and next-to-leading powers from the ground up. The soft graviton theorem and decoupling of collinear gravitons at the leading power are manifest from the outset in the effective symmetries of the theory. At the next-to-leading power, certain simple structures of amplitudes, which are completely obscure in Feynman diagrams of the full theory, are also revealed, which greatly simplifies calculations. The effective Lagrangian is highly constrained by effectively multiple copies of diffeomorphism invariance that are inevitably present in gravity SCET due to mode separation, an essential ingredient of any SCET. Further explorations of effective theories of gravity with mode separation may shed light on Lagrangian-level understandings of some of the surprising properties of gravitational scattering amplitudes. A gravity SCET with an appropriate inclusion of Glauber modes may serve as a powerful tool for studying gravitational scattering in the Regge limit.

  18. On the entropy variation in the scenario of entropic gravity

    Science.gov (United States)

    Xiao, Yong; Bai, Shi-Yang

    2018-05-01

    In the scenario of entropic gravity, entropy varies as a function of the location of the matter, while the tendency to increase entropy appears as gravity. We concentrate on studying the entropy variation of a typical gravitational system with different relative positions between the mass and the gravitational source. The result is that the entropy of the system doesn't increase when the mass is displaced closer to the gravitational source. In this way it disproves the proposal of entropic gravity from thermodynamic entropy. It doesn't exclude the possibility that gravity originates from non-thermodynamic entropy like entanglement entropy.

  19. Anomalies and gravity

    International Nuclear Information System (INIS)

    Mielke, Eckehard W.

    2006-01-01

    Anomalies in Yang-Mills type gauge theories of gravity are reviewed. Particular attention is paid to the relation between the Dirac spin, the axial current j5 and the non-covariant gauge spin C. Using diagrammatic techniques, we show that only generalizations of the U(1)- Pontrjagin four-form F and F = dC arise in the chiral anomaly, even when coupled to gravity. Implications for Ashtekar's canonical approach to quantum gravity are discussed

  20. Magnetic Fields Versus Gravity

    Science.gov (United States)

    Hensley, Kerry

    2018-04-01

    polarized emission toward all three sources. By extracting the magnetic field orientations from the polarization vectors, Koch and collaborators found that the molecular cloud contains an ordered magnetic field with never-before-seen structures. Several small clumps on the perimeter of the massive star-forming cores exhibit comet-shaped magnetic field structures, which could indicate that these smaller cores are being pulled toward the more massive cores.These findings hint that the magnetic field structure can tell us about the flow of material within star-forming regions key to understanding the nature of star formation itself.Maps of sin for two of the protostars (e2 and e8) and their surroundings. [Adapted from Koch et al. 2018]Guiding Star FormationDo the magnetic fields in W51 help or hinder star formation? To explore this question,Koch and collaborators introduced the quantity sin , where is the angle between the local gravity and the local magnetic field.When the angle between gravity and the magnetic field is small (sin 0), the magnetic field has little effect on the collapse of the cloud. If gravity and the magnetic field are perpendicular (sin 1), the magnetic field can slow the infall of gas and inhibit star formation.Based on this parameter, Koch and collaborators identified narrow channels where gravity acts unimpeded by the magnetic field. These magnetic channels may funnel gas toward the dense cores and aid the star-formation process.The authors observations demonstrate just one example of the broad realm ALMAs polarimetry capabilities have opened to discovery. These and future observations of dust polarization will continue to reveal more about the delicate magnetic structure within molecular clouds, furtherilluminating the role that magnetic fields play in star formation.CitationPatrick M. Koch et al 2018 ApJ 855 39. doi:10.3847/1538-4357/aaa4c1

  1. influence of gravity

    Directory of Open Access Journals (Sweden)

    Animesh Mukherjee

    1991-01-01

    Full Text Available Based upon Biot's [1965] theory of initial stresses of hydrostatic nature produced by the effect of gravity, a study is made of surface waves in higher order visco-elastic media under the influence of gravity. The equation for the wave velocity of Stonely waves in the presence of viscous and gravitational effects is obtained. This is followed by particular cases of surface waves including Rayleigh waves and Love waves in the presence of viscous and gravity effects. In all cases the wave-velocity equations are found to be in perfect agreement with the corresponding classical results when the effects of gravity and viscosity are neglected.

  2. Classical Weyl transverse gravity

    Energy Technology Data Exchange (ETDEWEB)

    Oda, Ichiro [University of the Ryukyus, Department of Physics, Faculty of Science, Nishihara, Okinawa (Japan)

    2017-05-15

    We study various classical aspects of the Weyl transverse (WTDiff) gravity in a general space-time dimension. First of all, we clarify a classical equivalence among three kinds of gravitational theories, those are, the conformally invariant scalar tensor gravity, Einstein's general relativity and the WTDiff gravity via the gauge-fixing procedure. Secondly, we show that in the WTDiff gravity the cosmological constant is a mere integration constant as in unimodular gravity, but it does not receive any radiative corrections unlike the unimodular gravity. A key point in this proof is to construct a covariantly conserved energy-momentum tensor, which is achieved on the basis of this equivalence relation. Thirdly, we demonstrate that the Noether current for the Weyl transformation is identically vanishing, thereby implying that the Weyl symmetry existing in both the conformally invariant scalar tensor gravity and the WTDiff gravity is a ''fake'' symmetry. We find it possible to extend this proof to all matter fields, i.e. the Weyl-invariant scalar, vector and spinor fields. Fourthly, it is explicitly shown that in the WTDiff gravity the Schwarzschild black hole metric and a charged black hole one are classical solutions to the equations of motion only when they are expressed in the Cartesian coordinate system. Finally, we consider the Friedmann-Lemaitre-Robertson-Walker (FLRW) cosmology and provide some exact solutions. (orig.)

  3. TH-CD-207B-01: BEST IN PHYSICS (IMAGING): Development of High Brightness Multiple-Pixel X-Ray Source Using Oxide Coated Cathodes

    International Nuclear Information System (INIS)

    Kandlakunta, P; Pham, R; Zhang, T

    2016-01-01

    Purpose: To develop and characterize a high brightness multiple-pixel thermionic emission x-ray (MPTEX) source. Methods: Multiple-pixel x-ray sources allow for designs of novel x-ray imaging techniques, such as fixed gantry CT, digital tomosynthesis, tetrahedron beam computed tomography, etc. We are developing a high-brightness multiple-pixel thermionic emission x-ray (MPTEX) source based on oxide coated cathodes. Oxide cathode is chosen as the electron source due to its high emission current density and low operating temperature. A MPTEX prototype has been developed which may contain up to 41 micro-rectangular oxide cathodes in 4 mm pixel spacing. Electronics hardware was developed for source control and switching. The cathode emission current was evaluated and x-ray measurements were performed to estimate the focal spot size. Results: The oxide cathodes were able to produce ∼110 mA cathode current in pulse mode which corresponds to an emission current density of 0.55 A/cm 2 . The maximum kVp of the MPTEX prototype currently is limited to 100 kV due to the rating of high voltage feedthrough. Preliminary x-ray measurements estimated the focal spot size as 1.5 × 1.3 mm 2 . Conclusion: A MPTEX source was developed with thermionic oxide coated cathodes and preliminary source characterization was successfully performed. The MPTEX source is able to produce an array of high brightness x-ray beams with a fast switching speed.

  4. Multiple source genes of HAmo SINE actively expanded and ongoing retroposition in cyprinid genomes relying on its partner LINE

    Directory of Open Access Journals (Sweden)

    Gan Xiaoni

    2010-04-01

    Full Text Available Abstract Background We recently characterized HAmo SINE and its partner LINE in silver carp and bighead carp based on hybridization capture of repetitive elements from digested genomic DNA in solution using a bead-probe 1. To reveal the distribution and evolutionary history of SINEs and LINEs in cyprinid genomes, we performed a multi-species search for HAmo SINE and its partner LINE using the bead-probe capture and internal-primer-SINE polymerase chain reaction (PCR techniques. Results Sixty-seven full-size and 125 internal-SINE sequences (as well as 34 full-size and 9 internal sequences previously reported in bighead carp and silver carp from 17 species of the family Cyprinidae were aligned as well as 14 new isolated HAmoL2 sequences. Four subfamilies (type I, II, III and IV, which were divided based on diagnostic nucleotides in the tRNA-unrelated region, expanded preferentially within a certain lineage or within the whole family of Cyprinidae as multiple active source genes. The copy numbers of HAmo SINEs were estimated to vary from 104 to 106 in cyprinid genomes by quantitative RT-PCR. Over one hundred type IV members were identified and characterized in the primitive cyprinid Danio rerio genome but only tens of sequences were found to be similar with type I, II and III since the type IV was the oldest subfamily and its members dispersed in almost all investigated cyprinid fishes. For determining the taxonomic distribution of HAmo SINE, inter-primer SINE PCR was conducted in other non-cyprinid fishes, the results shows that HAmo SINE- related sequences may disperse in other families of order Cypriniforms but absent in other orders of bony fishes: Siluriformes, Polypteriformes, Lepidosteiformes, Acipenseriformes and Osteoglossiforms. Conclusions Depending on HAmo LINE2, multiple source genes (subfamilies of HAmo SINE actively expanded and underwent retroposition in a certain lineage or within the whole family of Cyprinidae. From this

  5. Interior Alaska Bouguer Gravity Anomaly

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — A 1 kilometer Complete Bouguer Anomaly gravity grid of interior Alaska. Only those grid cells within 10 kilometers of a gravity data point have gravity values....

  6. Consistency of orthodox gravity

    Energy Technology Data Exchange (ETDEWEB)

    Bellucci, S. [INFN, Frascati (Italy). Laboratori Nazionali di Frascati; Shiekh, A. [International Centre for Theoretical Physics, Trieste (Italy)

    1997-01-01

    A recent proposal for quantizing gravity is investigated for self consistency. The existence of a fixed-point all-order solution is found, corresponding to a consistent quantum gravity. A criterion to unify couplings is suggested, by invoking an application of their argument to more complex systems.

  7. Generalized pure Lovelock gravity

    Science.gov (United States)

    Concha, Patrick; Rodríguez, Evelyn

    2017-11-01

    We present a generalization of the n-dimensional (pure) Lovelock Gravity theory based on an enlarged Lorentz symmetry. In particular, we propose an alternative way to introduce a cosmological term. Interestingly, we show that the usual pure Lovelock gravity is recovered in a matter-free configuration. The five and six-dimensional cases are explicitly studied.

  8. Generalized pure Lovelock gravity

    Directory of Open Access Journals (Sweden)

    Patrick Concha

    2017-11-01

    Full Text Available We present a generalization of the n-dimensional (pure Lovelock Gravity theory based on an enlarged Lorentz symmetry. In particular, we propose an alternative way to introduce a cosmological term. Interestingly, we show that the usual pure Lovelock gravity is recovered in a matter-free configuration. The five and six-dimensional cases are explicitly studied.

  9. Discovering perturbation of modular structure in HIV progression by integrating multiple data sources through non-negative matrix factorization.

    Science.gov (United States)

    Ray, Sumanta; Maulik, Ujjwal

    2016-12-20

    Detecting perturbation in modular structure during HIV-1 disease progression is an important step to understand stage specific infection pattern of HIV-1 virus in human cell. In this article, we proposed a novel methodology on integration of multiple biological information to identify such disruption in human gene module during different stages of HIV-1 infection. We integrate three different biological information: gene expression information, protein-protein interaction information and gene ontology information in single gene meta-module, through non negative matrix factorization (NMF). As the identified metamodules inherit those information so, detecting perturbation of these, reflects the changes in expression pattern, in PPI structure and in functional similarity of genes during the infection progression. To integrate modules of different data sources into strong meta-modules, NMF based clustering is utilized here. Perturbation in meta-modular structure is identified by investigating the topological and intramodular properties and putting rank to those meta-modules using a rank aggregation algorithm. We have also analyzed the preservation structure of significant GO terms in which the human proteins of the meta-modules participate. Moreover, we have performed an analysis to show the change of coregulation pattern of identified transcription factors (TFs) over the HIV progression stages.

  10. Stochastic Gravity: Theory and Applications

    Directory of Open Access Journals (Sweden)

    Hu Bei Lok

    2004-01-01

    Full Text Available Whereas semiclassical gravity is based on the semiclassical Einstein equation with sources given by the expectation value of the stress-energy tensor of quantum fields, stochastic semiclassical gravity is based on the Einstein-Langevin equation, which has in addition sources due to the noise kernel. The noise kernel is the vacuum expectation value of the (operator-valued stress-energy bi-tensor which describes the fluctuations of quantum matter fields in curved spacetimes. In the first part, we describe the fundamentals of this new theory via two approaches: the axiomatic and the functional. The axiomatic approach is useful to see the structure of the theory from the framework of semiclassical gravity, showing the link from the mean value of the stress-energy tensor to their correlation functions. The functional approach uses the Feynman-Vernon influence functional and the Schwinger-Keldysh closed-time-path effective action methods which are convenient for computations. It also brings out the open systems concepts and the statistical and stochastic contents of the theory such as dissipation, fluctuations, noise, and decoherence. We then focus on the properties of the stress-energy bi-tensor. We obtain a general expression for the noise kernel of a quantum field defined at two distinct points in an arbitrary curved spacetime as products of covariant derivatives of the quantum field's Green function. In the second part, we describe three applications of stochastic gravity theory. First, we consider metric perturbations in a Minkowski spacetime. We offer an analytical solution of the Einstein-Langevin equation and compute the two-point correlation functions for the linearized Einstein tensor and for the metric perturbations. Second, we discuss structure formation from the stochastic gravity viewpoint, which can go beyond the standard treatment by incorporating the full quantum effect of the inflaton fluctuations. Third, we discuss the backreaction

  11. Lattice gravity and strings

    International Nuclear Information System (INIS)

    Jevicki, A.; Ninomiya, M.

    1985-01-01

    We are concerned with applications of the simplicial discretization method (Regge calculus) to two-dimensional quantum gravity with emphasis on the physically relevant string model. Beginning with the discretization of gravity and matter we exhibit a discrete version of the conformal trace anomaly. Proceeding to the string problem we show how the direct approach of (finite difference) discretization based on Nambu action corresponds to unsatisfactory treatment of gravitational degrees. Based on the Regge approach we then propose a discretization corresponding to the Polyakov string. In this context we are led to a natural geometric version of the associated Liouville model and two-dimensional gravity. (orig.)

  12. The Future of Gravity

    CERN Multimedia

    CERN. Geneva

    2007-01-01

    Of the four fundamental forces, gravity has been studied the longest, yet gravitational physics is one of the most rapidly developing areas of science today. This talk will give a broad brush survey of the past achievements and future prospects of general relativistic gravitational physics. Gravity is a two frontier science being important on both the very largest and smallest length scales considered in contemporary physics. Recent advances and future prospects will be surveyed in precision tests of general relativity, gravitational waves, black holes, cosmology and quantum gravity. The aim will be an overview of a subject that is becoming increasingly integrated with experiment and other branches of physics.

  13. Scaling in quantum gravity

    Directory of Open Access Journals (Sweden)

    J. Ambjørn

    1995-07-01

    Full Text Available The 2-point function is the natural object in quantum gravity for extracting critical behavior: The exponential falloff of the 2-point function with geodesic distance determines the fractal dimension dH of space-time. The integral of the 2-point function determines the entropy exponent γ, i.e. the fractal structure related to baby universes, while the short distance behavior of the 2-point function connects γ and dH by a quantum gravity version of Fisher's scaling relation. We verify this behavior in the case of 2d gravity by explicit calculation.

  14. Contextualized perceptions of movement as a source of expanded insight: People with multiple sclerosis' experience with physiotherapy.

    Science.gov (United States)

    Normann, Britt; Sørgaard, Knut W; Salvesen, Rolf; Moe, Siri

    2013-01-01

    The hospitals' outpatient clinics for people with multiple sclerosis (PwMS) are important in the health care. Research regarding physiotherapy in such clinics is limited. The purpose was to investigate how PwMS perceive movement during single sessions of physiotherapy in a hospital's outpatient clinic, and what do these experiences mean for the patient's insight into their movement disturbances? Qualitative research interviews were performed with a purposive sample of 12 PwMS and supplemented with seven videotaped sessions. Content analysis was performed. The results indicate that contextualized perceptions of movement appear to be an essential source for PwMS to gain expanded insight with regard to their individual movement disturbances regardless of their ambulatory status. The contextualization implies that perceptions of movement are integrated with the physiotherapist's explanations regarding optimizing gait and balance or other activities of daily life. Perceptions of improvement in body part movement and/or functional activities are vital to enhancing their understanding of their individual movement disorders, and they may provide expanded insight regarding future possibilities and limitations involving everyday tasks. The implementation of movements, which transforms the perceived improvement into self-assisted exercises, appeared to be meaningful. Contextualized perceptions of improvements in movement may strengthen the person's sense of ownership and sense of agency and thus promote autonomy and self-encouragement. The findings underpin the importance of contextualized perceptions of movement based on exploration of potential for change, as an integrated part of information and communication in the health care for PwMS. Further investigations are necessary to deepen our knowledge.

  15. Glacier changes and climate trends derived from multiple sources in the data scarce Cordillera Vilcanota region, southern Peruvian Andes

    Science.gov (United States)

    Salzmann, N.; Huggel, C.; Rohrer, M.; Silverio, W.; Mark, B. G.; Burns, P.; Portocarrero, C.

    2013-01-01

    The role of glaciers as temporal water reservoirs is particularly pronounced in the (outer) tropics because of the very distinct wet/dry seasons. Rapid glacier retreat caused by climatic changes is thus a major concern, and decision makers demand urgently for regional/local glacier evolution trends, ice mass estimates and runoff assessments. However, in remote mountain areas, spatial and temporal data coverage is typically very scarce and this is further complicated by a high spatial and temporal variability in regions with complex topography. Here, we present an approach on how to deal with these constraints. For the Cordillera Vilcanota (southern Peruvian Andes), which is the second largest glacierized cordillera in Peru (after the Cordillera Blanca) and also comprises the Quelccaya Ice Cap, we assimilate a comprehensive multi-decadal collection of available glacier and climate data from multiple sources (satellite images, meteorological station data and climate reanalysis), and analyze them for respective changes in glacier area and volume and related trends in air temperature, precipitation and in a more general manner for specific humidity. While we found only marginal glacier changes between 1962 and 1985, there has been a massive ice loss since 1985 (about 30% of area and about 45% of volume). These high numbers corroborate studies from other glacierized cordilleras in Peru. The climate data show overall a moderate increase in air temperature, mostly weak and not significant trends for precipitation sums and probably cannot in full explain the observed substantial ice loss. Therefore, the likely increase of specific humidity in the upper troposphere, where the glaciers are located, is further discussed and we conclude that it played a major role in the observed massive ice loss of the Cordillera Vilcanota over the past decades.

  16. Glacier changes and climate trends derived from multiple sources in the data scarce Cordillera Vilcanota region, southern Peruvian Andes

    Directory of Open Access Journals (Sweden)

    N. Salzmann

    2013-01-01

    Full Text Available The role of glaciers as temporal water reservoirs is particularly pronounced in the (outer tropics because of the very distinct wet/dry seasons. Rapid glacier retreat caused by climatic changes is thus a major concern, and decision makers demand urgently for regional/local glacier evolution trends, ice mass estimates and runoff assessments. However, in remote mountain areas, spatial and temporal data coverage is typically very scarce and this is further complicated by a high spatial and temporal variability in regions with complex topography. Here, we present an approach on how to deal with these constraints. For the Cordillera Vilcanota (southern Peruvian Andes, which is the second largest glacierized cordillera in Peru (after the Cordillera Blanca and also comprises the Quelccaya Ice Cap, we assimilate a comprehensive multi-decadal collection of available glacier and climate data from multiple sources (satellite images, meteorological station data and climate reanalysis, and analyze them for respective changes in glacier area and volume and related trends in air temperature, precipitation and in a more general manner for specific humidity. While we found only marginal glacier changes between 1962 and 1985, there has been a massive ice loss since 1985 (about 30% of area and about 45% of volume. These high numbers corroborate studies from other glacierized cordilleras in Peru. The climate data show overall a moderate increase in air temperature, mostly weak and not significant trends for precipitation sums and probably cannot in full explain the observed substantial ice loss. Therefore, the likely increase of specific humidity in the upper troposphere, where the glaciers are located, is further discussed and we conclude that it played a major role in the observed massive ice loss of the Cordillera Vilcanota over the past decades.

  17. The Hidden Health and Economic Burden of Rotavirus Gastroenteritis in Malaysia: An Estimation Using Multiple Data Sources.

    Science.gov (United States)

    Loganathan, Tharani; Ng, Chiu-Wan; Lee, Way-Seah; Jit, Mark

    2016-06-01

    Rotavirus gastroenteritis (RVGE) results in substantial mortality and morbidity worldwide. However, an accurate estimation of the health and economic burden of RVGE in Malaysia covering public, private and home treatment is lacking. Data from multiple sources were used to estimate diarrheal mortality and morbidity according to health service utilization. The proportion of this burden attributable to rotavirus was estimated from a community-based study and a meta-analysis we conducted of primary hospital-based studies. Rotavirus incidence was determined by multiplying acute gastroenteritis incidence with estimates of the proportion of gastroenteritis attributable to rotavirus. The economic burden of rotavirus disease was estimated from the health systems and societal perspective. Annually, rotavirus results in 27 deaths, 31,000 hospitalizations, 41,000 outpatient visits and 145,000 episodes of home-treated gastroenteritis in Malaysia. We estimate an annual rotavirus incidence of 1 death per 100,000 children and 12 hospitalizations, 16 outpatient clinic visits and 57 home-treated episodes per 1000 children under-5 years. Annually, RVGE is estimated to cost US$ 34 million to the healthcare provider and US$ 50 million to society. Productivity loss contributes almost a third of costs to society. Publicly, privately and home-treated episodes consist of 52%, 27% and 21%, respectively, of the total societal costs. RVGE represents a considerable health and economic burden in Malaysia. Much of the burden lies in privately or home-treated episodes and is poorly captured in previous studies. This study provides vital information for future evaluation of cost-effectiveness, which are necessary for policy-making regarding universal vaccination.

  18. Terrestrial Sagnac delay constraining modified gravity models

    Science.gov (United States)

    Karimov, R. Kh.; Izmailov, R. N.; Potapov, A. A.; Nandi, K. K.

    2018-04-01

    Modified gravity theories include f(R)-gravity models that are usually constrained by the cosmological evolutionary scenario. However, it has been recently shown that they can also be constrained by the signatures of accretion disk around constant Ricci curvature Kerr-f(R0) stellar sized black holes. Our aim here is to use another experimental fact, viz., the terrestrial Sagnac delay to constrain the parameters of specific f(R)-gravity prescriptions. We shall assume that a Kerr-f(R0) solution asymptotically describes Earth's weak gravity near its surface. In this spacetime, we shall study oppositely directed light beams from source/observer moving on non-geodesic and geodesic circular trajectories and calculate the time gap, when the beams re-unite. We obtain the exact time gap called Sagnac delay in both cases and expand it to show how the flat space value is corrected by the Ricci curvature, the mass and the spin of the gravitating source. Under the assumption that the magnitude of corrections are of the order of residual uncertainties in the delay measurement, we derive the allowed intervals for Ricci curvature. We conclude that the terrestrial Sagnac delay can be used to constrain the parameters of specific f(R) prescriptions. Despite using the weak field gravity near Earth's surface, it turns out that the model parameter ranges still remain the same as those obtained from the strong field accretion disk phenomenon.

  19. Text-Based Argumentation with Multiple Sources: A Descriptive Study of Opportunity to Learn in Secondary English Language Arts, History, and Science

    Science.gov (United States)

    Litman, Cindy; Marple, Stacy; Greenleaf, Cynthia; Charney-Sirott, Irisa; Bolz, Michael J.; Richardson, Lisa K.; Hall, Allison H.; George, MariAnne; Goldman, Susan R.

    2017-01-01

    This study presents a descriptive analysis of 71 videotaped lessons taught by 34 highly regarded secondary English language arts, history, and science teachers, collected to inform an intervention focused on evidence-based argumentation from multiple text sources. Studying the practices of highly regarded teachers is valuable for identifying…

  20. The role of satellite altimetry in gravity field modelling in coastal areas

    DEFF Research Database (Denmark)

    Andersen, Ole Baltazar; Knudsen, Per

    2000-01-01

    global uniform gravity information with very high resolution, and these global marine gravity fields are registered on a two by two minute grid corresponding to 4 by 4 kilometres at the equator. In this presentation several coastal complications in deriving the marine gravity field from satellite...... altimetry will be investigated using the KMS98 gravity field. Comparison with other sources of gravity field information like airborne and marine gravity observations will be carried out and two fundamentally different test areas (Azores and Skagerak) will be studied to investigated the different role...

  1. Gravity Data for Egypt

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity station data (71 records) were gathered by various governmental organizations (and academia) using a variety of methods. This data base was received in...

  2. New massive gravity

    NARCIS (Netherlands)

    Bergshoeff, Eric A.; Hohm, Olaf; Townsend, Paul K.

    2012-01-01

    We present a brief review of New Massive Gravity, which is a unitary theory of massive gravitons in three dimensions obtained by considering a particular combination of the Einstein-Hilbert and curvature squared terms.

  3. DMA Antarctic Gravity Data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity station data (65,164 records) were gathered by various governmental organizations (and academia) using a variety of methods. The data base was received...

  4. Gravity Data for Minnesota

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity station data (55,907 records) were gathered by various governmental organizations (and academia) using a variety of methods. This data base was received...

  5. Stability in designer gravity

    International Nuclear Information System (INIS)

    Hertog, Thomas; Hollands, Stefan

    2005-01-01

    We study the stability of designer gravity theories, in which one considers gravity coupled to a tachyonic scalar with anti-de Sitter (AdS) boundary conditions defined by a smooth function W. We construct Hamiltonian generators of the asymptotic symmetries using the covariant phase space method of Wald et al and find that they differ from the spinor charges except when W = 0. The positivity of the spinor charge is used to establish a lower bound on the conserved energy of any solution that satisfies boundary conditions for which W has a global minimum. A large class of designer gravity theories therefore have a stable ground state, which the AdS/CFT correspondence indicates should be the lowest energy soliton. We make progress towards proving this by showing that minimum energy solutions are static. The generalization of our results to designer gravity theories in higher dimensions involving several tachyonic scalars is discussed

  6. Carroll versus Galilei gravity

    Energy Technology Data Exchange (ETDEWEB)

    Bergshoeff, Eric [Centre for Theoretical Physics, University of Groningen,Nijenborgh 4, 9747 AG Groningen (Netherlands); Gomis, Joaquim [Departament de Física Cuàntica i Astrofísica and Institut de Ciències del Cosmos,Universitat de Barcelona,Martí i Franquès 1, E-08028 Barcelona (Spain); Rollier, Blaise [Centre for Theoretical Physics, University of Groningen,Nijenborgh 4, 9747 AG Groningen (Netherlands); Rosseel, Jan [Faculty of Physics, University of Vienna,Boltzmanngasse 5, A-1090 Vienna (Austria); Veldhuis, Tonnis ter [Centre for Theoretical Physics, University of Groningen,Nijenborgh 4, 9747 AG Groningen (Netherlands)

    2017-03-30

    We consider two distinct limits of General Relativity that in contrast to the standard non-relativistic limit can be taken at the level of the Einstein-Hilbert action instead of the equations of motion. One is a non-relativistic limit and leads to a so-called Galilei gravity theory, the other is an ultra-relativistic limit yielding a so-called Carroll gravity theory. We present both gravity theories in a first-order formalism and show that in both cases the equations of motion (i) lead to constraints on the geometry and (ii) are not sufficient to solve for all of the components of the connection fields in terms of the other fields. Using a second-order formalism we show that these independent components serve as Lagrange multipliers for the geometric constraints we found earlier. We point out a few noteworthy differences between Carroll and Galilei gravity and give some examples of matter couplings.

  7. Discrete quantum gravity

    International Nuclear Information System (INIS)

    Williams, Ruth M

    2006-01-01

    A review is given of a number of approaches to discrete quantum gravity, with a restriction to those likely to be relevant in four dimensions. This paper is dedicated to Rafael Sorkin on the occasion of his sixtieth birthday

  8. Analysis of Brassica oleracea early stage abiotic stress responses reveals tolerance in multiple crop types and for multiple sources of stress.

    Science.gov (United States)

    Beacham, Andrew M; Hand, Paul; Pink, David Ac; Monaghan, James M

    2017-12-01

    Brassica oleracea includes a number of important crop types such as cabbage, cauliflower, broccoli and kale. Current climate conditions and weather patterns are causing significant losses in these crops, meaning that new cultivars with improved tolerance of one or more abiotic stress types must be sought. In this study, genetically fixed B. oleracea lines belonging to a Diversity Fixed Foundation Set (DFFS) were assayed for their response to seedling stage-imposed drought, flood, salinity, heat and cold stress. Significant (P ≤ 0.05) variation in stress tolerance response was found for each stress, for each of four measured variables (relative fresh weight, relative dry weight, relative leaf number and relative plant height). Lines tolerant to multiple stresses were found to belong to several different crop types. There was no overall correlation between the responses to the different stresses. Abiotic stress tolerance was identified in multiple B. oleracea crop types, with some lines exhibiting resistance to multiple stresses. For each stress, no one crop type appeared significantly more or less tolerant than others. The results are promising for the development of more environmentally robust lines of different B. oleracea crops by identifying tolerant material and highlighting the relationship between responses to different stresses. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  9. Source apportionment of PM2.5 at multiple Northwest U.S. sites: Assessing regional winter wood smoke impacts from residential wood combustion

    Science.gov (United States)

    Kotchenruther, Robert A.

    2016-10-01

    Wood smoke from residential wood combustion is a significant source of elevated PM2.5 in many communities across the Northwest U.S. Accurate representation of residential wood combustion in source-oriented regional scale air quality models is challenging because of multiple uncertainties. As an alternative to source-oriented source apportionment, this work provides, through receptor-oriented source apportionment, an assessment of winter residential wood combustion impacts at multiple Northwest U.S. locations. Source apportionment was performed on chemically speciated PM2.5 from 19 monitoring sites using the Positive Matrix Factorization (PMF) receptor model. Each site was modeled independently, but a common data preparation and modeling protocol was used so that results were as comparable as possible across sites. Model solutions had from 4 to 8 PMF factors, depending on the site. PMF factors at each site were associated with a source classification (e.g., primary wood smoke), a dominant chemical composition (e.g., ammonium nitrate), or were some mixture. 15 different sources or chemical compositions were identified as contributing to PM2.5 across the 19 sites. The 6 most common were; aged wood smoke and secondary organic carbon, motor vehicles, primary wood smoke, ammonium nitrate, ammonium sulfate, and fugitive dust. Wood smoke was identified at every site, with both aged and primary wood smoke identified at most sites. Wood smoke contributions to PM2.5 were averaged for the two winter months of December and January, the months when wood smoke in the Northwest U.S. is mainly from residential wood combustion. The total contribution of residential wood combustion, that from primary plus aged smoke, ranged from 11.4% to 92.7% of average December and January PM2.5 depending on the site, with the highest percent contributions occurring in smaller towns that have fewer expected sources of winter PM2.5. Receptor modeling at multiple sites, such as that conducted in this

  10. The earth's shape and gravity

    CERN Document Server

    Garland, G D; Wilson, J T

    2013-01-01

    The Earth's Shape and Gravity focuses on the progress of the use of geophysical methods in investigating the interior of the earth and its shape. The publication first offers information on gravity, geophysics, geodesy, and geology and gravity measurements. Discussions focus on gravity measurements and reductions, potential and equipotential surfaces, absolute and relative measurements, and gravity networks. The text then elaborates on the shape of the sea-level surface and reduction of gravity observations. The text takes a look at gravity anomalies and structures in the earth's crust; interp

  11. Streaming gravity mode instability

    International Nuclear Information System (INIS)

    Wang Shui.

    1989-05-01

    In this paper, we study the stability of a current sheet with a sheared flow in a gravitational field which is perpendicular to the magnetic field and plasma flow. This mixing mode caused by a combined role of the sheared flow and gravity is named the streaming gravity mode instability. The conditions of this mode instability are discussed for an ideal four-layer model in the incompressible limit. (author). 5 refs

  12. On higher derivative gravity

    International Nuclear Information System (INIS)

    Accioly, A.J.

    1987-01-01

    A possible classical route conducting towards a general relativity theory with higher-derivatives starting, in a sense, from first principles, is analysed. A completely causal vacuum solution with the symmetries of the Goedel universe is obtained in the framework of this higher-derivative gravity. This very peculiar and rare result is the first known vcuum solution of the fourth-order gravity theory that is not a solution of the corresponding Einstein's equations.(Author) [pt

  13. What Is Gravity?

    Science.gov (United States)

    Nelson, George

    2004-01-01

    Gravity is the name given to the phenomenon that any two masses, like you and the Earth, attract each other. One pulls on the Earth and the Earth pulls on one the same amount. And one does not have to be touching. Gravity acts over vast distances, like the 150 million kilometers (93 million miles) between the Earth and the Sun or the billions of…

  14. Automated borehole gravity meter system

    International Nuclear Information System (INIS)

    Lautzenhiser, Th.V.; Wirtz, J.D.

    1984-01-01

    An automated borehole gravity meter system for measuring gravity within a wellbore. The gravity meter includes leveling devices for leveling the borehole gravity meter, displacement devices for applying forces to a gravity sensing device within the gravity meter to bring the gravity sensing device to a predetermined or null position. Electronic sensing and control devices are provided for (i) activating the displacement devices, (ii) sensing the forces applied to the gravity sensing device, (iii) electronically converting the values of the forces into a representation of the gravity at the location in the wellbore, and (iv) outputting such representation. The system further includes electronic control devices with the capability of correcting the representation of gravity for tidal effects, as well as, calculating and outputting the formation bulk density and/or porosity

  15. TH-CD-207B-01: BEST IN PHYSICS (IMAGING): Development of High Brightness Multiple-Pixel X-Ray Source Using Oxide Coated Cathodes

    Energy Technology Data Exchange (ETDEWEB)

    Kandlakunta, P; Pham, R; Zhang, T [Washington University School of Medicine, St. Louis, MO (United States)

    2016-06-15

    Purpose: To develop and characterize a high brightness multiple-pixel thermionic emission x-ray (MPTEX) source. Methods: Multiple-pixel x-ray sources allow for designs of novel x-ray imaging techniques, such as fixed gantry CT, digital tomosynthesis, tetrahedron beam computed tomography, etc. We are developing a high-brightness multiple-pixel thermionic emission x-ray (MPTEX) source based on oxide coated cathodes. Oxide cathode is chosen as the electron source due to its high emission current density and low operating temperature. A MPTEX prototype has been developed which may contain up to 41 micro-rectangular oxide cathodes in 4 mm pixel spacing. Electronics hardware was developed for source control and switching. The cathode emission current was evaluated and x-ray measurements were performed to estimate the focal spot size. Results: The oxide cathodes were able to produce ∼110 mA cathode current in pulse mode which corresponds to an emission current density of 0.55 A/cm{sup 2}. The maximum kVp of the MPTEX prototype currently is limited to 100 kV due to the rating of high voltage feedthrough. Preliminary x-ray measurements estimated the focal spot size as 1.5 × 1.3 mm{sup 2}. Conclusion: A MPTEX source was developed with thermionic oxide coated cathodes and preliminary source characterization was successfully performed. The MPTEX source is able to produce an array of high brightness x-ray beams with a fast switching speed.

  16. Venus: radar determination of gravity potential.

    Science.gov (United States)

    Shapiro, I I; Pettengill, G H; Sherman, G N; Rogers, A E; Ingalls, R P

    1973-02-02

    We describe a method for the determination of the gravity potential of Venus from multiple-frequency radar measurements. The method is based on the strong frequency dependence of the absorption of radio waves in Venus' atmosphere. Comparison of the differing radar reflection intensities at several frequencies yields the height of the surface relative to a reference pressure contour; combination with measurements of round-trip echo delays allows the pressure, and hence the gravity potential contour, to be mapped relative to the mean planet radius. Since calibration data from other frequencies are unavailable, the absorption-sensitive Haystack Observatory data have been analyzed under the assumption of uniform surface reflectivity to yield a gravity equipotential contour for the equatorial region and a tentative upper bound of 6 x 10(-4) on the fractional difference of Venus' principal equatorial moments of inertia. The minima in the equipotential contours appear to be associated with topographic minima.

  17. Extended Theories of Gravity

    International Nuclear Information System (INIS)

    Capozziello, Salvatore; De Laurentis, Mariafelicia

    2011-01-01

    Extended Theories of Gravity can be considered as a new paradigm to cure shortcomings of General Relativity at infrared and ultraviolet scales. They are an approach that, by preserving the undoubtedly positive results of Einstein’s theory, is aimed to address conceptual and experimental problems recently emerged in astrophysics, cosmology and High Energy Physics. In particular, the goal is to encompass, in a self-consistent scheme, problems like inflation, dark energy, dark matter, large scale structure and, first of all, to give at least an effective description of Quantum Gravity. We review the basic principles that any gravitational theory has to follow. The geometrical interpretation is discussed in a broad perspective in order to highlight the basic assumptions of General Relativity and its possible extensions in the general framework of gauge theories. Principles of such modifications are presented, focusing on specific classes of theories like f(R)-gravity and scalar–tensor gravity in the metric and Palatini approaches. The special role of torsion is also discussed. The conceptual features of these theories are fully explored and attention is paid to the issues of dynamical and conformal equivalence between them considering also the initial value problem. A number of viability criteria are presented considering the post-Newtonian and the post-Minkowskian limits. In particular, we discuss the problems of neutrino oscillations and gravitational waves in extended gravity. Finally, future perspectives of extended gravity are considered with possibility to go beyond a trial and error approach.

  18. Tunable optical frequency comb enabled scalable and cost-effective multiuser orthogonal frequency-division multiple access passive optical network with source-free optical network units.

    Science.gov (United States)

    Chen, Chen; Zhang, Chongfu; Liu, Deming; Qiu, Kun; Liu, Shuang

    2012-10-01

    We propose and experimentally demonstrate a multiuser orthogonal frequency-division multiple access passive optical network (OFDMA-PON) with source-free optical network units (ONUs), enabled by tunable optical frequency comb generation technology. By cascading a phase modulator (PM) and an intensity modulator and dynamically controlling the peak-to-peak voltage of a PM driven signal, a tunable optical frequency comb source can be generated. It is utilized to assist the configuration of a multiple source-free ONUs enhanced OFDMA-PON where simultaneous and interference-free multiuser upstream transmission over a single wavelength can be efficiently supported. The proposed multiuser OFDMA-PON is scalable and cost effective, and its feasibility is successfully verified by experiment.

  19. Center of Gravity - Still Relevant After All These Years

    National Research Council Canada - National Science Library

    Fowler, Christopher

    2002-01-01

    .... Individual service parochialism further confuses the issue. This paper initially investigates possible reasons why Clausewitz chose to label the source of enemy strength or the hub of all power with center of gravity...

  20. Gravity and low-frequency geodynamics

    CERN Document Server

    Teisseyre, Roman

    1989-01-01

    This fourth volume in the series Physics and Evolution of the Earth's Interior, provides a comprehensive review of the geophysical and geodetical aspects related to gravity and low-frequency geodynamics. Such aspects include the Earth's gravity field, geoid shape theory, and low-frequency phenomena like rotation, oscillations and tides.Global-scale phenomena are treated as a response to source excitation in spherical Earth models consisting of several shells: lithosphere, mantle, core and sometimes also the inner solid core. The effect of gravitation and rotation on the Earth's shape is anal

  1. Eddington's theory of gravity and its progeny.

    Science.gov (United States)

    Bañados, Máximo; Ferreira, Pedro G

    2010-07-02

    We resurrect Eddington's proposal for the gravitational action in the presence of a cosmological constant and extend it to include matter fields. We show that the Newton-Poisson equation is modified in the presence of sources and that charged black holes show great similarities with those arising in Born-Infeld electrodynamics coupled to gravity. When we consider homogeneous and isotropic space-times, we find that there is a minimum length (and maximum density) at early times, clearly pointing to an alternative theory of the big bang. We thus argue that the modern formulation of Eddington's theory, Born-Infeld gravity, presents us with a novel, nonsingular description of the Universe.

  2. Assessment of factors which affect multiple uses of water sources at household level in rural Zimbabwe - A case study of Marondera, Murehwa and Uzumba Maramba Pfungwe districts

    Science.gov (United States)

    Katsi, Luckson; Siwadi, Japson; Guzha, Edward; Makoni, Fungai S.; Smits, Stef

    Water with all its multiple uses plays a pivotal role in the sustenance of rural livelihoods, especially the poor. As such, the provision of water which go beyond domestic to include water for small-scale productive uses should be encouraged to enhance peoples’ livelihood options by making significant contribution to household income, food security, improved nutrition and health. All these multiple benefits, if combined can assist in the fight against hunger and poverty. This study was conducted in Mashonaland East province, covering Marondera, Murehwa and Uzumba Maramba Pfungwe districts in Zimbabwe for the period December 2005-May 2006 to assess factors which affect multiple uses of water sources at household level. Participatory Rural Appraisal tools such as discussions, observations and interviews were used for data collection. The survey found that people indeed require water for productive purposes apart from domestic uses, which are often given top priority. The study found out that multiple uses of water sources at household level can be affected by segmentation of water services into domestic and productive water supply schemes, technology and system design, water quality and quantity and distance to water sources among other factors. The study recommends that water service providers to be able to provide appropriate, efficient and sustainable services, they should understand and appreciate that people’s water needs are integrated and are part and parcel of their multifaceted livelihood strategies.

  3. Determination of the multiplication factor and its bias by the 252Cf-source technique: A method for code benchmarking with subcritical configurations

    International Nuclear Information System (INIS)

    Perez, R.B.; Valentine, T.E.; Mihalczo, J.T.; Mattingly, J.K.

    1997-01-01

    A brief discussion of the Cf-252 source driven method for subcritical measurements serves as an introduction to the concept and use of the spectral ratio, Γ. It has also been shown that the Monte Carlo calculation of spectral densities and effective multiplication factors have as a common denominator the transport propagator. This commonality follows from the fact that the Neumann series expansion of the propagator lends itself to the Monte Carlo method. On this basis a linear relationship between the spectral ratio and the effective multiplication factor has been shown. This relationship demonstrates the ability of subcritical measurements of the ratio of spectral densities to validate transport theory methods and cross sections

  4. New Gravity Wave Treatments for GISS Climate Models

    Science.gov (United States)

    Geller, Marvin A.; Zhou, Tiehan; Ruedy, Reto; Aleinov, Igor; Nazarenko, Larissa; Tausnev, Nikolai L.; Sun, Shan; Kelley, Maxwell; Cheng, Ye

    2011-01-01

    Previous versions of GISS climate models have either used formulations of Rayleigh drag to represent unresolved gravity wave interactions with the model-resolved flow or have included a rather complicated treatment of unresolved gravity waves that, while being climate interactive, involved the specification of a relatively large number of parameters that were not well constrained by observations and also was computationally very expensive. Here, the authors introduce a relatively simple and computationally efficient specification of unresolved orographic and nonorographic gravity waves and their interaction with the resolved flow. Comparisons of the GISS model winds and temperatures with no gravity wave parameterization; with only orographic gravity wave parameterization; and with both orographic and nonorographic gravity wave parameterizations are shown to illustrate how the zonal mean winds and temperatures converge toward observations. The authors also show that the specifications of orographic and nonorographic gravity waves must be different in the Northern and Southern Hemispheres. Then results are presented where the nonorographic gravity wave sources are specified to represent sources from convection in the intertropical convergence zone and spontaneous emission from jet imbalances. Finally, a strategy to include these effects in a climate-dependent manner is suggested.

  5. On combined gravity gradient components modelling for applied geophysics

    International Nuclear Information System (INIS)

    Veryaskin, Alexey; McRae, Wayne

    2008-01-01

    Gravity gradiometry research and development has intensified in recent years to the extent that technologies providing a resolution of about 1 eotvos per 1 second average shall likely soon be available for multiple critical applications such as natural resources exploration, oil reservoir monitoring and defence establishment. Much of the content of this paper was composed a decade ago, and only minor modifications were required for the conclusions to be just as applicable today. In this paper we demonstrate how gravity gradient data can be modelled, and show some examples of how gravity gradient data can be combined in order to extract valuable information. In particular, this study demonstrates the importance of two gravity gradient components, Txz and Tyz, which, when processed together, can provide more information on subsurface density contrasts than that derived solely from the vertical gravity gradient (Tzz)

  6. Students' Consideration of Source Information during the Reading of Multiple Texts and Its Effect on Intertextual Conflict Resolution

    Science.gov (United States)

    Kobayashi, Keiichi

    2014-01-01

    This study investigated students' spontaneous use of source information for the resolution of conflicts between texts. One-hundred fifty-four undergraduate students read two conflicting explanations concerning the relationship between blood type and personality under two conditions: either one explanation with a higher credibility source and…

  7. Children's Ability to Distinguish between Memories from Multiple Sources: Implications for the Quality and Accuracy of Eyewitness Statements.

    Science.gov (United States)

    Roberts, Kim P.

    2002-01-01

    Outlines five perspectives addressing alternate aspects of the development of children's source monitoring: source-monitoring theory, fuzzy-trace theory, schema theory, person-based perspective, and mental-state reasoning model. Discusses research areas with relation to forensic developmental psychology: agent identity, prospective processing,…

  8. MEG (Magnetoencephalography) multipolar modeling of distributed sources using RAP-MUSIC (Recursively Applied and Projected Multiple Signal Characterization)

    Energy Technology Data Exchange (ETDEWEB)

    Mosher, J. C. (John C.); Baillet, S. (Sylvain); Jerbi, K. (Karim); Leahy, R. M. (Richard M.)

    2001-01-01

    We describe the use of truncated multipolar expansions for producing dynamic images of cortical neural activation from measurements of the magnetoencephalogram. We use a signal-subspace method to find the locations of a set of multipolar sources, each of which represents a region of activity in the cerebral cortex. Our method builds up an estimate of the sources in a recursive manner, i.e. we first search for point current dipoles, then magnetic dipoles, and finally first order multipoles. The dynamic behavior of these sources is then computed using a linear fit to the spatiotemporal data. The final step in the procedure is to map each of the multipolar sources into an equivalent distributed source on the cortical surface. The method is illustrated through an application to epileptic interictal MEG data.

  9. A principal-component and least-squares method for allocating polycyclic aromatic hydrocarbons in sediment to multiple sources

    International Nuclear Information System (INIS)

    Burns, W.A.; Mankiewicz, P.J.; Bence, A.E.; Page, D.S.; Parker, K.R.

    1997-01-01

    A method was developed to allocate polycyclic aromatic hydrocarbons (PAHs) in sediment samples to the PAH sources from which they came. The method uses principal-component analysis to identify possible sources and a least-squares model to find the source mix that gives the best fit of 36 PAH analytes in each sample. The method identified 18 possible PAH sources in a large set of field data collected in Prince William Sound, Alaska, USA, after the 1989 Exxon Valdez oil spill, including diesel oil, diesel soot, spilled crude oil in various weathering states, natural background, creosote, and combustion products from human activities and forest fires. Spill oil was generally found to be a small increment of the natural background in subtidal sediments, whereas combustion products were often the predominant sources for subtidal PAHs near sites of past or present human activity. The method appears to be applicable to other situations, including other spills

  10. Quantum Gravity Experiments

    Directory of Open Access Journals (Sweden)

    Cahill R. T.

    2015-10-01

    Full Text Available A new quantum gravity experiment is reported with the data confirming the generali- sation of the Schrödinger equation to include the interaction of the wave function with dynamical space. Dynamical space turbulence, via this interaction process, raises and lowers the energy of the electron wave function, which is detected by observing conse- quent variations in the electron quantum barrier tunnelling rate in reverse-biased Zener diodes. This process has previously been reported and enabled the measurement of the speed of the dynamical space flow, which is consistent with numerous other detection experiments. The interaction process is dependent on the angle between the dynamical space flow velocity and the direction of the electron flow in the diode, and this depen- dence is experimentally demonstrated. This interaction process explains gravity as an emergent quantum process, so unifying quantum phenomena and gravity. Gravitational waves are easily detected.

  11. Gravity and strings

    CERN Document Server

    Ortín, Tomás

    2015-01-01

    Self-contained and comprehensive, this definitive new edition of Gravity and Strings is a unique resource for graduate students and researchers in theoretical physics. From basic differential geometry through to the construction and study of black-hole and black-brane solutions in quantum gravity - via all the intermediate stages - this book provides a complete overview of the intersection of gravity, supergravity, and superstrings. Now fully revised, this second edition covers an extensive array of topics, including new material on non-linear electric-magnetic duality, the electric-tensor formalism, matter-coupled supergravity, supersymmetric solutions, the geometries of scalar manifolds appearing in 4- and 5-dimensional supergravities, and much more. Covering reviews of important solutions and numerous solution-generating techniques, and accompanied by an exhaustive index and bibliography, this is an exceptional reference work.

  12. Solitons in Newtonian gravity

    International Nuclear Information System (INIS)

    Goetz, G.

    1988-01-01

    It is shown that the plane-wave solutions for the equations governing the motion of a self-gravitating isothermal fluid in Newtonian hydrodynamics are generated by a sine-Gordon equation which is solvable by an 'inverse scattering' transformation. A transformation procedure is outlined by means of which one can construct solutions of the gravity system out of a pair of solutions of the sine-Gordon equation, which are interrelated via an auto-Baecklund transformation. In general the solutions to the gravity system are obtained in a parametric representation in terms of characteristic coordinates. All solutions of the gravity system generated by the one-and two-soliton solutions of the sine-Gordon equation can be constructed explicitly. These might provide models for the evolution of flat structures as they are predicted to arise in the process of galaxy formation. (author)

  13. Stochastic quantum gravity

    International Nuclear Information System (INIS)

    Rumpf, H.

    1987-01-01

    We begin with a naive application of the Parisi-Wu scheme to linearized gravity. This will lead into trouble as one peculiarity of the full theory, the indefiniteness of the Euclidean action, shows up already at this level. After discussing some proposals to overcome this problem, Minkowski space stochastic quantization will be introduced. This will still not result in an acceptable quantum theory of linearized gravity, as the Feynman propagator turns out to be non-causal. This defect will be remedied only after a careful analysis of general covariance in stochastic quantization has been performed. The analysis requires the notion of a metric on the manifold of metrics, and a natural candidate for this is singled out. With this a consistent stochastic quantization of Einstein gravity becomes possible. It is even possible, at least perturbatively, to return to the Euclidean regime. 25 refs. (Author)

  14. No slip gravity

    Science.gov (United States)

    Linder, Eric V.

    2018-03-01

    A subclass of the Horndeski modified gravity theory we call No Slip Gravity has particularly interesting properties: 1) a speed of gravitational wave propagation equal to the speed of light, 2) equality between the effective gravitational coupling strengths to matter and light, Gmatter and Glight, hence no slip between the metric potentials, yet difference from Newton's constant, and 3) suppressed growth to give better agreement with galaxy clustering observations. We explore the characteristics and implications of this theory, and project observational constraints. We also give a simple expression for the ratio of the gravitational wave standard siren distance to the photon standard candle distance, in this theory and others, and enable a direct comparison of modified gravity in structure growth and in gravitational waves, an important crosscheck.

  15. The quantization of gravity

    CERN Document Server

    Gerhardt, Claus

    2018-01-01

    A unified quantum theory incorporating the four fundamental forces of nature is one of the major open problems in physics. The Standard Model combines electro-magnetism, the strong force and the weak force, but ignores gravity. The quantization of gravity is therefore a necessary first step to achieve a unified quantum theory. In this monograph a canonical quantization of gravity has been achieved by quantizing a geometric evolution equation resulting in a gravitational wave equation in a globally hyperbolic spacetime. Applying the technique of separation of variables we obtain eigenvalue problems for temporal and spatial self-adjoint operators where the temporal operator has a pure point spectrum with eigenvalues $\\lambda_i$ and related eigenfunctions, while, for the spatial operator, it is possible to find corresponding eigendistributions for each of the eigenvalues $\\lambda_i$, if the Cauchy hypersurface is asymptotically Euclidean or if the quantized spacetime is a black hole with a negative cosmological ...

  16. Development of multiple source data processing for structural analysis at a regional scale. [digital remote sensing in geology

    Science.gov (United States)

    Carrere, Veronique

    1990-01-01

    Various image processing techniques developed for enhancement and extraction of linear features, of interest to the structural geologist, from digital remote sensing, geologic, and gravity data, are presented. These techniques include: (1) automatic detection of linear features and construction of rose diagrams from Landsat MSS data; (2) enhancement of principal structural directions using selective filters on Landsat MSS, Spacelab panchromatic, and HCMM NIR data; (3) directional filtering of Spacelab panchromatic data using Fast Fourier Transform; (4) detection of linear/elongated zones of high thermal gradient from thermal infrared data; and (5) extraction of strong gravimetric gradients from digitized Bouguer anomaly maps. Processing results can be compared to each other through the use of a geocoded database to evaluate the structural importance of each lineament according to its depth: superficial structures in the sedimentary cover, or deeper ones affecting the basement. These image processing techniques were successfully applied to achieve a better understanding of the transition between Provence and the Pyrenees structural blocks, in southeastern France, for an improved structural interpretation of the Mediterranean region.

  17. Airborne Gravity: NGS' Gravity Data for EN08 (2013)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Airborne gravity data for New York, Vermont, New Hampshire, Massachusettes, Maine, and Canada collected in 2013 over 1 survey. This data set is part of the Gravity...

  18. Airborne Gravity: NGS' Gravity Data for TS01 (2014)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Airborne gravity data for Puerto Rico and the Virgin Islands collected in 2009 over 1 survey. This data set is part of the Gravity for the Re-definition of the...

  19. Airborne Gravity: NGS' Gravity Data for AN08 (2016)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Airborne gravity data for Alaska collected in 2016 over one survey. This data set is part of the Gravity for the Re-definition of the American Vertical Datum...

  20. Airborne Gravity: NGS' Gravity Data for CN02 (2013 & 2014)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Airborne gravity data for Nebraska collected in 2013 & 2014 over 3 surveys. This data set is part of the Gravity for the Re-definition of the American Vertical...

  1. Airborne Gravity: NGS' Gravity Data for EN01 (2011)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Airborne gravity data for New York, Canada, and Lake Ontario collected in 2011 over 1 survey. This data set is part of the Gravity for the Re-definition of the...

  2. Airborne Gravity: NGS' Gravity Data for AN03 (2010)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Airborne gravity data for Alaska collected in 2010 and 2012 over 2 surveys. This data set is part of the Gravity for the Re-definition of the American Vertical Datum...

  3. Airborne Gravity: NGS' Gravity Data for EN06 (2016)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Airborne gravity data for Maine, Canada, and the Atlantic Ocean collected in 2012 over 2 surveys. This data set is part of the Gravity for the Re-definition of the...

  4. Airborne Gravity: NGS' Gravity Data for ES01 (2013)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Airborne gravity data for Florida, the Bahamas, and the Atlantic Ocean collected in 2013 over 1 survey. This data set is part of the Gravity for the Re-definition of...

  5. A Combined Gravity Compensation Method for INS Using the Simplified Gravity Model and Gravity Database.

    Science.gov (United States)

    Zhou, Xiao; Yang, Gongliu; Wang, Jing; Wen, Zeyang

    2018-05-14

    In recent decades, gravity compensation has become an important way to reduce the position error of an inertial navigation system (INS), especially for a high-precision INS, because of the extensive application of high precision inertial sensors (accelerometers and gyros). This paper first deducts the INS's solution error considering gravity disturbance and simulates the results. Meanwhile, this paper proposes a combined gravity compensation method using a simplified gravity model and gravity database. This new combined method consists of two steps all together. Step 1 subtracts the normal gravity using a simplified gravity model. Step 2 first obtains the gravity disturbance on the trajectory of the carrier with the help of ELM training based on the measured gravity data (provided by Institute of Geodesy and Geophysics; Chinese Academy of sciences), and then compensates it into the error equations of the INS, considering the gravity disturbance, to further improve the navigation accuracy. The effectiveness and feasibility of this new gravity compensation method for the INS are verified through vehicle tests in two different regions; one is in flat terrain with mild gravity variation and the other is in complex terrain with fierce gravity variation. During 2 h vehicle tests, the positioning accuracy of two tests can improve by 20% and 38% respectively, after the gravity is compensated by the proposed method.

  6. Real-time particle monitor calibration factors and PM2.5 emission factors for multiple indoor sources.

    Science.gov (United States)

    Dacunto, Philip J; Cheng, Kai-Chung; Acevedo-Bolton, Viviana; Jiang, Ruo-Ting; Klepeis, Neil E; Repace, James L; Ott, Wayne R; Hildemann, Lynn M

    2013-08-01

    Indoor sources can greatly contribute to personal exposure to particulate matter less than 2.5 μm in diameter (PM2.5). To accurately assess PM2.5 mass emission factors and concentrations, real-time particle monitors must be calibrated for individual sources. Sixty-six experiments were conducted with a common, real-time laser photometer (TSI SidePak™ Model AM510 Personal Aerosol Monitor) and a filter-based PM2.5 gravimetric sampler to quantify the monitor calibration factors (CFs), and to estimate emission factors for common indoor sources including cigarettes, incense, cooking, candles, and fireplaces. Calibration factors for these indoor sources were all significantly less than the factory-set CF of 1.0, ranging from 0.32 (cigarette smoke) to 0.70 (hamburger). Stick incense had a CF of 0.35, while fireplace emissions ranged from 0.44-0.47. Cooking source CFs ranged from 0.41 (fried bacon) to 0.65-0.70 (fried pork chops, salmon, and hamburger). The CFs of combined sources (e.g., cooking and cigarette emissions mixed) were linear combinations of the CFs of the component sources. The highest PM2.5 emission factors per time period were from burned foods and fireplaces (15-16 mg min(-1)), and the lowest from cooking foods such as pizza and ground beef (0.1-0.2 mg min(-1)).

  7. Use of a Bayesian isotope mixing model to estimate proportional contributions of multiple nitrate sources in surface water

    International Nuclear Information System (INIS)

    Xue Dongmei; De Baets, Bernard; Van Cleemput, Oswald; Hennessy, Carmel; Berglund, Michael; Boeckx, Pascal

    2012-01-01

    To identify different NO 3 − sources in surface water and to estimate their proportional contribution to the nitrate mixture in surface water, a dual isotope and a Bayesian isotope mixing model have been applied for six different surface waters affected by agriculture, greenhouses in an agricultural area, and households. Annual mean δ 15 N–NO 3 − were between 8.0 and 19.4‰, while annual mean δ 18 O–NO 3 − were given by 4.5–30.7‰. SIAR was used to estimate the proportional contribution of five potential NO 3 − sources (NO 3 − in precipitation, NO 3 − fertilizer, NH 4 + in fertilizer and rain, soil N, and manure and sewage). SIAR showed that “manure and sewage” contributed highest, “soil N”, “NO 3 − fertilizer” and “NH 4 + in fertilizer and rain” contributed middle, and “NO 3 − in precipitation” contributed least. The SIAR output can be considered as a “fingerprint” for the NO 3 − source contributions. However, the wide range of isotope values observed in surface water and of the NO 3 − sources limit its applicability. - Highlights: ► The dual isotope approach (δ 15 N- and δ 18 O–NO 3 − ) identify dominant nitrate sources in 6 surface waters. ► The SIAR model estimate proportional contributions for 5 nitrate sources. ► SIAR is a reliable approach to assess temporal and spatial variations of different NO 3 − sources. ► The wide range of isotope values observed in surface water and of the nitrate sources limit its applicability. - This paper successfully applied a dual isotope approach and Bayesian isotopic mixing model to identify and quantify 5 potential nitrate sources in surface water.

  8. Miniaturised Gravity Sensors for Remote Gravity Surveys.

    Science.gov (United States)

    Middlemiss, R. P.; Bramsiepe, S. G.; Hough, J.; Paul, D. J.; Rowan, S.; Samarelli, A.; Hammond, G.

    2016-12-01

    Gravimetry lets us see the world from a completely different perspective. The ability to measure tiny variations in gravitational acceleration (g), allows one to see not just the Earth's gravitational pull, but the influence of smaller objects. The more accurate the gravimeter, the smaller the objects one can see. Gravimetry has applications in many different fields: from tracking magma moving under volcanoes before eruptions; to locating hidden tunnels. The top commercial gravimeters weigh tens of kg and cost at least $100,000, limiting the situations in which they can be used. By contrast, smart phones use a MEMS (microelectromechanical system) accelerometer that can measure the orientation of the device. These are not nearly sensitive or stable enough to be used for the gravimetry but they are cheap, light-weight and mass-producible. At Glasgow University we have developed a MEMS device with both the stability and sensitivity for useful gravimetric measurements. This was demonstrated by a measurement of the Earth tides - the first time this has been achieved with a MEMS sensor. A gravimeter of this size opens up the possiblility for new gravity imaging modalities. Thousands of gravimeters could be networked over a survey site, storing data on an SD card or communicating wirelessly to a remote location. These devices could also be small enough to be carried by a UAVs: airborne gravity surveys could be carried out at low altitude by mulitple UAVs, or UAVs could be used to deliver ground based gravimeters to remote or inaccessible locations.

  9. Surfing surface gravity waves

    Science.gov (United States)

    Pizzo, Nick

    2017-11-01

    A simple criterion for water particles to surf an underlying surface gravity wave is presented. It is found that particles travelling near the phase speed of the wave, in a geometrically confined region on the forward face of the crest, increase in speed. The criterion is derived using the equation of John (Commun. Pure Appl. Maths, vol. 6, 1953, pp. 497-503) for the motion of a zero-stress free surface under the action of gravity. As an example, a breaking water wave is theoretically and numerically examined. Implications for upper-ocean processes, for both shallow- and deep-water waves, are discussed.

  10. Towards a quantum gravity

    International Nuclear Information System (INIS)

    Romney, B.; Barrau, A.; Vidotto, F.; Le Meur, H.; Noui, K.

    2011-01-01

    The loop quantum gravity is the only theory that proposes a quantum description of space-time and therefore of gravitation. This theory predicts that space is not infinitely divisible but that is has a granular structure at the Planck scale (10 -35 m). Another feature of loop quantum gravity is that it gets rid of the Big-Bang singularity: our expanding universe may come from the bouncing of a previous contracting universe, in this theory the Big-Bang is replaced with a big bounce. The loop quantum theory predicts also the huge number of quantum states that accounts for the entropy of large black holes. (A.C.)

  11. Terrestrial gravity data analysis for interim gravity model improvement

    Science.gov (United States)

    1987-01-01

    This is the first status report for the Interim Gravity Model research effort that was started on June 30, 1986. The basic theme of this study is to develop appropriate models and adjustment procedures for estimating potential coefficients from terrestrial gravity data. The plan is to use the latest gravity data sets to produce coefficient estimates as well as to provide normal equations to NASA for use in the TOPEX/POSEIDON gravity field modeling program.

  12. Lithologic boundaries from gravity and magnetic anomalies over ...

    Indian Academy of Sciences (India)

    Pramod Kumar Yadav

    2018-03-02

    Mar 2, 2018 ... nature of causative source using Euler depth solutions and radially averaged power spectrum (RAPS). Residual anomaly maps of gravity and ... the lateral boundaries and nature of the source. It seems that the source is of ..... Goldfarb R J and Richards J P,. The Economic Geology Publishing Company, pp.

  13. Groundwater-quality data associated with abandoned underground coal mine aquifers in West Virginia, 1973-2016: Compilation of existing data from multiple sources

    Science.gov (United States)

    McAdoo, Mitchell A.; Kozar, Mark D.

    2017-11-14

    This report describes a compilation of existing water-quality data associated with groundwater resources originating from abandoned underground coal mines in West Virginia. Data were compiled from multiple sources for the purpose of understanding the suitability of groundwater from abandoned underground coal mines for public supply, industrial, agricultural, and other uses. This compilation includes data collected for multiple individual studies conducted from July 13, 1973 through September 7, 2016. Analytical methods varied by the time period of data collection and requirements of the independent studies.This project identified 770 water-quality samples from 294 sites that could be attributed to abandoned underground coal mine aquifers originating from multiple coal seams in West Virginia.

  14. Multiple-source tracking: Investigating sources of pathogens, nutrients, and sediment in the Upper Little River Basin, Kentucky, water years 2013–14

    Science.gov (United States)

    Crain, Angela S.; Cherry, Mac A.; Williamson, Tanja N.; Bunch, Aubrey R.

    2017-09-20

    The South Fork Little River (SFLR) and the North Fork Little River (NFLR) are two major headwater tributaries that flow into the Little River just south of Hopkinsville, Kentucky. Both tributaries are included in those water bodies in Kentucky and across the Nation that have been reported with declining water quality. Each tributary has been listed by the Kentucky Energy and Environment Cabinet—Kentucky Division of Water in the 303(d) List of Waters for Kentucky Report to Congress as impaired by nutrients, pathogens, and sediment for contact recreation from point and nonpoint sources since 2002. In 2009, the Kentucky Energy and Environment Cabinet—Kentucky Division of Water developed a pathogen total maximum daily load (TMDL) for the Little River Basin including the SFLR and NFLR Basins. Future nutrient and suspended-sediment TMDLs are planned once nutrient criteria and suspended-sediment protocols have been developed for Kentucky. In this study, different approaches were used to identify potential sources of fecal-indicator bacteria (FIB), nitrate, and suspended sediment; to inform the TMDL process; and to aid in the implementation of effective watershed-management activities. The main focus of source identification was in the SFLR Basin.To begin understanding the potential sources of fecal contamination, samples were collected at 19 sites for densities of FIB (E. coli) in water and fluvial sediment and at 11 sites for Bacteroidales genetic markers (General AllBac, human HF183, ruminant BoBac, canid BacCan, and waterfowl GFD) during the recreational season (May through October) in 2013 and 2014. Results indicated 34 percent of all E. coli water samples (n=227 samples) did not meet the U.S. Environmental Protection Agency 2012 recommended national criteria for primary recreational waters. No criterion currently exists for E. coli in fluvial sediment. By use of the Spearman’s rank correlation test, densities of FIB in fluvial sediments were observed to have a

  15. Seasonal gravity change at Yellowstone caldera

    Science.gov (United States)

    Poland, M. P.; de Zeeuw-van Dalfsen, E.

    2017-12-01

    The driving forces behind Yellowstone's dynamic deformation, vigorous hydrothermal system, and abundant seismicity are usually ascribed to "magmatic fluids," which could refer to magma, water, volatiles, or some combination. Deformation data alone cannot distinguish the relative importance of these fluids. Gravity measurements, however, provide an indication of mass change over time and, when combined with surface displacements, can constrain the density of subsurface fluids. Unfortunately, several decades of gravity surveys at Yellowstone have yielded ambiguous results. We suspect that the difficulty in interpreting Yellowstone gravity data is due to seasonal variations in environmental conditions—especially surface and ground water. Yellowstone gravity surveys are usually carried out at the same time of year (generally late summer) to minimize the impact of seasonality. Nevertheless, surface and subsurface water levels are not likely to be constant from year to year, given annual differences in precipitation. To assess the overall magnitude of seasonal gravity changes, we conducted gravity surveys of benchmarks in and around Yellowstone caldera in May, July, August, and October 2017. Our goal was to characterize seasonal variations due to snow melt/accumulation, changes in river and lake levels, changes in groundwater levels, and changes in hydrothermal activity. We also hope to identify sites that show little variation in gravity over the course of the 2017 surveys, as these locations may be less prone to seasonal changes and more likely to detect small variations due to magmatic processes. Preliminary examination of data collected in May and July 2017 emphasizes the importance of site location relative to sources of water. For example, a site on the banks of the Yellowstone River showed a gravity increase of several hundred microgals associated with a 50 cm increase in the river level. A high-altitude site far from rivers and lakes, in contrast, showed a

  16. Laboratory experiments to test relativistic gravity

    International Nuclear Information System (INIS)

    Braginsky, V.B.; Caves, C.M.; Thorne, K.S.

    1977-01-01

    Advancing technology will soon make possible a new class of gravitation experiments: pure laboratory experiments with laboratory sources of non-Newtonian gravity and laboratory detectors. This paper proposes seven such experiments; and for each one it describes, briefly, the dominant sources of noise and the technology required. Three experiments would utilize a high-Q torque balance as the detector. They include (i) an ''Ampere-type'' experiment to measure the gravitational spin-spin coupling of two rotating bodies, (ii) a search for time changes of the gravitation constant, and (iii) a measurement of the gravity produced by magnetic stresses and energy. Three experiments would utilize a high-Q dielectric crystal as the detector. They include (i) a ''Faraday-type'' experiment to measure the ''electric-type'' gravity produced by a time-changing flux of ''magnetic-type'' gravity, (ii) a search for ''preferred-frame'' and ''preferred-orientation'' effects in gravitational coupling, and (iii) a measurement of the gravitational field produced by protons moving in a storage ring at nearly the speed of light. One experiment would use a high-Q toroidal microwave cavity as detector to search for the dragging of inertial frames by a rotating body

  17. Part 2. Development of Enhanced Statistical Methods for Assessing Health Effects Associated with an Unknown Number of Major Sources of Multiple Air Pollutants.

    Science.gov (United States)

    Park, Eun Sug; Symanski, Elaine; Han, Daikwon; Spiegelman, Clifford

    2015-06-01

    A major difficulty with assessing source-specific health effects is that source-specific exposures cannot be measured directly; rather, they need to be estimated by a source-apportionment method such as multivariate receptor modeling. The uncertainty in source apportionment (uncertainty in source-specific exposure estimates and model uncertainty due to the unknown number of sources and identifiability conditions) has been largely ignored in previous studies. Also, spatial dependence of multipollutant data collected from multiple monitoring sites has not yet been incorporated into multivariate receptor modeling. The objectives of this project are (1) to develop a multipollutant approach that incorporates both sources of uncertainty in source-apportionment into the assessment of source-specific health effects and (2) to develop enhanced multivariate receptor models that can account for spatial correlations in the multipollutant data collected from multiple sites. We employed a Bayesian hierarchical modeling framework consisting of multivariate receptor models, health-effects models, and a hierarchical model on latent source contributions. For the health model, we focused on the time-series design in this project. Each combination of number of sources and identifiability conditions (additional constraints on model parameters) defines a different model. We built a set of plausible models with extensive exploratory data analyses and with information from previous studies, and then computed posterior model probability to estimate model uncertainty. Parameter estimation and model uncertainty estimation were implemented simultaneously by Markov chain Monte Carlo (MCMC*) methods. We validated the methods using simulated data. We illustrated the methods using PM2.5 (particulate matter ≤ 2.5 μm in aerodynamic diameter) speciation data and mortality data from Phoenix, Arizona, and Houston, Texas. The Phoenix data included counts of cardiovascular deaths and daily PM2

  18. Ionizing radiation sources: very diversified means, multiple applications and a changing regulatory environment. Conference proceedings; Les sources de rayonnements ionisants: des moyens tres diversifies, des applications multiples et une reglementation en evolution. Recueil des presentations

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2011-11-15

    This document brings together the available presentations given at the conference organised by the French society of radiation protection about ionizing radiation source means, applications and regulatory environment. Twenty eight presentations (slides) are compiled in this document and deal with: 1 - Overview of sources - some quantitative data from the national inventory of ionizing radiation sources (Yann Billarand, IRSN); 2 - Overview of sources (Jerome Fradin, ASN); 3 - Regulatory framework (Sylvie Rodde, ASN); 4 - Alternatives to Iridium radiography - the case of pressure devices at the manufacturing stage (Henri Walaszek, Cetim; Bruno Kowalski, Welding Institute); 5 - Dosimetric stakes of medical scanner examinations (Jean-Louis Greffe, Charleroi hospital of Medical University); 6 - The removal of ionic smoke detectors (Bruno Charpentier, ASN); 7 - Joint-activity and reciprocal liabilities - Organisation of labour risk prevention in case of companies joint-activity (Paulo Pinto, DGT); 8 - Consideration of gamma-graphic testing in the organization of a unit outage activities (Jean-Gabriel Leonard, EDF); 9 - Radiological risk control at a closed and independent work field (Stephane Sartelet, Areva); 10 - Incidents and accidents status and typology (Pascale Scanff, IRSN); 11 - Regional overview of radiation protection significant events (Philippe Menechal, ASN); 12 - Incident leading to a tritium contamination in and urban area - consequences and experience feedback (Laurence Fusil, CEA); 13 - Experience feedback - loss of sealing of a calibration source (Philippe Mougnard, Areva); 14 - Blocking incident of a {sup 60}Co source (Bruno Delille, Salvarem); 15 - Triggering of gantry's alarm: status of findings (Philippe Prat, Syctom); 16 - Non-medical electric devices: regulatory changes (Sophie Dagois, IRSN; Jerome Fradin, ASN); 17 - Evaluation of the dose equivalent rate in pulsed fields: method proposed by the IRSN and implementation test (Laurent Donadille

  19. Full Tensor Gradient of Simulated Gravity Data for Prospect Scale Delineation

    Directory of Open Access Journals (Sweden)

    Hendra Grandis

    2014-07-01

    Full Text Available Gravity gradiometry measurement allows imaging of anomalous sources in more detail than conventional gravity data. The availability of this new technique is limited to airborne gravity surveys using very specific instrumentation. In principle, the gravity gradients can be calculated from the vertical component of the gravity commonly measured in a ground-based gravity survey. We present a calculation of the full tensor gradient (FTG of the gravity employing the Fourier transformation. The calculation was applied to synthetic data associated with a simple block model and also with a more realistic model. The latter corresponds to a 3D model in which a thin coal layer is embedded in a sedimentary environment. Our results show the utility of the FTG of the gravity for prospect scale delineation.

  20. L-1 constraint in Liouville gravity

    International Nuclear Information System (INIS)

    Kitazawa, Y.

    1992-01-01

    In this paper, the authors study recursion relations among the amplitudes which involve discrete states in c = 1 Liouville gravity on the sphere. The authors find that the spin J = 1/2 discrete state gives rise to the L -1 type recursion relation. Multiple point correlation functions are determined recursively from fewer point functions by this recursion relation. The authors further point out that the analogs of J = 1/2 state exist in c -1 type recursion relation

  1. Gravity Data for South America

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity station data (152,624 records) were compiled by the University of Texas at Dallas. This data base was received in June 1992. Principal gravity parameters...

  2. Interior Alaska Gravity Station Data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity station data total 9416 records. This data base was received in March 1997. Principal gravity parameters include Free-air Anomalies which have been...

  3. Gravity Station Data for Spain

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity station data total 28493 records. This data base was received in April 1997. Principal gravity parameters include Free-air Anomalies which have been...

  4. Gravity Station Data for Portugal

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity station data total 3064 records. This data base was received in April 1997. Principal gravity parameters include Free-air Anomalies which have been...

  5. Thermosyphon Flooding in Reduced Gravity Environments Test Results

    Science.gov (United States)

    Gibson, Marc A.; Jaworske, Donald A.; Sanzi, Jim; Ljubanovic, Damir

    2013-01-01

    The condenser flooding phenomenon associated with gravity aided two-phase thermosyphons was studied using parabolic flights to obtain the desired reduced gravity environment (RGE). The experiment was designed and built to test a total of twelve titanium water thermosyphons in multiple gravity environments with the goal of developing a model that would accurately explain the correlation between gravitational forces and the maximum axial heat transfer limit associated with condenser flooding. Results from laboratory testing and parabolic flights are included in this report as part I of a two part series. The data analysis and correlations are included in a follow on paper.

  6. Improving a maximum horizontal gradient algorithm to determine geological body boundaries and fault systems based on gravity data

    Science.gov (United States)

    Van Kha, Tran; Van Vuong, Hoang; Thanh, Do Duc; Hung, Duong Quoc; Anh, Le Duc

    2018-05-01

    The maximum horizontal gradient method was first proposed by Blakely and Simpson (1986) for determining the boundaries between geological bodies with different densities. The method involves the comparison of a center point with its eight nearest neighbors in four directions within each 3 × 3 calculation grid. The horizontal location and magnitude of the maximum values are found by interpolating a second-order polynomial through the trio of points provided that the magnitude of the middle point is greater than its two nearest neighbors in one direction. In theoretical models of multiple sources, however, the above condition does not allow the maximum horizontal locations to be fully located, and it could be difficult to correlate the edges of complicated sources. In this paper, the authors propose an additional condition to identify more maximum horizontal locations within the calculation grid. This additional condition will improve the method algorithm for interpreting the boundaries of magnetic and/or gravity sources. The improved algorithm was tested on gravity models and applied to gravity data for the Phu Khanh basin on the continental shelf of the East Vietnam Sea. The results show that the additional locations of the maximum horizontal gradient could be helpful for connecting the edges of complicated source bodies.

  7. Having a Lot of a Good Thing: Multiple Important Group Memberships as a Source of Self-Esteem

    Science.gov (United States)

    Jetten, Jolanda; Branscombe, Nyla R.; Haslam, S. Alexander; Haslam, Catherine; Cruwys, Tegan; Jones, Janelle M.; Cui, Lijuan; Dingle, Genevieve; Liu, James; Murphy, Sean; Thai, Anh; Walter, Zoe; Zhang, Airong

    2015-01-01

    Membership in important social groups can promote a positive identity. We propose and test an identity resource model in which personal self-esteem is boosted by membership in additional important social groups. Belonging to multiple important group memberships predicts personal self-esteem in children (Study 1a), older adults (Study 1b), and former residents of a homeless shelter (Study 1c). Study 2 shows that the effects of multiple important group memberships on personal self-esteem are not reducible to number of interpersonal ties. Studies 3a and 3b provide longitudinal evidence that multiple important group memberships predict personal self-esteem over time. Studies 4 and 5 show that collective self-esteem mediates this effect, suggesting that membership in multiple important groups boosts personal self-esteem because people take pride in, and derive meaning from, important group memberships. Discussion focuses on when and why important group memberships act as a social resource that fuels personal self-esteem. PMID:26017554

  8. Having a lot of a good thing: multiple important group memberships as a source of self-esteem.

    Directory of Open Access Journals (Sweden)

    Jolanda Jetten

    Full Text Available Membership in important social groups can promote a positive identity. We propose and test an identity resource model in which personal self-esteem is boosted by membership in additional important social groups. Belonging to multiple important group memberships predicts personal self-esteem in children (Study 1a, older adults (Study 1b, and former residents of a homeless shelter (Study 1c. Study 2 shows that the effects of multiple important group memberships on personal self-esteem are not reducible to number of interpersonal ties. Studies 3a and 3b provide longitudinal evidence that multiple important group memberships predict personal self-esteem over time. Studies 4 and 5 show that collective self-esteem mediates this effect, suggesting that membership in multiple important groups boosts personal self-esteem because people take pride in, and derive meaning from, important group memberships. Discussion focuses on when and why important group memberships act as a social resource that fuels personal self-esteem.

  9. Pre-coding method and apparatus for multiple source or time-shifted single source data and corresponding inverse post-decoding method and apparatus

    Science.gov (United States)

    Yeh, Pen-Shu (Inventor)

    1998-01-01

    A pre-coding method and device for improving data compression performance by removing correlation between a first original data set and a second original data set, each having M members, respectively. The pre-coding method produces a compression-efficiency-enhancing double-difference data set. The method and device produce a double-difference data set, i.e., an adjacent-delta calculation performed on a cross-delta data set or a cross-delta calculation performed on two adjacent-delta data sets, from either one of (1) two adjacent spectral bands coming from two discrete sources, respectively, or (2) two time-shifted data sets coming from a single source. The resulting double-difference data set is then coded using either a distortionless data encoding scheme (entropy encoding) or a lossy data compression scheme. Also, a post-decoding method and device for recovering a second original data set having been represented by such a double-difference data set.

  10. Massive Conformal Gravity

    International Nuclear Information System (INIS)

    Faria, F. F.

    2014-01-01

    We construct a massive theory of gravity that is invariant under conformal transformations. The massive action of the theory depends on the metric tensor and a scalar field, which are considered the only field variables. We find the vacuum field equations of the theory and analyze its weak-field approximation and Newtonian limit.

  11. Colossal creations of gravity

    DEFF Research Database (Denmark)

    Skielboe, Andreas

    Gravity governs the evolution of the universe on the largest scales, and powers some of the most extreme objects at the centers of galaxies. Determining the masses and kinematics of galaxy clusters provides essential constraints on the large-scale structure of the universe, and act as direct probes...

  12. A Trick of Gravity

    Science.gov (United States)

    Newburgh, Ronald

    2010-01-01

    It's both surprising and rewarding when an old, standard problem reveals a subtlety that expands its pedagogic value. I realized recently that the role of gravity in the range equation for a projectile is not so simple as first appears. This realization may be completely obvious to others but was quite new to me.

  13. Discrete Lorentzian quantum gravity

    NARCIS (Netherlands)

    Loll, R.

    2000-01-01

    Just as for non-abelian gauge theories at strong coupling, discrete lattice methods are a natural tool in the study of non-perturbative quantum gravity. They have to reflect the fact that the geometric degrees of freedom are dynamical, and that therefore also the lattice theory must be formulated

  14. Loop quantum gravity

    International Nuclear Information System (INIS)

    Pullin, J.

    2015-01-01

    Loop quantum gravity is one of the approaches that are being studied to apply the rules of quantum mechanics to the gravitational field described by the theory of General Relativity . We present an introductory summary of the main ideas and recent results. (Author)

  15. A finite quantum gravity

    International Nuclear Information System (INIS)

    Meszaros, A.

    1984-05-01

    In case the graviton has a very small non-zero mass, the existence of six additional massive gravitons with very big masses leads to a finite quantum gravity. There is an acausal behaviour on the scales that is determined by the masses of additional gravitons. (author)

  16. Venus - Ishtar gravity anomaly

    Science.gov (United States)

    Sjogren, W. L.; Bills, B. G.; Mottinger, N. A.

    1984-01-01

    The gravity anomaly associated with Ishtar Terra on Venus is characterized, comparing line-of-sight acceleration profiles derived by differentiating Pioneer Venus Orbiter Doppler residual profiles with an Airy-compensated topographic model. The results are presented in graphs and maps, confirming the preliminary findings of Phillips et al. (1979). The isostatic compensation depth is found to be 150 + or - 30 km.

  17. Torsion induces gravity

    International Nuclear Information System (INIS)

    Aros, Rodrigo; Contreras, Mauricio

    2006-01-01

    In this work the Poincare-Chern-Simons and anti-de Sitter-Chern-Simons gravities are studied. For both, a solution that can be cast as a black hole with manifest torsion is found. Those solutions resemble Schwarzschild and Schwarzschild-AdS solutions, respectively

  18. Discrete quantum gravity

    International Nuclear Information System (INIS)

    Williams, J.W.

    1992-01-01

    After a brief introduction to Regge calculus, some examples of its application is quantum gravity are described in this paper. In particular, the earliest such application, by Ponzano and Regge, is discussed in some detail and it is shown how this leads naturally to current work on invariants of three-manifolds

  19. Loop Quantum Gravity

    Directory of Open Access Journals (Sweden)

    Rovelli Carlo

    1998-01-01

    Full Text Available The problem of finding the quantum theory of the gravitational field, and thus understanding what is quantum spacetime, is still open. One of the most active of the current approaches is loop quantum gravity. Loop quantum gravity is a mathematically well-defined, non-perturbative and background independent quantization of general relativity, with its conventional matter couplings. Research in loop quantum gravity today forms a vast area, ranging from mathematical foundations to physical applications. Among the most significant results obtained are: (i The computation of the physical spectra of geometrical quantities such as area and volume, which yields quantitative predictions on Planck-scale physics. (ii A derivation of the Bekenstein-Hawking black hole entropy formula. (iii An intriguing physical picture of the microstructure of quantum physical space, characterized by a polymer-like Planck scale discreteness. This discreteness emerges naturally from the quantum theory and provides a mathematically well-defined realization of Wheeler's intuition of a spacetime ``foam''. Long standing open problems within the approach (lack of a scalar product, over-completeness of the loop basis, implementation of reality conditions have been fully solved. The weak part of the approach is the treatment of the dynamics: at present there exist several proposals, which are intensely debated. Here, I provide a general overview of ideas, techniques, results and open problems of this candidate theory of quantum gravity, and a guide to the relevant literature.

  20. Epidemiologic study of neural tube defects in Los Angeles County. I. Prevalence at birth based on multiple sources of case ascertainment

    Energy Technology Data Exchange (ETDEWEB)

    Sever, L.E. (Pacific Northwest Lab., Richland, WA); Sanders, M.; Monsen, R.

    1982-01-01

    Epidemiologic studies of the neural tube defects (NTDs), anencephalus and spina bifida, have for the most part been based on single sources of case ascertainment in past studies. The present investigation attempts total ascertainment of NTD cases in the newborn population of Los Angeles County residents for the period 1966 to 1972. Design of the study, sources of data, and estimates of prevalence rates based on single and multiple sources of case ascertainment are here discussed. Anencephalus cases totaled 448, spina bifida 442, and encephalocele 72, giving prevalence rates of 0.52, 0.51, and 0.08 per 1000 total births, respectively, for these neural tube defects - rates considered to be low. The Los Angeles County prevalence rates are compared with those of other recent North American studies and support is provided for earlier suggestions of low rates on the West Coast.

  1. Merging of airborne gravity and gravity derived from satellite altimetry: Test cases along the coast of greenland

    DEFF Research Database (Denmark)

    Olesen, Arne Vestergaard; Andersen, Ole Baltazar; Tscherning, C.C.

    2002-01-01

    for the use of gravity data especially, when computing geoid models in coastal regions. The presence of reliable marine gravity data for independent control offers an opportunity to study procedures for the merging of airborne and satellite data around Greenland. Two different merging techniques, both based......The National Survey and Cadastre - Denmark (KMS) has for several years produced gravity anomaly maps over the oceans derived from satellite altimetry. During the last four years, KMS has also conducted airborne gravity surveys along the coast of Greenland dedicated to complement the existing...... onshore gravity coverage and fill in new data in the very-near coastal area, where altimetry data may contain gross errors. The airborne surveys extend from the coastline to approximately 100 km offshore, along 6000 km of coastline. An adequate merging of these different data sources is important...

  2. Quantifying methane emission from fugitive sources by combining tracer release and downwind measurements - a sensitivity analysis based on multiple field surveys.

    Science.gov (United States)

    Mønster, Jacob G; Samuelsson, Jerker; Kjeldsen, Peter; Rella, Chris W; Scheutz, Charlotte

    2014-08-01

    Using a dual species methane/acetylene instrument based on cavity ring down spectroscopy (CRDS), the dynamic plume tracer dispersion method for quantifying the emission rate of methane was successfully tested in four measurement campaigns: (1) controlled methane and trace gas release with different trace gas configurations, (2) landfill with unknown emission source locations, (3) landfill with closely located emission sources, and (4) comparing with an Fourier transform infrared spectroscopy (FTIR) instrument using multiple trace gasses for source separation. The new real-time, high precision instrument can measure methane plumes more than 1.2 km away from small sources (about 5 kg h(-1)) in urban areas with a measurement frequency allowing plume crossing at normal driving speed. The method can be used for quantification of total methane emissions from diffuse area sources down to 1 kg per hour and can be used to quantify individual sources with the right choice of wind direction and road distance. The placement of the trace gas is important for obtaining correct quantification and uncertainty of up to 36% can be incurred when the trace gas is not co-located with the methane source. Measurements made at greater distances are less sensitive to errors in trace gas placement and model calculations showed an uncertainty of less than 5% in both urban and open-country for placing the trace gas 100 m from the source, when measurements were done more than 3 km away. Using the ratio of the integrated plume concentrations of tracer gas and methane gives the most reliable results for measurements at various distances to the source, compared to the ratio of the highest concentration in the plume, the direct concentration ratio and using a Gaussian plume model. Under suitable weather and road conditions, the CRDS system can quantify the emission from different sources located close to each other using only one kind of trace gas due to the high time resolution, while the FTIR

  3. Quantum Gravity Effects in Cosmology

    Directory of Open Access Journals (Sweden)

    Gu Je-An

    2018-01-01

    Full Text Available Within the geometrodynamic approach to quantum cosmology, we studied the quantum gravity effects in cosmology. The Gibbons-Hawking temperature is corrected by quantum gravity due to spacetime fluctuations and the power spectrum as well as any probe field will experience the effective temperature, a quantum gravity effect.

  4. Even-dimensional topological gravity from Chern-Simons gravity

    International Nuclear Information System (INIS)

    Merino, N.; Perez, A.; Salgado, P.

    2009-01-01

    It is shown that the topological action for gravity in 2n-dimensions can be obtained from the (2n+1)-dimensional Chern-Simons gravity genuinely invariant under the Poincare group. The 2n-dimensional topological gravity is described by the dynamics of the boundary of a (2n+1)-dimensional Chern-Simons gravity theory with suitable boundary conditions. The field φ a , which is necessary to construct this type of topological gravity in even dimensions, is identified with the coset field associated with the non-linear realizations of the Poincare group ISO(d-1,1).

  5. Geodynamics implication of GPS and satellite altimeter and gravity observations to the Eastern Mediterranean

    Directory of Open Access Journals (Sweden)

    Khaled H. Zahran

    2012-06-01

    Results show important zones of mass discontinuity in this region correlated with the seismological activities and temporal gravity variations agree with the crustal deformation obtained from GPS observations. The current study indicates that satellite gravity data is a valuable source of data in understanding the geodynamical behavior of the studied region and that satellite gravity data is an important contemporary source of data in the geodynamical studies.

  6. GEODYNAMIC WAVES AND GRAVITY

    Directory of Open Access Journals (Sweden)

    A. V. Vikulin

    2014-01-01

    Full Text Available  Gravity phenomena related to the Earth movements in the Solar System and through the Galaxy are reviewed. Such movements are manifested by geological processes on the Earth and correlate with geophysical fields of the Earth. It is concluded that geodynamic processes and the gravity phenomena (including those of cosmic nature are related.  The state of the geomedium composed of blocks is determined by stresses with force moment and by slow rotational waves that are considered as a new type of movements [Vikulin, 2008, 2010]. It is shown that the geomedium has typical rheid properties [Carey, 1954], specifically an ability to flow while being in the solid state [Leonov, 2008]. Within the framework of the rotational model with a symmetric stress tensor, which is developed by the authors [Vikulin, Ivanchin, 1998; Vikulin et al., 2012a, 2013], such movement of the geomedium may explain the energy-saturated state of the geomedium and a possibility of its movements in the form of vortex geological structures [Lee, 1928]. The article discusses the gravity wave detection method based on the concept of interactions between gravity waves and crustal blocks [Braginsky et al., 1985]. It is concluded that gravity waves can be recorded by the proposed technique that detects slow rotational waves. It is shown that geo-gravitational movements can be described by both the concept of potential with account of gravitational energy of bodies [Kondratyev, 2003] and the nonlinear physical acoustics [Gurbatov et al., 2008]. Based on the combined description of geophysical and gravitational wave movements, the authors suggest a hypothesis about the nature of spin, i.e. own moment as a demonstration of the space-time ‘vortex’ properties.  

  7. Cosmological attractors in massive gravity

    CERN Document Server

    Dubovsky, S; Tkachev, I I

    2005-01-01

    We study Lorentz-violating models of massive gravity which preserve rotations and are invariant under time-dependent shifts of the spatial coordinates. In the linear approximation the Newtonian potential in these models has an extra ``confining'' term proportional to the distance from the source. We argue that during cosmological expansion the Universe may be driven to an attractor point with larger symmetry which includes particular simultaneous dilatations of time and space coordinates. The confining term in the potential vanishes as one approaches the attractor. In the vicinity of the attractor the extra contribution is present in the Friedmann equation which, in a certain range of parameters, gives rise to the cosmic acceleration.

  8. GRAVITY ANOMALIES OF THE MOON

    Directory of Open Access Journals (Sweden)

    S. G. Pugacheva

    2015-01-01

    Full Text Available The source of gravity anomalies of the Moon are large mascons with a high mass concentration at a depth of volcanic plains and lunar Maria. New data on the gravitational field of the Moon were obtained from two Grail spacecrafts. The article presents the data of physical and mechanical properties of the surface soil layer of the lunar Maria and gives an assessment of the chemical composition of the soil. There have been calculated heterogeneity parameters of the surface macro-relief of the lunar Maria: albedo, soil density, average grain diameter of the particles forming the surface layer and the volume fraction occupied by particles. It can be assumed that mascons include rich KREEP rocks with a high content of thorium and iron oxide. Formation of mascons is connected with intensive development of basaltic volcanism on the Moon in the early periods of its existence.

  9. Testing for multiple invasion routes and source populations for the invasive brown treesnake (Boiga irregularis) on Guam: implications for pest management

    Science.gov (United States)

    Richmond, Jonathan Q.; Wood, Dustin A.; Stanford, James W.; Fisher, Robert N.

    2014-01-01

    The brown treesnake (Boiga irregularis) population on the Pacific island of Guam has reached iconic status as one of the most destructive invasive species of modern times, yet no published works have used genetic data to identify a source population. We used DNA sequence data from multiple genetic markers and coalescent-based phylogenetic methods to place the Guam population within the broader phylogeographic context of B. irregularis across its native range and tested whether patterns of genetic variation on the island are consistent with one or multiple introductions from different source populations. We also modeled a series of demographic scenarios that differed in the effective size and duration of a population bottleneck immediately following the invasion on Guam, and measured the fit of these simulations to the observed data using approximate Bayesian computation. Our results exclude the possibility of serial introductions from different source populations, and instead verify a single origin from the Admiralty Archipelago off the north coast of Papua New Guinea. This finding is consistent with the hypothesis thatB. irregularis was accidentally transported to Guam during military relocation efforts at the end of World War II. Demographic model comparisons suggest that multiple snakes were transported to Guam from the source locality, but that fewer than 10 individuals could be responsible for establishing the population. Our results also provide evidence that low genetic diversity stemming from the founder event has not been a hindrance to the ecological success of B. irregularis on Guam, and at the same time offers a unique ‘genetic opening’ to manage snake density using classical biological approaches.

  10. Pt Electrodes Enable the Formation of μ4-O Centers in MOF-5 from Multiple Oxygen Sources.

    Science.gov (United States)

    Li, Minyuan M; Dincă, Mircea

    2017-10-04

    The μ 4 -O 2- ions in the Zn 4 O(O 2 C-) 6 secondary building units of Zn 4 O(1,4-benzenedicarboxylate) 3 (MOF-5) electrodeposited under cathodic bias can be sourced from nitrate, water, and molecular oxygen when using platinum gauze as working electrodes. The use of Zn(ClO 4 ) 2 ·6H 2 O, anhydrous Zn(NO 3 ) 2 , or anhydrous Zn(CF 3 SO 3 ) 2 as Zn 2+ sources under rigorous control of other sources of oxygen, including water and O 2 , confirm that the source of the μ 4 -O 2- ions can be promiscuous. Although this finding reveals a relatively complicated manifold of electrochemical processes responsible for the crystallization of MOF-5 under cathodic bias, it further highlights the importance of hydroxide intermediates in the formation of the Zn 4 O(O 2 C-R) secondary building units in this iconic material and is illustrative of the complicated crystallization mechanisms of metal-organic frameworks in general.

  11. Design and commissioning of an aberration-corrected ultrafast spin-polarized low energy electron microscope with multiple electron sources.

    Science.gov (United States)

    Wan, Weishi; Yu, Lei; Zhu, Lin; Yang, Xiaodong; Wei, Zheng; Liu, Jefferson Zhe; Feng, Jun; Kunze, Kai; Schaff, Oliver; Tromp, Ruud; Tang, Wen-Xin

    2017-03-01

    We describe the design and commissioning of a novel aberration-corrected low energy electron microscope (AC-LEEM). A third magnetic prism array (MPA) is added to the standard AC-LEEM with two prism arrays, allowing the incorporation of an ultrafast spin-polarized electron source alongside the standard cold field emission electron source, without degrading spatial resolution. The high degree of symmetries of the AC-LEEM are utilized while we design the electron optics of the ultrafast spin-polarized electron source, so as to minimize the deleterious effect of time broadening, while maintaining full control of electron spin. A spatial resolution of 2nm and temporal resolution of 10ps (ps) are expected in the future time resolved aberration-corrected spin-polarized LEEM (TR-AC-SPLEEM). The commissioning of the three-prism AC-LEEM has been successfully finished with the cold field emission source, with a spatial resolution below 2nm. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Metastable gravity on classical defects

    International Nuclear Information System (INIS)

    Ringeval, Christophe; Rombouts, Jan-Willem

    2005-01-01

    We discuss the realization of metastable gravity on classical defects in infinite-volume extra dimensions. In dilatonic Einstein gravity, it is found that the existence of metastable gravity on the defect core requires violation of the dominant energy condition for codimension N c =2 defects. This is illustrated with a detailed analysis of a six-dimensional hyperstring minimally coupled to dilaton gravity. We present the general conditions under which a codimension N c >2 defect admits metastable modes, and find that they differ from lower codimensional models in that, under certain conditions, they do not require violation of energy conditions to support quasilocalized gravity

  13. Quantum gravity from noncommutative spacetime

    International Nuclear Information System (INIS)

    Lee, Jungjai; Yang, Hyunseok

    2014-01-01

    We review a novel and authentic way to quantize gravity. This novel approach is based on the fact that Einstein gravity can be formulated in terms of a symplectic geometry rather than a Riemannian geometry in the context of emergent gravity. An essential step for emergent gravity is to realize the equivalence principle, the most important property in the theory of gravity (general relativity), from U(1) gauge theory on a symplectic or Poisson manifold. Through the realization of the equivalence principle, which is an intrinsic property in symplectic geometry known as the Darboux theorem or the Moser lemma, one can understand how diffeomorphism symmetry arises from noncommutative U(1) gauge theory; thus, gravity can emerge from the noncommutative electromagnetism, which is also an interacting theory. As a consequence, a background-independent quantum gravity in which the prior existence of any spacetime structure is not a priori assumed but is defined by using the fundamental ingredients in quantum gravity theory can be formulated. This scheme for quantum gravity can be used to resolve many notorious problems in theoretical physics, such as the cosmological constant problem, to understand the nature of dark energy, and to explain why gravity is so weak compared to other forces. In particular, it leads to a remarkable picture of what matter is. A matter field, such as leptons and quarks, simply arises as a stable localized geometry, which is a topological object in the defining algebra (noncommutative *-algebra) of quantum gravity.

  14. Quantum gravity from noncommutative spacetime

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jungjai [Daejin University, Pocheon (Korea, Republic of); Yang, Hyunseok [Korea Institute for Advanced Study, Seoul (Korea, Republic of)

    2014-12-15

    We review a novel and authentic way to quantize gravity. This novel approach is based on the fact that Einstein gravity can be formulated in terms of a symplectic geometry rather than a Riemannian geometry in the context of emergent gravity. An essential step for emergent gravity is to realize the equivalence principle, the most important property in the theory of gravity (general relativity), from U(1) gauge theory on a symplectic or Poisson manifold. Through the realization of the equivalence principle, which is an intrinsic property in symplectic geometry known as the Darboux theorem or the Moser lemma, one can understand how diffeomorphism symmetry arises from noncommutative U(1) gauge theory; thus, gravity can emerge from the noncommutative electromagnetism, which is also an interacting theory. As a consequence, a background-independent quantum gravity in which the prior existence of any spacetime structure is not a priori assumed but is defined by using the fundamental ingredients in quantum gravity theory can be formulated. This scheme for quantum gravity can be used to resolve many notorious problems in theoretical physics, such as the cosmological constant problem, to understand the nature of dark energy, and to explain why gravity is so weak compared to other forces. In particular, it leads to a remarkable picture of what matter is. A matter field, such as leptons and quarks, simply arises as a stable localized geometry, which is a topological object in the defining algebra (noncommutative *-algebra) of quantum gravity.

  15. The gravity field and GGOS

    DEFF Research Database (Denmark)

    Forsberg, René; Sideris, M.G.; Shum, C.K.

    2005-01-01

    The gravity field of the earth is a natural element of the Global Geodetic Observing System (GGOS). Gravity field quantities are like spatial geodetic observations of potential very high accuracy, with measurements, currently at part-per-billion (ppb) accuracy, but gravity field quantities are also...... unique as they can be globally represented by harmonic functions (long-wavelength geopotential model primarily from satellite gravity field missions), or based on point sampling (airborne and in situ absolute and superconducting gravimetry). From a GGOS global perspective, one of the main challenges...... is to ensure the consistency of the global and regional geopotential and geoid models, and the temporal changes of the gravity field at large spatial scales. The International Gravity Field Service, an umbrella "level-2" IAG service (incorporating the International Gravity Bureau, International Geoid Service...

  16. Global estimates of CO sources with high resolution by adjoint inversion of multiple satellite datasets (MOPITT, AIRS, SCIAMACHY, TES

    Directory of Open Access Journals (Sweden)

    M. Kopacz

    2010-02-01

    Full Text Available We combine CO column measurements from the MOPITT, AIRS, SCIAMACHY, and TES satellite instruments in a full-year (May 2004–April 2005 global inversion of CO sources at 4°×5° spatial resolution and monthly temporal resolution. The inversion uses the GEOS-Chem chemical transport model (CTM and its adjoint applied to MOPITT, AIRS, and SCIAMACHY. Observations from TES, surface sites (NOAA/GMD, and aircraft (MOZAIC are used for evaluation of the a posteriori solution. Using GEOS-Chem as a common intercomparison platform shows global consistency between the different satellite datasets and with the in situ data. Differences can be largely explained by different averaging kernels and a priori information. The global CO emission from combustion as constrained in the inversion is 1350 Tg a−1. This is much higher than current bottom-up emission inventories. A large fraction of the correction results from a seasonal underestimate of CO sources at northern mid-latitudes in winter and suggests a larger-than-expected CO source from vehicle cold starts and residential heating. Implementing this seasonal variation of emissions solves the long-standing problem of models underestimating CO in the northern extratropics in winter-spring. A posteriori emissions also indicate a general underestimation of biomass burning in the GFED2 inventory. However, the tropical biomass burning constraints are not quantitatively consistent across the different datasets.

  17. Brain source localization: A new method based on MUltiple SIgnal Classification algorithm and spatial sparsity of the field signal for electroencephalogram measurements

    Science.gov (United States)

    Vergallo, P.; Lay-Ekuakille, A.

    2013-08-01

    Brain activity can be recorded by means of EEG (Electroencephalogram) electrodes placed on the scalp of the patient. The EEG reflects the activity of groups of neurons located in the head, and the fundamental problem in neurophysiology is the identification of the sources responsible of brain activity, especially if a seizure occurs and in this case it is important to identify it. The studies conducted in order to formalize the relationship between the electromagnetic activity in the head and the recording of the generated external field allow to know pattern of brain activity. The inverse problem, that is given the sampling field at different electrodes the underlying asset must be determined, is more difficult because the problem may not have a unique solution, or the search for the solution is made difficult by a low spatial resolution which may not allow to distinguish between activities involving sources close to each other. Thus, sources of interest may be obscured or not detected and known method in source localization problem as MUSIC (MUltiple SIgnal Classification) could fail. Many advanced source localization techniques achieve a best resolution by exploiting sparsity: if the number of sources is small as a result, the neural power vs. location is sparse. In this work a solution based on the spatial sparsity of the field signal is presented and analyzed to improve MUSIC method. For this purpose, it is necessary to set a priori information of the sparsity in the signal. The problem is formulated and solved using a regularization method as Tikhonov, which calculates a solution that is the better compromise between two cost functions to minimize, one related to the fitting of the data, and another concerning the maintenance of the sparsity of the signal. At the first, the method is tested on simulated EEG signals obtained by the solution of the forward problem. Relatively to the model considered for the head and brain sources, the result obtained allows to

  18. Gravity Responsive NADH Oxidase of the Plasma Membrane

    Science.gov (United States)

    Morre, D. James (Inventor)

    2002-01-01

    A method and apparatus for sensing gravity using an NADH oxidase of the plasma membrane which has been found to respond to unit gravity and low centrifugal g forces. The oxidation rate of NADH supplied to the NADH oxidase is measured and translated to represent the relative gravitational force exerted on the protein. The NADH oxidase of the plasma membrane may be obtained from plant or animal sources or may be produced recombinantly.

  19. Robust heart rate estimation from multiple asynchronous noisy sources using signal quality indices and a Kalman filter

    International Nuclear Information System (INIS)

    Li, Q; Mark, R G; Clifford, G D

    2008-01-01

    Physiological signals such as the electrocardiogram (ECG) and arterial blood pressure (ABP) in the intensive care unit (ICU) are often severely corrupted by noise, artifact and missing data, which lead to large errors in the estimation of the heart rate (HR) and ABP. A robust HR estimation method is described that compensates for these problems. The method is based upon the concept of fusing multiple signal quality indices (SQIs) and HR estimates derived from multiple electrocardiogram (ECG) leads and an invasive ABP waveform recorded from ICU patients. Physiological SQIs were obtained by analyzing the statistical characteristics of each waveform and their relationships to each other. HR estimates from the ECG and ABP are tracked with separate Kalman filters, using a modified update sequence based upon the individual SQIs. Data fusion of each HR estimate was then performed by weighting each estimate by the Kalman filters' SQI-modified innovations. This method was evaluated on over 6000 h of simultaneously acquired ECG and ABP from a 437 patient subset of ICU data by adding real ECG and realistic artificial ABP noise. The method provides an accurate HR estimate even in the presence of high levels of persistent noise and artifact, and during episodes of extreme bradycardia and tachycardia

  20. Pseudotopological quasilocal energy of torsion gravity

    Science.gov (United States)

    Ko, Sheng-Lan; Lin, Feng-Li; Ning, Bo

    2017-08-01

    Torsion gravity is a natural extension to Einstein gravity in the presence of fermion matter sources. In this paper we adopt Wald's covariant method of calculating the Noether charge to construct the quasilocal energy of the Einstein-Cartan-fermion system, and find that its explicit expression is formally independent of the coupling constant between the torsion and axial current. This seemingly topological nature is unexpected and is reminiscent of the quantum Hall effect and topological insulators. However, a coupling dependence does arise when evaluating it on shell, and thus the situation is pseudotopological. Based on the expression for the quasilocal energy, we evaluate it for a particular solution on the entanglement wedge and find agreement with the holographic relative entropy obtained before. This shows the equivalence of these two quantities in the Einstein-Cartan-fermion system. Moreover, the quasilocal energy in this case is not always positive definite, and thus it provides an example of a swampland in torsion gravity. Based on the covariant Noether charge, we also derive the nonzero fermion effect on the Komar angular momentum. The implications of our results for future tests of torsion gravity in gravitational-wave astronomy are also discussed.

  1. Cosmological Tests of Gravity

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Extensions of Einstein’s theory of General Relativity are under investigation as a potential explanation of the accelerating expansion rate of the universe. I’ll present a cosmologist’s overview of attempts to test these ideas in an efficient and unbiased manner. I’ll start by introducing the bestiary of alternative gravity theories that have been put forwards. This proliferation of models motivates us to develop model-independent, agnostic tools for comparing the theory space to cosmological data. I’ll introduce the effective field theory for cosmological perturbations, a framework designed to unify modified gravity theories in terms of a manageable set of parameters. Having outlined the formalism, I’ll talk about the current constraints on this framework, and the improvements expected from the next generation of large galaxy clustering, weak lensing and intensity mapping experiments.

  2. The relativistic gravity train

    Science.gov (United States)

    Seel, Max

    2018-05-01

    The gravity train that takes 42.2 min from any point A to any other point B that is connected by a straight-line tunnel through Earth has captured the imagination more than most other applications in calculus or introductory physics courses. Brachystochron and, most recently, nonlinear density solutions have been discussed. Here relativistic corrections are presented. It is discussed how the corrections affect the time to fall through Earth, the Sun, a white dwarf, a neutron star, and—the ultimate limit—the difference in time measured by a moving, a stationary and the fiducial observer at infinity if the density of the sphere approaches the density of a black hole. The relativistic gravity train can serve as a problem with approximate and exact analytic solutions and as numerical exercise in any introductory course on relativity.

  3. Antimatter gravity experiment

    International Nuclear Information System (INIS)

    Brown, R.E.; Camp, J.B.; Darling, T.W.

    1990-01-01

    An experiment is being developed to measure the acceleration of the antiproton in the gravitational field of the earth. Antiprotons of a few MeV from the LEAR facility at CERN will be slowed, captured, cooled to a temperature of about 10 K, and subsequently launched a few at a time into a drift tube where the effect of gravity on their motion will be determined by a time-of-flight method. Development of the experiment is proceeding at Los Alamos using normal matter. The fabrication of a drift tube that will produce a region of space in which gravity is the dominant force on moving ions is of major difficulty. This involves a study of methods of minimizing the electric fields produced by spatially varying work functions on conducting surfaces. Progress in a number of areas is described, with stress on the drift-tube development

  4. Lectures on Quantum Gravity

    CERN Document Server

    Gomberoff, Andres

    2006-01-01

    The 2002 Pan-American Advanced Studies Institute School on Quantum Gravity was held at the Centro de Estudios Cientificos (CECS),Valdivia, Chile, January 4-14, 2002. The school featured lectures by ten speakers, and was attended by nearly 70 students from over 14 countries. A primary goal was to foster interaction and communication between participants from different cultures, both in the layman’s sense of the term and in terms of approaches to quantum gravity. We hope that the links formed by students and the school will persist throughout their professional lives, continuing to promote interaction and the essential exchange of ideas that drives research forward. This volume contains improved and updated versions of the lectures given at the School. It has been prepared both as a reminder for the participants, and so that these pedagogical introductions can be made available to others who were unable to attend. We expect them to serve students of all ages well.

  5. Topics in quantum gravity

    Energy Technology Data Exchange (ETDEWEB)

    Lamon, Raphael

    2010-06-29

    Quantum gravity is an attempt to unify general relativity with quantum mechanics which are the two highly successful fundamental theories of theoretical physics. The main difficulty in this unification arises from the fact that, while general relativity describes gravity as a macroscopic geometrical theory, quantum mechanics explains microscopic phenomena. As a further complication, not only do both theories describe different scales but also their philosophical ramifications and the mathematics used to describe them differ in a dramatic way. Consequently, one possible starting point of an attempt at a unification is quantum mechanics, i.e. particle physics, and try to incorporate gravitation. This pathway has been chosen by particle physicists which led to string theory. On the other hand, loop quantum gravity (LQG) chooses the other possibility, i.e. it takes the geometrical aspects of gravity seriously and quantizes geometry. The first part of this thesis deals with a generalization of loop quantum cosmology (LQC) to toroidal topologies. LQC is a quantization of homogenous solutions of Einstein's field equations using tools from LQG. First the general concepts of closed topologies is introduced with special emphasis on Thurston's theorem and its consequences. It is shown that new degrees of freedom called Teichmueller parameters come into play and their dynamics can be described by a Hamiltonian. Several numerical solutions for a toroidal universe are presented and discussed. Following the guidelines of LQG this dynamics are rewritten using the Ashtekar variables and numerical solutions are shown. However, in order to find a suitable Hilbert space a canonical transformation must be performed. On the other hand this transformation makes the quantization of geometrical quantities less tractable such that two different ways are presented. It is shown that in both cases the spectrum of such geometrical operators depends on the initial value problem

  6. Tensor Galileons and gravity

    Energy Technology Data Exchange (ETDEWEB)

    Chatzistavrakidis, Athanasios [Van Swinderen Institute for Particle Physics and Gravity, University of Groningen,Nijenborgh 4, 9747 AG Groningen (Netherlands); Khoo, Fech Scen [Department of Physics and Earth Sciences, Jacobs University Bremen,Campus Ring 1, 28759 Bremen (Germany); Roest, Diederik [Van Swinderen Institute for Particle Physics and Gravity, University of Groningen,Nijenborgh 4, 9747 AG Groningen (Netherlands); Schupp, Peter [Department of Physics and Earth Sciences, Jacobs University Bremen,Campus Ring 1, 28759 Bremen (Germany)

    2017-03-13

    The particular structure of Galileon interactions allows for higher-derivative terms while retaining second order field equations for scalar fields and Abelian p-forms. In this work we introduce an index-free formulation of these interactions in terms of two sets of Grassmannian variables. We employ this to construct Galileon interactions for mixed-symmetry tensor fields and coupled systems thereof. We argue that these tensors are the natural generalization of scalars with Galileon symmetry, similar to p-forms and scalars with a shift-symmetry. The simplest case corresponds to linearised gravity with Lovelock invariants, relating the Galileon symmetry to diffeomorphisms. Finally, we examine the coupling of a mixed-symmetry tensor to gravity, and demonstrate in an explicit example that the inclusion of appropriate counterterms retains second order field equations.

  7. Topics in quantum gravity

    International Nuclear Information System (INIS)

    Lamon, Raphael

    2010-01-01

    Quantum gravity is an attempt to unify general relativity with quantum mechanics which are the two highly successful fundamental theories of theoretical physics. The main difficulty in this unification arises from the fact that, while general relativity describes gravity as a macroscopic geometrical theory, quantum mechanics explains microscopic phenomena. As a further complication, not only do both theories describe different scales but also their philosophical ramifications and the mathematics used to describe them differ in a dramatic way. Consequently, one possible starting point of an attempt at a unification is quantum mechanics, i.e. particle physics, and try to incorporate gravitation. This pathway has been chosen by particle physicists which led to string theory. On the other hand, loop quantum gravity (LQG) chooses the other possibility, i.e. it takes the geometrical aspects of gravity seriously and quantizes geometry. The first part of this thesis deals with a generalization of loop quantum cosmology (LQC) to toroidal topologies. LQC is a quantization of homogenous solutions of Einstein's field equations using tools from LQG. First the general concepts of closed topologies is introduced with special emphasis on Thurston's theorem and its consequences. It is shown that new degrees of freedom called Teichmueller parameters come into play and their dynamics can be described by a Hamiltonian. Several numerical solutions for a toroidal universe are presented and discussed. Following the guidelines of LQG this dynamics are rewritten using the Ashtekar variables and numerical solutions are shown. However, in order to find a suitable Hilbert space a canonical transformation must be performed. On the other hand this transformation makes the quantization of geometrical quantities less tractable such that two different ways are presented. It is shown that in both cases the spectrum of such geometrical operators depends on the initial value problem. Furthermore, we

  8. Simplicial quantum gravity

    International Nuclear Information System (INIS)

    Hartle, J.B.

    1985-01-01

    Simplicial approximation and the ideas associated with the Regge calculus provide a concrete way of implementing a sum over histories formulation of quantum gravity. A simplicial geometry is made up of flat simplices joined together in a prescribed way together with an assignment of lengths to their edges. A sum over simplicial geometries is a sum over the different ways the simplices can be joined together with an integral over their edge lengths. The construction of the simplicial Euclidean action for this approach to quantum general relativity is illustrated. The recovery of the diffeomorphism group in the continuum limit is discussed. Some possible classes of simplicial complexes with which to define a sum over topologies are described. In two dimensional quantum gravity it is argued that a reasonable class is the class of pseudomanifolds

  9. Instantons and gravity

    International Nuclear Information System (INIS)

    Konopleva, N.P.

    1996-01-01

    The problems of application of nonperturbative quantization methods in the theories of the gauge fields and gravity are discussed. Unification of interactions is considered in the framework of the geometrical gauge fields theory. Vacuum conception in the unified theory of interactions and instantons role in the vacuum structure are analyzed. The role of vacuum solutions of Einstein equations in definition of the gauge field vacuum is demonstrated

  10. Gravity, Time, and Lagrangians

    Science.gov (United States)

    Huggins, Elisha

    2010-01-01

    Feynman mentioned to us that he understood a topic in physics if he could explain it to a college freshman, a high school student, or a dinner guest. Here we will discuss two topics that took us a while to get to that level. One is the relationship between gravity and time. The other is the minus sign that appears in the Lagrangian. (Why would one…

  11. Spontaneously generated gravity

    International Nuclear Information System (INIS)

    Zee, A.

    1981-01-01

    We show, following a recent suggestion of Adler, that gravity may arise as a consequence of dynamical symmetry breaking in a scale- and gauge-invariant world. Our calculation is not tied to any specific scheme of dynamical symmetry breaking. A representation for Newton's coupling constant in terms of flat-space quantities is derived. The sign of Newton's coupling constant appears to depend on infrared details of the symmetry-breaking mechanism

  12. Loop Quantum Gravity.

    Science.gov (United States)

    Rovelli, Carlo

    2008-01-01

    The problem of describing the quantum behavior of gravity, and thus understanding quantum spacetime , is still open. Loop quantum gravity is a well-developed approach to this problem. It is a mathematically well-defined background-independent quantization of general relativity, with its conventional matter couplings. Today research in loop quantum gravity forms a vast area, ranging from mathematical foundations to physical applications. Among the most significant results obtained so far are: (i) The computation of the spectra of geometrical quantities such as area and volume, which yield tentative quantitative predictions for Planck-scale physics. (ii) A physical picture of the microstructure of quantum spacetime, characterized by Planck-scale discreteness. Discreteness emerges as a standard quantum effect from the discrete spectra, and provides a mathematical realization of Wheeler's "spacetime foam" intuition. (iii) Control of spacetime singularities, such as those in the interior of black holes and the cosmological one. This, in particular, has opened up the possibility of a theoretical investigation into the very early universe and the spacetime regions beyond the Big Bang. (iv) A derivation of the Bekenstein-Hawking black-hole entropy. (v) Low-energy calculations, yielding n -point functions well defined in a background-independent context. The theory is at the roots of, or strictly related to, a number of formalisms that have been developed for describing background-independent quantum field theory, such as spin foams, group field theory, causal spin networks, and others. I give here a general overview of ideas, techniques, results and open problems of this candidate theory of quantum gravity, and a guide to the relevant literature.

  13. Loop Quantum Gravity

    Directory of Open Access Journals (Sweden)

    Rovelli Carlo

    2008-07-01

    Full Text Available The problem of describing the quantum behavior of gravity, and thus understanding quantum spacetime, is still open. Loop quantum gravity is a well-developed approach to this problem. It is a mathematically well-defined background-independent quantization of general relativity, with its conventional matter couplings. Today research in loop quantum gravity forms a vast area, ranging from mathematical foundations to physical applications. Among the most significant results obtained so far are: (i The computation of the spectra of geometrical quantities such as area and volume, which yield tentative quantitative predictions for Planck-scale physics. (ii A physical picture of the microstructure of quantum spacetime, characterized by Planck-scale discreteness. Discreteness emerges as a standard quantum effect from the discrete spectra, and provides a mathematical realization of Wheeler’s “spacetime foam” intuition. (iii Control of spacetime singularities, such as those in the interior of black holes and the cosmological one. This, in particular, has opened up the possibility of a theoretical investigation into the very early universe and the spacetime regions beyond the Big Bang. (iv A derivation of the Bekenstein–Hawking black-hole entropy. (v Low-energy calculations, yielding n-point functions well defined in a background-independent context. The theory is at the roots of, or strictly related to, a number of formalisms that have been developed for describing background-independent quantum field theory, such as spin foams, group field theory, causal spin networks, and others. I give here a general overview of ideas, techniques, results and open problems of this candidate theory of quantum gravity, and a guide to the relevant literature.

  14. Semiclassical unimodular gravity

    International Nuclear Information System (INIS)

    Fiol, Bartomeu; Garriga, Jaume

    2010-01-01

    Classically, unimodular gravity is known to be equivalent to General Relativity (GR), except for the fact that the effective cosmological constant Λ has the status of an integration constant. Here, we explore various formulations of unimodular gravity beyond the classical limit. We first consider the non-generally covariant action formulation in which the determinant of the metric is held fixed to unity. We argue that the corresponding quantum theory is also equivalent to General Relativity for localized perturbative processes which take place in generic backgrounds of infinite volume (such as asymptotically flat spacetimes). Next, using the same action, we calculate semiclassical non-perturbative quantities, which we expect will be dominated by Euclidean instanton solutions. We derive the entropy/area ratio for cosmological and black hole horizons, finding agreement with GR for solutions in backgrounds of infinite volume, but disagreement for backgrounds with finite volume. In deriving the above results, the path integral is taken over histories with fixed 4-volume. We point out that the results are different if we allow the 4-volume of the different histories to vary over a continuum range. In this ''generalized'' version of unimodular gravity, one recovers the full set of Einstein's equations in the classical limit, including the trace, so Λ is no longer an integration constant. Finally, we consider the generally covariant theory due to Henneaux and Teitelboim, which is classically equivalent to unimodular gravity. In this case, the standard semiclassical GR results are recovered provided that the boundary term in the Euclidean action is chosen appropriately

  15. Granular Superconductors and Gravity

    Science.gov (United States)

    Noever, David; Koczor, Ron

    1999-01-01

    As a Bose condensate, superconductors provide novel conditions for revisiting previously proposed couplings between electromagnetism and gravity. Strong variations in Cooper pair density, large conductivity and low magnetic permeability define superconductive and degenerate condensates without the traditional density limits imposed by the Fermi energy (approx. 10(exp -6) g cu cm). Recent experiments have reported anomalous weight loss for a test mass suspended above a rotating Type II, YBCO superconductor, with a relatively high percentage change (0.05-2.1%) independent of the test mass' chemical composition and diamagnetic properties. A variation of 5 parts per 104 was reported above a stationary (non-rotating) superconductor. In experiments using a sensitive gravimeter, bulk YBCO superconductors were stably levitated in a DC magnetic field and exposed without levitation to low-field strength AC magnetic fields. Changes in observed gravity signals were measured to be less than 2 parts in 108 of the normal gravitational acceleration. Given the high sensitivity of the test, future work will examine variants on the basic magnetic behavior of granular superconductors, with particular focus on quantifying their proposed importance to gravity.

  16. Venus gravity fields

    Science.gov (United States)

    Sjogren, W. L.; Ananda, M.; Williams, B. G.; Birkeland, P. W.; Esposito, P. S.; Wimberly, R. N.; Ritke, S. J.

    1981-01-01

    Results of Pioneer Venus Orbiter observations concerning the gravity field of Venus are presented. The gravitational data was obtained from reductions of Doppler radio tracking data for the Orbiter, which is in a highly eccentric orbit with periapsis altitude varying from 145 to 180 km and nearly fixed periapsis latitude of 15 deg N. The global gravity field was obtained through the simultaneous estimation of the orbit state parameters and gravity coefficients from long-period variations in orbital element rates. The global field has been described with sixth degree and order spherical harmonic coefficients, which are capable of resolving the three major topographical features on Venus. Local anomalies have been mapped using line-of-sight accelerations derived from the Doppler residuals between 40 deg N and 10 deg S latitude at approximately 300 km spatial resolution. Gravitational data is observed to correspond to topographical data obtained by radar altimeter, with most of the gravitational anomalies about 20-30 milligals. Simulations evaluating the isostatic states of two topographic features indicate that at least partial isostasy prevails, with the possibility of complete compensation.

  17. Variable sulfur isotope composition of sulfides provide evidence for multiple sources of contamination in the Rustenburg Layered Suite, Bushveld Complex

    Science.gov (United States)

    Magalhães, Nivea; Penniston-Dorland, Sarah; Farquhar, James; Mathez, Edmond A.

    2018-06-01

    The Rustenburg Layered Suite (RLS) of the Bushveld Complex (BC) is famous for its platinum group element (PGE) ore, which is hosted in sulfides. The source of sulfur necessary to generate this type of mineralization is inferred to be the host rock of the intrusion. The RLS has a sulfur isotopic signature that indicates the presence of Archean surface-derived material (Δ33 S ≠ 0) in the magma. This signature, with an average value of Δ33 S = 0.112 ± 0.024 ‰, deviates from the expected Δ33 S value of the mantle of 0 ± 0.008 ‰. Previous work suggested that this signature is uniform throughout the RLS, which contrasts with radiogenic isotopes which vary throughout the igneous stratigraphy of the RLS. In this study, samples from key intervals within the igneous stratigraphy were analyzed, showing that Δ33 S values vary in the same stratigraphic levels as Sr and Nd isotopes. However, the variation is not consistent; in some levels there is a positive correlation and in others a negative correlation. This observation suggests that in some cases distinct magma pulses contained assimilated sulfur from different sources. Textural analysis shows no evidence for late addition of sulfur. These results also suggest that it is unlikely that large-scale assimilation and/or efficient mixing of host rock material in a single magma chamber occurred during emplacement. The data do not uniquely identify the source of sulfur in the different layers of the RLS, but the variation in sulfur isotope composition and its relationship to radiogenic isotope data calls for a reevaluation of the models for the formation and evolution of the RLS, which has the potential to impact the knowledge of how PGE deposits form.

  18. An analysis of the vapor flow and the heat conduction through the liquid-wick and pipe wall in a heat pipe with single or multiple heat sources

    Science.gov (United States)

    Chen, Ming-Ming; Faghri, Amir

    1990-01-01

    A numerical analysis is presented for the overall performance of heat pipes with single or multiple heat sources. The analysis includes the heat conduction in the wall and liquid-wick regions as well as the compressibility effect of the vapor inside the heat pipe. The two-dimensional elliptic governing equations in conjunction with the thermodynamic equilibrium relation and appropriate boundary conditions are solved numerically. The solutions are in agreement with existing experimental data for the vapor and wall temperatures at both low and high operating temperatures.

  19. Towards single embryo transfer? Modelling clinical outcomes of potential treatment choices using multiple data sources: predictive models and patient perspectives.

    Science.gov (United States)

    Roberts, Sa; McGowan, L; Hirst, Wm; Brison, Dr; Vail, A; Lieberman, Ba

    2010-07-01

    In vitro fertilisation (IVF) treatments involve an egg retrieval process, fertilisation and culture of the resultant embryos in the laboratory, and the transfer of embryos back to the mother over one or more transfer cycles. The first transfer is usually of fresh embryos and the remainder may be cryopreserved for future frozen cycles. Most commonly in UK practice two embryos are transferred (double embryo transfer, DET). IVF techniques have led to an increase in the number of multiple births, carrying an increased risk of maternal and infant morbidity. The UK Human Fertilisation and Embryology Authority (HFEA) has adopted a multiple birth minimisation strategy. One way of achieving this would be by increased use of single embryo transfer (SET). To collate cohort data from treatment centres and the HFEA; to develop predictive models for live birth and twinning probabilities from fresh and frozen embryo transfers and predict outcomes from treatment scenarios; to understand patients' perspectives and use the modelling results to investigate the acceptability of twin reduction policies. A multidisciplinary approach was adopted, combining statistical modelling with qualitative exploration of patients' perspectives: interviews were conducted with 27 couples at various stages of IVF treatment at both UK NHS and private clinics; datasets were collated of over 90,000 patients from the HFEA registry and nearly 9000 patients from five clinics, both over the period 2000-5; models were developed to determine live birth and twin outcomes and predict the outcomes of policies for selecting patients for SET or DET in the fresh cycle following egg retrieval and fertilisation, and the predictions were used in simulations of treatments; two focus groups were convened, one NHS and one web based on a patient organisation's website, to present the results of the statistical analyses and explore potential treatment policies. The statistical analysis revealed no characteristics that

  20. Identification of multiple detrital sources for Otway Supergroup sedimentary rocks: implications for basin models and chronostratigraphic correlations

    International Nuclear Information System (INIS)

    Mitchell, M.M.

    1997-01-01

    Correlation of apatite chlorine content (wt%) with apatite fission track age (Ma) from Lower Cretaceous Otway Supergroup sediments at present-day low temperatures, allows identification of two characteristic detrital source regions. Apatites from eroded Palaeozoic basement terrains yield low Cl content (generally 0.5 wt%) and syndepositional fission track ages. Where post-depositional thermal annealing ( > 70 degree C) has significantly reduced the fission track age, provenance information is preserved in the apatite Cl composition alone. In the Otway Supergroup, evidence for contemporaneous volcanism was found in both the Eumeralla Formation (Albian-Aptian), and Crayfish Group (Aptian-Berriasian) in samples located towards the central rift, where less sandy facies dominate. Results suggest that Crayfish Group sediments deposited along the northern margin of the basin were predominantly derived from eroding basement material, while the section located towards the central rift contains a greater proportion of volcanogenic detritus. Evidence from this study suggests that volcanogenic detritus was a distal sediment source throughout the entire early rift phase, prior to the main influx of arc-related volcanogenic material during deposition of the Eumeralla Formation. As diagenesis of volcanogenic sediments significantly reduces porosity and permeability of the sandstones, reservoir quality and petroleum potential may be significantly reduced in the Crayfish Group in deeper parts of the basin where a greater proportion of volcanogenic detritus is suggested. The results presented here provide important information regarding Lower Cretaceous Otway Basin stratigraphy and clearly indicate that this methodology may have wider application. (authors)

  1. Polar gravity fields from GOCE and airborne gravity

    DEFF Research Database (Denmark)

    Forsberg, René; Olesen, Arne Vestergaard; Yidiz, Hasan

    2011-01-01

    Airborne gravity, together with high-quality surface data and ocean satellite altimetric gravity, may supplement GOCE to make consistent, accurate high resolution global gravity field models. In the polar regions, the special challenge of the GOCE polar gap make the error characteristics...... of combination models especially sensitive to the correct merging of satellite and surface data. We outline comparisons of GOCE to recent airborne gravity surveys in both the Arctic and the Antarctic. The comparison is done to new 8-month GOCE solutions, as well as to a collocation prediction from GOCE gradients...... in Antarctica. It is shown how the enhanced gravity field solutions improve the determination of ocean dynamic topography in both the Arctic and in across the Drake Passage. For the interior of Antarctica, major airborne gravity programs are currently being carried out, and there is an urgent need...

  2. Gravity signatures of terrane accretion

    Science.gov (United States)

    Franco, Heather; Abbott, Dallas

    1999-01-01

    In modern collisional environments, accreted terranes are bracketed by forearc gravity lows, a gravitational feature which results from the abandonment of the original trench and the initiation of a new trench seaward of the accreted terrane. The size and shape of the gravity low depends on the type of accreted feature and the strength of the formerly subducting plate. Along the Central American trench, the accretion of Gorgona Island caused a seaward trench jump of 48 to 66 km. The relict trench axes show up as gravity lows behind the trench with minimum values of -78 mgal (N of Gorgona) and -49 mgal (S of Gorgona) respectively. These forearc gravity lows have little or no topographic expression. The active trench immediately seaward of these forearc gravity lows has minimum gravity values of -59 mgal (N of Gorgona) and -58 mgal (S of Gorgona), respectively. In the north, the active trench has a less pronounced gravity low than the sediment covered forearc. In the Mariana arc, two Cretaceous seamounts have been accreted to the Eocene arc. The northern seamount is most likely a large block, the southern seamount may be a thrust slice. These more recent accretion events have produced modest forearc topographic and gravity lows in comparison with the topographic and gravity lows within the active trench. However, the minimum values of the Mariana forearc gravity lows are modest only by comparison to the Mariana Trench (-216 mgal); their absolute values are more negative than at Gorgona Island (-145 to -146 mgal). We speculate that the forearc gravity lows and seaward trench jumps near Gorgona Island were produced by the accretion of a hotspot island from a strong plate. The Mariana gravity lows and seaward trench jumps (or thrust slices) were the result of breaking a relatively weak plate close to the seamount edifice. These gravity lows resulting from accretion events should be preserved in older accreted terranes.

  3. A methodology for combining multiple commercial data sources to improve measurement of the food and alcohol environment: applications of geographical information systems

    Directory of Open Access Journals (Sweden)

    Dara D. Mendez

    2014-11-01

    Full Text Available Commercial data sources have been increasingly used to measure and locate community resources. We describe a methodology for combining and comparing the differences in commercial data of the food and alcohol environment. We used commercial data from two commercial databases (InfoUSA and Dun&Bradstreet for 2003 and 2009 to obtain infor- mation on food and alcohol establishments and developed a matching process using computer algorithms and manual review by applying ArcGIS to geocode addresses, standard industrial classification and North American industry classification tax- onomy for type of establishment and establishment name. We constructed population and area-based density measures (e.g. grocery stores and assessed differences across data sources and used ArcGIS to map the densities. The matching process resulted in 8,705 and 7,078 unique establishments for 2003 and 2009, respectively. There were more establishments cap- tured in the combined dataset than relying on one data source alone, and the additional establishments captured ranged from 1,255 to 2,752 in 2009. The correlations for the density measures between the two data sources was highest for alcohol out- lets (r = 0.75 and 0.79 for per capita and area, respectively and lowest for grocery stores/supermarkets (r = 0.32 for both. This process for applying geographical information systems to combine multiple commercial data sources and develop meas- ures of the food and alcohol environment captured more establishments than relying on one data source alone. This replic- able methodology was found to be useful for understanding the food and alcohol environment when local or public data are limited.

  4. Improving Realism in Reduced Gravity Simulators

    Science.gov (United States)

    Cowley, Matthew; Harvil, Lauren; Clowers, Kurt; Clark, Timothy; Rajulu, Sudhakar

    2010-01-01

    Since man was first determined to walk on the moon, simulating the lunar environment became a priority. Providing an accurate reduced gravity environment is crucial for astronaut training and hardware testing. This presentation will follow the development of reduced gravity simulators to a final comparison of environments between the currently used systems. During the Apollo program era, multiple systems were built and tested, with several NASA centers having their own unique device. These systems ranged from marionette-like suspension devices where the subject laid on his side, to pneumatically driven offloading harnesses, to parabolic flights. However, only token comparisons, if any, were made between systems. Parabolic flight allows the entire body to fall at the same rate, giving an excellent simulation of reduced gravity as far as the biomechanics and physical perceptions are concerned. While the effects are accurate, there is limited workspace, limited time, and high cost associated with these tests. With all mechanical offload systems only the parts of the body that are actively offloaded feel any reduced gravity effects. The rest of the body still feels the full effect of gravity. The Partial Gravity System (Pogo) is the current ground-based offload system used to training and testing at the NASA Johnson Space Center. The Pogo is a pneumatic type system that allows for offloaded motion in the z-axis and free movement in the x-axis, but has limited motion in the y-axis. The pneumatic system itself is limited by cylinder stroke length and response time. The Active Response Gravity Offload System (ARGOS) is a next generation groundbased offload system, currently in development, that is based on modern robotic manufacturing lines. This system is projected to provide more z-axis travel and full freedom in both the x and y-axes. Current characterization tests are underway to determine how the ground-based offloading systems perform, how they compare to parabolic

  5. Open-source tool for automatic import of coded surveying data to multiple vector layers in GIS environment

    Directory of Open Access Journals (Sweden)

    Eva Stopková

    2016-12-01

    Full Text Available This paper deals with a tool that enables import of the coded data in a singletext file to more than one vector layers (including attribute tables, together withautomatic drawing of line and polygon objects and with optional conversion toCAD. Python script v.in.survey is available as an add-on for open-source softwareGRASS GIS (GRASS Development Team. The paper describes a case study basedon surveying at the archaeological mission at Tell-el Retaba (Egypt. Advantagesof the tool (e.g. significant optimization of surveying work and its limits (demandson keeping conventions for the points’ names coding are discussed here as well.Possibilities of future development are suggested (e.g. generalization of points’names coding or more complex attribute table creation.

  6. A Reconstruction Method for the Estimation of Temperatures of Multiple Sources Applied for Nanoparticle-Mediated Hyperthermia

    Directory of Open Access Journals (Sweden)

    Idan Steinberg

    2018-03-01

    Full Text Available Solid malignant tumors are one of the leading causes of death worldwide. Many times complete removal is not possible and alternative methods such as focused hyperthermia are used. Precise control of the hyperthermia process is imperative for the successful application of such treatment. To that end, this research presents a fast method that enables the estimation of deep tissue heat distribution by capturing and processing the transient temperature at the boundary based on a bio-heat transfer model. The theoretical model is rigorously developed and thoroughly validated by a series of experiments. A 10-fold improvement is demonstrated in resolution and visibility on tissue mimicking phantoms. The inverse problem is demonstrated as well with a successful application of the model for imaging deep-tissue embedded heat sources. Thereby, allowing the physician then ability to dynamically evaluate the hyperthermia treatment efficiency in real time.

  7. A Reconstruction Method for the Estimation of Temperatures of Multiple Sources Applied for Nanoparticle-Mediated Hyperthermia.

    Science.gov (United States)

    Steinberg, Idan; Tamir, Gil; Gannot, Israel

    2018-03-16

    Solid malignant tumors are one of the leading causes of death worldwide. Many times complete removal is not possible and alternative methods such as focused hyperthermia are used. Precise control of the hyperthermia process is imperative for the successful application of such treatment. To that end, this research presents a fast method that enables the estimation of deep tissue heat distribution by capturing and processing the transient temperature at the boundary based on a bio-heat transfer model. The theoretical model is rigorously developed and thoroughly validated by a series of experiments. A 10-fold improvement is demonstrated in resolution and visibility on tissue mimicking phantoms. The inverse problem is demonstrated as well with a successful application of the model for imaging deep-tissue embedded heat sources. Thereby, allowing the physician then ability to dynamically evaluate the hyperthermia treatment efficiency in real time.

  8. Climate Narratives: Combing multiple sources of information to develop risk management strategies for a municipal water utility

    Science.gov (United States)

    Yates, D. N.; Basdekas, L.; Rajagopalan, B.; Stewart, N.

    2013-12-01

    Municipal water utilities often develop Integrated Water Resource Plans (IWRP), with the goal of providing a reliable, sustainable water supply to customers in a cost-effective manner. Colorado Springs Utilities, a 5-service provider (potable and waste water, solid waste, natural gas and electricity) in Colorado USA, recently undertook an IWRP. where they incorporated water supply, water demand, water quality, infrastructure reliability, environmental protection, and other measures within the context of complex water rights, such as their critically important 'exchange potential'. The IWRP noted that an uncertain climate was one of the greatest sources of uncertainty to achieving a sustainable water supply to a growing community of users. We describe how historic drought, paleo-climate, and climate change projections were blended together into climate narratives that informed a suite of water resource systems models used by the utility to explore the vulnerabilities of their water systems.

  9. Multiple metabolic alterations exist in mutant PI3K cancers, but only glucose is essential as a nutrient source.

    Directory of Open Access Journals (Sweden)

    Rebecca Foster

    Full Text Available Targeting tumour metabolism is becoming a major new area of pharmaceutical endeavour. Consequently, a systematic search to define whether there are specific energy source dependencies in tumours, and how these might be dictated by upstream driving genetic mutations, is required. The PI3K-AKT-mTOR signalling pathway has a seminal role in regulating diverse cellular processes including cell proliferation and survival, but has also been associated with metabolic dysregulation. In this study, we sought to define how mutations within PI3KCA may affect the metabolic dependency of a cancer cell, using precisely engineered isogenic cell lines. Studies revealed gene expression signatures in PIK3CA mutant cells indicative of a consistent up-regulation of glycolysis. Interestingly, the genes up- and down-regulated varied between isogenic models suggesting that the primary node of regulation is not the same between models. Additional gene expression changes were also observed, suggesting that metabolic pathways other than glycolysis, such as glutaminolysis, were also affected. Nutrient dependency studies revealed that growth of PIK3CA mutant cells is highly dependent on glucose, whereas glutamine dependency is independent of PIK3CA status. In addition, the glucose dependency exhibited by PIK3CA mutant cells could not be overridden by supplementation with other nutrients. This specific dependence on glucose for growth was further illustrated by studies evaluating the effects of targeted disruption of the glycolytic pathway using siRNA and was also found to be present across a wider panel of cancer cell lines harbouring endogenous PIK3CA mutations. In conclusion, we have found that PIK3CA mutations lead to a shift towards a highly glycolytic phenotype, and that despite suggestions that cancer cells are adept at utilising alternative nutrient sources, PIK3CA mutant cells are not able to compensate for glucose withdrawal. Understanding the metabolic

  10. Sewage pollution in urban stormwater runoff as evident from the widespread presence of multiple microbial and chemical source tracking markers.

    Science.gov (United States)

    Sidhu, J P S; Ahmed, W; Gernjak, W; Aryal, R; McCarthy, D; Palmer, A; Kolotelo, P; Toze, S

    2013-10-01

    The concurrence of human sewage contamination in urban stormwater runoff (n=23) from six urban catchments across Australia was assessed by using both microbial source tracking (MST) and chemical source tracking (CST) markers. Out of 23 stormwater samples human adenovirus (HAv), human polyomavirus (HPv) and the sewage-associated markers; Methanobrevibacter smithii nifH and Bacteroides HF183 were detected in 91%, 56%, 43% and 96% of samples, respectively. Similarly, CST markers paracetamol (87%), salicylic acid (78%) acesulfame (96%) and caffeine (91%) were frequently detected. Twenty one samples (91%) were positive for six to eight sewage related MST and CST markers and remaining two samples were positive for five and four markers, respectively. A very good consensus (>91%) observed between the concurrence of the HF183, HAv, acesulfame and caffeine suggests good predictability of the presence of HAv in samples positive for one of the three markers. High prevalence of HAv (91%) also suggests that other enteric viruses may also be present in the stormwater samples which may pose significant health risks. This study underscores the benefits of employing a set of MST and CST markers which could include monitoring for HF183, adenovirus, caffeine and paracetamol to accurately detect human sewage contamination along with credible information on the presence of human enteric viruses, which could be used for more reliable public health risk assessments. Based on the results obtained in this study, it is recommended that some degree of treatment of captured stormwater would be required if it were to be used for non-potable purposes. Crown Copyright © 2013. Published by Elsevier B.V. All rights reserved.

  11. A case study of typhoon-induced gravity waves and the orographic impacts related to Typhoon Mindulle (2004) over Taiwan

    OpenAIRE

    Wu, J. F.; Xue, X. H.; Hoffmann, L.; Dou, X. K.; Li, H. M.; Chen, T. D.

    2015-01-01

    Atmospheric gravity waves (GWs) significantly influence global circulation. Deep convection, particularly that associated with typhoons, is believed to be an important source of gravity waves. Stratospheric gravity waves induced by Typhoon Mindulle (2004) were detected by the Atmospheric Infrared Sounder (AIRS). Semicircular GWs with horizontal wavelengths of 100–400 km were found over Taiwan through an inspection of AIRS radiances at 4.3 μm. Characteristics of the stratospheric gravity waves...

  12. Connecting multiple clouds and mixing real and virtual resources via the open source WNoDeS framework

    CERN Multimedia

    CERN. Geneva; Italiano, Alessandro

    2012-01-01

    In this paper we present the latest developments introduced in the WNoDeS framework (http://web.infn.it/wnodes); we will in particular describe inter-cloud connectivity, support for multiple batch systems, and coexistence of virtual and real environments on a single hardware. Specific effort has been dedicated to the work needed to deploy a "multi-sites" WNoDeS installation. The goal is to give end users the possibility to submit requests for resources using cloud interfaces on several sites in a transparent way. To this extent, we will show how we have exploited already existing and deployed middleware within the framework of the IGI (Italian Grid Initiative) and EGI (European Grid Infrastructure) services. In this context, we will also describe the developments that have taken place in order to have the possibility to dynamically exploit public cloud services like Amazon EC2. The latter gives WNoDeS the capability to serve, for example, part of the user requests through external computing resources when ne...

  13. Cosmology and modifications of gravity at large distances

    International Nuclear Information System (INIS)

    Ziour, R.

    2010-01-01

    In the framework of General Relativity, the observed current acceleration of the expansion of the Universe requires the presence of a Dark Energy component, whose nature is not well understood. In order to explain the acceleration of the Universe without introducing such a tantalizing source of energy, other gravitation theories have been designed. This thesis is devoted to the study of some of these modified gravity theories, as well as to the observation methods that could constrain them. The first part of this thesis presents a review of modified gravity theories and their motivations. The second part is devoted to the study of the massive gravity theories and of the so-called Vainshtein's mechanism, which allows some of the solutions of Massive Gravity to strongly differ from General Relativity at cosmological scales while satisfying the experimental constraints inside the solar system. For the first time, the validity of the Vainshtein's mechanism is demonstrated, through the study of specific spherically symmetric solutions. The third part deals with scalar modification of gravity; a new model of this sort is presented, inspired by the Vainshtein's mechanism in Massive Gravity. Finally, the fourth part discusses local, astrophysical and cosmological observations that might constrain modified gravity theories. (author)

  14. Uncertainty in biological monitoring: a framework for data collection and analysis to account for multiple sources of sampling bias

    Science.gov (United States)

    Ruiz-Gutierrez, Viviana; Hooten, Melvin B.; Campbell Grant, Evan H.

    2016-01-01

    Biological monitoring programmes are increasingly relying upon large volumes of citizen-science data to improve the scope and spatial coverage of information, challenging the scientific community to develop design and model-based approaches to improve inference.Recent statistical models in ecology have been developed to accommodate false-negative errors, although current work points to false-positive errors as equally important sources of bias. This is of particular concern for the success of any monitoring programme given that rates as small as 3% could lead to the overestimation of the occurrence of rare events by as much as 50%, and even small false-positive rates can severely bias estimates of occurrence dynamics.We present an integrated, computationally efficient Bayesian hierarchical model to correct for false-positive and false-negative errors in detection/non-detection data. Our model combines independent, auxiliary data sources with field observations to improve the estimation of false-positive rates, when a subset of field observations cannot be validated a posteriori or assumed as perfect. We evaluated the performance of the model across a range of occurrence rates, false-positive and false-negative errors, and quantity of auxiliary data.The model performed well under all simulated scenarios, and we were able to identify critical auxiliary data characteristics which resulted in improved inference. We applied our false-positive model to a large-scale, citizen-science monitoring programme for anurans in the north-eastern United States, using auxiliary data from an experiment designed to estimate false-positive error rates. Not correcting for false-positive rates resulted in biased estimates of occupancy in 4 of the 10 anuran species we analysed, leading to an overestimation of the average number of occupied survey routes by as much as 70%.The framework we present for data collection and analysis is able to efficiently provide reliable inference for

  15. Sr isotope tracing of multiple water sources in a complex river system, Noteć River, central Poland

    Energy Technology Data Exchange (ETDEWEB)

    Zieliński, Mateusz, E-mail: mateusz.zielinski@amu.edu.pl [Institute of Geoecology and Geoinformation, Adam Mickiewicz University, Dzięgielowa 27, 61-680 Poznań (Poland); Dopieralska, Jolanta, E-mail: dopieralska@amu.edu.pl [Poznań Science and Technology Park, Adam Mickiewicz University Foundation, Rubież 46, 61-612 Poznań (Poland); Belka, Zdzislaw, E-mail: zbelka@amu.edu.pl [Isotope Laboratory, Adam Mickiewicz University, Dzięgielowa 27, 61-680 Poznań (Poland); Walczak, Aleksandra, E-mail: awalczak@amu.edu.pl [Isotope Laboratory, Adam Mickiewicz University, Dzięgielowa 27, 61-680 Poznań (Poland); Siepak, Marcin, E-mail: siep@amu.edu.pl [Institute of Geology, Adam Mickiewicz University, Maków Polnych 16, 61-606 Poznań (Poland); Jakubowicz, Michal, E-mail: mjakub@amu.edu.pl [Institute of Geoecology and Geoinformation, Adam Mickiewicz University, Dzięgielowa 27, 61-680 Poznań (Poland)

    2016-04-01

    Anthropogenic impact on surface waters and other elements in the environment was investigated in the Noteć River basin in central Poland. The approach was to trace changes in the Sr isotope composition ({sup 87}Sr/{sup 86}Sr) and concentration in space and time. Systematic sampling of the river water shows a very wide range of {sup 87}Sr/{sup 86}Sr ratios, from 0.7089 to 0.7127. This strong variation, however, is restricted to the upper course of the river, whereas the water in the lower course typically shows {sup 87}Sr/{sup 86}Sr values around 0.7104–0.7105. Variations in {sup 87}Sr/{sup 86}Sr are associated with a wide range of Sr concentrations, from 0.14 to 1.32 mg/L. We find that strong variations in {sup 87}Sr/{sup 86}Sr and Sr concentrations can be accounted for by mixing of two end-members: 1) atmospheric waters charged with Sr from the near-surface weathering and wash-out of Quaternary glaciogenic deposits, and 2) waters introduced into the river from an open pit lignite mine. The first reservoir is characterized by a low Sr content and high {sup 87}Sr/{sup 86}Sr ratios, whereas mine waters display opposite characteristics. Anthropogenic pollution is also induced by extensive use of fertilizers which constitute the third source of Sr in the environment. The study has an important implication for future archeological studies in the region. It shows that the present-day Sr isotope signatures of river water, flora and fauna cannot be used unambiguously to determine the “baseline” for bioavailable {sup 87}Sr/{sup 86}Sr in the past. - Highlights: • Sr isotopes fingerprint water sources and their interactions in a complex river system. • Mine waters and fertilizers are critical anthropogenic additions in the river water. • Limited usage of environmental isotopic data in archeological studies. • Sr budget of the river is dynamic and temporary.

  16. Cosmological tests of modified gravity.

    Science.gov (United States)

    Koyama, Kazuya

    2016-04-01

    We review recent progress in the construction of modified gravity models as alternatives to dark energy as well as the development of cosmological tests of gravity. Einstein's theory of general relativity (GR) has been tested accurately within the local universe i.e. the Solar System, but this leaves the possibility open that it is not a good description of gravity at the largest scales in the Universe. This being said, the standard model of cosmology assumes GR on all scales. In 1998, astronomers made the surprising discovery that the expansion of the Universe is accelerating, not slowing down. This late-time acceleration of the Universe has become the most challenging problem in theoretical physics. Within the framework of GR, the acceleration would originate from an unknown dark energy. Alternatively, it could be that there is no dark energy and GR itself is in error on cosmological scales. In this review, we first give an overview of recent developments in modified gravity theories including f(R) gravity, braneworld gravity, Horndeski theory and massive/bigravity theory. We then focus on common properties these models share, such as screening mechanisms they use to evade the stringent Solar System tests. Once armed with a theoretical knowledge of modified gravity models, we move on to discuss how we can test modifications of gravity on cosmological scales. We present tests of gravity using linear cosmological perturbations and review the latest constraints on deviations from the standard [Formula: see text]CDM model. Since screening mechanisms leave distinct signatures in the non-linear structure formation, we also review novel astrophysical tests of gravity using clusters, dwarf galaxies and stars. The last decade has seen a number of new constraints placed on gravity from astrophysical to cosmological scales. Thanks to on-going and future surveys, cosmological tests of gravity will enjoy another, possibly even more, exciting ten years.

  17. The 2008 Wells, Nevada earthquake sequence: Source constraints using calibrated multiple event relocation and InSAR

    Science.gov (United States)

    Nealy, Jennifer; Benz, Harley M.; Hayes, Gavin; Berman, Eric; Barnhart, William

    2017-01-01

    The 2008 Wells, NV earthquake represents the largest domestic event in the conterminous U.S. outside of California since the October 1983 Borah Peak earthquake in southern Idaho. We present an improved catalog, magnitude complete to 1.6, of the foreshock-aftershock sequence, supplementing the current U.S. Geological Survey (USGS) Preliminary Determination of Epicenters (PDE) catalog with 1,928 well-located events. In order to create this catalog, both subspace and kurtosis detectors are used to obtain an initial set of earthquakes and associated locations. The latter are then calibrated through the implementation of the hypocentroidal decomposition method and relocated using the BayesLoc relocation technique. We additionally perform a finite fault slip analysis of the mainshock using InSAR observations. By combining the relocated sequence with the finite fault analysis, we show that the aftershocks occur primarily updip and along the southwestern edge of the zone of maximum slip. The aftershock locations illuminate areas of post-mainshock strain increase; aftershock depths, ranging from 5 to 16 km, are consistent with InSAR imaging, which shows that the Wells earthquake was a buried source with no observable near-surface offset.

  18. Multiple energy computed tomography for neuroradiology with monochromatic x-rays from the National Synchrotron Light Source

    International Nuclear Information System (INIS)

    Dilmanian, F.A.; Garrett, R.F.; Thomlinson, W.C.; Berman, L.E.; Chapman, L.D.; Gmuer, N.F.; Lazarz, N.M.; Moulin, H.R.; Oversluizen, T.; Slatkin, D.N.; Stojanoff, V.; Volkow, N.D.; Zeman, H.D.; Luke, P.N.; Thompson, A.C.

    1990-01-01

    Monochromatic and tunable 33--100 keV x-rays from the X17 superconducting wiggler of the National Synchrotron Light Source (NSLS) at Brookhaven National Laboratory (BNL) will be used for computed tomography (CT) of the human head and neck. The CT configuration will be one of a fixed horizontal fan-shaped beam and a seated rotating subject. The system, which is under development, will employ a two-crystal monochromator with an energy bandwidth of about 0.1%, and high-purity germanium linear array detector with 0.5 mm element width and 200 mm total width. Narrow energy bands not only eliminate beam hardening but are ideal for carrying out the following dial-energy methods: (a) dual-photon absorptiometry CT, that provides separate images of the low-Z and the intermediate-Z elements; and (b) K-edge subtraction CT of iodine and perhaps of heavier contrast elements. As a result, the system should provide ∼10-fold improvement in image contrast resolution and in quantitative precision over conventional CT. A prototype system for a 45 mm subject diameter will be ready in 1991, which will be used for studies with phantoms and small animals. The human imaging system will have a field of view of 200 mm. The in-plane spatial resolution in both systems will be 0.5 mm FWHM. 34 refs., 6 figs

  19. Screening the Medicines for Malaria Venture Pathogen Box across Multiple Pathogens Reclassifies Starting Points for Open-Source Drug Discovery.

    Science.gov (United States)

    Duffy, Sandra; Sykes, Melissa L; Jones, Amy J; Shelper, Todd B; Simpson, Moana; Lang, Rebecca; Poulsen, Sally-Ann; Sleebs, Brad E; Avery, Vicky M

    2017-09-01

    Open-access drug discovery provides a substantial resource for diseases primarily affecting the poor and disadvantaged. The open-access Pathogen Box collection is comprised of compounds with demonstrated biological activity against specific pathogenic organisms. The supply of this resource by the Medicines for Malaria Venture has the potential to provide new chemical starting points for a number of tropical and neglected diseases, through repurposing of these compounds for use in drug discovery campaigns for these additional pathogens. We tested the Pathogen Box against kinetoplastid parasites and malaria life cycle stages in vitro Consequently, chemical starting points for malaria, human African trypanosomiasis, Chagas disease, and leishmaniasis drug discovery efforts have been identified. Inclusive of this in vitro biological evaluation, outcomes from extensive literature reviews and database searches are provided. This information encompasses commercial availability, literature reference citations, other aliases and ChEMBL number with associated biological activity, where available. The release of this new data for the Pathogen Box collection into the public domain will aid the open-source model of drug discovery. Importantly, this will provide novel chemical starting points for drug discovery and target identification in tropical disease research. Copyright © 2017 Duffy et al.

  20. Mapping of wind energy potential over the Gobi Desert in Northwest China based on multiple sources of data

    Science.gov (United States)

    Li, Li; Wang, Xinyuan; Luo, Lei; Zhao, Yanchuang; Zong, Xin; Bachagha, Nabil

    2018-06-01

    In recent years, wind energy has been a fastgrowing alternative source of electrical power due to its sustainability. In this paper, the wind energy potential over the Gobi Desert in Northwest China is assessed at the patch scale using geographic information systems (GIS). Data on land cover, topography, and administrative boundaries and 11 years (2000‒2010) of wind speed measurements were collected and used to map and estimate the region's wind energy potential. Based on the results, it was found that continuous regions of geographical potential (GeoP) are located in the middle of the research area (RA), with scattered areas of similar GeoP found in other regions. The results also show that the technical potential (TecP) levels are about 1.72‒2.67 times (2.20 times on average) higher than the actual levels. It was found that the GeoP patches can be divided into four classes: unsuitable regions, suitable regions, more suitable regions, and the most suitable regions. The GeoP estimation shows that 0.41 billion kW of wind energy are potentially available in the RA. The suitable regions account for 25.49%, the more suitable regions 24.45%, and the most suitable regions for more than half of the RA. It is also shown that Xinjiang and Gansu are more suitable for wind power development than Ningxia.

  1. Gender differences in drunk driving prevalence rates and trends: a 20-year assessment using multiple sources of evidence.

    Science.gov (United States)

    Schwartz, Jennifer

    2008-09-01

    This research tracked women's and men's drunk driving rates and the DUI sex ratio in the United States from 1982-2004 using three diverse sources of evidence. Sex-specific prevalence estimates and the sex ratio are derived from official arrest statistics from the Federal Bureau of Investigation, self-reports from the Centers for Disease Control and Prevention, and traffic fatality data from the National Highway and Transportation Safety Administration. Drunk driving trends were analyzed using Augmented Dickey Fuller time series techniques. Female DUI arrest rates increased whereas male rates declined then stabilized, producing a significantly narrower sex ratio. According to self-report and traffic data, women's and men's drunk driving rates declined and the gender gap was unchanged. Women's overrepresentation in arrests relative to their share of offending began in the 1990s and accelerated in 2000. Women's arrest gains, contrasted with no systematic change in DUI behavior, and the timing of this shift suggest an increased vulnerability to arrest. More stringent laws and enforcement directed at less intoxicated offenders may inadvertently target female offending patterns.

  2. Multiple disparities in adult mortality in relation to social and health care perspective: results from different data sources.

    Science.gov (United States)

    Ranabhat, Chhabi Lal; Kim, Chun-Bae; Park, Myung-Bae; Acharaya, Sambhu

    2017-08-08

    Disparity in adult mortality (AM) with reference to social dynamics and health care has not been sufficiently examined. This study aimed to identify the gap in the understanding of AM in relation to religion, political stability, economic level, and universal health coverage (UHC). A cross-national study was performed with different sources of data, using the administrative record linkage theory. Data was created from the 2013 World Bank data catalogue by region, The Economist (Political instability index 2013), Stuckler David et al. (Universal health coverage, 2010), and religious categories of all UN country members. Descriptive statistics, a t-test, an ANOVA followed by a post hoc test, and a linear regression were used where applicable. The average AM rate for males and females was 0.20 ± 0.10 and 0.14 ± 0.10, respectively. There was high disparity of AM between countries with and without UHC and between groups with low and high income. UHC and political stability would significantly reduce AMR by >0.41 in both sexes and high economic status would reduce male AMR by 0.44, and female AMR by 0.70. It can be concluded that effective health care; UHC and political stability significantly reduce AM.

  3. Bringing Gravity to Space

    Science.gov (United States)

    Norsk, P.; Shelhamer, M.

    2016-01-01

    This panel will present NASA's plans for ongoing and future research to define the requirements for Artificial Gravity (AG) as a countermeasure against the negative health effects of long-duration weightlessness. AG could mitigate the gravity-sensitive effects of spaceflight across a host of physiological systems. Bringing gravity to space could mitigate the sensorimotor and neuro-vestibular disturbances induced by G-transitions upon reaching a planetary body, and the cardiovascular deconditioning and musculoskeletal weakness induced by weightlessness. Of particular interest for AG during deep-space missions is mitigation of the Visual Impairment Intracranial Pressure (VIIP) syndrome that the majority of astronauts exhibit in space to varying degrees, and which presumably is associated with weightlessness-induced fluid shift from lower to upper body segments. AG could be very effective for reversing the fluid shift and thus help prevent VIIP. The first presentation by Dr. Charles will summarize some of the ground-based and (very little) space-based research that has been conducted on AG by the various space programs. Dr. Paloski will address the use of AG during deep-space exploration-class missions and describe the different AG scenarios such as intra-vehicular, part-of-vehicle, or whole-vehicle centrifugations. Dr. Clement will discuss currently planned NASA research as well as how to coordinate future activities among NASA's international partners. Dr. Barr will describe some possible future plans for using space- and ground-based partial-G analogs to define the relationship between physiological responses and G levels between 0 and 1. Finally, Dr. Stenger will summarize how the human cardiovascular system could benefit from intermittent short-radius centrifugations during long-duration missions.

  4. Is Gravity an Entropic Force?

    Directory of Open Access Journals (Sweden)

    Shan Gao

    2011-04-01

    Full Text Available The remarkable connections between gravity and thermodynamics seem to imply that gravity is not fundamental but emergent, and in particular, as Verlinde suggested, gravity is probably an entropic force. In this paper, we will argue that the idea of gravity as an entropic force is debatable. It is shown that there is no convincing analogy between gravity and entropic force in Verlinde’s example. Neither holographic screen nor test particle satisfies all requirements for the existence of entropic force in a thermodynamics system. Furthermore, we show that the entropy increase of the screen is not caused by its statistical tendency to increase entropy as required by the existence of entropic force, but in fact caused by gravity. Therefore, Verlinde’s argument for the entropic origin of gravity is problematic. In addition, we argue that the existence of a minimum size of spacetime, together with the Heisenberg uncertainty principle in quantum theory, may imply the fundamental existence of gravity as a geometric property of spacetime. This may provide a further support for the conclusion that gravity is not an entropic force.

  5. Active Response Gravity Offload System

    Science.gov (United States)

    Valle, Paul; Dungan, Larry; Cunningham, Thomas; Lieberman, Asher; Poncia, Dina

    2011-01-01

    The Active Response Gravity Offload System (ARGOS) provides the ability to simulate with one system the gravity effect of planets, moons, comets, asteroids, and microgravity, where the gravity is less than Earth fs gravity. The system works by providing a constant force offload through an overhead hoist system and horizontal motion through a rail and trolley system. The facility covers a 20 by 40-ft (approximately equals 6.1 by 12.2m) horizontal area with 15 ft (approximately equals4.6 m) of lifting vertical range.

  6. Teleparallel equivalent of Lovelock gravity

    Science.gov (United States)

    González, P. A.; Vásquez, Yerko

    2015-12-01

    There is a growing interest in modified gravity theories based on torsion, as these theories exhibit interesting cosmological implications. In this work inspired by the teleparallel formulation of general relativity, we present its extension to Lovelock gravity known as the most natural extension of general relativity in higher-dimensional space-times. First, we review the teleparallel equivalent of general relativity and Gauss-Bonnet gravity, and then we construct the teleparallel equivalent of Lovelock gravity. In order to achieve this goal, we use the vielbein and the connection without imposing the Weitzenböck connection. Then, we extract the teleparallel formulation of the theory by setting the curvature to null.

  7. Differentiation among Multiple Sources of Anthropogenic Nitrate in a Complex Groundwater System using Dual Isotope Systematics: A case study from Mortandad Canyon, New Mexico

    Science.gov (United States)

    Larson, T. E.; Perkins, G.; Longmire, P.; Heikoop, J. M.; Fessenden, J. E.; Rearick, M.; Fabyrka-Martin, J.; Chrystal, A. E.; Dale, M.; Simmons, A. M.

    2009-12-01

    The groundwater system beneath Los Alamos National Laboratory has been affected by multiple sources of anthropogenic nitrate contamination. Average NO3-N concentrations of up to 18.2±1.7 mg/L have been found in wells in the perched intermediate aquifer beneath one of the more affected sites within Mortandad Canyon. Sources of nitrate potentially reaching the alluvial and intermediate aquifers include: (1) sewage effluent, (2) neutralized nitric acid, (3) neutralized 15N-depleted nitric acid (treated waste from an experiment enriching nitric acid in 15N), and (4) natural background nitrate. Each of these sources is unique in δ18O and δ15N space. Using nitrate stable isotope ratios, a mixing model for the three anthropogenic sources of nitrate was established, after applying a linear subtraction of the background component. The spatial and temporal variability in nitrate contaminant sources through Mortandad Canyon is clearly shown in ternary plots. While microbial denitrification has been shown to change groundwater nitrate stable isotope ratios in other settings, the redox potential, relatively high dissolved oxygen content, increasing nitrate concentrations over time, and lack of observed NO2 in these wells suggest minimal changes to the stable isotope ratios have occurred. Temporal trends indicate that the earliest form of anthropogenic nitrate in this watershed was neutralized nitric acid. Alluvial wells preserve a trend of decreasing nitrate concentrations and mixing models show decreasing contributions of 15N-depleted nitric acid. Nearby intermediate wells show increasing nitrate concentrations and mixing models indicate a larger component derived from 15N-depleted nitric acid. These data indicate that the pulse of neutralized 15N-depleted nitric acid that was released into Mortandad Canyon between 1986 and 1989 has infiltrated through the alluvial aquifer and is currently affecting two intermediate wells. This hypothesis is consistent with previous

  8. Development of a flattening filter free multiple source model for use as an independent, Monte Carlo, dose calculation, quality assurance tool for clinical trials.

    Science.gov (United States)

    Faught, Austin M; Davidson, Scott E; Popple, Richard; Kry, Stephen F; Etzel, Carol; Ibbott, Geoffrey S; Followill, David S

    2017-09-01

    The Imaging and Radiation Oncology Core-Houston (IROC-H) Quality Assurance Center (formerly the Radiological Physics Center) has reported varying levels of compliance from their anthropomorphic phantom auditing program. IROC-H studies have suggested that one source of disagreement between institution submitted calculated doses and measurement is the accuracy of the institution's treatment planning system dose calculations and heterogeneity corrections used. In order to audit this step of the radiation therapy treatment process, an independent dose calculation tool is needed. Monte Carlo multiple source models for Varian flattening filter free (FFF) 6 MV and FFF 10 MV therapeutic x-ray beams were commissioned based on central axis depth dose data from a 10 × 10 cm 2 field size and dose profiles for a 40 × 40 cm 2 field size. The models were validated against open-field measurements in a water tank for field sizes ranging from 3 × 3 cm 2 to 40 × 40 cm 2 . The models were then benchmarked against IROC-H's anthropomorphic head and neck phantom and lung phantom measurements. Validation results, assessed with a ±2%/2 mm gamma criterion, showed average agreement of 99.9% and 99.0% for central axis depth dose data for FFF 6 MV and FFF 10 MV models, respectively. Dose profile agreement using the same evaluation technique averaged 97.8% and 97.9% for the respective models. Phantom benchmarking comparisons were evaluated with a ±3%/2 mm gamma criterion, and agreement averaged 90.1% and 90.8% for the respective models. Multiple source models for Varian FFF 6 MV and FFF 10 MV beams have been developed, validated, and benchmarked for inclusion in an independent dose calculation quality assurance tool for use in clinical trial audits. © 2017 American Association of Physicists in Medicine.

  9. The gravity apple tree

    International Nuclear Information System (INIS)

    Aldama, Mariana Espinosa

    2015-01-01

    The gravity apple tree is a genealogical tree of the gravitation theories developed during the past century. The graphic representation is full of information such as guides in heuristic principles, names of main proponents, dates and references for original articles (See under Supplementary Data for the graphic representation). This visual presentation and its particular classification allows a quick synthetic view for a plurality of theories, many of them well validated in the Solar System domain. Its diachronic structure organizes information in a shape of a tree following similarities through a formal concept analysis. It can be used for educational purposes or as a tool for philosophical discussion. (paper)

  10. Tests and comparisons of gravity models.

    Science.gov (United States)

    Marsh, J. G.; Douglas, B. C.

    1971-01-01

    Optical observations of the GEOS satellites were used to obtain orbital solutions with different sets of geopotential coefficients. The solutions were compared before and after modification to high order terms (necessary because of resonance) and were then analyzed by comparing subsequent observations with predicted trajectories. The most important source of error in orbit determination and prediction for the GEOS satellites is the effect of resonance found in most published sets of geopotential coefficients. Modifications to the sets yield greatly improved orbits in most cases. The results of these comparisons suggest that with the best optical tracking systems and gravity models, satellite position error due to gravity model uncertainty can reach 50-100 m during a heavily observed 5-6 day orbital arc. If resonant coefficients are estimated, the uncertainty is reduced considerably.

  11. Restricted gravity: Abelian projection of Einstein's theory

    International Nuclear Information System (INIS)

    Cho, Y.M.

    2013-01-01

    Treating Einstein's theory as a gauge theory of Lorentz group, we decompose the gravitational connection Γμ into the restricted connection made of the potential of the maximal Abelian subgroup H of Lorentz group G and the valence connection made of G/H part of the potential which transforms covariantly under Lorentz gauge transformation. With this we show that Einstein's theory can be decomposed into the restricted gravity made of the restricted connection which has the full Lorentz gauge invariance which has the valence connection as gravitational source. The decomposition shows the existence of a restricted theory of gravitation which has the full general invariance but is much simpler than Einstein's theory. Moreover, it tells that the restricted gravity can be written as an Abelian gauge theory,

  12. Quadratic gravity in first order formalism

    Energy Technology Data Exchange (ETDEWEB)

    Alvarez, Enrique; Anero, Jesus; Gonzalez-Martin, Sergio, E-mail: enrique.alvarez@uam.es, E-mail: jesusanero@gmail.com, E-mail: sergio.gonzalez.martin@uam.es [Departamento de Física Teórica and Instituto de Física Teórica (IFT-UAM/CSIC), Universidad Autónoma de Madrid, Cantoblanco, 28049, Madrid (Spain)

    2017-10-01

    We consider the most general action for gravity which is quadratic in curvature. In this case first order and second order formalisms are not equivalent. This framework is a good candidate for a unitary and renormalizable theory of the gravitational field; in particular, there are no propagators falling down faster than 1/ p {sup 2}. The drawback is of course that the parameter space of the theory is too big, so that in many cases will be far away from a theory of gravity alone. In order to analyze this issue, the interaction between external sources was examined in some detail. We find that this interaction is conveyed mainly by propagation of the three-index connection field. At any rate the theory as it stands is in the conformal invariant phase; only when Weyl invariance is broken through the coupling to matter can an Einstein-Hilbert term (and its corresponding Planck mass scale) be generated by quantum corrections.

  13. Stochastic gravity: a primer with applications

    International Nuclear Information System (INIS)

    Hu, B L; Verdaguer, E

    2003-01-01

    Stochastic semiclassical gravity of the 1990s is a theory naturally evolved from semiclassical gravity of the 1970s and 1980s. It improves on the semiclassical Einstein equation with source given by the expectation value of the stress-energy tensor of quantum matter fields in curved spacetime by incorporating an additional source due to their fluctuations. In stochastic semiclassical gravity the main object of interest is the noise kernel, the vacuum expectation value of the (operator-valued) stress-energy bi-tensor, and the centrepiece is the (semiclassical) Einstein-Langevin equation. We describe this new theory via two approaches: the axiomatic and the functional. The axiomatic approach is useful to see the structure of the theory from the framework of semiclassical gravity, showing the link from the mean value of the energy-momentum tensor to their correlation functions. The functional approach uses the Feynman-Vernon influence functional and the Schwinger-Keldysh closed-time-path effective action methods which are convenient for computations. It also brings out the open system concepts and the statistical and stochastic contents of the theory such as dissipation, fluctuations, noise and decoherence. We then describe the applications of stochastic gravity to the backreaction problems in cosmology and black-hole physics. In the first problem, we study the backreaction of conformally coupled quantum fields in a weakly inhomogeneous cosmology. In the second problem, we study the backreaction of a thermal field in the gravitational background of a quasi-static black hole (enclosed in a box) and its fluctuations. These examples serve to illustrate closely the ideas and techniques presented in the first part. This topical review is intended as a first introduction providing readers with some basic ideas and working knowledge. Thus, we place more emphasis here on pedagogy than completeness. (Further discussions of ideas, issues and ongoing research topics can be found

  14. Stochastic gravity: a primer with applications

    Energy Technology Data Exchange (ETDEWEB)

    Hu, B L [Department of Physics, University of Maryland, College Park, MD 20742-4111 (United States); Verdaguer, E [Departament de Fisica Fonamental and CER en Astrofisica Fisica de Particules i Cosmologia, Universitat de Barcelona, Av. Diagonal 647, 08028 Barcelona (Spain)

    2003-03-21

    Stochastic semiclassical gravity of the 1990s is a theory naturally evolved from semiclassical gravity of the 1970s and 1980s. It improves on the semiclassical Einstein equation with source given by the expectation value of the stress-energy tensor of quantum matter fields in curved spacetime by incorporating an additional source due to their fluctuations. In stochastic semiclassical gravity the main object of interest is the noise kernel, the vacuum expectation value of the (operator-valued) stress-energy bi-tensor, and the centrepiece is the (semiclassical) Einstein-Langevin equation. We describe this new theory via two approaches: the axiomatic and the functional. The axiomatic approach is useful to see the structure of the theory from the framework of semiclassical gravity, showing the link from the mean value of the energy-momentum tensor to their correlation functions. The functional approach uses the Feynman-Vernon influence functional and the Schwinger-Keldysh closed-time-path effective action methods which are convenient for computations. It also brings out the open system concepts and the statistical and stochastic contents of the theory such as dissipation, fluctuations, noise and decoherence. We then describe the applications of stochastic gravity to the backreaction problems in cosmology and black-hole physics. In the first problem, we study the backreaction of conformally coupled quantum fields in a weakly inhomogeneous cosmology. In the second problem, we study the backreaction of a thermal field in the gravitational background of a quasi-static black hole (enclosed in a box) and its fluctuations. These examples serve to illustrate closely the ideas and techniques presented in the first part. This topical review is intended as a first introduction providing readers with some basic ideas and working knowledge. Thus, we place more emphasis here on pedagogy than completeness. (Further discussions of ideas, issues and ongoing research topics can be found

  15. Sequencing-Based Analysis of the Bacterial and Fungal Composition of Kefir Grains and Milks from Multiple Sources

    Science.gov (United States)

    Marsh, Alan J.; O’Sullivan, Orla; Hill, Colin; Ross, R. Paul; Cotter, Paul D.

    2013-01-01

    Kefir is a fermented milk-based beverage to which a number of health-promoting properties have been attributed. The microbes responsible for the fermentation of milk to produce kefir consist of a complex association of bacteria and yeasts, bound within a polysaccharide matrix, known as the kefir grain. The consistency of this microbial population, and that present in the resultant beverage, has been the subject of a number of previous, almost exclusively culture-based, studies which have indicated differences depending on geographical location and culture conditions. However, culture-based identification studies are limited by virtue of only detecting species with the ability to grow on the specific medium used and thus culture-independent, molecular-based techniques offer the potential for a more comprehensive analysis of such communities. Here we describe a detailed investigation of the microbial population, both bacterial and fungal, of kefir, using high-throughput sequencing to analyse 25 kefir milks and associated grains sourced from 8 geographically distinct regions. This is the first occasion that this technology has been employed to investigate the fungal component of these populations or to reveal the microbial composition of such an extensive number of kefir grains or milks. As a result several genera and species not previously identified in kefir were revealed. Our analysis shows that the bacterial populations in kefir are dominated by 2 phyla, the Firmicutes and the Proteobacteria. It was also established that the fungal populations of kefir were dominated by the genera Kazachstania, Kluyveromyces and Naumovozyma, but that a variable sub-dominant population also exists. PMID:23894461

  16. Sequencing-based analysis of the bacterial and fungal composition of kefir grains and milks from multiple sources.

    Directory of Open Access Journals (Sweden)

    Alan J Marsh

    Full Text Available Kefir is a fermented milk-based beverage to which a number of health-promoting properties have been attributed. The microbes responsible for the fermentation of milk to produce kefir consist of a complex association of bacteria and yeasts, bound within a polysaccharide matrix, known as the kefir grain. The consistency of this microbial population, and that present in the resultant beverage, has been the subject of a number of previous, almost exclusively culture-based, studies which have indicated differences depending on geographical location and culture conditions. However, culture-based identification studies are limited by virtue of only detecting species with the ability to grow on the specific medium used and thus culture-independent, molecular-based techniques offer the potential for a more comprehensive analysis of such communities. Here we describe a detailed investigation of the microbial population, both bacterial and fungal, of kefir, using high-throughput sequencing to analyse 25 kefir milks and associated grains sourced from 8 geographically distinct regions. This is the first occasion that this technology has been employed to investigate the fungal component of these populations or to reveal the microbial composition of such an extensive number of kefir grains or milks. As a result several genera and species not previously identified in kefir were revealed. Our analysis shows that the bacterial populations in kefir are dominated by 2 phyla, the Firmicutes and the Proteobacteria. It was also established that the fungal populations of kefir were dominated by the genera Kazachstania, Kluyveromyces and Naumovozyma, but that a variable sub-dominant population also exists.

  17. Contrasts between chemical and physical estimates of baseflow help discern multiple sources of water contributing to rivers

    Science.gov (United States)

    Cartwright, I.; Gilfedder, B.; Hofmann, H.

    2013-05-01

    This study compares geochemical and physical methods of estimating baseflow in the upper reaches of the Barwon River, southeast Australia. Estimates of baseflow from physical techniques such as local minima and recursive digital filters are higher than those based on chemical mass balance using continuous electrical conductivity (EC). Between 2001 and 2011 the baseflow flux calculated using chemical mass balance is between 1.8 × 103 and 1.5 × 104 ML yr-1 (15 to 25% of the total discharge in any one year) whereas recursive digital filters yield baseflow fluxes of 3.6 × 103 to 3.8 × 104 ML yr-1 (19 to 52% of discharge) and the local minimum method yields baseflow fluxes of 3.2 × 103 to 2.5 × 104 ML yr-1 (13 to 44% of discharge). These differences most probably reflect how the different techniques characterise baseflow. Physical methods probably aggregate much of the water from delayed sources as baseflow. However, as many delayed transient water stores (such as bank return flow or floodplain storage) are likely to be geochemically similar to surface runoff, chemical mass balance calculations aggregate them with the surface runoff component. The mismatch between geochemical and physical estimates is greatest following periods of high discharge in winter, implying that these transient stores of water feed the river for several weeks to months. Consistent with these interpretations, modelling of bank storage indicates that bank return flows provide water to the river for several weeks after flood events. EC vs. discharge variations during individual flow events also imply that an inflow of low EC water stored within the banks or on the floodplain occurs as discharge falls. The joint use of physical and geochemical techniques allows a better understanding of the different components of water that contribute to river flow, which is important for the management and protection of water resources.

  18. Technical Note: A new global database of trace gases and aerosols from multiple sources of high vertical resolution measurements

    Directory of Open Access Journals (Sweden)

    G. E. Bodeker

    2008-09-01

    Full Text Available A new database of trace gases and aerosols with global coverage, derived from high vertical resolution profile measurements, has been assembled as a collection of binary data files; hereafter referred to as the "Binary DataBase of Profiles" (BDBP. Version 1.0 of the BDBP, described here, includes measurements from different satellite- (HALOE, POAM II and III, SAGE I and II and ground-based measurement systems (ozonesondes. In addition to the primary product of ozone, secondary measurements of other trace gases, aerosol extinction, and temperature are included. All data are subjected to very strict quality control and for every measurement a percentage error on the measurement is included. To facilitate analyses, each measurement is added to 3 different instances (3 different grids of the database where measurements are indexed by: (1 geographic latitude, longitude, altitude (in 1 km steps and time, (2 geographic latitude, longitude, pressure (at levels ~1 km apart and time, (3 equivalent latitude, potential temperature (8 levels from 300 K to 650 K and time.

    In contrast to existing zonal mean databases, by including a wider range of measurement sources (both satellite and ozonesondes, the BDBP is sufficiently dense to permit calculation of changes in ozone by latitude, longitude and altitude. In addition, by including other trace gases such as water vapour, this database can be used for comprehensive radiative transfer calculations. By providing the original measurements rather than derived monthly means, the BDBP is applicable to a wider range of applications than databases containing only monthly mean data. Monthly mean zonal mean ozone concentrations calculated from the BDBP are compared with the database of Randel and Wu, which has been used in many earlier analyses. As opposed to that database which is generated from regression model fits, the BDBP uses the original (quality controlled measurements with no smoothing applied in any

  19. Sequencing-based analysis of the bacterial and fungal composition of kefir grains and milks from multiple sources.

    Science.gov (United States)

    Marsh, Alan J; O'Sullivan, Orla; Hill, Colin; Ross, R Paul; Cotter, Paul D

    2013-01-01

    Kefir is a fermented milk-based beverage to which a number of health-promoting properties have been attributed. The microbes responsible for the fermentation of milk to produce kefir consist of a complex association of bacteria and yeasts, bound within a polysaccharide matrix, known as the kefir grain. The consistency of this microbial population, and that present in the resultant beverage, has been the subject of a number of previous, almost exclusively culture-based, studies which have indicated differences depending on geographical location and culture conditions. However, culture-based identification studies are limited by virtue of only detecting species with the ability to grow on the specific medium used and thus culture-independent, molecular-based techniques offer the potential for a more comprehensive analysis of such communities. Here we describe a detailed investigation of the microbial population, both bacterial and fungal, of kefir, using high-throughput sequencing to analyse 25 kefir milks and associated grains sourced from 8 geographically distinct regions. This is the first occasion that this technology has been employed to investigate the fungal component of these populations or to reveal the microbial composition of such an extensive number of kefir grains or milks. As a result several genera and species not previously identified in kefir were revealed. Our analysis shows that the bacterial populations in kefir are dominated by 2 phyla, the Firmicutes and the Proteobacteria. It was also established that the fungal populations of kefir were dominated by the genera Kazachstania, Kluyveromyces and Naumovozyma, but that a variable sub-dominant population also exists.

  20. Incorporating community and multiple perspectives in the development of acceptable drinking water source protection policy in catchments facing recreation demands.

    Science.gov (United States)

    Syme, Geoffrey J; Nancarrow, Blair E

    2013-11-15

    The protection of catchment areas for drinking water quality has become an increasingly disputed issue in Australia and internationally. This is particularly the case in regard to the growing demand for nature based and rural recreation. Currently the policy for the protection of drinking water in Western Australia is to enforce a 2 km exclusion zone with a much larger surrounding area with limited and prescribed access to recreators. The debate between recreators and water management agencies has been lively, culminating in a recent state government enquiry. This paper describes the second phase of a three phase study to develop a methodology for defensible policy formulation which accounts for the points of view of all stakeholders. We examine general community, active recreators and professionals' views on the current policy of catchment protection and five proposed alternatives using a social judgement theory approach. Key attitudinal determinants of the preferences for policies were identified. Overall the recreators did not support the current policy despite strong support from both the general community and the professional group. Nevertheless, it was evident that there was some support by the community for policies that would enable a slight relaxation of current recreational exclusion. It was also evident that there was a significant proportion of the general community who were dissatisfied with current recreational opportunities and that, in future, it may be less easy to police exclusion zones even if current policy is maintained. The potential for future integration of recreational and water source protection is discussed as well as the benefits of community research in understanding policy preferences in this regard. Copyright © 2013 Elsevier Ltd. All rights reserved.