WorldWideScience

Sample records for sources time addresses

  1. SAVAH: Source Address Validation with Host Identity Protocol

    Science.gov (United States)

    Kuptsov, Dmitriy; Gurtov, Andrei

    Explosive growth of the Internet and lack of mechanisms that validate the authenticity of a packet source produced serious security and accounting issues. In this paper, we propose validating source addresses in LAN using Host Identity Protocol (HIP) deployed in a first-hop router. Compared to alternative solutions such as CGA, our approach is suitable both for IPv4 and IPv6. We have implemented SAVAH in Wi-Fi access points and evaluated its overhead for clients and the first-hop router.

  2. Time domain localization technique with sparsity constraint for imaging acoustic sources

    Science.gov (United States)

    Padois, Thomas; Doutres, Olivier; Sgard, Franck; Berry, Alain

    2017-09-01

    This paper addresses source localization technique in time domain for broadband acoustic sources. The objective is to accurately and quickly detect the position and amplitude of noise sources in workplaces in order to propose adequate noise control options and prevent workers hearing loss or safety risk. First, the generalized cross correlation associated with a spherical microphone array is used to generate an initial noise source map. Then a linear inverse problem is defined to improve this initial map. Commonly, the linear inverse problem is solved with an l2 -regularization. In this study, two sparsity constraints are used to solve the inverse problem, the orthogonal matching pursuit and the truncated Newton interior-point method. Synthetic data are used to highlight the performances of the technique. High resolution imaging is achieved for various acoustic sources configurations. Moreover, the amplitudes of the acoustic sources are correctly estimated. A comparison of computation times shows that the technique is compatible with quasi real-time generation of noise source maps. Finally, the technique is tested with real data.

  3. Addressing Parental Vaccine Concerns: Engagement, Balance, and Timing.

    Directory of Open Access Journals (Sweden)

    Jason M Glanz

    2015-08-01

    Full Text Available The recent United States measles epidemic has sparked another contentious national discussion about childhood vaccination. A growing number of parents are expressing concerns about the safety of vaccines, often fueled by misinformation from the internet, books, and other nonmedical sources. Many of these concerned parents are choosing to refuse or delay childhood vaccines, placing their children and surrounding communities at risk for serious diseases that are nearly 100% preventable with vaccination. Between 10% and 15% of parents are asking physicians to space out the timing of vaccines, which often poses an ethical dilemma for physicians. This trend reflects a tension between personal liberty and public health, as parents fight to control the decisions that affect the health of their children and public health officials strive to maintain high immunization rates to prevent outbreaks of vaccine-preventable diseases. Interventions to address this emerging public health issue are needed. We describe a framework by which web-based interventions can be used to help parents make evidence-based decisions about childhood vaccinations.

  4. Addressing Software Engineering Issues in Real-Time Software ...

    African Journals Online (AJOL)

    Addressing Software Engineering Issues in Real-Time Software ... systems, manufacturing process, process control, military, space exploration, and ... but also physical properties such as timeliness, Quality of Service and reliability.

  5. Absolute symbolic addressing, a structure making time-sharing easier

    International Nuclear Information System (INIS)

    Debraine, P.

    1968-08-01

    Time-sharing of computers asks for a certain number of conditions, particularly, an efficient dynamic loading of programs and data. This paper indicates a paging method making linkages with a minimum of table-looking operations. The principle is to use associative memory registers for calling blocks of physical memory, the block address being given by the concatenation of a file number (located in a base register) and a page number (located in the instruction proper). The position within the block is given by a displacement located in the instruction. A second associated base register contains the local part (page number + displacement) of the base address. This extended base register system allows executing programs in a very large programming complex without loss of time. The addresses are fixed at assembly time and the blocks can be loaded anywhere without modification for execution. The various problems associated with the execution of complex programs are presented in this context and shown to be easily solved by the proposed system, the realization of which would be very easy starting from the computer structures existing now. (author) [fr

  6. MATCHING ALTERNATIVE ADDRESSES: A SEMANTIC WEB APPROACH

    Directory of Open Access Journals (Sweden)

    S. Ariannamazi

    2015-12-01

    Full Text Available Rapid development of crowd-sourcing or volunteered geographic information (VGI provides opportunities for authoritatives that deal with geospatial information. Heterogeneity of multiple data sources and inconsistency of data types is a key characteristics of VGI datasets. The expansion of cities resulted in the growing number of POIs in the OpenStreetMap, a well-known VGI source, which causes the datasets to outdate in short periods of time. These changes made to spatial and aspatial attributes of features such as names and addresses might cause confusion or ambiguity in the processes that require feature’s literal information like addressing and geocoding. VGI sources neither will conform specific vocabularies nor will remain in a specific schema for a long period of time. As a result, the integration of VGI sources is crucial and inevitable in order to avoid duplication and the waste of resources. Information integration can be used to match features and qualify different annotation alternatives for disambiguation. This study enhances the search capabilities of geospatial tools with applications able to understand user terminology to pursuit an efficient way for finding desired results. Semantic web is a capable tool for developing technologies that deal with lexical and numerical calculations and estimations. There are a vast amount of literal-spatial data representing the capability of linguistic information in knowledge modeling, but these resources need to be harmonized based on Semantic Web standards. The process of making addresses homogenous generates a helpful tool based on spatial data integration and lexical annotation matching and disambiguating.

  7. Popular sayings that address time and PLE/PL2 teaching/learning

    Directory of Open Access Journals (Sweden)

    Danúsia Torres-dos-Santos

    2011-12-01

    Full Text Available The study of fixed expressions contributes to a significant demand of L2/SL teaching, the lexicon acquisition. The binomial language/culture gained a clear shape in the context of L2/SL teaching, so as to require the teacher to reflect constantly on the topics that outline the linguistic and cultural identity of the target language. Based on the assumption that time and culture are inseparable, the PFL/PL2 (Portuguese as a Foreign/ Second Language teacher has to be prepared to address the Brazilian concept of time. Relying on the notion of time language (HALL, 1996, this study seeks to identify aspects of the Brazilian temporal language. As fixed expressions have different types, it is often difficult to maintain theoretical boundaries among them. Considering Silva's definition of proverb (1999:14-15, some Brazilian popular sayings concerning time were selected. It was found that popular wisdom has at least two types of expressions that refer to time: those related to the concepts of clock time and the ones linked to the concepts of event time (LEVINE, 1997. It is intended therefore that this study may assist the PFL/PL2 teacher in addressing this issue.

  8. Time-lapse controlled-source electromagnetics using interferometry

    NARCIS (Netherlands)

    Hunziker, J.W.; Slob, E.C.; Wapenaar, C.P.A.

    In time-lapse controlled-source electromagnetics, it is crucial that the source and the receivers are positioned at exactly the same location at all times of measurement. We use interferometry by multidimensional deconvolution (MDD) to overcome problems in repeatability of the source location.

  9. Collective Odor Source Estimation and Search in Time-Variant Airflow Environments Using Mobile Robots

    Science.gov (United States)

    Meng, Qing-Hao; Yang, Wei-Xing; Wang, Yang; Zeng, Ming

    2011-01-01

    This paper addresses the collective odor source localization (OSL) problem in a time-varying airflow environment using mobile robots. A novel OSL methodology which combines odor-source probability estimation and multiple robots’ search is proposed. The estimation phase consists of two steps: firstly, the separate probability-distribution map of odor source is estimated via Bayesian rules and fuzzy inference based on a single robot’s detection events; secondly, the separate maps estimated by different robots at different times are fused into a combined map by way of distance based superposition. The multi-robot search behaviors are coordinated via a particle swarm optimization algorithm, where the estimated odor-source probability distribution is used to express the fitness functions. In the process of OSL, the estimation phase provides the prior knowledge for the searching while the searching verifies the estimation results, and both phases are implemented iteratively. The results of simulations for large-scale advection–diffusion plume environments and experiments using real robots in an indoor airflow environment validate the feasibility and robustness of the proposed OSL method. PMID:22346650

  10. Time-stretch microscopy based on time-wavelength sequence reconstruction from wideband incoherent source

    International Nuclear Information System (INIS)

    Zhang, Chi; Xu, Yiqing; Wei, Xiaoming; Tsia, Kevin K.; Wong, Kenneth K. Y.

    2014-01-01

    Time-stretch microscopy has emerged as an ultrafast optical imaging concept offering the unprecedented combination of the imaging speed and sensitivity. However, dedicated wideband and coherence optical pulse source with high shot-to-shot stability has been mandated for time-wavelength mapping—the enabling process for ultrahigh speed wavelength-encoded image retrieval. From the practical point of view, exploiting methods to relax the stringent requirements (e.g., temporal stability and coherence) for the source of time-stretch microscopy is thus of great value. In this paper, we demonstrated time-stretch microscopy by reconstructing the time-wavelength mapping sequence from a wideband incoherent source. Utilizing the time-lens focusing mechanism mediated by a narrow-band pulse source, this approach allows generation of a wideband incoherent source, with the spectral efficiency enhanced by a factor of 18. As a proof-of-principle demonstration, time-stretch imaging with the scan rate as high as MHz and diffraction-limited resolution is achieved based on the wideband incoherent source. We note that the concept of time-wavelength sequence reconstruction from wideband incoherent source can also be generalized to any high-speed optical real-time measurements, where wavelength is acted as the information carrier

  11. Fission-neutrons source with fast neutron-emission timing

    Energy Technology Data Exchange (ETDEWEB)

    Rusev, G., E-mail: rusev@lanl.gov; Baramsai, B.; Bond, E.M.; Jandel, M.

    2016-05-01

    A neutron source with fast timing has been built to help with detector-response measurements. The source is based on the neutron emission from the spontaneous fission of {sup 252}Cf. The time is provided by registering the fission fragments in a layer of a thin scintillation film with a signal rise time of 1 ns. The scintillation light output is measured by two silicon photomultipliers with rise time of 0.5 ns. Overall time resolution of the source is 0.3 ns. Design of the source and test measurements using it are described. An example application of the source for determining the neutron/gamma pulse-shape discrimination by a stilbene crystal is given.

  12. Comparing the health impacts of different sources of energy. Keynote address

    International Nuclear Information System (INIS)

    Hamilton, L.D.

    1981-01-01

    Assessing health impacts of different energy sources requires synthesis of research results from any different disciplines into a rational framework. Information is often scanty; qualitatively different risks, or energy systems with substantially different end uses, must be put on a common footing. Historically institutional constraints have inhibited agencies from making incisive comparisons necessary for formulating energy policy; this has exacerbated public controversy over appropriate energy sources. Risk assessment methods reviewed include examples drawn from work of the Biomedical and Environmental Assessment Division at Brookhaven National Laboratory and elsewhere. Uncertainty over the mechanism and size of air pollution health damage is addressed through a probabilistic health-damage function, using sulfate-particle exposure as an indicator. This facilitates intercomparison through analysis of each step in the whole fuel cycle between a typical coal and nuclear powerplant. Occupational health impacts, a significant fraction of overall damage, are illustrated by accident trends in coal mining. In broadening comparisons to include new technologies, one must include the impact of manufacturing the energy-producing device as part of an expanded fuel cycle, via input/output methods. Throughout the analysis, uncertainties must be made explicit in the results, including uncertainty of data and uncertainty in choice of appropriate models and methods. No single method of comparative risk assessment is fully satisfactory; each has its limitations. One needs to compare several methods if decision-making is to be realistic

  13. Comparison of source moment tensor recovered by diffraction stacking migration and source time reversal imaging

    Science.gov (United States)

    Zhang, Q.; Zhang, W.

    2017-12-01

    Diffraction stacking migration is an automatic location methods and widely used in microseismic monitoring of the hydraulic fracturing. It utilizes the stacking of thousands waveform to enhance signal-to-noise ratio of weak events. For surface monitoring, the diffraction stacking method is suffered from polarity reverse among receivers due to radiation pattern of moment source. Joint determination of location and source mechanism has been proposed to overcome the polarity problem but needs significantly increased computational calculations. As an effective method to recover source moment tensor, time reversal imaging based on wave equation can locate microseismic event by using interferometry on the image to extract source position. However, the time reversal imaging is very time consuming compared to the diffraction stacking location because of wave-equation simulation.In this study, we compare the image from diffraction stacking and time reversal imaging to check if the diffraction stacking can obtain similar moment tensor as time reversal imaging. We found that image produced by taking the largest imaging value at each point along time axis does not exhibit the radiation pattern, while with the same level of calculation efficiency, the image produced for each trial origin time can generate radiation pattern similar to time reversal imaging procedure. Thus it is potential to locate the source position by the diffraction stacking method for general moment tensor sources.

  14. Evaluation of risk impact of changes to Completion Times addressing model and parameter uncertainties

    International Nuclear Information System (INIS)

    Martorell, S.; Martón, I.; Villamizar, M.; Sánchez, A.I.; Carlos, S.

    2014-01-01

    This paper presents an approach and an example of application for the evaluation of risk impact of changes to Completion Times within the License Basis of a Nuclear Power Plant based on the use of the Probabilistic Risk Assessment addressing identification, treatment and analysis of uncertainties in an integrated manner. It allows full development of a three tired approach (Tier 1–3) following the principles of the risk-informed decision-making accounting for uncertainties as proposed by many regulators. Completion Time is the maximum outage time a safety related equipment is allowed to be down, e.g. for corrective maintenance, which is established within the Limiting Conditions for Operation included into Technical Specifications for operation of a Nuclear Power Plant. The case study focuses on a Completion Time change of the Accumulators System of a Nuclear Power Plant using a level 1 PRA. It focuses on several sources of model and parameter uncertainties. The results obtained show the risk impact of the proposed CT change including both types of epistemic uncertainties is small as compared with current safety goals of concern to Tier 1. However, what concerns to Tier 2 and 3, the results obtained show how the use of some traditional and uncertainty importance measures helps in identifying high risky configurations that should be avoided in NPP technical specifications no matter the duration of CT (Tier 2), and other configurations that could take part of a configuration risk management program (Tier 3). - Highlights: • New approach for evaluation of risk impact of changes to Completion Times. • Integrated treatment and analysis of model and parameter uncertainties. • PSA based application to support risk-informed decision-making. • Measures of importance for identification of risky configurations. • Management of important safety issues to accomplish safety goals

  15. Time-dependent source model of the Lusi mud volcano

    Science.gov (United States)

    Shirzaei, M.; Rudolph, M. L.; Manga, M.

    2014-12-01

    The Lusi mud eruption, near Sidoarjo, East Java, Indonesia, began erupting in May 2006 and continues to erupt today. Previous analyses of surface deformation data suggested an exponential decay of the pressure in the mud source, but did not constrain the geometry and evolution of the source(s) from which the erupting mud and fluids ascend. To understand the spatiotemporal evolution of the mud and fluid sources, we apply a time-dependent inversion scheme to a densely populated InSAR time series of the surface deformation at Lusi. The SAR data set includes 50 images acquired on 3 overlapping tracks of the ALOS L-band satellite between May 2006 and April 2011. Following multitemporal analysis of this data set, the obtained surface deformation time series is inverted in a time-dependent framework to solve for the volume changes of distributed point sources in the subsurface. The volume change distribution resulting from this modeling scheme shows two zones of high volume change underneath Lusi at 0.5-1.5 km and 4-5.5km depth as well as another shallow zone, 7 km to the west of Lusi and underneath the Wunut gas field. The cumulative volume change within the shallow source beneath Lusi is ~2-4 times larger than that of the deep source, whilst the ratio of the Lusi shallow source volume change to that of Wunut gas field is ~1. This observation and model suggest that the Lusi shallow source played a key role in eruption process and mud supply, but that additional fluids do ascend from depths >4 km on eruptive timescales.

  16. Separation of non-stationary multi-source sound field based on the interpolated time-domain equivalent source method

    Science.gov (United States)

    Bi, Chuan-Xing; Geng, Lin; Zhang, Xiao-Zheng

    2016-05-01

    In the sound field with multiple non-stationary sources, the measured pressure is the sum of the pressures generated by all sources, and thus cannot be used directly for studying the vibration and sound radiation characteristics of every source alone. This paper proposes a separation model based on the interpolated time-domain equivalent source method (ITDESM) to separate the pressure field belonging to every source from the non-stationary multi-source sound field. In the proposed method, ITDESM is first extended to establish the relationship between the mixed time-dependent pressure and all the equivalent sources distributed on every source with known location and geometry information, and all the equivalent source strengths at each time step are solved by an iterative solving process; then, the corresponding equivalent source strengths of one interested source are used to calculate the pressure field generated by that source alone. Numerical simulation of two baffled circular pistons demonstrates that the proposed method can be effective in separating the non-stationary pressure generated by every source alone in both time and space domains. An experiment with two speakers in a semi-anechoic chamber further evidences the effectiveness of the proposed method.

  17. Blind source separation problem in GPS time series

    Science.gov (United States)

    Gualandi, A.; Serpelloni, E.; Belardinelli, M. E.

    2016-04-01

    A critical point in the analysis of ground displacement time series, as those recorded by space geodetic techniques, is the development of data-driven methods that allow the different sources of deformation to be discerned and characterized in the space and time domains. Multivariate statistic includes several approaches that can be considered as a part of data-driven methods. A widely used technique is the principal component analysis (PCA), which allows us to reduce the dimensionality of the data space while maintaining most of the variance of the dataset explained. However, PCA does not perform well in finding the solution to the so-called blind source separation (BSS) problem, i.e., in recovering and separating the original sources that generate the observed data. This is mainly due to the fact that PCA minimizes the misfit calculated using an L2 norm (χ 2), looking for a new Euclidean space where the projected data are uncorrelated. The independent component analysis (ICA) is a popular technique adopted to approach the BSS problem. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we test the use of a modified variational Bayesian ICA (vbICA) method to recover the multiple sources of ground deformation even in the presence of missing data. The vbICA method models the probability density function (pdf) of each source signal using a mix of Gaussian distributions, allowing for more flexibility in the description of the pdf of the sources with respect to standard ICA, and giving a more reliable estimate of them. Here we present its application to synthetic global positioning system (GPS) position time series, generated by simulating deformation near an active fault, including inter-seismic, co-seismic, and post-seismic signals, plus seasonal signals and noise, and an additional time-dependent volcanic source. We evaluate the ability of the PCA and ICA decomposition

  18. 3D Multi‐source Least‐squares Reverse Time Migration

    KAUST Repository

    Dai, Wei; Boonyasiriwat, Chaiwoot; Schuster, Gerard T.

    2010-01-01

    : random time shift, random source polarity and random source location selected from a pre‐designed table. Numerical tests for the 3D SEG/EAGE Overthrust model show that multi‐source LSRTM can suppress migration artifacts in the migration image and remove

  19. A GIS-based time-dependent seismic source modeling of Northern Iran

    Science.gov (United States)

    Hashemi, Mahdi; Alesheikh, Ali Asghar; Zolfaghari, Mohammad Reza

    2017-01-01

    The first step in any seismic hazard study is the definition of seismogenic sources and the estimation of magnitude-frequency relationships for each source. There is as yet no standard methodology for source modeling and many researchers have worked on this topic. This study is an effort to define linear and area seismic sources for Northern Iran. The linear or fault sources are developed based on tectonic features and characteristic earthquakes while the area sources are developed based on spatial distribution of small to moderate earthquakes. Time-dependent recurrence relationships are developed for fault sources using renewal approach while time-independent frequency-magnitude relationships are proposed for area sources based on Poisson process. GIS functionalities are used in this study to introduce and incorporate spatial-temporal and geostatistical indices in delineating area seismic sources. The proposed methodology is used to model seismic sources for an area of about 500 by 400 square kilometers around Tehran. Previous researches and reports are studied to compile an earthquake/fault catalog that is as complete as possible. All events are transformed to uniform magnitude scale; duplicate events and dependent shocks are removed. Completeness and time distribution of the compiled catalog is taken into account. The proposed area and linear seismic sources in conjunction with defined recurrence relationships can be used to develop time-dependent probabilistic seismic hazard analysis of Northern Iran.

  20. 3D Multi‐source Least‐squares Reverse Time Migration

    KAUST Repository

    Dai, Wei

    2010-10-17

    We present the theory and numerical results for least‐squares reverse time migration (LSRTM) of phase‐encoded supergathers, where each supergather is the superposition of phased‐encoded shots. Three type of encoding functions are used in this study: random time shift, random source polarity and random source location selected from a pre‐designed table. Numerical tests for the 3D SEG/EAGE Overthrust model show that multi‐source LSRTM can suppress migration artifacts in the migration image and remove most of the crosstalk noise from multi‐source data. Empirical results suggest that multi‐source LSRTM can provide a noticeable increase in computational efficiency compared to standard RTM, when the CSGs in a supergather are modeled and migrated together with a finite‐difference simulator. If the phase‐encoding functions are dynamically changed after each iteration of LSRTM, the best images are obtained. The potential drawback is that the final results are very sensitive to the accuracy of the starting model.

  1. Addressing the long time horizon for managing used nuclear fuel

    International Nuclear Information System (INIS)

    Hodge, R.A.

    2006-01-01

    The time horizon that must be considered in developing an approach to managing used nuclear fuel extends many thousands of years. Such a time horizon is without precedent in environmental, economic, social, technical and public policy terms. As a first step in addressing this issue, the Nuclear Waste Management Organization convened a team of 33 individuals to undertake a formal scenarios exercise. Such an exercise is a way of framing potential futures that might occur. There is no intent to predict the future. This exercise represents the first time that the scenarios technique has been used for such a long time horizon. The approach involved identifying two principle axes of potential change: (1) social - political - environmental well-being; and (2) magnitude of the used nuclear fuel challenge. Using this organizing template, four scenarios were developed reaching out 25 years, and an additional twelve were developed at 175 years branching out from the original four. In addition, a series of sixteen possible 'end-points' were identified to span conditions 500 years out and for 10,000 years a large number of 'what- ifs' were developed. The scenarios, end-points, and what- ifs were then used to identify a number of criteria that could be used for testing proposed management options and their capacity to deal with future conditions. This paper describes this work and the role that it has played in the deliberations of the Nuclear Waste Management Organization. (author)

  2. Uncertainty in Earthquake Source Imaging Due to Variations in Source Time Function and Earth Structure

    KAUST Repository

    Razafindrakoto, H. N. T.; Mai, Paul Martin

    2014-01-01

    One way to improve the accuracy and reliability of kinematic earthquake source imaging is to investigate the origin of uncertainty and to minimize their effects. The difficulties in kinematic source inversion arise from the nonlinearity of the problem, nonunique choices in the parameterization, and observational errors. We analyze particularly the uncertainty related to the choice of the source time function (STF) and the variability in Earth structure. We consider a synthetic data set generated from a spontaneous dynamic rupture calculation. Using Bayesian inference, we map the solution space of peak slip rate, rupture time, and rise time to characterize the kinematic rupture in terms of posterior density functions. Our test to investigate the effect of the choice of STF reveals that all three tested STFs (isosceles triangle, regularized Yoffe with acceleration time of 0.1 and 0.3 s) retrieve the patch of high slip and slip rate around the hypocenter. However, the use of an isosceles triangle as STF artificially accelerates the rupture to propagate faster than the target solution. It additionally generates an artificial linear correlation between rupture onset time and rise time. These appear to compensate for the dynamic source effects that are not included in the symmetric triangular STF. The exact rise time for the tested STFs is difficult to resolve due to the small amount of radiated seismic moment in the tail of STF. To highlight the effect of Earth structure variability, we perform inversions including the uncertainty in the wavespeed only, and variability in both wavespeed and layer depth. We find that little difference is noticeable between the resulting rupture model uncertainties from these two parameterizations. Both significantly broaden the posterior densities and cause faster rupture propagation particularly near the hypocenter due to the major velocity change at the depth where the fault is located.

  3. Uncertainty in Earthquake Source Imaging Due to Variations in Source Time Function and Earth Structure

    KAUST Repository

    Razafindrakoto, H. N. T.

    2014-03-25

    One way to improve the accuracy and reliability of kinematic earthquake source imaging is to investigate the origin of uncertainty and to minimize their effects. The difficulties in kinematic source inversion arise from the nonlinearity of the problem, nonunique choices in the parameterization, and observational errors. We analyze particularly the uncertainty related to the choice of the source time function (STF) and the variability in Earth structure. We consider a synthetic data set generated from a spontaneous dynamic rupture calculation. Using Bayesian inference, we map the solution space of peak slip rate, rupture time, and rise time to characterize the kinematic rupture in terms of posterior density functions. Our test to investigate the effect of the choice of STF reveals that all three tested STFs (isosceles triangle, regularized Yoffe with acceleration time of 0.1 and 0.3 s) retrieve the patch of high slip and slip rate around the hypocenter. However, the use of an isosceles triangle as STF artificially accelerates the rupture to propagate faster than the target solution. It additionally generates an artificial linear correlation between rupture onset time and rise time. These appear to compensate for the dynamic source effects that are not included in the symmetric triangular STF. The exact rise time for the tested STFs is difficult to resolve due to the small amount of radiated seismic moment in the tail of STF. To highlight the effect of Earth structure variability, we perform inversions including the uncertainty in the wavespeed only, and variability in both wavespeed and layer depth. We find that little difference is noticeable between the resulting rupture model uncertainties from these two parameterizations. Both significantly broaden the posterior densities and cause faster rupture propagation particularly near the hypocenter due to the major velocity change at the depth where the fault is located.

  4. Three-dimensional localization of low activity gamma-ray sources in real-time scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, Manish K., E-mail: mksrkf@mst.edu; Alajo, Ayodeji B.; Lee, Hyoung K.

    2016-03-21

    Radioactive source localization plays an important role in tracking radiation threats in homeland security tasks. Its real-time application requires computationally efficient and reasonably accurate algorithms even with limited data to support detection with minimum uncertainty. This paper describes a statistic-based grid-refinement method for backtracing the position of a gamma-ray source in a three-dimensional domain in real-time. The developed algorithm used measurements from various known detector positions to localize the source. This algorithm is based on an inverse-square relationship between source intensity at a detector and the distance from the source to the detector. The domain discretization was developed and implemented in MATLAB. The algorithm was tested and verified from simulation results of an ideal case of a point source in non-attenuating medium. Subsequently, an experimental validation of the algorithm was performed to determine the suitability of deploying this scheme in real-time scenarios. Using the measurements from five known detector positions and for a measurement time of 3 min, the source position was estimated with an accuracy of approximately 53 cm. The accuracy improved and stabilized to approximately 25 cm for higher measurement times. It was concluded that the error in source localization was primarily due to detection uncertainties. In verification and experimental validation of the algorithm, the distance between {sup 137}Cs source and any detector position was between 0.84 m and 1.77 m. The results were also compared with the least squares method. Since the discretization algorithm was validated with a weak source, it is expected that it can localize the source of higher activity in real-time. It is believed that for the same physical placement of source and detectors, a source of approximate activity 0.61–0.92 mCi can be localized in real-time with 1 s of measurement time and same accuracy. The accuracy and computational

  5. Solution to the monoenergetic time-dependent neutron transport equation with a time-varying source

    International Nuclear Information System (INIS)

    Ganapol, B.D.

    1986-01-01

    Even though fundamental time-dependent neutron transport problems have existed since the inception of neutron transport theory, it has only been recently that a reliable numerical solution to one of the basic problems has been obtained. Experience in generating numerical solutions to time-dependent transport equations has indicated that the multiple collision formulation is the most versatile numerical technique for model problems. The formulation coupled with a moment reconstruction of each collided flux component has led to benchmark-quality (four- to five-digit accuracy) numerical evaluation of the neutron flux in plane infinite geometry for any degree of scattering anisotropy and for both pulsed isotropic and beam sources. As will be shown in this presentation, this solution can serve as a Green's function, thus extending the previous results to more complicated source situations. Here we will be concerned with a time-varying source at the center of an infinite medium. If accurate, such solutions have both pedagogical and practical uses as benchmarks against which other more approximate solutions designed for a wider class of problems can be compared

  6. Photodetection-induced relative timing jitter in synchronized time-lens source for coherent Raman scattering microscopy

    Directory of Open Access Journals (Sweden)

    Jiaqi Wang

    2017-09-01

    Full Text Available Synchronized time-lens source is a novel method to generate synchronized optical pulses to mode-locked lasers, and has found widespread applications in coherent Raman scattering microscopy. Relative timing jitter between the mode-locked laser and the synchronized time-lens source is a key parameter for evaluating the synchronization performance of such synchronized laser systems. However, the origins of the relative timing jitter in such systems are not fully determined, which in turn prevents the experimental efforts to optimize the synchronization performance. Here, we demonstrate, through theoretical modeling and numerical simulation, that the photodetection could be one physical origin of the relative timing jitter. Comparison with relative timing jitter due to the intrinsic timing jitter of the mode-locked laser is also demonstrated, revealing different qualitative and quantitative behaviors. Based on the nature of this photodetection-induced timing jitter, we further propose several strategies to reduce the relative timing jitter. Our theoretical results will provide guidelines for optimizing synchronization performance in experiments.

  7. Time-correlated neutron analysis of a multiplying HEU source

    Energy Technology Data Exchange (ETDEWEB)

    Miller, E.C., E-mail: Eric.Miller@jhuapl.edu [Johns Hopkins University Applied Physics Laboratory, Laurel, MD (United States); Kalter, J.M.; Lavelle, C.M. [Johns Hopkins University Applied Physics Laboratory, Laurel, MD (United States); Watson, S.M.; Kinlaw, M.T.; Chichester, D.L. [Idaho National Laboratory, Idaho Falls, ID (United States); Noonan, W.A. [Johns Hopkins University Applied Physics Laboratory, Laurel, MD (United States)

    2015-06-01

    The ability to quickly identify and characterize special nuclear material remains a national security challenge. In counter-proliferation applications, identifying the neutron multiplication of a sample can be a good indication of the level of threat. Currently neutron multiplicity measurements are performed with moderated {sup 3}He proportional counters. These systems rely on the detection of thermalized neutrons, a process which obscures both energy and time information from the source. Fast neutron detectors, such as liquid scintillators, have the ability to detect events on nanosecond time scales, providing more information on the temporal structure of the arriving signal, and provide an alternative method for extracting information from the source. To explore this possibility, a series of measurements were performed on the Idaho National Laboratory's MARVEL assembly, a configurable HEU source. The source assembly was measured in a variety of different HEU configurations and with different reflectors, covering a range of neutron multiplications from 2 to 8. The data was collected with liquid scintillator detectors and digitized for offline analysis. A gap based approach for identifying the bursts of detected neutrons associated with the same fission chain was used. Using this approach, we are able to study various statistical properties of individual fission chains. One of these properties is the distribution of neutron arrival times within a given burst. We have observed two interesting empirical trends. First, this distribution exhibits a weak, but definite, dependence on source multiplication. Second, there are distinctive differences in the distribution depending on the presence and type of reflector. Both of these phenomena might prove to be useful when assessing an unknown source. The physical origins of these phenomena can be illuminated with help of MCNPX-PoliMi simulations.

  8. Time-correlated neutron analysis of a multiplying HEU source

    Science.gov (United States)

    Miller, E. C.; Kalter, J. M.; Lavelle, C. M.; Watson, S. M.; Kinlaw, M. T.; Chichester, D. L.; Noonan, W. A.

    2015-06-01

    The ability to quickly identify and characterize special nuclear material remains a national security challenge. In counter-proliferation applications, identifying the neutron multiplication of a sample can be a good indication of the level of threat. Currently neutron multiplicity measurements are performed with moderated 3He proportional counters. These systems rely on the detection of thermalized neutrons, a process which obscures both energy and time information from the source. Fast neutron detectors, such as liquid scintillators, have the ability to detect events on nanosecond time scales, providing more information on the temporal structure of the arriving signal, and provide an alternative method for extracting information from the source. To explore this possibility, a series of measurements were performed on the Idaho National Laboratory's MARVEL assembly, a configurable HEU source. The source assembly was measured in a variety of different HEU configurations and with different reflectors, covering a range of neutron multiplications from 2 to 8. The data was collected with liquid scintillator detectors and digitized for offline analysis. A gap based approach for identifying the bursts of detected neutrons associated with the same fission chain was used. Using this approach, we are able to study various statistical properties of individual fission chains. One of these properties is the distribution of neutron arrival times within a given burst. We have observed two interesting empirical trends. First, this distribution exhibits a weak, but definite, dependence on source multiplication. Second, there are distinctive differences in the distribution depending on the presence and type of reflector. Both of these phenomena might prove to be useful when assessing an unknown source. The physical origins of these phenomena can be illuminated with help of MCNPX-PoliMi simulations.

  9. Time-correlated neutron analysis of a multiplying HEU source

    International Nuclear Information System (INIS)

    Miller, E.C.; Kalter, J.M.; Lavelle, C.M.; Watson, S.M.; Kinlaw, M.T.; Chichester, D.L.; Noonan, W.A.

    2015-01-01

    The ability to quickly identify and characterize special nuclear material remains a national security challenge. In counter-proliferation applications, identifying the neutron multiplication of a sample can be a good indication of the level of threat. Currently neutron multiplicity measurements are performed with moderated 3 He proportional counters. These systems rely on the detection of thermalized neutrons, a process which obscures both energy and time information from the source. Fast neutron detectors, such as liquid scintillators, have the ability to detect events on nanosecond time scales, providing more information on the temporal structure of the arriving signal, and provide an alternative method for extracting information from the source. To explore this possibility, a series of measurements were performed on the Idaho National Laboratory's MARVEL assembly, a configurable HEU source. The source assembly was measured in a variety of different HEU configurations and with different reflectors, covering a range of neutron multiplications from 2 to 8. The data was collected with liquid scintillator detectors and digitized for offline analysis. A gap based approach for identifying the bursts of detected neutrons associated with the same fission chain was used. Using this approach, we are able to study various statistical properties of individual fission chains. One of these properties is the distribution of neutron arrival times within a given burst. We have observed two interesting empirical trends. First, this distribution exhibits a weak, but definite, dependence on source multiplication. Second, there are distinctive differences in the distribution depending on the presence and type of reflector. Both of these phenomena might prove to be useful when assessing an unknown source. The physical origins of these phenomena can be illuminated with help of MCNPX-PoliMi simulations

  10. ONE TIME PASSWORDLESS and IP ADDRESS AUTHENTICATION METHOD for WEB APPLICATION

    Directory of Open Access Journals (Sweden)

    Mohammad Sani Suprayogi

    2015-08-01

    Full Text Available Penelitian yang membahas model autentikasi pengguna, mulai dari autentikasi tradisional menggunakan username dan password hingga metode multi-factor authentication telah sering dilakukan. Namun model autentikasi tersebut masih menggunakan password, dimana manusia memiki keterbatasan untuk mengingat sehingga resiko kehilangan password kerap terjadi. Selain itu pencurian data pada jaringan komputer masih marak dilakukan sehingga perlu pendekatan lain dalam autentikasi pengguna terhadap sistem. Passwordless authentication adalah model autentikasi yang mulai dikenalkan, hanya saja proses implementasinya masih terbatas. Paper ini berusaha meningkatkan metode passwordless dengan tambahan time limit, session, dan ipaddress dalam melakukan autentikasi pengguna. Hasilnya, pengguna tidak perlu membuat dan mengingat password. Pengguna cukup memanfaatkan layanan email untuk proses registrasi dan login, kemudian ip address menjamin bahwa hanya pengguna tersebut yang dapat mengakses layanan website.

  11. Automatic classification of time-variable X-ray sources

    Energy Technology Data Exchange (ETDEWEB)

    Lo, Kitty K.; Farrell, Sean; Murphy, Tara; Gaensler, B. M. [Sydney Institute for Astronomy, School of Physics, The University of Sydney, Sydney, NSW 2006 (Australia)

    2014-05-01

    To maximize the discovery potential of future synoptic surveys, especially in the field of transient science, it will be necessary to use automatic classification to identify some of the astronomical sources. The data mining technique of supervised classification is suitable for this problem. Here, we present a supervised learning method to automatically classify variable X-ray sources in the Second XMM-Newton Serendipitous Source Catalog (2XMMi-DR2). Random Forest is our classifier of choice since it is one of the most accurate learning algorithms available. Our training set consists of 873 variable sources and their features are derived from time series, spectra, and other multi-wavelength contextual information. The 10 fold cross validation accuracy of the training data is ∼97% on a 7 class data set. We applied the trained classification model to 411 unknown variable 2XMM sources to produce a probabilistically classified catalog. Using the classification margin and the Random Forest derived outlier measure, we identified 12 anomalous sources, of which 2XMM J180658.7–500250 appears to be the most unusual source in the sample. Its X-ray spectra is suggestive of a ultraluminous X-ray source but its variability makes it highly unusual. Machine-learned classification and anomaly detection will facilitate scientific discoveries in the era of all-sky surveys.

  12. Automatic classification of time-variable X-ray sources

    International Nuclear Information System (INIS)

    Lo, Kitty K.; Farrell, Sean; Murphy, Tara; Gaensler, B. M.

    2014-01-01

    To maximize the discovery potential of future synoptic surveys, especially in the field of transient science, it will be necessary to use automatic classification to identify some of the astronomical sources. The data mining technique of supervised classification is suitable for this problem. Here, we present a supervised learning method to automatically classify variable X-ray sources in the Second XMM-Newton Serendipitous Source Catalog (2XMMi-DR2). Random Forest is our classifier of choice since it is one of the most accurate learning algorithms available. Our training set consists of 873 variable sources and their features are derived from time series, spectra, and other multi-wavelength contextual information. The 10 fold cross validation accuracy of the training data is ∼97% on a 7 class data set. We applied the trained classification model to 411 unknown variable 2XMM sources to produce a probabilistically classified catalog. Using the classification margin and the Random Forest derived outlier measure, we identified 12 anomalous sources, of which 2XMM J180658.7–500250 appears to be the most unusual source in the sample. Its X-ray spectra is suggestive of a ultraluminous X-ray source but its variability makes it highly unusual. Machine-learned classification and anomaly detection will facilitate scientific discoveries in the era of all-sky surveys.

  13. Pulsar timing residuals due to individual non-evolving gravitational wave sources

    International Nuclear Information System (INIS)

    Tong Ming-Lei; Zhao Cheng-Shi; Yan Bao-Rong; Yang Ting-Gao; Gao Yu-Ping

    2014-01-01

    The pulsar timing residuals induced by gravitational waves from non-evolving single binary sources are affected by many parameters related to the relative positions of the pulsar and the gravitational wave sources. We will analyze the various effects due to different parameters. The standard deviations of the timing residuals will be calculated with a variable parameter fixing a set of other parameters. The orbits of the binary sources will be generally assumed to be elliptical. The influences of different eccentricities on the pulsar timing residuals will also be studied in detail. We find that the effects of the related parameters are quite different, and some of them display certain regularities

  14. Time-resolved X-ray studies using third generation synchrotron radiation sources

    International Nuclear Information System (INIS)

    Mills, D.M.

    1991-10-01

    The third generation, high-brilliance, hard x-ray, synchrotron radiation (SR) sources currently under construction (ESRF at Grenoble, France; APS at Argonne, Illinois; and SPring-8 at Harima, Japan) will usher in a new era of x-ray experimentation for both physical and biological sciences. One of the most exciting areas of experimentation will be the extension of x-ray scattering and diffraction techniques to the study of transient or time-evolving systems. The high repetition rate, short-pulse duration, high brilliance, and variable spectral bandwidth of these sources make them ideal for x-ray time-resolved studies. The temporal properties (bunch length, interpulse period, etc.) of these new sources will be summarized. Finally, the scientific potential and the technological challenges of time-resolved x-ray scattering from these new sources will be described. 13 refs., 4 figs

  15. Time-dependent anisotropic external sources in transient 3-D transport code TORT-TD

    International Nuclear Information System (INIS)

    Seubert, A.; Pautz, A.; Becker, M.; Dagan, R.

    2009-01-01

    This paper describes the implementation of a time-dependent distributed external source in TORT-TD by explicitly considering the external source in the ''fixed-source'' term of the implicitly time-discretised 3-D discrete ordinates transport equation. Anisotropy of the external source is represented by a spherical harmonics series expansion similar to the angular fluxes. The YALINA-Thermal subcritical assembly serves as a test case. The configuration with 280 fuel rods has been analysed with TORT-TD using cross sections in 18 energy groups and P1 scattering order generated by the KAPROS code system. Good agreement is achieved concerning the multiplication factor. The response of the system to an artificial time-dependent source consisting of two square-wave pulses demonstrates the time-dependent external source capability of TORT-TD. The result is physically plausible as judged from validation calculations. (orig.)

  16. Towards an accurate real-time locator of infrasonic sources

    Science.gov (United States)

    Pinsky, V.; Blom, P.; Polozov, A.; Marcillo, O.; Arrowsmith, S.; Hofstetter, A.

    2017-11-01

    Infrasonic signals propagate from an atmospheric source via media with stochastic and fast space-varying conditions. Hence, their travel time, the amplitude at sensor recordings and even manifestation in the so-called "shadow zones" are random. Therefore, the traditional least-squares technique for locating infrasonic sources is often not effective, and the problem for the best solution must be formulated in probabilistic terms. Recently, a series of papers has been published about Bayesian Infrasonic Source Localization (BISL) method based on the computation of the posterior probability density function (PPDF) of the source location, as a convolution of a priori probability distribution function (APDF) of the propagation model parameters with likelihood function (LF) of observations. The present study is devoted to the further development of BISL for higher accuracy and stability of the source location results and decreasing of computational load. We critically analyse previous algorithms and propose several new ones. First of all, we describe the general PPDF formulation and demonstrate that this relatively slow algorithm might be among the most accurate algorithms, provided the adequate APDF and LF are used. Then, we suggest using summation instead of integration in a general PPDF calculation for increased robustness, but this leads us to the 3D space-time optimization problem. Two different forms of APDF approximation are considered and applied for the PPDF calculation in our study. One of them is previously suggested, but not yet properly used is the so-called "celerity-range histograms" (CRHs). Another is the outcome from previous findings of linear mean travel time for the four first infrasonic phases in the overlapping consecutive distance ranges. This stochastic model is extended here to the regional distance of 1000 km, and the APDF introduced is the probabilistic form of the junction between this travel time model and range-dependent probability

  17. The Electromagnetic Field of Elementary Time-Dependent Toroidal Sources

    International Nuclear Information System (INIS)

    Afanas'ev, G.N.; Stepanovskij, Yu.P.

    1994-01-01

    The radiation field of toroidal-like time-dependent current configurations is investigated. Time-dependent charge-current sources are found outside which the electromagnetic strengths disappear but the potentials survive. This can be used to carry out time-dependent Aharonov-Bohm-like experiments and the information transfer. Using the Neumann-Helmholtz parametrization of the current density we present the time-dependent electromagnetic field in a form convenient for applications. 17 refs

  18. Time-resolved materials science opportunities using synchrotron x-ray sources

    International Nuclear Information System (INIS)

    Larson, B.C.; Tischler, J.Z.

    1995-06-01

    The high brightness, high intensity, and pulsed time-structure of synchrotron sources provide new opportunities for time-resolved x-ray diffraction investigations. With third generation synchrotron sources coming on line, high brilliance and high brightness are now available in x-ray beams with the highest flux. In addition to the high average flux, the instantaneous flux available in synchrotron beams is greatly enhanced by the pulsed time structure, which consists of short bursts of x-rays that are separated by ∼tens to hundreds of nanoseconds. Time-resolved one- and two-dimensional position sensitive detection techniques that take advantage of synchrotron radiation for materials science x-ray diffraction investigations are presented, and time resolved materials science applications are discussed in terms of recent diffraction and spectroscopy results and materials research opportunities

  19. Effects of the airwave in time-domain marine controlled-source electromagnetics

    NARCIS (Netherlands)

    Hunziker, J.W.; Slob, E.C.; Mulder, W.

    2011-01-01

    In marine time-domain controlled-source electromagnetics (CSEM), there are two different acquisition methods: with horizontal sources for fast and simple data acquisition or with vertical sources for minimizing the effects of the airwave. Illustrations of the electric field as a function of space

  20. Blind Separation of Nonstationary Sources Based on Spatial Time-Frequency Distributions

    Directory of Open Access Journals (Sweden)

    Zhang Yimin

    2006-01-01

    Full Text Available Blind source separation (BSS based on spatial time-frequency distributions (STFDs provides improved performance over blind source separation methods based on second-order statistics, when dealing with signals that are localized in the time-frequency (t-f domain. In this paper, we propose the use of STFD matrices for both whitening and recovery of the mixing matrix, which are two stages commonly required in many BSS methods, to provide robust BSS performance to noise. In addition, a simple method is proposed to select the auto- and cross-term regions of time-frequency distribution (TFD. To further improve the BSS performance, t-f grouping techniques are introduced to reduce the number of signals under consideration, and to allow the receiver array to separate more sources than the number of array sensors, provided that the sources have disjoint t-f signatures. With the use of one or more techniques proposed in this paper, improved performance of blind separation of nonstationary signals can be achieved.

  1. Optimization of NANOGrav's time allocation for maximum sensitivity to single sources

    International Nuclear Information System (INIS)

    Christy, Brian; Anella, Ryan; Lommen, Andrea; Camuccio, Richard; Handzo, Emma; Finn, Lee Samuel

    2014-01-01

    Pulsar timing arrays (PTAs) are a collection of precisely timed millisecond pulsars (MSPs) that can search for gravitational waves (GWs) in the nanohertz frequency range by observing characteristic signatures in the timing residuals. The sensitivity of a PTA depends on the direction of the propagating GW source, the timing accuracy of the pulsars, and the allocation of the available observing time. The goal of this paper is to determine the optimal time allocation strategy among the MSPs in the North American Nanohertz Observatory for Gravitational Waves (NANOGrav) for a single source of GW under a particular set of assumptions. We consider both an isotropic distribution of sources across the sky and a specific source in the Virgo cluster. This work improves on previous efforts by modeling the effect of intrinsic spin noise for each pulsar. We find that, in general, the array is optimized by maximizing time spent on the best-timed pulsars, with sensitivity improvements typically ranging from a factor of 1.5 to 4.

  2. Joint source-channel coding using variable length codes

    NARCIS (Netherlands)

    Balakirsky, V.B.

    2001-01-01

    We address the problem of joint source-channel coding when variable-length codes are used for information transmission over a discrete memoryless channel. Data transmitted over the channel are interpreted as pairs (m k ,t k ), where m k is a message generated by the source and t k is a time instant

  3. About the Modeling of Radio Source Time Series as Linear Splines

    Science.gov (United States)

    Karbon, Maria; Heinkelmann, Robert; Mora-Diaz, Julian; Xu, Minghui; Nilsson, Tobias; Schuh, Harald

    2016-12-01

    Many of the time series of radio sources observed in geodetic VLBI show variations, caused mainly by changes in source structure. However, until now it has been common practice to consider source positions as invariant, or to exclude known misbehaving sources from the datum conditions. This may lead to a degradation of the estimated parameters, as unmodeled apparent source position variations can propagate to the other parameters through the least squares adjustment. In this paper we will introduce an automated algorithm capable of parameterizing the radio source coordinates as linear splines.

  4. The Advanced Photon Source injection timing system

    International Nuclear Information System (INIS)

    Lenkszus, F.R.; Laird, R.

    1995-01-01

    The Advanced Photon Source consists of five accelerators. The injection timing system provides the signals required to cause a bunch emitted from the electron gun to navigate through intermediate accelerators to a specific bucket (1 out of 1296) within the storage ring. Two linacs and a positron accumulator ring operate at 60Hz while a booster synchrotron ramps and injects into the storage ring at 2Hz. The distributed, modular VME/VXI-based injection timing system is controlled by two EPICS-based input/output controllers (IOCs). Over 40 VME/VXI cards have been developed to implement the system. Card types range from 352MHz VXI timing modules to VME-based fiber optic fanouts and logic translators/drivers. All timing is distributed with fiber optics. Timing references are derived directly from machine low-level rf of 9.77MHz and 352MHz. The timing references provide triggers to programmable delay generators. Three grades of timing are provided. Precision timing is derived from commercial digital delay generators, intermediate precision timing is obtained from VXI 8-channel digital delay generators which provide timing with 25ns peak-to-peak jitter, and modest precision timing is provided by the APS event system. The timing system is fully integrated into the APS EPICS-based control system

  5. Timing jitter measurements at the SLC electron source

    International Nuclear Information System (INIS)

    Sodja, J.; Browne, M.J.; Clendenin, J.E.

    1989-03-01

    The SLC thermionic gun and electron source produce a beam of up to 15 /times/ 10 10 /sub e//minus/ in a single S-band bunch. A 170 keV, 2 ns FWHM pulse out of the gun is compressed by means of two subharmonic buncher cavities followed by an S-band buncher and a standard SLAC accelerating section. Ceramic gaps in the beam pipe at the output of the gun allow a measure of the beam intensity and timing. A measurement at these gaps of the timing jitter, with a resolution of <10 ps, is described. 3 refs., 5 figs

  6. Addressing the changing sources of health information in Iran

    Directory of Open Access Journals (Sweden)

    Amir Alishahi Tabriz

    2013-01-01

    Conclusion : Although during 8 years of study radio and television remained as main source of health information but there is an increasing tendency to use internet especially in men. Policymakers should revise their broadcasting strategies based on people demand.

  7. Single sources in the low-frequency gravitational wave sky: properties and time to detection by pulsar timing arrays

    Science.gov (United States)

    Kelley, Luke Zoltan; Blecha, Laura; Hernquist, Lars; Sesana, Alberto; Taylor, Stephen R.

    2018-06-01

    We calculate the properties, occurrence rates and detection prospects of individually resolvable `single sources' in the low-frequency gravitational wave (GW) spectrum. Our simulations use the population of galaxies and massive black hole binaries from the Illustris cosmological hydrodynamic simulations, coupled to comprehensive semi-analytic models of the binary merger process. Using mock pulsar timing arrays (PTA) with, for the first time, varying red-noise models, we calculate plausible detection prospects for GW single sources and the stochastic GW background (GWB). Contrary to previous results, we find that single sources are at least as detectable as the GW background. Using mock PTA, we find that these `foreground' sources (also `deterministic'/`continuous') are likely to be detected with ˜20 yr total observing baselines. Detection prospects, and indeed the overall properties of single sources, are only moderately sensitive to binary evolution parameters - namely eccentricity and environmental coupling, which can lead to differences of ˜5 yr in times to detection. Red noise has a stronger effect, roughly doubling the time to detection of the foreground between a white-noise only model (˜10-15 yr) and severe red noise (˜20-30 yr). The effect of red noise on the GWB is even stronger, suggesting that single source detections may be more robust. We find that typical signal-to-noise ratios for the foreground peak near f = 0.1 yr-1, and are much less sensitive to the continued addition of new pulsars to PTA.

  8. Radiation Tolerant Low Power Precision Time Source, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The availability of small, low power atomic clocks is now a reality for ground-based and airborne navigation systems. Kernco's Low Power Precision Time Source...

  9. Sources of variability and systematic error in mouse timing behavior.

    Science.gov (United States)

    Gallistel, C R; King, Adam; McDonald, Robert

    2004-01-01

    In the peak procedure, starts and stops in responding bracket the target time at which food is expected. The variability in start and stop times is proportional to the target time (scalar variability), as is the systematic error in the mean center (scalar error). The authors investigated the source of the error and the variability, using head poking in the mouse, with target intervals of 5 s, 15 s, and 45 s, in the standard procedure, and in a variant with 3 different target intervals at 3 different locations in a single trial. The authors conclude that the systematic error is due to the asymmetric location of start and stop decision criteria, and the scalar variability derives primarily from sources other than memory.

  10. Real time source term and dose assessment

    International Nuclear Information System (INIS)

    Breznik, B.; Kovac, A.; Mlakar, P.

    2001-01-01

    The Dose Projection Programme is a tool for decision making in case of nuclear emergency. The essential input data for quick emergency evaluation in the case of hypothetical pressurised water reactor accident are following: source term, core damage assessment, fission product radioactivity, release source term and critical exposure pathways for an early phase of the release. A reduced number of radio-nuclides and simplified calculations can be used in dose calculation algorithm. Simple expert system personal computer programme has been developed for the Krsko Nuclear Power Plant for dose projection within the radius of few kilometers from the pressurised water reactor in early phase of an accident. The input data are instantaneous data of core activity, core damage indicators, release fractions, reduction factor of the release pathways, spray operation, release timing, and dispersion coefficient. Main dose projection steps are: accurate in-core radioactivity determination using reactor power input; core damage and in-containment source term assessment based on quick indications of instrumentation or on activity analysis data; user defines release pathway for typical PWR accident scenarius; dose calculation is performed only for exposure pathway critical for decision about evacuation or sheltering in early phase of an accident.(author)

  11. Time-Reversal Study of the Hemet (CA) Tremor Source

    Science.gov (United States)

    Larmat, C. S.; Johnson, P. A.; Guyer, R. A.

    2010-12-01

    Since its first observation by Nadeau & Dolenc (2005) and Gomberg et al. (2008), tremor along the San Andreas fault system is thought to be a probe into the frictional state of the deep part of the fault (e.g. Shelly et al., 2007). Tremor is associated with slow, otherwise deep, aseismic slip events that may be triggered by faint signals such as passing waves from remote earthquakes or solid Earth tides.Well resolved tremor source location is key to constrain frictional models of the fault. However, tremor source location is challenging because of the high-frequency and highly-scattered nature of tremor signal characterized by the lack of isolated phase arrivals. Time Reversal (TR) methods are emerging as a useful tool for location. The unique requirement is a good velocity model for the different time-reversed phases to arrive coherently onto the source point. We present results of location for a tremor source near the town of Hemet, CA, which was triggered by the 2002 M 7.9 Denali Fault earthquake (Gomberg et al., 2008) and by the 2009 M 6.9 Gulf of California earthquake. We performed TR in a volume model of 88 (N-S) x 70 (W-E) x 60 km (Z) using the full-wave 3D wave-propagation package SPECFEM3D (Komatitsch et al., 2002). The results for the 2009 episode indicate a deep source (at about 22km) which is about 4km SW the fault surface scarp. We perform STA/SLA and correlation analysis in order to have independent confirmation of the Hemet tremor source. We gratefully acknowledge the support of the U. S. Department of Energy through the LANL/LDRD Program for this work.

  12. Global threat reduction initiative efforts to address transportation challenges associated with the recovery of disused radioactive sealed sources - 10460

    International Nuclear Information System (INIS)

    Whitworth, Julie; Abeyta, Cristy L.; Griffin, Justin M.; Matzke, James L.; Pearson, Michael W.; Cuthbertson, Abigail; Rawl, Richard; Singley, Paul

    2010-01-01

    Proper disposition of disused radioactive sources is essential for their safe and secure management and necessary to preclude their use in malicious activities. Without affordable, timely transportation options, disused sealed sources remain in storage at hundreds of sites throughout the country and around the world. While secure storage is a temporary measure, the longer sources remain disused or unwanted the chances increase that they will become unsecured or abandoned. The Global Threat Reduction Initiative's Off-Site Source Recovery Project (GTRIlOSRP), recovers thousands of disused and unwanted sealed sources annually as part of GTRl's larger mission to reduce and protect high risk nuclear and radiological materials located at civilian sites worldwide. Faced with decreasing availability of certified transportation containers to support movement of disused and unwanted neutron- and beta/gamma-emitting radioactive sealed sources, GTRIlOSRP has initiated actions to ensure the continued success of the project in timely recovery and management of sealed radioactive sources. Efforts described in this paper to enhance transportation capabilities include: (sm b ullet) Addition of authorized content to existing and planned Type B containers to support the movement of non-special form and other Type B-quantity sealed sources; (sm b ullet) Procurement of vendor services for the design, development, testing and certification of a new Type B container to support transportation of irradiators, teletherapy heads or sources removed from these devices using remote handling capabilities such as the IAEA portable hot cell facility; (sm b ullet) Expansion of shielded Type A container inventory for transportation of gamma-emitting sources in activity ranges requiring use of shielding for conformity with transportation requirements; (sm b ullet) Approval of the S300 Type A fissile container for transport of Pu-239 sealed sources internationally; (sm b ullet) Technology transfer of

  13. Time-domain single-source integral equations for analyzing scattering from homogeneous penetrable objects

    KAUST Repository

    Valdés, Felipe

    2013-03-01

    Single-source time-domain electric-and magnetic-field integral equations for analyzing scattering from homogeneous penetrable objects are presented. Their temporal discretization is effected by using shifted piecewise polynomial temporal basis functions and a collocation testing procedure, thus allowing for a marching-on-in-time (MOT) solution scheme. Unlike dual-source formulations, single-source equations involve space-time domain operator products, for which spatial discretization techniques developed for standalone operators do not apply. Here, the spatial discretization of the single-source time-domain integral equations is achieved by using the high-order divergence-conforming basis functions developed by Graglia alongside the high-order divergence-and quasi curl-conforming (DQCC) basis functions of Valdés The combination of these two sets allows for a well-conditioned mapping from div-to curl-conforming function spaces that fully respects the space-mapping properties of the space-time operators involved. Numerical results corroborate the fact that the proposed procedure guarantees accuracy and stability of the MOT scheme. © 2012 IEEE.

  14. Arid landscape dynamics along a precipitation gradient: addressing vegetation - landscape structure - resource interactions at different time scales

    NARCIS (Netherlands)

    Buis, E.

    2008-01-01

    This research is entitled ‘Arid landscape dynamics along a precipitation gradient: addressing
    vegetation – landscape structure – resource interactions at different time scales’ with as subtitle
    ‘A case study for the Northern Negev Desert of Israel’. Landscape dynamics describes the

  15. Advances in high-order harmonic generation sources for time-resolved investigations

    Energy Technology Data Exchange (ETDEWEB)

    Reduzzi, Maurizio [Dipartimento di Fisica, Politecnico di Milano, Piazza Leonardo da Vinci 32, 20133 Milano (Italy); Institute of Photonics and Nanotechnologies, CNR-IFN, Piazza Leonardo da Vinci 32, 20133 Milano (Italy); Carpeggiani, Paolo [Dipartimento di Fisica, Politecnico di Milano, Piazza Leonardo da Vinci 32, 20133 Milano (Italy); Kühn, Sergei [ELI-ALPS, ELI-Hu Kft., Dugonics ter 13, H-6720 Szeged (Hungary); Calegari, Francesca [Institute of Photonics and Nanotechnologies, CNR-IFN, Piazza Leonardo da Vinci 32, 20133 Milano (Italy); Nisoli, Mauro; Stagira, Salvatore [Dipartimento di Fisica, Politecnico di Milano, Piazza Leonardo da Vinci 32, 20133 Milano (Italy); Institute of Photonics and Nanotechnologies, CNR-IFN, Piazza Leonardo da Vinci 32, 20133 Milano (Italy); Vozzi, Caterina [Institute of Photonics and Nanotechnologies, CNR-IFN, Piazza Leonardo da Vinci 32, 20133 Milano (Italy); Dombi, Peter [ELI-ALPS, ELI-Hu Kft., Dugonics ter 13, H-6720 Szeged (Hungary); Wigner Research Center for Physics, 1121 Budapest (Hungary); Kahaly, Subhendu [ELI-ALPS, ELI-Hu Kft., Dugonics ter 13, H-6720 Szeged (Hungary); Tzallas, Paris; Charalambidis, Dimitris [ELI-ALPS, ELI-Hu Kft., Dugonics ter 13, H-6720 Szeged (Hungary); Foundation for Research and Technology – Hellas, Institute of Electronic Structure and Lasers, P.O. Box 1527, GR-711 10 Heraklion, Crete (Greece); Varju, Katalin [ELI-ALPS, ELI-Hu Kft., Dugonics ter 13, H-6720 Szeged (Hungary); Department of Optics and Quantum Electronics, University of Szeged, Dóm tér 9, 6720 Szeged (Hungary); Osvay, Karoly [ELI-ALPS, ELI-Hu Kft., Dugonics ter 13, H-6720 Szeged (Hungary); and others

    2015-10-15

    We review the main research directions ongoing in the development of extreme ultraviolet sources based on high-harmonic generation for the synthesization and application of trains and isolated attosecond pulses to time-resolved spectroscopy. A few experimental and theoretical works will be discussed in connection to well-established attosecond techniques. In this context, we present the unique possibilities offered for time-resolved investigations on the attosecond timescale by the new Extreme Light Infrastructure Attosecond Light Pulse Source, which is currently under construction.

  16. Advances in high-order harmonic generation sources for time-resolved investigations

    International Nuclear Information System (INIS)

    Reduzzi, Maurizio; Carpeggiani, Paolo; Kühn, Sergei; Calegari, Francesca; Nisoli, Mauro; Stagira, Salvatore; Vozzi, Caterina; Dombi, Peter; Kahaly, Subhendu; Tzallas, Paris; Charalambidis, Dimitris; Varju, Katalin; Osvay, Karoly

    2015-01-01

    We review the main research directions ongoing in the development of extreme ultraviolet sources based on high-harmonic generation for the synthesization and application of trains and isolated attosecond pulses to time-resolved spectroscopy. A few experimental and theoretical works will be discussed in connection to well-established attosecond techniques. In this context, we present the unique possibilities offered for time-resolved investigations on the attosecond timescale by the new Extreme Light Infrastructure Attosecond Light Pulse Source, which is currently under construction.

  17. A new time-space accounting scheme to predict stream water residence time and hydrograph source components at the watershed scale

    Science.gov (United States)

    Takahiro Sayama; Jeffrey J. McDonnell

    2009-01-01

    Hydrograph source components and stream water residence time are fundamental behavioral descriptors of watersheds but, as yet, are poorly represented in most rainfall-runoff models. We present a new time-space accounting scheme (T-SAS) to simulate the pre-event and event water fractions, mean residence time, and spatial source of streamflow at the watershed scale. We...

  18. Probing Motion of Fast Radio Burst Sources by Timing Strongly Lensed Repeaters

    Science.gov (United States)

    Dai, Liang; Lu, Wenbin

    2017-09-01

    Given the possible repetitive nature of fast radio bursts (FRBs), their cosmological origin, and their high occurrence, detection of strongly lensed sources due to intervening galaxy lenses is possible with forthcoming radio surveys. We show that if multiple images of a repeating source are resolved with VLBI, using a method independent of lens modeling, accurate timing could reveal non-uniform motion, either physical or apparent, of the emission spot. This can probe the physical nature of FRBs and their surrounding environments, constraining scenarios including orbital motion around a stellar companion if FRBs require a compact star in a special system, and jet-medium interactions for which the location of the emission spot may randomly vary. The high timing precision possible for FRBs (˜ms) compared with the typical time delays between images in galaxy lensing (≳10 days) enables the measurement of tiny fractional changes in the delays (˜ {10}-9) and hence the detection of time-delay variations induced by relative motions between the source, the lens, and the Earth. We show that uniform cosmic peculiar velocities only cause the delay time to drift linearly, and that the effect from the Earth’s orbital motion can be accurately subtracted, thus enabling a search for non-trivial source motion. For a timing accuracy of ˜1 ms and a repetition rate (of detected bursts) of ˜0.05 per day of a single FRB source, non-uniform displacement ≳0.1-1 au of the emission spot perpendicular to the line of sight is detectable if repetitions are seen over a period of hundreds of days.

  19. An Evolving Worldview: Making Open Source Easy

    Science.gov (United States)

    Rice, Z.

    2017-12-01

    NASA Worldview is an interactive interface for browsing full-resolution, global satellite imagery. Worldview supports an open data policy so that academia, private industries and the general public can use NASA's satellite data to address Earth science related issues. Worldview was open sourced in 2014. By shifting to an open source approach, the Worldview application has evolved to better serve end-users. Project developers are able to have discussions with end-users and community developers to understand issues and develop new features. Community developers are able to track upcoming features, collaborate on them and make their own contributions. Developers who discover issues are able to address those issues and submit a fix. This reduces the time it takes for a project developer to reproduce an issue or develop a new feature. Getting new developers to contribute to the project has been one of the most important and difficult aspects of open sourcing Worldview. After witnessing potential outside contributors struggle, a focus has been made on making the installation of Worldview simple to reduce the initial learning curve and make contributing code easy. One way we have addressed this is through a simplified setup process. Our setup documentation includes a set of prerequisites and a set of straightforward commands to clone, configure, install and run. This presentation will emphasize our focus to simplify and standardize Worldview's open source code so that more people are able to contribute. The more people who contribute, the better the application will become over time.

  20. Time-dependent anisotropic distributed source capability in transient 3-d transport code tort-TD

    International Nuclear Information System (INIS)

    Seubert, A.; Pautz, A.; Becker, M.; Dagan, R.

    2009-01-01

    The transient 3-D discrete ordinates transport code TORT-TD has been extended to account for time-dependent anisotropic distributed external sources. The extension aims at the simulation of the pulsed neutron source in the YALINA-Thermal subcritical assembly. Since feedback effects are not relevant in this zero-power configuration, this offers a unique opportunity to validate the time-dependent neutron kinetics of TORT-TD with experimental data. The extensions made in TORT-TD to incorporate a time-dependent anisotropic external source are described. The steady state of the YALINA-Thermal assembly and its response to an artificial square-wave source pulse sequence have been analysed with TORT-TD using pin-wise homogenised cross sections in 18 prompt energy groups with P 1 scattering order and 8 delayed neutron groups. The results demonstrate the applicability of TORT-TD to subcritical problems with a time-dependent external source. (authors)

  1. Improving wheat productivity through source and timing of nitrogen fertilization

    International Nuclear Information System (INIS)

    Jan, M.T.; Khan, A.; Afridi, M.Z.; Arif, M.; Khan, M.J.; Farhatullah; Jan, D.; Saeed, M.

    2011-01-01

    Efficient nitrogen (N) fertilizer management is critical for the improved production of wheat (Triticum aestivum L.) and can be achieved through source and timing of N application. Thus, an experiment was carried out at the Research Farm of KPK Agricultural University Peshawar during 2005-06 to test the effects of sources and timing of N application on yield and yield components of wheat. Nitrogen sources were ammonium (NH/sub 4/) and nitrate (NO/sub 3/) applied at the rate of 100 kg ha/sup -1/ at three different stages i.e., at sowing (S1), tillering (S2) and boot stage (S3). Ammonium N increased yield component but did not affect the final grain yield. Split N application at sowing, tillering and boot stages had increased productive tillers m-2, and thousand grains weight, whereas grain yield was higher when N was applied at tillering and boot stages. Nitrogen fertilization increased 20% grain yield compared to control regardless of N application time. It was concluded from the experiment that split application of NH/sub 4/-N performed better than full dose application and/or NO/sub 3/-N for improved wheat productivity and thus, is recommended for general practice in agro-climatic conditions of Peshawar. (author)

  2. Name-Based Address Mapping for Virtual Private Networks

    Science.gov (United States)

    Surányi, Péter; Shinjo, Yasushi; Kato, Kazuhiko

    IPv4 private addresses are commonly used in local area networks (LANs). With the increasing popularity of virtual private networks (VPNs), it has become common that a user connects to multiple LANs at the same time. However, private address ranges for LANs frequently overlap. In such cases, existing systems do not allow the user to access the resources on all LANs at the same time. In this paper, we propose name-based address mapping for VPNs, a novel method that allows connecting to hosts through multiple VPNs at the same time, even when the address ranges of the VPNs overlap. In name-based address mapping, rather than using the IP addresses used on the LANs (the real addresses), we assign a unique virtual address to each remote host based on its domain name. The local host uses the virtual addresses to communicate with remote hosts. We have implemented name-based address mapping for layer 3 OpenVPN connections on Linux and measured its performance. The communication overhead of our system is less than 1.5% for throughput and less than 0.2ms for each name resolution.

  3. Addressing techniques of liquid crystal displays

    CERN Document Server

    Ruckmongathan, Temkar N

    2014-01-01

    Unique reference source that can be used from the beginning to end of a design project to aid choosing an appropriate LCD addressing technique for a given application This book will be aimed at design engineers who are likely to embed LCD drivers and controllers in many systems including systems on chip. Such designers face the challenge of making the right choice of an addressing technique that will serve them with best performance at minimal cost and complexity. Readers will be able to learn about various methods available for driving matrix LCDs and the comparisons at the end of each chap

  4. System identification through nonstationary data using Time-Frequency Blind Source Separation

    Science.gov (United States)

    Guo, Yanlin; Kareem, Ahsan

    2016-06-01

    Classical output-only system identification (SI) methods are based on the assumption of stationarity of the system response. However, measured response of buildings and bridges is usually non-stationary due to strong winds (e.g. typhoon, and thunder storm etc.), earthquakes and time-varying vehicle motions. Accordingly, the response data may have time-varying frequency contents and/or overlapping of modal frequencies due to non-stationary colored excitation. This renders traditional methods problematic for modal separation and identification. To address these challenges, a new SI technique based on Time-Frequency Blind Source Separation (TFBSS) is proposed. By selectively utilizing "effective" information in local regions of the time-frequency plane, where only one mode contributes to energy, the proposed technique can successfully identify mode shapes and recover modal responses from the non-stationary response where the traditional SI methods often encounter difficulties. This technique can also handle response with closely spaced modes which is a well-known challenge for the identification of large-scale structures. Based on the separated modal responses, frequency and damping can be easily identified using SI methods based on a single degree of freedom (SDOF) system. In addition to the exclusive advantage of handling non-stationary data and closely spaced modes, the proposed technique also benefits from the absence of the end effects and low sensitivity to noise in modal separation. The efficacy of the proposed technique is demonstrated using several simulation based studies, and compared to the popular Second-Order Blind Identification (SOBI) scheme. It is also noted that even some non-stationary response data can be analyzed by the stationary method SOBI. This paper also delineates non-stationary cases where SOBI and the proposed scheme perform comparably and highlights cases where the proposed approach is more advantageous. Finally, the performance of the

  5. Reclaiming unused IPv4 addresses

    CERN Multimedia

    IT Department

    2016-01-01

    As many people might know, the number of IPv4 addresses is limited and almost all have been allocated (see here and here for more information).   Although CERN has been allocated some 340,000 addresses, the way these are allocated across the site is not as efficient as we would like. As we face an increasing demand for IPv4 addresses with the growth in virtual machines, the IT Department’s Communication Systems Group will be reorganising address allocation during 2016 to make more efficient use of the IPv4 address ranges that have been allocated to CERN. We aim, wherever possible, to avoid giving out fixed IP addresses, and have all devices connected to the campus network obtain an address dynamically each time they connect. As a first stage, starting in February, IP addresses that have not been used for more than 9 months will be reclaimed. No information about the devices concerned will be deleted from LANDB, but a new IP address will have to be requested if they are ever reconnected to t...

  6. Disrupting gatekeeping practices: Journalists' source selection in times of crisis.

    Science.gov (United States)

    van der Meer, Toni G L A; Verhoeven, Piet; Beentjes, Johannes W J; Vliegenthart, Rens

    2017-10-01

    As gatekeepers, journalists have the power to select the sources that get a voice in crisis coverage. The aim of this study is to find out how journalists select sources during a crisis. In a survey, journalists were asked how they assess the following sources during an organizational crisis: news agencies, an organization undergoing a crisis, and the general public. The sample consisted of 214 Dutch experienced journalists who at least once covered a crisis. Using structural equation modeling, sources' likelihood of being included in the news was predicted using five source characteristics: credibility, knowledge, willingness, timeliness, and the relationship with the journalist. Findings indicated that during a crisis, news agencies are most likely to be included in the news, followed by the public, and finally the organization. The significance of the five source characteristics is dependent on source type. For example, to be used in the news, news agencies and organizations should be mainly evaluated as knowledgeable, whereas information from the public should be both credible and timely. In addition, organizations should not be seen as too willing or too eager to communicate. The findings imply that, during a crisis, journalists remain critical gatekeepers; however, they rely mainly on familiar sources.

  7. Dual Source Time-of-flight Mass Spectrometer and Sample Handling System

    Science.gov (United States)

    Brinckerhoff, W.; Mahaffy, P.; Cornish, T.; Cheng, A.; Gorevan, S.; Niemann, H.; Harpold, D.; Rafeek, S.; Yucht, D.

    We present details of an instrument under development for potential NASA missions to planets and small bodies. The instrument comprises a dual ionization source (laser and electron impact) time-of-flight mass spectrometer (TOF-MS) and a carousel sam- ple handling system for in situ analysis of solid materials acquired by, e.g., a coring drill. This DSTOF instrument could be deployed on a fixed lander or a rover, and has an open design that would accommodate measurements by additional instruments. The sample handling system (SHS) is based on a multi-well carousel, originally de- signed for Champollion/DS4. Solid samples, in the form of drill cores or as loose chips or fines, are inserted through an access port, sealed in vacuum, and transported around the carousel to a pyrolysis cell and/or directly to the TOF-MS inlet. Samples at the TOF-MS inlet are xy-addressable for laser or optical microprobe. Cups may be ejected from their holders for analyzing multiple samples or caching them for return. Samples are analyzed with laser desorption and evolved-gas/electron-impact sources. The dual ion source permits studies of elemental, isotopic, and molecular composition of unprepared samples with a single mass spectrometer. Pulsed laser desorption per- mits the measurement of abundance and isotope ratios of refractory elements, as well as the detection of high-mass organic molecules in solid samples. Evolved gas analysis permits similar measurements of the more volatile species in solids and aerosols. The TOF-MS is based on previous miniature prototypes at JHU/APL that feature high sensitivity and a wide mass range. The laser mode, in which the sample cup is directly below the TOF-MS inlet, permits both ablation and desorption measurements, to cover elemental and molecular species, respectively. In the evolved gas mode, sample cups are raised into a small pyrolysis cell and heated, producing a neutral gas that is elec- tron ionized and pulsed into the TOF-MS. (Any imaging

  8. The genetic source and timing of hydrocarbon formation in gas hydrate reservoirs in Green Canyon, Block GC955

    Science.gov (United States)

    Moore, M. T.; Darrah, T.; Cook, A.; Sawyer, D.; Phillips, S.; Whyte, C. J.; Lary, B. A.

    2017-12-01

    Although large volumes of gas hydrates are known to exist along continental slopes and below permafrost, their role in the energy sector and the global carbon cycle remains uncertain. Investigations regarding the genetic source(s) (i.e., biogenic, thermogenic, mixed sources of hydrocarbon gases), the location of hydrocarbon generation, (whether hydrocarbons formed within the current reservoir formations or underwent migration), rates of clathrate formation, and the timing of natural gas formation/accumulation within clathrates are vital to evaluate economic potential and enhance our understanding of geologic processes. Previous studies addressed some of these questions through analysis of conventional hydrocarbon molecular (C1/C2+) and stable isotopic (e.g., δ13C-CH4, δ2H-CH4, δ13C-CO2) composition of gases, water chemistry and isotopes (e.g., major and trace elements, δ2H-H2O, δ18O-H2O), and dissolved inorganic carbon (δ13C-DIC) of natural gas hydrate systems to determine proportions of biogenic and thermogenic gas. However, the effects from contributions of mixing, transport/migration, methanogenesis, and oxidation in the subsurface can complicate the first-order application of these techniques. Because the original noble gas composition of a fluid is preserved independent of microbial activity, chemical reactions, or changes in oxygen fugacity, the integration of noble gas data can provide both a geochemical fingerprint for sources of fluids and an additional insight as to the uncertainty between effects of mixing versus post-genetic modification. Here, we integrate inert noble gases (He, Ne, Ar, and associated isotopes) with these conventional approaches to better constrain the source of gas hydrate formation and the residence time of fluids (porewaters and natural gases) using radiogenic 4He ingrowth techniques in cores from two boreholes collected as part of the University of Texas led UT-GOM2-01 drilling project. Pressurized cores were extracted from

  9. Disrupting gatekeeping practices: Journalists’ source selection in times of crisis

    Science.gov (United States)

    van der Meer, Toni G.L.A.; Verhoeven, Piet; Beentjes, Johannes W.J.; Vliegenthart, Rens

    2016-01-01

    As gatekeepers, journalists have the power to select the sources that get a voice in crisis coverage. The aim of this study is to find out how journalists select sources during a crisis. In a survey, journalists were asked how they assess the following sources during an organizational crisis: news agencies, an organization undergoing a crisis, and the general public. The sample consisted of 214 Dutch experienced journalists who at least once covered a crisis. Using structural equation modeling, sources’ likelihood of being included in the news was predicted using five source characteristics: credibility, knowledge, willingness, timeliness, and the relationship with the journalist. Findings indicated that during a crisis, news agencies are most likely to be included in the news, followed by the public, and finally the organization. The significance of the five source characteristics is dependent on source type. For example, to be used in the news, news agencies and organizations should be mainly evaluated as knowledgeable, whereas information from the public should be both credible and timely. In addition, organizations should not be seen as too willing or too eager to communicate. The findings imply that, during a crisis, journalists remain critical gatekeepers; however, they rely mainly on familiar sources. PMID:29278263

  10. TEMPS, 1-Group Time-Dependent Pulsed Source Neutron Transport

    International Nuclear Information System (INIS)

    Ganapol, B.D.

    1988-01-01

    1 - Description of program or function: TEMPS numerically determines the scalar flux as given by the one-group neutron transport equation with a pulsed source in an infinite medium. Standard plane, point, and line sources are considered as well as a volume source in the negative half-space in plane geometry. The angular distribution of emitted neutrons can either be isotropic or mono-directional (beam) in plane geometry and isotropic in spherical and cylindrical geometry. A general anisotropic scattering Kernel represented in terms of Legendre polynomials can be accommodated with a time- dependent number of secondaries given by c(t)=c 0 (t/t 0 ) β , where β is greater than -1 and less than infinity. TEMPS is designed to provide the flux to a high degree of accuracy (4-5 digits) for use as a benchmark to which results from other numerical solutions or approximations can be compared. 2 - Method of solution: A semi-analytic Method of solution is followed. The main feature of this approach is that no discretization of the transport or scattering operators is employed. The numerical solution involves the evaluation of an analytical representation of the solution by standard numerical techniques. The transport equation is first reformulated in terms of multiple collisions with the flux represented by an infinite series of collisional components. Each component is then represented by an orthogonal Legendre series expansion in the variable x/t where the distance x and time t are measured in terms of mean free path and mean free time, respectively. The moments in the Legendre reconstruction are found from an algebraic recursion relation obtained from Legendre expansion in the direction variable mu. The multiple collision series is evaluated first to a prescribed relative error determined by the number of digits desired in the scalar flux. If the Legendre series fails to converge in the plane or point source case, an accelerative transformation, based on removing the

  11. Overcomplete Blind Source Separation by Combining ICA and Binary Time-Frequency Masking

    DEFF Research Database (Denmark)

    Pedersen, Michael Syskind; Wang, DeLiang; Larsen, Jan

    2005-01-01

    a novel method for over-complete blind source separation. Two powerful source separation techniques have been combined, independent component analysis and binary time-frequency masking. Hereby, it is possible to iteratively extract each speech signal from the mixture. By using merely two microphones we...

  12. Using Bayesian Belief Network (BBN) modelling for Rapid Source Term Prediction. RASTEP Phase 1

    International Nuclear Information System (INIS)

    Knochenhauer, M.; Swaling, V.H.; Alfheim, P.

    2012-09-01

    The project is connected to the development of RASTEP, a computerized source term prediction tool aimed at providing a basis for improving off-site emergency management. RASTEP uses Bayesian belief networks (BBN) to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, timing, and pathway of released radio-nuclides). The output is a set of possible source terms with associated probabilities. In the NKS project, a number of complex issues associated with the integration of probabilistic and deterministic analyses are addressed. This includes issues related to the method for estimating source terms, signal validation, and sensitivity analysis. One major task within Phase 1 of the project addressed the problem of how to make the source term module flexible enough to give reliable and valid output throughout the accident scenario. Of the alternatives evaluated, it is recommended that RASTEP is connected to a fast running source term prediction code, e.g., MARS, with a possibility of updating source terms based on real-time observations. (Author)

  13. Using Bayesian Belief Network (BBN) modelling for Rapid Source Term Prediction. RASTEP Phase 1

    Energy Technology Data Exchange (ETDEWEB)

    Knochenhauer, M.; Swaling, V.H.; Alfheim, P. [Scandpower AB, Sundbyberg (Sweden)

    2012-09-15

    The project is connected to the development of RASTEP, a computerized source term prediction tool aimed at providing a basis for improving off-site emergency management. RASTEP uses Bayesian belief networks (BBN) to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, timing, and pathway of released radio-nuclides). The output is a set of possible source terms with associated probabilities. In the NKS project, a number of complex issues associated with the integration of probabilistic and deterministic analyses are addressed. This includes issues related to the method for estimating source terms, signal validation, and sensitivity analysis. One major task within Phase 1 of the project addressed the problem of how to make the source term module flexible enough to give reliable and valid output throughout the accident scenario. Of the alternatives evaluated, it is recommended that RASTEP is connected to a fast running source term prediction code, e.g., MARS, with a possibility of updating source terms based on real-time observations. (Author)

  14. Sources management; La gestion des sources

    Energy Technology Data Exchange (ETDEWEB)

    Mansoux, H.; Gourmelon; Scanff, P.; Fournet, F. [Institut de Radioprotection et de Surete Nucleaire, 92 - Fontenay-aux-Roses (France); Murith, Ch. [Office Federal de la SantePublique (Switzerland); Saint-Paul, N. [NOVAR, 75 - Paris (France); Colson, P. [Electricite de France (EDF/DPN), 93 - Saint-Denis (France); Jouve, A.; Feron, F. [Direction Generale de al Surete Nucleaire et de la Radioprotection, 75 - Paris (France); Haranger, D. [Electricite de France (EDF), 75 - Paris (France); Mathieu, P. [Institut Pasteur, 75 - Paris (France); Paycha, F. [CHU Louis Mourier, Unitede Medecine Nucleaire Assistance Publique-Hopitaux de Paris, 92 - Colombes (France); Israel, S. [CEGELEC NDT et la gestion des sources radioactives (France); Auboiroux, B. [APAVE (France); Chartier, P. [DRIRE de Basse-Normandie, Div. Surete Nucleaire et Radioprotection, 14 - Caen (France)

    2005-07-01

    Organized by the section of technical protection of the French society of radiation protection ( S.F.R.P.), these two days had for objective to review the evolution of the rule relative to the sources of ionising radiations 'sealed and unsealed radioactive sources, electric generators'. They addressed all the actors concerned by the implementation of the new regulatory system in the different sectors of activities ( research, medicine and industry): Authorities, manufacturers, and suppliers of sources, holders and users, bodies involved in the approval of sources, carriers. (N.C.)

  15. Finite element approximation for time-dependent diffusion with measure-valued source

    Czech Academy of Sciences Publication Activity Database

    Seidman, T.; Gobbert, M.; Trott, D.; Kružík, Martin

    2012-01-01

    Roč. 122, č. 4 (2012), s. 709-723 ISSN 0029-599X R&D Projects: GA AV ČR IAA100750802 Institutional support: RVO:67985556 Keywords : measure-valued source * diffusion equation Subject RIV: BA - General Mathematics Impact factor: 1.329, year: 2012 http://library.utia.cas.cz/separaty/2012/MTR/kruzik-finite element approximation for time - dependent diffusion with measure-valued source.pdf

  16. Design of the Advanced Light Source timing system

    International Nuclear Information System (INIS)

    Fahmie, M.

    1993-05-01

    The Advanced Light Source (ALS) is a third generation synchrotron radiation facility, and as such, has several unique timing requirements. Arbitrary Storage Ring filling patterns and high single bunch purity requirements demand a highly stable, low jitter timing system with the flexibility to reconfigure on a pulse-to-pulse basis. This modular system utilizes a highly linear Gauss Clock with ''on the fly'' programmable setpoints to track a free-running Booster ramping magnet and provides digitally programmable sequencing and delay for Electron Gun, Linac, Booster Ring, and Storage Ring RF, Pulsed Magnet, and Instrumentation systems. It has proven itself over the last year of accelerator operation to be reliable and rock solid

  17. Time collimation for elastic neutron scattering instrument at a pulsed source

    International Nuclear Information System (INIS)

    Aksenov, V.L.; Nikitenko, Yu.V.

    1996-01-01

    Conditions for carrying out elastic neutron scattering experiments using the time-of-flight technique are considered. It is shown that the employment of time dependent neutron beam collimation in the source-sample flight path increases the luminosity of the spectrometer under certain resolution restrictions. 3 refs., 8 figs

  18. SLStudio: Open-source framework for real-time structured light

    DEFF Research Database (Denmark)

    Wilm, Jakob; Olesen, Oline Vinter; Larsen, Rasmus

    2014-01-01

    that this software makes real-time 3D scene capture more widely accessible and serves as a foundation for new structured light scanners operating in real-time, e.g. 20 depth images per second and more. The use cases for such scanners are plentyfull, however due to the computational constraints, all public......An open-source framework for real-time structured light is presented. It is called “SLStudio”, and enables real-time capture of metric depth images. The framework is modular, and extensible to support new algorithms for scene encoding/decoding, triangulation, and aquisition hardware. It is the aim...... implementations so far are limited to offline processing. With “SLStudio”, we are making a platform available which enables researchers from many different fields to build application specific real time 3D scanners. The software is hosted at http://compute.dtu.dk/~jakw/slstudio....

  19. THE ROLE OF NUTRITIONAL INFORMATION IN ADDRESSING ...

    African Journals Online (AJOL)

    The paper discusses the role of nutritional information for addressing under-five child malnutrition in Tanzania. The paper is based on a master's dissertation whose objective was to determine the sources of nutritional information used to provide nutritional information to mothers in Maternal and Child Health (MCH) clinics, ...

  20. Blind Time-Frequency Analysis for Source Discrimination in Multisensor Array Processing

    National Research Council Canada - National Science Library

    Amin, Moeness

    1999-01-01

    .... We have clearly demonstrated, through analysis and simulations, the offerings of time-frequency distributions in solving key problems in sensor array processing, including direction finding, source...

  1. A compact time-of-flight mass spectrometer for ion source characterization

    International Nuclear Information System (INIS)

    Chen, L.; Wan, X.; Jin, D. Z.; Tan, X. H.; Huang, Z. X.; Tan, G. B.

    2015-01-01

    A compact time-of-flight mass spectrometer with overall dimension of about 413 × 250 × 414 mm based on orthogonal injection and angle reflection has been developed for ion source characterization. Configuration and principle of the time-of-flight mass spectrometer are introduced in this paper. The mass resolution is optimized to be about 1690 (FWHM), and the ion energy detection range is tested to be between about 3 and 163 eV with the help of electron impact ion source. High mass resolution and compact configuration make this spectrometer useful to provide a valuable diagnostic for ion spectra fundamental research and study the mass to charge composition of plasma with wide range of parameters

  2. Measurement and simulation of the time-dependent behavior of the UMER source

    International Nuclear Information System (INIS)

    Haber, I.; Feldman, D.; Fiorito, R.; Friedman, A.; Grote, D.P.; Kishek, R.A.; Quinn, B.; Reiser, M.; Rodgers, J.; O'Shea, P.G.; Stratakis, D.; Tian, K.; Vay, J.-L.; Walter, M.

    2007-01-01

    Control of the time-dependent characteristics of the beam pulse, beginning when it is born from the source, is important for obtaining adequate beam intensity on a target. Recent experimental measurements combined with the new mesh-refinement capability in WARP have improved the understanding of time-dependent beam characteristics beginning at the source, as well as the predictive ability of the simulation codes. The University of Maryland Electron Ring (UMER), because of its ease of operation and flexible diagnostics has proved particularly useful for benchmarking WARP by comparing simulation to measurement. One source of significant agreement has been in the ability of three-dimensional WARP simulations to predict the onset of virtual cathode oscillations in the vicinity of the cathode grid in the UMER gun, and the subsequent measurement of the predicted oscillations

  3. Non-uniform dwell times in line source high dose rate brachytherapy: physical and radiobiological considerations

    International Nuclear Information System (INIS)

    Jones, B.; Tan, L.T.; Freestone, G.; Bleasdale, C.; Myint, S.; Littler, J.

    1994-01-01

    The ability to vary source dwell times in high dose rate (HDR) brachytherapy allows for the use of non-uniform dwell times along a line source. This may have advantages in the radical treatment of tumours depending on individual tumour geometry. This study investigates the potential improvements in local tumour control relative to adjacent normal tissue isoeffects when intratumour source dwell times are increased along the central portion of a line source (technique A) in radiotherapy schedules which include a relatively small component of HDR brachytherapy. Such a technique is predicted to increase the local control for tumours of diameters ranging between 2 cm and 4 cm by up to 11% compared with a technique in which there are uniform dwell times along the line source (technique B). There is no difference in the local control rates for the two techniques when used to treat smaller tumours. Normal tissue doses are also modified by the technique used. Technique A produces higher normal tissue doses at points perpendicular to the centre of the line source and lower dose at points nearer the ends of the line source if the prescription point is not in the central plane of the line source. Alternatively, if the dose is prescribed at a point in the central plane of the line source, the dose at all the normal tissue points are lower when technique A is used. (author)

  4. Impact source identification in finite isotropic plates using a time-reversal method: theoretical study

    International Nuclear Information System (INIS)

    Chen, Chunlin; Yuan, Fuh-Gwo

    2010-01-01

    This paper aims to identify impact sources on plate-like structures based on the synthetic time-reversal (T-R) concept using an array of sensors. The impact source characteristics, namely, impact location and impact loading time history, are reconstructed using the invariance of time-reversal concept, reciprocal theory, and signal processing algorithms. Numerical verification for two finite isotropic plates under low and high velocity impacts is performed to demonstrate the versatility of the synthetic T-R method for impact source identification. The results show that the impact location and time history of the impact force with various shapes and frequency bands can be readily obtained with only four sensors distributed around the impact location. The effects of time duration and the inaccuracy in the estimated impact location on the accuracy of the time history of the impact force using the T-R method are investigated. Since the T-R technique retraces all the multi-paths of reflected waves from the geometrical boundaries back to the impact location, it is well suited for quantifying the impact characteristics for complex structures. In addition, this method is robust against noise and it is suggested that a small number of sensors is sufficient to quantify the impact source characteristics through simple computation; thus it holds promise for the development of passive structural health monitoring (SHM) systems for impact monitoring in near real-time

  5. Time dependence of the field energy densities surrounding sources: Application to scalar mesons near point sources and to electromagnetic fields near molecules

    International Nuclear Information System (INIS)

    Persico, F.; Power, E.A.

    1987-01-01

    The time dependence of the dressing-undressing process, i.e., the acquiring or losing by a source of a boson field intensity and hence of a field energy density in its neighborhood, is considered by examining some simple soluble models. First, the loss of the virtual field is followed in time when a point source is suddenly decoupled from a neutral scalar meson field. Second, an initially bare point source acquires a virtual meson cloud as the coupling is switched on. The third example is that of an initially bare molecule interacting with the vacuum of the electromagnetic field to acquire a virtual photon cloud. In all three cases the dressing-undressing is shown to take place within an expanding sphere of radius r = ct centered at the source. At each point in space the energy density tends, for large times, to that of the ground state of the total system. Differences in the time dependence of the dressing between the massive scalar field and the massless electromagnetic field are discussed. The results are also briefly discussed in the light of Feinberg's ideas on the nature of half-dressed states in quantum field theory

  6. Rietveld refinement with time-of-flight powder diffraction data from pulsed neutron sources

    International Nuclear Information System (INIS)

    David, W.I.F.; Jorgensen, J.D.

    1990-10-01

    The recent development of accelerator-based pulsed neutron sources has led to the widespread use of the time-of-flight technique for neutron powder diffraction. The properties of the pulsed source make possible unusually high resolution over a wide range of d spacings, high count rates, and the ability to collect complete data at fixed scattering angles. The peak shape and other instrument characteristics can be accurately modelled, which make Rietveld refinement possible for complex structures. In this paper we briefly review the development of the Rietveld method for time-of-flight diffraction data from pulsed neutron sources and discuss the latest developments in high resolution instrumentation and advanced Rietveld analysis methods. 50 refs., 12 figs., 14 tabs

  7. OpenPSTD : The open source implementation of the pseudospectral time-domain method

    NARCIS (Netherlands)

    Krijnen, T.; Hornikx, M.C.J.; Borkowski, B.

    2014-01-01

    An open source implementation of the pseudospectral time-domain method for the propagation of sound is presented, which is geared towards applications in the built environment. Being a wavebased method, PSTD captures phenomena like diffraction, but maintains efficiency in processing time and memory

  8. Invited Article: Characterization of background sources in space-based time-of-flight mass spectrometers

    International Nuclear Information System (INIS)

    Gilbert, J. A.; Gershman, D. J.; Gloeckler, G.; Lundgren, R. A.; Zurbuchen, T. H.; Orlando, T. M.; McLain, J.; Steiger, R. von

    2014-01-01

    For instruments that use time-of-flight techniques to measure space plasma, there are common sources of background signals that evidence themselves in the data. The background from these sources may increase the complexity of data analysis and reduce the signal-to-noise response of the instrument, thereby diminishing the science value or usefulness of the data. This paper reviews several sources of background commonly found in time-of-flight mass spectrometers and illustrates their effect in actual data using examples from ACE-SWICS and MESSENGER-FIPS. Sources include penetrating particles and radiation, UV photons, energy straggling and angular scattering, electron stimulated desorption of ions, ion-induced electron emission, accidental coincidence events, and noise signatures from instrument electronics. Data signatures of these sources are shown, as well as mitigation strategies and design considerations for future instruments

  9. Convocation address.

    Science.gov (United States)

    Zakaria, R

    1996-07-01

    By means of this graduation address at the International Institute for Population Sciences (IIPS) in Bombay, the Chancellor of Urdu University voiced his concerns about overpopulation in India. During the speaker's tenure as Health Minister of Maharashtra, he implemented a sterilization incentive program that resulted in the state's having the best family planning (FP) statistics in India for almost 10 years. The incentive program, however, was misused by overenthusiastic officials in other states, with the result that the FP program was renamed the Family Welfare Programme. Population is growing in India because of improvements in health care, but the population education necessary to change fertility will require more time than the seriousness of the population problem allows. In the longterm, poverty and illiteracy must be addressed to control population. In the meanwhile, the graduate program at the IIPS should be expanded to include an undergraduate program, marriage age laws should be enforced, and misconceptions about religious objections to FP must be addressed. India can not afford to use the measures forwarded by developed countries to control population growth. India must integrate population control efforts with the provision of health care because if population continues to grow in the face of reduced infant mortality and longer life expectancy, future generations will be forced to live in a state of poverty and economic degradation.

  10. Time-Dependent Moment Tensors of the First Four Source Physics Experiments (SPE) Explosions

    Science.gov (United States)

    Yang, X.

    2015-12-01

    We use mainly vertical-component geophone data within 2 km from the epicenter to invert for time-dependent moment tensors of the first four SPE explosions: SPE-1, SPE-2, SPE-3 and SPE-4Prime. We employ a one-dimensional (1D) velocity model developed from P- and Rg-wave travel times for Green's function calculations. The attenuation structure of the model is developed from P- and Rg-wave amplitudes. We select data for the inversion based on the criterion that they show consistent travel times and amplitude behavior as those predicted by the 1D model. Due to limited azimuthal coverage of the sources and the mostly vertical-component-only nature of the dataset, only long-period, diagonal components of the moment tensors are well constrained. Nevertheless, the moment tensors, particularly their isotropic components, provide reasonable estimates of the long-period source amplitudes as well as estimates of corner frequencies, albeit with larger uncertainties. The estimated corner frequencies, however, are consistent with estimates from ratios of seismogram spectra from different explosions. These long-period source amplitudes and corner frequencies cannot be fit by classical P-wave explosion source models. The results motivate the development of new P-wave source models suitable for these chemical explosions. To that end, we fit inverted moment-tensor spectra by modifying the classical explosion model using regressions of estimated source parameters. Although the number of data points used in the regression is small, the approach suggests a way for the new-model development when more data are collected.

  11. Digital time stamping system based on open source technologies.

    Science.gov (United States)

    Miskinis, Rimantas; Smirnov, Dmitrij; Urba, Emilis; Burokas, Andrius; Malysko, Bogdan; Laud, Peeter; Zuliani, Francesco

    2010-03-01

    A digital time stamping system based on open source technologies (LINUX-UBUNTU, OpenTSA, OpenSSL, MySQL) is described in detail, including all important testing results. The system, called BALTICTIME, was developed under a project sponsored by the European Commission under the Program FP 6. It was designed to meet the requirements posed to the systems of legal and accountable time stamping and to be applicable to the hardware commonly used by the national time metrology laboratories. The BALTICTIME system is intended for the use of governmental and other institutions as well as personal bodies. Testing results demonstrate that the time stamps issued to the user by BALTICTIME and saved in BALTICTIME's archives (which implies that the time stamps are accountable) meet all the regulatory requirements. Moreover, the BALTICTIME in its present implementation is able to issue more than 10 digital time stamps per second. The system can be enhanced if needed. The test version of the BALTICTIME service is free and available at http://baltictime. pfi.lt:8080/btws/ and http://baltictime.lnmc.lv:8080/btws/.

  12. New developments with H-sources

    International Nuclear Information System (INIS)

    Sherman, Joseph D.; Rouleau, G.

    2002-01-01

    Existing spallation neutron source upgrades, planned spallation neutron sources, and high-energy accelerators for particle physics place demanding requirements on the Hsources. These requirements ask for increased beam currents and duty factor (df) while generally maintaining state-of-the art H' source emittance. A variety of H sources are being developed to address these challenges. These include volume sources with and without the addition of cesium for enhanced He production, increased df cesiated H' Penning and magnetron sources, and cesiated surface converter H- sources. Research on surface films of tantalum metal for enhanced volume H- production is also being studied. Innovative plasma production techniques to address the longer df requirement without sacrificing H- source reliability and liktime will be reviewed. The physical bases, the goals, and perceived challenges will be discussed.

  13. Space-time structure of neutron and X-ray sources in a plasma focus

    International Nuclear Information System (INIS)

    Bostick, W.H.; Nardi, V.; Prior, W.

    1977-01-01

    Systematic measurements with paraffin collimators of the neutron emission intensity have been completed on a plasma focus with a 15-20 kV capacitor bank (hollow centre electrode; discharge period T approximately 8 μs; D 2 filling at 4-8 torr). The space resolution was 1 cm or better. These data indicate that at least 70% of the total neutron yield originates within hot-plasma regions where electron beams and high-energy D beams (approximately > 0.1-1 MeV) are produced. The neutron source is composed of several (approximately > 1-10) space-localized sources of different intensity, each with a duration approximately less than 5 ns (FWHM). Localized neutron sources and hard (approximately > 100 keV) X-ray sources have the same time multiplicity and are usually distributed in two groups over a time interval 40-400 ns long. By the mode of operation used by the authors one group of localized sources (Burst II) is observed 200-400 ns after the other group (Burst I) and its space distribution is broader than for Burst I. The maximum intensity of a localized source of neutrons in Burst I is much higher than the maximum intensity in Burst II. Secondary reactions T(D,n) 4 He (from the tritium produced only by primary reactions in the same discharge; no tritium was used in filling the discharge chamber) are observed in a time coincidence with the strongest D-D neutron pulse of Burst I. The neutron signal from a localized source with high intensity has a relatively long tail of small amplitude (area tail approximately less than 0.2 X area peak). This tail can be generated by the D-D reactions of the unconfined part of an ion beam in the cold plasma. Complete elimination of scattered neutrons on the detector was achieved in these measurements. (author)

  14. Real-time control using open source RTOS

    Science.gov (United States)

    Irwin, Philip C.; Johnson, Richard L., Jr.

    2002-12-01

    Complex telescope systems such as interferometers tend to rely heavily on hard real-time operating systems (RTOS). It has been standard practice at NASA's Jet Propulsion Laboratory (JPL) and many other institutions to use costly commercial RTOSs and hardware. After developing a real-time toolkit for VxWorks on the PowerPC platform (dubbed RTC), the interferometry group at JPL is porting this code to the real-time Application Interface (RTAI), an open source RTOS that is essentially an extension to the Linux kernel. This port has the potential to reduce software and hardware costs for future projects, while increasing the level of performance. The goals of this paper are to briefly describe the RTC toolkit, highlight the successes and pitfalls of porting the toolkit from VxWorks to Linux-RTAI, and to discuss future enhancements that will be implemented as a direct result of this port. The first port of any body of code is always the most difficult since it uncovers the OS-specific calls and forces "red flags" into those portions of the code. For this reason, It has also been a huge benefit that the project chose a generic, platform independent OS extension, ACE, and its CORBA counterpart, TAO. This port of RTC will pave the way for conversions to other environments, the most interesting of which is a non-real-time simulation environment, currently being considered by the Space Interferometry Mission (SIM) and the Terrestrial Planet Finder (TPF) Projects.

  15. Time Reversal Migration for Passive Sources Using a Maximum Variance Imaging Condition

    KAUST Repository

    Wang, H.; Alkhalifah, Tariq Ali

    2017-01-01

    The conventional time-reversal imaging approach for micro-seismic or passive source location is based on focusing the back-propagated wavefields from each recorded trace in a source image. It suffers from strong background noise and limited acquisition aperture, which may create unexpected artifacts and cause error in the source location. To overcome such a problem, we propose a new imaging condition for microseismic imaging, which is based on comparing the amplitude variance in certain windows, and use it to suppress the artifacts as well as find the right location for passive sources. Instead of simply searching for the maximum energy point in the back-propagated wavefield, we calculate the amplitude variances over a window moving in both space and time axis to create a highly resolved passive event image. The variance operation has negligible cost compared with the forward/backward modeling operations, which reveals that the maximum variance imaging condition is efficient and effective. We test our approach numerically on a simple three-layer model and on a piece of the Marmousi model as well, both of which have shown reasonably good results.

  16. Time Reversal Migration for Passive Sources Using a Maximum Variance Imaging Condition

    KAUST Repository

    Wang, H.

    2017-05-26

    The conventional time-reversal imaging approach for micro-seismic or passive source location is based on focusing the back-propagated wavefields from each recorded trace in a source image. It suffers from strong background noise and limited acquisition aperture, which may create unexpected artifacts and cause error in the source location. To overcome such a problem, we propose a new imaging condition for microseismic imaging, which is based on comparing the amplitude variance in certain windows, and use it to suppress the artifacts as well as find the right location for passive sources. Instead of simply searching for the maximum energy point in the back-propagated wavefield, we calculate the amplitude variances over a window moving in both space and time axis to create a highly resolved passive event image. The variance operation has negligible cost compared with the forward/backward modeling operations, which reveals that the maximum variance imaging condition is efficient and effective. We test our approach numerically on a simple three-layer model and on a piece of the Marmousi model as well, both of which have shown reasonably good results.

  17. The Source Signature Estimator - System Improvements and Applications

    Energy Technology Data Exchange (ETDEWEB)

    Sabel, Per; Brink, Mundy; Eidsvig, Seija; Jensen, Lars

    1998-12-31

    This presentation relates briefly to the first part of the joint project on post-survey analysis of shot-by-shot based source signature estimation. The improvements of a Source Signature Estimator system are analysed. The notional source method can give suboptimal results when not inputting the real array geometry, i.e. actual separations between the sub-arrays of an air gun array, to the notional source algorithm. This constraint has been addressed herein and was implemented for the first time in the field in summer 1997. The second part of this study will show the potential advantages for interpretation when the signature estimates are then to be applied in the data processing. 5 refs., 1 fig.

  18. Evaluating four-dimensional time-lapse electrical resistivity tomography for monitoring DNAPL source zone remediation.

    Science.gov (United States)

    Power, Christopher; Gerhard, Jason I; Karaoulis, Marios; Tsourlos, Panagiotis; Giannopoulos, Antonios

    2014-07-01

    Practical, non-invasive tools do not currently exist for mapping the remediation of dense non-aqueous phase liquids (DNAPLs). Electrical resistivity tomography (ERT) exhibits significant potential but has not yet become a practitioner's tool due to challenges in interpreting the survey results at real sites. This study explores the effectiveness of recently developed four-dimensional (4D, i.e., 3D space plus time) time-lapse surface ERT to monitor DNAPL source zone remediation. A laboratory experiment demonstrated the approach for mapping a changing NAPL distribution over time. A recently developed DNAPL-ERT numerical model was then employed to independently simulate the experiment, providing confidence that the DNAPL-ERT model is a reliable tool for simulating real systems. The numerical model was then used to evaluate the potential for this approach at the field scale. Four DNAPL source zones, exhibiting a range of complexity, were initially simulated, followed by modeled time-lapse ERT monitoring of complete DNAPL remediation by enhanced dissolution. 4D ERT inversion provided estimates of the regions of the source zone experiencing mass reduction with time. Results show that 4D time-lapse ERT has significant potential to map both the outline and the center of mass of the evolving treated portion of the source zone to within a few meters in each direction. In addition, the technique can provide a reasonable, albeit conservative, estimate of the DNAPL volume remediated with time: 25% underestimation in the upper 2m and up to 50% underestimation at late time between 2 and 4m depth. The technique is less reliable for identifying cleanup of DNAPL stringers outside the main DNAPL body. Overall, this study demonstrates that 4D time-lapse ERT has potential for mapping where and how quickly DNAPL mass changes in real time during site remediation. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Time course of effects of emotion on item memory and source memory for Chinese words.

    Science.gov (United States)

    Wang, Bo; Fu, Xiaolan

    2011-05-01

    Although many studies have investigated the effect of emotion on memory, it is unclear whether the effect of emotion extends to all aspects of an event. In addition, it is poorly understood how effects of emotion on item memory and source memory change over time. This study examined the time course of effects of emotion on item memory and source memory. Participants learned intentionally a list of neutral, positive, and negative Chinese words, which were presented twice, and then took test of free recall, followed by recognition and source memory tests, at one of eight delayed points of time. The main findings are (within the time frame of 2 weeks): (1) Negative emotion enhances free recall, whereas there is only a trend that positive emotion enhances free recall. In addition, negative and positive emotions have different points of time at which their effects on free recall reach the greatest magnitude. (2) Negative emotion reduces recognition, whereas positive emotion has no effect on recognition. (3) Neither positive nor negative emotion has any effect on source memory. The above findings indicate that effect of emotion does not necessarily extend to all aspects of an event and that valence is a critical modulating factor in effect of emotion on item memory. Furthermore, emotion does not affect the time course of item memory and source memory, at least with a time frame of 2 weeks. This study has implications for establishing the theoretical model regarding the effect of emotion on memory. Copyright © 2011 Elsevier Inc. All rights reserved.

  20. Real-time tunability of chip-based light source enabled by microfluidic mixing

    DEFF Research Database (Denmark)

    Olsen, Brian Bilenberg; Rasmussen, Torben; Balslev, Søren

    2006-01-01

    We demonstrate real-time tunability of a chip-based liquid light source enabled by microfluidic mixing. The mixer and light source are fabricated in SU-8 which is suitable for integration in SU-8-based laboratory-on-a-chip microsystems. The tunability of the light source is achieved by changing...... the concentration of rhodamine 6G dye inside two integrated vertical resonators, since both the refractive index and the gain profile are influenced by the dye concentration. The effect on the refractive index and the gain profile of rhodamine 6G in ethanol is investigated and the continuous tuning of the laser...

  1. Separation of blended impulsive sources using an iterative estimation-subtraction algorithm

    NARCIS (Netherlands)

    Doulgeris, P.; Mahdad, A.; Blacquière, G.

    2010-01-01

    Traditional data acquisition practice dictates the existence of sufficient time intervals between the firing of sequential impulsive sources in the field. However, much attention has been drawn recently to the possibility of shooting in an overlapping fashion. Numerous publications have addressed

  2. Counting addressing method: Command addressable element and extinguishing module

    Directory of Open Access Journals (Sweden)

    Ristić Jovan D.

    2009-01-01

    Full Text Available The specific requirements that appear in addressable fire detection and alarm systems and the shortcomings of the existing addressing methods were discussed. A new method of addressing of detectors was proposed. The basic principles of addressing and responding of a called element are stated. Extinguishing module is specific subsystem in classic fire detection and alarm systems. Appearing of addressable fire detection and alarm systems didn't caused essential change in the concept of extinguishing module because of long calling period of such systems. Addressable fire security system based on counting addressing method reaches high calling rates and enables integrating of the extinguishing module in addressable system. Solutions for command addressable element and integrated extinguishing module are given in this paper. The counting addressing method was developed for specific requirements in fire detection and alarm systems, yet its speed and reliability justifies its use in the acquisition of data on slowly variable parameters under industrial conditions. .

  3. Travel-time source-specific station correction improves location accuracy

    Science.gov (United States)

    Giuntini, Alessandra; Materni, Valerio; Chiappini, Stefano; Carluccio, Roberto; Console, Rodolfo; Chiappini, Massimo

    2013-04-01

    Accurate earthquake locations are crucial for investigating seismogenic processes, as well as for applications like verifying compliance to the Comprehensive Test Ban Treaty (CTBT). Earthquake location accuracy is related to the degree of knowledge about the 3-D structure of seismic wave velocity in the Earth. It is well known that modeling errors of calculated travel times may have the effect of shifting the computed epicenters far from the real locations by a distance even larger than the size of the statistical error ellipses, regardless of the accuracy in picking seismic phase arrivals. The consequences of large mislocations of seismic events in the context of the CTBT verification is particularly critical in order to trigger a possible On Site Inspection (OSI). In fact, the Treaty establishes that an OSI area cannot be larger than 1000 km2, and its larger linear dimension cannot be larger than 50 km. Moreover, depth accuracy is crucial for the application of the depth event screening criterion. In the present study, we develop a method of source-specific travel times corrections based on a set of well located events recorded by dense national seismic networks in seismically active regions. The applications concern seismic sequences recorded in Japan, Iran and Italy. We show that mislocations of the order of 10-20 km affecting the epicenters, as well as larger mislocations in hypocentral depths, calculated from a global seismic network and using the standard IASPEI91 travel times can be effectively removed by applying source-specific station corrections.

  4. GANIL Workshop on Ion Sources; Journees Sources d'Ions

    Energy Technology Data Exchange (ETDEWEB)

    Leroy, Renan [Grand Accelerateur National d' Ions Lourds (GANIL), 14 - Caen (France)

    1999-07-01

    The proceedings of the GANIL Workshop on Ion Sources held at GANIL - Caen on 18-19 March 1999 contains 13 papers aiming at improving the old source operation and developing new types of sources for nuclear research and studies of ion physics. A number of reports are devoted to applications like surface treatment, ion implantation or fusion injection. The 1+{yields}n+ transformation, charged particle transport in ECR sources, addition of cesium and xenon in negative ion sources and other basic issues in ion sources are also addressed.

  5. High-resolution and super stacking of time-reversal mirrors in locating seismic sources

    KAUST Repository

    Cao, Weiping; Hanafy, Sherif M.; Schuster, Gerard T.; Zhan, Ge; Boonyasiriwat, Chaiwoot

    2011-01-01

    Time reversal mirrors can be used to backpropagate and refocus incident wavefields to their actual source location, with the subsequent benefits of imaging with high-resolution and super-stacking properties. These benefits of time reversal mirrors

  6. Multi-source least-squares reverse time migration

    KAUST Repository

    Dai, Wei

    2012-06-15

    Least-squares migration has been shown to improve image quality compared to the conventional migration method, but its computational cost is often too high to be practical. In this paper, we develop two numerical schemes to implement least-squares migration with the reverse time migration method and the blended source processing technique to increase computation efficiency. By iterative migration of supergathers, which consist in a sum of many phase-encoded shots, the image quality is enhanced and the crosstalk noise associated with the encoded shots is reduced. Numerical tests on 2D HESS VTI data show that the multisource least-squares reverse time migration (LSRTM) algorithm suppresses migration artefacts, balances the amplitudes, improves image resolution and reduces crosstalk noise associated with the blended shot gathers. For this example, the multisource LSRTM is about three times faster than the conventional RTM method. For the 3D example of the SEG/EAGE salt model, with a comparable computational cost, multisource LSRTM produces images with more accurate amplitudes, better spatial resolution and fewer migration artefacts compared to conventional RTM. The empirical results suggest that multisource LSRTM can produce more accurate reflectivity images than conventional RTM does with a similar or less computational cost. The caveat is that the LSRTM image is sensitive to large errors in the migration velocity model. © 2012 European Association of Geoscientists & Engineers.

  7. Multi-source least-squares reverse time migration

    KAUST Repository

    Dai, Wei; Fowler, Paul J.; Schuster, Gerard T.

    2012-01-01

    Least-squares migration has been shown to improve image quality compared to the conventional migration method, but its computational cost is often too high to be practical. In this paper, we develop two numerical schemes to implement least-squares migration with the reverse time migration method and the blended source processing technique to increase computation efficiency. By iterative migration of supergathers, which consist in a sum of many phase-encoded shots, the image quality is enhanced and the crosstalk noise associated with the encoded shots is reduced. Numerical tests on 2D HESS VTI data show that the multisource least-squares reverse time migration (LSRTM) algorithm suppresses migration artefacts, balances the amplitudes, improves image resolution and reduces crosstalk noise associated with the blended shot gathers. For this example, the multisource LSRTM is about three times faster than the conventional RTM method. For the 3D example of the SEG/EAGE salt model, with a comparable computational cost, multisource LSRTM produces images with more accurate amplitudes, better spatial resolution and fewer migration artefacts compared to conventional RTM. The empirical results suggest that multisource LSRTM can produce more accurate reflectivity images than conventional RTM does with a similar or less computational cost. The caveat is that the LSRTM image is sensitive to large errors in the migration velocity model. © 2012 European Association of Geoscientists & Engineers.

  8. The effect of interaural-time-difference fluctuations on apparent source width

    DEFF Research Database (Denmark)

    Käsbach, Johannes; May, Tobias; Oskarsdottir, Gudrun

    2014-01-01

    For the perception of spaciousness, the temporal fluctuations of the interaural time differences (ITDs) and interaural level differences (ILDs) provide important binaural cues. One major characteristic of spatial perception is apparent source width (ASW), which describes the perceived width of a ...

  9. openPSTD: The open source pseudospectral time-domain method for acoustic propagation

    Science.gov (United States)

    Hornikx, Maarten; Krijnen, Thomas; van Harten, Louis

    2016-06-01

    An open source implementation of the Fourier pseudospectral time-domain (PSTD) method for computing the propagation of sound is presented, which is geared towards applications in the built environment. Being a wave-based method, PSTD captures phenomena like diffraction, but maintains efficiency in processing time and memory usage as it allows to spatially sample close to the Nyquist criterion, thus keeping both the required spatial and temporal resolution coarse. In the implementation it has been opted to model the physical geometry as a composition of rectangular two-dimensional subdomains, hence initially restricting the implementation to orthogonal and two-dimensional situations. The strategy of using subdomains divides the problem domain into local subsets, which enables the simulation software to be built according to Object-Oriented Programming best practices and allows room for further computational parallelization. The software is built using the open source components, Blender, Numpy and Python, and has been published under an open source license itself as well. For accelerating the software, an option has been included to accelerate the calculations by a partial implementation of the code on the Graphical Processing Unit (GPU), which increases the throughput by up to fifteen times. The details of the implementation are reported, as well as the accuracy of the code.

  10. An assessment of open source promotion in addressing ICT acceptance challenges in Tanzania

    CSIR Research Space (South Africa)

    Kinyondo, J

    2012-05-01

    Full Text Available challenges. This study is an assessment of such OS promotion efforts in addressing ICT acceptance challenges in Tanzania specifically. The research design includes case studies done on four OS communities using document analysis, a questionnaire, interviews...

  11. High-resolution and super stacking of time-reversal mirrors in locating seismic sources

    KAUST Repository

    Cao, Weiping

    2011-07-08

    Time reversal mirrors can be used to backpropagate and refocus incident wavefields to their actual source location, with the subsequent benefits of imaging with high-resolution and super-stacking properties. These benefits of time reversal mirrors have been previously verified with computer simulations and laboratory experiments but not with exploration-scale seismic data. We now demonstrate the high-resolution and the super-stacking properties in locating seismic sources with field seismic data that include multiple scattering. Tests on both synthetic data and field data show that a time reversal mirror has the potential to exceed the Rayleigh resolution limit by factors of 4 or more. Results also show that a time reversal mirror has a significant resilience to strong Gaussian noise and that accurate imaging of source locations from passive seismic data can be accomplished with traces having signal-to-noise ratios as low as 0.001. Synthetic tests also demonstrate that time reversal mirrors can sometimes enhance the signal by a factor proportional to the square root of the product of the number of traces, denoted as N and the number of events in the traces. This enhancement property is denoted as super-stacking and greatly exceeds the classical signal-to-noise enhancement factor of. High-resolution and super-stacking are properties also enjoyed by seismic interferometry and reverse-time migration with the exact velocity model. © 2011 European Association of Geoscientists & Engineers.

  12. Real-time implementation of logo detection on open source BeagleBoard

    Science.gov (United States)

    George, M.; Kehtarnavaz, N.; Estevez, L.

    2011-03-01

    This paper presents the real-time implementation of our previously developed logo detection and tracking algorithm on the open source BeagleBoard mobile platform. This platform has an OMAP processor that incorporates an ARM Cortex processor. The algorithm combines Scale Invariant Feature Transform (SIFT) with k-means clustering, online color calibration and moment invariants to robustly detect and track logos in video. Various optimization steps that are carried out to allow the real-time execution of the algorithm on BeagleBoard are discussed. The results obtained are compared to the PC real-time implementation results.

  13. Application of the unwrapped phase inversion to land data without source estimation

    KAUST Repository

    Choi, Yun Seok

    2015-08-19

    Unwrapped phase inversion with a strong damping was developed to solve the phase wrapping problem in frequency-domain waveform inversion. In this study, we apply the unwrapped phase inversion to band-limited real land data, for which the available minimum frequency is quite high. An important issue of the data is a strong ambiguity of source-ignition time (or source shift) shown in a seismogram. A source-estimation approach does not fully address the issue of source shift, since the velocity model and the source wavelet are updated simultaneously and interact with each other. We suggest a source-independent unwrapped phase inversion approach instead of relying on source-estimation from this land data. In the source-independent approach, the phase of the modeled data converges not to the exact phase value of the observed data, but to the relative phase value (or the trend of phases); thus it has the potential to solve the ambiguity of source-ignition time in a seismogram and work better than the source-estimation approach. Numerical examples show the validation of the source-independent unwrapped phase inversion, especially for land field data having an ambiguity in the source-ignition time.

  14. Real-time software for multi-isotopic source term estimation

    International Nuclear Information System (INIS)

    Goloubenkov, A.; Borodin, R.; Sohier, A.

    1996-01-01

    Consideration is given to development of software for one of crucial components of the RODOS - assessment of the source rate (SR) from indirect measurements. Four components of the software are described in the paper. First component is a GRID system, which allow to prepare stochastic meteorological and radioactivity fields using measured data. Second part is a model of atmospheric transport which can be adapted for emulation of practically any gamma dose/spectrum detectors. The third one is a method which allows space-time and quantitative discrepancies in measured and modelled data to be taken into account simultaneously. It bases on the preference scheme selected by an expert. Last component is a special optimization method for calculation of multi-isotopic SR and its uncertainties. Results of a validation of the software using tracer experiments data and Chernobyl source estimation for main dose-forming isotopes are enclosed in the paper

  15. Presidential addresses of the Royal Society of Tropical Medicine and Hygiene: 1907–2013

    Science.gov (United States)

    Hay, Simon I.; McHugh, Gerri M.

    2013-01-01

    Presidents have been required to give an inaugural address on commencing office at the Royal Society of Tropical Medicine and Hygiene (RSTMH) since its foundation in 1907. All presidential addresses were identified, sourced and assembled into an annotated bibliography. The majority of presidential addresses have been published in Transactions of the Royal Society of Tropical Medicine and Hygiene. Unpublished and in some cases ‘lost’ contributions have now been sourced where possible and archived at the RSTMH. This unique, rich and rewarding archive provides a vista into the development of the RSTMH and the discipline of tropical medicine. The archive is freely available to all. PMID:24026462

  16. Flash X-Ray (FXR) Accelerator Optimization Electronic Time-Resolved Measurement of X-Ray Source Size

    International Nuclear Information System (INIS)

    Jacob, J; Ong, M; Wargo, P

    2005-01-01

    Lawrence Livermore National Laboratory (LLNL) is currently investigating various approaches to minimize the x-ray source size on the Flash X-Ray (FXR) linear induction accelerator in order to improve x-ray flux and increase resolution for hydrodynamic radiography experiments. In order to effectively gauge improvements to final x-ray source size, a fast, robust, and accurate system for measuring the spot size is required. Timely feedback on x-ray source size allows new and improved accelerator tunes to be deployed and optimized within the limited run-time constraints of a production facility with a busy experimental schedule; in addition, time-resolved measurement capability allows the investigation of not only the time-averaged source size, but also the evolution of the source size, centroid position, and x-ray dose throughout the 70 ns beam pulse. Combined with time-resolved measurements of electron beam parameters such as emittance, energy, and current, key limiting factors can be identified, modeled, and optimized for the best possible spot size. Roll-bar techniques are a widely used method for x-ray source size measurement, and have been the method of choice at FXR for many years. A thick bar of tungsten or other dense metal with a sharp edge is inserted into the path of the x-ray beam so as to heavily attenuate the lower half of the beam, resulting in a half-light, half-dark image as seen downstream of the roll-bar; by measuring the width of the transition from light to dark across the edge of the roll-bar, the source size can be deduced. For many years, film has been the imaging medium of choice for roll-bar measurements thanks to its high resolution, linear response, and excellent contrast ratio. Film measurements, however, are fairly cumbersome and require considerable setup and analysis time; moreover, with the continuing trend towards all-electronic measurement systems, film is becoming increasingly difficult and expensive to procure. Here, we shall

  17. Source Finding in the Era of the SKA (Precursors): Aegean 2.0

    Science.gov (United States)

    Hancock, Paul J.; Trott, Cathryn M.; Hurley-Walker, Natasha

    2018-03-01

    In the era of the SKA precursors, telescopes are producing deeper, larger images of the sky on increasingly small time-scales. The greater size and volume of images place an increased demand on the software that we use to create catalogues, and so our source finding algorithms need to evolve accordingly. In this paper, we discuss some of the logistical and technical challenges that result from the increased size and volume of images that are to be analysed, and demonstrate how the Aegean source finding package has evolved to address these challenges. In particular, we address the issues of source finding on spatially correlated data, and on images in which the background, noise, and point spread function vary across the sky. We also introduce the concept of forced or prioritised fitting.

  18. Sources management

    International Nuclear Information System (INIS)

    Mansoux, H.; Gourmelon; Scanff, P.; Fournet, F.; Murith, Ch.; Saint-Paul, N.; Colson, P.; Jouve, A.; Feron, F.; Haranger, D.; Mathieu, P.; Paycha, F.; Israel, S.; Auboiroux, B.; Chartier, P.

    2005-01-01

    Organized by the section of technical protection of the French society of radiation protection ( S.F.R.P.), these two days had for objective to review the evolution of the rule relative to the sources of ionising radiations 'sealed and unsealed radioactive sources, electric generators'. They addressed all the actors concerned by the implementation of the new regulatory system in the different sectors of activities ( research, medicine and industry): Authorities, manufacturers, and suppliers of sources, holders and users, bodies involved in the approval of sources, carriers. (N.C.)

  19. The Earthquake‐Source Inversion Validation (SIV) Project

    KAUST Repository

    Mai, Paul Martin

    2016-04-27

    Finite-fault earthquake source inversions infer the (time-dependent) displacement on the rupture surface from geophysical data. The resulting earthquake source models document the complexity of the rupture process. However, multiple source models for the same earthquake, obtained by different research teams, often exhibit remarkable dissimilarities. To address the uncertainties in earthquake-source inversion methods and to understand strengths and weaknesses of the various approaches used, the Source Inversion Validation (SIV) project conducts a set of forward-modeling exercises and inversion benchmarks. In this article, we describe the SIV strategy, the initial benchmarks, and current SIV results. Furthermore, we apply statistical tools for quantitative waveform comparison and for investigating source-model (dis)similarities that enable us to rank the solutions, and to identify particularly promising source inversion approaches. All SIV exercises (with related data and descriptions) and statistical comparison tools are available via an online collaboration platform, and we encourage source modelers to use the SIV benchmarks for developing and testing new methods. We envision that the SIV efforts will lead to new developments for tackling the earthquake-source imaging problem.

  20. The Earthquake‐Source Inversion Validation (SIV) Project

    KAUST Repository

    Mai, Paul Martin; Schorlemmer, Danijel; Page, Morgan; Ampuero, Jean‐Paul; Asano, Kimiyuki; Causse, Mathieu; Custodio, Susana; Fan, Wenyuan; Festa, Gaetano; Galis, Martin; Gallovic, Frantisek; Imperatori, Walter; Kä ser, Martin; Malytskyy, Dmytro; Okuwaki, Ryo; Pollitz, Fred; Passone, Luca; Razafindrakoto, Hoby; Sekiguchi, Haruko; Song, Seok Goo; Somala, Surendra N.; Thingbaijam, Kiran Kumar; Twardzik, Cedric; van Driel, Martin; Vyas, Jagdish Chandra; Wang, Rongjiang; Yagi, Yuji; Zielke, Olaf

    2016-01-01

    Finite-fault earthquake source inversions infer the (time-dependent) displacement on the rupture surface from geophysical data. The resulting earthquake source models document the complexity of the rupture process. However, multiple source models for the same earthquake, obtained by different research teams, often exhibit remarkable dissimilarities. To address the uncertainties in earthquake-source inversion methods and to understand strengths and weaknesses of the various approaches used, the Source Inversion Validation (SIV) project conducts a set of forward-modeling exercises and inversion benchmarks. In this article, we describe the SIV strategy, the initial benchmarks, and current SIV results. Furthermore, we apply statistical tools for quantitative waveform comparison and for investigating source-model (dis)similarities that enable us to rank the solutions, and to identify particularly promising source inversion approaches. All SIV exercises (with related data and descriptions) and statistical comparison tools are available via an online collaboration platform, and we encourage source modelers to use the SIV benchmarks for developing and testing new methods. We envision that the SIV efforts will lead to new developments for tackling the earthquake-source imaging problem.

  1. Using recruitment source timing and diagnosticity to enhance applicants' occupation-specific human capital.

    Science.gov (United States)

    Campion, Michael C; Ployhart, Robert E; Campion, Michael A

    2017-05-01

    [Correction Notice: An Erratum for this article was reported in Vol 102(5) of Journal of Applied Psychology (see record 2017-14296-001). In the article, the following headings were inadvertently set at the wrong level: Method, Participants and Procedure, Measures, Occupation specific human capital, Symbolic jobs, Relevant majors, Occupation-specific capital hotspots, Source timing, Source diagnosticity, Results, and Discussion. All versions of this article have been corrected.] This study proposes that reaching applicants through more diagnostic recruitment sources earlier in their educational development (e.g., in high school) can lead them to invest more in their occupation-specific human capital (OSHC), thereby making them higher quality candidates. Using a sample of 78,157 applicants applying for jobs within a desirable professional occupation in the public sector, results indicate that applicants who report hearing about the occupation earlier, and applicants who report hearing about the occupation through more diagnostic sources, have higher levels of OSHC upon application. Additionally, source timing and diagnosticity affect the likelihood of candidates applying for jobs symbolic of the occupation, selecting relevant majors, and attending educational institutions with top programs related to the occupation. These findings suggest a firm's recruiting efforts may influence applicants' OSHC investment strategies. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  2. Research of future network with multi-layer IP address

    Science.gov (United States)

    Li, Guoling; Long, Zhaohua; Wei, Ziqiang

    2018-04-01

    The shortage of IP addresses and the scalability of routing systems [1] are challenges for the Internet. The idea of dividing existing IP addresses between identities and locations is one of the important research directions. This paper proposed a new decimal network architecture based on IPv9 [11], and decimal network IP address from E.164 principle of traditional telecommunication network, the IP address level, which helps to achieve separation and identification and location of IP address, IP address form a multilayer network structure, routing scalability problem in remission at the same time, to solve the problem of IPv4 address depletion. On the basis of IPv9, a new decimal network architecture is proposed, and the IP address of the decimal network draws on the E.164 principle of the traditional telecommunication network, and the IP addresses are hierarchically divided, which helps to realize the identification and location separation of IP addresses, the formation of multi-layer IP address network structure, while easing the scalability of the routing system to find a way out of IPv4 address exhausted. In addition to modifying DNS [10] simply and adding the function of digital domain, a DDNS [12] is formed. At the same time, a gateway device is added, that is, IPV9 gateway. The original backbone network and user network are unchanged.

  3. Influence of starch source in the required hydrolysis time for the ...

    African Journals Online (AJOL)

    Influence of starch source in the required hydrolysis time for the production of maltodextrins with different dextrose equivalent. José Luis Montañez Soto, Luis Medina García, José Venegas González, Aurea Bernardino Nicanor, Leopoldo González Cruz ...

  4. Multiple time-reversed guide-sources in shallow water

    Science.gov (United States)

    Gaumond, Charles F.; Fromm, David M.; Lingevitch, Joseph F.; Gauss, Roger C.; Menis, Richard

    2003-10-01

    Detection in a monostatic, broadband, active sonar system in shallow water is degraded by propagation-induced spreading. The detection improvement from multiple spatially separated guide sources (GSs) is presented as a method to mitigate this degradation. The improvement of detection by using information in a set of one-way transmissions from a variety of positions is shown using sea data. The experimental area is south of the Hudson Canyon off the coast of New Jersey. The data were taken using five elements of a time-reversing VLA. The five elements were contiguous and at midwater depth. The target and guide source was an echo repeater positioned at various ranges and at middepth. The transmitted signals were 3.0- to 3.5-kHz LFMs. The data are analyzed to show the amount of information present in the collection, a baseline probability of detection (PD) not using the collection of GS signals, the improvement in PD from the use of various sets of GS signals. The dependence of the improvement as a function of range is also shown. [The authors acknowledge support from Dr. Jeffrey Simmen, ONR321OS, and the chief scientist Dr. Charles Holland. Work supported by ONR.

  5. Effects of detector–source distance and detector bias voltage variations on time resolution of general purpose plastic scintillation detectors

    International Nuclear Information System (INIS)

    Ermis, E.E.; Celiktas, C.

    2012-01-01

    Effects of source-detector distance and the detector bias voltage variations on time resolution of a general purpose plastic scintillation detector such as BC400 were investigated. 133 Ba and 207 Bi calibration sources with and without collimator were used in the present work. Optimum source-detector distance and bias voltage values were determined for the best time resolution by using leading edge timing method. Effect of the collimator usage on time resolution was also investigated. - Highlights: ► Effect of the source-detector distance on time spectra was investigated. ► Effect of the detector bias voltage variations on time spectra was examined. ► Optimum detector–source distance was determined for the best time resolution. ► Optimum detector bias voltage was determined for the best time resolution. ► 133 Ba and 207 Bi radioisotopes were used.

  6. Time-resolved far-infrared experiments at the National Synchrotron Light Source. Final report

    International Nuclear Information System (INIS)

    Tanner, D.B.; Reitze, D.H.; Carr, G.L.

    1999-01-01

    A facility for time-resolved infrared and far-infrared spectroscopy has been built and commissioned at the National Synchrotron Light Source. This facility permits the study of time dependent phenomena over a frequency range from 2-8000cm -1 (0.25 meV-1 eV). Temporal resolution is approximately 200 psec and time dependent phenomena in the time range out to 100 nsec can be investigated

  7. Time-of-flight diffraction at pulsed neutron sources: An introduction to the symposium

    International Nuclear Information System (INIS)

    Jorgensen, J.D.

    1994-01-01

    In the 25 years since the first low-power demonstration experiments, pulsed neutron sources have become as productive as reactor sources for many types of diffraction experiments. The pulsed neutron sources presently operating in the United States, England, and Japan offer state of the art instruments for powder and single crystal diffraction, small angle scattering, and such specialized techniques as grazing-incidence neutron reflection, as well as quasielastic and inelastic scattering. In this symposium, speakers review the latest advances in diffraction instrumentation for pulsed neutron sources and give examples of some of the important science presently being done. In this introduction to the symposium, I briefly define the basic principles of pulsed neutron sources, review their development, comment in general terms on the development of time-of-flight diffraction instrumentation for these sources, and project how this field will develop in the next ten years

  8. An ion source for radiofrequency-pulsed glow discharge time-of-flight mass spectrometry

    International Nuclear Information System (INIS)

    González Gago, C.; Lobo, L.; Pisonero, J.; Bordel, N.; Pereiro, R.; Sanz-Medel, A.

    2012-01-01

    A Grimm-type glow discharge (GD) has been designed and constructed as an ion source for pulsed radiofrequency GD spectrometry when coupled to an orthogonal time of flight mass spectrometer. Pulse shapes of argon species and analytes were studied as a function of the discharge conditions using a new in-house ion source (UNIOVI GD) and results have been compared with a previous design (PROTOTYPE GD). Different behavior and shapes of the pulse profiles have been observed for the two sources evaluated, particularly for the plasma gas ionic species detected. In the more analytically relevant region (afterglow), signals for 40 Ar + with this new design were negligible, while maximum intensity was reached earlier in time for 41 (ArH) + than when using the PROTOTYPE GD. Moreover, while maximum 40 Ar + signals measured along the pulse period were similar in both sources, 41 (ArH) + and 80 (Ar 2 ) + signals tend to be noticeable higher using the PROTOTYPE chamber. The UNIOVI GD design was shown to be adequate for sensitive direct analysis of solid samples, offering linear calibration graphs and good crater shapes. Limits of detection (LODs) are in the same order of magnitude for both sources, although the UNIOVI source provides slightly better LODs for those analytes with masses slightly higher than 41 (ArH) + . - Highlights: ► A new RF-pulsed GD ion source (UNIOVI GD) coupled to TOFMS has been characterized. ► Linear calibration graphs and LODs in the low ppm range are achieved. ► Craters with flat bottoms and vertical walls are obtained. ► UNIOVI source can be easily cleaned as it does not require flow tube. ► UNIOVI GD has a simple design and thus its manufacture is easy and cheap.

  9. TimeSet: A computer program that accesses five atomic time services on two continents

    Science.gov (United States)

    Petrakis, P. L.

    1993-01-01

    TimeSet is a shareware program for accessing digital time services by telephone. At its initial release, it was capable of capturing time signals only from the U.S. Naval Observatory to set a computer's clock. Later the ability to synchronize with the National Institute of Standards and Technology was added. Now, in Version 7.10, TimeSet is able to access three additional telephone time services in Europe - in Sweden, Austria, and Italy - making a total of five official services addressable by the program. A companion program, TimeGen, allows yet another source of telephone time data strings for callers equipped with TimeSet version 7.10. TimeGen synthesizes UTC time data strings in the Naval Observatory's format from an accurately set and maintained DOS computer clock, and transmits them to callers. This allows an unlimited number of 'freelance' time generating stations to be created. Timesetting from TimeGen is made feasible by the advent of Becker's RighTime, a shareware program that learns the drift characteristics of a computer's clock and continuously applies a correction to keep it accurate, and also brings .01 second resolution to the DOS clock. With clock regulation by RighTime and periodic update calls by the TimeGen station to an official time source via TimeSet, TimeGen offers the same degree of accuracy within the resolution of the computer clock as any official atomic time source.

  10. Overview of terahertz radiation sources

    International Nuclear Information System (INIS)

    Gallerano, G.P.; Biedron, S.G.

    2004-01-01

    Although terahertz (THz) radiation was first observed about hundred years ago, the corresponding portion of the electromagnetic spectrum has been for long time considered a rather poorly explored region at the boundary between the microwaves and the infrared. This situation has changed during the past ten years with the rapid development of coherent THz sources, such as solid state oscillators, quantum cascade lasers, optically pumped solid state devices and novel free electron devices, which have in turn stimulated a wide variety of applications from material science to telecommunications, from biology to biomedicine. For a comprehensive review of THz technology the reader is addressed to a recent paper by P. Siegel. In this paper we focus on the development and perspectives of THz radiation sources.

  11. A Comparison between Predicted and Observed Atmospheric States and their Effects on Infrasonic Source Time Function Inversion at Source Physics Experiment 6

    Science.gov (United States)

    Aur, K. A.; Poppeliers, C.; Preston, L. A.

    2017-12-01

    The Source Physics Experiment (SPE) consists of a series of underground chemical explosions at the Nevada National Security Site (NNSS) designed to gain an improved understanding of the generation and propagation of physical signals in the near and far field. Characterizing the acoustic and infrasound source mechanism from underground explosions is of great importance to underground explosion monitoring. To this end we perform full waveform source inversion of infrasound data collected from the SPE-6 experiment at distances from 300 m to 6 km and frequencies up to 20 Hz. Our method requires estimating the state of the atmosphere at the time of each experiment, computing Green's functions through these atmospheric models, and subsequently inverting the observed data in the frequency domain to obtain a source time function. To estimate the state of the atmosphere at the time of the experiment, we utilize the Weather Research and Forecasting - Data Assimilation (WRF-DA) modeling system to derive a unified atmospheric state model by combining Global Energy and Water Cycle Experiment (GEWEX) Continental-scale International Project (GCIP) data and locally obtained sonde and surface weather observations collected at the time of the experiment. We synthesize Green's functions through these atmospheric models using Sandia's moving media acoustic propagation simulation suite (TDAAPS). These models include 3-D variations in topography, temperature, pressure, and wind. We compare inversion results using the atmospheric models derived from the unified weather models versus previous modeling results and discuss how these differences affect computed source waveforms with respect to observed waveforms at various distances. Sandia National Laboratories is a multi-mission laboratory managed and operated by National Technology and Engineering Solutions of Sandia LLC, a wholly owned subsidiary of Honeywell International Inc. for the U.S. Department of Energy's National Nuclear

  12. Evaluating scintillator performance in time-resolved hard X-ray studies at synchrotron light sources

    International Nuclear Information System (INIS)

    Rutherford, Michael E.; Chapman, David J.; White, Thomas G.; Drakopoulos, Michael; Rack, Alexander; Eakins, Daniel E.

    2016-01-01

    Scintillator performance in time-resolved, hard, indirect detection X-ray studies on the sub-microsecond timescale at synchrotron light sources is reviewed, modelled and examined experimentally. LYSO:Ce is found to be the only commercially available crystal suitable for these experiments. The short pulse duration, small effective source size and high flux of synchrotron radiation is ideally suited for probing a wide range of transient deformation processes in materials under extreme conditions. In this paper, the challenges of high-resolution time-resolved indirect X-ray detection are reviewed in the context of dynamic synchrotron experiments. In particular, the discussion is targeted at two-dimensional integrating detector methods, such as those focused on dynamic radiography and diffraction experiments. The response of a scintillator to periodic synchrotron X-ray excitation is modelled and validated against experimental data collected at the Diamond Light Source (DLS) and European Synchrotron Radiation Facility (ESRF). An upper bound on the dynamic range accessible in a time-resolved experiment for a given bunch separation is calculated for a range of scintillators. New bunch structures are suggested for DLS and ESRF using the highest-performing commercially available crystal LYSO:Ce, allowing time-resolved experiments with an interframe time of 189 ns and a maximum dynamic range of 98 (6.6 bits)

  13. Evaluating scintillator performance in time-resolved hard X-ray studies at synchrotron light sources

    Energy Technology Data Exchange (ETDEWEB)

    Rutherford, Michael E.; Chapman, David J.; White, Thomas G. [Imperial College London, London (United Kingdom); Drakopoulos, Michael [Diamond Light Source, I12 Joint Engineering, Environmental, Processing (JEEP) Beamline, Didcot, Oxfordshire (United Kingdom); Rack, Alexander [European Synchrotron Radiation Facility, Grenoble (France); Eakins, Daniel E., E-mail: d.eakins@imperial.ac.uk [Imperial College London, London (United Kingdom)

    2016-03-24

    Scintillator performance in time-resolved, hard, indirect detection X-ray studies on the sub-microsecond timescale at synchrotron light sources is reviewed, modelled and examined experimentally. LYSO:Ce is found to be the only commercially available crystal suitable for these experiments. The short pulse duration, small effective source size and high flux of synchrotron radiation is ideally suited for probing a wide range of transient deformation processes in materials under extreme conditions. In this paper, the challenges of high-resolution time-resolved indirect X-ray detection are reviewed in the context of dynamic synchrotron experiments. In particular, the discussion is targeted at two-dimensional integrating detector methods, such as those focused on dynamic radiography and diffraction experiments. The response of a scintillator to periodic synchrotron X-ray excitation is modelled and validated against experimental data collected at the Diamond Light Source (DLS) and European Synchrotron Radiation Facility (ESRF). An upper bound on the dynamic range accessible in a time-resolved experiment for a given bunch separation is calculated for a range of scintillators. New bunch structures are suggested for DLS and ESRF using the highest-performing commercially available crystal LYSO:Ce, allowing time-resolved experiments with an interframe time of 189 ns and a maximum dynamic range of 98 (6.6 bits)

  14. Integrating address geocoding, land use regression, and spatiotemporal geostatistical estimation for groundwater tetrachloroethylene.

    Science.gov (United States)

    Messier, Kyle P; Akita, Yasuyuki; Serre, Marc L

    2012-03-06

    Geographic information systems (GIS) based techniques are cost-effective and efficient methods used by state agencies and epidemiology researchers for estimating concentration and exposure. However, budget limitations have made statewide assessments of contamination difficult, especially in groundwater media. Many studies have implemented address geocoding, land use regression, and geostatistics independently, but this is the first to examine the benefits of integrating these GIS techniques to address the need of statewide exposure assessments. A novel framework for concentration exposure is introduced that integrates address geocoding, land use regression (LUR), below detect data modeling, and Bayesian Maximum Entropy (BME). A LUR model was developed for tetrachloroethylene that accounts for point sources and flow direction. We then integrate the LUR model into the BME method as a mean trend while also modeling below detects data as a truncated Gaussian probability distribution function. We increase available PCE data 4.7 times from previously available databases through multistage geocoding. The LUR model shows significant influence of dry cleaners at short ranges. The integration of the LUR model as mean trend in BME results in a 7.5% decrease in cross validation mean square error compared to BME with a constant mean trend.

  15. Two Wrongs Make a Right: Addressing Underreporting in Binary Data from Multiple Sources.

    Science.gov (United States)

    Cook, Scott J; Blas, Betsabe; Carroll, Raymond J; Sinha, Samiran

    2017-04-01

    Media-based event data-i.e., data comprised from reporting by media outlets-are widely used in political science research. However, events of interest (e.g., strikes, protests, conflict) are often underreported by these primary and secondary sources, producing incomplete data that risks inconsistency and bias in subsequent analysis. While general strategies exist to help ameliorate this bias, these methods do not make full use of the information often available to researchers. Specifically, much of the event data used in the social sciences is drawn from multiple, overlapping news sources (e.g., Agence France-Presse, Reuters). Therefore, we propose a novel maximum likelihood estimator that corrects for misclassification in data arising from multiple sources. In the most general formulation of our estimator, researchers can specify separate sets of predictors for the true-event model and each of the misclassification models characterizing whether a source fails to report on an event. As such, researchers are able to accurately test theories on both the causes of and reporting on an event of interest. Simulations evidence that our technique regularly out performs current strategies that either neglect misclassification, the unique features of the data-generating process, or both. We also illustrate the utility of this method with a model of repression using the Social Conflict in Africa Database.

  16. Time-resolved hard x-ray studies using third-generation synchrotron radiation sources (abstract)

    International Nuclear Information System (INIS)

    Mills, D.M.

    1992-01-01

    The third-generation, high-brilliance, synchrotron radiation sources currently under construction will usher in a new era of x-ray research in the physical, chemical, and biological sciences. One of the most exciting areas of experimentation will be the extension of static x-ray scattering and diffraction techniques to the study of transient or time-evolving systems. The high repetition rate, short-pulse duration, high-brilliance, variable spectral bandwidth, and large particle beam energies of these sources make them ideal for hard x-ray, time-resolved studies. The primary focus of this presentation will be on the novel instrumentation required for time-resolved studies such as optics which can increase the flux on the sample or disperse the x-ray beam, detectors and electronics for parallel data collection, and methods for altering the natural time structure of the radiation. This work is supported by the U.S. Department of Energy, BES-Materials Science, under Contract No. W-31-109-ENG-38

  17. Learning to locate an odour source with a mobile robot

    OpenAIRE

    Duckett, T.; Axelsson, M.; Saffiotti, A.

    2001-01-01

    We address the problem of enabling a mobile robot to locate a stationary odour source using an electronic nose constructed from gas sensors. On the hardware side, we use a stereo nose architecture consisting of two parallel chambers, each containing an identical set of sensors. On the software side, we use a recurrent artificial neural network to learn the direction to a stationary source from a time series of sensor readings. This contrasts with previous approaches, that rely on the existenc...

  18. Source Segregation and Collection of Source-Segregated Waste

    DEFF Research Database (Denmark)

    Christensen, Thomas Højlund; Matsufuji, Y.

    2011-01-01

    of optimal handling of the waste. But in a few cases, the waste must also be separated at source, for example removing the protective plastic cover from a commercial advertisement received by mail, prior to putting the advertisement into the waste collection bin for recyclable paper. These issues are often...... in wastes segregation addressing: - Purpose of source segregation. - Segregation criteria and guidance. - Segregation potentials and efficiencies. - Systems for collecting segregated fraction....

  19. Contributed Review: Source-localization algorithms and applications using time of arrival and time difference of arrival measurements

    Science.gov (United States)

    Li, Xinya; Deng, Zhiqun Daniel; Rauchenstein, Lynn T.; Carlson, Thomas J.

    2016-04-01

    Locating the position of fixed or mobile sources (i.e., transmitters) based on measurements obtained from sensors (i.e., receivers) is an important research area that is attracting much interest. In this paper, we review several representative localization algorithms that use time of arrivals (TOAs) and time difference of arrivals (TDOAs) to achieve high signal source position estimation accuracy when a transmitter is in the line-of-sight of a receiver. Circular (TOA) and hyperbolic (TDOA) position estimation approaches both use nonlinear equations that relate the known locations of receivers and unknown locations of transmitters. Estimation of the location of transmitters using the standard nonlinear equations may not be very accurate because of receiver location errors, receiver measurement errors, and computational efficiency challenges that result in high computational burdens. Least squares and maximum likelihood based algorithms have become the most popular computational approaches to transmitter location estimation. In this paper, we summarize the computational characteristics and position estimation accuracies of various positioning algorithms. By improving methods for estimating the time-of-arrival of transmissions at receivers and transmitter location estimation algorithms, transmitter location estimation may be applied across a range of applications and technologies such as radar, sonar, the Global Positioning System, wireless sensor networks, underwater animal tracking, mobile communications, and multimedia.

  20. Locating S-wave sources for the SPE-5 explosion using time reversal methods and a close-in, 1000 sensor network

    Science.gov (United States)

    Myers, S. C.; Pitarka, A.; Mellors, R. J.

    2016-12-01

    The Source Physics Experiment (SPE) is producing new data to study the generation of seismic waves from explosive sources. Preliminary results show that far-field S-waves are generated both within the non-elastic volume surrounding explosive sources and by P- to S-wave scattering. The relative contribution of non-elastic phenomenology and elastic-wave scattering to far-field S-waves has been debated for decades, and numerical simulations based on the SPE experiments are addressing this question. The match between observed and simulated data degrades with event-station distance and with increasing time in each seismogram. This suggests that a more accurate model of subsurface elastic properties could result in better agreement between observed and simulated seismograms. A detailed model of subsurface structure has been developed using geologic maps and the extensive database of borehole logs, but uncertainty in structural details remains high. The large N instrument deployment during the SPE-5 experiment offers an opportunity to use time-reversal techniques to back project the wave field into the subsurface to locate significant sources of scattered energy. The large N deployment was nominally 1000, 5 Hz sensors (500 Z and 500 3C geophones) deployed in a roughly rectangular array to the south and east of the SPE-5 shot. Sensor spacing was nominally 50 meters in the interior portion of the array and 100 meters in the outer region, with two dense lines at 25 m spacing. The array covers the major geologic boundary between the Yucca Flat basin and the granitic Climax Stock in which the SPE experiments have been conducted. Improved mapping of subsurface scatterers is expected to result in better agreement between simulated and observed seismograms and aid in our understanding of S-wave generation from explosions. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under contract DE-AC52-07NA27344.

  1. How to address social concerns? round table discussions during session 2 of the Fsc workshop in Canada

    International Nuclear Information System (INIS)

    Kotra, J.

    2003-01-01

    All of the round table discussion groups recognised that a variety of tools exist for addressing social concerns. Among them are tools for sharing information and specific program offering compensations, financial and otherwise. Institutional behaviors, both general and specific, may also be modified to respond to social concerns. However, many discussants emphasised that social concerns and effective solutions for them, when they exist, are highly site- and community-specific. Only when the sources or origins of site-specific concerns have been identified may the selection of tools be approached. Specific sources of concerns seen in the Canadian case studies were discussed, as well as examples from other countries and programs. All involved the absence or erosion of trust. A deficit of trust may arise from lack of familiarity, misinformation or missing information, changing sensibilities of society over time, specific past failures of particular institutions, or inadequate general education. Virtually all of the tools discussed by the round tables for addressing social concerns were also means for building - or rebuilding - social trust. (author)

  2. A method for untriggered time-dependent searches for multiple flares from neutrino point sources

    International Nuclear Information System (INIS)

    Gora, D.; Bernardini, E.; Cruz Silva, A.H.

    2011-04-01

    A method for a time-dependent search for flaring astrophysical sources which can be potentially detected by large neutrino experiments is presented. The method uses a time-clustering algorithm combined with an unbinned likelihood procedure. By including in the likelihood function a signal term which describes the contribution of many small clusters of signal-like events, this method provides an effective way for looking for weak neutrino flares over different time-scales. The method is sensitive to an overall excess of events distributed over several flares which are not individually detectable. For standard cases (one flare) the discovery potential of the method is worse than a standard time-dependent point source analysis with unknown duration of the flare by a factor depending on the signal-to-background level. However, for flares sufficiently shorter than the total observation period, the method is more sensitive than a time-integrated analysis. (orig.)

  3. A method for untriggered time-dependent searches for multiple flares from neutrino point sources

    Energy Technology Data Exchange (ETDEWEB)

    Gora, D. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Institute of Nuclear Physics PAN, Cracow (Poland); Bernardini, E.; Cruz Silva, A.H. [Institute of Nuclear Physics PAN, Cracow (Poland)

    2011-04-15

    A method for a time-dependent search for flaring astrophysical sources which can be potentially detected by large neutrino experiments is presented. The method uses a time-clustering algorithm combined with an unbinned likelihood procedure. By including in the likelihood function a signal term which describes the contribution of many small clusters of signal-like events, this method provides an effective way for looking for weak neutrino flares over different time-scales. The method is sensitive to an overall excess of events distributed over several flares which are not individually detectable. For standard cases (one flare) the discovery potential of the method is worse than a standard time-dependent point source analysis with unknown duration of the flare by a factor depending on the signal-to-background level. However, for flares sufficiently shorter than the total observation period, the method is more sensitive than a time-integrated analysis. (orig.)

  4. Studing Regional Wave Source Time Functions Using A Massive Automated EGF Deconvolution Procedure

    Science.gov (United States)

    Xie, J. "; Schaff, D. P.

    2010-12-01

    Reliably estimated source time functions (STF) from high-frequency regional waveforms, such as Lg, Pn and Pg, provide important input for seismic source studies, explosion detection, and minimization of parameter trade-off in attenuation studies. The empirical Green’s function (EGF) method can be used for estimating STF, but it requires a strict recording condition. Waveforms from pairs of events that are similar in focal mechanism, but different in magnitude must be on-scale recorded on the same stations for the method to work. Searching for such waveforms can be very time consuming, particularly for regional waves that contain complex path effects and have reduced S/N ratios due to attenuation. We have developed a massive, automated procedure to conduct inter-event waveform deconvolution calculations from many candidate event pairs. The procedure automatically evaluates the “spikiness” of the deconvolutions by calculating their “sdc”, which is defined as the peak divided by the background value. The background value is calculated as the mean absolute value of the deconvolution, excluding 10 s around the source time function. When the sdc values are about 10 or higher, the deconvolutions are found to be sufficiently spiky (pulse-like), indicating similar path Green’s functions and good estimates of the STF. We have applied this automated procedure to Lg waves and full regional wavetrains from 989 M ≥ 5 events in and around China, calculating about a million deconvolutions. Of these we found about 2700 deconvolutions with sdc greater than 9, which, if having a sufficiently broad frequency band, can be used to estimate the STF of the larger events. We are currently refining our procedure, as well as the estimated STFs. We will infer the source scaling using the STFs. We will also explore the possibility that the deconvolution procedure could complement cross-correlation in a real time event-screening process.

  5. Double point source W-phase inversion: Real-time implementation and automated model selection

    Science.gov (United States)

    Nealy, Jennifer; Hayes, Gavin

    2015-01-01

    Rapid and accurate characterization of an earthquake source is an extremely important and ever evolving field of research. Within this field, source inversion of the W-phase has recently been shown to be an effective technique, which can be efficiently implemented in real-time. An extension to the W-phase source inversion is presented in which two point sources are derived to better characterize complex earthquakes. A single source inversion followed by a double point source inversion with centroid locations fixed at the single source solution location can be efficiently run as part of earthquake monitoring network operational procedures. In order to determine the most appropriate solution, i.e., whether an earthquake is most appropriately described by a single source or a double source, an Akaike information criterion (AIC) test is performed. Analyses of all earthquakes of magnitude 7.5 and greater occurring since January 2000 were performed with extended analyses of the September 29, 2009 magnitude 8.1 Samoa earthquake and the April 19, 2014 magnitude 7.5 Papua New Guinea earthquake. The AIC test is shown to be able to accurately select the most appropriate model and the selected W-phase inversion is shown to yield reliable solutions that match published analyses of the same events.

  6. Implementation of a Network Address Translation Mechanism Over IPv6

    National Research Council Canada - National Science Library

    Baumgartner, Trevor

    2004-01-01

    ...; however, NAT provides several other benefits. NAT can be used to mask the internal IP addresses of an Intranet - IPv6, the emerging standard for Internet addressing, provides three times the number of bits for IP addressing...

  7. Real-time particle monitor calibration factors and PM2.5 emission factors for multiple indoor sources.

    Science.gov (United States)

    Dacunto, Philip J; Cheng, Kai-Chung; Acevedo-Bolton, Viviana; Jiang, Ruo-Ting; Klepeis, Neil E; Repace, James L; Ott, Wayne R; Hildemann, Lynn M

    2013-08-01

    Indoor sources can greatly contribute to personal exposure to particulate matter less than 2.5 μm in diameter (PM2.5). To accurately assess PM2.5 mass emission factors and concentrations, real-time particle monitors must be calibrated for individual sources. Sixty-six experiments were conducted with a common, real-time laser photometer (TSI SidePak™ Model AM510 Personal Aerosol Monitor) and a filter-based PM2.5 gravimetric sampler to quantify the monitor calibration factors (CFs), and to estimate emission factors for common indoor sources including cigarettes, incense, cooking, candles, and fireplaces. Calibration factors for these indoor sources were all significantly less than the factory-set CF of 1.0, ranging from 0.32 (cigarette smoke) to 0.70 (hamburger). Stick incense had a CF of 0.35, while fireplace emissions ranged from 0.44-0.47. Cooking source CFs ranged from 0.41 (fried bacon) to 0.65-0.70 (fried pork chops, salmon, and hamburger). The CFs of combined sources (e.g., cooking and cigarette emissions mixed) were linear combinations of the CFs of the component sources. The highest PM2.5 emission factors per time period were from burned foods and fireplaces (15-16 mg min(-1)), and the lowest from cooking foods such as pizza and ground beef (0.1-0.2 mg min(-1)).

  8. Open Source GIS based integrated watershed management

    Science.gov (United States)

    Byrne, J. M.; Lindsay, J.; Berg, A. A.

    2013-12-01

    Optimal land and water management to address future and current resource stresses and allocation challenges requires the development of state-of-the-art geomatics and hydrological modelling tools. Future hydrological modelling tools should be of high resolution, process based with real-time capability to assess changing resource issues critical to short, medium and long-term enviromental management. The objective here is to merge two renowned, well published resource modeling programs to create an source toolbox for integrated land and water management applications. This work will facilitate a much increased efficiency in land and water resource security, management and planning. Following an 'open-source' philosophy, the tools will be computer platform independent with source code freely available, maximizing knowledge transfer and the global value of the proposed research. The envisioned set of water resource management tools will be housed within 'Whitebox Geospatial Analysis Tools'. Whitebox, is an open-source geographical information system (GIS) developed by Dr. John Lindsay at the University of Guelph. The emphasis of the Whitebox project has been to develop a user-friendly interface for advanced spatial analysis in environmental applications. The plugin architecture of the software is ideal for the tight-integration of spatially distributed models and spatial analysis algorithms such as those contained within the GENESYS suite. Open-source development extends knowledge and technology transfer to a broad range of end-users and builds Canadian capability to address complex resource management problems with better tools and expertise for managers in Canada and around the world. GENESYS (Generate Earth Systems Science input) is an innovative, efficient, high-resolution hydro- and agro-meteorological model for complex terrain watersheds developed under the direction of Dr. James Byrne. GENESYS is an outstanding research and applications tool to address

  9. Moving source localization with a single hydrophone using multipath time delays in the deep ocean.

    Science.gov (United States)

    Duan, Rui; Yang, Kunde; Ma, Yuanliang; Yang, Qiulong; Li, Hui

    2014-08-01

    Localizing a source of radial movement at moderate range using a single hydrophone can be achieved in the reliable acoustic path by tracking the time delays between the direct and surface-reflected arrivals (D-SR time delays). The problem is defined as a joint estimation of the depth, initial range, and speed of the source, which are the state parameters for the extended Kalman filter (EKF). The D-SR time delays extracted from the autocorrelation functions are the measurements for the EKF. Experimental results using pseudorandom signals show that accurate localization results are achieved by offline iteration of the EKF.

  10. OpenPSTD : The open source pseudospectral time-domain method for acoustic propagation

    NARCIS (Netherlands)

    Hornikx, M.C.J.; Krijnen, T.F.; van Harten, L.

    2016-01-01

    An open source implementation of the Fourier pseudospectral time-domain (PSTD) method for computing the propagation of sound is presented, which is geared towards applications in the built environment. Being a wave-based method, PSTD captures phenomena like diffraction, but maintains efficiency in

  11. [Multiple time scales analysis of spatial differentiation characteristics of non-point source nitrogen loss within watershed].

    Science.gov (United States)

    Liu, Mei-bing; Chen, Xing-wei; Chen, Ying

    2015-07-01

    Identification of the critical source areas of non-point source pollution is an important means to control the non-point source pollution within the watershed. In order to further reveal the impact of multiple time scales on the spatial differentiation characteristics of non-point source nitrogen loss, a SWAT model of Shanmei Reservoir watershed was developed. Based on the simulation of total nitrogen (TN) loss intensity of all 38 subbasins, spatial distribution characteristics of nitrogen loss and critical source areas were analyzed at three time scales of yearly average, monthly average and rainstorms flood process, respectively. Furthermore, multiple linear correlation analysis was conducted to analyze the contribution of natural environment and anthropogenic disturbance on nitrogen loss. The results showed that there were significant spatial differences of TN loss in Shanmei Reservoir watershed at different time scales, and the spatial differentiation degree of nitrogen loss was in the order of monthly average > yearly average > rainstorms flood process. TN loss load mainly came from upland Taoxi subbasin, which was identified as the critical source area. At different time scales, land use types (such as farmland and forest) were always the dominant factor affecting the spatial distribution of nitrogen loss, while the effect of precipitation and runoff on the nitrogen loss was only taken in no fertilization month and several processes of storm flood at no fertilization date. This was mainly due to the significant spatial variation of land use and fertilization, as well as the low spatial variability of precipitation and runoff.

  12. Time-domain single-source integral equations for analyzing scattering from homogeneous penetrable objects

    KAUST Repository

    Valdé s, Felipe; Andriulli, Francesco P.; Bagci, Hakan; Michielssen, Eric

    2013-01-01

    Single-source time-domain electric-and magnetic-field integral equations for analyzing scattering from homogeneous penetrable objects are presented. Their temporal discretization is effected by using shifted piecewise polynomial temporal basis

  13. Addressing Circuitous Currents MVDC Power Systems Protection

    Science.gov (United States)

    2017-12-31

    Addressing Circuitous Currents MVDC Power Systems Protection 5b. GRANT NUMBER N00014-16-1-3113 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR($) Sd. PROJECT NUMBER...efficiency. A challenge with DC distribution is electrical protection . Z-source DC breakers alt! an pti n b&i g cvr.sidcrcd and this w rk ~xplores...zonal distribution, electric ship 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF a. REPORT b. ABSTRACT c. THIS PAGE ABSTRACT u u u uu 18. NUMBER

  14. The importance of source and cue type in time-based everyday prospective memory.

    Science.gov (United States)

    Oates, Joyce M; Peynircioğlu, Zehra F

    2014-01-01

    We examined the effects of the source of a prospective memory task (provided or generated) and the type of cue (specific or general) triggering that task in everyday settings. Participants were asked to complete both generated and experimenter-provided tasks and to send a text message when each task was completed. The cue/context for the to-be-completed tasks was either a specific time or a general deadline (time-based cue), and the cue/context for the texting task was the completion of the task itself (activity-based cue). Although generated tasks were completed more often, generated cues/contexts were no more effective than provided ones in triggering the intention. Furthermore, generated tasks were completed more often when the cue/context comprised a specific time, whereas provided tasks were completed more often when the cue/context comprised a general deadline. However, texting was unaffected by the source of the cue/context. Finally, emotion modulated the effects. Results are discussed within a process-driven framework.

  15. eBooking of beam-time over internet for beamlines of Indus synchrotron radiation sources

    International Nuclear Information System (INIS)

    Jain, Alok; Verma, Rajesh; Rajan, Alpana; Modi, M.H.; Rawat, Anil

    2015-01-01

    Users from various research labs and academic institutes carry out experiments on beamlines of two Synchrotron Radiation Sources Indus-1 and Indus-2 available at RRCAT, Indore. To carry out experimental work on beamlines of both synchrotron radiation sources, beam-time is booked over Internet by the users of beamlines using user portal designed, developed and deployed over Internet. This portal has made the process of beamtime booking fast, hassle free and paperless as manual booking of beam-time for carrying out experiment on a particular beamline is cumbersome. The portal facilitates in-charge of Indus-1 and Indus-2 beamlines to keep track of users' records, work progress and other activities linked to experiments carried on beamlines. It is important to keep record and provide statistics about the usage of the beam lines from time-to-time. The user portal for e-booking of beam-time has been developed in-house using open source software development tools. Multi-step activities of users and beamline administrators are workflow based with seamless flow of information across various modules and fully authenticated using role based mechanism for different roles of software usage. The software is in regular use since November 2013 and has helped beamline in- charges in efficiently managing various activities related to user registration, booking of beam-time, booking of Guest House, Generation of Security permits, User feedback etc. Design concept, role based authentication mechanism and features provided by the web portal are discussed in detail in this paper. (author)

  16. A Breath of Fresh Air: Addressing Indoor Air Quality

    Science.gov (United States)

    Palliser, Janna

    2011-01-01

    Indoor air pollution refers to "chemical, biological, and physical contamination of indoor air," which may result in adverse health effects (OECD 2003). The causes, sources, and types of indoor air pollutants will be addressed in this article, as well as health effects and how to reduce exposure. Learning more about potential pollutants in home…

  17. In vivo time-gated diffuse correlation spectroscopy at quasi-null source-detector separation.

    Science.gov (United States)

    Pagliazzi, M; Sekar, S Konugolu Venkata; Di Sieno, L; Colombo, L; Durduran, T; Contini, D; Torricelli, A; Pifferi, A; Mora, A Dalla

    2018-06-01

    We demonstrate time domain diffuse correlation spectroscopy at quasi-null source-detector separation by using a fast time-gated single-photon avalanche diode without the need of time-tagging electronics. This approach allows for increased photon collection, simplified real-time instrumentation, and reduced probe dimensions. Depth discriminating, quasi-null distance measurement of blood flow in a human subject is presented. We envision the miniaturization and integration of matrices of optical sensors of increased spatial resolution and the enhancement of the contrast of local blood flow changes.

  18. A diagonal address generator for a Josephson memory circuit

    International Nuclear Information System (INIS)

    Suzuki, H.; Hasuo, S.

    1987-01-01

    The authors propose that a diagonal D address generator, which is useful for a single flux quantum (SFQ) memory cell in the triple coincidence scheme, can be performed by a full adder circuit. For the purpose of evaluating the D address generator for a 16-kbit memory circuit, a 6-bit full adder circuit, using a current-steering flip-flop circuit, has been designed and fabricated with the lead-alloy process. Operating times for the address latch, carry generator, and sum generator were 150 ps, 250 ps/stage, and 1.4 ns, respectively. From these results, they estimate that the time necessary for the diagonal signal generation is 2.8 ns

  19. Renewable energy sources - rational energy use. Enterprises - suppliers - research - consultation. BINE public information. Market leaders - addresses. Erneuerbare Energiequellen - rationelle Energieverwendung. Unternehmen - Bezugsquellen - Forschung - Beratung. BINE-Buergerinformation. Marktfuehrer-Adresshandbuch

    Energy Technology Data Exchange (ETDEWEB)

    1989-01-01

    The manual lists addresses and business information given by research institutes, companies, associations, groups, etc. in the field of renewable energy sources and rational energy use. It is to provide information, as comprehensive as possible to all those who have to solve problems concerning energy conservation and environmental protection. The manual is based on a detailed questionnaire distributed by BINE (Buerger-Information Neue Energietechniken, Nachwachsende Rohstoffe, Umwelt). (UA).

  20. Addressing the Sustainability of Groundwater Extraction in California Using Hydrochronology

    Science.gov (United States)

    Moran, J. E.; Visser, A.; Singleton, M. J.; Esser, B. K.

    2017-12-01

    In urban and agricultural settings in California, intense pressure on water supplies has led to extensive managed aquifer recharge and extensive overdraft in these areas, respectively. The California Sustainable Groundwater Management Act (SGMA) includes criteria for pumping that maintains groundwater levels and basin storage, and avoids stream depletion and degradation of water quality. Most sustainability plans will likely use water level monitoring and water budget balancing based on integrated flow models as evidence of compliance. However, hydrochronology data are applicable to several of the criteria, and provide an independent method of addressing questions related to basin turnover time, recharge rate, surface water-groundwater interaction, and the age distribution at pumping wells. We have applied hydrochronology (mainly tritium-helium groundwater age dating and extrinsic tracers) in urban areas to delineate flowpaths of artificially recharged water, to identify stagnant zones bypassed by the engineered flow system, and to predict vulnerability of drinking water sources to contamination. In agricultural areas, we have applied multi-tracer hydrochronology to delineate groundwater stratigraphy, to identify paleowater, and to project future nitrate concentrations in long-screened wells. This presentation will describe examples in which groundwater dating and other tracer methods can be applied to directly address the SGMA criteria for sustainable groundwater pumping.

  1. The address in real time data driver card for the MicroMegas detector of the ATLAS muon upgrade

    International Nuclear Information System (INIS)

    Yao, L.; Polychronakos, V.; Chen, H.; Chen, K.; Xu, H.; Martoiu, S.; Felt, N.; Lazovich, T.

    2017-01-01

    The ART Data Driver Card (ADDC) will be used in the ATLAS muon upgrade to process and transmit the Address in Real Time (ART) signals, which are generated by the front end chip (VMM) to indicate the location of the first above-threshold event. This ART signal is encoded to represent the address of the first threshold-crossing strip for trigger processing and the magnitude information is not included. The ADDC will be installed on the detector with high radiation and magnetic field thus a custom ASIC (ART ASIC) will be used to receive the ART signals from VMM and do the hit-selection processing. Processed data from ART ASIC will be transmitted out of the detector to the trigger processor through fiber connection. To evaluate the performance of the ADDC before the ART ASIC is produced, an FPGA based prototype was built. This prototype includes most of the major components of the ADDC, while a Xilinx Artix-7 FPGA is used to emulate the ART ASIC. The bench test and integration test results of this prototype will also be described.

  2. The Source of Time-Correlated Photons at 1.064 μm and its Applications

    Directory of Open Access Journals (Sweden)

    Gostev P.P.

    2015-01-01

    Full Text Available The source of time-correlated photon-pairs at 1064 nm is described. The source consists of the spontaneous parametric down-conversion (SPDC generator, pumped by cw laser operating at 532 nm, and the measuring and control appliances. One of the main parts of the electronic systems is the “time-to-digital converter” which is designed and built by our group. The system allows to create and detect correlation of photon pairs with resolution better than 1 ns. We adduce the results of a quantum key distribution through open air. The key length was about 5000 bits and the accuracy ~0.1%.

  3. Effects of detector-source distance and detector bias voltage variations on time resolution of general purpose plastic scintillation detectors.

    Science.gov (United States)

    Ermis, E E; Celiktas, C

    2012-12-01

    Effects of source-detector distance and the detector bias voltage variations on time resolution of a general purpose plastic scintillation detector such as BC400 were investigated. (133)Ba and (207)Bi calibration sources with and without collimator were used in the present work. Optimum source-detector distance and bias voltage values were determined for the best time resolution by using leading edge timing method. Effect of the collimator usage on time resolution was also investigated. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Estimates of Imaging Times for Conventional and Synchrotron X-Ray Sources

    CERN Document Server

    Kinney, J

    2003-01-01

    The following notes are to be taken as estimates of the time requirements for imaging NIF targets in three-dimensions with absorption contrast. The estimates ignore target geometry and detector inefficiency, and focus only on the statistical question of detecting compositional (structural) differences between adjacent volume elements in the presence of noise. The basic equations, from the classic reference by Grodzins, consider imaging times in terms of the required number of photons necessary to provide an image with given resolution and noise. The time estimates, therefore, have been based on the calculated x-ray fluxes from the proposed Advanced Light Source (ALS) imaging beamline, and from the calculated flux for a tungsten anode x-ray generator operated in a point focus mode.

  5. 3-D time-domain induced polarization tomography: a new approach based on a source current density formulation

    Science.gov (United States)

    Soueid Ahmed, A.; Revil, A.

    2018-04-01

    Induced polarization (IP) of porous rocks can be associated with a secondary source current density, which is proportional to both the intrinsic chargeability and the primary (applied) current density. This gives the possibility of reformulating the time domain induced polarization (TDIP) problem as a time-dependent self-potential-type problem. This new approach implies a change of strategy regarding data acquisition and inversion, allowing major time savings for both. For inverting TDIP data, we first retrieve the electrical resistivity distribution. Then, we use this electrical resistivity distribution to reconstruct the primary current density during the injection/retrieval of the (primary) current between the current electrodes A and B. The time-lapse secondary source current density distribution is determined given the primary source current density and a distribution of chargeability (forward modelling step). The inverse problem is linear between the secondary voltages (measured at all the electrodes) and the computed secondary source current density. A kernel matrix relating the secondary observed voltages data to the source current density model is computed once (using the electrical conductivity distribution), and then used throughout the inversion process. This recovered source current density model is in turn used to estimate the time-dependent chargeability (normalized voltages) in each cell of the domain of interest. Assuming a Cole-Cole model for simplicity, we can reconstruct the 3-D distributions of the relaxation time τ and the Cole-Cole exponent c by fitting the intrinsic chargeability decay curve to a Cole-Cole relaxation model for each cell. Two simple cases are studied in details to explain this new approach. In the first case, we estimate the Cole-Cole parameters as well as the source current density field from a synthetic TDIP data set. Our approach is successfully able to reveal the presence of the anomaly and to invert its Cole

  6. Modelling Choice of Information Sources

    Directory of Open Access Journals (Sweden)

    Agha Faisal Habib Pathan

    2013-04-01

    Full Text Available This paper addresses the significance of traveller information sources including mono-modal and multimodal websites for travel decisions. The research follows a decision paradigm developed earlier, involving an information acquisition process for travel choices, and identifies the abstract characteristics of new information sources that deserve further investigation (e.g. by incorporating these in models and studying their significance in model estimation. A Stated Preference experiment is developed and the utility functions are formulated by expanding the travellers' choice set to include different combinations of sources of information. In order to study the underlying choice mechanisms, the resulting variables are examined in models based on different behavioural strategies, including utility maximisation and minimising the regret associated with the foregone alternatives. This research confirmed that RRM (Random Regret Minimisation Theory can fruitfully be used and can provide important insights for behavioural studies. The study also analyses the properties of travel planning websites and establishes a link between travel choices and the content, provenance, design, presence of advertisements, and presentation of information. The results indicate that travellers give particular credence to governmentowned sources and put more importance on their own previous experiences than on any other single source of information. Information from multimodal websites is more influential than that on train-only websites. This in turn is more influential than information from friends, while information from coachonly websites is the least influential. A website with less search time, specific information on users' own criteria, and real time information is regarded as most attractive

  7. Distributed source coding of video

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Van Luong, Huynh

    2015-01-01

    A foundation for distributed source coding was established in the classic papers of Slepian-Wolf (SW) [1] and Wyner-Ziv (WZ) [2]. This has provided a starting point for work on Distributed Video Coding (DVC), which exploits the source statistics at the decoder side offering shifting processing...... steps, conventionally performed at the video encoder side, to the decoder side. Emerging applications such as wireless visual sensor networks and wireless video surveillance all require lightweight video encoding with high coding efficiency and error-resilience. The video data of DVC schemes differ from...... the assumptions of SW and WZ distributed coding, e.g. by being correlated in time and nonstationary. Improving the efficiency of DVC coding is challenging. This paper presents some selected techniques to address the DVC challenges. Focus is put on pin-pointing how the decoder steps are modified to provide...

  8. The recovery of a time-dependent point source in a linear transport equation: application to surface water pollution

    International Nuclear Information System (INIS)

    Hamdi, Adel

    2009-01-01

    The aim of this paper is to localize the position of a point source and recover the history of its time-dependent intensity function that is both unknown and constitutes the right-hand side of a 1D linear transport equation. Assuming that the source intensity function vanishes before reaching the final control time, we prove that recording the state with respect to the time at two observation points framing the source region leads to the identification of the source position and the recovery of its intensity function in a unique manner. Note that at least one of the two observation points should be strategic. We establish an identification method that determines quasi-explicitly the source position and transforms the task of recovering its intensity function into solving directly a well-conditioned linear system. Some numerical experiments done on a variant of the water pollution BOD model are presented

  9. Computing travel time when the exact address is unknown: a comparison of point and polygon ZIP code approximation methods.

    Science.gov (United States)

    Berke, Ethan M; Shi, Xun

    2009-04-29

    Travel time is an important metric of geographic access to health care. We compared strategies of estimating travel times when only subject ZIP code data were available. Using simulated data from New Hampshire and Arizona, we estimated travel times to nearest cancer centers by using: 1) geometric centroid of ZIP code polygons as origins, 2) population centroids as origin, 3) service area rings around each cancer center, assigning subjects to rings by assuming they are evenly distributed within their ZIP code, 4) service area rings around each center, assuming the subjects follow the population distribution within the ZIP code. We used travel times based on street addresses as true values to validate estimates. Population-based methods have smaller errors than geometry-based methods. Within categories (geometry or population), centroid and service area methods have similar errors. Errors are smaller in urban areas than in rural areas. Population-based methods are superior to the geometry-based methods, with the population centroid method appearing to be the best choice for estimating travel time. Estimates in rural areas are less reliable.

  10. A Bayesian Approach to Real-Time Earthquake Phase Association

    Science.gov (United States)

    Benz, H.; Johnson, C. E.; Earle, P. S.; Patton, J. M.

    2014-12-01

    Real-time location of seismic events requires a robust and extremely efficient means of associating and identifying seismic phases with hypothetical sources. An association algorithm converts a series of phase arrival times into a catalog of earthquake hypocenters. The classical approach based on time-space stacking of the locus of possible hypocenters for each phase arrival using the principal of acoustic reciprocity has been in use now for many years. One of the most significant problems that has emerged over time with this approach is related to the extreme variations in seismic station density throughout the global seismic network. To address this problem we have developed a novel, Bayesian association algorithm, which looks at the association problem as a dynamically evolving complex system of "many to many relationships". While the end result must be an array of one to many relations (one earthquake, many phases), during the association process the situation is quite different. Both the evolving possible hypocenters and the relationships between phases and all nascent hypocenters is many to many (many earthquakes, many phases). The computational framework we are using to address this is a responsive, NoSQL graph database where the earthquake-phase associations are represented as intersecting Bayesian Learning Networks. The approach directly addresses the network inhomogeneity issue while at the same time allowing the inclusion of other kinds of data (e.g., seismic beams, station noise characteristics, priors on estimated location of the seismic source) by representing the locus of intersecting hypothetical loci for a given datum as joint probability density functions.

  11. Radiation and occupational health: keynote address: the impact of radiation on health

    International Nuclear Information System (INIS)

    Shrimpton, P.C.

    1995-01-01

    The part of address discusses the following issue: sources of exposure, effects of ionizing radiations, deterministic effects, stochastic effects, in utero exposure, recommendations of radiation protection: principles, practices, intervention, radiation protection in practices

  12. Time-of-flight small-angle scattering spectrometers on pulsed neutron sources

    International Nuclear Information System (INIS)

    Ostanevich, Yu.M.

    1987-01-01

    The operation principles, constructions, advantages and shortcomings of known time-of-flight small angle neutron scattering (TOF SANS) spectrometers built up with pulsed neutron sources are reviewed. The most important characteristics of TOF SANS apparatuses are rather a high luminosity and the possibility for the measurement in an extremely wide range of scattering vector at a single exposure. This is achieved by simultaneous employment of white beam, TOF technique for wave length-scan and the commonly known angle-scan. However, the electronic equipment, data-matching programs, and the measurement procedure, necessary for accurate normalization of experimental data and their transformation into absolute cross-section scale, they all become more complex, as compared with those for SANS apparatuses operating on steady-state neutron sources, where only angle-scan is used

  13. A Separation Algorithm for Sources with Temporal Structure Only Using Second-order Statistics

    Directory of Open Access Journals (Sweden)

    J.G. Wang

    2013-09-01

    Full Text Available Unlike conventional blind source separation (BSS deals with independent identically distributed (i.i.d. sources, this paper addresses the separation from mixtures of sources with temporal structure, such as linear autocorrelations. Many sequential extraction algorithms have been reported, resulting in inevitable cumulated errors introduced by the deflation scheme. We propose a robust separation algorithm to recover original sources simultaneously, through a joint diagonalizer of several average delayed covariance matrices at positions of the optimal time delay and its integers. The proposed algorithm is computationally simple and efficient, since it is based on the second-order statistics only. Extensive simulation results confirm the validity and high performance of the algorithm. Compared with related extraction algorithms, its separation signal-to-noise rate for a desired source can reach 20dB higher, and it seems rather insensitive to the estimation error of the time delay.

  14. Real-time source deformation modeling through GNSS permanent stations at Merapi volcano (Indonesia

    Science.gov (United States)

    Beauducel, F.; Nurnaning, A.; Iguchi, M.; Fahmi, A. A.; Nandaka, M. A.; Sumarti, S.; Subandriyo, S.; Metaxian, J. P.

    2014-12-01

    Mt. Merapi (Java, Indonesia) is one of the most active and dangerous volcano in the world. A first GPS repetition network was setup and periodically measured since 1993, allowing detecting a deep magma reservoir, quantifying magma flux in conduit and identifying shallow discontinuities around the former crater (Beauducel and Cornet, 1999;Beauducel et al., 2000, 2006). After the 2010 centennial eruption, when this network was almost completely destroyed, Indonesian and Japanese teams installed a new continuous GPS network for monitoring purpose (Iguchi et al., 2011), consisting of 3 stations located at the volcano flanks, plus a reference station at the Yogyakarta Observatory (BPPTKG).In the framework of DOMERAPI project (2013-2016) we have completed this network with 5 additional stations, which are located on the summit area and volcano surrounding. The new stations are 1-Hz sampling, GNSS (GPS + GLONASS) receivers, and near real-time data streaming to the Observatory. An automatic processing has been developed and included in the WEBOBS system (Beauducel et al., 2010) based on GIPSY software computing precise daily moving solutions every hour, and for different time scales (2 months, 1 and 5 years), time series and velocity vectors. A real-time source modeling estimation has also been implemented. It uses the depth-varying point source solution (Mogi, 1958; Williams and Wadge, 1998) in a systematic inverse problem model exploration that displays location, volume variation and 3-D probability map.The operational system should be able to better detect and estimate the location and volume variations of possible magma sources, and to follow magma transfer towards the surface. This should help monitoring and contribute to decision making during future unrest or eruption.

  15. The cohesiveness of sourcing teams

    DEFF Research Database (Denmark)

    Lidegaard, Nina

    2015-01-01

    Sourcing teams are introduced as an approach to achieving the interdepartmental integration necessary for companies to address the complexity of strategic sourcing. Companies aim at facilitating teams capable of balancing the goals and tasks of the team with departmental expectations; however...

  16. Decreasing Computational Time for VBBinaryLensing by Point Source Approximation

    Science.gov (United States)

    Tirrell, Bethany M.; Visgaitis, Tiffany A.; Bozza, Valerio

    2018-01-01

    The gravitational lens of a binary system produces a magnification map that is more intricate than a single object lens. This map cannot be calculated analytically and one must rely on computational methods to resolve. There are generally two methods of computing the microlensed flux of a source. One is based on ray-shooting maps (Kayser, Refsdal, & Stabell 1986), while the other method is based on an application of Green’s theorem. This second method finds the area of an image by calculating a Riemann integral along the image contour. VBBinaryLensing is a C++ contour integration code developed by Valerio Bozza, which utilizes this method. The parameters at which the source object could be treated as a point source, or in other words, when the source is far enough from the caustic, was of interest to substantially decrease the computational time. The maximum and minimum values of the caustic curves produced, were examined to determine the boundaries for which this simplification could be made. The code was then run for a number of different maps, with separation values and accuracies ranging from 10-1 to 10-3, to test the theoretical model and determine a safe buffer for which minimal error could be made for the approximation. The determined buffer was 1.5+5q, with q being the mass ratio. The theoretical model and the calculated points worked for all combinations of the separation values and different accuracies except the map with accuracy and separation equal to 10-3 for y1 max. An alternative approach has to be found in order to accommodate a wider range of parameters.

  17. Sources

    International Nuclear Information System (INIS)

    Duffy, L.P.

    1991-01-01

    This paper discusses the sources of radiation in the narrow perspective of radioactivity and the even narrow perspective of those sources that concern environmental management and restoration activities at DOE facilities, as well as a few related sources. Sources of irritation, Sources of inflammatory jingoism, and Sources of information. First, the sources of irritation fall into three categories: No reliable scientific ombudsman to speak without bias and prejudice for the public good, Technical jargon with unclear definitions exists within the radioactive nomenclature, and Scientific community keeps a low-profile with regard to public information. The next area of personal concern are the sources of inflammation. This include such things as: Plutonium being described as the most dangerous substance known to man, The amount of plutonium required to make a bomb, Talk of transuranic waste containing plutonium and its health affects, TMI-2 and Chernobyl being described as Siamese twins, Inadequate information on low-level disposal sites and current regulatory requirements under 10 CFR 61, Enhanced engineered waste disposal not being presented to the public accurately. Numerous sources of disinformation regarding low level radiation high-level radiation, Elusive nature of the scientific community, The Federal and State Health Agencies resources to address comparative risk, and Regulatory agencies speaking out without the support of the scientific community

  18. Learning from hackers: open-source clinical trials.

    Science.gov (United States)

    Dunn, Adam G; Day, Richard O; Mandl, Kenneth D; Coiera, Enrico

    2012-05-02

    Open sharing of clinical trial data has been proposed as a way to address the gap between the production of clinical evidence and the decision-making of physicians. A similar gap was addressed in the software industry by their open-source software movement. Here, we examine how the social and technical principles of the movement can guide the growth of an open-source clinical trial community.

  19. Locating the source of diffusion in complex networks by time-reversal backward spreading

    Science.gov (United States)

    Shen, Zhesi; Cao, Shinan; Wang, Wen-Xu; Di, Zengru; Stanley, H. Eugene

    2016-03-01

    Locating the source that triggers a dynamical process is a fundamental but challenging problem in complex networks, ranging from epidemic spreading in society and on the Internet to cancer metastasis in the human body. An accurate localization of the source is inherently limited by our ability to simultaneously access the information of all nodes in a large-scale complex network. This thus raises two critical questions: how do we locate the source from incomplete information and can we achieve full localization of sources at any possible location from a given set of observable nodes. Here we develop a time-reversal backward spreading algorithm to locate the source of a diffusion-like process efficiently and propose a general locatability condition. We test the algorithm by employing epidemic spreading and consensus dynamics as typical dynamical processes and apply it to the H1N1 pandemic in China. We find that the sources can be precisely located in arbitrary networks insofar as the locatability condition is assured. Our tools greatly improve our ability to locate the source of diffusion in complex networks based on limited accessibility of nodal information. Moreover, they have implications for controlling a variety of dynamical processes taking place on complex networks, such as inhibiting epidemics, slowing the spread of rumors, pollution control, and environmental protection.

  20. Effects of Linear Falling Ramp Reset Pulse on Addressing Operation in AC PDP

    International Nuclear Information System (INIS)

    Liu Zujun; Liang Zhihu; Liu Chunliang; Meng Lingguo

    2006-01-01

    The effects of linear falling ramp reset pulse related to addressing operation in an alternating current plasma display panel (AC PDP) were studied. The wall charge waveforms were measured by the electrode balance method in a 12-inch coplanar AC PDP. The wall charge waveforms show the relationship between the slope ratio of the falling ramp reset pulse and the wall charges at the end of the falling ramp reset pulse which influences the addressing stability. Then the effects of the slope ratio of the linear falling ramp reset pulse on the addressing voltage and addressing time were investigated. The experimental results show that the minimum addressing voltage increases with the increase of the slope ratio of the falling ramp reset pulse, and so does the minimum addressing time. Based on the experimental results, the optimization of the addressing time and the slope ratio of the falling ramp pulse is discussed

  1. Using TANF to Finance Out-of-School Time Initiatives

    Science.gov (United States)

    Relave, Nanette; Flynn-Khan, Margaret

    2007-01-01

    This report addresses how the Temporary Assistance for Needy Families (TANF) program can be an important source of funding for maintaining, improving, and expanding out-of-school time initiatives. The report is designed to help policymakers and program developers understand the opportunities and challenges of effectively using TANF funding to…

  2. Time-limited effects of emotional arousal on item and source memory.

    Science.gov (United States)

    Wang, Bo; Sun, Bukuan

    2015-01-01

    Two experiments investigated the time-limited effects of emotional arousal on consolidation of item and source memory. In Experiment 1, participants memorized words (items) and the corresponding speakers (sources) and then took an immediate free recall test. Then they watched a neutral, positive, or negative video 5, 35, or 50 min after learning, and 24 hours later they took surprise memory tests. Experiment 2 was similar to Experiment 1 except that (a) a reality monitoring task was used; (b) elicitation delays of 5, 30, and 45 min were used; and (c) delayed memory tests were given 60 min after learning. Both experiments showed that, regardless of elicitation delay, emotional arousal did not enhance item recall memory. Second, both experiments showed that negative arousal enhanced delayed item recognition memory only at the medium elicitation delay, but not in the shorter or longer delays. Positive arousal enhanced performance only in Experiment 1. Third, regardless of elicitation delay, emotional arousal had little effect on source memory. These findings have implications for theories of emotion and memory, suggesting that emotion effects are contingent upon the nature of the memory task and elicitation delay.

  3. Review of Sealed Source Designs and Manufacturing Techniques Affecting Disused Source Management

    International Nuclear Information System (INIS)

    2012-10-01

    This publication presents an investigation on the influence of the design and technical features of sealed radioactive sources (SRSs) on predisposal and disposal activities when the sources become disused. The publication also addresses whether design modifications could contribute to safer and/or more efficient management of disused sources without compromising the benefits provided by the use of the sealed sources. This technical publication aims to collect information on the most typical design features and manufacturing techniques of sealed radioactive sources and examines how they affect the safe management of disused sealed radioactive sources (DSRS). The publication also aims to assist source designers and manufacturers by discussing design features that are important from the waste management point of view. It has been identified that most SRS manufacturers use similar geometries and materials for their designs and apply improved and reliable manufacturing techniques e.g. double- encapsulation. These designs and manufacturing techniques have been proven over time to reduce contamination levels in fabrication and handling, and improve source integrity and longevity. The current source designs and materials ensure as well as possible that SRSs will maintain their integrity in use and when they become disused. No significant improvement options to current designs have been identified. However, some design considerations were identified as important to facilitate source retrieval, to increase the possibility of re-use and to ensure minimal contamination risk and radioactive waste generation at recycling. It was also concluded that legible identifying markings on a source are critical for DSRS management. The publication emphasizes the need for a common understanding of the radioactive source's recommended working life (RWL) for manufacturers and regulators. The conditions of use (COU) are important for the determination of RWL. A formal system for specification

  4. Evaluating scintillator performance in time-resolved hard X-ray studies at synchrotron light sources.

    Science.gov (United States)

    Rutherford, Michael E; Chapman, David J; White, Thomas G; Drakopoulos, Michael; Rack, Alexander; Eakins, Daniel E

    2016-05-01

    The short pulse duration, small effective source size and high flux of synchrotron radiation is ideally suited for probing a wide range of transient deformation processes in materials under extreme conditions. In this paper, the challenges of high-resolution time-resolved indirect X-ray detection are reviewed in the context of dynamic synchrotron experiments. In particular, the discussion is targeted at two-dimensional integrating detector methods, such as those focused on dynamic radiography and diffraction experiments. The response of a scintillator to periodic synchrotron X-ray excitation is modelled and validated against experimental data collected at the Diamond Light Source (DLS) and European Synchrotron Radiation Facility (ESRF). An upper bound on the dynamic range accessible in a time-resolved experiment for a given bunch separation is calculated for a range of scintillators. New bunch structures are suggested for DLS and ESRF using the highest-performing commercially available crystal LYSO:Ce, allowing time-resolved experiments with an interframe time of 189 ns and a maximum dynamic range of 98 (6.6 bits).

  5. Acoustic emission source location in plates using wavelet analysis and cross time frequency spectrum.

    Science.gov (United States)

    Mostafapour, A; Davoodi, S; Ghareaghaji, M

    2014-12-01

    In this study, the theories of wavelet transform and cross-time frequency spectrum (CTFS) are used to locate AE source with frequency-varying wave velocity in plate-type structures. A rectangular array of four sensors is installed on the plate. When an impact is generated by an artificial AE source such as Hsu-Nielsen method of pencil lead breaking (PLB) at any position of the plate, the AE signals will be detected by four sensors at different times. By wavelet packet decomposition, a packet of signals with frequency range of 0.125-0.25MHz is selected. The CTFS is calculated by the short-time Fourier transform of the cross-correlation between considered packets captured by AE sensors. The time delay is calculated when the CTFS reaches the maximum value and the corresponding frequency is extracted per this maximum value. The resulting frequency is used to calculate the group velocity of wave velocity in combination with dispersive curve. The resulted locating error shows the high precision of proposed algorithm. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Addressing concerns about the inclusion of premenstrual dysphoric disorder in DSM-5.

    Science.gov (United States)

    Hartlage, S Ann; Breaux, Cynthia A; Yonkers, Kimberly A

    2014-01-01

    Inclusion of premenstrual dysphoric disorder (PMDD) into the main text of the DSM has been a point of controversy for many years. The purpose of this article is to address the main concerns raised by opponents to its inclusion. Concerns are presented and countered in turn. To identify the most prevalent arguments against inclusion of PMDD, we searched MEDLINE (1966-2012), PsycINFO (1930-2012), the Internet, and reference lists of identified articles during September 1-17, 2012, using the keywords PMDD, premenstrual syndrome (PMS), DSM, DSM-5, concerns, controversy, women, political power, workforce, courts, and history. The search was restricted to English-language publications. A total of 55 articles were identified and included. The most pressing arguments against inclusion were grouped by similarity and addressed if they were reported 5 or more times. Our review of the sources yielded 38 concerns regarding PMDD; 6 concerns were reported at least 5 times and are addressed in this article. Evidence culled from historical and legal trends does not support the alleged societal use of PMS to harm women (eg, keeping women out of the workforce or using PMS against women in child custody disputes). Further, current epidemiologic research has answered all of the methodology criticisms of opponents. Studies have confirmed the existence of PMDD worldwide. The involvement of pharmaceutical companies in research has been questioned. However, irrespective of the level of association with industry, current research on PMDD has consistent results: PMDD exists in a minority of women. Historically, the pain and suffering of women have been dismissed, minimized, and negated. Similarly, women with PMDD have often had their experience invalidated. With the preponderance of evidence in its favor, PMDD has been placed in the main text of the DSM-5, opening the door for affected women to receive the attention full diagnostic status provides. © Copyright 2014 Physicians Postgraduate

  7. Assessing the Financial Benefits of Faster Development Times: The Case of Single-source Versus Multi-vendor Outsourced Biopharmaceutical Manufacturing.

    Science.gov (United States)

    DiMasi, Joseph A; Smith, Zachary; Getz, Kenneth A

    2018-05-10

    The extent to which new drug developers can benefit financially from shorter development times has implications for development efficiency and innovation incentives. We provided a real-world example of such gains by using recent estimates of drug development costs and returns. Time and fee data were obtained on 5 single-source manufacturing projects. Time and fees were modeled for these projects as if the drug substance and drug product processes had been contracted separately from 2 vendors. The multi-vendor model was taken as the base case, and financial impacts from single-source contracting were determined relative to the base case. The mean and median after-tax financial benefits of shorter development times from single-source contracting were $44.7 million and $34.9 million, respectively (2016 dollars). The after-tax increases in sponsor fees from single-source contracting were small in comparison (mean and median of $0.65 million and $0.25 million). For the data we examined, single-source contracting yielded substantial financial benefits over multi-source contracting, even after accounting for somewhat higher sponsor fees. Copyright © 2018 Elsevier HS Journals, Inc. All rights reserved.

  8. Conceptual design of a high-intensity positron source for the Advanced Neutron Source

    International Nuclear Information System (INIS)

    Hulett, L.D.; Eberle, C.C.

    1994-12-01

    The Advanced Neutron Source (ANS) is a planned new basic and applied research facility based on a powerful steady-state research reactor that provides neutrons for measurements and experiments in the fields of materials science and engineering, biology, chemistry, materials analysis, and nuclear science. The useful neutron flux will be at least five times more than is available in the world's best existing reactor facility. Construction of the ANS provides a unique opportunity to build a positron spectroscopy facility (PSF) with very-high-intensity beams based on the radioactive decay of a positron-generating isotope. The estimated maximum beam current is 1000 to 5000 times higher than that available at the world's best existing positron research facility. Such an improvement in beam capability, coupled with complementary detectors, will reduce experiment durations from months to less than one hour while simultaneously improving output resolution. This facility will remove the existing barriers to the routine use of positron-based analytical techniques and will be a giant step toward realization of the full potential of the application of positron spectroscopy to materials science. The ANS PSF is based on a batch cycle process using 64 Cu isotope as the positron emitter and represents the status of the design at the end of last year. Recent work not included in this report, has led to a proposal for placing the laboratory space for the positron experiments outside the ANS containment; however, the design of the positron source is not changed by that relocation. Hydraulic and pneumatic flight tubes transport the source material between the reactor and the positron source where the beam is generated and conditioned. The beam is then transported through a beam pipe to one of several available detectors. The design presented here includes all systems necessary to support the positron source, but the beam pipe and detectors have not been addressed yet

  9. Source modeling and inversion with near real-time GPS: a GITEWS perspective for Indonesia

    Science.gov (United States)

    Babeyko, A. Y.; Hoechner, A.; Sobolev, S. V.

    2010-07-01

    We present the GITEWS approach to source modeling for the tsunami early warning in Indonesia. Near-field tsunami implies special requirements to both warning time and details of source characterization. To meet these requirements, we employ geophysical and geological information to predefine a maximum number of rupture parameters. We discretize the tsunamigenic Sunda plate interface into an ordered grid of patches (150×25) and employ the concept of Green's functions for forward and inverse rupture modeling. Rupture Generator, a forward modeling tool, additionally employs different scaling laws and slip shape functions to construct physically reasonable source models using basic seismic information only (magnitude and epicenter location). GITEWS runs a library of semi- and fully-synthetic scenarios to be extensively employed by system testing as well as by warning center personnel teaching and training. Near real-time GPS observations are a very valuable complement to the local tsunami warning system. Their inversion provides quick (within a few minutes on an event) estimation of the earthquake magnitude, rupture position and, in case of sufficient station coverage, details of slip distribution.

  10. Source size and time dependence of multifragmentation induced by GeV 3He beams

    International Nuclear Information System (INIS)

    Wang, G.; Kwiatkowski, K.; Bracken, D.S.; Renshaw Foxford, E.; Hsi, W.; Morley, K.B.; Viola, V.E.; Yoder, N.R.; Volant, C.; Legrain, R.; Pollacco, E.C.; Korteling, R.G.; Botvina, A.; Brzychczyk, J.; Breuer, H.

    1999-01-01

    To investigate the source size and time dependence of multifragmentation reactions, small- and large-angle relative velocity correlations between coincident complex fragments have been measured for the 1.8 - 4.8 GeV 3 He+ nat Ag, 197 Au systems. The results support an evolutionary scenario for the fragment emission process in which lighter IMFs (Z approx-lt 6) are emitted from a hot, more dense source prior to breakup of an expanded residue. For the most highly excited residues, for which there is a significant yield of fragments with very soft energy spectra (E/A≤3 MeV), comparisons with an N-body simulation suggest a breakup time of τ∼50 fm/c for the expanded residue. Comparison of these data with both the evolutionary expanding emitting source model and the Copenhagen statistical multifragmentation model shows good agreement for heavier IMF close-quote s formed in the final breakup stage, but only the evolutionary model is successful in accounting for the lighter IMFs. copyright 1999 The American Physical Society

  11. GANIL Workshop on Ion Sources

    International Nuclear Information System (INIS)

    Leroy, Renan

    1999-01-01

    The proceedings of the GANIL Workshop on Ion Sources held at GANIL - Caen on 18-19 March 1999 contains 13 papers aiming at improving the old source operation and developing new types of sources for nuclear research and studies of ion physics. A number of reports are devoted to applications like surface treatment, ion implantation or fusion injection. The 1+→n+ transformation, charged particle transport in ECR sources, addition of cesium and xenon in negative ion sources and other basic issues in ion sources are also addressed

  12. Opportunities for Neutrino Physics at the Spallation Neutron Source: A White Paper

    Energy Technology Data Exchange (ETDEWEB)

    Bolozdynya, A. [Moscow Phys. Eng. Inst.; Cavanna, F. [INFN, Aquila; Efremenko, Y. [Tennessee U.; Garvey, G. T. [Los Alamos; Gudkov, V. [South Carolina U.; Hatzikoutelis, A. [Tennessee U.; Hix, W. R. [Oak Ridge; Louis, W. C. [Los Alamos; Link, J. M. [Virginia Tech.; Markoff, D. M. [North Carolina Central U.; Mills, G. B. [Los Alamos; Patton, K. [North Carolina State U.; Ray, H. [Florida U.; Scholberg, K. [Duke U.; Van de Water, R. G. [Los Alamos; Virtue, C. [Laurentian U.; White, D. H. [Los Alamos; Yen, S. [TRIUMF; Yoo, J. [Fermilab

    2012-11-01

    The Spallation Neutron Source (SNS) at Oak Ridge National Laboratory, Tennessee, provides an intense flux of neutrinos in the few tens-of-MeV range, with a sharply-pulsed timing structure that is beneficial for background rejection. In this document, the product of a workshop at the SNS in May 2012, we describe this free, high-quality stopped-pion neutrino source and outline various physics that could be done using it. We describe without prioritization some specific experimental configurations that could address these physics topics.

  13. HYSPEC : A CRYSTAL TIME OF FLIGHT HYBRID SPECTROMETER FOR THE SPALLATION NEUTRON SOURCE

    International Nuclear Information System (INIS)

    SHAPIRO, S.M.; ZALIZNYAK, I.A.

    2002-01-01

    This document lays out a proposal by the Instrument Development Team (IDT) composed of scientists from leading Universities and National Laboratories to design and build a conceptually new high-flux inelastic neutron spectrometer at the pulsed Spallation Neutron Source (SNS) at Oak Ridge. This instrument is intended to supply users of the SNS and scientific community, of which the IDT is an integral part, with a platform for ground-breaking investigations of the low-energy atomic-scale dynamical properties of crystalline solids. It is also planned that the proposed instrument will be equipped with a polarization analysis capability, therefore becoming the first polarized beam inelastic spectrometer in the SNS instrument suite, and the first successful polarized beam inelastic instrument at a pulsed spallation source worldwide. The proposed instrument is designed primarily for inelastic and elastic neutron spectroscopy of single crystals. In fact, the most informative neutron scattering studies of the dynamical properties of solids nearly always require single crystal samples, and they are almost invariably flux-limited. In addition, in measurements with polarization analysis the available flux is reduced through selection of the particular neutron polarization, which puts even more stringent limits on the feasibility of a particular experiment. To date, these investigations have mostly been carried out on crystal spectrometers at high-flux reactors, which usually employ focusing Bragg optics to concentrate the neutron beam on a typically small sample. Construction at Oak Ridge of the high-luminosity spallation neutron source, which will provide intense pulsed neutron beams with time-averaged fluxes equal to those at medium-flux reactors, opens entirely new opportunities for single crystal neutron spectroscopy. Drawing upon experience acquired during decades of studies with both crystal and time-of-flight (TOF) spectrometers, the IDT has developed a conceptual

  14. HYSPEC : A CRYSTAL TIME OF FLIGHT HYBRID SPECTROMETER FOR THE SPALLATION NEUTRON SOURCE.

    Energy Technology Data Exchange (ETDEWEB)

    SHAPIRO,S.M.; ZALIZNYAK,I.A.

    2002-12-30

    This document lays out a proposal by the Instrument Development Team (IDT) composed of scientists from leading Universities and National Laboratories to design and build a conceptually new high-flux inelastic neutron spectrometer at the pulsed Spallation Neutron Source (SNS) at Oak Ridge. This instrument is intended to supply users of the SNS and scientific community, of which the IDT is an integral part, with a platform for ground-breaking investigations of the low-energy atomic-scale dynamical properties of crystalline solids. It is also planned that the proposed instrument will be equipped with a polarization analysis capability, therefore becoming the first polarized beam inelastic spectrometer in the SNS instrument suite, and the first successful polarized beam inelastic instrument at a pulsed spallation source worldwide. The proposed instrument is designed primarily for inelastic and elastic neutron spectroscopy of single crystals. In fact, the most informative neutron scattering studies of the dynamical properties of solids nearly always require single crystal samples, and they are almost invariably flux-limited. In addition, in measurements with polarization analysis the available flux is reduced through selection of the particular neutron polarization, which puts even more stringent limits on the feasibility of a particular experiment. To date, these investigations have mostly been carried out on crystal spectrometers at high-flux reactors, which usually employ focusing Bragg optics to concentrate the neutron beam on a typically small sample. Construction at Oak Ridge of the high-luminosity spallation neutron source, which will provide intense pulsed neutron beams with time-averaged fluxes equal to those at medium-flux reactors, opens entirely new opportunities for single crystal neutron spectroscopy. Drawing upon experience acquired during decades of studies with both crystal and time-of-flight (TOF) spectrometers, the IDT has developed a conceptual

  15. Time-resolved X-ray scattering program at the Advanced Photon Source

    International Nuclear Information System (INIS)

    Rodricks, B.

    1994-01-01

    The Time-Resolved Scattering Program's goal is the development of instruments and techniques for time-resolved studies. This entails the development of wide bandpass and focusing optics, high-speed detectors, mechanical choppers, and components for the measurement and creation of changes in samples. Techniques being developed are pump-probe experiments, single-bunch scattering experiments, high-speed white and pink beam Laue scattering, and nanosecond to microsecond synchronization of instruments. This program will be carried out primarily from a white-beam, bend-magnet source, experimental station, 1-BM-B, that immediately follows the first optics enclosure (1-BM-A). This paper will describe the experimental station and instruments under development to carry out the program

  16. Sealed radioactive sources toolkit

    International Nuclear Information System (INIS)

    Mac Kenzie, C.

    2005-09-01

    The IAEA has developed a Sealed Radioactive Sources Toolkit to provide information to key groups about the safety and security of sealed radioactive sources. The key groups addressed are officials in government agencies, medical users, industrial users and the scrap metal industry. The general public may also benefit from an understanding of the fundamentals of radiation safety

  17. A New Method of Chinese Address Extraction Based on Address Tree Model

    Directory of Open Access Journals (Sweden)

    KANG Mengjun

    2015-01-01

    Full Text Available Address is a spatial location encoding method of individual geographical area. In China, address planning is relatively backward due to the rapid development of the city, resulting in the presence of large number of non-standard address. The space constrain relationship of standard address model is analyzed in this paper and a new method of standard address extraction based on the tree model is proposed, which regards topological relationship as consistent criteria of space constraints. With this method, standard address can be extracted and errors can be excluded from non-standard address. Results indicate that higher math rate can be obtained with this method.

  18. Space-time quantitative source apportionment of soil heavy metal concentration increments.

    Science.gov (United States)

    Yang, Yong; Christakos, George; Guo, Mingwu; Xiao, Lu; Huang, Wei

    2017-04-01

    Assessing the space-time trends and detecting the sources of heavy metal accumulation in soils have important consequences in the prevention and treatment of soil heavy metal pollution. In this study, we collected soil samples in the eastern part of the Qingshan district, Wuhan city, Hubei Province, China, during the period 2010-2014. The Cd, Cu, Pb and Zn concentrations in soils exhibited a significant accumulation during 2010-2014. The spatiotemporal Kriging technique, based on a quantitative characterization of soil heavy metal concentration variations in terms of non-separable variogram models, was employed to estimate the spatiotemporal soil heavy metal distribution in the study region. Our findings showed that the Cd, Cu, and Zn concentrations have an obvious incremental tendency from the southwestern to the central part of the study region. However, the Pb concentrations exhibited an obvious tendency from the northern part to the central part of the region. Then, spatial overlay analysis was used to obtain absolute and relative concentration increments of adjacent 1- or 5-year periods during 2010-2014. The spatial distribution of soil heavy metal concentration increments showed that the larger increments occurred in the center of the study region. Lastly, the principal component analysis combined with the multiple linear regression method were employed to quantify the source apportionment of the soil heavy metal concentration increments in the region. Our results led to the conclusion that the sources of soil heavy metal concentration increments should be ascribed to industry, agriculture and traffic. In particular, 82.5% of soil heavy metal concentration increment during 2010-2014 was ascribed to industrial/agricultural activities sources. Using STK and SOA to obtain the spatial distribution of heavy metal concentration increments in soils. Using PCA-MLR to quantify the source apportionment of soil heavy metal concentration increments. Copyright © 2017

  19. Duplicate Address Detection Table in IPv6 Mobile Networks

    Science.gov (United States)

    Alisherov, Farkhod; Kim, Taihoon

    In IP networks, each computer or communication equipment needs an IP address. To supply enough IP addresses, the new Internet protocol IPv6 is used in next generatoion mobile communication. Although IPv6 improves the existing IPv4 Internet protocol, Duplicate Address Detection (DAD) mechanism may consume resources and suffer from long delay. DAD is used to ensure whether the IP address is unique or not. When a mobile node performs an inter-domain handoff, it will first generate a new IP and perform a DAD procedure. The DAD procedure not only wastes time but also increases the signaling load on Internet. In this paper, the author proposes a new DAD mechanism to speed up the DAD procedure. A DAD table is created in access or mobility routers in IP networks and record all IP addresses of the area. When a new IP address needs to perform DAD, it can just search in the DAD table to confirm the uniqueness of the address.

  20. Source-independent time-domain waveform inversion using convolved wavefields: Application to the encoded multisource waveform inversion

    KAUST Repository

    Choi, Yun Seok

    2011-09-01

    Full waveform inversion requires a good estimation of the source wavelet to improve our chances of a successful inversion. This is especially true for an encoded multisource time-domain implementation, which, conventionally, requires separate-source modeling, as well as the Fourier transform of wavefields. As an alternative, we have developed the source-independent time-domain waveform inversion using convolved wavefields. Specifically, the misfit function consists of the convolution of the observed wavefields with a reference trace from the modeled wavefield, plus the convolution of the modeled wavefields with a reference trace from the observed wavefield. In this case, the source wavelet of the observed and the modeled wavefields are equally convolved with both terms in the misfit function, and thus, the effects of the source wavelets are eliminated. Furthermore, because the modeled wavefields play a role of low-pass filtering, the observed wavefields in the misfit function, the frequency-selection strategy from low to high can be easily adopted just by setting the maximum frequency of the source wavelet of the modeled wavefields; and thus, no filtering is required. The gradient of the misfit function is computed by back-propagating the new residual seismograms and applying the imaging condition, similar to reverse-time migration. In the synthetic data evaluations, our waveform inversion yields inverted models that are close to the true model, but demonstrates, as predicted, some limitations when random noise is added to the synthetic data. We also realized that an average of traces is a better choice for the reference trace than using a single trace. © 2011 Society of Exploration Geophysicists.

  1. Keynote address

    International Nuclear Information System (INIS)

    Farlinger, W.

    1997-01-01

    In this second keynote address of the conference Mr. Farlinger, Chairman of Ontario Hydro, attempted to respond to some of the criticisms levelled at the Corporation in the course of the Macdonald Committee process. He appeared to be particularly vexed by the criticism of IPPSO, saying that in effect, they are' beating up on their only customer', at a time when Hydro is being pulled in several different directions, and was facing pressure from jurisdictional dispute with municipal utilities, (MEUs). Nevertheless, he agreed with the need for restructuring. He defended Hydro by saying that the Macdonald Report in fact represented a vindication of the position Ontario Hydro had taken, particularly on such issues as open competition, customer choice, rationalization of the distribution system, and termination of Hydro's monopoly position. At the same time, he objected to the Report's assertion that dismantling the generation system into smaller units would be in the best interest of the people of Ontario. He suggested that there would be several large US utility companies willing and able to fill the vacuum if there was no large company with its head office in Ontario to stake its claim to the provincial market

  2. Winter Annual Weed Response to Nitrogen Sources and Application Timings prior to a Burndown Corn Herbicide

    Directory of Open Access Journals (Sweden)

    Kelly A. Nelson

    2015-01-01

    Full Text Available Autumn and early preplant N applications, sources, and placement may affect winter annual weed growth. Field research evaluated (1 the effect of different nitrogen sources in autumn and early preplant on total winter annual weed growth (2006–2010, and (2 strip-till and broadcast no-till N applied in autumn and early preplant on henbit (Lamium amplexicaule L. growth (2008–2010 prior to a burndown herbicide application. Total winter annual weed biomass was greater than the nontreated control when applying certain N sources in autumn or early preplant for no-till corn. Anhydrous ammonia had the lowest average weed density (95 weeds m−2, though results were inconsistent over the years. Winter annual weed biomass was lowest (43 g m−2 when applying 32% urea ammonium nitrate in autumn and was similar to applying anhydrous ammonia in autumn or early preplant and the nontreated control. Henbit biomass was 28% greater when applying N in the autumn compared to an early preplant application timing. Nitrogen placement along with associated tillage with strip-till placement was important in reducing henbit biomass. Nitrogen source selection, application timing, and placement affected the impact of N on winter annual weed growth and should be considered when recommending a burndown herbicide application timing.

  3. Source detection at 100 meter standoff with a time-encoded imaging system

    International Nuclear Information System (INIS)

    Brennan, J.; Brubaker, E.; Gerling, M.; Marleau, P.; Monterial, M.

    2017-01-01

    Here, we present the design, characterization, and testing of a laboratory prototype radiological search and localization system. The system, based on time-encoded imaging, uses the attenuation signature of neutrons in time, induced by the geometrical layout and motion of the system. We have demonstrated the ability to detect a ~1 mCi 252 Cf radiological source at 100 m standoff with 90% detection efficiency and 10% false positives against background in 12 min. As a result, this same detection efficiency is met at 15 s for a 40 m standoff, and 1.2 s for a 20 m standoff.

  4. Real-time analysis, visualization, and steering of microtomography experiments at photon sources

    International Nuclear Information System (INIS)

    Laszeski, G. von; Insley, J.A.; Foster, I.; Bresnahan, J.; Kesselman, C.; Su, M.; Thiebaux, M.; Rivers, M.L.; Wang, S.; Tieman, B.; McNulty, I.

    2000-01-01

    A new generation of specialized scientific instruments called synchrotron light sources allow the imaging of materials at very fine scales. However, in contrast to a traditional microscope, interactive use has not previously been possible because of the large amounts of data generated and the considerable computation required translating this data into a useful image. The authors describe a new software architecture that uses high-speed networks and supercomputers to enable quasi-real-time and hence interactive analysis of synchrotron light source data. This architecture uses technologies provided by the Globus computational grid toolkit to allow dynamic creation of a reconstruction pipeline that transfers data from a synchrotron source beamline to a preprocessing station, next to a parallel reconstruction system, and then to multiple visualization stations. Collaborative analysis tools allow multiple users to control data visualization. As a result, local and remote scientists can see and discuss preliminary results just minutes after data collection starts. The implications for more efficient use of this scarce resource and for more effective science appear tremendous

  5. An elementary solution of the Maxwell equations for a time-dependent source

    International Nuclear Information System (INIS)

    Rivera, R; Villarroel, D

    2002-01-01

    We present an elementary solution of the Maxwell equations for a time-dependent source consisting of an infinite solenoid with a current density that increases linearly with time. The geometrical symmetries and the time dependence of the current density make possible a mathematical treatment that does not involve the usual technical difficulties, thus making this presentation suitable for students that are taking a first course in electromagnetism. We also show that the electric field generated by the solenoid can be used to construct an exact solution of the relativistic equation of motion of the electron that takes into account the effect of the radiation. In particular, we derive, in an almost trivial way, the formula for the radiation rate of an electron in circular motion

  6. Open source development

    DEFF Research Database (Denmark)

    Ulhøi, John Parm

    2004-01-01

    This paper addresses innovations based on open source or non-proprietary knowledge. Viewed through the lens of private property theory, such agency appears to be a true anomaly. However, by a further turn of the theoretical kaleidoscope, we will show that there may be perfectly justifiable reasons...... for not regarding open source innovations as anomalies. The paper is based on three sectorial and generic cases of open source innovation, which is an offspring of contemporary theory made possible by combining elements of the model of private agency with those of the model of collective agency. In closing...

  7. Supplementary household water sources to augment potable ...

    African Journals Online (AJOL)

    This paper addresses on-site supplementary household water sources with a focus on groundwater abstraction, rainwater harvesting and greywater reuse as available non-potable water sources to residential consumers. An end-use model is presented and used to assess the theoretical impact of household water sources ...

  8. The first synchrotron infrared beamlines at the Advanced Light Source: Microspectroscopy and fast timing

    International Nuclear Information System (INIS)

    Martin, M.C.; McKinney, W.R.

    1998-05-01

    A set of new infrared (IR) beamlines on the 1.4 bending magnet port at the Advanced Light Source, LBNL, are described. Using a synchrotron as an IR source provides considerable brightness advantages, which manifests itself most beneficially when performing spectroscopy on a microscopic length scale. Beamline (BL) 1.4.3 is a dedicated microspectroscopy beamline, where the much smaller focused spot size using the synchrotron source is utilized. This enables an entirely new set of experiments to be performed where spectroscopy on a truly microscopic scale is now possible. BL 1.4.2 consists of a vacuum FTIR bench with a wide spectral range and step-scan capabilities. The fast timing is demonstrated by observing the synchrotron electron storage pattern at the ALS

  9. Real-time earthquake source imaging: An offline test for the 2011 Tohoku earthquake

    Science.gov (United States)

    Zhang, Yong; Wang, Rongjiang; Zschau, Jochen; Parolai, Stefano; Dahm, Torsten

    2014-05-01

    In recent decades, great efforts have been expended in real-time seismology aiming at earthquake and tsunami early warning. One of the most important issues is the real-time assessment of earthquake rupture processes using near-field seismogeodetic networks. Currently, earthquake early warning systems are mostly based on the rapid estimate of P-wave magnitude, which contains generally large uncertainties and the known saturation problem. In the case of the 2011 Mw9.0 Tohoku earthquake, JMA (Japan Meteorological Agency) released the first warning of the event with M7.2 after 25 s. The following updates of the magnitude even decreased to M6.3-6.6. Finally, the magnitude estimate stabilized at M8.1 after about two minutes. This led consequently to the underestimated tsunami heights. By using the newly developed Iterative Deconvolution and Stacking (IDS) method for automatic source imaging, we demonstrate an offline test for the real-time analysis of the strong-motion and GPS seismograms of the 2011 Tohoku earthquake. The results show that we had been theoretically able to image the complex rupture process of the 2011 Tohoku earthquake automatically soon after or even during the rupture process. In general, what had happened on the fault could be robustly imaged with a time delay of about 30 s by using either the strong-motion (KiK-net) or the GPS (GEONET) real-time data. This implies that the new real-time source imaging technique is helpful to reduce false and missing warnings, and therefore should play an important role in future tsunami early warning and earthquake rapid response systems.

  10. 4th generation light source instrumentation

    International Nuclear Information System (INIS)

    Lumpkin, A.

    1998-01-01

    This working group on 4th Generation Light Source (4GLS) Instrumentation was a follow-up to the opening-discussion on Challenges in Beam Profiling. It was in parallel with the Feedback Systems session. We filled the SSRL Conference Room with about 25 participants. The session opened with an introduction by Lumpkin. The target beam parameter values for a few-angstrom, self-amplified spontaneous emissions (SASE) experiment and for a diffraction-limited soft x-ray storage ring source were addressed. Instrument resolution would of course need to be 2-3 times better than the value measured, if possible. The nominal targeted performance parameters are emittance (1-2π mm mrad), bunch length (100 fs), peak-current (l-5 kA), beam size (10 microm), beam divergence (1 microrad), energy spread (2 x 10 -4 ), and beam energy (10's of GeV). These are mostly the SASE values, and the possible parameters for a diffraction-limited soft x-ray source would be relaxed somewhat. Beam stability and alignment specifications in the sub-micron domain for either device are anticipated

  11. The space-time outside a source of gravitational radiation: the axially symmetric null fluid

    Energy Technology Data Exchange (ETDEWEB)

    Herrera, L. [Universidad Central de Venezuela, Escuela de Fisica, Facultad de Ciencias, Caracas (Venezuela, Bolivarian Republic of); Universidad de Salamanca, Instituto Universitario de Fisica Fundamental y Matematicas, Salamanca (Spain); Di Prisco, A. [Universidad Central de Venezuela, Escuela de Fisica, Facultad de Ciencias, Caracas (Venezuela, Bolivarian Republic of); Ospino, J. [Universidad de Salamanca, Departamento de Matematica Aplicada and Instituto Universitario de Fisica Fundamental y Matematicas, Salamanca (Spain)

    2016-11-15

    We carry out a study of the exterior of an axially and reflection symmetric source of gravitational radiation. The exterior of such a source is filled with a null fluid produced by the dissipative processes inherent to the emission of gravitational radiation, thereby representing a generalization of the Vaidya metric for axially and reflection symmetric space-times. The role of the vorticity, and its relationship with the presence of gravitational radiation is put in evidence. The spherically symmetric case (Vaidya) is, asymptotically, recovered within the context of the 1 + 3 formalism. (orig.)

  12. Address-event-based platform for bioinspired spiking systems

    Science.gov (United States)

    Jiménez-Fernández, A.; Luján, C. D.; Linares-Barranco, A.; Gómez-Rodríguez, F.; Rivas, M.; Jiménez, G.; Civit, A.

    2007-05-01

    Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows a real-time virtual massive connectivity between huge number neurons, located on different chips. By exploiting high speed digital communication circuits (with nano-seconds timings), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Also, neurons generate "events" according to their activity levels. More active neurons generate more events per unit time, and access the interchip communication channel more frequently, while neurons with low activity consume less communication bandwidth. When building multi-chip muti-layered AER systems, it is absolutely necessary to have a computer interface that allows (a) reading AER interchip traffic into the computer and visualizing it on the screen, and (b) converting conventional frame-based video stream in the computer into AER and injecting it at some point of the AER structure. This is necessary for test and debugging of complex AER systems. In the other hand, the use of a commercial personal computer implies to depend on software tools and operating systems that can make the system slower and un-robust. This paper addresses the problem of communicating several AER based chips to compose a powerful processing system. The problem was discussed in the Neuromorphic Engineering Workshop of 2006. The platform is based basically on an embedded computer, a powerful FPGA and serial links, to make the system faster and be stand alone (independent from a PC). A new platform is presented that allow to connect up to eight AER based chips to a Spartan 3 4000 FPGA. The FPGA is responsible of the network communication based in Address-Event and, at the same time, to map and transform the address space of the traffic to implement a pre-processing. A MMU microprocessor (Intel XScale 400MHz Gumstix Connex computer) is also connected to the FPGA

  13. The Space-, Time-, and Energy-distribution of Neutrons from a Pulsed Plane Source

    Energy Technology Data Exchange (ETDEWEB)

    Claesson, Arne

    1962-05-15

    The space-, time- and energy-distribution of neutrons from a pulsed, plane, high energy source in an infinite medium is determined in a diffusion approximation. For simplicity the moderator is first assumed to be hydrogen gas but it is also shown that the method can be used for a moderator of arbitrary mass.

  14. Calibration of time of flight detectors using laser-driven neutron source

    Energy Technology Data Exchange (ETDEWEB)

    Mirfayzi, S. R.; Kar, S., E-mail: s.kar@qub.ac.uk; Ahmed, H.; Green, A.; Alejo, A.; Jung, D. [Centre for Plasma Physics, School of Mathematics and Physics, Queen’s University Belfast, Belfast BT7 1NN (United Kingdom); Krygier, A. G.; Freeman, R. R. [Department of Physics, The Ohio State University, Columbus, Ohio 43210 (United States); Clarke, R. [Central Laser Facility, Rutherford Appleton Laboratory, Didcot, Oxfordshire OX11 0QX (United Kingdom); Fuchs, J.; Vassura, L. [LULI, Ecole Polytechnique, CNRS, Route de Saclay, 91128 Palaiseau Cedex (France); Kleinschmidt, A.; Roth, M. [Institut für Kernphysik, Technische Universität Darmstadt, Schloßgartenstrasse 9, D-64289 Darmstadt,Germany (Germany); Morrison, J. T. [Propulsion Systems Directorate, Air Force Research Lab, Wright Patterson Air Force Base, Ohio 45433 (United States); Najmudin, Z.; Nakamura, H. [Blackett Laboratory, Department of Physics, Imperial College, London SW7 2AZ (United Kingdom); Norreys, P. [Central Laser Facility, Rutherford Appleton Laboratory, Didcot, Oxfordshire OX11 0QX (United Kingdom); Department of Physics, University of Oxford, Oxford OX1 3PU (United Kingdom); Oliver, M. [Department of Physics, University of Oxford, Oxford OX1 3PU (United Kingdom); Zepf, M. [Centre for Plasma Physics, School of Mathematics and Physics, Queen’s University Belfast, Belfast BT7 1NN (United Kingdom); Helmholtz Institut Jena, D-07743 Jena (Germany); Borghesi, M. [Centre for Plasma Physics, School of Mathematics and Physics, Queen’s University Belfast, Belfast BT7 1NN (United Kingdom); Institute of Physics of the ASCR, ELI-Beamlines Project, Na Slovance 2, 18221 Prague (Czech Republic)

    2015-07-15

    Calibration of three scintillators (EJ232Q, BC422Q, and EJ410) in a time-of-flight arrangement using a laser drive-neutron source is presented. The three plastic scintillator detectors were calibrated with gamma insensitive bubble detector spectrometers, which were absolutely calibrated over a wide range of neutron energies ranging from sub-MeV to 20 MeV. A typical set of data obtained simultaneously by the detectors is shown, measuring the neutron spectrum emitted from a petawatt laser irradiated thin foil.

  15. Calibration of time of flight detectors using laser-driven neutron source

    Science.gov (United States)

    Mirfayzi, S. R.; Kar, S.; Ahmed, H.; Krygier, A. G.; Green, A.; Alejo, A.; Clarke, R.; Freeman, R. R.; Fuchs, J.; Jung, D.; Kleinschmidt, A.; Morrison, J. T.; Najmudin, Z.; Nakamura, H.; Norreys, P.; Oliver, M.; Roth, M.; Vassura, L.; Zepf, M.; Borghesi, M.

    2015-07-01

    Calibration of three scintillators (EJ232Q, BC422Q, and EJ410) in a time-of-flight arrangement using a laser drive-neutron source is presented. The three plastic scintillator detectors were calibrated with gamma insensitive bubble detector spectrometers, which were absolutely calibrated over a wide range of neutron energies ranging from sub-MeV to 20 MeV. A typical set of data obtained simultaneously by the detectors is shown, measuring the neutron spectrum emitted from a petawatt laser irradiated thin foil.

  16. Calibration of time of flight detectors using laser-driven neutron source

    International Nuclear Information System (INIS)

    Mirfayzi, S. R.; Kar, S.; Ahmed, H.; Green, A.; Alejo, A.; Jung, D.; Krygier, A. G.; Freeman, R. R.; Clarke, R.; Fuchs, J.; Vassura, L.; Kleinschmidt, A.; Roth, M.; Morrison, J. T.; Najmudin, Z.; Nakamura, H.; Norreys, P.; Oliver, M.; Zepf, M.; Borghesi, M.

    2015-01-01

    Calibration of three scintillators (EJ232Q, BC422Q, and EJ410) in a time-of-flight arrangement using a laser drive-neutron source is presented. The three plastic scintillator detectors were calibrated with gamma insensitive bubble detector spectrometers, which were absolutely calibrated over a wide range of neutron energies ranging from sub-MeV to 20 MeV. A typical set of data obtained simultaneously by the detectors is shown, measuring the neutron spectrum emitted from a petawatt laser irradiated thin foil

  17. Atmospheric Nitrogen Deposition in the Western United States: Sources, Sinks and Changes over Time

    Science.gov (United States)

    Anderson, Sarah Marie

    Anthropogenic activities have greatly modified the way nitrogen moves through the atmosphere and terrestrial and aquatic environments. Excess reactive nitrogen generated through fossil fuel combustion, industrial fixation, and intensification of agriculture is not confined to anthropogenic systems but leaks into natural ecosystems with consequences including acidification, eutrophication, and biodiversity loss. A better understanding of where excess nitrogen originates and how that changes over time is crucial to identifying when, where, and to what degree environmental impacts occur. A major route into ecosystems for excess nitrogen is through atmospheric deposition. Excess nitrogen is emitted to the atmosphere where it can be transported great distances before being deposited back to the Earth's surface. Analyzing the composition of atmospheric nitrogen deposition and biological indicators that reflect deposition can provide insight into the emission sources as well as processes and atmospheric chemistry that occur during transport and what drives variation in these sources and processes. Chapter 1 provides a review and proof of concept of lichens to act as biological indicators and how their elemental and stable isotope composition can elucidate variation in amounts and emission sources of nitrogen over space and time. Information on amounts and emission sources of nitrogen deposition helps inform natural resources and land management decisions by helping to identify potentially impacted areas and causes of those impacts. Chapter 2 demonstrates that herbaria lichen specimens and field lichen samples reflect historical changes in atmospheric nitrogen deposition from urban and agricultural sources across the western United States. Nitrogen deposition increases throughout most of the 20 th century because of multiple types of emission sources until the implementation of the Clean Air Act Amendments of 1990 eventually decrease nitrogen deposition around the turn of

  18. Absolute symbolic addressing, a structure making time-sharing easier; Adressage symbolique absolu, structure facilitant le travail en partage de temps

    Energy Technology Data Exchange (ETDEWEB)

    Debraine, P [Commissariat a l' Energie Atomique, 91 - Saclay (France). Centre d' Etudes Nucleaires, services scientifiques

    1968-08-01

    Time-sharing of computers asks for a certain number of conditions, particularly, an efficient dynamic loading of programs and data. This paper indicates a paging method making linkages with a minimum of table-looking operations. The principle is to use associative memory registers for calling blocks of physical memory, the block address being given by the concatenation of a file number (located in a base register) and a page number (located in the instruction proper). The position within the block is given by a displacement located in the instruction. A second associated base register contains the local part (page number + displacement) of the base address. This extended base register system allows executing programs in a very large programming complex without loss of time. The addresses are fixed at assembly time and the blocks can be loaded anywhere without modification for execution. The various problems associated with the execution of complex programs are presented in this context and shown to be easily solved by the proposed system, the realization of which would be very easy starting from the computer structures existing now. (author) [French] Le partage de temps des ordinateurs demande la realisation d'un certain nombre de conditions, en particulier, un chargement dynamique efficace des programmes et des donnees. Cet article indique une methode de pagination realisant les liaisons avec un minimum de consultations de tables. Le principe consiste a utiliser une memoire associative composee de registres pour appeler les blocs de la memoire physique, l'adresse de bloc etant donnee par concatenation d'un no de fichier (loge dans un registre de base) et d'un no de page (loge dans l'instruction proprement dite). La position a l'interieur du bloc est donnee par un deplacement loge dans l'instruction. Un second registre de base, associe au premier, contient la partie locale (no de page + deplacement) de l'adresse de base. Ce systeme d'extension de registre de base

  19. Real-time monitoring and massive inversion of source parameters of very long period seismic signals: An application to Stromboli Volcano, Italy

    Science.gov (United States)

    Auger, E.; D'Auria, L.; Martini, M.; Chouet, B.; Dawson, P.

    2006-01-01

    We present a comprehensive processing tool for the real-time analysis of the source mechanism of very long period (VLP) seismic data based on waveform inversions performed in the frequency domain for a point source. A search for the source providing the best-fitting solution is conducted over a three-dimensional grid of assumed source locations, in which the Green's functions associated with each point source are calculated by finite differences using the reciprocal relation between source and receiver. Tests performed on 62 nodes of a Linux cluster indicate that the waveform inversion and search for the best-fitting signal over 100,000 point sources require roughly 30 s of processing time for a 2-min-long record. The procedure is applied to post-processing of a data archive and to continuous automatic inversion of real-time data at Stromboli, providing insights into different modes of degassing at this volcano. Copyright 2006 by the American Geophysical Union.

  20. [Regional pilot study to evaluate the laboratory turnaround time according to the client source].

    Science.gov (United States)

    Salinas, M; López-Garrigós, M; Yago, M; Ortuño, M; Díaz, J; Marcaida, G; Chinchilla, V; Carratala, A; Aguado, C; Rodríguez-Borja, E; Laíz, B; Guaita, M; Esteban, A; Lorente, M A; Uris, J

    2011-01-01

    To show turnaround time to client source in eight laboratories covering eight Health Areas (2,014,475 inhabitants) of the Valencian Community (Spain). Internal Laboratory Information System (LIS) registers (test register and verification date and time), and daily LIS registers were used to design the indicators, These indicators showed the percentage of key tests requested (full blood count and serum glucose and thyrotropin) that were validated on the same day the blood was taken (inpatients and Primary Care and/or at 12 a.m. (inpatients). Urgent (stat) tests were also registered as key tests (serum troponin and potassium) and were recorded in minutes. Registers were collected and indicators calculated automatically through a Data Warehouse application and OLAP cube software. Long turnaround time differences were observed at 12 a.m. in inpatients, and in the day of sample extraction in primary care patients. The variability in turnaround of stat tests is related to hospital size, activity and validation by the laboratory physician. The study results show the large turnaround time disparity in eight Health Care Areas of Valencian Community. The various requesting sources covered by the laboratories create the need for continuous mapping processes redesign and benchmarking studies to achieve customer satisfaction. Copyright © 2010 SECA. Published by Elsevier Espana. All rights reserved.

  1. Point-source inversion techniques

    Science.gov (United States)

    Langston, Charles A.; Barker, Jeffrey S.; Pavlin, Gregory B.

    1982-11-01

    A variety of approaches for obtaining source parameters from waveform data using moment-tensor or dislocation point source models have been investigated and applied to long-period body and surface waves from several earthquakes. Generalized inversion techniques have been applied to data for long-period teleseismic body waves to obtain the orientation, time function and depth of the 1978 Thessaloniki, Greece, event, of the 1971 San Fernando event, and of several events associated with the 1963 induced seismicity sequence at Kariba, Africa. The generalized inversion technique and a systematic grid testing technique have also been used to place meaningful constraints on mechanisms determined from very sparse data sets; a single station with high-quality three-component waveform data is often sufficient to discriminate faulting type (e.g., strike-slip, etc.). Sparse data sets for several recent California earthquakes, for a small regional event associated with the Koyna, India, reservoir, and for several events at the Kariba reservoir have been investigated in this way. Although linearized inversion techniques using the moment-tensor model are often robust, even for sparse data sets, there are instances where the simplifying assumption of a single point source is inadequate to model the data successfully. Numerical experiments utilizing synthetic data and actual data for the 1971 San Fernando earthquake graphically demonstrate that severe problems may be encountered if source finiteness effects are ignored. These techniques are generally applicable to on-line processing of high-quality digital data, but source complexity and inadequacy of the assumed Green's functions are major problems which are yet to be fully addressed.

  2. Determining the sources of fine-grained sediment using the Sediment Source Assessment Tool (Sed_SAT)

    Science.gov (United States)

    Gorman Sanisaca, Lillian E.; Gellis, Allen C.; Lorenz, David L.

    2017-07-27

    A sound understanding of sources contributing to instream sediment flux in a watershed is important when developing total maximum daily load (TMDL) management strategies designed to reduce suspended sediment in streams. Sediment fingerprinting and sediment budget approaches are two techniques that, when used jointly, can qualify and quantify the major sources of sediment in a given watershed. The sediment fingerprinting approach uses trace element concentrations from samples in known potential source areas to determine a clear signature of each potential source. A mixing model is then used to determine the relative source contribution to the target suspended sediment samples.The computational steps required to apportion sediment for each target sample are quite involved and time intensive, a problem the Sediment Source Assessment Tool (Sed_SAT) addresses. Sed_SAT is a user-friendly statistical model that guides the user through the necessary steps in order to quantify the relative contributions of sediment sources in a given watershed. The model is written using the statistical software R (R Core Team, 2016b) and utilizes Microsoft Access® as a user interface but requires no prior knowledge of R or Microsoft Access® to successfully run the model successfully. Sed_SAT identifies outliers, corrects for differences in size and organic content in the source samples relative to the target samples, evaluates the conservative behavior of tracers used in fingerprinting by applying a “Bracket Test,” identifies tracers with the highest discriminatory power, and provides robust error analysis through a Monte Carlo simulation following the mixing model. Quantifying sediment source contributions using the sediment fingerprinting approach provides local, State, and Federal land management agencies with important information needed to implement effective strategies to reduce sediment. Sed_SAT is designed to assist these agencies in applying the sediment fingerprinting

  3. Image Coding Based on Address Vector Quantization.

    Science.gov (United States)

    Feng, Yushu

    Image coding is finding increased application in teleconferencing, archiving, and remote sensing. This thesis investigates the potential of Vector Quantization (VQ), a relatively new source coding technique, for compression of monochromatic and color images. Extensions of the Vector Quantization technique to the Address Vector Quantization method have been investigated. In Vector Quantization, the image data to be encoded are first processed to yield a set of vectors. A codeword from the codebook which best matches the input image vector is then selected. Compression is achieved by replacing the image vector with the index of the code-word which produced the best match, the index is sent to the channel. Reconstruction of the image is done by using a table lookup technique, where the label is simply used as an address for a table containing the representative vectors. A code-book of representative vectors (codewords) is generated using an iterative clustering algorithm such as K-means, or the generalized Lloyd algorithm. A review of different Vector Quantization techniques are given in chapter 1. Chapter 2 gives an overview of codebook design methods including the Kohonen neural network to design codebook. During the encoding process, the correlation of the address is considered and Address Vector Quantization is developed for color image and monochrome image coding. Address VQ which includes static and dynamic processes is introduced in chapter 3. In order to overcome the problems in Hierarchical VQ, Multi-layer Address Vector Quantization is proposed in chapter 4. This approach gives the same performance as that of the normal VQ scheme but the bit rate is about 1/2 to 1/3 as that of the normal VQ method. In chapter 5, a Dynamic Finite State VQ based on a probability transition matrix to select the best subcodebook to encode the image is developed. In chapter 6, a new adaptive vector quantization scheme, suitable for color video coding, called "A Self -Organizing

  4. Convocation address.

    Science.gov (United States)

    Swaminathan, M S

    1998-07-01

    This address delivered to the 40th convocation of the International Institute for Population Sciences in India in 1998 opens by noting that a shortage of jobs for youth is India's most urgent problem but that the problems that attend the increasing numbers of elderly also require serious attention. The address then notes that the Earth's population is growing at an unsustainable rate while economic inequities among countries are increasing, so that, while intellectual property is becoming the most important asset in developed countries, nutritional anemia among pregnant women causes their offspring to be unable to achieve their full intellectual potential from birth. Next, the address uses a discussion of the 18th-century work on population of the Marquis de Condorcet and of Thomas Malthus to lead into a consideration of estimated increased needs of countries like India and China to import food grains in the near future. Next, the progress of demographic transition in Indian states is covered and applied to Mahbub ul Haq's measure of human deprivation developed for and applied to the region of the South Asian Association for Regional Cooperation (India, Pakistan, Bangladesh, Nepal, Sri Lanka, Bhutan, and the Maldives). The address continues by reiterating some of the major recommendations forwarded by a government of India committee charged in 1995 with drafting a national population policy. Finally, the address suggests specific actions that could be important components of the Hunger-Free India Programme and concludes that all success rests on the successful implementation of appropriate population policies.

  5. Source apportionment of the summer time carbonaceous aerosol at Nordic rural background sites

    Directory of Open Access Journals (Sweden)

    K. E. Yttri

    2011-12-01

    Full Text Available In the present study, natural and anthropogenic sources of particulate organic carbon (OCp and elemental carbon (EC have been quantified based on weekly filter samples of PM10 (particles with aerodynamic diameter <10 μm collected at four Nordic rural background sites [Birkenes (Norway, Hyytiälä (Finland, Vavihill (Sweden, Lille Valby, (Denmark] during late summer (5 August–2 September 2009. Levels of source specific tracers, i.e. cellulose, levoglucosan, mannitol and the 14C/12C ratio of total carbon (TC, have been used as input for source apportionment of the carbonaceous aerosol, whereas Latin Hypercube Sampling (LHS was used to statistically treat the multitude of possible combinations resulting from this approach. The carbonaceous aerosol (here: TCp; i.e. particulate TC was totally dominated by natural sources (69–86%, with biogenic secondary organic aerosol (BSOA being the single most important source (48–57%. Interestingly, primary biological aerosol particles (PBAP were the second most important source (20–32%. The anthropogenic contribution was mainly attributed to fossil fuel sources (OCff and ECff (10–24%, whereas no more than 3–7% was explained by combustion of biomass (OCbb and ECbb in this late summer campaign i.e. emissions from residential wood burning and/or wild/agricultural fires. Fossil fuel sources totally dominated the ambient EC loading, which accounted for 4–12% of TCp, whereas <1.5% of EC was attributed to combustion of biomass. The carbonaceous aerosol source apportionment showed only minor variation between the four selected sites. However, Hyytiälä and Birkenes showed greater resemblance to each other, as did Lille Valby and Vavihill, the two latter being somewhat more influenced by anthropogenic sources. Ambient levels of organosulphates and nitrooxy-organosulphates in the Nordic rural

  6. A high-intensity plasma-sputter heavy negative ion source

    International Nuclear Information System (INIS)

    Alton, G.D.; Mori, Y.; Takagi, A.; Ueno, A.; Fukumoto, S.

    1989-01-01

    A multicusp magnetic field plasma surface ion source, normally used for H/sup /minus//ion beam formation, has been modified for the generation of high-intensity, pulsed, heavy negative ion beams suitable for a variety of uses. To date, the source has been utilized to produce mA intensity pulsed beams of more than 24 species. A brief description of the source, and basic pulsed-mode operational data, (e.g., intensity versus cesium oven temperature, sputter probe voltage, and discharge pressure), are given. In addition, illustrative examples of intensity versus time and the mass distributions of ion beams extracted from a number of samples along with emittance data, are also presented. Preliminary results obtained during dc operation of the source under low discharge power conditions suggest that sources of this type may also be used to produce high-intensity (mA) dc beams. The results of these investigations are given, as well, and the technical issues that must be addressed for this mode of operation are discussed. 15 refs., 10 figs., 2 tabs

  7. Discrete-Time Domain Modelling of Voltage Source Inverters in Standalone Applications

    DEFF Research Database (Denmark)

    Federico, de Bosio; de Sousa Ribeiro, Luiz Antonio; Freijedo Fernandez, Francisco Daniel

    2017-01-01

    modelling of the LC plant with consideration of delay and sample-and-hold effects on the state feedback cross-coupling decoupling is derived. From this plant formulation, current controllers with wide bandwidth and good relative stability properties are developed. Two controllers based on lead compensation......The decoupling of the capacitor voltage and inductor current has been shown to improve significantly the dynamic performance of voltage source inverters in standalone applications. However, the computation and PWM delays still limit the achievable bandwidth. In this paper a discrete-time domain...

  8. DOA Estimation of Multiple LFM Sources Using a STFT-based and FBSS-based MUSIC Algorithm

    Directory of Open Access Journals (Sweden)

    K. B. Cui

    2017-12-01

    Full Text Available Direction of arrival (DOA estimation is an important problem in array signal processing. An effective multiple signal classification (MUSIC method based on the short-time Fourier transform (STFT and forward/ backward spatial smoothing (FBSS techniques for the DOA estimation problem of multiple time-frequency (t-f joint LFM sources is addressed. Previous work in the area e. g. STFT-MUSIC algorithm cannot resolve the t-f completely or largely joint sources because they can only select the single-source t-f points. The proposed method con¬structs the spatial t-f distributions (STFDs by selecting the multiple-source t-f points and uses the FBSS techniques to solve the problem of rank loss. In this way, the STFT-FBSS-MUSIC algorithm can resolve the t-f largely joint or completely joint LFM sources. In addition, the proposed algorithm also owns pretty low computational complexity when resolving multiple LFM sources because it can reduce the times of the feature decomposition and spectrum search. The performance of the proposed method is compared with that of the existing t-f based MUSIC algorithms through computer simulations and the results show its good performance.

  9. Overview of an address and purpose of the workshop [ISO Workshop on address standards: Considering the issues related to an international address standard

    CSIR Research Space (South Africa)

    Cooper, Antony K

    2008-05-01

    Full Text Available ) (ISO 19112) Precision Redirectable Standards Postal address Street delivery address Y N N Y N Y Fine Y UPU S42 PO Box or Private Bag Y N N Y Fine to Coarse Y UPU S42 Post Restante Y N N Y N Y Coarse Y UPU S42 Delivery address... (for goods, etc) Street address Y N N Y N Y Fine N Intersection address Y N N Y N Y Fine N Landmark address Y N N Y N Y Fine to Moderate N Building address Y N N Y N Y Fine N Site address Y N N Y N Y Fine to Coarse N Farm...

  10. Nitrogen Fertilizer Source, Rates, and Timing for a Cover Crop and Subsequent Cotton Crop

    Science.gov (United States)

    The objectives were to compare N fertilizer sources, rates, and time of application for a rye winter cover crop to determine optimal biomass production for conservation tillage production, compare recommended and no additional N fertilizer rates across different biomass levels for cotton, and determ...

  11. Open Source Initiative Powers Real-Time Data Streams

    Science.gov (United States)

    2014-01-01

    Under an SBIR contract with Dryden Flight Research Center, Creare Inc. developed a data collection tool called the Ring Buffered Network Bus. The technology has now been released under an open source license and is hosted by the Open Source DataTurbine Initiative. DataTurbine allows anyone to stream live data from sensors, labs, cameras, ocean buoys, cell phones, and more.

  12. License Address List

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Address list generated from National Saltwater Angler Registry. Used in conjunction with an address-based sample as per survey design.

  13. CHANDRA ACIS SURVEY OF X-RAY POINT SOURCES: THE SOURCE CATALOG

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Song; Liu, Jifeng; Qiu, Yanli; Bai, Yu; Yang, Huiqin; Guo, Jincheng; Zhang, Peng, E-mail: jfliu@bao.ac.cn, E-mail: songw@bao.ac.cn [Key Laboratory of Optical Astronomy, National Astronomical Observatories, Chinese Academy of Sciences, Beijing 100012 (China)

    2016-06-01

    The Chandra archival data is a valuable resource for various studies on different X-ray astronomy topics. In this paper, we utilize this wealth of information and present a uniformly processed data set, which can be used to address a wide range of scientific questions. The data analysis procedures are applied to 10,029 Advanced CCD Imaging Spectrometer observations, which produces 363,530 source detections belonging to 217,828 distinct X-ray sources. This number is twice the size of the Chandra Source Catalog (Version 1.1). The catalogs in this paper provide abundant estimates of the detected X-ray source properties, including source positions, counts, colors, fluxes, luminosities, variability statistics, etc. Cross-correlation of these objects with galaxies shows that 17,828 sources are located within the D {sub 25} isophotes of 1110 galaxies, and 7504 sources are located between the D {sub 25} and 2 D {sub 25} isophotes of 910 galaxies. Contamination analysis with the log N –log S relation indicates that 51.3% of objects within 2 D {sub 25} isophotes are truly relevant to galaxies, and the “net” source fraction increases to 58.9%, 67.3%, and 69.1% for sources with luminosities above 10{sup 37}, 10{sup 38}, and 10{sup 39} erg s{sup −1}, respectively. Among the possible scientific uses of this catalog, we discuss the possibility of studying intra-observation variability, inter-observation variability, and supersoft sources (SSSs). About 17,092 detected sources above 10 counts are classified as variable in individual observation with the Kolmogorov–Smirnov (K–S) criterion ( P {sub K–S} < 0.01). There are 99,647 sources observed more than once and 11,843 sources observed 10 times or more, offering us a wealth of data with which to explore the long-term variability. There are 1638 individual objects (∼2350 detections) classified as SSSs. As a quite interesting subclass, detailed studies on X-ray spectra and optical spectroscopic follow-up are needed to

  14. A time resolved microfocus XEOL facility at the Diamond Light Source

    International Nuclear Information System (INIS)

    Mosselmans, J F W; Taylor, R P; Quinn, P D; Cibin, G; Gianolio, D; Finch, A A; Sapelkin, A V

    2013-01-01

    We have constructed a Time-Resolved X-ray Excited Optical Luminescence (TR-XEOL) detection system at the Microfocus Spectroscopy beamline I18 at the Diamond Light Source. Using the synchrotron in h ybrid bunch mode , the data collection is triggered by the RF clock, and we are able to record XEOL photons with a time resolution of 6.1 ps during the 230 ns gap between the hybrid bunch and the main train of electron bunches. We can detect photons over the range 180-850 nm using a bespoke optical fibre, with X-ray excitation energies between 2 and 20 keV. We have used the system to study a range of feldspars. The detector is portable and has also been used on beamline B18 to collect Optically Determined X-ray Absorption Spectroscopy (OD-XAS) in QEXAFS mode.

  15. A time resolved microfocus XEOL facility at the Diamond Light Source

    Science.gov (United States)

    Mosselmans, J. F. W.; Taylor, R. P.; Quinn, P. D.; Finch, A. A.; Cibin, G.; Gianolio, D.; Sapelkin, A. V.

    2013-03-01

    We have constructed a Time-Resolved X-ray Excited Optical Luminescence (TR-XEOL) detection system at the Microfocus Spectroscopy beamline I18 at the Diamond Light Source. Using the synchrotron in "hybrid bunch mode", the data collection is triggered by the RF clock, and we are able to record XEOL photons with a time resolution of 6.1 ps during the 230 ns gap between the hybrid bunch and the main train of electron bunches. We can detect photons over the range 180-850 nm using a bespoke optical fibre, with X-ray excitation energies between 2 and 20 keV. We have used the system to study a range of feldspars. The detector is portable and has also been used on beamline B18 to collect Optically Determined X-ray Absorption Spectroscopy (OD-XAS) in QEXAFS mode.

  16. SU-E-T-459: Impact of Source Position and Traveling Time On HDR Skin Surface Applicator Dosimetry

    International Nuclear Information System (INIS)

    Jeong, J; Barker, C; Zaider, M; Cohen, G

    2015-01-01

    Purpose: Observed dosimetric discrepancy between measured and treatment planning system (TPS) predicted values, during applicator commissioning, were traced to source position uncertainty in the applicator. We quantify the dosimetric impact of this geometric uncertainty, and of the source traveling time inside the applicator, and propose corrections for clinical use. Methods: We measured the dose profiles from the Varian Leipzig-style (horizontal) HDR skin applicator, using EBT3 film, photon diode, and optically stimulated luminescence dosimeter (OSLD) and three different GammaMed HDR afterloders. The dose profiles and depth dose of each aperture were measured at several depths (up to about 10 mm, depending on the dosimeter). The measured dose profiles were compared with Acuros calculated profiles in BrachyVision TPS. For the impact of the source position, EBT3 film measurements were performed with applicator, facing-down and facing-up orientations. The dose with and without source traveling was measured with diode detector using HDR timer and electrometer timer, respectively. Results: Depth doses measured using the three dosimeters were in good agreement, but were consistently higher than the Acuros dose calculations. Measurements with the applicator facing-up were significantly lower than those in the facing-down position with maximum difference of about 18% at the surface, due to source sag inside the applicator. Based on the inverse-square law, the effective source sag was evaluated to be about 0.5 mm from the planned position. The additional dose from the source traveling was about 2.8% for 30 seconds with 10 Ci source, decreasing with increased dwelling time and decreased source activity. Conclusion: Due to the short source-to-surface distance of the applicator, the small source sag inside the applicator has significant dosimetric impact, which should be considered before the clinical use of the applicator. Investigation of the effect for other applicators

  17. Communicating one's local address and emergency contact details

    CERN Multimedia

    Information Technology Department, AIS (Administrative Information Services) Group; Human Resources Department, SPS (Services, Procedures and Social) Group

    2007-01-01

    As part of the ongoing simplification of procedures and rationalisation of administrative processes, the IT, PH (Users Office) and HR Departments have developed two new EDH forms for communicating or updating one's local address and emergency contact details. This is the first time that the forms relating to an official HR procedure can be accessed on a self-service basis and directly updated by the members of personnel themselves. The information recorded remains confidential and may only be accessed by the authorised administrative services and the emergency services. Local address: Members of the personnel must declare any change in their local address (Art. R V 1.38 of the Staff Regulations). This declaration is henceforth made by directly filling out the EDH document indicated below, and without requiring any other spontaneous formality vis-à-vis the department secretariat or the Users Office. It is also possible for any member of the personnel to check whether the local address in the Organizati...

  18. Impact of the diagnostic process on the accuracy of source identification and time to antibiotics in septic emergency department patients.

    Science.gov (United States)

    Uittenbogaard, Annemieke J M; de Deckere, Ernie R J T; Sandel, Maro H; Vis, Alice; Houser, Christine M; de Groot, Bas

    2014-06-01

    Timely administration of effective antibiotics is important in sepsis management. Source-targeted antibiotics are believed to be most effective, but source identification could cause time delays. First, to describe the accuracy/time delays of a diagnostic work-up and the association with time to antibiotics in septic emergency department (ED) patients. Second, to assess the fraction in which source-targeted antibiotics could have been administered solely on the basis of patient history and physical examination. Secondary analysis of the prospective observational study on septic ED patients was carried out. The time to test result availability was associated with time to antibiotics. The accuracy of the suspected source of infection in the ED was assessed. For patients with pneumosepsis, urosepsis, and abdominal sepsis, combinations of signs and symptoms were assessed to achieve a maximal positive predictive value for the sepsis source, identifying a subset of patients in whom source-targeted antibiotics could be administered without waiting for diagnostic test results. The time to antibiotics increased by 18 (95% confidence interval: 12-24) min/h delay in test result availability (n=323). In 38-79% of patients, antibiotics were administered after additional tests, whereas the ED diagnosis was correct in 68-85% of patients. The maximal positive predictive value of signs and symptoms was 0.87 for patients with pneumosepsis and urosepsis and 0.75 for those with abdominal sepsis. Use of signs and symptoms would have led to correct ED diagnosis in 33% of patients. Diagnostic tests are associated with delayed administration of antibiotics to septic ED patients while increasing the diagnostic accuracy to only 68-85%. In one-third of septic ED patients, the choice of antibiotics could have been accurately determined solely on the basis of patient history and physical examination.

  19. Time-Dependent S{sub N} Calculations Describing Pulsed Source Experiments at the FRO Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Bergstrom, A.; Kockum, J.; Soderberg, S. [Research Institute of National Defence, Stockholm (Sweden)

    1968-04-15

    In view of the difficulties in describing pulsed source experiments quantitatively in assemblies consisting of a fast core and a light reflector, a time-dependent S{sub N} code has been applied to this type of assembly. The code, written for the IBM 7090 computer, divides time into short intervals and computes the flux in spherical geometry for each interval using the Carlson S{sub N} scheme. The source term is obtained by extrapolation from two earlier time-intervals. Several problems in connection with the discretization of the time, space and energy dimensions are discussed. For the sub-critical assembly studied the treatment of the lower energy-groups is decisive for the numerical stability. A 22-group cross-section set with a low energy cut-off at 0.04 eV obtained with the SPENG programme has been used. The time intervals are varied continuously and are set proportional to the inverse of the maximum logarithmic time-derivative of the space and energy-dependent flux with the further restriction that they are not allowed to increase above a predetermined value. In a typical case, the intervals vary between 10{sup -9} and 10{sup -8} sec. The memory of the computer is fully exploited when 22 energy groups and 46 radial points are used. The computing time for each time-interval is about 6 sec. The code has been applied to a 3.5% sub-critical assembly consisting of a 20% enriched, spherical uranium metal core with a thick copper reflector and the calculations have been compared to experiments with good agreement. The calculations show that spectral equilibrium below 10 keV is not reached until times long compared to the usual measuring times and that the exponential decay finally reached is entirely determined by reflector properties at almost thermal energies. It is also shown that the simple one- and two-region models are inadequate in this case and that no time-independent prompt neutron life-time can be obtained from the measurements. (author)

  20. Addresses

    Data.gov (United States)

    Town of Chapel Hill, North Carolina — Point features representing locations of all street addresses in Orange County, NC including Chapel Hill, NC. Data maintained by Orange County, the Town of Chapel...

  1. Characterization of Ground Displacement Sources from Variational Bayesian Independent Component Analysis of Space Geodetic Time Series

    Science.gov (United States)

    Gualandi, Adriano; Serpelloni, Enrico; Elina Belardinelli, Maria; Bonafede, Maurizio; Pezzo, Giuseppe; Tolomei, Cristiano

    2015-04-01

    A critical point in the analysis of ground displacement time series, as those measured by modern space geodetic techniques (primarly continuous GPS/GNSS and InSAR) is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies, since PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem. The recovering and separation of the different sources that generate the observed ground deformation is a fundamental task in order to provide a physical meaning to the possible different sources. PCA fails in the BSS problem since it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the displacement time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient deformation signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources

  2. "Out of Fear and into Peace" President Eisenhower's Address of the United Nations.

    Science.gov (United States)

    Mueller, Jean West; Schamel, Wynell Burroughs

    1990-01-01

    Presents a section of President Dwight D. Eisenhower's, "Atoms for Peace," 1953 address to the United Nations General Assembly. Suggests using the document for classroom discussions of nuclear proliferation, emphasizing that using primary sources develops research skills, activates classroom discussions, citizenship, and creative…

  3. Enter your email-address: how German internet users manage their email addresses

    NARCIS (Netherlands)

    Utz, S.

    2004-01-01

    Writing E-mail is the most popular Internet activity. Meanwhile, many people have more than one E-mail address. The question how people manage their E-mail addresses, more specifically, whether they use them deliberately for different purposes, is the central question of this paper. E-mail addresses

  4. Investigation of the ion beam of the Titan source by the time-of-flight mass spectrometer

    International Nuclear Information System (INIS)

    Bugaev, A.S.; Gushenets, V.V.; Nikolaev, A.G.; Yushkov, G.Yu.

    2000-01-01

    The Titan ion source generates wide-aperture beams of both gaseous and metal ions of various materials. The above possibility is realized on the account of combining two types of arc discharge with cold cathodes in the source discharge system. The vacuum arc, initiated between the cathode accomplished from the ion forming material, and hollow anode, is used for obtaining the metal ions. The pinch-effect low pressure arc discharge, ignited on the same hollow anode, is used for obtaining gaseous ions. The composition of ion beams, generated by the Titan source through the specially designed time-of-flight spectrometer, is studied. The spectrometer design and principle pf operation are presented. The physical peculiarities of the source functioning, influencing the ion beam composition, are discussed [ru

  5. Opening address

    International Nuclear Information System (INIS)

    Boening, K.

    2003-01-01

    The program of this 9th Meeting of the International Group on Research Reactors IGORR includes are quite a number of fascinating new research reactor projects in France, Germany, Russia, Canada, China, Thailand, and in Australia. In addition to the session about New Facilities there are interesting sessions on the Upgrades and on the Optimization of Operation and Utilization of existing research reactors, on Secondary Neutron Sources, on Neutron Scattering applications, and on the aspects of Safety, Licensing and Decommissioning. Two particular projects of new research reactors are mentioned specially: the TRR-II project in Taiwan, has unfortunately been terminated last year because of a change to anti-nuclear of the ruling parties in the government - and the new FRM-II in Munich, Germany, which will hopefully survive such a political change and receive its green light for nuclear start up in the very near future. The charter of IGORR and its objectives are part of this address: The International Group on Research Reactors IGORR was formed to facilitate the sharing of knowledge and experience among those institutions and individuals who are actively working to design, build, and promote new research reactors or to make significant upgrades to existing facilities. The main IGORR objectives are to promote contacts between its members, to identify and discuss problems of common interest, to distribute newsletters about once or twice every year and to organize meetings about once every one-and-a-half years

  6. A FOURIER-TRANSFORMED BREMSSTRAHLUNG FLASH MODEL FOR THE PRODUCTION OF X-RAY TIME LAGS IN ACCRETING BLACK HOLE SOURCES

    International Nuclear Information System (INIS)

    Kroon, John J.; Becker, Peter A.

    2014-01-01

    Accreting black hole sources show a wide variety of rapid time variability, including the manifestation of time lags during X-ray transients, in which a delay (phase shift) is observed between the Fourier components of the hard and soft spectra. Despite a large body of observational evidence for time lags, no fundamental physical explanation for the origin of this phenomenon has been presented. We develop a new theoretical model for the production of X-ray time lags based on an exact analytical solution for the Fourier transform describing the diffusion and Comptonization of seed photons propagating through a spherical corona. The resulting Green's function can be convolved with any source distribution to compute the associated Fourier transform and time lags, hence allowing us to explore a wide variety of injection scenarios. We show that thermal Comptonization is able to self-consistently explain both the X-ray time lags and the steady-state (quiescent) X-ray spectrum observed in the low-hard state of Cyg X-1. The reprocessing of bremsstrahlung seed photons produces X-ray time lags that diminish with increasing Fourier frequency, in agreement with the observations for a wide range of sources

  7. Allegheny County Addressing Landmarks

    Data.gov (United States)

    Allegheny County / City of Pittsburgh / Western PA Regional Data Center — This dataset contains address points which represent physical address locations assigned by the Allegheny County addressing authority. Data is updated by County...

  8. Allegheny County Address Points

    Data.gov (United States)

    Allegheny County / City of Pittsburgh / Western PA Regional Data Center — This dataset contains address points which represent physical address locations assigned by the Allegheny County addressing authority. Data is updated by County...

  9. Biological effect of pulsed dose rate brachytherapy with stepping sources if short half-times of repair are present in tissues

    International Nuclear Information System (INIS)

    Fowler, Jack F.; Limbergen, Erik F.M. van

    1997-01-01

    Purpose: To explore the possible increase of radiation effect in tissues irradiated by pulsed brachytherapy (PDR) for local tissue dose rates between those 'averaged over the whole pulse' and the instantaneous high dose rates close to the dwell positions. Increased effect is more likely for tissues with short half-times of repair of the order of a few minutes, similar to pulse durations. Methods and Materials: Calculations were done assuming the linear quadratic formula for radiation damage, in which only the dose-squared term is subject to exponential repair. The situation with two components of T (1(2)) is addressed. A constant overall time of 140 h and a constant total dose of 70 Gy were assumed throughout, the continuous low dose rate of 0.5 Gy/h (CLDR) providing the unitary standard effects for each PDR condition. Effects of dose rates ranging from 4 Gy/h to 120 Gy/h (HDR at 2 Gy/min) were studied, covering the gap in an earlier publication. Four schedules were examined: doses per pulse of 0.5, 1, 1.5, and 2 Gy given at repetition frequencies of 1, 2, 3, and 4 h, respectively, each with a range of assumed half-times of repair of 4 min to 1.5 h. Results are presented for late-responding tissues, the differences from CLDR being two or three times greater than for early-responding tissues and most tumors. Results: Curves are presented relating the ratio of increased biological effect (proportional to log cell kill) calculated for PDR relative to CLDR. Ratios as high as 1.5 can be found for large doses per pulse (2 Gy) if the half-time of repair in tissues is as short as a few minutes. The major influences on effect are dose per pulse, half-time of repair in tissue, and--when T (1(2)) is short--the instantaneous dose rate. Maximum ratios of PDR/CLDR occur when the dose rate is such that pulse duration is approximately equal to T (1(2)) . As dose rate in the pulse is increased, a plateau of effect is reached, for most T (1(2)) s, above 10 to 20 Gy/h, which is

  10. MVO Automation Platform: Addressing Unmet Needs in Clinical Laboratories with Microcontrollers, 3D Printing, and Open-Source Hardware/Software.

    Science.gov (United States)

    Iglehart, Brian

    2018-05-01

    Laboratory automation improves test reproducibility, which is vital to patient care in clinical laboratories. Many small and specialty laboratories are excluded from the benefits of automation due to low sample number, cost, space, and/or lack of automation expertise. The Minimum Viable Option (MVO) automation platform was developed to address these hurdles and fulfill an unmet need. Consumer 3D printing enabled rapid iterative prototyping to allow for a variety of instrumentation and assay setups and procedures. Three MVO versions have been produced. MVOv1.1 successfully performed part of a clinical assay, and results were comparable to those of commercial automation. Raspberry Pi 3 Model B (RPI3) single-board computers with Sense Hardware Attached on Top (HAT) and Raspberry Pi Camera Module V2 hardware were remotely accessed and evaluated for their suitability to qualify the latest MVOv1.2 platform. Sense HAT temperature, barometric pressure, and relative humidity sensors were stable in climate-controlled environments and are useful in identifying appropriate laboratory spaces for automation placement. The RPI3 with camera plus digital dial indicator logged axis travel experiments. RPI3 with camera and Sense HAT as a light source showed promise when used for photometric dispensing tests. Individual well standard curves were necessary for well-to-well light and path length compensations.

  11. Detection of Spoofed MAC Addresses in 802.11 Wireless Networks

    Science.gov (United States)

    Tao, Kai; Li, Jing; Sampalli, Srinivas

    Medium Access Control (MAC) address spoofing is considered as an important first step in a hacker's attempt to launch a variety of attacks on 802.11 wireless networks. Unfortunately, MAC address spoofing is hard to detect. Most current spoofing detection systems mainly use the sequence number (SN) tracking technique, which has drawbacks. Firstly, it may lead to an increase in the number of false positives. Secondly, such techniques cannot be used in systems with wireless cards that do not follow standard 802.11 sequence number patterns. Thirdly, attackers can forge sequence numbers, thereby causing the attacks to go undetected. We present a new architecture called WISE GUARD (Wireless Security Guard) for detection of MAC address spoofing on 802.11 wireless LANs. It integrates three detection techniques - SN tracking, Operating System (OS) fingerprinting & tracking and Received Signal Strength (RSS) fingerprinting & tracking. It also includes the fingerprinting of Access Point (AP) parameters as an extension to the OS fingerprinting for detection of AP address spoofing. We have implemented WISE GUARD on a test bed using off-the-shelf wireless devices and open source drivers. Experimental results show that the new design enhances the detection effectiveness and reduces the number of false positives in comparison with current approaches.

  12. Do Faith Communities Have a Role in Addressing Childhood Obesity?

    Science.gov (United States)

    Opalinski, Andra; Dyess, Susan; Grooper, Sareen

    2015-01-01

    Pediatric obesity is a multifaceted phenomenon. A partnership with faith-based communities to address the issue has been suggested. The purpose of this study was to describe the cultural beliefs of faith community leaders regarding childhood obesity and to examine attitudes about their role in addressing the issue. A qualitative descriptive design informed by ethnographic methods and triangulation of multiple data sources was utilized to assess the cultural beliefs of faith community leaders. A purposive sample of 13 leaders (nine females, four males) from seven multicultural and multigenerational local faith communities participated in the study. No more than three participants from any one faith community were enrolled in the study. Twenty-first century lifestyle challenges, accountability of behaviors (a dichotomy that fluctuated between individual responsibility to community and/or social responsibility), and the need for intentionality emerged as themes from the data. Faith community leaders envisioned a role for faith communities in addressing childhood obesity. Findings support the ongoing development of population based health promotion programs through faith community engagement. The findings provide a foundation for nurses partnering with faith communities on health promotion programs targeting childhood obesity to address family health issues in a holistic way. © 2015 Wiley Periodicals, Inc.

  13. South African address standard and initiatives towards an international address standard

    CSIR Research Space (South Africa)

    Cooper, Antony K

    2008-10-01

    Full Text Available ; visiting friends; and providing a reference context for presenting other information. The benefits of an international address standards include: enabling address interoperability across boundaries; reducing service delivery costs; enabling development...

  14. The dynamic method for time-of-flight measurement of thermal neutron spectra from pulsed sources

    International Nuclear Information System (INIS)

    Pepyolyshev, Yu.N.; Chuklyaev, S.V.; Tulaev, A.B.; Bobrakov, V.F.

    1995-01-01

    A time-of-flight method for measurement of thermal neutron spectra in pulsed neutron sources with an efficiency more than 10 5 times higher than the standard method is described. The main problems associated with the electric current technique for time-of-flight spectra measurement are examined. The methodical errors, problems of special neutron detector design and other questions are discussed. Some experimental results for spectra from the surfaces of water and solid methane moderators obtained at the IBR-2 pulsed reactor (Dubna, Russia) are presented. (orig.)

  15. Investigating scintillometer source areas

    Science.gov (United States)

    Perelet, A. O.; Ward, H. C.; Pardyjak, E.

    2017-12-01

    Scintillometry is an indirect ground-based method for measuring line-averaged surface heat and moisture fluxes on length scales of 0.5 - 10 km. These length scales are relevant to urban and other complex areas where setting up traditional instrumentation like eddy covariance is logistically difficult. In order to take full advantage of scintillometry, a better understanding of the flux source area is needed. The source area for a scintillometer is typically calculated as a convolution of point sources along the path. A weighting function is then applied along the path to compensate for a total signal contribution that is biased towards the center of the beam path, and decreasing near the beam ends. While this method of calculating the source area provides an estimate of the contribution of the total flux along the beam, there are still questions regarding the physical meaning of the weighted source area. These questions are addressed using data from an idealized experiment near the Salt Lake City International Airport in northern Utah, U.S.A. The site is a flat agricultural area consisting of two different land uses. This simple heterogeneity in the land use facilitates hypothesis testing related to source areas. Measurements were made with a two wavelength scintillometer system spanning 740 m along with three standard open-path infrared gas analyzer-based eddy-covariance stations along the beam path. This configuration allows for direct observations of fluxes along the beam and comparisons to the scintillometer average. The scintillometer system employed measures the refractive index structure parameter of air for two wavelengths of electromagnetic radiation, 880 μm and 1.86 cm to simultaneously estimate path-averaged heat and moisture fluxes, respectively. Meteorological structure parameters (CT2, Cq2, and CTq) as well as surface fluxes are compared for various amounts of source area overlap between eddy covariance and scintillometry. Additionally, surface

  16. Highly efficient sources of single indistinguishable photons

    DEFF Research Database (Denmark)

    Gregersen, Niels

    2013-01-01

    be electrically driven. Several design strategies addressing these requirements have been proposed. In the cavity-based source, light emission is controlled using resonant cavity quantum electrodynamics effects, whereas in the waveguide-based source, broadband electric field screening effects are employed......Solid-state sources capable of emitting single photons on demand are of great interest in quantum information applications. Ideally, such a source should emit exactly one photon into the collection optics per trigger, the emitted photons should be indistinguishable and the source should...

  17. Time-Dependent Selection of an Optimal Set of Sources to Define a Stable Celestial Reference Frame

    Science.gov (United States)

    Le Bail, Karine; Gordon, David

    2010-01-01

    Temporal statistical position stability is required for VLBI sources to define a stable Celestial Reference Frame (CRF) and has been studied in many recent papers. This study analyzes the sources from the latest realization of the International Celestial Reference Frame (ICRF2) with the Allan variance, in addition to taking into account the apparent linear motions of the sources. Focusing on the 295 defining sources shows how they are a good compromise of different criteria, such as statistical stability and sky distribution, as well as having a sufficient number of sources, despite the fact that the most stable sources of the entire ICRF2 are mostly in the Northern Hemisphere. Nevertheless, the selection of a stable set is not unique: studying different solutions (GSF005a and AUG24 from GSFC and OPA from the Paris Observatory) over different time periods (1989.5 to 2009.5 and 1999.5 to 2009.5) leads to selections that can differ in up to 20% of the sources. Observing, recording, and network improvement are some of the causes, showing better stability for the CRF over the last decade than the last twenty years. But this may also be explained by the assumption of stationarity that is not necessarily right for some sources.

  18. Orphan Sources. Extending Radiological Protection outside the Regulatory Framework

    Energy Technology Data Exchange (ETDEWEB)

    Eugenio Gil [Deputy Director for Emergency, Spanish Nuclear Safety Council (Spain)

    2006-07-01

    Radioactive sources that are not under appropriate regulatory control-Orphan sources- can result in a number of undesirable consequences including human health impacts, socio-psychological impacts, political and economic impacts, as well as environmental impacts. Many countries are now in the process of introducing the necessary measures to regain an appropriate level of control over them. For a variety of historical and economic reasons, there could already be sources in any specific country that are not within the usual regulatory system. Some of these may be known about, others may not. Therefore a national strategy is needed to ascertain the likelihood and magnitude of the issue of radioactive source control problem within a country and the priorities necessary to address the problems identified. A well-developed plan for improving control over all relevant radioactive sources tailored to the national situation will ensure optimum use of resources such as time, money and personnel. It will allow these limited resources to be allocated appropriately to ensure that control is first regained over those sources presenting the highest risks. This lecture shows a way to develop an appropriate national strategy for regaining control over orphan sources. The methodology described in this lecture is basically based in the IAEA Recommendations. (author)

  19. Orphan Sources. Extending Radiological Protection outside the Regulatory Framework

    International Nuclear Information System (INIS)

    Eugenio Gil

    2006-01-01

    Radioactive sources that are not under appropriate regulatory control-Orphan sources- can result in a number of undesirable consequences including human health impacts, socio-psychological impacts, political and economic impacts, as well as environmental impacts. Many countries are now in the process of introducing the necessary measures to regain an appropriate level of control over them. For a variety of historical and economic reasons, there could already be sources in any specific country that are not within the usual regulatory system. Some of these may be known about, others may not. Therefore a national strategy is needed to ascertain the likelihood and magnitude of the issue of radioactive source control problem within a country and the priorities necessary to address the problems identified. A well-developed plan for improving control over all relevant radioactive sources tailored to the national situation will ensure optimum use of resources such as time, money and personnel. It will allow these limited resources to be allocated appropriately to ensure that control is first regained over those sources presenting the highest risks. This lecture shows a way to develop an appropriate national strategy for regaining control over orphan sources. The methodology described in this lecture is basically based in the IAEA Recommendations. (author)

  20. Real-Time Processing Library for Open-Source Hardware Biomedical Sensors.

    Science.gov (United States)

    Molina-Cantero, Alberto J; Castro-García, Juan A; Lebrato-Vázquez, Clara; Gómez-González, Isabel M; Merino-Monge, Manuel

    2018-03-29

    Applications involving data acquisition from sensors need samples at a preset frequency rate, the filtering out of noise and/or analysis of certain frequency components. We propose a novel software architecture based on open-software hardware platforms which allows programmers to create data streams from input channels and easily implement filters and frequency analysis objects. The performances of the different classes given in the size of memory allocated and execution time (number of clock cycles) were analyzed in the low-cost platform Arduino Genuino. In addition, 11 people took part in an experiment in which they had to implement several exercises and complete a usability test. Sampling rates under 250 Hz (typical for many biomedical applications) makes it feasible to implement filters, sliding windows and Fourier analysis, operating in real time. Participants rated software usability at 70.2 out of 100 and the ease of use when implementing several signal processing applications was rated at just over 4.4 out of 5. Participants showed their intention of using this software because it was percieved as useful and very easy to use. The performances of the library showed that it may be appropriate for implementing small biomedical real-time applications or for human movement monitoring, even in a simple open-source hardware device like Arduino Genuino. The general perception about this library is that it is easy to use and intuitive.

  1. Miniaturized pulsed laser source for time-domain diffuse optics routes to wearable devices.

    Science.gov (United States)

    Di Sieno, Laura; Nissinen, Jan; Hallman, Lauri; Martinenghi, Edoardo; Contini, Davide; Pifferi, Antonio; Kostamovaara, Juha; Mora, Alberto Dalla

    2017-08-01

    We validate a miniaturized pulsed laser source for use in time-domain (TD) diffuse optics, following rigorous and shared protocols for performance assessment of this class of devices. This compact source (12×6  mm2) has been previously developed for range finding applications and is able to provide short, high energy (∼100  ps, ∼0.5  nJ) optical pulses at up to 1 MHz repetition rate. Here, we start with a basic level laser characterization with an analysis of suitability of this laser for the diffuse optics application. Then, we present a TD optical system using this source and its performances in both recovering optical properties of tissue-mimicking homogeneous phantoms and in detecting localized absorption perturbations. Finally, as a proof of concept of in vivo application, we demonstrate that the system is able to detect hemodynamic changes occurring in the arm of healthy volunteers during a venous occlusion. Squeezing the laser source in a small footprint removes a key technological bottleneck that has hampered so far the realization of a miniaturized TD diffuse optics system, able to compete with already assessed continuous-wave devices in terms of size and cost, but with wider performance potentialities, as demonstrated by research over the last two decades. (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  2. The Chandra Source Catalog : Automated Source Correlation

    Science.gov (United States)

    Hain, Roger; Evans, I. N.; Evans, J. D.; Glotfelty, K. J.; Anderson, C. S.; Bonaventura, N. R.; Chen, J. C.; Davis, J. E.; Doe, S. M.; Fabbiano, G.; Galle, E.; Gibbs, D. G.; Grier, J. D.; Hall, D. M.; Harbo, P. N.; He, X.; Houck, J. C.; Karovska, M.; Lauer, J.; McCollough, M. L.; McDowell, J. C.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Nowak, M. A.; Plummer, D. A.; Primini, F. A.; Refsdal, B. L.; Rots, A. H.; Siemiginowska, A. L.; Sundheim, B. A.; Tibbetts, M. S.; Van Stone, D. W.; Winkelman, S. L.; Zografou, P.

    2009-01-01

    Chandra Source Catalog (CSC) master source pipeline processing seeks to automatically detect sources and compute their properties. Since Chandra is a pointed mission and not a sky survey, different sky regions are observed for a different number of times at varying orientations, resolutions, and other heterogeneous conditions. While this provides an opportunity to collect data from a potentially large number of observing passes, it also creates challenges in determining the best way to combine different detection results for the most accurate characterization of the detected sources. The CSC master source pipeline correlates data from multiple observations by updating existing cataloged source information with new data from the same sky region as they become available. This process sometimes leads to relatively straightforward conclusions, such as when single sources from two observations are similar in size and position. Other observation results require more logic to combine, such as one observation finding a single, large source and another identifying multiple, smaller sources at the same position. We present examples of different overlapping source detections processed in the current version of the CSC master source pipeline. We explain how they are resolved into entries in the master source database, and examine the challenges of computing source properties for the same source detected multiple times. Future enhancements are also discussed. This work is supported by NASA contract NAS8-03060 (CXC).

  3. Some selection criteria for computers in real-time systems for high energy physics

    International Nuclear Information System (INIS)

    Kolpakov, I.F.

    1980-01-01

    The right choice of program source is for the organization of real-time systems of great importance as cost and reliability are decisive factors. Some selection criteria for program sources for high energy physics multiwire chamber spectrometers (MWCS) are considered in this report. MWCS's accept bits of information from event pattens. Large and small computers, microcomputers and intelligent controllers in CAMAC crates are compared with respect to the following characteristics: data exchange speed, number of addresses for peripheral devices, cost of interfacing a peripheral device, sizes of buffer and mass memory, configuration costs, and the mean time between failures (MTBF). The results of comparisons are shown by plots and histograms which allow the selection of program sources according to the above criteria. (Auth.)

  4. Sources and Timing of Sex Education: Relations with American Adolescent Sexual Attitudes and Behavior

    Science.gov (United States)

    Somers, Cheryl L.; Surmann, Amy T.

    2005-01-01

    The purpose of this study was to explore the comparative contribution that (a) multiple sources of education about sexual topics (peers, media, school and other adults), and (b) the timing of this sex education, make on American adolescent sexual attitudes and behavior. Participants were 672 ethnically and economically diverse male and female,…

  5. Open Source, Openness, and Higher Education

    Science.gov (United States)

    Wiley, David

    2006-01-01

    In this article David Wiley provides an overview of how the general expansion of open source software has affected the world of education in particular. In doing so, Wiley not only addresses the development of open source software applications for teachers and administrators, he also discusses how the fundamental philosophy of the open source…

  6. Disambiguating past events: Accurate source memory for time and context depends on different retrieval processes.

    Science.gov (United States)

    Persson, Bjorn M; Ainge, James A; O'Connor, Akira R

    2016-07-01

    Current animal models of episodic memory are usually based on demonstrating integrated memory for what happened, where it happened, and when an event took place. These models aim to capture the testable features of the definition of human episodic memory which stresses the temporal component of the memory as a unique piece of source information that allows us to disambiguate one memory from another. Recently though, it has been suggested that a more accurate model of human episodic memory would include contextual rather than temporal source information, as humans' memory for time is relatively poor. Here, two experiments were carried out investigating human memory for temporal and contextual source information, along with the underlying dual process retrieval processes, using an immersive virtual environment paired with a 'Remember-Know' memory task. Experiment 1 (n=28) showed that contextual information could only be retrieved accurately using recollection, while temporal information could be retrieved using either recollection or familiarity. Experiment 2 (n=24), which used a more difficult task, resulting in reduced item recognition rates and therefore less potential for contamination by ceiling effects, replicated the pattern of results from Experiment 1. Dual process theory predicts that it should only be possible to retrieve source context from an event using recollection, and our results are consistent with this prediction. That temporal information can be retrieved using familiarity alone suggests that it may be incorrect to view temporal context as analogous to other typically used source contexts. This latter finding supports the alternative proposal that time since presentation may simply be reflected in the strength of memory trace at retrieval - a measure ideally suited to trace strength interrogation using familiarity, as is typically conceptualised within the dual process framework. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Contributed Review: Source-localization algorithms and applications using time of arrival and time difference of arrival measurements

    Energy Technology Data Exchange (ETDEWEB)

    Li, Xinya [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland, Washington 99352, USA; Deng, Zhiqun Daniel [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland, Washington 99352, USA; Rauchenstein, Lynn T. [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland, Washington 99352, USA; Carlson, Thomas J. [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland, Washington 99352, USA

    2016-04-01

    Locating the position of fixed or mobile sources (i.e., transmitters) based on received measurements from sensors is an important research area that is attracting much research interest. In this paper, we present localization algorithms using time of arrivals (TOA) and time difference of arrivals (TDOA) to achieve high accuracy under line-of-sight conditions. The circular (TOA) and hyperbolic (TDOA) location systems both use nonlinear equations that relate the locations of the sensors and tracked objects. These nonlinear equations can develop accuracy challenges because of the existence of measurement errors and efficiency challenges that lead to high computational burdens. Least squares-based and maximum likelihood-based algorithms have become the most popular categories of location estimators. We also summarize the advantages and disadvantages of various positioning algorithms. By improving measurement techniques and localization algorithms, localization applications can be extended into the signal-processing-related domains of radar, sonar, the Global Positioning System, wireless sensor networks, underwater animal tracking, mobile communications, and multimedia.

  8. The first synchrotron infrared beamlines at the Advanced Light Source: Spectromicroscopy and fast timing

    International Nuclear Information System (INIS)

    Martin, Michael C.; McKinney, Wayne R.

    1999-01-01

    Two recently commissioned infrared beamlines on the 1.4 bending magnet port at the Advanced Light Source, LBNL, are described. Using a synchrotron as an IR source provides three primary advantages: increased brightness, very fast light pulses, and enhanced far-IR flux. The considerable brightness advantage manifests itself most beneficially when performing spectroscopy on a microscopic length scale. Beamline (BL) 1.4.3 is a dedicated FTIR spectromicroscopy beamline, where a diffraction-limited spot size using the synchrotron source is utilized. BL 1.4.2 consists of a vacuum FTIR bench with a wide spectral range and step-scan capability. This BL makes use of the pulsed nature of the synchrotron light as well as the far-IR flux. Fast timing is demonstrated by observing the pulses from the electron bunch storage pattern at the ALS. Results from several experiments from both IR beamlines will be presented as an overview of the IR research currently being done at the ALS

  9. Quantitative Real-Time PCR Fecal Source Identification in the ...

    Science.gov (United States)

    Rivers in the Tillamook Basin play a vital role in supporting a thriving dairy and cheese-making industry, as well as providing a safe water resource for local human and wildlife populations. Historical concentrations of fecal bacteria in these waters are at times too high to allow for safe use leading to economic loss, endangerment of local wildlife, and poor conditions for recreational use. In this study, we employ host-associated qPCR methods for human (HF183/BacR287 and HumM2), ruminant (Rum2Bac), cattle (CowM2 and CowM3), canine (DG3 and DG37), and avian (GFD) fecal pollution combined with high-resolution geographic information system (GIS) land use data and general indicator bacteria measurements to elucidatewater quality spatial and temporal trends. Water samples (n=584) were collected over a 1-year period at 29 sites along the Trask, Kilchis, and Tillamook rivers and tributaries (Tillamook Basin, OR). A total of 16.6% of samples (n=97) yielded E. coli levels considered impaired based on Oregon Department of Environmental Quality bacteria criteria (406 MPN/100mL). Hostassociated genetic indicators were detected at frequencies of 39.2% (HF183/BacR287), 16.3% (HumM2), 74.6% (Rum2Bac), 13.0% (CowM2), 26.7% (CowM3), 19.8% (DG3), 3.2% (DG37), and 53.4% (GFD) across all water samples (n=584). Seasonal trends in avian, cattle, and human fecal pollution sources were evident over the study area. On a sample site basis, quantitative fecal source identification and

  10. Adjusting for under-identification of Aboriginal and/or Torres Strait Islander births in time series produced from birth records: Using record linkage of survey data and administrative data sources

    Directory of Open Access Journals (Sweden)

    Lawrence David

    2012-07-01

    Full Text Available Abstract Background Statistical time series derived from administrative data sets form key indicators in measuring progress in addressing disadvantage in Aboriginal and Torres Strait Islander populations in Australia. However, inconsistencies in the reporting of Indigenous status can cause difficulties in producing reliable indicators. External data sources, such as survey data, provide a means of assessing the consistency of administrative data and may be used to adjust statistics based on administrative data sources. Methods We used record linkage between a large-scale survey (the Western Australian Aboriginal Child Health Survey, and two administrative data sources (the Western Australia (WA Register of Births and the WA Midwives’ Notification System to compare the degree of consistency in determining Indigenous status of children between the two sources. We then used a logistic regression model predicting probability of consistency between the two sources to estimate the probability of each record on the two administrative data sources being identified as being of Aboriginal and/or Torres Strait Islander origin in a survey. By summing these probabilities we produced model-adjusted time series of neonatal outcomes for Aboriginal and/or Torres Strait Islander births. Results Compared to survey data, information based only on the two administrative data sources identified substantially fewer Aboriginal and/or Torres Strait Islander births. However, these births were not randomly distributed. Births of children identified as being of Aboriginal and/or Torres Strait Islander origin in the survey only were more likely to be living in urban areas, in less disadvantaged areas, and to have only one parent who identifies as being of Aboriginal and/or Torres Strait Islander origin, particularly the father. They were also more likely to have better health and wellbeing outcomes. Applying an adjustment model based on the linked survey data increased

  11. Space-Time Dependent Transport, Activation, and Dose Rates for Radioactivated Fluids.

    Science.gov (United States)

    Gavazza, Sergio

    Two methods are developed to calculate the space - and time-dependent mass transport of radionuclides, their production and decay, and the associated dose rates generated from the radioactivated fluids flowing through pipes. The work couples space- and time-dependent phenomena, treated as only space- or time-dependent in the open literature. The transport and activation methodology (TAM) is used to numerically calculate space- and time-dependent transport and activation of radionuclides in fluids flowing through pipes exposed to radiation fields, and volumetric radioactive sources created by radionuclide motions. The computer program Radionuclide Activation and Transport in Pipe (RNATPA1) performs the numerical calculations required in TAM. The gamma ray dose methodology (GAM) is used to numerically calculate space- and time-dependent gamma ray dose equivalent rates from the volumetric radioactive sources determined by TAM. The computer program Gamma Ray Dose Equivalent Rate (GRDOSER) performs the numerical calculations required in GAM. The scope of conditions considered by TAM and GAM herein include (a) laminar flow in straight pipe, (b)recirculating flow schemes, (c) time-independent fluid velocity distributions, (d) space-dependent monoenergetic neutron flux distribution, (e) space- and time-dependent activation process of a single parent nuclide and transport and decay of a single daughter radionuclide, and (f) assessment of space- and time-dependent gamma ray dose rates, outside the pipe, generated by the space- and time-dependent source term distributions inside of it. The methodologies, however, can be easily extended to include all the situations of interest for solving the phenomena addressed in this dissertation. A comparison is made from results obtained by the described calculational procedures with analytical expressions. The physics of the problems addressed by the new technique and the increased accuracy versus non -space and time-dependent methods

  12. Addressing Thermal Model Run Time Concerns of the Wide Field Infrared Survey Telescope using Astrophysics Focused Telescope Assets (WFIRST-AFTA)

    Science.gov (United States)

    Peabody, Hume; Guerrero, Sergio; Hawk, John; Rodriguez, Juan; McDonald, Carson; Jackson, Cliff

    2016-01-01

    The Wide Field Infrared Survey Telescope using Astrophysics Focused Telescope Assets (WFIRST-AFTA) utilizes an existing 2.4 m diameter Hubble sized telescope donated from elsewhere in the federal government for near-infrared sky surveys and Exoplanet searches to answer crucial questions about the universe and dark energy. The WFIRST design continues to increase in maturity, detail, and complexity with each design cycle leading to a Mission Concept Review and entrance to the Mission Formulation Phase. Each cycle has required a Structural-Thermal-Optical-Performance (STOP) analysis to ensure the design can meet the stringent pointing and stability requirements. As such, the models have also grown in size and complexity leading to increased model run time. This paper addresses efforts to reduce the run time while still maintaining sufficient accuracy for STOP analyses. A technique was developed to identify slews between observing orientations that were sufficiently different to warrant recalculation of the environmental fluxes to reduce the total number of radiation calculation points. The inclusion of a cryocooler fluid loop in the model also forced smaller time-steps than desired, which greatly increases the overall run time. The analysis of this fluid model required mitigation to drive the run time down by solving portions of the model at different time scales. Lastly, investigations were made into the impact of the removal of small radiation couplings on run time and accuracy. Use of these techniques allowed the models to produce meaningful results within reasonable run times to meet project schedule deadlines.

  13. Real-time speckle variance swept-source optical coherence tomography using a graphics processing unit.

    Science.gov (United States)

    Lee, Kenneth K C; Mariampillai, Adrian; Yu, Joe X Z; Cadotte, David W; Wilson, Brian C; Standish, Beau A; Yang, Victor X D

    2012-07-01

    Advances in swept source laser technology continues to increase the imaging speed of swept-source optical coherence tomography (SS-OCT) systems. These fast imaging speeds are ideal for microvascular detection schemes, such as speckle variance (SV), where interframe motion can cause severe imaging artifacts and loss of vascular contrast. However, full utilization of the laser scan speed has been hindered by the computationally intensive signal processing required by SS-OCT and SV calculations. Using a commercial graphics processing unit that has been optimized for parallel data processing, we report a complete high-speed SS-OCT platform capable of real-time data acquisition, processing, display, and saving at 108,000 lines per second. Subpixel image registration of structural images was performed in real-time prior to SV calculations in order to reduce decorrelation from stationary structures induced by the bulk tissue motion. The viability of the system was successfully demonstrated in a high bulk tissue motion scenario of human fingernail root imaging where SV images (512 × 512 pixels, n = 4) were displayed at 54 frames per second.

  14. Osmium Isotope Compositions of Komatiite Sources Through Time

    Science.gov (United States)

    Walker, R. J.

    2001-12-01

    Extending Os isotopic measurements to ancient plume sources may help to constrain how and when the well-documented isotopic heterogeneities in modern systems were created. Komatiites and picrites associated with plume-related volcanism are valuable tracers of the Os isotopic composition of plumes because of their typically high Os concentrations and relatively low Re/Os. Re-Os data are now available for a variety of Phanerozoic, Proterozoic and Archean komatiites and picrites. As with modern plumes, the sources of Archean and Proterozoic komatiites exhibit a large range of initial 187Os/188Os ratios. Most komatiites are dominated by sources with chondritic Os isotopic compositions (e.g. Song La; Norseman-Wiluna; Pyke Hill; Alexo), though some (e.g. Gorgona) derive from heterogeneous sources. Of note, however, two ca. 2.7 Ga systems, Kostomuksha (Russia) and Belingwe (Zimbabwe), have initial ratios enriched by 2-3% relative to the contemporary convecting upper mantle. These results suggest that if the 187Os enrichment was due to the incorporation of minor amounts of recycled crust into the mantle source of the rocks, the crust formed very early in Earth history. Thus, the Os results could reflect derivation of melt from hybrid mantle whose composition was modified by the addition of mafic crustal material that would most likely have formed between 4.2 and 4.5 Ga. Alternately, the mantle sources of these komatiites may have derived a portion of their Os from the putative 187Os - and 186Os -enriched outer core. For this hypothesis to be applicable to Archean rocks, an inner core of sufficient mass would have to have crystallized sufficiently early in Earth history to generate an outer core with 187Os enriched by at least 3% relative to the chondritic average. Using the Pt-Re-Os partition coefficients espoused by our earlier work, and assuming linear growth of the inner core started at 4.5 Ga and continued to present, would yield an outer core at 2.7 Ga with a gamma Os

  15. Multisource least-squares reverse-time migration with structure-oriented filtering

    Science.gov (United States)

    Fan, Jing-Wen; Li, Zhen-Chun; Zhang, Kai; Zhang, Min; Liu, Xue-Tong

    2016-09-01

    The technology of simultaneous-source acquisition of seismic data excited by several sources can significantly improve the data collection efficiency. However, direct imaging of simultaneous-source data or blended data may introduce crosstalk noise and affect the imaging quality. To address this problem, we introduce a structure-oriented filtering operator as preconditioner into the multisource least-squares reverse-time migration (LSRTM). The structure-oriented filtering operator is a nonstationary filter along structural trends that suppresses crosstalk noise while maintaining structural information. The proposed method uses the conjugate-gradient method to minimize the mismatch between predicted and observed data, while effectively attenuating the interference noise caused by exciting several sources simultaneously. Numerical experiments using synthetic data suggest that the proposed method can suppress the crosstalk noise and produce highly accurate images.

  16. Measurement of Dα sources for particle confinement time determination in TEXTOR

    International Nuclear Information System (INIS)

    Gray, D.S.; Boedo, J.A.; Conn, R.W.; Finken, K.H.; Mank, G.; Pospieszczyk, A.; Samm, U.

    1993-01-01

    An important quantity in the study of tokamak discharges is the global particle confinement time, defined for each ionic species i by the equation below, where N i is the total population of the species in the plasma and S i is the source rate (ionization rate) of the species: τ pi N i /(S i - dN i /dt). Of particular significance is the confinement time of the main plasma component, deuterium; here, in most cases of interest, the time derivative is negligible and the confinement time is given by N/S. The deuterium content N can be estimated from the electron content, measured by interferometry, if Z eff is known. A common method of estimating the fueling rate S is to measure the emission of D α light from recycling neutrals in the plasma boundary, since collisional-radiative modeling has shown that, for plasma conditions typical in the tokamak edge, the rate of ionization of D atoms and the rate of emission of D α photons are related by a factor that varies only weakly with electron density and temperature. This paper describes the use of a CCD video camera at TEXTOR for the purpose of spatially resolving the D α light in order to measure more accurately the total emission so that τ p can be determined reliably. (author) 5 refs., 5 figs

  17. Safety measures to address the year 2000 issue at radioactive waste management facilities

    International Nuclear Information System (INIS)

    1999-03-01

    This report evaluates eventual impacts of the date problem in computer-based systems, referred to as year 2000 or Y2K problem, on the safety of radioactive waste management. It addresses the various types of waste, their processing, storage and disposal, decommissioning activities and sealed sources in terms of the approach to the Y2K problem, eventual remediation or contingencies and regulatory considerations. It assesses also typical processes involved in radioactive waste management for their potential of being affected by the Y2K problem. It addresses also eventual impacts on records and data as well as instruments and measurements

  18. An Open Source-Based Real-Time Data Processing Architecture Framework for Manufacturing Sustainability

    Directory of Open Access Journals (Sweden)

    Muhammad Syafrudin

    2017-11-01

    Full Text Available Currently, the manufacturing industry is experiencing a data-driven revolution. There are multiple processes in the manufacturing industry and will eventually generate a large amount of data. Collecting, analyzing and storing a large amount of data are one of key elements of the smart manufacturing industry. To ensure that all processes within the manufacturing industry are functioning smoothly, the big data processing is needed. Thus, in this study an open source-based real-time data processing (OSRDP architecture framework was proposed. OSRDP architecture framework consists of several open sources technologies, including Apache Kafka, Apache Storm and NoSQL MongoDB that are effective and cost efficient for real-time data processing. Several experiments and impact analysis for manufacturing sustainability are provided. The results showed that the proposed system is capable of processing a massive sensor data efficiently when the number of sensors data and devices increases. In addition, the data mining based on Random Forest is presented to predict the quality of products given the sensor data as the input. The Random Forest successfully classifies the defect and non-defect products, and generates high accuracy compared to other data mining algorithms. This study is expected to support the management in their decision-making for product quality inspection and support manufacturing sustainability.

  19. {sup 124}Sb–Be photo-neutron source for BNCT: Is it possible?

    Energy Technology Data Exchange (ETDEWEB)

    Golshanian, Mohadeseh [Nuclear Science and Technology Research Institute (NSTRI), Tehran (Iran, Islamic Republic of); Department of Physics, Shahrood University, Shahrood (Iran, Islamic Republic of); Rajabi, Ali Akbar [Department of Physics, Shahrood University, Shahrood (Iran, Islamic Republic of); Kasesaz, Yaser, E-mail: ykasesaz@aeoi.org.ir [Nuclear Science and Technology Research Institute (NSTRI), Tehran (Iran, Islamic Republic of)

    2016-11-01

    In this research a computational feasibility study has been done on the use of {sup 124}SbBe photo-neutron source for Boron Neutron Capture Therapy (BNCT) using MCNPX Monte Carlo code. For this purpose, a special beam shaping assembly has been designed to provide an appropriate epithermal neutron beam suitable for BNCT. The final result shows that using 150 kCi of {sup 124}Sb, the epithermal neutron flux at the designed beam exit is 0.23×10{sup 9} (n/cm{sup 2} s). In-phantom dose analysis indicates that treatment time for a brain tumor is about 40 min which is a reasonable time. This high activity {sup 124}Sb could be achieved using three 50 kCi rods of {sup 124}Sb which can be produced in a research reactor. It is clear, that as this activity is several hundred times the activity of a typical cobalt radiotherapy source, issues related to handling, safety and security must be addressed.

  20. Time dependence of energy spectra of brachytherapy sources and its impact on the half and the tenth value layers

    International Nuclear Information System (INIS)

    Yue, Ning J.; Chen Zhe; Hearn, Robert A.; Rodgers, Joseph J.; Nath, Ravinder

    2009-01-01

    Purpose: Several factors including radionuclide purity influence the photon energy spectra from sealed brachytherapy sources. The existence of impurities and trace elements in radioactive materials as well as the substrate and encapsulation may not only alter the spectrum at a given time but also cause change in the spectra as a function of time. The purpose of this study is to utilize a semiempirical formalism, which quantitatively incorporates this time dependence, to calculate and evaluate the shielding requirement impacts introduced by this time dependence for a 103 Pd source. Methods: The formalism was used to calculate the NthVL thicknesses in lead for a 103 Pd model 200 seed. Prior to 2005, the 103 Pd in this source was purified to a level better than 0.006% of the total 103 Pd activity, the key trace impurity consisting of 65 Zn. Because 65 Zn emits higher energy photons and has a much longer half-life of 244 days compared to 103 Pd, its presence in 103 Pd seeds led to a time dependence of the photon spectrum and other related physical quantities. This study focuses on the time dependence of the NthVL and the analysis of the corresponding shielding requirements. Results: The results indicate that the first HVL and the first TVL in lead steadily increased with time for about 200 days and then reached a plateau. The increases at plateau were more than 1000 times compared to the corresponding values on the zeroth day. The second and third TVLs in lead reached their plateaus in about 100 and 60 days, respectively, and the increases were about 19 and 2.33 times the corresponding values on the zeroth day, respectively. All the TVLs demonstrated a similar time dependence pattern, with substantial increases and eventual approach to a plateau. Conclusions: The authors conclude that the time dependence of the emitted photon spectra from brachytherapy sources can introduce substantial variations in the values of the NthVL with time if certain impurities are present

  1. Forms of address in Isizulu

    OpenAIRE

    2014-01-01

    M.A. (African Studies) The study deals with forms of address in isiZulu. Therefore, the various aspects of speech that play roles when addressing a person, the factors affecting forms of address in isiZulu and the effect of languages such as English, Afrikaans and other African languages on the forms of address in isiZulu are of interest. Research was conducted on forms of address in isiZulu in parts of Soweto and it was discovered that form of address are determined by different factors i...

  2. Source-independent time-domain waveform inversion using convolved wavefields: Application to the encoded multisource waveform inversion

    KAUST Repository

    Choi, Yun Seok; Alkhalifah, Tariq Ali

    2011-01-01

    Full waveform inversion requires a good estimation of the source wavelet to improve our chances of a successful inversion. This is especially true for an encoded multisource time-domain implementation, which, conventionally, requires separate

  3. Laser plasma x-ray source for ultrafast time-resolved x-ray absorption spectroscopy

    Directory of Open Access Journals (Sweden)

    L. Miaja-Avila

    2015-03-01

    Full Text Available We describe a laser-driven x-ray plasma source designed for ultrafast x-ray absorption spectroscopy. The source is comprised of a 1 kHz, 20 W, femtosecond pulsed infrared laser and a water target. We present the x-ray spectra as a function of laser energy and pulse duration. Additionally, we investigate the plasma temperature and photon flux as we vary the laser energy. We obtain a 75 μm FWHM x-ray spot size, containing ∼106 photons/s, by focusing the produced x-rays with a polycapillary optic. Since the acquisition of x-ray absorption spectra requires the averaging of measurements from >107 laser pulses, we also present data on the source stability, including single pulse measurements of the x-ray yield and the x-ray spectral shape. In single pulse measurements, the x-ray flux has a measured standard deviation of 8%, where the laser pointing is the main cause of variability. Further, we show that the variability in x-ray spectral shape from single pulses is low, thus justifying the combining of x-rays obtained from different laser pulses into a single spectrum. Finally, we show a static x-ray absorption spectrum of a ferrioxalate solution as detected by a microcalorimeter array. Altogether, our results demonstrate that this water-jet based plasma source is a suitable candidate for laboratory-based time-resolved x-ray absorption spectroscopy experiments.

  4. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    International Nuclear Information System (INIS)

    Messroghli, Daniel R; Rudolph, Andre; Abdel-Aty, Hassan; Wassmuth, Ralf; Kühne, Titus; Dietz, Rainer; Schulz-Menger, Jeanette

    2010-01-01

    In magnetic resonance (MR) imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI) T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet

  5. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    Directory of Open Access Journals (Sweden)

    Kühne Titus

    2010-07-01

    Full Text Available Abstract Background In magnetic resonance (MR imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. Results After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. Conclusions MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet.

  6. Situational Strength Cues from Social Sources at Work: Relative Importance and Mediated Effects.

    Science.gov (United States)

    Alaybek, Balca; Dalal, Reeshad S; Sheng, Zitong; Morris, Alexander G; Tomassetti, Alan J; Holland, Samantha J

    2017-01-01

    Situational strength is considered one of the most important situational forces at work because it can attenuate the personality-performance relationship. Although organizational scholars have studied the consequences of situational strength, they have paid little attention to its antecedents. To address this gap, the current study focused on situational strength cues from different social sources as antecedents of overall situational strength at work. Specifically, we examined how employees combine situational strength cues emanating from three social sources (i.e., coworkers, the immediate supervisor, and top management). Based on field theory, we hypothesized that the effect of situational strength from coworkers and immediate supervisors (i.e., proximal sources of situational strength) on employees' perceptions of overall situational strength on the job would be greater than the effect of situational strength from the top management (i.e., the distal source of situational strength). We also hypothesized that the effect of situational strength from the distal source would be mediated by the effects of situational strength from the proximal sources. Data from 363 full-time employees were collected at two time points with a cross-lagged panel design. The former hypothesis was supported for one of the two situational strength facets studied. The latter hypothesis was fully supported.

  7. Situational Strength Cues from Social Sources at Work: Relative Importance and Mediated Effects

    Directory of Open Access Journals (Sweden)

    Balca Alaybek

    2017-09-01

    Full Text Available Situational strength is considered one of the most important situational forces at work because it can attenuate the personality–performance relationship. Although organizational scholars have studied the consequences of situational strength, they have paid little attention to its antecedents. To address this gap, the current study focused on situational strength cues from different social sources as antecedents of overall situational strength at work. Specifically, we examined how employees combine situational strength cues emanating from three social sources (i.e., coworkers, the immediate supervisor, and top management. Based on field theory, we hypothesized that the effect of situational strength from coworkers and immediate supervisors (i.e., proximal sources of situational strength on employees' perceptions of overall situational strength on the job would be greater than the effect of situational strength from the top management (i.e., the distal source of situational strength. We also hypothesized that the effect of situational strength from the distal source would be mediated by the effects of situational strength from the proximal sources. Data from 363 full-time employees were collected at two time points with a cross-lagged panel design. The former hypothesis was supported for one of the two situational strength facets studied. The latter hypothesis was fully supported.

  8. Providing cell phone numbers and email addresses to Patients: the physician's perspective

    Science.gov (United States)

    2011-01-01

    Background The provision of cell phone numbers and email addresses enhances the accessibility of medical consultations, but can add to the burden of physicians' routine clinical practice and affect their free time. The objective was to assess the attitudes of physicians to providing their telephone number or email address to patients. Methods Primary care physicians in the southern region of Israel completed a structured questionnaire that related to the study objective. Results The study population included 120 primary care physicians with a mean age of 41.2 ± 8.5, 88 of them women (73.3%). Physicians preferred to provide their cell phone number rather than their email address (P = 0.0007). They preferred to answer their cell phones only during the daytime and at predetermined times, but would answer email most hours of the day, including weekends and holidays (P = 0.001). More physicians (79.7%) would have preferred allotted time for email communication than allotted time for cell phone communication (50%). However, they felt that email communication was more likely to lead to miscommunication than telephone calls (P = 0.0001). There were no differences between male and female physicians on the provision of cell phone numbers or email addresses to patients. Older physicians were more prepared to provide cell phone numbers that younger ones (P = 0.039). Conclusions The attitude of participating physicians was to provide their cell phone number or email address to some of their patients, but most of them preferred to give out their cell phone number. PMID:21426591

  9. Providing cell phone numbers and email addresses to Patients: the physician's perspective

    Directory of Open Access Journals (Sweden)

    Freud Tamar

    2011-03-01

    Full Text Available Abstract Background The provision of cell phone numbers and email addresses enhances the accessibility of medical consultations, but can add to the burden of physicians' routine clinical practice and affect their free time. The objective was to assess the attitudes of physicians to providing their telephone number or email address to patients. Methods Primary care physicians in the southern region of Israel completed a structured questionnaire that related to the study objective. Results The study population included 120 primary care physicians with a mean age of 41.2 ± 8.5, 88 of them women (73.3%. Physicians preferred to provide their cell phone number rather than their email address (P = 0.0007. They preferred to answer their cell phones only during the daytime and at predetermined times, but would answer email most hours of the day, including weekends and holidays (P = 0.001. More physicians (79.7% would have preferred allotted time for email communication than allotted time for cell phone communication (50%. However, they felt that email communication was more likely to lead to miscommunication than telephone calls (P = 0.0001. There were no differences between male and female physicians on the provision of cell phone numbers or email addresses to patients. Older physicians were more prepared to provide cell phone numbers that younger ones (P = 0.039. Conclusions The attitude of participating physicians was to provide their cell phone number or email address to some of their patients, but most of them preferred to give out their cell phone number.

  10. The First 100 Years of American College of Surgeons Presidential Addresses.

    Science.gov (United States)

    Ghanem, Omar M; Heitmiller, Richard F

    2016-01-01

    We reviewed the first 100 years of presidential addresses delivered at the fall congress of the American College of Surgeons (ACS). Our hypothesis was that these addresses would be an excellent indicator of the College's position on surgical policy, ethics, methods, and education. All ACS presidential addresses from 1913 to 2013 were identified through the ACS archives website. This included the presenter, title, year, and citation if published in a peer reviewed journal. The text of each address was obtained from the ACS archives, or from the listed citations. Addresses were then classified into 1 of 6 subgroups based on content-surgical credo, medical innovation, medical education, surgical history, business and legal, and personal tribute. The 100-year period was divided into 5 interval each of 20-year and the frequency of each category was graphed over time. There were 111 ACS presidential addresses delivered in the study period. Distribution by category was surgical credo (57%), surgical history (14%), medical innovation (10%), medical education (8%), business and legal (6%), and personal tributes (5%). The frequency of surgical credo has remained stable over time. Business and legal emerged as a new category in 1975. The other topics had low, but stable frequency. ACS presidential addresses do reflect the College's position on surgical policy and practice. The college has remained consistent in serving its members, maintaining, and defining the role of its organization, the qualifications for membership, and the expectations for the professional conduct of its members. Copyright © 2016. Published by Elsevier Inc.

  11. Validation of the direct analysis in real time source for use in forensic drug screening.

    Science.gov (United States)

    Steiner, Robert R; Larson, Robyn L

    2009-05-01

    The Direct Analysis in Real Time (DART) ion source is a relatively new mass spectrometry technique that is seeing widespread use in chemical analyses world-wide. DART studies include such diverse topics as analysis of flavors and fragrances, melamine in contaminated dog food, differentiation of writing inks, characterization of solid counterfeit drugs, and as a detector for planar chromatography. Validation of this new technique for the rapid screening of forensic evidence for drugs of abuse, utilizing the DART source coupled to an accurate mass time-of-flight mass spectrometer, was conducted. The study consisted of the determination of the lower limit of detection for the method, determination of selectivity and a comparison of this technique to established analytical protocols. Examples of DART spectra are included. The results of this study have allowed the Virginia Department of Forensic Science to incorporate this new technique into their analysis scheme for the screening of solid dosage forms of drugs of abuse.

  12. The Dynamic Method for Time-of-Flight Measurement of Thermal Neutron Spectra from Pulsed Sources

    International Nuclear Information System (INIS)

    Pepelyshev, Yu.N.; Tulaev, A.B.; Bobrakov, V.F.

    1994-01-01

    The time-of-flight method for a measurement of thermal neutron spectra in the pulsed neutron sources with high efficiency of neutron registration, more than 10 5 times higher in comparison with traditional one, is described. The main problems connected with the electric current technique for time-of-flight spectra measurement are examined. The methodical errors, problems of a special neutron detector design and other questions are discussed. Some experimental results, spectra from surfaces of the water and solid methane moderators, obtained in the pulsed reactor IBR-2 (Dubna, Russia) are presented. 4 refs., 5 figs

  13. Mapping of MAC Address with Moving WiFi Scanner

    Directory of Open Access Journals (Sweden)

    Arief Hidayat

    2017-10-01

    Full Text Available Recently, Wifi is one of the most useful technologies that can be used for detecting and counting MAC Address. This paper described using of WiFi scanner which carried out seven times circulated the bus. The method used WiFi and GPS are to counting MAC address as raw data from the pedestrian smartphone, bus passenger or WiFi devices near from the bus as long as the bus going around the route. There are seven processes to make map WiFi data.

  14. A Derivation of Source-based Kinetics Equation with Time Dependent Fission Kernel for Reactor Transient Analyses

    International Nuclear Information System (INIS)

    Kim, Song Hyun; Woo, Myeong Hyun; Shin, Chang Ho; Pyeon, Cheol Ho

    2015-01-01

    In this study, a new balance equation to overcome the problems generated by the previous methods is proposed using source-based balance equation. And then, a simple problem is analyzed with the proposed method. In this study, a source-based balance equation with the time dependent fission kernel was derived to simplify the kinetics equation. To analyze the partial variations of reactor characteristics, two representative methods were introduced in previous studies; (1) quasi-statics method and (2) multipoint technique. The main idea of quasistatics method is to use a low-order approximation for large integration times. To realize the quasi-statics method, first, time dependent flux is separated into the shape and amplitude functions, and shape function is calculated. It is noted that the method has a good accuracy; however, it can be expensive as a calculation cost aspect because the shape function should be fully recalculated to obtain accurate results. To improve the calculation efficiency, multipoint method was proposed. The multipoint method is based on the classic kinetics equation with using Green's function to analyze the flight probability from region r' to r. Those previous methods have been used to analyze the reactor kinetics analysis; however, the previous methods can have some limitations. First, three group variables (r g , E g , t g ) should be considered to solve the time dependent balance equation. This leads a big limitation to apply large system problem with good accuracy. Second, the energy group neutrons should be used to analyze reactor kinetics problems. In time dependent problem, neutron energy distribution can be changed at different time. It can affect the change of the group cross section; therefore, it can lead the accuracy problem. Third, the neutrons in a space-time region continually affect the other space-time regions; however, it is not properly considered in the previous method. Using birth history of the neutron sources

  15. Estimation of the Plant Time Constant of Current-Controlled Voltage Source Converters

    DEFF Research Database (Denmark)

    Vidal, Ana; Yepes, Alejandro G.; Malvar, Jano

    2014-01-01

    Precise knowledge of the plant time constant is essential to perform a thorough analysis of the current control loop in voltage source converters (VSCs). As the loop behavior can be significantly influenced by the VSC working conditions, the effects associated to converter losses should be included...... in the model, through an equivalent series resistance. In a recent work, an algorithm to identify this parameter was developed, considering the inductance value as known and practically constant. Nevertheless, the plant inductance can also present important uncertainties with respect to the inductance...... of the VSC interface filter measured at rated conditions. This paper extends that method so that both parameters of the plant time constant (resistance and inductance) are estimated. Such enhancement is achieved through the evaluation of the closed-loop transient responses of both axes of the synchronous...

  16. Exact analytical solution of time-independent neutron transport equation, and its applications to systems with a point source

    International Nuclear Information System (INIS)

    Mikata, Y.

    2014-01-01

    Highlights: • An exact solution for the one-speed neutron transport equation is obtained. • This solution as well as its derivation are believed to be new. • Neutron flux for a purely absorbing material with a point neutron source off the origin is obtained. • Spherically as well as cylindrically piecewise constant cross sections are studied. • Neutron flux expressions for a point neutron source off the origin are believed to be new. - Abstract: An exact analytical solution of the time-independent monoenergetic neutron transport equation is obtained in this paper. The solution is applied to systems with a point source. Systematic analysis of the solution of the time-independent neutron transport equation, and its applications represent the primary goal of this paper. To the best of the author’s knowledge, certain key results on the scalar neutron flux as well as their derivations are new. As an application of these results, a scalar neutron flux for a purely absorbing medium with a spherically piecewise constant cross section and an isotropic point neutron source off the origin as well as that for a cylindrically piecewise constant cross section with a point neutron source off the origin are obtained. Both of these results are believed to be new

  17. Real time measurements of submicrometer aerosols in Seoul, Korea: Sources, characteristics, and processing of organic aerosols during winter time.

    Science.gov (United States)

    Kim, H.; Zhang, Q.

    2016-12-01

    Highly time-resolved chemical characterization of non-refractory submicrometer particulate matter (NR-PM1) was conducted in Seoul, the capital of Korea, using an Aerodyne high-resolution time-of-flight aerosol mass spectrometer (HR-ToF-AMS). The measurements were performed during winter when persistent air quality problems associated with elevated PM concentrations were observed. The average NR-PM1 concentration was 27.5 µg m-3 and the average mass was dominated by organics (44%), followed by nitrate (24%) and sulfate (10%). Five distinct sources of organic aerosol (OA) were identified from positive matrix factorization (PMF) analysis of the AMS data: vehicle emissions represented by a hydrocarbon-like OA factor (HOA), cooking represented by a cooking OA factor (COA), wood combustion represented by a biomass burning OA factor (BBOA), and secondary aerosol formation in the atmosphere that is represented by a semi-volatile oxygenated OA factor (SVOOA) and a low volatile oxygenated OA factor (LVOOA). These factors, on average, contributed 16, 20, 23, 15 and 26% to the total OA mass, respectively, with primary organic aerosol (POA = HOA + COA + BBOA) accounting for 59% of the OA mass. On average, both primary emissions and secondary aerosol formation are important factors affecting air quality in Seoul during winter, contributing approximately equal. However, differences in the fraction of PM source and properties were observed between high and low loading PM period. For example, during stagnant period with low wind speed (WS) (0.99 ± 0.7 m/s) and high RH (71%), high PM loadings (43.6 ± 12.4 µg m-3) with enhanced fractions of nitrate (27%) and SVOOA (8%) were observed, indicating a strong influence from locally generated secondary aerosol. On the other hand, when low PM loadings (12.6 ± 7.1 µg m-3), which were commonly associated with high WS (1.8 ± 1.1 m/s) and low RH (50 %), were observed, the fraction of regional sources, such as sulfate (12%) and LVOOA (21

  18. Interfacing Hardware Accelerators to a Time-Division Multiplexing Network-on-Chip

    DEFF Research Database (Denmark)

    Pezzarossa, Luca; Sørensen, Rasmus Bo; Schoeberl, Martin

    2015-01-01

    This paper addresses the integration of stateless hardware accelerators into time-predictable multi-core platforms based on time-division multiplexing networks-on-chip. Stateless hardware accelerators, like floating-point units, are typically attached as co-processors to individual processors in ...... implementation. The design evaluation is carried out using the open source T-CREST multi-core platform implemented on an Altera Cyclone IV FPGA. The size of the proposed design, including a floating-point accelerator, is about two-thirds of a processor....

  19. Performance analysis of an acoustic time reversal system in dynamic and random oceanic environments

    Science.gov (United States)

    Khosla, Sunny Rajendra

    This dissertation provides a theoretical framework along with specific performance predictions for an acoustic time reversal system in shallow oceanic environments. Acoustic time-reversal is a robust means of retrofocusing acoustic energy, in both time and space, to the original sound-source location without any information about the acoustic environment in which it is deployed. The effect of three performance limiting oceanic complexities addressed, include (i)ambient noise field, (ii)reflection and volume scattering from a deterministic soliton internal wave traveling on the thermocline between two water masses, and (iii)volume scattering from a random superposition of linear internal waves convecting a gradient in the sound speed profile. The performance analysis establishes acoustic time reversal to be a promising technology for a two-way communication system in an oceanic medium. For an omni-directional noisy environment a general formulation for the probability of retrofocusing is developed that includes the effect of the medium, accounts for the system hardware and the acoustic parameters. Monte-Carlo simulations in both, a free-space environment and a shallow-ocean sound-channel environment compare well with theory. A 41 element TRA spanning a shallow water depth of 60 m is predicted to return a 70% focal probability at -15 dB SNR for a source to array range of 6 km. Preliminary research with broadband signals suggest that they should outperform narrowband response in both free space and sound channel environments. The impact of the nonlinear solitary waves is addressed using a two-path Green's function to treat the presence of a flat thermocline, and the single scattering Born approximation to address scattering from the soliton internal wave. It is predicted that a stationary soliton located along ray turning paths between the source and the TRA can lead to both enhanced and degraded focal performance. Based on extension of previous research in wave

  20. Time-Dependent Searches for Point Sources of Neutrinos with the 40-String and 22-String Configurations of IceCube

    Science.gov (United States)

    Stamatikos, M.

    2012-01-01

    This paper presents four searches for flaring sources of neutrinos using the IceCube neutrino telescope. For the first time, a search is performed over the entire parameter space of energy, direction and time with sensitivity to neutrino flares lasting between 20 microseconds and a year duration from astrophysical sources. Searches which integrate over time are less sensitive to flares because they are affected by a larger background of atmospheric neutrinos and muons that can be reduced by the use of additional timing information. Flaring sources considered here, such as active galactic nuclei, soft gamma ray repeaters and gamma-ray bursts, are promising candidate neutrino emitters. Two searches are untriggered in the sense that they look for any possible flare in the entire sky and from a predefined catalog of sources from which photon flares have been recorded. The other two searches are triggered by multi-wavelength information on flares from blazars and from a soft gamma-ray repeater. One triggered search uses lightcurves from Fermi-LAT which provides continuous monitoring. A second triggered search uses information where the flux states have been measured only for short periods of time near the flares. The untriggered searches use data taken by 40 strings of IceCube between Apr 5, 2008 and May 20, 2009. The triggered searches also use data taken by the 22-string configuration of IceCube operating between May 31, 2007 and Apr 5, 2008. The results from all four searches are compatible with a fluctuation of the background.

  1. The maladies of water and war: addressing poor water quality in Iraq.

    Science.gov (United States)

    Zolnikov, Tara Rava

    2013-06-01

    Water is essential in providing nutrients, but contaminated water contributes to poor population health. Water quality and availability can change in unstructured situations, such as war. To develop a practical strategy to address poor water quality resulting from intermittent wars in Iraq, I reviewed information from academic sources regarding waterborne diseases, conflict and war, water quality treatment, and malnutrition. The prevalence of disease was high in impoverished, malnourished populations exposed to contaminated water sources. The data aided in developing a strategy to improve water quality in Iraq, which encompasses remineralized water from desalination plants, health care reform, monitoring and evaluation systems, and educational public health interventions.

  2. Three-dimensional imagery by encoding sources of X rays

    International Nuclear Information System (INIS)

    Magnin, Isabelle

    1987-01-01

    This research thesis addresses the theoretical and practical study of X ray coded sources, and thus notably aims at exploring whether it would be possible to transform a standard digital radiography apparatus (as those operated in radiology hospital departments) into a low cost three-dimensional imagery system. The author first recalls the principle of conventional tomography and improvement attempts, and describes imagery techniques based on the use of encoding openings and source encoding. She reports the modelling of an imagery system based on encoded sources of X ray, and addresses the original notion of three-dimensional response for such a system. The author then addresses the reconstruction method by considering the reconstruction of a plane object, of a multi-plane object, and of real three-dimensional object. The frequency properties and the tomographic capacities of various types of source codes are analysed. She describes a prototype tomography apparatus, and presents and discusses three-dimensional actual phantom reconstructions. She finally introduces a new principle of dynamic three-dimensional radiography which implements an acquisition technique by 'gating code'. The acquisition principle should allow the reconstruction of volumes animated by periodic deformations, such as the heart for example [fr

  3. Synchrotron light source data book

    International Nuclear Information System (INIS)

    Murphy, J.

    1989-01-01

    The ''Synchrotron Light Source Data Book'' is as its name implies a collection of data on existing and planned synchrotron light sources. The intention was to provide a compendium of tools for the design of electron storage rings as synchrotron radiation sources. The slant is toward the accelerator physicist as other booklets such as the X-ray Data Booklet, edited by D. Vaughan (LBL PUB-490), address the 'use' of synchrotron radiation. It is hoped that the booklet serves as a pocket sized reference to facilitate back of the envelope type calculations. It contains some useful formulae in 'practical units' and a brief description of many of the existing and planned light source lattices

  4. Radiation sources working group summary

    International Nuclear Information System (INIS)

    Fazio, M.V.

    1998-01-01

    The Radiation Sources Working Group addressed advanced concepts for the generation of RF energy to power advanced accelerators. The focus of the working group included advanced sources and technologies above 17 GHz. The topics discussed included RF sources above 17 GHz, pulse compression techniques to achieve extreme peak power levels, components technology, technology limitations and physical limits, and other advanced concepts. RF sources included gyroklystrons, magnicons, free-electron masers, two beam accelerators, and gyroharmonic and traveling wave devices. Technology components discussed included advanced cathodes and electron guns, high temperature superconductors for producing magnetic fields, RF breakdown physics and mitigation, and phenomena that impact source design such as fatigue in resonant structures due to RF heating. New approaches for RF source diagnostics located internal to the source were discussed for detecting plasma and beam phenomena existing in high energy density electrodynamic systems in order to help elucidate the reasons for performance limitations

  5. Real-Time Implementation of Islanded Microgrid for Remote Areas

    Directory of Open Access Journals (Sweden)

    Monika Jain

    2016-01-01

    Full Text Available Islanding is a condition in which a microgrid or a portion of power grid, consisting of distributed generation (DG sources, converter, and load, gets disconnected from the utility grid. Under this condition the DG sources in a microgrid must switch to a voltage control mode, in order to provide constant voltage to local loads. In grid connected mode, the microgrid works as current controller and injects power to the main grid, depending on the power generation and local load with suitable market policies. Providing constant voltage at a stable frequency with proper synchronization amongst each DG in a microgrid is a challenge. The complexity of such grid requires careful study and analysis before actual implementation. These challenges of microgrid are addressed using real time OPAL-RT simulation technology. Thus the paper describes an islanded microgrid with master slave controller for power balance, voltage/frequency regulation, and synchronization. Based on an advanced real-time platform named Real-Time Laboratory (RT-LAB, the impacts of the micro sources, load, and converters in an islanded microgrid is studied in this paper. The effectiveness of the proposed controller is analyzed through experimental results under balanced/unbalanced nonlinear loads condition.

  6. THE COMPACT, TIME-VARIABLE RADIO SOURCE PROJECTED INSIDE W3(OH): EVIDENCE FOR A PHOTOEVAPORATED DISK?

    International Nuclear Information System (INIS)

    Dzib, Sergio A.; Rodríguez-Garza, Carolina B.; Rodríguez, Luis F.; Kurtz, Stan E.; Loinard, Laurent; Zapata, Luis A.; Lizano, Susana

    2013-01-01

    We present new Karl G. Jansky Very Large Array (VLA) observations of the compact (∼0.''05), time-variable radio source projected near the center of the ultracompact H II region W3(OH). The analysis of our new data as well as of VLA archival observations confirms the variability of the source on timescales of years and for a given epoch indicates a spectral index of α = 1.3 ± 0.3 (S ν ∝ν α ). This spectral index and the brightness temperature of the source (∼6500 K) suggest that we are most likely detecting partially optically thick free-free radiation. The radio source is probably associated with the ionizing star of W3(OH), but an interpretation in terms of an ionized stellar wind fails because the detected flux densities are orders of magnitude larger than expected. We discuss several scenarios and tentatively propose that the radio emission could arise in a static ionized atmosphere around a fossil photoevaporated disk

  7. Strategies for satellite-based monitoring of CO2 from distributed area and point sources

    Science.gov (United States)

    Schwandner, Florian M.; Miller, Charles E.; Duren, Riley M.; Natraj, Vijay; Eldering, Annmarie; Gunson, Michael R.; Crisp, David

    2014-05-01

    Atmospheric CO2 budgets are controlled by the strengths, as well as the spatial and temporal variabilities of CO2 sources and sinks. Natural CO2 sources and sinks are dominated by the vast areas of the oceans and the terrestrial biosphere. In contrast, anthropogenic and geogenic CO2 sources are dominated by distributed area and point sources, which may constitute as much as 70% of anthropogenic (e.g., Duren & Miller, 2012), and over 80% of geogenic emissions (Burton et al., 2013). Comprehensive assessments of CO2 budgets necessitate robust and highly accurate satellite remote sensing strategies that address the competing and often conflicting requirements for sampling over disparate space and time scales. Spatial variability: The spatial distribution of anthropogenic sources is dominated by patterns of production, storage, transport and use. In contrast, geogenic variability is almost entirely controlled by endogenic geological processes, except where surface gas permeability is modulated by soil moisture. Satellite remote sensing solutions will thus have to vary greatly in spatial coverage and resolution to address distributed area sources and point sources alike. Temporal variability: While biogenic sources are dominated by diurnal and seasonal patterns, anthropogenic sources fluctuate over a greater variety of time scales from diurnal, weekly and seasonal cycles, driven by both economic and climatic factors. Geogenic sources typically vary in time scales of days to months (geogenic sources sensu stricto are not fossil fuels but volcanoes, hydrothermal and metamorphic sources). Current ground-based monitoring networks for anthropogenic and geogenic sources record data on minute- to weekly temporal scales. Satellite remote sensing solutions would have to capture temporal variability through revisit frequency or point-and-stare strategies. Space-based remote sensing offers the potential of global coverage by a single sensor. However, no single combination of orbit

  8. SIMS: addressing the problem of heterogeneity in databases

    Science.gov (United States)

    Arens, Yigal

    1997-02-01

    The heterogeneity of remotely accessible databases -- with respect to contents, query language, semantics, organization, etc. -- presents serious obstacles to convenient querying. The SIMS (single interface to multiple sources) system addresses this global integration problem. It does so by defining a single language for describing the domain about which information is stored in the databases and using this language as the query language. Each database to which SIMS is to provide access is modeled using this language. The model describes a database's contents, organization, and other relevant features. SIMS uses these models, together with a planning system drawing on techniques from artificial intelligence, to decompose a given user's high-level query into a series of queries against the databases and other data manipulation steps. The retrieval plan is constructed so as to minimize data movement over the network and maximize parallelism to increase execution speed. SIMS can recover from network failures during plan execution by obtaining data from alternate sources, when possible. SIMS has been demonstrated in the domains of medical informatics and logistics, using real databases.

  9. Hard real-time quick EXAFS data acquisition with all open source software on a commodity personal computer

    International Nuclear Information System (INIS)

    So, I.; Siddons, D.P.; Caliebe, W.A.; Khalid, S.

    2007-01-01

    We describe here the data acquisition subsystem of the Quick EXAFS (QEXAFS) experiment at the National Synchrotron Light Source of Brookhaven National Laboratory. For ease of future growth and flexibility, almost all software components are open source with very active maintainers. Among them, Linux running on x86 desktop computer, RTAI for real-time response, COMEDI driver for the data acquisition hardware, Qt and PyQt for graphical user interface, PyQwt for plotting, and Python for scripting. The signal (A/D) and energy-reading (IK220 encoder) devices in the PCI computer are also EPICS enabled. The control system scans the monochromator energy through a networked EPICS motor. With the real-time kernel, the system is capable of deterministic data-sampling period of tens of micro-seconds with typical timing-jitter of several micro-seconds. At the same time, Linux is running in other non-real-time processes handling the user-interface. A modern Qt-based controls-frontend enhances productivity. The fast plotting and zooming of data in time or energy coordinates let the experimenters verify the quality of the data before detailed analysis. Python scripting is built-in for automation. The typical data-rate for continuous runs are around 10 M bytes/min

  10. "Using recruitment source timing and diagnosticity to enhance applicants' occupation-specific human capital": Correction to Campion, Ployhart, and Campion (2017).

    Science.gov (United States)

    2017-05-01

    Reports an error in "Using Recruitment Source Timing and Diagnosticity to Enhance Applicants' Occupation-Specific Human Capital" by Michael C. Campion, Robert E. Ployhart and Michael A. Campion ( Journal of Applied Psychology , Advanced Online Publication, Feb 02, 2017, np). In the article, the following headings were inadvertently set at the wrong level: Method, Participants and Procedure, Measures, Occupation specific human capital, Symbolic jobs, Relevant majors, Occupation-specific capital hotspots, Source timing, Source diagnosticity, Results, and Discussion. All versions of this article have been corrected. (The following abstract of the original article appeared in record 2017-04566-001.) This study proposes that reaching applicants through more diagnostic recruitment sources earlier in their educational development (e.g., in high school) can lead them to invest more in their occupation-specific human capital (OSHC), thereby making them higher quality candidates. Using a sample of 78,157 applicants applying for jobs within a desirable professional occupation in the public sector, results indicate that applicants who report hearing about the occupation earlier, and applicants who report hearing about the occupation through more diagnostic sources, have higher levels of OSHC upon application. Additionally, source timing and diagnosticity affect the likelihood of candidates applying for jobs symbolic of the occupation, selecting relevant majors, and attending educational institutions with top programs related to the occupation. These findings suggest a firm's recruiting efforts may influence applicants' OSHC investment strategies. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  11. Using Citizen Science and Crowdsourcing via Aurorasaurus as a Near Real Time Data Source for Space Weather Applications

    Science.gov (United States)

    MacDonald, E.; Heavner, M.; Hall, M.; Tapia, A.; Lalone, N.; Clayon, J.; Case, N.

    2014-12-01

    Aurorasaurus is on the cutting edge of space science, citizen science, and computer science simultaneously with the broad goals to develop a real-time citizen science network, educate the general public about the northern lights, and revolutionize real-time space weather nowcasting of the aurora for the public. We are currently in the first solar maximum with social media, which enables the technological roots to connect users, citizen scientists, and professionals around a shared global, rare interest. We will introduce the project which has been in a prototype mode since 2012 and recently relaunched with a new mobile and web presence and active campaigns. We will showcase the interdisciplinary advancements which include a more educated public, disaster warning system applications, and improved real-time ground truth data including photographs and observations of the Northern Lights. We will preview new data which validates the proof of concept for significant improvements in real-time space weather nowcasting. Our aim is to provide better real-time notifications of the visibility of the Northern Lights to the interested public via the combination of noisy crowd-sourced ground truth with noisy satellite-based predictions. The latter data are available now but are often delivered with significant jargon and uncertainty, thus reliable, timely interpretation of such forecasts by the public are problematic. The former data show real-time characteristic significant rises (in tweets for instance) that correlate with other non-real-time indices of auroral activity (like the Kp index). We will discuss the source of 'noise' in each data source. Using citizen science as a platform to provide a basis for deeper understanding is one goal; secondly we want to improve understanding of and appreciation for the dynamics and beauty of the Northern Lights by the public and scientists alike.

  12. The continued development of the Spallation Neutron Source external antenna H- ion source

    International Nuclear Information System (INIS)

    Welton, R. F.; Carmichael, J.; Fuga, R.; Goulding, R. H.; Han, B.; Kang, Y.; Lee, S. W.; Murray, S. N.; Pennisi, T.; Potter, K. G.; Santana, M.; Stockli, M. P.; Desai, N. J.

    2010-01-01

    The U.S. Spallation Neutron Source (SNS) is an accelerator-based, pulsed neutron-scattering facility, currently in the process of ramping up neutron production. In order to ensure that the SNS will meet its operational commitments as well as provide for future facility upgrades with high reliability, we are developing a rf-driven, H - ion source based on a water-cooled, ceramic aluminum nitride (AlN) plasma chamber. To date, early versions of this source have delivered up to 42 mA to the SNS front end and unanalyzed beam currents up to ∼100 mA (60 Hz, 1 ms) to the ion source test stand. This source was operated on the SNS accelerator from February to April 2009 and produced ∼35 mA (beam current required by the ramp up plan) with availability of ∼97%. During this run several ion source failures identified reliability issues, which must be addressed before the source re-enters production: plasma ignition, antenna lifetime, magnet cooling, and cooling jacket integrity. This report discusses these issues, details proposed engineering solutions, and notes progress to date.

  13. The effect of magnetic field strength on the time evolution of high energy bremsstrahlung radiation created by an electron cyclotron resonance ion source

    Energy Technology Data Exchange (ETDEWEB)

    Ropponen, T. [Department of Physics, University of Jyvaeskylae, P.O. Box 35, FI-40014 (Finland)], E-mail: tommi.ropponen@phys.jyu.fi; Tarvainen, O. [Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Jones, P.; Peura, P.; Kalvas, T. [Department of Physics, University of Jyvaeskylae, P.O. Box 35, FI-40014 (Finland); Suominen, P. [Prizztech Ltd/Magnet Technology Centre, Tiedepuisto 4, FI-28600 Pori (Finland); Koivisto, H.; Arje, J. [Department of Physics, University of Jyvaeskylae, P.O. Box 35, FI-40014 (Finland)

    2009-03-11

    An electron cyclotron resonance (ECR) ion source is one of the most used ion source types for high charge state heavy ion production. In ECR plasma the electrons are heated by radio frequency microwaves in order to provide ionization of neutral gases. As a consequence, ECR heating also generates very high electron energies (up to MeV region) which can produce a vast amount of bremsstrahlung radiation causing problems with radiation shielding and heating superconducting cryostat of an ECR ion source. To gain information about the time evolution of the electron energies in ECR plasma radial bremsstrahlung measurements were performed. JYFL 14 GHz ECR ion source was operated in pulsed mode and time evolution measurements were done with different axial magnetic field strengths with oxygen and argon plasmas. Bremsstrahlung data were analyzed with a time interval of 2 ms yielding information at unprecedented detail about the time evolution of high energy bremsstrahlung radiation from an ECR ion source. It was observed, for example, that reaching the steady state phase of the plasma bremsstrahlung requires several hundred milliseconds and the steady state time can be different with different gases.

  14. Mitigation of Cognitive Bias with a Serious Game: Two Experiments Testing Feedback Timing and Source

    Science.gov (United States)

    Dunbar, Norah E.; Jensen, Matthew L.; Miller, Claude H.; Bessarabova, Elena; Lee, Yu-Hao; Wilson, Scott N.; Elizondo, Javier; Adame, Bradley J.; Valacich, Joseph; Straub, Sara; Burgoon, Judee K.; Lane, Brianna; Piercy, Cameron W.; Wilson, David; King, Shawn; Vincent, Cindy; Schuetzler, Ryan M.

    2017-01-01

    One of the benefits of using digital games for education is that games can provide feedback for learners to assess their situation and correct their mistakes. We conducted two studies to examine the effectiveness of different feedback design (timing, duration, repeats, and feedback source) in a serious game designed to teach learners about…

  15. Estimating the Seasonal Importance of Precipitation to Plant Source Water over Time and Space with Water Isotopes

    Science.gov (United States)

    Nelson, D. B.; Kahmen, A.

    2017-12-01

    The stable isotopic composition of hydrogen and oxygen are physical properties of water molecules that can carry information on their sources or transport histories. This provides a useful tool for assessing the importance of rainfall at different times of the year for plant growth, provided that rainwater values vary over time and that waters do not partially evaporate after deposition. We tested the viability of this approach using data from samples collected at nineteen sites throughout Europe at monthly intervals over two consecutive growing seasons in 2014 and 2015. We compared isotope measurements of plant xylem water with soil water from multiple depths, and measured and modeled precipitation isotope values. Paired analyses of oxygen and hydrogen isotope values were used to screen out a limited number of water samples that were influenced by evaporation, with the majority of all water samples indicating meteoric sources. The isotopic composition of soil and xylem waters varied over the course of an individual growing season, with many trending towards more enriched values, suggesting integration of the plant-relevant water pool at a timescale shorter than the annual mean. We then quantified how soil water residence times varied at each site by calculating the interval between measured xylem water and the most recently preceding match in modeled precipitation isotope values. Results suggest a generally increasing interval between rainfall and plant uptake throughout each year, with source water corresponding to dates in the spring, likely reflecting a combination of spring rain, and mixing with winter and summer precipitation. The seasonally evolving spatial distribution of source water-precipitation lag values was then modeled as a function of location and climatology to develop continental-scale predictions. This spatial portrait of the average date for filling the plant source water pool provides insights on the seasonal importance of rainfall for plant

  16. Detection and Characterization of Ground Displacement Sources from Variational Bayesian Independent Component Analysis of GPS Time Series

    Science.gov (United States)

    Gualandi, A.; Serpelloni, E.; Belardinelli, M. E.

    2014-12-01

    A critical point in the analysis of ground displacements time series is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies. Indeed, PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem, i.e. in recovering and separating the original sources that generated the observed data. This is mainly due to the assumptions on which PCA relies: it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources, giving a more reliable estimate of them. Here we present the application of the vbICA technique to GPS position time series. First, we use vbICA on synthetic data that simulate a seismic cycle

  17. New source terms and the implications for emergency planning requirements at nuclear power plants in the United State

    International Nuclear Information System (INIS)

    Kaiser, G.D.; Cheok, M.C.

    1987-01-01

    This paper begins with a brief review of current approaches to source term driven changes to NRC emergency planning requirements and addresses significant differences between them. Approaches by IDCOR and EPRI, industry submittals to NRC and alternative risk-based evaluations have been considered. Important issues are discussed, such as the role of Protective Action Guides in determining the radius of the emergency planning zone (EPZ). The significance of current trends towards the prediction of longer warning times and longer durations of release in new source terms is assessed. These trends may help to relax the current notification time requirements. Finally, the implications of apparent support in the regulations for a threshold in warning time beyond which ad hoc protective measures are adequate is discussed

  18. 75 FR 41790 - Address Management Services-Elimination of the Manual Card Option for Address Sequencing Services

    Science.gov (United States)

    2010-07-19

    .... The authority citation for 39 CFR Part 111 continues to read as follows: Authority: 5 U.S.C. 552(a... addresses (including rural address conversions to city-style addressing). For each 5-digit ZIP Code grouping... customer includes a rural-style address (RR/box number) in an address file submitted for sequencing, and a...

  19. Reach Address Database (RAD)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Reach Address Database (RAD) stores the reach address of each Water Program feature that has been linked to the underlying surface water features (streams,...

  20. Off-site source recovery project case study: disposal of high activity cobalt 60 sources at the Nevada test site 2008

    International Nuclear Information System (INIS)

    Cocina, Frank G.; Stewart, William C.; Wald-Hopkins, Mark; Hageman, John P.

    2009-01-01

    The Off-Site Source Recovery Project has been operating at Los Alamos National Laboratory since 1998 to address the U.S. Department of Energy responsibility for collection and management of orphaned or disused radioactive sealed sources which may represent a risk to public health and national security if not properly managed.

  1. Multi-Sensor Integration to Map Odor Distribution for the Detection of Chemical Sources

    Directory of Open Access Journals (Sweden)

    Xiang Gao

    2016-07-01

    Full Text Available This paper addresses the problem of mapping odor distribution derived from a chemical source using multi-sensor integration and reasoning system design. Odor localization is the problem of finding the source of an odor or other volatile chemical. Most localization methods require a mobile vehicle to follow an odor plume along its entire path, which is time consuming and may be especially difficult in a cluttered environment. To solve both of the above challenges, this paper proposes a novel algorithm that combines data from odor and anemometer sensors, and combine sensors’ data at different positions. Initially, a multi-sensor integration method, together with the path of airflow was used to map the pattern of odor particle movement. Then, more sensors are introduced at specific regions to determine the probable location of the odor source. Finally, the results of odor source location simulation and a real experiment are presented.

  2. Quasielastic high-resolution time-of-flight spectrometers employing multi-disk chopper cascades for spallation sources

    International Nuclear Information System (INIS)

    Lechner, R.E.

    2001-01-01

    The design of multi-disk chopper time-of-flight (MTOF) spectrometers for high-resolution quasielastic and low-energy inelastic neutron scattering at spallation sources is discussed in some detail. A continuously variable energy resolution (1 μeV to 10 meV), and a large dynamic range (1 μeV to 100 meV), are outstanding features of this type of instrument, which are easily achieved also at a pulsed source using state-of-the-art technology. The method of intensity-resolution optimization of MTOF spectrometers at spallation sources is treated on the basis of the requirement of using (almost) 'all the neutrons of the pulse', taking into account the constant, but wavelength-dependent duration of the source pulse. It follows, that the optimization procedure (which is slightly different from that employed in the steady-state source case) should give priority to the highest resolution, whenever such a choice becomes necessary. This leads to long monochromator distances (L l2 ) of the order of 50 m, for achieving resolutions now available at reactor sources. A few examples of spectrometer layout and corresponding design parameters for large-angle and for small-angle quasielastic scattering instruments are given. In the latter case higher energy resolution than for large-angle scattering is required and achieved. The use of phase-space transformers, neutron wavelength band-pass filters and multichromatic operation for the purpose of intensity-resolution optimization are discussed. This spectrometer can be designed to make full use of the pulsed source peak flux. Therefore, and because of a number of improvements, high resolution will be available at high intensity: for any given resolution the total intensity at the detectors, when placed at one of the planned new spallation sources (SNS, JSNS, ESS, AUSTRON) will be larger by at least three orders of magnitude than the total intensity of any of the presently existing instruments of this type in routine operation at steady

  3. A formal method for identifying distinct states of variability in time-varying sources: SGR A* as an example

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, L.; Witzel, G.; Ghez, A. M. [Department of Physics and Astronomy, University of California, Los Angeles, CA 90095-1547 (United States); Longstaff, F. A. [UCLA Anderson School of Management, University of California, Los Angeles, CA 90095-1481 (United States)

    2014-08-10

    Continuously time variable sources are often characterized by their power spectral density and flux distribution. These quantities can undergo dramatic changes over time if the underlying physical processes change. However, some changes can be subtle and not distinguishable using standard statistical approaches. Here, we report a methodology that aims to identify distinct but similar states of time variability. We apply this method to the Galactic supermassive black hole, where 2.2 μm flux is observed from a source associated with Sgr A* and where two distinct states have recently been suggested. Our approach is taken from mathematical finance and works with conditional flux density distributions that depend on the previous flux value. The discrete, unobserved (hidden) state variable is modeled as a stochastic process and the transition probabilities are inferred from the flux density time series. Using the most comprehensive data set to date, in which all Keck and a majority of the publicly available Very Large Telescope data have been merged, we show that Sgr A* is sufficiently described by a single intrinsic state. However, the observed flux densities exhibit two states: noise dominated and source dominated. Our methodology reported here will prove extremely useful to assess the effects of the putative gas cloud G2 that is on its way toward the black hole and might create a new state of variability.

  4. Source memory in the absence of successful cued recall.

    Science.gov (United States)

    Cook, Gabriel I; Marsh, Richard L; Hicks, Jason L

    2006-07-01

    Five experiments were conducted to address the question of whether source information could be accessed in the absence of being able to recall an item. The authors used a paired-associate learning paradigm in which cue-target word pairs were studied, and target recall was requested in the presence of the cue. When target recall failed, participants were asked to make a source judgment of whether a man or woman spoke the unrecalled item. In 3 of the 5 experiments, source accuracy was at or very close to chance. By contrast, if cue-target pairs were studied multiple times or participants knew in advance of learning that a predictive judgment would be required, then predictive source accuracy was well above chance. These data are suggestive that context information may not play a very large role in metacognitive judgments such as feeling-of-knowing ratings or putting one into a tip-of-the-tongue state without strong and specific encoding procedures. These same results also highlight the important role that item memory plays in retrieving information about the context in which an item was experienced. Copyright 2006 APA, all rights reserved.

  5. Radiochemistry methods in DOE Methods for Evaluating Environmental and Waste Management Samples: Addressing new challenges

    International Nuclear Information System (INIS)

    Fadeff, S.K.; Goheen, S.C.; Riley, R.G.

    1994-01-01

    Radiochemistry methods in Department of Energy Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) add to the repertoire of other standard methods in support of U.S. Department of Energy environmental restoration and waste management (DOE/EM) radiochemical characterization activities. Current standard sources of radiochemistry methods are not always applicable for evaluating DOE/EM samples. Examples of current sources include those provided by the US Environmental Protection Agency, the American Society for Testing and Materials, Standard Methods for the Examination of Water and Wastewater, and Environmental Measurements Laboratory Procedures Manual (HASL-300). The applicability of these methods is generally limited to specific matrices (usually water), low-level radioactive samples, and a limited number of analytes. DOE Methods complements these current standard methods by addressing the complexities of EM characterization needs. The process for determining DOE/EM radiochemistry characterization needs is discussed. In this context of DOE/EM needs, the applicability of other sources of standard radiochemistry methods is defined, and gaps in methodology are identified. Current methods in DOE Methods and the EM characterization needs they address are discussed. Sources of new methods and the methods incorporation process are discussed. The means for individuals to participate in (1) identification of DOE/EM needs, (2) the methods incorporation process, and (3) submission of new methods are identified

  6. Prediction of broadband ground-motion time histories: Hybrid low/high-frequency method with correlated random source parameters

    Science.gov (United States)

    Liu, P.; Archuleta, R.J.; Hartzell, S.H.

    2006-01-01

    We present a new method for calculating broadband time histories of ground motion based on a hybrid low-frequency/high-frequency approach with correlated source parameters. Using a finite-difference method we calculate low- frequency synthetics (structure. We also compute broadband synthetics in a 1D velocity model using a frequency-wavenumber method. The low frequencies from the 3D calculation are combined with the high frequencies from the 1D calculation by using matched filtering at a crossover frequency of 1 Hz. The source description, common to both the 1D and 3D synthetics, is based on correlated random distributions for the slip amplitude, rupture velocity, and rise time on the fault. This source description allows for the specification of source parameters independent of any a priori inversion results. In our broadband modeling we include correlation between slip amplitude, rupture velocity, and rise time, as suggested by dynamic fault modeling. The method of using correlated random source parameters is flexible and can be easily modified to adjust to our changing understanding of earthquake ruptures. A realistic attenuation model is common to both the 3D and 1D calculations that form the low- and high-frequency components of the broadband synthetics. The value of Q is a function of the local shear-wave velocity. To produce more accurate high-frequency amplitudes and durations, the 1D synthetics are corrected with a randomized, frequency-dependent radiation pattern. The 1D synthetics are further corrected for local site and nonlinear soil effects by using a 1D nonlinear propagation code and generic velocity structure appropriate for the site’s National Earthquake Hazards Reduction Program (NEHRP) site classification. The entire procedure is validated by comparison with the 1994 Northridge, California, strong ground motion data set. The bias and error found here for response spectral acceleration are similar to the best results that have been published by

  7. A Derivation of Source-based Kinetics Equation with Time Dependent Fission Kernel for Reactor Transient Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Song Hyun; Woo, Myeong Hyun; Shin, Chang Ho [Hanyang University, Seoul (Korea, Republic of); Pyeon, Cheol Ho [Kyoto University, Osaka (Japan)

    2015-10-15

    In this study, a new balance equation to overcome the problems generated by the previous methods is proposed using source-based balance equation. And then, a simple problem is analyzed with the proposed method. In this study, a source-based balance equation with the time dependent fission kernel was derived to simplify the kinetics equation. To analyze the partial variations of reactor characteristics, two representative methods were introduced in previous studies; (1) quasi-statics method and (2) multipoint technique. The main idea of quasistatics method is to use a low-order approximation for large integration times. To realize the quasi-statics method, first, time dependent flux is separated into the shape and amplitude functions, and shape function is calculated. It is noted that the method has a good accuracy; however, it can be expensive as a calculation cost aspect because the shape function should be fully recalculated to obtain accurate results. To improve the calculation efficiency, multipoint method was proposed. The multipoint method is based on the classic kinetics equation with using Green's function to analyze the flight probability from region r' to r. Those previous methods have been used to analyze the reactor kinetics analysis; however, the previous methods can have some limitations. First, three group variables (r{sub g}, E{sub g}, t{sub g}) should be considered to solve the time dependent balance equation. This leads a big limitation to apply large system problem with good accuracy. Second, the energy group neutrons should be used to analyze reactor kinetics problems. In time dependent problem, neutron energy distribution can be changed at different time. It can affect the change of the group cross section; therefore, it can lead the accuracy problem. Third, the neutrons in a space-time region continually affect the other space-time regions; however, it is not properly considered in the previous method. Using birth history of the

  8. Time reversal imaging, Inverse problems and Adjoint Tomography}

    Science.gov (United States)

    Montagner, J.; Larmat, C. S.; Capdeville, Y.; Kawakatsu, H.; Fink, M.

    2010-12-01

    With the increasing power of computers and numerical techniques (such as spectral element methods), it is possible to address a new class of seismological problems. The propagation of seismic waves in heterogeneous media is simulated more and more accurately and new applications developed, in particular time reversal methods and adjoint tomography in the three-dimensional Earth. Since the pioneering work of J. Claerbout, theorized by A. Tarantola, many similarities were found between time-reversal methods, cross-correlations techniques, inverse problems and adjoint tomography. By using normal mode theory, we generalize the scalar approach of Draeger and Fink (1999) and Lobkis and Weaver (2001) to the 3D- elastic Earth, for theoretically understanding time-reversal method on global scale. It is shown how to relate time-reversal methods on one hand, with auto-correlations of seismograms for source imaging and on the other hand, with cross-correlations between receivers for structural imaging and retrieving Green function. Time-reversal methods were successfully applied in the past to acoustic waves in many fields such as medical imaging, underwater acoustics, non destructive testing and to seismic waves in seismology for earthquake imaging. In the case of source imaging, time reversal techniques make it possible an automatic location in time and space as well as the retrieval of focal mechanism of earthquakes or unknown environmental sources . We present here some applications at the global scale of these techniques on synthetic tests and on real data, such as Sumatra-Andaman (Dec. 2004), Haiti (Jan. 2010), as well as glacial earthquakes and seismic hum.

  9. Assessment of In Situ Time Resolved Shock Experiments at Synchrotron Light Sources*

    Science.gov (United States)

    Belak, J.; Ilavsky, J.; Hessler, J. P.

    2005-07-01

    Prior to fielding in situ time resolved experiments of shock wave loading at the Advanced Photon Source, we have performed feasibility experiments assessing a single photon bunch. Using single and poly-crystal Al, Ti, V and Cu shock to incipient spallation on the gas gun, samples were prepared from slices normal to the spall plane of thickness 100-500 microns. In addition, single crystal Al of thickness 500 microns was shocked to incipient spallation and soft recovered using the LLNL e-gun mini-flyer system. The e-gun mini-flyer impacts the sample target producing a 10's ns flat-top shock transient. Here, we present results for imaging, small-angle scattering (SAS), and diffraction. In particular, there is little SAS away from the spall plane and significant SAS at the spall plane, demonstrating the presence of sub-micron voids. * Use of the Advanced Photon Source was supported by the U. S. Department of Energy, Office of Science, Office of Basic Energy Sciences, under Contract No. W-31-109-Eng-38 and work performed under the auspices of the U.S. Department of Energy by University of California, Lawrence Livermore National Laboratory under Contract W-7405-Eng-48.

  10. THE STATISTICS OF RADIO ASTRONOMICAL POLARIMETRY: BRIGHT SOURCES AND HIGH TIME RESOLUTION

    International Nuclear Information System (INIS)

    Van Straten, W.

    2009-01-01

    A four-dimensional statistical description of electromagnetic radiation is developed and applied to the analysis of radio pulsar polarization. The new formalism provides an elementary statistical explanation of the modal-broadening phenomenon in single-pulse observations. It is also used to argue that the degree of polarization of giant pulses has been poorly defined in past studies. Single- and giant-pulse polarimetry typically involves sources with large flux-densities and observations with high time-resolution, factors that necessitate consideration of source-intrinsic noise and small-number statistics. Self-noise is shown to fully explain the excess polarization dispersion previously noted in single-pulse observations of bright pulsars, obviating the need for additional randomly polarized radiation. Rather, these observations are more simply interpreted as an incoherent sum of covariant, orthogonal, partially polarized modes. Based on this premise, the four-dimensional covariance matrix of the Stokes parameters may be used to derive mode-separated pulse profiles without any assumptions about the intrinsic degrees of mode polarization. Finally, utilizing the small-number statistics of the Stokes parameters, it is established that the degree of polarization of an unresolved pulse is fundamentally undefined; therefore, previous claims of highly polarized giant pulses are unsubstantiated.

  11. THE PROPER MOTIONS OF THE DOUBLE RADIO SOURCE n IN THE ORION BN/KL REGION

    International Nuclear Information System (INIS)

    Rodríguez, Luis F.; Loinard, Laurent; Zapata, Luis; Lizano, Susana; Dzib, Sergio A.; Menten, Karl M.; Gómez, Laura

    2017-01-01

    We have extended the time baseline for observations of the proper motions of radio sources in the Orion BN/KL region from 14.7 to 22.5 years. We present improved determinations for the sources BN and I. In addition, we address the proper motions of the double radio source n, that have been questioned in the literature. We confirm that all three sources are moving away at transverse velocities of tens of kilometers per second from a region in-between them, where they were located about 500 years ago. Source n exhibits a new component that we interpret as due to a one-sided ejection of free–free emitting plasma that took place after 2006.36. We used the highly accurate relative proper motions between sources BN and I to determine that their closest separation took place in the year 1475 ± 6, when they were within ∼100 au or less from each other in the plane of the sky.

  12. THE PROPER MOTIONS OF THE DOUBLE RADIO SOURCE n IN THE ORION BN/KL REGION

    Energy Technology Data Exchange (ETDEWEB)

    Rodríguez, Luis F.; Loinard, Laurent; Zapata, Luis; Lizano, Susana [Instituto de Radioastronomía y Astrofísica, UNAM, Apdo. Postal 3-72 (Xangari), 58089 Morelia, Michoacán, México (Mexico); Dzib, Sergio A.; Menten, Karl M. [Max Planck Institut für Radioastronomie, Auf dem Hügel 69, D-53121 Bonn (Germany); Gómez, Laura, E-mail: l.rodriguez@crya.unam.mx [Joint ALMA Observatory, Alonso de Córdoba 3107, Vitacura, Santiago (Chile)

    2017-01-10

    We have extended the time baseline for observations of the proper motions of radio sources in the Orion BN/KL region from 14.7 to 22.5 years. We present improved determinations for the sources BN and I. In addition, we address the proper motions of the double radio source n, that have been questioned in the literature. We confirm that all three sources are moving away at transverse velocities of tens of kilometers per second from a region in-between them, where they were located about 500 years ago. Source n exhibits a new component that we interpret as due to a one-sided ejection of free–free emitting plasma that took place after 2006.36. We used the highly accurate relative proper motions between sources BN and I to determine that their closest separation took place in the year 1475 ± 6, when they were within ∼100 au or less from each other in the plane of the sky.

  13. An open source platform for multi-scale spatially distributed simulations of microbial ecosystems

    Energy Technology Data Exchange (ETDEWEB)

    Segre, Daniel [Boston Univ., MA (United States)

    2014-08-14

    The goal of this project was to develop a tool for facilitating simulation, validation and discovery of multiscale dynamical processes in microbial ecosystems. This led to the development of an open-source software platform for Computation Of Microbial Ecosystems in Time and Space (COMETS). COMETS performs spatially distributed time-dependent flux balance based simulations of microbial metabolism. Our plan involved building the software platform itself, calibrating and testing it through comparison with experimental data, and integrating simulations and experiments to address important open questions on the evolution and dynamics of cross-feeding interactions between microbial species.

  14. What an open source clinical trial community can learn from hackers

    Science.gov (United States)

    Dunn, Adam G.; Day, Richard O.; Mandl, Kenneth D.; Coiera, Enrico

    2014-01-01

    Summary Open sharing of clinical trial data has been proposed as a way to address the gap between the production of clinical evidence and the decision-making of physicians. Since a similar gap has already been addressed in the software industry by the open source software movement, we examine how the social and technical principles of the movement can be used to guide the growth of an open source clinical trial community. PMID:22553248

  15. The Chandra Source Catalog: Source Variability

    Science.gov (United States)

    Nowak, Michael; Rots, A. H.; McCollough, M. L.; Primini, F. A.; Glotfelty, K. J.; Bonaventura, N. R.; Chen, J. C.; Davis, J. E.; Doe, S. M.; Evans, J. D.; Evans, I.; Fabbiano, G.; Galle, E. C.; Gibbs, D. G., II; Grier, J. D.; Hain, R.; Hall, D. M.; Harbo, P. N.; He, X.; Houck, J. C.; Karovska, M.; Lauer, J.; McDowell, J. C.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Plummer, D. A.; Refsdal, B. L.; Siemiginowska, A. L.; Sundheim, B. A.; Tibbetts, M. S.; van Stone, D. W.; Winkelman, S. L.; Zografou, P.

    2009-09-01

    The Chandra Source Catalog (CSC) contains fields of view that have been studied with individual, uninterrupted observations that span integration times ranging from 1 ksec to 160 ksec, and a large number of which have received (multiple) repeat observations days to years later. The CSC thus offers an unprecedented look at the variability of the X-ray sky over a broad range of time scales, and across a wide diversity of variable X-ray sources: stars in the local galactic neighborhood, galactic and extragalactic X-ray binaries, Active Galactic Nuclei, etc. Here we describe the methods used to identify and quantify source variability within a single observation, and the methods used to assess the variability of a source when detected in multiple, individual observations. Three tests are used to detect source variability within a single observation: the Kolmogorov-Smirnov test and its variant, the Kuiper test, and a Bayesian approach originally suggested by Gregory and Loredo. The latter test not only provides an indicator of variability, but is also used to create a best estimate of the variable lightcurve shape. We assess the performance of these tests via simulation of statistically stationary, variable processes with arbitrary input power spectral densities (here we concentrate on results of red noise simulations) at variety of mean count rates and fractional root mean square variabilities relevant to CSC sources. We also assess the false positive rate via simulations of constant sources whose sole source of fluctuation is Poisson noise. We compare these simulations to an assessment of the variability found in real CSC sources, and estimate the variability sensitivities of the CSC.

  16. THE COMPACT, TIME-VARIABLE RADIO SOURCE PROJECTED INSIDE W3(OH): EVIDENCE FOR A PHOTOEVAPORATED DISK?

    Energy Technology Data Exchange (ETDEWEB)

    Dzib, Sergio A.; Rodriguez-Garza, Carolina B.; Rodriguez, Luis F.; Kurtz, Stan E.; Loinard, Laurent; Zapata, Luis A.; Lizano, Susana, E-mail: s.dzib@crya.unam.mx [Centro de Radiostronomia y Astrofisica, Universidad Nacional Autonoma de Mexico, Morelia 58089 (Mexico)

    2013-08-01

    We present new Karl G. Jansky Very Large Array (VLA) observations of the compact ({approx}0.''05), time-variable radio source projected near the center of the ultracompact H II region W3(OH). The analysis of our new data as well as of VLA archival observations confirms the variability of the source on timescales of years and for a given epoch indicates a spectral index of {alpha} = 1.3 {+-} 0.3 (S{sub {nu}}{proportional_to}{nu}{sup {alpha}}). This spectral index and the brightness temperature of the source ({approx}6500 K) suggest that we are most likely detecting partially optically thick free-free radiation. The radio source is probably associated with the ionizing star of W3(OH), but an interpretation in terms of an ionized stellar wind fails because the detected flux densities are orders of magnitude larger than expected. We discuss several scenarios and tentatively propose that the radio emission could arise in a static ionized atmosphere around a fossil photoevaporated disk.

  17. Do pediatric gastroenterology doctors address pediatric obesity?

    OpenAIRE

    Batra, Suruchi; Yee, Caitlin; Diez, Bernadette; Nguyen, Nicholas; Sheridan, Michael J; Tufano, Mark; Sikka, Natalie; Townsend, Stacie; Hourigan, Suchitra

    2017-01-01

    Objectives: To assess how often obesity is acknowledged at pediatric gastroenterology outpatient visits. Methods: A retrospective chart review was performed to identify obese children seen at a gastroenterology subspecialty clinic over a 1-year period of time; 132 children were identified. Demographics, obesity comorbidities, reasons for referral, diagnosis of obesity, and a plan to address obesity were abstracted. Chi-square or Fisher?s exact tests were used to examine statistical associatio...

  18. On the Reliability of Source Time Functions Estimated Using Empirical Green's Function Methods

    Science.gov (United States)

    Gallegos, A. C.; Xie, J.; Suarez Salas, L.

    2017-12-01

    The Empirical Green's Function (EGF) method (Hartzell, 1978) has been widely used to extract source time functions (STFs). In this method, seismograms generated by collocated events with different magnitudes are deconvolved. Under a fundamental assumption that the STF of the small event is a delta function, the deconvolved Relative Source Time Function (RSTF) yields the large event's STF. While this assumption can be empirically justified by examination of differences in event size and frequency content of the seismograms, there can be a lack of rigorous justification of the assumption. In practice, a small event might have a finite duration when the RSTF is retrieved and interpreted as the large event STF with a bias. In this study, we rigorously analyze this bias using synthetic waveforms generated by convolving a realistic Green's function waveform with pairs of finite-duration triangular or parabolic STFs. The RSTFs are found using a time-domain based matrix deconvolution. We find when the STFs of smaller events are finite, the RSTFs are a series of narrow non-physical spikes. Interpreting these RSTFs as a series of high-frequency source radiations would be very misleading. The only reliable and unambiguous information we can retrieve from these RSTFs is the difference in durations and the moment ratio of the two STFs. We can apply a Tikhonov smoothing to obtain a single-pulse RSTF, but its duration is dependent on the choice of weighting, which may be subjective. We then test the Multi-Channel Deconvolution (MCD) method (Plourde & Bostock, 2017) which assumes that both STFs have finite durations to be solved for. A concern about the MCD method is that the number of unknown parameters is larger, which would tend to make the problem rank-deficient. Because the kernel matrix is dependent on the STFs to be solved for under a positivity constraint, we can only estimate the rank-deficiency with a semi-empirical approach. Based on the results so far, we find that the

  19. Kurtosis based blind source extraction of complex noncircular signals with application in EEG artifact removal in real-time

    Directory of Open Access Journals (Sweden)

    Soroush eJavidi

    2011-10-01

    Full Text Available A new class of complex domain blind source extraction (BSE algorithms suitable for the extraction of both circular and noncircular complex signals is proposed. This is achieved through sequential extraction based on the degree of kurtosis and in the presence of noncircular measurement noise. The existence and uniqueness analysis of the solution is followed by a study of fast converging variants of the algorithm. The performance is first assessed through simulations on well understood benchmark signals, followed by a case study on real-time artifact removal from EEG signals, verified using both qualitative and quantitative metrics. The results illustrate the power of the proposed approach in real-time blind extraction of general complex-valued sources.

  20. Source apportionment of size and time resolved trace elements and organic aerosols from an urban courtyard site in Switzerland

    Science.gov (United States)

    Richard, A.; Gianini, M. F. D.; Mohr, C.; Furger, M.; Bukowiecki, N.; Minguillón, M. C.; Lienemann, P.; Flechsig, U.; Appel, K.; Decarlo, P. F.; Heringa, M. F.; Chirico, R.; Baltensperger, U.; Prévôt, A. S. H.

    2011-09-01

    Time and size resolved data of trace elements were obtained from measurements with a rotating drum impactor (RDI) and subsequent X-ray fluorescence spectrometry. Trace elements can act as indicators for the identification of sources of particulate matter Switzerland. Eight different sources were identified for the three examined size ranges (PM1-0.1, PM2.5-1 and PM10-2.5): secondary sulfate, wood combustion, fire works, road traffic, mineral dust, de-icing salt, industrial and local anthropogenic activities. The major component was secondary sulfate for the smallest size range; the road traffic factor was found in all three size ranges. This trace element analysis is complemented with data from an Aerodyne high-resolution time-of-flight aerosol mass spectrometer (AMS), assessing the PM1 fraction of organic aerosols. A separate PMF analysis revealed three factors related to three of the sources found with the RDI: oxygenated organic aerosol (OOA, related to inorganic secondary sulfate), hydrocarbon-like organic aerosol (HOA, related to road traffic) and biomass burning organic aerosol (BBOA), explaining 60 %, 22 % and 17 % of total measured organics, respectively. Since different compounds are used for the source classification, a higher percentage of the ambient PM10 mass concentration can be apportioned to sources by the combination of both methods.

  1. Speaker Introductions at Internal Medicine Grand Rounds: Forms of Address Reveal Gender Bias.

    Science.gov (United States)

    Files, Julia A; Mayer, Anita P; Ko, Marcia G; Friedrich, Patricia; Jenkins, Marjorie; Bryan, Michael J; Vegunta, Suneela; Wittich, Christopher M; Lyle, Melissa A; Melikian, Ryan; Duston, Trevor; Chang, Yu-Hui H; Hayes, Sharonne N

    2017-05-01

    Gender bias has been identified as one of the drivers of gender disparity in academic medicine. Bias may be reinforced by gender subordinating language or differential use of formality in forms of address. Professional titles may influence the perceived expertise and authority of the referenced individual. The objective of this study is to examine how professional titles were used in the same and mixed-gender speaker introductions at Internal Medicine Grand Rounds (IMGR). A retrospective observational study of video-archived speaker introductions at consecutive IMGR was conducted at two different locations (Arizona, Minnesota) of an academic medical center. Introducers and speakers at IMGR were physician and scientist peers holding MD, PhD, or MD/PhD degrees. The primary outcome was whether or not a speaker's professional title was used during the first form of address during speaker introductions at IMGR. As secondary outcomes, we evaluated whether or not the speakers professional title was used in any form of address during the introduction. Three hundred twenty-one forms of address were analyzed. Female introducers were more likely to use professional titles when introducing any speaker during the first form of address compared with male introducers (96.2% [102/106] vs. 65.6% [141/215]; p form of address 97.8% (45/46) compared with male dyads who utilized a formal title 72.4% (110/152) of the time (p = 0.007). In mixed-gender dyads, where the introducer was female and speaker male, formal titles were used 95.0% (57/60) of the time. Male introducers of female speakers utilized professional titles 49.2% (31/63) of the time (p addressed by professional title than were men introduced by men. Differential formality in speaker introductions may amplify isolation, marginalization, and professional discomfiture expressed by women faculty in academic medicine.

  2. Impacts of Reverberation Time, Absorption Location and Background Noise on Listening Conditions in Multi Source Environment

    DEFF Research Database (Denmark)

    Saher, Konca; Rindel, Jens Holger; Nijs, Lau

    2005-01-01

    index (STI) needs to be improved. The impact of the reverberation time (RT), the distribution of the absorptive materials and the introduction of a screen on STI are discussed briefly .However, these objective parameters have to be assessed through subjective judgement. Auralizations of the multi source...

  3. Time-Dependent Searches for Point Sources of Neutrinos with the 4O-String and 22-String Configurations of IceCube

    Science.gov (United States)

    Abbasi, R.; Abdou, Y.; Abu-Zayyad, T.; Adams, J.; Aguilar, J. A.; Ahlers, M.; Andeen, K.; Auffenberg, J.; Bai, X.; Baker, M.; hide

    2012-01-01

    This paper presents four searches for flaring sources of neutrinos using the IceCube neutrino telescope. For the first time, a search is performed over the entire parameter space of energy, direction, and time with sensitivity to neutrino flares lasting between 20 micro-s and a year duration from astrophysical sources. Searches that integrate over time are less sensitive to flares because they are affected by a larger background of atmospheric neutrinos and muons that can be reduced by the use of additional timing information. Flaring sources considered here, such as active galactic nuclei, soft gamma-ray repeaters, and gamma-ray bursts, are promising candidate neutrino emitters. Two searches are "untriggered" in the sense that they look for any possible flare in the entire sky and from a predefined catalog of sources from which photon flares have been recorded. The other two searches are triggered by multi-wavelength information on flares from blazars and from a soft gamma-ray repeater. One triggered search uses lightcurves from Fermi-LAT which provides continuous monitoring. A second triggered search uses information where the flux states have been measured only for short periods of time near the flares. The untriggered searches use data taken by 40 strings of IceCube between 2008 April 5 and 2009 May 20. The triggered searches also use data taken by the 22-string configuration of IceCube operating between 2007 May 31 and 2008 April 5. The results from all four searches are compatible with a fluctuation of the background.

  4. Adequately Addressing Pediatric Obesity: Challenges Faced by Primary Care Providers.

    Science.gov (United States)

    Shreve, Marilou; Scott, Allison; Vowell Johnson, Kelly

    2017-07-01

    To assess the challenges primary care providers encounter when providing counseling for pediatric patients identified as obese. A survey assessed the current challenges and barriers to the screening and treatment of pediatric obesity for providers in northwest Arkansas who provide care to families. The survey consisted of 15 Likert scale questions and 4 open-ended questions. Time, resources, comfort, and cultural issues were reported by providers as the biggest barriers in screening and the treatment of pediatric obesity. All providers reported lack of time as a barrier to providing the care needed for obese children. Cultural barriers of both the provider and client were identified as factors, which negatively affect the care and treatment of obese children. Primary care providers continue to experience challenges when addressing pediatric obesity. In this study, a lack of adequate time to address obesity was identified as the most significant current barrier and may likely be tied to physician resources. Although reimbursement for obesity is increasing, the level of reimbursement does not support the time or the resources needed to treat patients. Many providers reported their patients' cultural view of obesity influenced how they counsel their patients. Increasing providers' knowledge concerning differences in how weight is viewed or valued may assist them in the assessment and care of obese pediatric patients. The challenges identified in previous research continue to limit providers when addressing obesity. Although progress has been made regarding knowledge of guidelines, continuing effort is needed to tackle the remaining challenges. This will allow for earlier identification and intervention, resulting in improved outcomes in pediatric obesity.

  5. Modeling and Design of High-Efficiency Single-Photon Sources

    DEFF Research Database (Denmark)

    Gregersen, Niels; Nielsen, Per Kær; Mørk, Jesper

    2013-01-01

    be electrically driven. Several design strategies addressing these requirements have been proposed. In the cavity-based source, light emission is controlled using resonant cavity quantum electrodynamics effects, whereas in the waveguide-based source, broadband electric field screening effects are employed......Solid-state sources capable of emitting single photons on demand are of great interest in quantum information applications. Ideally, such a source should emit exactly one photon into the collection optics per trigger, the emitted photons should be indistinguishable, and the source should...

  6. Crowd-sourcing with Uncertain Quality

    DEFF Research Database (Denmark)

    Papakonstantinou, Athanasios; Bogetoft, Peter

    This article addresses two important issues in crowd-sourcing: ex ante uncertainty about the quality and cost of different workers and strategic behaviour. We present a novel multi-dimensional auction that incentivises the workers to make partial enquiry into the task and to honestly report quality...

  7. Crowd Sourced Formal Verification-Augmentation (CSFV-A)

    Science.gov (United States)

    2016-06-01

    Projects Agency (DARPA), Air Force Research Laboratory (AFRL), Charles River Analytics Inc., and TopCoder, Inc. will be holding a contest to reward...CROWD SOURCED FORMAL VERIFICATION – AUGMENTATION (CSFV-A) CHARLES RIVER ANALYTICS, INC. JUNE 2016 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC...CSFV 5e. TASK NUMBER TC 5f. WORK UNIT NUMBER RA 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Charles River Analytics, Inc. 625 Mount Auburn

  8. The Earthquake Source Inversion Validation (SIV) - Project: Summary, Status, Outlook

    Science.gov (United States)

    Mai, P. M.

    2017-12-01

    Finite-fault earthquake source inversions infer the (time-dependent) displacement on the rupture surface from geophysical data. The resulting earthquake source models document the complexity of the rupture process. However, this kinematic source inversion is ill-posed and returns non-unique solutions, as seen for instance in multiple source models for the same earthquake, obtained by different research teams, that often exhibit remarkable dissimilarities. To address the uncertainties in earthquake-source inversions and to understand strengths and weaknesses of various methods, the Source Inversion Validation (SIV) project developed a set of forward-modeling exercises and inversion benchmarks. Several research teams then use these validation exercises to test their codes and methods, but also to develop and benchmark new approaches. In this presentation I will summarize the SIV strategy, the existing benchmark exercises and corresponding results. Using various waveform-misfit criteria and newly developed statistical comparison tools to quantify source-model (dis)similarities, the SIV platforms is able to rank solutions and identify particularly promising source inversion approaches. Existing SIV exercises (with related data and descriptions) and all computational tools remain available via the open online collaboration platform; additional exercises and benchmark tests will be uploaded once they are fully developed. I encourage source modelers to use the SIV benchmarks for developing and testing new methods. The SIV efforts have already led to several promising new techniques for tackling the earthquake-source imaging problem. I expect that future SIV benchmarks will provide further innovations and insights into earthquake source kinematics that will ultimately help to better understand the dynamics of the rupture process.

  9. RF H-minus ion source development in China spallation neutron source

    Science.gov (United States)

    Chen, W.; Ouyang, H.; Xiao, Y.; Liu, S.; Lü, Y.; Cao, X.; Huang, T.; Xue, K.

    2017-08-01

    China Spallation Neutron Source (CSNS) phase-I project currently uses a Penning surface plasma H- ion source, which has a life time of several weeks with occasional sparks between high voltage electrodes. To extend the life time of the ion source and prepare for the CSNS phase-II, we are trying to develop a RF negative hydrogen ion source with external antenna. The configuration of the source is similar to the DESY external antenna ion source and SNS ion source. However several changes are made to improve the stability and the life time. Firstly, Si3N4 ceramic with high thermal shock resistance, and high thermal conductivity is used for plasma chamber, which can endure an average power of 2000W. Secondly, the water-cooled antenna is brazed on the chamber to improve the energy efficiency. Thirdly, cesium is injected directly to the plasma chamber if necessary, to simplify the design of the converter and the extraction. Area of stainless steel exposed to plasma is minimized to reduce the sputtering and degassing. Instead Mo, Ta, and Pt coated materials are used to face the plasma, which makes the self-cleaning of the source possible.

  10. Digital intelligence sources transporter

    International Nuclear Information System (INIS)

    Zhang Zhen; Wang Renbo

    2011-01-01

    It presents from the collection of particle-ray counting, infrared data communication, real-time monitoring and alarming, GPRS and other issues start to realize the digital management of radioactive sources, complete the real-time monitoring of all aspects, include the storing of radioactive sources, transporting and using, framing intelligent radioactive sources transporter, as a result, achieving reliable security supervision of radioactive sources. (authors)

  11. Three-Dimensional Passive-Source Reverse-Time Migration of Converted Waves: The Method

    Science.gov (United States)

    Li, Jiahang; Shen, Yang; Zhang, Wei

    2018-02-01

    At seismic discontinuities in the crust and mantle, part of the compressional wave energy converts to shear wave, and vice versa. These converted waves have been widely used in receiver function (RF) studies to image discontinuity structures in the Earth. While generally successful, the conventional RF method has its limitations and is suited mostly to flat or gently dipping structures. Among the efforts to overcome the limitations of the conventional RF method is the development of the wave-theory-based, passive-source reverse-time migration (PS-RTM) for imaging complex seismic discontinuities and scatters. To date, PS-RTM has been implemented only in 2D in the Cartesian coordinate for local problems and thus has limited applicability. In this paper, we introduce a 3D PS-RTM approach in the spherical coordinate, which is better suited for regional and global problems. New computational procedures are developed to reduce artifacts and enhance migrated images, including back-propagating the main arrival and the coda containing the converted waves separately, using a modified Helmholtz decomposition operator to separate the P and S modes in the back-propagated wavefields, and applying an imaging condition that maintains a consistent polarity for a given velocity contrast. Our new approach allows us to use migration velocity models with realistic velocity discontinuities, improving accuracy of the migrated images. We present several synthetic experiments to demonstrate the method, using regional and teleseismic sources. The results show that both regional and teleseismic sources can illuminate complex structures and this method is well suited for imaging dipping interfaces and sharp lateral changes in discontinuity structures.

  12. Development of Real-Time PCR to Monitor Groundwater Contaminated by Fecal Sources and Leachate from the Carcass

    Science.gov (United States)

    Park, S.; Kim, H.; Kim, M.; Lee, Y.; Han, J.

    2011-12-01

    The 2010 outbreak of foot and mouth disease (FMD) in South Korea caused about 4,054 carcass burial sites to dispose the carcasses. Potential environmental impacts by leachate of carcass on groundwater have been issued and it still needs to be studied. Therefore, we tried to develop robust and sensitive tool to immediately determine a groundwater contamination by the leachate from carcass burial. For tracking both an agricultural fecal contamination source and the leachate in groundwater, competitive real-time PCR and PCR method were developed using various PCR primer sets designed to detect E. Coli uidA gene and mtDNA(cytochrome B, cytB) of the animal species such as ovine, porcine, caprine, and bovine. The designed methods were applied to tract the animal species in livestock wastewater and leachate of carcass under appropriate PCR or real-time PCR condition. In the result, mtDNA primer sets for individual (Cow or Pig) and multiple (Cow and Pig) amplification, and E. Coli uidA primers for fecal source amplification were specific and sensitive to target genes. To determine contamination source, concentration of amplified mtDNA and uidA was competitively quantified in Livestock wastewater, leachate of carcass, and groundwater. The highest concentration of mtDNA and uidA showed in leachate of carcass and livestock wastewater, respectively. Groundwater samples possibly contaminated by leachate of carcass were analyzed by this assay and it was able to prove contamination source.

  13. Address Points - COUNTY_ADDRESS_POINTS_IDHS_IN: Address Points Maintained by County Agencies in Indiana (Indiana Department of Homeland Security, Point feature class)

    Data.gov (United States)

    NSGIC State | GIS Inventory — COUNTY_ADDRESS_POINTS_IDHS_IN is an ESRI Geodatabase point feature class that contains address points maintained by county agencies in Indiana, provided by personnel...

  14. Implications on 1 + 1 D Tsunami Runup Modeling due to Time Features of the Earthquake Source

    Science.gov (United States)

    Fuentes, M.; Riquelme, S.; Ruiz, J.; Campos, J.

    2018-04-01

    The time characteristics of the seismic source are usually neglected in tsunami modeling, due to the difference in the time scale of both processes. Nonetheless, there are just a few analytical studies that intended to explain separately the role of the rise time and the rupture velocity. In this work, we extend an analytical 1 + 1 D solution for the shoreline motion time series, from the static case to the kinematic case, by including both rise time and rupture velocity. Our results show that the static case corresponds to a limit case of null rise time and infinite rupture velocity. Both parameters contribute in shifting the arrival time, but maximum runup may be affected by very slow ruptures and long rise time. Parametric analysis reveals that runup is strictly decreasing with the rise time while is highly amplified in a certain range of slow rupture velocities. For even lower rupture velocities, the tsunami excitation vanishes and for larger, quicker approaches to the instantaneous case.

  15. Implications on 1 + 1 D Tsunami Runup Modeling due to Time Features of the Earthquake Source

    Science.gov (United States)

    Fuentes, M.; Riquelme, S.; Ruiz, J.; Campos, J.

    2018-02-01

    The time characteristics of the seismic source are usually neglected in tsunami modeling, due to the difference in the time scale of both processes. Nonetheless, there are just a few analytical studies that intended to explain separately the role of the rise time and the rupture velocity. In this work, we extend an analytical 1 + 1 D solution for the shoreline motion time series, from the static case to the kinematic case, by including both rise time and rupture velocity. Our results show that the static case corresponds to a limit case of null rise time and infinite rupture velocity. Both parameters contribute in shifting the arrival time, but maximum runup may be affected by very slow ruptures and long rise time. Parametric analysis reveals that runup is strictly decreasing with the rise time while is highly amplified in a certain range of slow rupture velocities. For even lower rupture velocities, the tsunami excitation vanishes and for larger, quicker approaches to the instantaneous case.

  16. Digital timing: sampling frequency, anti-aliasing filter and signal interpolation filter dependence on timing resolution

    International Nuclear Information System (INIS)

    Cho, Sanghee; Grazioso, Ron; Zhang Nan; Aykac, Mehmet; Schmand, Matthias

    2011-01-01

    The main focus of our study is to investigate how the performance of digital timing methods is affected by sampling rate, anti-aliasing and signal interpolation filters. We used the Nyquist sampling theorem to address some basic questions such as what will be the minimum sampling frequencies? How accurate will the signal interpolation be? How do we validate the timing measurements? The preferred sampling rate would be as low as possible, considering the high cost and power consumption of high-speed analog-to-digital converters. However, when the sampling rate is too low, due to the aliasing effect, some artifacts are produced in the timing resolution estimations; the shape of the timing profile is distorted and the FWHM values of the profile fluctuate as the source location changes. Anti-aliasing filters are required in this case to avoid the artifacts, but the timing is degraded as a result. When the sampling rate is marginally over the Nyquist rate, a proper signal interpolation is important. A sharp roll-off (higher order) filter is required to separate the baseband signal from its replicates to avoid the aliasing, but in return the computation will be higher. We demonstrated the analysis through a digital timing study using fast LSO scintillation crystals as used in time-of-flight PET scanners. From the study, we observed that there is no significant timing resolution degradation down to 1.3 Ghz sampling frequency, and the computation requirement for the signal interpolation is reasonably low. A so-called sliding test is proposed as a validation tool checking constant timing resolution behavior of a given timing pick-off method regardless of the source location change. Lastly, the performance comparison for several digital timing methods is also shown.

  17. Time delay estimation in a reverberant environment by low rate sampling of impulsive acoustic sources

    KAUST Repository

    Omer, Muhammad

    2012-07-01

    This paper presents a new method of time delay estimation (TDE) using low sample rates of an impulsive acoustic source in a room environment. The proposed method finds the time delay from the room impulse response (RIR) which makes it robust against room reverberations. The RIR is considered a sparse phenomenon and a recently proposed sparse signal reconstruction technique called orthogonal clustering (OC) is utilized for its estimation from the low rate sampled received signal. The arrival time of the direct path signal at a pair of microphones is identified from the estimated RIR and their difference yields the desired time delay. Low sampling rates reduce the hardware and computational complexity and decrease the communication between the microphones and the centralized location. The performance of the proposed technique is demonstrated by numerical simulations and experimental results. © 2012 IEEE.

  18. Event generators for address event representation transmitters

    Science.gov (United States)

    Serrano-Gotarredona, Rafael; Serrano-Gotarredona, Teresa; Linares Barranco, Bernabe

    2005-06-01

    Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows for real-time virtual massive connectivity between huge number neurons located on different chips. By exploiting high speed digital communication circuits (with nano-seconds timings), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Also, neurons generate 'events' according to their activity levels. More active neurons generate more events per unit time, and access the interchip communication channel more frequently, while neurons with low activity consume less communication bandwidth. In a typical AER transmitter chip, there is an array of neurons that generate events. They send events to a peripheral circuitry (let's call it "AER Generator") that transforms those events to neurons coordinates (addresses) which are put sequentially on an interchip high speed digital bus. This bus includes a parallel multi-bit address word plus a Rqst (request) and Ack (acknowledge) handshaking signals for asynchronous data exchange. There have been two main approaches published in the literature for implementing such "AER Generator" circuits. They differ on the way of handling event collisions coming from the array of neurons. One approach is based on detecting and discarding collisions, while the other incorporates arbitration for sequencing colliding events . The first approach is supposed to be simpler and faster, while the second is able to handle much higher event traffic. In this article we will concentrate on the second arbiter-based approach. Boahen has been publishing several techniques for implementing and improving the arbiter based approach. Originally, he proposed an arbitration squeme by rows, followed by a column arbitration. In this scheme, while one neuron was selected by the arbiters to transmit his event out of the chip, the rest of neurons in the array were

  19. Managing the risks of legacy radioactive sources from a security perspective

    International Nuclear Information System (INIS)

    Alexander, Mark; Murray, Allan

    2008-01-01

    The safety and security risk posed by highly radioactive, long-lived sources at the end of their normal use has not been consistently well-managed in previous decades. The Brazilian Cs-137 accident in 1986 and the Thailand Co-60 accident in 2000 are prime examples of the consequences that ensue from the loss of control of highly dangerous sources after their normal use. With the new international emphasis on security of radioactive sources throughout their life cycle, there is now further incentive to address the management of risks posed by legacy, highly dangerous radioactive sources. The ANSTO South-East Asia Regional Security of Radioactive Sources (RSRS) Project has identified, and is addressing, a number of legacy situations that have arisen as a result of inadequate management practices in the past. Specific examples are provided of these legacy situations and the lessons learned for managing the consequent safety and security risk, and for future complete life-cycle management of highly radioactive sources. (author)

  20. Weak unique continuation property and a related inverse source problem for time-fractional diffusion-advection equations

    Science.gov (United States)

    Jiang, Daijun; Li, Zhiyuan; Liu, Yikan; Yamamoto, Masahiro

    2017-05-01

    In this paper, we first establish a weak unique continuation property for time-fractional diffusion-advection equations. The proof is mainly based on the Laplace transform and the unique continuation properties for elliptic and parabolic equations. The result is weaker than its parabolic counterpart in the sense that we additionally impose the homogeneous boundary condition. As a direct application, we prove the uniqueness for an inverse problem on determining the spatial component in the source term by interior measurements. Numerically, we reformulate our inverse source problem as an optimization problem, and propose an iterative thresholding algorithm. Finally, several numerical experiments are presented to show the accuracy and efficiency of the algorithm.

  1. Developing an Open Source Option for NASA Software

    Science.gov (United States)

    Moran, Patrick J.; Parks, John W. (Technical Monitor)

    2003-01-01

    We present arguments in favor of developing an Open Source option for NASA software; in particular we discuss how Open Source is compatible with NASA's mission. We compare and contrast several of the leading Open Source licenses, and propose one - the Mozilla license - for use by NASA. We also address some of the related issues for NASA with respect to Open Source. In particular, we discuss some of the elements in the External Release of NASA Software document (NPG 2210.1A) that will likely have to be changed in order to make Open Source a reality withm the agency.

  2. Addressing Climate Change and the Role of Technological Solutions

    Directory of Open Access Journals (Sweden)

    Stephen Axon

    2010-05-01

    Full Text Available As far as sustainability is concerned, the role of technology has always been contested. With regard to environmental degradation, technology is either perceived to be part of the problem or part of the solution. To combat the complex issues of the present time, technological solutions are expected to play a key role towards mitigating and adapting to the negative impacts of climate change. The paper also discusses the role of the 2009 Copenhagen Conference towards addressing climate change. Although the Copenhagen Accord is not a legally binding agreement, it is seen as a necessary first step towards a protocol that will effectively address the issue of climate change.

  3. Livermore Accelerator Source for Radionuclide Science (LASRS)

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Scott [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bleuel, Darren [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Johnson, Micah [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Rusnak, Brian [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Soltz, Ron [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Tonchev, Anton [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-05-05

    The Livermore Accelerator Source for Radionuclide Science (LASRS) will generate intense photon and neutron beams to address important gaps in the study of radionuclide science that directly impact Stockpile Stewardship, Nuclear Forensics, and Nuclear Material Detection. The co-location of MeV-scale neutral and photon sources with radiochemical analytics provides a unique facility to meet current and future challenges in nuclear security and nuclear science.

  4. Pre-coding method and apparatus for multiple source or time-shifted single source data and corresponding inverse post-decoding method and apparatus

    Science.gov (United States)

    Yeh, Pen-Shu (Inventor)

    1998-01-01

    A pre-coding method and device for improving data compression performance by removing correlation between a first original data set and a second original data set, each having M members, respectively. The pre-coding method produces a compression-efficiency-enhancing double-difference data set. The method and device produce a double-difference data set, i.e., an adjacent-delta calculation performed on a cross-delta data set or a cross-delta calculation performed on two adjacent-delta data sets, from either one of (1) two adjacent spectral bands coming from two discrete sources, respectively, or (2) two time-shifted data sets coming from a single source. The resulting double-difference data set is then coded using either a distortionless data encoding scheme (entropy encoding) or a lossy data compression scheme. Also, a post-decoding method and device for recovering a second original data set having been represented by such a double-difference data set.

  5. The safe use of radiation sources

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-12-01

    As a means of promoting safety in the use of radiation sources, as well as encouraging consistency in regulatory control, the IAEA has from time to time organized training courses with the co-operation of Member State governments and organizations, to inform individuals from developing countries with appropriate responsibilities on the provisions for the safe use and regulation of radiation sources. Three such courses on the safe use of radiation sources have been held in both the USA, with the co-operation of the United States Government, and in Dublin, Ireland, with the co-operation of the Irish Government. The Training Course on the Safe Use and Regulation of Radiation Sources has been successfully given to over 77 participants from over 30 countries during the last years. The course is aimed at providing a basis of radiation protection knowledge in all aspects of the uses of radiation and of radiation sources that are used today. It is the intention of this course to provide a systematic enhancement of radioisotope safety in countries with developing radiological programmes through a core group of national authorities. The IAEA's training programmes provide an excellent opportunity for direct contact with lecturers that have extensive experience in resolving issues faced by developing countries and in providing guidance documents useful in addressing their problems. This document uses this collective experience and provides valuable technical information regarding the safety aspects of the uses not only of sealed and unsealed sources of radiation, but also for those machines that produce ionizing radiation. The first of these training courses, 'Safety and Regulation of Unsealed Sources' was held in Dublin, Ireland, June through July 1989 with the co-operation of the Nuclear Energy Board and Trinity College. This was an interregional training course, the participants came from all over the world. The second and third interregional courses, 'Safety and Regulation

  6. The safe use of radiation sources

    International Nuclear Information System (INIS)

    1995-01-01

    As a means of promoting safety in the use of radiation sources, as well as encouraging consistency in regulatory control, the IAEA has from time to time organized training courses with the co-operation of Member State governments and organizations, to inform individuals from developing countries with appropriate responsibilities on the provisions for the safe use and regulation of radiation sources. Three such courses on the safe use of radiation sources have been held in both the USA, with the co-operation of the United States Government, and in Dublin, Ireland, with the co-operation of the Irish Government. The Training Course on the Safe Use and Regulation of Radiation Sources has been successfully given to over 77 participants from over 30 countries during the last years. The course is aimed at providing a basis of radiation protection knowledge in all aspects of the uses of radiation and of radiation sources that are used today. It is the intention of this course to provide a systematic enhancement of radioisotope safety in countries with developing radiological programmes through a core group of national authorities. The IAEA's training programmes provide an excellent opportunity for direct contact with lecturers that have extensive experience in resolving issues faced by developing countries and in providing guidance documents useful in addressing their problems. This document uses this collective experience and provides valuable technical information regarding the safety aspects of the uses not only of sealed and unsealed sources of radiation, but also for those machines that produce ionizing radiation. The first of these training courses, 'Safety and Regulation of Unsealed Sources' was held in Dublin, Ireland, June through July 1989 with the co-operation of the Nuclear Energy Board and Trinity College. This was an interregional training course, the participants came from all over the world. The second and third interregional courses, 'Safety and Regulation

  7. A Novel Smart Meter Controlling System with Dynamic IP Addresses

    DEFF Research Database (Denmark)

    Manembu, Pinrolinvic; Welang, Brammy; Kalua Lapu, Aditya

    2017-01-01

    Smart meters are the electronic devices for measuring energy consumption in real time. Usually, static public IP addresses are allocated to realize the point-to-point (P2P) communication and remote controlling for smart metering systems. This, however, restricts the wide deployment of smart meters......, due to the deficiency of public IP resources. This paper proposes a novel subscription-based communication architecture for the support of dynamic IP addresses and group controlling of smart meters. The paper evaluates the proposed architecture by comparing the traditional P2P architecture...

  8. Addressing Student Burnout: What Medical Schools Can Learn From Business Schools.

    Science.gov (United States)

    Pathipati, Akhilesh S; Cassel, Christine K

    2018-03-13

    Although they enter school with enthusiasm for a career in medicine, medical students in the United States subsequently report high levels of burnout and disillusionment. As medical school leaders consider how to address this problem, they can look to business schools as one source of inspiration. In this Commentary, the authors argue-based on their collective experience in both medical and business education-that medical schools can draw three lessons from business schools that can help reinvigorate students. First, medical schools should offer more opportunities and dedicated time for creative work. Engaging with diverse challenges promotes intellectual curiosity and can help students maintain perspective. Second, schools should provide more explicit training in resiliency and the management of stressful situations. Many business programs include formal training in how to cope with conflict and how to make high-stakes decisions whereas medical students are typically expected to learn those skills on the job. Finally, medical schools should provide better guidance on practical career considerations like income, lifestyle, and financial skills. Whether in medicine or business, students benefit from open discussions about their personal and professional goals. Medical schools must ensure students have an outlet for those conversations.

  9. Port virtual addressing for PC

    International Nuclear Information System (INIS)

    Bolanos, L.; Arista, E.; Osorio Deliz, J.F.

    1997-01-01

    Instruments for nuclear signal measurements based on add-on card for a personal computer (PC) are designed often. Then one faces the problem of the addressing of data input/output devices which show an integration level or intelligence that makes the use of several port address indispensable, and these are limited in the PC. The virtual addressing offers the advantage of the occupation of few addresses to accede to many of these devices. The principles of this technique and the appliances of a solution in radiometric in a radiometric card based on programmed logic are discussed in this paper

  10. International conference in Stockholm to address protection of nuclear material and radioactive sources from illicit trafficking

    International Nuclear Information System (INIS)

    2001-01-01

    The Conference will look at approaches for enhancing security of material in general as well as protecting facilities against terrorism and sabotage. More specifically it will address measures for interception and response to illicit trafficking and discuss the practices and measures currently being used to minimize the possibilities of the unauthorized removal and movement of nuclear materials and critical equipment. It will also consider the importance of closer co-operation with law enforcement authorities and intelligence agencies and the necessity of applying new technologies to this effort

  11. EUV sources for the alpha-tools

    Science.gov (United States)

    Pankert, Joseph; Apetz, Rolf; Bergmann, Klaus; Damen, Marcel; Derra, Günther; Franken, Oliver; Janssen, Maurice; Jonkers, Jeroen; Klein, Jürgen; Kraus, Helmar; Krücken, Thomas; List, Andreas; Loeken, Micheal; Mader, Arnaud; Metzmacher, Christof; Neff, Willi; Probst, Sven; Prümmer, Ralph; Rosier, Oliver; Schwabe, Stefan; Seiwert, Stefan; Siemons, Guido; Vaudrevange, Dominik; Wagemann, Dirk; Weber, Achim; Zink, Peter; Zitzen, Oliver

    2006-03-01

    In this paper, we report on the recent progress of the Philips Extreme UV source. The Philips source concept is based on a discharge plasma ignited in a Sn vapor plume that is ablated by a laser pulse. Using rotating electrodes covered with a regenerating tin surface, the problems of electrode erosion and power scaling are fundamentally solved. Most of the work of the past year has been dedicated to develop a lamp system which is operating very reliably and stable under full scanner remote control. Topics addressed were the development of the scanner interface, a dose control system, thermo-mechanical design, positional stability of the source, tin handling, and many more. The resulting EUV source-the Philips NovaTin(R) source-can operate at more than 10kW electrical input power and delivers 200W in-band EUV into 2π continuously. The source is very small, so nearly 100% of the EUV radiation can be collected within etendue limits. The lamp system is fully automated and can operate unattended under full scanner remote control. 500 Million shots of continuous operation without interruption have been realized, electrode lifetime is at least 2 Billion shots. Three sources are currently being prepared, two of them will be integrated into the first EUV Alpha Demonstration tools of ASML. The debris problem was reduced to a level which is well acceptable for scanner operation. First, a considerable reduction of the Sn emission of the source has been realized. The debris mitigation system is based on a two-step concept using a foil trap based stage and a chemical cleaning stage. Both steps were improved considerably. A collector lifetime of 1 Billion shots is achieved, after this operating time a cleaning would be applied. The cleaning step has been verified to work with tolerable Sn residues. From the experimental results, a total collector lifetime of more than 10 Billion shots can be expected.

  12. Inverse source problems for eddy current equations

    International Nuclear Information System (INIS)

    Rodríguez, Ana Alonso; Valli, Alberto; Camaño, Jessika

    2012-01-01

    We study the inverse source problem for the eddy current approximation of Maxwell equations. As for the full system of Maxwell equations, we show that a volume current source cannot be uniquely identified by knowledge of the tangential components of the electromagnetic fields on the boundary, and we characterize the space of non-radiating sources. On the other hand, we prove that the inverse source problem has a unique solution if the source is supported on the boundary of a subdomain or if it is the sum of a finite number of dipoles. We address the applicability of this result for the localization of brain activity from electroencephalography and magnetoencephalography measurements. (paper)

  13. Space-time dependence between energy sources and climate related energy production

    Science.gov (United States)

    Engeland, Kolbjorn; Borga, Marco; Creutin, Jean-Dominique; Ramos, Maria-Helena; Tøfte, Lena; Warland, Geir

    2014-05-01

    The European Renewable Energy Directive adopted in 2009 focuses on achieving a 20% share of renewable energy in the EU overall energy mix by 2020. A major part of renewable energy production is related to climate, called "climate related energy" (CRE) production. CRE production systems (wind, solar, and hydropower) are characterized by a large degree of intermittency and variability on both short and long time scales due to the natural variability of climate variables. The main strategies to handle the variability of CRE production include energy-storage, -transport, -diversity and -information (smart grids). The three first strategies aim to smooth out the intermittency and variability of CRE production in time and space whereas the last strategy aims to provide a more optimal interaction between energy production and demand, i.e. to smooth out the residual load (the difference between demand and production). In order to increase the CRE share in the electricity system, it is essential to understand the space-time co-variability between the weather variables and CRE production under both current and future climates. This study presents a review of the literature that searches to tackle these problems. It reveals that the majority of studies deals with either a single CRE source or with the combination of two CREs, mostly wind and solar. This may be due to the fact that the most advanced countries in terms of wind equipment have also very little hydropower potential (Denmark, Ireland or UK, for instance). Hydropower is characterized by both a large storage capacity and flexibility in electricity production, and has therefore a large potential for both balancing and storing energy from wind- and solar-power. Several studies look at how to better connect regions with large share of hydropower (e.g., Scandinavia and the Alps) to regions with high shares of wind- and solar-power (e.g., green battery North-Sea net). Considering time scales, various studies consider wind

  14. Triple GEM gas detectors as real time fast neutron beam monitors for spallation neutron sources

    International Nuclear Information System (INIS)

    Murtas, F; Claps, G; Croci, G; Tardocchi, M; Pietropaolo, A; Cippo, E Perelli; Rebai, M; Gorini, G; Frost, C D; Raspino, D; Rhodes, N J; Schooneveld, E M

    2012-01-01

    A fast neutron beam monitor based on a triple Gas Electron Multiplier (GEM) detector was developed and tested for the ISIS spallation neutron source in U.K. The test on beam was performed at the VESUVIO beam line operating at ISIS. The 2D fast neutron beam footprint was recorded in real time with a spatial resolution of a few millimeters thanks to the patterned detector readout.

  15. An Adjoint Sensitivity Method Applied to Time Reverse Imaging of Tsunami Source for the 2009 Samoa Earthquake

    Science.gov (United States)

    Hossen, M. Jakir; Gusman, Aditya; Satake, Kenji; Cummins, Phil R.

    2018-01-01

    We have previously developed a tsunami source inversion method based on "Time Reverse Imaging" and demonstrated that it is computationally very efficient and has the ability to reproduce the tsunami source model with good accuracy using tsunami data of the 2011 Tohoku earthquake tsunami. In this paper, we implemented this approach in the 2009 Samoa earthquake tsunami triggered by a doublet earthquake consisting of both normal and thrust faulting. Our result showed that the method is quite capable of recovering the source model associated with normal and thrust faulting. We found that the inversion result is highly sensitive to some stations that must be removed from the inversion. We applied an adjoint sensitivity method to find the optimal set of stations in order to estimate a realistic source model. We found that the inversion result is improved significantly once the optimal set of stations is used. In addition, from the reconstructed source model we estimated the slip distribution of the fault from which we successfully determined the dipping orientation of the fault plane for the normal fault earthquake. Our result suggests that the fault plane dip toward the northeast.

  16. Is prophetic discourse adequate to address global economic justice?

    Directory of Open Access Journals (Sweden)

    Piet J. Naudé

    2011-04-01

    Full Text Available This article outlined key features of prophetic discourse and investigated whether this form of moral discourse adequately addresses issues of economic injustice. It is shown that the strength of prophetic discourse is its ability to denounce instances of injustice whilst at the same time announcing a God-willed alternative future. The ‘preferential option for the poor’ in Latin American liberation theologies is treated as a case study of the influence of prophetic discourse in contexts of perceived economic injustice. Also the core weaknesses of prophetic discourse are investigated, specifically its incomplete moral argument, weak moral analyses, silence on transition measures, and its inability to take a positive stance on reforms in the system from which itself benefits. In the final section it is concluded that prophetic discourse plays an indispensable role in addressing issues of global economic justice, but – taken by itself – it is not an adequate form of moral discourse to address concrete matters of justice.

  17. A flexible analog memory address list manager for PHENIX

    International Nuclear Information System (INIS)

    Ericson, M.N.; Musrock, M.S.; Britton, C.L. Jr.; Walker, J.W.; Wintenberg, A.L.; Young, G.R.; Allen, M.D.

    1996-01-01

    A programmable analog memory address list manager has been developed for use with all analog memory-based detector subsystems of PHENIX. The unit provides simultaneous read/write control, cell write-over protection for both a Level-1 trigger decision delay and digitization latency, and re-ordering of AMU addresses following conversion, at a beam crossing rate of 105 ns. Addresses are handled such that up to 5 Level-1 (LVL-1) events can be maintained in the AMU without write-over. Data tagging is implemented for handling overlapping and shared beam-event data packets. Full usage in all PHENIX analog memory-based detector subsystems is accomplished by the use of detector-specific programmable parameters--the number of data samples per valid LVL-1 trigger and the sample spacing. Architectural candidates for the system are discussed with emphasis on implementation implications. Details of the design are presented including application specifics, timing information, and test results from a full implementation using field programmable gate arrays (FPGAs)

  18. Recent innovation in microbial source tracking using bacterial real-time PCR markers in shellfish

    International Nuclear Information System (INIS)

    Mauffret, A.; Mieszkin, S.; Morizur, M.; Alfiansah, Y.; Lozach, S.; Gourmelon, M.

    2013-01-01

    Highlights: ► DNA extraction from intravalvular liquid is promising for microbial source tracking in oysters. ► Host-associated bacterial markers in shellfish digestive tissues were difficult to assess with real-time PCR. ► DNA extracts from shellfish flesh appeared to have low inhibitor levels but low marker levels. ► Protocol transfer from one shellfish species to another does not appear possible. -- Abstract: We assessed the capacity of real-time PCR markers to identify the origin of contamination in shellfish. Oyster, cockles or clams were either contaminated with fecal materials and host-associated markers designed from Bacteroidales or Catellicoccus marimammalium 16S RNA genes were extracted from their intravalvular liquid, digestive tissues or shellfish flesh. Extraction of bacterial DNA from the oyster intravalvular liquid with FastDNA spin kit for soil enabled the selected markers to be quantified in 100% of artificially contaminated samples, and the source of contamination to be identified in 13 out of 38 naturally contaminated batches from European Class B and Class C areas. However, this protocol did not enable the origin of the contamination to be identified in cockle or clam samples. Although results are promising for extracts from intravalvular liquid in oyster, it is unlikely that a single protocol could be the best across all bacterial markers and types of shellfish

  19. Transformed composite sequences for improved qubit addressing

    Science.gov (United States)

    Merrill, J. True; Doret, S. Charles; Vittorini, Grahame; Addison, J. P.; Brown, Kenneth R.

    2014-10-01

    Selective laser addressing of a single atom or atomic ion qubit can be improved using narrow-band composite pulse sequences. We describe a Lie-algebraic technique to generalize known narrow-band sequences and introduce sequences related by dilation and rotation of sequence generators. Our method improves known narrow-band sequences by decreasing both the pulse time and the residual error. Finally, we experimentally demonstrate these composite sequences using 40Ca+ ions trapped in a surface-electrode ion trap.

  20. Initial time-resolved particle beam profile measurements at the Advanced Photon Source

    International Nuclear Information System (INIS)

    Yang, B.X.; Lumpkin, A.H.

    1995-01-01

    The commissioning of the 7-GeV Advanced Photon Source (APS) storage ring began in early 1995. Characterization of the stored particle beam properties involved time-resolved transverse and longitudinal profile measurements using optical synchrotron radiation (OSR) monitors. Early results include the observation of the beam on a single turn, measurements of the transverse beam sizes after damping using a 100 μs integration time (σ x ∼ 150 ± 25 μm, σ γ ∼ 65 ± 25 μm, depending on vertical coupling), and measurement of the bunch length (σ τ ∼ 25 to 55 ps, depending on the charge per bunch). The results are consistent with specifications and predictions based on the 8.2 nm-rad natural emittance, the calculated lattice parameters, and vertical coupling less than 10%. The novel, single-element focusing mirror for the photon transport line and the dual-sweep streak camera techniques which allow turn-by-turn measurements will also be presented. The latter measurements are believed to be the first of their kind on a storage ring in the USA

  1. www.kernenergie.de - nuclear power has a German Internet address

    International Nuclear Information System (INIS)

    Anon.

    2000-01-01

    www.kernenergie.de is the address on the worldwide web under which the German nuclear organizations, Deutsches Atomforum (DAtF), Informationskreis Kernenergie (IK), and Kerntechnische Gesellschaft (KTG) as well as atw - internationale Zeitschrift fuer Kernenergie, and INFORUM can be reached. Extensive sources of information, discussions, on-line dictionaries, computer codes, dynamic web pages, digital documents and multimedia offerings can be called up via the portal under the individual web sites. In this way, www.kernenergie.de provides a comprehensive and up-to-date background of information about nuclear power and adjacent topics in the digital worldwide web. (orig.) [de

  2. State Legislation to Address Childhood Obesity. Program Results Brief

    Science.gov (United States)

    Fiester, Leila

    2012-01-01

    An estimated 12.5 million American children and teens are obese. Over time, the diseases and disabilities associated with obesity may undermine this population's health and result in substantial social and economic costs. Policies that address children's nutrition and physical activity are an important tool in reversing the obesity epidemic. More…

  3. Study sponsorship and the nutrition research agenda: analysis of randomized controlled trials included in systematic reviews of nutrition interventions to address obesity.

    Science.gov (United States)

    Fabbri, Alice; Chartres, Nicholas; Scrinis, Gyorgy; Bero, Lisa A

    2017-05-01

    To categorize the research topics covered by a sample of randomized controlled trials (RCT) included in systematic reviews of nutrition interventions to address obesity; to describe their funding sources; and to explore the association between funding sources and nutrition research topics. Cross-sectional study. RCT included in Cochrane Reviews of nutrition interventions to address obesity and/or overweight. Two hundred and thirteen RCT from seventeen Cochrane Reviews were included. Funding source and authors' conflicts of interest were disclosed in 82·6 and 29·6 % of the studies, respectively. RCT were more likely to test an intervention to manipulate nutrients in the context of reduced energy intake (44·2 % of studies) than food-level (11·3 %) and dietary pattern-level (0·9 %) interventions. Most of the food industry-sponsored studies focused on interventions involving manipulations of specific nutrients (66·7 %). Only 33·1 % of the industry-funded studies addressed dietary behaviours compared with 66·9 % of the non-industry-funded ones (P=0·002). The level of food processing was poorly considered across all funding sources. The predominance of RCT examining nutrient-specific questions could limit the public health relevance of rigorous evidence available for systematic reviews and dietary guidelines.

  4. Shifting nitrous oxide source/sink behaviour in a subtropical estuary revealed by automated time series observations

    Science.gov (United States)

    Reading, Michael J.; Santos, Isaac R.; Maher, Damien T.; Jeffrey, Luke C.; Tait, Douglas R.

    2017-07-01

    The oceans are a major source of the potent greenhouse gas nitrous oxide (N2O) to the atmosphere. However, little information is available on how estuaries and the coastal ocean may contribute to N2O budgets, and on the drivers of N2O in aquatic environments. This study utilised five time series stations along the freshwater to marine continuum in a sub-tropical estuary in Australia (Coffs Creek, Australia). Each time series station captured N2O, radon (222Rn, a natural submarine groundwater discharge tracer), dissolved nitrogen, and dissolved organic carbon (DOC) concentrations for a minimum of 25 h. The use of automated time series observations enabled spatial and tidal-scale variability of N2O to be captured. Groundwater was highly enriched in N2O (up to 306 nM) compared to the receiving surface water. Dissolved N2O supersaturation as high as 386% (27.4 nM) was observed in the upstream freshwater and brackish water areas which represented only a small (∼13%) proportion of the total estuary area. A large area of N2O undersaturation (as low as 53% or 3.9 nM) was observed in the mangrove-dominated lower estuary. This undersaturated area likely resulted from N2O consumption due to nitrate/nitrite (NOx) limitation in mangrove sediments subject to shallow porewater exchange. Overall, the estuary was a minor source of N2O to the atmosphere as the lower mangrove-dominated estuary sink of N2O counteracted groundwater-dominated source of N2O in the upper estuary. Average area-weighted N2O fluxes at the water-air interface approached zero (0.2-0.7 μmol m-2 d-1, depending on piston velocity model used), and were much lower than nitrogen-rich Northern Hemisphere estuaries that are considered large sources of N2O to the atmosphere. This study revealed a temporally and spatially diverse estuary, with areas of N2O production and consumption related to oxygen and total dissolved nitrogen availability, submarine groundwater discharge, and uptake within mangroves.

  5. MAP-Based Underdetermined Blind Source Separation of Convolutive Mixtures by Hierarchical Clustering and -Norm Minimization

    Directory of Open Access Journals (Sweden)

    Kellermann Walter

    2007-01-01

    Full Text Available We address the problem of underdetermined BSS. While most previous approaches are designed for instantaneous mixtures, we propose a time-frequency-domain algorithm for convolutive mixtures. We adopt a two-step method based on a general maximum a posteriori (MAP approach. In the first step, we estimate the mixing matrix based on hierarchical clustering, assuming that the source signals are sufficiently sparse. The algorithm works directly on the complex-valued data in the time-frequency domain and shows better convergence than algorithms based on self-organizing maps. The assumption of Laplacian priors for the source signals in the second step leads to an algorithm for estimating the source signals. It involves the -norm minimization of complex numbers because of the use of the time-frequency-domain approach. We compare a combinatorial approach initially designed for real numbers with a second-order cone programming (SOCP approach designed for complex numbers. We found that although the former approach is not theoretically justified for complex numbers, its results are comparable to, or even better than, the SOCP solution. The advantage is a lower computational cost for problems with low input/output dimensions.

  6. Radiation Sources Working Group Summary Report

    International Nuclear Information System (INIS)

    Fazio, Michael V.

    1999-01-01

    The Radiation Sources Working Group addressed advanced concepts for the generation of RF energy to power advanced accelerators. The focus of the working group included advanced sources and technologies above 17 GHz. The topics discussed included RF sources above 17 GHz, pulse compression techniques to achieve extreme peak power levels, component technology, technology limitations and physical limits, and other advanced concepts. RF sources included gyroklystrons, magnicons, free-electron masers, two beam accelerators, and gyroharmonic and traveling wave devices. Technology components discussed included advanced cathodes and electron guns, high temperature superconductors for producing magnetic fields, RF breakdown physics and mitigarion, and phenomena that impact source design such as fatigue in resonant structures due to pulsed RF heating. New approaches for RF source diagnostics located internal to the source were discussed for detecting plasma and beam phenomena existing in high energy density electrodynamic systems in order to help elucidate the reasons for performance limitations

  7. Radiation Sources Working Group Summary Report

    International Nuclear Information System (INIS)

    Fazio, M.V.

    1999-01-01

    The Radiation Sources Working Group addressed advanced concepts for the generation of RF energy to power advanced accelerators. The focus of the working group included advanced sources and technologies above 17 GHz. The topics discussed included RF sources above 17 GHz, pulse compression techniques to achieve extreme peak power levels, component technology, technology limitations and physical limits, and other advanced concepts. RF sources included gyroklystrons, magnicons, free-electron masers, two beam accelerators, and gyroharmonic and traveling wave devices. Technology components discussed included advanced cathodes and electron guns, high temperature superconductors for producing magnetic fields, RF breakdown physics and mitigarion, and phenomena that impact source design such as fatigue in resonant structures due to pulsed RF heating. New approaches for RF source diagnostics located internal to the source were discussed for detecting plasma and beam phenomena existing in high energy density electrodynamic systems in order to help elucidate the reasons for performance limitations. copyright 1999 American Institute of Physics

  8. Thermal modeling of multi-shape heating sources on n-layer electronic board

    Directory of Open Access Journals (Sweden)

    Monier-Vinard Eric

    2017-01-01

    Full Text Available The present work completes the toolbox of analytical solutions that deal with resolving steady-state temperatures of a multi-layered structure heated by one or many heat sources. The problematic of heating sources having non-rectangular shapes is addressed to enlarge the capability of analytical approaches. Moreover, various heating sources could be located on the external surfaces of the sandwiched layers as well as embedded at interface of its constitutive layers. To demonstrate its relevance, the updated analytical solution has been compared with numerical simulations on the case of a multi-layered electronic board submitted to a set of heating source configurations. The comparison shows a high agreement between analytical and numerical calculations to predict the centroid and average temperatures. The promoted analytical approach establishes a kit of practical expressions, easy to implement, which would be cumulated, using superposition principle, to help electronic designers to early detect component or board temperatures beyond manufacturer limit. The ability to eliminate bad concept candidates with a minimum of set-up, relevant assumptions and low computation time can be easily achieved.

  9. Null stream analysis of Pulsar Timing Array data: localisation of resolvable gravitational wave sources

    Science.gov (United States)

    Goldstein, Janna; Veitch, John; Sesana, Alberto; Vecchio, Alberto

    2018-04-01

    Super-massive black hole binaries are expected to produce a gravitational wave (GW) signal in the nano-Hertz frequency band which may be detected by pulsar timing arrays (PTAs) in the coming years. The signal is composed of both stochastic and individually resolvable components. Here we develop a generic Bayesian method for the analysis of resolvable sources based on the construction of `null-streams' which cancel the part of the signal held in common for each pulsar (the Earth-term). For an array of N pulsars there are N - 2 independent null-streams that cancel the GW signal from a particular sky location. This method is applied to the localisation of quasi-circular binaries undergoing adiabatic inspiral. We carry out a systematic investigation of the scaling of the localisation accuracy with signal strength and number of pulsars in the PTA. Additionally, we find that source sky localisation with the International PTA data release one is vastly superior than what is achieved by its constituent regional PTAs.

  10. The case for open-source software in drug discovery.

    Science.gov (United States)

    DeLano, Warren L

    2005-02-01

    Widespread adoption of open-source software for network infrastructure, web servers, code development, and operating systems leads one to ask how far it can go. Will "open source" spread broadly, or will it be restricted to niches frequented by hopeful hobbyists and midnight hackers? Here we identify reasons for the success of open-source software and predict how consumers in drug discovery will benefit from new open-source products that address their needs with increased flexibility and in ways complementary to proprietary options.

  11. inaugral address

    African Journals Online (AJOL)

    While political reorientation and economic redress were of immediate concern, ... South African context, where widespread changes have been proposed for education at all ... education at school and other levels and needs to be addressed so as to ..... the major national curriculum intervention in environmental education.

  12. Assembling GHERG: Could "academic crowd-sourcing" address gaps in global health estimates?

    Science.gov (United States)

    Rudan, Igor; Campbell, Harry; Marušić, Ana; Sridhar, Devi; Nair, Harish; Adeloye, Davies; Theodoratou, Evropi; Chan, Kit Yee

    2015-06-01

    In recent months, the World Health Organization (WHO), independent academic researchers, the Lancet and PLoS Medicine journals worked together to improve reporting of population health estimates. The new guidelines for accurate and transparent health estimates reporting (likely to be named GATHER), which are eagerly awaited, represent a helpful move that should benefit the field of global health metrics. Building on this progress and drawing from a tradition of Child Health Epidemiology Reference Group (CHERG)'s successful work model, we would like to propose a new initiative - "Global Health Epidemiology Reference Group" (GHERG). We see GHERG as an informal and entirely voluntary international collaboration of academic groups who are willing to contribute to improving disease burden estimates and respect the principles of the new guidelines - a form of "academic crowd-sourcing". The main focus of GHERG will be to identify the "gap areas" where not much information is available and/or where there is a lot of uncertainty present about the accuracy of the existing estimates. This approach should serve to complement the existing WHO and IHME estimates and to represent added value to both efforts.

  13. Identification of a time-varying point source in a system of two coupled linear diffusion-advection- reaction equations: application to surface water pollution

    International Nuclear Information System (INIS)

    Hamdi, Adel

    2009-01-01

    This paper deals with the identification of a point source (localization of its position and recovering the history of its time-varying intensity function) that constitutes the right-hand side of the first equation in a system of two coupled 1D linear transport equations. Assuming that the source intensity function vanishes before reaching the final control time, we prove the identifiability of the sought point source from recording the state relative to the second coupled transport equation at two observation points framing the source region. Note that at least one of the two observation points should be strategic. We establish an identification method that uses these records to identify the source position as the root of a continuous and strictly monotonic function. Whereas the source intensity function is recovered using a recursive formula without any need of an iterative process. Some numerical experiments on a variant of the surface water pollution BOD–OD coupled model are presented

  14. Provincial panel: addressing emerging energy constraints and new strategies to meet future generation demand

    International Nuclear Information System (INIS)

    Clarkson, J.

    2006-01-01

    This paper addresses emerging energy constraints and new strategies to meet future generation demand in the Province of Manitoba. The focus is to reduce reliance on energy sources that emit greenhouse gases such as petroleum, natural gas and coal, and increase clean and green electricity. The current plan is to double hydro generation, achieve 1000 MW wind power and utilize bio energy

  15. The ugly twins: Failed global sourcing projects and their substitutes

    NARCIS (Netherlands)

    Schiele, Holger; Horn, Philipp; Horn, Philipp; Werner, Welf

    2010-01-01

    Purpose of the paper and literature addressed: Analyzing the impact of failed global sourcing projects on the entire commodity group and exploring isomorphism as potential antecedent to the observed phenomenon. The paper is embedded in the global sourcing literature, as well as isomorphism and total

  16. A modification of the Regional Nutrient Management model (ReNuMa) to identify long-term changes in riverine nitrogen sources

    Science.gov (United States)

    Hu, Minpeng; Liu, Yanmei; Wang, Jiahui; Dahlgren, Randy A.; Chen, Dingjiang

    2018-06-01

    Source apportionment is critical for guiding development of efficient watershed nitrogen (N) pollution control measures. The ReNuMa (Regional Nutrient Management) model, a semi-empirical, semi-process-oriented model with modest data requirements, has been widely used for riverine N source apportionment. However, the ReNuMa model contains limitations for addressing long-term N dynamics by ignoring temporal changes in atmospheric N deposition rates and N-leaching lag effects. This work modified the ReNuMa model by revising the source code to allow yearly changes in atmospheric N deposition and incorporation of N-leaching lag effects into N transport processes. The appropriate N-leaching lag time was determined from cross-correlation analysis between annual watershed individual N source inputs and riverine N export. Accuracy of the modified ReNuMa model was demonstrated through analysis of a 31-year water quality record (1980-2010) from the Yongan watershed in eastern China. The revisions considerably improved the accuracy (Nash-Sutcliff coefficient increased by ∼0.2) of the modified ReNuMa model for predicting riverine N loads. The modified model explicitly identified annual and seasonal changes in contributions of various N sources (i.e., point vs. nonpoint source, surface runoff vs. groundwater) to riverine N loads as well as the fate of watershed anthropogenic N inputs. Model results were consistent with previously modeled or observed lag time length as well as changes in riverine chloride and nitrate concentrations during the low-flow regime and available N levels in agricultural soils of this watershed. The modified ReNuMa model is applicable for addressing long-term changes in riverine N sources, providing decision-makers with critical information for guiding watershed N pollution control strategies.

  17. Variation-Tolerant and Low-Power Source-Synchronous Multicycle On-Chip Interconnect Scheme

    Directory of Open Access Journals (Sweden)

    Maged Ghoneima

    2007-01-01

    The proposed multicycle bus scheme also leads to significant energy savings due to eliminating the power-hungry flip-flops and efficiently designing the source synchronization overhead. Moreover, eliminating intermediate flip-flops avoids the timing overhead of the setup time, the flip-flop delay, and the single-cycle clock jitter. This delay slack can then be translated into further energy savings by downsizing the repeaters. The significant delay jitter due to capacitive coupling has been addressed and solutions are put forward to alleviate it. Circuit simulations in a  65-nm process environment indicate that energy savings up to 20% are achievable for a 6-cycle 9 mm long 16-bit bus.

  18. A shift in emission time profiles of fossil fuel combustion due to energy transitions impacts source receptor matrices for air quality.

    Science.gov (United States)

    Hendriks, Carlijn; Kuenen, Jeroen; Kranenburg, Richard; Scholz, Yvonne; Schaap, Martijn

    2015-03-01

    Effective air pollution and short-lived climate forcer mitigation strategies can only be designed when the effect of emission reductions on pollutant concentrations and health and ecosystem impacts are quantified. Within integrated assessment modeling source-receptor relationships (SRRs) based on chemistry transport modeling are used to this end. Currently, these SRRs are made using invariant emission time profiles. The LOTOS-EUROS model equipped with a source attribution module was used to test this assumption for renewable energy scenarios. Renewable energy availability and thereby fossil fuel back up are strongly dependent on meteorological conditions. We have used the spatially and temporally explicit energy model REMix to derive time profiles for backup power generation. These time profiles were used in LOTOS-EUROS to investigate the effect of emission timing on air pollutant concentrations and SRRs. It is found that the effectiveness of emission reduction in the power sector is significantly lower when accounting for the shift in the way emissions are divided over the year and the correlation of emissions with synoptic situations. The source receptor relationships also changed significantly. This effect was found for both primary and secondary pollutants. Our results indicate that emission timing deserves explicit attention when assessing the impacts of system changes on air quality and climate forcing from short lived substances.

  19. Analysis of Quasi-periodic Oscillations and Time Lag in Ultraluminous X-Ray Sources with XMM-Newton

    Energy Technology Data Exchange (ETDEWEB)

    Li, Zi-Jian; Xiao, Guang-Cheng; Zhang, Shu; Ma, Xiang; Yan, Lin-Li; Qu, Jin-Lu [Key Laboratory for Particle Astrophysics, Institute of High Energy Physics, Chinese Academy of Sciences, 19B Yuquan Road, Beijing 100049 (China); Chen, Li; Bu, Qing-Cui; Zhang, Liang, E-mail: lizijian@ihep.ac.cn, E-mail: qujl@ihep.ac.cn [Department of Astronomy, Beijing Normal University, Beijing 100875 (China)

    2017-04-10

    We investigated the power density spectrum (PDS) and time lag of ultraluminous X-ray sources (ULXs) observed by XMM-Newton . We determined the PDSs for each ULX and found that five of them show intrinsic variability due to obvious quasi-periodic oscillations (QPOs) of mHz–1 Hz, consistent with previous reports. We further investigated these five ULXs to determine their possible time lag. The ULX QPOs exhibit a soft time lag that is linearly related to the QPO frequency. We discuss the likelihood of the ULX QPOs being type-C QPO analogs, and the time lag models. The ULXs might harbor intermediate-mass black holes if their QPOs are type-C QPO analogs. We suggest that the soft lag and the linearity may be due to reverberation.

  20. Cloud-Hosted Real-time Data Services for the Geosciences (CHORDS)

    Science.gov (United States)

    Daniels, M. D.; Graves, S. J.; Vernon, F.; Kerkez, B.; Chandra, C. V.; Keiser, K.; Martin, C.

    2014-12-01

    Cloud-Hosted Real-time Data Services for the Geosciences (CHORDS) Access, utilization and management of real-time data continue to be challenging for decision makers, as well as researchers in several scientific fields. This presentation will highlight infrastructure aimed at addressing some of the gaps in handling real-time data, particularly in increasing accessibility of these data to the scientific community through cloud services. The Cloud-Hosted Real-time Data Services for the Geosciences (CHORDS) system addresses the ever-increasing importance of real-time scientific data, particularly in mission critical scenarios, where informed decisions must be made rapidly. Advances in the distribution of real-time data are leading many new transient phenomena in space-time to be observed, however real-time decision-making is infeasible in many cases that require streaming scientific data as these data are locked down and sent only to proprietary in-house tools or displays. This lack of accessibility to the broader scientific community prohibits algorithm development and workflows initiated by these data streams. As part of NSF's EarthCube initiative, CHORDS proposes to make real-time data available to the academic community via cloud services. The CHORDS infrastructure will enhance the role of real-time data within the geosciences, specifically expanding the potential of streaming data sources in enabling adaptive experimentation and real-time hypothesis testing. Adherence to community data and metadata standards will promote the integration of CHORDS real-time data with existing standards-compliant analysis, visualization and modeling tools.

  1. EEASA 2004 Keynote Address - Environment as Life:The Journey of ...

    African Journals Online (AJOL)

    EEASA 2004 Keynote Address - Environment as Life:The Journey of an ... as a biology teacher in times of political turmoil and change in South Africa. ... threaten not only the quality of human life, but also the capacity of the earth to sustain life.

  2. Iterative algorithm for joint zero diagonalization with application in blind source separation.

    Science.gov (United States)

    Zhang, Wei-Tao; Lou, Shun-Tian

    2011-07-01

    A new iterative algorithm for the nonunitary joint zero diagonalization of a set of matrices is proposed for blind source separation applications. On one hand, since the zero diagonalizer of the proposed algorithm is constructed iteratively by successive multiplications of an invertible matrix, the singular solutions that occur in the existing nonunitary iterative algorithms are naturally avoided. On the other hand, compared to the algebraic method for joint zero diagonalization, the proposed algorithm requires fewer matrices to be zero diagonalized to yield even better performance. The extension of the algorithm to the complex and nonsquare mixing cases is also addressed. Numerical simulations on both synthetic data and blind source separation using time-frequency distributions illustrate the performance of the algorithm and provide a comparison to the leading joint zero diagonalization schemes.

  3. Generative Street Addresses from Satellite Imagery

    Directory of Open Access Journals (Sweden)

    İlke Demir

    2018-03-01

    Full Text Available We describe our automatic generative algorithm to create street addresses from satellite images by learning and labeling roads, regions, and address cells. Currently, 75% of the world’s roads lack adequate street addressing systems. Recent geocoding initiatives tend to convert pure latitude and longitude information into a memorable form for unknown areas. However, settlements are identified by streets, and such addressing schemes are not coherent with the road topology. Instead, we propose a generative address design that maps the globe in accordance with streets. Our algorithm starts with extracting roads from satellite imagery by utilizing deep learning. Then, it uniquely labels the regions, roads, and structures using some graph- and proximity-based algorithms. We also extend our addressing scheme to (i cover inaccessible areas following similar design principles; (ii be inclusive and flexible for changes on the ground; and (iii lead as a pioneer for a unified street-based global geodatabase. We present our results on an example of a developed city and multiple undeveloped cities. We also compare productivity on the basis of current ad hoc and new complete addresses. We conclude by contrasting our generative addresses to current industrial and open solutions.

  4. A sparse equivalent source method for near-field acoustic holography

    DEFF Research Database (Denmark)

    Fernandez Grande, Efren; Xenaki, Angeliki; Gerstoft, Peter

    2017-01-01

    and experimental results on a classical guitar and on a highly reactive dipolelike source are presented. C-ESM is valid beyond the conventional sampling limits, making wideband reconstruction possible. Spatially extended sources can also be addressed with C-ESM, although in this case the obtained solution does...

  5. A formal treatment of uncertainty sources in a level 2 PSA

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Yang, Joon Eon

    2003-01-01

    The methodological framework of the level 2 PSA appears to be currently standardized in a formalized fashion, but there have been different opinions on the way the sources of uncertainty are characterized and treated. This is primarily because the level 2 PSA deals with complex phenomenological processes that are deterministic in nature rather than random processes, and there are no probabilistic models characterizing them clearly. As a result, the probabilistic quantification of the level 2 PSA is often subjected to two sources of uncertainty: (a) incomplete modeling of accident pathways or different predictions for the behavior of phenomenological events and (b) expert-to-expert variation in estimating the occurrence probability of phenomenological events. While a clear definition of the two sources of uncertainty involved in the level 2 PSA makes it possible to treat an uncertainty in a consistent manner, careless application of these different sources of uncertainty may produce different conclusions in the decision-making process. The primary purpose of this paper is to characterize typical sources of uncertainty that would often be addressed in the level 2 PSA and their impacts on the PSA level 2 risk results. An additional purpose of this paper is to give a formal approach on how to combine random uncertainties addressed in the level 1 PSA with subjectivistic uncertainties addressed in the level 2 PSA

  6. K+ ion source for the heavy ion Induction Linac System Experiment ILSE

    International Nuclear Information System (INIS)

    Eylon, S.; Henestroza, E.; Chupp, W.W.; Yu, S.

    1993-05-01

    Low emittance singly charged potassium thermionic ion sources are being developed for the ILSE injector. The ILSE, now under study at LBL, will address the physics issues of particle beams in a heavy ion fusion driver scenario. The K + ion beam is emitted thermionically into a diode gap from alumina-silicate layers (zeolite) coated uniformly on a porous tungsten cup. The Injector diode design requires a large diameter (4in. to 7in.) source able to deliver high current (∼800 mA) low emittance (E n < .5 π mm-mr) beam. The SBTE (Single Beam Test Experiment) 120 keV gun was redesigned and modified with the aid of diode optics calculations using the EGUN code to enable the extraction of high currents of about 90 mA out of a one-inch diameter source. We report on the 1in. source fabrication technique and performance, including total current and current density profile measurements using Faraday cups, emittance and phase space profile measurements using the double slit scanning technique, and life time measurements. Furthermore, we shall report on the extension of the fabricating technique to large diameter sources (up to 7in.), measured ion emission performance, measured surface temperature uniform heating power considerations for large sources

  7. K+ ion source for the heavy ion induction linac system experiment ILSE

    International Nuclear Information System (INIS)

    Eylon, S.; Henestroza, E.; Chupp, W.W.; Yu, S.

    1993-01-01

    Low emittance singly charged potassium thermionic ion sources are being developed for the ILSE injector. The ILSE, now under study at LBL, will address the physics issues of particle beams in a heavy ion fusion driver scenario. The K + ion beam is emitted thermionically into a diode gap from alumino-silicate layers (zeolite) coated uniformly on a porous tungsten cup. The Injector diode design requires a large diameter (4 inches to 7 inches) source able to deliver high current (∼ 800 mA) low emittance (E n < .5 π mm-mr) beam. The SBTE (Single Beam Test Experiment) 120 keV gun was redesigned and modified with the aid of diode optics calculations using the EGUN code to enable the extraction of high currents of about 90 mA out of a one-inch diameter source. The authors report on the 1 inch source fabrication technique and performance, including total current and current density profile measurements using Faraday cups, emittance and phase space profile measurements using the double slit scanning technique, and life time measurements. Furthermore, they shall report on the extension of the fabricating technique to large diameter sources (up to 7 inches), measured ion emission performance, measured surface temperature uniformity and heating power considerations for large sources

  8. Stochastic resonance driven by time-modulated correlated coloured noise sources in a single-mode laser

    International Nuclear Information System (INIS)

    De-Yi, Chen; Li, Zhang

    2009-01-01

    This paper investigates the phenomenon of stochastic resonance in a single-mode laser driven by time-modulated correlated coloured noise sources. The power spectrum and signal-to-noise ratio R of the laser intensity are calculated by the linear approximation. The effects caused by noise self-correlation time τ 1 , τ 2 and cross-correlated time τ 3 for stochastic resonance are analysed in two ways: τ 1 , τ 2 and τ 3 are taken to be the independent variables and the parameters respectively. The effects of the gain coefficient Γ and loss coefficient K on the stochastic resonance are also discussed. It is found that besides the presence of the standard form and the broad sense of stochastic resonance, the number of extrema in the curve of R versus K is reduced with the increase of the gain coefficient Γ

  9. Addressing sampling bias in counting forest birds: a West African ...

    African Journals Online (AJOL)

    Addressing sampling bias in counting forest birds: a West African case study. ... result may occur because of the noise they may introduce into the analysis. ... used at all; and for all transects to reach their mid-point at the same time each day, ...

  10. Advances in surface ion suppression from RILIS: Towards the Time-of-Flight Laser Ion Source (ToF-LIS)

    CERN Document Server

    Rothe, S; Crepieux, B; Day Goodacre, T; Fedosseev, V N; Giles, T; Marsh, B A; Ramos, J P; Rossel, R E

    2016-01-01

    We present results from the development towards the Time-of-Flight Laser Ion Source (ToF-LIS) aiming for the suppression of isobaric contaminants through fast beam gating. The capability to characterize high resistance ion sources has been successfully demonstrated. A ninefold selectivity gain has been achieved through suppression of surface ionized potassium, while maintaining >90% transmission for laser-ionized gallium using a thin wall graphite ionizer cavity combined with a fast beam gate. Initial results from the investigation of glassy carbon as a potential hot cavity ion source are presented. Power-cycle tests of a newly designed mount for fragile ion source cavities indicates its capability to survive the thermal stress expected during operation in an ISOLDE target unit. Finally, we introduce fast ion beam switching at a rate of 10 kHz using the ISOLDE ion beam switchyard as a new concept for ion beam distribution and conclude by highlighting the potential applications of this ion beam multiplexing te...

  11. RSS-based localization of isotropically decaying source with unknown power and pathloss factor

    International Nuclear Information System (INIS)

    Sun, Shunyuan; Sun, Li; Ding, Zhiguo

    2016-01-01

    This paper addresses the localization of an isotropically decaying source based on the received signal strength (RSS) measurements that are collected from nearby active sensors that are position-known and wirelessly connected, and it propose a novel iterative algorithm for RSS-based source localization in order to improve the location accuracy and realize real-time location and automatic monitoring for hospital patients and medical equipment in the smart hospital. In particular, we consider the general case where the source power and pathloss factor are both unknown. For such a source localization problem, we propose an iterative algorithm, in which the unknown source position and two other unknown parameters (i.e. the source power and pathloss factor) are estimated in an alternating way based on each other, with our proposed sub-optimum initial estimate on source position obtained based on the RSS measurements that are collected from a few (closest) active sensors with largest RSS values. Analysis and simulation study show that our proposed iterative algorithm guarantees globally convergence to the least-squares (LS) solution, where for our suitably assumed independent and identically distributed (i.i.d.) zero-mean Gaussian RSS measurement errors the converged localization performance achieves the optimum that corresponds to the Cramer–Rao lower bound (CRLB).

  12. Optimization of light source parameters in the photodynamic therapy of heterogeneous prostate

    International Nuclear Information System (INIS)

    Li Jun; Altschuler, Martin D; Hahn, Stephen M; Zhu, Timothy C

    2008-01-01

    The three-dimensional (3D) heterogeneous distributions of optical properties in a patient prostate can now be measured in vivo. Such data can be used to obtain a more accurate light-fluence kernel. (For specified sources and points, the kernel gives the fluence delivered to a point by a source of unit strength.) In turn, the kernel can be used to solve the inverse problem that determines the source strengths needed to deliver a prescribed photodynamic therapy (PDT) dose (or light-fluence) distribution within the prostate (assuming uniform drug concentration). We have developed and tested computational procedures to use the new heterogeneous data to optimize delivered light-fluence. New problems arise, however, in quickly obtaining an accurate kernel following the insertion of interstitial light sources and data acquisition. (1) The light-fluence kernel must be calculated in 3D and separately for each light source, which increases kernel size. (2) An accurate kernel for light scattering in a heterogeneous medium requires ray tracing and volume partitioning, thus significant calculation time. To address these problems, two different kernels were examined and compared for speed of creation and accuracy of dose. Kernels derived more quickly involve simpler algorithms. Our goal is to achieve optimal dose planning with patient-specific heterogeneous optical data applied through accurate kernels, all within clinical times. The optimization process is restricted to accepting the given (interstitially inserted) sources, and determining the best source strengths with which to obtain a prescribed dose. The Cimmino feasibility algorithm is used for this purpose. The dose distribution and source weights obtained for each kernel are analyzed. In clinical use, optimization will also be performed prior to source insertion to obtain initial source positions, source lengths and source weights, but with the assumption of homogeneous optical properties. For this reason, we compare the

  13. Astrometric and Timing Effects of Gravitational Waves from Localized Sources

    OpenAIRE

    Kopeikin, Sergei M.; Schafer, Gerhard; Gwinn, Carl R.; Eubanks, T. Marshall

    1998-01-01

    A consistent approach for an exhaustive solution of the problem of propagation of light rays in the field of gravitational waves emitted by a localized source of gravitational radiation is developed in the first post-Minkowskian and quadrupole approximation of General Relativity. We demonstrate that the equations of light propagation in the retarded gravitational field of an arbitrary localized source emitting quadrupolar gravitational waves can be integrated exactly. The influence of the gra...

  14. Characterization of dynamic changes of current source localization based on spatiotemporal fMRI constrained EEG source imaging

    Science.gov (United States)

    Nguyen, Thinh; Potter, Thomas; Grossman, Robert; Zhang, Yingchun

    2018-06-01

    Objective. Neuroimaging has been employed as a promising approach to advance our understanding of brain networks in both basic and clinical neuroscience. Electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) represent two neuroimaging modalities with complementary features; EEG has high temporal resolution and low spatial resolution while fMRI has high spatial resolution and low temporal resolution. Multimodal EEG inverse methods have attempted to capitalize on these properties but have been subjected to localization error. The dynamic brain transition network (DBTN) approach, a spatiotemporal fMRI constrained EEG source imaging method, has recently been developed to address these issues by solving the EEG inverse problem in a Bayesian framework, utilizing fMRI priors in a spatial and temporal variant manner. This paper presents a computer simulation study to provide a detailed characterization of the spatial and temporal accuracy of the DBTN method. Approach. Synthetic EEG data were generated in a series of computer simulations, designed to represent realistic and complex brain activity at superficial and deep sources with highly dynamical activity time-courses. The source reconstruction performance of the DBTN method was tested against the fMRI-constrained minimum norm estimates algorithm (fMRIMNE). The performances of the two inverse methods were evaluated both in terms of spatial and temporal accuracy. Main results. In comparison with the commonly used fMRIMNE method, results showed that the DBTN method produces results with increased spatial and temporal accuracy. The DBTN method also demonstrated the capability to reduce crosstalk in the reconstructed cortical time-course(s) induced by neighboring regions, mitigate depth bias and improve overall localization accuracy. Significance. The improved spatiotemporal accuracy of the reconstruction allows for an improved characterization of complex neural activity. This improvement can be

  15. Measurement of Neutron Energy Spectrum Emitted by Cf-252 Source Using Time-of-Flight Method

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Cheol Ho; Son, Jaebum; Kim, Tae Hoon; Lee, Sangmin; Kim, Yong-Kyun [Hanyang University, Seoul (Korea, Republic of)

    2016-10-15

    The techniques proposed to detect the neutrons usually require the detection of a secondary recoiling nucleus in a scintillator (or other type of detector) to indicate the rare collision of a neutron with a nucleus. This is the same basic technique, in this case detection of a recoil proton that was used by Chadwick in the 1930 s to discover and identify the neutron and determine its mass. It is primary technique still used today for detection of fast neutron, which typically involves the use of a hydrogen based organic plastic or liquid scintillator coupled to a photo-multiplier tube. The light output from such scintillators is a function of the cross section and nuclear kinematics of the n + nucleus collision. With the exception of deuterated scintillators, the scintillator signal does not necessarily produce a distinct peak in the scintillator spectrum directly related to the incident neutron energy. Instead neutron time-of-flight (TOF) often must be utilized to determine the neutron energy, which requires generation of a prompt start signal from the nuclear source emitting the neutrons. This method takes advantage of the high number of prompt gamma rays. The Time-of-Flight method was used to measure neutron energy spectrum emitted by the Cf-252 neutron source. Plastic scintillator that has a superior discrimination ability of neutron and gamma-ray was used as a stop signal detector and liquid scintillator was used as a stat signal detector. In experiment, neutron and gamma-ray spectrum was firstly measured and discriminated using the TOF method. Secondly, neutron energy spectrum was obtained through spectrum analysis. Equation of neutron energy spectrum that was emitted by Cf-252 source using the Gaussian fitting was obtained.

  16. Interactive, open source, travel time scenario modelling: tools to facilitate participation in health service access analysis.

    Science.gov (United States)

    Fisher, Rohan; Lassa, Jonatan

    2017-04-18

    Modelling travel time to services has become a common public health tool for planning service provision but the usefulness of these analyses is constrained by the availability of accurate input data and limitations inherent in the assumptions and parameterisation. This is particularly an issue in the developing world where access to basic data is limited and travel is often complex and multi-modal. Improving the accuracy and relevance in this context requires greater accessibility to, and flexibility in, travel time modelling tools to facilitate the incorporation of local knowledge and the rapid exploration of multiple travel scenarios. The aim of this work was to develop simple open source, adaptable, interactive travel time modelling tools to allow greater access to and participation in service access analysis. Described are three interconnected applications designed to reduce some of the barriers to the more wide-spread use of GIS analysis of service access and allow for complex spatial and temporal variations in service availability. These applications are an open source GIS tool-kit and two geo-simulation models. The development of these tools was guided by health service issues from a developing world context but they present a general approach to enabling greater access to and flexibility in health access modelling. The tools demonstrate a method that substantially simplifies the process for conducting travel time assessments and demonstrate a dynamic, interactive approach in an open source GIS format. In addition this paper provides examples from empirical experience where these tools have informed better policy and planning. Travel and health service access is complex and cannot be reduced to a few static modeled outputs. The approaches described in this paper use a unique set of tools to explore this complexity, promote discussion and build understanding with the goal of producing better planning outcomes. The accessible, flexible, interactive and

  17. Bayesian Travel Time Inversion adopting Gaussian Process Regression

    Science.gov (United States)

    Mauerberger, S.; Holschneider, M.

    2017-12-01

    A major application in seismology is the determination of seismic velocity models. Travel time measurements are putting an integral constraint on the velocity between source and receiver. We provide insight into travel time inversion from a correlation-based Bayesian point of view. Therefore, the concept of Gaussian process regression is adopted to estimate a velocity model. The non-linear travel time integral is approximated by a 1st order Taylor expansion. A heuristic covariance describes correlations amongst observations and a priori model. That approach enables us to assess a proxy of the Bayesian posterior distribution at ordinary computational costs. No multi dimensional numeric integration nor excessive sampling is necessary. Instead of stacking the data, we suggest to progressively build the posterior distribution. Incorporating only a single evidence at a time accounts for the deficit of linearization. As a result, the most probable model is given by the posterior mean whereas uncertainties are described by the posterior covariance.As a proof of concept, a synthetic purely 1d model is addressed. Therefore a single source accompanied by multiple receivers is considered on top of a model comprising a discontinuity. We consider travel times of both phases - direct and reflected wave - corrupted by noise. Left and right of the interface are assumed independent where the squared exponential kernel serves as covariance.

  18. An Efficient Addressing Scheme and Its Routing Algorithm for a Large-Scale Wireless Sensor Network

    Directory of Open Access Journals (Sweden)

    Choi Jeonghee

    2008-01-01

    Full Text Available Abstract So far, various addressing and routing algorithms have been extensively studied for wireless sensor networks (WSNs, but many of them were limited to cover less than hundreds of sensor nodes. It is largely due to stringent requirements for fully distributed coordination among sensor nodes, leading to the wasteful use of available address space. As there is a growing need for a large-scale WSN, it will be extremely challenging to support more than thousands of nodes, using existing standard bodies. Moreover, it is highly unlikely to change the existing standards, primarily due to backward compatibility issue. In response, we propose an elegant addressing scheme and its routing algorithm. While maintaining the existing address scheme, it tackles the wastage problem and achieves no additional memory storage during a routing. We also present an adaptive routing algorithm for location-aware applications, using our addressing scheme. Through a series of simulations, we prove that our approach can achieve two times lesser routing time than the existing standard in a ZigBee network.

  19. An Efficient Addressing Scheme and Its Routing Algorithm for a Large-Scale Wireless Sensor Network

    Directory of Open Access Journals (Sweden)

    Yongwan Park

    2008-12-01

    Full Text Available So far, various addressing and routing algorithms have been extensively studied for wireless sensor networks (WSNs, but many of them were limited to cover less than hundreds of sensor nodes. It is largely due to stringent requirements for fully distributed coordination among sensor nodes, leading to the wasteful use of available address space. As there is a growing need for a large-scale WSN, it will be extremely challenging to support more than thousands of nodes, using existing standard bodies. Moreover, it is highly unlikely to change the existing standards, primarily due to backward compatibility issue. In response, we propose an elegant addressing scheme and its routing algorithm. While maintaining the existing address scheme, it tackles the wastage problem and achieves no additional memory storage during a routing. We also present an adaptive routing algorithm for location-aware applications, using our addressing scheme. Through a series of simulations, we prove that our approach can achieve two times lesser routing time than the existing standard in a ZigBee network.

  20. Real-Time Localization of Moving Dipole Sources for Tracking Multiple Free-Swimming Weakly Electric Fish

    Science.gov (United States)

    Jun, James Jaeyoon; Longtin, André; Maler, Leonard

    2013-01-01

    In order to survive, animals must quickly and accurately locate prey, predators, and conspecifics using the signals they generate. The signal source location can be estimated using multiple detectors and the inverse relationship between the received signal intensity (RSI) and the distance, but difficulty of the source localization increases if there is an additional dependence on the orientation of a signal source. In such cases, the signal source could be approximated as an ideal dipole for simplification. Based on a theoretical model, the RSI can be directly predicted from a known dipole location; but estimating a dipole location from RSIs has no direct analytical solution. Here, we propose an efficient solution to the dipole localization problem by using a lookup table (LUT) to store RSIs predicted by our theoretically derived dipole model at many possible dipole positions and orientations. For a given set of RSIs measured at multiple detectors, our algorithm found a dipole location having the closest matching normalized RSIs from the LUT, and further refined the location at higher resolution. Studying the natural behavior of weakly electric fish (WEF) requires efficiently computing their location and the temporal pattern of their electric signals over extended periods. Our dipole localization method was successfully applied to track single or multiple freely swimming WEF in shallow water in real-time, as each fish could be closely approximated by an ideal current dipole in two dimensions. Our optimized search algorithm found the animal’s positions, orientations, and tail-bending angles quickly and accurately under various conditions, without the need for calibrating individual-specific parameters. Our dipole localization method is directly applicable to studying the role of active sensing during spatial navigation, or social interactions between multiple WEF. Furthermore, our method could be extended to other application areas involving dipole source

  1. Real-Time Localization of Moving Dipole Sources for Tracking Multiple Free-Swimming Weakly Electric Fish.

    Directory of Open Access Journals (Sweden)

    James Jaeyoon Jun

    Full Text Available In order to survive, animals must quickly and accurately locate prey, predators, and conspecifics using the signals they generate. The signal source location can be estimated using multiple detectors and the inverse relationship between the received signal intensity (RSI and the distance, but difficulty of the source localization increases if there is an additional dependence on the orientation of a signal source. In such cases, the signal source could be approximated as an ideal dipole for simplification. Based on a theoretical model, the RSI can be directly predicted from a known dipole location; but estimating a dipole location from RSIs has no direct analytical solution. Here, we propose an efficient solution to the dipole localization problem by using a lookup table (LUT to store RSIs predicted by our theoretically derived dipole model at many possible dipole positions and orientations. For a given set of RSIs measured at multiple detectors, our algorithm found a dipole location having the closest matching normalized RSIs from the LUT, and further refined the location at higher resolution. Studying the natural behavior of weakly electric fish (WEF requires efficiently computing their location and the temporal pattern of their electric signals over extended periods. Our dipole localization method was successfully applied to track single or multiple freely swimming WEF in shallow water in real-time, as each fish could be closely approximated by an ideal current dipole in two dimensions. Our optimized search algorithm found the animal's positions, orientations, and tail-bending angles quickly and accurately under various conditions, without the need for calibrating individual-specific parameters. Our dipole localization method is directly applicable to studying the role of active sensing during spatial navigation, or social interactions between multiple WEF. Furthermore, our method could be extended to other application areas involving dipole

  2. Time management and procrastination

    OpenAIRE

    van Eerde, W.; Mumford, M.D.; Frese, M.

    2015-01-01

    This chapter combines the topics time management and procrastination. Time management is an overarching term derived from popular notions on how to be effective at work. Procrastination has been mainly researched from a personality perspective, addressing the emotional and psychological issues of the phenomenon in more detail. First, I describe time management and procrastination, and next I address interventions that may help people in overcoming procrastination. Studies on time management s...

  3. Source apportionment of size and time resolved trace elements and organic aerosols from an urban courtyard site in Switzerland

    Directory of Open Access Journals (Sweden)

    A. Richard

    2011-09-01

    Full Text Available Time and size resolved data of trace elements were obtained from measurements with a rotating drum impactor (RDI and subsequent X-ray fluorescence spectrometry. Trace elements can act as indicators for the identification of sources of particulate matter <10 μm (PM10 in ambient air. Receptor modeling was performed with positive matrix factorization (PMF for trace element data from an urban background site in Zürich, Switzerland. Eight different sources were identified for the three examined size ranges (PM1−0.1, PM2.5−1 and PM10−2.5: secondary sulfate, wood combustion, fire works, road traffic, mineral dust, de-icing salt, industrial and local anthropogenic activities. The major component was secondary sulfate for the smallest size range; the road traffic factor was found in all three size ranges. This trace element analysis is complemented with data from an Aerodyne high-resolution time-of-flight aerosol mass spectrometer (AMS, assessing the PM1 fraction of organic aerosols. A separate PMF analysis revealed three factors related to three of the sources found with the RDI: oxygenated organic aerosol (OOA, related to inorganic secondary sulfate, hydrocarbon-like organic aerosol (HOA, related to road traffic and biomass burning organic aerosol (BBOA, explaining 60 %, 22 % and 17 % of total measured organics, respectively. Since different compounds are used for the source classification, a higher percentage of the ambient PM10 mass concentration can be apportioned to sources by the combination of both methods.

  4. High Temporal Resolution Mapping of Seismic Noise Sources Using Heterogeneous Supercomputers

    Science.gov (United States)

    Paitz, P.; Gokhberg, A.; Ermert, L. A.; Fichtner, A.

    2017-12-01

    The time- and space-dependent distribution of seismic noise sources is becoming a key ingredient of modern real-time monitoring of various geo-systems like earthquake fault zones, volcanoes, geothermal and hydrocarbon reservoirs. We present results of an ongoing research project conducted in collaboration with the Swiss National Supercomputing Centre (CSCS). The project aims at building a service providing seismic noise source maps for Central Europe with high temporal resolution. We use source imaging methods based on the cross-correlation of seismic noise records from all seismic stations available in the region of interest. The service is hosted on the CSCS computing infrastructure; all computationally intensive processing is performed on the massively parallel heterogeneous supercomputer "Piz Daint". The solution architecture is based on the Application-as-a-Service concept to provide the interested researchers worldwide with regular access to the noise source maps. The solution architecture includes the following sub-systems: (1) data acquisition responsible for collecting, on a periodic basis, raw seismic records from the European seismic networks, (2) high-performance noise source mapping application responsible for the generation of source maps using cross-correlation of seismic records, (3) back-end infrastructure for the coordination of various tasks and computations, (4) front-end Web interface providing the service to the end-users and (5) data repository. The noise source mapping itself rests on the measurement of logarithmic amplitude ratios in suitably pre-processed noise correlations, and the use of simplified sensitivity kernels. During the implementation we addressed various challenges, in particular, selection of data sources and transfer protocols, automation and monitoring of daily data downloads, ensuring the required data processing performance, design of a general service-oriented architecture for coordination of various sub-systems, and

  5. Terror in time: extending culturomics to address basic terror management mechanisms.

    Science.gov (United States)

    Dechesne, Mark; Bandt-Law, Bryn

    2018-04-11

    Building on Google's efforts to scan millions of books, this article introduces methodology using a database of annual word frequencies of the 40,000 most frequently occurring words in the American literature between 1800 and 2009. The current paper uses this methodology to replicate and identify terror management processes in historical context. Variation in frequencies of word usage of constructs relevant to terror management theory (e.g. death, worldview, self-esteem, relationships) are investigated over a time period of 209 years. Study 1 corroborated previous TMT findings and demonstrated that word use of constructs related to death and of constructs related to patriotism and romantic relationships significantly co-vary over time. Study 2 showed that the use of the word "death" most strongly co-varies over time with the use of medical constructs, but also co-varies with the use of constructs related to violence, relationships, religion, positive sentiment, and negative sentiment. Study 3 found that a change in the use of death related words is associated with an increase in the use of fear related words, but not in anxiety related words. Results indicate that the described methodology generates valuable insights regarding terror management theory and provide new perspectives for theoretical advances.

  6. Measurement and analysis of time-domain characteristics of corona-generated radio interference from a single positive corona source

    Science.gov (United States)

    Li, Xuebao; Li, Dayong; Chen, Bo; Cui, Xiang; Lu, Tiebing; Li, Yinfei

    2018-04-01

    The corona-generated electromagnetic interference commonly known as radio interference (RI) has become a limiting factor for the design of high voltage direct current transmission lines. In this paper, a time-domain measurement system is developed to measure the time-domain characteristics of corona-generated RI from a single corona source under a positive corona source. In the experiments, the corona current pulses are synchronously measured through coupling capacitors. The one-to-one relationship between the corona current pulse and measured RI voltage pulse is observed. The statistical characteristics of pulse parameters are analyzed, and the correlations between the corona current pulse and RI voltage pulse in the time-domain and frequency-domain are analyzed. Depending on the measured corona current pulses, the time-domain waveform of corona-generated RI is calculated on the basis of the propagation model of corona current on the conductor, the dipolar model for electric field calculation, and the antenna model for inducing voltage calculation. The well matched results between measured and simulated waveforms of RI voltage can show the validity of the measurement and calculation method presented in this paper, which also further show the close correlation between corona current and corona-generated RI.

  7. Real-Time Detection of Application-Layer DDoS Attack Using Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Tongguang Ni

    2013-01-01

    Full Text Available Distributed denial of service (DDoS attacks are one of the major threats to the current Internet, and application-layer DDoS attacks utilizing legitimate HTTP requests to overwhelm victim resources are more undetectable. Consequently, neither intrusion detection systems (IDS nor victim server can detect malicious packets. In this paper, a novel approach to detect application-layer DDoS attack is proposed based on entropy of HTTP GET requests per source IP address (HRPI. By approximating the adaptive autoregressive (AAR model, the HRPI time series is transformed into a multidimensional vector series. Then, a trained support vector machine (SVM classifier is applied to identify the attacks. The experiments with several databases are performed and results show that this approach can detect application-layer DDoS attacks effectively.

  8. 40 CFR 60.2991 - What incineration units must I address in my State plan?

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 6 2010-07-01 2010-07-01 false What incineration units must I address... and Compliance Times for Other Solid Waste Incineration Units That Commenced Construction On or Before December 9, 2004 Applicability of State Plans § 60.2991 What incineration units must I address in my State...

  9. Addressing chronic operational issues at the W. M. Keck Observatory

    Science.gov (United States)

    Nordin, Tom; Matsuda, Richard

    2016-07-01

    The W. M. Keck Observatory (WMKO) has a good track record at addressing large critical faults which impact observing. Our performance tracking and correcting chronic minor faults has been mixed, yet this class of problems has a significant negative impact on scientific productivity and staff effectiveness. We have taken steps to address this shortcoming. This paper outlines the creation of a program to identify, categorize and rank these chronic operational issues, track them over time, and develop management options for their resolution. The success of the program at identifying these chronic operational issues and the advantages of dedicating observatory resources to this endeavor are presented.

  10. Harvesting Information from Heterogeneous Sources

    DEFF Research Database (Denmark)

    Qureshi, Pir Abdul Rasool; Memon, Nasrullah; Wiil, Uffe Kock

    2011-01-01

    The abundance of information regarding any topic makes the Internet a very good resource. Even though searching the Internet is very easy, what remains difficult is to automate the process of information extraction from the available online information due to the lack of structure and the diversi...... with performance of our tool with respect to each format. Finally, the different potential applications of the proposed tool are discussed with special emphasis on open source intelligence....... in the sharing methods. Most of the times, information is stored in different proprietary formats, complying with different standards and protocols which makes tasks like data mining and information harvesting very difficult. In this paper, an information harvesting tool (heteroHarvest) is presented...... with objectives to address these problems by filtering the useful information and then normalizing the information in a singular non hypertext format. We also discuss state of the art tools along with the shortcomings and present the results of an analysis carried out over different heterogeneous formats along...

  11. Use of Sources and SNM in COE Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Archuleta, Jeffrey Christopher [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-06-07

    This procedure describes how to use Special Nuclear Material (SNM) and sources in order to perform measurements in a safe manner. The hazards and controls associated with these activities are addressed in this document.

  12. DOA Estimation of Audio Sources in Reverberant Environments

    DEFF Research Database (Denmark)

    Jensen, Jesper Rindom; Nielsen, Jesper Kjær; Heusdens, Richard

    2016-01-01

    Reverberation is well-known to have a detrimental impact on many localization methods for audio sources. We address this problem by imposing a model for the early reflections as well as a model for the audio source itself. Using these models, we propose two iterative localization methods...... that estimate the direction-of-arrival (DOA) of both the direct path of the audio source and the early reflections. In these methods, the contribution of the early reflections is essentially subtracted from the signal observations before localization of the direct path component, which may reduce the estimation...

  13. 33 CFR 135.9 - Fund address.

    Science.gov (United States)

    2010-07-01

    ... FINANCIAL RESPONSIBILITY AND COMPENSATION OFFSHORE OIL POLLUTION COMPENSATION FUND General § 135.9 Fund address. The address to which correspondence relating to the Coast Guard's administration of the Fund... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false Fund address. 135.9 Section 135.9...

  14. Open-source intelligence and privacy by design

    NARCIS (Netherlands)

    Koops, B.J.; Hoepman, J.H.; Leenes, R.

    2013-01-01

    As demonstrated by other papers on this issue, open-source intelligence (OSINT) by state authorities poses challenges for privacy protection and intellectual-property enforcement. A possible strategy to address these challenges is to adapt the design of OSINT tools to embed normative requirements,

  15. Compact X-ray Light Source Workshop Report

    Energy Technology Data Exchange (ETDEWEB)

    Thevuthasan, Suntharampillai; Evans, James E.; Terminello, Louis J.; Koppenaal, David W.; Manke, Kristin L.; Plata, Charity

    2012-12-01

    This report, produced jointly by EMSL and FCSD, is the result of a workshop held in September 2011 that examined the utility of a compact x-ray light source (CXLS) in addressing many scientific challenges critical to advancing energy science and technology.

  16. Un uso sostenibile delle risorse idriche con dati telerilevati e software Open Source

    Directory of Open Access Journals (Sweden)

    Pasquale Nino

    2009-03-01

    Full Text Available Sustainable agricultural water use by means of open source decision support systems and Earth Observation data.The PLEIADeS project (funded by European VI Framework Program addresses the improvement of water use and management in agriculture through innovative Information Technologies and the most recent Earth Observation (EO methodologies. Within this framework a tool, based on Open Source software, which aims at helping water managers to optimize the water consumption, has been realized. One of the key features of the system is the delivering of a near real-time irrigation schedule to farmers produced through the integration of EO-derived product and fi eld data inside a GIS environment that provides a reliable crop requirement estimation at farm level.

  17. Un uso sostenibile delle risorse idriche con dati telerilevati e software Open Source

    Directory of Open Access Journals (Sweden)

    Pasquale Nino

    2009-03-01

    Full Text Available Sustainable agricultural water use by means of open source decision support systems and Earth Observation data. The PLEIADeS project (funded by European VI Framework Program addresses the improvement of water use and management in agriculture through innovative Information Technologies and the most recent Earth Observation (EO methodologies. Within this framework a tool, based on Open Source software, which aims at helping water managers to optimize the water consumption, has been realized. One of the key features of the system is the delivering of a near real-time irrigation schedule to farmers produced through the integration of EO-derived product and fi eld data inside a GIS environment that provides a reliable crop requirement estimation at farm level.

  18. Assessing error sources for Landsat time series analysis for tropical test sites in Viet Nam and Ethiopia

    Science.gov (United States)

    Schultz, Michael; Verbesselt, Jan; Herold, Martin; Avitabile, Valerio

    2013-10-01

    Researchers who use remotely sensed data can spend half of their total effort analysing prior data. If this data preprocessing does not match the application, this time spent on data analysis can increase considerably and can lead to inaccuracies. Despite the existence of a number of methods for pre-processing Landsat time series, each method has shortcomings, particularly for mapping forest changes under varying illumination, data availability and atmospheric conditions. Based on the requirements of mapping forest changes as defined by the United Nations (UN) Reducing Emissions from Forest Degradation and Deforestation (REDD) program, the accurate reporting of the spatio-temporal properties of these changes is necessary. We compared the impact of three fundamentally different radiometric preprocessing techniques Moderate Resolution Atmospheric TRANsmission (MODTRAN), Second Simulation of a Satellite Signal in the Solar Spectrum (6S) and simple Dark Object Subtraction (DOS) on mapping forest changes using Landsat time series data. A modification of Breaks For Additive Season and Trend (BFAST) monitor was used to jointly map the spatial and temporal agreement of forest changes at test sites in Ethiopia and Viet Nam. The suitability of the pre-processing methods for the occurring forest change drivers was assessed using recently captured Ground Truth and high resolution data (1000 points). A method for creating robust generic forest maps used for the sampling design is presented. An assessment of error sources has been performed identifying haze as a major source for time series analysis commission error.

  19. Addressing the nuclear misconception

    International Nuclear Information System (INIS)

    Taylor, J.J.

    1998-01-01

    There is a perception, fostered and encouraged by the anti-nuclear groups, that the nuclear industry generates large quantities of waste with no idea how to deal with it, that it is unsafe, uneconomic, and environmentally damaging. The task is to change these perceptions, by demonstrating that the industry is not a problem in itself, but in fact provides solutions to problems. This paper, while primarily concerned with waste, addresses all of these issues as each has a bearing on the perception of the industry and therefore must be considered when addressing the issue of waste. The paper concludes that evidence exists to support the industry view, but that the mission of the industry should be to change the perception of the industry, by influencing and working together with its stake holders to address their concerns, rather than merely presenting more and more facts. (author)

  20. Risk perceptions, general environmental beliefs, and willingness to address climate change

    International Nuclear Information System (INIS)

    O'Connor, R.E.; Bord, R.J.; Fisher, A.

    1999-01-01

    The research reported here examines the relationship between risk perceptions and willingness to address climate change. The data are a national sample of 1,225 mail surveys that include measures of risk perceptions and knowledge tied to climate change, support for voluntary and government actions to address the problem, general environmental beliefs, and demographic variables. Risk perceptions matter in predicting behavior intentions. Risk perceptions are not a surrogate for general environmental beliefs, but have their own power to account for behavioral intentions. There are four secondary conclusions. First, behavioral intentions regarding climate change are complex and intriguing. People are neither nonbelievers who will take no initiatives themselves and oppose all government efforts, nor are they believers who promise both to make personal efforts and to vote for every government proposal that promises to address climate change. Second, there are separate demographic sources for voluntary actions compared with voting intentions. Third, recognizing the causes of global warming is a powerful predictor of behavioral intentions independent from believing that climate change will happen and have bad consequences. Finally, the success of the risk perception variables to account for behavioral intentions should encourage greater attention to risk perceptions as independent variables. Risk perceptions and knowledge, however, share the stage with general environmental beliefs and demographic characteristics. Although related, risk perceptions, knowledge, and general environmental beliefs are somewhat independent predictors of behavioral intentions

  1. A flexible analog memory address list manager/controller for PHENIX

    International Nuclear Information System (INIS)

    Ericson, M.N.; Walker, J.W.; Britton, C.L.; Wintenberg, A.L.; Young, G.R.

    1995-01-01

    A programmable analog memory address list manager/controller has been developed for use with all analog memory-based detector subsystems of PHENIX. The unit provides simultaneous read/write control, cell write-over protection for both a Level-1 trigger decision delay and digitization latency, and re-ordering of AMU addresses following conversion, at a beam crossing rate of 112 ns. Addresses are handled such that up to 5 Level-1 events can be maintained in the AMU without write-over. Data tagging is implemented for handling overlapping and shared beam event data packets. Full usage in all PHENIX analog memory-based detector sub-systems is accomplished by the use of detector-specific programmable parameters -- the number of data samples per Level-1 trigger valid and the swnple spacing. Architectural candidates for the system are discussed with emphasis on implementation implications. Details of the design are presented including design simulations, timing information, and test results from a full implementation using programmable logic devices

  2. Environmental research at the Advanced Photon Source

    International Nuclear Information System (INIS)

    Kemner, K.M.; Boyanov, M.I.; Eng, P.; Fenter, P.; Heald, S.; Lai, B.; Lee, S.S.; Scheckel, K.G.; Skanthakumar, S.; Sutton, S.R.; Wilson, R.E.

    2010-01-01

    Because of the importance of probing molecular-scale chemical and physical structure of environmental samples in their natural and often hydrated state, synchrotron radiation has been a powerful tool for environmental scientists for decades. Thus, the crucial role that a highly coherent and high-brightness hard X-ray source such as the Advance Photon Source (APS) can play in addressing many of the outstanding questions in molecular environmental science (MES) was recognized even before 'first light' at the facility. No single synchrotron-based technique or experimental approach can adequately address the tremendous temporal and spatial heterogeneities of the chemistry, physics, and biology of natural environmental samples. Thus, it is common at the APS that multiple X-ray techniques and experimental systems are employed to investigate environmental samples, often chosen for their ability to focus on solute species, plants, microbes, organics, interfacial species, or solids.

  3. A phantom for verification of dwell position and time of a high dose rate brachytherapy source

    International Nuclear Information System (INIS)

    Madebo, M.; Kron, T.; Pillainayagam, J.; Franich, R.

    2012-01-01

    Accuracy of dwell position and reproducibility of dwell time are critical in high dose rate (HDR) brachytherapy. A phantom was designed to verify dwell position and dwell time reproducibility for an Ir-192 HDR stepping source using Computed Radiography (CR). The central part of the phantom, incorporating thin alternating strips of lead and acrylic, was used to measure dwell positions. The outer part of the phantom features recesses containing different absorber materials (lead, aluminium, acrylic and polystyrene foam), and was used for determining reproducibility of dwell times. Dwell position errors of <1 mm were easily detectable using the phantom. The effect of bending a transfer tube was studied with this phantom and no change of clinical significance was observed when varying the curvature of the transfer tube in typical clinical scenarios. Changes of dwell time as low as 0.1 s, the minimum dwell time of the treatment unit, could be detected by choosing dwell times over the four materials that produce identical exposure at the CR detector.

  4. Fast temperature optimization of multi-source hyperthermia applicators with reduced-order modeling of 'virtual sources'

    International Nuclear Information System (INIS)

    Cheng, K-S; Stakhursky, Vadim; Craciunescu, Oana I; Stauffer, Paul; Dewhirst, Mark; Das, Shiva K

    2008-01-01

    The goal of this work is to build the foundation for facilitating real-time magnetic resonance image guided patient treatment for heating systems with a large number of physical sources (e.g. antennas). Achieving this goal requires knowledge of how the temperature distribution will be affected by changing each source individually, which requires time expenditure on the order of the square of the number of sources. To reduce computation time, we propose a model reduction approach that combines a smaller number of predefined source configurations (fewer than the number of actual sources) that are most likely to heat tumor. The source configurations consist of magnitude and phase source excitation values for each actual source and may be computed from a CT scan based plan or a simplified generic model of the corresponding patient anatomy. Each pre-calculated source configuration is considered a 'virtual source'. We assume that the actual best source settings can be represented effectively as weighted combinations of the virtual sources. In the context of optimization, each source configuration is treated equivalently to one physical source. This model reduction approach is tested on a patient upper-leg tumor model (with and without temperature-dependent perfusion), heated using a 140 MHz ten-antenna cylindrical mini-annular phased array. Numerical simulations demonstrate that using only a few pre-defined source configurations can achieve temperature distributions that are comparable to those from full optimizations using all physical sources. The method yields close to optimal temperature distributions when using source configurations determined from a simplified model of the tumor, even when tumor position is erroneously assumed to be ∼2.0 cm away from the actual position as often happens in practical clinical application of pre-treatment planning. The method also appears to be robust under conditions of changing, nonlinear, temperature-dependent perfusion. The

  5. Domestic energy sources urged as Middle East situation heats up

    International Nuclear Information System (INIS)

    Rodgers, L.M.

    1990-01-01

    This article discusses the alternatives to foreign oil as an energy source for the US in the light of the invasion of Kuwait by Iraq. Topics addressed include the responses of organizations representing various energy sources, the public response of the Department of Energy, the response of conservation advocates, and the Administration's reaction

  6. An Algorithm of Traffic Perception of DDoS Attacks against SOA Based on Time United Conditional Entropy

    Directory of Open Access Journals (Sweden)

    Yuntao Zhao

    2016-01-01

    Full Text Available DDoS attacks can prevent legitimate users from accessing the service by consuming resource of the target nodes, whose availability of network and service is exposed to a significant threat. Therefore, DDoS traffic perception is the premise and foundation of the whole system security. In this paper the method of DDoS traffic perception for SOA network based on time united conditional entropy was proposed. According to many-to-one relationship mapping between the source IP address and destination IP addresses of DDoS attacks, traffic characteristics of services are analyzed based on conditional entropy. The algorithm is provided with perception ability of DDoS attacks on SOA services by introducing time dimension. Simulation results show that the novel method can realize DDoS traffic perception with analyzing abrupt variation of conditional entropy in time dimension.

  7. Realization and Addressing Analysis In Blockchain Bitcoin

    Science.gov (United States)

    Sakti Arief Daulay, Raja; Michrandi Nasution, Surya; Paryasto, Marisa W.

    2017-11-01

    The implementation research and analyze address blockchain on this bitcoin will have the results that refers to making address bitcoin a safe and boost security of address the bitcoin. The working mechanism of blockchain in making address bitcoin which is already in the blockchain system.

  8. Locating single-point sources from arrival times containing large picking errors (LPEs): the virtual field optimization method (VFOM)

    Science.gov (United States)

    Li, Xi-Bing; Wang, Ze-Wei; Dong, Long-Jun

    2016-01-01

    Microseismic monitoring systems using local location techniques tend to be timely, automatic and stable. One basic requirement of these systems is the automatic picking of arrival times. However, arrival times generated by automated techniques always contain large picking errors (LPEs), which may make the location solution unreliable and cause the integrated system to be unstable. To overcome the LPE issue, we propose the virtual field optimization method (VFOM) for locating single-point sources. In contrast to existing approaches, the VFOM optimizes a continuous and virtually established objective function to search the space for the common intersection of the hyperboloids, which is determined by sensor pairs other than the least residual between the model-calculated and measured arrivals. The results of numerical examples and in-site blasts show that the VFOM can obtain more precise and stable solutions than traditional methods when the input data contain LPEs. Furthermore, we discuss the impact of LPEs on objective functions to determine the LPE-tolerant mechanism, velocity sensitivity and stopping criteria of the VFOM. The proposed method is also capable of locating acoustic sources using passive techniques such as passive sonar detection and acoustic emission.

  9. Time-resolved X-ray diffraction with accelerator- and laser-plasma-based X-ray sources

    International Nuclear Information System (INIS)

    Nicoul, Matthieu

    2010-01-01

    Femtosecond X-ray pulses are a powerful tool to investigate atomic motions triggered by femtosecond pump pulses. This thesis is dedicated to the production of such pulses and their use in optical pump - X-ray probe measurement. This thesis describes the laser-plasma-based sources available at the University of Duisburg-Essen. Part of it consists of the description of the design, built-up and characterization of a new ''modular'' X-ray source dedicated to optimize the X-ray flux onto the sample under investigation. The acoustic wave generation in femtosecond optically excited semiconductor (gallium arsenide) and metal (gold) was performed using the sources of the University of Duisburg-Essen. The physical answer of the material was modeled by a simple strain model for the semiconductor, pressure model for the metal, in order to gain information on the interplay of the electronic and thermal pressures rising after excitation. Whereas no reliable information could be obtain in gallium arsenide (principally due to the use of a bulk), the model for gold achieved very good agreement, providing useful information. The relaxation time of the electron to lattice energy was found to be (5.0±0.3) ps, and the ratio of the Grueneisen parameters was found to be γ e / γ i = (0.5±0.1). This thesis also describes the Sub-Picosecond Pulse Source (SPPS) which existed at the (formally) Stanford Linear Accelerator Center, an accelerator-based X-ray source, and two measurements performed with it. The first one is the detailed investigation of the phonon softening of the A 1g mode launch in bismuth upon fluence excitation. Detailed information concerning the new equilibrium position and phonon frequency were obtained over extended laser pump fluences. The second measurement concerned the study of the liquid phase dynamics in a newly formed liquid phase following ultrafast melting in indium antimonide. The formation of the liquid phase and its development for excitations close to the

  10. Time-resolved X-ray diffraction with accelerator- and laser-plasma-based X-ray sources

    Energy Technology Data Exchange (ETDEWEB)

    Nicoul, Matthieu

    2010-09-01

    Femtosecond X-ray pulses are a powerful tool to investigate atomic motions triggered by femtosecond pump pulses. This thesis is dedicated to the production of such pulses and their use in optical pump - X-ray probe measurement. This thesis describes the laser-plasma-based sources available at the University of Duisburg-Essen. Part of it consists of the description of the design, built-up and characterization of a new ''modular'' X-ray source dedicated to optimize the X-ray flux onto the sample under investigation. The acoustic wave generation in femtosecond optically excited semiconductor (gallium arsenide) and metal (gold) was performed using the sources of the University of Duisburg-Essen. The physical answer of the material was modeled by a simple strain model for the semiconductor, pressure model for the metal, in order to gain information on the interplay of the electronic and thermal pressures rising after excitation. Whereas no reliable information could be obtain in gallium arsenide (principally due to the use of a bulk), the model for gold achieved very good agreement, providing useful information. The relaxation time of the electron to lattice energy was found to be (5.0{+-}0.3) ps, and the ratio of the Grueneisen parameters was found to be {gamma}{sub e} / {gamma}{sub i} = (0.5{+-}0.1). This thesis also describes the Sub-Picosecond Pulse Source (SPPS) which existed at the (formally) Stanford Linear Accelerator Center, an accelerator-based X-ray source, and two measurements performed with it. The first one is the detailed investigation of the phonon softening of the A{sub 1g} mode launch in bismuth upon fluence excitation. Detailed information concerning the new equilibrium position and phonon frequency were obtained over extended laser pump fluences. The second measurement concerned the study of the liquid phase dynamics in a newly formed liquid phase following ultrafast melting in indium antimonide. The formation of the liquid phase

  11. Address Points, The Address Point layer contains an address point for almost every structure over 200 square feet and for some vacant properties. Attributes include addresses, sub-units, address use, LAT/LONG, 10-digit SDAT taxpins, political areas and more., Published in 2013, 1:2400 (1in=200ft) scale, Baltimore County Government.

    Data.gov (United States)

    NSGIC Local Govt | GIS Inventory — Address Points dataset current as of 2013. The Address Point layer contains an address point for almost every structure over 200 square feet and for some vacant...

  12. Governance through Economic Paradigms: Addressing Climate Change by Accounting for Health

    Directory of Open Access Journals (Sweden)

    Kristine Belesova

    2016-12-01

    Full Text Available Climate change is a major challenge for sustainable development, impacting human health, wellbeing, security, and livelihoods. While the post-2015 development agenda sets out action on climate change as one of the Sustainable Development Goals, there is little provision on how this can be achieved in tandem with the desired economic progress and the required improvements in health and wellbeing. This paper examines synergies and tensions between the goals addressing climate change and economic progress. We identify reductionist approaches in economics, such as ‘externalities’, reliance on the metric of the Gross Domestic Product, positive discount rates, and short-term profit targets as some of the key sources of tensions between these goals. Such reductionist approaches could be addressed by intersectoral governance mechanisms. Health in All Policies, health-sensitive macro-economic progress indicators, and accounting for long-term and non-monetary values are some of the approaches that could be adapted and used in governance for the SDGs. Policy framing of climate change and similar issues should facilitate development of intersectoral governance approaches.

  13. Addressing earthquakes strong ground motion issues at the Idaho National Engineering Laboratory

    International Nuclear Information System (INIS)

    Wong, I.G.; Silva, W.J.; Stark, C.L.; Jackson, S.; Smith, R.P.

    1991-01-01

    In the course of reassessing seismic hazards at the Idaho National Engineering Laboratory (INEL), several key issues have been raised concerning the effects of the earthquake source and site geology on potential strong ground motions that might be generated by a large earthquake. The design earthquake for the INEL is an approximate moment magnitude (M w ) 7 event that may occur on the southern portion of the Lemhi fault, a Basin and Range normal fault that is located on the northwestern boundary of the eastern Snake River Plain and the INEL, within 10 to 27 km of several major facilities. Because the locations of these facilities place them at close distances to a large earthquake and generally along strike of the causative fault, the effects of source rupture dynamics (e.g., directivity) could be critical in enhancing potential ground shaking at the INEL. An additional source issue that has been addressed is the value of stress drop to use in ground motion predictions. In terms of site geology, it has been questioned whether the interbedded volcanic stratigraphy beneath the ESRP and the INEL attenuates ground motions to a greater degree than a typical rock site in the western US. These three issues have been investigated employing a stochastic ground motion methodology which incorporates the Band-Limited-White-Noise source model for both a point source and finite fault, random vibration theory and an equivalent linear approach to model soil response

  14. Lead-acid batteries life time prolongation in renewable energy source plants

    Directory of Open Access Journals (Sweden)

    Костянтин Ігорович Ткаченко

    2015-11-01

    Full Text Available Charge controllers with microprocessor control are recognized to be almost optimal process control devices for collecting and storing energy in batteries in power systems with renewable energy sources such as solar photoelectric batteries, wind electrogenerators and others. The task of the controller is charging process control, that is such as charging and discharging the batteries while providing maximum charging speed and battery saving parameters that characterize the state of the battery, within certain limits, preventing overcharging, overheating and the batteries deep discharge. The possibility of archiving data that keeps the battery parameters time dependance is also important. Thus, the concept of a charge controller with Texas Instruments microcontroller device MSP430G2553 was introduced in the study. The program saved in the ROM microcontroller provides for: charge regime(with a particular algorithm; control and training cycle followed by charging; continuous charge-discharge regime to restore the battery or the study of charge regime algorithms influence on repair effectiveness. The device can perform its functions without being connected to a personal computer, but this connection makes it possible to observe in real time the characteristics of a number of discharge and charge regimes parameters, as well as reading the stored data from microcontroller flash memory and storing these data on the PC hard disk for further analysis. A four stages charging algorithm with reverse charging regime was offered by the author and correctness of algorithm was proved

  15. 21 CFR 1321.01 - DEA mailing addresses.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 9 2010-04-01 2010-04-01 false DEA mailing addresses. 1321.01 Section 1321.01 Food and Drugs DRUG ENFORCEMENT ADMINISTRATION, DEPARTMENT OF JUSTICE DEA MAILING ADDRESSES § 1321.01 DEA mailing addresses. The following table provides information regarding mailing addresses to be used...

  16. IP Address Management Principles and Practice

    CERN Document Server

    Rooney, Timothy

    2010-01-01

    This book will be the first covering the subject of IP address management (IPAM). The practice of IPAM includes the application of network management disciplines to IP address space and associated network services, namely DHCP (Dynamic Host Configuration Protocol) and DNS (Domain Name System). The consequence of inaccurately configuring DHCP is that end users may not be able to obtain IP addresses to access the network. Without proper DNS configuration, usability of the network will greatly suffer as the name-to-address lookup process may fail. Imagine having to navigate to a website or send a

  17. Environmental impact of non-conventional energy sources

    International Nuclear Information System (INIS)

    Abbasi, S.A.; Abbasi, Naseema; Nipaney, P.C.; Ramasamy, E.V.

    1995-01-01

    Whereas the global attention has always been focused on the adverse environmental impacts of conventional energy sources, only a few studies have been conducted on the clean environment image of the non-conventional energy sources, particularly the renewable ones. The question whether the non-conventional sources are really as benign as they are made out to be is addressed in the present paper in the background of a classical paradigm developed by Lovin which had postulated the hard (malignant) and soft (benign) energy concepts in the first place. It then assesses the likely environmental impacts of several major non-conventional energy sources and comes up with the note of caution that in many cases the adverse impacts may not be insubstantial; indeed in some cases they can be as strongly negative as the impacts of the conventional energy sources. (author). 31 refs

  18. Real-time rockmass response from microseismics

    Energy Technology Data Exchange (ETDEWEB)

    Andrew King; Michael Lofgren; Matt van de Werken [CSIRO Exploration and Mining (Australia)

    2009-06-15

    The primary objective of this project was to develop a prototype real-time microseismic monitoring system for strata control management and forewarning of geotechnical hazards. Power and communications problems have been addressed by developing a wirelessly connected network of solar-powered acquisition nodes, one at the top of each instrumented borehole. The open-source 'earthworm' earthquake acquisition software, which can run on different hardware platforms and use different acquisition cards, was modified for use in a coal environment by developing special new arrival-picking and event-location procedures. The system was field-trialled at Moranbah North mine. The acquisition software performed well, as did wireless communications and solar power. There were issues with the acquisition hardware selected, including problems with timing synchronisation, which is essential for seismic event location. Although these were fixed during the test, different hardware is likely to be used in future installations.

  19. Science, Practitioners and Faith Communities: using TEK and Faith Knowledge to address climate issues.

    Science.gov (United States)

    Peterson, K.

    2017-12-01

    Worldview, Lifeway and Science - Communities that are tied to the land or water for their livelihood, and for whom subsistence guides their cultural lifeway, have knowledges that inform their interactions with the environment. These frameworks, sometimes called Traditional Ecological Knowledges (TEK), are based on generations of observations made and shared within lived life-environmental systems, and are tied to practitioners' broader worldviews. Subsistence communities, including Native American tribes, are well aware of the crises caused by climate change impacts. These communities are working on ways to integrate knowledge from their ancient ways with current observations and methods from Western science to implement appropriate adaptation and resilience measures. In the delta region of south Louisiana, the communities hold worldviews that blend TEK, climate science and faith-derived concepts. It is not incongruent for the communities to intertwine conversations from complex and diverse sources, including the academy, to inform their adaptation measures and their imagined solutions. Drawing on over twenty years of work with local communities, science organizations and faith institutions of the lower bayou region of Louisiana, the presenter will address the complexity of traditional communities' work with diverse sources of knowledge to guide local decision-making and to assist outside partners to more effectively address challenges associated with climate change.

  20. Tsunami forecast by joint inversion of real-time tsunami waveforms and seismic of GPS data: application to the Tohoku 2011 tsunami

    Science.gov (United States)

    Yong, Wei; Newman, Andrew V.; Hayes, Gavin P.; Titov, Vasily V.; Tang, Liujuan

    2014-01-01

    Correctly characterizing tsunami source generation is the most critical component of modern tsunami forecasting. Although difficult to quantify directly, a tsunami source can be modeled via different methods using a variety of measurements from deep-ocean tsunameters, seismometers, GPS, and other advanced instruments, some of which in or near real time. Here we assess the performance of different source models for the destructive 11 March 2011 Japan tsunami using model–data comparison for the generation, propagation, and inundation in the near field of Japan. This comparative study of tsunami source models addresses the advantages and limitations of different real-time measurements with potential use in early tsunami warning in the near and far field. The study highlights the critical role of deep-ocean tsunami measurements and rapid validation of the approximate tsunami source for high-quality forecasting. We show that these tsunami measurements are compatible with other real-time geodetic data, and may provide more insightful understanding of tsunami generation from earthquakes, as well as from nonseismic processes such as submarine landslide failures.

  1. BIOSENSOR TECHNOLOGY EVALUATIONS FOR REAL-TIME/SOURCE WATER PROTECTION

    Science.gov (United States)

    Recent advances in electronics and computer technology have made great strides in the field of remote sensing and biomonitoring. The quality of drinking water sources has come under closer scrutiny in recent years. Issues ranging from ecological to public health and national se...

  2. The Earthquake‐Source Inversion Validation (SIV) Project

    Science.gov (United States)

    Mai, P. Martin; Schorlemmer, Danijel; Page, Morgan T.; Ampuero, Jean-Paul; Asano, Kimiyuki; Causse, Mathieu; Custodio, Susana; Fan, Wenyuan; Festa, Gaetano; Galis, Martin; Gallovic, Frantisek; Imperatori, Walter; Käser, Martin; Malytskyy, Dmytro; Okuwaki, Ryo; Pollitz, Fred; Passone, Luca; Razafindrakoto, Hoby N. T.; Sekiguchi, Haruko; Song, Seok Goo; Somala, Surendra N.; Thingbaijam, Kiran K. S.; Twardzik, Cedric; van Driel, Martin; Vyas, Jagdish C.; Wang, Rongjiang; Yagi, Yuji; Zielke, Olaf

    2016-01-01

    Finite‐fault earthquake source inversions infer the (time‐dependent) displacement on the rupture surface from geophysical data. The resulting earthquake source models document the complexity of the rupture process. However, multiple source models for the same earthquake, obtained by different research teams, often exhibit remarkable dissimilarities. To address the uncertainties in earthquake‐source inversion methods and to understand strengths and weaknesses of the various approaches used, the Source Inversion Validation (SIV) project conducts a set of forward‐modeling exercises and inversion benchmarks. In this article, we describe the SIV strategy, the initial benchmarks, and current SIV results. Furthermore, we apply statistical tools for quantitative waveform comparison and for investigating source‐model (dis)similarities that enable us to rank the solutions, and to identify particularly promising source inversion approaches. All SIV exercises (with related data and descriptions) and statistical comparison tools are available via an online collaboration platform, and we encourage source modelers to use the SIV benchmarks for developing and testing new methods. We envision that the SIV efforts will lead to new developments for tackling the earthquake‐source imaging problem.

  3. Transient soft X-ray sources

    International Nuclear Information System (INIS)

    Hayakawa, S.; Murakami, T.; Nagase, F.; Tanaka, Y.; Yamashita, K.

    1976-01-01

    A rocket observation of cosmic soft X-rays suggests the existence of transient, recurrent soft X-ray sources which are found variable during the flight time of the rocket. Some of the soft X-ray sources thus far reported are considered to be of this time. These sources are listed and their positions are shown. (Auth.)

  4. Optically addressed ultra-wideband phased antenna array

    Science.gov (United States)

    Bai, Jian

    Demands for high data rate and multifunctional apertures from both civilian and military users have motivated development of ultra-wideband (UWB) electrically steered phased arrays. Meanwhile, the need for large contiguous frequency is pushing operation of radio systems into the millimeter-wave (mm-wave) range. Therefore, modern radio systems require UWB performance from VHF to mm-wave. However, traditional electronic systems suffer many challenges that make achieving these requirements difficult. Several examples includes: voltage controlled oscillators (VCO) cannot provide a tunable range of several octaves, distribution of wideband local oscillator signals undergo high loss and dispersion through RF transmission lines, and antennas have very limited bandwidth or bulky sizes. Recently, RF photonics technology has drawn considerable attention because of its advantages over traditional systems, with the capability of offering extreme power efficiency, information capacity, frequency agility, and spatial beam diversity. A hybrid RF photonic communication system utilizing optical links and an RF transducer at the antenna potentially provides ultra-wideband data transmission, i.e., over 100 GHz. A successful implementation of such an optically addressed phased array requires addressing several key challenges. Photonic generation of an RF source with over a seven-octave bandwidth has been demonstrated in the last few years. However, one challenge which still remains is how to convey phased optical signals to downconversion modules and antennas. Therefore, a feed network with phase sweeping capability and low excessive phase noise needs to be developed. Another key challenge is to develop an ultra-wideband array antenna. Modern frontends require antennas to be compact, planar, and low-profile in addition to possessing broad bandwidth, conforming to stringent space, weight, cost, and power constraints. To address these issues, I will study broadband and miniaturization

  5. Chairman's address

    International Nuclear Information System (INIS)

    Foster, J.S.

    1981-06-01

    There is no intrinsic demand for energy separable from supply; the demand for a resource reflects its availability. The Conservation Commission of the World Energy Conference used one study that predicted that if economic growth rates were to remain comparable to those of the last forty years, eight times the 1975 level of energy supply would be required by 2020. An alternative view suggested that if the income elasticity for energy declined with increasing economic development, there would be a fourfold increase in energy demand by 2020. Energy supply today and what it might be in 40 years is examined. The energy use of biomass will probably increase in certain areas of the world, but it is unlikely that there will be an overall growth in supply from this source. Energy production from fossil fuels may double between now and the early years of the next century, but unless the recovery rate of coal is increased drastically, growth will peak by mid-century. Hydro-electric and geothermal power may quadruple over the next forty years, but this will not represent a big increase in supply. Solar energy can make only a modest contribution in the next few decades. If there is to be a major increase in world energy supply, it must come from nuclear energy. The Conservation Commission felt that the potential for world energy supply is doubling between 1980 and 2000, and a further 50 percent growth to 2020. There will most likely be a 50 percent growth in per capita energy use around the world, with 3 to 4 percent annual growth rate in developing countries and perhaps the same in industrialized countries for the first 20 years, but about half that growth rate towards the end of the time period

  6. Time for a change: addressing R&D and commercialization challenges for antibacterials

    Science.gov (United States)

    Payne, David J.; Miller, Linda Federici; Findlay, David; Anderson, James; Marks, Lynn

    2015-01-01

    The antibacterial therapeutic area has been described as the perfect storm. Resistance is increasing to the point that our hospitals encounter patients infected with untreatable pathogens, the overall industry pipeline is described as dry and most multinational pharmaceutical companies have withdrawn from the area. Major contributing factors to the declining antibacterial industry pipeline include scientific challenges, clinical/regulatory hurdles and low return on investment. This paper examines these challenges and proposes approaches to address them. There is a need for a broader scientific agenda to explore new approaches to discover and develop antibacterial agents. Additionally, ideas of how industry and academia could be better integrated will be presented. While promising progress in the regulatory environment has been made, more streamlined regulatory paths are still required and the solutions will lie in global harmonization and clearly defined guidance. Creating the right incentives for antibacterial research and development is critical and a new commercial model for antibacterial agents will be proposed. One key solution to help resolve both the problem of antimicrobial resistance (AMR) and lack of new drug development are rapid, cost-effective, accurate point of care diagnostics that will transform antibacterial prescribing and enable more cost-effective and efficient antibacterial clinical trials. The challenges of AMR are too great for any one group to resolve and success will require leadership and partnerships among academia, industry and governments globally. PMID:25918443

  7. Forms, Sources and Processes of Trust

    NARCIS (Netherlands)

    Nooteboom, B.

    2006-01-01

    This chapter reviews some key points in the analysis of trust, based on Nooteboom (2002)i.The following questions are addressed.What can we have trust in?What is the relation between trust and control?What are the sources of trust? And what are its limits?By what process is trust built up and broken

  8. Beamspace fast fully adaptive brain source localization for limited data sequences

    International Nuclear Information System (INIS)

    Ravan, Maryam

    2017-01-01

    In the electroencephalogram (EEG) or magnetoencephalogram (MEG) context, brain source localization methods that rely on estimating second order statistics often fail when the observations are taken over a short time interval, especially when the number of electrodes is large. To address this issue, in previous study, we developed a multistage adaptive processing called fast fully adaptive (FFA) approach that can significantly reduce the required sample support while still processing all available degrees of freedom (DOFs). This approach processes the observed data in stages through a decimation procedure. In this study, we introduce a new form of FFA approach called beamspace FFA. We first divide the brain into smaller regions and transform the measured data from the source space to the beamspace in each region. The FFA approach is then applied to the beamspaced data of each region. The goal of this modification is to benefit the correlation sensitivity reduction between sources in different brain regions. To demonstrate the performance of the beamspace FFA approach in the limited data scenario, simulation results with multiple deep and cortical sources as well as experimental results are compared with regular FFA and widely used FINE approaches. Both simulation and experimental results demonstrate that the beamspace FFA method can localize different types of multiple correlated brain sources in low signal to noise ratios more accurately with limited data. (paper)

  9. Moderated discussion: Are we addressing the ``real`` issue?

    Energy Technology Data Exchange (ETDEWEB)

    Feher, M [Atomic Energy of Canada Ltd., Chalk River, ON (Canada)

    1997-09-01

    Session 5 a was moderated discussion on the topic of ``Are we addressing the `real` issue?`` The Moderator opened the session with the following questions areas to stimulate discussion. Questions Part 1: Is the concept of alarms and alarm systems still valid? Are we designing for physical features rather than information that has to be conveyed? Are we addressing the essential annunciation needs or are attempting to implement patch-work solutions to solve specific problems? Is the design process so firmly established in organizations that a major change is required to result in different and improved approaches? Will the cost of increasing scrutinity for Software QA make advancement impossible or too costly? What is the role of overview display in accident management and how do imbedded alarms play a role? Is the need for reliable signals adequately addressed (or can it be)? Questions Part 2: Should we include automated diagnosis and decision making with annunciation? What is the role of the operator? Is the operator someone who only follows fixed procedures, or is he/she a responsible authority, or both? Does the focus on safety-first divert the attention away from other important issues, such as operational efficiency? Does the concept of ``hard-wired`` annunciation still apply given advancements in reliability of computer systems? How do we shorten the design and implementation time period cost effectively while still improving the performance?.

  10. Moderated discussion: Are we addressing the ''real'' issue?

    International Nuclear Information System (INIS)

    Feher, M.

    1997-01-01

    Session 5 a was moderated discussion on the topic of ''Are we addressing the 'real' issue?'' The Moderator opened the session with the following questions areas to stimulate discussion. Questions Part 1: Is the concept of alarms and alarm systems still valid? Are we designing for physical features rather than information that has to be conveyed? Are we addressing the essential annunciation needs or are attempting to implement patch-work solutions to solve specific problems? Is the design process so firmly established in organizations that a major change is required to result in different and improved approaches? Will the cost of increasing scrutinity for Software QA make advancement impossible or too costly? What is the role of overview display in accident management and how do imbedded alarms play a role? Is the need for reliable signals adequately addressed (or can it be)? Questions Part 2: Should we include automated diagnosis and decision making with annunciation? What is the role of the operator? Is the operator someone who only follows fixed procedures, or is he/she a responsible authority, or both? Does the focus on safety-first divert the attention away from other important issues, such as operational efficiency? Does the concept of ''hard-wired'' annunciation still apply given advancements in reliability of computer systems? How do we shorten the design and implementation time period cost effectively while still improving the performance?

  11. Status report on the ADVANCED LIGHT SOURCE control system

    International Nuclear Information System (INIS)

    Magyary, S.; Chin, M.; Fahmie, M.; Lancaster, H.; Molinari, P.; Robb, A.; Timossi, C.; Young, J.

    1992-01-01

    This paper is a status report on the ADVANCED LIGHT SOURCE (ALS) control system. The current status, performance data, and future plans will be discussed. Manpower, scheduling, and costs issues are addressed. (author)

  12. Environmental impacts of renewable energy sources

    International Nuclear Information System (INIS)

    Abbasi, S.A.; Abbasi, N.

    1997-01-01

    The global attention has always been focused on the adverse environmental impacts of conventional energy sources. In contrast nonconventional energy sources, particularly the renewable ones, have enjoyed a clean image vis a vis environmental impacts. The only major exception to this general trend has been large hydropower projects; experience has taught that they can be disastrous for the environment. The belief now is that mini hydro and microhydro projects are harmless alternatives. But are renewable energy sources really as benign as is widely believed? The present essay addresses this question in the background of Lovin's classical paradigm which had postulated the hard (malignant) and soft (benign) energy concepts in the first place. It then critically evaluates the environmental impacts of major renewable energy sources. It then comes up with the broad conclusion that renewable energy sources are not the panacea they are popularly perceived to be; indeed in some cases their adverse environmental impacts can be as strongly negative as the impacts of conventional energy sources. The paper also dwells on the steps needed to utilize renewable energy sources without facing environmental backlashes of the type experienced from hydropower projects

  13. Using imputation to provide location information for nongeocoded addresses.

    Directory of Open Access Journals (Sweden)

    Frank C Curriero

    2010-02-01

    Full Text Available The importance of geography as a source of variation in health research continues to receive sustained attention in the literature. The inclusion of geographic information in such research often begins by adding data to a map which is predicated by some knowledge of location. A precise level of spatial information is conventionally achieved through geocoding, the geographic information system (GIS process of translating mailing address information to coordinates on a map. The geocoding process is not without its limitations, though, since there is always a percentage of addresses which cannot be converted successfully (nongeocodable. This raises concerns regarding bias since traditionally the practice has been to exclude nongeocoded data records from analysis.In this manuscript we develop and evaluate a set of imputation strategies for dealing with missing spatial information from nongeocoded addresses. The strategies are developed assuming a known zip code with increasing use of collateral information, namely the spatial distribution of the population at risk. Strategies are evaluated using prostate cancer data obtained from the Maryland Cancer Registry. We consider total case enumerations at the Census county, tract, and block group level as the outcome of interest when applying and evaluating the methods. Multiple imputation is used to provide estimated total case counts based on complete data (geocodes plus imputed nongeocodes with a measure of uncertainty. Results indicate that the imputation strategy based on using available population-based age, gender, and race information performed the best overall at the county, tract, and block group levels.The procedure allows for the potentially biased and likely under reported outcome, case enumerations based on only the geocoded records, to be presented with a statistically adjusted count (imputed count with a measure of uncertainty that are based on all the case data, the geocodes and imputed

  14. Novel technique for addressing streak artifact in gated dual-source MDCT angiography utilizing ECG-editing

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Laura T.; Boll, Daniel T. [Duke University Medical Center, Department of Radiology, Box 3808, Durham, NC (United States)

    2008-11-15

    Streak artifact is an important source of image degradation in computed tomographic imaging. In coronary MDCT angiography, streak artifact from pacemaker leads in the SVC can render segments of the right coronary artery uninterpretable. With current technology in clinical practice, there is no effective way to eliminate streak artifact in coronary MDCT angiography entirely. We propose a technique to minimize the impact of streak artifact in retrospectively gated coronary MDCT angiography by utilizing small shifts in the reconstruction window. In our experience, previously degraded portions of the coronary vasculature were able to be well evaluated using this technique. (orig.)

  15. Aurorasaurus Database of Real-Time, Soft-Sensor Sourced Aurora Data for Space Weather Research

    Science.gov (United States)

    Kosar, B.; MacDonald, E.; Heavner, M.

    2017-12-01

    Aurorasaurus is an innovative citizen science project focused on two fundamental objectives i.e., collecting real-time, ground-based signals of auroral visibility from citizen scientists (soft-sensors) and incorporating this new type of data into scientific investigations pertaining to aurora. The project has been live since the Fall of 2014, and as of Summer 2017, the database compiled approximately 12,000 observations (5295 direct reports and 6413 verified tweets). In this presentation, we will focus on demonstrating the utility of this robust science quality data for space weather research needs. These data scale with the size of the event and are well-suited to capture the largest, rarest events. Emerging state-of-the-art computational methods based on statistical inference such as machine learning frameworks and data-model integration methods can offer new insights that could potentially lead to better real-time assessment and space weather prediction when citizen science data are combined with traditional sources.

  16. Joint Source-Channel Decoding of Variable-Length Codes with Soft Information: A Survey

    Directory of Open Access Journals (Sweden)

    Pierre Siohan

    2005-05-01

    Full Text Available Multimedia transmission over time-varying wireless channels presents a number of challenges beyond existing capabilities conceived so far for third-generation networks. Efficient quality-of-service (QoS provisioning for multimedia on these channels may in particular require a loosening and a rethinking of the layer separation principle. In that context, joint source-channel decoding (JSCD strategies have gained attention as viable alternatives to separate decoding of source and channel codes. A statistical framework based on hidden Markov models (HMM capturing dependencies between the source and channel coding components sets the foundation for optimal design of techniques of joint decoding of source and channel codes. The problem has been largely addressed in the research community, by considering both fixed-length codes (FLC and variable-length source codes (VLC widely used in compression standards. Joint source-channel decoding of VLC raises specific difficulties due to the fact that the segmentation of the received bitstream into source symbols is random. This paper makes a survey of recent theoretical and practical advances in the area of JSCD with soft information of VLC-encoded sources. It first describes the main paths followed for designing efficient estimators for VLC-encoded sources, the key component of the JSCD iterative structure. It then presents the main issues involved in the application of the turbo principle to JSCD of VLC-encoded sources as well as the main approaches to source-controlled channel decoding. This survey terminates by performance illustrations with real image and video decoding systems.

  17. Joint Source-Channel Decoding of Variable-Length Codes with Soft Information: A Survey

    Science.gov (United States)

    Guillemot, Christine; Siohan, Pierre

    2005-12-01

    Multimedia transmission over time-varying wireless channels presents a number of challenges beyond existing capabilities conceived so far for third-generation networks. Efficient quality-of-service (QoS) provisioning for multimedia on these channels may in particular require a loosening and a rethinking of the layer separation principle. In that context, joint source-channel decoding (JSCD) strategies have gained attention as viable alternatives to separate decoding of source and channel codes. A statistical framework based on hidden Markov models (HMM) capturing dependencies between the source and channel coding components sets the foundation for optimal design of techniques of joint decoding of source and channel codes. The problem has been largely addressed in the research community, by considering both fixed-length codes (FLC) and variable-length source codes (VLC) widely used in compression standards. Joint source-channel decoding of VLC raises specific difficulties due to the fact that the segmentation of the received bitstream into source symbols is random. This paper makes a survey of recent theoretical and practical advances in the area of JSCD with soft information of VLC-encoded sources. It first describes the main paths followed for designing efficient estimators for VLC-encoded sources, the key component of the JSCD iterative structure. It then presents the main issues involved in the application of the turbo principle to JSCD of VLC-encoded sources as well as the main approaches to source-controlled channel decoding. This survey terminates by performance illustrations with real image and video decoding systems.

  18. Reading handprinted addresses on IRS tax forms

    Science.gov (United States)

    Ramanaprasad, Vemulapati; Shin, Yong-Chul; Srihari, Sargur N.

    1996-03-01

    The hand-printed address recognition system described in this paper is a part of the Name and Address Block Reader (NABR) system developed by the Center of Excellence for Document Analysis and Recognition (CEDAR). NABR is currently being used by the IRS to read address blocks (hand-print as well as machine-print) on fifteen different tax forms. Although machine- print address reading was relatively straightforward, hand-print address recognition has posed some special challenges due to demands on processing speed (with an expected throughput of 8450 forms/hour) and recognition accuracy. We discuss various subsystems involved in hand- printed address recognition, including word segmentation, word recognition, digit segmentation, and digit recognition. We also describe control strategies used to make effective use of these subsystems to maximize recognition accuracy. We present system performance on 931 address blocks in recognizing various fields, such as city, state, ZIP Code, street number and name, and personal names.

  19. Identification of spikes associated with local sources in continuous time series of atmospheric CO, CO2 and CH4

    Science.gov (United States)

    El Yazidi, Abdelhadi; Ramonet, Michel; Ciais, Philippe; Broquet, Gregoire; Pison, Isabelle; Abbaris, Amara; Brunner, Dominik; Conil, Sebastien; Delmotte, Marc; Gheusi, Francois; Guerin, Frederic; Hazan, Lynn; Kachroudi, Nesrine; Kouvarakis, Giorgos; Mihalopoulos, Nikolaos; Rivier, Leonard; Serça, Dominique

    2018-03-01

    This study deals with the problem of identifying atmospheric data influenced by local emissions that can result in spikes in time series of greenhouse gases and long-lived tracer measurements. We considered three spike detection methods known as coefficient of variation (COV), robust extraction of baseline signal (REBS) and standard deviation of the background (SD) to detect and filter positive spikes in continuous greenhouse gas time series from four monitoring stations representative of the European ICOS (Integrated Carbon Observation System) Research Infrastructure network. The results of the different methods are compared to each other and against a manual detection performed by station managers. Four stations were selected as test cases to apply the spike detection methods: a continental rural tower of 100 m height in eastern France (OPE), a high-mountain observatory in the south-west of France (PDM), a regional marine background site in Crete (FKL) and a marine clean-air background site in the Southern Hemisphere on Amsterdam Island (AMS). This selection allows us to address spike detection problems in time series with different variability. Two years of continuous measurements of CO2, CH4 and CO were analysed. All methods were found to be able to detect short-term spikes (lasting from a few seconds to a few minutes) in the time series. Analysis of the results of each method leads us to exclude the COV method due to the requirement to arbitrarily specify an a priori percentage of rejected data in the time series, which may over- or underestimate the actual number of spikes. The two other methods freely determine the number of spikes for a given set of parameters, and the values of these parameters were calibrated to provide the best match with spikes known to reflect local emissions episodes that are well documented by the station managers. More than 96 % of the spikes manually identified by station managers were successfully detected both in the SD and the

  20. Identification of spikes associated with local sources in continuous time series of atmospheric CO, CO2 and CH4

    Directory of Open Access Journals (Sweden)

    A. El Yazidi

    2018-03-01

    Full Text Available This study deals with the problem of identifying atmospheric data influenced by local emissions that can result in spikes in time series of greenhouse gases and long-lived tracer measurements. We considered three spike detection methods known as coefficient of variation (COV, robust extraction of baseline signal (REBS and standard deviation of the background (SD to detect and filter positive spikes in continuous greenhouse gas time series from four monitoring stations representative of the European ICOS (Integrated Carbon Observation System Research Infrastructure network. The results of the different methods are compared to each other and against a manual detection performed by station managers. Four stations were selected as test cases to apply the spike detection methods: a continental rural tower of 100 m height in eastern France (OPE, a high-mountain observatory in the south-west of France (PDM, a regional marine background site in Crete (FKL and a marine clean-air background site in the Southern Hemisphere on Amsterdam Island (AMS. This selection allows us to address spike detection problems in time series with different variability. Two years of continuous measurements of CO2, CH4 and CO were analysed. All methods were found to be able to detect short-term spikes (lasting from a few seconds to a few minutes in the time series. Analysis of the results of each method leads us to exclude the COV method due to the requirement to arbitrarily specify an a priori percentage of rejected data in the time series, which may over- or underestimate the actual number of spikes. The two other methods freely determine the number of spikes for a given set of parameters, and the values of these parameters were calibrated to provide the best match with spikes known to reflect local emissions episodes that are well documented by the station managers. More than 96 % of the spikes manually identified by station managers were successfully detected both in

  1. Design and Implementation of a New Run-time Life-cycle for Interactive Public Display Applications

    OpenAIRE

    Cardoso, Jorge C. S.; Perpétua, Alice

    2015-01-01

    Public display systems are becoming increasingly complex. They are moving from passive closed systems to open interactive systems that are able to accommodate applications from several independent sources. This shift needs to be accompanied by a more flexible and powerful application management. In this paper, we propose a run-time life-cycle model for interactive public display applications that addresses several shortcomings of current display systems. Our mo...

  2. Effective source approach to self-force calculations

    International Nuclear Information System (INIS)

    Vega, Ian; Wardell, Barry; Diener, Peter

    2011-01-01

    Numerical evaluation of the self-force on a point particle is made difficult by the use of delta functions as sources. Recent methods for self-force calculations avoid delta functions altogether, using instead a finite and extended 'effective source' for a point particle. We provide a review of the general principles underlying this strategy, using the specific example of a scalar point charge moving in a black hole spacetime. We also report on two new developments: (i) the construction and evaluation of an effective source for a scalar charge moving along a generic orbit of an arbitrary spacetime, and (ii) the successful implementation of hyperboloidal slicing that significantly improves on previous treatments of boundary conditions used for effective-source-based self-force calculations. Finally, we identify some of the key issues related to the effective source approach that will need to be addressed by future work.

  3. Selective Attention in Multi-Chip Address-Event Systems

    Directory of Open Access Journals (Sweden)

    Giacomo Indiveri

    2009-06-01

    Full Text Available Selective attention is the strategy used by biological systems to cope with the inherent limits in their available computational resources, in order to efficiently process sensory information. The same strategy can be used in artificial systems that have to process vast amounts of sensory data with limited resources. In this paper we present a neuromorphic VLSI device, the “Selective Attention Chip” (SAC, which can be used to implement these models in multi-chip address-event systems. We also describe a real-time sensory-motor system, which integrates the SAC with a dynamic vision sensor and a robotic actuator. We present experimental results from each component in the system, and demonstrate how the complete system implements a real-time stimulus-driven selective attention model.

  4. Selective attention in multi-chip address-event systems.

    Science.gov (United States)

    Bartolozzi, Chiara; Indiveri, Giacomo

    2009-01-01

    Selective attention is the strategy used by biological systems to cope with the inherent limits in their available computational resources, in order to efficiently process sensory information. The same strategy can be used in artificial systems that have to process vast amounts of sensory data with limited resources. In this paper we present a neuromorphic VLSI device, the "Selective Attention Chip" (SAC), which can be used to implement these models in multi-chip address-event systems. We also describe a real-time sensory-motor system, which integrates the SAC with a dynamic vision sensor and a robotic actuator. We present experimental results from each component in the system, and demonstrate how the complete system implements a real-time stimulus-driven selective attention model.

  5. Crystal timing offset calibration method for time of flight PET scanners

    Science.gov (United States)

    Ye, Jinghan; Song, Xiyun

    2016-03-01

    In time-of-flight (TOF) positron emission tomography (PET), precise calibration of the timing offset of each crystal of a PET scanner is essential. Conventionally this calibration requires a specially designed tool just for this purpose. In this study a method that uses a planar source to measure the crystal timing offsets (CTO) is developed. The method uses list mode acquisitions of a planar source placed at multiple orientations inside the PET scanner field-of-view (FOV). The placement of the planar source in each acquisition is automatically figured out from the measured data, so that a fixture for exactly placing the source is not required. The expected coincidence time difference for each detected list mode event can be found from the planar source placement and the detector geometry. A deviation of the measured time difference from the expected one is due to CTO of the two crystals. The least squared solution of the CTO is found iteratively using the list mode events. The effectiveness of the crystal timing calibration method is evidenced using phantom images generated by placing back each list mode event into the image space with the timing offset applied to each event. The zigzagged outlines of the phantoms in the images become smooth after the crystal timing calibration is applied. In conclusion, a crystal timing calibration method is developed. The method uses multiple list mode acquisitions of a planar source to find the least squared solution of crystal timing offsets.

  6. Open source R&D - an anomaly in innovation management?

    DEFF Research Database (Denmark)

    Ulhøi, John Parm

    2004-01-01

    This paper addresses innovations based on the principle of open source or non-proprietary knowledge. Viewed through the lens of private property theory, such agency appears to be a true anomaly. However, by a further turn of the theoretical kaleidoscope, we will show that there may be perfectly...... justifiable reasons for not regarding open source innovations as anomalies. The paper has identified three generic cases of open source innovation, which is an offspring of contemporary theory made possible by combining elements of the model of private agency with those of the model of collective agency...

  7. Recharge sources and residence times of groundwater as determined by geochemical tracers in the Mayfield Area, southwestern Idaho, 2011–12

    Science.gov (United States)

    Hopkins, Candice B.

    2013-01-01

    Parties proposing residential development in the area of Mayfield, Idaho are seeking a sustainable groundwater supply. During 2011–12, the U.S. Geological Survey, in cooperation with the Idaho Department of Water Resources, used geochemical tracers in the Mayfield area to evaluate sources of aquifer recharge and differences in groundwater residence time. Fourteen groundwater wells and one surface-water site were sampled for major ion chemistry, metals, stable isotopes, and age tracers; data collected from this study were used to evaluate the sources of groundwater recharge and groundwater residence times in the area. Major ion chemistry varied along a flow path between deeper wells, suggesting an upgradient source of dilute water, and a downgradient source of more concentrated water with the geochemical signature of the Idaho Batholith. Samples from shallow wells had elevated nutrient concentrations, a more positive oxygen-18 signature, and younger carbon-14 dates than deep wells, suggesting that recharge comes from young precipitation and surface-water infiltration. Samples from deep wells generally had higher concentrations of metals typical of geothermal waters, a more negative oxygen-18 signature, and older carbon-14 values than samples from shallow wells, suggesting that recharge comes from both infiltration of meteoric water and another source. The chemistry of groundwater sampled from deep wells is somewhat similar to the chemistry in geothermal waters, suggesting that geothermal water may be a source of recharge to this aquifer. Results of NETPATH mixing models suggest that geothermal water composes 1–23 percent of water in deep wells. Chlorofluorocarbons were detected in every sample, which indicates that all groundwater samples contain at least a component of young recharge, and that groundwater is derived from multiple recharge sources. Conclusions from this study can be used to further refine conceptual hydrological models of the area.

  8. A Task-Based Needs Analysis for Australian Aboriginal Students: Going beyond the Target Situation to Address Cultural Issues

    Science.gov (United States)

    Oliver, Rhonda; Grote, Ellen; Rochecouste, Judith; Exell, Michael

    2013-01-01

    While needs analyses underpin the design of second language analytic syllabi, the methodologies undertaken are rarely examined. This paper explores the value of multiple data sources and collection methods for developing a needs analysis model to enable vocational education and training teachers to address the needs of Australian Aboriginal…

  9. A novel integrated approach for the hazardous radioactive dust source terms estimation in future nuclear fusion power plants.

    Science.gov (United States)

    Poggi, L A; Malizia, A; Ciparisse, J F; Gaudio, P

    2016-10-01

    An open issue still under investigation by several international entities working on the safety and security field for the foreseen nuclear fusion reactors is the estimation of source terms that are a hazard for the operators and public, and for the machine itself in terms of efficiency and integrity in case of severe accident scenarios. Source term estimation is a crucial key safety issue to be addressed in the future reactors safety assessments, and the estimates available at the time are not sufficiently satisfactory. The lack of neutronic data along with the insufficiently accurate methodologies used until now, calls for an integrated methodology for source term estimation that can provide predictions with an adequate accuracy. This work proposes a complete methodology to estimate dust source terms starting from a broad information gathering. The wide number of parameters that can influence dust source term production is reduced with statistical tools using a combination of screening, sensitivity analysis, and uncertainty analysis. Finally, a preliminary and simplified methodology for dust source term production prediction for future devices is presented.

  10. Traffic, air pollution, minority and socio-economic status: addressing inequities in exposure and risk.

    Science.gov (United States)

    Pratt, Gregory C; Vadali, Monika L; Kvale, Dorian L; Ellickson, Kristie M

    2015-05-19

    Higher levels of nearby traffic increase exposure to air pollution and adversely affect health outcomes. Populations with lower socio-economic status (SES) are particularly vulnerable to stressors like air pollution. We investigated cumulative exposures and risks from traffic and from MNRiskS-modeled air pollution in multiple source categories across demographic groups. Exposures and risks, especially from on-road sources, were higher than the mean for minorities and low SES populations and lower than the mean for white and high SES populations. Owning multiple vehicles and driving alone were linked to lower household exposures and risks. Those not owning a vehicle and walking or using transit had higher household exposures and risks. These results confirm for our study location that populations on the lower end of the socio-economic spectrum and minorities are disproportionately exposed to traffic and air pollution and at higher risk for adverse health outcomes. A major source of disparities appears to be the transportation infrastructure. Those outside the urban core had lower risks but drove more, while those living nearer the urban core tended to drive less but had higher exposures and risks from on-road sources. We suggest policy considerations for addressing these inequities.

  11. Identification of Watershed-scale Critical Source Areas Using Bayesian Maximum Entropy Spatiotemporal Analysis

    Science.gov (United States)

    Roostaee, M.; Deng, Z.

    2017-12-01

    The states' environmental agencies are required by The Clean Water Act to assess all waterbodies and evaluate potential sources of impairments. Spatial and temporal distributions of water quality parameters are critical in identifying Critical Source Areas (CSAs). However, due to limitations in monetary resources and a large number of waterbodies, available monitoring stations are typically sparse with intermittent periods of data collection. Hence, scarcity of water quality data is a major obstacle in addressing sources of pollution through management strategies. In this study spatiotemporal Bayesian Maximum Entropy method (BME) is employed to model the inherent temporal and spatial variability of measured water quality indicators such as Dissolved Oxygen (DO) concentration for Turkey Creek Watershed. Turkey Creek is located in northern Louisiana and has been listed in 303(d) list for DO impairment since 2014 in Louisiana Water Quality Inventory Reports due to agricultural practices. BME method is proved to provide more accurate estimates than the methods of purely spatial analysis by incorporating space/time distribution and uncertainty in available measured soft and hard data. This model would be used to estimate DO concentration at unmonitored locations and times and subsequently identifying CSAs. The USDA's crop-specific land cover data layers of the watershed were then used to determine those practices/changes that led to low DO concentration in identified CSAs. Primary results revealed that cultivation of corn and soybean as well as urban runoff are main contributing sources in low dissolved oxygen in Turkey Creek Watershed.

  12. Time to address the problems at the neural interface

    Science.gov (United States)

    Durand, Dominique M.; Ghovanloo, Maysam; Krames, Elliot

    2014-04-01

    Neural engineers have made significant, if not remarkable, progress in interfacing with the nervous system in the last ten years. In particular, neuromodulation of the brain has generated significant therapeutic benefits [1-5]. EEG electrodes can be used to communicate with patients with locked-in syndrome [6]. In the central nervous system (CNS), electrode arrays placed directly over or within the cortex can record neural signals related to the intent of the subject or patient [7, 8]. A similar technology has allowed paralyzed patients to control an otherwise normal skeletal system with brain signals [9, 10]. This technology has significant potential to restore function in these and other patients with neural disorders such as stroke [11]. Although there are several multichannel arrays described in the literature, the workhorse for these cortical interfaces has been the Utah array [12]. This 100-channel electrode array has been used in most studies on animals and humans since the 1990s and is commercially available. This array and other similar microelectrode arrays can record neural signals with high quality (high signal-to-noise ratio), but these signals fade and disappear after a few months and therefore the current technology is not reliable for extended periods of time. Therefore, despite these major advances in communicating with the brain, clinical translation cannot be implemented. The reasons for this failure are not known but clearly involve the interface between the electrode and the neural tissue. The Defense Advanced Research Project Agency (DARPA) as well as other federal funding agencies such as the National Science Foundation (NSF) and the National Institutes of Health have provided significant financial support to investigate this problem without much success. A recent funding program from DARPA was designed to establish the failure modes in order to generate a reliable neural interface technology and again was unsuccessful at producing a robust

  13. Car indoor air pollution - analysis of potential sources

    Directory of Open Access Journals (Sweden)

    Müller Daniel

    2011-12-01

    Full Text Available Abstract The population of industrialized countries such as the United States or of countries from the European Union spends approximately more than one hour each day in vehicles. In this respect, numerous studies have so far addressed outdoor air pollution that arises from traffic. By contrast, only little is known about indoor air quality in vehicles and influences by non-vehicle sources. Therefore the present article aims to summarize recent studies that address i.e. particulate matter exposure. It can be stated that although there is a large amount of data present for outdoor air pollution, research in the area of indoor air quality in vehicles is still limited. Especially, knowledge on non-vehicular sources is missing. In this respect, an understanding of the effects and interactions of i.e. tobacco smoke under realistic automobile conditions should be achieved in future.

  14. The Chandra Source Catalog: Source Properties and Data Products

    Science.gov (United States)

    Rots, Arnold; Evans, Ian N.; Glotfelty, Kenny J.; Primini, Francis A.; Zografou, Panagoula; Anderson, Craig S.; Bonaventura, Nina R.; Chen, Judy C.; Davis, John E.; Doe, Stephen M.; Evans, Janet D.; Fabbiano, Giuseppina; Galle, Elizabeth C.; Gibbs, Danny G., II; Grier, John D.; Hain, Roger; Hall, Diane M.; Harbo, Peter N.; He, Xiang Qun (Helen); Houck, John C.; Karovska, Margarita; Kashyap, Vinay L.; Lauer, Jennifer; McCollough, Michael L.; McDowell, Jonathan C.; Miller, Joseph B.; Mitschang, Arik W.; Morgan, Douglas L.; Mossman, Amy E.; Nichols, Joy S.; Nowak, Michael A.; Plummer, David A.; Refsdal, Brian L.; Siemiginowska, Aneta L.; Sundheim, Beth A.; Tibbetts, Michael S.; van Stone, David W.; Winkelman, Sherry L.

    2009-09-01

    The Chandra Source Catalog (CSC) is breaking new ground in several areas. There are two aspects that are of particular interest to the users: its evolution and its contents. The CSC will be a living catalog that becomes richer, bigger, and better in time while still remembering its state at each point in time. This means that users will be able to take full advantage of new additions to the catalog, while retaining the ability to back-track and return to what was extracted in the past. The CSC sheds the limitations of flat-table catalogs. Its sources will be characterized by a large number of properties, as usual, but each source will also be associated with its own specific data products, allowing users to perform mini custom analysis on the sources. Source properties fall in the spatial (position, extent), photometric (fluxes, count rates), spectral (hardness ratios, standard spectral fits), and temporal (variability probabilities) domains, and are all accompanied by error estimates. Data products cover the same coordinate space and include event lists, images, spectra, and light curves. In addition, the catalog contains data products covering complete observations: event lists, background images, exposure maps, etc. This work is supported by NASA contract NAS8-03060 (CXC).

  15. Time-sensitive remote sensing

    CERN Document Server

    Lippitt, Christopher; Coulter, Lloyd

    2015-01-01

    This book documents the state of the art in the use of remote sensing to address time-sensitive information requirements. Specifically, it brings together a group of authors who are both researchers and practitioners, who work toward or are currently using remote sensing to address time-sensitive information requirements with the goal of advancing the effective use of remote sensing to supply time-sensitive information. The book addresses the theoretical implications of time-sensitivity on the remote sensing process, assessments or descriptions of methods for expediting the delivery and improving the quality of information derived from remote sensing, and describes and analyzes time-sensitive remote sensing applications, with an emphasis on lessons learned. This book is intended for remote sensing scientists, practitioners (e.g., emergency responders or administrators of emergency response agencies), and students, but will also be of use to those seeking to understand the potential of remote sensing to addres...

  16. Enhancing source location protection in wireless sensor networks

    Science.gov (United States)

    Chen, Juan; Lin, Zhengkui; Wu, Di; Wang, Bailing

    2015-12-01

    Wireless sensor networks are widely deployed in the internet of things to monitor valuable objects. Once the object is monitored, the sensor nearest to the object which is known as the source informs the base station about the object's information periodically. It is obvious that attackers can capture the object successfully by localizing the source. Thus, many protocols have been proposed to secure the source location. However, in this paper, we examine that typical source location protection protocols generate not only near but also highly localized phantom locations. As a result, attackers can trace the source easily from these phantom locations. To address these limitations, we propose a protocol to enhance the source location protection (SLE). With phantom locations far away from the source and widely distributed, SLE improves source location anonymity significantly. Theory analysis and simulation results show that our SLE provides strong source location privacy preservation and the average safety period increases by nearly one order of magnitude compared with existing work with low communication cost.

  17. Addressing Pricing Power in Integrated Delivery: The Limits of Antitrust.

    Science.gov (United States)

    Berenson, Robert

    2015-08-01

    Prices are the major driver of why the United States spends so much more on health care than other countries do. The pricing power that hospitals have garnered recently has resulted from consolidated delivery systems and concentrated markets, leading to enhanced negotiating leverage. But consolidation may be the wrong frame for viewing the problem of high and highly variable prices; many "must-have" hospitals achieve their pricing power from sources other than consolidation, for example, reputation. Further, the frame of consolidation leads to unrealistic expectations for what antitrust's role in addressing pricing power should be, especially because in the wake of two periods of merger "manias" and "frenzies" many markets already lack effective competition. It is particularly challenging for antitrust to address extant monopolies lawfully attained. New payment and delivery models being pioneered in Medicare, especially those built around accountable care organizations (ACOs), offer an opportunity to reduce pricing power, but only if they are implemented with a clear eye on the impact on prices in commercial insurance markets. This article proposes approaches that public and private payers should consider to complement the role of antitrust to assure that ACOs will actually help control costs in commercial markets as well as in Medicare and Medicaid. Copyright © 2015 by Duke University Press.

  18. Source of the Vrancea, Romania intermediate-depth earthquakes: variability test of the source time function using a small-aperture array

    International Nuclear Information System (INIS)

    Popescu, E.; Radulian, M.; Popa, M.; Placinta, A.O.; Cioflan, C. O.; Grecu, B.

    2005-01-01

    The main purpose of the present work is to investigate the possibility to detect and calibrate the source parameters of the Vrancea intermediate-depth earthquakes using a small-aperture array, Bucovina Seismic Array (BURAR). BURAR array was installed in 1999 in joint cooperation between Romania and USA. The array is situated in the northern part of Romania, in Eastern Carpathians, at about 250 km distance from the Vrancea epicentral area. The array consists of 10 stations (nine short period and one broad band instruments installed in boreholes). For our study we selected 30 earthquakes (3.8 iU MD iU 6.0) occurred between 2002 and 2004, including two recent Vrancea events, which are the best ever recorded earthquakes on the Romanian territory: September 27, 2004 (45.70 angle N, 26.45 angle E, h = 166 km, M w = 4.7) and October 27, 2004 (45.84 angle N, 26.63 angle E, h = 105 km, M w 6.0). Empirical Green function deconvolution and spectral ratio methods are applied for pairs of collocated events with similar focal mechanism. Stability tests are performed for the retrieved source time function using the array elements. Empirical scaling and calibration relationships are also determined. Possible variation with depth along the subducting slab, in agreement with assumed differences in the seismic and tectonic regime between the upper (h = 60 -110 km) and lower (h = 110 - 180 km) lithospheric seismic active segments, and variation in the attenuation of the seismic waves propagating toward BURAR site, are also investigated. (authors)

  19. Leveraging Big Data Tools and Technologies: Addressing the Challenges of the Water Quality Sector

    Directory of Open Access Journals (Sweden)

    Juan Manuel Ponce Romero

    2017-11-01

    Full Text Available The water utility sector is subject to stringent legislation, seeking to address both the evolution of practices within the chemical/pharmaceutical industry, and the safeguarding of environmental protection, and which is informed by stakeholder views. Growing public environmental awareness is balanced by fair apportionment of liability within-sector. This highly complex and dynamic context poses challenges for water utilities seeking to manage the diverse chemicals arising from disparate sources reaching Wastewater Treatment Plants, including residential, commercial, and industrial points of origin, and diffuse sources including agricultural and hard surface water run-off. Effluents contain broad ranges of organic and inorganic compounds, herbicides, pesticides, phosphorus, pharmaceuticals, and chemicals of emerging concern. These potential pollutants can be in dissolved form, or arise in association with organic matter, the associated risks posing significant environmental challenges. This paper examines how the adoption of new Big Data tools and computational technologies can offer great advantage to the water utility sector in addressing this challenge. Big Data approaches facilitate improved understanding and insight of these challenges, by industry, regulator, and public alike. We discuss how Big Data approaches can be used to improve the outputs of tools currently in use by the water industry, such as SAGIS (Source Apportionment GIS system, helping to reveal new relationships between chemicals, the environment, and human health, and in turn provide better understanding of contaminants in wastewater (origin, pathways, and persistence. We highlight how the sector can draw upon Big Data tools to add value to legacy datasets, such as the Chemicals Investigation Programme in the UK, combined with contemporary data sources, extending the lifespan of data, focusing monitoring strategies, and helping users adapt and plan more efficiently. Despite

  20. Implementation of time-delay interferometry for LISA

    International Nuclear Information System (INIS)

    Tinto, Massimo; Shaddock, Daniel A.; Sylvestre, Julien; Armstrong, J.W.

    2003-01-01

    We discuss the baseline optical configuration for the Laser Interferometer Space Antenna (LISA) mission, in which the lasers are not free-running, but rather one of them is used as the main frequency reference generator (the master) and the remaining five as slaves, these being phase-locked to the master (the master-slave configuration). Under the condition that the frequency fluctuations due to the optical transponders can be made negligible with respect to the secondary LISA noise sources (mainly proof-mass and shot noises), we show that the entire space of interferometric combinations LISA can generate when operated with six independent lasers (the one-way method) can also be constructed with the master-slave system design. The corresponding hardware trade-off analysis for these two optical designs is presented, which indicates that the two sets of systems needed for implementing the one-way method, and the master-slave configuration, are essentially identical. Either operational mode could therefore be implemented without major implications on the hardware configuration. We then derive the required accuracies of armlength knowledge, time synchronization of the onboard clocks, sampling times and time-shifts needed for effectively implementing time-delay interferometry for LISA. We find that an armlength accuracy of about 16 meters, a synchronization accuracy of about 50 ns, and the time jitter due to a presently existing space qualified clock will allow the suppression of the frequency fluctuations of the lasers below to the level identified by the secondary noise sources. A new procedure for sampling the data in such a way to avoid the problem of having time shifts that are not integer multiples of the sampling time is also introduced, addressing one of the concerns about the implementation of time-delay interferometry