WorldWideScience

Sample records for garching-bonn deep survey

  1. GARCH Option Valuation: Theory and Evidence

    DEFF Research Database (Denmark)

    Christoffersen, Peter; Jacobs, Kris; Ornthanalai, Chayawat

    We survey the theory and empirical evidence on GARCH option valuation models. Our treatment includes the range of functional forms available for the volatility dynamic, multifactor models, nonnormal shock distributions as well as style of pricing kernels typically used. Various strategies...... for empirical implementation are laid out and we also discuss the links between GARCH and stochastic volatility models. In the appendix we provide Matlab computer code for option pricing via Monte Carlo simulation for nonaffine models as well as Fourier inversion for affine models....

  2. Improving Garch Volatility Forecasts

    NARCIS (Netherlands)

    Klaassen, F.J.G.M.

    1998-01-01

    Many researchers use GARCH models to generate volatility forecasts. We show, however, that such forecasts are too variable. To correct for this, we extend the GARCH model by distinguishing two regimes with different volatility levels. GARCH effects are allowed within each regime, so that our model

  3. TESTING GARCH-X TYPE MODELS

    DEFF Research Database (Denmark)

    Pedersen, Rasmus Søndergaard; Rahbek, Anders

    2017-01-01

    We present novel theory for testing for reduction of GARCH-X type models with an exogenous (X) covariate to standard GARCH type models. To deal with the problems of potential nuisance parameters on the boundary of the parameter space as well as lack of identification under the null, we exploit...... a noticeable property of specific zero-entries in the inverse information of the GARCH-X type models. Specifically, we consider sequential testing based on two likelihood ratio tests and as demonstrated the structure of the inverse information implies that the proposed test neither depends on whether...... the nuisance parameters lie on the boundary of the parameter space, nor on lack of identification. Our general results on GARCH-X type models are applied to Gaussian based GARCH-X models, GARCH-X models with Student's t-distributed innovations as well as the integer-valued GARCH-X (PAR-X) models....

  4. Multivariate GARCH models

    DEFF Research Database (Denmark)

    Silvennoinen, Annastiina; Teräsvirta, Timo

    This article contains a review of multivariate GARCH models. Most common GARCH models are presented and their properties considered. This also includes nonparametric and semiparametric models. Existing specification and misspecification tests are discussed. Finally, there is an empirical example...

  5. GARCH Modelling of Cryptocurrencies

    OpenAIRE

    Jeffrey Chu; Stephen Chan; Saralees Nadarajah; Joerg Osterrieder

    2017-01-01

    With the exception of Bitcoin, there appears to be little or no literature on GARCH modelling of cryptocurrencies. This paper provides the first GARCH modelling of the seven most popular cryptocurrencies. Twelve GARCH models are fitted to each cryptocurrency, and their fits are assessed in terms of five criteria. Conclusions are drawn on the best fitting models, forecasts and acceptability of value at risk estimates.

  6. THE EFFELSBERG-BONN H I SURVEY: DATA REDUCTION

    International Nuclear Information System (INIS)

    Winkel, B.; Kalberla, P. M. W.; Kerp, J.; Floeer, L.

    2010-01-01

    Starting in winter 2008/2009 an L-band seven-feed-array receiver is used for a 21 cm line survey performed with the 100 m telescope, the Effelsberg-Bonn H I survey (EBHIS). The EBHIS will cover the whole northern hemisphere for decl. > - 5 0 comprising both the galactic and extragalactic sky out to a distance of about 230 Mpc. Using state-of-the-art FPGA-based digital fast Fourier transform spectrometers, superior in dynamic range and temporal resolution to conventional correlators, allows us to apply sophisticated radio frequency interference (RFI) mitigation schemes. In this paper, the EBHIS data reduction package and first results are presented. The reduction software consists of RFI detection schemes, flux and gain-curve calibration, stray-radiation removal, baseline fitting, and finally the gridding to produce data cubes. The whole software chain is successfully tested using multi-feed data toward many smaller test fields (1-100 deg 2 ) and recently applied for the first time to data of two large sky areas, each covering about 2000 deg 2 . The first large area is toward the northern galactic pole and the second one toward the northern tip of the Magellanic Leading Arm. Here, we demonstrate the data quality of EBHIS Milky Way data and give a first impression on the first data release in 2011.

  7. GARCH Modelling of Cryptocurrencies

    Directory of Open Access Journals (Sweden)

    Jeffrey Chu

    2017-10-01

    Full Text Available With the exception of Bitcoin, there appears to be little or no literature on GARCH modelling of cryptocurrencies. This paper provides the first GARCH modelling of the seven most popular cryptocurrencies. Twelve GARCH models are fitted to each cryptocurrency, and their fits are assessed in terms of five criteria. Conclusions are drawn on the best fitting models, forecasts and acceptability of value at risk estimates.

  8. The Bonn Astro/Geo Correlator

    Science.gov (United States)

    Bernhart, Simone; Alef, Walter; Bertarini, Alessandra; La Porta, Laura; Muskens, Arno; Rottmann, Helge; Roy, Alan

    2013-01-01

    The Bonn Distributed FX (DiFX) correlator is a software correlator operated jointly by the Max- Planck-Institut fur Radioastronomie (MPIfR), the Institut fur Geodasie und Geoinformation der Universitat Bonn (IGG), and the Bundesamt fur Kartographie und Geodasie (BKG) in Frankfurt.

  9. Empirical study of the GARCH model with rational errors

    International Nuclear Information System (INIS)

    Chen, Ting Ting; Takaishi, Tetsuya

    2013-01-01

    We use the GARCH model with a fat-tailed error distribution described by a rational function and apply it to stock price data on the Tokyo Stock Exchange. To determine the model parameters we perform Bayesian inference to the model. Bayesian inference is implemented by the Metropolis-Hastings algorithm with an adaptive multi-dimensional Student's t-proposal density. In order to compare our model with the GARCH model with the standard normal errors, we calculate the information criteria AIC and DIC, and find that both criteria favor the GARCH model with a rational error distribution. We also calculate the accuracy of the volatility by using the realized volatility and find that a good accuracy is obtained for the GARCH model with a rational error distribution. Thus we conclude that the GARCH model with a rational error distribution is superior to the GARCH model with the normal errors and it can be used as an alternative GARCH model to those with other fat-tailed distributions

  10. Modeling energy price dynamics: GARCH versus stochastic volatility

    International Nuclear Information System (INIS)

    Chan, Joshua C.C.; Grant, Angelia L.

    2016-01-01

    We compare a number of GARCH and stochastic volatility (SV) models using nine series of oil, petroleum product and natural gas prices in a formal Bayesian model comparison exercise. The competing models include the standard models of GARCH(1,1) and SV with an AR(1) log-volatility process, as well as more flexible models with jumps, volatility in mean, leverage effects, and t distributed and moving average innovations. We find that: (1) SV models generally compare favorably to their GARCH counterparts; (2) the jump component and t distributed innovations substantially improve the performance of the standard GARCH, but are unimportant for the SV model; (3) the volatility feedback channel seems to be superfluous; (4) the moving average component markedly improves the fit of both GARCH and SV models; and (5) the leverage effect is important for modeling crude oil prices—West Texas Intermediate and Brent—but not for other energy prices. Overall, the SV model with moving average innovations is the best model for all nine series. - Highlights: • We compare a variety of GARCH and SV models for fitting nine series of energy prices. • We find that SV models generally compare favorably to their GARCH counterparts. • The SV model with moving average innovations is the best model for all nine series.

  11. Empirical investigation on modeling solar radiation series with ARMA–GARCH models

    International Nuclear Information System (INIS)

    Sun, Huaiwei; Yan, Dong; Zhao, Na; Zhou, Jianzhong

    2015-01-01

    Highlights: • Apply 6 ARMA–GARCH(-M) models to model and forecast solar radiation. • The ARMA–GARCH(-M) models produce more accurate radiation forecasting than conventional methods. • Show that ARMA–GARCH-M models are more effective for forecasting solar radiation mean and volatility. • The ARMA–EGARCH-M is robust and the ARMA–sGARCH-M is very competitive. - Abstract: Simulation of radiation is one of the most important issues in solar utilization. Time series models are useful tools in the estimation and forecasting of solar radiation series and their changes. In this paper, the effectiveness of autoregressive moving average (ARMA) models with various generalized autoregressive conditional heteroskedasticity (GARCH) processes, namely ARMA–GARCH models are evaluated for their effectiveness in radiation series. Six different GARCH approaches, which contain three different ARMA–GARCH models and corresponded GARCH in mean (ARMA–GARCH-M) models, are applied in radiation data sets from two representative climate stations in China. Multiple evaluation metrics of modeling sufficiency are used for evaluating the performances of models. The results show that the ARMA–GARCH(-M) models are effective in radiation series estimation. Both in fitting and prediction of radiation series, the ARMA–GARCH(-M) models show better modeling sufficiency than traditional models, while ARMA–EGARCH-M models are robustness in two sites and the ARMA–sGARCH-M models appear very competitive. Comparisons of statistical diagnostics and model performance clearly show that the ARMA–GARCH-M models make the mean radiation equations become more sufficient. It is recommended the ARMA–GARCH(-M) models to be the preferred method to use in the modeling of solar radiation series

  12. Nonstationary ARCH and GARCH with t-distributed Innovations

    DEFF Research Database (Denmark)

    Pedersen, Rasmus Søndergaard; Rahbek, Anders

    Consistency and asymptotic normality are established for the maximum likelihood estimators in the nonstationary ARCH and GARCH models with general t-distributed innovations. The results hold for joint estimation of (G)ARCH effects and the degrees of freedom parameter parametrizing the t-distribut......Consistency and asymptotic normality are established for the maximum likelihood estimators in the nonstationary ARCH and GARCH models with general t-distributed innovations. The results hold for joint estimation of (G)ARCH effects and the degrees of freedom parameter parametrizing the t......-distribution. With T denoting sample size, classic square-root T-convergence is shown to hold with closed form expressions for the multivariate covariances....

  13. Forecasting Performance of Asymmetric GARCH Stock Market Volatility Models

    Directory of Open Access Journals (Sweden)

    Hojin Lee

    2009-12-01

    Full Text Available We investigate the asymmetry between positive and negative returns in their effect on conditional variance of the stock market index and incorporate the characteristics to form an out-of-sample volatility forecast. Contrary to prior evidence, however, the results in this paper suggest that no asymmetric GARCH model is superior to basic GARCH(1,1 model. It is our prior knowledge that, for equity returns, it is unlikely that positive and negative shocks have the same impact on the volatility. In order to reflect this intuition, we implement three diagnostic tests for volatility models: the Sign Bias Test, the Negative Size Bias Test, and the Positive Size Bias Test and the tests against the alternatives of QGARCH and GJR-GARCH. The asymmetry test results indicate that the sign and the size of the unexpected return shock do not influence current volatility differently which contradicts our presumption that there are asymmetric effects in the stock market volatility. This result is in line with various diagnostic tests which are designed to determine whether the GARCH(1,1 volatility estimates adequately represent the data. The diagnostic tests in section 2 indicate that the GARCH(1,1 model for weekly KOSPI returns is robust to the misspecification test. We also investigate two representative asymmetric GARCH models, QGARCH and GJR-GARCH model, for our out-of-sample forecasting performance. The out-of-sample forecasting ability test reveals that no single model is clearly outperforming. It is seen that the GJR-GARCH and QGARCH model give mixed results in forecasting ability on all four criteria across all forecast horizons considered. Also, the predictive accuracy test of Diebold and Mariano based on both absolute and squared prediction errors suggest that the forecasts from the linear and asymmetric GARCH models need not be significantly different from each other.

  14. Analysis of Spin Financial Market by GARCH Model

    International Nuclear Information System (INIS)

    Takaishi, Tetsuya

    2013-01-01

    A spin model is used for simulations of financial markets. To determine return volatility in the spin financial market we use the GARCH model often used for volatility estimation in empirical finance. We apply the Bayesian inference performed by the Markov Chain Monte Carlo method to the parameter estimation of the GARCH model. It is found that volatility determined by the GARCH model exhibits ''volatility clustering'' also observed in the real financial markets. Using volatility determined by the GARCH model we examine the mixture-of-distribution hypothesis (MDH) suggested for the asset return dynamics. We find that the returns standardized by volatility are approximately standard normal random variables. Moreover we find that the absolute standardized returns show no significant autocorrelation. These findings are consistent with the view of the MDH for the return dynamics

  15. Option valuation with the simplified component GARCH model

    DEFF Research Database (Denmark)

    Dziubinski, Matt P.

    We introduce the Simplified Component GARCH (SC-GARCH) option pricing model, show and discuss sufficient conditions for non-negativity of the conditional variance, apply it to low-frequency and high-frequency financial data, and consider the option valuation, comparing the model performance...

  16. Stochastic GARCH dynamics describing correlations between stocks

    Science.gov (United States)

    Prat-Ortega, G.; Savel'ev, S. E.

    2014-09-01

    The ARCH and GARCH processes have been successfully used for modelling price dynamics such as stock returns or foreign exchange rates. Analysing the long range correlations between stocks, we propose a model, based on the GARCH process, which is able to describe the main characteristics of the stock price correlations, including the mean, variance, probability density distribution and the noise spectrum.

  17. Volatility estimation using a rational GARCH model

    Directory of Open Access Journals (Sweden)

    Tetsuya Takaishi

    2018-03-01

    Full Text Available The rational GARCH (RGARCH model has been proposed as an alternative GARCHmodel that captures the asymmetric property of volatility. In addition to the previously proposedRGARCH model, we propose an alternative RGARCH model called the RGARCH-Exp model thatis more stable when dealing with outliers. We measure the performance of the volatility estimationby a loss function calculated using realized volatility as a proxy for true volatility and compare theRGARCH-type models with other asymmetric type models such as the EGARCH and GJR models.We conduct empirical studies of six stocks on the Tokyo Stock Exchange and find that a volatilityestimation using the RGARCH-type models outperforms the GARCH model and is comparable toother asymmetric GARCH models.

  18. Bayesian Non-Parametric Mixtures of GARCH(1,1 Models

    Directory of Open Access Journals (Sweden)

    John W. Lau

    2012-01-01

    Full Text Available Traditional GARCH models describe volatility levels that evolve smoothly over time, generated by a single GARCH regime. However, nonstationary time series data may exhibit abrupt changes in volatility, suggesting changes in the underlying GARCH regimes. Further, the number and times of regime changes are not always obvious. This article outlines a nonparametric mixture of GARCH models that is able to estimate the number and time of volatility regime changes by mixing over the Poisson-Kingman process. The process is a generalisation of the Dirichlet process typically used in nonparametric models for time-dependent data provides a richer clustering structure, and its application to time series data is novel. Inference is Bayesian, and a Markov chain Monte Carlo algorithm to explore the posterior distribution is described. The methodology is illustrated on the Standard and Poor's 500 financial index.

  19. Realized GARCH: A Complete Model of Returns and Realized Measures of Volatility

    DEFF Research Database (Denmark)

    Hansen, Peter Reinhard; Huang, Zhuo (Albert); Shek, Howard Howan

    GARCH models have been successful in modeling financial returns. Still, much is to be gained by incorporating a realized measure of volatility in these models. In this paper we introduce a new framework for the joint modeling of returns and realized measures of volatility. The Realized GARCH...... framework nests most GARCH models as special cases and is, in many ways, a natural extension of standard GARCH models. We pay special attention to linear and log-linear Realized GARCH specifications. This class of models has several attractive features. It retains the simplicity and tractability...... to latent volatility. This equation facilitates a simple modeling of the dependence between returns and future volatility that is commonly referred to as the leverage effect. An empirical application with DJIA stocks and an exchange traded index fund shows that a simple Realized GARCH structure leads...

  20. Nonlinear GARCH model and 1 / f noise

    Science.gov (United States)

    Kononovicius, A.; Ruseckas, J.

    2015-06-01

    Auto-regressive conditionally heteroskedastic (ARCH) family models are still used, by practitioners in business and economic policy making, as a conditional volatility forecasting models. Furthermore ARCH models still are attracting an interest of the researchers. In this contribution we consider the well known GARCH(1,1) process and its nonlinear modifications, reminiscent of NGARCH model. We investigate the possibility to reproduce power law statistics, probability density function and power spectral density, using ARCH family models. For this purpose we derive stochastic differential equations from the GARCH processes in consideration. We find the obtained equations to be similar to a general class of stochastic differential equations known to reproduce power law statistics. We show that linear GARCH(1,1) process has power law distribution, but its power spectral density is Brownian noise-like. However, the nonlinear modifications exhibit both power law distribution and power spectral density of the 1 /fβ form, including 1 / f noise.

  1. Volatility forecasting with the wavelet transformation algorithm GARCH model: Evidence from African stock markets

    Directory of Open Access Journals (Sweden)

    Mohd Tahir Ismail

    2016-06-01

    Full Text Available The daily returns of four African countries' stock market indices for the period January 2, 2000, to December 31, 2014, were employed to compare the GARCH(1,1 model and a newly proposed Maximal Overlap Discreet Wavelet Transform (MODWT-GARCH(1,1 model. The results showed that although both models fit the returns data well, the forecast produced by the GARCH(1,1 model underestimates the observed returns whereas the newly proposed MODWT-GARCH(1,1 model generates an accurate forecast value of the observed returns. The results generally showed that the newly proposed MODWT-GARCH(1,1 model best fits returns series for these African countries. Hence the proposed MODWT-GARCH should be applied on other context to further verify its validity.

  2. Photonuclear physics at the Bonn synchrotrons. Present status and future plans at the Bonn synchrotron

    International Nuclear Information System (INIS)

    Mecking, B.A.

    1983-11-01

    The activities in the field of photonuclear physics at the Bonn 500 MeV and 2.5 GeV synchrotrons are reviewed. The experiments concentrate on photodisintegration and pion-photoproduction reactions on light nuclei. (orig.)

  3. SEDS: THE SPITZER EXTENDED DEEP SURVEY. SURVEY DESIGN, PHOTOMETRY, AND DEEP IRAC SOURCE COUNTS

    International Nuclear Information System (INIS)

    Ashby, M. L. N.; Willner, S. P.; Fazio, G. G.; Huang, J.-S.; Hernquist, L.; Hora, J. L.; Arendt, R.; Barmby, P.; Barro, G.; Faber, S.; Guhathakurta, P.; Bell, E. F.; Bouwens, R.; Cattaneo, A.; Croton, D.; Davé, R.; Dunlop, J. S.; Egami, E.; Finlator, K.; Grogin, N. A.

    2013-01-01

    The Spitzer Extended Deep Survey (SEDS) is a very deep infrared survey within five well-known extragalactic science fields: the UKIDSS Ultra-Deep Survey, the Extended Chandra Deep Field South, COSMOS, the Hubble Deep Field North, and the Extended Groth Strip. SEDS covers a total area of 1.46 deg 2 to a depth of 26 AB mag (3σ) in both of the warm Infrared Array Camera (IRAC) bands at 3.6 and 4.5 μm. Because of its uniform depth of coverage in so many widely-separated fields, SEDS is subject to roughly 25% smaller errors due to cosmic variance than a single-field survey of the same size. SEDS was designed to detect and characterize galaxies from intermediate to high redshifts (z = 2-7) with a built-in means of assessing the impact of cosmic variance on the individual fields. Because the full SEDS depth was accumulated in at least three separate visits to each field, typically with six-month intervals between visits, SEDS also furnishes an opportunity to assess the infrared variability of faint objects. This paper describes the SEDS survey design, processing, and publicly-available data products. Deep IRAC counts for the more than 300,000 galaxies detected by SEDS are consistent with models based on known galaxy populations. Discrete IRAC sources contribute 5.6 ± 1.0 and 4.4 ± 0.8 nW m –2 sr –1 at 3.6 and 4.5 μm to the diffuse cosmic infrared background (CIB). IRAC sources cannot contribute more than half of the total CIB flux estimated from DIRBE data. Barring an unexpected error in the DIRBE flux estimates, half the CIB flux must therefore come from a diffuse component.

  4. Evaluating Portfolio Value-At-Risk Using Semi-Parametric GARCH Models

    NARCIS (Netherlands)

    J.V.K. Rombouts; M.J.C.M. Verbeek (Marno)

    2009-01-01

    textabstractIn this paper we examine the usefulness of multivariate semi-parametric GARCH models for evaluating the Value-at-Risk (VaR) of a portfolio with arbitrary weights. We specify and estimate several alternative multivariate GARCH models for daily returns on the S&P 500 and Nasdaq indexes.

  5. Realized Beta GARCH: A Multivariate GARCH Model with Realized Measures of Volatility and CoVolatility

    DEFF Research Database (Denmark)

    Hansen, Peter Reinhard; Lunde, Asger; Voev, Valeri

    We introduce a multivariate GARCH model that utilizes and models realized measures of volatility and covolatility. The realized measures extract information contained in high-frequency data that is particularly beneficial during periods with variation in volatility and covolatility. Applying the ...

  6. Estimation of value-at-risk for energy commodities via fat-tailed GARCH models

    International Nuclear Information System (INIS)

    Hung, Jui-Cheng; Lee, Ming-Chih; Liu, Hung-Chun

    2008-01-01

    The choice of an appropriate distribution for return innovations is important in VaR applications owing to its ability to directly affect the estimation quality of the required quantiles. This study investigates the influence of fat-tailed innovation process on the performance of one-day-ahead VaR estimates using three GARCH models (GARCH-N, GARCH-t and GARCH-HT). Daily spot prices of five energy commodities (WTI crude oil, Brent crude oil, heating oil 2, propane and New York Harbor Conventional Gasoline Regular) are used to compare the accuracy and efficiency of the VaR models. Empirical results suggest that for asset returns that exhibit leptokurtic and fat-tailed features, the VaR estimates generated by the GARCH-HT models have good accuracy at both low and high confidence levels. Additionally, MRSB indicates that the GARCH-HT model is more efficient than alternatives for most cases at high confidence levels. These findings suggest that the heavy-tailed distribution is more suitable for energy commodities, particularly VaR calculation. (author)

  7. GARCH and Irregularly Spaced Data

    NARCIS (Netherlands)

    Meddahi, N.; Renault, E.; Werker, B.J.M.

    2003-01-01

    An exact discretization of continuous time stochastic volatility processes observed at irregularly spaced times is used to give insights on how a coherent GARCH model can be specified for such data. The relation of our approach with those in the existing literature is studied.

  8. Bitcoin, gold and the dollar: A GARCH volatility analysis

    OpenAIRE

    Dyhrberg, Anne Haubo

    2015-01-01

    This paper explores the financial asset capabilities of bitcoin using GARCH models. The initial model showed several similarities to gold and the dollar indicating hedging capabilities and advantages as a medium of exchange. The asymmetric GARCH showed that bitcoin may be useful in risk management and ideal for risk averse investors in anticipation of negative shocks to the market. Overall bitcoin has a place on the financial markets and in portfolio management as it can be classified as some...

  9. 12 May 1979: 'Atom Disaster' in Garching

    International Nuclear Information System (INIS)

    Gerwin, R.

    1979-01-01

    'Die Zeit', in its number 25 (15 June 79), dealt with the mass media's reporting on a fire in a laboratory of the Technical University of Munich in Garching (District Munich) of 12 May 79. Under the headline 'After the fire in Garching - sensations at any price. The fire provoked the journalists' imagination', Robert Gerwin, the press agent of the Max Planck Society, dealt with the question of whether or not the TV and newspapers' reports on this fire were extremely exaggerating, suggesting the readers a sensation which was no sensation at all. The press section of the Max Planck Society kindly gave its permission to print this article in 'brandschutz'. (orig./HP) [de

  10. Estimating the Volatility of Cocoa Price Return with ARCH and GARCH Models

    Directory of Open Access Journals (Sweden)

    Lya Aklimawati

    2013-08-01

    Full Text Available Dynamics of market changing as a result of market liberalization have an impact on agricultural commodities price fluctuation. High volatility on cocoa price movement reflect its price and market risk. Because of price and market uncertainty, the market players face some difficulties to make a decision in determining business development. This research was conducted to 1 understand the characteristics of cocoa price movement in cocoa futures trading, and 2analyze cocoa price volatility using ARCH and GARCH type model. Research was carried out by direct observation on the pattern of cocoa price movement in the futures trading and volatility analysis based on secondary data. The data was derived from Intercontinental Exchange ( ICE Futures U.S. Reports. The analysis result showed that GARCH is the best model to predict the value of average cocoa price return volatility, because it meets criteria of three diagnostic checking, which are ARCH-LM test, residual autocorrelation test and residual normality test. Based on the ARCH-LM test, GARCH (1,1did not have heteroscedasticity, because p-value  2 (0.640139and F-statistic (0.640449 were greater than 0.05. Results of residual autocorrelation test indicated that residual value of GARCH (1,1 was random, because the statistic value of Ljung-Box (LBon the 36 th lag is smaller than the statistic value of  2. Whereas, residual normality test concluded the residual of GARCH (1,1 were normally distributed, because AR (29, MA (29, RESID (-1^2, and GARCH (-1 were significant at 5% significance level. Increasing volatility value indicate high potential risk. Price risk can be reduced by managing financial instrument in futures trading such as forward and futures contract, and hedging. The research result also give an insight to the market player for decision making and determining time of hedging. Key words: Volatility, price, cocoa, GARCH, risk, futures trading

  11. Modeling Markov Switching ARMA-GARCH Neural Networks Models and an Application to Forecasting Stock Returns

    Directory of Open Access Journals (Sweden)

    Melike Bildirici

    2014-01-01

    Full Text Available The study has two aims. The first aim is to propose a family of nonlinear GARCH models that incorporate fractional integration and asymmetric power properties to MS-GARCH processes. The second purpose of the study is to augment the MS-GARCH type models with artificial neural networks to benefit from the universal approximation properties to achieve improved forecasting accuracy. Therefore, the proposed Markov-switching MS-ARMA-FIGARCH, APGARCH, and FIAPGARCH processes are further augmented with MLP, Recurrent NN, and Hybrid NN type neural networks. The MS-ARMA-GARCH family and MS-ARMA-GARCH-NN family are utilized for modeling the daily stock returns in an emerging market, the Istanbul Stock Index (ISE100. Forecast accuracy is evaluated in terms of MAE, MSE, and RMSE error criteria and Diebold-Mariano equal forecast accuracy tests. The results suggest that the fractionally integrated and asymmetric power counterparts of Gray’s MS-GARCH model provided promising results, while the best results are obtained for their neural network based counterparts. Further, among the models analyzed, the models based on the Hybrid-MLP and Recurrent-NN, the MS-ARMA-FIAPGARCH-HybridMLP, and MS-ARMA-FIAPGARCH-RNN provided the best forecast performances over the baseline single regime GARCH models and further, over the Gray’s MS-GARCH model. Therefore, the models are promising for various economic applications.

  12. Modelling Multivariate Autoregressive Conditional Heteroskedasticity with the Double Smooth Transition Conditional Correlation GARCH Model

    DEFF Research Database (Denmark)

    Silvennoinen, Annastiina; Teräsvirta, Timo

    In this paper we propose a multivariate GARCH model with a time-varying conditional correlation structure. The new Double Smooth Transition Conditional Correlation GARCH model extends the Smooth Transition Conditional Correlation GARCH model of Silvennoinen and Ter¨asvirta (2005) by including...... another variable according to which the correlations change smoothly between states of constant correlations. A Lagrange multiplier test is derived to test the constancy of correlations against the DSTCC-GARCH model, and another one to test for another transition in the STCC-GARCH framework. In addition......, other specification tests, with the aim of aiding the model building procedure, are considered. Analytical expressions for the test statistics and the required derivatives are provided. The model is applied to a selection of world stock indices, and it is found that time is an important factor affecting...

  13. Forecasting Tehran stock exchange volatility; Markov switching GARCH approach

    Science.gov (United States)

    Abounoori, Esmaiel; Elmi, Zahra (Mila); Nademi, Younes

    2016-03-01

    This paper evaluates several GARCH models regarding their ability to forecast volatility in Tehran Stock Exchange (TSE). These include GARCH models with both Gaussian and fat-tailed residual conditional distribution, concerning their ability to describe and forecast volatility from 1-day to 22-day horizon. Results indicate that AR(2)-MRSGARCH-GED model outperforms other models at one-day horizon. Also, the AR(2)-MRSGARCH-GED as well as AR(2)-MRSGARCH-t models outperform other models at 5-day horizon. In 10 day horizon, three models of AR(2)-MRSGARCH outperform other models. Concerning 22 day forecast horizon, results indicate no differences between MRSGARCH models with that of standard GARCH models. Regarding Risk management out-of-sample evaluation (95% VaR), a few models seem to provide reasonable and accurate VaR estimates at 1-day horizon, with a coverage rate close to the nominal level. According to the risk management loss functions, there is not a uniformly most accurate model.

  14. RF Negative Ion Source Development at IPP Garching

    International Nuclear Information System (INIS)

    Kraus, W.; McNeely, P.; Berger, M.; Christ-Koch, S.; Falter, H. D.; Fantz, U.; Franzen, P.; Froeschle, M.; Heinemann, B.; Leyer, S.; Riedl, R.; Speth, E.; Wuenderlich, D.

    2007-01-01

    IPP Garching is heavily involved in the development of an ion source for Neutral Beam Heating of the ITER Tokamak. RF driven ion sources have been successfully developed and are in operation on the ASDEX-Upgrade Tokamak for positive ion based NBH by the NB Heating group at IPP Garching. Building on this experience a RF driven H- ion source has been under development at IPP Garching as an alternative to the ITER reference design ion source. The number of test beds devoted to source development for ITER has increased from one (BATMAN) by the addition of two test beds (MANITU, RADI). This paper contains descriptions of the three test beds. Results on diagnostic development using laser photodetachment and cavity ringdown spectroscopy are given for BATMAN. The latest results for long pulse development on MANITU are presented including the to date longest pulse (600 s). As well, details of source modifications necessitated for pulses in excess of 100 s are given. The newest test bed RADI is still being commissioned and only technical details of the test bed are included in this paper. The final topic of the paper is an investigation into the effects of biasing the plasma grid

  15. Experimental investigation of photoreactions in Bonn

    International Nuclear Information System (INIS)

    Anton, G.

    1988-01-01

    In this paper the investigation of nuclear photoreactions in Bonn is reviewed. At the new stretcher ring ELSA considerable experimental improvements are possible with an energy tagged photon beam in conjunction with a large acceptance hadron detector. The future program and the new experimental facilities are described

  16. PERBANDINGAN MODEL OPSI BLACKSCHOLES DAN MODEL OPSI GARCH DI BURSA EFEK INDONESIA

    Directory of Open Access Journals (Sweden)

    Riko Hendrawan

    2017-03-01

    Full Text Available The purpose of this research was to compare the accuracy of Black-Scholes Opt ionModel and GARCH opt ion models for Stock opt ion ut ilizing data f rom Ast ra, BCA, Indofoodand Telkom at the Indonesian Stock Exchange. The intraday stock return of Astra, BCA, Indofoodand Telkom exhibited an overwhelming presence of volat ilit y cluster, suggesting that GARCHmodel had an ef fect which best corresponded with the actual price. The best model wasconst ructed using ARIMA model and the best lag in GARCH model was ext racted. The findingf rom this research showed that by comparing the average percentage mean squared errors ofthe GARCH Opt ion Model and the Black-Scholes Opt ion Model, the former was found moreaccurate than the lat ter. GARCH Model relat ively improved average percentage mean squarederrors of Black-Scholes Model; one month opt ion showed a twent y eight point ten percentimprovement , two month option showed twenty three point thirt y percent and three monthopt ion showed twent y percent .

  17. Deep Water Survey Data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The deep water biodiversity surveys explore and describe the biodiversity of the bathy- and bentho-pelagic nekton using Midwater and bottom trawls centered in the...

  18. Clusters, groups, and filaments in the Chandra deep field-south up to redshift 1

    International Nuclear Information System (INIS)

    Dehghan, S.; Johnston-Hollitt, M.

    2014-01-01

    We present a comprehensive structure detection analysis of the 0.3 deg 2 area of the MUSYC-ACES field, which covers the Chandra Deep Field-South (CDFS). Using a density-based clustering algorithm on the MUSYC and ACES photometric and spectroscopic catalogs, we find 62 overdense regions up to redshifts of 1, including clusters, groups, and filaments. We also present the detection of a relatively small void of ∼10 Mpc 2 at z ∼ 0.53. All structures are confirmed using the DBSCAN method, including the detection of nine structures previously reported in the literature. We present a catalog of all structures present, including their central position, mean redshift, velocity dispersions, and classification based on their morphological and spectroscopic distributions. In particular, we find 13 galaxy clusters and 6 large groups/small clusters. Comparison of these massive structures with published XMM-Newton imaging (where available) shows that 80% of these structures are associated with diffuse, soft-band (0.4-1 keV) X-ray emission, including 90% of all objects classified as clusters. The presence of soft-band X-ray emission in these massive structures (M 200 ≥ 4.9 × 10 13 M ☉ ) provides a strong independent confirmation of our methodology and classification scheme. In the closest two clusters identified (z < 0.13) high-quality optical imaging from the Deep2c field of the Garching-Bonn Deep Survey reveals the cD galaxies and demonstrates that they sit at the center of the detected X-ray emission. Nearly 60% of the clusters, groups, and filaments are detected in the known enhanced density regions of the CDFS at z ≅ 0.13, 0.52, 0.68, and 0.73. Additionally, all of the clusters, bar the most distant, are found in these overdense redshift regions. Many of the clusters and groups exhibit signs of ongoing formation seen in their velocity distributions, position within the detected cosmic web, and in one case through the presence of tidally disrupted central galaxies

  19. Clusters, groups, and filaments in the Chandra deep field-south up to redshift 1

    Energy Technology Data Exchange (ETDEWEB)

    Dehghan, S.; Johnston-Hollitt, M., E-mail: siamak.dehghan@vuw.ac.nz [School of Chemical and Physical Sciences, Victoria University of Wellington, P.O. Box 600, Wellington 6140 (New Zealand)

    2014-03-01

    We present a comprehensive structure detection analysis of the 0.3 deg{sup 2} area of the MUSYC-ACES field, which covers the Chandra Deep Field-South (CDFS). Using a density-based clustering algorithm on the MUSYC and ACES photometric and spectroscopic catalogs, we find 62 overdense regions up to redshifts of 1, including clusters, groups, and filaments. We also present the detection of a relatively small void of ∼10 Mpc{sup 2} at z ∼ 0.53. All structures are confirmed using the DBSCAN method, including the detection of nine structures previously reported in the literature. We present a catalog of all structures present, including their central position, mean redshift, velocity dispersions, and classification based on their morphological and spectroscopic distributions. In particular, we find 13 galaxy clusters and 6 large groups/small clusters. Comparison of these massive structures with published XMM-Newton imaging (where available) shows that 80% of these structures are associated with diffuse, soft-band (0.4-1 keV) X-ray emission, including 90% of all objects classified as clusters. The presence of soft-band X-ray emission in these massive structures (M {sub 200} ≥ 4.9 × 10{sup 13} M {sub ☉}) provides a strong independent confirmation of our methodology and classification scheme. In the closest two clusters identified (z < 0.13) high-quality optical imaging from the Deep2c field of the Garching-Bonn Deep Survey reveals the cD galaxies and demonstrates that they sit at the center of the detected X-ray emission. Nearly 60% of the clusters, groups, and filaments are detected in the known enhanced density regions of the CDFS at z ≅ 0.13, 0.52, 0.68, and 0.73. Additionally, all of the clusters, bar the most distant, are found in these overdense redshift regions. Many of the clusters and groups exhibit signs of ongoing formation seen in their velocity distributions, position within the detected cosmic web, and in one case through the presence of tidally

  20. Determining the Best Arch/Garch Model and Comparing JKSE with Stock Index in Developed Countries

    Directory of Open Access Journals (Sweden)

    Kharisya Ayu Effendi

    2015-09-01

    Full Text Available The slow movement of Indonesia economic growth in 2014 due to several factors, in internal factors; due to the high interest rates in Indonesia and external factors from the US which will raise the fed rate this year. However, JKSE shows a sharp increase trend from the beginning of 2014 until the second quarter of 2015 although it remains fluctuate but insignificant. The purpose of this research is to determine the best ARCH/ GARCH model in JKSE and stock index in developed countries (FTSE, Nasdaq and STI and then compare the JKSE with the stock index in developed countries (FTSE, Nasdaq and STI. The results obtained in this study is to determine the best model of ARCH / GARCH, it is obtained that JKSE is GARCH (1,2, while the FTSE obtains GARCH (2,2, NASDAQ produces the best model which is GARCH (1,1 and STI with GARCH (2,1, and the results of the comparison of JKSE with FTSE, NASDAQ and STI are that even though JKSE fluctuates with moderate levels but the trend shown upward trend. This is different with other stock indexes fluctuated highly and tends to have a downward trend.

  1. Conditional density estimation using fuzzy GARCH models

    NARCIS (Netherlands)

    Almeida, R.J.; Bastürk, N.; Kaymak, U.; Costa Sousa, da J.M.; Kruse, R.; Berthold, M.R.; Moewes, C.; Gil, M.A.; Grzegorzewski, P.; Hryniewicz, O.

    2013-01-01

    Abstract. Time series data exhibits complex behavior including non-linearity and path-dependency. This paper proposes a flexible fuzzy GARCH model that can capture different properties of data, such as skewness, fat tails and multimodality in one single model. Furthermore, additional information and

  2. International evidence on crude oil price dynamics. Applications of ARIMA-GARCH models

    International Nuclear Information System (INIS)

    Mohammadi, Hassan; Su, Lixian

    2010-01-01

    We examine the usefulness of several ARIMA-GARCH models for modeling and forecasting the conditional mean and volatility of weekly crude oil spot prices in eleven international markets over the 1/2/1997-10/3/2009 period. In particular, we investigate the out-of-sample forecasting performance of four volatility models - GARCH, EGARCH and APARCH and FIGARCH over January 2009 to October 2009. Forecasting results are somewhat mixed, but in most cases, the APARCH model outperforms the others. Also, conditional standard deviation captures the volatility in oil returns better than the traditional conditional variance. Finally, shocks to conditional volatility dissipate at an exponential rate, which is consistent with the covariance-stationary GARCH models than the slow hyperbolic rate implied by the FIGARCH alternative. (author)

  3. Modelling world gold prices and USD foreign exchange relationship using multivariate GARCH model

    Science.gov (United States)

    Ping, Pung Yean; Ahmad, Maizah Hura Binti

    2014-12-01

    World gold price is a popular investment commodity. The series have often been modeled using univariate models. The objective of this paper is to show that there is a co-movement between gold price and USD foreign exchange rate. Using the effect of the USD foreign exchange rate on the gold price, a model that can be used to forecast future gold prices is developed. For this purpose, the current paper proposes a multivariate GARCH (Bivariate GARCH) model. Using daily prices of both series from 01.01.2000 to 05.05.2014, a causal relation between the two series understudied are found and a bivariate GARCH model is produced.

  4. Hedging with futures: an application of GARCH to european electricity markets

    OpenAIRE

    G. Zanotti; G. Gabbi; M. Geranio

    2009-01-01

    European electricity markets have been subject to a broad deregulation process in the last few decades. We analyse hedging policies implemented through different hedge ratios estimation. More specifically we compare naïve, ordinary least squares, and GARCH conditional variance and correlations models to test if GARCH models lead to higher variance reduction in a context of high time varying volatility as the case of electricity markets. Our results show that the choice of the hedge ratio esti...

  5. Semiparametric Inference in a GARCH-in-Mean Model

    DEFF Research Database (Denmark)

    Christensen, Bent Jesper; Dahl, Christian Møller; Iglesias, Emma M.

    A new semiparametric estimator for an empirical asset pricing model with general nonpara- metric risk-return tradeoff and a GARCH process for the underlying volatility is introduced. The estimator does not rely on any initial parametric estimator of the conditional mean func- tion, and this feature...... facilitates the derivation of asymptotic theory under possible nonlinearity of unspecified form of the risk-return tradeoff. Besides the nonlinear GARCH-in-mean effect, our specification accommodates exogenous regressors that are typically used as conditioning variables entering linearly in the mean equation...... with the fully parametric approach and the iterative semiparametric approach using a parametric initial esti- mate proposed by Conrad and Mammen (2008). An empirical application to the daily S&P 500 stock market returns suggests that the linear relation between conditional expected return and conditional...

  6. WFIRST: Science from Deep Field Surveys

    Science.gov (United States)

    Koekemoer, Anton; Foley, Ryan; WFIRST Deep Field Working Group

    2018-01-01

    WFIRST will enable deep field imaging across much larger areas than those previously obtained with Hubble, opening up completely new areas of parameter space for extragalactic deep fields including cosmology, supernova and galaxy evolution science. The instantaneous field of view of the Wide Field Instrument (WFI) is about 0.3 square degrees, which would for example yield an Ultra Deep Field (UDF) reaching similar depths at visible and near-infrared wavelengths to that obtained with Hubble, over an area about 100-200 times larger, for a comparable investment in time. Moreover, wider fields on scales of 10-20 square degrees could achieve depths comparable to large HST surveys at medium depths such as GOODS and CANDELS, and would enable multi-epoch supernova science that could be matched in area to LSST Deep Drilling fields or other large survey areas. Such fields may benefit from being placed on locations in the sky that have ancillary multi-band imaging or spectroscopy from other facilities, from the ground or in space. The WFIRST Deep Fields Working Group has been examining the science considerations for various types of deep fields that may be obtained with WFIRST, and present here a summary of the various properties of different locations in the sky that may be considered for future deep fields with WFIRST.

  7. Ranking multivariate GARCH models by problem dimension

    NARCIS (Netherlands)

    M. Caporin (Massimiliano); M.J. McAleer (Michael)

    2010-01-01

    textabstractIn the last 15 years, several Multivariate GARCH (MGARCH) models have appeared in the literature. The two most widely known and used are the Scalar BEKK model of Engle and Kroner (1995) and Ding and Engle (2001), and the DCC model of Engle (2002). Some recent research has begun to

  8. Symposium on disruptive instabilities at Garching

    International Nuclear Information System (INIS)

    Lackner, K.

    1979-01-01

    The phenomenon of disruptive instabilities was investigated with a special care at the IPP at Garching. After lectures and panel sessions it appears suitable, to subdivide the disruptive phenomena into four classes: 1. The internal disruption (the socalled saw-tooth oscillators). 2. the socalled reconnection disruptions. 3. The large disruptions. 4. The small disruptions. The four appearance forms of the phenomena are briefly explained. (GG) [de

  9. Limit theory for the sample autocorrelations and extremes of a GARCH (1,1) process

    NARCIS (Netherlands)

    Mikosch, T; Starica, C

    2000-01-01

    The asymptotic theory for the sample autocorrelations and extremes of a GARCH(I, 1) process is provided. Special attention is given to the case when the sum of the ARCH and GARCH parameters is close to 1, that is, when one is close to an infinite Variance marginal distribution. This situation has

  10. Preliminary analysis on hybrid Box-Jenkins - GARCH modeling in forecasting gold price

    Science.gov (United States)

    Yaziz, Siti Roslindar; Azizan, Noor Azlinna; Ahmad, Maizah Hura; Zakaria, Roslinazairimah; Agrawal, Manju; Boland, John

    2015-02-01

    Gold has been regarded as a valuable precious metal and the most popular commodity as a healthy return investment. Hence, the analysis and prediction of gold price become very significant to investors. This study is a preliminary analysis on gold price and its volatility that focuses on the performance of hybrid Box-Jenkins models together with GARCH in analyzing and forecasting gold price. The Box-Cox formula is used as the data transformation method due to its potential best practice in normalizing data, stabilizing variance and reduces heteroscedasticity using 41-year daily gold price data series starting 2nd January 1973. Our study indicates that the proposed hybrid model ARIMA-GARCH with t-innovation can be a new potential approach in forecasting gold price. This finding proves the strength of GARCH in handling volatility in the gold price as well as overcomes the non-linear limitation in the Box-Jenkins modeling.

  11. ITER technical advisory committee meeting at Garching

    International Nuclear Information System (INIS)

    Fujiwara, M.

    1999-01-01

    The ITER Technical Advisory Committee meeting took place on 24-27 February at the Garching Joint Work Site. According to the discussions at the ITER meeting in Yokohama in October 1998, the Technical Advisory Committee was requested to conduct a thorough review of the document 'Options for the reduced technical objectives / reduced cost ITER'

  12. Modeling Covariance Breakdowns in Multivariate GARCH

    OpenAIRE

    Jin, Xin; Maheu, John M

    2014-01-01

    This paper proposes a flexible way of modeling dynamic heterogeneous covariance breakdowns in multivariate GARCH (MGARCH) models. During periods of normal market activity, volatility dynamics are governed by an MGARCH specification. A covariance breakdown is any significant temporary deviation of the conditional covariance matrix from its implied MGARCH dynamics. This is captured through a flexible stochastic component that allows for changes in the conditional variances, covariances and impl...

  13. Assessing Efficiency of D-Vine Copula ARMA-GARCH Method in Value at Risk Forecasting: Evidence from PSE Listed Companies

    Directory of Open Access Journals (Sweden)

    Václav Klepáč

    2015-01-01

    Full Text Available The article points out the possibilities of using static D-Vine copula ARMA-GARCH model for estimation of 1 day ahead market Value at Risk. For the illustration we use data of the four companies listed on Prague Stock Exchange in range from 2010 to 2014. Vine copula approach allows us to construct high-dimensional copula from both elliptical and Archimedean bivariate copulas, i.e. multivariate probability distribution, created from process innovations. Due to a deeper shortage of existing domestic results or comparison studies with advanced volatility governed VaR forecasts we backtested D-Vine copula ARMA-GARCH model against the VaR rolling out of sample forecast from October 2012 to April 2014 of chosen benchmark models, e.g. multivariate VAR-GO-GARCH, VAR-DCC-GARCH and univariate ARMA-GARCH type models. Common backtesting via Kupiec and Christoffersen procedures offer generalization that technological superiority of model supports accuracy only in case of an univariate modeling – working with non-basic GARCH models and innovations with leptokurtic distributions. Multivariate VAR governed type models and static Copula Vines performed in stated backtesting comparison worse than selected univariate ARMA-GARCH, i.e. it have overestimated the level of actual market risk, probably due to hardly tractable time-varying dependence structure.

  14. Estimating Price Volatility Structure in Iran’s Meat Market: Application of General GARCH Models

    Directory of Open Access Journals (Sweden)

    Z. Rasouli Birami

    2016-10-01

    Full Text Available Introduction: Over the past few years, the price volatility of agricultural products and food markets has attracted attention of many researchers and policy makers. This growing attention was started from the food price crisis in 2007 and 2008 when major agricultural products faced accelerated price increases and then rapidly decreased. This paper focused on the price volatility of major commodities related to three market levels of Iran’s meat market, including hay (the input level, calf and sheep (the wholesale level and beef and mutton (the retail level. In particular, efforts will made to find more appropriate models for explaining the behavior of volatility of the return series and to identify which return series are more volatile. The effects of good and bad news on the volatility of prices in each return series will also be studied. Materials and Methods: Different GARCH type models have been considered the best for modeling volatility of return series. Nonlinear GARCH models were introduced to capture the effect of good and bad news separately. The paper uses some GARCH type models including GARCH, Exponential GARCH (EGARCH, GJR-GARCH, Threshold GARCH (TGARCH, Simple Asymmetric GARCH (SAGARCH, Power GARCH (PGARCH, Non-linear GARCH (NGARCH, Asymmetric Power GARCH (APGARCH and Non-linear Power GARCH (NPGARCH to model the volatility of hay, calf, sheep, beef and mutton return series. The data on hay, calf, sheep, and beef and mutton monthly prices are published by Iran’s livestock support firm. The paper uses monthly data over the sample period of the May 1992 to the March 2014. Results and Discussion: Descriptive statistics of the studied return series show evidence of skewness and kurtosis. The results here show that all the series has fat tails. The significant p-values for the Ljung-Box Q-statistics mean that the auto-correlation exists in the squared residuals. The presence of unit roots in the return series is confirmed by the

  15. Modeling returns volatility: Realized GARCH incorporating realized risk measure

    Science.gov (United States)

    Jiang, Wei; Ruan, Qingsong; Li, Jianfeng; Li, Ye

    2018-06-01

    This study applies realized GARCH models by introducing several risk measures of intraday returns into the measurement equation, to model the daily volatility of E-mini S&P 500 index futures returns. Besides using the conventional realized measures, realized volatility and realized kernel as our benchmarks, we also use generalized realized risk measures, realized absolute deviation, and two realized tail risk measures, realized value-at-risk and realized expected shortfall. The empirical results show that realized GARCH models using the generalized realized risk measures provide better volatility estimation for the in-sample and substantial improvement in volatility forecasting for the out-of-sample. In particular, the realized expected shortfall performs best for all of the alternative realized measures. Our empirical results reveal that future volatility may be more attributable to present losses (risk measures). The results are robust to different sample estimation windows.

  16. Importance of the macroeconomic variables for variance prediction: A GARCH-MIDAS approach

    DEFF Research Database (Denmark)

    Asgharian, Hossein; Hou, Ai Jun; Javed, Farrukh

    2013-01-01

    This paper aims to examine the role of macroeconomic variables in forecasting the return volatility of the US stock market. We apply the GARCH-MIDAS (Mixed Data Sampling) model to examine whether information contained in macroeconomic variables can help to predict short-term and long-term compone......This paper aims to examine the role of macroeconomic variables in forecasting the return volatility of the US stock market. We apply the GARCH-MIDAS (Mixed Data Sampling) model to examine whether information contained in macroeconomic variables can help to predict short-term and long...

  17. Polarized target physics at the Bonn electron accelerators

    International Nuclear Information System (INIS)

    Meyer, W.

    1988-12-01

    At the BONN 2.5 GeV electron synchrotron experiments with polarized nucleon targets have a long tradition. Starting with measurements of the target asymmetry in single pion photoproduction off polarized protons, resp. neutrons, the experiments have been concentrated on photodisintegration measurements of polarized deuterons. Parallel to these activities a considerable progress in the field of the target technology, e.g. cryogenics and target materials, has been made, by which all the measurements have profitted enormously. Especially the development of the new target material ammonia has allowed the first use of a polarized deuteron (ND 3 ) target in an intense electron beam. The construction of a frozen spin target, which will be used in combination with a tagged polarized photon beam, makes a new generation of polarized target experiments in photon induced reactions possible. Together with electron scattering off polarized deuterons and neutrons they will be a main activity in the physics program at the new stretcher accelerator ELSA in BONN. (orig.)

  18. From Kyoto to Bonn: implications and opportunities for renewable energy

    International Nuclear Information System (INIS)

    Pugliese, M.; Cameron, J.; Wilder, M.

    2001-01-01

    The article discusses the need for the uptake of renewable energy sources to increase to meet the commitments made in Bonn in July for compliance with the Kyoto Protocol. The article is presented under the sub-headings of: (i) the Bonn Agreement; (ii) implications and opportunities for renewable energy; (iii) the commercialisation and mainstreaming of renewable energy technologies; (iv) greenhouse gas-reducing projects (v) renewable portfolio standards and renewable certificate trading programmes; (vi) increased funding for product and technology development; (vii) emissions trading; (viii) domestic legislation and initiatives; (ix) regulatory effects in Annex I countries specifically impacting renewable energy (UK, Germany, Australia, EU Renewable Energy Law) and (x) US efforts in the absence of a national climate policy

  19. Robust Ranking of Multivariate GARCH Models by Problem Dimension

    NARCIS (Netherlands)

    M. Caporin (Massimiliano); M.J. McAleer (Michael)

    2012-01-01

    textabstractDuring the last 15 years, several Multivariate GARCH (MGARCH) models have appeared in the literature. Recent research has begun to examine MGARCH specifications in terms of their out-of-sample forecasting performance. We provide an empirical comparison of alternative MGARCH models,

  20. MEMBANDINGKAN RISIKO SISTEMATIS MENGGUNAKAN CAPM-GARCH DAN CAPM-EGARCH

    Directory of Open Access Journals (Sweden)

    VIKY AMELIAH

    2017-11-01

    Full Text Available In making stock investments, investors usually pay attention to the rate of return and risk of the stock investment. To calculate risk using capital asset pricing model (CAPM, GARCH, and EGARCH. The data used in this study is secondary data in the form of daily closing price (daily close price, JII price index and monthly SBI rate. All data were processed using matlab 13. The research sample consisted of 6 flagship shares for the period of 2013-2017 ie ADHI, SMGR, UNTR, BSDE, ICBP, KLBF. The conclusion of the research is the beta of each stock including aggressive beta because beta greater than 1. For return CAPM GARACH and CAPM EGARCH obtained Kalbe Farma stock (KLBF has small beta and big return means GARCH and EGARCH model equally Can predict that stock KLBF shares the least risk and large returns among the six stocks.

  1. PERHITUNGAN NILAI BETA DARI BEBERAPA SAHAM UNGGULAN DI INDONESIA DENGAN MENGGUNAKAN METODE GARCH

    Directory of Open Access Journals (Sweden)

    NI KADEK PUSPITAYANTI

    2016-05-01

    Full Text Available The objective of investment in the capital market is to acquire dividends and capital gain. The fact proves that the advantage of investation risky assets is uncertain . This is because of the difficulty in analyzing and predicting Return and stock losses due to factors that affect the movement of the stock price , such as economic factors , political , social , and security. The model can be used by investors in predicting stock returns expected that Generalized Autoregressive Conditional Heteroscedaticity (GARCH. In this study calculations beta value of some leading stocks in Indonesia by using Generalized Autoregressive Conditional Heteroscedaticity (GARCH are presented . The data used this search is secondary data covering daily data sampled 5 shares of PT Unilever Indonesia Tbk , PT Indosat Tbk , PT Indofood Sukses Makmur Tbk , PT Telkom Indonesia Tbk , PT Holcim Indonesia Tbk. From the results described fifth beta value of these shares using the method GARCH beta greater than the market in the period from 23 September 2013 until 24 September 2014.

  2. On Diagnostic Checking of Vector ARMA-GARCH Models with Gaussian and Student-t Innovations

    Directory of Open Access Journals (Sweden)

    Yongning Wang

    2013-04-01

    Full Text Available This paper focuses on the diagnostic checking of vector ARMA (VARMA models with multivariate GARCH errors. For a fitted VARMA-GARCH model with Gaussian or Student-t innovations, we derive the asymptotic distributions of autocorrelation matrices of the cross-product vector of standardized residuals. This is different from the traditional approach that employs only the squared series of standardized residuals. We then study two portmanteau statistics, called Q1(M and Q2(M, for model checking. A residual-based bootstrap method is provided and demonstrated as an effective way to approximate the diagnostic checking statistics. Simulations are used to compare the performance of the proposed statistics with other methods available in the literature. In addition, we also investigate the effect of GARCH shocks on checking a fitted VARMA model. Empirical sizes and powers of the proposed statistics are investigated and the results suggest a procedure of using jointly Q1(M and Q2(M in diagnostic checking. The bivariate time series of FTSE 100 and DAX index returns is used to illustrate the performance of the proposed portmanteau statistics. The results show that it is important to consider the cross-product series of standardized residuals and GARCH effects in model checking.

  3. Volatility in GARCH Models of Business Tendency Index

    Science.gov (United States)

    Wahyuni, Dwi A. S.; Wage, Sutarman; Hartono, Ateng

    2018-01-01

    This paper aims to obtain a model of business tendency index by considering volatility factor. Volatility factor detected by ARCH (Autoregressive Conditional Heteroscedasticity). The ARCH checking was performed using the Lagrange multiplier test. The modeling is Generalized Autoregressive Conditional Heteroscedasticity (GARCH) are able to overcome volatility problems by incorporating past residual elements and residual variants.

  4. Present status and future plans at the Bonn synchrotron

    International Nuclear Information System (INIS)

    Mecking, B.A.

    1984-01-01

    The electron stretcher ring ELSA presently under construction in Bonn and the associated experimental equipment are described. The ring will deliver electron and photon beams with close to 100% duty cycle for experiments in elementary particle and photonuclear physics. (author)

  5. An analysis of the Bonn agreement. Background information for evaluating business implications

    International Nuclear Information System (INIS)

    Torvanger, Asbjoern

    2001-01-01

    This report has been commissioned by the World Business Council for Sustainable Development and written in August 2001. The aim of the report is to present and analyze the newest developments in the climate negotiations, particularly from part two of the sixth Conference of the Parties to the Climate Convention in Bonn in July 2001, and to provide background information to evaluate what the ''Bonn agreement'' means for business. The report is organized as a collection of slides with supporting text explaining the background and contents of each slide. (author)

  6. The influence of freezer storage of urine samples on the BONN-Risk-Index for calcium oxalate crystallization.

    Science.gov (United States)

    Laube, Norbert; Zimmermann, Diana J

    2004-01-01

    This study was performed to quantify the effect of a 1-week freezer storage of urine on its calcium oxalate crystallization risk. Calcium oxalate is the most common urinary stone material observed in urolithiasis patients in western and affluent countries. The BONN-Risk-Index of calcium oxalate crystallization risk in human urine is determined from a crystallization experiment performed on untreated native urine samples. We tested the influence of a 1-week freezing on the BONN-Risk-Index value as well as the effect of the sample freezing on the urinary osmolality. In vitro crystallization experiments in 49 native urine samples from stone-forming and non-stone forming individuals were performed in order to determine their calcium oxalate crystallization risk according to the BONN-Risk-Index approach. Comparison of the results derived from original sample investigations with those obtained from the thawed aliquots by statistical evaluation shows that i) no significant deviation from linearity between both results exists and ii) both results are identical by statistical means. This is valid for both, the BONN-Risk-Index and the osmolality data. The differences in the BONN-Risk-Index results of both procedures of BONN-Risk-Index determination, however, exceed the clinically acceptable difference. Thus, determination of the urinary calcium oxalate crystallization risk from thawed urine samples cannot be recommended.

  7. Deep Extragalactic VIsible Legacy Survey (DEVILS): Motivation, Design and Target Catalogue

    Science.gov (United States)

    Davies, L. J. M.; Robotham, A. S. G.; Driver, S. P.; Lagos, C. P.; Cortese, L.; Mannering, E.; Foster, C.; Lidman, C.; Hashemizadeh, A.; Koushan, S.; O'Toole, S.; Baldry, I. K.; Bilicki, M.; Bland-Hawthorn, J.; Bremer, M. N.; Brown, M. J. I.; Bryant, J. J.; Catinella, B.; Croom, S. M.; Grootes, M. W.; Holwerda, B. W.; Jarvis, M. J.; Maddox, N.; Meyer, M.; Moffett, A. J.; Phillipps, S.; Taylor, E. N.; Windhorst, R. A.; Wolf, C.

    2018-06-01

    The Deep Extragalactic VIsible Legacy Survey (DEVILS) is a large spectroscopic campaign at the Anglo-Australian Telescope (AAT) aimed at bridging the near and distant Universe by producing the highest completeness survey of galaxies and groups at intermediate redshifts (0.3 < z < 1.0). Our sample consists of ˜60,000 galaxies to Y<21.2 mag, over ˜6 deg2 in three well-studied deep extragalactic fields (Cosmic Origins Survey field, COSMOS, Extended Chandra Deep Field South, ECDFS and the X-ray Multi-Mirror Mission Large-Scale Structure region, XMM-LSS - all Large Synoptic Survey Telescope deep-drill fields). This paper presents the broad experimental design of DEVILS. Our target sample has been selected from deep Visible and Infrared Survey Telescope for Astronomy (VISTA) Y-band imaging (VISTA Deep Extragalactic Observations, VIDEO and UltraVISTA), with photometry measured by PROFOUND. Photometric star/galaxy separation is done on the basis of NIR colours, and has been validated by visual inspection. To maximise our observing efficiency for faint targets we employ a redshift feedback strategy, which continually updates our target lists, feeding back the results from the previous night's observations. We also present an overview of the initial spectroscopic observations undertaken in late 2017 and early 2018.

  8. Consistency and asymptotic normality of maximum likelihood estimators of a multiplicative time-varying smooth transition correlation GARCH model

    DEFF Research Database (Denmark)

    Silvennoinen, Annestiina; Terasvirta, Timo

    A new multivariate volatility model that belongs to the family of conditional correlation GARCH models is introduced. The GARCH equations of this model contain a multiplicative deterministic component to describe long-run movements in volatility and, in addition, the correlations...

  9. Modeling the exchange rate of the euro against the dollar using the ARCH/GARCH models

    Directory of Open Access Journals (Sweden)

    Kovačević Radovan

    2016-01-01

    Full Text Available The analysis of time series with conditional heteroskedasticity (changeable time variability, conditional variance instability, the phenomenon called volatility is the main task of ARCH and GARCH models. The aim of these models is to calculate some of the volatility indicators needed for financial decisions. This paper examines the performance of generalized autoregressive conditional heteroscedasticity (GARCH model in modeling the daily changes of the log exchange rate of the euro against the dollar. Several GARCH models have been applied for modeling the daily log exchange rate returns of the euro, with a different number of parameters. The characteristic of estimated GARCH models is that the obtained coefficients of lagged squared residuals and the conditional variance parameters in the equation of conditional variance have a strong statistical significance. The sum of these two coefficients' estimates is close to a unit, which is typical for GARCH models that are applied on the data of financial assets returns. This means that the shocks in the conditional variance equation will be long lasting. The great value of the sum of these two coefficients shows that the high rates of positive or negative returns leads to a large forecasted value of the variance in the prolonged period. The asymmetrical EGARCH (1,1 model has showed the best results in modeling the euro exchange rate returns. The asymmetry term in the conditional variance equation of this model is negative and statistically significant. A negative value of this term suggests that the positive shock has less impact on the conditional variance than the negative shocks. The asymmetric EGARCH (1,1 model provides evidence of a leverage effect.

  10. Day of the week effect on the Zimbabwe Stock Exchange: A non-linear GARCH analysis

    Directory of Open Access Journals (Sweden)

    Batsirai Winmore Mazviona

    2015-11-01

    Full Text Available This study analysed the day of the week effect on the Zimbabwe Stock Exchange (ZSE by taking into account volatility of returns. The purpose of the study was to establish whether daily mean returns across a trading week differ from each other. We employ a non-linear approach in modelling the day of the week effects. In particular, we used the Generalised Autoregressive Conditional Heteroscedasticity (GARCH and the Exponential GARCH (EGARCH models. We used industrial and mining daily closing indices data from 19 February 2009 to 31 December 2013. The data was retrieved from the ZSE website. EViews 7 software was utilised for data analysis. In order to test the null hypothesis of equality of daily mean returns, a Wald test was carried out. The Wald F-statistic rejected the null hypothesis of equality of mean returns for the industrial index. We found the traditional negative Monday and positive Friday effect for the industrial index in GARCH (1,1 and EGARCH (1,1 models. The GARCH (1,1 detected a negative Friday effect and the EGARCH (1,1 detected negative Wednesday effect for the mining index. We found evidence of model dependency for the mining index results.

  11. Maximum likelihood unit rooting test in the presence GARCH: A new test with increased power

    OpenAIRE

    Cook , Steve

    2008-01-01

    Abstract The literature on testing the unit root hypothesis in the presence of GARCH errors is extended. A new test based upon the combination of local-to-unity detrending and joint maximum likelihood estimation of the autoregressive parameter and GARCH process is presented. The finite sample distribution of the test is derived under alternative decisions regarding the deterministic terms employed. Using Monte Carlo simulation, the newly proposed ML t-test is shown to exhibit incre...

  12. Computed tomography of surface related radionuclide distributions ('BONN'-tomography)

    International Nuclear Information System (INIS)

    Bockisch, A.; Koenig, R.

    1989-01-01

    A method called the 'BONN' tomography is described to produce planar projections of circular activity distributions using standard single photon emission computed tomography. The clinical value of the method is demonstrated for bone scans of the jaw, thorax, and pelvis. Numerical or projection-related problems are discussed. (orig.) [de

  13. The ITER Management Advisory Committee (MAC) meeting in Garching

    International Nuclear Information System (INIS)

    Yoshikawa, M.

    1999-01-01

    The ITER management advisory committee meeting was held on 22-23 July 1999 in Garching, Germany. The main topics were the ITER EDA status, task status summary and work program, joint fund, information technology needs at the ITER joint work sites, the disposition of R and D components and a schedule of ITER meetings

  14. Ranking Multivariate GARCH Models by Problem Dimension: An Empirical Evaluation

    NARCIS (Netherlands)

    M. Caporin (Massimiliano); M.J. McAleer (Michael)

    2011-01-01

    textabstractIn the last 15 years, several Multivariate GARCH (MGARCH) models have appeared in the literature. Recent research has begun to examine MGARCH specifications in terms of their out-of-sample forecasting performance. In this paper, we provide an empirical comparison of a set of models,

  15. THE SPITZER DEEP, WIDE-FIELD SURVEY

    International Nuclear Information System (INIS)

    Ashby, M. L. N.; Brodwin, M.; Stern, D.; Griffith, R.; Eisenhardt, P.; Gorjian, V.; Kozlowski, S.; Kochanek, C. S.; Bock, J. J.; Borys, C.; Brand, K.; Grogin, N. A.; Brown, M. J. I.; Cool, R.; Cooray, A.; Croft, S.; Dey, A.; Eisenstein, D.; Gonzalez, A. H.; Ivison, R. J.

    2009-01-01

    The Spitzer Deep, Wide-Field Survey (SDWFS) is a four-epoch infrared survey of 10 deg. 2 in the Booetes field of the NOAO Deep Wide-Field Survey using the IRAC instrument on the Spitzer Space Telescope. SDWFS, a Spitzer Cycle 4 Legacy project, occupies a unique position in the area-depth survey space defined by other Spitzer surveys. The four epochs that make up SDWFS permit-for the first time-the selection of infrared-variable and high proper motion objects over a wide field on timescales of years. Because of its large survey volume, SDWFS is sensitive to galaxies out to z ∼ 3 with relatively little impact from cosmic variance for all but the richest systems. The SDWFS data sets will thus be especially useful for characterizing galaxy evolution beyond z ∼ 1.5. This paper explains the SDWFS observing strategy and data processing, presents the SDWFS mosaics and source catalogs, and discusses some early scientific findings. The publicly released, full-depth catalogs contain 6.78, 5.23, 1.20, and 0.96 x 10 5 distinct sources detected to the average 5σ, 4''-diameter, aperture-corrected limits of 19.77, 18.83, 16.50, and 15.82 Vega mag at 3.6, 4.5, 5.8, and 8.0 μm, respectively. The SDWFS number counts and color-color distribution are consistent with other, earlier Spitzer surveys. At the 6 minute integration time of the SDWFS IRAC imaging, >50% of isolated Faint Images of the Radio Sky at Twenty cm radio sources and >80% of on-axis XBooetes sources are detected out to 8.0 μm. Finally, we present the four highest proper motion IRAC-selected sources identified from the multi-epoch imaging, two of which are likely field brown dwarfs of mid-T spectral class.

  16. Forecasting Volatility of USD/MUR Exchange Rate using a GARCH ...

    African Journals Online (AJOL)

    that both distributions may forecast quite well with a slight advantage to the. GARCH(1 ... Financial time series tend to exhibit certain characteristic features such as volatility ... Heteroscedasticity-adjusted MAE to evaluate the forecasts. Chuanga et .... the Student's-t distribution or the GED with the following probability density.

  17. Synchrotron radiation laboratories at the Bonn electron accelerators. a status report

    Science.gov (United States)

    Hormes, J.

    1987-07-01

    At the Physikalisches Institut of the University in Bonn experiments with synchrotron radiation were carried out ever since 1962. At the moment (June 1986) all work takes place in the SR-laboratory at the 2.5 GeV synchrotron. A 3.5 GeV stretcher ring (ELSA) is under construction and will come into operation at the end of 1986. This accelerator will also run as a storage ring for synchrotron radiation experiments and a laboratory to be used at this machine is also under consideration. The SR experiments which are carried out in Bonn try to take advantage of the fact that we are still using a high energy synchrotron for our work. Besides basic research also applied work is done using synchrotron radiation even as a production tool for X-ray lithography.

  18. A 6-cm deep sky survey

    International Nuclear Information System (INIS)

    Fomalont, E.B.; Kellermann, K.I.; Wall, J.V.

    1983-01-01

    In order to extend radio source counts to lower flux density, the authors have used the VLA to survey a small region of sky at 4.885 GHz (6 cm) to a limiting flux density of 50 μJy. Details of this deep survey are given in the paper by kellermann et al. (these proceedings). In addition, they have observed 10 other nearby fields to a limiting flux density of 350 μJy in order to provide better statistics on sources of intermediate flux density. (Auth.)

  19. Bridge Structure Deformation Prediction Based on GNSS Data Using Kalman-ARIMA-GARCH Model.

    Science.gov (United States)

    Xin, Jingzhou; Zhou, Jianting; Yang, Simon X; Li, Xiaoqing; Wang, Yu

    2018-01-19

    Bridges are an essential part of the ground transportation system. Health monitoring is fundamentally important for the safety and service life of bridges. A large amount of structural information is obtained from various sensors using sensing technology, and the data processing has become a challenging issue. To improve the prediction accuracy of bridge structure deformation based on data mining and to accurately evaluate the time-varying characteristics of bridge structure performance evolution, this paper proposes a new method for bridge structure deformation prediction, which integrates the Kalman filter, autoregressive integrated moving average model (ARIMA), and generalized autoregressive conditional heteroskedasticity (GARCH). Firstly, the raw deformation data is directly pre-processed using the Kalman filter to reduce the noise. After that, the linear recursive ARIMA model is established to analyze and predict the structure deformation. Finally, the nonlinear recursive GARCH model is introduced to further improve the accuracy of the prediction. Simulation results based on measured sensor data from the Global Navigation Satellite System (GNSS) deformation monitoring system demonstrated that: (1) the Kalman filter is capable of denoising the bridge deformation monitoring data; (2) the prediction accuracy of the proposed Kalman-ARIMA-GARCH model is satisfactory, where the mean absolute error increases only from 3.402 mm to 5.847 mm with the increment of the prediction step; and (3) in comparision to the Kalman-ARIMA model, the Kalman-ARIMA-GARCH model results in superior prediction accuracy as it includes partial nonlinear characteristics (heteroscedasticity); the mean absolute error of five-step prediction using the proposed model is improved by 10.12%. This paper provides a new way for structural behavior prediction based on data processing, which can lay a foundation for the early warning of bridge health monitoring system based on sensor data using sensing

  20. Bridge Structure Deformation Prediction Based on GNSS Data Using Kalman-ARIMA-GARCH Model

    Directory of Open Access Journals (Sweden)

    Jingzhou Xin

    2018-01-01

    Full Text Available Bridges are an essential part of the ground transportation system. Health monitoring is fundamentally important for the safety and service life of bridges. A large amount of structural information is obtained from various sensors using sensing technology, and the data processing has become a challenging issue. To improve the prediction accuracy of bridge structure deformation based on data mining and to accurately evaluate the time-varying characteristics of bridge structure performance evolution, this paper proposes a new method for bridge structure deformation prediction, which integrates the Kalman filter, autoregressive integrated moving average model (ARIMA, and generalized autoregressive conditional heteroskedasticity (GARCH. Firstly, the raw deformation data is directly pre-processed using the Kalman filter to reduce the noise. After that, the linear recursive ARIMA model is established to analyze and predict the structure deformation. Finally, the nonlinear recursive GARCH model is introduced to further improve the accuracy of the prediction. Simulation results based on measured sensor data from the Global Navigation Satellite System (GNSS deformation monitoring system demonstrated that: (1 the Kalman filter is capable of denoising the bridge deformation monitoring data; (2 the prediction accuracy of the proposed Kalman-ARIMA-GARCH model is satisfactory, where the mean absolute error increases only from 3.402 mm to 5.847 mm with the increment of the prediction step; and (3 in comparision to the Kalman-ARIMA model, the Kalman-ARIMA-GARCH model results in superior prediction accuracy as it includes partial nonlinear characteristics (heteroscedasticity; the mean absolute error of five-step prediction using the proposed model is improved by 10.12%. This paper provides a new way for structural behavior prediction based on data processing, which can lay a foundation for the early warning of bridge health monitoring system based on sensor data

  1. Excited baryon program at the Bonn electron stretcher accelerator ELSA

    International Nuclear Information System (INIS)

    Menze, D.

    1989-01-01

    The Bonn electron stretcher accelerator ELSA is the first of a new generation of continuous beam machines in the GeV region. It is qualified for experiments with tagged photons and with polarized electrons on polarized nucleons to investigate the electromagnetic properties of excited baryon resonances

  2. A comparison of monthly precipitation point estimates at 6 locations in Iran using integration of soft computing methods and GARCH time series model

    Science.gov (United States)

    Mehdizadeh, Saeid; Behmanesh, Javad; Khalili, Keivan

    2017-11-01

    Precipitation plays an important role in determining the climate of a region. Precise estimation of precipitation is required to manage and plan water resources, as well as other related applications such as hydrology, climatology, meteorology and agriculture. Time series of hydrologic variables such as precipitation are composed of deterministic and stochastic parts. Despite this fact, the stochastic part of the precipitation data is not usually considered in modeling of precipitation process. As an innovation, the present study introduces three new hybrid models by integrating soft computing methods including multivariate adaptive regression splines (MARS), Bayesian networks (BN) and gene expression programming (GEP) with a time series model, namely generalized autoregressive conditional heteroscedasticity (GARCH) for modeling of the monthly precipitation. For this purpose, the deterministic (obtained by soft computing methods) and stochastic (obtained by GARCH time series model) parts are combined with each other. To carry out this research, monthly precipitation data of Babolsar, Bandar Anzali, Gorgan, Ramsar, Tehran and Urmia stations with different climates in Iran were used during the period of 1965-2014. Root mean square error (RMSE), relative root mean square error (RRMSE), mean absolute error (MAE) and determination coefficient (R2) were employed to evaluate the performance of conventional/single MARS, BN and GEP, as well as the proposed MARS-GARCH, BN-GARCH and GEP-GARCH hybrid models. It was found that the proposed novel models are more precise than single MARS, BN and GEP models. Overall, MARS-GARCH and BN-GARCH models yielded better accuracy than GEP-GARCH. The results of the present study confirmed the suitability of proposed methodology for precise modeling of precipitation.

  3. Value-at-Risk for South-East Asian Stock Markets: Stochastic Volatility vs. GARCH

    Directory of Open Access Journals (Sweden)

    Paul Bui Quang

    2018-04-01

    Full Text Available This study compares the performance of several methods to calculate the Value-at-Risk of the six main ASEAN stock markets. We use filtered historical simulations, GARCH models, and stochastic volatility models. The out-of-sample performance is analyzed by various backtesting procedures. We find that simpler models fail to produce sufficient Value-at-Risk forecasts, which appears to stem from several econometric properties of the return distributions. With stochastic volatility models, we obtain better Value-at-Risk forecasts compared to GARCH. The quality varies over forecasting horizons and across markets. This indicates that, despite a regional proximity and homogeneity of the markets, index volatilities are driven by different factors.

  4. Deep X-ray lithography for the fabrication of microstructures at ELSA

    Energy Technology Data Exchange (ETDEWEB)

    Pantenburg, F.J. E-mail: pantenburg@imt.fzk.de; Mohr, J

    2001-07-21

    Two beamlines at the Electron Stretcher Accelerator (ELSA) of Bonn University are dedicated for the production of microstructures by deep X-ray lithography with synchrotron radiation. They are equipped with state-of-the-art X-ray scanners, maintained and used by Forschungszentrum Karlsruhe. Polymer microstructure heights between 30 and 3000 {mu}m are manufactured regularly for research and industrial projects. This requires different characteristic energies. Therefore, ELSA operates routinely at 1.6, 2.3 and 2.7 GeV, for high-resolution X-ray mask fabrication, deep and ultra-deep X-ray lithography, respectively. The experimental setup, as well as the structure quality of deep and ultra deep X-ray lithographic microstructures are described.

  5. Deep X-ray lithography for the fabrication of microstructures at ELSA

    Science.gov (United States)

    Pantenburg, F. J.; Mohr, J.

    2001-07-01

    Two beamlines at the Electron Stretcher Accelerator (ELSA) of Bonn University are dedicated for the production of microstructures by deep X-ray lithography with synchrotron radiation. They are equipped with state-of-the-art X-ray scanners, maintained and used by Forschungszentrum Karlsruhe. Polymer microstructure heights between 30 and 3000 μm are manufactured regularly for research and industrial projects. This requires different characteristic energies. Therefore, ELSA operates routinely at 1.6, 2.3 and 2.7 GeV, for high-resolution X-ray mask fabrication, deep and ultra-deep X-ray lithography, respectively. The experimental setup, as well as the structure quality of deep and ultra deep X-ray lithographic microstructures are described.

  6. Deep X-ray lithography for the fabrication of microstructures at ELSA

    International Nuclear Information System (INIS)

    Pantenburg, F.J.; Mohr, J.

    2001-01-01

    Two beamlines at the Electron Stretcher Accelerator (ELSA) of Bonn University are dedicated for the production of microstructures by deep X-ray lithography with synchrotron radiation. They are equipped with state-of-the-art X-ray scanners, maintained and used by Forschungszentrum Karlsruhe. Polymer microstructure heights between 30 and 3000 μm are manufactured regularly for research and industrial projects. This requires different characteristic energies. Therefore, ELSA operates routinely at 1.6, 2.3 and 2.7 GeV, for high-resolution X-ray mask fabrication, deep and ultra-deep X-ray lithography, respectively. The experimental setup, as well as the structure quality of deep and ultra deep X-ray lithographic microstructures are described

  7. Deep X-ray lithography for the fabrication of microstructures at ELSA

    CERN Document Server

    Pantenburg, F J

    2001-01-01

    Two beamlines at the Electron Stretcher Accelerator (ELSA) of Bonn University are dedicated for the production of microstructures by deep X-ray lithography with synchrotron radiation. They are equipped with state-of-the-art X-ray scanners, maintained and used by Forschungszentrum Karlsruhe. Polymer microstructure heights between 30 and 3000 mu m are manufactured regularly for research and industrial projects. This requires different characteristic energies. Therefore, ELSA operates routinely at 1.6, 2.3 and 2.7 GeV, for high-resolution X-ray mask fabrication, deep and ultra-deep X-ray lithography, respectively. The experimental setup, as well as the structure quality of deep and ultra deep X-ray lithographic microstructures are described.

  8. Empirical Results of Modeling EUR/RON Exchange Rate using ARCH, GARCH, EGARCH, TARCH and PARCH models

    Directory of Open Access Journals (Sweden)

    Andreea – Cristina PETRICĂ

    2017-03-01

    Full Text Available The aim of this study consists in examining the changes in the volatility of daily returns of EUR/RON exchange rate using on the one hand symmetric GARCH models (ARCH and GARCH and on the other hand the asymmetric GARCH models (EGARCH, TARCH and PARCH, since the conditional variance is time-varying. The analysis takes into account daily quotations of EUR/RON exchange rate over the period of 04th January 1999 to 13th June 2016. Thus, we are modeling heteroscedasticity by applying different specifications of GARCH models followed by looking for significant parameters and low information criteria (minimum Akaike Information Criterion. All models are estimated using the maximum likelihood method under the assumption of several distributions of the innovation terms such as: Normal (Gaussian distribution, Student’s t distribution, Generalized Error distribution (GED, Student’s with fixed df. Distribution, and GED with fixed parameter distribution. The predominant models turned out to be EGARCH and PARCH models, and the empirical results point out that the best model for estimating daily returns of EUR/RON exchange rate is EGARCH(2,1 with Asymmetric order 2 under the assumption of Student’s t distributed innovation terms. This can be explained by the fact that in case of EGARCH model, the restriction regarding the positivity of the conditional variance is automatically satisfied.

  9. QR-GARCH-M Model for Risk-Return Tradeoff in U.S. Stock Returns and Business Cycles

    OpenAIRE

    Nyberg, Henri

    2010-01-01

    In the empirical finance literature findings on the risk return tradeoff in excess stock market returns are ambiguous. In this study, we develop a new QR-GARCH-M model combining a probit model for a binary business cycle indicator and a regime switching GARCH-in-mean model for excess stock market return with the business cycle indicator defining the regime. Estimation results show that there is statistically significant variation in the U.S. excess stock returns over the business cycle. Howev...

  10. Modelling of cayenne production in Central Java using ARIMA-GARCH

    Science.gov (United States)

    Tarno; Sudarno; Ispriyanti, Dwi; Suparti

    2018-05-01

    Some regencies/cities in Central Java Province are known as producers of horticultural crops in Indonesia, for example, Brebes which is the largest area of shallot producer in Central Java, while the others, such as Cilacap and Wonosobo are the areas of cayenne commodities production. Currently, cayenne is a strategic commodity and it has broad impact to Indonesian economic development. Modelling the cayenne production is necessary to predict about the commodity to meet the need for society. The needs fulfillment of society will affect stability of the concerned commodity price. Based on the reality, the decreasing of cayenne production will cause the increasing of society’s basic needs price, and finally it will affect the inflation level at that area. This research focused on autoregressive integrated moving average (ARIMA) modelling by considering the effect of autoregressive conditional heteroscedasticity (ARCH) to study about cayenne production in Central Java. The result of empirical study of ARIMA-GARCH modelling for cayenne production in Central Java from January 2003 to November 2015 is ARIMA([1,3],0,0)-GARCH(1,0) as the best model.

  11. The relationship between trading volumes, number of transactions, and stock volatility in GARCH models

    Science.gov (United States)

    Takaishi, Tetsuya; Chen, Ting Ting

    2016-08-01

    We examine the relationship between trading volumes, number of transactions, and volatility using daily stock data of the Tokyo Stock Exchange. Following the mixture of distributions hypothesis, we use trading volumes and the number of transactions as proxy for the rate of information arrivals affecting stock volatility. The impact of trading volumes or number of transactions on volatility is measured using the generalized autoregressive conditional heteroscedasticity (GARCH) model. We find that the GARCH effects, that is, persistence of volatility, is not always removed by adding trading volumes or number of transactions, indicating that trading volumes and number of transactions do not adequately represent the rate of information arrivals.

  12. Estimation of value at risk in currency exchange rate portfolio using asymmetric GJR-GARCH Copula

    Science.gov (United States)

    Nurrahmat, Mohamad Husein; Noviyanti, Lienda; Bachrudin, Achmad

    2017-03-01

    In this study, we discuss the problem in measuring the risk in a portfolio based on value at risk (VaR) using asymmetric GJR-GARCH Copula. The approach based on the consideration that the assumption of normality over time for the return can not be fulfilled, and there is non-linear correlation for dependent model structure among the variables that lead to the estimated VaR be inaccurate. Moreover, the leverage effect also causes the asymmetric effect of dynamic variance and shows the weakness of the GARCH models due to its symmetrical effect on conditional variance. Asymmetric GJR-GARCH models are used to filter the margins while the Copulas are used to link them together into a multivariate distribution. Then, we use copulas to construct flexible multivariate distributions with different marginal and dependence structure, which is led to portfolio joint distribution does not depend on the assumptions of normality and linear correlation. VaR obtained by the analysis with confidence level 95% is 0.005586. This VaR derived from the best Copula model, t-student Copula with marginal distribution of t distribution.

  13. IPP Max Planck Institute of Plasma of Physics at Garching

    International Nuclear Information System (INIS)

    1979-01-01

    The cost accounting system of the IPP Max Planck Institute of Plasma Physics at Garching is described with all details as there are cost class accounting, cost centers, cost units and resulting overall cost summary. Detailed instructions are given about the implementation of this cost accounting system into the organisational structure of the IPP. (A.N.)

  14. Musik-Sammlungen . Speicher interkultureller Prozesse. Bonn, 28.9.-1.10.2005

    Czech Academy of Sciences Publication Activity Database

    Freemanová, Michaela

    2006-01-01

    Roč. 43, č. 1 (2006), s. 98-99 ISSN 0018-7003. [ Musik -Sammlungen . Speicher interkultureller Prozesse. Bonn, 28.09.2005-01.10.2005] Institutional research plan: CEZ:AV0Z90580513 Keywords : music collections Subject RIV: AL - Art, Architecture, Cultural Heritage

  15. Low-energy neutron-proton analyzing power and the new Bonn potential and Paris potential predictions

    International Nuclear Information System (INIS)

    Tornow, W.; Howell, C.R.; Roberts, M.L.; Felsher, P.D.; Chen, Z.M.; Walter, R.L.; Mertens, G.; Slaus, I.

    1988-01-01

    Instrumental asymmetries recently observed by Haeberli and co-workers, limit the accuracy of neutron-proton analyzing power A/sub y/(θ) data. These instrumental effects are discussed and calculated for previously published n-p A/sub y/(θ) data at 16.9 MeV. To enable these calculations, the analyzing power for the 2 H(d-arrow-right,n) 3 He reaction was measured at small angles. Additional n-p A/sub y/(θ) data at extreme backward angles, obtained via proton recoil detection, are also reported for this energy in this paper. The composite data set is compared to calculations based on the new Bonn NN potential, the Paris NN potential, and to the recent NN phase-shift solution of Arndt. In addition, a detailed comparison between A/sub y/(θ) calculated from the new Bonn and the Paris potentials between 10 and 50 MeV is shown to reveal unexpectedly large relative differences. The experimental data in this energy range are better described by the Paris potential than by the new Bonn potential

  16. MODEL NON LINIER GARCH (NGARCH UNTUK MENGESTIMASI NILAI VALUE at RISK (VaR PADA IHSG

    Directory of Open Access Journals (Sweden)

    I KOMANG TRY BAYU MAHENDRA

    2015-06-01

    Full Text Available In investment, risk measurement is important. One of risk measure is Value at Risk (VaR. There are many methods that can be used to estimate risk based on VaR framework. One of them Non Linier GARCH (NGARCH model. In this research, determination of VaR used NGARCH model. NGARCH model allowed for asymetric behaviour in the volatility such that “good news” or positive return and “bad news” or negative return. Based on calculations of VaR, the higher of the confidence level and the longer the investment period, the risk was greater. Determination of VaR using NGARCH model was less than GARCH model.

  17. Exponential GARCH Modeling with Realized Measures of Volatility

    DEFF Research Database (Denmark)

    Hansen, Peter Reinhard; Huang, Zhuo

    returns and volatility. We apply the model to DJIA stocks and an exchange traded fund that tracks the S&P 500 index and find that specifications with multiple realized measures dominate those that rely on a single realized measure. The empirical analysis suggests some convenient simplifications......We introduce the Realized Exponential GARCH model that can utilize multiple realized volatility measures for the modeling of a return series. The model specifies the dynamic properties of both returns and realized measures, and is characterized by a flexible modeling of the dependence between...

  18. Estimating Risk of Natural Gas Portfolios by Using GARCH-EVT-Copula Model.

    Science.gov (United States)

    Tang, Jiechen; Zhou, Chao; Yuan, Xinyu; Sriboonchitta, Songsak

    2015-01-01

    This paper concentrates on estimating the risk of Title Transfer Facility (TTF) Hub natural gas portfolios by using the GARCH-EVT-copula model. We first use the univariate ARMA-GARCH model to model each natural gas return series. Second, the extreme value distribution (EVT) is fitted to the tails of the residuals to model marginal residual distributions. Third, multivariate Gaussian copula and Student t-copula are employed to describe the natural gas portfolio risk dependence structure. Finally, we simulate N portfolios and estimate value at risk (VaR) and conditional value at risk (CVaR). Our empirical results show that, for an equally weighted portfolio of five natural gases, the VaR and CVaR values obtained from the Student t-copula are larger than those obtained from the Gaussian copula. Moreover, when minimizing the portfolio risk, the optimal natural gas portfolio weights are found to be similar across the multivariate Gaussian copula and Student t-copula and different confidence levels.

  19. Estimating Risk of Natural Gas Portfolios by Using GARCH-EVT-Copula Model

    Directory of Open Access Journals (Sweden)

    Jiechen Tang

    2015-01-01

    Full Text Available This paper concentrates on estimating the risk of Title Transfer Facility (TTF Hub natural gas portfolios by using the GARCH-EVT-copula model. We first use the univariate ARMA-GARCH model to model each natural gas return series. Second, the extreme value distribution (EVT is fitted to the tails of the residuals to model marginal residual distributions. Third, multivariate Gaussian copula and Student t-copula are employed to describe the natural gas portfolio risk dependence structure. Finally, we simulate N portfolios and estimate value at risk (VaR and conditional value at risk (CVaR. Our empirical results show that, for an equally weighted portfolio of five natural gases, the VaR and CVaR values obtained from the Student t-copula are larger than those obtained from the Gaussian copula. Moreover, when minimizing the portfolio risk, the optimal natural gas portfolio weights are found to be similar across the multivariate Gaussian copula and Student t-copula and different confidence levels.

  20. Cross-Sectional Distribution of GARCH Coefficients across S&P 500 Constituents: Time-Variation over the Period 2000-2012

    OpenAIRE

    David Ardia; Lennart F. Hoogerheide

    2013-01-01

    We investigate the time-variation of the cross-sectional distribution of asymmetric GARCH model parameters over the S&P 500 constituents for the period 2000-2012. We find the following results. First, the unconditional variances in the GARCH model obviously show major time-variation, with a high level after the dot-com bubble and the highest peak in the latest financial crisis. Second, in these more volatile periods it is especially the persistence of deviations of volatility from is uncondit...

  1. Modeling rainfall-runoff relationship using multivariate GARCH model

    Science.gov (United States)

    Modarres, R.; Ouarda, T. B. M. J.

    2013-08-01

    The traditional hydrologic time series approaches are used for modeling, simulating and forecasting conditional mean of hydrologic variables but neglect their time varying variance or the second order moment. This paper introduces the multivariate Generalized Autoregressive Conditional Heteroscedasticity (MGARCH) modeling approach to show how the variance-covariance relationship between hydrologic variables varies in time. These approaches are also useful to estimate the dynamic conditional correlation between hydrologic variables. To illustrate the novelty and usefulness of MGARCH models in hydrology, two major types of MGARCH models, the bivariate diagonal VECH and constant conditional correlation (CCC) models are applied to show the variance-covariance structure and cdynamic correlation in a rainfall-runoff process. The bivariate diagonal VECH-GARCH(1,1) and CCC-GARCH(1,1) models indicated both short-run and long-run persistency in the conditional variance-covariance matrix of the rainfall-runoff process. The conditional variance of rainfall appears to have a stronger persistency, especially long-run persistency, than the conditional variance of streamflow which shows a short-lived drastic increasing pattern and a stronger short-run persistency. The conditional covariance and conditional correlation coefficients have different features for each bivariate rainfall-runoff process with different degrees of stationarity and dynamic nonlinearity. The spatial and temporal pattern of variance-covariance features may reflect the signature of different physical and hydrological variables such as drainage area, topography, soil moisture and ground water fluctuations on the strength, stationarity and nonlinearity of the conditional variance-covariance for a rainfall-runoff process.

  2. Quasi-periodicity in deep redshift surveys

    International Nuclear Information System (INIS)

    Weygaert, R. van de

    1991-01-01

    The recent result by Broadhurst et al., (1990. Nature 343, 726) showing a striking, nearly periodic, galaxy redshift distribution in a narrow pencil-beam survey, is explained within the Voronoi cellular model of clustering of galaxies. Galaxies, whose luminosities are selected from a Schechter luminosity function, are placed randomly within the walls of this cellular model. Narrow and deep, magnitude-limited, pencil-beam surveys through these structures are simulated. Some 15 per cent of these beams show that observed regular pattern, with a spacing between the peaks of the order of 105 h -1 -150 h -1 Mpc, but most pencil-beams show peaks in the redshift distribution without periodicity, so we may conclude that, even within a cellular universe, periodicity is not a common phenomenon. (author)

  3. THE DEEP2 GALAXY REDSHIFT SURVEY: DESIGN, OBSERVATIONS, DATA REDUCTION, AND REDSHIFTS

    International Nuclear Information System (INIS)

    Newman, Jeffrey A.; Cooper, Michael C.; Davis, Marc; Faber, S. M.; Guhathakurta, Puragra; Koo, David C.; Phillips, Andrew C.; Conroy, Charlie; Harker, Justin J.; Lai, Kamson; Coil, Alison L.; Dutton, Aaron A.; Finkbeiner, Douglas P.; Gerke, Brian F.; Rosario, David J.; Weiner, Benjamin J.; Willmer, C. N. A.; Yan Renbin; Kassin, Susan A.; Konidaris, N. P.

    2013-01-01

    We describe the design and data analysis of the DEEP2 Galaxy Redshift Survey, the densest and largest high-precision redshift survey of galaxies at z ∼ 1 completed to date. The survey was designed to conduct a comprehensive census of massive galaxies, their properties, environments, and large-scale structure down to absolute magnitude M B = –20 at z ∼ 1 via ∼90 nights of observation on the Keck telescope. The survey covers an area of 2.8 deg 2 divided into four separate fields observed to a limiting apparent magnitude of R AB = 24.1. Objects with z ∼ 0.7 to be targeted ∼2.5 times more efficiently than in a purely magnitude-limited sample. Approximately 60% of eligible targets are chosen for spectroscopy, yielding nearly 53,000 spectra and more than 38,000 reliable redshift measurements. Most of the targets that fail to yield secure redshifts are blue objects that lie beyond z ∼ 1.45, where the [O II] 3727 Å doublet lies in the infrared. The DEIMOS 1200 line mm –1 grating used for the survey delivers high spectral resolution (R ∼ 6000), accurate and secure redshifts, and unique internal kinematic information. Extensive ancillary data are available in the DEEP2 fields, particularly in the Extended Groth Strip, which has evolved into one of the richest multiwavelength regions on the sky. This paper is intended as a handbook for users of the DEEP2 Data Release 4, which includes all DEEP2 spectra and redshifts, as well as for the DEEP2 DEIMOS data reduction pipelines. Extensive details are provided on object selection, mask design, biases in target selection and redshift measurements, the spec2d two-dimensional data-reduction pipeline, the spec1d automated redshift pipeline, and the zspec visual redshift verification process, along with examples of instrumental signatures or other artifacts that in some cases remain after data reduction. Redshift errors and catastrophic failure rates are assessed through more than 2000 objects with duplicate

  4. Inference and testing on the boundary in extended constant conditional correlation GARCH models

    DEFF Research Database (Denmark)

    Pedersen, Rasmus Søndergaard

    2017-01-01

    We consider inference and testing in extended constant conditional correlation GARCH models in the case where the true parameter vector is a boundary point of the parameter space. This is of particular importance when testing for volatility spillovers in the model. The large-sample properties...

  5. A survey on deep learning in medical image analysis.

    Science.gov (United States)

    Litjens, Geert; Kooi, Thijs; Bejnordi, Babak Ehteshami; Setio, Arnaud Arindra Adiyoso; Ciompi, Francesco; Ghafoorian, Mohsen; van der Laak, Jeroen A W M; van Ginneken, Bram; Sánchez, Clara I

    2017-12-01

    Deep learning algorithms, in particular convolutional networks, have rapidly become a methodology of choice for analyzing medical images. This paper reviews the major deep learning concepts pertinent to medical image analysis and summarizes over 300 contributions to the field, most of which appeared in the last year. We survey the use of deep learning for image classification, object detection, segmentation, registration, and other tasks. Concise overviews are provided of studies per application area: neuro, retinal, pulmonary, digital pathology, breast, cardiac, abdominal, musculoskeletal. We end with a summary of the current state-of-the-art, a critical discussion of open challenges and directions for future research. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. BUDHIES: a Blind Ultra Deep HI Environmental Survey

    NARCIS (Netherlands)

    Jaffé, Y. L.; Verheijen, M. A.; Poggianti, B. M.; van Gorkom, J. H.; Deshev, B. Z.

    2014-01-01

    We present recent results from the Blind Ultra Deep HI Environmental Survey (BUDHIES), that has detected over 150 galaxies at z˜ 0.2 with the Westerbork Synthesis Radio Telescope (WSRT). Our multi-wavelength study is the first where optical properties and HI content are combined at a redshift where

  7. Modeling the stock price returns volatility using GARCH(1,1) in some Indonesia stock prices

    Science.gov (United States)

    Awalludin, S. A.; Ulfah, S.; Soro, S.

    2018-01-01

    In the financial field, volatility is one of the key variables to make an appropriate decision. Moreover, modeling volatility is needed in derivative pricing, risk management, and portfolio management. For this reason, this study presented a widely used volatility model so-called GARCH(1,1) for estimating the volatility of daily returns of stock prices of Indonesia from July 2007 to September 2015. The returns can be obtained from stock price by differencing log of the price from one day to the next. Parameters of the model were estimated by Maximum Likelihood Estimation. After obtaining the volatility, natural cubic spline was employed to study the behaviour of the volatility over the period. The result shows that GARCH(1,1) indicate evidence of volatility clustering in the returns of some Indonesia stock prices.

  8. Radio variability in the Phoenix Deep Survey at 1.4 GHz

    Science.gov (United States)

    Hancock, P. J.; Drury, J. A.; Bell, M. E.; Murphy, T.; Gaensler, B. M.

    2016-09-01

    We use archival data from the Phoenix Deep Survey to investigate the variable radio source population above 1 mJy beam-1 at 1.4 GHz. Given the similarity of this survey to other such surveys we take the opportunity to investigate the conflicting results which have appeared in the literature. Two previous surveys for variability conducted with the Very Large Array (VLA) achieved a sensitivity of 1 mJy beam-1. However, one survey found an areal density of radio variables on time-scales of decades that is a factor of ˜4 times greater than a second survey which was conducted on time-scales of less than a few years. In the Phoenix deep field we measure the density of variable radio sources to be ρ = 0.98 deg-2 on time-scales of 6 months to 8 yr. We make use of Wide-field Infrared Survey Explorer infrared cross-ids, and identify all variable sources as an active galactic nucleus of some description. We suggest that the discrepancy between previous VLA results is due to the different time-scales probed by each of the surveys, and that radio variability at 1.4 GHz is greatest on time-scales of 2-5 yr.

  9. Specification and testing of Multiplicative Time-Varying GARCH models with applications

    DEFF Research Database (Denmark)

    Amado, Cristina; Teräsvirta, Timo

    2017-01-01

    In this article, we develop a specification technique for building multiplicative time-varying GARCH models of Amado and Teräsvirta (2008, 2013). The variance is decomposed into an unconditional and a conditional component such that the unconditional variance component is allowed to evolve smooth...... is illustrated in practice with two real examples: an empirical application to daily exchange rate returns and another one to daily coffee futures returns....

  10. The 2016 Frontiers in Medicinal Chemistry Conference in Bonn.

    Science.gov (United States)

    Müller, Christa E; Thimm, Dominik; Baringhaus, Karl-Heinz

    2017-01-05

    Pushing the frontiers of medicinal chemistry: Christa Müller, Dominik Thimm, and Karl-Heinz Baringhaus look back at the events of the 2016 Frontiers in Medicinal Chemistry (FiMC) Conference held in Bonn, Germany. The report highlights the themes & talks in the annual conference hosted by the Joint Division of Medicinal Chemistry of the German Pharmaceutical Society (DPhG) and German Chemical Society (GDCh). It is also an invitation to the 2017 conference in Bern, Switzerland this February 12-15. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. The energy crisis and Bonn's atomic energy programme

    International Nuclear Information System (INIS)

    Steinhaus, K.; Heimbrecht, J.

    1979-01-01

    What are the background and causes of the energy crisis. In whose interest and on whose back is energy policy made in our country. Will the lights go out without nuclear power. Which are the real goals and dangers of Bonn's atomic energy programme. Is coal a real alternative to nuclear power in the Federal Republic of Germany. What possibilities and requirements are there for a national and democratic energy policy in the Federal Republic of Germany. Which are the central problems of the protest movement against the government's atomic energy programme. These questions, which are still in the centre of political discussion, are investigated by the authors. (orig.) [de

  12. Computer aided control of the Bonn Penning polarized ion source

    International Nuclear Information System (INIS)

    He, N.W.; VonRossen, P.; Eversheim, P.D.; Busch, R.

    1984-01-01

    A CBM computer system is described which has been set up to control the Bonn Polarized Ion Source. The controlling program, besides setting and logging parameters, performs an optimization of the ion source output. A free definable figure of merit, being composed of the current of the ionizer and its variance, has proven to be an effective means in directing the source optimization. The performance that has been reached during the first successful tests is reported

  13. TOPAS 2 - a high-resolution tagging system at the Bonn SAPHIR detector

    International Nuclear Information System (INIS)

    Rappenecker, G.

    1989-02-01

    For the SAPHIR-arrangement in Bonn a high resolving tagging system has been developed achieving an energy resolution of 2 MeV, covering the range of (0.94-0.34) E 0 photon energy (1.0 GeV 0 2 , ArCH 4 and ArC 2 H 6 in concern of performance, clustersize and coincidence width. (orig.)

  14. Targeting estimation of CCC-GARCH models with infinite fourth moments

    DEFF Research Database (Denmark)

    Pedersen, Rasmus Søndergaard

    . In this paper we consider the large-sample properties of the variance targeting estimator for the multivariate extended constant conditional correlation GARCH model when the distribution of the data generating process has infinite fourth moments. Using non-standard limit theory we derive new results...... for the estimator stating that its limiting distribution is multivariate stable. The rate of consistency of the estimator is slower than √Τ (as obtained by the quasi-maximum likelihood estimator) and depends on the tails of the data generating process....

  15. Hot money and China's stock market volatility: Further evidence using the GARCH-MIDAS model

    Science.gov (United States)

    Wei, Yu; Yu, Qianwen; Liu, Jing; Cao, Yang

    2018-02-01

    This paper investigates the influence of hot money on the return and volatility of the Chinese stock market using a nonlinear Granger causality test and a new GARCH-class model based on mixed data sampling regression (GARCH-MIDAS). The empirical results suggest that no linear or nonlinear causality exists between the growth rate of hot money and the Chinese stock market return, implying that the Chinese stock market is not driven by hot money and vice versa. However, hot money has a significant positive impact on the long-term volatility of the Chinese stock market. Furthermore, the dependence between the long-term volatility caused by hot money and the total volatility of the Chinese stock market is time-variant, indicating that huge volatilities in the stock market are not always triggered by international speculation capital flow and that Chinese authorities should further focus on more systemic reforms in the trading rules and on effectively regulating the stock market.

  16. Deep-towed CSEM survey of gas hydrates in the Gulf of Mexico

    Science.gov (United States)

    Kannberg, P.; Constable, S.

    2017-12-01

    Controlled source electromagnetic (CSEM) surveys are increasingly being used to remotely detect hydrate deposits in seafloor sediments. CSEM methods are sensitive to sediment pore space resistivity, such as when electrically resistive hydrate displaces the electrically conductive pore fluid, increasing the bulk resistivity of the sediment. In July 2017, a two-week research cruise using an upgraded and expanded "Vulcan" towed receiver system collected over 250 line km of data at four sites in the Gulf of Mexico (GoM) thought to have hydrate bearing sediments. Hydrate bearing horizons at the survey sites ranged from 400-700 m below seafloor. Modeling suggested an array with source receiver offsets of up to 1600 m would be needed to properly image the deep hydrate. A deep towed electromagnetic transmitter outputting 270 Amps was towed 100 m above seafloor. Six Vulcan receivers, each recording three-axis electric field data, were towed at 200 m intervals from 600-1600 m behind the transmitter. The four sites surveyed, Walker Ridge 313, Orca Basin, Mad Dog, and Green Canyon 955, are associated with the upcoming GOM^2 coring operation scheduled for 2020. Wells at WR313 and GC955 were logged as part of a joint industry drilling project in 2009 and will be used to ground truth our inversion results. In 2008, WR313 and GC955 were surveyed using traditional CSEM seafloor receivers, accompanied by a single prototype Vulcan towed receiver. This prior survey will allow comparison of results from a seafloor receiver survey with those from a towed receiver survey. Seismic data has been collected at all the sites, which will be used to constrain inversions. In addition to the four hydrate sites surveyed, two lines were towed over Green Knoll, a deep-water salt dome located between Mad Dog and GC955. Presented here are initial results from our recent cruise.

  17. Deep 20-GHz survey of the Chandra Deep Field South and SDSS Stripe 82: source catalogue and spectral properties

    Science.gov (United States)

    Franzen, Thomas M. O.; Sadler, Elaine M.; Chhetri, Rajan; Ekers, Ronald D.; Mahony, Elizabeth K.; Murphy, Tara; Norris, Ray P.; Waldram, Elizabeth M.; Whittam, Imogen H.

    2014-04-01

    We present a source catalogue and first results from a deep, blind radio survey carried out at 20 GHz with the Australia Telescope Compact Array, with follow-up observations at 5.5, 9 and 18 GHz. The Australia Telescope 20 GHz (AT20G) deep pilot survey covers a total area of 5 deg2 in the Chandra Deep Field South and in Stripe 82 of the Sloan Digital Sky Survey. We estimate the survey to be 90 per cent complete above 2.5 mJy. Of the 85 sources detected, 55 per cent have steep spectra (α _{1.4}^{20} law spectra between 1.4 and 18 GHz, while the spectral indices of the flat- or inverted-spectrum sources tend to steepen with frequency. Among the 18 inverted-spectrum (α _{1.4}^{20} ≥ 0.0) sources, 10 have clearly defined peaks in their spectra with α _{1.4}^{5.5} > 0.15 and α 9^{18} Cambridge and Tenth Cambridge surveys: there is a shift towards a steeper-spectrum population when going from ˜1 Jy to ˜5 mJy, which is followed by a shift back towards a flatter-spectrum population below ˜5 mJy. The 5-GHz source-count model by Jackson & Wall, which only includes contributions from FRI and FRII sources, and star-forming galaxies, does not reproduce the observed flattening of the flat-spectrum counts below ˜5 mJy. It is therefore possible that another population of sources is contributing to this effect.

  18. Deep Echo State Network (DeepESN): A Brief Survey

    OpenAIRE

    Gallicchio, Claudio; Micheli, Alessio

    2017-01-01

    The study of deep recurrent neural networks (RNNs) and, in particular, of deep Reservoir Computing (RC) is gaining an increasing research attention in the neural networks community. The recently introduced deep Echo State Network (deepESN) model opened the way to an extremely efficient approach for designing deep neural networks for temporal data. At the same time, the study of deepESNs allowed to shed light on the intrinsic properties of state dynamics developed by hierarchical compositions ...

  19. Modeling the Volatility of Exchange Rates: GARCH Models

    Directory of Open Access Journals (Sweden)

    Fahima Charef

    2017-03-01

    Full Text Available The modeling of the dynamics of the exchange rate at a long time remains a financial and economic research center. In our research we tried to study the relationship between the evolution of exchange rates and macroeconomic fundamentals. Our empirical study is based on a series of exchange rates for the Tunisian dinar against three currencies of major trading partners (dollar, euro, yen and fundamentals (the terms of trade, the inflation rate, the interest rate differential, of monthly data, from jan 2000 to dec-2014, for the case of the Tunisia. We have adopted models of conditional heteroscedasticity (ARCH, GARCH, EGARCH, TGARCH. The results indicate that there is a partial relationship between the evolution of the Tunisian dinar exchange rates and macroeconomic variables.

  20. Using Garch-in-Mean Model to Investigate Volatility and Persistence at Different Frequencies for Bucharest Stock Exchange during 1997-2012

    Directory of Open Access Journals (Sweden)

    Iulian PANAIT

    2012-05-01

    Full Text Available In our paper we use data mining to compare the volatility structure of high (daily and low (weekly, monthly frequencies for seven Romanian companies traded on Bucharest Stock Exchange and three market indices, during 1997-2012. For each of the 10 time series and three frequencies we fit a GARCH-in-mean model and we find that persistency is more present in the daily returns as compared with the weekly and monthly series. On the other hand, the GARCH-in-mean failed to confirm (on our data the theoretical hypothesis that an increase in volatility leads to a rise in future returns, mainly because the variance coefficient from the mean equation of the model was not statistically significant for most of the time series analyzed and on most of the frequencies. The diagnosis that we ran in order the verify the goodness of fit for the model showed that GARCH-in-mean was well fitted on the weekly and monthly time series but behaved less well on the daily time series.

  1. GOOD NEWS, BAD NEWS AND GARCH EFFECTS IN STOCK RETURN DATA

    OpenAIRE

    Craig A. Depken II

    2001-01-01

    It is shown that the volume of trade can be decomposed into proportional proxies for stochastic flows of good news and bad news into the market. Positive (good) information flows are assumed to increase the price of a financial vehicle while negative (bad) information flows decrease the price. For the majority of a sample of ten split-stocks it is shown that the proposed decomposition explains more GARCH than volume itself. Using the proposed decomposition, the variance of returns for younger...

  2. CANDELS: The Cosmic Assembly Near-Infrared Deep Extragalactic Legacy Survey

    Science.gov (United States)

    Grogin, Norman A.; Koekemoer, anton M.; Faber, S. M.; Ferguson, Henry C.; Kocevski, Dale D.; Riess, Adam G.; Acquaviva, Viviana; Alexander, David M.; Almaini, Omar; Ashby, Matthew L. N.; hide

    2011-01-01

    The Cosmic Assembly Near-IR Deep Extragalactic Legacy Survey (CANDELS) is designed to document the first third of galactic evolution, from z approx. 8 - 1.5. It will image > 250,000 distant galaxies using three separate cameras on the Hubble Space Tele8cope, from the mid-UV to near-IR, and will find and measure Type Ia supernovae beyond z > 1.5 to test their accuracy as standard candles for cosmology. Five premier multi-wavelength sky regions are selected, each with extensive ancillary data. The use of five widely separated fields mitigates cosmic variance and yields statistically robust and complete samples of galaxies down to a stellar mass of 10(exp 9) solar mass to z approx. 2, reaching the knee of the UV luminosity function of galaxies to z approx. 8. The survey covers approximately 800 square arc minutes and is divided into two parts. The CANDELS/Deep survey (5(sigma) point-source limit H =27.7mag) covers approx. 125 square arcminutes within GOODS-N and GOODS-S. The CANDELS/Wide survey includes GOODS and three additional fields (EGS, COSMOS, and UDS) and covers the full area to a 50(sigma) point-source limit of H ? or approx. = 27.0 mag. Together with the Hubble Ultradeep Fields, the strategy creates a three-tiered "wedding cake" approach that has proven efficient for extragalactic surveys. Data from the survey are non-proprietary and are useful for a wide variety of science investigations. In this paper, we describe the basic motivations for the survey, the CANDELS team science goals and the resulting observational requirements, the field selection and geometry, and the observing design.

  3. CANDELS : THE COSMIC ASSEMBLY NEAR-INFRARED DEEP EXTRAGALACTIC LEGACY SURVEY

    NARCIS (Netherlands)

    Grogin, Norman A.; Kocevski, Dale D.; Faber, S. M.; Ferguson, Henry C.; Koekemoer, Anton M.; Riess, Adam G.; Acquaviva, Viviana; Alexander, David M.; Almaini, Omar; Ashby, Matthew L. N.; Barden, Marco; Bell, Eric F.; Bournaud, Frederic; Brown, Thomas M.; Caputi, Karina I.; Casertano, Stefano; Cassata, Paolo; Castellano, Marco; Challis, Peter; Chary, Ranga-Ram; Cheung, Edmond; Cirasuolo, Michele; Conselice, Christopher J.; Cooray, Asantha Roshan; Croton, Darren J.; Daddi, Emanuele; Dahlen, Tomas; Dave, Romeel; de Mello, Duilia F.; Dekel, Avishai; Dickinson, Mark; Dolch, Timothy; Donley, Jennifer L.; Dunlop, James S.; Dutton, Aaron A.; Elbaz, David; Fazio, Giovanni G.; Filippenko, Alexei V.; Finkelstein, Steven L.; Fontana, Adriano; Gardner, Jonathan P.; Garnavich, Peter M.; Gawiser, Eric; Giavalisco, Mauro; Grazian, Andrea; Guo, Yicheng; Hathi, Nimish P.; Haeussler, Boris; Hopkins, Philip F.; Huang, Jia-Sheng; Huang, Kuang-Han; Jha, Saurabh W.; Kartaltepe, Jeyhan S.; Kirshner, Robert P.; Koo, David C.; Lai, Kamson; Lee, Kyoung-Soo; Li, Weidong; Lotz, Jennifer M.; Lucas, Ray A.; Madau, Piero; McCarthy, Patrick J.; McGrath, Elizabeth J.; McIntosh, Daniel H.; McLure, Ross J.; Mobasher, Bahram; Moustakas, Leonidas A.; Mozena, Mark; Nandra, Kirpal; Newman, Jeffrey A.; Niemi, Sami-Matias; Noeske, Kai G.; Papovich, Casey J.; Pentericci, Laura; Pope, Alexandra; Primack, Joel R.; Rajan, Abhijith; Ravindranath, Swara; Reddy, Naveen A.; Renzini, Alvio; Rix, Hans-Walter; Robaina, Aday R.; Rodney, Steven A.; Rosario, David J.; Rosati, Piero; Salimbeni, Sara; Scarlata, Claudia; Siana, Brian; Simard, Luc; Smidt, Joseph; Somerville, Rachel S.; Spinrad, Hyron; Straughn, Amber N.; Strolger, Louis-Gregory; Telford, Olivia; Teplitz, Harry I.; Trump, Jonathan R.; van der Wel, Arjen; Villforth, Carolin; Wechsler, Risa H.; Weiner, Benjamin J.; Wiklind, Tommy; Wild, Vivienne; Wilson, Grant; Wuyts, Stijn; Yan, Hao-Jing; Yun, Min S.

    2011-01-01

    The Cosmic Assembly Near-infrared Deep Extragalactic Legacy Survey (CANDELS) is designed to document the first third of galactic evolution, over the approximate redshift (z) range 8-1.5. It will image >250,000 distant galaxies using three separate cameras on the Hubble Space Telescope, from the

  4. Westerbork Ultra-Deep Survey of HI at z=0.2

    NARCIS (Netherlands)

    Verheijen, Marc; Deshev, Boris; van Gorkom, Jacqueline; Poggianti, Bianca; Chung, Aeree; Cybulski, Ryan; Dwarakanath, K. S.; Montero-Castano, Maria; Morrison, Glenn; Schiminovich, David; Szomoru, Arpad; Yun, Min

    2010-01-01

    In this contribution, we present some preliminary observational results from the completed ultra-deep survey of 21cm emission from neutral hydrogen at redshifts z=0.164-0.224 with the Westerbork Synthesis Radio Telescope. In two separate fields, a total of 160 individual galaxies has been detected

  5. Noise sensitivity of portfolio selection in constant conditional correlation GARCH models

    Science.gov (United States)

    Varga-Haszonits, I.; Kondor, I.

    2007-11-01

    This paper investigates the efficiency of minimum variance portfolio optimization for stock price movements following the Constant Conditional Correlation GARCH process proposed by Bollerslev. Simulations show that the quality of portfolio selection can be improved substantially by computing optimal portfolio weights from conditional covariances instead of unconditional ones. Measurement noise can be further reduced by applying some filtering method on the conditional correlation matrix (such as Random Matrix Theory based filtering). As an empirical support for the simulation results, the analysis is also carried out for a time series of S&P500 stock prices.

  6. Forecasting the variance and return of Mexican financial series with symmetric GARCH models

    Directory of Open Access Journals (Sweden)

    Fátima Irina VILLALBA PADILLA

    2013-03-01

    Full Text Available The present research shows the application of the generalized autoregresive conditional heteroskedasticity models (GARCH in order to forecast the variance and return of the IPC, the EMBI, the weighted-average government funding rate, the fix exchange rate and the Mexican oil reference, as important tools for investment decisions. Forecasts in-sample and out-of-sample are performed. The covered period involves from 2005 to 2011.

  7. Fiscal 1995 verification survey of geothermal exploration technology. Report on a deep geothermal resource survey; 1995 nendo chinetsu tansa gijutsu nado kensho chosa. Shinbu chinetsu shigen hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-06-01

    For the purpose of reducing the risk of deep geothermal resource development, the paper investigated three factors for the formation of geothermal resource in the deep underground, that is, heat supply from heat source, supply of geothermal fluids, and the developmental status of fracture systems forming reservoir structures. The survey further clarified the status of existence of deep geothermal resource and the whole image of the geothermal system including shallow geothermal energy in order to research/study usability of deep geothermal resource. In the deep geothermal resource survey, drilling/examination were made of a deep geothermal exploration well (`WD-1,` target depth: approximately 3,000-4,000m) in the already developed area, with the aim of making rationalized promotion of the geothermal development. And the status of existence of deep geothermal resource and the whole image of the geothermal system were clarified to investigate/study usability of the geothermal system. In fiscal 1995, `WD-1` in the Kakkonda area reached a depth of 3,729m. By this, surveys were made to grasp the whole image of the shallow-deep geothermal system and to obtain basic data for researching usability of deep geothermal resource. 22 refs., 531 figs., 136 tabs.

  8. Code of Practice on the International Transboundary Movement of Radioactive Waste; Code De Bonne Pratique Sur Le Mouvement Transfrontiere International De Dechets Radioactifs

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1990-11-03

    On 21 September 1990, the General Conference, by resolution GC(XXXIV)/RES/530, adopted a Code of Practice on the International Transboundary Movement of Radioactive Waste and requested the Director General - inter alia - to take all necessary steps to ensure wide dissemination of the Code of Practice at both the national and the international level. The Code of Practice was elaborated by a Group of Experts established pursuant to resolution GC(XXXII)/RES/490 adopted by the General Conference in 1988. The text of the Code of Practice is reproduced herewith for the information of all Member States [French] Le 21 septembre 1990, la Conference generale, par la resolution GC(XXXIV)/RES/530, a adopte le Code de bonne pratique sur le mouvement transfrontiere international de dechets radioactifs et a prie le Directeur general-notamment-de prendre toutes les mesures necessaires pour assurer une large diffusion du Code de bonne pratique aux niveaux tant national qu'international. Le Code de bonne pratique a ete elabore par un groupe d'experts cree en application de la resolution GC(XXXII)/RES/490 adoptee par la Conference generale en 1988. Le texte du Code de bonne pratique est reproduit ci-apres pour l'information de tous les Etats Membres.

  9. Measuring the Effect of Exchange Rate Movements on Stock Market Returns Volatility: GARCH Model

    Directory of Open Access Journals (Sweden)

    Abdelkadir BESSEBA

    2017-06-01

    Full Text Available This paper aims to investigate the dynamic links between exchange rate fluctuations and stock market return volatility. For this purpose, we have employed a Generalized Autoregressive Conditional Heteroscedasticity model (GARCH model. Stock market returns sensitivities are found to be stronger for exchange rates, implying that exchange rate change plays an important role in determining the dynamics of the stock market returns.

  10. On the Consequences of the U.S. Withdrawal from the Kyoto/Bonn Protocol

    International Nuclear Information System (INIS)

    Buchner, B.; Cersosimo, I.; Carraro, C.

    2001-12-01

    The US decision not to ratify the Kyoto Protocol and the recent outcomes of the Bonn and Marrakech Conferences of the Parties have important implications for both the effectiveness and the efficiency of future climate policies. Among these implications, those related with technical change and with the functioning of the international market for carbon emissions are particularly relevant, because these variables have the largest impact on the overall abatement cost to be borne by Annex B countries in the short and in the long run. This paper analyses the consequences of the US decision to withdraw from the Kyoto/Bonn Protocol both on technological innovation and on the price of emission permits (and, as a consequence, on abatement costs). A first goal is to assess the impact of the US defection on the price of permits and compliance costs when technological innovation and diffusion is taken into account (the model embodies international technological spillovers). A second goal is to understand for what reasons in the presence of endogenous and induced technical change the reduction of the price of permits is lower than in most empirical analyses recently circulated. A third goal is to assess the role of Russia in climate negotiations, its increased bargaining power and its eventual incentives to follow the US defections

  11. The CfA Einstein Observatory extended deep X-ray survey

    Science.gov (United States)

    Primini, F. A.; Murray, S. S.; Huchra, J.; Schild, R.; Burg, R.

    1991-01-01

    All IPC exposures in the Einstein Extended Deep X-ray Survey program have been reanalyzed. The current survey covers about 2.3 sq deg with a typical limiting sensitivity of about 5 x 10 to the -14th ergs/sq cm/s in the energy range from 0.8-3.5 keV. A total of 25 IPC sources are detected above a threshold of 4.5 sigma. A total of 18 are detected independently in the HRI, leading to the identification of six with stars and 11 with extragalactic objects. The remaining sources are classified as extragalactic. The population of identified extragalactic objects is dominated by QSOs, with one or two possible clusters. The basic conclusions of the original survey remain unchanged.

  12. Forecasting volatility in gold returns under the GARCH, IGARCH and FIGARCH frameworks: New evidence

    Science.gov (United States)

    Bentes, Sonia R.

    2015-11-01

    This study employs three volatility models of the GARCH family to examine the volatility behavior of gold returns. Much of the literature on this topic suggests that gold plays a fundamental role as a hedge and safe haven against adverse market conditions, which is particularly relevant in periods of high volatility. This makes understanding gold volatility important for a number of theoretical and empirical applications, namely investment valuation, portfolio selection, risk management, monetary policy-making, futures and option pricing, hedging strategies and value-at-risk (VaR) policies (e.g. Baur and Lucey (2010)). We use daily data from August 2, 1976 to February 6, 2015 and divide the full sample into two periods: the in-sample period (August 2, 1976-October 24, 2008) is used to estimate model coefficients, while the out-of-sample period (October 27, 2008-February 6, 2015) is for forecasting purposes. Specifically, we employ the GARCH(1,1), IGARCH(1,1) and FIGARCH(1, d,1) specifications. The results show that the FIGARCH(1, d,1) is the best model to capture linear dependence in the conditional variance of the gold returns as given by the information criteria. It is also found to be the best model to forecast the volatility of gold returns.

  13. Survey and analysis of deep water mineral deposits using nuclear methods

    International Nuclear Information System (INIS)

    Staehle, C.M.; Noakes, J.E.; Spaulding, J.

    1991-01-01

    Present knowledge of the location, quality, quantity and recoverability of sea floor minerals is severely limited, particularly in the abyssal depths and deep water within the 200 mile Exclusion Economic Zone (EEZ) surrounding the U.S. Pacific Islands. To improve this understanding and permit exploitation of these mineral reserves much additional data is needed. This paper will discuss a sponsored program for extending existing proven nuclear survey methods currently used on the shallow continental margins of the Atlantic and Gulf of Mexico into the deeper waters of the Pacific. This nuclear technology can be readily integrated and extended to depths of 2000 m using the existing RCV-150 remotely operated vehicle (ROV) and the PISCESE V manned deep submersible vehicle (DSV) operated by The University of Hawaii's, Hawaii Underseas Research Laboratory (HURL). Previous papers by the authors have also proposed incorporating these nuclear analytical methods for survey of the deep ocean through the use of Autonomous Underwater Vehicle (AUX). Such a vehicle could extend the use of passive nuclear instrument operation, in addition to conventional analytical methods, into the abyssal depths and do so with speed and economy not otherwise possible. The natural radioactivity associated with manganese nodules and crustal deposits is sufficiently above normal background levels to allow discrimination and quantification in near real time

  14. Global Risk Evolution and Diversification: a Copula-DCC-GARCH Model Approach

    Directory of Open Access Journals (Sweden)

    Marcelo Brutti Righi

    2012-12-01

    Full Text Available In this paper we estimate a dynamic portfolio composed by the U.S., German, British, Brazilian, Hong Kong and Australian markets, the period considered started on September 2001 and finished in September 2011. We ran the Copula-DCC-GARCH model on the daily returns conditional covariance matrix. The results allow us to conclude that there were changes in portfolio composition, occasioned by modifications in volatility and dependence between markets. The dynamic approach significantly reduced the portfolio risk if compared to the traditional static approach, especially in turbulent periods. Furthermore, we verified that the estimated copula model outperformed the conventional DCC model for the sample studied.

  15. A PIXE mini-beam setup at the Bonn cyclotron for archeometric metal analyses

    International Nuclear Information System (INIS)

    Weber, J.; Beier, T.; Diehl, U.; Lambrecht, D.; Mommsen, H.; Pantenburg, F.J.

    1990-01-01

    The exact analysis of the elemental composition at and around a soldering joint of an antique piece of jewelry can elucidate the joining technologies and give hints of the genuineness of the object. As analytical method we use PIXE, which is nondestructive, multielemental and with fundamental parameter calculations gives absolute concentration values. To obtain the necessary spot size of the H 2 + beam at the Bonn cyclotron we use a piezo-controlled diaphragm, whose demagnified image is focussed on the target by two magnetic quadrupole triplets. With an electrostatic deflector the beam spot of 0.1x0.3 mm 2 size can be moved 2 mm in each direction on the target. With a laser beam, which simulates the ion beam, an irregularly shaped archeological object can be positioned. The laser is also used to obtain the alignment of the target surface to the ion beam direction and the small beam size makes it easier to find a flat part on the surface of the object; both of these are important preconditions for using the fundamental parameter method. A scan over a joint of modern gold alloys demonstrates the ability to detect soldering joints. The analysis of four pieces of Roman gold jewelry found in the area of Cologne and Bonn shows examples of brazing with a solder as well as diffusion soldering. (orig.)

  16. FY 1998 report on the verification survey of geothermal exploration technology, etc. 1/2. Survey of deep geothermal resource; 1998 nendo chinetsu tansa gijutsu nado kensho chosa hokokusho. 1/2. Shinbu chinetsu shigen chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-12-01

    For the purpose of commercializing deep geothermal resource, a deep exploration well of 4000m class was drilled in the existing geothermal development area to survey the situation of deep geothermal resource existence and the availability. Concretely, the deep geothermal exploration well was drilled for study in the Kakkonda area, Shizukuishi town, Iwate prefecture, to clarify the situation of deep geothermal resource existence and the whole image of geothermal system. Consideration was made of the deep geothermal exploration method, systematization of deep high temperature drilling technology, and availability of deep geothermal resource. The results of the survey were summed up as follows: 1) general remarks; 2) deep exploration well drilling work; 3) details of the study. 1) and 2) were included in this report, and 3) in the next report. In 1), the items were as follows: the study plan/gist of study execution, the details and results of the deep geothermal resource survey, the outline of the deep exploration well drilling work, and the outline of the results of the FY 1998 study. In 2), the drilling work plan/the actual results of the drilling work were summed up. As to the results of the study, summarized were the acquisition of survey data on deep exploration well, heightening of accuracy of the deep geothermal resource exploration method, etc. (NEDO)

  17. Confirmation study of the effectiveness of prospect techniques for deep geothermal resources. Deep-seated geothermal resources survey report (Fiscal year 1993); 1993 nendo chinetsu tansa gijutsu nado kensho chosa. Shinbu chinetsu shigen chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-01

    Drilling and survey of deep geothermal exploration wells were conducted in order to grasp the existing situation of deep geothermal resource and the whole image of geothermal systems in the area where geothermal resource was already developed. Following fiscal 1992, the well was drilled in fiscal 1993 down to depths of 605m-1505m, and a 13-3/8 inch casing was inserted down to a depth of 1500m. In the drilling, four cores including oriented cores were sampled, and microscopic observation, X-ray diffraction analysis, fluid inclusion survey, core property test, etc. were conducted. In the FMI logging, detected were 273 bedding planes, 483 fractures, etc. Further made were a velocity structure survey, a gravity survey in the area of 270 km{sup 2} including deep exploration wells, a quality survey of the Kakkonda river water, etc. As to geothermal structure models in the Kakkonda area, results of the drilling were added to prediction models before drilling deep exploration wells, but the revision was not very much. Besides, studies were made of a survey method using microearthquakes, a survey technique using resistivity, etc. 61 refs., 259 figs., 95 tabs.

  18. American Option Pricing using GARCH models and the Normal Inverse Gaussian distribution

    DEFF Research Database (Denmark)

    Stentoft, Lars Peter

    In this paper we propose a feasible way to price American options in a model with time varying volatility and conditional skewness and leptokurtosis using GARCH processes and the Normal Inverse Gaussian distribution. We show how the risk neutral dynamics can be obtained in this model, we interpret...... properties shows that there are important option pricing differences compared to the Gaussian case as well as to the symmetric special case. A large scale empirical examination shows that our model outperforms the Gaussian case for pricing options on three large US stocks as well as a major index...

  19. Individual precipitates in Al alloys probed by the Bonn positron microprobe

    Energy Technology Data Exchange (ETDEWEB)

    Balarisi, Osman; Eich, Patrick; Haaks, Matz; Klobes, Benedikt; Korff, Bjoern; Maier, Karl; Sottong, Reinhard [Helmholtz-Institut fuer Strahlen- und Kernphysik, Nussallee 14-16, 53115 Bonn (Germany); Huehne, Sven-Martin; Mader, Werner [Institut fuer Anorganische Chemie, Roemerstrasse 164, 53117 Bonn (Germany); Staab, Torsten [Fraunhofer ISC, Neunerplatz 2, 97082 Wuerzburg (Germany)

    2010-07-01

    Positron annihilation spectroscopy (PAS) is a unique tool for the characterization of open-volume defects such as vacancies. Therefore, age hardenable Al alloys, whose decomposition is mainly driven by the vacancy mechanism of diffusion, are often characterized by PAS techniques. Nevertheless, probing the defect state of individual precipitates grown in Al alloys requires a focused positron beam and has not been carried out up to now. In this respect we present the first investigations of the defect state of individual precipitates utilizing the Bonn Positron Microprobe (BPM). Furthermore, the analysis of the experimental data has to be facilitated by theoretical calculations of the observables of positron annihilation spectroscopy.

  20. The impact of new polarization data from Bonn, Mainz and Jefferson Laboratory on γp → πN multipoles

    Energy Technology Data Exchange (ETDEWEB)

    Anisovich, A.V.; Nikonov, V.; Sarantsev, A. [Universitaet Bonn, Helmholtz-Institut fuer Strahlen- und Kernphysik der, Bonn (Germany); PNPI, NRS ' ' Kurchatov Institute' ' , Gatchina (Russian Federation); Beck, R.; Gottschall, M.; Hartmann, J.; Klempt, E.; Thiel, A.; Thoma, U.; Wunderlich, Y. [Universitaet Bonn, Helmholtz-Institut fuer Strahlen- und Kernphysik der, Bonn (Germany); Doering, M. [George Washington University, Department of Physics, Washington, DC (United States); Thomas Jefferson National Accelerator Facility, Newport News, VA (United States); Kashevarov, V.; Ostrick, M.; Tiator, L. [Institut fuer Kernphysik der Universitaet Mainz, Mainz (Germany); Meissner, Ulf G. [Universitaet Bonn, Helmholtz-Institut fuer Strahlen- und Kernphysik der, Bonn (Germany); Universitaet Bonn, Bethe Center for Theoretical Physics, Bonn (Germany); Juelich Center for Hadron Physics, JARA FAME and JARA HPC, Forschungszentrum Juelich, Institut fuer Kernphysik, Institute for Advanced Simulation, Juelich (Germany); Roenchen, D. [Universitaet Bonn, Helmholtz-Institut fuer Strahlen- und Kernphysik der, Bonn (Germany); Universitaet Bonn, Bethe Center for Theoretical Physics, Bonn (Germany); Strakovsky, I.; Workman, R. [George Washington University, Department of Physics, Washington, DC (United States)

    2016-09-15

    New data on pion-photoproduction off the proton have been included in the partial-wave analyses Bonn-Gatchina and SAID and in the dynamical coupled-channel approach Juelich-Bonn. All reproduce the recent new data well: the double-polarization data for E, G, H, P and T in γp → π{sup 0}p from ELSA, the beam asymmetry Σ for γp → π{sup 0}p and π{sup +}n from Jefferson Laboratory, and the precise new differential cross section and beam asymmetry data Σ for γp → π{sup 0}p from MAMI. The new fit results for the multipoles are compared with predictions not taking into account the new data. The mutual agreement is improved considerably but still far from being perfect. (orig.)

  1. Construction and test of the Bonn frozen spin target

    International Nuclear Information System (INIS)

    Dutz, H.

    1989-04-01

    For γN→ΠN and γd→pn scattering experiments at the PHOENICS detector, a new 'bonn frozen spin target' (BOFROST) is developed. The target with a maximum volume of 30 cm 3 is cooled in a vertical 3 He- 4 He dilution kryostat. The lowest temperature of the dilution kryostat in the frozen spin mode should be 50 mk. In a first stage, the magnet system consist of two superconducting solenoids: A polarisation magnet with a maximum field of 7 T with a homogenity of 10 -5 over the target area and a 'vertical holding' magnet with a maximum field in the target area of 0.57 T. This work describes the construction and the set-up of the 'frozen spin target' in the laboratory and the first tests of the dilution kryostat and the superconducting magnet system. (orig.) [de

  2. Simulation of deep one- and two-dimensional redshift surveys

    International Nuclear Information System (INIS)

    Park, Changbom; Gott, J.R. III

    1991-01-01

    We show that slice or pencil-beam redshift surveys of galaxies can be simulated in a box with non-equal sides. This method saves a lot of computer time and memory while providing essentially the same results as from whole-cube simulations. A 2457.6-h -1 Mpc-long rod (out to a redshift z = 0.58 in two opposite directions) is simulated using the standard biased Cold Dark Matter model as an example to mimic the recent deep pencil-beam surveys by Broadhurst et al. The structures (spikes) we see in these simulated samples occur when the narrow pencil-beam pierces walls, filaments and clusters appearing randomly along the line-of-sight. We have applied a statistical test for goodness of fit to a periodic lattice to the observations and the simulations. (author)

  3. Deep Sea Coral voucher sequence dataset - Identification of deep-sea corals collected during the 2009 - 2014 West Coast Groundfish Bottom Trawl Survey

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Data for this project resides in the West Coast Groundfish Bottom Trawl Survey Database. Deep-sea corals are often components of trawling bycatch, though their...

  4. Modeling and forecasting the volatility of Islamic unit trust in Malaysia using GARCH model

    Science.gov (United States)

    Ismail, Nuraini; Ismail, Mohd Tahir; Karim, Samsul Ariffin Abdul; Hamzah, Firdaus Mohamad

    2015-10-01

    Due to the tremendous growth of Islamic unit trust in Malaysia since it was first introduced on 12th of January 1993 through the fund named Tabung Ittikal managed by Arab-Malaysian Securities, vast studies have been done to evaluate the performance of Islamic unit trust offered in Malaysia's capital market. Most of the studies found that one of the factors that affect the performance of the fund is the volatility level. Higher volatility produces better performance of the fund. Thus, we believe that a strategy must be set up by the fund managers in order for the fund to perform better. By using a series of net asset value (NAV) data of three different types of fund namely CIMB-IDEGF, CIMB-IBGF and CIMB-ISF from a fund management company named CIMB Principal Asset Management Berhad over a six years period from 1st January 2008 until 31st December 2013, we model and forecast the volatility of these Islamic unit trusts. The study found that the best fitting models for CIMB-IDEGF, CIMB-IBGF and CIMB-ISF are ARCH(4), GARCH(3,3) and GARCH(3,1) respectively. Meanwhile, the fund that is expected to be the least volatile is CIMB-IDEGF and the fund that is expected to be the most volatile is CIMB-IBGF.

  5. Noise behavior of the Garching 30-meter prototype gravitational-wave detector

    International Nuclear Information System (INIS)

    Shoemaker, D.; Schilling, R.; Schnupp, L.; Winkler, W.; Maischberger, K.; Ruediger, A.

    1988-01-01

    The prototype gravitational-wave detector at Garching is described: in a laser-illuminated Michelson interferometer having arms 30 m in length, a folded optical path of 3 km is realized. The origin, action, and magnitude of possible noise sources are given. The agreement between the expected and measured noise is good. For a band of astrophysical interest, extending from 1 to 6 kHz, the quantum shot noise corresponding to a light power of P = 0.23 W is dominant. In terms of the dimensionless strain h the best sensitivity in a 1-kHz bandwidth is h = 3 x 10/sup -18/, comparable to the most sensitive Weber-bar-type antennas

  6. Day-ahead electricity price forecasting using wavelet transform combined with ARIMA and GARCH models

    International Nuclear Information System (INIS)

    Tan, Zhongfu; Zhang, Jinliang; Xu, Jun; Wang, Jianhui

    2010-01-01

    This paper proposes a novel price forecasting method based on wavelet transform combined with ARIMA and GARCH models. By wavelet transform, the historical price series is decomposed and reconstructed into one approximation series and some detail series. Then each subseries can be separately predicted by a suitable time series model. The final forecast is obtained by composing the forecasted results of each subseries. This proposed method is examined on Spanish and PJM electricity markets and compared with some other forecasting methods. (author)

  7. FY 1998 report on the verification survey of geothermal exploration technology, etc. 2/2. Survey of deep geothermal resource; 1998 nendo chinetsu tansa gijutsu nado kensho chosa hokokusho. 2/2. Shinbu chinetsu shigen chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-12-01

    For the purpose of commercializing deep geothermal resource, a deep exploration well of 4000m class was drilled in the existing geothermal development area to survey the situation of deep geothermal resource existence and the availability. Concretely, the deep geothermal exploration well was drilled for study in the Kakkonda area, Shizukuishi town, Iwate prefecture, to clarify the situation of deep geothermal resource existence and the whole image of geothermal system. Consideration was made of the deep geothermal exploration method, systematization of deep high temperature drilling technology, and availability of deep geothermal resource. The results of the survey were summed up as follows: 1) general remarks; 2) deep exploration well drilling work; 3) details of the study. This report contained 3). In 3), the items were as follows: heightening of accuracy of the deep geothermal resource exploration method, making of a geothermal model in the Kakkonda area, study of deep drilling technology, study of deep fluid utilization technology, and making of a guide for deep geothermal resource exploration/development in the Kakkonda area. As to the technology of high temperature deep geothermal well drilling, studies were made of the borehole cooling method, mud water cooling method, survey of deterioration of casing with age, etc. (NEDO)

  8. The polarized proton and deuteron beam at the Bonn isochronous cyclotron

    Energy Technology Data Exchange (ETDEWEB)

    Bauer, K G; Enders, R; Hammon, W; Krause, K D; Lesemann, D; Scholzen, A [Bonn Univ. (F.R. Germany). Inst. fuer Angewandte Physik; Euler, K; Schueller, B [Bonn Univ. (F.R. Germany). Inst. fuer Strahlen- und Kernphysik

    1976-02-15

    The present state of the polarized proton and deuteron source at the Bonn cyclotron is described. The source, which is of the atomic beam type, gives typical ion beam intensities of 2 ..mu..A for protons and 3 ..mu..A for deuterons. The overall transmission from the source to the first stopper after extraction from the cyclotron is 3%. Target currents with an energy resolution E/..delta..E=500 are 20 nA for deuterons and 10 nA for protons. For the proton beam, a polarization P=-0.71 was measured. For the deuteron beam, a pure vector polarization Psub(z)=-0.47 or various mixtures of vector and tensor polarization are obtained.

  9. Closed-orbit correction using the new beam position monitor electronic of Elsa Bonn

    CERN Document Server

    Dietrich, J; Keil, J

    2000-01-01

    RF and digital electronics, developed at the Forschungszentrum Jülich/IKP were integrated to form the new beam position monitor (BPM) system at the Electron Stretcher Accelerator (ELSA) of the University of Bonn. With this system the preservation of the polarization level during acceleration was currently improved by a good correction of the closed-orbit. All BPM offsets relative to the magnetic quadrupole centers were determined by the method of beam-based alignment. The optics functions measured by the BPM system are in good agreement with theoretical predictions.

  10. Ettore Spalletti: Salle des départs a Garches

    Directory of Open Access Journals (Sweden)

    Andrea Dall'Asta

    2012-06-01

    Full Text Available Ettore Spalletti re-inventa nel 1996 l’obitorio dell’ospedale Raymond Poincaré a Garches - alle porte di Parigi. È uno spazio privo di simboli religiosi, in cui i corpi, posti nella bara, sono esposti all’ultimo sguardo dei familiari e degli amici. È la Salle des départs, sala delle partenze, in cui ogni uomo è chiamato a soggiornare - musulmano, cristiano, non credente – per il breve transito dal mondo della vita a quello della morte, verso una nuova vita. L’intento di Spalletti è quello di umanizzare un luogo che aiuti le persone a elaborare il lutto, infondendo pace e serenità. Da uno spazio anonimo, grazie alla forza espressiva del colore azzurro, come quello del manto di una Madonna che accoglie i suoi figli, l’artista ci fa immergere in un luogo che si presenta come l’incarnazione della purezza, diventando simbolo della promessa della trascendenza e dell’assoluto.

  11. DEEP 21 cm H I OBSERVATIONS AT z ∼ 0.1: THE PRECURSOR TO THE ARECIBO ULTRA DEEP SURVEY

    International Nuclear Information System (INIS)

    Freudling, Wolfram; Zwaan, Martin; Staveley-Smith, Lister; Meyer, Martin; Catinella, Barbara; Minchin, Robert; Calabretta, Mark; Momjian, Emmanuel; O'Neil, Karen

    2011-01-01

    The 'ALFA Ultra Deep Survey' (AUDS) is an ongoing 21 cm spectral survey with the Arecibo 305 m telescope. AUDS will be the most sensitive blind survey undertaken with Arecibo's 300 MHz Mock spectrometer. The survey searches for 21 cm H I line emission at redshifts between 0 and 0.16. The main goals of the survey are to investigate the H I content and probe the evolution of H I gas within that redshift region. In this paper, we report on a set of precursor observations with a total integration time of 53 hr. The survey detected a total of eighteen 21 cm emission lines at redshifts between 0.07 and 0.15 in a region centered around α 2000 ∼ 0 h , δ ∼ 15 0 42'. The rate of detection is consistent with the one expected from the local H I mass function. The derived relative H I density at the median redshift of the survey is ρ H I [z = 0.125] = (1.0 ± 0.3)ρ 0 , where ρ 0 is the H I density at zero redshift.

  12. Microcomputer development at Bonn and its application in FADC data reduction

    International Nuclear Information System (INIS)

    Mertens, V.; Schmitt, H. von der.

    1983-04-01

    With the 16/32-bit microprocessors 68K, high CPU performance (comparable to a VAX 11/780) and large address space (several Mbytes) become available for data processing on the crate level. Some software effort is necessary to take full advantage of such devices. For the Camac environment, an ACC (auxiliary crate controller) based on the 68K has been built at Bonn. Fortran-77 with specific support of real-time applications in the crate environment is made available as portable cross software. A comprehensive compiler-writing language has been developed and employed for this purpose, offering the flexibility to adapt the compiler to specific hardware conditions. A first application of this hardware/software system in FADC data reduction is described. (orig.)

  13. Very deep IRAS survey - constraints on the evolution of starburst galaxies

    International Nuclear Information System (INIS)

    Hacking, P.; Houck, J.R.; Condon, J.J.; National Radio Astronomy Observatory, Charlottesville, VA)

    1987-01-01

    Counts of sources (primarily starburst galaxies) from a deep 60 microns IRAS survey published by Hacking and Houck (1987) are compared with four evolutionary models. The counts below 100 mJy are higher than expected if no evolution has taken place out to a redshift of approximately 0.2. Redshift measurements of the survey sources should be able to distinguish between luminosity-evolution and density-evolution models and detect as little as a 20 percent brightening or increase in density of infrared sources per billion years ago (H/0/ = 100 km/s per Mpc). Starburst galaxies cannot account for the reported 100 microns background without extreme evolution at high redshifts. 21 references

  14. Deep fracturing of granite bodies. Literature survey, geostructural and geostatistic investigations

    International Nuclear Information System (INIS)

    Bles, J.L.; Blanchin, R.

    1986-01-01

    This report deals with investigations about deep fracturing of granite bodies, which were performed within two cost-sharing contracts between the Commission of the European Communities, the Commissariat a l'Energie Atomique and the Bureau de Recherches Geologiques et Minieres. The aim of this work was to study the evolution of fracturing in granite from the surface to larger depths, so that guidelines can be identified in order to extrapolate, at depth, the data obtained from surface investigations. These guidelines could eventually be used for feasibility studies about radioactive waste disposal. The results of structural and geostatistic investigations about the St. Sylvestre granite, as well as the literature survey about fractures encountered in two long Alpine galleries (Mont-Blanc tunnel and Arc-Isere water gallery), in the 1000 m deep borehole at Auriat, and in the Bassies granite body (Pyrenees) are presented. These results show that, for radioactive waste disposal feasibility studies: 1. The deep state of fracturing in a granite body can be estimated from results obtained at the surface; 2. Studying only the large fault network would be insufficient, both for surface investigations and for studies in deep boreholes and/or in underground galleries; 3. It is necessary to study orientations and frequencies of small fractures, so that structural mapping and statistical/geostatistical methods can be used in order to identify zones of higher and lower fracturing

  15. Simulation of deep one- and two-dimensional redshift surveys

    Science.gov (United States)

    Park, Changbom; Gott, J. Richard, III

    1991-03-01

    It is shown that slice or pencil-beam redshift surveys of galaxies can be simulated in a box with nonequal sides. This method saves a lot of computer time and memory while providing essentially the same results as from whole-cube simulations. A 2457.6/h Mpc-long rod (out to a redshift z = 0.58 in two opposite directions) is simulated using the standard biased cold dark matter model as an example to mimic the recent deep pencil-beam surveys by Broadhurst et al. (1990). The structures (spikes) seen in these simulated samples occur when the narrow pencil-beam pierces walls, filaments, and clusters appearing randomly along the line-of-sight. A statistical test for goodness of fit to a periodic lattice has been applied to the observations and the simulations. It is found that the statistical significance level (P = 15.4 percent) is not strong enough to reject the null hypothesis that the observations and the simulations were drawn at random from the same set.

  16. Measures and experiments to the reduction of mercury in the waste recycling plant Bonn. The gold amalgam procedure - Attempt of precipitation in the scrubber with TMT 15 - enrichment of the Dioxorb absorbent with activated charcoal; Massnahmen und Versuche zur Quecksilberminderung in der Abfallverwertungsanlage Bonn. Das Gold-Amalgamverfahren: Faellungsversuch im Waescher mit TMT 15 - Anreicherung des Dioxorb-Absorbens mit Aktivkohle

    Energy Technology Data Exchange (ETDEWEB)

    Heidrich, R. [Muellverwertungsanlage Bonn GmbH, Bonn (Germany)

    2007-07-01

    MVA Bonn GmbH (Bonn, Federal Republic of Germany) operates a plant for the thermal utilization of settlement wastes. The author of the contribution under consideration reports on measures and attempts according to the reduction of mercury in the waste processing plant Bonn. In particular, three procedures are discussed: (a) The gold amalgam procedure; (b) attempt of precipitation in the scrubber with TMT-15; (c) enrichment of the adsorbent Dioxorb with activated charcoal. The gold amalgam procedure is a reliable procedure for the reduction of the mercury content in exhaust gases. Additionally, it is a very expensive procedure. The filling material containers of all three procedures should be replaced in the next revisions gradually with new containers. The danger of an enrichment of considerable quantities of mercury on the filling material packing is large. A further possibility is the filling of the finished solution over a mist eliminator. Here the danger of the blockage plays a substantial role. A thorough and reliable lowering of the mercury emission can be achieved by means of the active charcoal containing Dioxorb.

  17. Oil shock transmission to stock market returns: Wavelet-multivariate Markov switching GARCH approach

    International Nuclear Information System (INIS)

    Jammazi, Rania

    2012-01-01

    Since oil prices are typically governed by nonlinear and chaotic behavior, it’s become rather difficult to capture the dominant properties of their fluctuations. In recent years, unprecedented interest emerged on the decomposition methods in order to capture drifts or spikes relatively to this data. Together, our understanding of the nature of crude oil price shocks and their effects on the stock market returns has evolved noticeably. We accommodate these findings to investigate two issues that have been at the center of recent debates on the effect of crude oil shocks on the stock market returns of five developed countries (USA, UK, Japan, Germany and Canada). First, we analyze whether shocks and or volatility emanating from two major crude oil markets are transmitted to the equity markets. We do this by applying, the Haar A Trous Wavelet decomposition to monthly real crude oil series in a first step, and the trivariate BEKK Markov Switching GARCH model to analyze the effect of the smooth part on the degree of the stock market instability in a second step. The motivation behind the use of the former method is that noises and erratic behavior often appeared at the edge of the signal, can affect the quality of the shock and thus increase erroneous results of the shock transmission to the stock market. The proposed model is able to circumvent the path dependency problem that can influence the prediction’s robustness and can provide useful information for investors and government agencies that have largely based their views on the notion that crude oil markets affect negatively stock market returns. Second, under the hypothesis of common increased volatility, we investigate whether these states happen around the identified international crises. Indeed, the results show that the A Haar Trous Wavelet decomposition method appears to be an important step toward improving accuracy of the smooth signal in detecting key real crude oil volatility features. Additionally

  18. research document no. 27 bis. After the Hague, Bonn and Marrakech: the future international market for emissions permits and the issue of hot air

    International Nuclear Information System (INIS)

    Blanchard, O.; Criqui, P.; Kitous, A.

    2002-01-01

    The main objective of this paper is to assess the Bonn-Marrakech agreement, in terms of abatement cost and emission trading as compared with the initial agreement reached in Kyoto (the Kyoto Protocol). Our reference case (the Initial Deal) does not include the use of sinks credits, as the Kyoto Protocol does not give explicit figures nor method to estimate them. In addition, two hypothetical situations are considered. The first describes the ''missed compromise'' that could have emerged among all Parties in November 2000 in The Hague. The second is a virtual case where the US is assumed to be part of the Bonn-Marrakech Agreement, along with all the other Parties. These two cases contribute to shed the light on the Bonn-Marrakech Agreement potential pitfalls. In the current situation, the US is out of the negotiation process and has no emission reduction commitment. Given the projections of carbon dioxide (CO 2 ) emissions used in this study, the Former Soviet Union countries (FSU) and the Eastern European Economies (EEE) that are part of the Annex B have potentially enough Hot Air to fulfill the overall commitment of the Annex B bubble, without any domestic abatement effort from the other Annex B countries. We show that in the theoretical case where no limit would be imposed on the selling of Hot Air, the permit price according to the POLES model would be zero as no market equilibrium could take place. This is why, next, we examine the economic impacts of restrictions to hot air trading, for FSU and EEE as well as for the other countries. We shed the light on the potential market power of the former countries that arises from the Bonn-Marrakech Agreement. (author)

  19. Probabilistic forecasting of the solar irradiance with recursive ARMA and GARCH models

    DEFF Research Database (Denmark)

    David, M.; Ramahatana, F.; Trombe, Pierre-Julien

    2016-01-01

    Forecasting of the solar irradiance is a key feature in order to increase the penetration rate of solar energy into the energy grids. Indeed, the anticipation of the fluctuations of the solar renewables allows a better management of the production means of electricity and a better operation...... sky index show some similarities with that of financial time series. The aim of this paper is to assess the performances of a commonly used combination of two linear models (ARMA and GARCH) in econometrics in order to provide probabilistic forecasts of solar irradiance. In addition, a recursive...... regarding the statistical distribution of the error, the reliability of the probabilistic forecasts stands in the same order of magnitude as other works done in the field of solar forecasting....

  20. Photometric redshifts for the next generation of deep radio continuum surveys - I. Template fitting

    Science.gov (United States)

    Duncan, Kenneth J.; Brown, Michael J. I.; Williams, Wendy L.; Best, Philip N.; Buat, Veronique; Burgarella, Denis; Jarvis, Matt J.; Małek, Katarzyna; Oliver, S. J.; Röttgering, Huub J. A.; Smith, Daniel J. B.

    2018-01-01

    We present a study of photometric redshift performance for galaxies and active galactic nuclei detected in deep radio continuum surveys. Using two multiwavelength data sets, over the NOAO Deep Wide Field Survey Boötes and COSMOS fields, we assess photometric redshift (photo-z) performance for a sample of ∼4500 radio continuum sources with spectroscopic redshifts relative to those of ∼63 000 non-radio-detected sources in the same fields. We investigate the performance of three photometric redshift template sets as a function of redshift, radio luminosity and infrared/X-ray properties. We find that no single template library is able to provide the best performance across all subsets of the radio-detected population, with variation in the optimum template set both between subsets and between fields. Through a hierarchical Bayesian combination of the photo-z estimates from all three template sets, we are able to produce a consensus photo-z estimate that equals or improves upon the performance of any individual template set.

  1. THE IMPACT OF POLITICAL AND ECONOMIC NEWS ON THE EURO/RON EXCHANGE RATE: A GARCH APPROACH

    OpenAIRE

    Cristi Spulbar; Mihai Nitoi

    2012-01-01

    Within this study we try to capture the impact of political news and economic news from euro area on the exchange rate between Romanian currency and euro. In order to do this we used a GARCH model. As we observed, both variables influence the exchange rate, this fact implying national currency depreciation and a volatility growth. The political news and the economic news positively affect the euro/ron exchange rate volatility. The two factors conjugation, as it has happened in the recent peri...

  2. Research document no. 27 bis. After the Hague, Bonn and Marrakech: the future international market for emissions permits and the issue of hot air; Cahier de recherche no. 27. Apres La Hague, Bonn et Marrakech: le futur marche international des permis de droits d'emissions et la question de l'air chaud

    Energy Technology Data Exchange (ETDEWEB)

    Blanchard, O.; Criqui, P.; Kitous, A

    2002-01-01

    The main objective of this paper is to assess the Bonn-Marrakech agreement, in terms of abatement cost and emission trading as compared with the initial agreement reached in Kyoto (the Kyoto Protocol). Our reference case (the Initial Deal) does not include the use of sinks credits, as the Kyoto Protocol does not give explicit figures nor method to estimate them. In addition, two hypothetical situations are considered. The first describes the ''missed compromise'' that could have emerged among all Parties in November 2000 in The Hague. The second is a virtual case where the US is assumed to be part of the Bonn-Marrakech Agreement, along with all the other Parties. These two cases contribute to shed the light on the Bonn-Marrakech Agreement potential pitfalls. In the current situation, the US is out of the negotiation process and has no emission reduction commitment. Given the projections of carbon dioxide (CO{sub 2}) emissions used in this study, the Former Soviet Union countries (FSU) and the Eastern European Economies (EEE) that are part of the Annex B have potentially enough Hot Air to fulfill the overall commitment of the Annex B bubble, without any domestic abatement effort from the other Annex B countries. We show that in the theoretical case where no limit would be imposed on the selling of Hot Air, the permit price according to the POLES model would be zero as no market equilibrium could take place. This is why, next, we examine the economic impacts of restrictions to hot air trading, for FSU and EEE as well as for the other countries. We shed the light on the potential market power of the former countries that arises from the Bonn-Marrakech Agreement. (author)

  3. The infrared medium-deep survey. II. How to trigger radio AGNs? Hints from their environments

    Energy Technology Data Exchange (ETDEWEB)

    Karouzos, Marios; Im, Myungshin; Kim, Jae-Woo; Lee, Seong-Kook; Jeon, Yiseul; Choi, Changsu; Hong, Jueun; Hyun, Minhee; Jun, Hyunsung David; Kim, Dohyeong; Kim, Yongjung; Kim, Ji Hoon; Kim, Duho; Park, Won-Kee; Taak, Yoon Chan; Yoon, Yongmin [CEOU—Astronomy Program, Department of Physics and Astronomy, Seoul National University, Gwanak-gu, Seoul 151-742 (Korea, Republic of); Chapman, Scott [Department of Physics and Atmospheric Science, Dalhousie University, Halifax, Nova Scotia (Canada); Pak, Soojong [School of Space Research, Kyung Hee University, Yongin-si, Gyeonggi-do 446-701 (Korea, Republic of); Edge, Alastair, E-mail: mkarouzos@astro.snu.ac.kr [Department of Physics, University of Durham, South Road, Durham, DH1 3LE (United Kingdom)

    2014-12-10

    Activity at the centers of galaxies, during which the central supermassive black hole is accreting material, is nowadays accepted to be rather ubiquitous and most probably a phase of every galaxy's evolution. It has been suggested that galactic mergers and interactions may be the culprits behind the triggering of nuclear activity. We use near-infrared data from the new Infrared Medium-Deep Survey and the Deep eXtragalactic Survey of the VIMOS-SA22 field and radio data at 1.4 GHz from the FIRST survey and a deep Very Large Array survey to study the environments of radio active galactic nuclei (AGNs) over an area of ∼25 deg{sup 2} and down to a radio flux limit of 0.1 mJy and a J-band magnitude of 23 mag AB. Radio AGNs are predominantly found in environments similar to those of control galaxies at similar redshift, J-band magnitude, and (M{sub u} – M{sub r} ) rest-frame color. However, a subpopulation of radio AGNs is found in environments up to 100 times denser than their control sources. We thus preclude merging as the dominant triggering mechanism of radio AGNs. By fitting the broadband spectral energy distribution of radio AGNs in the least and most dense environments, we find that those in the least dense environments show higher radio-loudness, higher star formation efficiencies, and higher accretion rates, typical of the so-called high-excitation radio AGNs. These differences tend to disappear at z > 1. We interpret our results in terms of a different triggering mechanism for these sources that is driven by mass loss through winds of young stars created during the observed ongoing star formation.

  4. Survey on deep learning for radiotherapy.

    Science.gov (United States)

    Meyer, Philippe; Noblet, Vincent; Mazzara, Christophe; Lallement, Alex

    2018-05-17

    More than 50% of cancer patients are treated with radiotherapy, either exclusively or in combination with other methods. The planning and delivery of radiotherapy treatment is a complex process, but can now be greatly facilitated by artificial intelligence technology. Deep learning is the fastest-growing field in artificial intelligence and has been successfully used in recent years in many domains, including medicine. In this article, we first explain the concept of deep learning, addressing it in the broader context of machine learning. The most common network architectures are presented, with a more specific focus on convolutional neural networks. We then present a review of the published works on deep learning methods that can be applied to radiotherapy, which are classified into seven categories related to the patient workflow, and can provide some insights of potential future applications. We have attempted to make this paper accessible to both radiotherapy and deep learning communities, and hope that it will inspire new collaborations between these two communities to develop dedicated radiotherapy applications. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. The Munich Near-Infrared Cluster Survey - IX. Galaxy evolution to z ~ 2 from optically selected catalogues†‡

    Science.gov (United States)

    Feulner, Georg; Goranova, Yuliana; Hopp, Ulrich; Gabasch, Armin; Bender, Ralf; Botzler, Christine S.; Drory, Niv

    2007-06-01

    We present B-, R- and I-band-selected galaxy catalogues based on the Munich Near-Infrared Cluster Survey (MUNICS) which, together with the previously used K-selected sample, serve as an important probe of galaxy evolution in the redshift range 0 Karl-Schwarzschild Strasse 2, D-85748, Garching bei München, Germany.

  6. Long- and Short-Term Cryptocurrency Volatility Components: A GARCH-MIDAS Analysis

    Directory of Open Access Journals (Sweden)

    Christian Conrad

    2018-05-01

    Full Text Available We use the GARCH-MIDAS model to extract the long- and short-term volatility components of cryptocurrencies. As potential drivers of Bitcoin volatility, we consider measures of volatility and risk in the US stock market as well as a measure of global economic activity. We find that S&P 500 realized volatility has a negative and highly significant effect on long-term Bitcoin volatility. The finding is atypical for volatility co-movements across financial markets. Moreover, we find that the S&P 500 volatility risk premium has a significantly positive effect on long-term Bitcoin volatility. Finally, we find a strong positive association between the Baltic dry index and long-term Bitcoin volatility. This result shows that Bitcoin volatility is closely linked to global economic activity. Overall, our findings can be used to construct improved forecasts of long-term Bitcoin volatility.

  7. Conditional Correlation Models of Autoregressive Conditional Heteroskedasticity with Nonstationary GARCH Equations

    DEFF Research Database (Denmark)

    Amado, Cristina; Teräsvirta, Timo

    -run and the short-run dynamic behaviour of the volatilities. The structure of the conditional correlation matrix is assumed to be either time independent or to vary over time. We apply our model to pairs of seven daily stock returns belonging to the S&P 500 composite index and traded at the New York Stock Exchange......In this paper we investigate the effects of careful modelling the long-run dynamics of the volatilities of stock market returns on the conditional correlation structure. To this end we allow the individual unconditional variances in Conditional Correlation GARCH models to change smoothly over time...... by incorporating a nonstationary component in the variance equations. The modelling technique to determine the parametric structure of this time-varying component is based on a sequence of specification Lagrange multiplier-type tests derived in Amado and Teräsvirta (2011). The variance equations combine the long...

  8. Modeling climate effects on hip fracture rate by the multivariate GARCH model in Montreal region, Canada

    Science.gov (United States)

    Modarres, Reza; Ouarda, Taha B. M. J.; Vanasse, Alain; Orzanco, Maria Gabriela; Gosselin, Pierre

    2014-07-01

    Changes in extreme meteorological variables and the demographic shift towards an older population have made it important to investigate the association of climate variables and hip fracture by advanced methods in order to determine the climate variables that most affect hip fracture incidence. The nonlinear autoregressive moving average with exogenous variable-generalized autoregressive conditional heteroscedasticity (ARMA X-GARCH) and multivariate GARCH (MGARCH) time series approaches were applied to investigate the nonlinear association between hip fracture rate in female and male patients aged 40-74 and 75+ years and climate variables in the period of 1993-2004, in Montreal, Canada. The models describe 50-56 % of daily variation in hip fracture rate and identify snow depth, air temperature, day length and air pressure as the influencing variables on the time-varying mean and variance of the hip fracture rate. The conditional covariance between climate variables and hip fracture rate is increasing exponentially, showing that the effect of climate variables on hip fracture rate is most acute when rates are high and climate conditions are at their worst. In Montreal, climate variables, particularly snow depth and air temperature, appear to be important predictors of hip fracture incidence. The association of climate variables and hip fracture does not seem to change linearly with time, but increases exponentially under harsh climate conditions. The results of this study can be used to provide an adaptive climate-related public health program and ti guide allocation of services for avoiding hip fracture risk.

  9. Estimating 'Value at Risk' of crude oil price and its spillover effect using the GED-GARCH approach

    International Nuclear Information System (INIS)

    Fan, Ying; Wei, Yi-Ming; Zhang, Yue-Jun; Tsai, Hsien-Tang

    2008-01-01

    Estimation has been carried out using GARCH-type models, based on the Generalized Error Distribution (GED), for both the extreme downside and upside Value-at-Risks (VaR) of returns in the WTI and Brent crude oil spot markets. Furthermore, according to a new concept of Granger causality in risk, a kernel-based test is proposed to detect extreme risk spillover effect between the two oil markets. Results of an empirical study indicate that the GED-GARCH-based VaR approach appears more effective than the well-recognized HSAF (i.e. historical simulation with ARMA forecasts). Moreover, this approach is also more realistic and comprehensive than the standard normal distribution-based VaR model that is commonly used. Results reveal that there is significant two-way risk spillover effect between WTI and Brent markets. Supplementary study indicates that at the 99% confidence level, when negative market news arises that brings about a slump in oil price return, historical information on risk in the WTI market helps to forecast the Brent market. Conversely, it is not the case when positive news occurs and returns rise. Historical information on risk in the two markets can facilitate forecasts of future extreme market risks for each other. These results are valuable for anyone who needs evaluation and forecasts of the risk situation in international crude oil markets. (author)

  10. The Gemini Deep Planet Survey - GDPS

    Energy Technology Data Exchange (ETDEWEB)

    Lafreniere, D; Doyon, R; Marois, C; Nadeau, D; Oppenheimer, B R; Roche, P F; Rigaut, F; Graham, J R; Jayawardhana, R; Johnstone, D; Kalas, P G; Macintosh, B; Racine, R

    2007-06-01

    We present the results of the Gemini Deep Planet Survey, a near-infrared adaptive optics search for giant planets and brown dwarfs around nearby young stars. The observations were obtained with the Altair adaptive optics system at the Gemini North telescope and angular differential imaging was used to suppress the speckle noise of the central star. Detection limits for the 85 stars observed are presented, along with a list of all faint point sources detected around them. Typically, the observations are sensitive to angular separations beyond 0.5-inch with 5{sigma} contrast sensitivities in magnitude difference at 1.6 {micro}m of 9.6 at 0.5-inch, 12.9 at 1-inch, 15 at 2-inch, and 16.6 at 5-inch. For the typical target of the survey, a 100 Myr old K0 star located 22 pc from the Sun, the observations are sensitive enough to detect planets more massive than 2 M{sub Jup} with a projected separation in the range 40-200 AU. Depending on the age, spectral type, and distance of the target stars, the minimum mass that could be detected with our observations can be {approx}1 M{sub Jup}. Second epoch observations of 48 stars with candidates (out of 54) have confirmed that all candidates are unrelated background stars. A detailed statistical analysis of the survey results, which provide upper limits on the fractions of stars with giant planet or low mass brown dwarf companions, is presented. Assuming a planet mass distribution dn/dm {proportional_to} m{sup -1.2} and a semi-major axis distribution dn/da {proportional_to} a{sup -1}, the upper limits on the fraction of stars with at least one planet of mass 0.5-13 M{sub Jup} are 0.29 for the range 10-25 AU, 0.13 for 25-50 AU, and 0.09 for 50-250 AU, with a 95% confidence level; this result is weakly dependent on the semi-major axis distribution power-law index. Without making any assumption on the mass and semi-major axis distributions, the fraction of stars with at least one brown dwarf companion having a semi-major axis in the

  11. GTC/OSIRIS SPECTROSCOPIC IDENTIFICATION OF A FAINT L SUBDWARF IN THE UKIRT INFRARED DEEP SKY SURVEY

    International Nuclear Information System (INIS)

    Lodieu, N.; Osorio, M. R. Zapatero; MartIn, E. L.; Solano, E.; Aberasturi, M.

    2010-01-01

    We present the discovery of an L subdwarf in 234 deg 2 common to the UK InfraRed Telescope (UKIRT) Infrared Deep Sky Survey Large Area Survey Data Release 2 and the Sloan Digital Sky Survey Data Release 3. This is the fifth L subdwarf announced to date, the first one identified in the UKIRT Infrared Deep Sky Survey, and the faintest known. The blue optical and near-infrared colors of ULAS J135058.86+081506.8 and its overall spectra energy distribution are similar to the known mid-L subdwarfs. Low-resolution optical (700-1000 nm) spectroscopy with the Optical System for Imaging and low Resolution Integrated Spectroscopy spectrograph on the 10.4 m Gran Telescopio de Canarias reveals that ULAS J135058.86+081506.8 exhibits a strong K I pressure-broadened line at 770 nm and a red slope longward of 800 nm, features characteristics of L-type dwarfs. From direct comparison with the four known L subdwarfs, we estimate its spectral type to be sdL4-sdL6 and derive a distance in the interval 94-170 pc. We provide a rough estimate of the space density for mid-L subdwarfs of 1.5 x 10 -4 pc -3 .

  12. The DEEP2 Galaxy Redshift Survey: The Voronoi-Delaunay Method Catalog of Galaxy Groups

    Energy Technology Data Exchange (ETDEWEB)

    Gerke, Brian F.; /UC, Berkeley; Newman, Jeffrey A.; /LBNL, NSD; Davis, Marc; /UC, Berkeley /UC, Berkeley, Astron.Dept.; Marinoni, Christian; /Brera Observ.; Yan, Renbin; Coil, Alison L.; Conroy, Charlie; Cooper, Michael C.; /UC, Berkeley, Astron.Dept.; Faber, S.M.; /Lick Observ.; Finkbeiner, Douglas P.; /Princeton U. Observ.; Guhathakurta, Puragra; /Lick Observ.; Kaiser, Nick; /Hawaii U.; Koo, David C.; Phillips, Andrew C.; /Lick Observ.; Weiner, Benjamin J.; /Maryland U.

    2012-02-14

    We use the first 25% of the DEEP2 Galaxy Redshift Survey spectroscopic data to identify groups and clusters of galaxies in redshift space. The data set contains 8370 galaxies with confirmed redshifts in the range 0.7 {<=} z {<=} 1.4, over one square degree on the sky. Groups are identified using an algorithm (the Voronoi-Delaunay Method) that has been shown to accurately reproduce the statistics of groups in simulated DEEP2-like samples. We optimize this algorithm for the DEEP2 survey by applying it to realistic mock galaxy catalogs and assessing the results using a stringent set of criteria for measuring group-finding success, which we develop and describe in detail here. We find in particular that the group-finder can successfully identify {approx}78% of real groups and that {approx}79% of the galaxies that are true members of groups can be identified as such. Conversely, we estimate that {approx}55% of the groups we find can be definitively identified with real groups and that {approx}46% of the galaxies we place into groups are interloper field galaxies. Most importantly, we find that it is possible to measure the distribution of groups in redshift and velocity dispersion, n({sigma}, z), to an accuracy limited by cosmic variance, for dispersions greater than 350 km s{sup -1}. We anticipate that such measurements will allow strong constraints to be placed on the equation of state of the dark energy in the future. Finally, we present the first DEEP2 group catalog, which assigns 32% of the galaxies to 899 distinct groups with two or more members, 153 of which have velocity dispersions above 350 km s{sup -1}. We provide locations, redshifts and properties for this high-dispersion subsample. This catalog represents the largest sample to date of spectroscopically detected groups at z {approx} 1.

  13. Fiscal 1996 verification and survey of geothermal prospecting technology etc. 1/2. Survey report on deep-seated geothermal resources; 1996 nendo chinetsu tansa gijutsu nado kensho chosa hokokusho. 1/2. Shinbu chinetsu shigen chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    For the purpose of reducing the risk to accompany the exploitation of deep-seated geothermal resources, investigations are conducted into the three factors that govern the formation of geothermal resources at deep levels, that is, the supply of heat from heat sources, the supply of geothermal fluids, and the development of fracture systems contributing to the constitution of reservoir structures. In fiscal 1996, the deep-seated exploration well WD-1a is sidetracked for penetration for a target newly assigned at the 3,000m-deep level. Carried out in a survey of well geology are the naked-eye and microscopic observation of core cuttings, X-ray powder method, examination of inclusions in fluids, chemical analysis of whole rocks, analysis of isotopes in minerals, analysis of core fracturing, etc. Also, data are collected from a survey of mud log, survey of water in the well before digging, and from well logging. Furthermore, pressure monitoring etc. are conducted in order to determine the interference in pressure between the deep-seated and shallow-seated wells that accompanies multiple outbursts from the Kakkonda No. 2 machine, to know the water permeability between the shallow-seated and deep-seated parts, and to grasp the anisotropy in permeability in shallow-seated reservoirs. (NEDO)

  14. South China Sea Tectonics and Magnetics: Constraints from IODP Expedition 349 and Deep-tow Magnetic Surveys

    Science.gov (United States)

    Lin, J.; Li, C. F.; Kulhanek, D. K.; Zhao, X.; Liu, Q.; Xu, X.; Sun, Z.; Zhu, J.

    2014-12-01

    The South China Sea (SCS) is the largest low-latitude marginal sea in the world. Its formation and evolution are linked to the complex continental-oceanic tectonic interaction of the Eurasian, Pacific, and Indo-Australian plates. Despite its relatively small size and short history, the SCS has undergone nearly a complete Wilson cycle from continental break-up to seafloor spreading to subduction. In January-March 2014, Expedition 349 of the International Ocean Discovery Program (IODP) drilled five sites in the deep basin of the SCS. Three sites (U1431, U1433, and U1434) cored into oceanic basement near the fossil spreading center on the East and Southwest Subbasins, whereas Sites U1432 and U1435 are located near the northern continent/ocean boundary of the East Subbasin. Shipboard biostratigraphy based on microfossils preserved in sediment directly above or within basement suggests that the preliminary cessation age of spreading in both the East and Southwest Subbasins is around early Miocene (16-20 Ma); however, post-cruise radiometric dating is being conducted to directly date the basement basalt in these subbasins. Prior to the IODP drilling, high-resolution near-seafloor magnetic surveys were conducted in 2012 and 2013 in the SCS with survey lines passing near the five IODP drilling sites. The deep-tow surveys revealed detailed patterns of the SCS magnetic anomalies with amplitude and spatial resolutions several times better than that of traditional sea surface measurements. Preliminary results reveal several episodes of magnetic reversal events that were not recognized by sea surface measurements. Together the IODP drilling and deep-tow magnetic surveys provide critical constraints for investigating the processes of seafloor spreading in the SCS and evolution of a mid-ocean ridge from active spreading to termination.

  15. Deep far infrared ISOPHOT survey in "Selected Area 57" - I. Observations and source counts

    DEFF Research Database (Denmark)

    Linden-Vornle, M.J.D.; Nørgaard-Nielsen, Hans Ulrik; Jørgensen, H.E.

    2000-01-01

    We present here the results of a deep survey in a 0.4 deg(2) blank field in Selected Area 57 conducted with the ISOPHOT instrument aboard ESAs Infrared Space Observatory (ISO1) at both 60 mu m and 90 mu m. The resulting sky maps have a spatial resolution of 15 x 23 arcsrc(2) per pixel which is much...

  16. Deep, Broadband Spectral Line Surveys of Molecule-rich Interstellar Clouds

    Energy Technology Data Exchange (ETDEWEB)

    Widicus Weaver, Susanna L.; Laas, Jacob C.; Zou, Luyao; Kroll, Jay A.; Rad, Mary L.; Hays, Brian M.; Sanders, James L.; Cross, Trevor N.; Wehres, Nadine; McGuire, Brett A. [Department of Chemistry, Emory University, Atlanta, GA 30322 (United States); Lis, Dariusz C.; Sumner, Matthew C., E-mail: susanna.widicus.weaver@emory.edu [California Institute of Technology, Cahill Center for Astronomy and Astrophysics 301-17, Pasadena, CA 91125 (United States)

    2017-09-01

    Spectral line surveys are an indispensable tool for exploring the physical and chemical evolution of astrophysical environments due to the vast amount of data that can be obtained in a relatively short amount of time. We present deep, broadband spectral line surveys of 30 interstellar clouds using two broadband λ  = 1.3 mm receivers at the Caltech Submillimeter Observatory. This information can be used to probe the influence of physical environment on molecular complexity. We observed a wide variety of sources to examine the relative abundances of organic molecules as they relate to the physical properties of the source (i.e., temperature, density, dynamics, etc.). The spectra are highly sensitive, with noise levels ≤25 mK at a velocity resolution of ∼0.35 km s{sup −1}. In the initial analysis presented here, column densities and rotational temperatures have been determined for the molecular species that contribute significantly to the spectral line density in this wavelength regime. We present these results and discuss their implications for complex molecule formation in the interstellar medium.

  17. ELSA - one year of experience with the Bonn Electron Stretcher Accelerator

    International Nuclear Information System (INIS)

    Althoff, K.H.; Drachenfels, W.v.; Dreist, A.; Husmann, D.; Neckenig, M.; Nuhn, H.D.; Schillo, M.; Schittko, F.J.; Wermelskirchen, C.

    1990-01-01

    One and a half year ago the Bonn Electron Stretcher Accelerator ELSA came into operation. Since then detailed machine studies have been performed between 0.5 and 2 GeV. The control system proved to be a valuable tool for operating the machine. Injection into ELSA including the fast extraction out of the 2.5 GeV booster-synchrotron has been investigated. The adjustment of dipoles and quadrupoles has been checked by closed orbit measurements. The slow extraction at a third integer resonance has been studied in detail. Extraction times up to 200 msec with a duty factor of about 35% are possible. For synchrotron radiation experiments the accumulation and storage of high currents up to 275 mA in ELSA was tested. The beam lifetime (1/e point) at 30 mA is in the order of 15 min. (due to vacuum limitations). Since one year the three experiments have been supplied with external beams. (author) 3 refs., 2 figs., 1 tab

  18. Acceleration of polarized electrons in the Bonn electron-accelerator facility ELSA

    International Nuclear Information System (INIS)

    Hoffmann, M.

    2001-12-01

    The future medium energy physics program at the electron stretcher accelerator ELSA of Bonn University mainly relies on experiments using polarized electrons in the energy range from 1 to 3.2 GeV. To prevent depolarization during acceleration in the circular accelerators several depolarizing resonances have to be corrected for. Intrinsic resonances are compensated using two pulsed betatron tune jump quadrupoles. The influence of imperfection resonances is successfully reduced applying a dynamic closed orbit correction in combination with an empirical harmonic correction on the energy ramp. Both types of resonances and the correction techniques have been studied in detail. The imperfection resonances were used to calibrate the energy of the stretcher ring with high accuracy. A new technique to extract the beam with horizontal oriented polarization was successfully installed. For all energies a polarized electron beam with more than 50% polarization can now be supplied to the experiments at ELSA, which is demonstrated by measurements using a Moeller polarimeter installed in the external beamline. (orig.)

  19. SCUBA-2 Ultra Deep Imaging EAO Survey (STUDIES): Faint-end Counts at 450 μm

    NARCIS (Netherlands)

    Wang, Wei-Hao; Lin, Wei-Ching; Lim, Chen-Fatt; Smail, Ian; Chapman, Scott C.; Zheng, Xian Zhong; Shim, Hyunjin; Kodama, Tadayuki; Almaini, Omar; Ao, Yiping; Blain, Andrew W.; Bourne, Nathan; Bunker, Andrew J.; Chang, Yu-Yen; Chao, Dani C.-Y.; Chen, Chian-Chou; Clements, David L.; Conselice, Christopher J.; Cowley, William I.; Dannerbauer, Helmut; Dunlop, James S.; Geach, James E.; Goto, Tomotsugu; Jiang, Linhua; Ivison, Rob J.; Jeong, Woong-Seob; Kohno, Kotaro; Kong, Xu; Lee, Chien-Hsu; Lee, Hyung Mok; Lee, Minju; Michałowski, Michał J.; Oteo, Iván; Sawicki, Marcin; Scott, Douglas; Shu, Xin Wen; Simpson, James M.; Tee, Wei-Leong; Toba, Yoshiki; Valiante, Elisabetta; Wang, Jun-Xian; Wang, Ran; Wardlow, Julie L.

    2017-01-01

    The SCUBA-2 Ultra Deep Imaging EAO Survey (STUDIES) is a three-year JCMT Large Program aiming to reach the 450 μm confusion limit in the COSMOS-CANDELS region to study a representative sample of the high-redshift far-infrared galaxy population that gives rise to the bulk of the far-infrared

  20. International Deep Planet Survey, 317 stars to determine the wide-separated planet frequency

    Science.gov (United States)

    Galicher, R.; Marois, C.; Macintosh, B.; Zuckerman, B.; Song, I.; Barman, T.; Patience, J.

    2013-09-01

    Since 2000, more than 300 nearby young stars were observed for the International Deep Planet Survey with adaptive optics systems at Gemini (NIRI/NICI), Keck (Nirc2), and VLT (Naco). Massive young AF stars were included in our sample whereas they have generally been neglected in first generation surveys because the contrast and target distances are less favorable to image substellar companions. The most significant discovery of the campaign is the now well-known HR 8799 multi-planet system. This remarkable finding allows, for the first time, an estimate of the Jovians planet population at large separations (further than a few AUs) instead of deriving upper limits. During my presentation, I will present the survey showing images of multiple stars and planets. I will then propose a statistic study of the observed stars deriving constraints on the Jupiter-like planet frequency at large separations.

  1. A Dataset of Deep-Sea Fishes Surveyed by Research Vessels in the Waters around Taiwan

    Directory of Open Access Journals (Sweden)

    Kwang-Tsao Shao

    2014-12-01

    Full Text Available The study of deep-sea fish fauna is hampered by a lack of data due to the difficulty and high cost incurred in its surveys and collections. Taiwan is situated along the edge of the Eurasia fig, at the junction of three Large Marine Ecosystems or Ecoregions of the East China Sea, South China Sea and the Philippines. As nearly two-thirds of its surrounding marine ecosystems are deep-sea environments, Taiwan is expected to hold a rich diversity of deep-sea fish. However, in the past, no research vessels were employed to collect fish data on site. Only specimens, caught by bottom trawl fishing in the waters hundreds of meters deep and missing precise locality information, were collected from Dasi and Donggang fishing harbors. Began in 2001, with the support of National Science Council, research vessels were made available to take on the task of systematically collecting deep-sea fish specimens and occurrence records in the waters surrounding Taiwan. By the end of 2006, a total of 3,653 specimens, belonging to 26 orders, 88 families, 198 genera and 366 species, were collected in addition to data such as sampling site geographical coordinates and water depth, and fish body length and weight. The information, all accessible from the “Database of Taiwan’s Deep-Sea Fauna and Its Distribution (http://deepsea.biodiv.tw/” as part of the “Fish Database of Taiwan,” can benefit the study of temporal and spatial changes in distribution and abundance of fish fauna in the context of global deep-sea biodiversity.

  2. Should investors diversify their portfolios with stocks from major trading countries? A comparative multivariate GARCH-DCC and wavelet correlation analysis

    OpenAIRE

    Dwihasri, Dhaifina; Masih, Mansur

    2015-01-01

    The existing literature have evaluated the performance of stock markets without taking into account the time-varying correlations and different investment horizons of the investors. The present paper attempts to investigate to what extent the Indonesian sharia stock returns can earn portfolio diversification benefits if they are trading with sharia stocks from its major trading partners (China, Japan, United States). The recent Multivariate GARCH Dynamic Conditional Correlation, the Continuou...

  3. THE DEEP2 GALAXY REDSHIFT SURVEY: THE VORONOI-DELAUNAY METHOD CATALOG OF GALAXY GROUPS

    Energy Technology Data Exchange (ETDEWEB)

    Gerke, Brian F. [KIPAC, SLAC National Accelerator Laboratory, 2575 Sand Hill Road, MS 29, Menlo Park, CA 94725 (United States); Newman, Jeffrey A. [Department of Physics and Astronomy, 3941 O' Hara Street, Pittsburgh, PA 15260 (United States); Davis, Marc [Department of Physics and Department of Astronomy, Campbell Hall, University of California-Berkeley, Berkeley, CA 94720 (United States); Coil, Alison L. [Center for Astrophysics and Space Sciences, University of California, San Diego, 9500 Gilman Drive, MC 0424, La Jolla, CA 92093 (United States); Cooper, Michael C. [Center for Galaxy Evolution, Department of Physics and Astronomy, University of California-Irvine, Irvine, CA 92697 (United States); Dutton, Aaron A. [Department of Physics and Astronomy, University of Victoria, Victoria, BC V8P 5C2 (Canada); Faber, S. M.; Guhathakurta, Puragra; Koo, David C.; Phillips, Andrew C. [UCO/Lick Observatory, University of California-Santa Cruz, Santa Cruz, CA 95064 (United States); Konidaris, Nicholas; Lin, Lihwai [Astronomy Department, Caltech 249-17, Pasadena, CA 91125 (United States); Noeske, Kai [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States); Rosario, David J. [Max Planck Institute for Extraterrestrial Physics, Giessenbachstr. 1, 85748 Garching bei Muenchen (Germany); Weiner, Benjamin J.; Willmer, Christopher N. A. [Steward Observatory, University of Arizona, 933 North Cherry Avenue, Tucson, AZ 85721 (United States); Yan, Renbin [Department of Astronomy and Astrophysics, University of Toronto, 50 St. George Street, Toronto, ON M5S 3H4 (Canada)

    2012-05-20

    We present a public catalog of galaxy groups constructed from the spectroscopic sample of galaxies in the fourth data release from the Deep Extragalactic Evolutionary Probe 2 (DEEP2) Galaxy Redshift Survey, including the Extended Groth Strip (EGS). The catalog contains 1165 groups with two or more members in the EGS over the redshift range 0 < z < 1.5 and 1295 groups at z > 0.6 in the rest of DEEP2. Twenty-five percent of EGS galaxies and fourteen percent of high-z DEEP2 galaxies are assigned to galaxy groups. The groups were detected using the Voronoi-Delaunay method (VDM) after it has been optimized on mock DEEP2 catalogs following similar methods to those employed in Gerke et al. In the optimization effort, we have taken particular care to ensure that the mock catalogs resemble the data as closely as possible, and we have fine-tuned our methods separately on mocks constructed for the EGS and the rest of DEEP2. We have also probed the effect of the assumed cosmology on our inferred group-finding efficiency by performing our optimization on three different mock catalogs with different background cosmologies, finding large differences in the group-finding success we can achieve for these different mocks. Using the mock catalog whose background cosmology is most consistent with current data, we estimate that the DEEP2 group catalog is 72% complete and 61% pure (74% and 67% for the EGS) and that the group finder correctly classifies 70% of galaxies that truly belong to groups, with an additional 46% of interloper galaxies contaminating the catalog (66% and 43% for the EGS). We also confirm that the VDM catalog reconstructs the abundance of galaxy groups with velocity dispersions above {approx}300 km s{sup -1} to an accuracy better than the sample variance, and this successful reconstruction is not strongly dependent on cosmology. This makes the DEEP2 group catalog a promising probe of the growth of cosmic structure that can potentially be used for cosmological tests.

  4. THE DEEP2 GALAXY REDSHIFT SURVEY: THE VORONOI-DELAUNAY METHOD CATALOG OF GALAXY GROUPS

    International Nuclear Information System (INIS)

    Gerke, Brian F.; Newman, Jeffrey A.; Davis, Marc; Coil, Alison L.; Cooper, Michael C.; Dutton, Aaron A.; Faber, S. M.; Guhathakurta, Puragra; Koo, David C.; Phillips, Andrew C.; Konidaris, Nicholas; Lin, Lihwai; Noeske, Kai; Rosario, David J.; Weiner, Benjamin J.; Willmer, Christopher N. A.; Yan, Renbin

    2012-01-01

    We present a public catalog of galaxy groups constructed from the spectroscopic sample of galaxies in the fourth data release from the Deep Extragalactic Evolutionary Probe 2 (DEEP2) Galaxy Redshift Survey, including the Extended Groth Strip (EGS). The catalog contains 1165 groups with two or more members in the EGS over the redshift range 0 0.6 in the rest of DEEP2. Twenty-five percent of EGS galaxies and fourteen percent of high-z DEEP2 galaxies are assigned to galaxy groups. The groups were detected using the Voronoi-Delaunay method (VDM) after it has been optimized on mock DEEP2 catalogs following similar methods to those employed in Gerke et al. In the optimization effort, we have taken particular care to ensure that the mock catalogs resemble the data as closely as possible, and we have fine-tuned our methods separately on mocks constructed for the EGS and the rest of DEEP2. We have also probed the effect of the assumed cosmology on our inferred group-finding efficiency by performing our optimization on three different mock catalogs with different background cosmologies, finding large differences in the group-finding success we can achieve for these different mocks. Using the mock catalog whose background cosmology is most consistent with current data, we estimate that the DEEP2 group catalog is 72% complete and 61% pure (74% and 67% for the EGS) and that the group finder correctly classifies 70% of galaxies that truly belong to groups, with an additional 46% of interloper galaxies contaminating the catalog (66% and 43% for the EGS). We also confirm that the VDM catalog reconstructs the abundance of galaxy groups with velocity dispersions above ∼300 km s –1 to an accuracy better than the sample variance, and this successful reconstruction is not strongly dependent on cosmology. This makes the DEEP2 group catalog a promising probe of the growth of cosmic structure that can potentially be used for cosmological tests.

  5. IMPROVED MOCK GALAXY CATALOGS FOR THE DEEP2 GALAXY REDSHIFT SURVEY FROM SUBHALO ABUNDANCE AND ENVIRONMENT MATCHING

    Energy Technology Data Exchange (ETDEWEB)

    Gerke, Brian F.; Wechsler, Risa H.; Behroozi, Peter S. [Kavli Institute for Particle Astrophysics and Cosmology, SLAC National Accelerator Laboratory, M/S 29, 2575 Sand Hill Road, Menlo Park, CA 94025 (United States); Cooper, Michael C. [Center for Galaxy Evolution, Department of Physics and Astronomy, University of California-Irvine, Irvine, CA 92697 (United States); Yan, Renbin [Center for Cosmology and Particle Physics, Department of Physics, New York University, 4 Washington Place, New York, NY 10003 (United States); Coil, Alison L., E-mail: bgerke@slac.stanford.edu [Center for Astrophysics and Space Sciences, University of California, San Diego, 9500 Gilman Dr., MC 0424, La Jolla, CA 92093 (United States)

    2013-09-15

    We develop empirical methods for modeling the galaxy population and populating cosmological N-body simulations with mock galaxies according to the observed properties of galaxies in survey data. We use these techniques to produce a new set of mock catalogs for the DEEP2 Galaxy Redshift Survey based on the output of the high-resolution Bolshoi simulation, as well as two other simulations with different cosmological parameters, all of which we release for public use. The mock-catalog creation technique uses subhalo abundance matching to assign galaxy luminosities to simulated dark-matter halos. It then adds color information to the resulting mock galaxies in a manner that depends on the local galaxy density, in order to reproduce the measured color-environment relation in the data. In the course of constructing the catalogs, we test various models for including scatter in the relation between halo mass and galaxy luminosity, within the abundance-matching framework. We find that there is no constant-scatter model that can simultaneously reproduce both the luminosity function and the autocorrelation function of DEEP2. This result has implications for galaxy-formation theory, and it restricts the range of contexts in which the mock catalogs can be usefully applied. Nevertheless, careful comparisons show that our new mock catalogs accurately reproduce a wide range of the other properties of the DEEP2 catalog, suggesting that they can be used to gain a detailed understanding of various selection effects in DEEP2.

  6. IMPROVED MOCK GALAXY CATALOGS FOR THE DEEP2 GALAXY REDSHIFT SURVEY FROM SUBHALO ABUNDANCE AND ENVIRONMENT MATCHING

    International Nuclear Information System (INIS)

    Gerke, Brian F.; Wechsler, Risa H.; Behroozi, Peter S.; Cooper, Michael C.; Yan, Renbin; Coil, Alison L.

    2013-01-01

    We develop empirical methods for modeling the galaxy population and populating cosmological N-body simulations with mock galaxies according to the observed properties of galaxies in survey data. We use these techniques to produce a new set of mock catalogs for the DEEP2 Galaxy Redshift Survey based on the output of the high-resolution Bolshoi simulation, as well as two other simulations with different cosmological parameters, all of which we release for public use. The mock-catalog creation technique uses subhalo abundance matching to assign galaxy luminosities to simulated dark-matter halos. It then adds color information to the resulting mock galaxies in a manner that depends on the local galaxy density, in order to reproduce the measured color-environment relation in the data. In the course of constructing the catalogs, we test various models for including scatter in the relation between halo mass and galaxy luminosity, within the abundance-matching framework. We find that there is no constant-scatter model that can simultaneously reproduce both the luminosity function and the autocorrelation function of DEEP2. This result has implications for galaxy-formation theory, and it restricts the range of contexts in which the mock catalogs can be usefully applied. Nevertheless, careful comparisons show that our new mock catalogs accurately reproduce a wide range of the other properties of the DEEP2 catalog, suggesting that they can be used to gain a detailed understanding of various selection effects in DEEP2

  7. The NuSTAR Extragalactic Surveys: Initial Results and Catalog from the Extended Chandra Deep Field South

    DEFF Research Database (Denmark)

    Mullaney, J. R.; Del-Moro, A.; Aird, J.

    2015-01-01

    We present the initial results and the source catalog from the Nuclear Spectroscopic Telescope Array (NuSTAR) survey of the Extended Chandra Deep Field South (hereafter, ECDFS)—currently the deepest contiguous component of the NuSTAR extragalactic survey program. The survey covers the full ≈30......V fluxes) span the range L10 40 keV (0.7 300) 10 erg s» - ´ 43 1 -- ,sampling below the “knee” of the X-ray luminosity function out to z ~ 0.8-1. Finally, we identify oneNuSTAR source that has neither a Chandra nor an XMM-Newton counterpart, but that shows evidence of nuclearactivity at infrared...

  8. Fiscal 1996 verification and survey of geothermal prospecting technology etc. 2/2. Survey report on deep-seated geothermal resources; 1996 nendo chinetsu tansa gijutsu nado kensho chosa hokokusho. 2/2. Shinbu chinetsu shigen chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    For the purpose of reducing the risk to accompany the exploitation of deep-seated geothermal resources, investigations are conducted into the three factors that govern the formation of geothermal resources at deep levels, that is, the supply of heat from heat sources, the supply of geothermal fluids, and the development of fracture systems contributing to the constitution of reservoir structures. In the study of deep-seated geothermal models for the Kakkonda area, a reservoir structure model, a thermal structure model, and a geothermal fluid/hydraulic structure model are deliberated. Then, after studying the relations of the said three structure models to fracture systems, the boundary between the geothermal fluid convection region and the thermal conduction region near the 3,100m-deep level, the existence of high-salinity fluids and the depth of gas inflow, the ranges of shallow-seated reservoirs and deep-seated reservoirs, the trend of reduction in reservoir pressure and the anisotropy in water permeability in shallow-seated reservoirs, etc., a latest reservoir model is constructed into which all the findings obtained so far are incorporated. As for guidelines for deep-seated thermal resources survey and development, it is so decided that deep-seated geothermal survey guidelines, deep-seated fluid production guidelines, and deep-seated well drilling guidelines be prepared and that assessment be made of their economic effectiveness. (NEDO)

  9. CANDELS : THE COSMIC ASSEMBLY NEAR-INFRARED DEEP EXTRAGALACTIC LEGACY SURVEY-THE HUBBLE SPACE TELESCOPE OBSERVATIONS, IMAGING DATA PRODUCTS, AND MOSAICS

    NARCIS (Netherlands)

    Koekemoer, Anton M.; Faber, S. M.; Ferguson, Henry C.; Grogin, Norman A.; Kocevski, Dale D.; Koo, David C.; Lai, Kamson; Lotz, Jennifer M.; Lucas, Ray A.; McGrath, Elizabeth J.; Ogaz, Sara; Rajan, Abhijith; Riess, Adam G.; Rodney, Steve A.; Strolger, Louis; Casertano, Stefano; Castellano, Marco; Dahlen, Tomas; Dickinson, Mark; Dolch, Timothy; Fontana, Adriano; Giavalisco, Mauro; Grazian, Andrea; Guo, Yicheng; Hathi, Nimish P.; Huang, Kuang-Han; van der Wel, Arjen; Yan, Hao-Jing; Acquaviva, Viviana; Alexander, David M.; Almaini, Omar; Ashby, Matthew L. N.; Barden, Marco; Bell, Eric F.; Bournaud, Frederic; Brown, Thomas M.; Caputi, Karina I.; Cassata, Paolo; Challis, Peter J.; Chary, Ranga-Ram; Cheung, Edmond; Cirasuolo, Michele; Conselice, Christopher J.; Cooray, Asantha Roshan; Croton, Darren J.; Daddi, Emanuele; Dave, Romeel; de Mello, Duilia F.; de Ravel, Loic; Dekel, Avishai; Donley, Jennifer L.; Dunlop, James S.; Dutton, Aaron A.; Elbaz, David; Fazio, Giovanni G.; Filippenko, Alexei V.; Finkelstein, Steven L.; Frazer, Chris; Gardner, Jonathan P.; Garnavich, Peter M.; Gawiser, Eric; Gruetzbauch, Ruth; Hartley, Will G.; Haeussler, Boris; Herrington, Jessica; Hopkins, Philip F.; Huang, Jia-Sheng; Jha, Saurabh W.; Johnson, Andrew; Kartaltepe, Jeyhan S.; Khostovan, Ali A.; Kirshner, Robert P.; Lani, Caterina; Lee, Kyoung-Soo; Li, Weidong; Madau, Piero; McCarthy, Patrick J.; McIntosh, Daniel H.; McLure, Ross J.; McPartland, Conor; Mobasher, Bahram; Moreira, Heidi; Mortlock, Alice; Moustakas, Leonidas A.; Mozena, Mark; Nandra, Kirpal; Newman, Jeffrey A.; Nielsen, Jennifer L.; Niemi, Sami; Noeske, Kai G.; Papovich, Casey J.; Pentericci, Laura; Pope, Alexandra; Primack, Joel R.; Ravindranath, Swara; Reddy, Naveen A.; Renzini, Alvio; Rix, Hans-Walter; Robaina, Aday R.; Rosario, David J.; Rosati, Piero; Salimbeni, Sara; Scarlata, Claudia; Siana, Brian; Simard, Luc; Smidt, Joseph; Snyder, Diana; Somerville, Rachel S.; Spinrad, Hyron; Straughn, Amber N.; Telford, Olivia; Teplitz, Harry I.; Trump, Jonathan R.; Vargas, Carlos; Villforth, Carolin; Wagner, Cory R.; Wandro, Pat; Wechsler, Risa H.; Weiner, Benjamin J.; Wiklind, Tommy; Wild, Vivienne; Wilson, Grant; Wuyts, Stijn; Yun, Min S.

    2011-01-01

    This paper describes the Hubble Space Telescope imaging data products and data reduction procedures for the Cosmic Assembly Near-infrared Deep Extragalactic Legacy Survey (CANDELS). This survey is designed to document the evolution of galaxies and black holes at z approximate to 1.5-8, and to study

  10. Howard S. Becker, La Bonne focale. De l’utilité des cas particuliers en sciences sociales

    OpenAIRE

    Feryn, Mathieu

    2017-01-01

    Comment peut-on comprendre de longs processus que nous observons sur le terrain dans le cadre des enquêtes menées en sciences humaines et sociales ? Rédigé par Howard S. Becker, l’ouvrage La Bonne focale. De l’utilité des cas particuliers en sciences sociales illustre la manière dont le sociologue de terrain investit sa recherche en étudiant des cas particuliers. Faisant le point sur sa méthodologie et son évolution depuis les trente dernières années ; d’une part, l’auteur met en relation ses...

  11. Bonnes et mauvaises pratiques dans les corpus consacrés aux sexualités numériques.

    OpenAIRE

    Perea, François

    2015-01-01

    Si les corpus sensibles sont caractérisés par la présence de données personnelles et intimes, l’observation de transactions sexuelles en ligne (forums de masturbation, petites annonces de rencontre sans lendemain…) ou pornographiques mettant en scène des couples amateurs (en une forme extrême du principe d’extimité) constitue un terrain privilégié de questionnement sur leurs aspects techniques, éthique et moraux. Si les obligations légales et les « bonnes pratiques » professionnelles (Baude, ...

  12. THE IMPACT OF POLITICAL AND ECONOMIC NEWS ON THE EURO/RON EXCHANGE RATE: A GARCH APPROACH

    Directory of Open Access Journals (Sweden)

    Mihai Niţoi

    2012-12-01

    Full Text Available Within this study we try to capture the impact of political news and economic news from euro area on the exchange rate between Romanian currency and euro. In order to do this we used a GARCH model. As we observed, both variables influence the exchange rate, this fact implying national currency depreciation and a volatility growth. The political news and the economic news positively affect the euro/ron exchange rate volatility. The two factors conjugation, as it has happened in the recent period is to be avoided because it can have financial and economic consequences with a very high cost for Romania.

  13. NASA and ESA astronauts visit ESO. Hubble repair team meets European astronomers in Garching.

    Science.gov (United States)

    1994-02-01

    On Wednesday, February 16, 1994, seven NASA and ESA astronauts and their spouses will spend a day at the Headquarters of the European Southern Observatory. They are the members of the STS-61 crew that successfully repaired the Hubble Space Telescope during a Space Shuttle mission in December 1993. This will be the only stop in Germany during their current tour of various European countries. ESO houses the Space Telescope European Coordinating Facility (ST/ECF), a joint venture by the European Space Agency and ESO. This group of astronomers and computer specialists provide all services needed by European astronomers for observations with the Space Telescope. Currently, the European share is about 20 of the total time available at this telescope. During this visit, a Press Conference will be held on Wednesday, February 16, 11:45 - 12:30 at the ESO Headquarters Karl-Schwarzschild-Strasse 2 D-85748 Garching bei Munchen. Please note that participation in this Press Conference is by invitation only. Media representatives may obtain invitations from Mrs. E. Volk, ESO Information Service at this address (Tel.: +49-89-32006276; Fax.: +49-89-3202362), until Friday, February 11, 1994. After the Press Conference, between 12:30 - 14:00, a light refreshment will be served at the ESO Headquarters to all participants. >From 14:00 - 15:30, the astronauts will meet with students and teachers from the many scientific institutes in Garching in the course of an open presentation at the large lecture hall of the Physics Department of the Technical University. It is a 10 minute walk from ESO to the hall. Later the same day, the astronauts will be back at ESO for a private discussion of various space astronomy issues with their astronomer colleagues, many of whom are users of the Hubble Space Telescope, as well as ground-based telescopes at the ESO La Silla Observatory and elsewhere. The astronauts continue to Switzerland in the evening.

  14. THE SUBSTELLAR POPULATION OF {sigma} ORIONIS: A DEEP WIDE SURVEY

    Energy Technology Data Exchange (ETDEWEB)

    Bejar, V. J. S.; Rebolo, R. [Instituto de Astrofisica de Canarias, E-38205 La Laguna, Tenerife (Spain); Zapatero Osorio, M. R.; Martin, E. L. [Centro de Astrobiologia (INTA-CSIC), Crta. Ajalvir km 4, E-28850 Torrejon de Ardoz, Madrid (Spain); Caballero, J. A.; Barrado, D. [Centro de Astrobiologia (INTA-CSIC), ESAC campus, P.O. Box 78, E-28691 Villanueva de la Canada, Madrid (Spain); Mundt, R.; Bailer-Jones, C. A. L., E-mail: vbejar@iac.es, E-mail: mosorio@cab.inta-csic.es, E-mail: ege@cab.inta-csic.es, E-mail: rrl@iac.es, E-mail: caballero@cab.inta-csic.es, E-mail: barrado@cab.inta-csic.es, E-mail: mundt@mpia.de, E-mail: calj@mpia.de [Max-Planck-Institut fuer Astronomie, Koenigstuhl 17, D-69117 Heidelberg (Germany)

    2011-12-10

    We present a deep I, Z photometric survey covering a total area of 1.12 deg{sup 2} of the {sigma} Orionis cluster and reaching completeness magnitudes of I = 22 and Z = 21.5 mag. From I, I - Z color-magnitude diagrams we have selected 153 candidates that fit the previously known sequence of the cluster. They have magnitudes in the range I = 16-23 mag, which corresponds to a mass interval from 0.1 down to 0.008 M{sub Sun} at the most probable age of {sigma} Orionis (2-4 Myr). Using J-band photometry, we find that 124 of the 151 candidates within the completeness of the optical survey (82%) follow the previously known infrared photometric sequence of the cluster and are probably members. We have studied the spatial distribution of the very low mass stars and brown dwarf population of the cluster and found that there are objects located at distances greater than 30 arcmin to the north and west of {sigma} Orionis that probably belong to different populations of the Orion's Belt. For the 102 bona fide {sigma} Orionis cluster member candidates, we find that the radial surface density can be represented by a decreasing exponential function ({sigma}={sigma}{sub 0}e{sup -r/r{sub 0}}) with a central density of {sigma}{sub 0} = 0.23 {+-} 0.03 objects arcmin{sup -2} and a characteristic radius of r{sub 0} = 9.5 {+-} 0.7 arcmin. From a statistical comparison with Monte Carlo simulations, we conclude that the spatial distribution of the objects located at the same distance from the center of the cluster is compatible with a Poissonian distribution and, hence, that very low mass stars and brown dwarfs are not mainly forming aggregations or sub-clustering. Using near-infrared JHK-band data from Two Micron All Sky Survey and UKIRT Deep Infrared Sky Survey and mid-infrared data from Infrared Array Camera/Spitzer, we find that about 5%-9% of the brown dwarf candidates in the {sigma} Orionis cluster have K-band excesses and 30% {+-} 7% of them show mid-infrared excesses at

  15. THE TAIWAN ECDFS NEAR-INFRARED SURVEY: ULTRA-DEEP J AND K{sub S} IMAGING IN THE EXTENDED CHANDRA DEEP FIELD-SOUTH

    Energy Technology Data Exchange (ETDEWEB)

    Hsieh, Bau-Ching; Wang, Wei-Hao; Hsieh, Chih-Chiang; Lin, Lihwai; Lim, Jeremy; Ho, Paul T. P. [Institute of Astrophysics and Astronomy, Academia Sinica, P.O. Box 23-141, Taipei 106, Taiwan (China); Yan Haojing [Department of Physics and Astronomy, University of Missouri, Columbia, MO 65211 (United States)

    2012-12-15

    We present ultra-deep J and K{sub S} imaging observations covering a 30' Multiplication-Sign 30' area of the Extended Chandra Deep Field-South (ECDFS) carried out by our Taiwan ECDFS Near-Infrared Survey (TENIS). The median 5{sigma} limiting magnitudes for all detected objects in the ECDFS reach 24.5 and 23.9 mag (AB) for J and K{sub S} , respectively. In the inner 400 arcmin{sup 2} region where the sensitivity is more uniform, objects as faint as 25.6 and 25.0 mag are detected at 5{sigma}. Thus, this is by far the deepest J and K{sub S} data sets available for the ECDFS. To combine TENIS with the Spitzer IRAC data for obtaining better spectral energy distributions of high-redshift objects, we developed a novel deconvolution technique (IRACLEAN) to accurately estimate the IRAC fluxes. IRACLEAN can minimize the effect of blending in the IRAC images caused by the large point-spread functions and reduce the confusion noise. We applied IRACLEAN to the images from the Spitzer IRAC/MUSYC Public Legacy in the ECDFS survey (SIMPLE) and generated a J+K{sub S} -selected multi-wavelength catalog including the photometry of both the TENIS near-infrared and the SIMPLE IRAC data. We publicly release the data products derived from this work, including the J and K{sub S} images and the J+K{sub S} -selected multi-wavelength catalog.

  16. Incertidumbre, crecimiento del producto, inflación y depreciación cambiaria en México: Evidencia de modelos GARCH multivariados

    OpenAIRE

    Rodolfo Cermeño; Benjamín Oliva

    2010-01-01

    In this paper we investigate empirically the relationship among conditional mean and variances of exchange rate depreciation, inflation and output growth in Mexico using a multivariate GARCH-in-mean model (MGARCH-M). The study is performed with monthly data over the period 1993-2009. The results support the existence of a positive relationship between exchange rate depreciation and its volatility as well as a negative effect of exchange rate uncertainty on output growth. On the other hand, no...

  17. Volatility Modeling, Seasonality and Risk-Return Relationship in GARCH-in-Mean Framework: The Case of Indian Stock and Commodity Markets

    OpenAIRE

    Brajesh Kumar; Singh, Priyanka

    2008-01-01

    This paper is based on an empirical study of volatility, risk premium and seasonality in risk-return relation of the Indian stock and commodity markets. This investigation is conducted by means of the General Autoregressive Conditional Heteroscedasticity in the mean model (GARCH-in-Mean) introduced by Engle et al. (1987). A systematic approach to model volatility in returns is presented. Volatility clustering and asymmetric nature is examined for Indian stock and commodity markets. The risk-r...

  18. Using Gaia as an Astrometric Tool for Deep Ground-based Surveys

    Science.gov (United States)

    Casetti-Dinescu, Dana I.; Girard, Terrence M.; Schriefer, Michael

    2018-04-01

    Gaia DR1 positions are used to astrometrically calibrate three epochs' worth of Subaru SuprimeCam images in the fields of globular cluster NGC 2419 and the Sextans dwarf spheroidal galaxy. Distortion-correction ``maps'' are constructed from a combination of offset dithers and reference to Gaia DR1. These are used to derive absolute proper motions in the field of NGC 2419. Notably, we identify the photometrically-detected Monoceros structure in the foreground of NGC 2419 as a kinematically-cold population of stars, distinct from Galactic-field stars. This project demonstrates the feasibility of combining Gaia with deep, ground-based surveys, thus extending high-quality astrometry to magnitudes beyond the limits of Gaia.

  19. Deep learning

    CERN Document Server

    Goodfellow, Ian; Courville, Aaron

    2016-01-01

    Deep learning is a form of machine learning that enables computers to learn from experience and understand the world in terms of a hierarchy of concepts. Because the computer gathers knowledge from experience, there is no need for a human computer operator to formally specify all the knowledge that the computer needs. The hierarchy of concepts allows the computer to learn complicated concepts by building them out of simpler ones; a graph of these hierarchies would be many layers deep. This book introduces a broad range of topics in deep learning. The text offers mathematical and conceptual background, covering relevant concepts in linear algebra, probability theory and information theory, numerical computation, and machine learning. It describes deep learning techniques used by practitioners in industry, including deep feedforward networks, regularization, optimization algorithms, convolutional networks, sequence modeling, and practical methodology; and it surveys such applications as natural language proces...

  20. Deep learning for studies of galaxy morphology

    Science.gov (United States)

    Tuccillo, D.; Huertas-Company, M.; Decencière, E.; Velasco-Forero, S.

    2017-06-01

    Establishing accurate morphological measurements of galaxies in a reasonable amount of time for future big-data surveys such as EUCLID, the Large Synoptic Survey Telescope or the Wide Field Infrared Survey Telescope is a challenge. Because of its high level of abstraction with little human intervention, deep learning appears to be a promising approach. Deep learning is a rapidly growing discipline that models high-level patterns in data as complex multilayered networks. In this work we test the ability of deep convolutional networks to provide parametric properties of Hubble Space Telescope like galaxies (half-light radii, Sérsic indices, total flux etc..). We simulate a set of galaxies including point spread function and realistic noise from the CANDELS survey and try to recover the main galaxy parameters using deep-learning. We compare the results with the ones obtained with the commonly used profile fitting based software GALFIT. This way showing that with our method we obtain results at least equally good as the ones obtained with GALFIT but, once trained, with a factor 5 hundred time faster.

  1. LUMINOUS AND HIGH STELLAR MASS CANDIDATE GALAXIES AT z ≈ 8 DISCOVERED IN THE COSMIC ASSEMBLY NEAR-INFRARED DEEP EXTRAGALACTIC LEGACY SURVEY

    International Nuclear Information System (INIS)

    Yan Haojing; Finkelstein, Steven L.; Huang, Kuang-Han; Ryan, Russell E.; Ferguson, Henry C.; Koekemoer, Anton M.; Grogin, Norman A.; Dickinson, Mark; Newman, Jeffrey A.; Somerville, Rachel S.; Davé, Romeel; Faber, S. M.; Papovich, Casey; Guo Yicheng; Giavalisco, Mauro; Lee, Kyoung-soo; Reddy, Naveen; Siana, Brian D.; Cooray, Asantha R.; Hathi, Nimish P.

    2012-01-01

    One key goal of the Hubble Space Telescope Cosmic Assembly Near-Infrared Deep Extragalactic Legacy Survey is to track galaxy evolution back to z ≈ 8. Its two-tiered ''wide and deep'' strategy bridges significant gaps in existing near-infrared surveys. Here we report on z ≈ 8 galaxy candidates selected as F105W-band dropouts in one of its deep fields, which covers 50.1 arcmin 2 to 4 ks depth in each of three near-infrared bands in the Great Observatories Origins Deep Survey southern field. Two of our candidates have J 1 mag brighter than any previously known F105W-dropouts. We derive constraints on the bright end of the rest-frame ultraviolet luminosity function of galaxies at z ≈ 8, and show that the number density of such very bright objects is higher than expected from the previous Schechter luminosity function estimates at this redshift. Another two candidates are securely detected in Spitzer Infrared Array Camera images, which are the first such individual detections at z ≈ 8. Their derived stellar masses are on the order of a few × 10 9 M ☉ , from which we obtain the first measurement of the high-mass end of the galaxy stellar mass function at z ≈ 8. The high number density of very luminous and very massive galaxies at z ≈ 8, if real, could imply a large stellar-to-halo mass ratio and an efficient conversion of baryons to stars at such an early time.

  2. ALFAZOA Deep HI Survey to Identify Galaxies in the ZOA 37° ≦ l ≦ 43° and -2.5° ≦ b ≦ 3°

    Science.gov (United States)

    Palencia, Kelby; Robert Minchin, Monica Sanchez, Patricia Henning , Rhys Taylor

    2018-01-01

    The area where the galaxy lies, as viewed from the solar system, is called the Zone of Avoidance (ZOA). Due to extinction and confusion in the ZOA sources behind it appear to be blocked. This project is working with data from the Arecibo ALFAZOA Deep survey to identify galaxies in the ZOA amid 37° ≦ l ≦ 43° and -2.5° ≦ b ≦ 3° . The ALFAZOA Deep surveyed a part of the inner galaxy for the first time in the ZOA. The ALFAZOA Deep survey is a more sensitive survey than the previous survey the ALFAZOA Shallow. FRELLED and Miriad were used to identify and analyze the data in this region. With the data 57 sources where identified. Within these 57 sources, 51 were galaxies, which 3 were previously discovered galaxies; leaving 48 as new galaxies. The other 6 remaining sources from the 57, were follow-up sources. Two groups of galaxies were also identified, one lies around 1,500-3,200 km/s and the other between 10,600-11,700 km/s in redshift. The sources from the group in 10,600-11,700 km/s in redshift also need a follow up as they lie near the spectrum where the receiver signal starts to weaken.

  3. A Survey on Deep Learning in Medical Image Analysis

    NARCIS (Netherlands)

    Litjens, G.J.; Kooi, T.; Ehteshami Bejnordi, B.; Setio, A.A.A.; Ciompi, F.; Ghafoorian, M.; Laak, J.A.W.M. van der; Ginneken, B. van; Sanchez, C.I.

    2017-01-01

    Deep learning algorithms, in particular convolutional networks, have rapidly become a methodology of choice for analyzing medical images. This paper reviews the major deep learning concepts pertinent to medical image analysis and summarizes over 300 contributions to the field, most of which appeared

  4. Near-bottom Multibeam Survey Capabilities in the US National Deep Submergence Facility (Invited)

    Science.gov (United States)

    Yoerger, D. R.; McCue, S. J.; Jason; Sentry Operations Groups

    2010-12-01

    The US National Deep Submergence Facility (NDSF) provides near-bottom multibeam mapping capabilities from the autonomous underwater vehicle Sentry and the remotely operated vehicle Jason. These vehicles can be used to depths of 4500 and 6500m respectively. Both vehicles are equipped with Reson 7125 400khz multibeam sonars as well as compatible navigation equipment (inertial navigation systems, doppler velocity logs, and acoustic navigation systems). These vehicles have produced maps of rugged Mid-Ocean Ridge terrain in the Galapagos Rift, natural oil and gas seeps off the coast of Southern California, deep coral sites in the Gulf of Mexico, and sites for the Ocean Observing Initiative off the coast of Oregon. Multibeam surveys are conducted from heights between 20 and 80 meters, allowing the scientific user to select the tradeoff between resolution and coverage rate. In addition to conventional bathymetric mapping, the systems have used to image methane bubble plumes from natural seeps. This talk will provide summaries of these mapping efforts and describe the data processing pipeline used to produce maps shortly after each dive. Development efforts to reduce navigational errors and reconcile discrepancies between adjacent swaths will also be described.

  5. Safe and peaceful use of nuclear energy - an IAEA perspective. Address. Deutsche Gesellschaft fuer Auswaertige Politik, Bonn, 17 April 1998

    International Nuclear Information System (INIS)

    ElBaradei, M.

    1998-01-01

    The document reproduces the text of the conference given by the Director General of the IAEA at the Deutsche Gesellschaft fuer Auswaertige Politik in Bonn on 17 April 1998. After a presentation of the Agency's role in the safe and peaceful use of nuclear energy, the conference gives an overview of the main issues facing nuclear energy in the following three major areas: the contribution of nuclear energy to economic and social development, nuclear safety, and verification. In the last part, the Director General makes some comments about the future

  6. Peer-mentoring Program during the Preclinical Years of Medical School at Bonn University: a Project Description.

    Science.gov (United States)

    Lapp, Hendrik; Makowka, Philipp; Recker, Florian

    2018-01-01

    Introduction: To better prepare young medical students in a thorough and competent manner for the ever increasing clinical, scientific, as well as psychosocial requirements, universities should enable a close, personal transfer of experience and knowledge. Structured mentoring programs are a promising approach to incorporate clinical subjects earlier into the preclinical training. Such a mentoring program facilitates the prioritization of concepts from a broad, theory-heavy syllabus. Here we report the experiences and results of the preclinical mentoring program of Bonn University, which was introduced in the winter semester of 2012/2013. Project desciption: The program is characterized by the concept of peer-to-peer teaching during the preclinical semesters of medical school. Regular, voluntary course meetings with different clinical case examples provide students the opportunity to apply knowledge acquired from the basic science curricula; furthermore, a personal contact for advice and support is ensured. Thus, an informal exchange of experiences is made possible, which provides to the students motivational and learning aids, in particular for the oral examination at the end of the premedical semesters as well as for other examinations during medical school. Results: Over the course of the preceding three years the number of participants and the interest in the program grew steadily. The analysis of collected evaluations confirms very good communication between mentors and students (>80%), as well as consistently good to very good quality and usefulness in terms of the mentors' subject-specific and other advice. The overall final evaluation of the mentoring program was always good to very good (winter semester: very good 64.8±5.0%, good 35.2±5.0%, summer semester: very good 83.9±7.5%, good 16.1±7.5%) Summary: In summary, it has been shown that the mentoring program had a positive impact on the development, education and satisfaction of students beginning

  7. Working with Research Integrity-Guidance for Research Performing Organisations: The Bonn PRINTEGER Statement.

    Science.gov (United States)

    Forsberg, Ellen-Marie; Anthun, Frank O; Bailey, Sharon; Birchley, Giles; Bout, Henriette; Casonato, Carlo; Fuster, Gloria González; Heinrichs, Bert; Horbach, Serge; Jacobsen, Ingrid Skjæggestad; Janssen, Jacques; Kaiser, Matthias; Lerouge, Inge; van der Meulen, Barend; de Rijcke, Sarah; Saretzki, Thomas; Sutrop, Margit; Tazewell, Marta; Varantola, Krista; Vie, Knut Jørgen; Zwart, Hub; Zöller, Mira

    2018-05-31

    This document presents the Bonn PRINTEGER Consensus Statement: Working with Research Integrity-Guidance for research performing organisations. The aim of the statement is to complement existing instruments by focusing specifically on institutional responsibilities for strengthening integrity. It takes into account the daily challenges and organisational contexts of most researchers. The statement intends to make research integrity challenges recognisable from the work-floor perspective, providing concrete advice on organisational measures to strengthen integrity. The statement, which was concluded February 7th 2018, provides guidance on the following key issues: § 1. Providing information about research integrity § 2. Providing education, training and mentoring § 3. Strengthening a research integrity culture § 4. Facilitating open dialogue § 5. Wise incentive management § 6. Implementing quality assurance procedures § 7. Improving the work environment and work satisfaction § 8. Increasing transparency of misconduct cases § 9. Opening up research § 10. Implementing safe and effective whistle-blowing channels § 11. Protecting the alleged perpetrators § 12. Establishing a research integrity committee and appointing an ombudsperson § 13. Making explicit the applicable standards for research integrity.

  8. White Dwarfs in the UKIRT Infrared Deep Sky Survey Data Release 9

    Science.gov (United States)

    Tremblay, P.-E.; Leggett, S. K.; Lodieu, N.; Freytag, B.; Bergeron, P.; Kalirai, J. S.; Ludwig, H.-G.

    2014-06-01

    We have identified 8 to 10 new cool white dwarfs from the Large Area Survey (LAS) Data Release 9 of the United Kingdom InfraRed Telescope (UKIRT) Infrared Deep Sky Survey (UKIDSS). The data set was paired with the Sloan Digital Sky Survey to obtain proper motions and a broad ugrizYJHK wavelength coverage. Optical spectroscopic observations were secured at Gemini Observatory and confirm the degenerate status for eight of our targets. The final sample includes two additional white dwarf candidates with no spectroscopic observations. We rely on improved one-dimensional model atmospheres and new multi-dimensional simulations with CO5BOLD to review the stellar parameters of the published LAS white dwarf sample along with our additional discoveries. Most of the new objects possess very cool atmospheres with effective temperatures below 5000 K, including two pure-hydrogen remnants with a cooling age between 8.5 and 9.0 Gyr, and tangential velocities in the range 40 km s-1 3.0 and 5.0 Gyr. These white dwarfs could be disk remnants with a very high velocity or former halo G stars. We also compare the LAS sample with earlier studies of very cool degenerates and observe a similar deficit of helium-dominated atmospheres in the range 5000 < T eff (K) < 6000. We review the possible explanations for the spectral evolution from helium-dominated toward hydrogen-rich atmospheres at low temperatures.

  9. Supermassive Black Hole Binary Candidates from the Pan-STARRS1 Medium Deep Survey

    Science.gov (United States)

    Liu, Tingting; Gezari, Suvi

    2018-01-01

    Supermassive black hole binaries (SMBHBs) should be a common product of the hierarchal growth of galaxies and gravitational wave sources at nano-Hz frequencies. We have performed a systematic search in the Pan-STARRS1 Medium Deep Survey for periodically varying quasars, which are predicted manifestations of SMBHBs, and identified 26 candidates that are periodically varying on the timescale of ~300-1000 days over the 4-year baseline of MDS. We continue to monitor them with the Discovery Channel Telescope and the LCO network telescopes and thus are able to extend the baseline to 3-8 cycles and break false positive signals due to stochastic, normal quasar variability. From our imaging campaign, five candidates show persistent periodic variability and remain strong SMBHB candidates for follow-up observations. We calculate the cumulative number rate of SMBHBs and compare with previous work. We also compare the gravitational wave strain amplitudes of the candidates with the capability of pulsar timing arrays and discuss the future capabilities to detect periodic quasar and SMBHB candidates with the Large Synoptic Survey Telescope.

  10. DM Considerations for Deep Drilling

    OpenAIRE

    Dubois-Felsmann, Gregory

    2016-01-01

    An outline of the current situation regarding the DM plans for the Deep Drilling surveys and an invitation to the community to provide feedback on what they would like to see included in the data processing and visualization of these surveys.

  11. White dwarfs in the UKIRT infrared deep sky survey data release

    International Nuclear Information System (INIS)

    Tremblay, P.-E.; Kalirai, J. S.; Leggett, S. K.; Lodieu, N.; Freytag, B.; Bergeron, P.; Ludwig, H.-G.

    2014-01-01

    We have identified 8 to 10 new cool white dwarfs from the Large Area Survey (LAS) Data Release 9 of the United Kingdom InfraRed Telescope (UKIRT) Infrared Deep Sky Survey (UKIDSS). The data set was paired with the Sloan Digital Sky Survey to obtain proper motions and a broad ugrizYJHK wavelength coverage. Optical spectroscopic observations were secured at Gemini Observatory and confirm the degenerate status for eight of our targets. The final sample includes two additional white dwarf candidates with no spectroscopic observations. We rely on improved one-dimensional model atmospheres and new multi-dimensional simulations with CO5BOLD to review the stellar parameters of the published LAS white dwarf sample along with our additional discoveries. Most of the new objects possess very cool atmospheres with effective temperatures below 5000 K, including two pure-hydrogen remnants with a cooling age between 8.5 and 9.0 Gyr, and tangential velocities in the range 40 km s –1 ≤v tan ≤ 60 km s –1 . They are likely thick disk 10-11 Gyr old objects. In addition, we find a resolved double degenerate system with v tan ∼ 155 km s –1 and a cooling age between 3.0 and 5.0 Gyr. These white dwarfs could be disk remnants with a very high velocity or former halo G stars. We also compare the LAS sample with earlier studies of very cool degenerates and observe a similar deficit of helium-dominated atmospheres in the range 5000 < T eff (K) < 6000. We review the possible explanations for the spectral evolution from helium-dominated toward hydrogen-rich atmospheres at low temperatures.

  12. The VANDELS ESO spectroscopic survey

    Science.gov (United States)

    McLure, R. J.; Pentericci, L.; Cimatti, A.; Dunlop, J. S.; Elbaz, D.; Fontana, A.; Nandra, K.; Amorin, R.; Bolzonella, M.; Bongiorno, A.; Carnall, A. C.; Castellano, M.; Cirasuolo, M.; Cucciati, O.; Cullen, F.; De Barros, S.; Finkelstein, S. L.; Fontanot, F.; Franzetti, P.; Fumana, M.; Gargiulo, A.; Garilli, B.; Guaita, L.; Hartley, W. G.; Iovino, A.; Jarvis, M. J.; Juneau, S.; Karman, W.; Maccagni, D.; Marchi, F.; Mármol-Queraltó, E.; Pompei, E.; Pozzetti, L.; Scodeggio, M.; Sommariva, V.; Talia, M.; Almaini, O.; Balestra, I.; Bardelli, S.; Bell, E. F.; Bourne, N.; Bowler, R. A. A.; Brusa, M.; Buitrago, F.; Caputi, K. I.; Cassata, P.; Charlot, S.; Citro, A.; Cresci, G.; Cristiani, S.; Curtis-Lake, E.; Dickinson, M.; Fazio, G. G.; Ferguson, H. C.; Fiore, F.; Franco, M.; Fynbo, J. P. U.; Galametz, A.; Georgakakis, A.; Giavalisco, M.; Grazian, A.; Hathi, N. P.; Jung, I.; Kim, S.; Koekemoer, A. M.; Khusanova, Y.; Le Fèvre, O.; Lotz, J. M.; Mannucci, F.; Maltby, D. T.; Matsuoka, K.; McLeod, D. J.; Mendez-Hernandez, H.; Mendez-Abreu, J.; Mignoli, M.; Moresco, M.; Mortlock, A.; Nonino, M.; Pannella, M.; Papovich, C.; Popesso, P.; Rosario, D. P.; Salvato, M.; Santini, P.; Schaerer, D.; Schreiber, C.; Stark, D. P.; Tasca, L. A. M.; Thomas, R.; Treu, T.; Vanzella, E.; Wild, V.; Williams, C. C.; Zamorani, G.; Zucca, E.

    2018-05-01

    VANDELS is a uniquely-deep spectroscopic survey of high-redshift galaxies with the VIMOS spectrograph on ESO's Very Large Telescope (VLT). The survey has obtained ultra-deep optical (0.48 studies. Using integration times calculated to produce an approximately constant signal-to-noise ratio (20 motivation, survey design and target selection.

  13. Research on deep electromagnetic induction methods (Fy 1985)

    Energy Technology Data Exchange (ETDEWEB)

    Murakami, Hiroshi; Uchida, Toshihiro; Tanaka, Shin' ichi

    1987-06-01

    Geological Survey of Japan started from FY 1984 a research of deep electomagnetic induction methods as a part of the research on deep geothermal resources prospecting technology, the Sunshine Project. This article is the report of its second fiscal year. These methods are a generic term of the methods to survey specific resistance structure in the deep part of the earth by utilizing the technique of the electromagnetic induction method and the time domain CSMT method aiming to survey about estimated depth of 5Km as well as the CA method to estimate the general structure of the earth of the depth of 5Km or more are now being developed. This article reports the respective methods separately. Concerning the former, the reception of useful signals were successfully made during the FY 1984 field experiment and based on this, field experiments in a geothermal area were conducted in FY 1985 verifying its effectivenss. With regard to the latter, following FY 1984, CA observations were conducted in the northern part of Tohoku Region and the deep specific resistance structure in a wide area was surveyed. (43 figs, 1 tab, 11 refs)

  14. Fuel prices scenario generation based on a multivariate GARCH model for risk analysis in a wholesale electricity market

    International Nuclear Information System (INIS)

    Batlle, C.; Barquin, J.

    2004-01-01

    This paper presents a fuel prices scenario generator in the frame of a simulation tool developed to support risk analysis in a competitive electricity environment. The tool feeds different erogenous risk factors to a wholesale electricity market model to perform a statistical analysis of the results. As the different fuel series that are studied, such as the oil or gas ones, present stochastic volatility and strong correlation among them, a multivariate Generalized Autoregressive Conditional Heteroskedastic (GARCH) model has been designed in order to allow the generation of future fuel prices paths. The model makes use of a decomposition method to simplify the consideration of the multidimensional conditional covariance. An example of its application with real data is also presented. (author)

  15. Estimating risk of foreign exchange portfolio: Using VaR and CVaR based on GARCH-EVT-Copula model

    Science.gov (United States)

    Wang, Zong-Run; Chen, Xiao-Hong; Jin, Yan-Bo; Zhou, Yan-Ju

    2010-11-01

    This paper introduces GARCH-EVT-Copula model and applies it to study the risk of foreign exchange portfolio. Multivariate Copulas, including Gaussian, t and Clayton ones, were used to describe a portfolio risk structure, and to extend the analysis from a bivariate to an n-dimensional asset allocation problem. We apply this methodology to study the returns of a portfolio of four major foreign currencies in China, including USD, EUR, JPY and HKD. Our results suggest that the optimal investment allocations are similar across different Copulas and confidence levels. In addition, we find that the optimal investment concentrates on the USD investment. Generally speaking, t Copula and Clayton Copula better portray the correlation structure of multiple assets than Normal Copula.

  16. Prediction of selected Indian stock using a partitioning–interpolation based ARIMA–GARCH model

    Directory of Open Access Journals (Sweden)

    C. Narendra Babu

    2015-07-01

    Full Text Available Accurate long-term prediction of time series data (TSD is a very useful research challenge in diversified fields. As financial TSD are highly volatile, multi-step prediction of financial TSD is a major research problem in TSD mining. The two challenges encountered are, maintaining high prediction accuracy and preserving the data trend across the forecast horizon. The linear traditional models such as autoregressive integrated moving average (ARIMA and generalized autoregressive conditional heteroscedastic (GARCH preserve data trend to some extent, at the cost of prediction accuracy. Non-linear models like ANN maintain prediction accuracy by sacrificing data trend. In this paper, a linear hybrid model, which maintains prediction accuracy while preserving data trend, is proposed. A quantitative reasoning analysis justifying the accuracy of proposed model is also presented. A moving-average (MA filter based pre-processing, partitioning and interpolation (PI technique are incorporated by the proposed model. Some existing models and the proposed model are applied on selected NSE India stock market data. Performance results show that for multi-step ahead prediction, the proposed model outperforms the others in terms of both prediction accuracy and preserving data trend.

  17. The Bonn Electron Stretcher Accelerator ELSA: Past and future

    Energy Technology Data Exchange (ETDEWEB)

    Hillert, W. [Universitaet Bonn, Physikalisches Institut, Bonn (Germany)

    2006-05-15

    In 1953, it was decided to build a 500 MeV electron synchrotron in Bonn. It came into operation 1958, being the first alternating gradient synchrotron in Europe. After five years of performing photoproduction experiments at this accelerator, a larger 2.5 GeV electron synchrotron was built and set into operation in 1967. Both synchrotrons were running for particle physics experiments, until from 1982 to 1987 a third accelerator, the electron stretcher ring ELSA, was constructed and set up in a separate ring tunnel below the physics institute. ELSA came into operation in 1987, using the pulsed 2.5 GeV synchrotron as pre-accelerator. ELSA serves either as storage ring producing synchrotron radiation, or as post-accelerator and pulse stretcher. Applying a slow extraction close to a third integer resonance, external electron beams with energies up to 3.5 GeV and high duty factors are delivered to hadron physics experiments. Various photo- and electroproduction experiments, utilising the experimental set-ups PHOENICS, ELAN, SAPHIR, GDH and Crystal Barrel have been carried out. During the late 90's, a pulsed GaAs source of polarised electrons was constructed and set up at the accelerator. ELSA was upgraded in order to accelerate polarised electrons, compensating for depolarising resonances by applying the methods of fast tune jumping and harmonic closed orbit correction. With the experimental investigation of the GDH sum rule, the first experiment requiring a polarised beam and a polarised target was successfully performed at the accelerator. In the near future, the stretcher ring will be further upgraded to increase polarisation and current of the external electron beams. In addition, the aspects of an increase of the maximum energy to 5 GeV using superconducting resonators will be investigated. (orig.)

  18. The Bonn Electron Stretcher Accelerator ELSA: Past and future

    Science.gov (United States)

    Hillert, W.

    2006-05-01

    In 1953, it was decided to build a 500MeV electron synchrotron in Bonn. It came into operation 1958, being the first alternating gradient synchrotron in Europe. After five years of performing photoproduction experiments at this accelerator, a larger 2.5GeV electron synchrotron was built and set into operation in 1967. Both synchrotrons were running for particle physics experiments, until from 1982 to 1987 a third accelerator, the electron stretcher ring ELSA, was constructed and set up in a separate ring tunnel below the physics institute. ELSA came into operation in 1987, using the pulsed 2.5GeV synchrotron as pre-accelerator. ELSA serves either as storage ring producing synchrotron radiation, or as post-accelerator and pulse stretcher. Applying a slow extraction close to a third integer resonance, external electron beams with energies up to 3.5GeV and high duty factors are delivered to hadron physics experiments. Various photo- and electroproduction experiments, utilising the experimental set-ups PHOENICS, ELAN, SAPHIR, GDH and Crystal Barrel have been carried out. During the late 90's, a pulsed GaAs source of polarised electrons was constructed and set up at the accelerator. ELSA was upgraded in order to accelerate polarised electrons, compensating for depolarising resonances by applying the methods of fast tune jumping and harmonic closed orbit correction. With the experimental investigation of the GDH sum rule, the first experiment requiring a polarised beam and a polarised target was successfully performed at the accelerator. In the near future, the stretcher ring will be further upgraded to increase polarisation and current of the external electron beams. In addition, the aspects of an increase of the maximum energy to 5GeV using superconducting resonators will be investigated.

  19. The Bonn Electron Stretcher Accelerator ELSA: Past and future

    International Nuclear Information System (INIS)

    Hillert, W.

    2006-01-01

    In 1953, it was decided to build a 500 MeV electron synchrotron in Bonn. It came into operation 1958, being the first alternating gradient synchrotron in Europe. After five years of performing photoproduction experiments at this accelerator, a larger 2.5 GeV electron synchrotron was built and set into operation in 1967. Both synchrotrons were running for particle physics experiments, until from 1982 to 1987 a third accelerator, the electron stretcher ring ELSA, was constructed and set up in a separate ring tunnel below the physics institute. ELSA came into operation in 1987, using the pulsed 2.5 GeV synchrotron as pre-accelerator. ELSA serves either as storage ring producing synchrotron radiation, or as post-accelerator and pulse stretcher. Applying a slow extraction close to a third integer resonance, external electron beams with energies up to 3.5 GeV and high duty factors are delivered to hadron physics experiments. Various photo- and electroproduction experiments, utilising the experimental set-ups PHOENICS, ELAN, SAPHIR, GDH and Crystal Barrel have been carried out. During the late 90's, a pulsed GaAs source of polarised electrons was constructed and set up at the accelerator. ELSA was upgraded in order to accelerate polarised electrons, compensating for depolarising resonances by applying the methods of fast tune jumping and harmonic closed orbit correction. With the experimental investigation of the GDH sum rule, the first experiment requiring a polarised beam and a polarised target was successfully performed at the accelerator. In the near future, the stretcher ring will be further upgraded to increase polarisation and current of the external electron beams. In addition, the aspects of an increase of the maximum energy to 5 GeV using superconducting resonators will be investigated. (orig.)

  20. Value-at-risk estimations of energy commodities via long-memory, asymmetry and fat-tailed GARCH models

    International Nuclear Information System (INIS)

    Aloui, Chaker; Mabrouk, Samir

    2010-01-01

    In this paper, we evaluate the value-at-risk (VaR) and the expected shortfalls for some major crude oil and gas commodities for both short and long trading positions. Classical VaR estimations as well as RiskMetrics and other extensions to cases considering for long-range memory, asymmetry and fat-tail in energy markets volatility were conducted. We computed the VaR for three ARCH/GARCH-type models including FIGARCH, FIAPARCH and HYGARCH. These models were estimated in the presence of three alternative innovation's distributions: normal, Student and skewed Student. Our results show that considering for long-range memory, fat-tails and asymmetry performs better in predicting a one-day-ahead VaR for both short and long trading positions. Moreover, the FIAPARCH model outperforms the other models in the VaR's prediction. These results present several potential implications for energy markets risk quantifications and hedging strategies. (author)

  1. The VANDELS ESO public spectroscopic survey

    Science.gov (United States)

    McLure, R. J.; Pentericci, L.; Cimatti, A.; Dunlop, J. S.; Elbaz, D.; Fontana, A.; Nandra, K.; Amorin, R.; Bolzonella, M.; Bongiorno, A.; Carnall, A. C.; Castellano, M.; Cirasuolo, M.; Cucciati, O.; Cullen, F.; De Barros, S.; Finkelstein, S. L.; Fontanot, F.; Franzetti, P.; Fumana, M.; Gargiulo, A.; Garilli, B.; Guaita, L.; Hartley, W. G.; Iovino, A.; Jarvis, M. J.; Juneau, S.; Karman, W.; Maccagni, D.; Marchi, F.; Mármol-Queraltó, E.; Pompei, E.; Pozzetti, L.; Scodeggio, M.; Sommariva, V.; Talia, M.; Almaini, O.; Balestra, I.; Bardelli, S.; Bell, E. F.; Bourne, N.; Bowler, R. A. A.; Brusa, M.; Buitrago, F.; Caputi, K. I.; Cassata, P.; Charlot, S.; Citro, A.; Cresci, G.; Cristiani, S.; Curtis-Lake, E.; Dickinson, M.; Fazio, G. G.; Ferguson, H. C.; Fiore, F.; Franco, M.; Fynbo, J. P. U.; Galametz, A.; Georgakakis, A.; Giavalisco, M.; Grazian, A.; Hathi, N. P.; Jung, I.; Kim, S.; Koekemoer, A. M.; Khusanova, Y.; Fèvre, O. Le; Lotz, J. M.; Mannucci, F.; Maltby, D. T.; Matsuoka, K.; McLeod, D. J.; Mendez-Hernandez, H.; Mendez-Abreu, J.; Mignoli, M.; Moresco, M.; Mortlock, A.; Nonino, M.; Pannella, M.; Papovich, C.; Popesso, P.; Rosario, D. P.; Salvato, M.; Santini, P.; Schaerer, D.; Schreiber, C.; Stark, D. P.; Tasca, L. A. M.; Thomas, R.; Treu, T.; Vanzella, E.; Wild, V.; Williams, C. C.; Zamorani, G.; Zucca, E.

    2018-05-01

    VANDELS is a uniquely-deep spectroscopic survey of high-redshift galaxies with the VIMOS spectrograph on ESO's Very Large Telescope (VLT). The survey has obtained ultra-deep optical (0.48 studies. Using integration times calculated to produce an approximately constant signal-to-noise ratio (20 motivation, survey design and target selection.

  2. Evolution of galaxies clustering in the VIMOS-VLT Deep Survey

    International Nuclear Information System (INIS)

    Meneux, Baptiste

    2005-01-01

    The recent surveys of the Universe highlighted the presence of structures in the matter distribution, such as filaments and voids. To study the evolution of the galaxy spatial distribution, it is necessary to know their accurate position in a three dimensional space. This thesis took place within the framework of the VIMOS-VLT Deep Survey (VVDS). Its goal is to measure some 100000 redshifts to study the formation and evolution of the galaxies and large scale structures of the Universe up to z∼5. After having made a review of our knowledge of the galaxies distribution, then introduced the VVDS, I present the measurement and the evolution of the real-space two-point correlation function from the first epoch data of the VVDS, the largest sample of 10759 spectra ever acquired up to I_A_B = 24. I developed a whole set of programs made available to the VVDS consortium to easily measure the clustering length of galaxies in a given redshift range, with its associated errors, correcting the effects of the VVDS observing strategy. This tool enabled to measure the evolution of the real space correlation function of the global galaxies population up to z=2. I then extended this study dividing the full galaxies sample by spectral type and color. Finally, combining the GALEX data to the VVDS has allowed me to measure the clustering of an ultraviolet-selected sample of galaxies up to z∼1. This is the first time that such measurements are carried out on such a so long range of cosmic time. The results presented in this thesis are thus establishing a new reference in the field. (author) [fr

  3. Photometric redshifts for the next generation of deep radio continuum surveys - II. Gaussian processes and hybrid estimates

    Science.gov (United States)

    Duncan, Kenneth J.; Jarvis, Matt J.; Brown, Michael J. I.; Röttgering, Huub J. A.

    2018-04-01

    Building on the first paper in this series (Duncan et al. 2018), we present a study investigating the performance of Gaussian process photometric redshift (photo-z) estimates for galaxies and active galactic nuclei detected in deep radio continuum surveys. A Gaussian process redshift code is used to produce photo-z estimates targeting specific subsets of both the AGN population - infrared, X-ray and optically selected AGN - and the general galaxy population. The new estimates for the AGN population are found to perform significantly better at z > 1 than the template-based photo-z estimates presented in our previous study. Our new photo-z estimates are then combined with template estimates through hierarchical Bayesian combination to produce a hybrid consensus estimate that outperforms both of the individual methods across all source types. Photo-z estimates for radio sources that are X-ray sources or optical/IR AGN are significantly improved in comparison to previous template-only estimates - with outlier fractions and robust scatter reduced by up to a factor of ˜4. The ability of our method to combine the strengths of the two input photo-z techniques and the large improvements we observe illustrate its potential for enabling future exploitation of deep radio continuum surveys for both the study of galaxy and black hole co-evolution and for cosmological studies.

  4. Correction of the equilibrium orbit at the Bonn 3.5 GeV electron stretcher facility ELSA

    International Nuclear Information System (INIS)

    Wenzel, J.

    1990-09-01

    A beam position monitor system is being built for the Bonn electron stretcher facility ELSA. Based on the ELSA monitor system the Closed Orbit Correction Program for Interactive Tasks COCPIT has been developed. It enables an online correction of the closed orbit and is fully integrated into the ELSA operating system which allows control of all steering and diagnostic facilities of ELSA. COCPIT implements least square and harmonic correction methods with choosable harmonic components. A statistical analysis shows which correction method is best under given circumstances. Furthermore data about the current status of the monitor system as well as the corrector system enter the correction. Monitor offsets are added to the position measurements and the corrector-dipoles maximum currents are accounted for as constraints. Computer simulations prove the proper work of COCPIT and the validity of the statistical analysis. Possibilities of a future development of COCPIT are shown. (orig.) [de

  5. Four faint T dwarfs from the UKIRT Infrared Deep Sky Survey (UKIDSS) Southern Stripe

    Science.gov (United States)

    Chiu, Kuenley; Liu, Michael C.; Jiang, Linhua; Allers, Katelyn N.; Stark, Daniel P.; Bunker, Andrew; Fan, Xiaohui; Glazebrook, Karl; Dupuy, Trent J.

    2008-03-01

    We present the optical and near-infrared photometry and spectroscopy of four faint T dwarfs newly discovered from the UKIDSS first data release. The sample, drawn from an imaged area of ~136 deg2 to a depth of Y = 19.9 (5σ, Vega), is located in the Sloan Digital Sky Survey (SDSS) Southern Equatorial Stripe, a region of significant future deep imaging potential. We detail the selection and followup of these objects, three of which are spectroscopically confirmed brown dwarfs ranging from type T2.5 to T7.5, and one is photometrically identified as early T. Their magnitudes range from Y = 19.01 to 19.88 with derived distances from 34 to 98 pc, making these among the coldest and faintest brown dwarfs known. The T7.5 dwarf appears to be single based on 0.05-arcsec images from Keck laser guide star adaptive optics. The sample brings the total number of T dwarfs found or confirmed by UKIDSS data in this region to nine, and we discuss the projected numbers of dwarfs in the future survey data. We estimate that ~240 early and late T dwarfs are discoverable in the UKIDSS Large Area Survey (LAS) data, falling significantly short of published model projections and suggesting that initial mass functions and/or birth rates may be at the low end of possible models. Thus, deeper optical data have good potential to exploit the UKIDSS survey depth more fully, but may still find the potential Y dwarf sample to be extremely rare.

  6. Deep learning in neural networks: an overview.

    Science.gov (United States)

    Schmidhuber, Jürgen

    2015-01-01

    In recent years, deep artificial neural networks (including recurrent ones) have won numerous contests in pattern recognition and machine learning. This historical survey compactly summarizes relevant work, much of it from the previous millennium. Shallow and Deep Learners are distinguished by the depth of their credit assignment paths, which are chains of possibly learnable, causal links between actions and effects. I review deep supervised learning (also recapitulating the history of backpropagation), unsupervised learning, reinforcement learning & evolutionary computation, and indirect search for short programs encoding deep and large networks.

  7. Modeling inflation rates and exchange rates in Ghana: application of multivariate GARCH models.

    Science.gov (United States)

    Nortey, Ezekiel Nn; Ngoh, Delali D; Doku-Amponsah, Kwabena; Ofori-Boateng, Kenneth

    2015-01-01

    This paper was aimed at investigating the volatility and conditional relationship among inflation rates, exchange rates and interest rates as well as to construct a model using multivariate GARCH DCC and BEKK models using Ghana data from January 1990 to December 2013. The study revealed that the cumulative depreciation of the cedi to the US dollar from 1990 to 2013 is 7,010.2% and the yearly weighted depreciation of the cedi to the US dollar for the period is 20.4%. There was evidence that, the fact that inflation rate was stable, does not mean that exchange rates and interest rates are expected to be stable. Rather, when the cedi performs well on the forex, inflation rates and interest rates react positively and become stable in the long run. The BEKK model is robust to modelling and forecasting volatility of inflation rates, exchange rates and interest rates. The DCC model is robust to model the conditional and unconditional correlation among inflation rates, exchange rates and interest rates. The BEKK model, which forecasted high exchange rate volatility for the year 2014, is very robust for modelling the exchange rates in Ghana. The mean equation of the DCC model is also robust to forecast inflation rates in Ghana.

  8. A deep redshift survey of field galaxies. Comments on the reality of the Butcher-Oemler effect

    Science.gov (United States)

    Koo, David C.; Kron, Richard G.

    1987-01-01

    A spectroscopic survey of over 400 field galaxies has been completed in three fields for which we have deep UBVI photographic photometry. The galaxies typically range from B=20 to 22 and possess redshifts z from 0.1 to 0.5 that are often quite spiky in distribution. Little, if any, luminosity evolution is observed up to redshifts z approx 0.5. By such redshifts, however, an unexpectedly large fraction of luminous galaxies has very blue intrinsic colors that suggest extensive star formation; in contrast, the reddest galaxies still have colors that match those of present-day ellipticals.

  9. Deep Mapping and Spatial Anthropology

    Directory of Open Access Journals (Sweden)

    Les Roberts

    2016-01-01

    Full Text Available This paper provides an introduction to the Humanities Special Issue on “Deep Mapping”. It sets out the rationale for the collection and explores the broad-ranging nature of perspectives and practices that fall within the “undisciplined” interdisciplinary domain of spatial humanities. Sketching a cross-current of ideas that have begun to coalesce around the concept of “deep mapping”, the paper argues that rather than attempting to outline a set of defining characteristics and “deep” cartographic features, a more instructive approach is to pay closer attention to the multivalent ways deep mapping is performatively put to work. Casting a critical and reflexive gaze over the developing discourse of deep mapping, it is argued that what deep mapping “is” cannot be reduced to the otherwise a-spatial and a-temporal fixity of the “deep map”. In this respect, as an undisciplined survey of this increasing expansive field of study and practice, the paper explores the ways in which deep mapping can engage broader discussion around questions of spatial anthropology.

  10. Seismic stability of the survey areas of potential sites for the deep geological repository of the spent nuclear fuel

    Science.gov (United States)

    Kaláb, Zdeněk; Šílený, Jan; Lednická, Markéta

    2017-07-01

    This paper deals with the seismic stability of the survey areas of potential sites for the deep geological repository of the spent nuclear fuel in the Czech Republic. The basic source of data for historical earthquakes up to 1990 was the seismic website [1-]. The most intense earthquake described occurred on September 15, 1590 in the Niederroesterreich region (Austria) in the historical period; its reported intensity is Io = 8-9. The source of the contemporary seismic data for the period since 1991 to the end of 2014 was the website [11]. It may be stated based on the databases and literature review that in the period from 1900, no earthquake exceeding magnitude 5.1 originated in the territory of the Czech Republic. In order to evaluate seismicity and to assess the impact of seismic effects at depths of hypothetical deep geological repository for the next time period, the neo-deterministic method was selected as an extension of the probabilistic method. Each one out of the seven survey areas were assessed by the neo-deterministic evaluation of the seismic wave-field excited by selected individual events and determining the maximum loading. Results of seismological databases studies and neo-deterministic analysis of Čihadlo locality are presented.

  11. Deep-tow magnetic survey above large exhumed mantle domains of the eastern Southwest Indian ridge

    Science.gov (United States)

    Bronner, A.; Munschy, M.; Carlut, J. H.; Searle, R. C.; Sauter, D.; Cannat, M.

    2011-12-01

    The recent discovery of a new type of seafloor, the "smooth seafloor", formed with no or very little volcanic activity along the ultra-slow spreading Southwest Indian ridge (SWIR) shows an unexpected complexity in processes of generation of the oceanic lithosphere. There, detachment faulting is thought to be a mechanism for efficient exhumation of deep-seated mantle rocks. We present here a deep-tow geological-geophysical survey over smooth seafloor at the eastern SWIR (62-64°N) combining magnetic data, geology mapping from side-scan sonar images and results from dredge sampling. We introduce a new type of calibration approach for deep-tow fluxgate magnetometer. We show that magnetic data can be corrected from the magnetic effect of the vehicle with no recourse to its attitude (pitch, roll and heading) but only using the 3 components recorded by the magnetometer and an approximation of the scalar intensity of the Earth magnetic field. The collected dredge samples as well as the side-scan images confirm the presence of large areas of exhumed mantle-derived peridodites surrounded by a few volcanic constructions. This allows us to hypothesis that magnetic anomalies are caused by serpentinized peridotites or magmatic intrusions. We show that the magnetic signature of the smooth seafloor is clearly weaker than the surrounding volcanic areas. Moreover, the calculated magnetization of a source layer as well as the comparison between deep-tow and sea-surface magnetic data argue for strong East-West variability in the distribution of the magnetized sources. This variability may results from fluid-rocks interaction along the detachment faults as well as from the repartition of the volcanic material and thus questions the seafloor spreading origin of the corresponding magnetic anomalies. Finally, we provide magnetic arguments, as calculation of block rotation or spreading asymmetry in order to better constrain tectonic mechanisms that occur during the formation of this

  12. Deep-tow geophysical survey above large exhumed mantle domains of the eastern Southwest Indian ridge

    Science.gov (United States)

    Bronner, A.; Munschy, M.; Sauter, D.; Carlut, J.; Searle, R.; Cannat, M.

    2012-04-01

    The recent discovery of a new type of seafloor, the "smooth seafloor", formed with no or very little volcanic activity along the easternmost part of the ultra-slow spreading Southwest Indian ridge (SWIR) shows an unexpected complexity in processes of generation of the oceanic lithosphere. There, detachment faulting is thought to be a mechanism for efficient exhumation of deep-seated mantle rocks. We present here a deep-tow geological-geophysical survey over smooth seafloor at the eastern SWIR (62-64°N) combining multibeam bathymetric data, magnetic data, geology mapping from sidescan sonar (TOBI) images and results from dredge sampling. We introduce a new type of calibration approach for deep-tow fluxgate magnetometer. We show that magnetic data can be corrected from the magnetic effect of the vehicle with no recourse to its attitude (pitch, roll and heading) but only using the 3 components recorded by the magnetometer and an approximation of the scalar intensity of the Earth magnetic field. The collected dredge samples as well as the sidescan sonar images confirm the presence of large areas of exhumed mantle-derived peridodites surrounded by a few volcanic constructions. We investigate the possibility that magnetic anomalies are either caused by serpentinized peridotites and/or magmatic intrusions. We show that the magnetic signature of the smooth seafloor is clearly weaker than the surrounding volcanic areas. Moreover, the calculated magnetization of a source layer as well as the comparison between deep-tow and sea-surface magnetic data argue for strong East-West variability in the distribution of the magnetized sources. This variability may result from fluid-rock interactions along the detachment faults as well as from the occurrence of small sized and thin volcanic patches and thus questions the seafloor spreading origin of the corresponding magnetic anomalies. Finally, we provide magnetic arguments, as calculation of block rotation or spreading asymmetry in

  13. The Sloan Digital Sky Survey COADD: 275 deg2 of deep Sloan Digital Sky Survey imaging on stripe 82

    International Nuclear Information System (INIS)

    Annis, James; Soares-Santos, Marcelle; Dodelson, Scott; Hao, Jiangang; Jester, Sebastian; Johnston, David E.; Kubo, Jeffrey M.; Lampeitl, Hubert; Lin, Huan; Miknaitis, Gajus; Yanny, Brian; Strauss, Michael A.; Gunn, James E.; Lupton, Robert H.; Becker, Andrew C.; Ivezić, Željko; Fan, Xiaohui; Jiang, Linhua; Seo, Hee-Jong; Simet, Melanie

    2014-01-01

    We present details of the construction and characterization of the coaddition of the Sloan Digital Sky Survey (SDSS) Stripe 82 ugriz imaging data. This survey consists of 275 deg 2 of repeated scanning by the SDSS camera over –50° ≤ α ≤ 60° and –1.°25 ≤ δ ≤ +1.°25 centered on the Celestial Equator. Each piece of sky has ∼20 runs contributing and thus reaches ∼2 mag fainter than the SDSS single pass data, i.e., to r ∼ 23.5 for galaxies. We discuss the image processing of the coaddition, the modeling of the point-spread function (PSF), the calibration, and the production of standard SDSS catalogs. The data have an r-band median seeing of 1.''1 and are calibrated to ≤1%. Star color-color, number counts, and PSF size versus modeled size plots show that the modeling of the PSF is good enough for precision five-band photometry. Structure in the PSF model versus magnitude plot indicates minor PSF modeling errors, leading to misclassification of stars as galaxies, as verified using VVDS spectroscopy. There are a variety of uses for this wide-angle deep imaging data, including galactic structure, photometric redshift computation, cluster finding and cross wavelength measurements, weak lensing cluster mass calibrations, and cosmic shear measurements.

  14. A Global Survey and Interactive Map Suite of Deep Underground Facilities; Examples of Geotechnical and Engineering Capabilities, Achievements, Challenges: (Mines, Shafts, Tunnels, Boreholes, Sites and Underground Facilities for Nuclear Waste and Physics R&D)

    Science.gov (United States)

    Tynan, M. C.; Russell, G. P.; Perry, F.; Kelley, R.; Champenois, S. T.

    2017-12-01

    This global survey presents a synthesis of some notable geotechnical and engineering information reflected in four interactive layer maps for selected: 1) deep mines and shafts; 2) existing, considered or planned radioactive waste management deep underground studies, sites, or disposal facilities; 3) deep large diameter boreholes, and 4) physics underground laboratories and facilities from around the world. These data are intended to facilitate user access to basic information and references regarding deep underground "facilities", history, activities, and plans. In general, the interactive maps and database [http://gis.inl.gov/globalsites/] provide each facility's approximate site location, geology, and engineered features (e.g.: access, geometry, depth, diameter, year of operations, groundwater, lithology, host unit name and age, basin; operator, management organization, geographic data, nearby cultural features, other). Although the survey is not all encompassing, it is a comprehensive review of many of the significant existing and historical underground facilities discussed in the literature addressing radioactive waste management and deep mined geologic disposal safety systems. The global survey is intended to support and to inform: 1) interested parties and decision makers; 2) radioactive waste disposal and siting option evaluations, and 3) safety case development as a communication tool applicable to any mined geologic disposal facility as a demonstration of historical and current engineering and geotechnical capabilities available for use in deep underground facility siting, planning, construction, operations and monitoring.

  15. Strategies for restoration of deep-water coral ecosystems based on a global survey of oil and gas regulations

    Science.gov (United States)

    Cordes, E. E.; Jones, D.; Levin, L. A.

    2016-02-01

    The oil and gas industry is one of the most active agents of the global industrialization of the deep sea. The wide array of impacts following the Deepwater Horizon oil spill highlighted the need for a systematic review of existing regulations both in US waters and internationally. Within different exclusive economic zones, there are a wide variety of regulations regarding the survey of deep-water areas prior to leasing and the acceptable set-back distances from vulnerable marine ecosystems once they are discovered. There are also varying mitigation strategies for accidental release of oil and gas, including active monitoring systems, temporary closings of oil and gas production, and marine protected areas. The majority of these regulations are based on previous studies of typical impacts from oil and gas drilling, rather than accidental releases. However, the probability of an accident from standard operations increases significantly with depth. The Oil & Gas working group of the Deep Ocean Stewardship Initiative is an international partnership of scientists, managers, non-governmental organizations, and industry professionals whose goal is to review existing regulations for the oil & gas industry and produce a best practices document to advise both developed and developing nations on their regulatory structure as energy development moves into deeper waters.

  16. STIMULATION TECHNOLOGIES FOR DEEP WELL COMPLETIONS

    Energy Technology Data Exchange (ETDEWEB)

    Stephen Wolhart

    2003-06-01

    The Department of Energy (DOE) is sponsoring a Deep Trek Program targeted at improving the economics of drilling and completing deep gas wells. Under the DOE program, Pinnacle Technologies is conducting a project to evaluate the stimulation of deep wells. The objective of the project is to assess U.S. deep well drilling & stimulation activity, review rock mechanics & fracture growth in deep, high pressure/temperature wells and evaluate stimulation technology in several key deep plays. Phase 1 was recently completed and consisted of assessing deep gas well drilling activity (1995-2007) and an industry survey on deep gas well stimulation practices by region. Of the 29,000 oil, gas and dry holes drilled in 2002, about 300 were drilled in the deep well; 25% were dry, 50% were high temperature/high pressure completions and 25% were simply deep completions. South Texas has about 30% of these wells, Oklahoma 20%, Gulf of Mexico Shelf 15% and the Gulf Coast about 15%. The Rockies represent only 2% of deep drilling. Of the 60 operators who drill deep and HTHP wells, the top 20 drill almost 80% of the wells. Six operators drill half the U.S. deep wells. Deep drilling peaked at 425 wells in 1998 and fell to 250 in 1999. Drilling is expected to rise through 2004 after which drilling should cycle down as overall drilling declines.

  17. The KMOS Deep Survey: Dynamical Measurements of Star-Forming Galaxies at z 3.5

    Science.gov (United States)

    Turner, Owen; Cirasuolo, Michele; Harrison, Chris; McLure, Ross; Dunlop, James; Swinbank, Mark; Johnson, Helen; Sobral, David; Matthee, Jorryt; Sharples, Ray

    2017-07-01

    This poster present dynamical measurements from the KMOS (K-band Multi-Object Spectrograph) Deep Survey (KDS), which is comprised of 78 typical star-forming galaxies at z = 3.5 in the mass range 9.0 isolated. The results suggest that the rotation-dominated galaxies in the sample are offset to lower velocities at fixed stellar mass and have higher velocity dispersions than star-forming galaxies in the local and intermediate redshift universe. Only 1/3 of the galaxies in the sample are dominated by rotation, which hints that random motions are playing an increasingly significant role in supporting the dynamical mass in the systems. When searching for evolution in scaling relations, such as the stellar mass Tully-Fisher relation, it is important to take these random motions into account.

  18. Comprehensive geophysical survey technique in exploration for deep-buried hydrothermal type uranium deposits in Xiangshan volcanic basin, China

    International Nuclear Information System (INIS)

    Ke, D.

    2014-01-01

    According to recent drilling results, uranium mineralization has been found underground more than 1000 m deep in the Xiangshan volcanic basin, in where uranium exploration has been carried out for over 50 years. This paper presents a comprehensive geophysical survey technique, including audio magnetotelluric method (AMT), high resolution ground magnetic and radon survey, which aim to prospect deep-buried and concealed uranium deposits in Xiangshan volcanic basin. Based on research and application, a comprehensive geophysical technique consisting of data acquisition, processing and interpretation has been established. Concealed rock and ore-controlling structure buried deeper than 1000 m can be detected by using this technique. Moreover, one kind of anti-interference technique of AMT survey is presented, which can eliminate the interference induced by the high-voltage power lines. Result of AMT in Xiangshan volcanic basin is demonstrated as high-low-high mode, which indicates there are three layers in geology. The upper layer with high resistivity is mainly the react of porphyroclastic lava. The middle layer with low resistivity is metamorphic schists or dellenite whereas the lower layer with high resistivity is inferred as granite. The interface between middle and lower layer is recognized as the potential zone for occurrence of uranium deposits. According to the corresponding relation of the resistivity and magnetic anomaly with uranium ore bodies, the tracing model of faults and interfaces between the different rocks, and the forecasting model of advantageous area for uranium deposits have been established. In terms of the forecasting model, some significant sections for uranium deposits were delineated in the west of the Xiangshan volcanic basin. As a result, some achievements on uranium prospecting have been acquired. High grade economic uranium ore bodies have been found in several boreholes, which are located in the forecasted zones. (author)

  19. Deepwater Program: Lophelia II, continuing ecological research on deep-sea corals and deep-reef habitats in the Gulf of Mexico

    Science.gov (United States)

    Demopoulos, Amanda W.J.; Ross, Steve W.; Kellogg, Christina A.; Morrison, Cheryl L.; Nizinski, Martha S.; Prouty, Nancy G.; Bourque, Jill R.; Galkiewicz, Julie P.; Gray, Michael A.; Springmann, Marcus J.; Coykendall, D. Katharine; Miller, Andrew; Rhode, Mike; Quattrini, Andrea; Ames, Cheryl L.; Brooke, Sandra D.; McClain Counts, Jennifer; Roark, E. Brendan; Buster, Noreen A.; Phillips, Ryan M.; Frometa, Janessy

    2017-12-11

    The deep sea is a rich environment composed of diverse habitat types. While deep-sea coral habitats have been discovered within each ocean basin, knowledge about the ecology of these habitats and associated inhabitants continues to grow. This report presents information and results from the Lophelia II project that examined deep-sea coral habitats in the Gulf of Mexico. The Lophelia II project focused on Lophelia pertusa habitats along the continental slope, at depths ranging from 300 to 1,000 meters. The chapters are authored by several scientists from the U.S. Geological Survey, National Oceanic and Atmospheric Administration, University of North Carolina Wilmington, and Florida State University who examined the community ecology (from microbes to fishes), deep-sea coral age, growth, and reproduction, and population connectivity of deep-sea corals and inhabitants. Data from these studies are presented in the chapters and appendixes of the report as well as in journal publications. This study was conducted by the Ecosystems Mission Area of the U.S. Geological Survey to meet information needs identified by the Bureau of Ocean Energy Management.

  20. A new method of Debye-Scherrer pattern integration on two-dimensional detectors, demonstrated for the new structure powder diffractometer (SPODI) at the FRM-II in Garching

    CERN Document Server

    Elf, F; Artus, G R J; Roth, S

    2002-01-01

    The expected diffraction patterns of the new powder diffractometer SPODI, currently under construction at the FRM-II in Garching, will be smeared Debye-Scherrer rings as depicted by Monte Carlo (MC) simulations. To overcome this disadvantage, a concept based on the combination of MC simulations and empirical approximation methods is developed to reverse the smearing by deconvolution and then summing up along the rings, including corrections for different arc lengths, resulting in conventional one-dimensional diffraction patterns suitable for Rietveld-refinement programs without further processing. (orig.)

  1. JAMSTEC E-library of Deep-sea Images (J-EDI) Realizes a Virtual Journey to the Earth's Unexplored Deep Ocean

    Science.gov (United States)

    Sasaki, T.; Azuma, S.; Matsuda, S.; Nagayama, A.; Ogido, M.; Saito, H.; Hanafusa, Y.

    2016-12-01

    The Japan Agency for Marine-Earth Science and Technology (JAMSTEC) archives a large amount of deep-sea research videos and photos obtained by JAMSTEC's research submersibles and vehicles with cameras. The web site "JAMSTEC E-library of Deep-sea Images : J-EDI" (http://www.godac.jamstec.go.jp/jedi/e/) has made videos and photos available to the public via the Internet since 2011. Users can search for target videos and photos by keywords, easy-to-understand icons, and dive information at J-EDI because operating staffs classify videos and photos as to contents, e.g. living organism and geological environment, and add comments to them.Dive survey data including videos and photos are not only valiant academically but also helpful for education and outreach activities. With the aim of the improvement of visibility for broader communities, we added new functions of 3-dimensional display synchronized various dive survey data with videos in this year.New Functions Users can search for dive survey data by 3D maps with plotted dive points using the WebGL virtual map engine "Cesium". By selecting a dive point, users can watch deep-sea videos and photos and associated environmental data, e.g. water temperature, salinity, rock and biological sample photos, obtained by the dive survey. Users can browse a dive track visualized in 3D virtual spaces using the WebGL JavaScript library. By synchronizing this virtual dive track with videos, users can watch deep-sea videos recorded at a point on a dive track. Users can play an animation which a submersible-shaped polygon automatically traces a 3D virtual dive track and displays of dive survey data are synchronized with tracing a dive track. Users can directly refer to additional information of other JAMSTEC data sites such as marine biodiversity database, marine biological sample database, rock sample database, and cruise and dive information database, on each page which a 3D virtual dive track is displayed. A 3D visualization of a dive

  2. Les situations difficiles au travail approche organisationnelle, individuelle, relationnelle, bonnes pratiques, outils d'accompagnement, illustrations cliniques

    CERN Document Server

    Poirot, Matthieu

    2013-01-01

    Une situation difficile au travail correspond à l'accumulation de facteurs dépassant les capacités collectives de régulation d'une entreprise et les ressources psychosociales des personnes. Il en résulte un déséquilibre pouvant mettre en jeu la santé des salariés et l'efficacité du travail. Il peut s'agir de facteurs : - organisationnels : équipes marginalisées, choc de générations, perte chronique d'activité, bouleversement stratégique, etc. ; - individuels : situation de fragilisation, personnalité difficile, épuisement professionnel (burnout), etc. ; - relationnels : hyperconflit, harcèlement, persécution, management toxique, etc. Illustré de situations concrètes, cet ouvrage permet d'analyser les situations difficiles selon ces trois approches et d'en évaluer les risques. Il propose des outils d'accompagnement et de résolution des conflits. Enfin, il présente les bonnes pratiques à mettre en place pour prévenir ce type de situations. Cet ouvrage s'adresse aux praticiens de l'acc...

  3. The XMM deep survey in the CDF-S. X. X-ray variability of bright sources

    Science.gov (United States)

    Falocco, S.; Paolillo, M.; Comastri, A.; Carrera, F. J.; Ranalli, P.; Iwasawa, K.; Georgantopoulos, I.; Vignali, C.; Gilli, R.

    2017-12-01

    Aims: We aim to study the variability properties of bright hard X-ray selected active galactic nuclei (AGN) in the redshift range between 0.3 and 1.6 detected in the Chandra Deep Field South (XMM-CDFS) by a long ( 3 Ms) XMM observation. Methods: Taking advantage of the good count statistics in the XMM CDFS, we search for flux and spectral variability using the hardness ratio (HR) techniques. We also investigate the spectral variability of different spectral components (photon index of the power law, column density of the local absorber, and reflection intensity). The spectra were merged in six epochs (defined as adjacent observations) and in high and low flux states to understand whether the flux transitions are accompanied by spectral changes. Results: The flux variability is significant in all the sources investigated. The HRs in general are not as variable as the fluxes, in line with previous results on deep fields. Only one source displays a variable HR, anti-correlated with the flux (source 337). The spectral analysis in the available epochs confirms the steeper when brighter trend consistent with Comptonisation models only in this source at 99% confidence level. Finding this trend in one out of seven unabsorbed sources is consistent, within the statistical limits, with the 15% of unabsorbed AGN in previous deep surveys. No significant variability in the column densities, nor in the Compton reflection component, has been detected across the epochs considered. The high and low states display in general different normalisations but consistent spectral properties. Conclusions: X-ray flux fluctuations are ubiquitous in AGN, though in some cases the data quality does not allow for their detection. In general, the significant flux variations are not associated with spectral variability: photon index and column densities are not significantly variable in nine out of the ten AGN over long timescales (from three to six and a half years). Photon index variability is

  4. The Canada-France deep fields survey-II: Lyman-break galaxies and galaxy clustering at z ~ 3

    Science.gov (United States)

    Foucaud, S.; McCracken, H. J.; Le Fèvre, O.; Arnouts, S.; Brodwin, M.; Lilly, S. J.; Crampton, D.; Mellier, Y.

    2003-10-01

    We present a large sample of z ~ 3 U-band dropout galaxies extracted from the Canada-France deep fields survey (CFDF). Our catalogue covers an effective area of ~ 1700 arcmin2 divided between three large, contiguous fields separated widely on the sky. To IAB=24.5, the survey contains 1294 Lyman-break candidates, in agreement with previous measurements by other authors, after appropriate incompleteness corrections have been applied to our data. Based on comparisons with spectroscopic observations and simulations, we estimate that our sample of Lyman-break galaxies is contaminated by stars and interlopers (lower-redshift galaxies) at no more than { ~ } 30%. We find that omega (theta ) is well fitted by a power-law of fixed slope, gamma =1.8, even at small (theta University of Hawaii, and at the Cerro Tololo Inter-American Observatory and Mayall 4-meter Telescopes, divisions of the National Optical Astronomy Observatories, which are operated by the Association of Universities for Research in Astronomy, Inc. under cooperative agreement with the National Science Foundation.

  5. Modernization of instrumentation and control systems in nuclear power plants. Working materials. Proceedings of a specialists` meeting held in Garching, Germany, 4-7 July 1995

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-31

    The Specialists` Meeting on ``Modernization of Instrumentation and Control Systems in Nuclear Power Plants`` was organized by the IAEA (jointly by Division of Nuclear Power and Division of Nuclear Safety) in co-operation with Institute for Safety Technology (ISTec) and held in Garching, Germany from 4 to 7 July 1995 (The Meeting Chairman - Dr. W. Bastl). The meeting brought together experts on power plant operation with experts on application of today`s instrumentation and control technology. In this way, a match was made between those knowing the industry needs and requirements and those knowing the potentials of the technology. Refs, figs and tabs.

  6. The Hyper Suprime-Cam SSP Survey: Overview and survey design

    Science.gov (United States)

    Aihara, Hiroaki; Arimoto, Nobuo; Armstrong, Robert; Arnouts, Stéphane; Bahcall, Neta A.; Bickerton, Steven; Bosch, James; Bundy, Kevin; Capak, Peter L.; Chan, James H. H.; Chiba, Masashi; Coupon, Jean; Egami, Eiichi; Enoki, Motohiro; Finet, Francois; Fujimori, Hiroki; Fujimoto, Seiji; Furusawa, Hisanori; Furusawa, Junko; Goto, Tomotsugu; Goulding, Andy; Greco, Johnny P.; Greene, Jenny E.; Gunn, James E.; Hamana, Takashi; Harikane, Yuichi; Hashimoto, Yasuhiro; Hattori, Takashi; Hayashi, Masao; Hayashi, Yusuke; Hełminiak, Krzysztof G.; Higuchi, Ryo; Hikage, Chiaki; Ho, Paul T. P.; Hsieh, Bau-Ching; Huang, Kuiyun; Huang, Song; Ikeda, Hiroyuki; Imanishi, Masatoshi; Inoue, Akio K.; Iwasawa, Kazushi; Iwata, Ikuru; Jaelani, Anton T.; Jian, Hung-Yu; Kamata, Yukiko; Karoji, Hiroshi; Kashikawa, Nobunari; Katayama, Nobuhiko; Kawanomoto, Satoshi; Kayo, Issha; Koda, Jin; Koike, Michitaro; Kojima, Takashi; Komiyama, Yutaka; Konno, Akira; Koshida, Shintaro; Koyama, Yusei; Kusakabe, Haruka; Leauthaud, Alexie; Lee, Chien-Hsiu; Lin, Lihwai; Lin, Yen-Ting; Lupton, Robert H.; Mandelbaum, Rachel; Matsuoka, Yoshiki; Medezinski, Elinor; Mineo, Sogo; Miyama, Shoken; Miyatake, Hironao; Miyazaki, Satoshi; Momose, Rieko; More, Anupreeta; More, Surhud; Moritani, Yuki; Moriya, Takashi J.; Morokuma, Tomoki; Mukae, Shiro; Murata, Ryoma; Murayama, Hitoshi; Nagao, Tohru; Nakata, Fumiaki; Niida, Mana; Niikura, Hiroko; Nishizawa, Atsushi J.; Obuchi, Yoshiyuki; Oguri, Masamune; Oishi, Yukie; Okabe, Nobuhiro; Okamoto, Sakurako; Okura, Yuki; Ono, Yoshiaki; Onodera, Masato; Onoue, Masafusa; Osato, Ken; Ouchi, Masami; Price, Paul A.; Pyo, Tae-Soo; Sako, Masao; Sawicki, Marcin; Shibuya, Takatoshi; Shimasaku, Kazuhiro; Shimono, Atsushi; Shirasaki, Masato; Silverman, John D.; Simet, Melanie; Speagle, Joshua; Spergel, David N.; Strauss, Michael A.; Sugahara, Yuma; Sugiyama, Naoshi; Suto, Yasushi; Suyu, Sherry H.; Suzuki, Nao; Tait, Philip J.; Takada, Masahiro; Takata, Tadafumi; Tamura, Naoyuki; Tanaka, Manobu M.; Tanaka, Masaomi; Tanaka, Masayuki; Tanaka, Yoko; Terai, Tsuyoshi; Terashima, Yuichi; Toba, Yoshiki; Tominaga, Nozomu; Toshikawa, Jun; Turner, Edwin L.; Uchida, Tomohisa; Uchiyama, Hisakazu; Umetsu, Keiichi; Uraguchi, Fumihiro; Urata, Yuji; Usuda, Tomonori; Utsumi, Yousuke; Wang, Shiang-Yu; Wang, Wei-Hao; Wong, Kenneth C.; Yabe, Kiyoto; Yamada, Yoshihiko; Yamanoi, Hitomi; Yasuda, Naoki; Yeh, Sherry; Yonehara, Atsunori; Yuma, Suraphong

    2018-01-01

    Hyper Suprime-Cam (HSC) is a wide-field imaging camera on the prime focus of the 8.2-m Subaru telescope on the summit of Mauna Kea in Hawaii. A team of scientists from Japan, Taiwan, and Princeton University is using HSC to carry out a 300-night multi-band imaging survey of the high-latitude sky. The survey includes three layers: the Wide layer will cover 1400 deg2 in five broad bands (grizy), with a 5 σ point-source depth of r ≈ 26. The Deep layer covers a total of 26 deg2 in four fields, going roughly a magnitude fainter, while the UltraDeep layer goes almost a magnitude fainter still in two pointings of HSC (a total of 3.5 deg2). Here we describe the instrument, the science goals of the survey, and the survey strategy and data processing. This paper serves as an introduction to a special issue of the Publications of the Astronomical Society of Japan, which includes a large number of technical and scientific papers describing results from the early phases of this survey.

  7. The variable sky of deep synoptic surveys

    Energy Technology Data Exchange (ETDEWEB)

    Ridgway, Stephen T.; Matheson, Thomas; Mighell, Kenneth J.; Olsen, Knut A. [National Optical Astronomy Observatory, Tucson, AZ 85725 (United States); Howell, Steve B., E-mail: ridgway@noao.edu [NASA Ames Research Center, P.O. Box 1, M/S 244-30, Moffett Field, CA 94035 (United States)

    2014-11-20

    The discovery of variable and transient sources is an essential product of synoptic surveys. The alert stream will require filtering for personalized criteria—a process managed by a functionality commonly described as a Broker. In order to understand quantitatively the magnitude of the alert generation and Broker tasks, we have undertaken an analysis of the most numerous types of variable targets in the sky—Galactic stars, quasi-stellar objects (QSOs), active galactic nuclei (AGNs), and asteroids. It is found that the Large Synoptic Survey Telescope (LSST) will be capable of discovering ∼10{sup 5} high latitude (|b| > 20°) variable stars per night at the beginning of the survey. (The corresponding number for |b| < 20° is orders of magnitude larger, but subject to caveats concerning extinction and crowding.) However, the number of new discoveries may well drop below 100 per night within less than one year. The same analysis applied to GAIA clarifies the complementarity of the GAIA and LSST surveys. Discovery of AGNs and QSOs are each predicted to begin at ∼3000 per night and decrease by 50 times over four years. Supernovae are expected at ∼1100 per night, and after several survey years will dominate the new variable discovery rate. LSST asteroid discoveries will start at >10{sup 5} per night, and if orbital determination has a 50% success rate per epoch, they will drop below 1000 per night within two years.

  8. The variable sky of deep synoptic surveys

    International Nuclear Information System (INIS)

    Ridgway, Stephen T.; Matheson, Thomas; Mighell, Kenneth J.; Olsen, Knut A.; Howell, Steve B.

    2014-01-01

    The discovery of variable and transient sources is an essential product of synoptic surveys. The alert stream will require filtering for personalized criteria—a process managed by a functionality commonly described as a Broker. In order to understand quantitatively the magnitude of the alert generation and Broker tasks, we have undertaken an analysis of the most numerous types of variable targets in the sky—Galactic stars, quasi-stellar objects (QSOs), active galactic nuclei (AGNs), and asteroids. It is found that the Large Synoptic Survey Telescope (LSST) will be capable of discovering ∼10 5 high latitude (|b| > 20°) variable stars per night at the beginning of the survey. (The corresponding number for |b| < 20° is orders of magnitude larger, but subject to caveats concerning extinction and crowding.) However, the number of new discoveries may well drop below 100 per night within less than one year. The same analysis applied to GAIA clarifies the complementarity of the GAIA and LSST surveys. Discovery of AGNs and QSOs are each predicted to begin at ∼3000 per night and decrease by 50 times over four years. Supernovae are expected at ∼1100 per night, and after several survey years will dominate the new variable discovery rate. LSST asteroid discoveries will start at >10 5 per night, and if orbital determination has a 50% success rate per epoch, they will drop below 1000 per night within two years.

  9. Report on the results of the Sunshine Project - Verification survey for geothermal exploration technology, etc. Summary. Survey of deep geothermal resource; Chinetsu tansa gijutsu tou kensho chosa. Shinbu chinetsu shigen chosa sokatsu seika hokokusho (Yoyaku)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-03-01

    As to the development of deep geothermal resource which is expected to contribute to increasing the capacity of future power generation in Japan, investigational study was made from FY 1992 to FY 2000, and the results were summed up. The investigational study was conducted for the hydrothermal convection type deep geothermal resource with a thermal conducting heating mechanism, of which Kakkonda is typical, including the drilling of deep exploration well using the existing technology. As a result, new information/knowledge were acquired about the thermal structure, reservoir structure and hydrothermal supply structure of the depths, and a deep geothermal model was made. Based on the model, a detailed simulation was made possible, and a whole image of the hydrothermal convection type deep geothermal resource with the thermal conducting heating mechanism was made clear. In the surface survey, observation of microearthquakes, high-accuracy MT method, etc. were carried out, and a grasp of the shape of a new granite body from the surface was made possible. Concerning the drilling technology, the geologic stratum with a temperature over 500 degrees C was successfully drilled down to a depth of 3,729m by prolonging the life of bit at the time of drilling by introducing the top drive system, the closed mud cooling device, etc. (NEDO)

  10. A Survey: Time Travel in Deep Learning Space: An Introduction to Deep Learning Models and How Deep Learning Models Evolved from the Initial Ideas

    OpenAIRE

    Wang, Haohan; Raj, Bhiksha

    2015-01-01

    This report will show the history of deep learning evolves. It will trace back as far as the initial belief of connectionism modelling of brain, and come back to look at its early stage realization: neural networks. With the background of neural network, we will gradually introduce how convolutional neural network, as a representative of deep discriminative models, is developed from neural networks, together with many practical techniques that can help in optimization of neural networks. On t...

  11. THE ALMA SPECTROSCOPIC SURVEY IN THE HUBBLE ULTRA DEEP FIELD: IMPLICATIONS FOR SPECTRAL LINE INTENSITY MAPPING AT MILLIMETER WAVELENGTHS AND CMB SPECTRAL DISTORTIONS

    Energy Technology Data Exchange (ETDEWEB)

    Carilli, C. L.; Walter, F. [National Radio Astronomy Observatory, P.O. Box 0, Socorro, NM 87801 (United States); Chluba, J. [Jodrell Bank Centre for Astrophysics, University of Manchester, Oxford Road, M13 9PL (United Kingdom); Decarli, R. [Max-Planck Institute for Astronomy, D-69117 Heidelberg (Germany); Aravena, M. [Nucleo de Astronomia, Facultad de Ingenieria, Universidad Diego Portales, Av. Ejercito 441, Santiago (Chile); Wagg, J. [Square Kilometre Array Organisation, Lower Withington, Cheshire (United Kingdom); Popping, G. [European Southern Observatory, Karl-Schwarzschild-Strasse 2, D-85748, Garching (Germany); Cortes, P. [Joint ALMA Observatory—ESO, Av. Alonso de Cordova, 3104, Santiago (Chile); Hodge, J. [Leiden Observatory, Leiden University, Niels Bohrweg 2, NL2333 RA Leiden (Netherlands); Weiss, A. [Max-Planck-Institut für Radioastronomie, Auf dem Hügel 69, D-53121 Bonn (Germany); Bertoldi, F. [Argelander Institute for Astronomy, University of Bonn, Auf dem Hügel 71, D-53121 Bonn (Germany); Riechers, D., E-mail: ccarilli@aoc.nrao.edu [Cornell University, 220 Space Sciences Building, Ithaca, NY 14853 (United States)

    2016-12-10

    We present direct estimates of the mean sky brightness temperature in observing bands around 99 and 242 GHz due to line emission from distant galaxies. These values are calculated from the summed line emission observed in a blind, deep survey for spectral line emission from high redshift galaxies using ALMA (the ALMA spectral deep field observations “ASPECS” survey). In the 99 GHz band, the mean brightness will be dominated by rotational transitions of CO from intermediate and high redshift galaxies. In the 242 GHz band, the emission could be a combination of higher order CO lines, and possibly [C ii] 158 μ m line emission from very high redshift galaxies ( z  ∼ 6–7). The mean line surface brightness is a quantity that is relevant to measurements of spectral distortions of the cosmic microwave background, and as a potential tool for studying large-scale structures in the early universe using intensity mapping. While the cosmic volume and the number of detections are admittedly small, this pilot survey provides a direct measure of the mean line surface brightness, independent of conversion factors, excitation, or other galaxy formation model assumptions. The mean surface brightness in the 99 GHZ band is: T{sub B}  = 0.94 ± 0.09 μ K. In the 242 GHz band, the mean brightness is: T{sub B}  = 0.55 ± 0.033 μ K. These should be interpreted as lower limits on the average sky signal, since we only include lines detected individually in the blind survey, while in a low resolution intensity mapping experiment, there will also be the summed contribution from lower luminosity galaxies that cannot be detected individually in the current blind survey.

  12. Deep Learning in the Automotive Industry: Applications and Tools

    OpenAIRE

    Luckow, Andre; Cook, Matthew; Ashcraft, Nathan; Weill, Edwin; Djerekarov, Emil; Vorster, Bennie

    2017-01-01

    Deep Learning refers to a set of machine learning techniques that utilize neural networks with many hidden layers for tasks, such as image classification, speech recognition, language understanding. Deep learning has been proven to be very effective in these domains and is pervasively used by many Internet services. In this paper, we describe different automotive uses cases for deep learning in particular in the domain of computer vision. We surveys the current state-of-the-art in libraries, ...

  13. Preface: Proceedings of the Colloidal Dispersions in External Fields II Conference (Bonn-Bad Godesberg, 31 March 2 April 2008)

    Science.gov (United States)

    Löwen, H.

    2008-10-01

    This special issue reflects the scientific programme of the International Colloidal Dispersions in External Fields Conference (CODEF II) that took place in Bonn-Bad Godesberg from 31 March-2 April 2008. This is the second conference in a series that started in 2004 when the first CODEF meeting was held. The proceedings of the first CODEF meeting were summarized in a previous special issue (Journal of Physics: Condensed Matter 16 (issue 38)). The present issue represents recent progress in this rapidly developing field. The CODEF meeting series is held in conjunction with the German-Dutch Transregional Collaborative Research Centre SFB TR6 with the title Physics of Colloidal Dispersions in External Fields. Scientists working within this network as well as international invited guest speakers contributed to these meetings. The contributions in this issue are organized according to the type of different fields applied namely: bulk (no external field) shear flow electric field magnetic and laser-optical field confinement We would like to thank the CODEF II sponsors (Deutsche Forschungsgemeinschaft and MWFZ Mainz) for their financial support. Furthermore, we thank IOP Publishing for their willingness to publish the proceedings of this conference as a special issue. Participants O Alarcón-Waess (Puebla), M Allen (Coventry), J L Arauz-Lara (San Luis Potosi), L Assoud (Düsseldorf), G K Auernhammer (Mainz), R Backofen (Dresden), M Balbás-Gambra (Munich), J Bammert (Bayreuth), M Baptista (Mainz), J-L Barrat (Lyon), M Bier (Utrecht), K Binder (Mainz), R Blaak (Düsseldorf), V Blickle (Stuttgart), D Block (Kiel), S Böhm (Düsseldorf), V Botan (Mainz), J P Bouchaud (Paris), J Brader (Konstanz), G Brambilla (Montpellier), W J Briels (Enschede), M Brinkmann (Göttingen), C Brunet (Paris), H-J Butt (Mainz), M A Camargo Chaparro (Düsseldorf), R Castañeda Priego (Guanajuato), J J Cerdà Pino (Frankfurt), A Chatterji (Jülich), M Chavez Paez (San Luis Potosi), A Chremos

  14. Erbium-doped fiber lasers as deep-sea hydrophones

    International Nuclear Information System (INIS)

    Bagnoli, P.E.; Beverini, N.; Bouhadef, B.; Castorina, E.; Falchini, E.; Falciai, R.; Flaminio, V.; Maccioni, E.; Morganti, M.; Sorrentino, F.; Stefani, F.; Trono, C.

    2006-01-01

    The present work describes the development of a hydrophone prototype for deep-sea acoustic detection. The base-sensitive element is a single-mode erbium-doped fiber laser. The high sensitivity of these sensors makes them particularly suitable for a wide range of deep-sea acoustic applications, including geological and marine mammals surveys and above all as acoustic detectors in under-water telescopes for high-energy neutrinos

  15. A SYSTEMATIC SEARCH FOR PERIODICALLY VARYING QUASARS IN PAN-STARRS1: AN EXTENDED BASELINE TEST IN MEDIUM DEEP SURVEY FIELD MD09

    Energy Technology Data Exchange (ETDEWEB)

    Liu, T.; Gezari, S. [Department of Astronomy, University of Maryland, College Park, MD 20742 (United States); Burgett, W. [GMTO Corp, 465 N. Halstead St, Suite 250, Pasadena, CA 91107 (United States); Chambers, K.; Hodapp, K.; Huber, M.; Kudritzki, R.-P.; Magnier, E.; Tonry, J.; Wainscoat, R.; Waters, C. [Institute for Astronomy, University of Hawaii at Manoa, 2680 Woodlawn Drive, Honolulu, HI 96822 (United States); Draper, P.; Metcalfe, N., E-mail: tingting@astro.umd.edu [Department of Physics, University of Durham, South Road, Durham DH1 3LE (United Kingdom)

    2016-12-10

    We present a systematic search for periodically varying quasars and supermassive black hole binary (SMBHB) candidates in the Pan-STARRS1 (PS1) Medium Deep Survey’s MD09 field. From a color-selected sample of 670 quasars extracted from a multi-band deep-stack catalog of point sources, we locally select variable quasars and look for coherent periods with the Lomb–Scargle periodogram. Three candidates from our sample demonstrate strong variability for more than ∼3 cycles, and their PS1 light curves are well fitted to sinusoidal functions. We test the persistence of the candidates’ apparent periodic variations detected during the 4.2 years of the PS1 survey with archival photometric data from the SDSS Stripe 82 survey or new monitoring with the Large Monolithic Imager at the Discovery Channel Telescope. None of the three periodic candidates (including PSO J334.2028+1.4075) remain persistent over the extended baseline of 7–14 years, corresponding to a detection rate of <1 in 670 quasars in a search area of ≈5 deg{sup 2}. Even though SMBHBs should be a common product of the hierarchal growth of galaxies, and periodic variability in SMBHBs has been theoretically predicted, a systematic search for such signatures in a large optical survey is strongly limited by its temporal baseline and the “red noise” associated with normal quasar variability. We show that follow-up long-term monitoring (≳5 cycles) is crucial to our search for these systems.

  16. Distribución condicional de los retornos de la tasa de cambio colombiana: un ejercicio empírico a partir de modelos GARCH univariados.

    Directory of Open Access Journals (Sweden)

    Santiago Gallón Gómez

    2010-05-01

    Full Text Available Un conjunto de modelos GARCH multivariados son estimados y su validez empírica comparada a partir del cálculo de la medida VaR, para los retornos diarios de la tasa de cambio nominal del peso colombiano con respecto al dólar americano, euro, libra esterlina y yen japonés en el periodo 1999–2005. La comparación de las estimaciones para la matriz de covarianza condicional y los resultados obtenidos para la proporción de fallo y el contraste de cuantil dinámico de Engle y Manganelli (2004 presentan evidencia a favor del modelo de correlación condicional constante.

  17. Survey report of NOAA Ship McArthur II cruises AR-04-04, AR-05-05 and AR-06-03: habitat classification of side scan sonar imagery in support of deep-sea coral/sponge explorations at the Olympic Coast National Marine Sanctuary

    Science.gov (United States)

    Intelmann, Steven S.; Cochrane, Guy R.; Bowlby, C. Edward; Brancato, Mary Sue; Hyland, Jeffrey

    2007-01-01

    Habitat mapping and characterization has been defined as a high-priority management issue for the Olympic Coast National Marine Sanctuary (OCNMS), especially for poorly known deep-sea habitats that may be sensitive to anthropogenic disturbance. As a result, a team of scientists from OCNMS, National Centers for Coastal Ocean Science (NCCOS), and other partnering institutions initiated a series of surveys to assess the distribution of deep-sea coral/sponge assemblages within the sanctuary and to look for evidence of potential anthropogenic impacts in these critical habitats. Initial results indicated that remotely delineating areas of hard bottom substrate through acoustic sensing could be a useful tool to increase the efficiency and success of subsequent ROV-based surveys of the associated deep-sea fauna. Accordingly, side scan sonar surveys were conducted in May 2004, June 2005, and April 2006 aboard the NOAA Ship McArthur II to: (1) obtain additional imagery of the seafloor for broader habitat-mapping coverage of sanctuary waters, and (2) help delineate suitable deep-sea coral-sponge habitat, in areas of both high and low commercial-fishing activities, to serve as sites for surveying-in more detail using an ROV on subsequent cruises, Several regions of the sea floor throughout the OCNMS were surveyed and mosaicked at 1-meter pixel resolution. Imagery from the side scan sonar mapping efforts was integrated with other complementary data from a towed camera sled, ROVs, sedentary samples, and bathymetry records to describe geological and biological (where possible) aspects of habitat. Using a hierarchical deep-water marine benthic classification scheme (Greene et al. 1999), we created a preliminary map of various habitat polygon features for use in a geographical information system (GIS). This report provides a description of the mapping and groundtruthing efforts as well as results of the image classification procedure for each of the areas surveyed.

  18. The Leiden/Argentine/Bonn (LAB) Survey of Galactic HI : Final data release of the combined LDS and IAR surveys with improved stray-radiation corrections

    NARCIS (Netherlands)

    Kaberla, P.M.W.; Burton, W.B.; Hartmann, L.; Arnal, E.M.; Bajaja, E.; Morras, R.; Pöppel, W.G.L.

    2005-01-01

    We present the final data release of observations of ?21-cm emission from Galactic neutral hydrogen over the entire sky, merging the Leiden/Dwingeloo Survey (LDS: Hartmann & Burton 1997, Atlas of Galactic Neutral Hydrogen) of the sky north of ? = ?30? with the Instituto Argentino de Radioastronomía

  19. Deep learning for galaxy surface brightness profile fitting

    Science.gov (United States)

    Tuccillo, D.; Huertas-Company, M.; Decencière, E.; Velasco-Forero, S.; Domínguez Sánchez, H.; Dimauro, P.

    2018-03-01

    Numerous ongoing and future large area surveys (e.g. Dark Energy Survey, EUCLID, Large Synoptic Survey Telescope, Wide Field Infrared Survey Telescope) will increase by several orders of magnitude the volume of data that can be exploited for galaxy morphology studies. The full potential of these surveys can be unlocked only with the development of automated, fast, and reliable analysis methods. In this paper, we present DeepLeGATo, a new method for 2-D photometric galaxy profile modelling, based on convolutional neural networks. Our code is trained and validated on analytic profiles (HST/CANDELS F160W filter) and it is able to retrieve the full set of parameters of one-component Sérsic models: total magnitude, effective radius, Sérsic index, and axis ratio. We show detailed comparisons between our code and GALFIT. On simulated data, our method is more accurate than GALFIT and ˜3000 time faster on GPU (˜50 times when running on the same CPU). On real data, DeepLeGATo trained on simulations behaves similarly to GALFIT on isolated galaxies. With a fast domain adaptation step made with the 0.1-0.8 per cent the size of the training set, our code is easily capable to reproduce the results obtained with GALFIT even on crowded regions. DeepLeGATo does not require any human intervention beyond the training step, rendering it much automated than traditional profiling methods. The development of this method for more complex models (two-component galaxies, variable point spread function, dense sky regions) could constitute a fundamental tool in the era of big data in astronomy.

  20. Deep groundwater chemistry

    International Nuclear Information System (INIS)

    Wikberg, P.; Axelsen, K.; Fredlund, F.

    1987-06-01

    Starting in 1977 and up till now a number of places in Sweden have been investigated in order to collect the necessary geological, hydrogeological and chemical data needed for safety analyses of repositories in deep bedrock systems. Only crystalline rock is considered and in many cases this has been gneisses of sedimentary origin but granites and gabbros are also represented. Core drilled holes have been made at nine sites. Up to 15 holes may be core drilled at one site, the deepest down to 1000 m. In addition to this a number of boreholes are percussion drilled at each site to depths of about 100 m. When possible drilling water is taken from percussion drilled holes. The first objective is to survey the hydraulic conditions. Core drilled boreholes and sections selected for sampling of deep groundwater are summarized. (orig./HP)

  1. Survey of methods used to determine if a patient has a deep vein thrombosis: An exploratory research report.

    Science.gov (United States)

    Heick, John D; Farris, James W

    2017-09-01

    The use of evidence-based practice (EBP) is encouraged in the physical therapy profession, but integrating evidence into practice can be difficult for clinicians because of lack of time and other constraints. To survey physical therapy clinical instructors and determine the methods they use for screening for deep vein thrombosis (DVT), a type of venous thromboembolism (VTE) in the lower extremities. Exploratory survey. Twelve survey questions written specifically for this study were sent to a convenience sample of clinical instructors associated with seven universities across 43 states. Eight hundred fifty clinical instructors (22.4% response rate) completed the survey. Of those who responded, 80.5% were taught to use Homans sign to screen for a possible DVT in their entry-level education and 67.9% continued to use Homans sign in clinical practice. Regardless of post-graduate education, respondents were more likely to choose Homans sign than a clinical decision rule (CDR) to screen for a suspected DVT. Additionally, nearly two-thirds of respondents failed to correctly identify one or more of the major risk factors for developing a DVT/VTE. The response rate was 22.4% and therefore may not fully represent the population of physical therapy clinical instructors in the United States. Results from this exploratory survey indicated that approximately two-thirds of physical therapy clinical instructors used outdated DVT/VTE screening methods that they were taught in their entry-level education and nearly two-thirds did not identify the major risk factors associated with DVT/VTE. These results suggest that change is necessary in physical therapy education, clinical practice, and continuing professional development to ensure a more evidenced-based identification of DVT and VTE.

  2. HI shells in the Leiden/Argentina/Bonn HI survey

    Czech Academy of Sciences Publication Activity Database

    Ehlerová, Soňa; Palouš, Jan

    2013-01-01

    Roč. 550, February (2013), A23/1-A23/9 ISSN 0004-6361 R&D Projects: GA ČR GAP209/12/1795 Institutional support: RVO:67985815 Keywords : ISM bubbles * ISM general Subject RIV: BN - Astronomy, Celestial Mechanics, Astrophysics Impact factor: 4.479, year: 2013

  3. Optimization of the laser-induced photoemission for the production of polarized electron beams at the 50-keV source of the Bonn accelerator facility ELSA

    International Nuclear Information System (INIS)

    Gowin, M.

    2001-10-01

    Medium energy experiments requiring circularly polarized photons (produced by Bremsstrahlung of longitudinally polarized electrons) have started at the electron stretcher ELSA in Bonn. To fulfill the demands of the experiment (GDH) the laser induced photoemission of the 50 keV electron source has been optimized. Systematic studies with a titan-sapphire laser to optimize the pulse structure of the laser pulse and the emitted spectral width has been done. Using a Be-InGaAs/Be-AlGaAs strained superlattice photocathode a beam polarization of 80% with a quantum efficiency of 0.4% has been obtained while producing a space charge limited 100 mA beam current. (orig.)

  4. Development of Deep-tow Autonomous Cable Seismic (ACS) for Seafloor Massive Sulfides (SMSs) Exploration.

    Science.gov (United States)

    Asakawa, Eiichi; Murakami, Fumitoshi; Tsukahara, Hitoshi; Saito, Shutaro; Lee, Sangkyun; Tara, Kenji; Kato, Masafumi; Jamali Hondori, Ehsan; Sumi, Tomonori; Kadoshima, Kazuyuki; Kose, Masami

    2017-04-01

    Within the EEZ of Japan, numerous surveys exploring ocean floor resources have been conducted. The exploration targets are gas hydrates, mineral resources (manganese, cobalt or rare earth) and especially seafloor massive sulphide (SMS) deposits. These resources exist in shallow subsurface areas in deep waters (>1500m). For seismic explorations very high resolution images are required. These cannot be effectively obtained with conventional marine seismic techniques. Therefore we have been developing autonomous seismic survey systems which record the data close to the seafloor to preserve high frequency seismic energy. Very high sampling rate (10kHz) and high accurate synchronization between recording systems and shot time are necessary. We adopted Cs-base atomic clock considering its power consumption. At first, we developed a Vertical Cable Seismic (VCS) system that uses hydrophone arrays moored vertically from the ocean bottom to record close to the target area. This system has been successfully applied to SMS exploration. Specifically it fixed over known sites to assess the amount of reserves with the resultant 3D volume. Based on the success of VCS, we modified the VCS system to use as a more efficient deep-tow seismic survey system. Although there are other examples of deep-tow seismic systems, signal transmission cables present challenges in deep waters. We use our autonomous recording system to avoid these problems. Combining a high frequency piezoelectric source (Sub Bottom Profiler:SBP) that automatically shots with a constant interval, we achieve the high resolution deep-tow seismic without data transmission/power cable to the board. Although the data cannot be monitored in real-time, the towing system becomes very simple. We have carried out survey trial, which showed the systems utility as a high-resolution deep-tow seismic survey system. Furthermore, the frequency ranges of deep-towed source (SBP) and surface towed sparker are 700-2300Hz and 10-200Hz

  5. NATURAL GAS RESOURCES IN DEEP SEDIMENTARY BASINS

    Energy Technology Data Exchange (ETDEWEB)

    Thaddeus S. Dyman; Troy Cook; Robert A. Crovelli; Allison A. Henry; Timothy C. Hester; Ronald C. Johnson; Michael D. Lewan; Vito F. Nuccio; James W. Schmoker; Dennis B. Riggin; Christopher J. Schenk

    2002-02-05

    From a geological perspective, deep natural gas resources are generally defined as resources occurring in reservoirs at or below 15,000 feet, whereas ultra-deep gas occurs below 25,000 feet. From an operational point of view, ''deep'' is often thought of in a relative sense based on the geologic and engineering knowledge of gas (and oil) resources in a particular area. Deep gas can be found in either conventionally-trapped or unconventional basin-center accumulations that are essentially large single fields having spatial dimensions often exceeding those of conventional fields. Exploration for deep conventional and unconventional basin-center natural gas resources deserves special attention because these resources are widespread and occur in diverse geologic environments. In 1995, the U.S. Geological Survey estimated that 939 TCF of technically recoverable natural gas remained to be discovered or was part of reserve appreciation from known fields in the onshore areas and State waters of the United. Of this USGS resource, nearly 114 trillion cubic feet (Tcf) of technically-recoverable gas remains to be discovered from deep sedimentary basins. Worldwide estimates of deep gas are also high. The U.S. Geological Survey World Petroleum Assessment 2000 Project recently estimated a world mean undiscovered conventional gas resource outside the U.S. of 844 Tcf below 4.5 km (about 15,000 feet). Less is known about the origins of deep gas than about the origins of gas at shallower depths because fewer wells have been drilled into the deeper portions of many basins. Some of the many factors contributing to the origin of deep gas include the thermal stability of methane, the role of water and non-hydrocarbon gases in natural gas generation, porosity loss with increasing thermal maturity, the kinetics of deep gas generation, thermal cracking of oil to gas, and source rock potential based on thermal maturity and kerogen type. Recent experimental simulations

  6. Application of Deep Learning in Automated Analysis of Molecular Images in Cancer: A Survey

    Science.gov (United States)

    Xue, Yong; Chen, Shihui; Liu, Yong

    2017-01-01

    Molecular imaging enables the visualization and quantitative analysis of the alterations of biological procedures at molecular and/or cellular level, which is of great significance for early detection of cancer. In recent years, deep leaning has been widely used in medical imaging analysis, as it overcomes the limitations of visual assessment and traditional machine learning techniques by extracting hierarchical features with powerful representation capability. Research on cancer molecular images using deep learning techniques is also increasing dynamically. Hence, in this paper, we review the applications of deep learning in molecular imaging in terms of tumor lesion segmentation, tumor classification, and survival prediction. We also outline some future directions in which researchers may develop more powerful deep learning models for better performance in the applications in cancer molecular imaging. PMID:29114182

  7. Stochastic Plume Simulations for the Fukushima Accident and the Deep Water Horizon Oil Spill

    Science.gov (United States)

    Coelho, E.; Peggion, G.; Rowley, C.; Hogan, P.

    2012-04-01

    The Fukushima Dai-ichi power plant suffered damage leading to radioactive contamination of coastal waters. Major issues in characterizing the extent of the affected waters were a poor knowledge of the radiation released to the coastal waters and the rather complex coastal dynamics of the region, not deterministically captured by the available prediction systems. Equivalently, during the Gulf of Mexico Deep Water Horizon oil platform accident in April 2010, significant amounts of oil and gas were released from the ocean floor. For this case, issues in mapping and predicting the extent of the affected waters in real-time were a poor knowledge of the actual amounts of oil reaching the surface and the fact that coastal dynamics over the region were not deterministically captured by the available prediction systems. To assess the ocean regions and times that were most likely affected by these accidents while capturing the above sources of uncertainty, ensembles of the Navy Coastal Ocean Model (NCOM) were configured over the two regions (NE Japan and Northern Gulf of Mexico). For the Fukushima case tracers were released on each ensemble member; their locations at each instant provided reference positions of water volumes where the signature of water released from the plant could be found. For the Deep Water Horizon oil spill case each ensemble member was coupled with a diffusion-advection solution to estimate possible scenarios of oil concentrations using perturbed estimates of the released amounts as the source terms at the surface. Stochastic plumes were then defined using a Risk Assessment Code (RAC) analysis that associates a number from 1 to 5 to each grid point, determined by the likelihood of having tracer particle within short ranges (for the Fukushima case), hence defining the high risk areas and those recommended for monitoring. For the Oil Spill case the RAC codes were determined by the likelihood of reaching oil concentrations as defined in the Bonn Agreement

  8. The UV galaxy luminosity function at z = 3-5 from the CFHT Legacy Survey Deep fields

    Science.gov (United States)

    van der Burg, R. F. J.; Hildebrandt, H.; Erben, T.

    2010-11-01

    Aims: We measure and study the evolution of the UV galaxy luminosity function (LF) at z = 3-5 from the largest high-redshift survey to date, the Deep part of the CFHT Legacy Survey. We also give accurate estimates of the SFR density at these redshifts. Methods: We consider ~100 000 Lyman-break galaxies at z ≈ 3.1, 3.8 and 4.8 selected from very deep ugriz images of this data set and estimate their rest-frame 1600 Å luminosity function. Due to the large survey volume, cosmic variance plays a negligible role. Furthermore, we measure the bright end of the LF with unprecedented statistical accuracy. Contamination fractions from stars and low-z galaxy interlopers are estimated from simulations. From these simulations the redshift distributions of the Lyman-break galaxies in the different samples are estimated, and those redshifts are used to choose bands and calculate k-corrections so that the LFs are compared at the same rest-frame wavelength. To correct for incompleteness, we study the detection rate of simulated galaxies injected to the images as a function of magnitude and redshift. We estimate the contribution of several systematic effects in the analysis to test the robustness of our results. Results: We find the bright end of the LF of our u-dropout sample to deviate significantly from a Schechter function. If we modify the function by a recently proposed magnification model, the fit improves. For the first time in an LBG sample, we can measure down to the density regime where magnification affects the shape of the observed LF because of the very bright and rare galaxies we are able to probe with this data set. We find an increase in the normalisation, ϕ*, of the LF by a factor of 2.5 between z ≈ 5 and z ≈ 3. The faint-end slope of the LF does not evolve significantly between z ≈ 5 and z ≈ 3. We do not find a significant evolution of the characteristic magnitude in the studied redshift interval, possibly because of insufficient knowledge of the source

  9. Deep Chandra Survey of the Small Magellanic Cloud. II. Timing Analysis of X-Ray Pulsars

    Energy Technology Data Exchange (ETDEWEB)

    Hong, JaeSub; Antoniou, Vallia; Zezas, Andreas; Drake, Jeremy J.; Plucinsky, Paul P. [Harvard-Smithsonian Center for Astrophysics, 60 Garden St., Cambridge, MA 02138 (United States); Haberl, Frank [Max-Planck-Institut für extraterrestrische Physik, Giessenbach straße, D-85748 Garching (Germany); Sasaki, Manami [Friedrich-Alexander-Universität Erlangen-Nürnberg, Sternwartstrasse 7, 96049 Bamberg (Germany); Laycock, Silas, E-mail: jaesub@head.cfa.harvard.edu [Department of Physics, University of Massachusetts Lowell, MA 01854 (United States)

    2017-09-20

    We report the timing analysis results of X-ray pulsars from a recent deep Chandra survey of the Small Magellanic Cloud (SMC). We analyzed a total exposure of 1.4 Ms from 31 observations over a 1.2 deg{sup 2} region in the SMC under a Chandra X-ray Visionary Program. Using the Lomb–Scargle and epoch-folding techniques, we detected periodic modulations from 20 pulsars and a new candidate pulsar. The survey also covered 11 other pulsars with no clear sign of periodic modulation. The 0.5–8 keV X-ray luminosity ( L {sub X} ) of the pulsars ranges from 10{sup 34} to 10{sup 37} erg s{sup −1} at 60 kpc. All of the Chandra sources with L {sub X} ≳ 4 × 10{sup 35} erg s{sup −1} exhibit X-ray pulsations. The X-ray spectra of the SMC pulsars (and high-mass X-ray binaries) are in general harder than those of the SMC field population. All but SXP 8.02 can be fitted by an absorbed power-law model with a photon index of Γ ≲ 1.5. The X-ray spectrum of the known magnetar SXP 8.02 is better fitted with a two-temperature blackbody model. Newly measured pulsation periods of SXP 51.0, SXP 214, and SXP 701, are significantly different from the previous XMM-Newton and RXTE measurements. This survey provides a rich data set for energy-dependent pulse profile modeling. Six pulsars show an almost eclipse-like dip in the pulse profile. Phase-resolved spectral analysis reveals diverse spectral variations during pulsation cycles: e.g., for an absorbed power-law model, some exhibit an (anti)-correlation between absorption and X-ray flux, while others show more intrinsic spectral variation (i.e., changes in photon indices).

  10. THE MULTIWAVELENGTH SURVEY BY YALE-CHILE (MUSYC): DEEP MEDIUM-BAND OPTICAL IMAGING AND HIGH-QUALITY 32-BAND PHOTOMETRIC REDSHIFTS IN THE ECDF-S

    International Nuclear Information System (INIS)

    Cardamone, Carolin N.; Van Dokkum, Pieter G.; Urry, C. Megan; Brammer, Gabriel; Taniguchi, Yoshi; Gawiser, Eric; Bond, Nicholas; Taylor, Edward; Damen, Maaike; Treister, Ezequiel; Cobb, Bethany E.; Schawinski, Kevin; Lira, Paulina; Murayama, Takashi; Saito, Tomoki; Sumikawa, Kentaro

    2010-01-01

    We present deep optical 18-medium-band photometry from the Subaru telescope over the ∼30' x 30' Extended Chandra Deep Field-South, as part of the Multiwavelength Survey by Yale-Chile (MUSYC). This field has a wealth of ground- and space-based ancillary data, and contains the GOODS-South field and the Hubble Ultra Deep Field. We combine the Subaru imaging with existing UBVRIzJHK and Spitzer IRAC images to create a uniform catalog. Detecting sources in the MUSYC 'BVR' image we find ∼40,000 galaxies with R AB 3.5. For 0.1 < z < 1.2, we find a 1σ scatter in Δz/(1 + z) of 0.007, similar to results obtained with a similar filter set in the COSMOS field. As a demonstration of the data quality, we show that the red sequence and blue cloud can be cleanly identified in rest-frame color-magnitude diagrams at 0.1 < z < 1.2. We find that ∼20% of the red sequence galaxies show evidence of dust emission at longer rest-frame wavelengths. The reduced images, photometric catalog, and photometric redshifts are provided through the public MUSYC Web site.

  11. Deep Water Coral (HB1402, EK60)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The cruise will survey and collect samples of deep-sea corals and related marine life in the canyons in the northern Gulf of Maine in U.S. and Canadian waters. The...

  12. Acceleration of polarized electrons in the Bonn electron-accelerator facility ELSA; Beschleunigung polarisierter Elektronen in der Bonner Elektronen-Beschleunigeranlage ELSA

    Energy Technology Data Exchange (ETDEWEB)

    Hoffmann, M.

    2001-12-01

    The future medium energy physics program at the electron stretcher accelerator ELSA of Bonn University mainly relies on experiments using polarized electrons in the energy range from 1 to 3.2 GeV. To prevent depolarization during acceleration in the circular accelerators several depolarizing resonances have to be corrected for. Intrinsic resonances are compensated using two pulsed betatron tune jump quadrupoles. The influence of imperfection resonances is successfully reduced applying a dynamic closed orbit correction in combination with an empirical harmonic correction on the energy ramp. Both types of resonances and the correction techniques have been studied in detail. The imperfection resonances were used to calibrate the energy of the stretcher ring with high accuracy. A new technique to extract the beam with horizontal oriented polarization was sucessfully installed. For all energies a polarized electron beam with more than 50% polarization can now be supplied to the experiments at ELSA, which is demonstrated by measurements using a Moeller polarimeter installed in the external beamline. (orig.)

  13. A SUCCESSFUL BROADBAND SURVEY FOR GIANT Ly{alpha} NEBULAE. I. SURVEY DESIGN AND CANDIDATE SELECTION

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, Moire K. M. [Department of Physics, Broida Hall, Mail Code 9530, University of California, Santa Barbara, CA 93106 (United States); Dey, Arjun; Jannuzi, Buell T., E-mail: mkpresco@physics.ucsb.edu [National Optical Astronomy Observatory, 950 North Cherry Avenue, Tucson, AZ 85719 (United States)

    2012-04-01

    Giant Ly{alpha} nebulae (or Ly{alpha} 'blobs') are likely sites of ongoing massive galaxy formation, but the rarity of these powerful sources has made it difficult to form a coherent picture of their properties, ionization mechanisms, and space density. Systematic narrowband Ly{alpha} nebula surveys are ongoing, but the small redshift range covered and the observational expense limit the comoving volume that can be probed by even the largest of these surveys and pose a significant problem when searching for such rare sources. We have developed a systematic search technique designed to find large Ly{alpha} nebulae at 2 {approx}< z {approx}< 3 within deep broadband imaging and have carried out a survey of the 9.4 deg{sup 2} NOAO Deep Wide-Field Survey Booetes field. With a total survey comoving volume of Almost-Equal-To 10{sup 8} h{sup -3}{sub 70} Mpc{sup 3}, this is the largest volume survey for Ly{alpha} nebulae ever undertaken. In this first paper in the series, we present the details of the survey design and a systematically selected sample of 79 candidates, which includes one previously discovered Ly{alpha} nebula.

  14. Acceptance and tolerability of an adjuvanted nH1N1 vaccine in HIV-infected patients in the cologne-bonn cohort

    Directory of Open Access Journals (Sweden)

    Steffens B

    2011-07-01

    Full Text Available Abstract Objective To evaluate the acceptance and tolerability of the nH1N1 2009 vaccine in HIV-positive individuals. Method 758 patients were included in this prospective study. Different study populations were formed: The Tolerability Study Group consists of HIV-infected patients who visited three outpatient clinics (Cologne, Bonn, Freiburg during a predefined time period. Patients were offered nH1N1 vaccination. Those accepting were administered a standard dose AS03 adjuvant nH1N1 vaccine. Questionnaires to report side effects occurring within 7 days after immunization were handed out. In a substudy conducted during the same time period, acceptance towards immunization was recorded. This Acceptance Study Group consists of all HIV-infected patients visiting the Cologne clinic. They were offered vaccination. In case of refusal, motivation was recorded. Results In the Tolerability Study Group, a total of 475 patient diaries returned in the three study centres could be evaluated, 119 of those (25% reported no side effects. Distribution of symptoms was as follows: Pain 285/475 patients (60%, swelling 96 (20%, redness 54 (11%, fever 48/475 (10%, muscle/joint ache 173 (36%, headache 127 (27%, and fatigue 210 (44%. Association of side effects with clinical data was calculated for patients in Cologne and Bonn. Incidence of side effects was significantly associated with CDC stages A, B compared to C, and with a detectable viral load (> 50 copies/mL. No correlation was noted for CD4 cell count, age, gender or ethnicity. In the Acceptance Study Group, 538 HIV-infected patients were offered vaccination, 402 (75% accepted, while 136 (25% rejected. Main reasons for rejection were: Negative media coverage (35%, indecisiveness with preference to wait until a later date (23%, influenza not seen as personal threat (19% and scepticism towards immunization in general (10%. Conclusion A total of 622 HIV-infected patients were vaccinated against nH1N1-influenza in

  15. Geological evidence for deep exploration in Xiazhuang uranium orefield and its periphery

    International Nuclear Information System (INIS)

    Feng Zhijun; Huang Hongkun; Zeng Wenwei; Wu Jiguang

    2011-01-01

    This paper first discussed the ore-controlling role of deep structure, the origin of metallogenic matter and fluid, the relation of diabase to silicification zone, then summarized the achievement of Geophysical survey and drilling, and finally analysed the potential for deep exploration in Xiazhuang uranium orefield.(authors)

  16. Study on the Geological Structure around KURT Using a Deep Borehole Investigation

    International Nuclear Information System (INIS)

    Park, Kyung Woo; Kim, Kyung Su; Koh, Yong Kwon; Choi, Jong Won

    2010-01-01

    To characterize geological features in study area for high-level radioactive waste disposal research, KAERI (Korea Atomic Energy Research Institute) has been performing the several geological investigations such as geophysical surveys and borehole drilling since 1997. Especially, the KURT (KAERI Underground Research Tunnel) constructed to understand the deep geological environments in 2006. Recently, the deep borehole of 500 m depths was drilled to confirm and validate the geological model at the left research module of the KURT. The objective of this research was to identify the geological structures around KURT using the data obtained from the deep borehole investigation. To achieve the purpose, several geological investigations such as geophysical and borehole fracture surveys were carried out simultaneously. As a result, 7 fracture zones were identified in deep borehole located in the KURT. As one of important parts of site characterization on KURT area, the results will be used to revise the geological model of the study area

  17. Comment la bonne Ligue sauva la monarchie. 1593 selon Nicolas Lefèvre de Lezeau

    Directory of Open Access Journals (Sweden)

    Fabrice Micallef

    2011-11-01

    Full Text Available Le chapitre III de la Vie de Marillac par Lefèvre de Lezeau est consacré au passé ligueur du garde des sceaux. Justifier l’appartenance au parti catholique n’est pas une chose facile dans la France du milieu du xviie qui considère majoritairement les ligueurs comme des fanatiques violents ou comme des hypocrites, vendus aux puissances étrangères, notamment l’Espagne. La réhabilitation menée par l’auteur consiste à dire que Marillac faisait partie d’une « bonne ligue », représentée notamment au parlement de Paris. C’est cette bonne ligue parlementaire qui en juin 1593, sous l’impulsion de Marillac, aurait sauvé la monarchie en promulguant le célèbre arrêt Lemaître, cassant toutes les décisions que les états généraux étaient susceptibles de prendre pour transmettre la couronne de France à un prince étranger. Nous avons choisi d’étudier cette source en posant trois questions.1 Les événements rapportés par l’auteur sont-ils crédibles ? Ces faits semblent exacts dans l’ensemble. Mais le point central de la démonstration, à savoir le rôle essentiel de Marillac, n’est à ce jour étayé par aucune autre source. 2 Que nous apprend ce document sur les pratiques d’historien de Lefèvre de Lezeau ? Plusieurs indices nous laissent penser que ce texte pourrait être à l’origine autonome, et aurait été tardivement inséré par l’auteur dans sa Vie de Marillac.3 Quelle est la stratégie d’écriture de l’auteur ? La réhabilitation de la bonne Ligue est rendue acceptable car elle se fait au miroir du parti royaliste : comme le royaliste, le bon ligueur est modéré, courageux, il a le sens de l’état, et c’est un « bon français », gallican et opposé aux ambitions espagnoles. Mais derrière ces éléments consensuels, l’auteur ne remet pas en cause la légitimité de la Ligue ; il en fait même le premier instrument de la Providence pour pacifier la France. Subrepticement

  18. Discovering Interacting Binaries with Halpha Surveys

    NARCIS (Netherlands)

    Witham, A.; Knigge, C.; Drew, J.; Groot, P.J.; Greimel, R.; Parker, Q.

    2005-01-01

    A deep (R ~ 19.5) photographic Halpha Survey of the southern Galactic Plane was recently completed using the UK Schmidt Telescope at the AAO. In addition, we have recently started a similar, CCD-based survey of the northern Galactic Plane using the Wide Field Camera on the INT. Both surveys aim to

  19. A deep X-ray spectroscopic survey of the ESO imaging survey fields

    DEFF Research Database (Denmark)

    Nørgaard-Nielsen, Hans Ulrik; Jørgensen, H.E.; Hansen, Lene

    1998-01-01

    The deepest ROSAT surveys have shown, that, in the energy range 0.5-2.0 keV, QSO's can account for similar to 30 per cent of the Diffuse X-ray Background (DXRB), and Narrow Emission Line Galaxies (NELG) and clusters of galaxies for about 10 per cent each. But, by assuming characteristic spectral ...... provide new insight into the evolution of galaxies, clusters of galaxies and AGN's.......The deepest ROSAT surveys have shown, that, in the energy range 0.5-2.0 keV, QSO's can account for similar to 30 per cent of the Diffuse X-ray Background (DXRB), and Narrow Emission Line Galaxies (NELG) and clusters of galaxies for about 10 per cent each. But, by assuming characteristic spectral....... This spectroscopic X-ray survey will provide a large, statistically complete, sample of sources detected at high energies, more than an order of magnitude fainter than obtained by previous missions. The study of these sources will significantly improve our understanding not only of the origin of DXRB, but also...

  20. Metas de inflação e volatilidade cambial: uma análise da experiência internacional com PAINEL-GARCH Inflation targeting and exchange rate volatility: an analysis of international experience with PANEL-GARCH

    Directory of Open Access Journals (Sweden)

    Marcos Rocha

    2011-08-01

    Full Text Available A adoção de regimes de metas inflacionárias tem como corolário o funcionamento de um regime de câmbio flutuante. Esta conexão leva alguns analistas a concluírem que um dos custos da implantação de metas de inflação é o aumento da volatilidade do câmbio. Este trabalho segue a sugestão de Edwards (2006 de que a avaliação da volatilidade cambial dentro do regime de metas deve ser feita controlando-se os efeitos do regime cambial vigente. A análise, portanto, dever ser da volatilidade condicional. Este artigo analisa a volatilidade condicional estimando um modelo de PAINEL-GARCH exponencial para 20 países que oficialmente adotaram o regime de metas. Os diferentes regimes cambiais são contemplados na explicação da volatilidade como variáveis de controle, seguindo a classificação acurada de regimes cambiais proposta por Reinhart e Rogoff (2002. Assim, os resultados contrariam muitas ideias cristalizadas por parte da literatura. No caso dos países emergentes, diferentemente do resultado para países desenvolvidos, foi encontrado que a adoção do regime de metas de inflação, Ceteris paribus, reduz a volatilidade condicional da taxa de câmbio real. Apesar de aparentemente desconcertante a princípio, o resultado segue uma lógica que é desenvolvida no artigo e que, em última instância, sugere problemas de credibilidade para o comportamento singular da volatilidade para esses países. Dentro desse contexto, é destacada a conexão entre estabilidade, fear of floating e especificidades dos países emergentes, como a dimensão do pass-trough cambial sobre os preços.The inflation targeting regimes adoption has as its corollary the operation of an exchange rate floating regime. This link makes some analysts to conclude that one of the costs of the inflation targeting implementation is the increase in the exchange rate volatility. This work follows the suggestion of Edwards (2006 that the evaluation of the exchange volatility

  1. Forecasting of exported volume for brazilian fruits by time series analysis: an arima/garch approach

    Directory of Open Access Journals (Sweden)

    Abdinardo Moreira Barreto de Oliveira

    2015-06-01

    Full Text Available The aim of this paper was to offer econometric forecasting models to the Brazilian exported volume fruits, with a view to assisting the planning and production control, also motivated by the existence of a few published papers dealing with this issue. In this sense, it was used the ARIMA/GARCH models, considering, likewise, the occurrence of a multiplicative stochastic seasonality in these series. They were collected 300 observations of exported net weight (kg between Jan/1989 and Dec/2013 of the following fruits: pineapple, banana, orange, lemon, apple, papaya, mango, watermelon, melon and grape, which selection criteria was its importance in the exported basket fruit, because they represented 97% of total received dollars, and 99% of total volume sold in 2010, of a population about 28 kinds of exported fruits. The results showed that it was not only observed the existence of a 12 month multiplicative seasonality in banana and mango. On the other hand, they were identified two fruits groups: (1 those which are continuously exported, and (2 those which have export peaks. On the quality of the models, they were considered satisfactory for six of the ten fruits analyzed. On the volatility, it was seen a high persistence in banana and papaya series, pointing to the existence of a structural break in time series, which could be linked to the economic crises happened in the last 17 years.

  2. Transmission lines and launching systems for ECRH on the garching stellarators W VIIa and W VII-AS

    International Nuclear Information System (INIS)

    Thumm, M.; Janzen, G.; Mueller, G.; Schueller, P.G.; Wilhelm, R.; Erckmann, V.

    1983-01-01

    The transmission lines and launching systems for non ohmic plasma production and heating by ECR-waves at 28 GHz (200 kW, 40 ms) and 70 GHz (200 kW, 100 ms) in the Garching Wendelstein Stellarator W VIIa and at 70 GHz (800 kW, cw) in the future Advanced Stellarator W VII-AS are described. The ECRH systems meet the requirements for neutral gas breakdown (R-wave), heating of a cold plasma (X-mode) and heating of a warm plasma (0-mode) in a combined way. Periodically modulated wall mode converters (sinusoidal m=0 radius modulation, a 0 =31.75 mm at 28 GHz, a 0 =13.9 mm at 70 GHz) convert the circular electric TE/sub on/ gyrotron output mode mixture (mainly TE 02 mode) into a pure TE 01 wave which is used for the long distance transmission in smooth overmoded waveguides (I.D. = 63.5 mm). At the converter inputs the phases between the TE/sub on/ modes are matched by phase shifters. The measured conversion efficiency for characteristic mode mixtures (TE 02 /TE 01 /TE 03 ) at 28 GHz is about 98 %. For the geometrical and electrical matching of different waveguide diameters waveguide tapers with approximate Tschebycheff mode-conversion responses are used

  3. A medium-deep Chandra and Subaru survey of the 13-h XMM/ROSAT deep survey area

    Science.gov (United States)

    McHardy, I. M.; Gunn, K. F.; Newsam, A. M.; Mason, K. O.; Page, M. J.; Takata, T.; Sekiguchi, K.; Sasseen, T.; Cordova, F.; Jones, L. R.; Loaring, N.

    2003-07-01

    We present the results of a Chandra ACIS-I survey of a high-latitude region at 13 h +38° which was earlier observed with ROSAT and which has recently been observed by XMM-Newton for 200 ks. XMM-Newton will provide good-quality X-ray spectra for over 200 sources with fluxes around the knee of the log N/ log S, which are responsible for the bulk of the X-ray background. The main aim of the Chandra observations is to provide arcsecond, or better, positions, and hence reliable identifications, for the XMM-Newton sources. The ACIS-I observations were arranged in a mosaic of four 30-ks pointings, covering almost all of the 15-arcmin radius XMM-Newton/ROSAT field. We detect 214 Chandra sources above a Cash likelihood statistic of 25, which approximates to 5σ significance, to a limiting flux of ~1.3 × 10-15 erg cm-2 s-1 (0.5-7 keV). Optical counterparts are derived from a Subaru SuprimeCam image reaching to R~ 27. The very large majority of the Chandra sources have an optical counterpart, with the distribution peaking at 23 high LX/Lopt ratios, implying absorption at moderate redshift. Comparison with the earlier ROSAT survey shows that the accuracy of the ROSAT positions agrees very well with the predictions from simulations by McHardy et al. and that the large majority of the identifications were correct.

  4. Protective Benefits of Deep Tube Wells Against Childhood Diarrhea in Matlab, Bangladesh

    Science.gov (United States)

    Winston, Jennifer Jane; Escamilla, Veronica; Perez-Heydrich, Carolina; Carrel, Margaret; Yunus, Mohammad; Streatfield, Peter Kim

    2013-01-01

    Objectives. We investigated whether deep tube wells installed to provide arsenic-free groundwater in rural Bangladesh have the added benefit of reducing childhood diarrheal disease incidence. Methods. We recorded cases of diarrhea in children younger than 5 years in 142 villages of Matlab, Bangladesh, during monthly community health surveys in 2005 and 2006. We surveyed the location and depth of 12 018 tube wells and integrated these data with diarrhea data and other data in a geographic information system. We fit a longitudinal logistic regression model to measure the relationship between childhood diarrhea and deep tube well use. We controlled for maternal education, family wealth, year, and distance to a deep tube well. Results. Household clusters assumed to be using deep tube wells were 48.7% (95% confidence interval = 27.8%, 63.5%) less likely to have a case of childhood diarrhea than were other household clusters. Conclusions. Increased access to deep tube wells may provide dual benefits to vulnerable populations in Matlab, Bangladesh, by reducing the risk of childhood diarrheal disease and decreasing exposure to naturally occurring arsenic in groundwater. PMID:23409905

  5. Pourquoi les filles sont si bonnes en maths et 40 autres histoires sur le cerveau de l’homme

    CERN Document Server

    Cohen, Laurent

    2012-01-01

    Pourquoi les filles sont-elles si bonnes en maths ? Pourquoi n'avons-nous aucun souvenir avant l'âge de 2 ans ? Pourquoi le fait d'avoir deux yeux nous aide-t-il à voir en trois dimensions ? En quoi la pratique des jeux vidéo aide-t-elle à devenir pilote d'avion ? Pourquoi se lave-t-on les mains quand on a honte ? Le cerveau d'un homme de droite est-il différent de celui d'un homme de gauche ? Et que fait notre cerveau quand nous ne faisons rien ? Laurent Cohen répond ici à toutes ces questions avec la clarté et le brio qu'on lui connaît, sans oublier l'humour ni négliger les dernières avancées scientifiques. ?Laurent Cohen est professeur de neurologie à l'hôpital de la Pitié-Salpêtrière- Paris-VI. Il a notamment publié L'Homme thermomètre et Pourquoi les chimpanzés ne parlent pas, qui ont été de grands succès.

  6. Deep learning—Accelerating Next Generation Performance Analysis Systems?

    Directory of Open Access Journals (Sweden)

    Heike Brock

    2018-02-01

    Full Text Available Deep neural network architectures show superior performance in recognition and prediction tasks of the image, speech and natural language domains. The success of such multi-layered networks encourages their implementation in further application scenarios as the retrieval of relevant motion information for performance enhancement in sports. However, to date deep learning is only seldom applied to activity recognition problems of the human motion domain. Therefore, its use for sports data analysis might remain abstract to many practitioners. This paper provides a survey on recent works in the field of high-performance motion data and examines relevant technologies for subsequent deployment in real training systems. In particular, it discusses aspects of data acquisition, processing and network modeling. Analysis suggests the advantage of deep neural networks under difficult and noisy data conditions. However, further research is necessary to confirm the benefit of deep learning for next generation performance analysis systems.

  7. Impacts of subsidized renewable electricity generation on spot market prices in Germany: evidence from a Garch model with panel data

    International Nuclear Information System (INIS)

    Pham, Thao; Lemoine, Killian

    2015-01-01

    Electricity generated by renewable energy sources creates a downward pressure on wholesale prices through - the so-called 'merit order effect'. This effect tends to lower average power prices and average market revenue that renewables producers should have received, making integration costs of renewables very high at large penetration rate. It is therefore crucial to determine the amplitude of this merit order effect particularly in the context of increasing burden of renewable support policies borne by final consumers. Using hourly data for the period 2009-2012 in German electricity wholesale market for GARCH model under panel data framework, we find that wind and solar power generation injected into German electricity network during this period induces a decrease of electricity spot prices and a slight increase of their volatility. The model-based results suggest that the merit-order effect created by renewable production ranges from 3.86 to 8.34 euro/MWh which implies to the annual volume of consumers' surplus from 1.89 to 3.92 billion euros. However this surplus has not been re-distributed equally among different types of electricity consumers. (authors)

  8. DeepPy: Pythonic deep learning

    DEFF Research Database (Denmark)

    Larsen, Anders Boesen Lindbo

    This technical report introduces DeepPy – a deep learning framework built on top of NumPy with GPU acceleration. DeepPy bridges the gap between highperformance neural networks and the ease of development from Python/NumPy. Users with a background in scientific computing in Python will quickly...... be able to understand and change the DeepPy codebase as it is mainly implemented using high-level NumPy primitives. Moreover, DeepPy supports complex network architectures by letting the user compose mathematical expressions as directed graphs. The latest version is available at http...

  9. COOL WHITE DWARFS IDENTIFIED IN THE SECOND DATA RELEASE OF THE UKIRT INFRARED DEEP SKY SURVEY

    International Nuclear Information System (INIS)

    Lodieu, N.; Leggett, S. K.; Nitta, A.; Bergeron, P.

    2009-01-01

    We have paired the second data release of the Large Area Survey of the UKIRT Infrared Deep Sky Survey with the fifth data release of the Sloan Digital Sky Survey to identify 10 cool white dwarf candidates, from their photometry and astrometry. Of these 10, one was previously known to be a very cool white dwarf. We have obtained optical spectroscopy for seven of the candidates using the GMOS-N spectrograph on Gemini North, and have confirmed all seven as white dwarfs. Our photometry and astrometry indicate that the remaining two objects are also white dwarfs. The model analysis of the photometry and available spectroscopy shows that the seven confirmed new white dwarfs, and the two new likely white dwarfs, have effective temperatures in the range of T eff = 5400-6600 K. Our analysis of the previously known white dwarf confirms that it is cool, with T eff = 3800 K. The cooling age for this dwarf is 8.7 Gyr, while that for the nine ∼ 6000 K white dwarfs is 1.8-3.6 Gyr. We are unable to determine the masses of the white dwarfs from the existing data, and therefore we cannot constrain the total ages of the white dwarfs. The large cooling age for the coolest white dwarf in the sample, combined with its low estimated tangential velocity, suggests that it is an old member of the thin disk, or a member of the thick disk of the Galaxy, with an age of 10-11 Gyr. The warmer white dwarfs appear to have velocities typical of the thick disk or even halo; these may be very old remnants of low-mass stars, or they may be relatively young thin-disk objects with unusually high space motion.

  10. Leachability of bituminized radioactive waste. Literature survey

    International Nuclear Information System (INIS)

    Akimoto, Toshiyuki; Nakayama, Shinichi; Iida, Yoshihisa; Nagano, Tetsushi

    1999-02-01

    Bituminized radioactive waste that will be returned from COGEMA, France is planned to be disposed of in deep geologic repository in Japan. Data on leachability of radionuclides from bituminized waste are required for the performance assessment of the disposal. We made a literature survey on bitumen and bituminized radioactive waste, placing emphasis on leach tests and leach data in terms of geologic disposal. This survey revealed that reliable leach data on transuranium elements and data obtained under reducing conditions that is characteristic to deep underground are lacking. (author). 64 refs

  11. Annual report 1975

    International Nuclear Information System (INIS)

    1976-01-01

    A survey is presented of: a) Research activities carried out during 1975 within the framework of the scientific sections of the Institute, b) the central technical seπtions and their services, and c) the organizational structure. Enclosed are: 1) Report on the activities of the administrative management, the administration, and general services, 2) a list of all publications and conference reports issued by IPP Garching during 1975, and an annual report by the Institut fuer Plasmaphysik of the University of Stuttgart, an institute financed by IPP Garching. (GG) [de

  12. The SCUBA-2 Cosmology Legacy Survey: the EGS deep field - I. Deep number counts and the redshift distribution of the recovered cosmic infrared background at 450 and 850 μ m

    Science.gov (United States)

    Zavala, J. A.; Aretxaga, I.; Geach, J. E.; Hughes, D. H.; Birkinshaw, M.; Chapin, E.; Chapman, S.; Chen, Chian-Chou; Clements, D. L.; Dunlop, J. S.; Farrah, D.; Ivison, R. J.; Jenness, T.; Michałowski, M. J.; Robson, E. I.; Scott, Douglas; Simpson, J.; Spaans, M.; van der Werf, P.

    2017-01-01

    We present deep observations at 450 and 850 μm in the Extended Groth Strip field taken with the SCUBA-2 camera mounted on the James Clerk Maxwell Telescope as part of the deep SCUBA-2 Cosmology Legacy Survey (S2CLS), achieving a central instrumental depth of σ450 = 1.2 mJy beam-1 and σ850 = 0.2 mJy beam-1. We detect 57 sources at 450 μm and 90 at 850 μm with signal-to-noise ratio >3.5 over ˜70 arcmin2. From these detections, we derive the number counts at flux densities S450 > 4.0 mJy and S850 > 0.9 mJy, which represent the deepest number counts at these wavelengths derived using directly extracted sources from only blank-field observations with a single-dish telescope. Our measurements smoothly connect the gap between previous shallower blank-field single-dish observations and deep interferometric ALMA results. We estimate the contribution of our SCUBA-2 detected galaxies to the cosmic infrared background (CIB), as well as the contribution of 24 μm-selected galaxies through a stacking technique, which add a total of 0.26 ± 0.03 and 0.07 ± 0.01 MJy sr-1, at 450 and 850 μm, respectively. These surface brightnesses correspond to 60 ± 20 and 50 ± 20 per cent of the total CIB measurements, where the errors are dominated by those of the total CIB. Using the photometric redshifts of the 24 μm-selected sample and the redshift distributions of the submillimetre galaxies, we find that the redshift distribution of the recovered CIB is different at each wavelength, with a peak at z ˜ 1 for 450 μm and at z ˜ 2 for 850 μm, consistent with previous observations and theoretical models.

  13. CMU DeepLens: deep learning for automatic image-based galaxy-galaxy strong lens finding

    Science.gov (United States)

    Lanusse, François; Ma, Quanbin; Li, Nan; Collett, Thomas E.; Li, Chun-Liang; Ravanbakhsh, Siamak; Mandelbaum, Rachel; Póczos, Barnabás

    2018-01-01

    Galaxy-scale strong gravitational lensing can not only provide a valuable probe of the dark matter distribution of massive galaxies, but also provide valuable cosmological constraints, either by studying the population of strong lenses or by measuring time delays in lensed quasars. Due to the rarity of galaxy-scale strongly lensed systems, fast and reliable automated lens finding methods will be essential in the era of large surveys such as Large Synoptic Survey Telescope, Euclid and Wide-Field Infrared Survey Telescope. To tackle this challenge, we introduce CMU DeepLens, a new fully automated galaxy-galaxy lens finding method based on deep learning. This supervised machine learning approach does not require any tuning after the training step which only requires realistic image simulations of strongly lensed systems. We train and validate our model on a set of 20 000 LSST-like mock observations including a range of lensed systems of various sizes and signal-to-noise ratios (S/N). We find on our simulated data set that for a rejection rate of non-lenses of 99 per cent, a completeness of 90 per cent can be achieved for lenses with Einstein radii larger than 1.4 arcsec and S/N larger than 20 on individual g-band LSST exposures. Finally, we emphasize the importance of realistically complex simulations for training such machine learning methods by demonstrating that the performance of models of significantly different complexities cannot be distinguished on simpler simulations. We make our code publicly available at https://github.com/McWilliamsCenter/CMUDeepLens.

  14. Seismic stability of the survey areas of potential sites for the deep geological repository of the spent nuclear fuel

    Directory of Open Access Journals (Sweden)

    Kaláb Zdeněk

    2017-07-01

    Full Text Available This paper deals with the seismic stability of the survey areas of potential sites for the deep geological repository of the spent nuclear fuel in the Czech Republic. The basic source of data for historical earthquakes up to 1990 was the seismic website [10]. The most intense earthquake described occurred on September 15, 1590 in the Niederroesterreich region (Austria in the historical period; its reported intensity is Io = 8-9. The source of the contemporary seismic data for the period since 1991 to the end of 2014 was the website [11]. It may be stated based on the databases and literature review that in the period from 1900, no earthquake exceeding magnitude 5.1 originated in the territory of the Czech Republic.

  15. Geological aspects of a deep underground disposal facility in the Czech Republic

    International Nuclear Information System (INIS)

    Skopovy, J.; Woller, F.

    1997-01-01

    The basic requirements for the geological situation at a deep underground radioactive waste disposal site are highlighted, a survey of candidate host sites worldwide is presented, and the situation in the Czech Republic is analyzed. A 'General Project of Geological Activities Related to the Development of a Deep Underground Disposal Site for Radioactive Wastes and Spent Fuel in the Czech Republic' has been developed by the Nuclear Research Institute and approved and financed by the authorities. The Project encompasses the following stages: (i) preliminary study and research; (ii) examination of the seismicity, neotectonics, and geodynamics; (iii) search and critical assessment of archived geological information; (iv) non-destructive survey; and (v) destructive survey. The Project should take about 30 years and its scope will be updated from time to time. (P.A.)

  16. Deep-sea coral research and technology program: Alaska deep-sea coral and sponge initiative final report

    Science.gov (United States)

    Rooper, Chris; Stone, Robert P.; Etnoyer, Peter; Conrath, Christina; Reynolds, Jennifer; Greene, H. Gary; Williams, Branwen; Salgado, Enrique; Morrison, Cheryl L.; Waller, Rhian G.; Demopoulos, Amanda W.J.

    2017-01-01

    Deep-sea coral and sponge ecosystems are widespread throughout most of Alaska’s marine waters. In some places, such as the central and western Aleutian Islands, deep-sea coral and sponge resources can be extremely diverse and may rank among the most abundant deep-sea coral and sponge communities in the world. Many different species of fishes and invertebrates are associated with deep-sea coral and sponge communities in Alaska. Because of their biology, these benthic invertebrates are potentially impacted by climate change and ocean acidification. Deepsea coral and sponge ecosystems are also vulnerable to the effects of commercial fishing activities. Because of the size and scope of Alaska’s continental shelf and slope, the vast majority of the area has not been visually surveyed for deep-sea corals and sponges. NOAA’s Deep Sea Coral Research and Technology Program (DSCRTP) sponsored a field research program in the Alaska region between 2012–2015, referred to hereafter as the Alaska Initiative. The priorities for Alaska were derived from ongoing data needs and objectives identified by the DSCRTP, the North Pacific Fishery Management Council (NPFMC), and Essential Fish Habitat-Environmental Impact Statement (EFH-EIS) process.This report presents the results of 15 projects conducted using DSCRTP funds from 2012-2015. Three of the projects conducted as part of the Alaska deep-sea coral and sponge initiative included dedicated at-sea cruises and fieldwork spread across multiple years. These projects were the eastern Gulf of Alaska Primnoa pacifica study, the Aleutian Islands mapping study, and the Gulf of Alaska fish productivity study. In all, there were nine separate research cruises carried out with a total of 109 at-sea days conducting research. The remaining projects either used data and samples collected by the three major fieldwork projects or were piggy-backed onto existing research programs at the Alaska Fisheries Science Center (AFSC).

  17. Stakeholder perspectives on the importance of rare-species research for deep-sea environmental management

    Science.gov (United States)

    Turner, Phillip J.; Campbell, Lisa M.; Van Dover, Cindy L.

    2017-07-01

    The apparent prevalence of rare species (rarity) in the deep sea is a concern for environmental management and conservation of biodiversity. Rare species are often considered at risk of extinction and, in terrestrial and shallow water environments, have been shown to play key roles within an ecosystem. In the deep-sea environment, current research focuses primarily on abundant species and deep-sea stakeholders are questioning the importance of rare species in ecosystem functioning. This study asks whether deep-sea stakeholders (primarily scientists) view rare-species research as a priority in guiding environmental management. Delphi methodology (i.e., an iterative survey approach) was used to understand views about whether or not 'deep-sea scientists should allocate more resources to research on rare species in the deep sea, even if this means less resources might be available for abundant-species research.' Results suggest little consensus regarding the prioritization of resources for rare-species research. From Survey 1 to Survey 3, the average participant response shifted toward a view that rare-species research is not a priority if it comes at a cost to research on abundant species. Participants pointed to the need for a balanced approach and highlighted knowledge gaps about even the most fundamental questions, including whether rare species are truly 'rare' or simply under-sampled. Participants emphasized the lack of basic biological knowledge for rare and abundant species, particularly abundant meio- and microscopic species, as well as uncertainty in the roles rare and abundant species play in ecosystem processes. Approaches that jointly consider the role of rare and abundant species in ecosystem functioning (e.g., biological trait analysis) may help to clarify the extent to which rare species need to be incorporated into deep-sea environment management in order to maintain ecosystem functioning.

  18. The MUSE Hubble Ultra Deep Field Survey. II. Spectroscopic redshifts and comparisons to color selections of high-redshift galaxies

    Science.gov (United States)

    Inami, H.; Bacon, R.; Brinchmann, J.; Richard, J.; Contini, T.; Conseil, S.; Hamer, S.; Akhlaghi, M.; Bouché, N.; Clément, B.; Desprez, G.; Drake, A. B.; Hashimoto, T.; Leclercq, F.; Maseda, M.; Michel-Dansac, L.; Paalvast, M.; Tresse, L.; Ventou, E.; Kollatschny, W.; Boogaard, L. A.; Finley, H.; Marino, R. A.; Schaye, J.; Wisotzki, L.

    2017-11-01

    We have conducted a two-layered spectroscopic survey (1' × 1' ultra deep and 3' × 3' deep regions) in the Hubble Ultra Deep Field (HUDF) with the Multi Unit Spectroscopic Explorer (MUSE). The combination of a large field of view, high sensitivity, and wide wavelength coverage provides an order of magnitude improvement in spectroscopically confirmed redshifts in the HUDF; i.e., 1206 secure spectroscopic redshifts for Hubble Space Telescope (HST) continuum selected objects, which corresponds to 15% of the total (7904). The redshift distribution extends well beyond z> 3 and to HST/F775W magnitudes as faint as ≈ 30 mag (AB, 1σ). In addition, 132 secure redshifts were obtained for sources with no HST counterparts that were discovered in the MUSE data cubes by a blind search for emission-line features. In total, we present 1338 high quality redshifts, which is a factor of eight increase compared with the previously known spectroscopic redshifts in the same field. We assessed redshifts mainly with the spectral features [O II] at zcolor selection (dropout) diagrams of high-z galaxies. The selection condition for F336W dropouts successfully captures ≈ 80% of the targeted z 2.7 galaxies. However, for higher redshift selections (F435W, F606W, and F775W dropouts), the success rates decrease to ≈ 20-40%. We empirically redefine the selection boundaries to make an attempt to improve them to ≈ 60%. The revised boundaries allow bluer colors that capture Lyα emitters with high Lyα equivalent widths falling in the broadbands used for the color-color selection. Along with this paper, we release the redshift and line flux catalog. Based on observations made with ESO telescopes at the La Silla Paranal Observatory under program IDs 094.A-0289(B), 095.A-0010(A), 096.A-0045(A) and 096.A-0045(B).MUSE Ultra Deep Field redshift catalogs (Full Table A.1) are available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http

  19. A wide deep infrared look at the Pleiades with UKIDSS: new constraints on the substellar binary fraction and the low-mass initial mass function

    NARCIS (Netherlands)

    Lodieu, N.; Dobbie, P.D.; Deacon, N.R.; Hodgkin, S.T.; Hambly, N.C.; Jameson, R.F.

    2007-01-01

    We present the results of a deep wide-field near-infrared survey of 12 deg2 of the Pleiades conducted as part of the United Kingdom Infrared Telescope (UKIRT) Infrared Deep Sky Survey (UKIDSS) Galactic Cluster Survey (GCS). We have extracted over 340 high-probability proper motion (PM)

  20. Digitally Inspired Thinking: Can Social Media Lead to Deep Learning in Higher Education?

    Science.gov (United States)

    Samuels-Peretz, Debbie; Dvorkin Camiel, Lana; Teeley, Karen; Banerjee, Gouri

    2017-01-01

    In this study, students from a variety of disciplines, who were enrolled in six courses that incorporate the use of social media, were surveyed to evaluate their perception of how the integration of social-media tools supports deep approaches to learning. Students reported that social media supports deep learning both directly and indirectly,…

  1. The Hubble Space Telescope Medium Deep Survey Cluster Sample: Methodology and Data

    Science.gov (United States)

    Ostrander, E. J.; Nichol, R. C.; Ratnatunga, K. U.; Griffiths, R. E.

    1998-12-01

    We present a new, objectively selected, sample of galaxy overdensities detected in the Hubble Space Telescope Medium Deep Survey (MDS). These clusters/groups were found using an automated procedure that involved searching for statistically significant galaxy overdensities. The contrast of the clusters against the field galaxy population is increased when morphological data are used to search around bulge-dominated galaxies. In total, we present 92 overdensities above a probability threshold of 99.5%. We show, via extensive Monte Carlo simulations, that at least 60% of these overdensities are likely to be real clusters and groups and not random line-of-sight superpositions of galaxies. For each overdensity in the MDS cluster sample, we provide a richness and the average of the bulge-to-total ratio of galaxies within each system. This MDS cluster sample potentially contains some of the most distant clusters/groups ever detected, with about 25% of the overdensities having estimated redshifts z > ~0.9. We have made this sample publicly available to facilitate spectroscopic confirmation of these clusters and help more detailed studies of cluster and galaxy evolution. We also report the serendipitous discovery of a new cluster close on the sky to the rich optical cluster Cl l0016+16 at z = 0.546. This new overdensity, HST 001831+16208, may be coincident with both an X-ray source and a radio source. HST 001831+16208 is the third cluster/group discovered near to Cl 0016+16 and appears to strengthen the claims of Connolly et al. of superclustering at high redshift.

  2. Turn-off-the-Month Effect on Stocks in LQ45 Index and Various Sectors in the Indonesia Stock Exchange using GARCH (p,q

    Directory of Open Access Journals (Sweden)

    Galih Pandekar

    2012-01-01

    Full Text Available There are few types of anomalies that occur in the Indonesia Stock Exchange, for example monthly effect, day-of-the-week effect, January effect, holiday effect, and turn-of-the-month effect. The existence of these anomalies is in contrast to the efficient market hypothesis theory, due to a signifi-cant difference in returns during certain periods. By using time-series analysis and the GARCH(p,q method, the existence of the turn-of-the-month effect has been found in the Jakarta Composite Index, sectoral indexes, and stocks in LQ45. The turn-of-the-month effect seems to be seen in the last two days and the four previous days of each month. The January effect does not incite the turn-of-the-month effect. The turn-of-the-month effect appears due to an increasing volume of stocks acquired by investment managers who want to see their portfolio performance better. ";} // -->activate javascript

  3. Neural network based satellite tracking for deep space applications

    Science.gov (United States)

    Amoozegar, F.; Ruggier, C.

    2003-01-01

    The objective of this paper is to provide a survey of neural network trends as applied to the tracking of spacecrafts in deep space at Ka-band under various weather conditions and examine the trade-off between tracing accuracy and communication link performance.

  4. Microbial ecology of deep-water mid-Atlantic canyons

    Science.gov (United States)

    Kellogg, Christina A.

    2011-01-01

    The research described in this fact sheet will be conducted from 2012 to 2014 as part of the U.S. Geological Survey's DISCOVRE (DIversity, Systematics, and COnnectivity of Vulnerable Reef Ecosystems) Program. This integrated, multidisciplinary effort will be investigating a variety of topics related to unique and fragile deep-sea ecosystems from the microscopic level to the ecosystem level. One goal is to improve understanding, at the microbiological scale, of the benthic communities (including corals) that reside in and around mid-Atlantic canyon habitats and their associated environments. Specific objectives include identifying and characterizing the microbial associates of deep-sea corals, characterizing the microbial biofilms on hard substrates to better determine their role in engineering the ecosystem, and adding a microbial dimension to benthic community structure and function assessments by characterizing micro-eukaryotes, bacteria, and archaea in deep-sea sediments.

  5. Modelacion del Efecto del Día de la Semana para los Indices Accionarios de Colombia mediante un Modelo STAR GARCH.

    Directory of Open Access Journals (Sweden)

    David Mauricio Rivera Palacio

    2010-05-01

    Full Text Available En este trabajo se estudia el comportamiento de los retornos de los tres principales índices bursátiles de Colombia: el IBB de la Bolsa de Bogotá, el IBOMED de la Bolsa de Medellin, y el IGBC de Bolsa de Valores de Colombia. A través de un modelo STAR GARCH se identifican dos estados o regiones extremos; mientras en el primero los rendimientos de los índices son, en términos absolutos, bajos y los procesos son estacionarios, en el segundo se tienen grandes pérdidas o ganancias, donde los efectos de los choques son permanentes. Aunque en cada uno de los regímenes el efecto del día de la semana es diferente, los resultados indican que para los tres índices existe un efecto del día de la semana en la media, y un efecto del día en la varianza para la Bolsa de Bogotá y Bolsa de Valores de Colombia. Los resultados contradicen la hipótesis de un mercado de acciones efciente en información.

  6. Análisis de la volatilidad del índice principal del mercado bursátil mexicano, del índice de riesgo país y de la mezcla mexicana de exportación mediante un modelo GARCH trivariado asimétrico || Volatility Analysis of the Core Mexican Stock Market Index, the Country Risk Index, and the Mexican Oil Basket Using an Asymmetric Trivariate GARCH Model

    Directory of Open Access Journals (Sweden)

    Villalba Padilla, Fátima Irina

    2014-12-01

    Full Text Available Se parametriza de forma conjunta€€ la heteroscedasticidad condicional autorregresiva generalizada que corresponde al comportamiento de la varianza de tres variables: (a el índice de precios y cotizaciones (IPC, indicador principal del mercado bursátil mexicano, (b el emerging markets bond index para México (EMBI, como indicador de riesgo país y (c el precio de la canasta mexicana de tres crudos de exportación (MEZCLA. Las variables se emplean como estimadores de la tendencia de los precios de las acciones, los bonos y los energéticos, respectivamente, con el objetivo final de conformar un portafolio de inversión diversicado que incluya dichos activos. Se presentan los resultados empíricos de un modelo econométrico GARCH trivariado asimétrico. El modelo permite incorporar la covarianza entre las variables para explicar su interrelación y en la estimación se considera el efecto de los choques generados por las innovaciones positivas y negativas. El estudio contempla el periodo de 2002 a 2013. || We jointly parameterized the generalized autoregressive conditional heteroskedasticity that corresponds to the behavior of the variance of three variables: (a the core Mexican stock market index (IPC, (b the Emerging Markets Bond Index for Mexico (EMBI as country risk pointer and, (c the Mexican three oil basket exports mix (MEZCLA. The variables are used as trend indicators of stocks, bonds and energetics respectively with the ultimate goal of forming a diversified portfolio including such assets. This paper presents the empirical results of an asymmetric econometric trivariate GARCH model. The model incorporates the covariance between the variables in order to explain their relationship and we considered the shocks generated by positive and negative innovations. The study involves the period 2002- 2013.

  7. Characterization of majority and minority carrier deep levels in p-type GaN:Mg grown by molecular beam epitaxy using deep level optical spectroscopy

    International Nuclear Information System (INIS)

    Armstrong, A.; Caudill, J.; Ringel, S. A.; Corrion, A.; Poblenz, C.; Mishra, U. K.; Speck, J. S.

    2008-01-01

    Deep level defects in p-type GaN:Mg grown by molecular beam epitaxy were characterized using steady-state photocapacitance and deep level optical spectroscopy (DLOS). Low frequency capacitance measurements were used to alleviate dispersion effects stemming from the deep Mg acceptor. Use of DLOS enabled a quantitative survey of both deep acceptor and deep donor levels, the latter being particularly important due to the limited understanding of minority carrier states for p-type GaN. Simultaneous electron and hole photoemissions resulted in a convoluted deep level spectrum that was decoupled by emphasizing either majority or minority carrier optical emission through control of the thermal filling time conditions. In this manner, DLOS was able to resolve and quantify the properties of deep levels residing near both the conduction and valence bandedges in the same sample. Bandgap states through hole photoemission were observed at E v +3.05 eV, E v +3.22 eV and E v +3.26 eV. Additionally, DLOS revealed levels at E c -3.24 eV and E c -2.97 eV through electron emission to the conduction band with the former attributed to the Mg acceptor itself. The detected deep donor concentration is less than 2% of activated [Mg] and demonstrates the excellent quality of the film

  8. M DWARF ACTIVITY IN THE PAN-STARRS1 MEDIUM-DEEP SURVEY: FIRST CATALOG AND ROTATION PERIODS

    Energy Technology Data Exchange (ETDEWEB)

    Kado-Fong, E. [Department of Physics and Astronomy, Tufts University, Medford, MA 02155 (United States); Williams, P. K. G.; Berger, E. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Mann, A. W. [The University of Texas at Austin, Department of Astronomy, 2515 Speedway C1400, Austin, TX 78712 (United States); Burgett, W. S.; Chambers, K. C.; Huber, M. E.; Kaiser, N.; Kudritzki, R.-P.; Magnier, E. A.; Wainscoat, R. J.; Waters, C. [Institute for Astronomy, University of Hawaii at Manoa, Honolulu, HI 96822 (United States); Rest, A., E-mail: erin.fong@tufts.edu [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States)

    2016-12-20

    We report on an ongoing project to investigate activity in the M dwarf stellar population observed by the Pan-STARRS1 Medium-Deep Survey (PS1-MDS). Using a custom-built pipeline, we refine an initial sample of ∼4 million sources in PS1-MDS to a sample of 184,148 candidate cool stars using color cuts. Motivated by the well-known relationship between rotation and stellar activity, we use a multiband periodogram analysis and visual vetting to identify 270 sources that are likely rotating M dwarfs. We derive a new set of polynomials relating M dwarf PS1 colors to fundamental stellar parameters and use them to estimate the masses, distances, effective temperatures, and bolometric luminosities of our sample. We present a catalog containing these values, our measured rotation periods, and cross-matches to other surveys. Our final sample spans periods of ≲1–130 days in stars with estimated effective temperatures of ∼2700–4000 K. Twenty-two of our sources have X-ray cross-matches, and they are found to be relatively X-ray bright as would be expected from selection effects. Our data set provides evidence that Kepler -based searches have not been sensitive to very slowly rotating stars ( P {sub rot} ≳ 70 day), implying that the observed emergence of very slow rotators in studies of low-mass stars may be a systematic effect. We also see a lack of low-amplitude (<2%) variability in objects with intermediate (10–40 day) rotation periods, which, considered in conjunction with other observational results, may be a signpost of a loss of magnetic complexity associated with a phase of rapid spin-down in intermediate-age M dwarfs. This work represents just a first step in exploring stellar variability in data from the PS1-MDS and, in the farther future, Large Synoptic Survey Telescope.

  9. A Global Survey of Deep Underground Facilities; Examples of Geotechnical and Engineering Capabilities, Achievements, Challenges (Mines, Shafts, Tunnels, Boreholes, Sites and Underground Facilities for Nuclear Waste and Physics R&D): A Guide to Interactive Global Map Layers, Table Database, References and Notes

    International Nuclear Information System (INIS)

    Tynan, Mark C.; Russell, Glenn P.; Perry, Frank V.; Kelley, Richard E.; Champenois, Sean T.

    2017-01-01

    These associated tables, references, notes, and report present a synthesis of some notable geotechnical and engineering information used to create four interactive layer maps for selected: 1) deep mines and shafts; 2) existing, considered or planned radioactive waste management deep underground studies or disposal facilities 3) deep large diameter boreholes, and 4) physics underground laboratories and facilities from around the world. These data are intended to facilitate user access to basic information and references regarding “deep underground” facilities, history, activities, and plans. In general, the interactive maps and database provide each facility’s approximate site location, geology, and engineered features (e.g.: access, geometry, depth, diameter, year of operations, groundwater, lithology, host unit name and age, basin; operator, management organization, geographic data, nearby cultural features, other). Although the survey is not comprehensive, it is representative of many of the significant existing and historical underground facilities discussed in the literature addressing radioactive waste management and deep mined geologic disposal safety systems. The global survey is intended to support and to inform: 1) interested parties and decision makers; 2) radioactive waste disposal and siting option evaluations, and 3) safety case development applicable to any mined geologic disposal facility as a demonstration of historical and current engineering and geotechnical capabilities available for use in deep underground facility siting, planning, construction, operations and monitoring.

  10. A Global Survey of Deep Underground Facilities; Examples of Geotechnical and Engineering Capabilities, Achievements, Challenges (Mines, Shafts, Tunnels, Boreholes, Sites and Underground Facilities for Nuclear Waste and Physics R&D): A Guide to Interactive Global Map Layers, Table Database, References and Notes

    Energy Technology Data Exchange (ETDEWEB)

    Tynan, Mark C. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Russell, Glenn P. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Perry, Frank V. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kelley, Richard E. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Champenois, Sean T. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2017-06-13

    These associated tables, references, notes, and report present a synthesis of some notable geotechnical and engineering information used to create four interactive layer maps for selected: 1) deep mines and shafts; 2) existing, considered or planned radioactive waste management deep underground studies or disposal facilities 3) deep large diameter boreholes, and 4) physics underground laboratories and facilities from around the world. These data are intended to facilitate user access to basic information and references regarding “deep underground” facilities, history, activities, and plans. In general, the interactive maps and database provide each facility’s approximate site location, geology, and engineered features (e.g.: access, geometry, depth, diameter, year of operations, groundwater, lithology, host unit name and age, basin; operator, management organization, geographic data, nearby cultural features, other). Although the survey is not comprehensive, it is representative of many of the significant existing and historical underground facilities discussed in the literature addressing radioactive waste management and deep mined geologic disposal safety systems. The global survey is intended to support and to inform: 1) interested parties and decision makers; 2) radioactive waste disposal and siting option evaluations, and 3) safety case development applicable to any mined geologic disposal facility as a demonstration of historical and current engineering and geotechnical capabilities available for use in deep underground facility siting, planning, construction, operations and monitoring.

  11. VME online system of the Bonn polarized nucleon targets and polarization measurements on NH3

    International Nuclear Information System (INIS)

    Thiel, W.

    1991-02-01

    The measurement of spin observables is the main purpose of the PHOENICS detector at the Bonn Electron Accelerator ELSA. Therefore a new frosen spin target was built allowing any spin orientation by means of two perpendicular holding fields and the use of a polarizing field up to 7 Tesla. With a vertical dilution refrigerator the polarization can be frozen at a temperature of 70 mK. This thesis describe a VME based control and monitor system for the various parts of this target. It mainly consists of a VIP processor together with different kinds of I/O and interface boards. Caused by its modular structure in hard- and software it can be easyly set up to control and monitor different hardware environments. A menu and command oriented user interface running on an ATARI computer allows a comfortable operation. Secondly the new NMR system is described in detail. It is based on the Liferpool module allowing a dispersion user interface running on an ATARI computer allows a comfortable operation. Secondly the new NMR system is described in detail. It is based on the Liverpool module allowing a dispersion free detection and a simple adjustment to different magnetic fields. A similar VME system takes care of all the necessary task for the polarization measurements. Fast optodecoupled analog I/O modules a e used as an interface to the NMR hardware. Finally the first measurements with this target are presented. Using NH 3 as target material and a polarizing field of 3.5 Tesla a proton polarization of +94% and -100% could be achieved. By lowering the magnetic field to 0.35 Tesla a superradiance effect was observed. (orig.)

  12. ALMACAL I: FIRST DUAL-BAND NUMBER COUNTS FROM A DEEP AND WIDE ALMA SUBMILLIMETER SURVEY, FREE FROM COSMIC VARIANCE

    Energy Technology Data Exchange (ETDEWEB)

    Oteo, I.; Ivison, R. J. [Institute for Astronomy, University of Edinburgh, Royal Observatory, Blackford Hill, Edinburgh EH9 3HJ UK (United Kingdom); Zwaan, M. A.; Biggs, A. D. [European Southern Observatory, Karl-Schwarzschild-Strasse 2, D-85748 Garching (Germany); Smail, I., E-mail: ivanoteogomez@gmail.com [Centre for Extragalactic Astronomy, Department of Physics, Durham University, South Road, Durham DH1 3LE UK (United Kingdom)

    2016-05-01

    We have exploited ALMA calibration observations to carry out a novel, wide, and deep submillimeter (submm) survey, almacal. These calibration data comprise a large number of observations of calibrator fields in a variety of frequency bands and array configurations. By gathering together data acquired during multiple visits to many ALMA calibrators, it is possible to reach noise levels which allow the detection of faint, dusty, star-forming galaxies (DSFGs) over a significant area. In this paper, we outline our survey strategy and report the first results. We have analyzed data for 69 calibrators, reaching depths of ∼25 μ Jy beam{sup −1} at sub-arcsec resolution. Adopting a conservative approach based on ≥5 σ detections, we have found 8 and 11 DSFGs in ALMA bands 6 and 7, respectively, with flux densities S {sub 1.2} m {sub m} ≥ 0.2 mJy. The faintest galaxies would have been missed by even the deepest Herschel surveys. Our cumulative number counts have been determined independently at 870 μ m and 1.2 mm from a sparse sampling of the astronomical sky, and are thus relatively free of cosmic variance. The counts are lower than reported previously by a factor of at least 2×. Future analyses will yield large, secure samples of DSFGs with redshifts determined via the detection of submm spectral lines. Uniquely, our strategy then allows for morphological studies of very faint DSFGs—representative of more normal star-forming galaxies than conventional submm galaxies—in fields where self-calibration is feasible, yielding milliarcsecond spatial resolution.

  13. Deep brain stimulation for dystonia: patient selection and outcomes

    NARCIS (Netherlands)

    Speelman, J. D.; Contarino, M. F.; Schuurman, P. R.; Tijssen, M. A. J.; de Bie, R. M. A.

    2010-01-01

    In a literature survey, 341 patients with primary and 109 with secondary dystonias treated with deep brain stimulation (DBS) of the internal segment of the globus pallidus (GPi) were identified. In general, the outcomes for primary dystonias were more favourable compared to the secondary forms. For

  14. Deep brain stimulation for dystonia : Patient selection and outcomes

    NARCIS (Netherlands)

    Speelman, J. D.; Contarino, M. F.; Schuurman, P. R.; Tijssen, M. A. J.; de Bie, R. M. A.

    In a literature survey, 341 patients with primary and 109 with secondary dystonias treated with deep brain stimulation (DBS) of the internal segment of the globus pallidus (GPi) were identified. In general, the outcomes for primary dystonias were more favourable compared to the secondary forms. For

  15. Trends in Continuous Deep Sedation until Death between 2007 and 2013: A Repeated Nationwide Survey

    Science.gov (United States)

    Cohen, Joachim; Rietjens, Judith

    2016-01-01

    Background Continuous deep sedation until death is a highly debated medical practice, particularly regarding its potential to hasten death and its proper use in end-of-life care. A thorough analysis of important trends in this practice is needed to identify potentially problematic developments. This study aims to examine trends in the prevalence and practice characteristics of continuous deep sedation until death in Flanders, Belgium between 2007 and 2013, and to study variation on physicians’ degree of palliative training. Methods Population-based death certificate study in 2007 and 2013 in Flanders, Belgium. Reporting physicians received questionnaires about medical practices preceding the patient’s death. Patient characteristics, clinical characteristics (drugs used, duration, artificial nutrition/hydration, intention and consent), and palliative care training of attending physician were recorded. We posed the following question regarding continuous deep sedation: ‘Was the patient continuously and deeply sedated or kept in a coma until death by the use of one or more drugs’. Results After the initial rise of continuous deep sedation to 14.5% in 2007 (95%CI 13.1%-15.9%), its use decreased to 12.0% in 2013 (95%CI 10.9%-13.2%). Compared with 2007, in 2013 opioids were less often used as sole drug and the decision to use continuous deep sedation was more often preceded by patient request. Compared to non-experts, palliative care experts more often used benzodiazepines and less often opioids, withheld artificial nutrition/hydration more often and performed sedation more often after a request from or with the consent of the patient or family. Conclusion Worldwide, this study is the first to show a decrease in the prevalence of continuous deep sedation. Despite positive changes in performance and decision-making towards more compliance with due care requirements, there is still room for improvement in the use of recommended drugs and in the involvement of

  16. Trends in Continuous Deep Sedation until Death between 2007 and 2013: A Repeated Nationwide Survey.

    Directory of Open Access Journals (Sweden)

    Lenzo Robijn

    Full Text Available Continuous deep sedation until death is a highly debated medical practice, particularly regarding its potential to hasten death and its proper use in end-of-life care. A thorough analysis of important trends in this practice is needed to identify potentially problematic developments. This study aims to examine trends in the prevalence and practice characteristics of continuous deep sedation until death in Flanders, Belgium between 2007 and 2013, and to study variation on physicians' degree of palliative training.Population-based death certificate study in 2007 and 2013 in Flanders, Belgium. Reporting physicians received questionnaires about medical practices preceding the patient's death. Patient characteristics, clinical characteristics (drugs used, duration, artificial nutrition/hydration, intention and consent, and palliative care training of attending physician were recorded. We posed the following question regarding continuous deep sedation: 'Was the patient continuously and deeply sedated or kept in a coma until death by the use of one or more drugs'.After the initial rise of continuous deep sedation to 14.5% in 2007 (95%CI 13.1%-15.9%, its use decreased to 12.0% in 2013 (95%CI 10.9%-13.2%. Compared with 2007, in 2013 opioids were less often used as sole drug and the decision to use continuous deep sedation was more often preceded by patient request. Compared to non-experts, palliative care experts more often used benzodiazepines and less often opioids, withheld artificial nutrition/hydration more often and performed sedation more often after a request from or with the consent of the patient or family.Worldwide, this study is the first to show a decrease in the prevalence of continuous deep sedation. Despite positive changes in performance and decision-making towards more compliance with due care requirements, there is still room for improvement in the use of recommended drugs and in the involvement of patients and relatives in the

  17. The asymmetric reactions of mean and volatility of stock returns to domestic and international information based on a four-regime double-threshold GARCH model

    Science.gov (United States)

    Chen, Cathy W. S.; Yang, Ming Jing; Gerlach, Richard; Jim Lo, H.

    2006-07-01

    In this paper, we investigate the asymmetric reactions of mean and volatility of stock returns in five major markets to their own local news and the US information via linear and nonlinear models. We introduce a four-regime Double-Threshold GARCH (DTGARCH) model, which allows asymmetry in both the conditional mean and variance equations simultaneously by employing two threshold variables, to analyze the stock markets’ reactions to different types of information (good/bad news) generated from the domestic markets and the US stock market. By applying the four-regime DTGARCH model, this study finds that the interaction between the information of domestic and US stock markets leads to the asymmetric reactions of stock returns and their variability. In addition, this research also finds that the positive autocorrelation reported in the previous studies of financial markets may in fact be mis-specified, and actually due to the local market's positive response to the US stock market.

  18. The Experience of Deep Learning by Accounting Students

    Science.gov (United States)

    Turner, Martin; Baskerville, Rachel

    2013-01-01

    This study examines how to support accounting students to experience deep learning. A sample of 81 students in a third-year undergraduate accounting course was studied employing a phenomenographic research approach, using ten assessed learning tasks for each student (as well as a focus group and student surveys) to measure their experience of how…

  19. Does the Underground Sidewall Station Survey Method Meet MHSA ...

    African Journals Online (AJOL)

    Grobler, Hendrik

    The underground survey network in a deep level platinum mine in ... The time duration for peg installation during the initial phase of learning the method was ..... changes to the survey “hardware” including prisms, stems and attachment points ...

  20. Diel effects on bottom-trawl survey catch rates of shallow- and deep ...

    African Journals Online (AJOL)

    Fishing in depths shallower than 400 m outside daylight hours should therefore be avoided in order to reduce bias and ensure consistency in abundance estimates from surveys. Keywords: Benguela Current system, consistency of survey indices, efficiency of bottom-trawl surveys, negative binomial GAM, transect survey ...

  1. Comprehensive survey of deep learning in remote sensing: theories, tools, and challenges for the community

    Science.gov (United States)

    Ball, John E.; Anderson, Derek T.; Chan, Chee Seng

    2017-10-01

    In recent years, deep learning (DL), a rebranding of neural networks (NNs), has risen to the top in numerous areas, namely computer vision (CV), speech recognition, and natural language processing. Whereas remote sensing (RS) possesses a number of unique challenges, primarily related to sensors and applications, inevitably RS draws from many of the same theories as CV, e.g., statistics, fusion, and machine learning, to name a few. This means that the RS community should not only be aware of advancements such as DL, but also be leading researchers in this area. Herein, we provide the most comprehensive survey of state-of-the-art RS DL research. We also review recent new developments in the DL field that can be used in DL for RS. Namely, we focus on theories, tools, and challenges for the RS community. Specifically, we focus on unsolved challenges and opportunities as they relate to (i) inadequate data sets, (ii) human-understandable solutions for modeling physical phenomena, (iii) big data, (iv) nontraditional heterogeneous data sources, (v) DL architectures and learning algorithms for spectral, spatial, and temporal data, (vi) transfer learning, (vii) an improved theoretical understanding of DL systems, (viii) high barriers to entry, and (ix) training and optimizing the DL.

  2. A DEEP VERY LARGE ARRAY RADIO CONTINUUM SURVEY OF THE CORE AND OUTSKIRTS OF THE COMA CLUSTER

    International Nuclear Information System (INIS)

    Miller, Neal A.; Hornschemeier, Ann E.; Mobasher, Bahram

    2009-01-01

    We present deep 1.4 GHz Very Large Array radio continuum observations of two ∼0.5 deg 2 fields in the Coma cluster of galaxies. The two fields, 'Coma 1' and 'Coma 3', correspond to the cluster core and southwest infall region and were selected on account of abundant preexisting multiwavelength data. In their most sensitive regions the radio data reach 22 μJy rms per 4.''4 beam, sufficient to detect (at 5σ) Coma member galaxies with L 1.4 G Hz = 1.3 x 10 20 W Hz -1 . The full catalog of radio detections is presented herein and consists of 1030 sources detected at ≥5σ, 628 of which are within the combined Coma 1 and Coma 3 area. We also provide optical identifications of the radio sources using data from the Sloan Digital Sky Survey. The depth of the radio observations allows us to detect active galactic nucleus in cluster elliptical galaxies with M r r r ∼ sun yr -1 .

  3. THE HOST GALAXY PROPERTIES OF VARIABILITY SELECTED AGN IN THE PAN-STARRS1 MEDIUM DEEP SURVEY

    Energy Technology Data Exchange (ETDEWEB)

    Heinis, S.; Gezari, S.; Kumar, S. [Department of Astronomy, University of Maryland, College Park, MD (United States); Burgett, W. S.; Flewelling, H.; Huber, M. E.; Kaiser, N.; Wainscoat, R. J.; Waters, C. [Institute for Astronomy, University of Hawaii at Manoa, Honolulu, HI 96822 (United States)

    2016-07-20

    We study the properties of 975 active galactic nuclei (AGNs) selected by variability in the Pan-STARRS1 Medium deep Survey. Using complementary multi-wavelength data from the ultraviolet to the far-infrared, we use spectral energy distribution fitting to determine the AGN and host properties at z < 1 and compare to a well-matched control sample. We confirm the trend previously observed: that the variability amplitude decreases with AGN luminosity, but we also observe that the slope of this relation steepens with wavelength, resulting in a “redder when brighter” trend at low luminosities. Our results show that AGNs are hosted by more massive hosts than control sample galaxies, while the rest frame dust-corrected NUV r color distribution of AGN hosts is similar to control galaxies. We find a positive correlation between the AGN luminosity and star formation rate (SFR), independent of redshift. AGN hosts populate the entire range of SFRs within and outside of the Main Sequence of star-forming galaxies. Comparing the distribution of AGN hosts and control galaxies, we show that AGN hosts are less likely to be hosted by quiescent galaxies and more likely to be hosted by Main Sequence or starburst galaxies.

  4. The Pan-STARRS1 Survey Data Release

    Science.gov (United States)

    Chambers, Kenneth C.; Pan-STARRS Team

    2017-01-01

    The first Pan-STARRS1 Science Mission is complete and an initial Data Release 1, or DR1, including a database of measured attributes, stacked images, and metadata of the 3PI Survey, will be available from the STScI MAST archive. This release will contain all stationary objects with mean and stack photometry registered on the GAIA astrometric frame.The characteristics of the Pan-STARRS1 Surveys will be presented, including image quality, depth, cadence, and coverage. Measured attributes include PSF model magnitudes, aperture magnitudes, Kron Magnitudes, radial moments, Petrosian magnitudes, DeVaucoulers, Exponential, and Sersic magnitudes for extended objects. Images include total intensity, variance, and masks.An overview of both DR1 and the second data release DR2, to follow in the spring of 2017, will be presented. DR2 will add all time domain data and individual warped images. We will also report on the status of the Pan-STARRS2 Observatory and ongoing science with Pan-STARRS. The science from the PS1 surveys has included results in many t fields of astronomy from Near Earth Objects to cosmology.The Pan-STARRS1 Surveys have been made possible through contributions of the Institute for Astronomy of the University of Hawaii; the Pan-STARRS Project Office; the Max-Planck Society and its participating institutes: the Max Planck Institute for Astronomy, Heidelberg and the Max Planck Institute for Extraterrestrial Physics, Garching; The Johns Hopkins University; Durham University; the University of Edinburgh; Queen's University Belfast; the Harvard-Smithsonian Center for Astrophysics, the Las Cumbres Observatory Global Telescope Network Incorporated; the National Central University of Taiwan; the Space Telescope Science Institute; the National Aeronautics and Space Administration under Grants No. NNX08AR22G, NNX12AR65G, NNX14AM74G issued through the Planetary Science Division of the NASA Science Mission Directorate; the National Science Foundation under Grant No. AST

  5. Deep-HiTS: Rotation Invariant Convolutional Neural Network for Transient Detection

    Science.gov (United States)

    Cabrera-Vives, Guillermo; Reyes, Ignacio; Förster, Francisco; Estévez, Pablo A.; Maureira, Juan-Carlos

    2017-02-01

    We introduce Deep-HiTS, a rotation-invariant convolutional neural network (CNN) model for classifying images of transient candidates into artifacts or real sources for the High cadence Transient Survey (HiTS). CNNs have the advantage of learning the features automatically from the data while achieving high performance. We compare our CNN model against a feature engineering approach using random forests (RFs). We show that our CNN significantly outperforms the RF model, reducing the error by almost half. Furthermore, for a fixed number of approximately 2000 allowed false transient candidates per night, we are able to reduce the misclassified real transients by approximately one-fifth. To the best of our knowledge, this is the first time CNNs have been used to detect astronomical transient events. Our approach will be very useful when processing images from next generation instruments such as the Large Synoptic Survey Telescope. We have made all our code and data available to the community for the sake of allowing further developments and comparisons at https://github.com/guille-c/Deep-HiTS. Deep-HiTS is licensed under the terms of the GNU General Public License v3.0.

  6. Knowledge and practice of prophylaxis of deep venous thrombosis ...

    African Journals Online (AJOL)

    2015-09-03

    Sep 3, 2015 ... Kesieme, et al.: Knowledge and practice of prophylaxis of deep venous thrombosis: A survey. 171. Nigerian Journal of Clinical Practice • Mar-Apr 2016 • Vol 19 • Issue 2. Introduction. Venous thromboembolism (VTE) is an important but preventable cause of morbidity and mortality among surgical patients.

  7. Bacteria as part of bioluminescence emission at the deep ANTARES station (North-Western Mediterranean Sea) during a one-year survey

    Science.gov (United States)

    Martini, S.; Michotey, V.; Casalot, L.; Bonin, P.; Guasco, S.; Garel, M.; Tamburini, C.

    2016-10-01

    Bioluminescent bacteria have been studied during a one-year survey in 2011 at the deep ANTARES site (Northwestern Mediterranean Sea, 2000 m depth). The neutrino underwater telescope ANTARES, located at this station, has been used to record the bioluminescence at the same depth. Together with these data, environmental variables (potential temperature, salinity, nutrients, dissolved organic carbon and oxygen) have been characterized in water samples. The year 2011 was characterized by relatively stable conditions, as revealed by minor variability in the monitored oceanographic variables, by low bioluminescence and low current speed. This suggests weak eukaryote participation and mainly non-stimulated light emission. Hence, no processes of dense water have affected the ANTARES station during this survey. Abundance of bioluminescent bacteria belonging to Photobacterium genus, measured by qPCR of the luxF gene, ranged from 1.4×102 to 7.2×102 genes mL-1. Their effective activity was confirmed through mRNA luxF quantification. Our results reveal that bioluminescent bacteria appeared more active than the total counterpart of bacteria, suggesting an ecological benefit of this feature such as favoring interaction with macro-organisms. Moreover, these results show that part of the bioluminescence, recorded at 2000 m depth over one year, could be due to bioluminescent bacteria in stable hydrological conditions.

  8. A deep imaging survey of the Pleiades with ROSAT

    Science.gov (United States)

    Stauffer, J. R.; Caillault, J.-P.; Gagne, M.; Prosser, C. F.; Hartmann, L. W.

    1994-01-01

    We have obtained deep ROSAT images of three regions within the Pleiades open cluster. We have detected 317 X-ray sources in these ROSAT Position Sensitive Proportional Counter (PSPC) images, 171 of which we associate with certain or probable members of the Pleiades cluster. We detect nearly all Pleiades members with spectral types later than G0 and within 25 arcminutes of our three field centers where our sensitivity is highest. This has allowed us to derive for the first time the luminosity function for the G, K, amd M dwarfs of an open cluster without the need to use statistical techniques to account for the presence of upper limits in the data sample. Because of our high X-ray detection frequency down to the faint limit of the optical catalog, we suspect that some of our unidentified X-ray sources are previously unknown, very low-mass members of Pleiades. A large fraction of the Pleiades members detected with ROSAT have published rotational velocities. Plots of L(sub X)/L(sub Bol) versus spectroscopic rotational velocity show tightly correlated `saturation' type relations for stars with ((B - V)(sub 0)) greater than or equal to 0.60. For each of several color ranges, X-ray luminosities rise rapidly with increasing rotation rate until c sin i approximately equal to 15 km/sec, and then remains essentially flat for rotation rates up to at least v sin i approximately equal to 100 km/sec. The dispersion in rotation among low-mass stars in the Pleiades is by far the dominant contributor to the dispersion in L(sub X) at a given mass. Only about 35% of the B, A, and early F stars in the Pleiades are detected as X-ray sources in our survey. There is no correlation between X-ray flux and rotation for these stars. The X-ray luminosity function for the early-type Pleiades stars appears to be bimodal -- with only a few exceptions, we either detect these stars at fluxes in the range found for low-mass stars or we derive X-ray limits below the level found for most Pleiades

  9. Revealing Holobiont Structure and Function of Three Red Sea Deep-Sea Corals

    KAUST Repository

    Yum, Lauren

    2014-12-01

    Deep-sea corals have long been regarded as cold-water coral; however a reevaluation of their habitat limitations has been suggested after the discovery of deep-sea coral in the Red Sea where temperatures exceed 20˚C. To gain further insight into the biology of deep-sea corals at these temperatures, the work in this PhD employed a holotranscriptomic approach, looking at coral animal host and bacterial symbiont gene expression in Dendrophyllia sp., Eguchipsammia fistula, and Rhizotrochus sp. sampled from the deep Red Sea. Bacterial community composition was analyzed via amplicon-based 16S surveys and cultured bacterial strains were subjected to bioprospecting in order to gauge the pharmaceutical potential of coralassociated microbes. Coral host transcriptome data suggest that coral can employ mitochondrial hypometabolism, anaerobic glycolysis, and surface cilia to enhance mass transport rates to manage the low oxygen and highly oligotrophic Red Sea waters. In the microbial community associated with these corals, ribokinases and retron-type reverse transcriptases are abundantly expressed. In its first application to deep-sea coral associated microbial communities, 16S-based next-generation sequencing found that a single operational taxonomic unit can comprise the majority of sequence reads and that a large number of low abundance populations are present, which cannot be visualized with first generation sequencing. Bioactivity testing of selected bacterial isolates was surveyed over 100 cytological parameters with high content screening, covering several major organelles and key proteins involved in a variety of signaling cascades. Some of these cytological profiles were similar to those of several reference pharmacologically active compounds, which suggest that the bacteria isolates produce compounds with similar mechanisms of action as the reference compounds. The sum of this work offers several mechanisms by which Red Sea deep-sea corals cope with environmental

  10. A Literature Survey of Early Time Series Classification and Deep Learning

    OpenAIRE

    Santos, Tiago; Kern, Roman

    2017-01-01

    This paper provides an overview of current literature on time series classification approaches, in particular of early time series classification. A very common and effective time series classification approach is the 1-Nearest Neighbor classier, with different distance measures such as the Euclidean or dynamic time warping distances. This paper starts by reviewing these baseline methods. More recently, with the gain in popularity in the application of deep neural networks to the eld of...

  11. The luminosity function for different morphological types in the CfA Redshift Survey

    Science.gov (United States)

    Marzke, Ronald O.; Geller, Margaret J.; Huchra, John P.; Corwin, Harold G., Jr.

    1994-01-01

    We derive the luminosity function for different morphological types in the original CfA Redshift Survey (CfA1) and in the first two slices of the CfA Redshift Survey Extension (CfA2). CfA1 is a complete sample containing 2397 galaxies distributed over 2.7 steradians with m(sub z) less than or equal 14.5. The first two complete slices of CfA2 contain 1862 galaxies distributed over 0.42 steradians with m(sub z)=15.5. The shapes of the E-S0 and spiral luminosity functions (LF) are indistinguishable. We do not confirm the steeply decreasing faint end in the E-S0 luminosity function found by Loveday et al. for an independent sample in the southern hemisphere. We demonstrate that incomplete classification in deep redshift surveys can lead to underestimates of the faint end of the elliptical luminosity function and could be partially responsible for the difference between the CfA survey and other local field surveys. The faint end of the LF for the Magellanic spirals and irregulars is very steep. The Sm-Im luminosity function is well fit by a Schechter function with M*=-18.79, alpha=-1.87, and phi*=0.6x10(exp -3) for M(sub z) less than or equal to -13. These galaxies are largely responsible for the excess at the faint end of the general CfA luminosity function. The abundance of intrinsically faint, blue galaxies nearby affects the interpretation of deep number counts. The dwarf population increases the expected counts at B=25 in a no-evolution, q(sub 0)=0.05 model by a factor of two over standard no-evolution estimates. These dwarfs change the expected median redshift in deep redshift surveys by less than 10 percent . Thus the steep Sm-Im LF may contribute to the reconciliation of deep number counts with deep redshift surveys.

  12. Why & When Deep Learning Works: Looking Inside Deep Learnings

    OpenAIRE

    Ronen, Ronny

    2017-01-01

    The Intel Collaborative Research Institute for Computational Intelligence (ICRI-CI) has been heavily supporting Machine Learning and Deep Learning research from its foundation in 2012. We have asked six leading ICRI-CI Deep Learning researchers to address the challenge of "Why & When Deep Learning works", with the goal of looking inside Deep Learning, providing insights on how deep networks function, and uncovering key observations on their expressiveness, limitations, and potential. The outp...

  13. Climate change in Latin America and the Caribbean. A review of the Bonn and Marrakech decisions and their effect on the clean development mechanism of the Kyoto protocol

    International Nuclear Information System (INIS)

    Maggiora, C. della

    2002-04-01

    The objective of this document is to present an overview of recent climate change developments, in particular with regards to carbon markets under the Clean Development Mechanism (CDM). The document is divided into three sections. The first section describes the history of the climate change negotiations. Section two presents an overview of the recent decisions adopted at the last international meetings (Bonn Agreements and Marrakech Accord), which have improved the odds of ratification of the Kyoto Protocol by 2002. The third section analyzes the carbon credit market. The first part of this section briefly presents the available information regarding real carbon credit transactions, while the second section focuses on the literature review of several theoretical models and presents the theoretical estimates of the price and size of the carbon market

  14. Infrared Faint Radio Sources in the Extended Chandra Deep Field South

    Science.gov (United States)

    Huynh, Minh T.

    2009-01-01

    Infrared-Faint Radio Sources (IFRSs) are a class of radio objects found in the Australia Telescope Large Area Survey (ATLAS) which have no observable counterpart in the Spitzer Wide-area Infrared Extragalactic Survey (SWIRE). The extended Chandra Deep Field South now has even deeper Spitzer imaging (3.6 to 70 micron) from a number of Legacy surveys. We report the detections of two IFRS sources in IRAC images. The non-detection of two other IFRSs allows us to constrain the source type. Detailed modeling of the SED of these objects shows that they are consistent with high redshift AGN (z > 2).

  15. The VIMOS Ultra Deep Survey first data release: Spectra and spectroscopic redshifts of 698 objects up to zspec 6 in CANDELS

    Science.gov (United States)

    Tasca, L. A. M.; Le Fèvre, O.; Ribeiro, B.; Thomas, R.; Moreau, C.; Cassata, P.; Garilli, B.; Le Brun, V.; Lemaux, B. C.; Maccagni, D.; Pentericci, L.; Schaerer, D.; Vanzella, E.; Zamorani, G.; Zucca, E.; Amorin, R.; Bardelli, S.; Cassarà, L. P.; Castellano, M.; Cimatti, A.; Cucciati, O.; Durkalec, A.; Fontana, A.; Giavalisco, M.; Grazian, A.; Hathi, N. P.; Ilbert, O.; Paltani, S.; Pforr, J.; Scodeggio, M.; Sommariva, V.; Talia, M.; Tresse, L.; Vergani, D.; Capak, P.; Charlot, S.; Contini, T.; de la Torre, S.; Dunlop, J.; Fotopoulou, S.; Guaita, L.; Koekemoer, A.; López-Sanjuan, C.; Mellier, Y.; Salvato, M.; Scoville, N.; Taniguchi, Y.; Wang, P. W.

    2017-04-01

    This paper describes the first data release (DR1) of the VIMOS Ultra Deep Survey (VUDS). The VUDS-DR1 is the release of all low-resolution spectroscopic data obtained in 276.9 arcmin2 of the CANDELS-COSMOS and CANDELS-ECDFS survey areas, including accurate spectroscopic redshifts zspec and individual spectra obtained with VIMOS on the ESO-VLT. A total of 698 objects have a measured redshift, with 677 galaxies, two type-I AGN, and a small number of 19 contaminating stars. The targets of the spectroscopic survey are selected primarily on the basis of their photometric redshifts to ensure a broad population coverage. About 500 galaxies have zspec > 2, 48of which have zspec > 4; the highest reliable redshifts reach beyond zspec = 6. This data set approximately doubles the number of galaxies with spectroscopic redshifts at z > 3 in these fields. We discuss the general properties of the VUDS-DR1 sample in terms of the spectroscopic redshift distribution, the distribution of Lyman-α equivalent widths, and physical properties including stellar masses M⋆ and star formation rates derived from spectral energy distribution fitting with the knowledge of zspec. We highlight the properties of the most massive star-forming galaxies, noting the wide range in spectral properties, with Lyman-α in emission or in absorption, and in imaging properties with compact, multi-component, or pair morphologies. We present the catalogue database and data products. All VUDS-DR1 data are publicly available and can be retrieved from a dedicated query-based database. Future VUDS data releases will follow this VUDS-DR1 to give access to the spectra and associated measurement of 8000 objects in the full 1 square degree of the VUDS survey. Based on data obtained with the European Southern Observatory Very Large Telescope, Paranal, Chile, under Large Program 185.A-0791. http://cesam.lam.fr/vuds

  16. Simulation calculations on the construction of the energy-tagged photon beam as well as development and test of the side drift chambers of the Bonn SAPHIR detector

    International Nuclear Information System (INIS)

    Jahnen, T.

    1990-01-01

    The SAPHIR-detector is built up at the continuous photon beam of the Electron Stretcher and Accelerator ELSA in Bonn. The equipment is designed for investigations of reactions with more then two particles in the final state and for photon energies up to 3.5 GeV. A tagging-system determines the energy of the Bremsstrahlung-photons and a set-up of five large driftchambers measures the tracks of the charged particles. This work describes a program which was used to develop the best design of the tagging-hodoscope. In a second part the tests of the planar side-chambers and their evaluation is described. These measurements were carried out to fix the gasfilling and the parameters of the best working point. It is shown, that the chambers can reach a resolution of σ≤200 μm. (orig.) [de

  17. ASSESSMENT OF THE DEEP SEA WRECK USS INDEPENDENCE

    Directory of Open Access Journals (Sweden)

    Lisa C. Symons

    2016-07-01

    Full Text Available As part of ongoing efforts to better understand the nature of shipwrecks in National Marine Sanctuaries which may pose some level of pollution risk, and in this case, to definitively locate what is likely the only shipwreck in a sanctuary involved in both nuclear testing and nuclear waste disposal, NOAA’s Office of National Marine Sanctuaries collaborated with NOAA’s Office of Ocean Exploration and The Boeing Company, which provided their autonomous underwater vehicle, Echo Ranger, to conduct the first deep-water archaeological survey of the scuttled aircraft carrier USS Independence in the waters of Monterey Bay National Marine Sanctuary (MBNMS in March 2015. The presence of the deep-sea scuttled radioactive aircraft carrier USS Independence off the California coast has been the source of consistent media speculation and public concern for decades. The survey confirmed that a sonar target charted at the location was Independence, and provided details on the condition of the wreck, and revealed no detectable levels of radioactivity. At the same time, new information from declassified government reports provided more detail on Independence’s use as a naval test craft for radiological decontamination as well as its use as a repository for radioactive materials at the time of its scuttling in 1951. While further surveys may reveal more, physical assessment and focused archival work has demonstrated that the level of concern and speculation of danger from either a radioactive or oil pollution threat posed may be exaggerated.

  18. Gulf of Mexico Deep-Sea Coral Ecosystem Studies, 2008-2011

    Science.gov (United States)

    Kellogg, Christina A.

    2009-01-01

    Most people are familiar with tropical coral reefs, located in warm, well-illuminated, shallow waters. However, corals also exist hundreds and even thousands of meters below the ocean surface, where it is cold and completely dark. These deep-sea corals, also known as cold-water corals, have become a topic of interest due to conservation concerns over the impacts of trawling, exploration for oil and gas, and climate change. Although the existence of these corals has been known since the 1800s, our understanding of their distribution, ecology, and biology is limited due to the technical difficulties of conducting deep-sea research. DISCOVRE (DIversity, Systematics, and COnnectivity of Vulnerable Reef Ecosystems) is a new U.S. Geological Survey (USGS) program focused on deep-water coral ecosystems in the Gulf of Mexico. This integrated, multidisciplinary, international effort investigates a variety of topics related to unique and fragile deep-sea coral ecosystems from the microscopic level to the ecosystem level, including components of microbiology, population genetics, paleoecology, food webs, taxonomy, community ecology, physical oceanography, and mapping.

  19. Student Deep Learning in Bachelor English Programs within Pakistani Universities

    Science.gov (United States)

    Tahir, Khazima

    2015-01-01

    The purpose of this study was to contrast undergraduate students' descriptions about transformational teaching practices, and student deep learning in bachelor English programs in selected universities within Pakistan. This study utilized a survey to gather responses from five hundred and twenty three students. A paired sample t test was utilized…

  20. Fast rise times and the physical mechanism of deep earthquakes

    Science.gov (United States)

    Houston, H.; Williams, Q.

    1991-01-01

    A systematic global survey of the rise times and stress drops of deep and intermediate earthquakes is reported. When the rise times are scaled to the seismic moment release of the events, their average is nearly twice as fast for events deeper than about 450 km as for shallower events.

  1. Vertical Cable Seismic Survey for SMS exploration

    Science.gov (United States)

    Asakawa, Eiichi; Murakami, Fumitoshi; Tsukahara, Hotoshi; Mizohata, Shigeharu

    2014-05-01

    The Vertical Cable Seismic (VCS) survey is one of the reflection seismic methods. It uses hydrophone arrays vertically moored from the seafloor to record acoustic waves generated by sea-surface, deep-towed or ocean bottom sources. Analyzing the reflections from the sub-seabed, we could look into the subsurface structure. Because the VCS is an efficient high-resolution 3D seismic survey method for a spatially-bounded area, we proposed it for the SMS survey tool development program that the Ministry of Education, Culture, Sports, Science and Technology (MEXT) started in 2009. We have been developing the VCS survey system, including not only data acquisition hardware but data processing and analysis technique. We carried out several VCS surveys combining with surface towed source, deep towed source and ocean bottom source. The water depths of these surveys are from 100m up to 2100 m. Through these experiments, our VCS data acquisition system has been also completed. But the data processing techniques are still on the way. One of the most critical issues is the positioning in the water. The uncertainty in the positions of the source and of the hydrophones in water degraded the quality of subsurface image. GPS navigation system is available on sea surface, but in case of deep-towed source or ocean bottom source, the accuracy of shot position with SSBL/USBL is not sufficient for the very high-resolution imaging. We have developed a new approach to determine the positions in water using the travel time data from the source to VCS hydrophones. In 2013, we have carried out the second VCS survey using the surface-towed high-voltage sparker and ocean bottom source in the Izena Cauldron, which is one of the most promising SMS areas around Japan. The positions of ocean bottom source estimated by this method are consistent with the VCS field records. The VCS data with the sparker have been processed with 3D PSTM. It gives the very high resolution 3D volume deeper than two

  2. Root Transcriptomic Analysis Revealing the Importance of Energy Metabolism to the Development of Deep Roots in Rice (Oryza sativa L.)

    OpenAIRE

    Lou, Qiaojun; Chen, Liang; Mei, Hanwei; Xu, Kai; Wei, Haibin; Feng, Fangjun; Li, Tiemei; Pang, Xiaomeng; Shi, Caiping; Luo, Lijun; Zhong, Yang

    2017-01-01

    Drought is the most serious abiotic stress limiting rice production, and deep root is the key contributor to drought avoidance. However, the genetic mechanism regulating the development of deep roots is largely unknown. In this study, the transcriptomes of 74 root samples from 37 rice varieties, representing the extreme genotypes of shallow or deep rooting, were surveyed by RNA-seq. The 13,242 differentially expressed genes (DEGs) between deep rooting and shallow rooting varieties (H vs. L) w...

  3. UltraVISTA : a new ultra-deep near-infrared survey in COSMOS

    NARCIS (Netherlands)

    McCracken, H. J.; Milvang-Jensen, B.; Dunlop, J.; Franx, M.; Fynbo, J. P. U.; Le Fevre, O.; Holt, J.; Caputi, K. I.; Goranova, Y.; Buitrago, F.; Emerson, J. P.; Freudling, W.; Hudelot, P.; Lopez-Sanjuan, C.; Magnard, F.; Mellier, Y.; Moller, P.; Nilsson, K. K.; Sutherland, W.; Tasca, L.; Zabl, J.

    In this paper we describe the first data release of the UltraVISTA near-infrared imaging survey of the COSMOS field. We summarise the key goals and design of the survey and provide a detailed description of our data reduction techniques. We provide stacked, sky-subtracted images in YJHK(s) and

  4. Trouver la bonne distance : étrangère, marginale, ethnologue et parente en Corée du Sud

    Directory of Open Access Journals (Sweden)

    Élise Prébin

    2009-03-01

    Full Text Available Trouver la bonne distance : étrangère, marginale, ethnologue et parente en Corée du Sud. Cet article a pour sujet ma relation à la société sud-coréenne, non seulement en tant qu’anthropologue mais aussi en tant que personne adoptée d’origine coréenne. Il relate la manière dont plusieurs sortes de relations ont dû être gérées pendant le terrain. Il a fallu d’un côté me rapprocher de mes informateurs pour pouvoir mener un terrain classique dans une société inconnue, comprendre une culture autre et traiter un sujet relevant des sciences sociales : le retour des adoptés étrangers dans leur pays d’origine. De l’autre, il a fallu établir une bonne distance avec ma famille biologique coréenne en m’en éloignant momentanément. Ma relation avec elle était en effet instable du fait qu’elle reposait sur une contradiction : mon statut d’étrangère en dépit de l’intimité que supposaient les liens du sang. Pourtant, cette relation continue entre ma famille biologique et moi-même a éclairé certaines ambiguïtés relatives au statut des adoptés dans la société sud-coréenne que le temps limité du terrain n’avait pas permis de saisir. Cet article ébauche donc une réflexion épistémologique sur la pertinence d’éléments biographiques dans le traitement anthropologique de certains sujets. Je montre en l’occurrence comment le don en mariage d’une fille biologique adoptée conjure de manière satisfaisante un don en adoption toujours problématique.Finding the right distance: stranger, marginal, ethnologist and relative in south Korea. The topic of this article is my relation to the South Korean society, not only as a social anthropologist but also as a Korean adoptee. I relate the way I had to manage contrasted relations while conducting fieldwork. On one hand, I endeavoured to get closer to my informants in order to conduct classic fieldwork in an unknown society, to understand a different culture

  5. A New Infrared Color Criterion for the Selection of 0 < z < 7 AGNs: Application to Deep Fields and Implications for JWST Surveys

    Science.gov (United States)

    Messias, H.; Afonso, J.; Salvato, M.; Mobasher, B.; Hopkins, A. M.

    2012-08-01

    It is widely accepted that observations at mid-infrared (mid-IR) wavelengths enable the selection of galaxies with nuclear activity, which may not be revealed even in the deepest X-ray surveys. Many mid-IR color-color criteria have been explored to accomplish this goal and tested thoroughly in the literature. Besides missing many low-luminosity active galactic nuclei (AGNs), one of the main conclusions is that, with increasing redshift, the contamination by non-active galaxies becomes significant (especially at z >~ 2.5). This is problematic for the study of the AGN phenomenon in the early universe, the main goal of many of the current and future deep extragalactic surveys. In this work new near- and mid-IR color diagnostics are explored, aiming for improved efficiency—better completeness and less contamination—in selecting AGNs out to very high redshifts. We restrict our study to the James Webb Space Telescope wavelength range (0.6-27 μm). The criteria are created based on the predictions by state-of-the-art galaxy and AGN templates covering a wide variety of galaxy properties, and tested against control samples with deep multi-wavelength coverage (ranging from the X-rays to radio frequencies). We show that the colors Ks - [4.5], [4.5] - [8.0], and [8.0] - [24] are ideal as AGN/non-AGN diagnostics at, respectively, z ~ 2.5-3. However, when the source redshift is unknown, these colors should be combined. We thus develop an improved IR criterion (using Ks and IRAC bands, KI) as a new alternative at z 50%-90% level of successful AGN selection). We also propose KIM (using Ks , IRAC, and MIPS 24 μm bands, KIM), which aims to select AGN hosts from local distances to as far back as the end of reionization (0 ~ 2.5. Overall, KIM shows a ~30%-40% completeness and a >70%-90% level of successful AGN selection. KI and KIM are built to be reliable against a ~10%-20% error in flux, are based on existing filters, and are suitable for immediate use.

  6. Report on fiscal 2000 survey for geothermal exploration technology verification. Survey of deep-seated geothermal resources; 2000 nendo chinetsu tansa gijutsu nado kensho chosa hokokusho. Shinbu chinetsu shigen chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-03-01

    To promote the development of deep-seated geothermal resources in a rationalized way, studies are conducted about deep-seated geothermal resource assessment techniques, development guidelines, and the like. Data were collected at the Sumikawa-Onuma district, Ogiri district, Mori district, Yanaizu-Nishiyama district, and the Onikobe district, and compiled into a database to be open to the public. Studies were made about methods for estimating parameters for deep-seated geothermal reservoirs. The resultant findings indicate that, in the Uenotai and Sumikawa-Onuma districts where geothermal reservoirs are governed mainly by a fracture network, the relaxation method and extrapolation will be effective for deep-seated reservoir temperature estimation, and the ascending current analysis method and extrapolation for permeability estimation. The findings also indicate that the expanse of deep-seated reservoirs will be suitably estimated using a method similar to that applied to shallow-seated reservoirs. In the study of the estimation of the amount of deep-seated geothermal resources, it is concluded that the simplified model A will be effective in dealing with a geothermal district where there is a well-developed fracture network and the simplified model B in dealing with a geothermal district where supply of deep-seated fluid governed by an extensive fault prevails. (NEDO)

  7. Report on fiscal 1999 survey for geothermal exploration technology verification. Survey of deep-seated geothermal resources; 1999 nendo chinetsu tansa gijutsu nado kensho chosa hokokusho. Shinbu chinetsu shigen chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-03-01

    To promote the development of deep-seated geothermal resources in a rationalized way, studies were conducted about deep-seated geothermal resource assessment techniques, development guidelines, and the like. For the development of techniques for estimating deep-seated geothermal reservoir parameters, the Uenotai district, Akita Prefecture, and the Hatchobaru district, Oita Prefecture, were designated as model fields, and a geothermal system conceptual model was fabricated. Data of the two districts were registered in a database. Using these data, verification was performed of the validity of stochastic estimation techniques, large area flow simulation, rock/water equilibrium reaction simulation, and the like. As for the technique of deep-seated resource amount estimation, a simplified reservoir model was experimentally constructed based on parameters determined by the stochastic estimation of deep-seated reservoirs and on the conceptual model, and a method was studied for TOUGH2-based production prediction. Studies were also made about deep-seated geothermal resource development guidelines, such as exploration guidelines, exploration well boring guidelines, and geothermal fluid production guidelines. (NEDO)

  8. THE ACS NEARBY GALAXY SURVEY TREASURY

    International Nuclear Information System (INIS)

    Dalcanton, Julianne J.; Williams, Benjamin F.; Rosema, Keith; Gogarten, Stephanie M.; Christensen, Charlotte; Gilbert, Karoline; Hodge, Paul; Seth, Anil C.; Dolphin, Andrew; Holtzman, Jon; Skillman, Evan D.; Weisz, Daniel; Cole, Andrew; Girardi, Leo; Karachentsev, Igor D.; Olsen, Knut; Freeman, Ken; Gallart, Carme; Harris, Jason; De Jong, Roelof S.

    2009-01-01

    The ACS Nearby Galaxy Survey Treasury (ANGST) is a systematic survey to establish a legacy of uniform multi-color photometry of resolved stars for a volume-limited sample of nearby galaxies (D 4 in luminosity and star formation rate. The survey data consist of images taken with the Advanced Camera for Surveys (ACS) on the Hubble Space Telescope (HST), supplemented with archival data and new Wide Field Planetary Camera 2 (WFPC2) imaging taken after the failure of ACS. Survey images include wide field tilings covering the full radial extent of each galaxy, and single deep pointings in uncrowded regions of the most massive galaxies in the volume. The new wide field imaging in ANGST reaches median 50% completenesses of m F475W = 28.0 mag, m F606W = 27.3 mag, and m F814W = 27.3 mag, several magnitudes below the tip of the red giant branch (TRGB). The deep fields reach magnitudes sufficient to fully resolve the structure in the red clump. The resulting photometric catalogs are publicly accessible and contain over 34 million photometric measurements of >14 million stars. In this paper we present the details of the sample selection, imaging, data reduction, and the resulting photometric catalogs, along with an analysis of the photometric uncertainties (systematic and random), for both ACS and WFPC2 imaging. We also present uniformly derived relative distances measured from the apparent magnitude of the TRGB.

  9. Brown dwarfs in wide-field surveys

    Directory of Open Access Journals (Sweden)

    Lodieu N.

    2011-07-01

    Full Text Available In this invited talk, I briefly summarise early photometric and proper motion surveys carried out in the nearest and youngest open clusters to introduce the motivation behind the Galactic Cluster component of the UKIRT Infrared Deep Sky Survey. Afterwards, I focus on the latest results that we obtained in the Upper Sco association and in the Pleiades. To finish, I show a comparison of the luminosity and mass functions obtained in the Upper Sco association, the Pleiades cluster, and σ Orionis from the homogeneous set of data publicly available from the Galactic Clusters Survey.

  10. Deep iCrawl: An Intelligent Vision-Based Deep Web Crawler

    OpenAIRE

    R.Anita; V.Ganga Bharani; N.Nityanandam; Pradeep Kumar Sahoo

    2011-01-01

    The explosive growth of World Wide Web has posed a challenging problem in extracting relevant data. Traditional web crawlers focus only on the surface web while the deep web keeps expanding behind the scene. Deep web pages are created dynamically as a result of queries posed to specific web databases. The structure of the deep web pages makes it impossible for traditional web crawlers to access deep web contents. This paper, Deep iCrawl, gives a novel and vision-based app...

  11. DEEP NEAR-INFRARED SURVEY OF THE PIPE NEBULA. II. DATA, METHODS, AND DUST EXTINCTION MAPS

    International Nuclear Information System (INIS)

    Roman-Zuniga, Carlos G.; Alves, Joao F.; Lada, Charles J.; Lombardi, Marco

    2010-01-01

    We present a new set of high-resolution dust extinction maps of the nearby and essentially starless Pipe Nebula molecular cloud. The maps were constructed from a concerted deep near-infrared imaging survey with the ESO-VLT, ESO-NTT, CAHA 3.5 m telescopes, and 2MASS data. The new maps have a resolution three times higher than the previous extinction map of this cloud by Lombardi et al. and are able to resolve structures down to 2600 AU. We detect 244 significant extinction peaks across the cloud. These peaks have masses between 0.1 and 18.4 M sun , diameters between 1.2 and 5.7 x 10 4 AU (0.06 and 0.28 pc), and mean densities of about 10 4 cm -3 , all in good agreement with previous results. From the analysis of the mean surface density of companions we find a well-defined scale near 1.4 x 10 4 AU below which we detect a significant decrease in structure of the cloud. This scale is smaller than the Jeans length calculated from the mean density of the peaks. The surface density of peaks is not uniform but instead it displays clustering. Extinction peaks in the Pipe Nebula appear to have a spatial distribution similar to the stars in Taurus, suggesting that the spatial distribution of stars evolves directly from the primordial spatial distribution of high-density material.

  12. Deep and accurate near-infrared photometry of the Galactic globular cluster omega Cen .

    Science.gov (United States)

    Calamida, A.; Bono, G.; Corsi, C. E.; Stetson, P. B.; Prada Moroni, P. G.; Degl'Innocenti, S.; Marchetti, E.; Amico, P.; Ferraro, I.; Iannicola, G.; Monelli, M.; Buonanno, R.; Caputo, F.; Dall'Ora, M.; Freyhammer, L. M.; Koester, D.; Nonino, M.; Piersimoni, A. M.; Pulone, L.; Romaniello, M.

    We present deep and accurate Near-Infrared (NIR) photometry of the Galactic Globular Cluster omega Cen . Data were collected using the Multi-Conjugate Adaptive Optics Demonstrator (MAD) mounted on the VLT (ESO). We combined the NIR photometry with optical space data collected with the Advanced Camera for Surveys (ACS) for the same region of the cluster. Our deep optical-NIR CMD indicates that the spread in age among the different stellar populations in omega Cen is at most of the order of 2 Gyr.

  13. SELECTION OF BURST-LIKE TRANSIENTS AND STOCHASTIC VARIABLES USING MULTI-BAND IMAGE DIFFERENCING IN THE PAN-STARRS1 MEDIUM-DEEP SURVEY

    International Nuclear Information System (INIS)

    Kumar, S.; Gezari, S.; Heinis, S.; Chornock, R.; Berger, E.; Soderberg, A.; Stubbs, C. W.; Kirshner, R. P.; Rest, A.; Huber, M. E.; Narayan, G.; Marion, G. H.; Burgett, W. S.; Foley, R. J.; Scolnic, D.; Riess, A. G.; Lawrence, A.; Smartt, S. J.; Smith, K.; Wood-Vasey, W. M.

    2015-01-01

    We present a novel method for the light-curve characterization of Pan-STARRS1 Medium Deep Survey (PS1 MDS) extragalactic sources into stochastic variables (SVs) and burst-like (BL) transients, using multi-band image-differencing time-series data. We select detections in difference images associated with galaxy hosts using a star/galaxy catalog extracted from the deep PS1 MDS stacked images, and adopt a maximum a posteriori formulation to model their difference-flux time-series in four Pan-STARRS1 photometric bands g P1 , r P1 , i P1 , and z P1 . We use three deterministic light-curve models to fit BL transients; a Gaussian, a Gamma distribution, and an analytic supernova (SN) model, and one stochastic light-curve model, the Ornstein-Uhlenbeck process, in order to fit variability that is characteristic of active galactic nuclei (AGNs). We assess the quality of fit of the models band-wise and source-wise, using their estimated leave-out-one cross-validation likelihoods and corrected Akaike information criteria. We then apply a K-means clustering algorithm on these statistics, to determine the source classification in each band. The final source classification is derived as a combination of the individual filter classifications, resulting in two measures of classification quality, from the averages across the photometric filters of (1) the classifications determined from the closest K-means cluster centers, and (2) the square distances from the clustering centers in the K-means clustering spaces. For a verification set of AGNs and SNe, we show that SV and BL occupy distinct regions in the plane constituted by these measures. We use our clustering method to characterize 4361 extragalactic image difference detected sources, in the first 2.5 yr of the PS1 MDS, into 1529 BL, and 2262 SV, with a purity of 95.00% for AGNs, and 90.97% for SN based on our verification sets. We combine our light-curve classifications with their nuclear or off-nuclear host galaxy offsets, to

  14. Larval transport modeling of deep-sea invertebrates can aid the search for undiscovered populations.

    Directory of Open Access Journals (Sweden)

    Jon M Yearsley

    Full Text Available BACKGROUND: Many deep-sea benthic animals occur in patchy distributions separated by thousands of kilometres, yet because deep-sea habitats are remote, little is known about their larval dispersal. Our novel method simulates dispersal by combining data from the Argo array of autonomous oceanographic probes, deep-sea ecological surveys, and comparative invertebrate physiology. The predicted particle tracks allow quantitative, testable predictions about the dispersal of benthic invertebrate larvae in the south-west Pacific. PRINCIPAL FINDINGS: In a test case presented here, using non-feeding, non-swimming (lecithotrophic trochophore larvae of polyplacophoran molluscs (chitons, we show that the likely dispersal pathways in a single generation are significantly shorter than the distances between the three known population centres in our study region. The large-scale density of chiton populations throughout our study region is potentially much greater than present survey data suggest, with intermediate 'stepping stone' populations yet to be discovered. CONCLUSIONS/SIGNIFICANCE: We present a new method that is broadly applicable to studies of the dispersal of deep-sea organisms. This test case demonstrates the power and potential applications of our new method, in generating quantitative, testable hypotheses at multiple levels to solve the mismatch between observed and expected distributions: probabilistic predictions of locations of intermediate populations, potential alternative dispersal mechanisms, and expected population genetic structure. The global Argo data have never previously been used to address benthic biology, and our method can be applied to any non-swimming larvae of the deep-sea, giving information upon dispersal corridors and population densities in habitats that remain intrinsically difficult to assess.

  15. A SYSTEMATIC SURVEY OF PROTOCLUSTERS AT z ∼ 3–6 IN THE CFHTLS DEEP FIELDS

    Energy Technology Data Exchange (ETDEWEB)

    Toshikawa, Jun; Kashikawa, Nobunari; Furusawa, Hisanori; Tanaka, Masayuki; Niino, Yuu [Optical and Infrared Astronomy Division, National Astronomical Observatory, Mitaka, Tokyo 181-8588 (Japan); Overzier, Roderik [Observatório Nacional, Rua José Cristino, 77. CEP 20921-400, São Cristóvão, Rio de Janeiro-RJ (Brazil); Malkan, Matthew A. [Department of Physics and Astronomy, University of California, Los Angeles, CA 90095-1547 (United States); Ishikawa, Shogo; Onoue, Masafusa; Uchiyama, Hisakazu [Department of Astronomy, School of Science, Graduate University for Advanced Studies, Mitaka, Tokyo 181-8588 (Japan); Ota, Kazuaki, E-mail: jun.toshikawa@nao.ac.jp [Kavli Institute for Cosmology, University of Cambridge, Madingley Road, Cambridge CB3 0HA (United Kingdom)

    2016-08-01

    We present the discovery of three protoclusters at z ∼ 3–4 with spectroscopic confirmation in the Canada–France–Hawaii Telescope Legacy Survey Deep Fields. In these fields, we investigate the large-scale projected sky distribution of z ∼ 3–6 Lyman-break galaxies and identify 21 protocluster candidates from regions that are overdense at more than 4 σ overdensity significance. Based on cosmological simulations, it is expected that more than 76% of these candidates will evolve into a galaxy cluster of at least a halo mass of 10{sup 14} M {sub ⊙} at z = 0. We perform follow-up spectroscopy for eight of the candidates using Subaru/FOCAS, Keck II/DEIMOS, and Gemini-N/GMOS. In total we target 462 dropout candidates and obtain 138 spectroscopic redshifts. We confirm three real protoclusters at z = 3–4 with more than five members spectroscopically identified and find one to be an incidental overdense region by mere chance alignment. The other four candidate regions at z ∼ 5–6 require more spectroscopic follow-up in order to be conclusive. A z = 3.67 protocluster, which has 11 spectroscopically confirmed members, shows a remarkable core-like structure composed of a central small region (<0.5 physical Mpc) and an outskirts region (∼1.0 physical Mpc). The Ly α equivalent widths of members of the protocluster are significantly smaller than those of field galaxies at the same redshift, while there is no difference in the UV luminosity distributions. These results imply that some environmental effects start operating as early as at z ∼ 4 along with the growth of the protocluster structure. This study provides an important benchmark for our analysis of protoclusters in the upcoming Subaru/HSC imaging survey and its spectroscopic follow-up with the Subaru/PFS that will detect thousands of protoclusters up to z ∼ 6.

  16. The WFCAM transit survey and cool white dwarfs

    Directory of Open Access Journals (Sweden)

    Pinfield D.

    2013-04-01

    Full Text Available We present results from our search for cool white dwarfs in the WTS (WFCAM Transit Survey. Repeat observations starting in 2007 allowed to produce deep stacked images in J and measure proper motions. We combine this with deep optical imaging to select cool white dwarf candidates (Teff < 5000 K. About 27 cool white dwarf candidates with proper motions above 0.10 arcsec/yr were identified in one of the fields representing 1/8th of the survey area. Follow-up spectroscopy with the 10.2 m GTC telescope at La Palma confirmed the white dwarf status for all observed candidates. On-going work is being carried out to increase the sample of cool white dwarfs that will allow a more comprehensive study of the thick disk/halo white dwarf population.

  17. Deep-sea benthic community and environmental impact assessment at the Atlantic Frontier

    Science.gov (United States)

    Gage, John D.

    2001-05-01

    the oil industry-funded Atlantic Margin Environmental Study cruises in 1996 and 1998. A predominantly depth-related pattern in variability applies here as found elsewhere in the deep ocean, and just sufficient knowledge-based predictive power exists to make comprehensive, high-resolution grid surveys unnecessary for the purpose of broad-scale environmental assessment. But new, small-scale site surveys remain necessary because of local-scale variability. Site survey should be undertaken in the context of existing knowledge of the deep sea in the UK area of the Atlantic Frontier and beyond, and can itself usefully be structured as tests of a projection from the regional scale to reduce sampling effort. It is to the benefit of all stakeholders that environmental assessment aspires to the highest scientific standards and contributes meaningfully to context knowledge. By doing so it will reduce uncertainties in future impact assessments and hence contribute usefully to environmental risk management.

  18. 78 FR 33369 - Takes of Marine Mammals Incidental to Specified Activities; Low-Energy Marine Geophysical Survey...

    Science.gov (United States)

    2013-06-04

    ... information on the study, their modeling process of the experiment in shallow, intermediate, and deep water... element operation and that it uses shallow-water sound propagation as a proxy for deep water propagation... geophysical (i.e., seismic) survey in the deep water of the Gulf of Mexico, April to May 2013. DATES...

  19. Stellar Atmospheric Parameterization Based on Deep Learning

    Science.gov (United States)

    Pan, Ru-yang; Li, Xiang-ru

    2017-07-01

    Deep learning is a typical learning method widely studied in the fields of machine learning, pattern recognition, and artificial intelligence. This work investigates the problem of stellar atmospheric parameterization by constructing a deep neural network with five layers, and the node number in each layer of the network is respectively 3821-500-100-50-1. The proposed scheme is verified on both the real spectra measured by the Sloan Digital Sky Survey (SDSS) and the theoretic spectra computed with the Kurucz's New Opacity Distribution Function (NEWODF) model, to make an automatic estimation for three physical parameters: the effective temperature (Teff), surface gravitational acceleration (lg g), and metallic abundance (Fe/H). The results show that the stacked autoencoder deep neural network has a better accuracy for the estimation. On the SDSS spectra, the mean absolute errors (MAEs) are 79.95 for Teff/K, 0.0058 for (lg Teff/K), 0.1706 for lg (g/(cm·s-2)), and 0.1294 dex for the [Fe/H], respectively; On the theoretic spectra, the MAEs are 15.34 for Teff/K, 0.0011 for lg (Teff/K), 0.0214 for lg(g/(cm · s-2)), and 0.0121 dex for [Fe/H], respectively.

  20. Conducting health survey research in a deep rural South African community: challenges and adaptive strategies.

    Science.gov (United States)

    Casale, Marisa; Lane, Tyler; Sello, Lebo; Kuo, Caroline; Cluver, Lucie

    2013-04-24

    In many parts of the developing world, rural health requires focused policy attention, informed by reliable, representative health data. Yet there is surprisingly little published material to guide health researchers who face the unique set of hurdles associated with conducting field research in remote rural areas. In this paper we provide a detailed description of the key challenges encountered during health survey field research carried out in 2010 in a deep rural site in KwaZulu-Natal, South Africa. The aim of the field research was to collect data on the health of children aged 10 to 17 years old, and their primary adult caregivers, as part of a larger national health survey; the research was a collaboration between several South African and foreign universities, South African national government departments, and various NGO partners. In presenting each of the four fieldwork challenges encountered on this site, we describe the initial planning decisions made, the difficulties faced when implementing these in the field, and the adaptive strategies we used to respond to these challenges. We reflect on learnings of potential relevance for the research community. Our four key fieldwork challenges were scarce research capacity, staff relocation tensions, logistical constraints, and difficulties related to community buy-in. Addressing each of these obstacles required timely assessment of the situation and adaptation of field plans, in collaboration with our local NGO partner. Adaptive strategies included a greater use of local knowledge; the adoption of tribal authority boundaries as the smallest geopolitical units for sampling; a creative developmental approach to capacity building; and planned, on-going engagement with multiple community representatives. We argue that in order to maintain high scientific standards of research and manage to 'get the job done' on the ground, it is necessary to respond to fieldwork challenges that arise as a cohesive team, with timely

  1. Kincardine deep geologic repository proposal and the public

    International Nuclear Information System (INIS)

    Squire, T.

    2005-01-01

    'Full text:' In 2002, the Municipality of Kincardine and OPG signed a Memorandum of Understanding (MOU) regarding the long-term management of low and intermediate level radioactive wastes. The purpose of the MOU was for OPG, in consultation with Kincardine, to develop a plan for the long-term management of low and intermediate level waste at OPG's Western Waste Management Facility (WWMF) located on the Bruce site. An independent assessment, which included geotechnical feasibility and safety analyses, a community attitude survey and interviews with local residents, businesses and tourists, and economic modeling to determine the potential benefits and impacts, was completed in February 2004. Ultimately, Kincardine Council endorsed a resolution (Kincardine Council no. 2004-232) to: 'endorse the opinion of the Nuclear Waste Steering Committee and select the 'Deep Rock Vault' option as the preferred course of study in regards to the management of low and intermediate level radioactive waste'. The surrounding municipalities of Saugeen Shores, Brockton, Arran-Elderslie, and Huron-Kinloss expressed their support for the Deep Geologic Repository proposal. This presentation discusses the history, major steps and public processes surrounding the Kincardine Deep Geologic Repository proposal. (author)

  2. Assessing the deep drilling potential of Lago de Tota, Colombia, with a seismic survey

    Science.gov (United States)

    Bird, B. W.; Wattrus, N. J.; Fonseca, H.; Velasco, F.; Escobar, J.

    2015-12-01

    Reconciling orbital-scale patterns of inter-hemispheric South American climate during the Quaternary requires continuous, high-resolution paleoclimate records that span multiple glacial cycles from both hemispheres. Southern Andean Quaternary climates are represented by multi-proxy results from Lake Titicaca (Peru-Bolivia) spanning the last 400 ka and by pending results from the Lago Junin Drilling Project (Peru). Although Northern Andean sediment records spanning the last few million years have been retrieved from the Bogota and Fúquene Basins in the Eastern Cordillera of the Colombian Andes, climatic reconstructions based on these cores have thus far been limited to pollen-based investigations. When viewed together with the Southern Hemisphere results, these records suggest an anti-phased hemispheric climatic response during glacial cycles. In order to better assess orbital-scale climate responses, however, independent temperature and hydroclimate proxies from the Northern Hemisphere are needed in addition to vegetation histories. As part of this objective, an effort is underway to develop a paleoclimate record from Lago de Tota (3030 m asl), the largest lake in Colombia and the third largest lake in the Andes. One of 17 highland tectonic basins in Eastern Cordillera, Lago de Tota formed during Tertiary uplift that deformed pre-foreland megasequences, synrift and back-arc megasequences. The precise age and thickness of sediments in the Lago de Tota basin has not previously been established. Here, we present results from a recent single-channel seismic reflection survey collected with a small (5 cubic inch) air gun and high-resolution CHIRP sub-bottom data. With these data, we examine the depositional history and sequence stratigraphy of Lago de Tota and assess its potential as a deep drilling target.

  3. Drilling a deep geologic test well at Hilton Head Island, South Carolina

    Science.gov (United States)

    Schultz, Arthur P.; Seefelt, Ellen L.

    2011-01-01

    The U.S. Geological Survey, in cooperation with the South Carolina Department of Health and Environmental Control (SCDHEC), is drilling a deep geologic test well at Hilton Head Island, S.C. The test well is scheduled to run between mid-March and early May 2011. When completed, the well will be about 1,000 feet deep. The purpose of this test well is to gain knowledge about the regional-scale Floridan aquifer, an important source of groundwater in the Hilton Head area. Also, cores obtained during drilling will enable geologists to study the last 60 million years of Earth history in this area.

  4. What influences intentions to request physician-assisted euthanasia or continuous deep sedation?

    Science.gov (United States)

    Scherrens, Anne-Lore; Roelands, Marc; Van den Block, Lieve; Deforche, Benedicte; Deliens, Luc; Cohen, Joachim

    2018-09-01

    The increasing prevalence of euthanasia in Belgium has been linked to changing attitudes. Using National health survey data (N = 9651), we investigated Belgian adults' intention to ask a physician for euthanasia or continuous deep sedation in the hypothetical scenario of a terminal illness and examined its connection to sociodemographic and health characteristics. Respectively, 38.3 and 25.8% could envisage asking for euthanasia and continuous deep sedation. Those with very bad to fair subjective health and with depression more likely had an intention to ask for euthanasia, which suggests need for attention in the evaluation of requests from specific patient groups.

  5. A Determination of the Intergalactic Redshift Dependent UV-Optical-NIR Photon Density Using Deep Galaxy Survey Data and the Gamma-ray Opacity of the Universe

    Science.gov (United States)

    Stecker, Floyd W.; Malkan, Matthew A.; Scully, Sean T.

    2012-01-01

    We calculate the intensity and photon spectrum of the intergalactic background light (IBL) as a function of redshift using an approach based on observational data obtained in many different wavelength bands from local to deep galaxy surveys. This allows us to obtain an empirical determination of the IBL and to quantify its observationally based uncertainties. Using our results on the IBL, we then place 68% confidence upper and lower limits on the opacity of the universe to gamma-rays, free of the theoretical assumptions that were needed for past calculations. We compare our results with measurements of the extragalactic background light and upper limits obtained from observations made by the Fermi Gamma-ray Space Telescope.

  6. NAGRA - Sites for geological repositories - Geological surveys for stage 3

    International Nuclear Information System (INIS)

    2014-01-01

    This brochure published by the Swiss National Cooperative for the Disposal of Radioactive Waste (NAGRA) examines the aims involved in the selection of sites for deep geological repositories for nuclear wastes in Switzerland. Various methods involved in their implementation are described. These include 3D-seismology, deep probe drillings, shallow drillings as well as field studies, gravimetric measurements and the study of the electrical properties of the ground and rock involved. These factors are discussed in detail. Maps are presented of the locations that are to be surveyed and details of the selected perimeters are shown. Also, the layout of a sample drilling site is presented. A timescale for the various surveys and work to be done is presented

  7. De-biased populations of Kuiper belt objects from the deep ecliptic survey

    International Nuclear Information System (INIS)

    Adams, E. R.; Benecchi, S. D.; Gulbis, A. A. S.; Elliot, J. L.; Buie, M. W.; Trilling, D. E.; Wasserman, L. H.

    2014-01-01

    The Deep Ecliptic Survey (DES) was a survey project that discovered hundreds of Kuiper Belt objects from 1998 to 2005. Extensive follow-up observations of these bodies has yielded 304 objects with well-determined orbits and dynamical classifications into one of several categories: Classical, Scattered, Centaur, or 16 mean-motion resonances with Neptune. The DES search fields are well documented, enabling us to calculate the probability on each frame of detecting an object with its particular orbital parameters and absolute magnitude at a randomized point in its orbit. The detection probabilities range from a maximum of 0.32 for the 3:2 resonant object 2002 GF 32 to a minimum of 1.5 × 10 –7 for the faint Scattered object 2001 FU 185 . By grouping individual objects together by dynamical classes, we can estimate the distributions of four parameters that define each class: semimajor axis, eccentricity, inclination, and object size. The orbital element distributions (a, e, and i) were fit to the largest three classes (Classical, 3:2, and Scattered) using a maximum likelihood fit. Using the absolute magnitude (H magnitude) as a proxy for the object size, we fit a power law to the number of objects versus H magnitude for eight classes with at least five detected members (246 objects). The Classical objects are best fit with a power-law slope of α = 1.02 ± 0.01 (observed from 5 ≤ H ≤ 7.2). Six other dynamical classes (Scattered plus five resonances) have consistent magnitude distribution slopes with the Classicals, provided that the absolute number of objects is scaled. Scattered objects are somewhat more numerous than Classical objects, while there are only a quarter as many 3:2 objects as Classicals. The exception to the power law relation is the Centaurs, which are non-resonant objects with perihelia closer than Neptune and therefore brighter and detectable at smaller sizes. Centaurs were observed from 7.5 < H < 11, and that population is best fit by a power

  8. Quantum chromodynamics and deep inelastic e - N scattering at TRISTAN

    International Nuclear Information System (INIS)

    Muta, Taizo

    1979-04-01

    An introductory survey is given on the formulation of QCD in deep inelastic lepton-hadron scatterings. Typical predictions of QCD are presented in the kinematical region of TRISTAN, including detailed descriptions of the scaling violation, QCD correction to the current algebra sum rules, problem of quark masses and higher order effects. Some suggestions for experiments at TRISTAN are made. (author)

  9. The evolution of spillover effects between oil and stock markets across multi-scales using a wavelet-based GARCH-BEKK model

    Science.gov (United States)

    Liu, Xueyong; An, Haizhong; Huang, Shupei; Wen, Shaobo

    2017-01-01

    Aiming to investigate the evolution of mean and volatility spillovers between oil and stock markets in the time and frequency dimensions, we employed WTI crude oil prices, the S&P 500 (USA) index and the MICEX index (Russia) for the period Jan. 2003-Dec. 2014 as sample data. We first applied a wavelet-based GARCH-BEKK method to examine the spillover features in frequency dimension. To consider the evolution of spillover effects in time dimension at multiple-scales, we then divided the full sample period into three sub-periods, pre-crisis period, crisis period, and post-crisis period. The results indicate that spillover effects vary across wavelet scales in terms of strength and direction. By analysis the time-varying linkage, we found the different evolution features of spillover effects between the Oil-US stock market and Oil-Russia stock market. The spillover relationship between oil and US stock market is shifting to short-term while the spillover relationship between oil and Russia stock market is changing to all time scales. That result implies that the linkage between oil and US stock market is weakening in the long-term, and the linkage between oil and Russia stock market is getting close in all time scales. This may explain the phenomenon that the US stock index and the Russia stock index showed the opposite trend with the falling of oil price in the post-crisis period.

  10. CANDELS: THE COSMIC ASSEMBLY NEAR-INFRARED DEEP EXTRAGALACTIC LEGACY SURVEY—THE HUBBLE SPACE TELESCOPE OBSERVATIONS, IMAGING DATA PRODUCTS, AND MOSAICS

    International Nuclear Information System (INIS)

    Koekemoer, Anton M.; Ferguson, Henry C.; Grogin, Norman A.; Lotz, Jennifer M.; Lucas, Ray A.; Ogaz, Sara; Rajan, Abhijith; Casertano, Stefano; Dahlen, Tomas; Faber, S. M.; Kocevski, Dale D.; Koo, David C.; Lai, Kamson; McGrath, Elizabeth J.; Riess, Adam G.; Rodney, Steve A.; Dolch, Timothy; Strolger, Louis; Castellano, Marco; Dickinson, Mark

    2011-01-01

    This paper describes the Hubble Space Telescope imaging data products and data reduction procedures for the Cosmic Assembly Near-infrared Deep Extragalactic Legacy Survey (CANDELS). This survey is designed to document the evolution of galaxies and black holes at z ≈ 1.5-8, and to study Type Ia supernovae at z > 1.5. Five premier multi-wavelength sky regions are selected, each with extensive multi-wavelength observations. The primary CANDELS data consist of imaging obtained in the Wide Field Camera 3 infrared channel (WFC3/IR) and the WFC3 ultraviolet/optical channel, along with the Advanced Camera for Surveys (ACS). The CANDELS/Deep survey covers ∼125 arcmin 2 within GOODS-N and GOODS-S, while the remainder consists of the CANDELS/Wide survey, achieving a total of ∼800 arcmin 2 across GOODS and three additional fields (Extended Groth Strip, COSMOS, and Ultra-Deep Survey). We summarize the observational aspects of the survey as motivated by the scientific goals and present a detailed description of the data reduction procedures and products from the survey. Our data reduction methods utilize the most up-to-date calibration files and image combination procedures. We have paid special attention to correcting a range of instrumental effects, including charge transfer efficiency degradation for ACS, removal of electronic bias-striping present in ACS data after Servicing Mission 4, and persistence effects and other artifacts in WFC3/IR. For each field, we release mosaics for individual epochs and eventual mosaics containing data from all epochs combined, to facilitate photometric variability studies and the deepest possible photometry. A more detailed overview of the science goals and observational design of the survey are presented in a companion paper.

  11. A DEEP, WIDE-FIELD Hα SURVEY OF NEARBY CLUSTERS OF GALAXIES: DATA

    International Nuclear Information System (INIS)

    Sakai, Shoko; Kennicutt, Robert C. Jr.; Moss, Chris

    2012-01-01

    We present the results of a wide-field Hα imaging survey of eight nearby (z = 0.02-0.03) Abell clusters. We have measured Hα fluxes and equivalent widths for 465 galaxies, of which 360 are new detections. The survey was designed to obtain complete emission-line-selected inventories of star-forming galaxies in the inner regions of these clusters, extending to star formation rates below 0.1 M ☉ yr –1 . This paper describes the observations, data processing, and source identification procedures, and presents an Hα and R-band catalog of detected cluster members and other candidates. Future papers in the series will use these data to study the completeness of spectroscopically based star formation surveys, and to quantify the effects of cluster environment on the present-day populations of star-forming galaxies. The data will also provide a valuable foundation for imaging surveys of redshifted Hα emission in more distant clusters.

  12. Using Deep Learning to Analyze the Voices of Stars.

    Science.gov (United States)

    Boudreaux, Thomas Macaulay

    2018-01-01

    With several new large-scale surveys on the horizon, including LSST, TESS, ZTF, and Evryscope, faster and more accurate analysis methods will be required to adequately process the enormous amount of data produced. Deep learning, used in industry for years now, allows for advanced feature detection in minimally prepared datasets at very high speeds; however, despite the advantages of this method, its application to astrophysics has not yet been extensively explored. This dearth may be due to a lack of training data available to researchers. Here we generate synthetic data loosely mimicking the properties of acoustic mode pulsating stars and compare the performance of different deep learning algorithms, including Artifical Neural Netoworks, and Convolutional Neural Networks, in classifing these synthetic data sets as either pulsators, or not observed to vary stars.

  13. Confirmation study of the effectiveness of prospect techniques for geothermal resources. Deep-seated geothermal resources survey report (Fiscal year 1994); 1994 nendo chinetsu tansa gijutsu nado kensho chosa. Shinbu chinetsu shigen chosa hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-03-01

    Drilling and survey of deep geothermal exploration wells were conducted to grasp the existing situation of deep geothermal resource and the whole image of geothermal systems in the area where geothermal resource was already developed. In the drilling work in fiscal 1994, 4000m-class rigs and the top drive system were planned to be used for drilling of 12-1/4 inch wells, but 9-5/8 inch liners were used for drilling down to depths of 2550m or deeper since the amount of lost circulation is large and the withdrawal of devices may be very difficult. And in 8-1/2 inch wells, the well was drilled down to a depth of 2950m. As to the deep resistivity exploration technology using electromagnetic method, studies were made of a multiple-frequency array induction logging (MAIL) method, a vertical electromagnetic profiling (VEMP) method, a joint analysis method, etc. Concerning the synthetic fluid inclusion logging technology, containers were lowered into the well and a comparison was made between data of the homogenization temperature analysis of the formed fluid inclusion and those of the temperature log analysis. With relation to the making of deep geothermal structural models, revision was made according to the determination of depths of Miocene formations, Pre-tertiary formations, and the Kakkonda granite. 65 refs., 268 figs., 79 tabs.

  14. Global diversity and biogeography of deep-sea pelagic prokaryotes

    KAUST Repository

    Salazar, Guillem

    2015-08-07

    The deep-sea is the largest biome of the biosphere, and contains more than half of the whole ocean\\'s microbes. Uncovering their general patterns of diversity and community structure at a global scale remains a great challenge, as only fragmentary information of deep-sea microbial diversity exists based on regional-scale studies. Here we report the first globally comprehensive survey of the prokaryotic communities inhabiting the bathypelagic ocean using high-throughput sequencing of the 16S rRNA gene. This work identifies the dominant prokaryotes in the pelagic deep ocean and reveals that 50% of the operational taxonomic units (OTUs) belong to previously unknown prokaryotic taxa, most of which are rare and appear in just a few samples. We show that whereas the local richness of communities is comparable to that observed in previous regional studies, the global pool of prokaryotic taxa detected is modest (∼3600 OTUs), as a high proportion of OTUs are shared among samples. The water masses appear to act as clear drivers of the geographical distribution of both particle-attached and free-living prokaryotes. In addition, we show that the deep-oceanic basins in which the bathypelagic realm is divided contain different particle-attached (but not free-living) microbial communities. The combination of the aging of the water masses and a lack of complete dispersal are identified as the main drivers for this biogeographical pattern. All together, we identify the potential of the deep ocean as a reservoir of still unknown biological diversity with a higher degree of spatial complexity than hitherto considered.

  15. Global diversity and biogeography of deep-sea pelagic prokaryotes

    KAUST Repository

    Salazar, Guillem; Cornejo-Castillo, Francisco M.; Bení tez-Barrios, Veró nica; Fraile-Nuez, Eugenio; Á lvarez-Salgado, X. Antó n; Duarte, Carlos M.; Gasol, Josep M.; Acinas, Silvia G.

    2015-01-01

    The deep-sea is the largest biome of the biosphere, and contains more than half of the whole ocean's microbes. Uncovering their general patterns of diversity and community structure at a global scale remains a great challenge, as only fragmentary information of deep-sea microbial diversity exists based on regional-scale studies. Here we report the first globally comprehensive survey of the prokaryotic communities inhabiting the bathypelagic ocean using high-throughput sequencing of the 16S rRNA gene. This work identifies the dominant prokaryotes in the pelagic deep ocean and reveals that 50% of the operational taxonomic units (OTUs) belong to previously unknown prokaryotic taxa, most of which are rare and appear in just a few samples. We show that whereas the local richness of communities is comparable to that observed in previous regional studies, the global pool of prokaryotic taxa detected is modest (∼3600 OTUs), as a high proportion of OTUs are shared among samples. The water masses appear to act as clear drivers of the geographical distribution of both particle-attached and free-living prokaryotes. In addition, we show that the deep-oceanic basins in which the bathypelagic realm is divided contain different particle-attached (but not free-living) microbial communities. The combination of the aging of the water masses and a lack of complete dispersal are identified as the main drivers for this biogeographical pattern. All together, we identify the potential of the deep ocean as a reservoir of still unknown biological diversity with a higher degree of spatial complexity than hitherto considered.

  16. The MUSE Hubble Ultra Deep Field Survey. IX. Evolution of galaxy merger fraction since z ≈ 6

    Science.gov (United States)

    Ventou, E.; Contini, T.; Bouché, N.; Epinat, B.; Brinchmann, J.; Bacon, R.; Inami, H.; Lam, D.; Drake, A.; Garel, T.; Michel-Dansac, L.; Pello, R.; Steinmetz, M.; Weilbacher, P. M.; Wisotzki, L.; Carollo, M.

    2017-11-01

    We provide, for the first time, robust observational constraints on the galaxy major merger fraction up to z ≈ 6 using spectroscopic close pair counts. Deep Multi Unit Spectroscopic Explorer (MUSE) observations in the Hubble Ultra Deep Field (HUDF) and Hubble Deep Field South (HDF-S) are used to identify 113 secure close pairs of galaxies among a parent sample of 1801 galaxies spread over a large redshift range (0.2 separation limit of 109.5 M⊙ or the median value of stellar mass computed in each redshift bin. Overall, the major close pair fraction for low-mass and massive galaxies follows the same trend. These new, homogeneous, and robust estimates of the major merger fraction since z ≈ 6 are in good agreement with recent predictions of cosmological numerical simulations. Based on observations made with ESO telescopes at the La Silla-Paranal Observatory under programmes 094.A-0289(B), 095.A-0010(A), 096.A-0045(A) and 096.A-0045(B).

  17. Spatial extent and dissipation of the deep chlorophyll layer in Lake Ontario during the Lake Ontario lower foodweb assessment, 2003 and 2008

    Science.gov (United States)

    Watkins, J. M.; Weidel, Brian M.; Rudstam, L. G.; Holek, K. T.

    2014-01-01

    Increasing water clarity in Lake Ontario has led to a vertical redistribution of phytoplankton and an increased importance of the deep chlorophyll layer in overall primary productivity. We used in situ fluorometer profiles collected in lakewide surveys of Lake Ontario in 2008 to assess the spatial extent and intensity of the deep chlorophyll layer. In situ fluorometer data were corrected with extracted chlorophyll data using paired samples from Lake Ontario collected in August 2008. The deep chlorophyll layer was present offshore during the stratified conditions of late July 2008 with maximum values from 4-13 μg l-1 corrected chlorophyll a at 10 to 17 m depth within the metalimnion. Deep chlorophyll layer was closely associated with the base of the thermocline and a subsurface maximum of dissolved oxygen, indicating the feature's importance as a growth and productivity maximum. Crucial to the deep chlorophyll layer formation, the photic zone extended deeper than the surface mixed layer in mid-summer. The layer extended through most of the offshore in July 2008, but was not present in the easternmost transect that had a deeper surface mixed layer. By early September 2008, the lakewide deep chlorophyll layer had dissipated. A similar formation and dissipation was observed in the lakewide survey of Lake Ontario in 2003.

  18. A SUCCESSFUL BROADBAND SURVEY FOR GIANT Lyα NEBULAE. I. SURVEY DESIGN AND CANDIDATE SELECTION

    International Nuclear Information System (INIS)

    Prescott, Moire K. M.; Dey, Arjun; Jannuzi, Buell T.

    2012-01-01

    Giant Lyα nebulae (or Lyα 'blobs') are likely sites of ongoing massive galaxy formation, but the rarity of these powerful sources has made it difficult to form a coherent picture of their properties, ionization mechanisms, and space density. Systematic narrowband Lyα nebula surveys are ongoing, but the small redshift range covered and the observational expense limit the comoving volume that can be probed by even the largest of these surveys and pose a significant problem when searching for such rare sources. We have developed a systematic search technique designed to find large Lyα nebulae at 2 ∼ 2 NOAO Deep Wide-Field Survey Boötes field. With a total survey comoving volume of ≈10 8 h –3 70 Mpc 3 , this is the largest volume survey for Lyα nebulae ever undertaken. In this first paper in the series, we present the details of the survey design and a systematically selected sample of 79 candidates, which includes one previously discovered Lyα nebula.

  19. Research and application of soil-mercury-surveys method for locating uranium

    International Nuclear Information System (INIS)

    You Yunfei; Lu Shili; Jiao Zongrun

    1995-06-01

    Soil-Hg-Surveys method for locating uranium ore was presented. Soil-sampler of drilling bottom, the ability of surveying the deep uranium orebodies was raised by using this method. Application of minicomputer technology to pyrolytic-Hg-analysis raises the degree of automation and precision of the analysis. Application condition of optimum is Hg content of orebodies >1 x 10 -6 . Locating deep is about 100 m. The forecast of uranium orebodies achieved success in two unknown section that are 534 and 510 mining area, therefore two little size deposits expanded into middle size deposits. This method is as well applicable to locating gold, silver, copper, lead zinc and oil-gas natural resource and so on. (8 figs., 3 tabs.)

  20. Deep Incremental Boosting

    OpenAIRE

    Mosca, Alan; Magoulas, George D

    2017-01-01

    This paper introduces Deep Incremental Boosting, a new technique derived from AdaBoost, specifically adapted to work with Deep Learning methods, that reduces the required training time and improves generalisation. We draw inspiration from Transfer of Learning approaches to reduce the start-up time to training each incremental Ensemble member. We show a set of experiments that outlines some preliminary results on some common Deep Learning datasets and discuss the potential improvements Deep In...

  1. Applications of Deep Learning and Reinforcement Learning to Biological Data.

    Science.gov (United States)

    Mahmud, Mufti; Kaiser, Mohammed Shamim; Hussain, Amir; Vassanelli, Stefano

    2018-06-01

    Rapid advances in hardware-based technologies during the past decades have opened up new possibilities for life scientists to gather multimodal data in various application domains, such as omics, bioimaging, medical imaging, and (brain/body)-machine interfaces. These have generated novel opportunities for development of dedicated data-intensive machine learning techniques. In particular, recent research in deep learning (DL), reinforcement learning (RL), and their combination (deep RL) promise to revolutionize the future of artificial intelligence. The growth in computational power accompanied by faster and increased data storage, and declining computing costs have already allowed scientists in various fields to apply these techniques on data sets that were previously intractable owing to their size and complexity. This paper provides a comprehensive survey on the application of DL, RL, and deep RL techniques in mining biological data. In addition, we compare the performances of DL techniques when applied to different data sets across various application domains. Finally, we outline open issues in this challenging research area and discuss future development perspectives.

  2. Assessing Deep Sea Communities Through Seabed Imagery

    Science.gov (United States)

    Matkin, A. G.; Cross, K.; Milititsky, M.

    2016-02-01

    The deep sea still remains virtually unexplored. Human activity, such as oil and gas exploration and deep sea mining, is expanding further into the deep sea, increasing the need to survey and map extensive areas of this habitat in order to assess ecosystem health and value. The technology needed to explore this remote environment has been advancing. Seabed imagery can cover extensive areas of the seafloor and investigate areas where sampling with traditional coring methodologies is just not possible (e.g. cold water coral reefs). Remotely operated vehicles (ROVs) are an expensive option, so drop or towed camera systems can provide a more viable and affordable alternative, while still allowing for real-time control. Assessment of seabed imagery in terms of presence, abundance and density of particular species can be conducted by bringing together a variety of analytical tools for a holistic approach. Sixteen deep sea transects located offshore West Africa were investigated with a towed digital video telemetry system (DTS). Both digital stills and video footage were acquired. An extensive data set was obtained from over 13,000 usable photographs, allowing for characterisation of the different habitats present in terms of community composition and abundance. All observed fauna were identified to the lowest taxonomic level and enumerated when possible, with densities derived after the seabed area was calculated for each suitable photograph. This methodology allowed for consistent assessment of the different habitat types present, overcoming constraints, such as specific taxa that cannot be enumerated, such as sponges, corals or bryozoans, the presence of mobile and sessile species, or the level of taxonomic detail. Although this methodology will not enable a full characterisation of a deep sea community, in terms of species composition for instance, itt will allow a robust assessment of large areas of the deep sea in terms of sensitive habitats present and community

  3. SELECTION OF BURST-LIKE TRANSIENTS AND STOCHASTIC VARIABLES USING MULTI-BAND IMAGE DIFFERENCING IN THE PAN-STARRS1 MEDIUM-DEEP SURVEY

    Energy Technology Data Exchange (ETDEWEB)

    Kumar, S.; Gezari, S.; Heinis, S. [Department of Astronomy, University of Maryland, Stadium Drive, College Park, MD 21224 (United States); Chornock, R.; Berger, E.; Soderberg, A.; Stubbs, C. W.; Kirshner, R. P. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Rest, A. [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States); Huber, M. E.; Narayan, G.; Marion, G. H.; Burgett, W. S. [Institute for Astronomy, University of Hawaii, 2680 Woodlawn Drive, Honolulu, HI 96822 (United States); Foley, R. J. [Astronomy Department, University of Illinois at Urbana-Champaign, 1002 West Green Street, Urbana, IL 61801 (United States); Scolnic, D.; Riess, A. G. [Department of Physics and Astronomy, Johns Hopkins University, 3400 North Charles Street, Baltimore, MD 21218 (United States); Lawrence, A. [Institute for Astronomy, University of Edinburgh Scottish Universities Physics Alliance, Royal Observatory, Blackford Hill, Edinburgh EH9 3HJ (United Kingdom); Smartt, S. J.; Smith, K. [Astrophysics Research Centre, School of Mathematics and Physics, Queen' s University Belfast, Belfast BT7 1NN (United Kingdom); Wood-Vasey, W. M. [Pittsburgh Particle Physics, Astrophysics, and Cosmology Center, Department of Physics and Astronomy, University of Pittsburgh, 3941 O' Hara Street, Pittsburgh, PA 15260 (United States); and others

    2015-03-20

    We present a novel method for the light-curve characterization of Pan-STARRS1 Medium Deep Survey (PS1 MDS) extragalactic sources into stochastic variables (SVs) and burst-like (BL) transients, using multi-band image-differencing time-series data. We select detections in difference images associated with galaxy hosts using a star/galaxy catalog extracted from the deep PS1 MDS stacked images, and adopt a maximum a posteriori formulation to model their difference-flux time-series in four Pan-STARRS1 photometric bands g {sub P1}, r {sub P1}, i {sub P1}, and z {sub P1}. We use three deterministic light-curve models to fit BL transients; a Gaussian, a Gamma distribution, and an analytic supernova (SN) model, and one stochastic light-curve model, the Ornstein-Uhlenbeck process, in order to fit variability that is characteristic of active galactic nuclei (AGNs). We assess the quality of fit of the models band-wise and source-wise, using their estimated leave-out-one cross-validation likelihoods and corrected Akaike information criteria. We then apply a K-means clustering algorithm on these statistics, to determine the source classification in each band. The final source classification is derived as a combination of the individual filter classifications, resulting in two measures of classification quality, from the averages across the photometric filters of (1) the classifications determined from the closest K-means cluster centers, and (2) the square distances from the clustering centers in the K-means clustering spaces. For a verification set of AGNs and SNe, we show that SV and BL occupy distinct regions in the plane constituted by these measures. We use our clustering method to characterize 4361 extragalactic image difference detected sources, in the first 2.5 yr of the PS1 MDS, into 1529 BL, and 2262 SV, with a purity of 95.00% for AGNs, and 90.97% for SN based on our verification sets. We combine our light-curve classifications with their nuclear or off-nuclear host

  4. Viral infections as controlling factors for the deep biosphere? (Invited)

    Science.gov (United States)

    Engelen, B.; Engelhardt, T.; Sahlberg, M.; Cypionka, H.

    2009-12-01

    The marine deep biosphere represents the largest biotope on Earth. Throughout the last years, we have obtained interesting insights into its microbial community composition. However, one component that was completely overlooked so far is the viral inventory of deep-subsurface sediments. While viral infections were identified to have a major impact on the benthic microflora of deep-sea surface sediments (Danavaro et al. 2008), no studies were performed on deep-biosphere samples, so far. As grazers probably play only a minor role in anoxic and highly compressed deep sediments, viruses might be the main “predators” for indigenous microorganisms. Furthermore, the release of cell components, called “the viral shunt”, could have a major impact on the deep biosphere in providing labile organic compounds to non-infected microorganisms in these generally nutrient depleted sediments. However, direct counting of viruses in sediments is highly challenging due to the small size of viruses and the high background of small particles. Even molecular surveys using “universal” PCR primers that target phage-specific genes fail due to the vast phage diversity. One solution for this problem is the lysogenic viral life cycle as many bacteriophages integrate their DNA into the host genome. It is estimated that up to 70% of cultivated bacteria contain prophages within their genome. Therefore, culture collections (Batzke et al. 2007) represent an archive of the viral composition within the respective habitat. These prophages can be induced to become free phage particles in stimulation experiments in which the host cells are set under certain stress situations such as a treatment with UV exposure or DNA-damaging antibiotics. The study of the viral component within the deep biosphere offers to answer the following questions: To which extent are deep-biosphere populations controlled by viral infections? What is the inter- and intra-specific diversity and the host-specific viral

  5. Deep Super Learner: A Deep Ensemble for Classification Problems

    OpenAIRE

    Young, Steven; Abdou, Tamer; Bener, Ayse

    2018-01-01

    Deep learning has become very popular for tasks such as predictive modeling and pattern recognition in handling big data. Deep learning is a powerful machine learning method that extracts lower level features and feeds them forward for the next layer to identify higher level features that improve performance. However, deep neural networks have drawbacks, which include many hyper-parameters and infinite architectures, opaqueness into results, and relatively slower convergence on smaller datase...

  6. DeepRT: deep learning for peptide retention time prediction in proteomics

    OpenAIRE

    Ma, Chunwei; Zhu, Zhiyong; Ye, Jun; Yang, Jiarui; Pei, Jianguo; Xu, Shaohang; Zhou, Ruo; Yu, Chang; Mo, Fan; Wen, Bo; Liu, Siqi

    2017-01-01

    Accurate predictions of peptide retention times (RT) in liquid chromatography have many applications in mass spectrometry-based proteomics. Herein, we present DeepRT, a deep learning based software for peptide retention time prediction. DeepRT automatically learns features directly from the peptide sequences using the deep convolutional Neural Network (CNN) and Recurrent Neural Network (RNN) model, which eliminates the need to use hand-crafted features or rules. After the feature learning, pr...

  7. Real-Time Visualization System for Deep-Sea Surveying

    Directory of Open Access Journals (Sweden)

    Yujie Li

    2014-01-01

    Full Text Available Remote robotic exploration holds vast potential for gaining knowledge about extreme environments, which is difficult to be accessed by humans. In the last two decades, various underwater devices were developed for detecting the mines and mine-like objects in the deep-sea environment. However, there are some problems in recent equipment, like poor accuracy of mineral objects detection, without real-time processing, and low resolution of underwater video frames. Consequently, the underwater objects recognition is a difficult task, because the physical properties of the medium, the captured video frames, are distorted seriously. In this paper, we are considering use of the modern image processing methods to determine the mineral location and to recognize the mineral actually within a little computation complex. We firstly analyze the recent underwater imaging models and propose a novel underwater optical imaging model, which is much closer to the light propagation model in the underwater environment. In our imaging system, we remove the electrical noise by dual-tree complex wavelet transform. And then we solve the nonuniform illumination of artificial lights by fast guided trilateral bilateral filter and recover the image color through automatic color equalization. Finally, a shape-based mineral recognition algorithm is proposed for underwater objects detection. These methods are designed for real-time execution on limited-memory platforms. This pipeline is suitable for detecting underwater objects in practice by our experiences. The initial results are presented and experiments demonstrate the effectiveness of the proposed real-time visualization system.

  8. THE LUPUS TRANSIT SURVEY FOR HOT JUPITERS: RESULTS AND LESSONS

    International Nuclear Information System (INIS)

    Bayliss, Daniel D. R.; Sackett, Penny D.; Weldrake, David T. F.; Tingley, Brandon W.; Lewis, Karen M.

    2009-01-01

    We present the results of a deep, wide-field transit survey targeting 'Hot Jupiter' planets in the Lupus region of the Galactic plane conducted over 53 nights concentrated in two epochs separated by a year. Using the Australian National University 40-inch telescope at Siding Spring Observatory (SSO), the survey covered a 0.66 deg 2 region close to the Galactic plane (b = 11 0 ) and monitored a total of 110,372 stars (15.0 ≤ V ≤ 22.0). Using difference imaging photometry, 16,134 light curves with a photometric precision of σ < 0.025 mag were obtained. These light curves were searched for transits, and four candidates were detected that displayed low-amplitude variability consistent with a transiting giant planet. Further investigations, including spectral typing and radial velocity measurements for some candidates, revealed that of the four, one is a true planetary companion (Lupus-TR-3), two are blended systems (Lupus-TR-1 and 4), and one is a binary (Lupus-TR-2). The results of this successful survey are instructive for optimizing the observational strategy and follow-up procedure for deep searches for transiting planets, including an upcoming survey using the SkyMapper telescope at SSO.

  9. 3D seismic surveys for shallow targets

    Energy Technology Data Exchange (ETDEWEB)

    Lawton, D.C.; Stewart, R.R.; Bertram, M.B. [Calgary Univ., AB (Canada). Dept. of Geoscience, Consortium for Research in Elastic Wave Exploration Seismology

    2008-07-01

    Although 3D seismic surveys are generally used to map deep hydrocarbon plays, this study demonstrated that they can be useful for characterizing shallow targets, such as oilsands deposits. A high-resolution 3D seismic survey was undertaken to map shallow stratigraphy near Calgary, Alberta. The project demonstrated the efficacy of reflection seismic surveys for shallow targets ranging from 100 to 500 metres. The purpose of the program was to map shallow stratigraphy and structure to depths of up to 500m, and to investigate shallow aquifers in the study area. The results of the survey illustrated the opportunity that 3D seismic surveys provide for mapping shallow reflectors and the acquisition geometry needed to image them. Applications include mapping the distribution of shallow aquifers, delineating shallow coals and investigating oilsands deposits. 2 refs., 5 figs.

  10. Analyses of the deep borehole drilling status for a deep borehole disposal system

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jong Youl; Choi, Heui Joo; Lee, Min Soo; Kim, Geon Young; Kim, Kyung Su [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    The purpose of disposal for radioactive wastes is not only to isolate them from humans, but also to inhibit leakage of any radioactive materials into the accessible environment. Because of the extremely high level and long-time scale radioactivity of HLW(High-level radioactive waste), a mined deep geological disposal concept, the disposal depth is about 500 m below ground, is considered as the safest method to isolate the spent fuels or high-level radioactive waste from the human environment with the best available technology at present time. Therefore, as an alternative disposal concept, i.e., deep borehole disposal technology is under consideration in number of countries in terms of its outstanding safety and cost effectiveness. In this paper, the general status of deep drilling technologies was reviewed for deep borehole disposal of high level radioactive wastes. Based on the results of these review, very preliminary applicability of deep drilling technology for deep borehole disposal analyzed. In this paper, as one of key technologies of deep borehole disposal system, the general status of deep drilling technologies in oil industry, geothermal industry and geo scientific field was reviewed for deep borehole disposal of high level radioactive wastes. Based on the results of these review, the very preliminary applicability of deep drilling technology for deep borehole disposal such as relation between depth and diameter, drilling time and feasibility classification was analyzed.

  11. Fall 2011 Small Pelagics Survey (PC1108, EK60)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Objectives of the Fall 2011 Small Pelagics Survey were to sample the waters of the northern Gulf of Mexico less than 500 meters deep with 90-ft high opening fish...

  12. Fall 2010 Small Pelagics Survey (PC1006, EK60)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Objectives of the Fall 2010 Small Pelagics Survey were to sample the waters of the northern Gulf of Mexico less than 500 meters deep with 90-ft high opening fish...

  13. Deep Space Telecommunications

    Science.gov (United States)

    Kuiper, T. B. H.; Resch, G. M.

    2000-01-01

    The increasing load on NASA's deep Space Network, the new capabilities for deep space missions inherent in a next-generation radio telescope, and the potential of new telescope technology for reducing construction and operation costs suggest a natural marriage between radio astronomy and deep space telecommunications in developing advanced radio telescope concepts.

  14. Construction, test, and operation of an energy-tagged photon beam at the PHOENICS experiment of the Bonn electron stretcher facility ELSA

    International Nuclear Information System (INIS)

    Detemple, P.

    1990-10-01

    This report describes the photon tagging facility of the PHOENICS-Experiment at the Bonn Electron Stretcher and Accelerator ELSA. The system is designed for primary electron energies up to E 0 = 1.2 GeV and covers a broad energy range k/E 0 = 20-95% simultaneously with an average energy resolution Δk/E 0 =7.5%. The tagging spectrometer consists of a single dipole magnet with a strong radial field gradient, which provides focussing properties in the deflection plane over a wide energy range. The tagging hodoscope consists of an array of 128 thin energy defining scintillation counters and an additional set of 16 thick backing counters for improved timing resolution and background rejection. The main properties were determined experimentally. The spot size of the primary electron beam at the radiator is ≅ 1 mm (σ), the photon beam spot size at a distance of 2 m from the radiator is ≅ 2 mm. The average tagging efficiency, measured with a total absorbing NaI-detector, is about 90%. The mean photon energies corresponding to the 128 energy defining counters were experimentally determined and found to be in good agreement with the design values. The tagging system has been used in a first experiment with the PHOENICS-Detector, the measurement of the differential cross section for the reaction γp→π + n, which was performed in order to test and calibrate the whole detector system. The data shows good agreement with other experiments. (orig.) [de

  15. Greedy Deep Dictionary Learning

    OpenAIRE

    Tariyal, Snigdha; Majumdar, Angshul; Singh, Richa; Vatsa, Mayank

    2016-01-01

    In this work we propose a new deep learning tool called deep dictionary learning. Multi-level dictionaries are learnt in a greedy fashion, one layer at a time. This requires solving a simple (shallow) dictionary learning problem, the solution to this is well known. We apply the proposed technique on some benchmark deep learning datasets. We compare our results with other deep learning tools like stacked autoencoder and deep belief network; and state of the art supervised dictionary learning t...

  16. Report on the survey for electrostatic discharges on Mars using NASA's Deep Space Network (DSN)

    Science.gov (United States)

    Arabshahi, S.; Majid, W.; Geldzahler, B.; Kocz, J.; Schulter, T.; White, L.

    2017-12-01

    Mars atmosphere has strong dust activity. It is suggested that the larger regional storms are capable of producing electric fields large enough to initiate electrostatic discharges. The storms have charging process similar to terrestrial dust devils and have hot cores and complicated vortex winds similar to terrestrial thunderstorms. However, due to uncertainties in our understanding of the electrical environment of the storms and absence of related in-situ measurements, the existence (or non-existence) of such electrostatic discharges on the planet is yet to be confirmed. Knowing about the electrical activity on Mars is essential for future human explorations of the planet. We have recently launched a long-term monitoring campaign at NASA's Madrid Deep Space Communication Complex (MDSCC) to search for powerful discharges on Mars. The search occurs during routine tracking of Mars orbiting spacecraft by Deep Space Network (DSN) radio telescope. In this presentation, we will report on the result of processing and analysis of the data from the first six months of our campaign.

  17. The HST/ACS Coma Cluster Survey : II. Data Description and Source Catalogs

    NARCIS (Netherlands)

    Hammer, Derek; Kleijn, Gijs Verdoes; Hoyos, Carlos; den Brok, Mark; Balcells, Marc; Ferguson, Henry C.; Goudfrooij, Paul; Carter, David; Guzman, Rafael; Peletier, Reynier F.; Smith, Russell J.; Graham, Alister W.; Trentham, Neil; Peng, Eric; Puzia, Thomas H.; Lucey, John R.; Jogee, Shardha; Aguerri, Alfonso L.; Batcheldor, Dan; Bridges, Terry J.; Chiboucas, Kristin; Davies, Jonathan I.; del Burgo, Carlos; Erwin, Peter; Hornschemeier, Ann; Hudson, Michael J.; Huxor, Avon; Jenkins, Leigh; Karick, Arna; Khosroshahi, Habib; Kourkchi, Ehsan; Komiyama, Yutaka; Lotz, Jennifer; Marzke, Ronald O.; Marinova, Irina; Matkovic, Ana; Merritt, David; Miller, Bryan W.; Miller, Neal A.; Mobasher, Bahram; Mouhcine, Mustapha; Okamura, Sadanori; Percival, Sue; Phillipps, Steven; Poggianti, Bianca M.; Price, James; Sharples, Ray M.; Tully, R. Brent; Valentijn, Edwin

    The Coma cluster, Abell 1656, was the target of an HST-ACS Treasury program designed for deep imaging in the F475W and F814W passbands. Although our survey was interrupted by the ACS instrument failure in early 2007, the partially completed survey still covers ~50% of the core high-density region in

  18. MBARI Mapping AUV: A High-Resolution Deep Ocean Seafloor Mapping Capability

    Science.gov (United States)

    Caress, D. W.; Kirkwood, W. J.; Thomas, H.; McEwen, R.; Henthorn, R.; McGill, P.; Thompson, D.; Sibenac, M.; Jensen, S.; Shane, F.; Hamilton, A.

    2005-05-01

    The Monterey Bay Aquarium Research Institute (MBARI) is developing an autonomous seafloor mapping capability for deep ocean science applications. The MBARI Mapping AUV is a 0.53 m (21 in) diameter, 5.1 m (16.7 ft) long, Dorado-class vehicle designed to carry four mapping sonars. The primary sensor is a 200 kHz multibeam sonar producing swath bathymetry and sidescan. In addition, the vehicle carries 100 kHz and 410 kHz chirp sidescan sonars, and a 2-16 kHz sweep chirp subbottom profiler. Navigation and attitude data are obtained from an inertial navigation system (INS) incorporating a ring laser gyro and a 300 kHz Doppler velocity log (DVL). The vehicle also includes acoustic modem, ultra-short baseline navigation, and long-baseline navigation systems. The Mapping AUV is powered by 6 kWhr of Li-polymer batteries, providing expected mission duration of 12 hours at a typical speed of 1.5 m/s. All components of the vehicle are rated to 6000 m depth, allowing MBARI to conduct high-resolution mapping of the deep-ocean seafloor. The sonar package is also be mountable on ROV Ventana, allowing surveys at altitudes less than 20 m at topographically challenging sites. The vehicle was assembled and extensively tested during 2004; this year we are commencing operations for MBARI science projects while continuing the process of testing and integrating the complete suite of sensors and systems. MBARI is beginning to use this capability to observe the changing morphology of dynamic systems such as submarine canyons and active slumps, to map deep-water benthic habitats at resolutions comparable to ROV and submersible observations, to provide basemaps for ROV dives, and to provide high resolution bathymetry and subbottom profiles as part of a variety of projects requiring knowledge of the seafloor. We will present initial results from surveys in and around Monterey Canyon, including high resolution repeat surveys of four sites along the canyon axis.

  19. DeepBipolar: Identifying genomic mutations for bipolar disorder via deep learning.

    Science.gov (United States)

    Laksshman, Sundaram; Bhat, Rajendra Rana; Viswanath, Vivek; Li, Xiaolin

    2017-09-01

    Bipolar disorder, also known as manic depression, is a brain disorder that affects the brain structure of a patient. It results in extreme mood swings, severe states of depression, and overexcitement simultaneously. It is estimated that roughly 3% of the population of the United States (about 5.3 million adults) suffers from bipolar disorder. Recent research efforts like the Twin studies have demonstrated a high heritability factor for the disorder, making genomics a viable alternative for detecting and treating bipolar disorder, in addition to the conventional lengthy and costly postsymptom clinical diagnosis. Motivated by this study, leveraging several emerging deep learning algorithms, we design an end-to-end deep learning architecture (called DeepBipolar) to predict bipolar disorder based on limited genomic data. DeepBipolar adopts the Deep Convolutional Neural Network (DCNN) architecture that automatically extracts features from genotype information to predict the bipolar phenotype. We participated in the Critical Assessment of Genome Interpretation (CAGI) bipolar disorder challenge and DeepBipolar was considered the most successful by the independent assessor. In this work, we thoroughly evaluate the performance of DeepBipolar and analyze the type of signals we believe could have affected the classifier in distinguishing the case samples from the control set. © 2017 Wiley Periodicals, Inc.

  20. Deep learning? What deep learning? | Fourie | South African ...

    African Journals Online (AJOL)

    In teaching generally over the past twenty years, there has been a move towards teaching methods that encourage deep, rather than surface approaches to learning. The reason for this being that students, who adopt a deep approach to learning are considered to have learning outcomes of a better quality and desirability ...

  1. Studying dark energy with galaxy cluster surveys

    International Nuclear Information System (INIS)

    Mohr, Joseph J.; O'Shea, Brian; Evrard, August E.; Bialek, John; Haiman, Zoltan

    2003-01-01

    Galaxy cluster surveys provide a powerful means of studying the density and nature of the dark energy. The redshift distribution of detected clusters in a deep, large solid angle SZE or X-ray survey is highly sensitive to the dark energy equation of state. Accurate constraints at the 5% level on the dark energy equation of state require that systematic biases in the mass estimators must be controlled at better than the ∼10% level. Observed regularity in the cluster population and the availability of multiple, independent mass estimators suggests these precise measurements are possible. Using hydrodynamical simulations that include preheating, we show that the level of preheating required to explain local galaxy cluster structure has a dramatic effect on X-ray cluster surveys, but only a mild effect on SZE surveys. This suggests that SZE surveys may be optimal for cosmology while X-ray surveys are well suited for studies of the thermal history of the intracluster medium

  2. Pipeline corridors through wetlands - impacts on plant communities: Deep Creek and Brandy Branch crossings, Nassau County, Florida

    Energy Technology Data Exchange (ETDEWEB)

    Shem, L.M.; Van Dyke, G.D.; Zimmerman, R.E.

    1994-12-01

    The goal of the Gas Research Institute Wetland Corridors Program is to document impacts of existing pipelines on the wetlands they traverse. To accomplish this goal, 12 existing wetland crossings were surveyed. These sites varied in elapsed time since pipeline construction, wetland type, pipeline installation techniques, and right-of-way (ROW) management practices. This report presents the results of surveys conducted July 14-18, 1992, at the Deep Creek and the Brandy Branch crossings of a pipeline installed during May 1991 in Nassau County, Florida. Both floodplains supported bottomland hardwood forests. The pipeline at the Deep Creek crossing was installed by means of horizontal directional drilling after the ROW had been clear-cut, while the pipeline at the Brandy Branch crossing was installed by means of conventional open trenching. Neither site was seeded or fertilized. At the time of sampling, a dense vegetative community, made up primarily of native perennial herbaceous species, occupied the ROW within the Deep Creek floodplain. The Brandy Branch ROW was vegetated by a less dense stand of primarily native perennial herbaceous plants. Plant diversity was also lower at the Brandy Branch crossing than at the Deep Creek crossing. The results suggest that some of the differences in plant communities are related to the more hydric conditions at the Brandy Branch floodplain.

  3. DeepInfer: open-source deep learning deployment toolkit for image-guided therapy

    Science.gov (United States)

    Mehrtash, Alireza; Pesteie, Mehran; Hetherington, Jorden; Behringer, Peter A.; Kapur, Tina; Wells, William M.; Rohling, Robert; Fedorov, Andriy; Abolmaesumi, Purang

    2017-03-01

    Deep learning models have outperformed some of the previous state-of-the-art approaches in medical image analysis. Instead of using hand-engineered features, deep models attempt to automatically extract hierarchical representations at multiple levels of abstraction from the data. Therefore, deep models are usually considered to be more flexible and robust solutions for image analysis problems compared to conventional computer vision models. They have demonstrated significant improvements in computer-aided diagnosis and automatic medical image analysis applied to such tasks as image segmentation, classification and registration. However, deploying deep learning models often has a steep learning curve and requires detailed knowledge of various software packages. Thus, many deep models have not been integrated into the clinical research work ows causing a gap between the state-of-the-art machine learning in medical applications and evaluation in clinical research procedures. In this paper, we propose "DeepInfer" - an open-source toolkit for developing and deploying deep learning models within the 3D Slicer medical image analysis platform. Utilizing a repository of task-specific models, DeepInfer allows clinical researchers and biomedical engineers to deploy a trained model selected from the public registry, and apply it to new data without the need for software development or configuration. As two practical use cases, we demonstrate the application of DeepInfer in prostate segmentation for targeted MRI-guided biopsy and identification of the target plane in 3D ultrasound for spinal injections.

  4. Cyriax's deep friction massage application parameters: Evidence from a cross-sectional study with physiotherapists.

    Science.gov (United States)

    Chaves, Paula; Simões, Daniela; Paço, Maria; Pinho, Francisco; Duarte, José Alberto; Ribeiro, Fernando

    2017-12-01

    Deep friction massage is one of several physiotherapy interventions suggested for the management of tendinopathy. To determine the prevalence of deep friction massage use in clinical practice, to characterize the application parameters used by physiotherapists, and to identify empirical model-based patterns of deep friction massage application in degenerative tendinopathy. observational, analytical, cross-sectional and national web-based survey. 478 physiotherapists were selected through snow-ball sampling method. The participants completed an online questionnaire about personal and professional characteristics as well as specific questions regarding the use of deep friction massage. Characterization of deep friction massage parameters used by physiotherapists were presented as counts and proportions. Latent class analysis was used to identify the empirical model-based patterns. Crude and adjusted odds ratios and 95% confidence intervals were computed. The use of deep friction massage was reported by 88.1% of the participants; tendinopathy was the clinical condition where it was most frequently used (84.9%) and, from these, 55.9% reported its use in degenerative tendinopathy. The "duration of application" parameters in chronic phase and "frequency of application" in acute and chronic phases are those that diverge most from those recommended by the author of deep friction massage. We found a high prevalence of deep friction massage use, namely in degenerative tendinopathy. Our results have shown that the application parameters are heterogeneous and diverse. This is reflected by the identification of two application patterns, although none is in complete agreement with Cyriax's description. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. The Horonobe Underground Research Laboratory (Tentative name) Project. A program on survey and research performed from earth surface

    International Nuclear Information System (INIS)

    2001-03-01

    The Horonobe Underground Research Laboratory (Tentative name) Project under planning at Horonobe-machi by the Japan Nuclear Cycle Development Institute (JNC) is a research facility on deep underground shown in the Long-term program on research, development and application of nuclear energy (June, 1994)' (LPNE), where some researches on the deep underground targeted at sedimentary rocks are carried out. The plan on The Horonobe Underground Research Laboratory performed at Horonobe-machi' is an about 20 years plan ranging from beginning to finishing of its survey and research, which is carried out by three steps such as 'Survey and research performed from earth surface', 'Survey and research performed under excavation of road', and Survey and research performed by using the road'. The Horonobe Underground Research Laboratory is one of research facilities on deep underground shown its importance in LPNE, and carries out some researches on the deep underground at a target of the sedimentary rocks. And also The Horonobe Underground Research Laboratory confirms some technical reliability and support on stratum disposal shown in the 'Technical reliability on stratum disposal of the high level radioactive wastes. The Second Progress Report of R and D on geological disposal' summarized on November, 1999 by JNC through actual tests and researches at the deep stratum. The obtained results are intended to reflect to disposal business of The Horonobe Underground Research Laboratory and safety regulation and so on performed by the government, together with results of stratum science research, at the Tono Geoscience Center, of geological disposal R and D at the Tokai Works, or of international collaborations. For R and D at the The Horonobe Underground Research Laboratory after 2000, following subjects are shown: 1) Survey technique on long-term stability of geological environment, 2) Survey technique on geological environment, 3) Engineering technique on engineered barrier and

  6. Predictive modeling of deep-sea fish distribution in the Azores

    Science.gov (United States)

    Parra, Hugo E.; Pham, Christopher K.; Menezes, Gui M.; Rosa, Alexandra; Tempera, Fernando; Morato, Telmo

    2017-11-01

    Understanding the link between fish and their habitat is essential for an ecosystem approach to fisheries management. However, determining such relationship is challenging, especially for deep-sea species. In this study, we applied generalized additive models (GAMs) to relate presence-absence and relative abundance data of eight economically-important fish species to environmental variables (depth, slope, aspect, substrate type, bottom temperature, salinity and oxygen saturation). We combined 13 years of catch data collected from systematic longline surveys performed across the region. Overall, presence-absence GAMs performed better than abundance models and predictions made for the observed data successfully predicted the occurrence of the eight deep-sea fish species. Depth was the most influential predictor of all fish species occurrence and abundance distributions, whereas other factors were found to be significant for some species but did not show such a clear influence. Our results predicted that despite the extensive Azores EEZ, the habitats available for the studied deep-sea fish species are highly limited and patchy, restricted to seamounts slopes and summits, offshore banks and island slopes. Despite some identified limitations, our GAMs provide an improved knowledge of the spatial distribution of these commercially important fish species in the region.

  7. Optical Design for a Survey X-Ray Telescope

    Science.gov (United States)

    Saha, Timo T.; Zhang, William W.; McClelland, Ryan S.

    2014-01-01

    Optical design trades are underway at the Goddard Space Flight Center to define a telescope for an x-ray survey mission. Top-level science objectives of the mission include the study of x-ray transients, surveying and long-term monitoring of compact objects in nearby galaxies, as well as both deep and wide-field x-ray surveys. In this paper we consider Wolter, Wolter-Schwarzschild, and modified Wolter-Schwarzschild telescope designs as basic building blocks for the tightly nested survey telescope. Design principles and dominating aberrations of individual telescopes and nested telescopes are discussed and we compare the off-axis optical performance at 1.0 KeV and 4.0 KeV across a 1.0-degree full field-of-view.

  8. Deep learning with Python

    CERN Document Server

    Chollet, Francois

    2018-01-01

    DESCRIPTION Deep learning is applicable to a widening range of artificial intelligence problems, such as image classification, speech recognition, text classification, question answering, text-to-speech, and optical character recognition. Deep Learning with Python is structured around a series of practical code examples that illustrate each new concept introduced and demonstrate best practices. By the time you reach the end of this book, you will have become a Keras expert and will be able to apply deep learning in your own projects. KEY FEATURES • Practical code examples • In-depth introduction to Keras • Teaches the difference between Deep Learning and AI ABOUT THE TECHNOLOGY Deep learning is the technology behind photo tagging systems at Facebook and Google, self-driving cars, speech recognition systems on your smartphone, and much more. AUTHOR BIO Francois Chollet is the author of Keras, one of the most widely used libraries for deep learning in Python. He has been working with deep neural ...

  9. Deep learning evaluation using deep linguistic processing

    OpenAIRE

    Kuhnle, Alexander; Copestake, Ann

    2017-01-01

    We discuss problems with the standard approaches to evaluation for tasks like visual question answering, and argue that artificial data can be used to address these as a complement to current practice. We demonstrate that with the help of existing 'deep' linguistic processing technology we are able to create challenging abstract datasets, which enable us to investigate the language understanding abilities of multimodal deep learning models in detail, as compared to a single performance value ...

  10. A MegaCam Survey of Outer Halo Satellites. I. Description of the Survey

    Science.gov (United States)

    Muñoz, Ricardo R.; Côté, Patrick; Santana, Felipe A.; Geha, Marla; Simon, Joshua D.; Oyarzún, Grecco A.; Stetson, Peter B.; Djorgovski, S. G.

    2018-06-01

    We describe a deep, systematic imaging study of satellites in the outer halo of the Milky Way. Our sample consists of 58 stellar overdensities—i.e., substructures classified as either globular clusters, classical dwarf galaxies, or ultra-faint dwarf galaxies—that are located at Galactocentric distances of R GC ≥ 25 kpc (outer halo) and out to ∼400 kpc. This includes 44 objects for which we have acquired deep, wide-field, g- and r-band imaging with the MegaCam mosaic cameras on the 3.6 m Canada–France–Hawaii Telescope and the 6.5 m Magellan-Clay telescope. These data are supplemented by archival imaging, or published gr photometry, for an additional 14 objects, most of which were discovered recently in the Dark Energy Survey (DES). We describe the scientific motivation for our survey, including sample selection, observing strategy, data reduction pipeline, calibration procedures, and the depth and precision of the photometry. The typical 5σ point-source limiting magnitudes for our MegaCam imaging—which collectively covers an area of ≈52 deg2—are g lim ≃ 25.6 and r lim ≃ 25.3 AB mag. These limits are comparable to those from the coadded DES images and are roughly a half-magnitude deeper than will be reached in a single visit with the Large Synoptic Survey Telescope. Our photometric catalog thus provides the deepest and most uniform photometric database of Milky Way satellites available for the foreseeable future. In other papers in this series, we have used these data to explore the blue straggler populations in these objects, their density distributions, star formation histories, scaling relations, and possible foreground structures.

  11. Habitat Ecology Visual Surveys of Demersal Fishes and Habitats off California

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Since 1992, the Habitat Ecology team has been conducting fishery independent, visual surveys of demersal fishes and associated habitats in deep water (20 to 900...

  12. Modelling the Volatility-Return Trade-off when Volatility may be Nonstationary

    DEFF Research Database (Denmark)

    Dahl, Christian Møller; Iglesias, Emma M.

    In this paper a new GARCH-M type model, denoted the GARCH-AR, is proposed. In particular, it is shown that it is possible to generate a volatility-return trade-off in a regression model simply by introducing dynamics in the standardized disturbance process. Importantly, the volatility in the GARCH......, we provide an empirical illustration showing the empirical relevance of the GARCH-AR model based on modelling a wide range of leading US stock return series....

  13. The National Deep-Sea Coral and Sponge Database: A Comprehensive Resource for United States Deep-Sea Coral and Sponge Records

    Science.gov (United States)

    Dornback, M.; Hourigan, T.; Etnoyer, P.; McGuinn, R.; Cross, S. L.

    2014-12-01

    Research on deep-sea corals has expanded rapidly over the last two decades, as scientists began to realize their value as long-lived structural components of high biodiversity habitats and archives of environmental information. The NOAA Deep Sea Coral Research and Technology Program's National Database for Deep-Sea Corals and Sponges is a comprehensive resource for georeferenced data on these organisms in U.S. waters. The National Database currently includes more than 220,000 deep-sea coral records representing approximately 880 unique species. Database records from museum archives, commercial and scientific bycatch, and from journal publications provide baseline information with relatively coarse spatial resolution dating back as far as 1842. These data are complemented by modern, in-situ submersible observations with high spatial resolution, from surveys conducted by NOAA and NOAA partners. Management of high volumes of modern high-resolution observational data can be challenging. NOAA is working with our data partners to incorporate this occurrence data into the National Database, along with images and associated information related to geoposition, time, biology, taxonomy, environment, provenance, and accuracy. NOAA is also working to link associated datasets collected by our program's research, to properly archive them to the NOAA National Data Centers, to build a robust metadata record, and to establish a standard protocol to simplify the process. Access to the National Database is provided through an online mapping portal. The map displays point based records from the database. Records can be refined by taxon, region, time, and depth. The queries and extent used to view the map can also be used to download subsets of the database. The database, map, and website is already in use by NOAA, regional fishery management councils, and regional ocean planning bodies, but we envision it as a model that can expand to accommodate data on a global scale.

  14. DeepMitosis: Mitosis detection via deep detection, verification and segmentation networks.

    Science.gov (United States)

    Li, Chao; Wang, Xinggang; Liu, Wenyu; Latecki, Longin Jan

    2018-04-01

    Mitotic count is a critical predictor of tumor aggressiveness in the breast cancer diagnosis. Nowadays mitosis counting is mainly performed by pathologists manually, which is extremely arduous and time-consuming. In this paper, we propose an accurate method for detecting the mitotic cells from histopathological slides using a novel multi-stage deep learning framework. Our method consists of a deep segmentation network for generating mitosis region when only a weak label is given (i.e., only the centroid pixel of mitosis is annotated), an elaborately designed deep detection network for localizing mitosis by using contextual region information, and a deep verification network for improving detection accuracy by removing false positives. We validate the proposed deep learning method on two widely used Mitosis Detection in Breast Cancer Histological Images (MITOSIS) datasets. Experimental results show that we can achieve the highest F-score on the MITOSIS dataset from ICPR 2012 grand challenge merely using the deep detection network. For the ICPR 2014 MITOSIS dataset that only provides the centroid location of mitosis, we employ the segmentation model to estimate the bounding box annotation for training the deep detection network. We also apply the verification model to eliminate some false positives produced from the detection model. By fusing scores of the detection and verification models, we achieve the state-of-the-art results. Moreover, our method is very fast with GPU computing, which makes it feasible for clinical practice. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. Deep frying

    NARCIS (Netherlands)

    Koerten, van K.N.

    2016-01-01

    Deep frying is one of the most used methods in the food processing industry. Though practically any food can be fried, French fries are probably the most well-known deep fried products. The popularity of French fries stems from their unique taste and texture, a crispy outside with a mealy soft

  16. Subsea surveying: a guide for oilmen. Deep-tow and digital techniques

    Energy Technology Data Exchange (ETDEWEB)

    Burns, F M

    1977-05-01

    Hydrodynamically stable tow fishes into which the acoustic energy source is mounted are examined. The advantages of deep-towed devices extends beyond the use of bottom and subbottom profiling systems. When used with side-scan sonar devices any small seabed relief or feature is enhanced by towing the sonar close to the seabed. The distortion of the records which is produced by observing slant ranges as opposed to the true range is also reduced. The use of marine magnetometers can greatly improve detection capabilities when searching for objects which produce some magnetic disturbance. The amplitude of this magnetic ''anomaly'' increases as the range between the magnetic sensor and the object is decreased. Areas where digital processing of data can be of significant value include navigational positioning (both of surface and sub-surface vessels); signal processing of seismic profiling data; and data presentation of all forms.

  17. DeepPVP: phenotype-based prioritization of causative variants using deep learning

    KAUST Repository

    Boudellioua, Imene

    2018-05-02

    Background: Prioritization of variants in personal genomic data is a major challenge. Recently, computational methods that rely on comparing phenotype similarity have shown to be useful to identify causative variants. In these methods, pathogenicity prediction is combined with a semantic similarity measure to prioritize not only variants that are likely to be dysfunctional but those that are likely involved in the pathogenesis of a patient\\'s phenotype. Results: We have developed DeepPVP, a variant prioritization method that combined automated inference with deep neural networks to identify the likely causative variants in whole exome or whole genome sequence data. We demonstrate that DeepPVP performs significantly better than existing methods, including phenotype-based methods that use similar features. DeepPVP is freely available at https://github.com/bio-ontology-research-group/phenomenet-vp Conclusions: DeepPVP further improves on existing variant prioritization methods both in terms of speed as well as accuracy.

  18. A PILOT FOR A VERY LARGE ARRAY H I DEEP FIELD

    International Nuclear Information System (INIS)

    Fernández, Ximena; Van Gorkom, J. H.; Schiminovich, David; Hess, Kelley M.; Pisano, D. J.; Kreckel, Kathryn; Momjian, Emmanuel; Popping, Attila; Oosterloo, Tom; Chomiuk, Laura; Verheijen, M. A. W.; Henning, Patricia A.; Bershady, Matthew A.; Wilcots, Eric M.; Scoville, Nick

    2013-01-01

    High-resolution 21 cm H I deep fields provide spatially and kinematically resolved images of neutral hydrogen at different redshifts, which are key to understanding galaxy evolution across cosmic time and testing predictions of cosmological simulations. Here we present results from a pilot for an H I deep field done with the Karl G. Jansky Very Large Array (VLA). We take advantage of the newly expanded capabilities of the telescope to probe the redshift interval 0 < z < 0.193 in one observation. We observe the COSMOS field for 50 hr, which contains 413 galaxies with optical spectroscopic redshifts in the imaged field of 34' × 34' and the observed redshift interval. We have detected neutral hydrogen gas in 33 galaxies in different environments spanning the probed redshift range, including three without a previously known spectroscopic redshift. The detections have a range of H I and stellar masses, indicating the diversity of galaxies we are probing. We discuss the observations, data reduction, results, and highlight interesting detections. We find that the VLA's B-array is the ideal configuration for H I deep fields since its long spacings mitigate radio frequency interference. This pilot shows that the VLA is ready to carry out such a survey, and serves as a test for future H I deep fields planned with other Square Kilometer Array pathfinders.

  19. A PILOT FOR A VERY LARGE ARRAY H I DEEP FIELD

    Energy Technology Data Exchange (ETDEWEB)

    Fernandez, Ximena; Van Gorkom, J. H.; Schiminovich, David [Department of Astronomy, Columbia University, New York, NY 10027 (United States); Hess, Kelley M. [Department of Astronomy, Astrophysics, Cosmology and Gravity Centre, University of Cape Town, Private Bag X3, Rondebosch 7701 (South Africa); Pisano, D. J. [Department of Physics, West Virginia University, P.O. Box 6315, Morgantown, WV 26506 (United States); Kreckel, Kathryn [Max Planck Institute for Astronomy, Koenigstuhl 17, D-69117 Heidelberg (Germany); Momjian, Emmanuel [National Radio Astronomy Observatory, Socorro, NM 87801 (United States); Popping, Attila [International Centre for Radio Astronomy Research (ICRAR), The University of Western Australia, 35 Stirling Hwy, Crawley, WA 6009 (Australia); Oosterloo, Tom [Netherlands Institute for Radio Astronomy (ASTRON), Postbus 2, NL-7990 AA Dwingeloo (Netherlands); Chomiuk, Laura [Department of Physics and Astronomy, Michigan State University, East Lansing, MI 48824 (United States); Verheijen, M. A. W. [Kapteyn Astronomical Institute, University of Groningen, Postbus 800, NL-9700 AV Groningen (Netherlands); Henning, Patricia A. [Department of Physics and Astronomy, University of New Mexico, Albuquerque, NM 87131 (United States); Bershady, Matthew A.; Wilcots, Eric M. [Department of Astronomy, University of Wisconsin-Madison, Madison, WI 53706 (United States); Scoville, Nick, E-mail: ximena@astro.columbia.edu [Department of Astronomy, California Institute of Technology, Pasadena, CA 91125 (United States)

    2013-06-20

    High-resolution 21 cm H I deep fields provide spatially and kinematically resolved images of neutral hydrogen at different redshifts, which are key to understanding galaxy evolution across cosmic time and testing predictions of cosmological simulations. Here we present results from a pilot for an H I deep field done with the Karl G. Jansky Very Large Array (VLA). We take advantage of the newly expanded capabilities of the telescope to probe the redshift interval 0 < z < 0.193 in one observation. We observe the COSMOS field for 50 hr, which contains 413 galaxies with optical spectroscopic redshifts in the imaged field of 34' Multiplication-Sign 34' and the observed redshift interval. We have detected neutral hydrogen gas in 33 galaxies in different environments spanning the probed redshift range, including three without a previously known spectroscopic redshift. The detections have a range of H I and stellar masses, indicating the diversity of galaxies we are probing. We discuss the observations, data reduction, results, and highlight interesting detections. We find that the VLA's B-array is the ideal configuration for H I deep fields since its long spacings mitigate radio frequency interference. This pilot shows that the VLA is ready to carry out such a survey, and serves as a test for future H I deep fields planned with other Square Kilometer Array pathfinders.

  20. Vertical Cable Seismic Survey for Hydrothermal Deposit

    Science.gov (United States)

    Asakawa, E.; Murakami, F.; Sekino, Y.; Okamoto, T.; Ishikawa, K.; Tsukahara, H.; Shimura, T.

    2012-04-01

    The vertical cable seismic is one of the reflection seismic methods. It uses hydrophone arrays vertically moored from the seafloor to record acoustic waves generated by surface, deep-towed or ocean bottom sources. Analyzing the reflections from the sub-seabed, we could look into the subsurface structure. This type of survey is generally called VCS (Vertical Cable Seismic). Because VCS is an efficient high-resolution 3D seismic survey method for a spatially-bounded area, we proposed the method for the hydrothermal deposit survey tool development program that the Ministry of Education, Culture, Sports, Science and Technology (MEXT) started in 2009. We are now developing a VCS system, including not only data acquisition hardware but data processing and analysis technique. Our first experiment of VCS surveys has been carried out in Lake Biwa, JAPAN in November 2009 for a feasibility study. Prestack depth migration is applied to the 3D VCS data to obtain a high quality 3D depth volume. Based on the results from the feasibility study, we have developed two autonomous recording VCS systems. After we carried out a trial experiment in the actual ocean at a water depth of about 400m and we carried out the second VCS survey at Iheya Knoll with a deep-towed source. In this survey, we could establish the procedures for the deployment/recovery of the system and could examine the locations and the fluctuations of the vertical cables at a water depth of around 1000m. The acquired VCS data clearly shows the reflections from the sub-seafloor. Through the experiment, we could confirm that our VCS system works well even in the severe circumstances around the locations of seafloor hydrothermal deposits. We have, however, also confirmed that the uncertainty in the locations of the source and of the hydrophones could lower the quality of subsurface image. It is, therefore, strongly necessary to develop a total survey system that assures a accurate positioning and a deployment techniques

  1. Hot, deep origin of petroleum: deep basin evidence and application

    Science.gov (United States)

    Price, Leigh C.

    1978-01-01

    Use of the model of a hot deep origin of oil places rigid constraints on the migration and entrapment of crude oil. Specifically, oil originating from depth migrates vertically up faults and is emplaced in traps at shallower depths. Review of petroleum-producing basins worldwide shows oil occurrence in these basins conforms to the restraints of and therefore supports the hypothesis. Most of the world's oil is found in the very deepest sedimentary basins, and production over or adjacent to the deep basin is cut by or directly updip from faults dipping into the basin deep. Generally the greater the fault throw the greater the reserves. Fault-block highs next to deep sedimentary troughs are the best target areas by the present concept. Traps along major basin-forming faults are quite prospective. The structural style of a basin governs the distribution, types, and amounts of hydrocarbons expected and hence the exploration strategy. Production in delta depocenters (Niger) is in structures cut by or updip from major growth faults, and structures not associated with such faults are barren. Production in block fault basins is on horsts next to deep sedimentary troughs (Sirte, North Sea). In basins whose sediment thickness, structure and geologic history are known to a moderate degree, the main oil occurrences can be specifically predicted by analysis of fault systems and possible hydrocarbon migration routes. Use of the concept permits the identification of significant targets which have either been downgraded or ignored in the past, such as production in or just updip from thrust belts, stratigraphic traps over the deep basin associated with major faulting, production over the basin deep, and regional stratigraphic trapping updip from established production along major fault zones.

  2. The applications of deep neural networks to sdBV classification

    Science.gov (United States)

    Boudreaux, Thomas M.

    2017-12-01

    With several new large-scale surveys on the horizon, including LSST, TESS, ZTF, and Evryscope, faster and more accurate analysis methods will be required to adequately process the enormous amount of data produced. Deep learning, used in industry for years now, allows for advanced feature detection in minimally prepared datasets at very high speeds; however, despite the advantages of this method, its application to astrophysics has not yet been extensively explored. This dearth may be due to a lack of training data available to researchers. Here we generate synthetic data loosely mimicking the properties of acoustic mode pulsating stars and we show that two separate paradigms of deep learning - the Artificial Neural Network And the Convolutional Neural Network - can both be used to classify this synthetic data effectively. And that additionally this classification can be performed at relatively high levels of accuracy with minimal time spent adjusting network hyperparameters.

  3. The USNO-UKIRT K-band Hemisphere Survey

    Science.gov (United States)

    Dahm, Scott; Bruursema, Justice; Munn, Jeffrey A.; Vrba, Fred J.; Dorland, Bryan; Dye, Simon; Kerr, Tom; Varricatt, Watson; Irwin, Mike; Lawrence, Andy; McLaren, Robert; Hodapp, Klaus; Hasinger, Guenther

    2018-01-01

    We present initial results from the United States Naval Observatory (USNO) and UKIRT K-band Hemisphere Survey (U2HS), currently underway using the Wide Field Camera (WFCAM) installed on UKIRT on Maunakea. U2HS is a collaborative effort undertaken by USNO, the Institute for Astronomy, University of Hawaii, the Cambridge Astronomy Survey Unit (CASU) and the Wide Field Astronomy Unit (WFAU) in Edinburgh. The principal objective of the U2HS is to provide continuous northern hemisphere K-band coverage over a declination range of δ=0o – +60o by combining over 12,700 deg2 of new imaging with the existing UKIRT Infrared Deep Sky Survey (UKIDSS) Large Area Survey (LAS), Galactic Plane Survey (GPS) and Galactic Cluster Survey (GCS). U2HS will achieve a 5-σ point source sensitivity of K~18.4 mag (Vega), over three magnitudes deeper than the Two Micron All Sky Survey (2MASS). In this contribution we discuss survey design, execution, data acquisition and processing, photometric calibration and quality control. The data obtained by the U2HS will be made publicly available through the Wide Field Science Archive (WSA) maintained by the WFAU.

  4. Deep learning in bioinformatics.

    Science.gov (United States)

    Min, Seonwoo; Lee, Byunghan; Yoon, Sungroh

    2017-09-01

    In the era of big data, transformation of biomedical big data into valuable knowledge has been one of the most important challenges in bioinformatics. Deep learning has advanced rapidly since the early 2000s and now demonstrates state-of-the-art performance in various fields. Accordingly, application of deep learning in bioinformatics to gain insight from data has been emphasized in both academia and industry. Here, we review deep learning in bioinformatics, presenting examples of current research. To provide a useful and comprehensive perspective, we categorize research both by the bioinformatics domain (i.e. omics, biomedical imaging, biomedical signal processing) and deep learning architecture (i.e. deep neural networks, convolutional neural networks, recurrent neural networks, emergent architectures) and present brief descriptions of each study. Additionally, we discuss theoretical and practical issues of deep learning in bioinformatics and suggest future research directions. We believe that this review will provide valuable insights and serve as a starting point for researchers to apply deep learning approaches in their bioinformatics studies. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. The BUFFALO HST Survey

    Science.gov (United States)

    Steinhardt, Charles; Jauzac, Mathilde; Capak, Peter; Koekemoer, Anton; Oesch, Pascal; Richard, Johan; Sharon, Keren q.; BUFFALO

    2018-01-01

    Beyond Ultra-deep Frontier Fields And Legacy Observations (BUFFALO) is an astronomical survey built around the six Hubble Space Telescope (HST) Frontier Fields clusters designed to learn about early galactic assembly and clustering and prepare targets for observations with the James Webb Space Telescope. BUFFALO will place significant new constraints on how and when the most massive and luminous galaxies in the universe formed and how early galaxy formation is linked to dark matter assembly. The same data will also probe the temperature and cross section of dark matter in the massive Frontier Fields galaxy clusters, and tell us how the dark matter, cluster gas, and dynamics of the clusters influence the galaxies in and around them. These studies are possible because the Spitzer Space Telescope, Chandra X-ray Observatory, XMM-Newton, and ground based telescopes have already invested heavily in deep observations around the Frontier Fields, so that the addition of HST observations can yield significant new results.

  6. Twenty-Three High-Redshift Supernovae from the Institute for Astronomy Deep Survey: Doubling the Supernova Sample at z > 0.7

    Science.gov (United States)

    Barris, Brian J.; Tonry, John L.; Blondin, Stéphane; Challis, Peter; Chornock, Ryan; Clocchiatti, Alejandro; Filippenko, Alexei V.; Garnavich, Peter; Holland, Stephen T.; Jha, Saurabh; Kirshner, Robert P.; Krisciunas, Kevin; Leibundgut, Bruno; Li, Weidong; Matheson, Thomas; Miknaitis, Gajus; Riess, Adam G.; Schmidt, Brian P.; Smith, R. Chris; Sollerman, Jesper; Spyromilio, Jason; Stubbs, Christopher W.; Suntzeff, Nicholas B.; Aussel, Hervé; Chambers, K. C.; Connelley, M. S.; Donovan, D.; Henry, J. Patrick; Kaiser, Nick; Liu, Michael C.; Martín, Eduardo L.; Wainscoat, Richard J.

    2004-02-01

    We present photometric and spectroscopic observations of 23 high-redshift supernovae (SNe) spanning a range of z=0.34-1.03, nine of which are unambiguously classified as Type Ia. These SNe were discovered during the IfA Deep Survey, which began in 2001 September and observed a total of 2.5 deg2 to a depth of approximately m~25-26 in RIZ over 9-17 visits, typically every 1-3 weeks for nearly 5 months, with additional observations continuing until 2002 April. We give a brief description of the survey motivations, observational strategy, and reduction process. This sample of 23 high-redshift SNe includes 15 at z>=0.7, doubling the published number of objects at these redshifts, and indicates that the evidence for acceleration of the universe is not due to a systematic effect proportional to redshift. In combination with the recent compilation of Tonry et al. (2003), we calculate cosmological parameter density contours that are consistent with the flat universe indicated by the cosmic microwave background (Spergel et al. 2003). Adopting the constraint that Ωtotal=1.0, we obtain best-fit values of (Ωm,ΩΛ)=(0.33,0.67) using 22 SNe from this survey augmented by the literature compilation. We show that using the empty-beam model for gravitational lensing does not eliminate the need for ΩΛ>0. Experience from this survey indicates great potential for similar large-scale surveys while also revealing the limitations of performing surveys for z>1 SNe from the ground. CFHT: Based in part on observations obtained at the Canada-France-Hawaii Telescope (CFHT), which is operated by the National Research Council of Canada, the Institut National des Science de l'Univers of the Centre National de la Recherche Scientifique of France, and the University of Hawaii. CTIO: Based in part on observations taken at the Cerro Tololo Inter-American Observatory. Keck: Some of the data presented herein were obtained at the W. M. Keck Observatory, which is operated as a scientific partnership

  7. DeepSimulator: a deep simulator for Nanopore sequencing

    KAUST Repository

    Li, Yu

    2017-12-23

    Motivation: Oxford Nanopore sequencing is a rapidly developed sequencing technology in recent years. To keep pace with the explosion of the downstream data analytical tools, a versatile Nanopore sequencing simulator is needed to complement the experimental data as well as to benchmark those newly developed tools. However, all the currently available simulators are based on simple statistics of the produced reads, which have difficulty in capturing the complex nature of the Nanopore sequencing procedure, the main task of which is the generation of raw electrical current signals. Results: Here we propose a deep learning based simulator, DeepSimulator, to mimic the entire pipeline of Nanopore sequencing. Starting from a given reference genome or assembled contigs, we simulate the electrical current signals by a context-dependent deep learning model, followed by a base-calling procedure to yield simulated reads. This workflow mimics the sequencing procedure more naturally. The thorough experiments performed across four species show that the signals generated by our context-dependent model are more similar to the experimentally obtained signals than the ones generated by the official context-independent pore model. In terms of the simulated reads, we provide a parameter interface to users so that they can obtain the reads with different accuracies ranging from 83% to 97%. The reads generated by the default parameter have almost the same properties as the real data. Two case studies demonstrate the application of DeepSimulator to benefit the development of tools in de novo assembly and in low coverage SNP detection. Availability: The software can be accessed freely at: https://github.com/lykaust15/DeepSimulator.

  8. Deep learning relevance

    DEFF Research Database (Denmark)

    Lioma, Christina; Larsen, Birger; Petersen, Casper

    2016-01-01

    train a Recurrent Neural Network (RNN) on existing relevant information to that query. We then use the RNN to "deep learn" a single, synthetic, and we assume, relevant document for that query. We design a crowdsourcing experiment to assess how relevant the "deep learned" document is, compared...... to existing relevant documents. Users are shown a query and four wordclouds (of three existing relevant documents and our deep learned synthetic document). The synthetic document is ranked on average most relevant of all....

  9. The very deep hole concept - Geoscientific appraisal of conditions at great depth

    International Nuclear Information System (INIS)

    Juhlin, C.; Wallroth, T.; Smellie, J.; Leijon, B.; Eliasson, T.; Ljunggren, C.; Beswick, J.

    1998-06-01

    One of the alternative systems for disposal of high-level radioactive nuclear waste being studied by SKB is the very deep hole (2000 - 4000 m) concept. As part of SKB's research programme a study has been carried out to increase the level of knowledge on the expected geological conditions in the depth interval 1000-5000 m in older crystalline rock. As a first step, existing data from relevant areas throughout the world have been compiled. The majority of the data come from deep boreholes, mines, and surface geophysical surveys. An attempt has been made to interpret these data in an integrated manner and to develop a conceptual geological model on the conditions in the Baltic Shield down to a depth of 5 km. One of the main features of the suggested model is that the upper 1 km of crust contains significantly more open fractures than the rock below. However, hydraulically conductive fractures and fracture zones may exist at great depth. In areas of low topography active groundwater circulation is primarily limited to the upper 1 km with the water below 1 km having high salinity. The high salinity reflects the near hydraulically stagnant conditions which exist relatively shallow in areas of low topography. In areas with greater topographic relief fresh water penetrates to great depth and near stagnant conditions are first encountered much deeper. The report also covers how the studied parameters which describe the geological conditions vary with depth. A number of recommendations are made on how the presented conceptual model can be tested and improved aside from obtaining data from new boreholes. These recommendations include the following geoscientific surveys and studies: Reflection and refraction seismics for mapping discrete sub-horizontal fracture zones and the upper more fractured part of the crust; Geoelectric methods for mapping the depth to saline water; Detailed hydrogeological measurements in existing deep boreholes; Isotope studies on fracture minerals

  10. The very deep hole concept - Geoscientific appraisal of conditions at great depth

    Energy Technology Data Exchange (ETDEWEB)

    Juhlin, C. [Christopher Juhlin Consulting (Sweden); Wallroth, T. [Bergab Consulting Geologists (Sweden); Smellie, J.; Leijon, B. [Conterra AB (Sweden); Eliasson, T. [Geological Survey of Sweden (Sweden); Ljunggren, C. [Vattenfall Hydropower AB (Sweden); Beswick, J. [EDECO Petroleum Services Ltd. (United Kingdom)

    1998-06-01

    One of the alternative systems for disposal of high-level radioactive nuclear waste being studied by SKB is the very deep hole (2000 - 4000 m) concept. As part of SKB`s research programme a study has been carried out to increase the level of knowledge on the expected geological conditions in the depth interval 1000-5000 m in older crystalline rock. As a first step, existing data from relevant areas throughout the world have been compiled. The majority of the data come from deep boreholes, mines, and surface geophysical surveys. An attempt has been made to interpret these data in an integrated manner and to develop a conceptual geological model on the conditions in the Baltic Shield down to a depth of 5 km. One of the main features of the suggested model is that the upper 1 km of crust contains significantly more open fractures than the rock below. However, hydraulically conductive fractures and fracture zones may exist at great depth. In areas of low topography active groundwater circulation is primarily limited to the upper 1 km with the water below 1 km having high salinity. The high salinity reflects the near hydraulically stagnant conditions which exist relatively shallow in areas of low topography. In areas with greater topographic relief fresh water penetrates to great depth and near stagnant conditions are first encountered much deeper. The report also covers how the studied parameters which describe the geological conditions vary with depth. A number of recommendations are made on how the presented conceptual model can be tested and improved aside from obtaining data from new boreholes. These recommendations include the following geoscientific surveys and studies: Reflection and refraction seismics for mapping discrete sub-horizontal fracture zones and the upper more fractured part of the crust; Geoelectric methods for mapping the depth to saline water; Detailed hydrogeological measurements in existing deep boreholes; Isotope studies on fracture minerals

  11. Optical identifications of radio sources in the 5C 7 survey

    International Nuclear Information System (INIS)

    Perryman, M.A.C.

    1979-01-01

    An identification procedure developed for the deep radio survey 5C 6 has been refined and applied to the 5C 7 survey. Positions and finding charts are presented for candidate identifications from deep plates taken with the Palomar 48-inch Schmidt telescope. The identification statistics are in good agreement with the 5C 6 results, the accurate radio positions obtained at 1407 MHz defining a reasonably reliable and complete sample of associations with an identification rate of about 40 per cent. At 408 MHz the positional uncertainties are larger and the identifications are thus of lower reliability; the identification rate is about 20 per cent. The results are in good agreement with the assumptions that the optical identifications are coincident with the radio centroids, and that the identifications are not preferentially associated with faint clusters. (author)

  12. Recent developments in the thermophilic microbiology of deep-sea hydrothermal vents.

    Science.gov (United States)

    Miroshnichenko, Margarita L; Bonch-Osmolovskaya, Elizaveta A

    2006-04-01

    The diversity of thermophilic prokaryotes inhabiting deep-sea hot vents was actively studied over the last two decades. The ever growing interest is reflected in the exponentially increasing number of novel thermophilic genera described. The goal of this paper is to survey the progress in this field made in the years 2000-2005. In this period, representatives of several new taxa of hyperthermophilic archaea were obtained from deep-sea environments. Two of these isolates had phenotypic features new for this group of organisms: the presence of an outer cell membrane (the genus Ignicoccus) and the ability to grow anaerobically with acetate and ferric iron (the genus Geoglobus). Also, our knowledge on the diversity of thermophilic bacteria from deep-sea thermal environments extended significantly. The new bacterial isolates represented diverse bacterial divisions: the phylum Aquificae, the subclass Epsilonproteobacteria, the order Thermotogales, the families Thermodesulfobacteriaceae, Deferribacteraceae, and Thermaceae, and a novel bacterial phylum represented by the genus Caldithrix. Most of these isolates are obligate or facultative lithotrophs, oxidizing molecular hydrogen in the course of different types of anaerobic respiration or microaerobic growth. The existence and significant ecological role of some of new bacterial thermophilic isolates was initially established by molecular methods.

  13. Hydrogeochemistry of deep groundwaters of mafic and ultramafic rocks in Finland

    International Nuclear Information System (INIS)

    Ruskeeniemi, T.; Blomqvist, R.; Lindberg, A.; Ahonen, L.; Frape, S.

    1996-12-01

    The present work reports and interprets the hydrogeochemical and hydrogeological data obtained from deep groundwaters in various mafic-ultramafic formations in Finland. The work is mainly based on the results of the research project 'Geochemistry of deep groundwaters' financed by the Ministry of Trade and Industry and the Geological Survey of Finland. Five sites were selected for this study: (1) Juuka, (2) Keminmaa, (3) Maentsaelae, (4) Ranua, and (5) Ylivieska. Keminmaa and Ranua are located in Early Proterozoic layered intrusions dated at 2.44 Ga. The Juuka site lies within the massive Miihkali serpentinite, which is thought to represent the ultramafic part of a Proterozoic (1.97 Ga) ophiolite complex. The Maentsaelae gabbro represents the deep parts of the Svecofennian volcanic sequence, while the Ylivieska mafic-ultramafic intrusion is one of a group of Svecokarelian Ni-potential intrusions 1.9 Ga in age. For reference, groundwaters from four other sites are also briefly described. Three of these sites are located within the nickel mining regions of Enonkoski, Kotalahti and Vammala, while the fourth is a small Ni mineralization at Hyvelae, Noormarkku. The four reference sites are all of Svecokarelian age. (refs.)

  14. Hydrogeochemistry of deep groundwaters of mafic and ultramafic rocks in Finland

    Energy Technology Data Exchange (ETDEWEB)

    Ruskeeniemi, T.; Blomqvist, R.; Lindberg, A.; Ahonen, L. [Geological Survey of Finland, Espoo (Finland); Frape, S. [Waterloo Univ., ON (Canada)

    1996-12-01

    The present work reports and interprets the hydrogeochemical and hydrogeological data obtained from deep groundwaters in various mafic-ultramafic formations in Finland. The work is mainly based on the results of the research project `Geochemistry of deep groundwaters` financed by the Ministry of Trade and Industry and the Geological Survey of Finland. Five sites were selected for this study: (1) Juuka, (2) Keminmaa, (3) Maentsaelae, (4) Ranua, and (5) Ylivieska. Keminmaa and Ranua are located in Early Proterozoic layered intrusions dated at 2.44 Ga. The Juuka site lies within the massive Miihkali serpentinite, which is thought to represent the ultramafic part of a Proterozoic (1.97 Ga) ophiolite complex. The Maentsaelae gabbro represents the deep parts of the Svecofennian volcanic sequence, while the Ylivieska mafic-ultramafic intrusion is one of a group of Svecokarelian Ni-potential intrusions 1.9 Ga in age. For reference, groundwaters from four other sites are also briefly described. Three of these sites are located within the nickel mining regions of Enonkoski, Kotalahti and Vammala, while the fourth is a small Ni mineralization at Hyvelae, Noormarkku. The four reference sites are all of Svecokarelian age. (refs.).

  15. AzTEC on ASTE Survey of Submillimeter Galaxies

    Science.gov (United States)

    Kohno, K.; Tamura, Y.; Hatsukade, B.; Nakanishi, K.; Iono, D.; Takata, T.; Wilson, G. W.; Yun, M. S.; Perera, T.; Austermann, J. E.; Scott, K. S.; Hughes, H.; Aretxaga, I.; Tanaka, K.; Oshima, T.; Yamaguchi, N.; Matsuo, H.; Ezawa, H.; Kawabe, R.

    2008-10-01

    We have conducted an unprecedented survey of submillimeter galaxies (SMGs) using the 144 pixel bolometer camera AzTEC mounted on the ASTE 10-m dish in Chile. We have already obtained many (>20) wide (typically 12' × 12' or wider) and deep (1 σ sensitivity of 0.5-1.0 mJy) 1.1 mm continuum images of known blank fields and over-density regions/protoclusters across a wide range of redshifts with a spatial resolution of ˜ 30''. It has resulted in the numerous (˜ a few 100, almost equivalent to the total number of the previously known SMGs) new and secure detections of SMGs. In this paper, we present initial results of two selected fields, SSA 22 and AKARI Deep Field South (ADF-S). A significnat clustering of bright SMGs toward the density peak of LAEs is found in SSA 22. We derived the differential and cumulative number counts from the detected sources in ADF-S, which probe the faintest flux densities (down to ˜1 mJy) among 1-mm blank field surveys to date.

  16. NOAA's efforts to map extent, health and condition of deep sea corals and sponges and their habitat on the banks and island slopes of Southern California

    Science.gov (United States)

    Etnoyer, P. J.; Salgado, E.; Stierhoff, K.; Wickes, L.; Nehasil, S.; Kracker, L.; Lauermann, A.; Rosen, D.; Caldow, C.

    2015-12-01

    Southern California's deep-sea corals are diverse and abundant, but subject to multiple stressors, including corallivory, ocean acidification, and commercial bottom fishing. NOAA has surveyed these habitats using a remotely operated vehicle (ROV) since 2003. The ROV was equipped with high-resolution cameras to document deep-water groundfish and their habitat in a series of research expeditions from 2003 - 2011. Recent surveys 2011-2015 focused on in-situ measures of aragonite saturation and habitat mapping in notable habitats identified in previous years. Surveys mapped abundance and diversity of fishes and corals, as well as commercial fisheries landings and frequency of fishing gear. A novel priority setting algorithm was developed to identify hotspots of diversity and fishing intensity, and to determine where future conservation efforts may be warranted. High density coral aggregations identified in these analyses were also used to guide recent multibeam mapping efforts. The maps suggest a large extent of unexplored and unprotected hard-bottom habitat in the mesophotic zone and deep-sea reaches of Channel Islands National Marine Sanctuary.

  17. The Newberry Deep Drilling Project (NDDP)

    Science.gov (United States)

    Bonneville, A.; Cladouhos, T. T.; Petty, S.; Schultz, A.; Sorle, C.; Asanuma, H.; Friðleifsson, G. Ó.; Jaupart, C. P.; Moran, S. C.; de Natale, G.

    2017-12-01

    We present the arguments to drill a deep well to the ductile/brittle transition zone (T>400°C) at Newberry Volcano, central Oregon state, U.S.A. The main research goals are related to heat and mass transfer in the crust from the point of view of natural hazards and geothermal energy: enhanced geothermal system (EGS supercritical and beyond-brittle), volcanic hazards, mechanisms of magmatic intrusions, geomechanics close to a magmatic system, calibration of geophysical imaging techniques and drilling in a high temperature environment. Drilling at Newberry will bring additional information to a very promising field of research initiated by ICDP in the Deep Drilling project in Iceland with IDDP-1 on Krafla in 2009, followed by IDDP-2 on the Reykjanes ridge in 2016, and the future Japan Beyond-Brittle project and Krafla Magma Testbed. Newberry Volcano contains one of the largest geothermal heat reservoirs in the western United States, extensively studied for the last 40 years. All the knowledge and experience collected make this an excellent choice for drilling a well that will reach high temperatures at relatively shallow depths (< 5000 m). The large conductive thermal anomaly (320°C at 3000 m depth), has already been well-characterized by extensive drilling and geophysical surveys. This will extend current knowledge from the existing 3000 m deep boreholes at the sites into and through the brittle-ductile transition approaching regions of partial melt like lateral dykes. The important scientific questions that will form the basis of a full drilling proposal, have been addressed during an International Continental Drilling Program (ICDP) workshop held in Bend, Oregon in September 2017. They will be presented and discussed as well as the strategic plan to address them.

  18. Deep learning in TMVA Benchmarking Benchmarking TMVA DNN Integration of a Deep Autoencoder

    CERN Document Server

    Huwiler, Marc

    2017-01-01

    The TMVA library in ROOT is dedicated to multivariate analysis, and in partic- ular oers numerous machine learning algorithms in a standardized framework. It is widely used in High Energy Physics for data analysis, mainly to perform regression and classication. To keep up to date with the state of the art in deep learning, a new deep learning module was being developed this summer, oering deep neural net- work, convolutional neural network, and autoencoder. TMVA did not have yet any autoencoder method, and the present project consists in implementing the TMVA autoencoder class based on the deep learning module. It also includes some bench- marking performed on the actual deep neural network implementation, in comparison to the Keras framework with Tensorflow and Theano backend.

  19. Deep subsurface microbial processes

    Science.gov (United States)

    Lovley, D.R.; Chapelle, F.H.

    1995-01-01

    Information on the microbiology of the deep subsurface is necessary in order to understand the factors controlling the rate and extent of the microbially catalyzed redox reactions that influence the geophysical properties of these environments. Furthermore, there is an increasing threat that deep aquifers, an important drinking water resource, may be contaminated by man's activities, and there is a need to predict the extent to which microbial activity may remediate such contamination. Metabolically active microorganisms can be recovered from a diversity of deep subsurface environments. The available evidence suggests that these microorganisms are responsible for catalyzing the oxidation of organic matter coupled to a variety of electron acceptors just as microorganisms do in surface sediments, but at much slower rates. The technical difficulties in aseptically sampling deep subsurface sediments and the fact that microbial processes in laboratory incubations of deep subsurface material often do not mimic in situ processes frequently necessitate that microbial activity in the deep subsurface be inferred through nonmicrobiological analyses of ground water. These approaches include measurements of dissolved H2, which can predict the predominant microbially catalyzed redox reactions in aquifers, as well as geochemical and groundwater flow modeling, which can be used to estimate the rates of microbial processes. Microorganisms recovered from the deep subsurface have the potential to affect the fate of toxic organics and inorganic contaminants in groundwater. Microbial activity also greatly influences 1 the chemistry of many pristine groundwaters and contributes to such phenomena as porosity development in carbonate aquifers, accumulation of undesirably high concentrations of dissolved iron, and production of methane and hydrogen sulfide. Although the last decade has seen a dramatic increase in interest in deep subsurface microbiology, in comparison with the study of

  20. Reprint of - Deep-sea coral and hardbottom habitats on the west Florida slope, eastern Gulf of Mexico

    Science.gov (United States)

    Ross, Steve W.; Rhode, Mike; Brooke, Sandra

    2017-09-01

    Until recently, benthic habitats dominated by deep-sea corals (DSC) appeared to be less extensive on the slope of the Gulf of Mexico (GOM) than in the northeast Atlantic Ocean or off the southeastern US. There are relatively few bioherms (i.e., coral-built mounds) in the northern GOM, and most DSCs are attached to existing hard substrata (e.g., authigenically formed carbonate). The primary structure-forming, DSC in the GOM is Lophelia pertusa, but structure is also provided by other living and dead scleractinians, antipatharians (black corals), octocorals (gorgonians, soft corals), hydrocorals and sponges, as well as abundant rocky substrata. The best development of DSCs in the GOM was previously documented within Viosca Knoll oil and gas lease blocks 826 and 862/906 (north-central GOM) and on the Campeche Bank (southern GOM in Mexican waters). This paper documents extensive deep reef ecosystems composed of DSC and rocky hard-bottom recently surveyed on the West Florida Slope (WFS, eastern GOM) during six research cruises (2008-2012). Using multibeam sonar, CTD casts, and video from underwater vehicles, we describe the physical and oceanographic characteristics of these deep reefs and provide size or area estimates of deep coral and hardground habitats. The multibeam sonar analyses revealed hundreds of mounds and ridges, some of which were subsequently surveyed using underwater vehicles. Mounds and ridges in <525 m depths were usually capped with living coral colonies, dominated by L. pertusa. An extensive rocky scarp, running roughly north-south for at least 229 km, supported lower abundances of scleractinian corals than the mounds and ridges, despite an abundance of settlement substrata. Areal comparisons suggested that the WFS may exceed other parts of the GOM slope in extent of living deep coral coverage and other deep-reef habitat (dead coral and rock). The complex WFS region warrants additional studies to better understand the influences of oceanography and

  1. DeepSurv: personalized treatment recommender system using a Cox proportional hazards deep neural network.

    Science.gov (United States)

    Katzman, Jared L; Shaham, Uri; Cloninger, Alexander; Bates, Jonathan; Jiang, Tingting; Kluger, Yuval

    2018-02-26

    Medical practitioners use survival models to explore and understand the relationships between patients' covariates (e.g. clinical and genetic features) and the effectiveness of various treatment options. Standard survival models like the linear Cox proportional hazards model require extensive feature engineering or prior medical knowledge to model treatment interaction at an individual level. While nonlinear survival methods, such as neural networks and survival forests, can inherently model these high-level interaction terms, they have yet to be shown as effective treatment recommender systems. We introduce DeepSurv, a Cox proportional hazards deep neural network and state-of-the-art survival method for modeling interactions between a patient's covariates and treatment effectiveness in order to provide personalized treatment recommendations. We perform a number of experiments training DeepSurv on simulated and real survival data. We demonstrate that DeepSurv performs as well as or better than other state-of-the-art survival models and validate that DeepSurv successfully models increasingly complex relationships between a patient's covariates and their risk of failure. We then show how DeepSurv models the relationship between a patient's features and effectiveness of different treatment options to show how DeepSurv can be used to provide individual treatment recommendations. Finally, we train DeepSurv on real clinical studies to demonstrate how it's personalized treatment recommendations would increase the survival time of a set of patients. The predictive and modeling capabilities of DeepSurv will enable medical researchers to use deep neural networks as a tool in their exploration, understanding, and prediction of the effects of a patient's characteristics on their risk of failure.

  2. Pathogenesis of deep endometriosis.

    Science.gov (United States)

    Gordts, Stephan; Koninckx, Philippe; Brosens, Ivo

    2017-12-01

    The pathophysiology of (deep) endometriosis is still unclear. As originally suggested by Cullen, change the definition "deeper than 5 mm" to "adenomyosis externa." With the discovery of the old European literature on uterine bleeding in 5%-10% of the neonates and histologic evidence that the bleeding represents decidual shedding, it is postulated/hypothesized that endometrial stem/progenitor cells, implanted in the pelvic cavity after birth, may be at the origin of adolescent and even the occasionally premenarcheal pelvic endometriosis. Endometriosis in the adolescent is characterized by angiogenic and hemorrhagic peritoneal and ovarian lesions. The development of deep endometriosis at a later age suggests that deep infiltrating endometriosis is a delayed stage of endometriosis. Another hypothesis is that the endometriotic cell has undergone genetic or epigenetic changes and those specific changes determine the development into deep endometriosis. This is compatible with the hereditary aspects, and with the clonality of deep and cystic ovarian endometriosis. It explains the predisposition and an eventual causal effect by dioxin or radiation. Specific genetic/epigenetic changes could explain the various expressions and thus typical, cystic, and deep endometriosis become three different diseases. Subtle lesions are not a disease until epi(genetic) changes occur. A classification should reflect that deep endometriosis is a specific disease. In conclusion the pathophysiology of deep endometriosis remains debated and the mechanisms of disease progression, as well as the role of genetics and epigenetics in the process, still needs to be unraveled. Copyright © 2017 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  3. European survey on the opinion and use of micronutrition in age-related macular degeneration: 10 years on from the Age-Related Eye Disease Study

    Directory of Open Access Journals (Sweden)

    Aslam T

    2014-10-01

    Full Text Available Tariq Aslam,1 Cécile Delcourt,2 Frank Holz,3 Alfredo García-Layana,4 Anita Leys,5 Rufino M Silva,6 Eric Souied7 1Manchester Royal Eye Hospital, Manchester, UK; 2University of Bordeaux, Bordeaux, France; 3University of Bonn, Bonn, Germany; 4Clínica Universidad de Navarra, Pamplona, Spain; 5University Hospitals, Leuven, Belgium; 6University of Coimbra, Coimbra, Portugal; 7Université Paris Est Créteil, Créteil, FrancePurpose: To evaluate ophthalmologists’ opinion of, and use of, micronutritional dietary supplements 10 years after publication of the first Age-Related Eye Disease Study (AREDS study.Methods: Participation was solicited from 4,000 European ophthalmologists. Responding physicians were screened, and those treating at least 40 patients with age-related macular degeneration (AMD per month and prescribing nutrition supplements at least 4 times per month were admitted and completed a 40-item questionnaire.Results: The surveyed sample included 112 general ophthalmologists and 104 retinal specialists. Most nutritional supplements (46% were initiated when early/intermediate AMD was confirmed, although 18% were initiated on confirmation of neovascular AMD. Clinical studies were well known: 90% were aware of AREDS, with 88% aware of AREDS1 and 36% aware of the, as-yet-unpublished, AREDS2 studies. Respondents considered lutein, zeaxanthin, zinc, omega-3, and vitamins to be the most important components of nutritional supplements, with the results of AREDS2 already having been taken into consideration by many. Ophthalmologists anticipate more scientific studies as well as improved product quality but identify cost as a barrier to wider uptake.Conclusion: Micronutrition is now part of the routine management of AMD for many ophthalmologists. Ophthalmologists choosing to use nutritional supplements are well-informed regarding current scientific studies. Keywords: age-related macular degeneration, micronutrition, nutritional

  4. DeepSpark: A Spark-Based Distributed Deep Learning Framework for Commodity Clusters

    OpenAIRE

    Kim, Hanjoo; Park, Jaehong; Jang, Jaehee; Yoon, Sungroh

    2016-01-01

    The increasing complexity of deep neural networks (DNNs) has made it challenging to exploit existing large-scale data processing pipelines for handling massive data and parameters involved in DNN training. Distributed computing platforms and GPGPU-based acceleration provide a mainstream solution to this computational challenge. In this paper, we propose DeepSpark, a distributed and parallel deep learning framework that exploits Apache Spark on commodity clusters. To support parallel operation...

  5. The application of the csamt method in the tectonic transformation of the deep-level fore exploration in the shandongkeng area in Nanxiong basin

    International Nuclear Information System (INIS)

    Xu Zhan

    2010-01-01

    With the national policy efforts on the strengthening of mining exploration, uranium exploration has also ushered in its second s pring . The topic of the new round exploration is P rospect the deeply minerals . Therefore, the changes of the deep structure of the mining area are the premise to carry out survey work. This article states briefly the working principle and characteristics of CSAMT method. The Application of the CSAMT Method in the Tectonic Transformation of The Deep-Level Exploration in the Shangdongkeng area in Nanxiong basin expresses that the method has a good application and effectiveness in research of deep geological objectives. It provides design basis for the mining exploration of deep-level area. (authors)

  6. The HST/ACS Coma Cluster Survey : VI. Colour gradients in giant and dwarf early-type galaxies

    NARCIS (Netherlands)

    den Brok, M.; Peletier, R. F.; Valentijn, E. A.; Balcells, Marc; Carter, D.; Erwin, P.; Ferguson, H. C.; Goudfrooij, P.; Graham, A. W.; Hammer, D.; Lucey, J. R.; Trentham, N.; Guzman, R.; Hoyos, C.; Kleijn, G. Verdoes; Jogee, S.; Karick, A. M.; Marinova, I.; Mouhcine, M.; Weinzirl, T.

    Using deep, high-spatial-resolution imaging from the Hubble Space Telescope/Advanced Camera for Surveys (HST/ACS) Coma Cluster Treasury Survey, we determine colour profiles of early-type galaxies in the Coma cluster. From 176 galaxies brighter than M-F814W(AB) = -15 mag that are either

  7. Making Data Mobile: The Hubble Deep Field Academy iPad app

    Science.gov (United States)

    Eisenhamer, Bonnie; Cordes, K.; Davis, S.; Eisenhamer, J.

    2013-01-01

    Many school districts are purchasing iPads for educators and students to use as learning tools in the classroom. Educators often prefer these devices to desktop and laptop computers because they offer portability and an intuitive design, while having a larger screen size when compared to smart phones. As a result, we began investigating the potential of adapting online activities for use on Apple’s iPad to enhance the dissemination and usage of these activities in instructional settings while continuing to meet educators’ needs. As a pilot effort, we are developing an iPad app for the “Hubble Deep Field Academy” - an activity that is currently available online and commonly used by middle school educators. The Hubble Deep Field Academy app features the HDF-North image while centering on the theme of how scientists use light to explore and study the universe. It also includes features such as embedded links to vocabulary, images and videos, teacher background materials, and readings about Hubble’s other deep field surveys. It is our goal is to impact students’ engagement in STEM-related activities, while enhancing educators’ usage of NASA data via new and innovative mediums. We also hope to develop and share lessons learned with the E/PO community that can be used to support similar projects. We plan to test the Hubble Deep Field Academy app during the school year to determine if this new activity format is beneficial to the education community.

  8. College Seniors' Plans for Graduate School: Do Deep Approaches Learning and Holland Academic Environments Matter?

    Science.gov (United States)

    Rocconi, Louis M.; Ribera, Amy K.; Nelson Laird, Thomas F.

    2015-01-01

    This study examines the extent to which college seniors' plans for graduate school are related to their tendency to engage in deep approaches to learning (DAL) and their academic environments (majors) as classified by Holland type. Using data from the National Survey of Student Engagement, we analyzed responses from over 116,000 seniors attending…

  9. DeepPVP: phenotype-based prioritization of causative variants using deep learning

    KAUST Repository

    Boudellioua, Imene; Kulmanov, Maxat; Schofield, Paul N; Gkoutos, Georgios V; Hoehndorf, Robert

    2018-01-01

    phenotype-based methods that use similar features. DeepPVP is freely available at https://github.com/bio-ontology-research-group/phenomenet-vp Conclusions: DeepPVP further improves on existing variant prioritization methods both in terms of speed as well

  10. Improved evaluations and integral data testing for FENDL. Summary report of the IAEA advisory group meeting held in Garching, Germany, 12 to 16 September 1994

    International Nuclear Information System (INIS)

    Ganesan, S.

    1994-12-01

    The IAEA Nuclear Data Section, in co-operation with several national nuclear data centres and research groups, has created the first version of an internationally available Fusion Evaluated Nuclear Data Library (FENDL-1). The FENDL library has been selected to serve as a comprehensive source of processed and tested nuclear data tailored to the requirements of the Engineering and Development Activities (EDA) of the International Thermonuclear Experimental Reactor (ITER) Project and other fusion-related development projects. Within the scope of the FENDL project, the International Atomic Energy Agency performs the task of coordinating the assembling, processing and testing of a comprehensive, fusion-relevant Fusion Evaluated Nuclear Data Library with unrestricted international distribution. The present report contains the summary of the IAEA Advisory Group Meeting on ''Improved Evaluations and Integral Data Testing for FENDL'', 12-16 September 1994 hosted by the Max-Planck-Institut fuer Plasmaphysik, Garching, Germany. The report presents the current status of the FENDL activity and the future work plans in the form of conclusions and recommendations of the four Working Groups of the Advisory Group Meeting on (1) Basic Evaluations towards FENDL/E-2.0 for ITER Design, (2) Experimental and Calculational Benchmarks on Fusion Neutronics for FENDL Validation, (3) Production and Interfacing of FENDL Libraries to ITER Design, and, (4) Activation. (author)

  11. DeepARG: a deep learning approach for predicting antibiotic resistance genes from metagenomic data.

    Science.gov (United States)

    Arango-Argoty, Gustavo; Garner, Emily; Pruden, Amy; Heath, Lenwood S; Vikesland, Peter; Zhang, Liqing

    2018-02-01

    Growing concerns about increasing rates of antibiotic resistance call for expanded and comprehensive global monitoring. Advancing methods for monitoring of environmental media (e.g., wastewater, agricultural waste, food, and water) is especially needed for identifying potential resources of novel antibiotic resistance genes (ARGs), hot spots for gene exchange, and as pathways for the spread of ARGs and human exposure. Next-generation sequencing now enables direct access and profiling of the total metagenomic DNA pool, where ARGs are typically identified or predicted based on the "best hits" of sequence searches against existing databases. Unfortunately, this approach produces a high rate of false negatives. To address such limitations, we propose here a deep learning approach, taking into account a dissimilarity matrix created using all known categories of ARGs. Two deep learning models, DeepARG-SS and DeepARG-LS, were constructed for short read sequences and full gene length sequences, respectively. Evaluation of the deep learning models over 30 antibiotic resistance categories demonstrates that the DeepARG models can predict ARGs with both high precision (> 0.97) and recall (> 0.90). The models displayed an advantage over the typical best hit approach, yielding consistently lower false negative rates and thus higher overall recall (> 0.9). As more data become available for under-represented ARG categories, the DeepARG models' performance can be expected to be further enhanced due to the nature of the underlying neural networks. Our newly developed ARG database, DeepARG-DB, encompasses ARGs predicted with a high degree of confidence and extensive manual inspection, greatly expanding current ARG repositories. The deep learning models developed here offer more accurate antimicrobial resistance annotation relative to current bioinformatics practice. DeepARG does not require strict cutoffs, which enables identification of a much broader diversity of ARGs. The

  12. Radon concentration distributions in shallow and deep groundwater around the Tachikawa fault zone.

    Science.gov (United States)

    Tsunomori, Fumiaki; Shimodate, Tomoya; Ide, Tomoki; Tanaka, Hidemi

    2017-06-01

    Groundwater radon concentrations around the Tachikawa fault zone were surveyed. The radon concentrations in shallow groundwater samples around the Tachikawa fault segment are comparable to previous studies. The characteristics of the radon concentrations on both sides of the segment are considered to have changed in response to the decrease in groundwater recharge caused by urbanization on the eastern side of the segment. The radon concentrations in deep groundwater samples collected around the Naguri and the Tachikawa fault segments are the same as those of shallow groundwater samples. However, the radon concentrations in deep groundwater samples collected from the bedrock beside the Naguri and Tachikawa fault segments are markedly higher than the radon concentrations expected from the geology on the Kanto plane. This disparity can be explained by the development of fracture zones spreading on both sides of the two segments. The radon concentration distribution for deep groundwater samples from the Naguri and the Tachikawa fault segments suggests that a fault exists even at the southern part of the Tachikawa fault line. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Stimulation Technologies for Deep Well Completions

    Energy Technology Data Exchange (ETDEWEB)

    None

    2003-09-30

    The Department of Energy (DOE) is sponsoring the Deep Trek Program targeted at improving the economics of drilling and completing deep gas wells. Under the DOE program, Pinnacle Technologies is conducting a study to evaluate the stimulation of deep wells. The objective of the project is to assess U.S. deep well drilling & stimulation activity, review rock mechanics & fracture growth in deep, high pressure/temperature wells and evaluate stimulation technology in several key deep plays. An assessment of historical deep gas well drilling activity and forecast of future trends was completed during the first six months of the project; this segment of the project was covered in Technical Project Report No. 1. The second progress report covers the next six months of the project during which efforts were primarily split between summarizing rock mechanics and fracture growth in deep reservoirs and contacting operators about case studies of deep gas well stimulation.

  14. DeepQA: improving the estimation of single protein model quality with deep belief networks.

    Science.gov (United States)

    Cao, Renzhi; Bhattacharya, Debswapna; Hou, Jie; Cheng, Jianlin

    2016-12-05

    Protein quality assessment (QA) useful for ranking and selecting protein models has long been viewed as one of the major challenges for protein tertiary structure prediction. Especially, estimating the quality of a single protein model, which is important for selecting a few good models out of a large model pool consisting of mostly low-quality models, is still a largely unsolved problem. We introduce a novel single-model quality assessment method DeepQA based on deep belief network that utilizes a number of selected features describing the quality of a model from different perspectives, such as energy, physio-chemical characteristics, and structural information. The deep belief network is trained on several large datasets consisting of models from the Critical Assessment of Protein Structure Prediction (CASP) experiments, several publicly available datasets, and models generated by our in-house ab initio method. Our experiments demonstrate that deep belief network has better performance compared to Support Vector Machines and Neural Networks on the protein model quality assessment problem, and our method DeepQA achieves the state-of-the-art performance on CASP11 dataset. It also outperformed two well-established methods in selecting good outlier models from a large set of models of mostly low quality generated by ab initio modeling methods. DeepQA is a useful deep learning tool for protein single model quality assessment and protein structure prediction. The source code, executable, document and training/test datasets of DeepQA for Linux is freely available to non-commercial users at http://cactus.rnet.missouri.edu/DeepQA/ .

  15. The Sloan Digital Sky Survey-II Supernova Survey: Technical Summary

    Energy Technology Data Exchange (ETDEWEB)

    Frieman, Joshua A.; /Fermilab /KICP, Chicago /Chicago U., Astron. Astrophys. Ctr.; Bassett, Bruce; /Cape Town U. /South African Astron. Observ.; Becker, Andrew; /Washington; Choi, Changsu; /Seoul Natl. U.; Cinabro, David; /Wayne State U.; DeJongh, Don Frederic; /Fermilab; Depoy, Darren L.; /Ohio State U.; Doi, Mamoru; /Tokyo U.; Garnavich, Peter M.; /Notre Dame U.; Hogan, Craig J.; /Washington U., Seattle, Astron. Dept.; Holtzman, Jon; /New Mexico State U.; Im, Myungshin; /Seoul Natl. U.; Jha, Saurabh; /Stanford U., Phys. Dept.; Konishi, Kohki; /Tokyo U.; Lampeitl, Hubert; /Baltimore, Space Telescope Sci.; Marriner, John; /Fermilab; Marshall, Jennifer L.; /Ohio State U.; McGinnis,; /Fermilab; Miknaitis, Gajus; /Fermilab; Nichol, Robert C.; /Portsmouth U.; Prieto, Jose Luis; /Ohio State U. /Rochester Inst. Tech. /Stanford U., Phys. Dept. /Pennsylvania U.

    2007-09-14

    The Sloan Digital Sky Survey-II (SDSS-II) has embarked on a multi-year project to identify and measure light curves for intermediate-redshift (0.05 < z < 0.35) Type Ia supernovae (SNe Ia) using repeated five-band (ugriz) imaging over an area of 300 sq. deg. The survey region is a stripe 2.5 degrees wide centered on the celestial equator in the Southern Galactic Cap that has been imaged numerous times in earlier years, enabling construction of a deep reference image for discovery of new objects. Supernova imaging observations are being acquired between 1 September and 30 November of 2005-7. During the first two seasons, each region was imaged on average every five nights. Spectroscopic follow-up observations to determine supernova type and redshift are carried out on a large number of telescopes. In its first two three-month seasons, the survey has discovered and measured light curves for 327 spectroscopically confirmed SNe Ia, 30 probable SNe Ia, 14 confirmed SNe Ib/c, 32 confirmed SNe II, plus a large number of photometrically identified SNe Ia, 94 of which have host-galaxy spectra taken so far. This paper provides an overview of the project and briefly describes the observations completed during the first two seasons of operation.

  16. A survey of volcano deformation in the central Andes using InSAR: Evidence for deep, slow inflation

    Science.gov (United States)

    Pritchard, M. E.; Simons, M.

    2001-12-01

    We use interferometric synthetic aperture radar (InSAR) to survey about 50 volcanos of the central Andes (15-27o S) for deformation during the 1992-2000 time interval. Because of the remote location of these volcanos, the activity of most are poorly constrained. Using the ERS-1/2 C-band radars (5.6 cm), we observe good interferometric correlation south of about 21o S, but poor correlation north of that latitude, especially in southern Peru. This variation is presumably related to regional climate variations. Our survey reveals broad (10's of km), roughly axisymmetric deformation at 2 volcanic centers with no previously documented deformation. At Uturuncu volcano, in southwestern Bolivia, the deformation rate can be constrained with radar data from several satellite tracks and is about 1 cm/year between 1992 and 2000. We find a second source of volcanic deformation located between Lastarria and Cordon del Azufre volcanos near the Chile/Argentina border. There is less radar data to constrain the deformation in this area, but the rate is also about 1 cm/yr between 1996 and 2000. While the spatial character of the deformation field appears to be affected by atmosphere at both locations, we do not think that the entire signal is atmospheric, because the signal is observed in several interferograms and nearby edifices do not show similar patterns. The deformation signal appears to be time-variable, although it is difficult to determine whether this is due to real variations in the deformation source or atmospheric effects. We model the deformation with both a uniform point-source source of inflation, and a tri-axial point-source ellipsoid, and compare both elastic half-space and layered-space models. We also explore the effects of local topography upon the deformation field using the method of Williams and Wadge (1998). We invert for source parameters using the global search Neighborhood Algorithm of Sambridge (1998). Preliminary results indicate that the sources at both

  17. The Dark Energy Survey Image Processing Pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Morganson, E.; et al.

    2018-01-09

    The Dark Energy Survey (DES) is a five-year optical imaging campaign with the goal of understanding the origin of cosmic acceleration. DES performs a 5000 square degree survey of the southern sky in five optical bands (g,r,i,z,Y) to a depth of ~24th magnitude. Contemporaneously, DES performs a deep, time-domain survey in four optical bands (g,r,i,z) over 27 square degrees. DES exposures are processed nightly with an evolving data reduction pipeline and evaluated for image quality to determine if they need to be retaken. Difference imaging and transient source detection are also performed in the time domain component nightly. On a bi-annual basis, DES exposures are reprocessed with a refined pipeline and coadded to maximize imaging depth. Here we describe the DES image processing pipeline in support of DES science, as a reference for users of archival DES data, and as a guide for future astronomical surveys.

  18. Stimulation Technologies for Deep Well Completions

    Energy Technology Data Exchange (ETDEWEB)

    Stephen Wolhart

    2005-06-30

    The Department of Energy (DOE) is sponsoring the Deep Trek Program targeted at improving the economics of drilling and completing deep gas wells. Under the DOE program, Pinnacle Technologies conducted a study to evaluate the stimulation of deep wells. The objective of the project was to review U.S. deep well drilling and stimulation activity, review rock mechanics and fracture growth in deep, high-pressure/temperature wells and evaluate stimulation technology in several key deep plays. This report documents results from this project.

  19. Root Transcriptomic Analysis Revealing the Importance of Energy Metabolism to the Development of Deep Roots in Rice (Oryza sativa L.

    Directory of Open Access Journals (Sweden)

    Qiaojun Lou

    2017-07-01

    Full Text Available Drought is the most serious abiotic stress limiting rice production, and deep root is the key contributor to drought avoidance. However, the genetic mechanism regulating the development of deep roots is largely unknown. In this study, the transcriptomes of 74 root samples from 37 rice varieties, representing the extreme genotypes of shallow or deep rooting, were surveyed by RNA-seq. The 13,242 differentially expressed genes (DEGs between deep rooting and shallow rooting varieties (H vs. L were enriched in the pathway of genetic information processing and metabolism, while the 1,052 DEGs between the deep roots and shallow roots from each of the plants (D vs. S were significantly enriched in metabolic pathways especially energy metabolism. Ten quantitative trait transcripts (QTTs were identified and some were involved in energy metabolism. Forty-nine candidate DEGs were confirmed by qRT-PCR and microarray. Through weighted gene co-expression network analysis (WGCNA, we found 18 hub genes. Surprisingly, all these hub genes expressed higher in deep roots than in shallow roots, furthermore half of them functioned in energy metabolism. We also estimated that the ATP production in the deep roots was faster than shallow roots. Our results provided a lot of reliable candidate genes to improve deep rooting, and firstly highlight the importance of energy metabolism to the development of deep roots.

  20. Root Transcriptomic Analysis Revealing the Importance of Energy Metabolism to the Development of Deep Roots in Rice (Oryza sativa L.).

    Science.gov (United States)

    Lou, Qiaojun; Chen, Liang; Mei, Hanwei; Xu, Kai; Wei, Haibin; Feng, Fangjun; Li, Tiemei; Pang, Xiaomeng; Shi, Caiping; Luo, Lijun; Zhong, Yang

    2017-01-01

    Drought is the most serious abiotic stress limiting rice production, and deep root is the key contributor to drought avoidance. However, the genetic mechanism regulating the development of deep roots is largely unknown. In this study, the transcriptomes of 74 root samples from 37 rice varieties, representing the extreme genotypes of shallow or deep rooting, were surveyed by RNA-seq. The 13,242 differentially expressed genes (DEGs) between deep rooting and shallow rooting varieties (H vs. L) were enriched in the pathway of genetic information processing and metabolism, while the 1,052 DEGs between the deep roots and shallow roots from each of the plants (D vs. S) were significantly enriched in metabolic pathways especially energy metabolism. Ten quantitative trait transcripts (QTTs) were identified and some were involved in energy metabolism. Forty-nine candidate DEGs were confirmed by qRT-PCR and microarray. Through weighted gene co-expression network analysis (WGCNA), we found 18 hub genes. Surprisingly, all these hub genes expressed higher in deep roots than in shallow roots, furthermore half of them functioned in energy metabolism. We also estimated that the ATP production in the deep roots was faster than shallow roots. Our results provided a lot of reliable candidate genes to improve deep rooting, and firstly highlight the importance of energy metabolism to the development of deep roots.

  1. DeepPicker: A deep learning approach for fully automated particle picking in cryo-EM.

    Science.gov (United States)

    Wang, Feng; Gong, Huichao; Liu, Gaochao; Li, Meijing; Yan, Chuangye; Xia, Tian; Li, Xueming; Zeng, Jianyang

    2016-09-01

    Particle picking is a time-consuming step in single-particle analysis and often requires significant interventions from users, which has become a bottleneck for future automated electron cryo-microscopy (cryo-EM). Here we report a deep learning framework, called DeepPicker, to address this problem and fill the current gaps toward a fully automated cryo-EM pipeline. DeepPicker employs a novel cross-molecule training strategy to capture common features of particles from previously-analyzed micrographs, and thus does not require any human intervention during particle picking. Tests on the recently-published cryo-EM data of three complexes have demonstrated that our deep learning based scheme can successfully accomplish the human-level particle picking process and identify a sufficient number of particles that are comparable to those picked manually by human experts. These results indicate that DeepPicker can provide a practically useful tool to significantly reduce the time and manual effort spent in single-particle analysis and thus greatly facilitate high-resolution cryo-EM structure determination. DeepPicker is released as an open-source program, which can be downloaded from https://github.com/nejyeah/DeepPicker-python. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. A SURVEY OF THE SPECIES AND DISTRIBUTION OF AEDES AT SEAPORTS AND AIRPORTS OF IRIAN JAYA

    Directory of Open Access Journals (Sweden)

    Sumengen Sumengen

    2012-09-01

    Full Text Available Walaupun penyakit demam berdarah terdapat dimana-mana didaerah Asia Tenggara termasuk Indonesia, namun demikian belum pernah dilaporkan adanya di propinsi Irian Jaya. Vector utama penyakit tersebut adalah A. (Stegomya aegypti meskipun A. (S albopictus juga dapat menjadi vectornya. Penyakit lain yang terpenting dapat ditularkan oleh A. aegypti ialah yellow fever, dimana virusnya hanya terdapat di Afrika dan Amerika. Mengingat letak geografis yang sangat dekat dan kommunikasi yang regular baik melalui laut dan udara dengan negara-negara seperti Philipana, Thailand, Singapura, Vietnam, India, Ceylon dan Indonesia, maka setiap saat kedua penyakit tersebut kemungkinan dapat menginfeksi penduduk Irian Jaya. Untuk membantu mempelajari apakah infeksi dapat terjadi maka dari bulan September sampai dengan Desember 1968 telah dilakukan survey pendahuluan untuk mengetahui species Aedes yang ada dan distribusinya di pelabuhan-pelabuhan laut maupun udara terpenting di Irian Jaya. Survey ini dilakukan dengan cara mengadakan penangkapan serta pemeriksaan nyamuk dan larva yang terdapat pada setiap bangunan didaerah pelabuhan laut maupun udara. Berdasarkan hasil survey yang dilakukan ternyata ditemukan adanya 7 species Aedes yaitu, A. aegypti, A. albopictus, A. (S scutellaris, A. (Finlaya koehi, A. (Ochlerotatus vigilax, A. (S alboleneatus, dan A. (F novalbitarsis. Walaupun Van Den Assem & Bonne Wepster (1964 menyatakan bahwa sebagian besar di wilayah Irian Jaya masih belum ditemukan adanya A. aegypti tetapi pada penelitian ini ternyata dari 11 pelabuhan laut dan udara, 9 diantaranya ditemukan A. aegypti. Timbulnya A. aegypti pada beberapa kota diwilayah Irian Jaya pada tahun 1968 mungkin disebabkan karena pemindahan vector tersebut dari daerah-daerah lain melalui kapal laut maupun udara yang merupakan alat pengangkut dari satu daerah kedaerah lain.

  3. DeepBase: annotation and discovery of microRNAs and other noncoding RNAs from deep-sequencing data.

    Science.gov (United States)

    Yang, Jian-Hua; Qu, Liang-Hu

    2012-01-01

    Recent advances in high-throughput deep-sequencing technology have produced large numbers of short and long RNA sequences and enabled the detection and profiling of known and novel microRNAs (miRNAs) and other noncoding RNAs (ncRNAs) at unprecedented sensitivity and depth. In this chapter, we describe the use of deepBase, a database that we have developed to integrate all public deep-sequencing data and to facilitate the comprehensive annotation and discovery of miRNAs and other ncRNAs from these data. deepBase provides an integrative, interactive, and versatile web graphical interface to evaluate miRBase-annotated miRNA genes and other known ncRNAs, explores the expression patterns of miRNAs and other ncRNAs, and discovers novel miRNAs and other ncRNAs from deep-sequencing data. deepBase also provides a deepView genome browser to comparatively analyze these data at multiple levels. deepBase is available at http://deepbase.sysu.edu.cn/.

  4. Deep Borehole Disposal as an Alternative Concept to Deep Geological Disposal

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jongyoul; Lee, Minsoo; Choi, Heuijoo; Kim, Kyungsu [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    In this paper, the general concept and key technologies for deep borehole disposal of spent fuels or HLW, as an alternative method to the mined geological disposal method, were reviewed. After then an analysis on the distance between boreholes for the disposal of HLW was carried out. Based on the results, a disposal area were calculated approximately and compared with that of mined geological disposal. These results will be used as an input for the analyses of applicability for DBD in Korea. The disposal safety of this system has been demonstrated with underground research laboratory and some advanced countries such as Finland and Sweden are implementing their disposal project on commercial stage. However, if the spent fuels or the high-level radioactive wastes can be disposed of in the depth of 3-5 km and more stable rock formation, it has several advantages. Therefore, as an alternative disposal concept to the mined deep geological disposal concept (DGD), very deep borehole disposal (DBD) technology is under consideration in number of countries in terms of its outstanding safety and cost effectiveness. In this paper, the general concept of deep borehole disposal for spent fuels or high level radioactive wastes was reviewed. And the key technologies, such as drilling technology of large diameter borehole, packaging and emplacement technology, sealing technology and performance/safety analyses technologies, and their challenges in development of deep borehole disposal system were analyzed. Also, very preliminary deep borehole disposal concept including disposal canister concept was developed according to the nuclear environment in Korea.

  5. Deep Borehole Disposal as an Alternative Concept to Deep Geological Disposal

    International Nuclear Information System (INIS)

    Lee, Jongyoul; Lee, Minsoo; Choi, Heuijoo; Kim, Kyungsu

    2016-01-01

    In this paper, the general concept and key technologies for deep borehole disposal of spent fuels or HLW, as an alternative method to the mined geological disposal method, were reviewed. After then an analysis on the distance between boreholes for the disposal of HLW was carried out. Based on the results, a disposal area were calculated approximately and compared with that of mined geological disposal. These results will be used as an input for the analyses of applicability for DBD in Korea. The disposal safety of this system has been demonstrated with underground research laboratory and some advanced countries such as Finland and Sweden are implementing their disposal project on commercial stage. However, if the spent fuels or the high-level radioactive wastes can be disposed of in the depth of 3-5 km and more stable rock formation, it has several advantages. Therefore, as an alternative disposal concept to the mined deep geological disposal concept (DGD), very deep borehole disposal (DBD) technology is under consideration in number of countries in terms of its outstanding safety and cost effectiveness. In this paper, the general concept of deep borehole disposal for spent fuels or high level radioactive wastes was reviewed. And the key technologies, such as drilling technology of large diameter borehole, packaging and emplacement technology, sealing technology and performance/safety analyses technologies, and their challenges in development of deep borehole disposal system were analyzed. Also, very preliminary deep borehole disposal concept including disposal canister concept was developed according to the nuclear environment in Korea

  6. Optical colours of AGN in the Extended Chandra Deep Field South: Obscured black holes in early type galaxies

    OpenAIRE

    Rovilos, E.; Georgantopoulos, I.

    2007-01-01

    We investigate the optical colours of X-ray sources from the Extended Chandra Deep Field South (ECDFS) using photometry from the COMBO-17 survey, aiming to explore AGN - galaxy feedback models. The X-ray sources populate both the ``blue'' and the ``red sequence'' on the colour-magnitude diagram. However, sources in the ``red sequence'' appear systematically more obscured. HST imaging from the GEMS survey demonstrates that the nucleus does not affect significantly the observed colours, and the...

  7. AGN Populations in Large-volume X-Ray Surveys: Photometric Redshifts and Population Types Found in the Stripe 82X Survey

    Science.gov (United States)

    Ananna, Tonima Tasnin; Salvato, Mara; LaMassa, Stephanie; Urry, C. Megan; Cappelluti, Nico; Cardamone, Carolin; Civano, Francesca; Farrah, Duncan; Gilfanov, Marat; Glikman, Eilat; Hamilton, Mark; Kirkpatrick, Allison; Lanzuisi, Giorgio; Marchesi, Stefano; Merloni, Andrea; Nandra, Kirpal; Natarajan, Priyamvada; Richards, Gordon T.; Timlin, John

    2017-11-01

    Multiwavelength surveys covering large sky volumes are necessary to obtain an accurate census of rare objects such as high-luminosity and/or high-redshift active galactic nuclei (AGNs). Stripe 82X is a 31.3 X-ray survey with Chandra and XMM-Newton observations overlapping the legacy Sloan Digital Sky Survey Stripe 82 field, which has a rich investment of multiwavelength coverage from the ultraviolet to the radio. The wide-area nature of this survey presents new challenges for photometric redshifts for AGNs compared to previous work on narrow-deep fields because it probes different populations of objects that need to be identified and represented in the library of templates. Here we present an updated X-ray plus multiwavelength matched catalog, including Spitzer counterparts, and estimated photometric redshifts for 5961 (96% of a total of 6181) X-ray sources that have a normalized median absolute deviation, σnmad=0.06, and an outlier fraction, η = 13.7%. The populations found in this survey and the template libraries used for photometric redshifts provide important guiding principles for upcoming large-area surveys such as eROSITA and 3XMM (in X-ray) and the Large Synoptic Survey Telescope (optical).

  8. New DSH planetary nebulae and candidates from optical and infrared surveys

    International Nuclear Information System (INIS)

    Kronberger, Matthias; Jacoby, George H; Alves, Filipe; Patchick, Dana; Parker, Quentin A; Bojicic, Ivan; Frew, David J; Acker, Agnes; Eigenthaler, Paul; Harmer, Dianne; Reid, Warren; Schedler, Johannes

    2016-01-01

    To date, the planetary nebula (PN) survey of the Deep Sky Hunters collaboration has led to the detection of more than 250 previously unknown candidate planetary nebulae (PNe). About 60% of them were found during the past two years and are expected to be true, likely or possible PNe because careful vetting has already thrown out more doubtful objects. The majority of the new PN candidates are located within the boundaries of the SHS and IPHAS Ha surveys and were discovered by combining MIR data from the WideField Infrared Survey Explorer (WISE) with optical data from the IPHAS, SHS and DSS surveys, and UV data from the Galaxy Evolution Explorer(GALEX). (paper)

  9. YouTube as a potential source of information on deep venous thrombosis.

    Science.gov (United States)

    Bademci, Mehmet Ş; Yazman, Serkan; Güneş, Tevfik; Ocakoglu, Gokhan; Tayfur, Kaptanderya; Gokalp, Orhan

    2017-09-01

    Background No work has been reported on the use of video websites to learn about deep vein thrombosis and the value of education using them. We examined the characteristics and scientific accuracy of videos related to deep vein thrombosis on YouTube. Methods YouTube was surveyed using no filter and the key words 'deep vein thrombosis' and 'leg vein clot' in June 2016. The videos evaluated were divided into three groups in terms of their scientific content, accuracy, and currency: useful, partly useful, and useless. Results Of the 1200 videos watched, 715 (59.58%) were excluded with the exclusion criteria. Although most of the videos uploaded (22.9%, n = 111) were created by physicians, the number of views for website-based videos was significantly higher (p = 0.002). When the uploaded videos were assessed in terms of their usefulness, videos from physicians and hospitals were statistically more useful than other videos (p < 0.001). Conclusions For videos created by medical professionals to be of higher quality, we believe they should be more up-to-date and comprehensive, and contain animations about treatment modalities and early diagnosis in particular.

  10. Deep Vein Thrombosis

    African Journals Online (AJOL)

    OWNER

    Deep Vein Thrombosis: Risk Factors and Prevention in Surgical Patients. Deep Vein ... preventable morbidity and mortality in hospitalized surgical patients. ... the elderly.3,4 It is very rare before the age ... depends on the risk level; therefore an .... but also in the post-operative period. ... is continuing uncertainty regarding.

  11. Survey of siting practices for selected management projects in seven countries

    International Nuclear Information System (INIS)

    Hardin, E.; Aahagen, H.

    1992-06-01

    This paper surveys siting practices for deep geologic disposal in seven countries, and attempts to formulate generalizations which could be useful for the upcoming review of the Swedish plan for siting the SFL repository (R and D 92). Comparison of projects in different countries is done with full appreciation of the technical, legal, and cultural differences. The seven countries were selected for experience with siting in crystalline rock, similarity of siting practices to Sweden, and the availability of published information. Local governments have demonstrated effective veto power in each of the seven countries surveyed, although this power is exercised in different ways. This paper shows how the siting strategy itself affects the ability and the inclination of localities to block the project. It shows by example that public involvement, parallel vs. sequential characterization, schedule for siting activities, and the existence of interim waste storage capability have an impact on the success of siting. The focus of this paper is deep geologic disposal. Shallow land disposal and non-radioactive wastes are not discussed in detail, with three exceptions: LLW disposal siting in the U.S. and Canada, and the SAKAB incinerator projects in Sweden. These provide insight into siting approaches and demonstrate that conclusions regarding deep geologic disposal are supported by other experience. (114 refs.) (au)

  12. XMM-Newton 13H deep field - I. X-ray sources

    Science.gov (United States)

    Loaring, N. S.; Dwelly, T.; Page, M. J.; Mason, K.; McHardy, I.; Gunn, K.; Moss, D.; Seymour, N.; Newsam, A. M.; Takata, T.; Sekguchi, K.; Sasseen, T.; Cordova, F.

    2005-10-01

    We present the results of a deep X-ray survey conducted with XMM-Newton, centred on the UK ROSAT13H deep field area. This region covers 0.18 deg2, and is the first of the two areas covered with XMM-Newton as part of an extensive multiwavelength survey designed to study the nature and evolution of the faint X-ray source population. We have produced detailed Monte Carlo simulations to obtain a quantitative characterization of the source detection procedure and to assess the reliability of the resultant sourcelist. We use the simulations to establish a likelihood threshold, above which we expect less than seven (3 per cent) of our sources to be spurious. We present the final catalogue of 225 sources. Within the central 9 arcmin, 68 per cent of source positions are accurate to 2 arcsec, making optical follow-up relatively straightforward. We construct the N(>S) relation in four energy bands: 0.2-0.5, 0.5-2, 2-5 and 5-10 keV. In all but our highest energy band we find that the source counts can be represented by a double power law with a bright-end slope consistent with the Euclidean case and a break around 10-14yergcm-2s-1. Below this flux, the counts exhibit a flattening. Our source counts reach densities of 700, 1300, 900 and 300 deg-2 at fluxes of 4.1 × 10-16,4.5 × 10-16,1.1 × 10-15 and 5.3 × 10-15ergcm-2s-1 in the 0.2-0.5, 0.5-2, 2-5 and 5-10 keV energy bands, respectively. We have compared our source counts with those in the two Chandra deep fields and Lockman hole, and found our source counts to be amongst the highest of these fields in all energy bands. We resolve >51 per cent (>50 per cent) of the X-ray background emission in the 1-2 keV (2-5 keV) energy bands.

  13. Deep-well injection of liquid radioactive waste in Russia. Present situation

    International Nuclear Information System (INIS)

    Rybalchenko, A.

    1998-01-01

    At present there are 3 facilities (polygons) for the deep-well injection of liquid radioactive waste in Russia, all of which were constructed in the mid60's. These facilities are operating successfully, and activities have started in preparation for decommissioning. Liquid radioactive waste is injected into deep porous horizons which act as 'collector-layers', isolated from the surface and from groundwaters by a relatively thick sequence of rock of low permeability. The collector-layers (also collector-horizons) contain salt waters or fresh waters of no practical application, lying beneath the main horizons containing potable waters. Construction of facilities for the deep-well injection of liquid radioactive waste was preceded by geological surveys and investigations which were able to substantiate the feasibility and safety of radioactive waste injection, and to obtain initial data for facility design. Operation of the facilities was accompanied by monitoring which confirmed that the main safety requirement was satisfied i.e. localisation of radioactive waste within specified boundaries of the geologic medium. The opinion of most specialists in the atomic power industry in Russia favours deep-well injection as a solution to the problem of liquid radioactive waste management; during the period of active operation of defence facilities (atomic power industry of the former U.S.S.R.), this disposal method prevented the impact of radioactive waste on man and the environment. The experience accumulated concerning the injection of liquid radioactive waste in Russia is of interest to scientists and engineers engaged in problems of protection and remediation of the environment in the vicinity of nuclear industry facilities; an example of the utilisation of the deep subsurface for solidified radioactive waste and the disposal of different types of nuclear materials. Information on the scientific principles and background for the development of facilities for the injection

  14. Assessment of deep electrical conductivity features of Northern Victoria Land (Antarctica under other geophysical constraints

    Directory of Open Access Journals (Sweden)

    A. Caneva

    2000-06-01

    Full Text Available The lithospheric and crustal structure of the Victoria Land continental block (Antarctica has been studied by geological and geophysical surveys. Among them magnetovariational investigations (MV have been addressed to highlight the deep electrical conductivity patterns which contribute to the understanding of continental rifting and tectonic setting of the region. The hypothetical event map for H linearly polarized perpendicular to the coast indicates a possible broad coast parallel conductivity anomaly zone. Despite the coast effect, this feature could be related to the deep upper mantle thermal anomaly leading to Cenozoic uplift of the Transantarctic Mountains rift flank. However, both the hypothetic event map polarized parallel to the coast and the induction arrows suggest that the area of enhanced conductivity may be confined to the Deep Freeze Range crustal block along the western flank of the Mesozoic Rennick Graben. We also discuss the possible association between increased conductivity over the Southern Cross block and extensive Cenozoic alkaline plutonism.

  15. Un nouvel habitat du Bronze final IIIb dans le Val d’Orléans et ses traces de métallurgie du fer : Bonnée, Les Terres à l’Est du Bourg (Centre, Loiret A new settlement from late Bronze IIIb in the val d’Orléans: Bonnée, Les Terres à l’Est du Bourg (lands to the east of the town (Centre, Loiret

    Directory of Open Access Journals (Sweden)

    Stéphane Joly

    2011-12-01

    Full Text Available La multiplication récente des diagnostics archéologiques autour du petit bourg actuel de Bonnée a permis la détection d’une occupation du Bronze final IIIb (circa 900-750 av. n.-e. sur environ un hectare aux Terres à l’Est du Bourg. Les limites inhérentes à ce type d’intervention réduisent l’interprétation des structures et toute approche spatiale de ce probable habitat repéré. L’étude conjointe des différents mobiliers en particulier céramique et métallurgique, mais aussi le torchis et la faune, atteste de sa culture matérielle et de son intérêt. Ces scories pourraient être parmi les plus anciennes traces en région Centre d’activité métallurgique du fer. Des activités de post-réduction sont avérées et certaines opérations de forgeage sont supposées.La découverte de cet habitat dans ce secteur du lit majeur de la Loire, apporte de nouvelles données concernant l’occupation du sol et ses problématiques évolutives dans le Val d’Orléans sur ces périodes de transition avec le Hallstatt ancien.The recent increase in the archaeological evaluations around the village of Bonnée has enabled the discovery of a final Bronze Age IIIb settlement (circa 900-750 BP on about a hectare of the Terres à l’Est du Bourg. The limits inherent in this type of work restrict the interpretation of structures and all spatial approaches to this likely dwelling. The joint study of the different furnishings, in particular ceramics and metal, but also the cob and fauna, attests to its material culture and its interest. The slag could be amongst the oldest iron metallurgy remains from the Centre region. Some activities of post-forging are proven and certain operations are guessed at. The discovery of this settlement in this sector of the major bed of the Loire, has produced new data concerning the occupation of the land and its evolutionary problems in the valley of Orleans over these periods of transition in the early

  16. Survey of naturally occurring hazardous materials in deep geologic formations: a perspective on the relative hazard of deep burial of nuclear wastes

    International Nuclear Information System (INIS)

    Tonnessen, K.A.; Cohen, J.J.

    1977-01-01

    Hazards associated with deep burial of solidified nuclear waste are considered with reference to toxic elements in naturally occurring ore deposits. This problem is put into perspective by relating the hazard of a radioactive waste repository to that of naturally occurring geologic formations. The basis for comparison derives from a consideration of safe drinking water levels. Calculations for relative toxicity of FBR waste and light water reactor (LWR) waste in an underground repository are compared with the relative toxicity indices obtained for average concentration ore deposits. Results indicate that, over time, nuclear waste toxicity decreases to levels below those of naturally occurring hazardous materials

  17. pDeep: Predicting MS/MS Spectra of Peptides with Deep Learning.

    Science.gov (United States)

    Zhou, Xie-Xuan; Zeng, Wen-Feng; Chi, Hao; Luo, Chunjie; Liu, Chao; Zhan, Jianfeng; He, Si-Min; Zhang, Zhifei

    2017-12-05

    In tandem mass spectrometry (MS/MS)-based proteomics, search engines rely on comparison between an experimental MS/MS spectrum and the theoretical spectra of the candidate peptides. Hence, accurate prediction of the theoretical spectra of peptides appears to be particularly important. Here, we present pDeep, a deep neural network-based model for the spectrum prediction of peptides. Using the bidirectional long short-term memory (BiLSTM), pDeep can predict higher-energy collisional dissociation, electron-transfer dissociation, and electron-transfer and higher-energy collision dissociation MS/MS spectra of peptides with >0.9 median Pearson correlation coefficients. Further, we showed that intermediate layer of the neural network could reveal physicochemical properties of amino acids, for example the similarities of fragmentation behaviors between amino acids. We also showed the potential of pDeep to distinguish extremely similar peptides (peptides that contain isobaric amino acids, for example, GG = N, AG = Q, or even I = L), which were very difficult to distinguish using traditional search engines.

  18. Sanford Underground Research Facility - The United State's Deep Underground Research Facility

    Science.gov (United States)

    Vardiman, D.

    2012-12-01

    /LIDAR), surveying instruments, and surveying benchmarks and optical survey points. Currently an array of single and multipoint extensometers monitors the Davis Campus. A facility-wide micro seismic monitoring system is anticipated to be deployed during the latter half of 2012. This system is designed to monitor minor events initiated within the historical mined out portions of the facility. The major science programs for the coming five years consist of the MAJORANA DEMONSTRATOR (MJD) neutrinoless double beta decay experiment; the Large Underground Xenon (LUX) dark matter search, the Center for Ultralow Background Experiments at DUSEL (CUBED), numerous geoscience installations, Long-Baseline Neutrino Experiment (LBNE), a nuclear astrophysics program involving a low energy underground particle accelerator, second and third generation dark matter experiments, and additional low background counting facilities. The Sanford Lab facility is an active, U.S. based, deep underground research facility dedicated to science, affording the science community the opportunity to conduct unprecedented scientific research in a broad range of physics, biology and geoscience fields at depth. SURF is actively interested in hosting additional research collaborations and provides resources for full facility design, cost estimation, excavation, construction and support management services.

  19. HB1204: Deep-Sea Corals and Benthic Habitats in Northeast Deepwater Canyons on NOAA Ship Henry Bigelow between 20120703 and 20120718

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — A multi-disciplinary team of scientists on the Henry Bigelow HB1204 mission surveyed and ground-truthed known or suspected deep-sea coral habitats associated with...

  20. Deep breathing exercises performed 2 months following cardiac surgery: a randomized controlled trial.

    Science.gov (United States)

    Westerdahl, Elisabeth; Urell, Charlotte; Jonsson, Marcus; Bryngelsson, Ing-Liss; Hedenström, Hans; Emtner, Margareta

    2014-01-01

    Postoperative breathing exercises are recommended to cardiac surgery patients. Instructions concerning how long patients should continue exercises after discharge vary, and the significance of treatment needs to be determined. Our aim was to assess the effects of home-based deep breathing exercises performed with a positive expiratory pressure device for 2 months following cardiac surgery. The study design was a prospective, single-blinded, parallel-group, randomized trial. Patients performing breathing exercises 2 months after cardiac surgery (n = 159) were compared with a control group (n = 154) performing no breathing exercises after discharge. The intervention consisted of 30 slow deep breaths performed with a positive expiratory pressure device (10-15 cm H2O), 5 times a day, during the first 2 months after surgery. The outcomes were lung function measurements, oxygen saturation, thoracic excursion mobility, subjective perception of breathing and pain, patient-perceived quality of recovery (40-Item Quality of Recovery score), health-related quality of life (36-Item Short Form Health Survey), and self-reported respiratory tract infection/pneumonia and antibiotic treatment. Two months postoperatively, the patients had significantly reduced lung function, with a mean decrease in forced expiratory volume in 1 second to 93 ± 12% (P< .001) of preoperative values. Oxygenation had returned to preoperative values, and 5 of 8 aspects in the 36-Item Short Form Health Survey were improved compared with preoperative values (P< .01). There were no significant differences between the groups in any of the measured outcomes. No significant differences in lung function, subjective perceptions, or quality of life were found between patients performing home-based deep breathing exercises and control patients 2 months after cardiac surgery.

  1. A survey on object detection in optical remote sensing images

    Science.gov (United States)

    Cheng, Gong; Han, Junwei

    2016-07-01

    Object detection in optical remote sensing images, being a fundamental but challenging problem in the field of aerial and satellite image analysis, plays an important role for a wide range of applications and is receiving significant attention in recent years. While enormous methods exist, a deep review of the literature concerning generic object detection is still lacking. This paper aims to provide a review of the recent progress in this field. Different from several previously published surveys that focus on a specific object class such as building and road, we concentrate on more generic object categories including, but are not limited to, road, building, tree, vehicle, ship, airport, urban-area. Covering about 270 publications we survey (1) template matching-based object detection methods, (2) knowledge-based object detection methods, (3) object-based image analysis (OBIA)-based object detection methods, (4) machine learning-based object detection methods, and (5) five publicly available datasets and three standard evaluation metrics. We also discuss the challenges of current studies and propose two promising research directions, namely deep learning-based feature representation and weakly supervised learning-based geospatial object detection. It is our hope that this survey will be beneficial for the researchers to have better understanding of this research field.

  2. Gouge marks on deep-sea mud volcanoes in the eastern Mediterranean: Caused by Cuvier’s beaked whales?

    NARCIS (Netherlands)

    Woodside, J.M.; David, L; Frantzis, A.; Hooker, S.K.

    2006-01-01

    Enigmatic seafloor gouge marks at depths of 1700-2100 m have been observed from submersible during geological survey work studying mud volcanoes in the eastern Mediterranean Sea. The marks consist of a central groove (about 10 cm deep and 1-2 m long), superimposed on a broader bowl-shaped depression

  3. Searching for prostate cancer by fully automated magnetic resonance imaging classification: deep learning versus non-deep learning.

    Science.gov (United States)

    Wang, Xinggang; Yang, Wei; Weinreb, Jeffrey; Han, Juan; Li, Qiubai; Kong, Xiangchuang; Yan, Yongluan; Ke, Zan; Luo, Bo; Liu, Tao; Wang, Liang

    2017-11-13

    Prostate cancer (PCa) is a major cause of death since ancient time documented in Egyptian Ptolemaic mummy imaging. PCa detection is critical to personalized medicine and varies considerably under an MRI scan. 172 patients with 2,602 morphologic images (axial 2D T2-weighted imaging) of the prostate were obtained. A deep learning with deep convolutional neural network (DCNN) and a non-deep learning with SIFT image feature and bag-of-word (BoW), a representative method for image recognition and analysis, were used to distinguish pathologically confirmed PCa patients from prostate benign conditions (BCs) patients with prostatitis or prostate benign hyperplasia (BPH). In fully automated detection of PCa patients, deep learning had a statistically higher area under the receiver operating characteristics curve (AUC) than non-deep learning (P = 0.0007 deep learning method and 0.70 (95% CI 0.63-0.77) for non-deep learning method, respectively. Our results suggest that deep learning with DCNN is superior to non-deep learning with SIFT image feature and BoW model for fully automated PCa patients differentiation from prostate BCs patients. Our deep learning method is extensible to image modalities such as MR imaging, CT and PET of other organs.

  4. A study on site characterization of the deep geological environment around KURT

    International Nuclear Information System (INIS)

    Park, Kw; Kim, Gy; Koh, Yk; Kim, Ks; Choi, Jw

    2009-01-01

    KURT (KAERI Underground Research Tunnel) is a small scale research tunnel which was constructed from 2005 to 2006 at Korea Atomic Energy Research Institute (KAERI). To understand the deep geological environment around KURT area, the surface geological surveys such as lineaments analysis and geophysical survey and borehole investigation were performed. For this study, a 3 dimensional geological model has been constructed using the surface and borehole geological data. The regional lineaments were determined using a topographical map and the surface geophysical survey data were collected for the geological model. In addition, statistical methods were applied to fracture data from borehole televiewer loggings to identify fracture zones in boreholes. For a hydro geological modeling, fixed interval hydraulic tests were carried out for all boreholes. The results of the hydraulic tests were analyzed and classified by the fracture zone data of geological model. At result, the hydrogeological elements were decided and the properties of each element were assessed around the KURT area

  5. Fiscal 1997 verification and survey of geothermal prospecting technology etc. 2/2. Survey report on deep-seated geothermal resources; 1997 nendo chinetsu tansa gijutsu nado kensho chosa hokokusho. 2/2. Shinbu chinetsu shigen chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-02-01

    For the purpose of reducing the risk to accompany the exploitation of deep-seated geothermal resources, investigations are conducted into the three factors that govern the formation of geothermal resources at deep levels, that is, the supply of heat from heat sources, the supply of geothermal fluids, and the development of fracture systems contributing to the constitution of reservoir structures. In the evaluation and study of reservoirs and the amount of resources, a reservoir simulation is conducted to grasp the characteristics of reservoirs and the amount of resources. For this purpose, the origin and history of the Kakkonda geothermal field are studied, with special attention paid to the origin of the difference in temperature between the shallow-seated and deep-seated reservoirs, the geometry of granite at Kakkonda, the region of recharge of meteoric water, the distribution of saline concentration in the natural state and the cause of the occurrence, the amount of supply of fluids and heat from the depth to the reservoirs, etc. In the evaluation and study of the economic effectiveness of the exploitation of deep-seated geothermal resources, it is learned that, if a 50MW geothermal power station is to be built at a deep level (drilled depth of 3000m on the average) with a rate of 50% attained in drilling, the steam amount required at such a deep level (presumed to be 75t/h) will be more than twice that required at a shallow level (presumed to be 35/h). (NEDO)

  6. VizieR Online Data Catalog: HST/ACS Coma Cluster Survey. VI. (den Brok+, 2011)

    Science.gov (United States)

    den Brok, M.; Peletier, R. F.; Valentijn, E. A.; Balcells, M.; Carter, D.; Erwin, P.; Ferguson, H. C.; Goudfrooij, P.; Graham, A. W.; Hammer, D.; Lucey, J. R.; Trentham, N.; Guzman, R.; Hoyos, C.; Verdoes Kleijn, G.; Jogee, S.; Karick, A. M.; Marinova, I.; Mouhcine, M.; Weinzirl, T.

    2018-01-01

    We have used the data from the HST/ACS Coma Cluster Survey, a deep two-passband imaging survey of the Coma cluster. A full description of the observations and data reduction can be found in Paper I (Carter et al., 2008ApJS..176..424C). We have derived colour gradients for a sample of confirmed or very likely Coma cluster members. (2 data files).

  7. Deep learning for image classification

    Science.gov (United States)

    McCoppin, Ryan; Rizki, Mateen

    2014-06-01

    This paper provides an overview of deep learning and introduces the several subfields of deep learning including a specific tutorial of convolutional neural networks. Traditional methods for learning image features are compared to deep learning techniques. In addition, we present our preliminary classification results, our basic implementation of a convolutional restricted Boltzmann machine on the Mixed National Institute of Standards and Technology database (MNIST), and we explain how to use deep learning networks to assist in our development of a robust gender classification system.

  8. Deep learning for computational chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Goh, Garrett B. [Advanced Computing, Mathematics, and Data Division, Pacific Northwest National Laboratory, 902 Battelle Blvd Richland Washington 99354; Hodas, Nathan O. [Advanced Computing, Mathematics, and Data Division, Pacific Northwest National Laboratory, 902 Battelle Blvd Richland Washington 99354; Vishnu, Abhinav [Advanced Computing, Mathematics, and Data Division, Pacific Northwest National Laboratory, 902 Battelle Blvd Richland Washington 99354

    2017-03-08

    The rise and fall of artificial neural networks is well documented in the scientific literature of both the fields of computer science and computational chemistry. Yet almost two decades later, we are now seeing a resurgence of interest in deep learning, a machine learning algorithm based on “deep” neural networks. Within the last few years, we have seen the transformative impact of deep learning the computer science domain, notably in speech recognition and computer vision, to the extent that the majority of practitioners in those field are now regularly eschewing prior established models in favor of deep learning models. In this review, we provide an introductory overview into the theory of deep neural networks and their unique properties as compared to traditional machine learning algorithms used in cheminformatics. By providing an overview of the variety of emerging applications of deep neural networks, we highlight its ubiquity and broad applicability to a wide range of challenges in the field, including QSAR, virtual screening, protein structure modeling, QM calculations, materials synthesis and property prediction. In reviewing the performance of deep neural networks, we observed a consistent outperformance against non neural networks state-of-the-art models across disparate research topics, and deep neural network based models often exceeded the “glass ceiling” expectations of their respective tasks. Coupled with the maturity of GPU-accelerated computing for training deep neural networks and the exponential growth of chemical data on which to train these networks on, we anticipate that deep learning algorithms will be a useful tool and may grow into a pivotal role for various challenges in the computational chemistry field.

  9. Deep learning for computational chemistry.

    Science.gov (United States)

    Goh, Garrett B; Hodas, Nathan O; Vishnu, Abhinav

    2017-06-15

    The rise and fall of artificial neural networks is well documented in the scientific literature of both computer science and computational chemistry. Yet almost two decades later, we are now seeing a resurgence of interest in deep learning, a machine learning algorithm based on multilayer neural networks. Within the last few years, we have seen the transformative impact of deep learning in many domains, particularly in speech recognition and computer vision, to the extent that the majority of expert practitioners in those field are now regularly eschewing prior established models in favor of deep learning models. In this review, we provide an introductory overview into the theory of deep neural networks and their unique properties that distinguish them from traditional machine learning algorithms used in cheminformatics. By providing an overview of the variety of emerging applications of deep neural networks, we highlight its ubiquity and broad applicability to a wide range of challenges in the field, including quantitative structure activity relationship, virtual screening, protein structure prediction, quantum chemistry, materials design, and property prediction. In reviewing the performance of deep neural networks, we observed a consistent outperformance against non-neural networks state-of-the-art models across disparate research topics, and deep neural network-based models often exceeded the "glass ceiling" expectations of their respective tasks. Coupled with the maturity of GPU-accelerated computing for training deep neural networks and the exponential growth of chemical data on which to train these networks on, we anticipate that deep learning algorithms will be a valuable tool for computational chemistry. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  10. Recent developments in volatility modeling and applications

    Directory of Open Access Journals (Sweden)

    A. Thavaneswaran

    2006-01-01

    Full Text Available In financial modeling, it has been constantly pointed out that volatility clustering and conditional nonnormality induced leptokurtosis observed in high frequency data. Financial time series data are not adequately modeled by normal distribution, and empirical evidence on the non-normality assumption is well documented in the financial literature (details are illustrated by Engle (1982 and Bollerslev (1986. An ARMA representation has been used by Thavaneswaran et al., in 2005, to derive the kurtosis of the various class of GARCH models such as power GARCH, non-Gaussian GARCH, nonstationary and random coefficient GARCH. Several empirical studies have shown that mixture distributions are more likely to capture heteroskedasticity observed in high frequency data than normal distribution. In this paper, some results on moment properties are generalized to stationary ARMA process with GARCH errors. Application to volatility forecasts and option pricing are also discussed in some detail.

  11. Comparison of historically simulated VaR: Evidence from oil prices

    International Nuclear Information System (INIS)

    Costello, Alexandra; Asem, Ebenezer; Gardner, Eldon

    2008-01-01

    Cabedo and Moya [Cabedo, J.D., Moya, I., 2003. Estimating oil price 'Value at Risk' using the historical simulation approach. Energy Economics 25, 239-253] find that ARMA with historical simulation delivers VaR forecasts that are superior to those from GARCH. We compare the ARMA with historical simulation to the semi-parametric GARCH model proposed by Barone-Adesi et al. [Barone-Adesi, G., Giannopoulos, K., Vosper, L., 1999. VaR without correlations for portfolios of derivative securities. Journal of Futures Markets 19 (5), 583-602]. The results suggest that the semi-parametric GARCH model generates VaR forecasts that are superior to the VaR forecasts from the ARMA with historical simulation. This is due to the fact that GARCH captures volatility clustering. Our findings suggest that Cabedo and Moya's conclusion is mainly driven by the normal distributional assumption imposed on the future risk structure in the GARCH model. (author)

  12. New constraints on the structure of Hess Deep from regional- and micro-bathymetry data acquired during RRS James Cook in Jan-Feb 2008 (JC021)

    Science.gov (United States)

    Shillington, D. J.; Ferrini, V. L.; MacLeod, C. J.; Teagle, D. A.; Gillis, K. M.; Cazenave, P. W.; Hurst, S. D.; Scientific Party, J.

    2008-12-01

    In January-February 2008, new geophysical and geological data were acquired in Hess Deep using the RRS James Cook and the British ROV Isis. Hess Deep provides a tectonic window into oceanic crust emplaced by fast seafloor spreading at the East Pacific Rise, thereby offering the opportunity to test competing hypotheses for oceanic crustal accretion. The goal of this cruise was to collect datasets that can constrain the structure and composition of the lower crustal section exposed in the south-facing slope of the Intrarift Ridge just north of the Deep, and thus provide insights into the emplacement of gabbroic lower crust at fast spreading rates. Additionally, the acquired datasets provide site survey data for IODP Proposal 551-Full. The following datasets were acquired during JC021: 1) regional multibeam bathymetry survey complemented with sub-bottom profiler (SBP) data (in selected areas), 2) two micro-bathymetry surveys, and 3) seafloor rock samples acquired with an ROV. Here we present grids of regional multibeam and microbathymetry data following post-cruise processing. Regional multibeam bathymetry were acquired using the hull-mounted Kongsberg Simrad EM120 system (12 kHz). These data provide new coverage of the northern flank of the rift as far east as 100°W, which show that it comprises of a series of 50- to 100-km-long en echelon segments. Both E-W and NE-SW striking features are observed in the immediate vicinity of the Deep, including in a newly covered region to the SW of the rift tip. Such features might arise due to the rotation of the Galapagos microplate(s), as proposed by other authors. The ROV Isis acquired micro-bathymetry data in two areas using a Simrad SM2000 (200 kHz) multibeam sonar. Data were acquired at a nominal altitude of ~100 m and speed of 0.3 kts to facilitate high-resolution mapping of seabed features and also permit coverage of two relatively large areas. Swath widths were ~200- 350 m depending on noise and seabed characteristics

  13. Origin and distribution of the organic matter in the distal lobe of the Congo deep-sea fan - A Rock-Eval survey

    Science.gov (United States)

    Baudin, François; Stetten, Elsa; Schnyder, Johann; Charlier, Karine; Martinez, Philippe; Dennielou, Bernard; Droz, Laurence

    2017-08-01

    The Congo River, the second largest river in the world, is a major source of organic matter for the deep Atlantic Ocean because of the connection of its estuary to the deep offshore area by a submarine canyon which feeds a vast deep-sea fan. The lobe zone of this deep-sea fan is the final receptacle of the sedimentary inputs presently channelled by the canyon and covers an area of 2500 km². The quantity and the source of organic matter preserved in recent turbiditic sediments from the distal lobe of the Congo deep-sea fan were assessed using Rock-Eval pyrolysis analyses. Six sites, located at approximately 5000 m water-depth, were investigated. The mud-rich sediments of the distal lobe contain high amounts of organic matter ( 3.5 to 4% Corg), the origin of which is a mixture of terrestrial higher-plant debris, soil organic matter and deeply oxidized phytoplanktonic material. Although the respective contribution of terrestrial and marine sources of organic matter cannot be precisely quantified using Rock-Eval analyses, the terrestrial fraction is dominant according to similar hydrogen and oxygen indices of both suspended and bedload sediments from the Congo River and that deposited in the lobe complex. The Rock-Eval signature supports the 70% to 80% of the terrestrial fraction previously estimated using C/N and δ13Corg data. In the background sediment, the organic matter distribution is homogeneous at different scales, from a single turbiditic event to the entire lobe, and changes in accumulation rates only have a limited effect on the quantity and quality of the preserved organic matter. Peculiar areas with chemosynthetic bivalves and/or bacterial mats, explored using ROV Victor 6000, show a Rock-Eval signature similar to background sediment. This high organic carbon content associated to high sedimentation rates (> 2 to 20 mm.yr-1) in the Congo distal lobe complex implies a high burial rate for organic carbon. Consequently, the Congo deep-sea fan represents an

  14. On contemporary sedimentation at the titanic survey area

    Science.gov (United States)

    Lukashin, V. N.

    2009-12-01

    The basic parameters of the sedimentation environment are considered: the Western Boundary Deep Current that transports sedimentary material and distributes it on the survey area; the nepheloid layer, its features, and the distribution of the concentrations and particulate standing crop in it; the distribution of the horizontal and vertical fluxes of the sedimentary material; and the bottom sediments and their absolute masses. The comparison of the vertical fluxes of the particulate matter and the absolute masses of the sediments showed that the contemporary fluxes of sedimentary material to the bottom provided the distribution of the absolute masses of the sediments in the survey area during the Holocene.

  15. ITER EDA newsletter. V. 3, no. 2

    International Nuclear Information System (INIS)

    1994-02-01

    This issue of the ITER EDA (Engineering Design Activities) Newsletter contains reports on the Fifth ITER Council Meeting held in Garching, Germany, 27-28 January 1994, a visit (28 January 1994) of an international group of Harvard Fellows to the San Diego Joint Work Site, the Inauguration Ceremony of the EC-hosted ITER joint work site in Garching (28 January 1994), on an ITER Technical Meeting on Assembly and Maintenance held in Garching, Germany, January 19-26, 1994, and a report on a Technical Committee Meeting on radiation effects on in-vessel components held in Garching, Germany, November 15-19, 1993, as well as an ITER Status Report

  16. What Really is Deep Learning Doing?

    OpenAIRE

    Xiong, Chuyu

    2017-01-01

    Deep learning has achieved a great success in many areas, from computer vision to natural language processing, to game playing, and much more. Yet, what deep learning is really doing is still an open question. There are a lot of works in this direction. For example, [5] tried to explain deep learning by group renormalization, and [6] tried to explain deep learning from the view of functional approximation. In order to address this very crucial question, here we see deep learning from perspect...

  17. Taoism and Deep Ecology.

    Science.gov (United States)

    Sylvan, Richard; Bennett, David

    1988-01-01

    Contrasted are the philosophies of Deep Ecology and ancient Chinese. Discusses the cosmology, morality, lifestyle, views of power, politics, and environmental philosophies of each. Concludes that Deep Ecology could gain much from Taoism. (CW)

  18. Deep-learnt classification of light curves

    DEFF Research Database (Denmark)

    Mahabal, Ashish; Gieseke, Fabian; Pai, Akshay Sadananda Uppinakudru

    2017-01-01

    is to derive statistical features from the time series and to use machine learning methods, generally supervised, to separate objects into a few of the standard classes. In this work, we transform the time series to two-dimensional light curve representations in order to classify them using modern deep......Astronomy light curves are sparse, gappy, and heteroscedastic. As a result standard time series methods regularly used for financial and similar datasets are of little help and astronomers are usually left to their own instruments and techniques to classify light curves. A common approach...... learning techniques. In particular, we show that convolutional neural networks based classifiers work well for broad characterization and classification. We use labeled datasets of periodic variables from CRTS survey and show how this opens doors for a quick classification of diverse classes with several...

  19. deepTools2: a next generation web server for deep-sequencing data analysis.

    Science.gov (United States)

    Ramírez, Fidel; Ryan, Devon P; Grüning, Björn; Bhardwaj, Vivek; Kilpert, Fabian; Richter, Andreas S; Heyne, Steffen; Dündar, Friederike; Manke, Thomas

    2016-07-08

    We present an update to our Galaxy-based web server for processing and visualizing deeply sequenced data. Its core tool set, deepTools, allows users to perform complete bioinformatic workflows ranging from quality controls and normalizations of aligned reads to integrative analyses, including clustering and visualization approaches. Since we first described our deepTools Galaxy server in 2014, we have implemented new solutions for many requests from the community and our users. Here, we introduce significant enhancements and new tools to further improve data visualization and interpretation. deepTools continue to be open to all users and freely available as a web service at deeptools.ie-freiburg.mpg.de The new deepTools2 suite can be easily deployed within any Galaxy framework via the toolshed repository, and we also provide source code for command line usage under Linux and Mac OS X. A public and documented API for access to deepTools functionality is also available. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  20. The SPHEREx All-Sky Spectral Survey

    Science.gov (United States)

    Bock, James; SPHEREx Science Team

    2018-01-01

    SPHEREx, a mission in NASA's Medium Explorer (MIDEX) program that was selected for Phase A in August 2017, is an all-sky survey satellite designed to address all three science goals in NASA's astrophysics division, with a single instrument, a wide-field spectral imager. SPHEREx will probe the physics of inflation by measuring non-Gaussianity by studying large-scale structure, surveying a large cosmological volume at low redshifts, complementing high-z surveys optimized to constrain dark energy. The origin of water and biogenic molecules will be investigated in all phases of planetary system formation - from molecular clouds to young stellar systems with protoplanetary disks - by measuring ice absorption spectra. We will chart the origin and history of galaxy formation through a deep survey mapping large-scale spatial power in two deep fields located near the ecliptic poles. Following in the tradition of all-sky missions such as IRAS, COBE and WISE, SPHEREx will be the first all-sky near-infrared spectral survey. SPHEREx will create spectra (0.75 – 4.2 um at R = 41; and 4.2 – 5 um at R = 135) with high sensitivity making background-limited observations using a passively-cooled telescope with a wide field-of-view for large mapping speed. During its two-year mission, SPHEREx will produce four complete all-sky maps that will serve as a rich archive for the astronomy community. With over a billion detected galaxies, hundreds of millions of high-quality stellar and galactic spectra, and over a million ice absorption spectra, the archive will enable diverse scientific investigations including studies of young stellar systems, brown dwarfs, high-redshift quasars, galaxy clusters, the interstellar medium, asteroids and comets. All aspects of the instrument and spacecraft have high heritage. SPHEREx requires no new technologies and carries large technical and resource margins on every aspect of the design. SPHEREx is a partnership between Caltech and JPL, following the