WorldWideScience

Sample records for covariance libraries based

  1. COVFILS: 30-group covariance library based on ENDF/B-V

    International Nuclear Information System (INIS)

    Muir, D.W.; LaBauve, R.J.

    1981-03-01

    A library of 30-group cross sections and covariances called COVFILS has been prepared from ENDF/B-V data using the NJOY code system. COVFILS includes data on the total cross section, scattering cross sections, and the most important absorption cross sections for 1 H, 10 B, C, 16 O, Cr, Fe, Ni, Cu, and Pb. This report contains detailed descriptions of various features of the library, a listing of a FORTRAN retrieval program, and 143 plots of the multigroup cross-section uncertainties and their correlations

  2. Specifications for adjusted cross section and covariance libraries based upon CSEWG fast reactor and dosimetry benchmarks

    International Nuclear Information System (INIS)

    Weisbin, C.R.; Marable, J.H.; Collins, P.J.; Cowan, C.L.; Peelle, R.W.; Salvatores, M.

    1979-06-01

    The present work proposes a specific plan of cross section library adjustment for fast reactor core physics analysis using information from fast reactor and dosimetry integral experiments and from differential data evaluations. This detailed exposition of the proposed approach is intended mainly to elicit review and criticism from scientists and engineers in the research, development, and design fields. This major attempt to develop useful adjusted libraries is based on the established benchmark integral data, accurate and well documented analysis techniques, sensitivities, and quantified uncertainties for nuclear data, integral experiment measurements, and calculational methodology. The adjustments to be obtained using these specifications are intended to produce an overall improvement in the least-squares sense in the quality of the data libraries, so that calculations of other similar systems using the adjusted data base with any credible method will produce results without much data-related bias. The adjustments obtained should provide specific recommendations to the data evaluation program to be weighed in the light of newer measurements, and also a vehicle for observing how the evaluation process is converging. This report specifies the calculational methodology to be used, the integral experiments to be employed initially, and the methods and integral experiment biases and uncertainties to be used. The sources of sensitivity coefficients, as well as the cross sections to be adjusted, are detailed. The formulae for sensitivity coefficients for fission spectral parameters are developed. A mathematical formulation of the least-square adjustment problem is given including biases and uncertainties in methods

  3. AFCI-2.0 Neutron Cross Section Covariance Library

    Energy Technology Data Exchange (ETDEWEB)

    Herman, M.; Herman, M; Oblozinsky, P.; Mattoon, C.M.; Pigni, M.; Hoblit, S.; Mughabghab, S.F.; Sonzogni, A.; Talou, P.; Chadwick, M.B.; Hale, G.M.; Kahler, A.C.; Kawano, T.; Little, R.C.; Yount, P.G.

    2011-03-01

    The cross section covariance library has been under development by BNL-LANL collaborative effort over the last three years. The project builds on two covariance libraries developed earlier, with considerable input from BNL and LANL. In 2006, international effort under WPEC Subgroup 26 produced BOLNA covariance library by putting together data, often preliminary, from various sources for most important materials for nuclear reactor technology. This was followed in 2007 by collaborative effort of four US national laboratories to produce covariances, often of modest quality - hence the name low-fidelity, for virtually complete set of materials included in ENDF/B-VII.0. The present project is focusing on covariances of 4-5 major reaction channels for 110 materials of importance for power reactors. The work started under Global Nuclear Energy Partnership (GNEP) in 2008, which changed to Advanced Fuel Cycle Initiative (AFCI) in 2009. With the 2011 release the name has changed to the Covariance Multigroup Matrix for Advanced Reactor Applications (COMMARA) version 2.0. The primary purpose of the library is to provide covariances for AFCI data adjustment project, which is focusing on the needs of fast advanced burner reactors. Responsibility of BNL was defined as developing covariances for structural materials and fission products, management of the library and coordination of the work; LANL responsibility was defined as covariances for light nuclei and actinides. The COMMARA-2.0 covariance library has been developed by BNL-LANL collaboration for Advanced Fuel Cycle Initiative applications over the period of three years, 2008-2010. It contains covariances for 110 materials relevant to fast reactor R&D. The library is to be used together with the ENDF/B-VII.0 central values of the latest official release of US files of evaluated neutron cross sections. COMMARA-2.0 library contains neutron cross section covariances for 12 light nuclei (coolants and moderators), 78 structural

  4. AFCI-2.0 Neutron Cross Section Covariance Library

    International Nuclear Information System (INIS)

    Herman, M.; Oblozinsky, P.; Mattoon, C.M.; Pigni, M.; Hoblit, S.; Mughabghab, S.F.; Sonzogni, A.; Talou, P.; Chadwick, M.B.; Hale, G.M.; Kahler, A.C.; Kawano, T.; Little, R.C.; Yount, P.G.

    2011-01-01

    The cross section covariance library has been under development by BNL-LANL collaborative effort over the last three years. The project builds on two covariance libraries developed earlier, with considerable input from BNL and LANL. In 2006, international effort under WPEC Subgroup 26 produced BOLNA covariance library by putting together data, often preliminary, from various sources for most important materials for nuclear reactor technology. This was followed in 2007 by collaborative effort of four US national laboratories to produce covariances, often of modest quality - hence the name low-fidelity, for virtually complete set of materials included in ENDF/B-VII.0. The present project is focusing on covariances of 4-5 major reaction channels for 110 materials of importance for power reactors. The work started under Global Nuclear Energy Partnership (GNEP) in 2008, which changed to Advanced Fuel Cycle Initiative (AFCI) in 2009. With the 2011 release the name has changed to the Covariance Multigroup Matrix for Advanced Reactor Applications (COMMARA) version 2.0. The primary purpose of the library is to provide covariances for AFCI data adjustment project, which is focusing on the needs of fast advanced burner reactors. Responsibility of BNL was defined as developing covariances for structural materials and fission products, management of the library and coordination of the work; LANL responsibility was defined as covariances for light nuclei and actinides. The COMMARA-2.0 covariance library has been developed by BNL-LANL collaboration for Advanced Fuel Cycle Initiative applications over the period of three years, 2008-2010. It contains covariances for 110 materials relevant to fast reactor R and D. The library is to be used together with the ENDF/B-VII.0 central values of the latest official release of US files of evaluated neutron cross sections. COMMARA-2.0 library contains neutron cross section covariances for 12 light nuclei (coolants and moderators), 78

  5. Progress on Nuclear Data Covariances: AFCI-1.2 Covariance Library

    International Nuclear Information System (INIS)

    Oblozinsky, P.; Oblozinsky, P.; Mattoon, C.M.; Herman, M.; Mughabghab, S.F.; Pigni, M.T.; Talou, P.; Hale, G.M.; Kahler, A.C.; Kawano, T.; Little, R.C.; Young, P.G

    2009-01-01

    Improved neutron cross section covariances were produced for 110 materials including 12 light nuclei (coolants and moderators), 78 structural materials and fission products, and 20 actinides. Improved covariances were organized into AFCI-1.2 covariance library in 33-energy groups, from 10 -5 eV to 19.6 MeV. BNL contributed improved covariance data for the following materials: 23 Na and 55 Mn where more detailed evaluation was done; improvements in major structural materials 52 Cr, 56 Fe and 58 Ni; improved estimates for remaining structural materials and fission products; improved covariances for 14 minor actinides, and estimates of mubar covariances for 23 Na and 56 Fe. LANL contributed improved covariance data for 235 U and 239 Pu including prompt neutron fission spectra and completely new evaluation for 240 Pu. New R-matrix evaluation for 16 O including mubar covariances is under completion. BNL assembled the library and performed basic testing using improved procedures including inspection of uncertainty and correlation plots for each material. The AFCI-1.2 library was released to ANL and INL in August 2009.

  6. ORACLE: an adjusted cross-section and covariance library for fast-reactor analysis

    International Nuclear Information System (INIS)

    Yeivin, Y.; Marable, J.H.; Weisbin, C.R.; Wagschal, J.J.

    1980-01-01

    Benchmark integral-experiment values from six fast critical-reactor assemblies and two standard neutron fields are combined with corresponding calculations using group cross sections based on ENDF/B-V in a least-squares data adjustment using evaluated covariances from ENDF/B-V and supporting covariance evaluations. Purpose is to produce an adjusted cross-section and covariance library which is based on well-documented data and methods and which is suitable for fast-reactor design. By use of such a library, data- and methods-related biases of calculated performance parameters should be reduced and uncertainties of the calculated values minimized. Consistency of the extensive data base is analyzed using the chi-square test. This adjusted library ORACLE will be available shortly

  7. Development of web-based user interface for evaluated covariance data files

    International Nuclear Information System (INIS)

    Togashi, Tomoaki; Kato, Kiyoshi; Suzuki, Ryusuke; Otuka, Naohiko

    2010-01-01

    We develop a web-based interface which visualizes cross sections with their covariance compiled in the ENDF format in order to support evaluated covariance data users who do not have experience of NJOY calculation. A package of programs has been constructed without aid of any existing program libraries. (author)

  8. ERRFILS: a preliminary library of 30-group multigroup covariance data for use in CTR sensitivity studies

    International Nuclear Information System (INIS)

    LaBauve, R.J.; Muir, D.W.

    1978-01-01

    A library of 30-group multigroup covariance data was prepared from preliminary ENDF/B-V data with the NJOY code. Data for Fe, Cr, Ni, 10 B, C, Cu, H, and Pb are included in this library. Reactions include total cross sections, elastic and inelastic scattering cross sections, and the most important absorption cross sections. Typical data from the file are shown. 3 tables

  9. ZZ COVFILS, 30-Group Covariance Library from ENDF/B-5 for Sensitivity Studies

    International Nuclear Information System (INIS)

    Muir, D.W.

    1997-01-01

    1 - Description of program or function: Format: ENDB/F; Number of groups: 30-Group Covariance Library; Nuclides: H-1, B-10, C, O-16, Cr, Fe, Ni, Cu, Pb. Origin: ENDF/B-V. COVFILS is a 30-Group Covariance Library. It contains neutron cross sections, and their uncertainties and correlation in multigroup form. These data can be used, in conjunction with sensitivity information, to estimate the data-related uncertainty in calculated integral quantities such as radiation-damage or heating. 2 - Method of solution: COVFILS was obtained by processing evaluations from ENDF/B-V with ERRORR module of the NJOY nuclear data processing system (LA-9303-M, Vols. 1).The group structure is the Los Alamos 30-group structure which is listed in 'File 1' of each multigroup data set in the library

  10. VITAMIN-J/COVA/EFF-3 cross-section covariance matrix library and its use to analyse benchmark experiments in sinbad database

    International Nuclear Information System (INIS)

    Kodeli, Ivan-Alexander

    2005-01-01

    The new cross-section covariance matrix library ZZ-VITAMIN-J/COVA/EFF3 intended to simplify and encourage sensitivity and uncertainty analysis was prepared and is available from the NEA Data Bank. The library is organised in a ready-to-use form including both the covariance matrix data as well as processing tools:-Cross-section covariance matrices from the EFF-3 evaluation for five materials: 9 Be, 28 Si, 56 Fe, 58 Ni and 60 Ni. Other data will be included when available. -FORTRAN program ANGELO-2 to extrapolate/interpolate the covariance matrices to a users' defined energy group structure. -FORTRAN program LAMBDA to verify the mathematical properties of the covariance matrices, like symmetry, positive definiteness, etc. The preparation, testing and use of the covariance matrix library are presented. The uncertainties based on the cross-section covariance data were compared with those based on other evaluations, like ENDF/B-VI. The collapsing procedure used in the ANGELO-2 code was compared and validated with the one used in the NJOY system

  11. Econometric analysis of realized covariation: high frequency based covariance, regression, and correlation in financial economics

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Shephard, N.

    2004-01-01

    This paper analyses multivariate high frequency financial data using realized covariation. We provide a new asymptotic distribution theory for standard methods such as regression, correlation analysis, and covariance. It will be based on a fixed interval of time (e.g., a day or week), allowing...... the number of high frequency returns during this period to go to infinity. Our analysis allows us to study how high frequency correlations, regressions, and covariances change through time. In particular we provide confidence intervals for each of these quantities....

  12. Partial covariance based functional connectivity computation using Ledoit-Wolf covariance regularization.

    Science.gov (United States)

    Brier, Matthew R; Mitra, Anish; McCarthy, John E; Ances, Beau M; Snyder, Abraham Z

    2015-11-01

    Functional connectivity refers to shared signals among brain regions and is typically assessed in a task free state. Functional connectivity commonly is quantified between signal pairs using Pearson correlation. However, resting-state fMRI is a multivariate process exhibiting a complicated covariance structure. Partial covariance assesses the unique variance shared between two brain regions excluding any widely shared variance, hence is appropriate for the analysis of multivariate fMRI datasets. However, calculation of partial covariance requires inversion of the covariance matrix, which, in most functional connectivity studies, is not invertible owing to rank deficiency. Here we apply Ledoit-Wolf shrinkage (L2 regularization) to invert the high dimensional BOLD covariance matrix. We investigate the network organization and brain-state dependence of partial covariance-based functional connectivity. Although RSNs are conventionally defined in terms of shared variance, removal of widely shared variance, surprisingly, improved the separation of RSNs in a spring embedded graphical model. This result suggests that pair-wise unique shared variance plays a heretofore unrecognized role in RSN covariance organization. In addition, application of partial correlation to fMRI data acquired in the eyes open vs. eyes closed states revealed focal changes in uniquely shared variance between the thalamus and visual cortices. This result suggests that partial correlation of resting state BOLD time series reflect functional processes in addition to structural connectivity. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Parametric Covariance Model for Horizon-Based Optical Navigation

    Science.gov (United States)

    Hikes, Jacob; Liounis, Andrew J.; Christian, John A.

    2016-01-01

    This Note presents an entirely parametric version of the covariance for horizon-based optical navigation measurements. The covariance can be written as a function of only the spacecraft position, two sensor design parameters, the illumination direction, the size of the observed planet, the size of the lit arc to be used, and the total number of observed horizon points. As a result, one may now more clearly understand the sensitivity of horizon-based optical navigation performance as a function of these key design parameters, which is insight that was obscured in previous (and nonparametric) versions of the covariance. Finally, the new parametric covariance is shown to agree with both the nonparametric analytic covariance and results from a Monte Carlo analysis.

  14. Video based object representation and classification using multiple covariance matrices.

    Science.gov (United States)

    Zhang, Yurong; Liu, Quan

    2017-01-01

    Video based object recognition and classification has been widely studied in computer vision and image processing area. One main issue of this task is to develop an effective representation for video. This problem can generally be formulated as image set representation. In this paper, we present a new method called Multiple Covariance Discriminative Learning (MCDL) for image set representation and classification problem. The core idea of MCDL is to represent an image set using multiple covariance matrices with each covariance matrix representing one cluster of images. Firstly, we use the Nonnegative Matrix Factorization (NMF) method to do image clustering within each image set, and then adopt Covariance Discriminative Learning on each cluster (subset) of images. At last, we adopt KLDA and nearest neighborhood classification method for image set classification. Promising experimental results on several datasets show the effectiveness of our MCDL method.

  15. Computer code ENDSAM for random sampling and validation of the resonance parameters covariance matrices of some major nuclear data libraries

    International Nuclear Information System (INIS)

    Plevnik, Lucijan; Žerovnik, Gašper

    2016-01-01

    Highlights: • Methods for random sampling of correlated parameters. • Link to open-source code for sampling of resonance parameters in ENDF-6 format. • Validation of the code on realistic and artificial data. • Validation of covariances in three major contemporary nuclear data libraries. - Abstract: Methods for random sampling of correlated parameters are presented. The methods are implemented for sampling of resonance parameters in ENDF-6 format and a link to the open-source code ENDSAM is given. The code has been validated on realistic data. Additionally, consistency of covariances of resonance parameters of three major contemporary nuclear data libraries (JEFF-3.2, ENDF/B-VII.1 and JENDL-4.0u2) has been checked.

  16. Cross-covariance based global dynamic sensitivity analysis

    Science.gov (United States)

    Shi, Yan; Lu, Zhenzhou; Li, Zhao; Wu, Mengmeng

    2018-02-01

    For identifying the cross-covariance source of dynamic output at each time instant for structural system involving both input random variables and stochastic processes, a global dynamic sensitivity (GDS) technique is proposed. The GDS considers the effect of time history inputs on the dynamic output. In the GDS, the cross-covariance decomposition is firstly developed to measure the contribution of the inputs to the output at different time instant, and an integration of the cross-covariance change over the specific time interval is employed to measure the whole contribution of the input to the cross-covariance of output. Then, the GDS main effect indices and the GDS total effect indices can be easily defined after the integration, and they are effective in identifying the important inputs and the non-influential inputs on the cross-covariance of output at each time instant, respectively. The established GDS analysis model has the same form with the classical ANOVA when it degenerates to the static case. After degeneration, the first order partial effect can reflect the individual effects of inputs to the output variance, and the second order partial effect can reflect the interaction effects to the output variance, which illustrates the consistency of the proposed GDS indices and the classical variance-based sensitivity indices. The MCS procedure and the Kriging surrogate method are developed to solve the proposed GDS indices. Several examples are introduced to illustrate the significance of the proposed GDS analysis technique and the effectiveness of the proposed solution.

  17. Matérn-based nonstationary cross-covariance models for global processes

    KAUST Repository

    Jun, Mikyoung

    2014-01-01

    -covariance models, based on the Matérn covariance model class, that are suitable for describing prominent nonstationary characteristics of the global processes. In particular, we seek nonstationary versions of Matérn covariance models whose smoothness parameters

  18. Robust Covariance Estimators Based on Information Divergences and Riemannian Manifold

    Directory of Open Access Journals (Sweden)

    Xiaoqiang Hua

    2018-03-01

    Full Text Available This paper proposes a class of covariance estimators based on information divergences in heterogeneous environments. In particular, the problem of covariance estimation is reformulated on the Riemannian manifold of Hermitian positive-definite (HPD matrices. The means associated with information divergences are derived and used as the estimators. Without resorting to the complete knowledge of the probability distribution of the sample data, the geometry of the Riemannian manifold of HPD matrices is considered in mean estimators. Moreover, the robustness of mean estimators is analyzed using the influence function. Simulation results indicate the robustness and superiority of an adaptive normalized matched filter with our proposed estimators compared with the existing alternatives.

  19. Managing distance and covariate information with point-based clustering

    Directory of Open Access Journals (Sweden)

    Peter A. Whigham

    2016-09-01

    Full Text Available Abstract Background Geographic perspectives of disease and the human condition often involve point-based observations and questions of clustering or dispersion within a spatial context. These problems involve a finite set of point observations and are constrained by a larger, but finite, set of locations where the observations could occur. Developing a rigorous method for pattern analysis in this context requires handling spatial covariates, a method for constrained finite spatial clustering, and addressing bias in geographic distance measures. An approach, based on Ripley’s K and applied to the problem of clustering with deliberate self-harm (DSH, is presented. Methods Point-based Monte-Carlo simulation of Ripley’s K, accounting for socio-economic deprivation and sources of distance measurement bias, was developed to estimate clustering of DSH at a range of spatial scales. A rotated Minkowski L1 distance metric allowed variation in physical distance and clustering to be assessed. Self-harm data was derived from an audit of 2 years’ emergency hospital presentations (n = 136 in a New Zealand town (population ~50,000. Study area was defined by residential (housing land parcels representing a finite set of possible point addresses. Results Area-based deprivation was spatially correlated. Accounting for deprivation and distance bias showed evidence for clustering of DSH for spatial scales up to 500 m with a one-sided 95 % CI, suggesting that social contagion may be present for this urban cohort. Conclusions Many problems involve finite locations in geographic space that require estimates of distance-based clustering at many scales. A Monte-Carlo approach to Ripley’s K, incorporating covariates and models for distance bias, are crucial when assessing health-related clustering. The case study showed that social network structure defined at the neighbourhood level may account for aspects of neighbourhood clustering of DSH. Accounting for

  20. Cross-covariance functions for multivariate random fields based on latent dimensions

    KAUST Repository

    Apanasovich, T. V.; Genton, M. G.

    2010-01-01

    The problem of constructing valid parametric cross-covariance functions is challenging. We propose a simple methodology, based on latent dimensions and existing covariance models for univariate random fields, to develop flexible, interpretable

  1. Robust adaptive multichannel SAR processing based on covariance matrix reconstruction

    Science.gov (United States)

    Tan, Zhen-ya; He, Feng

    2018-04-01

    With the combination of digital beamforming (DBF) processing, multichannel synthetic aperture radar(SAR) systems in azimuth promise well in high-resolution and wide-swath imaging, whereas conventional processing methods don't take the nonuniformity of scattering coefficient into consideration. This paper brings up a robust adaptive Multichannel SAR processing method which utilizes the Capon spatial spectrum estimator to obtain the spatial spectrum distribution over all ambiguous directions first, and then the interference-plus-noise covariance Matrix is reconstructed based on definition to acquire the Multichannel SAR processing filter. The performance of processing under nonuniform scattering coefficient is promoted by this novel method and it is robust again array errors. The experiments with real measured data demonstrate the effectiveness and robustness of the proposed method.

  2. The impact of covariance misspecification in group-based trajectory models for longitudinal data with non-stationary covariance structure.

    Science.gov (United States)

    Davies, Christopher E; Glonek, Gary Fv; Giles, Lynne C

    2017-08-01

    One purpose of a longitudinal study is to gain a better understanding of how an outcome of interest changes among a given population over time. In what follows, a trajectory will be taken to mean the series of measurements of the outcome variable for an individual. Group-based trajectory modelling methods seek to identify subgroups of trajectories within a population, such that trajectories that are grouped together are more similar to each other than to trajectories in distinct groups. Group-based trajectory models generally assume a certain structure in the covariances between measurements, for example conditional independence, homogeneous variance between groups or stationary variance over time. Violations of these assumptions could be expected to result in poor model performance. We used simulation to investigate the effect of covariance misspecification on misclassification of trajectories in commonly used models under a range of scenarios. To do this we defined a measure of performance relative to the ideal Bayesian correct classification rate. We found that the more complex models generally performed better over a range of scenarios. In particular, incorrectly specified covariance matrices could significantly bias the results but using models with a correct but more complicated than necessary covariance matrix incurred little cost.

  3. Needs of reliable nuclear data and covariance matrices for Burnup Credit in JEFF-3 library

    International Nuclear Information System (INIS)

    Chambon, A.; Santamarina, A.; Riffard, C.; Lavaud, F.; Lecarpentier, D.

    2013-01-01

    Burnup Credit (BUC) is the concept which consists in taking into account credit for the reduction of nuclear spent fuel reactivity due to its burnup. In the case of PWR-MOx spent fuel, studies pointed out that the contribution of the 15 most absorbing, stable and non-volatile fission products selected to the credit is as important as the one of the actinides. In order to get a 'best estimate' value of the keff, biases of their inventory calculation and individual reactivity worth should be considered in criticality safety studies. This paper enhances the most penalizing bias towards criticality and highlights possible improvements of nuclear data for the 15 fission products (FPs) of PWR-MOx BUC. Concerning the fuel inventory, trends in function of the burnup can be derived from experimental validation of the DARWIN-2.3 package (using the JEFF- 3.1.1/SHEM library). Thanks to the BUC oscillation programme of separated FPs in the MINERVE reactor and fully validated scheme PIMS, calculation over experiment ratios can be accurately transposed to tendencies on the FPs integral cross sections. (authors)

  4. The Performance Analysis Based on SAR Sample Covariance Matrix

    Directory of Open Access Journals (Sweden)

    Esra Erten

    2012-03-01

    Full Text Available Multi-channel systems appear in several fields of application in science. In the Synthetic Aperture Radar (SAR context, multi-channel systems may refer to different domains, as multi-polarization, multi-interferometric or multi-temporal data, or even a combination of them. Due to the inherent speckle phenomenon present in SAR images, the statistical description of the data is almost mandatory for its utilization. The complex images acquired over natural media present in general zero-mean circular Gaussian characteristics. In this case, second order statistics as the multi-channel covariance matrix fully describe the data. For practical situations however, the covariance matrix has to be estimated using a limited number of samples, and this sample covariance matrix follow the complex Wishart distribution. In this context, the eigendecomposition of the multi-channel covariance matrix has been shown in different areas of high relevance regarding the physical properties of the imaged scene. Specifically, the maximum eigenvalue of the covariance matrix has been frequently used in different applications as target or change detection, estimation of the dominant scattering mechanism in polarimetric data, moving target indication, etc. In this paper, the statistical behavior of the maximum eigenvalue derived from the eigendecomposition of the sample multi-channel covariance matrix in terms of multi-channel SAR images is simplified for SAR community. Validation is performed against simulated data and examples of estimation and detection problems using the analytical expressions are as well given.

  5. Experimental OAI-Based Digital Library Systems

    Science.gov (United States)

    Nelson, Michael L. (Editor); Maly, Kurt (Editor); Zubair, Mohammad (Editor); Rusch-Feja, Diann (Editor)

    2002-01-01

    The objective of Open Archives Initiative (OAI) is to develop a simple, lightweight framework to facilitate the discovery of content in distributed archives (http://www.openarchives.org). The focus of the workshop held at the 5th European Conference on Research and Advanced Technology for Digital Libraries (ECDL 2001) was to bring researchers in the area of digital libraries who are building OAI based systems so as to share their experiences, problems they are facing, and approaches they are taking to address them. The workshop consisted of invited talks from well-established researchers working in building OAI based digital library system along with short paper presentations.

  6. Spatio-Temporal Audio Enhancement Based on IAA Noise Covariance Matrix Estimates

    DEFF Research Database (Denmark)

    Nørholm, Sidsel Marie; Jensen, Jesper Rindom; Christensen, Mads Græsbøll

    2014-01-01

    A method for estimating the noise covariance matrix in a mul- tichannel setup is proposed. The method is based on the iter- ative adaptive approach (IAA), which only needs short seg- ments of data to estimate the covariance matrix. Therefore, the method can be used for fast varying signals....... The method is based on an assumption of the desired signal being harmonic, which is used for estimating the noise covariance matrix from the covariance matrix of the observed signal. The noise co- variance estimate is used in the linearly constrained minimum variance (LCMV) filter and compared...

  7. Curriculum-based neurosurgery digital library.

    Science.gov (United States)

    Langevin, Jean-Philippe; Dang, Thai; Kon, David; Sapo, Monica; Batzdorf, Ulrich; Martin, Neil

    2010-11-01

    Recent work-hour restrictions and the constantly evolving body of knowledge are challenging the current ways of teaching neurosurgery residents. To develop a curriculum-based digital library of multimedia content to face the challenges in neurosurgery education. We used the residency program curriculum developed by the Congress of Neurological Surgeons to structure the library and Microsoft Sharepoint as the user interface. This project led to the creation of a user-friendly and searchable digital library that could be accessed remotely and throughout the hospital, including the operating rooms. The electronic format allows standardization of the content and transformation of the operating room into a classroom. This in turn facilitates the implementation of a curriculum within the training program and improves teaching efficiency. Future work will focus on evaluating the efficacy of the library as a teaching tool for residents.

  8. Cross-covariance functions for multivariate random fields based on latent dimensions

    KAUST Repository

    Apanasovich, T. V.

    2010-02-16

    The problem of constructing valid parametric cross-covariance functions is challenging. We propose a simple methodology, based on latent dimensions and existing covariance models for univariate random fields, to develop flexible, interpretable and computationally feasible classes of cross-covariance functions in closed form. We focus on spatio-temporal cross-covariance functions that can be nonseparable, asymmetric and can have different covariance structures, for instance different smoothness parameters, in each component. We discuss estimation of these models and perform a small simulation study to demonstrate our approach. We illustrate our methodology on a trivariate spatio-temporal pollution dataset from California and demonstrate that our cross-covariance performs better than other competing models. © 2010 Biometrika Trust.

  9. Zero-base budgeting and the library.

    Science.gov (United States)

    Sargent, C W

    1978-01-01

    This paper describes the application of zero-base budgeting to libraries and the procedures involved in setting up this type of budget. It describes the "decision packages" necessary when this systmem is employed, as well as how to rank the packages and the problems which are related to the process. Zero-base budgeting involves the entire staff of a library, and the incentive engendered makes for a better and more realistic budget. The paper concludes with the problems which one might encounter in zero-base budgeting and the major benefits of the system. PMID:626795

  10. A measure of association between vectors based on "similarity covariance"

    OpenAIRE

    Pascual-Marqui, Roberto D.; Lehmann, Dietrich; Kochi, Kieko; Kinoshita, Toshihiko; Yamada, Naoto

    2013-01-01

    The "maximum similarity correlation" definition introduced in this study is motivated by the seminal work of Szekely et al on "distance covariance" (Ann. Statist. 2007, 35: 2769-2794; Ann. Appl. Stat. 2009, 3: 1236-1265). Instead of using Euclidean distances "d" as in Szekely et al, we use "similarity", which can be defined as "exp(-d/s)", where the scaling parameter s>0 controls how rapidly the similarity falls off with distance. Scale parameters are chosen by maximizing the similarity corre...

  11. Covariance evaluation system

    International Nuclear Information System (INIS)

    Kawano, Toshihiko; Shibata, Keiichi.

    1997-09-01

    A covariance evaluation system for the evaluated nuclear data library was established. The parameter estimation method and the least squares method with a spline function are used to generate the covariance data. Uncertainties of nuclear reaction model parameters are estimated from experimental data uncertainties, then the covariance of the evaluated cross sections is calculated by means of error propagation. Computer programs ELIESE-3, EGNASH4, ECIS, and CASTHY are used. Covariances of 238 U reaction cross sections were calculated with this system. (author)

  12. Dendrimer-based dynamic combinatorial libraries

    NARCIS (Netherlands)

    Chang, T.; Meijer, E.W.

    2005-01-01

    The aim of this project is to create water-sol. dynamic combinatorial libraries based upon dendrimer-guest complexes. The guest mols. are designed to bind to dendrimers using multiple secondary interactions, such as electrostatics and hydrogen bonding. We have been able to incorporate various guest

  13. Sparse Covariance Matrix Estimation by DCA-Based Algorithms.

    Science.gov (United States)

    Phan, Duy Nhat; Le Thi, Hoai An; Dinh, Tao Pham

    2017-11-01

    This letter proposes a novel approach using the [Formula: see text]-norm regularization for the sparse covariance matrix estimation (SCME) problem. The objective function of SCME problem is composed of a nonconvex part and the [Formula: see text] term, which is discontinuous and difficult to tackle. Appropriate DC (difference of convex functions) approximations of [Formula: see text]-norm are used that result in approximation SCME problems that are still nonconvex. DC programming and DCA (DC algorithm), powerful tools in nonconvex programming framework, are investigated. Two DC formulations are proposed and corresponding DCA schemes developed. Two applications of the SCME problem that are considered are classification via sparse quadratic discriminant analysis and portfolio optimization. A careful empirical experiment is performed through simulated and real data sets to study the performance of the proposed algorithms. Numerical results showed their efficiency and their superiority compared with seven state-of-the-art methods.

  14. Pu239 Cross-Section Variations Based on Experimental Uncertainties and Covariances

    Energy Technology Data Exchange (ETDEWEB)

    Sigeti, David Edward [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Williams, Brian J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parsons, D. Kent [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-10-18

    Algorithms and software have been developed for producing variations in plutonium-239 neutron cross sections based on experimental uncertainties and covariances. The varied cross-section sets may be produced as random samples from the multi-variate normal distribution defined by an experimental mean vector and covariance matrix, or they may be produced as Latin-Hypercube/Orthogonal-Array samples (based on the same means and covariances) for use in parametrized studies. The variations obey two classes of constraints that are obligatory for cross-section sets and which put related constraints on the mean vector and covariance matrix that detemine the sampling. Because the experimental means and covariances do not obey some of these constraints to sufficient precision, imposing the constraints requires modifying the experimental mean vector and covariance matrix. Modification is done with an algorithm based on linear algebra that minimizes changes to the means and covariances while insuring that the operations that impose the different constraints do not conflict with each other.

  15. Covariance and sensitivity data generation at ORNL

    International Nuclear Information System (INIS)

    Leal, L. C.; Derrien, H.; Larson, N. M.; Alpan, A.

    2005-01-01

    Covariance data are required to assess uncertainties in design parameters in several nuclear applications. The error estimation of calculated quantities relies on the nuclear data uncertainty information available in the basic nuclear data libraries, such as the US Evaluated Nuclear Data Library, ENDF/B. The uncertainty files in the ENDF/B library are obtained from the analysis of experimental data and are stored as variance and covariance data. In this paper we address the generation of covariance data in the resonance region done with the computer code SAMMY. SAMMY is used in the evaluation of the experimental data in the resolved and unresolved resonance energy regions. The data fitting of cross sections is based on the generalised least-squares formalism (Bayesian theory) together with the resonance formalism described by R-matrix theory. Two approaches are used in SAMMY for the generation of resonance parameter covariance data. In the evaluation process SAMMY generates a set of resonance parameters that fit the data, and, it provides the resonance parameter covariances. For resonance parameter evaluations where there are no resonance parameter covariance data available, the alternative is to use an approach called the 'retroactive' resonance parameter covariance generation. In this paper, we describe the application of the retroactive covariance generation approach for the gadolinium isotopes. (authors)

  16. The Cost of Library Services: Activity-Based Costing in an Australian Academic Library.

    Science.gov (United States)

    Robinson, Peter; Ellis-Newman, Jennifer

    1998-01-01

    Explains activity-based costing (ABC), discusses the benefits of ABC to library managers, and describes the steps involved in implementing ABC in an Australian academic library. Discusses the budgeting process in universities, and considers benefits to the library. (Author/LRW)

  17. Adaptive learning with covariate shift-detection for motor imagery-based brain–computer interface

    OpenAIRE

    Raza, H; Cecotti, H; Li, Y; Prasad, G

    2015-01-01

    A common assumption in traditional supervised learning is the similar probability distribution of data between the training phase and the testing/operating phase. When transitioning from the training to testing phase, a shift in the probability distribution of input data is known as a covariate shift. Covariate shifts commonly arise in a wide range of real-world systems such as electroencephalogram-based brain–computer interfaces (BCIs). In such systems, there is a necessity for continuous mo...

  18. Condition-based inspection/replacement policies for non-monotone deteriorating systems with environmental covariates

    Energy Technology Data Exchange (ETDEWEB)

    Zhao Xuejing [Universite de Technologie de Troyes, Institut Charles Delaunay and STMR UMR CNRS 6279, 12 rue Marie Curie, 10010 Troyes (France); School of mathematics and statistics, Lanzhou University, Lanzhou 730000 (China); Fouladirad, Mitra, E-mail: mitra.fouladirad@utt.f [Universite de Technologie de Troyes, Institut Charles Delaunay and STMR UMR CNRS 6279, 12 rue Marie Curie, 10010 Troyes (France); Berenguer, Christophe [Universite de Technologie de Troyes, Institut Charles Delaunay and STMR UMR CNRS 6279, 12 rue Marie Curie, 10010 Troyes (France); Bordes, Laurent [Universite de Pau et des Pays de l' Adour, LMA UMR CNRS 5142, 64013 PAU Cedex (France)

    2010-08-15

    The aim of this paper is to discuss the problem of modelling and optimising condition-based maintenance policies for a deteriorating system in presence of covariates. The deterioration is modelled by a non-monotone stochastic process. The covariates process is assumed to be a time-homogenous Markov chain with finite state space. A model similar to the proportional hazards model is used to show the influence of covariates on the deterioration. In the framework of the system under consideration, an appropriate inspection/replacement policy which minimises the expected average maintenance cost is derived. The average cost under different conditions of covariates and different maintenance policies is analysed through simulation experiments to compare the policies performances.

  19. Condition-based inspection/replacement policies for non-monotone deteriorating systems with environmental covariates

    International Nuclear Information System (INIS)

    Zhao Xuejing; Fouladirad, Mitra; Berenguer, Christophe; Bordes, Laurent

    2010-01-01

    The aim of this paper is to discuss the problem of modelling and optimising condition-based maintenance policies for a deteriorating system in presence of covariates. The deterioration is modelled by a non-monotone stochastic process. The covariates process is assumed to be a time-homogenous Markov chain with finite state space. A model similar to the proportional hazards model is used to show the influence of covariates on the deterioration. In the framework of the system under consideration, an appropriate inspection/replacement policy which minimises the expected average maintenance cost is derived. The average cost under different conditions of covariates and different maintenance policies is analysed through simulation experiments to compare the policies performances.

  20. JDATAVIEWER – JAVA-Based Charting Library

    CERN Document Server

    Kruk, G

    2009-01-01

    The JDataViewer is a Java-based charting library developed at CERN, with powerful, extensible and easy to use function editing capabilities. Function edition is heavily used in Control System applications, but poorly supported in products available on the market. The JDataViewer enables adding, removing and modifying function points graphically (using a mouse) or by editing a table of values. Custom edition strategies are supported: developer can specify an algorithm that reacts to the modification of a given point in the function by automatically adapting all other points. The library provides all typical 2D plotting types (scatter, polyline, area, bar, HiLo, contour), as well as data point annotations and data indicators. It also supports common interactors to zoom and move the visible view, or to select and highlight function segments. A clear API is provided to configure and customize all chart elements (colors, fonts, data ranges ...) programmatically, and to integrate non-standard rendering types, inter...

  1. Matérn-based nonstationary cross-covariance models for global processes

    KAUST Repository

    Jun, Mikyoung

    2014-07-01

    Many spatial processes in environmental applications, such as climate variables and climate model errors on a global scale, exhibit complex nonstationary dependence structure, in not only their marginal covariance but also their cross-covariance. Flexible cross-covariance models for processes on a global scale are critical for an accurate description of each spatial process as well as the cross-dependences between them and also for improved predictions. We propose various ways to produce cross-covariance models, based on the Matérn covariance model class, that are suitable for describing prominent nonstationary characteristics of the global processes. In particular, we seek nonstationary versions of Matérn covariance models whose smoothness parameters vary over space, coupled with a differential operators approach for modeling large-scale nonstationarity. We compare their performance to the performance of some existing models in terms of the aic and spatial predictions in two applications: joint modeling of surface temperature and precipitation, and joint modeling of errors in climate model ensembles. © 2014 Elsevier Inc.

  2. Web-Based Instruction: A Guide for Libraries, Third Edition

    Science.gov (United States)

    Smith, Susan Sharpless

    2010-01-01

    Expanding on the popular, practical how-to guide for public, academic, school, and special libraries, technology expert Susan Sharpless Smith offers library instructors the confidence to take Web-based instruction into their own hands. Smith has thoroughly updated "Web-Based Instruction: A Guide for Libraries" to include new tools and trends,…

  3. Library-Based Learning in an Information Society.

    Science.gov (United States)

    Breivik, Patricia Senn

    1986-01-01

    The average academic library has great potential for quality nonclassroom learning benefiting students, faculty, alumni, and the local business community. The major detriments are the limited perceptions about libraries and librarians among campus administrators and faculty. Library-based learning should be planned to be assimilated into overall…

  4. Covariance Based Pre-Filters and Screening Criteria for Conjunction Analysis

    Science.gov (United States)

    George, E., Chan, K.

    2012-09-01

    Several relationships are developed relating object size, initial covariance and range at closest approach to probability of collision. These relationships address the following questions: - Given the objects' initial covariance and combined hard body size, what is the maximum possible value of the probability of collision (Pc)? - Given the objects' initial covariance, what is the maximum combined hard body radius for which the probability of collision does not exceed the tolerance limit? - Given the objects' initial covariance and the combined hard body radius, what is the minimum miss distance for which the probability of collision does not exceed the tolerance limit? - Given the objects' initial covariance and the miss distance, what is the maximum combined hard body radius for which the probability of collision does not exceed the tolerance limit? The first relationship above allows the elimination of object pairs from conjunction analysis (CA) on the basis of the initial covariance and hard-body sizes of the objects. The application of this pre-filter to present day catalogs with estimated covariance results in the elimination of approximately 35% of object pairs as unable to ever conjunct with a probability of collision exceeding 1x10-6. Because Pc is directly proportional to object size and inversely proportional to covariance size, this pre-filter will have a significantly larger impact on future catalogs, which are expected to contain a much larger fraction of small debris tracked only by a limited subset of available sensors. This relationship also provides a mathematically rigorous basis for eliminating objects from analysis entirely based on element set age or quality - a practice commonly done by rough rules of thumb today. Further, these relations can be used to determine the required geometric screening radius for all objects. This analysis reveals the screening volumes for small objects are much larger than needed, while the screening volumes for

  5. Few group collapsing of covariance matrix data based on a conservation principle

    International Nuclear Information System (INIS)

    Hiruta, H.; Palmiotti, G.; Salvatores, M.; Arcilla, R. Jr.; Oblozinsky, P.; McKnight, R.D.

    2008-01-01

    A new algorithm for a rigorous collapsing of covariance data is proposed, derived, implemented, and tested. The method is based on a conservation principle that allows preserving at a broad energy group structure the uncertainty calculated in a fine group energy structure for a specific integral parameter, using as weights the associated sensitivity coefficients

  6. Structural covariance in the hallucinating brain: a voxel-based morphometry study

    Science.gov (United States)

    Modinos, Gemma; Vercammen, Ans; Mechelli, Andrea; Knegtering, Henderikus; McGuire, Philip K.; Aleman, André

    2009-01-01

    Background Neuroimaging studies have indicated that a number of cortical regions express altered patterns of structural covariance in schizophrenia. The relation between these alterations and specific psychotic symptoms is yet to be investigated. We used voxel-based morphometry to examine regional grey matter volumes and structural covariance associated with severity of auditory verbal hallucinations. Methods We applied optimized voxel-based morphometry to volumetric magnetic resonance imaging data from 26 patients with medication-resistant auditory verbal hallucinations (AVHs); statistical inferences were made at p < 0.05 after correction for multiple comparisons. Results Grey matter volume in the left inferior frontal gyrus was positively correlated with severity of AVHs. Hallucination severity influenced the pattern of structural covariance between this region and the left superior/middle temporal gyri, the right inferior frontal gyrus and hippocampus, and the insula bilaterally. Limitations The results are based on self-reported severity of auditory hallucinations. Complementing with a clinician-based instrument could have made the findings more compelling. Future studies would benefit from including a measure to control for other symptoms that may covary with AVHs and for the effects of antipsychotic medication. Conclusion The results revealed that overall severity of AVHs modulated cortical intercorrelations between frontotemporal regions involved in language production and verbal monitoring, supporting the critical role of this network in the pathophysiology of hallucinations. PMID:19949723

  7. EEG-Based Emotion Recognition Using Deep Learning Network with Principal Component Based Covariate Shift Adaptation

    Directory of Open Access Journals (Sweden)

    Suwicha Jirayucharoensak

    2014-01-01

    Full Text Available Automatic emotion recognition is one of the most challenging tasks. To detect emotion from nonstationary EEG signals, a sophisticated learning algorithm that can represent high-level abstraction is required. This study proposes the utilization of a deep learning network (DLN to discover unknown feature correlation between input signals that is crucial for the learning task. The DLN is implemented with a stacked autoencoder (SAE using hierarchical feature learning approach. Input features of the network are power spectral densities of 32-channel EEG signals from 32 subjects. To alleviate overfitting problem, principal component analysis (PCA is applied to extract the most important components of initial input features. Furthermore, covariate shift adaptation of the principal components is implemented to minimize the nonstationary effect of EEG signals. Experimental results show that the DLN is capable of classifying three different levels of valence and arousal with accuracy of 49.52% and 46.03%, respectively. Principal component based covariate shift adaptation enhances the respective classification accuracy by 5.55% and 6.53%. Moreover, DLN provides better performance compared to SVM and naive Bayes classifiers.

  8. Information provision in medical libraries: An evidence based ...

    African Journals Online (AJOL)

    The paper examined information provision in special libraries such as medical libraries. It provides an overview of evidence based practice as a concept for information provision by librarians. It specifically proffers meaning to the term evidence as used in evidence based practice and to evidence based medicine from where ...

  9. Towards Knowledge-Based Digital Libraries

    NARCIS (Netherlands)

    Feng, L.; Jeusfeld, M.A.; Hoppenbrouwers, J.

    From the standpoint of satisfying human's information needs, the current digital library (DL) systems suffer from the following two shortcomings: (i) inadequate high-level cognition support; (ii) inadequate knowledge sharing facilities. In this article, we introduce a two-layered digital library

  10. Covariance-Based Measurement Selection Criterion for Gaussian-Based Algorithms

    Directory of Open Access Journals (Sweden)

    Fernando A. Auat Cheein

    2013-01-01

    Full Text Available Process modeling by means of Gaussian-based algorithms often suffers from redundant information which usually increases the estimation computational complexity without significantly improving the estimation performance. In this article, a non-arbitrary measurement selection criterion for Gaussian-based algorithms is proposed. The measurement selection criterion is based on the determination of the most significant measurement from both an estimation convergence perspective and the covariance matrix associated with the measurement. The selection criterion is independent from the nature of the measured variable. This criterion is used in conjunction with three Gaussian-based algorithms: the EIF (Extended Information Filter, the EKF (Extended Kalman Filter and the UKF (Unscented Kalman Filter. Nevertheless, the measurement selection criterion shown herein can also be applied to other Gaussian-based algorithms. Although this work is focused on environment modeling, the results shown herein can be applied to other Gaussian-based algorithm implementations. Mathematical descriptions and implementation results that validate the proposal are also included in this work.

  11. Robust entry guidance using linear covariance-based model predictive control

    Directory of Open Access Journals (Sweden)

    Jianjun Luo

    2017-02-01

    Full Text Available For atmospheric entry vehicles, guidance design can be accomplished by solving an optimal issue using optimal control theories. However, traditional design methods generally focus on the nominal performance and do not include considerations of the robustness in the design process. This paper proposes a linear covariance-based model predictive control method for robust entry guidance design. Firstly, linear covariance analysis is employed to directly incorporate the robustness into the guidance design. The closed-loop covariance with the feedback updated control command is initially formulated to provide the expected errors of the nominal state variables in the presence of uncertainties. Then, the closed-loop covariance is innovatively used as a component of the cost function to guarantee the robustness to reduce its sensitivity to uncertainties. After that, the models predictive control is used to solve the optimal problem, and the control commands (bank angles are calculated. Finally, a series of simulations for different missions have been completed to demonstrate the high performance in precision and the robustness with respect to initial perturbations as well as uncertainties in the entry process. The 3σ confidence region results in the presence of uncertainties which show that the robustness of the guidance has been improved, and the errors of the state variables are decreased by approximately 35%.

  12. Object Tracking Using Adaptive Covariance Descriptor and Clustering-Based Model Updating for Visual Surveillance

    Directory of Open Access Journals (Sweden)

    Lei Qin

    2014-05-01

    Full Text Available We propose a novel approach for tracking an arbitrary object in video sequences for visual surveillance. The first contribution of this work is an automatic feature extraction method that is able to extract compact discriminative features from a feature pool before computing the region covariance descriptor. As the feature extraction method is adaptive to a specific object of interest, we refer to the region covariance descriptor computed using the extracted features as the adaptive covariance descriptor. The second contribution is to propose a weakly supervised method for updating the object appearance model during tracking. The method performs a mean-shift clustering procedure among the tracking result samples accumulated during a period of time and selects a group of reliable samples for updating the object appearance model. As such, the object appearance model is kept up-to-date and is prevented from contamination even in case of tracking mistakes. We conducted comparing experiments on real-world video sequences, which confirmed the effectiveness of the proposed approaches. The tracking system that integrates the adaptive covariance descriptor and the clustering-based model updating method accomplished stable object tracking on challenging video sequences.

  13. Structural covariance and cortical reorganisation in schizophrenia: a MRI-based morphometric study.

    Science.gov (United States)

    Palaniyappan, Lena; Hodgson, Olha; Balain, Vijender; Iwabuchi, Sarina; Gowland, Penny; Liddle, Peter

    2018-05-06

    In patients with schizophrenia, distributed abnormalities are observed in grey matter volume. A recent hypothesis posits that these distributed changes are indicative of a plastic reorganisation process occurring in response to a functional defect in neuronal information transmission. We investigated the structural covariance across various brain regions in early-stage schizophrenia to determine if indeed the observed patterns of volumetric loss conform to a coordinated pattern of structural reorganisation. Structural magnetic resonance imaging scans were obtained from 40 healthy adults and 41 age, gender and parental socioeconomic status matched patients with schizophrenia. Volumes of grey matter tissue were estimated at the regional level across 90 atlas-based parcellations. Group-level structural covariance was studied using a graph theoretical framework. Patients had distributed reduction in grey matter volume, with high degree of localised covariance (clustering) compared with controls. Patients with schizophrenia had reduced centrality of anterior cingulate and insula but increased centrality of the fusiform cortex, compared with controls. Simulating targeted removal of highly central nodes resulted in significant loss of the overall covariance patterns in patients compared with controls. Regional volumetric deficits in schizophrenia are not a result of random, mutually independent processes. Our observations support the occurrence of a spatially interconnected reorganisation with the systematic de-escalation of conventional 'hub' regions. This raises the question of whether the morphological architecture in schizophrenia is primed for compensatory functions, albeit with a high risk of inefficiency.

  14. Parcellation of the human orbitofrontal cortex based on gray matter volume covariance.

    Science.gov (United States)

    Liu, Huaigui; Qin, Wen; Qi, Haotian; Jiang, Tianzi; Yu, Chunshui

    2015-02-01

    The human orbitofrontal cortex (OFC) is an enigmatic brain region that cannot be parcellated reliably using diffusional and functional magnetic resonance imaging (fMRI) because there is signal dropout that results from an inherent defect in imaging techniques. We hypothesise that the OFC can be reliably parcellated into subregions based on gray matter volume (GMV) covariance patterns that are derived from artefact-free structural images. A total of 321 healthy young subjects were examined by high-resolution structural MRI. The OFC was parcellated into subregions-based GMV covariance patterns; and then sex and laterality differences in GMV covariance pattern of each OFC subregion were compared. The human OFC was parcellated into the anterior (OFCa), medial (OFCm), posterior (OFCp), intermediate (OFCi), and lateral (OFCl) subregions. This parcellation scheme was validated by the same analyses of the left OFC and the bilateral OFCs in male and female subjects. Both visual observation and quantitative comparisons indicated a unique GMV covariance pattern for each OFC subregion. These OFC subregions mainly covaried with the prefrontal and temporal cortices, cingulate cortex and amygdala. In addition, GMV correlations of most OFC subregions were similar across sex and laterality except for significant laterality difference in the OFCl. The right OFCl had stronger GMV correlation with the right inferior frontal cortex. Using high-resolution structural images, we established a reliable parcellation scheme for the human OFC, which may provide an in vivo guide for subregion-level studies of this region and improve our understanding of the human OFC at subregional levels. © 2014 Wiley Periodicals, Inc.

  15. AUTOMATION BASED LIBRARY MANAGEMENT IN DEPOK PUBLIC LIBRARY IN THE CONTEXT OF RITUAL PERFORMANCE

    Directory of Open Access Journals (Sweden)

    Rafiqa Maulidia

    2017-06-01

    Full Text Available Library management using manual system is no longer adequate to handle the workload in the library routines, librarians must use application of library automation. To provide a good working performance, librarians use strategy, competences and certain habits, which are referred to as a ritual performance. The performance of the ritual is the demonstration of competence spontaneously by individuals in dealing with individuals, groups and organizations, which contains elements of personal ritual, the work ritual, social ritual, and organization ritual. The research focuses in the automation based library management in the context of the performance of the ritual. This study used a qualitative approach with case study method. The findings suggest that the personal ritual shows the personal habits of librarians to do their tasks, ritual librarian's work show responsibility towards their duties, social rituals strengthen the emotional connection between librarians and leaders, as well as ritual organizations suggest the involvement of librarians in giving their contribution in decision making. Conclusions of this study shows that the performance of rituals librarian at Depok Public Library gives librarians the skills to implement automation systems in the library management, and reflect the values of responsibility, mutual trust, and mutual respect.   Key words : Library Management, Library Automation, Ritual Performance, Ritual Performance Value

  16. Application of Data Mining in Library-Based Personalized Learning

    Directory of Open Access Journals (Sweden)

    Lin Luo

    2017-12-01

    Full Text Available this paper expounds to mine up data with the DBSCAN algorithm in order to help teachers and students find which books they expect in the sea of library. In the first place, the model that DBSCAN algorithm applies in library data miner is proposed, followed by the DBSCAN algorithm improved on demands. In the end, an experiment is cited herein to validate this algorithm. The results show that the book price and the inventory level in the library produce a less impact on the resultant aggregation than the classification of books and the frequency of book borrowings. Library procurers should therefore purchase and subscribe data based on the results from cluster analysis thereby to improve hierarchies and structure distribution of library resources, forging on the library resources to be more scientific and reasonable, while it is also conducive to arousing readers' borrowing interest.

  17. A Standardized Generalized Dimensionality Discrepancy Measure and a Standardized Model-Based Covariance for Dimensionality Assessment for Multidimensional Models

    Science.gov (United States)

    Levy, Roy; Xu, Yuning; Yel, Nedim; Svetina, Dubravka

    2015-01-01

    The standardized generalized dimensionality discrepancy measure and the standardized model-based covariance are introduced as tools to critique dimensionality assumptions in multidimensional item response models. These tools are grounded in a covariance theory perspective and associated connections between dimensionality and local independence.…

  18. Analysis of stock investment selection based on CAPM using covariance and genetic algorithm approach

    Science.gov (United States)

    Sukono; Susanti, D.; Najmia, M.; Lesmana, E.; Napitupulu, H.; Supian, S.; Putra, A. S.

    2018-03-01

    Investment is one of the economic growth factors of countries, especially in Indonesia. Stocks is a form of investment, which is liquid. In determining the stock investment decisions which need to be considered by investors is to choose stocks that can generate maximum returns with a minimum risk level. Therefore, we need to know how to allocate the capital which may give the optimal benefit. This study discusses the issue of stock investment based on CAPM which is estimated using covariance and Genetic Algorithm approach. It is assumed that the stocks analyzed follow the CAPM model. To do the estimation of beta parameter on CAPM equation is done by two approach, first is to be represented by covariance approach, and second with genetic algorithm optimization. As a numerical illustration, in this paper analyzed ten stocks traded on the capital market in Indonesia. The results of the analysis show that estimation of beta parameters using covariance and genetic algorithm approach, give the same decision, that is, six underpriced stocks with buying decision, and four overpriced stocks with a sales decision. Based on the analysis, it can be concluded that the results can be used as a consideration for investors buying six under-priced stocks, and selling four overpriced stocks.

  19. Resampling-based methods in single and multiple testing for equality of covariance/correlation matrices.

    Science.gov (United States)

    Yang, Yang; DeGruttola, Victor

    2012-06-22

    Traditional resampling-based tests for homogeneity in covariance matrices across multiple groups resample residuals, that is, data centered by group means. These residuals do not share the same second moments when the null hypothesis is false, which makes them difficult to use in the setting of multiple testing. An alternative approach is to resample standardized residuals, data centered by group sample means and standardized by group sample covariance matrices. This approach, however, has been observed to inflate type I error when sample size is small or data are generated from heavy-tailed distributions. We propose to improve this approach by using robust estimation for the first and second moments. We discuss two statistics: the Bartlett statistic and a statistic based on eigen-decomposition of sample covariance matrices. Both statistics can be expressed in terms of standardized errors under the null hypothesis. These methods are extended to test homogeneity in correlation matrices. Using simulation studies, we demonstrate that the robust resampling approach provides comparable or superior performance, relative to traditional approaches, for single testing and reasonable performance for multiple testing. The proposed methods are applied to data collected in an HIV vaccine trial to investigate possible determinants, including vaccine status, vaccine-induced immune response level and viral genotype, of unusual correlation pattern between HIV viral load and CD4 count in newly infected patients.

  20. Evaluation of computer-based library services at Kenneth Dike ...

    African Journals Online (AJOL)

    This study evaluated computer-based library services/routines at Kenneth Dike Library, University of Ibadan. Four research questions were developed and answered. A survey research design was adopted; using questionnaire as the instrument for data collection. A total of 200 respondents randomly selected from 10 ...

  1. Mixture-based combinatorial libraries from small individual peptide libraries: a case study on α1-antitrypsin deficiency.

    Science.gov (United States)

    Chang, Yi-Pin; Chu, Yen-Ho

    2014-05-16

    The design, synthesis and screening of diversity-oriented peptide libraries using a "libraries from libraries" strategy for the development of inhibitors of α1-antitrypsin deficiency are described. The major buttress of the biochemical approach presented here is the use of well-established solid-phase split-and-mix method for the generation of mixture-based libraries. The combinatorial technique iterative deconvolution was employed for library screening. While molecular diversity is the general consideration of combinatorial libraries, exquisite design through systematic screening of small individual libraries is a prerequisite for effective library screening and can avoid potential problems in some cases. This review will also illustrate how large peptide libraries were designed, as well as how a conformation-sensitive assay was developed based on the mechanism of the conformational disease. Finally, the combinatorially selected peptide inhibitor capable of blocking abnormal protein aggregation will be characterized by biophysical, cellular and computational methods.

  2. Data depth and rank-based tests for covariance and spectral density matrices

    KAUST Repository

    Chau, Joris

    2017-06-26

    In multivariate time series analysis, objects of primary interest to study cross-dependences in the time series are the autocovariance or spectral density matrices. Non-degenerate covariance and spectral density matrices are necessarily Hermitian and positive definite, and our primary goal is to develop new methods to analyze samples of such matrices. The main contribution of this paper is the generalization of the concept of statistical data depth for collections of covariance or spectral density matrices by exploiting the geometric properties of the space of Hermitian positive definite matrices as a Riemannian manifold. This allows one to naturally characterize most central or outlying matrices, but also provides a practical framework for rank-based hypothesis testing in the context of samples of covariance or spectral density matrices. First, the desired properties of a data depth function acting on the space of Hermitian positive definite matrices are presented. Second, we propose two computationally efficient pointwise and integrated data depth functions that satisfy each of these requirements. Several applications of the developed methodology are illustrated by the analysis of collections of spectral matrices in multivariate brain signal time series datasets.

  3. Data depth and rank-based tests for covariance and spectral density matrices

    KAUST Repository

    Chau, Joris; Ombao, Hernando; Sachs, Rainer von

    2017-01-01

    In multivariate time series analysis, objects of primary interest to study cross-dependences in the time series are the autocovariance or spectral density matrices. Non-degenerate covariance and spectral density matrices are necessarily Hermitian and positive definite, and our primary goal is to develop new methods to analyze samples of such matrices. The main contribution of this paper is the generalization of the concept of statistical data depth for collections of covariance or spectral density matrices by exploiting the geometric properties of the space of Hermitian positive definite matrices as a Riemannian manifold. This allows one to naturally characterize most central or outlying matrices, but also provides a practical framework for rank-based hypothesis testing in the context of samples of covariance or spectral density matrices. First, the desired properties of a data depth function acting on the space of Hermitian positive definite matrices are presented. Second, we propose two computationally efficient pointwise and integrated data depth functions that satisfy each of these requirements. Several applications of the developed methodology are illustrated by the analysis of collections of spectral matrices in multivariate brain signal time series datasets.

  4. Spatial Pyramid Covariance based Compact Video Code for Robust Face Retrieval in TV-series.

    Science.gov (United States)

    Li, Yan; Wang, Ruiping; Cui, Zhen; Shan, Shiguang; Chen, Xilin

    2016-10-10

    We address the problem of face video retrieval in TV-series which searches video clips based on the presence of specific character, given one face track of his/her. This is tremendously challenging because on one hand, faces in TV-series are captured in largely uncontrolled conditions with complex appearance variations, and on the other hand retrieval task typically needs efficient representation with low time and space complexity. To handle this problem, we propose a compact and discriminative representation for the huge body of video data, named Compact Video Code (CVC). Our method first models the face track by its sample (i.e., frame) covariance matrix to capture the video data variations in a statistical manner. To incorporate discriminative information and obtain more compact video signature suitable for retrieval, the high-dimensional covariance representation is further encoded as a much lower-dimensional binary vector, which finally yields the proposed CVC. Specifically, each bit of the code, i.e., each dimension of the binary vector, is produced via supervised learning in a max margin framework, which aims to make a balance between the discriminability and stability of the code. Besides, we further extend the descriptive granularity of covariance matrix from traditional pixel-level to more general patchlevel, and proceed to propose a novel hierarchical video representation named Spatial Pyramid Covariance (SPC) along with a fast calculation method. Face retrieval experiments on two challenging TV-series video databases, i.e., the Big Bang Theory and Prison Break, demonstrate the competitiveness of the proposed CVC over state-of-the-art retrieval methods. In addition, as a general video matching algorithm, CVC is also evaluated in traditional video face recognition task on a standard Internet database, i.e., YouTube Celebrities, showing its quite promising performance by using an extremely compact code with only 128 bits.

  5. [Progress in the spectral library based protein identification strategy].

    Science.gov (United States)

    Yu, Derui; Ma, Jie; Xie, Zengyan; Bai, Mingze; Zhu, Yunping; Shu, Kunxian

    2018-04-25

    Exponential growth of the mass spectrometry (MS) data is exhibited when the mass spectrometry-based proteomics has been developing rapidly. It is a great challenge to develop some quick, accurate and repeatable methods to identify peptides and proteins. Nowadays, the spectral library searching has become a mature strategy for tandem mass spectra based proteins identification in proteomics, which searches the experiment spectra against a collection of confidently identified MS/MS spectra that have been observed previously, and fully utilizes the abundance in the spectrum, peaks from non-canonical fragment ions, and other features. This review provides an overview of the implement of spectral library search strategy, and two key steps, spectral library construction and spectral library searching comprehensively, and discusses the progress and challenge of the library search strategy.

  6. Evidence Based Management as a Tool for Special Libraries

    Directory of Open Access Journals (Sweden)

    Bill Fisher

    2007-12-01

    Full Text Available Objective ‐ To examine the evidence based management literature, as an example of evidence based practice, and determine how applicable evidence based management might be in the special library environment. Methods ‐ Recent general management literature and the subject‐focused literature of evidence based management were reviewed; likewise recent library/information science management literature and the subject‐focused literature of evidence based librarianshipwere reviewed to identify relevant examples of the introduction and use of evidence based practice in organizations. Searches were conducted in major business/management databases, major library/information science databases, and relevant Web sites, blogs and wikis. Citation searches on key articles and follow‐up searches on cited references were also conducted. Analysis of the retrieved literature was conducted to find similarities and/or differences between the management literature and the library/information scienceliterature, especially as it related to special libraries.Results ‐ The barriers to introducing evidence based management into most organizations were found to apply to many special libraries and are similar to issues involved with evidence based practice in librarianship in general. Despite these barriers, a set of resources to assist special librarians in accessing research‐based information to help them use principles of evidence based management is identified.Conclusion ‐ While most special librarians are faced with a number of barriers to using evidence based management, resources do exist to help overcome these obstacles.

  7. Automation Based Library Management in Depok Public Library In The Context of Ritual Performance

    Directory of Open Access Journals (Sweden)

    Rafiqa Maulidia

    2018-01-01

    Full Text Available Library management using manual system is no longer adequate to handle the workload in the library routines, librarians must use application of library automation. To provide a good working performance, librarians use strategy, competences and certain habits, which are referred to as a ritual performance. The performance of the ritual is the demonstration of competence spontaneously by individuals in dealing with individuals, groups and organizations, which contains elements of personal ritual, the work ritual, social ritual, and organization ritual. The research focuses in the automation based library management in the context of the performance of the ritual. This study used a qualitative approach with case study method. The findings suggest that the personal ritual shows the personal habits of librarians to do their tasks, ritual librarian's work show responsibility towards their duties, social rituals strengthen the emotional connection between librarians and leaders, as well as ritual organizations suggest the involvement of librarians in giving their contribution in decision making. Conclusions of this study shows that the performance of rituals librarian at Depok Public Library gives librarians the skills to implement automation systems in the library management, and reflect the values of responsibility, mutual trust, and mutual respect.

  8. Cloud-Based DDoS HTTP Attack Detection Using Covariance Matrix Approach

    Directory of Open Access Journals (Sweden)

    Abdulaziz Aborujilah

    2017-01-01

    Full Text Available In this era of technology, cloud computing technology has become essential part of the IT services used the daily life. In this regard, website hosting services are gradually moving to the cloud. This adds new valued feature to the cloud-based websites and at the same time introduces new threats for such services. DDoS attack is one such serious threat. Covariance matrix approach is used in this article to detect such attacks. The results were encouraging, according to confusion matrix and ROC descriptors.

  9. Linear regression based on Minimum Covariance Determinant (MCD) and TELBS methods on the productivity of phytoplankton

    Science.gov (United States)

    Gusriani, N.; Firdaniza

    2018-03-01

    The existence of outliers on multiple linear regression analysis causes the Gaussian assumption to be unfulfilled. If the Least Square method is forcedly used on these data, it will produce a model that cannot represent most data. For that, we need a robust regression method against outliers. This paper will compare the Minimum Covariance Determinant (MCD) method and the TELBS method on secondary data on the productivity of phytoplankton, which contains outliers. Based on the robust determinant coefficient value, MCD method produces a better model compared to TELBS method.

  10. DFT-Based Closed-form Covariance Matrix and Direct Waveforms Design for MIMO Radar to Achieve Desired Beampatterns

    KAUST Repository

    Bouchoucha, Taha; Ahmed, Sajid; Al-Naffouri, Tareq Y.; Alouini, Mohamed-Slim

    2017-01-01

    optimization problems. The computational complexity of these algorithms is very high, which makes them difficult to use in practice. In this paper, to achieve the desired beampattern, a low complexity discrete-Fourier-transform based closed-form covariance

  11. Directional variance adjustment: bias reduction in covariance matrices based on factor analysis with an application to portfolio optimization.

    Directory of Open Access Journals (Sweden)

    Daniel Bartz

    Full Text Available Robust and reliable covariance estimates play a decisive role in financial and many other applications. An important class of estimators is based on factor models. Here, we show by extensive Monte Carlo simulations that covariance matrices derived from the statistical Factor Analysis model exhibit a systematic error, which is similar to the well-known systematic error of the spectrum of the sample covariance matrix. Moreover, we introduce the Directional Variance Adjustment (DVA algorithm, which diminishes the systematic error. In a thorough empirical study for the US, European, and Hong Kong stock market we show that our proposed method leads to improved portfolio allocation.

  12. Directional variance adjustment: bias reduction in covariance matrices based on factor analysis with an application to portfolio optimization.

    Science.gov (United States)

    Bartz, Daniel; Hatrick, Kerr; Hesse, Christian W; Müller, Klaus-Robert; Lemm, Steven

    2013-01-01

    Robust and reliable covariance estimates play a decisive role in financial and many other applications. An important class of estimators is based on factor models. Here, we show by extensive Monte Carlo simulations that covariance matrices derived from the statistical Factor Analysis model exhibit a systematic error, which is similar to the well-known systematic error of the spectrum of the sample covariance matrix. Moreover, we introduce the Directional Variance Adjustment (DVA) algorithm, which diminishes the systematic error. In a thorough empirical study for the US, European, and Hong Kong stock market we show that our proposed method leads to improved portfolio allocation.

  13. Directional Variance Adjustment: Bias Reduction in Covariance Matrices Based on Factor Analysis with an Application to Portfolio Optimization

    Science.gov (United States)

    Bartz, Daniel; Hatrick, Kerr; Hesse, Christian W.; Müller, Klaus-Robert; Lemm, Steven

    2013-01-01

    Robust and reliable covariance estimates play a decisive role in financial and many other applications. An important class of estimators is based on factor models. Here, we show by extensive Monte Carlo simulations that covariance matrices derived from the statistical Factor Analysis model exhibit a systematic error, which is similar to the well-known systematic error of the spectrum of the sample covariance matrix. Moreover, we introduce the Directional Variance Adjustment (DVA) algorithm, which diminishes the systematic error. In a thorough empirical study for the US, European, and Hong Kong stock market we show that our proposed method leads to improved portfolio allocation. PMID:23844016

  14. Fee-based services in sci-tech libraries

    CERN Document Server

    Mount, Ellis

    2013-01-01

    This timely and important book explores how fee-based services have developed in various types of sci-tech libraries. The authoritative contributors focus on the current changing financial aspects of the sci-tech library operation and clarify for the reader how these changes have brought about conditions in which traditional methods of funding are no longer adequate. What new options are open and how they are best being applied in today's sci-tech libraries is fully and clearly explained and illustrated. Topics explored include cost allocation and cost recovery, fees for computer searching, an

  15. Estimation of genetic connectedness diagnostics based on prediction errors without the prediction error variance-covariance matrix.

    Science.gov (United States)

    Holmes, John B; Dodds, Ken G; Lee, Michael A

    2017-03-02

    An important issue in genetic evaluation is the comparability of random effects (breeding values), particularly between pairs of animals in different contemporary groups. This is usually referred to as genetic connectedness. While various measures of connectedness have been proposed in the literature, there is general agreement that the most appropriate measure is some function of the prediction error variance-covariance matrix. However, obtaining the prediction error variance-covariance matrix is computationally demanding for large-scale genetic evaluations. Many alternative statistics have been proposed that avoid the computational cost of obtaining the prediction error variance-covariance matrix, such as counts of genetic links between contemporary groups, gene flow matrices, and functions of the variance-covariance matrix of estimated contemporary group fixed effects. In this paper, we show that a correction to the variance-covariance matrix of estimated contemporary group fixed effects will produce the exact prediction error variance-covariance matrix averaged by contemporary group for univariate models in the presence of single or multiple fixed effects and one random effect. We demonstrate the correction for a series of models and show that approximations to the prediction error matrix based solely on the variance-covariance matrix of estimated contemporary group fixed effects are inappropriate in certain circumstances. Our method allows for the calculation of a connectedness measure based on the prediction error variance-covariance matrix by calculating only the variance-covariance matrix of estimated fixed effects. Since the number of fixed effects in genetic evaluation is usually orders of magnitudes smaller than the number of random effect levels, the computational requirements for our method should be reduced.

  16. Robust Adaptive Beamforming with Sensor Position Errors Using Weighted Subspace Fitting-Based Covariance Matrix Reconstruction.

    Science.gov (United States)

    Chen, Peng; Yang, Yixin; Wang, Yong; Ma, Yuanliang

    2018-05-08

    When sensor position errors exist, the performance of recently proposed interference-plus-noise covariance matrix (INCM)-based adaptive beamformers may be severely degraded. In this paper, we propose a weighted subspace fitting-based INCM reconstruction algorithm to overcome sensor displacement for linear arrays. By estimating the rough signal directions, we construct a novel possible mismatched steering vector (SV) set. We analyze the proximity of the signal subspace from the sample covariance matrix (SCM) and the space spanned by the possible mismatched SV set. After solving an iterative optimization problem, we reconstruct the INCM using the estimated sensor position errors. Then we estimate the SV of the desired signal by solving an optimization problem with the reconstructed INCM. The main advantage of the proposed algorithm is its robustness against SV mismatches dominated by unknown sensor position errors. Numerical examples show that even if the position errors are up to half of the assumed sensor spacing, the output signal-to-interference-plus-noise ratio is only reduced by 4 dB. Beam patterns plotted using experiment data show that the interference suppression capability of the proposed beamformer outperforms other tested beamformers.

  17. Covariant Evolutionary Event Analysis for Base Interaction Prediction Using a Relational Database Management System for RNA.

    Science.gov (United States)

    Xu, Weijia; Ozer, Stuart; Gutell, Robin R

    2009-01-01

    With an increasingly large amount of sequences properly aligned, comparative sequence analysis can accurately identify not only common structures formed by standard base pairing but also new types of structural elements and constraints. However, traditional methods are too computationally expensive to perform well on large scale alignment and less effective with the sequences from diversified phylogenetic classifications. We propose a new approach that utilizes coevolutional rates among pairs of nucleotide positions using phylogenetic and evolutionary relationships of the organisms of aligned sequences. With a novel data schema to manage relevant information within a relational database, our method, implemented with a Microsoft SQL Server 2005, showed 90% sensitivity in identifying base pair interactions among 16S ribosomal RNA sequences from Bacteria, at a scale 40 times bigger and 50% better sensitivity than a previous study. The results also indicated covariation signals for a few sets of cross-strand base stacking pairs in secondary structure helices, and other subtle constraints in the RNA structure.

  18. Covariant Transform

    OpenAIRE

    Kisil, Vladimir V.

    2010-01-01

    The paper develops theory of covariant transform, which is inspired by the wavelet construction. It was observed that many interesting types of wavelets (or coherent states) arise from group representations which are not square integrable or vacuum vectors which are not admissible. Covariant transform extends an applicability of the popular wavelets construction to classic examples like the Hardy space H_2, Banach spaces, covariant functional calculus and many others. Keywords: Wavelets, cohe...

  19. Blind Source Separation Based on Covariance Ratio and Artificial Bee Colony Algorithm

    Directory of Open Access Journals (Sweden)

    Lei Chen

    2014-01-01

    Full Text Available The computation amount in blind source separation based on bioinspired intelligence optimization is high. In order to solve this problem, we propose an effective blind source separation algorithm based on the artificial bee colony algorithm. In the proposed algorithm, the covariance ratio of the signals is utilized as the objective function and the artificial bee colony algorithm is used to solve it. The source signal component which is separated out, is then wiped off from mixtures using the deflation method. All the source signals can be recovered successfully by repeating the separation process. Simulation experiments demonstrate that significant improvement of the computation amount and the quality of signal separation is achieved by the proposed algorithm when compared to previous algorithms.

  20. Error estimation for ADS nuclear properties by using nuclear data covariances

    International Nuclear Information System (INIS)

    Tsujimoto, Kazufumi

    2005-01-01

    Error for nuclear properties of accelerator-driven subcritical system by the uncertainties of nuclear data was performed. An uncertainty analysis was done using the sensitivity coefficients based on the generalized perturbation theory and the variance matrix data. For major actinides and structural material, the covariance data in JENDL-3.3 library were used. For MA, newly evaluated covariance data was used since there had been no reliable data in all libraries. (author)

  1. Identification of toxic cyclopeptides based on mass spectral library matching

    Directory of Open Access Journals (Sweden)

    Boris L. Milman

    2014-08-01

    Full Text Available To gain perspective on the use of tandem mass spectral libraries for identification of toxic cyclic peptides, the new library was built from 263 mass spectra (mainly MS2 spectra of 59 compounds of that group, such as microcystins, amatoxins, and some related compounds. Mass spectra were extracted from the literature or specially acquired on ESI-Q-ToF and MALDI-ToF/ToF tandem instruments. ESI-MS2 product-ion mass spectra appeared to be rather close to MALDI-ToF/ToF fragment spectra which are uncommon for mass spectral libraries. Testing of the library was based on searches where reference spectra were in turn cross-compared. The percentage of 1st rank correct identifications (true positives was 70% in a general case and 88–91% without including knowingly defective (‘one-dimension’ spectra as test ones. The percentage of 88–91% is the principal estimate for the overall performance of this library that can be used in a method of choice for identification of individual cyclopeptides and also for group recognition of individual classes of such peptides. The approach to identification of cyclopeptides based on mass spectral library matching proved to be the most effective for abundant toxins. That was confirmed by analysis of extracts from two cyanobacterial strains.

  2. Library

    OpenAIRE

    Dulaney, Ronald E. Jr.

    1997-01-01

    This study began with the desire to design a public town library of the future and became a search for an inkling of what is essential to Architecture. It is murky and full of contradictions. It asks more than it proposes, and the traces of its windings are better ordered through collage than logical synthesis. This study is neither a thesis nor a synthesis. When drawing out the measure of this study it may be beneficial to state what it attempts to place at the ...

  3. Iterative Covariance-Based Removal of Time-Synchronous Artifacts: Application to Gastrointestinal Electrical Recordings.

    Science.gov (United States)

    Erickson, Jonathan C; Putney, Joy; Hilbert, Douglas; Paskaranandavadivel, Niranchan; Cheng, Leo K; O'Grady, Greg; Angeli, Timothy R

    2016-11-01

    The aim of this study was to develop, validate, and apply a fully automated method for reducing large temporally synchronous artifacts present in electrical recordings made from the gastrointestinal (GI) serosa, which are problematic for properly assessing slow wave dynamics. Such artifacts routinely arise in experimental and clinical settings from motion, switching behavior of medical instruments, or electrode array manipulation. A novel iterative Covariance-Based Reduction of Artifacts (COBRA) algorithm sequentially reduced artifact waveforms using an updating across-channel median as a noise template, scaled and subtracted from each channel based on their covariance. Application of COBRA substantially increased the signal-to-artifact ratio (12.8 ± 2.5 dB), while minimally attenuating the energy of the underlying source signal by 7.9% on average ( -11.1 ± 3.9 dB). COBRA was shown to be highly effective for aiding recovery and accurate marking of slow wave events (sensitivity = 0.90 ± 0.04; positive-predictive value = 0.74 ± 0.08) from large segments of in vivo porcine GI electrical mapping data that would otherwise be lost due to a broad range of contaminating artifact waveforms. Strongly reducing artifacts with COBRA ultimately allowed for rapid production of accurate isochronal activation maps detailing the dynamics of slow wave propagation in the porcine intestine. Such mapping studies can help characterize differences between normal and dysrhythmic events, which have been associated with GI abnormalities, such as intestinal ischemia and gastroparesis. The COBRA method may be generally applicable for removing temporally synchronous artifacts in other biosignal processing domains.

  4. Structure-Based Virtual Screening of Commercially Available Compound Libraries.

    Science.gov (United States)

    Kireev, Dmitri

    2016-01-01

    Virtual screening (VS) is an efficient hit-finding tool. Its distinctive strength is that it allows one to screen compound libraries that are not available in the lab. Moreover, structure-based (SB) VS also enables an understanding of how the hit compounds bind the protein target, thus laying ground work for the rational hit-to-lead progression. SBVS requires a very limited experimental effort and is particularly well suited for academic labs and small biotech companies that, unlike pharmaceutical companies, do not have physical access to quality small-molecule libraries. Here, we describe SBVS of commercial compound libraries for Mer kinase inhibitors. The screening protocol relies on the docking algorithm Glide complemented by a post-docking filter based on structural protein-ligand interaction fingerprints (SPLIF).

  5. FoodPro: A Web-Based Tool for Evaluating Covariance and Correlation NMR Spectra Associated with Food Processes

    Directory of Open Access Journals (Sweden)

    Eisuke Chikayama

    2016-10-01

    Full Text Available Foods from agriculture and fishery products are processed using various technologies. Molecular mixture analysis during food processing has the potential to help us understand the molecular mechanisms involved, thus enabling better cooking of the analyzed foods. To date, there has been no web-based tool focusing on accumulating Nuclear Magnetic Resonance (NMR spectra from various types of food processing. Therefore, we have developed a novel web-based tool, FoodPro, that includes a food NMR spectrum database and computes covariance and correlation spectra to tasting and hardness. As a result, FoodPro has accumulated 236 aqueous (extracted in D2O and 131 hydrophobic (extracted in CDCl3 experimental bench-top 60-MHz NMR spectra, 1753 tastings scored by volunteers, and 139 hardness measurements recorded by a penetrometer, all placed into a core database. The database content was roughly classified into fish and vegetable groups from the viewpoint of different spectrum patterns. FoodPro can query a user food NMR spectrum, search similar NMR spectra with a specified similarity threshold, and then compute estimated tasting and hardness, covariance, and correlation spectra to tasting and hardness. Querying fish spectra exemplified specific covariance spectra to tasting and hardness, giving positive covariance for tasting at 1.31 ppm for lactate and 3.47 ppm for glucose and a positive covariance for hardness at 3.26 ppm for trimethylamine N-oxide.

  6. FoodPro: A Web-Based Tool for Evaluating Covariance and Correlation NMR Spectra Associated with Food Processes.

    Science.gov (United States)

    Chikayama, Eisuke; Yamashina, Ryo; Komatsu, Keiko; Tsuboi, Yuuri; Sakata, Kenji; Kikuchi, Jun; Sekiyama, Yasuyo

    2016-10-19

    Foods from agriculture and fishery products are processed using various technologies. Molecular mixture analysis during food processing has the potential to help us understand the molecular mechanisms involved, thus enabling better cooking of the analyzed foods. To date, there has been no web-based tool focusing on accumulating Nuclear Magnetic Resonance (NMR) spectra from various types of food processing. Therefore, we have developed a novel web-based tool, FoodPro, that includes a food NMR spectrum database and computes covariance and correlation spectra to tasting and hardness. As a result, FoodPro has accumulated 236 aqueous (extracted in D₂O) and 131 hydrophobic (extracted in CDCl₃) experimental bench-top 60-MHz NMR spectra, 1753 tastings scored by volunteers, and 139 hardness measurements recorded by a penetrometer, all placed into a core database. The database content was roughly classified into fish and vegetable groups from the viewpoint of different spectrum patterns. FoodPro can query a user food NMR spectrum, search similar NMR spectra with a specified similarity threshold, and then compute estimated tasting and hardness, covariance, and correlation spectra to tasting and hardness. Querying fish spectra exemplified specific covariance spectra to tasting and hardness, giving positive covariance for tasting at 1.31 ppm for lactate and 3.47 ppm for glucose and a positive covariance for hardness at 3.26 ppm for trimethylamine N -oxide.

  7. Views From the Pacific--Military Base Hospital Libraries in Hawaii and Guam.

    Science.gov (United States)

    Stephenson, Priscilla L; Trafford, Mabel A; Hadley, Alice E

    2016-01-01

    Hospital libraries serving military bases offer a different perspective on library services. Two libraries located on islands in the Pacific Ocean provide services to active duty service men and women, including those deployed to other regions of the world. In addition, these hospital libraries serve service members' families living on the base, and often citizens from the surrounding communities.

  8. NParCov3: A SAS/IML Macro for Nonparametric Randomization-Based Analysis of Covariance

    Directory of Open Access Journals (Sweden)

    Richard C. Zink

    2012-07-01

    Full Text Available Analysis of covariance serves two important purposes in a randomized clinical trial. First, there is a reduction of variance for the treatment effect which provides more powerful statistical tests and more precise confidence intervals. Second, it provides estimates of the treatment effect which are adjusted for random imbalances of covariates between the treatment groups. The nonparametric analysis of covariance method of Koch, Tangen, Jung, and Amara (1998 defines a very general methodology using weighted least-squares to generate covariate-adjusted treatment effects with minimal assumptions. This methodology is general in its applicability to a variety of outcomes, whether continuous, binary, ordinal, incidence density or time-to-event. Further, its use has been illustrated in many clinical trial settings, such as multi-center, dose-response and non-inferiority trials.NParCov3 is a SAS/IML macro written to conduct the nonparametric randomization-based covariance analyses of Koch et al. (1998. The software can analyze a variety of outcomes and can account for stratification. Data from multiple clinical trials will be used for illustration.

  9. A Nakanishi-based model illustrating the covariant extension of the pion GPD overlap representation and its ambiguities

    Science.gov (United States)

    Chouika, N.; Mezrag, C.; Moutarde, H.; Rodríguez-Quintero, J.

    2018-05-01

    A systematic approach for the model building of Generalized Parton Distributions (GPDs), based on their overlap representation within the DGLAP kinematic region and a further covariant extension to the ERBL one, is applied to the valence-quark pion's case, using light-front wave functions inspired by the Nakanishi representation of the pion Bethe-Salpeter amplitudes (BSA). This simple but fruitful pion GPD model illustrates the general model building technique and, in addition, allows for the ambiguities related to the covariant extension, grounded on the Double Distribution (DD) representation, to be constrained by requiring a soft-pion theorem to be properly observed.

  10. A New Approach for Nuclear Data Covariance and Sensitivity Generation

    International Nuclear Information System (INIS)

    Leal, L.C.; Larson, N.M.; Derrien, H.; Kawano, T.; Chadwick, M.B.

    2005-01-01

    Covariance data are required to correctly assess uncertainties in design parameters in nuclear applications. The error estimation of calculated quantities relies on the nuclear data uncertainty information available in the basic nuclear data libraries, such as the U.S. Evaluated Nuclear Data File, ENDF/B. The uncertainty files in the ENDF/B library are obtained from the analysis of experimental data and are stored as variance and covariance data. The computer code SAMMY is used in the analysis of the experimental data in the resolved and unresolved resonance energy regions. The data fitting of cross sections is based on generalized least-squares formalism (Bayes' theory) together with the resonance formalism described by R-matrix theory. Two approaches are used in SAMMY for the generation of resonance-parameter covariance data. In the evaluation process SAMMY generates a set of resonance parameters that fit the data, and, in addition, it also provides the resonance-parameter covariances. For existing resonance-parameter evaluations where no resonance-parameter covariance data are available, the alternative is to use an approach called the 'retroactive' resonance-parameter covariance generation. In the high-energy region the methodology for generating covariance data consists of least-squares fitting and model parameter adjustment. The least-squares fitting method calculates covariances directly from experimental data. The parameter adjustment method employs a nuclear model calculation such as the optical model and the Hauser-Feshbach model, and estimates a covariance for the nuclear model parameters. In this paper we describe the application of the retroactive method and the parameter adjustment method to generate covariance data for the gadolinium isotopes

  11. Spatial-temporal-covariance-based modeling, analysis, and simulation of aero-optics wavefront aberrations.

    Science.gov (United States)

    Vogel, Curtis R; Tyler, Glenn A; Wittich, Donald J

    2014-07-01

    We introduce a framework for modeling, analysis, and simulation of aero-optics wavefront aberrations that is based on spatial-temporal covariance matrices extracted from wavefront sensor measurements. Within this framework, we present a quasi-homogeneous structure function to analyze nonhomogeneous, mildly anisotropic spatial random processes, and we use this structure function to show that phase aberrations arising in aero-optics are, for an important range of operating parameters, locally Kolmogorov. This strongly suggests that the d5/3 power law for adaptive optics (AO) deformable mirror fitting error, where d denotes actuator separation, holds for certain important aero-optics scenarios. This framework also allows us to compute bounds on AO servo lag error and predictive control error. In addition, it provides us with the means to accurately simulate AO systems for the mitigation of aero-effects, and it may provide insight into underlying physical processes associated with turbulent flow. The techniques introduced here are demonstrated using data obtained from the Airborne Aero-Optics Laboratory.

  12. Diurnal variability of CO2 flux at coastal zone of Taiwan based on eddy covariance observation

    Science.gov (United States)

    Chien, Hwa; Zhong, Yao-Zhao; Yang, Kang-Hung; Cheng, Hao-Yuan

    2018-06-01

    In this study, we employed shore-based eddy covariance systems for a continuous measurement of the coastal CO2 flux near the northwestern coast of Taiwan from 2011 to 2015. To ensure the validity of the analysis, the data was selected and filtered with a footprint model and an empirical mode decomposition method. The results indicate that the nearshore air-sea and air-land CO2 fluxes exhibited a significant diurnal variability and a substantial day-night difference. The net air-sea CO2 flux was -1.75 ± 0.98 μmol-C m-2 s-1, whereas the net air-land CO2 flux was 0.54 ± 7.35 μmol-C m-2 s-1, which indicated that in northwestern Taiwan, the coastal water acts as a sink of atmospheric CO2 but the coastal land acts as a source. The Random Forest Method was applied to hierarchize the influence of Chl-a, SST, DO, pH and U10 on air-sea CO2 fluxes. The result suggests that the strength of the diurnal air-sea CO2 flux is strongly influenced by the local wind speed.

  13. New distributed fusion filtering algorithm based on covariances over sensor networks with random packet dropouts

    Science.gov (United States)

    Caballero-Águila, R.; Hermoso-Carazo, A.; Linares-Pérez, J.

    2017-07-01

    This paper studies the distributed fusion estimation problem from multisensor measured outputs perturbed by correlated noises and uncertainties modelled by random parameter matrices. Each sensor transmits its outputs to a local processor over a packet-erasure channel and, consequently, random losses may occur during transmission. Different white sequences of Bernoulli variables are introduced to model the transmission losses. For the estimation, each lost output is replaced by its estimator based on the information received previously, and only the covariances of the processes involved are used, without requiring the signal evolution model. First, a recursive algorithm for the local least-squares filters is derived by using an innovation approach. Then, the cross-correlation matrices between any two local filters is obtained. Finally, the distributed fusion filter weighted by matrices is obtained from the local filters by applying the least-squares criterion. The performance of the estimators and the influence of both sensor uncertainties and transmission losses on the estimation accuracy are analysed in a numerical example.

  14. Evidence-based Practice in libraries - Principles and discussions

    DEFF Research Database (Denmark)

    Johannsen, Carl Gustav

    2012-01-01

    The article examines problems concerning the introduction and future implementation of evidence-based practice (EBP) in libraries. It includes important conceptual distinctions and definitions, and it reviews the more controversial aspects of EBP, primarely based on experiences from Denmark. The ....... The purpose of the article is both to qualify existing scepticism and reservations and - maybe - to clarify misunderstandings and objections through the presentation of arguments and data....

  15. Toeplitz Inverse Covariance-Based Clustering of Multivariate Time Series Data

    Science.gov (United States)

    Hallac, David; Vare, Sagar; Boyd, Stephen; Leskovec, Jure

    2018-01-01

    Subsequence clustering of multivariate time series is a useful tool for discovering repeated patterns in temporal data. Once these patterns have been discovered, seemingly complicated datasets can be interpreted as a temporal sequence of only a small number of states, or clusters. For example, raw sensor data from a fitness-tracking application can be expressed as a timeline of a select few actions (i.e., walking, sitting, running). However, discovering these patterns is challenging because it requires simultaneous segmentation and clustering of the time series. Furthermore, interpreting the resulting clusters is difficult, especially when the data is high-dimensional. Here we propose a new method of model-based clustering, which we call Toeplitz Inverse Covariance-based Clustering (TICC). Each cluster in the TICC method is defined by a correlation network, or Markov random field (MRF), characterizing the interdependencies between different observations in a typical subsequence of that cluster. Based on this graphical representation, TICC simultaneously segments and clusters the time series data. We solve the TICC problem through alternating minimization, using a variation of the expectation maximization (EM) algorithm. We derive closed-form solutions to efficiently solve the two resulting subproblems in a scalable way, through dynamic programming and the alternating direction method of multipliers (ADMM), respectively. We validate our approach by comparing TICC to several state-of-the-art baselines in a series of synthetic experiments, and we then demonstrate on an automobile sensor dataset how TICC can be used to learn interpretable clusters in real-world scenarios. PMID:29770257

  16. School Library Media Specialists Inform Technology Preparation of Library Science Students: An Evidence-Based Discussion

    Science.gov (United States)

    Snyder, Donna L.; Miller, Andrea L.

    2009-01-01

    What is the relative importance of current and emerging technologies in school library media programs? In order to answer this question, in Fall 2007 the authors administered a survey to 1,053 school library media specialists (SLMSs) throughout the state of Pennsylvania. As a part of the MSLS degree with Library Science K-12 certification, Clarion…

  17. Speech-Based Information Retrieval for Digital Libraries

    National Research Council Canada - National Science Library

    Oard, Douglas W

    1997-01-01

    Libraries and archives collect recorded speech and multimedia objects that contain recorded speech, and such material may comprise a substantial portion of the collection in future digital libraries...

  18. A network based covariance test for detecting multivariate eQTL in saccharomyces cerevisiae.

    Science.gov (United States)

    Yuan, Huili; Li, Zhenye; Tang, Nelson L S; Deng, Minghua

    2016-01-11

    Expression quantitative trait locus (eQTL) analysis has been widely used to understand how genetic variations affect gene expressions in the biological systems. Traditional eQTL is investigated in a pair-wise manner in which one SNP affects the expression of one gene. In this way, some associated markers found in GWAS have been related to disease mechanism by eQTL study. However, in real life, biological process is usually performed by a group of genes. Although some methods have been proposed to identify a group of SNPs that affect the mean of gene expressions in the network, the change of co-expression pattern has not been considered. So we propose a process and algorithm to identify the marker which affects the co-expression pattern of a pathway. Considering two genes may have different correlations under different isoforms which is hard to detect by the linear test, we also consider the nonlinear test. When we applied our method to yeast eQTL dataset profiled under both the glucose and ethanol conditions, we identified a total of 166 modules, with each module consisting of a group of genes and one eQTL where the eQTL regulate the co-expression patterns of the group of genes. We found that many of these modules have biological significance. We propose a network based covariance test to identify the SNP which affects the structure of a pathway. We also consider the nonlinear test as considering two genes may have different correlations under different isoforms which is hard to detect by linear test.

  19. Representation of physiological drought at ecosystem level based on model and eddy covariance measurements

    Science.gov (United States)

    Zhang, Y.; Novick, K. A.; Song, C.; Zhang, Q.; Hwang, T.

    2017-12-01

    Drought and heat waves are expected to increase both in frequency and amplitude, exhibiting a major disturbance to global carbon and water cycles under future climate change. However, how these climate anomalies translate into physiological drought, or ecosystem moisture stress are still not clear, especially under the co-limitations from soil moisture supply and atmospheric demand for water. In this study, we characterized the ecosystem-level moisture stress in a deciduous forest in the southeastern United States using the Coupled Carbon and Water (CCW) model and in-situ eddy covariance measurements. Physiologically, vapor pressure deficit (VPD) as an atmospheric water demand indicator largely controls the openness of leaf stomata, and regulates atmospheric carbon and water exchanges during periods of hydrological stress. Here, we tested three forms of VPD-related moisture scalars, i.e. exponent (K2), hyperbola (K3), and logarithm (K4) to quantify the sensitivity of light-use efficiency to VPD along different soil moisture conditions. The sensitivity indicators of K values were calibrated based on the framework of CCW using Monte Carlo simulations on the hourly scale, in which VPD and soil water content (SWC) are largely decoupled and the full carbon and water exchanging information are held. We found that three K values show similar performances in the predictions of ecosystem-level photosynthesis and transpiration after calibration. However, all K values show consistent gradient changes along SWC, indicating that this deciduous forest is less responsive to VPD as soil moisture decreases, a phenomena of isohydricity in which plants tend to close stomata to keep the leaf water potential constant and reduce the risk of hydraulic failure. Our study suggests that accounting for such isohydric information, or spectrum of moisture stress along different soil moisture conditions in models can significantly improve our ability to predict ecosystem responses to future

  20. Shrinkage covariance matrix approach based on robust trimmed mean in gene sets detection

    Science.gov (United States)

    Karjanto, Suryaefiza; Ramli, Norazan Mohamed; Ghani, Nor Azura Md; Aripin, Rasimah; Yusop, Noorezatty Mohd

    2015-02-01

    Microarray involves of placing an orderly arrangement of thousands of gene sequences in a grid on a suitable surface. The technology has made a novelty discovery since its development and obtained an increasing attention among researchers. The widespread of microarray technology is largely due to its ability to perform simultaneous analysis of thousands of genes in a massively parallel manner in one experiment. Hence, it provides valuable knowledge on gene interaction and function. The microarray data set typically consists of tens of thousands of genes (variables) from just dozens of samples due to various constraints. Therefore, the sample covariance matrix in Hotelling's T2 statistic is not positive definite and become singular, thus it cannot be inverted. In this research, the Hotelling's T2 statistic is combined with a shrinkage approach as an alternative estimation to estimate the covariance matrix to detect significant gene sets. The use of shrinkage covariance matrix overcomes the singularity problem by converting an unbiased to an improved biased estimator of covariance matrix. Robust trimmed mean is integrated into the shrinkage matrix to reduce the influence of outliers and consequently increases its efficiency. The performance of the proposed method is measured using several simulation designs. The results are expected to outperform existing techniques in many tested conditions.

  1. DFT-Based Closed-form Covariance Matrix and Direct Waveforms Design for MIMO Radar to Achieve Desired Beampatterns

    KAUST Repository

    Bouchoucha, Taha

    2017-01-23

    In multiple-input multiple-out (MIMO) radar, for desired transmit beampatterns, appropriate correlated waveforms are designed. To design such waveforms, conventional MIMO radar methods use two steps. In the first step, the waveforms covariance matrix, R, is synthesized to achieve the desired beampattern. While in the second step, to realize the synthesized covariance matrix, actual waveforms are designed. Most of the existing methods use iterative algorithms to solve these constrained optimization problems. The computational complexity of these algorithms is very high, which makes them difficult to use in practice. In this paper, to achieve the desired beampattern, a low complexity discrete-Fourier-transform based closed-form covariance matrix design technique is introduced for a MIMO radar. The designed covariance matrix is then exploited to derive a novel closed-form algorithm to directly design the finite-alphabet constant-envelope waveforms for the desired beampattern. The proposed technique can be used to design waveforms for large antenna array to change the beampattern in real time. It is also shown that the number of transmitted symbols from each antenna depends on the beampattern and is less than the total number of transmit antenna elements.

  2. Covariance-Based Estimation from Multisensor Delayed Measurements with Random Parameter Matrices and Correlated Noises

    Directory of Open Access Journals (Sweden)

    R. Caballero-Águila

    2014-01-01

    Full Text Available The optimal least-squares linear estimation problem is addressed for a class of discrete-time multisensor linear stochastic systems subject to randomly delayed measurements with different delay rates. For each sensor, a different binary sequence is used to model the delay process. The measured outputs are perturbed by both random parameter matrices and one-step autocorrelated and cross correlated noises. Using an innovation approach, computationally simple recursive algorithms are obtained for the prediction, filtering, and smoothing problems, without requiring full knowledge of the state-space model generating the signal process, but only the information provided by the delay probabilities and the mean and covariance functions of the processes (signal, random parameter matrices, and noises involved in the observation model. The accuracy of the estimators is measured by their error covariance matrices, which allow us to analyze the estimator performance in a numerical simulation example that illustrates the feasibility of the proposed algorithms.

  3. Use of internet library based services by the students of Imo State ...

    African Journals Online (AJOL)

    Findings show that students utilizes internet based library services in their academic work, for their intellectual development as well as in communicating to their lecturers and other relations on their day to day information needs. It is recommended that University libraries should provide and offer internet based library ...

  4. Activity-Based Costing in User Services of an Academic Library.

    Science.gov (United States)

    Ellis-Newman, Jennifer

    2003-01-01

    The rationale for using Activity-Based Costing (ABC) in a library is to allocate indirect costs to products and services based on the factors that most influence them. This paper discusses the benefits of ABC to library managers and explains the steps involved in implementing ABC in the user services area of an Australian academic library.…

  5. An automated procedure for covariation-based detection of RNA structure

    International Nuclear Information System (INIS)

    Winker, S.; Overbeek, R.; Woese, C.R.; Olsen, G.J.; Pfluger, N.

    1989-12-01

    This paper summarizes our investigations into the computational detection of secondary and tertiary structure of ribosomal RNA. We have developed a new automated procedure that not only identifies potential bondings of secondary and tertiary structure, but also provides the covariation evidence that supports the proposed bondings, and any counter-evidence that can be detected in the known sequences. A small number of previously unknown bondings have been detected in individual RNA molecules (16S rRNA and 7S RNA) through the use of our automated procedure. Currently, we are systematically studying mitochondrial rRNA. Our goal is to detect tertiary structure within 16S rRNA and quaternary structure between 16S and 23S rRNA. Our ultimate hope is that automated covariation analysis will contribute significantly to a refined picture of ribosome structure. Our colleagues in biology have begun experiments to test certain hypotheses suggested by an examination of our program's output. These experiments involve sequencing key portions of the 23S ribosomal RNA for species in which the known 16S ribosomal RNA exhibits variation (from the dominant pattern) at the site of a proposed bonding. The hope is that the 23S ribosomal RNA of these species will exhibit corresponding complementary variation or generalized covariation. 24 refs

  6. Early selection in open-pollinated Eucalyptus families based on competition covariates

    Directory of Open Access Journals (Sweden)

    Bruno Ettore Pavan

    2014-06-01

    Full Text Available The objetive of this work was to evaluate the influence of intergenotypic competition in open-pollinated families of Eucalyptus and its effects on early selection efficiency. Two experiments were carried out, in which the timber volume was evaluated at three ages, in a randomized complete block design. Data from the three years of evaluation (experiment 1, at 2, 4, and 7 years; and experiment 2, at 2, 5, and 7 years were analyzed using mixed models. The following were estimated: variance components, genetic parameters, selection gains, effective number, early selection efficiency, selection gain per unit time, and coincidence of selection with and without the use of competition covariates. Competition effect was nonsignificant for ages under three years, and adjustment using competition covariates was unnecessary. Early selection for families is effective; families that have a late growth spurt are more vulnerable to competition, which markedly impairs ranking at the end of the cycle. Early selection is efficient according to all adopted criteria, and the age of around three years is the most recommended, given the high efficiency and accuracy rate in the indication of trees and families. The addition of competition covariates at the end of the cycle improves early selection efficiency for almost all studied criteria.

  7. An automated procedure for covariation-based detection of RNA structure

    Energy Technology Data Exchange (ETDEWEB)

    Winker, S.; Overbeek, R.; Woese, C.R.; Olsen, G.J.; Pfluger, N.

    1989-12-01

    This paper summarizes our investigations into the computational detection of secondary and tertiary structure of ribosomal RNA. We have developed a new automated procedure that not only identifies potential bondings of secondary and tertiary structure, but also provides the covariation evidence that supports the proposed bondings, and any counter-evidence that can be detected in the known sequences. A small number of previously unknown bondings have been detected in individual RNA molecules (16S rRNA and 7S RNA) through the use of our automated procedure. Currently, we are systematically studying mitochondrial rRNA. Our goal is to detect tertiary structure within 16S rRNA and quaternary structure between 16S and 23S rRNA. Our ultimate hope is that automated covariation analysis will contribute significantly to a refined picture of ribosome structure. Our colleagues in biology have begun experiments to test certain hypotheses suggested by an examination of our program's output. These experiments involve sequencing key portions of the 23S ribosomal RNA for species in which the known 16S ribosomal RNA exhibits variation (from the dominant pattern) at the site of a proposed bonding. The hope is that the 23S ribosomal RNA of these species will exhibit corresponding complementary variation or generalized covariation. 24 refs.

  8. A Web-Based Electronic Book (e-book) Library: The netLibrary Model.

    Science.gov (United States)

    Connaway, Lynn Silipigni

    2001-01-01

    Identifies elements that are important for academic libraries to use in evaluating electronic books, including content; acquisition and collection development; software and hardware standards and protocols; digital rights management; access; archiving; privacy; the market and pricing; and enhancements and ideal features. Describes netLibrary, a…

  9. Sensitivity analysis for missing dichotomous outcome data in multi-visit randomized clinical trial with randomization-based covariance adjustment.

    Science.gov (United States)

    Li, Siying; Koch, Gary G; Preisser, John S; Lam, Diana; Sanchez-Kam, Matilde

    2017-01-01

    Dichotomous endpoints in clinical trials have only two possible outcomes, either directly or via categorization of an ordinal or continuous observation. It is common to have missing data for one or more visits during a multi-visit study. This paper presents a closed form method for sensitivity analysis of a randomized multi-visit clinical trial that possibly has missing not at random (MNAR) dichotomous data. Counts of missing data are redistributed to the favorable and unfavorable outcomes mathematically to address possibly informative missing data. Adjusted proportion estimates and their closed form covariance matrix estimates are provided. Treatment comparisons over time are addressed with Mantel-Haenszel adjustment for a stratification factor and/or randomization-based adjustment for baseline covariables. The application of such sensitivity analyses is illustrated with an example. An appendix outlines an extension of the methodology to ordinal endpoints.

  10. ORIGEN2 libraries based on JENDL-3.2 for LWR-MOX fuels

    Energy Technology Data Exchange (ETDEWEB)

    Suyama, Kenya; Katakura, Jun-ichi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Onoue, Masaaki; Matsumoto, Hideki [Mitsubishi Heavy Industries Ltd., Tokyo (Japan); Sasahara, Akihiro [Central Research Inst. of Electric Power Industry, Tokyo (Japan)

    2000-11-01

    A set of ORIGEN2 libraries for LWR MOX fuels was developed based on JENDL-3.2. The libraries were compiled with SWAT using the specification of MOX fuels that will be used in nuclear power reactors in Japan. The verification of the libraries were performed by the analyses of post irradiation examinations for the fuels from European PWR. By the analysis of PIE data from PWR in United States, the comparison was made between calculation and experimental results in the case of that parameters for making the libraries are different from irradiation conditions. These new libraries for LWR MOX fuels are packaged in ORLIBJ32, the libraries released in 1999. (author)

  11. Development of a new nuclear data library based on ROOT

    Directory of Open Access Journals (Sweden)

    Park Tae-Sun

    2017-01-01

    Full Text Available We develop a new C++ nuclear data library for the Evaluated Nuclear Data File (ENDF data, which we refer to as TNudy. Main motivation of the development is to provide systematic, powerful and intuitive interfaces and functionalities for browsing, visualizing and manipulating the detailed information embodied in the ENDF. To achieve this aim efficiently, the TNudy project is based on the ROOT system. TNudy is still in the stage of development, and the current status and future plans will be presented.

  12. Study of continuous blood pressure estimation based on pulse transit time, heart rate and photoplethysmography-derived hemodynamic covariates.

    Science.gov (United States)

    Feng, Jingjie; Huang, Zhongyi; Zhou, Congcong; Ye, Xuesong

    2018-06-01

    It is widely recognized that pulse transit time (PTT) can track blood pressure (BP) over short periods of time, and hemodynamic covariates such as heart rate, stiffness index may also contribute to BP monitoring. In this paper, we derived a proportional relationship between BP and PPT -2 and proposed an improved method adopting hemodynamic covariates in addition to PTT for continuous BP estimation. We divided 28 subjects from the Multi-parameter Intelligent Monitoring for Intensive Care database into two groups (with/without cardiovascular diseases) and utilized a machine learning strategy based on regularized linear regression (RLR) to construct BP models with different covariates for corresponding groups. RLR was performed for individuals as the initial calibration, while recursive least square algorithm was employed for the re-calibration. The results showed that errors of BP estimation by our method stayed within the Association of Advancement of Medical Instrumentation limits (- 0.98 ± 6.00 mmHg @ SBP, 0.02 ± 4.98 mmHg @ DBP) when the calibration interval extended to 1200-beat cardiac cycles. In comparison with other two representative studies, Chen's method kept accurate (0.32 ± 6.74 mmHg @ SBP, 0.94 ± 5.37 mmHg @ DBP) using a 400-beat calibration interval, while Poon's failed (- 1.97 ± 10.59 mmHg @ SBP, 0.70 ± 4.10 mmHg @ DBP) when using a 200-beat calibration interval. With additional hemodynamic covariates utilized, our method improved the accuracy of PTT-based BP estimation, decreased the calibration frequency and had the potential for better continuous BP estimation.

  13. MCNP4c JEFF-3.1 Based Libraries. Eccolib-Jeff-3.1 libraries

    International Nuclear Information System (INIS)

    Sublet, J.Ch.

    2006-01-01

    Continuous-energy and multi-temperatures MCNP Ace types libraries, derived from the Joint European Fusion-Fission JEFF-3.1 evaluations, have been generated using the NJOY-99.111 processing code system. They include the continuous-energy neutron JEFF-3.1/General Purpose, JEFF-3.1/Activation-Dosimetry and thermal S(α,β) JEFF-3.1/Thermal libraries and data tables. The processing steps and features are explained together with the Quality Assurance processes and records linked to the generation of such multipurpose libraries. (author)

  14. Research on Modified Root-MUSIC Algorithm of DOA Estimation Based on Covariance Matrix Reconstruction

    Directory of Open Access Journals (Sweden)

    Changgan SHU

    2014-09-01

    Full Text Available In the standard root multiple signal classification algorithm, the performance of direction of arrival estimation will reduce and even lose effect in circumstances that a low signal noise ratio and a small signals interval. By reconstructing and weighting the covariance matrix of received signal, the modified algorithm can provide more accurate estimation results. The computer simulation and performance analysis are given next, which show that under the condition of lower signal noise ratio and stronger correlation between signals, the proposed modified algorithm could provide preferable azimuth estimating performance than the standard method.

  15. Development of libraries for ORIGEN2 code based on JENDL-3.2

    Energy Technology Data Exchange (ETDEWEB)

    Suyama, Kenya; Katakura, Jun-ichi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Ishikawa, Makoto; Ohkawachi, Yasushi

    1998-03-01

    The working Group of JNDC `Nuclide Generation Evaluation` has launched a project to make libraries for ORIGEN2 code based on the latest nuclear data library `JENDL-3.2` for current design of LWR and FBR fuels. Many of these libraries are under validation. (author)

  16. Simulation-based hypothesis testing of high dimensional means under covariance heterogeneity.

    Science.gov (United States)

    Chang, Jinyuan; Zheng, Chao; Zhou, Wen-Xin; Zhou, Wen

    2017-12-01

    In this article, we study the problem of testing the mean vectors of high dimensional data in both one-sample and two-sample cases. The proposed testing procedures employ maximum-type statistics and the parametric bootstrap techniques to compute the critical values. Different from the existing tests that heavily rely on the structural conditions on the unknown covariance matrices, the proposed tests allow general covariance structures of the data and therefore enjoy wide scope of applicability in practice. To enhance powers of the tests against sparse alternatives, we further propose two-step procedures with a preliminary feature screening step. Theoretical properties of the proposed tests are investigated. Through extensive numerical experiments on synthetic data sets and an human acute lymphoblastic leukemia gene expression data set, we illustrate the performance of the new tests and how they may provide assistance on detecting disease-associated gene-sets. The proposed methods have been implemented in an R-package HDtest and are available on CRAN. © 2017, The International Biometric Society.

  17. Covariance data evaluation of some experimental data for n + 65,63,NatCu

    International Nuclear Information System (INIS)

    Jia Min; Liu Jianfeng; Liu Tingjin

    2003-01-01

    The evaluation of covariance data for 65,63,Nat Cu in the energy range from 99.5 keV to 20 MeV was carried out using EXPCOV and SPC code based on the experimental data available. The data can be as a part of the covariance file 33 in the evaluated library in ENDF/B6 format for the corresponding nuclides, and also can be used as the basis of theoretical calculation concerned. (authors)

  18. ERC analysis: web-based inference of gene function via evolutionary rate covariation.

    Science.gov (United States)

    Wolfe, Nicholas W; Clark, Nathan L

    2015-12-01

    The recent explosion of comparative genomics data presents an unprecedented opportunity to construct gene networks via the evolutionary rate covariation (ERC) signature. ERC is used to identify genes that experienced similar evolutionary histories, and thereby draws functional associations between them. The ERC Analysis website allows researchers to exploit genome-wide datasets to infer novel genes in any biological function and to explore deep evolutionary connections between distinct pathways and complexes. The website provides five analytical methods, graphical output, statistical support and access to an increasing number of taxonomic groups. Analyses and data at http://csb.pitt.edu/erc_analysis/ nclark@pitt.edu. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  19. General Galilei Covariant Gaussian Maps

    Science.gov (United States)

    Gasbarri, Giulio; Toroš, Marko; Bassi, Angelo

    2017-09-01

    We characterize general non-Markovian Gaussian maps which are covariant under Galilean transformations. In particular, we consider translational and Galilean covariant maps and show that they reduce to the known Holevo result in the Markovian limit. We apply the results to discuss measures of macroscopicity based on classicalization maps, specifically addressing dissipation, Galilean covariance and non-Markovianity. We further suggest a possible generalization of the macroscopicity measure defined by Nimmrichter and Hornberger [Phys. Rev. Lett. 110, 16 (2013)].

  20. An estimate of the terrestrial carbon budget of Russia using inventory-based, eddy covariance and inversion methods

    Directory of Open Access Journals (Sweden)

    A. J. Dolman

    2012-12-01

    Full Text Available We determine the net land to atmosphere flux of carbon in Russia, including Ukraine, Belarus and Kazakhstan, using inventory-based, eddy covariance, and inversion methods. Our high boundary estimate is −342 Tg C yr−1 from the eddy covariance method, and this is close to the upper bounds of the inventory-based Land Ecosystem Assessment and inverse models estimates. A lower boundary estimate is provided at −1350 Tg C yr−1 from the inversion models. The average of the three methods is −613.5 Tg C yr−1. The methane emission is estimated separately at 41.4 Tg C yr−1.

    These three methods agree well within their respective error bounds. There is thus good consistency between bottom-up and top-down methods. The forests of Russia primarily cause the net atmosphere to land flux (−692 Tg C yr−1 from the LEA. It remains however remarkable that the three methods provide such close estimates (−615, −662, −554 Tg C yr–1 for net biome production (NBP, given the inherent uncertainties in all of the approaches. The lack of recent forest inventories, the few eddy covariance sites and associated uncertainty with upscaling and undersampling of concentrations for the inversions are among the prime causes of the uncertainty. The dynamic global vegetation models (DGVMs suggest a much lower uptake at −91 Tg C yr−1, and we argue that this is caused by a high estimate of heterotrophic respiration compared to other methods.

  1. Preparation of covariance data for the fast reactor. 2

    International Nuclear Information System (INIS)

    Shibata, Keiichi; Hasagawa, Akira

    1998-03-01

    For some isotopes important for core analysis of the fast reactor, covariance data of neutron nuclear data in the evaluated nuclear data library (JENDL-3.2) were presumed to file. Objected isotopes were 10-B, 11-B, 55-Mn, 240-Pu and 241-Pu. Physical amounts presumed on covariance were cross section, isolated and unisolated resonance parameters and first order Legendre coefficient of elastic scattering angle distribution. Presumption of the covariance was conducted in accordance with the data estimation method of JENDL-3.2 as possible. In other ward, when the estimated value was based on the experimental one, error of the experimental value was calculated, and when based on the calculated value, error of the calculated one was obtained. Their estimated results were prepared with ENDF-6 format. (G.K.)

  2. Covariance Bell inequalities

    Science.gov (United States)

    Pozsgay, Victor; Hirsch, Flavien; Branciard, Cyril; Brunner, Nicolas

    2017-12-01

    We introduce Bell inequalities based on covariance, one of the most common measures of correlation. Explicit examples are discussed, and violations in quantum theory are demonstrated. A crucial feature of these covariance Bell inequalities is their nonlinearity; this has nontrivial consequences for the derivation of their local bound, which is not reached by deterministic local correlations. For our simplest inequality, we derive analytically tight bounds for both local and quantum correlations. An interesting application of covariance Bell inequalities is that they can act as "shared randomness witnesses": specifically, the value of the Bell expression gives device-independent lower bounds on both the dimension and the entropy of the shared random variable in a local model.

  3. A problem-based learning curriculum in transition: the emerging role of the library.

    Science.gov (United States)

    Eldredge, J D

    1993-07-01

    This case study describes library education programs that serve the University of New Mexico School of Medicine, known for its innovative problem-based learning (PBL) curricular track. The paper outlines the specific library instruction techniques that are integrated into the curriculum. The adaptation of library instruction to a PBL mode of medical education, including the use of case studies, is discussed in detail. Also addressed are the planning processes for the new PBL curriculum scheduled for implementation in 1993, including the activities of library faculty and staff and the probable new role of the library in the new curriculum.

  4. Cloud-based services for your library a LITA guide

    CERN Document Server

    Mitchell, Erik T

    2013-01-01

    By exploring specific examples of cloud computing and virtualization, this book allows libraries considering cloud computing to start their exploration of these systems with a more informed perspective.

  5. Security of Heterogeneous Content in Cloud Based Library Information Systems Using an Ontology Based Approach

    Directory of Open Access Journals (Sweden)

    Mihai DOINEA

    2014-01-01

    Full Text Available As in any domain that involves the use of software, the library information systems take advantages of cloud computing. The paper highlights the main aspect of cloud based systems, describing some public solutions provided by the most important players on the market. Topics related to content security in cloud based services are tackled in order to emphasize the requirements that must be met by these types of systems. A cloud based implementation of an Information Library System is presented and some adjacent tools that are used together with it to provide digital content and metadata links are described. In a cloud based Information Library System security is approached by means of ontologies. Aspects such as content security in terms of digital rights are presented and a methodology for security optimization is proposed.

  6. Fault estimation of satellite reaction wheels using covariance based adaptive unscented Kalman filter

    Science.gov (United States)

    Rahimi, Afshin; Kumar, Krishna Dev; Alighanbari, Hekmat

    2017-05-01

    Reaction wheels, as one of the most commonly used actuators in satellite attitude control systems, are prone to malfunction which could lead to catastrophic failures. Such malfunctions can be detected and addressed in time if proper analytical redundancy algorithms such as parameter estimation and control reconfiguration are employed. Major challenges in parameter estimation include speed and accuracy of the employed algorithm. This paper presents a new approach for improving parameter estimation with adaptive unscented Kalman filter. The enhancement in tracking speed of unscented Kalman filter is achieved by systematically adapting the covariance matrix to the faulty estimates using innovation and residual sequences combined with an adaptive fault annunciation scheme. The proposed approach provides the filter with the advantage of tracking sudden changes in the system non-measurable parameters accurately. Results showed successful detection of reaction wheel malfunctions without requiring a priori knowledge about system performance in the presence of abrupt, transient, intermittent, and incipient faults. Furthermore, the proposed approach resulted in superior filter performance with less mean squared errors for residuals compared to generic and adaptive unscented Kalman filters, and thus, it can be a promising method for the development of fail-safe satellites.

  7. Triple collocation-based estimation of spatially correlated observation error covariance in remote sensing soil moisture data assimilation

    Science.gov (United States)

    Wu, Kai; Shu, Hong; Nie, Lei; Jiao, Zhenhang

    2018-01-01

    Spatially correlated errors are typically ignored in data assimilation, thus degenerating the observation error covariance R to a diagonal matrix. We argue that a nondiagonal R carries more observation information making assimilation results more accurate. A method, denoted TC_Cov, was proposed for soil moisture data assimilation to estimate spatially correlated observation error covariance based on triple collocation (TC). Assimilation experiments were carried out to test the performance of TC_Cov. AMSR-E soil moisture was assimilated with a diagonal R matrix computed using the TC and assimilated using a nondiagonal R matrix, as estimated by proposed TC_Cov. The ensemble Kalman filter was considered as the assimilation method. Our assimilation results were validated against climate change initiative data and ground-based soil moisture measurements using the Pearson correlation coefficient and unbiased root mean square difference metrics. These experiments confirmed that deterioration of diagonal R assimilation results occurred when model simulation is more accurate than observation data. Furthermore, nondiagonal R achieved higher correlation coefficient and lower ubRMSD values over diagonal R in experiments and demonstrated the effectiveness of TC_Cov to estimate richly structuralized R in data assimilation. In sum, compared with diagonal R, nondiagonal R may relieve the detrimental effects of assimilation when simulated model results outperform observation data.

  8. ECNJEFI. A JEFI based 219-group neutron cross-section library: User's manual

    International Nuclear Information System (INIS)

    Stad, R.C.L. van der; Gruppelaar, H.

    1992-07-01

    This manual describes the contents of the ECNJEF1 library. The ECNJEF1 library is a JEF1.1 based 219-group AMPX-Master library for reactor calculations with the AMPX/SCALE-system, e.g. the PASC-3 system as implemented at the Netherlands Energy Research Foundation in Petten, Netherlands. The group cross-section data were generated with NJOY and NPTXS/XLACS-2 from the AMPX system. The data on the ECNJEF1 library allows resolved-resonance treatment by NITAWL and/or unresolved resonance self-shielding by BONAMI. These codes are based upon the Nordheim and Bondarenko methods, respectively. (author). 10 refs., 7 tabs

  9. A problem-based learning curriculum in transition: the emerging role of the library.

    OpenAIRE

    Eldredge, J D

    1993-01-01

    This case study describes library education programs that serve the University of New Mexico School of Medicine, known for its innovative problem-based learning (PBL) curricular track. The paper outlines the specific library instruction techniques that are integrated into the curriculum. The adaptation of library instruction to a PBL mode of medical education, including the use of case studies, is discussed in detail. Also addressed are the planning processes for the new PBL curriculum schedu...

  10. ORIGEN-2 libraries based on JENDL-3.2 for PWR-MOX fuel

    Energy Technology Data Exchange (ETDEWEB)

    Matsumoto, Hideki; Onoue, Masaaki; Tahara, Yoshihisa [Mitsubishi Heavy Industries Ltd., Tokyo (Japan)

    2001-08-01

    A set of ORIGEN-2 libraries for PWR MOX fuel was developed based on JENDL-3.2 in the Working Group on Evaluation of Nuclide Production, Japanese Nuclear Data Committee. The calculational model generating ORIGEN-2 libraries of PWR MOX is explained here in detail. The ORIGEN-2 calculation with the new ORIGEN-2 MOX library can predict the nuclides contents within 10% for U and Pu isotopes and 20% for both minor actinides and main FPs. (author)

  11. Endf/B-VII.0 Based Library for Paragon - 313

    International Nuclear Information System (INIS)

    Huria, H.C.; Kucukboyaci, V.N.; Ouisloumen, M.

    2010-01-01

    A new 70-group library has been generated for the Westinghouse lattice physics code PARAGON using the ENDF/B-VII.0 nuclear data files. The new library retains the major features of the current library, including the number of energy groups and the reduction in the U-238 resonance integral. The upper bound for the up-scattering effects in the new library, however, has been moved to 4.0 eV from 2.1 eV for better MOX fuel predictions. The new library has been used to analyze standard benchmarks and also to compare the measured and predicted parameters for different types of Westinghouse and Combustion Engineering (CE) type operating reactor cores. Results indicate that the new library will not impact the reactivity, power distribution and the temperature coefficient predictions over a wide range of physics design parameters; however, will improve the MOX core predictions. In other words, the ENDF/B-VI.3 and ENDF/B-VII.0 produce similar results for reactor core calculations. (authors)

  12. Curriculum-based library instruction from cultivating faculty relationships to assessment

    CERN Document Server

    Blevins, Amy

    2014-01-01

    Curriculum-Based Library Instruction: From Cultivating Faculty Relationships to Assessment highlights the movement beyond one-shot instruction sessions, specifically focusing on situations where academic librarians have developed curriculum based sessions and/or become involved in curriculum committees.

  13. Collection evaluation in University libraries (II. Methods based on collection use

    Directory of Open Access Journals (Sweden)

    Àngels Massísimo i Sánchez de Boado

    2004-01-01

    Full Text Available This is our second paper devoted to the collection evaluation in the university libraries. Seven methods are described, based on collection use. Their advantages and disadvantages are discussed, as well as their usefulness for a range of library types

  14. Evidence-Based Practice and School Libraries: Interconnections of Evidence, Advocacy, and Actions

    Science.gov (United States)

    Todd, Ross J.

    2015-01-01

    This author states that a professional focus on evidence based practice (EBP) for school libraries emerged from the International Association of School Librarianship conference when he presented the concept. He challenged the school library profession to actively engage in professional and reflective practices that chart, measure, document, and…

  15. The Experience of Evidence-Based Practice in an Australian Public Library: An Ethnography

    Science.gov (United States)

    Gillespie, Ann; Partridge, Helen; Bruce, Christine; Howlett, Alisa

    2016-01-01

    Introduction: This paper presents the findings from a project that investigated the lived experiences of library and information professionals in relation to evidence-based practice within an Australian public library. Method: The project employed ethnography, which allows holistic description of people's experiences within a particular community…

  16. Analysis of Environmental Friendly Library Based on the Satisfaction and Service Quality: study at Library “X”

    Science.gov (United States)

    Herdiansyah, Herdis; Satriya Utama, Andre; Safruddin; Hidayat, Heri; Gema Zuliana Irawan, Angga; Immanuel Tjandra Muliawan, R.; Mutia Pratiwi, Diana

    2017-10-01

    One of the factor that influenced the development of science is the existence of the library, which in this case is the college libraries. Library, which is located in the college environment, aims to supply collections of literatures to support research activities as well as educational for students of the college. Conceptually, every library now starts to practice environmental principles. For example, “X” library as a central library claims to be an environmental friendly library for practicing environmental friendly management, but the X library has not inserted the satisfaction and service aspect to the users, including whether it is true that environmental friendly process is perceived by library users. Satisfaction can be seen from the comparison between expectations and reality of library users. This paper analyzes the level of library user satisfaction with library services in the campus area and the gap between expectations and reality felt by the library users. The result of the research shows that there is a disparity between the hope of library management, which is sustainable and environmentally friendly with the reality in the management of the library, so that it has not given satisfaction to the users yet. The gap value of satisfaction that has the biggest difference is in the library collection with the value of 1.57; while for the smallest gap value is in the same service to all students with a value of 0.67.

  17. A parameterization of observer-based controllers: Bumpless transfer by covariance interpolation

    DEFF Research Database (Denmark)

    Stoustrup, Jakob; Komareji, Mohammad

    2009-01-01

    This paper presents an algorithm to interpolate between two observer-based controllers for a linear multivariable system such that the closed loop system remains stable throughout the interpolation. The method interpolates between the inverse Lyapunov functions for the two original state feedback...

  18. MT71x: Multi-Temperature Library Based on ENDF/B-VII.1

    Energy Technology Data Exchange (ETDEWEB)

    Conlin, Jeremy Lloyd [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parsons, Donald Kent [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Gray, Mark Girard [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Lee, Mary Beth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); White, Morgan Curtis [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-12-16

    The Nuclear Data Team has released a multitemperature transport library, MT71x, based upon ENDF/B-VII.1 with a few modifications as well as additional evaluations for a total of 427 isotope tables. The library was processed using NJOY2012.39 into 23 temperatures. MT71x consists of two sub-libraries; MT71xMG for multigroup energy representation data and MT71xCE for continuous energy representation data. These sub-libraries are suitable for deterministic transport and Monte Carlo transport applications, respectively. The SZAs used are the same for the two sub-libraries; that is, the same SZA can be used for both libraries. This makes comparisons between the two libraries and between deterministic and Monte Carlo codes straightforward. Both the multigroup energy and continuous energy libraries were verified and validated with our checking codes checkmg and checkace (multigroup and continuous energy, respectively) Then an expanded suite of tests was used for additional verification and, finally, verified using an extensive suite of critical benchmark models. We feel that this library is suitable for all calculations and is particularly useful for calculations sensitive to temperature effects.

  19. Heteroscedasticity resistant robust covariance matrix estimator

    Czech Academy of Sciences Publication Activity Database

    Víšek, Jan Ámos

    2010-01-01

    Roč. 17, č. 27 (2010), s. 33-49 ISSN 1212-074X Grant - others:GA UK(CZ) GA402/09/0557 Institutional research plan: CEZ:AV0Z10750506 Keywords : Regression * Covariance matrix * Heteroscedasticity * Resistant Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2011/SI/visek-heteroscedasticity resistant robust covariance matrix estimator.pdf

  20. Dynamic combinatorial libraries based on hydrogen-bonde molecular boxes

    NARCIS (Netherlands)

    Kerckhoffs, J.M.C.A.; Mateos timoneda, Miguel; Reinhoudt, David; Crego Calama, Mercedes

    2007-01-01

    This article describes two different types of dynamic combinatorial libraries of host and guest molecules. The first part of this article describes the encapsulation of alizarin trimer 2 a3 by dynamic mixtures of up to twenty different self-assembled molecular receptors together with the

  1. Analysis of Different Feature Selection Criteria Based on a Covariance Convergence Perspective for a SLAM Algorithm

    Science.gov (United States)

    Auat Cheein, Fernando A.; Carelli, Ricardo

    2011-01-01

    This paper introduces several non-arbitrary feature selection techniques for a Simultaneous Localization and Mapping (SLAM) algorithm. The feature selection criteria are based on the determination of the most significant features from a SLAM convergence perspective. The SLAM algorithm implemented in this work is a sequential EKF (Extended Kalman filter) SLAM. The feature selection criteria are applied on the correction stage of the SLAM algorithm, restricting it to correct the SLAM algorithm with the most significant features. This restriction also causes a decrement in the processing time of the SLAM. Several experiments with a mobile robot are shown in this work. The experiments concern the map reconstruction and a comparison between the different proposed techniques performance. The experiments were carried out at an outdoor environment composed by trees, although the results shown herein are not restricted to a special type of features. PMID:22346568

  2. Analyses of criticality and reactivity for TRACY experiments based on JENDL-3.3 data library

    International Nuclear Information System (INIS)

    Sono, Hiroki; Miyoshi, Yoshinori; Nakajima, Ken

    2003-01-01

    The parameters on criticality and reactivity employed for computational simulations of the TRACY supercritical experiments were analyzed using a recently revised nuclear data library, JENDL-3.3. The parameters based on the JENDL-3.3 library were compared to those based on two former-used libraries, JENDL-3.2 and ENDF/B-VI. In the analyses computational codes, MVP, MCNP version 4C and TWOTRAN, were used. The following conclusions were obtained from the analyses: (1) The computational biases of the effective neutron multiplication factor attributable to the nuclear data libraries and to the computational codes do not depend the TRACY experimental conditions such as fuel conditions. (2) The fractional discrepancies in the kinetic parameters and coefficients of reactivity are within ∼5% between the three libraries. By comparison between calculations and measurements of the parameters, the JENDL-3.3 library is expected to give closer values to the measurements than the JENDL-3.2 and ENDF/B-VI libraries. (3) While the reactivity worth of transient rods expressed in the $ unit shows ∼5% discrepancy between the three libraries according to their respective β eff values, there is little discrepancy in that expressed in the Δk/k unit. (author)

  3. Proofs of Contracted Length Non-covariance

    International Nuclear Information System (INIS)

    Strel'tsov, V.N.

    1994-01-01

    Different proofs of contracted length non covariance are discussed. The way based on the establishment of interval inconstancy (dependence on velocity) seems to be the most convincing one. It is stressed that the known non covariance of the electromagnetic field energy and momentum of a moving charge ('the problem 4/3') is a direct consequence of contracted length non covariance. 8 refs

  4. Covariance-based synaptic plasticity in an attractor network model accounts for fast adaptation in free operant learning.

    Science.gov (United States)

    Neiman, Tal; Loewenstein, Yonatan

    2013-01-23

    In free operant experiments, subjects alternate at will between targets that yield rewards stochastically. Behavior in these experiments is typically characterized by (1) an exponential distribution of stay durations, (2) matching of the relative time spent at a target to its relative share of the total number of rewards, and (3) adaptation after a change in the reward rates that can be very fast. The neural mechanism underlying these regularities is largely unknown. Moreover, current decision-making neural network models typically aim at explaining behavior in discrete-time experiments in which a single decision is made once in every trial, making these models hard to extend to the more natural case of free operant decisions. Here we show that a model based on attractor dynamics, in which transitions are induced by noise and preference is formed via covariance-based synaptic plasticity, can account for the characteristics of behavior in free operant experiments. We compare a specific instance of such a model, in which two recurrently excited populations of neurons compete for higher activity, to the behavior of rats responding on two levers for rewarding brain stimulation on a concurrent variable interval reward schedule (Gallistel et al., 2001). We show that the model is consistent with the rats' behavior, and in particular, with the observed fast adaptation to matching behavior. Further, we show that the neural model can be reduced to a behavioral model, and we use this model to deduce a novel "conservation law," which is consistent with the behavior of the rats.

  5. An Empirical Orthogonal Function-Based Algorithm for Estimating Terrestrial Latent Heat Flux from Eddy Covariance, Meteorological and Satellite Observations.

    Science.gov (United States)

    Feng, Fei; Li, Xianglan; Yao, Yunjun; Liang, Shunlin; Chen, Jiquan; Zhao, Xiang; Jia, Kun; Pintér, Krisztina; McCaughey, J Harry

    2016-01-01

    Accurate estimation of latent heat flux (LE) based on remote sensing data is critical in characterizing terrestrial ecosystems and modeling land surface processes. Many LE products were released during the past few decades, but their quality might not meet the requirements in terms of data consistency and estimation accuracy. Merging multiple algorithms could be an effective way to improve the quality of existing LE products. In this paper, we present a data integration method based on modified empirical orthogonal function (EOF) analysis to integrate the Moderate Resolution Imaging Spectroradiometer (MODIS) LE product (MOD16) and the Priestley-Taylor LE algorithm of Jet Propulsion Laboratory (PT-JPL) estimate. Twenty-two eddy covariance (EC) sites with LE observation were chosen to evaluate our algorithm, showing that the proposed EOF fusion method was capable of integrating the two satellite data sets with improved consistency and reduced uncertainties. Further efforts were needed to evaluate and improve the proposed algorithm at larger spatial scales and time periods, and over different land cover types.

  6. Evidence-based medicine and the development of medical libraries in China.

    Science.gov (United States)

    Huang, Michael Bailou; Cheng, Aijun; Ma, Lu

    2009-07-01

    This article elaborates on the opportunities and challenges that evidence-based medicine (EBM) has posed to the development of medical libraries and summarizes the research in the field of evidence-based medicine and achievements of EBM practice in Chinese medical libraries. Issues such as building collections of information resources, transformation of information services models, human resources management, and training of medical librarians, clinicians, and EBM users are addressed. In view of problems encountered in EBM research and practice, several suggestions are made about important roles medical libraries can play in the future development of EBM in China.

  7. PCR-based cDNA library construction: general cDNA libraries at the level of a few cells.

    OpenAIRE

    Belyavsky, A; Vinogradova, T; Rajewsky, K

    1989-01-01

    A procedure for the construction of general cDNA libraries is described which is based on the amplification of total cDNA in vitro. The first cDNA strand is synthesized from total RNA using an oligo(dT)-containing primer. After oligo(dG) tailing the total cDNA is amplified by PCR using two primers complementary to oligo(dA) and oligo(dG) ends of the cDNA. For insertion of the cDNA into a vector a controlled trimming of the 3' ends of the cDNA by Klenow enzyme was used. Starting from 10 J558L ...

  8. Optimization of reference library used in content-based medical image retrieval scheme

    International Nuclear Information System (INIS)

    Park, Sang Cheol; Sukthankar, Rahul; Mummert, Lily; Satyanarayanan, Mahadev; Zheng Bin

    2007-01-01

    Building an optimal image reference library is a critical step in developing the interactive computer-aided detection and diagnosis (I-CAD) systems of medical images using content-based image retrieval (CBIR) schemes. In this study, the authors conducted two experiments to investigate (1) the relationship between I-CAD performance and size of reference library and (2) a new reference selection strategy to optimize the library and improve I-CAD performance. The authors assembled a reference library that includes 3153 regions of interest (ROI) depicting either malignant masses (1592) or CAD-cued false-positive regions (1561) and an independent testing data set including 200 masses and 200 false-positive regions. A CBIR scheme using a distance-weighted K-nearest neighbor algorithm is applied to retrieve references that are considered similar to the testing sample from the library. The area under receiver operating characteristic curve (A z ) is used as an index to evaluate the I-CAD performance. In the first experiment, the authors systematically increased reference library size and tested I-CAD performance. The result indicates that scheme performance improves initially from A z =0.715 to 0.874 and then plateaus when the library size reaches approximately half of its maximum capacity. In the second experiment, based on the hypothesis that a ROI should be removed if it performs poorly compared to a group of similar ROIs in a large and diverse reference library, the authors applied a new strategy to identify 'poorly effective' references. By removing 174 identified ROIs from the reference library, I-CAD performance significantly increases to A z =0.914 (p<0.01). The study demonstrates that increasing reference library size and removing poorly effective references can significantly improve I-CAD performance

  9. Agricultural Library Information Retrieval Based on Improved Semantic Algorithm

    OpenAIRE

    Meiling , Xie

    2014-01-01

    International audience; To support users to quickly access information they need from the agricultural library’s vast information and to improve the low intelligence query service, a model for intelligent library information retrieval was constructed. The semantic web mode was introduced and the information retrieval framework was designed. The model structure consisted of three parts: Information data integration, user interface and information retrieval match. The key method supporting retr...

  10. Predicting the required number of training samples. [for remotely sensed image data based on covariance matrix estimate quality criterion of normal distribution

    Science.gov (United States)

    Kalayeh, H. M.; Landgrebe, D. A.

    1983-01-01

    A criterion which measures the quality of the estimate of the covariance matrix of a multivariate normal distribution is developed. Based on this criterion, the necessary number of training samples is predicted. Experimental results which are used as a guide for determining the number of training samples are included. Previously announced in STAR as N82-28109

  11. Optimal protein library design using recombination or point mutations based on sequence-based scoring functions.

    Science.gov (United States)

    Pantazes, Robert J; Saraf, Manish C; Maranas, Costas D

    2007-08-01

    In this paper, we introduce and test two new sequence-based protein scoring systems (i.e. S1, S2) for assessing the likelihood that a given protein hybrid will be functional. By binning together amino acids with similar properties (i.e. volume, hydrophobicity and charge) the scoring systems S1 and S2 allow for the quantification of the severity of mismatched interactions in the hybrids. The S2 scoring system is found to be able to significantly functionally enrich a cytochrome P450 library over other scoring methods. Given this scoring base, we subsequently constructed two separate optimization formulations (i.e. OPTCOMB and OPTOLIGO) for optimally designing protein combinatorial libraries involving recombination or mutations, respectively. Notably, two separate versions of OPTCOMB are generated (i.e. model M1, M2) with the latter allowing for position-dependent parental fragment skipping. Computational benchmarking results demonstrate the efficacy of models OPTCOMB and OPTOLIGO to generate high scoring libraries of a prespecified size.

  12. Collection-based analysis of selected medical libraries in the Philippines using Doody's Core Titles.

    Science.gov (United States)

    Torres, Efren

    2017-01-01

    This study assessed the book collection of five selected medical libraries in the Philippines, based on Doodys' Essential Purchase List for basic sciences and clinical medicine, to compare the match and non-match titles among libraries, to determine the strong and weak disciplines of each library, and to explore the factors that contributed to the percentage of match and non-match titles. List checking was employed as the method of research. Among the medical libraries, De La Salle Health Sciences Institute and University of Santo Tomas had the highest percentage of match titles, whereas Ateneo School of Medicine and Public Health had the lowest percentage of match titles. University of the Philippines Manila had the highest percentage of near-match titles. De La Salle Health Sciences Institute and University of Santo Tomas had sound medical collections based on Doody's Core Titles. Collectively, the medical libraries shared common collection development priorities, as evidenced by similarities in strong areas. Library budget and the role of the library director in book selection were among the factors that could contribute to a high percentage of match titles.

  13. Investigating User Interfaces of Non-Iranian Digital Libraries based on Social Bookmarking Capabilities and Characteristics to Use by Iranian Digital Libraries

    Directory of Open Access Journals (Sweden)

    Zahra Naseri

    2016-08-01

    Full Text Available Current study aims to investigate the status of user interfaces of non-Iranian digital libraries’ based on social bookmarking capabilities and characteristics to use by Iranian digital libraries. This research studies the characteristics and capabilities of top digital libraries’ user interfaces in the world based on social bookmarking used by library users. This capability facilitates producing, identifying, organizing, and sharing contents using tags. Survey method was used with descriptive-analytical approach in this study. Populations include non-Iranian digital libraries interfaces. Top ten digital libraries’ interfaces were selected as the sample. A researcher-made checklist prepared based on literature review and investigating four distinguished websites (Library Thing, Delicious, Amazon, and Google Book. Faced validity evaluated by 10 experts’ viewpoints, then reliability calculated 0.87.Findings of this study are important because of two reasons: first, it provides a comprehensive and an unambiguous vision for recognizing user interfaces’ basic capabilities and characteristics based on social bookmarking. Second, it can provide a base for designing digital libraries in Iran. The results showed that the majority of digital libraries around the world had not used web 2.0 characteristics such as producing, identifying, organizing, and sharing contents except two digital libraries (Google Books, and Ibiblio.

  14. Covariances for neutron cross sections calculated using a regional model based on local-model fits to experimental data

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.L.; Guenther, P.T.

    1983-11-01

    We suggest a procedure for estimating uncertainties in neutron cross sections calculated with a nuclear model descriptive of a specific mass region. It applies standard error propagation techniques, using a model-parameter covariance matrix. Generally, available codes do not generate covariance information in conjunction with their fitting algorithms. Therefore, we resort to estimating a relative covariance matrix a posteriori from a statistical examination of the scatter of elemental parameter values about the regional representation. We numerically demonstrate our method by considering an optical-statistical model analysis of a body of total and elastic scattering data for the light fission-fragment mass region. In this example, strong uncertainty correlations emerge and they conspire to reduce estimated errors to some 50% of those obtained from a naive uncorrelated summation in quadrature. 37 references.

  15. Covariances for neutron cross sections calculated using a regional model based on local-model fits to experimental data

    International Nuclear Information System (INIS)

    Smith, D.L.; Guenther, P.T.

    1983-11-01

    We suggest a procedure for estimating uncertainties in neutron cross sections calculated with a nuclear model descriptive of a specific mass region. It applies standard error propagation techniques, using a model-parameter covariance matrix. Generally, available codes do not generate covariance information in conjunction with their fitting algorithms. Therefore, we resort to estimating a relative covariance matrix a posteriori from a statistical examination of the scatter of elemental parameter values about the regional representation. We numerically demonstrate our method by considering an optical-statistical model analysis of a body of total and elastic scattering data for the light fission-fragment mass region. In this example, strong uncertainty correlations emerge and they conspire to reduce estimated errors to some 50% of those obtained from a naive uncorrelated summation in quadrature. 37 references

  16. ERRORJ. Covariance processing code system for JENDL. Version 2

    International Nuclear Information System (INIS)

    Chiba, Gou

    2003-09-01

    ERRORJ is the covariance processing code system for Japanese Evaluated Nuclear Data Library (JENDL) that can produce group-averaged covariance data to apply it to the uncertainty analysis of nuclear characteristics. ERRORJ can treat the covariance data for cross sections including resonance parameters as well as angular distributions and energy distributions of secondary neutrons which could not be dealt with by former covariance processing codes. In addition, ERRORJ can treat various forms of multi-group cross section and produce multi-group covariance file with various formats. This document describes an outline of ERRORJ and how to use it. (author)

  17. Critical experiments analyses by using 70 energy group library based on ENDF/B-VI

    Energy Technology Data Exchange (ETDEWEB)

    Tahara, Yoshihisa; Matsumoto, Hideki [Mitsubishi Heavy Industries Ltd., Yokohama (Japan). Nuclear Energy Systems Engineering Center; Huria, H.C.; Ouisloumen, M.

    1998-03-01

    The newly developed 70-group library has been validated by comparing kinf from a continuous energy Monte-Carlo code MCNP and two dimensional spectrum calculation code PHOENIX-CP. The code employs Discrete Angular Flux Method based on Collision Probability. The library has been also validated against a large number of critical experiments and numerical benchmarks for assemblies with MOX and Gd fuels. (author)

  18. Generation and validation of the WIMS-D5 library based on JENDL-3.2

    International Nuclear Information System (INIS)

    Gil, Choong-Sup; Kim, Jung-Do

    2002-01-01

    A WIMS-D5 library based on JENDL-3.2 has been prepared and is being tested against benchmark problems. Several sensitivity calculations for stability confirmation of the library were carried out, such as fission spectrum dependency and the self-shielding effects of the elastic scattering cross sections and the self-shielding effects 240 Pu and 242 Pu capture cross sections below 4.0 eV. The results of benchmark calculations with the libraries based on JENDL-3.2, ENDF/B-VI.5, JEF-2.2, and the 1986 WIMS-D library were intercompared. The multiplication factors for the thermal lattices are slightly underpredicted by all libraries (up to 1% with ENDF/B-VI.5). The k eff values with the library based on JENDL-3.2 are slightly higher than those of ENDF/B-VI.5 and JEF-2.2. The spectral indices for the lattices with JENDL-3.2 agree with the measured quantities within the uncertainties of the experiments. The calculated amounts of some isotopes such as 149 Sm, 237 Np, 238 Pu, 242 Cm and 243 Cm show large differences from the measured or reference values. (author)

  19. Douglass Rationalization: An Evaluation of a Team Environment and a Computer-Based Task in Academic Libraries

    Science.gov (United States)

    Denda, Kayo; Smulewitz, Gracemary

    2004-01-01

    In the contemporary library environment, the presence of the Internet and the infrastructure of the integrated library system suggest an integrated internal organization. The article describes the example of Douglass Rationalization, a team-based collaborative project to refocus the collection of Rutgers' Douglass Library, taking advantage of the…

  20. Compilation of MCNP data library based on JENDL-3T and test through analysis of benchmark experiment

    International Nuclear Information System (INIS)

    Sakurai, K.; Sasamoto, N.; Kosako, K.; Ishikawa, T.; Sato, O.; Oyama, Y.; Narita, H.; Maekawa, H.; Ueki, K.

    1989-01-01

    Based on an evaluated nuclear data library JENDL-3T, a temporary version of JENDL-3, a pointwise neutron cross section library for MCNP code is compiled which involves 39 nuclides from H-1 to Am-241 which are important for shielding calculations. Compilation is performed with the code system which consists of the nuclear data processing code NJOY-83 and library compilation code MACROS. Validity of the code system and reliability of the library are certified by analysing benchmark experiments. (author)

  1. A comparison of methane emission measurements using eddy covariance and manual and automated chamber-based techniques in Tibetan Plateau alpine wetland

    International Nuclear Information System (INIS)

    Yu, Lingfei; Wang, Hao; Wang, Guangshuai; Song, Weimin; Huang, Yao; Li, Sheng-Gong; Liang, Naishen; Tang, Yanhong; He, Jin-Sheng

    2013-01-01

    Comparing of different CH 4 flux measurement techniques allows for the independent evaluation of the performance and reliability of those techniques. We compared three approaches, the traditional discrete Manual Static Chamber (MSC), Continuous Automated Chamber (CAC) and Eddy Covariance (EC) methods of measuring the CH 4 fluxes in an alpine wetland. We found a good agreement among the three methods in the seasonal CH 4 flux patterns, but the diurnal patterns from both the CAC and EC methods differed. While the diurnal CH 4 flux variation from the CAC method was positively correlated with the soil temperature, the diurnal variation from the EC method was closely correlated with the solar radiation and net CO 2 fluxes during the daytime but was correlated with the soil temperature at nighttime. The MSC method showed 25.3% and 7.6% greater CH 4 fluxes than the CAC and EC methods when measured between 09:00 h and 12:00 h, respectively. -- Highlights: •Chamber and eddy covariance methods showed similar seasonal CH 4 flux patterns. •Chamber and eddy covariance methods showed different diurnal CH 4 flux patterns. •Static chamber methods gave a higher magnitude of CH 4 flux. -- The chamber-based methods and the eddy covariance method showed similar seasonal CH 4 flux patterns, but the manual static chamber method resulted in a higher CH 4 flux measurement

  2. Balancing focused combinatorial libraries based on multiple GPCR ligands

    Science.gov (United States)

    Soltanshahi, Farhad; Mansley, Tamsin E.; Choi, Sun; Clark, Robert D.

    2006-08-01

    G-Protein coupled receptors (GPCRs) are important targets for drug discovery, and combinatorial chemistry is an important tool for pharmaceutical development. The absence of detailed structural information, however, limits the kinds of combinatorial design techniques that can be applied to GPCR targets. This is particularly problematic given the current emphasis on focused combinatorial libraries. By linking an incremental construction method (OptDesign) to the very fast shape-matching capability of ChemSpace, we have created an efficient method for designing targeted sublibraries that are topomerically similar to known actives. Multi-objective scoring allows consideration of multiple queries (actives) simultaneously. This can lead to a distribution of products skewed towards one particular query structure, however, particularly when the ligands of interest are quite dissimilar to one another. A novel pivoting technique is described which makes it possible to generate promising designs even under those circumstances. The approach is illustrated by application to some serotonergic agonists and chemokine antagonists.

  3. Advanced Neutron Source Cross Section Libraries (ANSL-V): ENDF/B-V based multigroup cross-section libraries for advanced neutron source (ANS) reactor studies

    International Nuclear Information System (INIS)

    Ford, W.E. III; Arwood, J.W.; Greene, N.M.; Moses, D.L.; Petrie, L.M.; Primm, R.T. III; Slater, C.O.; Westfall, R.M.; Wright, R.Q.

    1990-09-01

    Pseudo-problem-independent, multigroup cross-section libraries were generated to support Advanced Neutron Source (ANS) Reactor design studies. The ANS is a proposed reactor which would be fueled with highly enriched uranium and cooled with heavy water. The libraries, designated ANSL-V (Advanced Neutron Source Cross Section Libraries based on ENDF/B-V), are data bases in AMPX master format for subsequent generation of problem-dependent cross-sections for use with codes such as KENO, ANISN, XSDRNPM, VENTURE, DOT, DORT, TORT, and MORSE. Included in ANSL-V are 99-group and 39-group neutron, 39-neutron-group 44-gamma-ray-group secondary gamma-ray production (SGRP), 44-group gamma-ray interaction (GRI), and coupled, 39-neutron group 44-gamma-ray group (CNG) cross-section libraries. The neutron and SGRP libraries were generated primarily from ENDF/B-V data; the GRI library was generated from DLC-99/HUGO data, which is recognized as the ENDF/B-V photon interaction data. Modules from the AMPX and NJOY systems were used to process the multigroup data. Validity of selected data from the fine- and broad-group neutron libraries was satisfactorily tested in performance parameter calculations

  4. Calculating the Fee-Based Services of Library Institutions: Theoretical Foundations and Practical Challenges

    Directory of Open Access Journals (Sweden)

    Sysіuk Svitlana V.

    2017-05-01

    Full Text Available The article is aimed at highlighting features of the provision of the fee-based services by library institutions, identifying problems related to the legal and regulatory framework for their calculation, and the methods to implement this. The objective of the study is to develop recommendations to improve the calculation of the fee-based library services. The theoretical foundations have been systematized, the need to develop a Provision for the procedure of the fee-based services by library institutions has been substantiated. Such a Provision would protect library institution from errors in fixing the fee for a paid service and would be an informational source of its explicability. The appropriateness of applying the market pricing law based on demand and supply has been substantiated. The development and improvement of accounting and calculation, taking into consideration both industry-specific and market-based conditions, would optimize the costs and revenues generated by the provision of the fee-based services. In addition, the complex combination of calculation leverages with development of the system of internal accounting together with use of its methodology – provides another equally efficient way of improving the efficiency of library institutions’ activity.

  5. America's Star Libraries, 2010: Top-Rated Libraries

    Science.gov (United States)

    Lyons, Ray; Lance, Keith Curry

    2010-01-01

    The "LJ" Index of Public Library Service 2010, "Library Journal"'s national rating of public libraries, identifies 258 "star" libraries. Created by Ray Lyons and Keith Curry Lance, and based on 2008 data from the IMLS, it rates 7,407 public libraries. The top libraries in each group get five, four, or three stars. All included libraries, stars or…

  6. State Virtual Libraries

    Science.gov (United States)

    Pappas, Marjorie L.

    2003-01-01

    Virtual library? Electronic library? Digital library? Online information network? These all apply to the growing number of Web-based resource collections managed by consortiums of state library entities. Some, like "INFOhio" and "KYVL" ("Kentucky Virtual Library"), have been available for a few years, but others are just starting. Searching for…

  7. Availability and accessibility of evidence-based information resources provided by medical libraries in Australia.

    Science.gov (United States)

    Ritchie, A; Sowter, B

    2000-01-01

    This article reports on the results of an exploratory survey of the availability and accessibility of evidence-based information resources provided by medical libraries in Australia. Although barriers impede access to evidence-based information for hospital clinicians, the survey revealed that Medline and Cinahl are available in over 90% of facilities. In most cases they are widely accessible via internal networks and the Internet. The Cochrane Library is available in 69% of cases. The Internet is widely accessible and most libraries provide access to some full-text, electronic journals. Strategies for overcoming restrictions and integrating information resources with clinical workflow are being pursued. State, regional and national public and private consortia are developing agreements utilising on-line technology. These could produce cost savings and more equitable access to a greater range of evidence-based resources.

  8. Parametric estimation of covariance function in Gaussian-process based Kriging models. Application to uncertainty quantification for computer experiments

    International Nuclear Information System (INIS)

    Bachoc, F.

    2013-01-01

    The parametric estimation of the covariance function of a Gaussian process is studied, in the framework of the Kriging model. Maximum Likelihood and Cross Validation estimators are considered. The correctly specified case, in which the covariance function of the Gaussian process does belong to the parametric set used for estimation, is first studied in an increasing-domain asymptotic framework. The sampling considered is a randomly perturbed multidimensional regular grid. Consistency and asymptotic normality are proved for the two estimators. It is then put into evidence that strong perturbations of the regular grid are always beneficial to Maximum Likelihood estimation. The incorrectly specified case, in which the covariance function of the Gaussian process does not belong to the parametric set used for estimation, is then studied. It is shown that Cross Validation is more robust than Maximum Likelihood in this case. Finally, two applications of the Kriging model with Gaussian processes are carried out on industrial data. For a validation problem of the friction model of the thermal-hydraulic code FLICA 4, where experimental results are available, it is shown that Gaussian process modeling of the FLICA 4 code model error enables to considerably improve its predictions. Finally, for a meta modeling problem of the GERMINAL thermal-mechanical code, the interest of the Kriging model with Gaussian processes, compared to neural network methods, is shown. (author) [fr

  9. An Adaptive Low-Cost INS/GNSS Tightly-Coupled Integration Architecture Based on Redundant Measurement Noise Covariance Estimation.

    Science.gov (United States)

    Li, Zheng; Zhang, Hai; Zhou, Qifan; Che, Huan

    2017-09-05

    The main objective of the introduced study is to design an adaptive Inertial Navigation System/Global Navigation Satellite System (INS/GNSS) tightly-coupled integration system that can provide more reliable navigation solutions by making full use of an adaptive Kalman filter (AKF) and satellite selection algorithm. To achieve this goal, we develop a novel redundant measurement noise covariance estimation (RMNCE) theorem, which adaptively estimates measurement noise properties by analyzing the difference sequences of system measurements. The proposed RMNCE approach is then applied to design both a modified weighted satellite selection algorithm and a type of adaptive unscented Kalman filter (UKF) to improve the performance of the tightly-coupled integration system. In addition, an adaptive measurement noise covariance expanding algorithm is developed to mitigate outliers when facing heavy multipath and other harsh situations. Both semi-physical simulation and field experiments were conducted to evaluate the performance of the proposed architecture and were compared with state-of-the-art algorithms. The results validate that the RMNCE provides a significant improvement in the measurement noise covariance estimation and the proposed architecture can improve the accuracy and reliability of the INS/GNSS tightly-coupled systems. The proposed architecture can effectively limit positioning errors under conditions of poor GNSS measurement quality and outperforms all the compared schemes.

  10. Brownian distance covariance

    OpenAIRE

    Székely, Gábor J.; Rizzo, Maria L.

    2010-01-01

    Distance correlation is a new class of multivariate dependence coefficients applicable to random vectors of arbitrary and not necessarily equal dimension. Distance covariance and distance correlation are analogous to product-moment covariance and correlation, but generalize and extend these classical bivariate measures of dependence. Distance correlation characterizes independence: it is zero if and only if the random vectors are independent. The notion of covariance with...

  11. Research of the application of multi-group libraries based on ENDF/B-VII library in the reactor design

    International Nuclear Information System (INIS)

    Mi Aijun; Li Junjie

    2010-01-01

    In this paper the multi-group libraries were constructed by processing ENDF/B-VII neutron incident files into multi-group structure, and the application of the multi-group libraries in the pressurized-water reactor(PWR) design was studied. The construction of the multi-group library is realized by using the NJOY nuclear data processing system. The code can process the neutron cross section files form ENDF format to MATXS format which was required in SN code. Two dimension transport theory code of discrete ordinates DORT was used to verify the multi-group libraries and the method of the construction by comparing calculations for some representative benchmarks. We made the PWR shielding calculation by using the multi-group libraries and studied the influence of the parameters involved during the construction of the libraries such as group structure, temperatures and weight functions on the shielding design of the PWR. This work is the preparation for the construction of the multi-group library which will be used in PWR shielding design in engineering. (authors)

  12. Process Fragment Libraries for Easier and Faster Development of Process-based Applications

    Directory of Open Access Journals (Sweden)

    David Schumm

    2011-01-01

    Full Text Available The term “process fragment” is recently gaining momentum in business process management research. We understand a process fragment as a connected and reusable process structure, which has relaxed completeness and consistency criteria compared to executable processes. We claim that process fragments allow for an easier and faster development of process-based applications. As evidence to this claim we present a process fragment concept and show a sample collection of concrete, real-world process fragments. We present advanced application scenarios for using such fragments in development of process-based applications. Process fragments are typically managed in a repository, forming a process fragment library. On top of a process fragment library from previous work, we discuss the potential impact of using process fragment libraries in cross-enterprise collaboration and application integration.

  13. Impact of the 235U Covariance Data in Benchmark Calculations

    International Nuclear Information System (INIS)

    Leal, Luiz C.; Mueller, D.; Arbanas, G.; Wiarda, D.; Derrien, H.

    2008-01-01

    The error estimation for calculated quantities relies on nuclear data uncertainty information available in the basic nuclear data libraries such as the U.S. Evaluated Nuclear Data File (ENDF/B). The uncertainty files (covariance matrices) in the ENDF/B library are generally obtained from analysis of experimental data. In the resonance region, the computer code SAMMY is used for analyses of experimental data and generation of resonance parameters. In addition to resonance parameters evaluation, SAMMY also generates resonance parameter covariance matrices (RPCM). SAMMY uses the generalized least-squares formalism (Bayes method) together with the resonance formalism (R-matrix theory) for analysis of experimental data. Two approaches are available for creation of resonance-parameter covariance data. (1) During the data-evaluation process, SAMMY generates both a set of resonance parameters that fit the experimental data and the associated resonance-parameter covariance matrix. (2) For existing resonance-parameter evaluations for which no resonance-parameter covariance data are available, SAMMY can retroactively create a resonance-parameter covariance matrix. The retroactive method was used to generate covariance data for 235U. The resulting 235U covariance matrix was then used as input to the PUFF-IV code, which processed the covariance data into multigroup form, and to the TSUNAMI code, which calculated the uncertainty in the multiplication factor due to uncertainty in the experimental cross sections. The objective of this work is to demonstrate the use of the 235U covariance data in calculations of critical benchmark systems

  14. Modeling of frequency agile devices: development of PKI neuromodeling library based on hierarchical network structure

    Science.gov (United States)

    Sanchez, P.; Hinojosa, J.; Ruiz, R.

    2005-06-01

    Recently, neuromodeling methods of microwave devices have been developed. These methods are suitable for the model generation of novel devices. They allow fast and accurate simulations and optimizations. However, the development of libraries makes these methods to be a formidable task, since they require massive input-output data provided by an electromagnetic simulator or measurements and repeated artificial neural network (ANN) training. This paper presents a strategy reducing the cost of library development with the advantages of the neuromodeling methods: high accuracy, large range of geometrical and material parameters and reduced CPU time. The library models are developed from a set of base prior knowledge input (PKI) models, which take into account the characteristics common to all the models in the library, and high-level ANNs which give the library model outputs from base PKI models. This technique is illustrated for a microwave multiconductor tunable phase shifter using anisotropic substrates. Closed-form relationships have been developed and are presented in this paper. The results show good agreement with the expected ones.

  15. The value of Web-based library services at Cedars-Sinai Health System.

    Science.gov (United States)

    Halub, L P

    1999-07-01

    Cedars-Sinai Medical Library/Information Center has maintained Web-based services since 1995 on the Cedars-Sinai Health System network. In that time, the librarians have found the provision of Web-based services to be a very worthwhile endeavor. Library users value the services that they access from their desktops because the services save time. They also appreciate being able to access services at their convenience, without restriction by the library's hours of operation. The library values its Web site because it brings increased visibility within the health system, and it enables library staff to expand services when budget restrictions have forced reduced hours of operation. In creating and maintaining the information center Web site, the librarians have learned the following lessons: consider the design carefully; offer what services you can, but weigh the advantages of providing the services against the time required to maintain them; make the content as accessible as possible; promote your Web site; and make friends in other departments, especially information services.

  16. On the coupled use of sapflow and eddy covariance measurements: environmental impacts on the evapotranspiration of an heterogeneous - wild olives based - Sardinian ecosystem.

    Science.gov (United States)

    Curreli, Matteo; Corona, Roberto; Montaldo, Nicola; Oren, Ram

    2015-04-01

    Sapflow and eddy covariance techniques are attractive methods for evapotranspiration (ET) estimates. We demonstrated that in Mediterranean ecosystems, characterized by an heterogeneous spatial distribution of different plant functional types (PFT) such as grass and trees, the combined use of these techniques becomes essential for the actual ET estimates. Indeed, during the dry summers these water-limited heterogeneous ecosystems are typically characterized by a simple dual PFT system with strong-resistant woody vegetation and bare soil, since grass died. An eddy covariance - micrometeorological tower has been installed over an heterogeneous ecosystem at the Orroli site in Sardinia (Italy) from 2003. The site landscape is a mixture of Mediterranean patchy vegetation types: wild olives, different shrubs and herbaceous species, which died during the summer. Where patchy land cover leads and the surface fluxes from different cover are largely different, ET evaluation may be not robust enough and eddy covariance method hypothesis are not anymore preserved. In these conditions the sapflow measurements, performed by thermodissipation probes, provide robust estimates of the transpiration from woody vegetation. Through the coupled use of the sapflow sensor observations, a 2D footprint model of the eddy covariance tower and high resolution satellite images for the estimate of the foot print land cover map, the eddy covariance measurements can be correctly interpreted, and ET components (bare soil evaporation and woody vegetation transpiration) can be separated. Based on the Granier technique, 33 thermo-dissipation probes have been built and 6 power regulators have been assembled to provide a constant current of 3V to the sensors. The sensors have been installed at the Orroli site into 15 wild olives clumps with different characteristics in terms of tree size, exposition to wind and solar radiation and soil depth. The sap flow sensors outputs are analyzed to estimate

  17. Multivariate covariance generalized linear models

    DEFF Research Database (Denmark)

    Bonat, W. H.; Jørgensen, Bent

    2016-01-01

    are fitted by using an efficient Newton scoring algorithm based on quasi-likelihood and Pearson estimating functions, using only second-moment assumptions. This provides a unified approach to a wide variety of types of response variables and covariance structures, including multivariate extensions......We propose a general framework for non-normal multivariate data analysis called multivariate covariance generalized linear models, designed to handle multivariate response variables, along with a wide range of temporal and spatial correlation structures defined in terms of a covariance link...... function combined with a matrix linear predictor involving known matrices. The method is motivated by three data examples that are not easily handled by existing methods. The first example concerns multivariate count data, the second involves response variables of mixed types, combined with repeated...

  18. Covariant w∞ gravity

    NARCIS (Netherlands)

    Bergshoeff, E.; Pope, C.N.; Stelle, K.S.

    1990-01-01

    We discuss the notion of higher-spin covariance in w∞ gravity. We show how a recently proposed covariant w∞ gravity action can be obtained from non-chiral w∞ gravity by making field redefinitions that introduce new gauge-field components with corresponding new gauge transformations.

  19. MOSFET-like CNFET based logic gate library for low-power application: a comparative study

    International Nuclear Information System (INIS)

    Gowri Sankar, P. A.; Udhayakumar, K.

    2014-01-01

    The next generation of logic gate devices are expected to depend upon radically new technologies mainly due to the increasing difficulties and limitations of existing CMOS technology. MOSFET like CNFETs should ideally be the best devices to work with for high-performance VLSI. This paper presents results of a comprehensive comparative study of MOSFET-like carbon nanotube field effect transistors (CNFETs) technology based logic gate library for high-speed, low-power operation than conventional bulk CMOS libraries. It focuses on comparing four promising logic families namely: complementary-CMOS (C-CMOS), transmission gate (TG), complementary pass logic (CPL) and Domino logic (DL) styles are presented. Based on these logic styles, the proposed library of static and dynamic NAND-NOR logic gates, XOR, multiplexer and full adder functions are implemented efficiently and carefully analyzed with a test bench to measure propagation delay and power dissipation as a function of supply voltage. This analysis provides the right choice of logic style for low-power, high-speed applications. Proposed logic gates libraries are simulated using Synopsys HSPICE based on the standard 32 nm CNFET model. The simulation results demonstrate that, it is best to use C-CMOS logic style gates that are implemented in CNFET technology which are superior in performance compared to other logic styles, because of their low average power-delay-product (PDP). The analysis also demonstrates how the optimum supply voltage varies with logic styles in ultra-low power systems. The robustness of the proposed logic gate library is also compared with conventional and state-art of CMOS logic gate libraries. (semiconductor integrated circuits)

  20. Networks of myelin covariance.

    Science.gov (United States)

    Melie-Garcia, Lester; Slater, David; Ruef, Anne; Sanabria-Diaz, Gretel; Preisig, Martin; Kherif, Ferath; Draganski, Bogdan; Lutti, Antoine

    2018-04-01

    Networks of anatomical covariance have been widely used to study connectivity patterns in both normal and pathological brains based on the concurrent changes of morphometric measures (i.e., cortical thickness) between brain structures across subjects (Evans, ). However, the existence of networks of microstructural changes within brain tissue has been largely unexplored so far. In this article, we studied in vivo the concurrent myelination processes among brain anatomical structures that gathered together emerge to form nonrandom networks. We name these "networks of myelin covariance" (Myelin-Nets). The Myelin-Nets were built from quantitative Magnetization Transfer data-an in-vivo magnetic resonance imaging (MRI) marker of myelin content. The synchronicity of the variations in myelin content between anatomical regions was measured by computing the Pearson's correlation coefficient. We were especially interested in elucidating the effect of age on the topological organization of the Myelin-Nets. We therefore selected two age groups: Young-Age (20-31 years old) and Old-Age (60-71 years old) and a pool of participants from 48 to 87 years old for a Myelin-Nets aging trajectory study. We found that the topological organization of the Myelin-Nets is strongly shaped by aging processes. The global myelin correlation strength, between homologous regions and locally in different brain lobes, showed a significant dependence on age. Interestingly, we also showed that the aging process modulates the resilience of the Myelin-Nets to damage of principal network structures. In summary, this work sheds light on the organizational principles driving myelination and myelin degeneration in brain gray matter and how such patterns are modulated by aging. © 2017 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.

  1. Surface Conductance of Five Different Crops Based on 10 Years of Eddy-Covariance Measurements

    Directory of Open Access Journals (Sweden)

    Uwe Spank

    2016-06-01

    Full Text Available The Penman-Monteith (PM equation is a state-of-the-art modelling approach to simulate evapotranspiration (ET at site and local scale. However, its practical application is often restricted by the availability and quality of required parameters. One of these parameters is the canopy conductance. Long term measurements of evapotranspiration by the eddy-covariance method provide an improved data basis to determine this parameter by inverse modelling. Because this approach may also include evaporation from the soil, not only the ‘actual’ canopy conductance but the whole surface conductance (gc$g_{c}$ is addressed. Two full cycles of crop rotation with five different crop types (winter barley, winter rape seed, winter wheat, silage maize, and spring barley have been continuously monitored for 10 years. These data form the basis for this study. As estimates of gc$g_{c}$ are obtained on basis of measurements, we investigated the impact of measurements uncertainties on obtained values of gc$g_{c }$. Here, two different foci were inspected more in detail. Firstly, the effect of the energy balance closure gap (EBCG on obtained values of gc$g_{c}$ was analysed. Secondly, the common hydrological practice to use vegetation height (hc$h_{c}$ to determine the period of highest plant activity (i.e., times with maximum gc$g_{c}$ concerning CO2-exchange and transpiration was critically reviewed. The results showed that hc$h_{c}$ and gc$g_{c}$ do only agree at the beginning of the growing season but increasingly differ during the rest of the growing season. Thus, the utilisation of hc$h_{c}$ as a proxy to assess maximum gc$g_{c}$ (gc,max$g_{c,\\text{max}}$ can lead to inaccurate estimates of gc,max$g_{c,\\text{max}}$ which in turn can cause serious shortcomings in simulated ET. The light use efficiency (LUE is superior to hc$h_{c}$ as a proxy to determine periods with maximum gc$g_{c}$. Based on this proxy, crop specific estimates of gc

  2. Activity-Based Costing (ABC and Time-Driven Activity-Based Costing (TDABC: Applicable Methods for University Libraries?

    Directory of Open Access Journals (Sweden)

    Kate-Riin Kont

    2011-01-01

    Full Text Available Objective – This article provides an overview of how university libraries research and adapt new cost accounting models, such as “activity-based costing” (ABC and “time-driven activity-based costing” (TDABC, focusing on the strengths and weaknesses of both methods to determine which of these two is suitable for application in university libraries.Methods – This paper reviews and summarizes the literature on cost accounting and costing practices of university libraries. A brief overview of the history of cost accounting, costing, and time and motion studies in libraries is also provided. The ABC and the TDABC method, designed as a revised and easier version of the ABC by Kaplan and Anderson (Kaplan & Anderson 2004 at the beginning of the 21st century, as well as the adoption and adaptation of these methods by university libraries are described, and their strengths and weaknesses, as well as their suitability for university libraries, are analyzed. Results – Cost accounting and costing studies in libraries have a long history, the first of these dating back to 1877. The development of cost accounting and time and motion studies can be seen as a natural evolution of techniques which were created to solve management problems. The ABC method is the best-known management accounting innovation of the last 20 years, and is already widely used in university libraries around the world. However, setting up an ABC system can be very costly, and the system needs to be regularly updated, which further increases its costs. The TDABC system can not only be implemented more quickly (and thus more cheaply, but also can be updated more easily than the traditional ABC, which makes the TDABC the more suitable method for university libraries.Conclusion – Both methods are suitable for university libraries. However, the ABC method can only be implemented in collaboration with an accounting department. The TDABC method can be tested and implemented by

  3. Covariant canonical quantization of fields and Bohmian mechanics

    International Nuclear Information System (INIS)

    Nikolic, H.

    2005-01-01

    We propose a manifestly covariant canonical method of field quantization based on the classical De Donder-Weyl covariant canonical formulation of field theory. Owing to covariance, the space and time arguments of fields are treated on an equal footing. To achieve both covariance and consistency with standard non-covariant canonical quantization of fields in Minkowski spacetime, it is necessary to adopt a covariant Bohmian formulation of quantum field theory. A preferred foliation of spacetime emerges dynamically owing to a purely quantum effect. The application to a simple time-reparametrization invariant system and quantum gravity is discussed and compared with the conventional non-covariant Wheeler-DeWitt approach. (orig.)

  4. ISSUES IN NEUTRON CROSS SECTION COVARIANCES

    Energy Technology Data Exchange (ETDEWEB)

    Mattoon, C.M.; Oblozinsky,P.

    2010-04-30

    We review neutron cross section covariances in both the resonance and fast neutron regions with the goal to identify existing issues in evaluation methods and their impact on covariances. We also outline ideas for suitable covariance quality assurance procedures.We show that the topic of covariance data remains controversial, the evaluation methodologies are not fully established and covariances produced by different approaches have unacceptable spread. The main controversy is in very low uncertainties generated by rigorous evaluation methods and much larger uncertainties based on simple estimates from experimental data. Since the evaluators tend to trust the former, while the users tend to trust the latter, this controversy has considerable practical implications. Dedicated effort is needed to arrive at covariance evaluation methods that would resolve this issue and produce results accepted internationally both by evaluators and users.

  5. Time-Based Way Finding at the Library of Agriculture Information and Scientific Documents Center (ASIDC

    Directory of Open Access Journals (Sweden)

    Roya Pournaghi

    2017-09-01

    The illustrated maps had shown that they might be helpful in gaining a better understanding of the users’ access to the library entrance and facilities in order to improve its utility and efficiency. This is a new idea started to be used in the libraries of the world. Since the study dealing with the network traffic and the amount of time for non-negative ways, Dijkstra’s algorithm was used to Time-Based Way finding. After creating the database, determining the shortest path at the least time was possible.

  6. Expanding the range of 'druggable' targets with natural product-based libraries: an academic perspective.

    Science.gov (United States)

    Bauer, Renato A; Wurst, Jacqueline M; Tan, Derek S

    2010-06-01

    Existing drugs address a relatively narrow range of biological targets. As a result, libraries of drug-like molecules have proven ineffective against a variety of challenging targets, such as protein-protein interactions, nucleic acid complexes, and antibacterial modalities. In contrast, natural products are known to be effective at modulating such targets, and new libraries are being developed based on underrepresented scaffolds and regions of chemical space associated with natural products. This has led to several recent successes in identifying new chemical probes that address these challenging targets. Copyright 2010 Elsevier Ltd. All rights reserved.

  7. A virtualized software based on the NVIDIA cuFFT library for image denoising: performance analysis

    DEFF Research Database (Denmark)

    Galletti, Ardelio; Marcellino, Livia; Montella, Raffaele

    2017-01-01

    Abstract Generic Virtualization Service (GVirtuS) is a new solution for enabling GPGPU on Virtual Machines or low powered devices. This paper focuses on the performance analysis that can be obtained using a GPGPU virtualized software. Recently, GVirtuS has been extended in order to support CUDA...... ancillary libraries with good results. Here, our aim is to analyze the applicability of this powerful tool to a real problem, which uses the NVIDIA cuFFT library. As case study we consider a simple denoising algorithm, implementing a virtualized GPU-parallel software based on the convolution theorem...

  8. Teaching Electronic Literacy A Concepts-Based Approach for School Library Media Specialists

    CERN Document Server

    Craver, Kathleen W

    1997-01-01

    School library media specialists will find this concepts-based approach to teaching electronic literacy an indispensable basic tool for instructing students and teachers. It provides step-by-step instruction on how to find and evaluate needed information from electronic databases and the Internet, how to formulate successful electronic search strategies and retrieve relevant results, and how to interpret and critically analyze search results. The chapters contain a suggested lesson plan and sample assignments for the school library media specialist to use in teaching electronic literacy skills

  9. Developing Applications in the Era of Cloud-based SaaS Library Systems

    Directory of Open Access Journals (Sweden)

    Josh Weisman

    2014-10-01

    Full Text Available As the move to cloud-based SaaS library systems accelerates, we must consider what it means to develop applications when the core of the system isn't under the library's control. The entire application lifecycle is changing, from development to testing to production. Developing applications for cloud solutions raises new concerns, such as security, multi-tenancy, latency, and analytics. In this article, we review the landscape and suggest a view of how to be successful for the benefit of library staff and end-users in this new reality. We discuss what kinds of APIs and protocols vendors should be supporting, and suggest how best to take advantage of the innovations being introduced.

  10. Investigating the Use of a Digital Library in an Inquiry-Based Undergraduate Geology Course

    Science.gov (United States)

    Apedoe, Xornam S.

    2007-01-01

    This paper reports the findings of a qualitative research study designed to investigate the opportunities and obstacles presented by a digital library for supporting teaching and learning in an inquiry-based undergraduate geology course. Data for this study included classroom observations and field-notes of classroom practices, questionnaires, and…

  11. Improving conditions for reuse of design solutions - by means of a context based solution library

    DEFF Research Database (Denmark)

    Mortensen, Niels Henrik; Grothe-Møller, Thorkild; Andreasen, Mogens Myrup

    1997-01-01

    Among the most important reasoning mechanisms in design is reasoning by analogy. One precondition for being able to reason about the properties and functionalitues of a product or subsystem is that the context of the solution is known. This paper presents a computer based solution library where...

  12. Speakeasy Studio and Cafe: Information Literacy, Web-based Library Instruction, and Technology.

    Science.gov (United States)

    Jacobs, Mark

    2001-01-01

    Discussion of academic library instruction and information literacy focuses on a Web-based program developed at Washington State University called Speakeasy Studio and Cafe that is used for bibliographic instruction. Highlights include the research process; asking the right question; and adapting to students' differing learning styles. (LRW)

  13. Comparing laser-based open- and closed-path gas analyzers to measure methane fluxes using the eddy covariance method

    Science.gov (United States)

    Detto, Matteo; Verfaillie, Joseph; Anderson, Frank; Xu, Liukang; Baldocchi, Dennis

    2011-01-01

    Closed- and open-path methane gas analyzers are used in eddy covariance systems to compare three potential methane emitting ecosystems in the Sacramento-San Joaquin Delta (CA, USA): a rice field, a peatland pasture and a restored wetland. The study points out similarities and differences of the systems in field experiments and data processing. The closed-path system, despite a less intrusive placement with the sonic anemometer, required more care and power. In contrast, the open-path system appears more versatile for a remote and unattended experimental site. Overall, the two systems have comparable minimum detectable limits, but synchronization between wind speed and methane data, air density corrections and spectral losses have different impacts on the computed flux covariances. For the closed-path analyzer, air density effects are less important, but the synchronization and spectral losses may represent a problem when fluxes are small or when an undersized pump is used. For the open-path analyzer air density corrections are greater, due to spectroscopy effects and the classic Webb–Pearman–Leuning correction. Comparison between the 30-min fluxes reveals good agreement in terms of magnitudes between open-path and closed-path flux systems. However, the scatter is large, as consequence of the intensive data processing which both systems require.

  14. Construction of naïve camelids VHH repertoire in phage display-based library.

    Science.gov (United States)

    Sabir, Jamal S M; Atef, Ahmed; El-Domyati, Fotouh M; Edris, Sherif; Hajrah, Nahid; Alzohairy, Ahmed M; Bahieldin, Ahmed

    2014-04-01

    Camelids have unique antibodies, namely HCAbs (VHH) or commercially named Nanobodies(®) (Nb) that are composed only of a heavy-chain homodimer. As libraries based on immunized camelids are time-consuming, costly and likely redundant for certain antigens, we describe the construction of a naïve camelid VHHs library from blood serum of non-immunized camelids with affinity in the subnanomolar range and suitable for standard immune applications. This approach is rapid and recovers VHH repertoire with the advantages of being more diverse, non-specific and devoid of subpopulations of specific antibodies, which allows the identification of binders for any potential antigen (or pathogen). RNAs from a number of camelids from Saudi Arabia were isolated and cDNAs of the diverse vhh gene were amplified; the resulting amplicons were cloned in the phage display pSEX81 vector. The size of the library was found to be within the required range (10(7)) suitable for subsequent applications in disease diagnosis and treatment. Two hundred clones were randomly selected and the inserted gene library was either estimated for redundancy or sequenced and aligned to the reference camelid vhh gene (acc. No. ADE99145). Results indicated complete non-specificity of this small library in which no single event of redundancy was detected. These results indicate the efficacy of following this approach in order to yield a large and diverse enough gene library to secure the presence of the required version encoding the required antibodies for any target antigen. This work is a first step towards the construction of phage display-based biosensors useful in disease (e.g., TB or tuberculosis) diagnosis and treatment. Copyright © 2014 Académie des sciences. Published by Elsevier SAS. All rights reserved.

  15. Assessing high affinity binding to HLA-DQ2.5 by a novel peptide library based approach

    DEFF Research Database (Denmark)

    Jüse, Ulrike; Arntzen, Magnus; Højrup, Peter

    2011-01-01

    Here we report on a novel peptide library based method for HLA class II binding motif identification. The approach is based on water soluble HLA class II molecules and soluble dedicated peptide libraries. A high number of different synthetic peptides are competing to interact with a limited amount...... library. The eluted sequences fit very well with the previously described HLA-DQ2.5 peptide binding motif. This novel method, limited by library complexity and sensitivity of mass spectrometry, allows the analysis of several thousand synthetic sequences concomitantly in a simple water soluble format....

  16. An Academic Library's Experience with Fee-Based Services.

    Science.gov (United States)

    Hornbeck, Julia W.

    1983-01-01

    Profile of fee-based information services offered by the Information Exchange Center of Georgia Institute of Technology notes history and background, document delivery to commercial clients and on-campus faculty, online and manual literature searching, staff, cost analysis, fee schedule, operating methods, client relations, marketing, and current…

  17. Academic Job Placements in Library and Information Science Field: A Case Study Performed on ALISE Web-Based Postings

    Science.gov (United States)

    Abouserie, Hossam Eldin Mohamed Refaat

    2010-01-01

    The study investigated and analyzed the state of academic web-based job announcements in Library and Information Science Field. The purpose of study was to get in depth understanding about main characteristics and trends of academic job market in Library and Information science field. The study focused on web-based version announcement as it was…

  18. Impacts of data covariances on the calculated breeding ratio for CRBRP

    International Nuclear Information System (INIS)

    Liaw, J.R.; Collins, P.J.; Henryson, H. II; Shenter, R.E.

    1983-01-01

    In order to establish confidence on the data adjustment methodology as applied to LMFBR design, and to estimate the importance of data correlations in that respect, an investigation was initiated on the impacts of data covariances on the calculated reactor performance parameters. This paper summarizes the results and findings of such an effort specifically related to the calculation of breeding ratio for CRBRP as an illustration. Thirty-nine integral parameters and their covariances, including k/sub eff/ and various capture and fission reaction rate ratios, from the ZEBRA-8 series and four ZPR physics benchmark assemblies were used in the least-squares fitting processes. Multigroup differential data and the sensitivity coefficients of those 39 integral parameters were generated by standard 2-D diffusion theory neutronic calculational modules at ANL. Three differential data covariance libraries, all based on ENDF/B-V evaluations, were tested in this study

  19. MpTheory Java library: a multi-platform Java library for systems biology based on the Metabolic P theory.

    Science.gov (United States)

    Marchetti, Luca; Manca, Vincenzo

    2015-04-15

    MpTheory Java library is an open-source project collecting a set of objects and algorithms for modeling observed dynamics by means of the Metabolic P (MP) theory, that is, a mathematical theory introduced in 2004 for modeling biological dynamics. By means of the library, it is possible to model biological systems both at continuous and at discrete time. Moreover, the library comprises a set of regression algorithms for inferring MP models starting from time series of observations. To enhance the modeling experience, beside a pure Java usage, the library can be directly used within the most popular computing environments, such as MATLAB, GNU Octave, Mathematica and R. The library is open-source and licensed under the GNU Lesser General Public License (LGPL) Version 3.0. Source code, binaries and complete documentation are available at http://mptheory.scienze.univr.it. luca.marchetti@univr.it, marchetti@cosbi.eu Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  20. Collection-based analysis of selected medical libraries in the Philippines using Doody’s Core Titles

    Directory of Open Access Journals (Sweden)

    Efren Torres Jr., MLIS

    2017-01-01

    Conclusion: De La Salle Health Sciences Institute and University of Santo Tomas had sound medical collections based on Doody’s Core Titles. Collectively, the medical libraries shared common collection development priorities, as evidenced by similarities in strong areas. Library budget and the role of the library director in book selection were among the factors that could contributed to a high percentage of match titles.

  1. Covariant representations of nuclear *-algebras

    International Nuclear Information System (INIS)

    Moore, S.M.

    1978-01-01

    Extensions of the Csup(*)-algebra theory for covariant representations to nuclear *-algebra are considered. Irreducible covariant representations are essentially unique, an invariant state produces a covariant representation with stable vacuum, and the usual relation between ergodic states and covariant representations holds. There exist construction and decomposition theorems and a possible relation between derivations and covariant representations

  2. Monitoring gas and heat emissions at Norris Geyser Basin, Yellowstone National Park, USA based on a combined eddy covariance and Multi-GAS approach

    Science.gov (United States)

    Lewicki, J. L.; Kelly, P. J.; Bergfeld, D.; Vaughan, R. G.; Lowenstern, J. B.

    2017-11-01

    We quantified gas and heat emissions in an acid-sulfate, vapor-dominated area (0.04-km2) of Norris Geyser Basin, located just north of the 0.63 Ma Yellowstone Caldera and near an area of anomalous uplift. From 14 May to 3 October 2016, an eddy covariance system measured half-hourly CO2, H2O and sensible (H) and latent (LE) heat fluxes and a Multi-GAS instrument measured (1 Hz frequency) atmospheric H2O, CO2 and H2S volumetric mixing ratios. We also measured soil CO2 fluxes using the accumulation chamber method and temperature profiles on a grid and collected fumarole gas samples for geochemical analysis. Eddy covariance CO2 fluxes ranged from - 56 to 885 g m- 2 d- 1. Using wavelet analysis, average daily eddy covariance CO2 fluxes were locally correlated with average daily environmental parameters on several-day to monthly time scales. Estimates of CO2 emission rate from the study area ranged from 8.6 t d- 1 based on eddy covariance measurements to 9.8 t d- 1 based on accumulation chamber measurements. Eddy covariance water vapor fluxes ranged from 1178 to 24,600 g m- 2 d- 1. Nighttime H and LE were considered representative of hydrothermal heat fluxes and ranged from 4 to 183 and 38 to 504 W m- 2, respectively. The total hydrothermal heat emission rate (H + LE + radiant) estimated for the study area was 11.6 MW and LE contributed 69% of the output. The mean ± standard deviation of H2O, CO2 and H2S mixing ratios measured by the Multi-GAS system were 9.3 ± 3.1 parts per thousand, 467 ± 61 ppmv, and 0.5 ± 0.6 ppmv, respectively, and variations in the gas compositions were strongly correlated with diurnal variations in environmental parameters (wind speed and direction, atmospheric temperature). After removing ambient H2O and CO2, the observed variations in the Multi-GAS data could be explained by the mixing of relatively H2O-CO2-H2S-rich fumarole gases with CO2-rich and H2O-H2S-poor soil gases. The fumarole H2O/CO2 and CO2/H2S end member ratios (101.7 and 27

  3. Monitoring gas and heat emissions at Norris Geyser Basin, Yellowstone National Park, USA based on a combined eddy covariance and Multi-GAS approach

    Science.gov (United States)

    Lewicki, Jennifer L.; Kelly, Peter; Bergfeld, Deborah; Vaughan, R. Greg; Lowenstern, Jacob B.

    2017-01-01

    We quantified gas and heat emissions in an acid-sulfate, vapor-dominated area (0.04-km2) of Norris Geyser Basin, located just north of the 0.63 Ma Yellowstone Caldera and near an area of anomalous uplift. From 14 May to 3 October 2016, an eddy covariance system measured half-hourly CO2, H2O and sensible (H) and latent (LE) heat fluxes and a Multi-GAS instrument measured (1 Hz frequency) atmospheric H2O, CO2 and H2S volumetric mixing ratios. We also measured soil CO2 fluxes using the accumulation chamber method and temperature profiles on a grid and collected fumarole gas samples for geochemical analysis. Eddy covariance CO2 fluxes ranged from − 56 to 885 g m− 2 d− 1. Using wavelet analysis, average daily eddy covariance CO2 fluxes were locally correlated with average daily environmental parameters on several-day to monthly time scales. Estimates of CO2emission rate from the study area ranged from 8.6 t d− 1 based on eddy covariance measurements to 9.8 t d− 1 based on accumulation chamber measurements. Eddy covariance water vapor fluxes ranged from 1178 to 24,600 g m− 2 d− 1. Nighttime H and LEwere considered representative of hydrothermal heat fluxes and ranged from 4 to 183 and 38 to 504 W m− 2, respectively. The total hydrothermal heat emission rate (H + LE + radiant) estimated for the study area was 11.6 MW and LE contributed 69% of the output. The mean ± standard deviation of H2O, CO2 and H2S mixing ratios measured by the Multi-GAS system were 9.3 ± 3.1 parts per thousand, 467 ± 61 ppmv, and 0.5 ± 0.6 ppmv, respectively, and variations in the gas compositions were strongly correlated with diurnal variations in environmental parameters (wind speed and direction, atmospheric temperature). After removing ambient H2O and CO2, the observed variations in the Multi-GAS data could be explained by the mixing of relatively H2O-CO2-H2S-rich fumarole gases with CO2-rich and H2O-H2S-poor soil gases. The

  4. Covariance Function for Nearshore Wave Assimilation Systems

    Science.gov (United States)

    2018-01-30

    which is applicable for any spectral wave model. The four dimensional variational (4DVar) assimilation methods are based on the mathematical ...covariance can be modeled by a parameterized Gaussian function, for nearshore wave assimilation applications , the covariance function depends primarily on...SPECTRAL ACTION DENSITY, RESPECTIVELY. ............................ 5 FIGURE 2. TOP ROW: STATISTICAL ANALYSIS OF THE WAVE-FIELD PROPERTIES AT THE

  5. Comparative analysis of machine learning methods in ligand-based virtual screening of large compound libraries.

    Science.gov (United States)

    Ma, Xiao H; Jia, Jia; Zhu, Feng; Xue, Ying; Li, Ze R; Chen, Yu Z

    2009-05-01

    Machine learning methods have been explored as ligand-based virtual screening tools for facilitating drug lead discovery. These methods predict compounds of specific pharmacodynamic, pharmacokinetic or toxicological properties based on their structure-derived structural and physicochemical properties. Increasing attention has been directed at these methods because of their capability in predicting compounds of diverse structures and complex structure-activity relationships without requiring the knowledge of target 3D structure. This article reviews current progresses in using machine learning methods for virtual screening of pharmacodynamically active compounds from large compound libraries, and analyzes and compares the reported performances of machine learning tools with those of structure-based and other ligand-based (such as pharmacophore and clustering) virtual screening methods. The feasibility to improve the performance of machine learning methods in screening large libraries is discussed.

  6. The Research and Education of Evidence Based Library and Information Practice; A Narrative Review

    Directory of Open Access Journals (Sweden)

    Vahideh Zareh Gavgani

    2018-01-01

    Full Text Available Background and Objectives: Evidence based librarianship (EBL was defined as “use of best available evidence from qualitative and quantitative research results and rational experience and decisions acquired from the daily practice of library”. However there are controversies about if the nature of EBL deals with library services or professional practice and if it needs a formal education or informal continuing education is enough? To shed light on this ambiguity, the aim of this study was to find out the state-of-the-art of education of EBL in the world. Material and Methods: The study utilized library and documentation methods to investigate the academic education of EBL through review of the available literature and websites. Results: The findings of the study revealed that evidence based librarianship does have formal curriculum for academic education in post graduate levels (post master and master. It also revealed that “Evidence Based Approach” (EBA and “Evidence Based Medicine” (EBM were also similar courses that are offered in Master and PhD levels. Conclusion: Based on the history and revolution of EBA, it is time to develop formal curriculum and field of study for Evidence Based Information Practice. This study suggests establishment of the academic field of Evidence Based and Information Science to overcome the problems and limitations that library science faces in practice.

  7. Covariant Noncommutative Field Theory

    Energy Technology Data Exchange (ETDEWEB)

    Estrada-Jimenez, S [Licenciaturas en Fisica y en Matematicas, Facultad de Ingenieria, Universidad Autonoma de Chiapas Calle 4a Ote. Nte. 1428, Tuxtla Gutierrez, Chiapas (Mexico); Garcia-Compean, H [Departamento de Fisica, Centro de Investigacion y de Estudios Avanzados del IPN P.O. Box 14-740, 07000 Mexico D.F., Mexico and Centro de Investigacion y de Estudios Avanzados del IPN, Unidad Monterrey Via del Conocimiento 201, Parque de Investigacion e Innovacion Tecnologica (PIIT) Autopista nueva al Aeropuerto km 9.5, Lote 1, Manzana 29, cp. 66600 Apodaca Nuevo Leon (Mexico); Obregon, O [Instituto de Fisica de la Universidad de Guanajuato P.O. Box E-143, 37150 Leon Gto. (Mexico); Ramirez, C [Facultad de Ciencias Fisico Matematicas, Universidad Autonoma de Puebla, P.O. Box 1364, 72000 Puebla (Mexico)

    2008-07-02

    The covariant approach to noncommutative field and gauge theories is revisited. In the process the formalism is applied to field theories invariant under diffeomorphisms. Local differentiable forms are defined in this context. The lagrangian and hamiltonian formalism is consistently introduced.

  8. Covariant Noncommutative Field Theory

    International Nuclear Information System (INIS)

    Estrada-Jimenez, S.; Garcia-Compean, H.; Obregon, O.; Ramirez, C.

    2008-01-01

    The covariant approach to noncommutative field and gauge theories is revisited. In the process the formalism is applied to field theories invariant under diffeomorphisms. Local differentiable forms are defined in this context. The lagrangian and hamiltonian formalism is consistently introduced

  9. ANSL-V: ENDF/B-V based multigroup cross-section libraries for Advanced Neutron Source (ANS) reactor studies

    International Nuclear Information System (INIS)

    Ford, W.E. III; Arwood, J.W.; Greene, N.M.; Petrie, L.M.; Primm, R.T. III; Waddell, M.W.; Webster, C.C.; Westfall, R.M.; Wright, R.Q.

    1987-01-01

    Multigroup P3 neutron, P0-P3 secondary gamma ray production (SGRP), and P6 gamma ray interaction (GRI) cross section libraries have been generated to support design work on the Advanced Neutron Source (ANS) reactor. The libraries, designated ANSL-V (Advanced Neutron Source Cross-Section Libraries), are data bases in a format suitable for subsequent generation of problem dependent cross sections. The ANSL-V libraries are available on magnetic tape from the Radiation Shielding Information Center at Oak Ridge National Laboratory

  10. Covariance data processing code. ERRORJ

    International Nuclear Information System (INIS)

    Kosako, Kazuaki

    2001-01-01

    The covariance data processing code, ERRORJ, was developed to process the covariance data of JENDL-3.2. ERRORJ has the processing functions of covariance data for cross sections including resonance parameters, angular distribution and energy distribution. (author)

  11. Are your covariates under control? How normalization can re-introduce covariate effects.

    Science.gov (United States)

    Pain, Oliver; Dudbridge, Frank; Ronald, Angelica

    2018-04-30

    Many statistical tests rely on the assumption that the residuals of a model are normally distributed. Rank-based inverse normal transformation (INT) of the dependent variable is one of the most popular approaches to satisfy the normality assumption. When covariates are included in the analysis, a common approach is to first adjust for the covariates and then normalize the residuals. This study investigated the effect of regressing covariates against the dependent variable and then applying rank-based INT to the residuals. The correlation between the dependent variable and covariates at each stage of processing was assessed. An alternative approach was tested in which rank-based INT was applied to the dependent variable before regressing covariates. Analyses based on both simulated and real data examples demonstrated that applying rank-based INT to the dependent variable residuals after regressing out covariates re-introduces a linear correlation between the dependent variable and covariates, increasing type-I errors and reducing power. On the other hand, when rank-based INT was applied prior to controlling for covariate effects, residuals were normally distributed and linearly uncorrelated with covariates. This latter approach is therefore recommended in situations were normality of the dependent variable is required.

  12. Template-based combinatorial enumeration of virtual compound libraries for lipids.

    Science.gov (United States)

    Sud, Manish; Fahy, Eoin; Subramaniam, Shankar

    2012-09-25

    A variety of software packages are available for the combinatorial enumeration of virtual libraries for small molecules, starting from specifications of core scaffolds with attachments points and lists of R-groups as SMILES or SD files. Although SD files include atomic coordinates for core scaffolds and R-groups, it is not possible to control 2-dimensional (2D) layout of the enumerated structures generated for virtual compound libraries because different packages generate different 2D representations for the same structure. We have developed a software package called LipidMapsTools for the template-based combinatorial enumeration of virtual compound libraries for lipids. Virtual libraries are enumerated for the specified lipid abbreviations using matching lists of pre-defined templates and chain abbreviations, instead of core scaffolds and lists of R-groups provided by the user. 2D structures of the enumerated lipids are drawn in a specific and consistent fashion adhering to the framework for representing lipid structures proposed by the LIPID MAPS consortium. LipidMapsTools is lightweight, relatively fast and contains no external dependencies. It is an open source package and freely available under the terms of the modified BSD license.

  13. FilTer BaSe: A web accessible chemical database for small compound libraries.

    Science.gov (United States)

    Kolte, Baban S; Londhe, Sanjay R; Solanki, Bhushan R; Gacche, Rajesh N; Meshram, Rohan J

    2018-03-01

    Finding novel chemical agents for targeting disease associated drug targets often requires screening of large number of new chemical libraries. In silico methods are generally implemented at initial stages for virtual screening. Filtering of such compound libraries on physicochemical and substructure ground is done to ensure elimination of compounds with undesired chemical properties. Filtering procedure, is redundant, time consuming and requires efficient bioinformatics/computer manpower along with high end software involving huge capital investment that forms a major obstacle in drug discovery projects in academic setup. We present an open source resource, FilTer BaSe- a chemoinformatics platform (http://bioinfo.net.in/filterbase/) that host fully filtered, ready to use compound libraries with workable size. The resource also hosts a database that enables efficient searching the chemical space of around 348,000 compounds on the basis of physicochemical and substructure properties. Ready to use compound libraries and database presented here is expected to aid a helping hand for new drug developers and medicinal chemists. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. The use of mobile technology in health libraries: a summary of a UK-based survey.

    Science.gov (United States)

    Chamberlain, David; Elcock, Martin; Puligari, Preeti

    2015-12-01

    Health libraries have changed over the past fifteen years in the format of the information they provide. This is driven by developments in technology. To conduct a survey of NHS health libraries in the United Kingdom in order to summarise how mobile technologies are being used, how they are promoted and how they are delivered, highlighting good practice and solutions to issues. An online survey was carried out in 2013 and sent to academic and NHS web-based discussion lists. There were 199 responses. Main replies were from large Acute Hospital Trusts. Only 18% of respondents had conducted research into use of mobile technology (MT) within their Trust. Forty per cent of Trusts offered clinical point of care tools, 29% mobile catalogues, and 30% had mobile enabled web sites. Libraries utilised third-party partnerships rather than develop their own applications or tools. Seventy per cent of Trusts promoted new MT services via e-mail. Network restrictions were the main barrier to development as well as finance and expertise. Uptake and development of MT is sporadic and driven by individuals. There is an opportunity for collaboration and sharing resources and expertise. There are benefits to adopting user-friendly resources. © 2015 Health Libraries Group.

  15. Affinity-based screening of combinatorial libraries using automated, serial-column chromatography

    Energy Technology Data Exchange (ETDEWEB)

    Evans, D.M.; Williams, K.P.; McGuinness, B. [PerSeptive Biosystems, Framingham, MA (United States)] [and others

    1996-04-01

    The authors have developed an automated serial chromatographic technique for screening a library of compounds based upon their relative affinity for a target molecule. A {open_quotes}target{close_quotes} column containing the immobilized target molecule is set in tandem with a reversed-phase column. A combinatorial peptide library is injected onto the target column. The target-bound peptides are eluted from the first column and transferred automatically to the reversed-phase column. The target-specific peptide peaks from the reversed-phase column are identified and sequenced. Using a monoclonal antibody (3E-7) against {beta}-endorphin as a target, we selected a single peptide with sequence YGGFL from approximately 5800 peptides present in a combinatorial library. We demonstrated the applicability of the technology towards selection of peptides with predetermined affinity for bacterial lipopolysaccharide (LPS, endotoxin). We expect that this technology will have broad applications for high throughput screening of chemical libraries or natural product extracts. 21 refs., 4 figs.

  16. ROSFOND based heating-damage cross sections sub-library: Preliminary uncertainty assessment

    International Nuclear Information System (INIS)

    Sinitsa, V.V.

    2016-01-01

    The accuracy of radiation damage calculations for the most important LWR component, the reactor pressure vessel (RPV), directly linked with the RPV End-of-Life (EoL) prediction which is in its turn connected with fundamental nuclear safety aspects and relevant economic impacts. In this connection, for nearly ten years the ENEA-Bologna Nuclear Data Group conducts the nuclear data processing and validation activities addressed to update the specialized broad-group coupled neutron/photon working cross section libraries for shielding and radiation damage calculations through NJOY and Bologna revised version of SCAMPI data processing systems. A number of working group-wise data libraries has been prepared and transferred to the ENEA Data Bank for dissemination. Several years ago the NRC ”Kurchatov Institute” has reset the GRUCON project, originally designed to provide group constants for fast nuclear reactor calculations [12], with aim to expand its application area and to use in the WWER safety tasks, in particular, in the RPV radiation damage analyses. By means of updated GRUCON and NJOY-99 processing codes, and calculation procedure, developed in the NDG of ENEA Bologna, a sample of kerma&damage energy point-wise data sub-libraries from different evaluated data libraries has been generated. On the base of this sample, the quantitative assessment of kerma/dpa data precision in the RPV calculations is obtained

  17. Quality Quantification of Evaluated Cross Section Covariances

    International Nuclear Information System (INIS)

    Varet, S.; Dossantos-Uzarralde, P.; Vayatis, N.

    2015-01-01

    Presently, several methods are used to estimate the covariance matrix of evaluated nuclear cross sections. Because the resulting covariance matrices can be different according to the method used and according to the assumptions of the method, we propose a general and objective approach to quantify the quality of the covariance estimation for evaluated cross sections. The first step consists in defining an objective criterion. The second step is computation of the criterion. In this paper the Kullback-Leibler distance is proposed for the quality quantification of a covariance matrix estimation and its inverse. It is based on the distance to the true covariance matrix. A method based on the bootstrap is presented for the estimation of this criterion, which can be applied with most methods for covariance matrix estimation and without the knowledge of the true covariance matrix. The full approach is illustrated on the 85 Rb nucleus evaluations and the results are then used for a discussion on scoring and Monte Carlo approaches for covariance matrix estimation of the cross section evaluations

  18. The mathematics of a successful deconvolution: a quantitative assessment of mixture-based combinatorial libraries screened against two formylpeptide receptors.

    Science.gov (United States)

    Santos, Radleigh G; Appel, Jon R; Giulianotti, Marc A; Edwards, Bruce S; Sklar, Larry A; Houghten, Richard A; Pinilla, Clemencia

    2013-05-30

    In the past 20 years, synthetic combinatorial methods have fundamentally advanced the ability to synthesize and screen large numbers of compounds for drug discovery and basic research. Mixture-based libraries and positional scanning deconvolution combine two approaches for the rapid identification of specific scaffolds and active ligands. Here we present a quantitative assessment of the screening of 32 positional scanning libraries in the identification of highly specific and selective ligands for two formylpeptide receptors. We also compare and contrast two mixture-based library approaches using a mathematical model to facilitate the selection of active scaffolds and libraries to be pursued for further evaluation. The flexibility demonstrated in the differently formatted mixture-based libraries allows for their screening in a wide range of assays.

  19. Time-Driven Activity-Based Costing for Inter-Library Services: A Case Study in a University

    Science.gov (United States)

    Pernot, Eli; Roodhooft, Filip; Van den Abbeele, Alexandra

    2007-01-01

    Although the true costs of inter-library loans (ILL) are unknown, universities increasingly rely on them to provide better library services at lower costs. Through a case study, we show how to perform a time-driven activity-based costing analysis of ILL and provide evidence of the benefits of such an analysis.

  20. IDENTIFICATION OF ACTIVE BACTERIAL COMMUNITIES IN A MODEL DRINKING WATER BIOFILM SYSTEM USING 16S RRNA-BASED CLONE LIBRARIES

    Science.gov (United States)

    Recent phylogenetic studies have used DNA as the target molecule for the development of environmental 16S rDNA clone libraries. As DNA may persist in the environment, DNA-based libraries cannot be used to identify metabolically active bacteria in water systems. In this study, a...

  1. Growing Competition for Libraries.

    Science.gov (United States)

    Gibbons, Susan

    2001-01-01

    Describes the Questia subscription-based online academic digital books library. Highlights include weaknesses of the collection; what college students want from a library; importance of marketing; competition for traditional academic libraries that may help improve library services; and the ability of Questia to overcome barriers and…

  2. Flexible Bayesian Dynamic Modeling of Covariance and Correlation Matrices

    KAUST Repository

    Lan, Shiwei; Holbrook, Andrew; Fortin, Norbert J.; Ombao, Hernando; Shahbaba, Babak

    2017-01-01

    Modeling covariance (and correlation) matrices is a challenging problem due to the large dimensionality and positive-definiteness constraint. In this paper, we propose a novel Bayesian framework based on decomposing the covariance matrix

  3. Development and verification of a 281-group WIMS-D library based on ENDF/B-VII.1

    International Nuclear Information System (INIS)

    Dong, Zhengyun; Wu, Jun; Ma, Xubo; Yu, Hui; Chen, Yixue

    2016-01-01

    Highlights: • A new WIMS-D library based on SHEM 281 energy structures is developed. • The method for calculating the lambda factor is illustrated and parameters are discussed. • The results show the improvements of this library compared with other libraries. - Abstract: The WIMS-D library based on WIMS 69 or XMAS 172 energy group structures is widely used in thermal reactor research. Otherwise, the resonance overlap effect is not taken into account in the two energy group structure, which limits the accuracy of resonance treatment. The SHEM 281 group structure is designed by the French to avoid the resonance overlap effect. In this study, a new WIMS-D library with SHEM 281 mesh is developed by using the NJOY nuclear data processing system based on the latest Evaluated Nuclear Data Library ENDF/B-VII.1. The parameters such as the thermal cut-off energy and lambda factor that depend on group structure are discussed. The lambda factor is calculated by Neutron Resonance Spectrum Calculation System and the effect of this factor is analyzed. The new library is verified through the analysis of various criticality benchmarks by using DRAGON code. The values of multiplication factor are consistent with the experiment data and the results also are improved in comparison with other WIMS libraries.

  4. Covariance descriptor fusion for target detection

    Science.gov (United States)

    Cukur, Huseyin; Binol, Hamidullah; Bal, Abdullah; Yavuz, Fatih

    2016-05-01

    Target detection is one of the most important topics for military or civilian applications. In order to address such detection tasks, hyperspectral imaging sensors provide useful images data containing both spatial and spectral information. Target detection has various challenging scenarios for hyperspectral images. To overcome these challenges, covariance descriptor presents many advantages. Detection capability of the conventional covariance descriptor technique can be improved by fusion methods. In this paper, hyperspectral bands are clustered according to inter-bands correlation. Target detection is then realized by fusion of covariance descriptor results based on the band clusters. The proposed combination technique is denoted Covariance Descriptor Fusion (CDF). The efficiency of the CDF is evaluated by applying to hyperspectral imagery to detect man-made objects. The obtained results show that the CDF presents better performance than the conventional covariance descriptor.

  5. Library based x-ray scatter correction for dedicated cone beam breast CT

    International Nuclear Information System (INIS)

    Shi, Linxi; Zhu, Lei; Vedantham, Srinivasan; Karellas, Andrew

    2016-01-01

    Purpose: The image quality of dedicated cone beam breast CT (CBBCT) is limited by substantial scatter contamination, resulting in cupping artifacts and contrast-loss in reconstructed images. Such effects obscure the visibility of soft-tissue lesions and calcifications, which hinders breast cancer detection and diagnosis. In this work, we propose a library-based software approach to suppress scatter on CBBCT images with high efficiency, accuracy, and reliability. Methods: The authors precompute a scatter library on simplified breast models with different sizes using the GEANT4-based Monte Carlo (MC) toolkit. The breast is approximated as a semiellipsoid with homogeneous glandular/adipose tissue mixture. For scatter correction on real clinical data, the authors estimate the breast size from a first-pass breast CT reconstruction and then select the corresponding scatter distribution from the library. The selected scatter distribution from simplified breast models is spatially translated to match the projection data from the clinical scan and is subtracted from the measured projection for effective scatter correction. The method performance was evaluated using 15 sets of patient data, with a wide range of breast sizes representing about 95% of general population. Spatial nonuniformity (SNU) and contrast to signal deviation ratio (CDR) were used as metrics for evaluation. Results: Since the time-consuming MC simulation for library generation is precomputed, the authors’ method efficiently corrects for scatter with minimal processing time. Furthermore, the authors find that a scatter library on a simple breast model with only one input parameter, i.e., the breast diameter, sufficiently guarantees improvements in SNU and CDR. For the 15 clinical datasets, the authors’ method reduces the average SNU from 7.14% to 2.47% in coronal views and from 10.14% to 3.02% in sagittal views. On average, the CDR is improved by a factor of 1.49 in coronal views and 2.12 in sagittal

  6. Library based x-ray scatter correction for dedicated cone beam breast CT

    Energy Technology Data Exchange (ETDEWEB)

    Shi, Linxi; Zhu, Lei, E-mail: leizhu@gatech.edu [Nuclear and Radiological Engineering and Medical Physics Programs, The George W. Woodruff School of Mechanical Engineering, Georgia Institute of Technology, Atlanta, Georgia 30332 (United States); Vedantham, Srinivasan; Karellas, Andrew [Department of Radiology, University of Massachusetts Medical School, Worcester, Massachusetts 01655 (United States)

    2016-08-15

    Purpose: The image quality of dedicated cone beam breast CT (CBBCT) is limited by substantial scatter contamination, resulting in cupping artifacts and contrast-loss in reconstructed images. Such effects obscure the visibility of soft-tissue lesions and calcifications, which hinders breast cancer detection and diagnosis. In this work, we propose a library-based software approach to suppress scatter on CBBCT images with high efficiency, accuracy, and reliability. Methods: The authors precompute a scatter library on simplified breast models with different sizes using the GEANT4-based Monte Carlo (MC) toolkit. The breast is approximated as a semiellipsoid with homogeneous glandular/adipose tissue mixture. For scatter correction on real clinical data, the authors estimate the breast size from a first-pass breast CT reconstruction and then select the corresponding scatter distribution from the library. The selected scatter distribution from simplified breast models is spatially translated to match the projection data from the clinical scan and is subtracted from the measured projection for effective scatter correction. The method performance was evaluated using 15 sets of patient data, with a wide range of breast sizes representing about 95% of general population. Spatial nonuniformity (SNU) and contrast to signal deviation ratio (CDR) were used as metrics for evaluation. Results: Since the time-consuming MC simulation for library generation is precomputed, the authors’ method efficiently corrects for scatter with minimal processing time. Furthermore, the authors find that a scatter library on a simple breast model with only one input parameter, i.e., the breast diameter, sufficiently guarantees improvements in SNU and CDR. For the 15 clinical datasets, the authors’ method reduces the average SNU from 7.14% to 2.47% in coronal views and from 10.14% to 3.02% in sagittal views. On average, the CDR is improved by a factor of 1.49 in coronal views and 2.12 in sagittal

  7. Econometric analysis of realised covariation: high frequency covariance, regression and correlation in financial economics

    OpenAIRE

    Ole E. Barndorff-Nielsen; Neil Shephard

    2002-01-01

    This paper analyses multivariate high frequency financial data using realised covariation. We provide a new asymptotic distribution theory for standard methods such as regression, correlation analysis and covariance. It will be based on a fixed interval of time (e.g. a day or week), allowing the number of high frequency returns during this period to go to infinity. Our analysis allows us to study how high frequency correlations, regressions and covariances change through time. In particular w...

  8. gLibrary/DRI: A grid-based platform to host multiple repositories for digital content

    International Nuclear Information System (INIS)

    Calanducci, A.; Gonzalez Martin, J. M.; Ramos Pollan, R.; Rubio del Solar, M.; Tcaci, S.

    2007-01-01

    In this work we present the gLibrary/DRI (Digital Repositories Infrastructure) platform. gLibrary/DRI extends gLibrary, a system with a easy-to-use web front-end designed to save and organize multimedia assets on Grid-based storage resources. The main goal of the extended platform is to reduce the cost in terms of time and effort that a repository provider spends to get its repository deployed. This is achieved by providing a common infrastructure and a set of mechanisms (APIs and specifications) that the repository providers use to define the data model, the access to the content (by navigation trees and filters) and the storage model. DRI offers a generic way to provide all this functionality; nevertheless the providers can add specific behaviours to the default functions for their repositories. The architecture is Grid based (VO system, data federation and distribution, computing power, etc). A working example based on a mammograms repository is also presented. (Author)

  9. CSBB-ConeExclusion, adapting structure based solution virtual screening to libraries on solid support.

    Science.gov (United States)

    Shave, Steven; Auer, Manfred

    2013-12-23

    Combinatorial chemical libraries produced on solid support offer fast and cost-effective access to a large number of unique compounds. If such libraries are screened directly on-bead, the speed at which chemical space can be explored by chemists is much greater than that addressable using solution based synthesis and screening methods. Solution based screening has a large supporting body of software such as structure-based virtual screening tools which enable the prediction of protein-ligand complexes. Use of these techniques to predict the protein bound complexes of compounds synthesized on solid support neglects to take into account the conjugation site on the small molecule ligand. This may invalidate predicted binding modes, the linker may be clashing with protein atoms. We present CSBB-ConeExclusion, a methodology and computer program which provides a measure of the applicability of solution dockings to solid support. Output is given in the form of statistics for each docking pose, a unique 2D visualization method which can be used to determine applicability at a glance, and automatically generated PyMol scripts allowing visualization of protein atom incursion into a defined exclusion volume. CSBB-ConeExclusion is then exemplarically used to determine the optimum attachment point for a purine library targeting cyclin-dependent kinase 2 CDK2.

  10. The covariant chiral ring

    Energy Technology Data Exchange (ETDEWEB)

    Bourget, Antoine; Troost, Jan [Laboratoire de Physique Théorique, École Normale Supérieure, 24 rue Lhomond, 75005 Paris (France)

    2016-03-23

    We construct a covariant generating function for the spectrum of chiral primaries of symmetric orbifold conformal field theories with N=(4,4) supersymmetry in two dimensions. For seed target spaces K3 and T{sup 4}, the generating functions capture the SO(21) and SO(5) representation theoretic content of the chiral ring respectively. Via string dualities, we relate the transformation properties of the chiral ring under these isometries of the moduli space to the Lorentz covariance of perturbative string partition functions in flat space.

  11. Dimension from covariance matrices.

    Science.gov (United States)

    Carroll, T L; Byers, J M

    2017-02-01

    We describe a method to estimate embedding dimension from a time series. This method includes an estimate of the probability that the dimension estimate is valid. Such validity estimates are not common in algorithms for calculating the properties of dynamical systems. The algorithm described here compares the eigenvalues of covariance matrices created from an embedded signal to the eigenvalues for a covariance matrix of a Gaussian random process with the same dimension and number of points. A statistical test gives the probability that the eigenvalues for the embedded signal did not come from the Gaussian random process.

  12. Development of the adjusted nuclear cross-section library based on JENDL-3.2 for large FBR

    International Nuclear Information System (INIS)

    Yokoyama, Kenji; Ishikawa, Makoto; Numata, Kazuyuki

    1999-04-01

    JNC (and PNC) had developed the adjusted nuclear cross-section library in which the results of the JUPITER experiments were reflected. Using this adjusted library, the distinct improvement of the accuracy in nuclear design of FBR cores had been achieved. As a recent research, JNC develops a database of other integral data in addition to the JUPITER experiments, aiming at further improvement for accuracy and reliability. In 1991, the adjusted library based on JENDL-2, JFS-3-J2 (ADJ91R), was developed, and it has been used on the design research for FBR. As an evaluated nuclear library, however, JENDL-3.2 is recently used. Therefore, the authors developed an adjusted library based on JENDL-3.2 which is called JFS-3-J3.2(ADJ98). It is known that the adjusted library based on JENDL-2 overestimated the sodium void reactivity worth by 10-20%. It is expected that the adjusted library based on JENDL-3.2 solve the problem. The adjusted library JFS-3-J3.2(ADJ98) was produced with the same method as the adjusted library JFS-3-J2(ADJ91R) and used more integral parameters of JUPITER experiments than the adjusted library JFS-3-J2(ADJ91R). This report also describes the design accuracy estimation on a 600 MWe class FBR with the adjusted library JFS-3-J3.2(ADJ98). Its main nuclear design parameters (multiplication factor, burn-up reactivity loss, breeding ratio, etc.) except the sodium void reactivity worth which are calculated with the adjusted library JFS-3-J3.2(ADJ98) are almost the same as those predicted with JFS-3-J2(ADJ91R). As for the sodium void reactivity, the adjusted library JFS-3-J3.2(ADJ98) estimates about 4% smaller than the JFS-3-J2(ADJ91R) because of the change of the basic nuclear library from JENDL-2 to JENDL-3.2. (author)

  13. Status of CINDER and ENDF/B-V based libraries for transmutation calculations

    International Nuclear Information System (INIS)

    Wilson, W.B.; England, T.R.; LaBauve, R.J.; Battat, M.E.; Wessol, D.E.; Perry, R.T.

    1980-01-01

    The CINDER codes and their data libraries are described, and their range of calculational capabilities are described using documented applications. The importance of ENDF/B data and the features of the ENDF/B-IV and ENDF/B-V fission-product and actinide data files are emphasized. The actinide decay data of ENDF/B-V, augmented by additional data from available sources, are used to produce average decay energy values and neutron source values from sponteneous fission, (α,n) and delayed neutron emission for 144 actinide nuclides that are formed in reactor fuel. The status and characteristics of the CINDER-2 code is described, along with a brief description of more well known code versions; a review of the status of new ENDF/B-V based libraries for all versions is presented

  14. A Generic High-performance GPU-based Library for PDE solvers

    DEFF Research Database (Denmark)

    Glimberg, Stefan Lemvig; Engsig-Karup, Allan Peter

    , the privilege of high-performance parallel computing is now in principle accessible for many scientific users, no matter their economic resources. Though being highly effective units, GPUs and parallel architectures in general, pose challenges for software developers to utilize their efficiency. Sequential...... legacy codes are not always easily parallelized and the time spent on conversion might not pay o in the end. We present a highly generic C++ library for fast assembling of partial differential equation (PDE) solvers, aiming at utilizing the computational resources of GPUs. The library requires a minimum...... of GPU computing knowledge, while still oering the possibility to customize user-specic solvers at kernel level if desired. Spatial dierential operators are based on matrix free exible order nite dierence approximations. These matrix free operators minimize both memory consumption and main memory access...

  15. Network Based Educational Environment How Libraries and Librarians Become Organizers of Knowledge Access and Resources

    CERN Document Server

    Pettenati, M C; Pettenati, Corrado

    2000-01-01

    In this paper we will highlight some important issues which will influence the redefinition of roles and duties of libraries and librarians in a networked based educational environment. Although librarians will also keep their traditional roles of faculty support services as well as reference service and research assistance, we identify the participation in the instructional design process, the support in the evaluation, development and use of a proper authoring system and the customization of information access, as being the domains where libraries and librarians should mainly involve themselves in the next future and make profit of their expertise in information and knowledge organization in order to properly and effectively support the institutions in the use of Information Technology in education.

  16. Covariant Quantization with Extended BRST Symmetry

    OpenAIRE

    Geyer, B.; Gitman, D. M.; Lavrov, P. M.

    1999-01-01

    A short rewiev of covariant quantization methods based on BRST-antiBRST symmetry is given. In particular problems of correct definition of Sp(2) symmetric quantization scheme known as triplectic quantization are considered.

  17. On the Methodology to Calculate the Covariance of Estimated Resonance Parameters

    International Nuclear Information System (INIS)

    Becker, B.; Kopecky, S.; Schillebeeckx, P.

    2015-01-01

    Principles to determine resonance parameters and their covariance from experimental data are discussed. Different methods to propagate the covariance of experimental parameters are compared. A full Bayesian statistical analysis reveals that the level to which the initial uncertainty of the experimental parameters propagates, strongly depends on the experimental conditions. For high precision data the initial uncertainties of experimental parameters, like a normalization factor, has almost no impact on the covariance of the parameters in case of thick sample measurements and conventional uncertainty propagation or full Bayesian analysis. The covariances derived from a full Bayesian analysis and least-squares fit are derived under the condition that the model describing the experimental observables is perfect. When the quality of the model can not be verified a more conservative method based on a renormalization of the covariance matrix is recommended to propagate fully the uncertainty of experimental systematic effects. Finally, neutron resonance transmission analysis is proposed as an accurate method to validate evaluated data libraries in the resolved resonance region

  18. Networks of myelin covariance

    Science.gov (United States)

    Slater, David; Ruef, Anne; Sanabria‐Diaz, Gretel; Preisig, Martin; Kherif, Ferath; Draganski, Bogdan; Lutti, Antoine

    2017-01-01

    Abstract Networks of anatomical covariance have been widely used to study connectivity patterns in both normal and pathological brains based on the concurrent changes of morphometric measures (i.e., cortical thickness) between brain structures across subjects (Evans, 2013). However, the existence of networks of microstructural changes within brain tissue has been largely unexplored so far. In this article, we studied in vivo the concurrent myelination processes among brain anatomical structures that gathered together emerge to form nonrandom networks. We name these “networks of myelin covariance” (Myelin‐Nets). The Myelin‐Nets were built from quantitative Magnetization Transfer data—an in‐vivo magnetic resonance imaging (MRI) marker of myelin content. The synchronicity of the variations in myelin content between anatomical regions was measured by computing the Pearson's correlation coefficient. We were especially interested in elucidating the effect of age on the topological organization of the Myelin‐Nets. We therefore selected two age groups: Young‐Age (20–31 years old) and Old‐Age (60–71 years old) and a pool of participants from 48 to 87 years old for a Myelin‐Nets aging trajectory study. We found that the topological organization of the Myelin‐Nets is strongly shaped by aging processes. The global myelin correlation strength, between homologous regions and locally in different brain lobes, showed a significant dependence on age. Interestingly, we also showed that the aging process modulates the resilience of the Myelin‐Nets to damage of principal network structures. In summary, this work sheds light on the organizational principles driving myelination and myelin degeneration in brain gray matter and how such patterns are modulated by aging. PMID:29271053

  19. Pedigree-based estimation of covariance between dominance deviations and additive genetic effects in closed rabbit lines considering inbreeding and using a computationally simpler equivalent model.

    Science.gov (United States)

    Fernández, E N; Legarra, A; Martínez, R; Sánchez, J P; Baselga, M

    2017-06-01

    Inbreeding generates covariances between additive and dominance effects (breeding values and dominance deviations). In this work, we developed and applied models for estimation of dominance and additive genetic variances and their covariance, a model that we call "full dominance," from pedigree and phenotypic data. Estimates with this model such as presented here are very scarce both in livestock and in wild genetics. First, we estimated pedigree-based condensed probabilities of identity using recursion. Second, we developed an equivalent linear model in which variance components can be estimated using closed-form algorithms such as REML or Gibbs sampling and existing software. Third, we present a new method to refer the estimated variance components to meaningful parameters in a particular population, i.e., final partially inbred generations as opposed to outbred base populations. We applied these developments to three closed rabbit lines (A, V and H) selected for number of weaned at the Polytechnic University of Valencia. Pedigree and phenotypes are complete and span 43, 39 and 14 generations, respectively. Estimates of broad-sense heritability are 0.07, 0.07 and 0.05 at the base versus 0.07, 0.07 and 0.09 in the final generations. Narrow-sense heritability estimates are 0.06, 0.06 and 0.02 at the base versus 0.04, 0.04 and 0.01 at the final generations. There is also a reduction in the genotypic variance due to the negative additive-dominance correlation. Thus, the contribution of dominance variation is fairly large and increases with inbreeding and (over)compensates for the loss in additive variation. In addition, estimates of the additive-dominance correlation are -0.37, -0.31 and 0.00, in agreement with the few published estimates and theoretical considerations. © 2017 Blackwell Verlag GmbH.

  20. BUGLE-96: A revised multigroup cross section library for LWR applications based on ENDF/B-VI Release 3

    International Nuclear Information System (INIS)

    White, J.E.; Ingersoll, D.T.; Slater, C.O.; Roussin, R.W.

    1996-01-01

    A revised multigroup cross-section library based ON ENDF/B-VI Release 3 has been produced for light water reactor shielding and reactor pressure vessel dosimetry applications. This new broad-group library, which is designated BUGLE-96, represents an improvement over the BUGLE-93 library released in February 1994 and is expected to replace te BUGLE-93 data. The cross-section processing methodology is the same as that used for producing BUGLE-93 and is consistent with ANSI/ANS 6.1.2. As an added feature, cross-section sets having upscatter data for four thermal neutron groups are included in the BUGLE-96 package available from the Radiation Shielding Information Center. The upscattering data should improve the application of this library to the calculation of more accurate thermal fluences, although more computer time will be required. The incorporation of feedback from users has resulted in a data library that addresses a wider spectrum of user needs

  1. Generalized Linear Covariance Analysis

    Science.gov (United States)

    Carpenter, James R.; Markley, F. Landis

    2014-01-01

    This talk presents a comprehensive approach to filter modeling for generalized covariance analysis of both batch least-squares and sequential estimators. We review and extend in two directions the results of prior work that allowed for partitioning of the state space into solve-for'' and consider'' parameters, accounted for differences between the formal values and the true values of the measurement noise, process noise, and textita priori solve-for and consider covariances, and explicitly partitioned the errors into subspaces containing only the influence of the measurement noise, process noise, and solve-for and consider covariances. In this work, we explicitly add sensitivity analysis to this prior work, and relax an implicit assumption that the batch estimator's epoch time occurs prior to the definitive span. We also apply the method to an integrated orbit and attitude problem, in which gyro and accelerometer errors, though not estimated, influence the orbit determination performance. We illustrate our results using two graphical presentations, which we call the variance sandpile'' and the sensitivity mosaic,'' and we compare the linear covariance results to confidence intervals associated with ensemble statistics from a Monte Carlo analysis.

  2. Development of a common nuclear group constants library system: JSSTDL-295n-104γ based on JENDL-3 nuclear data library

    International Nuclear Information System (INIS)

    Hasegawa, A.

    1992-01-01

    JSSTDL 295n-104γ: A common group cross-section library system has been developed in JAERI to be used in fairly wide range of applications in nuclear industry. This system is composed of a common 295n-104γ group cross-section library based on JENDL-3 nuclear data file and its utility codes. Target of this system is focused to the criticality or shielding calculations in fast and fusion reactors using ANISN, DOT, or MORSE code. Specifications of the common group constants were decided responding to the request from various nuclear data users, particularly from nuclear design group in Japan. Group structure is decided so as to cover almost all group structures currently used in our country. This library includes self-shielding factor tables for primary reactions. A routine for generating macro-scopic cross-section using the self-shielding factor table is also provided. Neutron cross-sections and photon production cross-sections are processed by Prof. GROUCH-G/B code system and γ ray transport cross-sections are generated by GAMLEG-JR. In this paper, outline and present status of the JSSTDL library system is described along with two examples adopted in JENDL-3 benchmark test. One is for shielding calculation, where effects of self-shielding factor (f-table) is shown in conjunction with the analysis of the ASPIS natural iron deep penetration experiment. Without considering resonance self-shielding effect in resonance energy region for resonant nuclides like iron, the results is completely missled in the attenuation profile calculation in the shields. The other example is fast rector criticality calculations of very small critical assemblies with very high enrichment fuel materials where some basic characteristics of this library is presented. (orig.)

  3. Development of library documents in BRICEM on network-based environment

    International Nuclear Information System (INIS)

    Gao Renxi

    2010-01-01

    With the development of the internet, the transformation from a traditional library to a modem one is essential to the development of BRICEM (Beijing Research Institute of Chemical Engineering and Metallurgy) Technology Library and the situations of other libraries, this thesis integrates the reality of BRICEM and its library in an effort to work out a tentative plan, as well as concrete measures and procedures of digitalising and online-sharing the resources of BRICEM Technology Library. (author)

  4. Open source libraries and frameworks for mass spectrometry based proteomics: A developer's perspective☆

    Science.gov (United States)

    Perez-Riverol, Yasset; Wang, Rui; Hermjakob, Henning; Müller, Markus; Vesada, Vladimir; Vizcaíno, Juan Antonio

    2014-01-01

    Data processing, management and visualization are central and critical components of a state of the art high-throughput mass spectrometry (MS)-based proteomics experiment, and are often some of the most time-consuming steps, especially for labs without much bioinformatics support. The growing interest in the field of proteomics has triggered an increase in the development of new software libraries, including freely available and open-source software. From database search analysis to post-processing of the identification results, even though the objectives of these libraries and packages can vary significantly, they usually share a number of features. Common use cases include the handling of protein and peptide sequences, the parsing of results from various proteomics search engines output files, and the visualization of MS-related information (including mass spectra and chromatograms). In this review, we provide an overview of the existing software libraries, open-source frameworks and also, we give information on some of the freely available applications which make use of them. This article is part of a Special Issue entitled: Computational Proteomics in the Post-Identification Era. Guest Editors: Martin Eisenacher and Christian Stephan. PMID:23467006

  5. EST Express: PHP/MySQL based automated annotation of ESTs from expression libraries.

    Science.gov (United States)

    Smith, Robin P; Buchser, William J; Lemmon, Marcus B; Pardinas, Jose R; Bixby, John L; Lemmon, Vance P

    2008-04-10

    Several biological techniques result in the acquisition of functional sets of cDNAs that must be sequenced and analyzed. The emergence of redundant databases such as UniGene and centralized annotation engines such as Entrez Gene has allowed the development of software that can analyze a great number of sequences in a matter of seconds. We have developed "EST Express", a suite of analytical tools that identify and annotate ESTs originating from specific mRNA populations. The software consists of a user-friendly GUI powered by PHP and MySQL that allows for online collaboration between researchers and continuity with UniGene, Entrez Gene and RefSeq. Two key features of the software include a novel, simplified Entrez Gene parser and tools to manage cDNA library sequencing projects. We have tested the software on a large data set (2,016 samples) produced by subtractive hybridization. EST Express is an open-source, cross-platform web server application that imports sequences from cDNA libraries, such as those generated through subtractive hybridization or yeast two-hybrid screens. It then provides several layers of annotation based on Entrez Gene and RefSeq to allow the user to highlight useful genes and manage cDNA library projects.

  6. EST Express: PHP/MySQL based automated annotation of ESTs from expression libraries

    Directory of Open Access Journals (Sweden)

    Pardinas Jose R

    2008-04-01

    Full Text Available Abstract Background Several biological techniques result in the acquisition of functional sets of cDNAs that must be sequenced and analyzed. The emergence of redundant databases such as UniGene and centralized annotation engines such as Entrez Gene has allowed the development of software that can analyze a great number of sequences in a matter of seconds. Results We have developed "EST Express", a suite of analytical tools that identify and annotate ESTs originating from specific mRNA populations. The software consists of a user-friendly GUI powered by PHP and MySQL that allows for online collaboration between researchers and continuity with UniGene, Entrez Gene and RefSeq. Two key features of the software include a novel, simplified Entrez Gene parser and tools to manage cDNA library sequencing projects. We have tested the software on a large data set (2,016 samples produced by subtractive hybridization. Conclusion EST Express is an open-source, cross-platform web server application that imports sequences from cDNA libraries, such as those generated through subtractive hybridization or yeast two-hybrid screens. It then provides several layers of annotation based on Entrez Gene and RefSeq to allow the user to highlight useful genes and manage cDNA library projects.

  7. Open source libraries and frameworks for mass spectrometry based proteomics: a developer's perspective.

    Science.gov (United States)

    Perez-Riverol, Yasset; Wang, Rui; Hermjakob, Henning; Müller, Markus; Vesada, Vladimir; Vizcaíno, Juan Antonio

    2014-01-01

    Data processing, management and visualization are central and critical components of a state of the art high-throughput mass spectrometry (MS)-based proteomics experiment, and are often some of the most time-consuming steps, especially for labs without much bioinformatics support. The growing interest in the field of proteomics has triggered an increase in the development of new software libraries, including freely available and open-source software. From database search analysis to post-processing of the identification results, even though the objectives of these libraries and packages can vary significantly, they usually share a number of features. Common use cases include the handling of protein and peptide sequences, the parsing of results from various proteomics search engines output files, and the visualization of MS-related information (including mass spectra and chromatograms). In this review, we provide an overview of the existing software libraries, open-source frameworks and also, we give information on some of the freely available applications which make use of them. This article is part of a Special Issue entitled: Computational Proteomics in the Post-Identification Era. Guest Editors: Martin Eisenacher and Christian Stephan. Copyright © 2013 Elsevier B.V. All rights reserved.

  8. Key Performance Indicators in Irish Hospital Libraries: Developing Outcome-Based Metrics

    Directory of Open Access Journals (Sweden)

    Michelle Dalton

    2012-12-01

    Full Text Available Objective – To develop a set of generic outcome-based performance measures for Irishhospital libraries.Methods – Various models and frameworks of performance measurement were used as atheoretical paradigm to link the impact of library services directly with measurablehealthcare objectives and outcomes. Strategic objectives were identified, mapped toperformance indicators, and finally translated into response choices to a single-questiononline survey for distribution via email.Results – The set of performance indicators represents an impact assessment tool whichis easy to administer across a variety of healthcare settings. In using a model directlyaligned with the mission and goals of the organization, and linked to core activities andoperations in an accountable way, the indicators can also be used as a channel throughwhich to implement action, change, and improvement.Conclusion – The indicators can be adopted at a local and potentially a national level, asboth a tool for advocacy and to assess and improve service delivery at a macro level. Toovercome the constraints posed by necessary simplifications, substantial further research is needed by hospital libraries to develop more sophisticated and meaningful measures of impact to further aid decision making at a micro level.

  9. MICROX-2 cross section library based on ENDF/B-VII

    International Nuclear Information System (INIS)

    Hou, J.; Ivanov, K.; Choi, H.

    2012-01-01

    New cross section libraries of a neutron transport code MICROX-2 have been generated for advanced reactor design and fuel cycle analyses. A total of 386 nuclides were processed, including 10 thermal scattering nuclides, which are available in ENDF/B-VII release 0 nuclear data. The NJOY system and MICROR code were used to process nuclear data and convert them into MICROX-2 format. The energy group structure of the new library was optimized for both the thermal and fast neutron spectrum reactors based on Contributon and Point-wise Cross Section Driven (CPXSD) method, resulting in a total of 1173 energy groups. A series of lattice cell level benchmark calculations have been performed against both experimental measurements and Monte Carlo calculations for the effective/infinite multiplication factor and reaction rate ratios. The results of MICROX-2 calculation with the new library were consistent with those of 15 reference cases. The average errors of the infinite multiplication factor and reaction rate ratio were 0.31% δk and 1.9%, respectively. The maximum error of reaction rate ratio was 8% for 238 U-to- 235 U fission of ZEBRA lattice against the reference calculation done by MCNP5. (authors)

  10. Earth Observation System Flight Dynamics System Covariance Realism

    Science.gov (United States)

    Zaidi, Waqar H.; Tracewell, David

    2016-01-01

    This presentation applies a covariance realism technique to the National Aeronautics and Space Administration (NASA) Earth Observation System (EOS) Aqua and Aura spacecraft based on inferential statistics. The technique consists of three parts: collection calculation of definitive state estimates through orbit determination, calculation of covariance realism test statistics at each covariance propagation point, and proper assessment of those test statistics.

  11. A data-driven and physics-based single-pass retrieval of active-passive microwave covariation and vegetation parameters for the SMAP mission

    Science.gov (United States)

    Entekhabi, D.; Jagdhuber, T.; Das, N. N.; Baur, M.; Link, M.; Piles, M.; Akbar, R.; Konings, A. G.; Mccoll, K. A.; Alemohammad, S. H.; Montzka, C.; Kunstmann, H.

    2016-12-01

    The active-passive soil moisture retrieval algorithm of NASA's SMAP mission depends on robust statistical estimation of active-passive covariation (β) and vegetation structure (Γ) parameters in order to provide reliable global measurements of soil moisture on an intermediate level (9km) compared to the native resolution of the radiometer (36km) and radar (3km) instruments. These parameters apply to the SMAP radiometer-radar combination over the period of record that was cut short with the end of the SMAP radar transmission. They also apply to the current SMAP radiometer and Sentinel 1A/B radar combination for high-resolution surface soil moisture mapping. However, the performance of the statistically-based approach is directly dependent on the selection of a representative time frame in which these parameters can be estimated assuming dynamic soil moisture and stationary soil roughness and vegetation cover. Here, we propose a novel, data-driven and physics-based single-pass retrieval of active-passive microwave covariation and vegetation parameters for the SMAP mission. The algorithm does not depend on time series analyses and can be applied using minimum one pair of an active-passive acquisition. The algorithm stems from the physical link between microwave emission and scattering via conservation of energy. The formulation of the emission radiative transfer is combined with the Distorted Born Approximation of radar scattering for vegetated land surfaces. The two formulations are simultaneously solved for the covariation and vegetation structure parameters. Preliminary results from SMAP active-passive observations (April 13th to July 7th 2015) compare well with the time-series statistical approach and confirms the capability of this method to estimate these parameters. Moreover, the method is not restricted to a given frequency (applies to both L-band and C-band combinations for the radar) or incidence angle (all angles and not just the fixed 40° incidence

  12. Impact of the 235U covariance data in benchmark calculations

    International Nuclear Information System (INIS)

    Leal, Luiz; Mueller, Don; Arbanas, Goran; Wiarda, Dorothea; Derrien, Herve

    2008-01-01

    The error estimation for calculated quantities relies on nuclear data uncertainty information available in the basic nuclear data libraries such as the U.S. Evaluated Nuclear Data File (ENDF/B). The uncertainty files (covariance matrices) in the ENDF/B library are generally obtained from analysis of experimental data. In the resonance region, the computer code SAMMY is used for analyses of experimental data and generation of resonance parameters. In addition to resonance parameters evaluation, SAMMY also generates resonance parameter covariance matrices (RPCM). SAMMY uses the generalized least-squares formalism (Bayes' method) together with the resonance formalism (R-matrix theory) for analysis of experimental data. Two approaches are available for creation of resonance-parameter covariance data. (1) During the data-evaluation process, SAMMY generates both a set of resonance parameters that fit the experimental data and the associated resonance-parameter covariance matrix. (2) For existing resonance-parameter evaluations for which no resonance-parameter covariance data are available, SAMMY can retroactively create a resonance-parameter covariance matrix. The retroactive method was used to generate covariance data for 235 U. The resulting 235 U covariance matrix was then used as input to the PUFF-IV code, which processed the covariance data into multigroup form, and to the TSUNAMI code, which calculated the uncertainty in the multiplication factor due to uncertainty in the experimental cross sections. The objective of this work is to demonstrate the use of the 235 U covariance data in calculations of critical benchmark systems. (authors)

  13. Featured Library: Parrish Library

    OpenAIRE

    Kirkwood, Hal P, Jr

    2015-01-01

    The Roland G. Parrish Library of Management & Economics is located within the Krannert School of Management at Purdue University. Between 2005 - 2007 work was completed on a white paper that focused on a student-centered vision for the Management & Economics Library. The next step was a massive collection reduction and a re-envisioning of both the services and space of the library. Thus began a 3 phase renovation from a 2 floor standard, collection-focused library into a single floor, 18,000s...

  14. Forecasting Covariance Matrices: A Mixed Frequency Approach

    DEFF Research Database (Denmark)

    Halbleib, Roxana; Voev, Valeri

    This paper proposes a new method for forecasting covariance matrices of financial returns. The model mixes volatility forecasts from a dynamic model of daily realized volatilities estimated with high-frequency data with correlation forecasts based on daily data. This new approach allows for flexi......This paper proposes a new method for forecasting covariance matrices of financial returns. The model mixes volatility forecasts from a dynamic model of daily realized volatilities estimated with high-frequency data with correlation forecasts based on daily data. This new approach allows...... for flexible dependence patterns for volatilities and correlations, and can be applied to covariance matrices of large dimensions. The separate modeling of volatility and correlation forecasts considerably reduces the estimation and measurement error implied by the joint estimation and modeling of covariance...

  15. N2O fluxes over a corn field from an open-path, laser-based eddy covariance system and static chambers

    Science.gov (United States)

    Tao, L.; Pan, D.; Gelfand, I.; Abraha, M.; Moyer, R.; Poe, A.; Sun, K.; Robertson, P.; Zondlo, M. A.

    2015-12-01

    Nitrous oxide (N2O) is important greenhouse and ozone-depleting gase. Although many efforts have been paid to N2O emissions, the spatial and temporal variability of N2O emissions still subject to large uncertainty. Application of the eddy covariance method for N2O emissions research would allow continuous ecosystem level flux measurements. The caveat, however, is need for high precision and high frequency measurements in field. In this study, an open-path, quantum cascade-laser-based eddy covariance N2O sensor has been deployed nearly continuously since May 2015 over a corn field at the W.K. Kellogg Biological Station site in SW Michigan. The field precision of the N2O sensor was assessed to be 0.1 ppbv at 10 Hz, and the total consumption was ~ 40 W, allowing the system to be powered solely by solar panels. The stability of the sensor under different temperature and humidity was tested within an environmental chamber. Spectroscopic experiments and cospectra analyses were carried out to study specific corrections associated with the sensor for eddy covariance techniques, including the line broadening effect due to water vapor and high frequency flux attenuation owning to sample path averaging. Ogive analyses indicated that the high-frequency N2O flux loss due to various damping effects was comparable to those of the CO2 flux. The detection limit of flux was estimated to be 0.3 ng N s-1 m-2 with a flux averaging interval of 30 minutes. The results from the EC system were also compared with ground measurements by standard static chambers (SC). Overall, more than 150 individual chamber measurements were taken within the footprint of the EC system. We found good correlation between the EC and SC methods given the spatiotemporal differences between the two techniques (R2 = 0.75). Both methods detected increased emissions during afternoon as compared to morning and night hours. Differences between EC and SC were also studied by investigating spatial variability with a

  16. New perspective in covariance evaluation for nuclear data

    International Nuclear Information System (INIS)

    Kanda, Y.

    1992-01-01

    Methods of nuclear data evaluation have been highly developed during the past decade, especially after introducing the concept of covariance. This makes it utmost important how to evaluate covariance matrices for nuclear data. It can be said that covariance evaluation is just the nuclear data evaluation, because the covariance matrix has quantitatively decisive function in current evaluation methods. The covariance primarily represents experimental uncertainties. However, correlation of individual uncertainties between different data must be taken into account and it can not be conducted without detailed physical considerations on experimental conditions. This procedure depends on the evaluator and the estimated covariance does also. The mathematical properties of the covariance have been intensively discussed. Their physical properties should be studied to apply it to the nuclear data evaluation, and then, in this report, are reviewed to give the base for further development of the covariance application. (orig.)

  17. An Evidence Based Methodology to Facilitate Public Library Non-fiction Collection Development

    Directory of Open Access Journals (Sweden)

    Matthew Kelly

    2015-12-01

    Full Text Available Objective – This research was designed as a pilot study to test a methodology for subject based collection analysis for public libraries. Methods – WorldCat collection data from eight Australian public libraries was extracted using the Collection Evaluation application. The data was aggregated and filtered to assess how the sample’s titles could be compared against the OCLC Conspectus subject categories. A hierarchy of emphasis emerged and this was divided into tiers ranging from 1% of the sample. These tiers were further analysed to quantify their representativeness against both the sample’s titles and the subject categories taken as a whole. The interpretive aspect of the study sought to understand the types of knowledge embedded in the tiers and was underpinned by hermeneutic phenomenology. Results – The study revealed that there was a marked tendency for a small percentage of subject categories to constitute a large proportion of the potential topicality that might have been represented in these types of collections. The study also found that distribution of the aggregated collection conformed to a Power Law distribution (80/20 so that approximately 80% of the collection was represented by 20% of the subject categories. The study also found that there were significant commonalities in the types of subject categories that were found in the designated tiers and that it may be possible to develop ontologies that correspond to the collection tiers. Conclusions – The evidence-based methodology developed in this pilot study has the potential for further development to help to improve the practice of collection development. The introduction of the concept of the epistemic role played by collection tiers is a promising aid to inform our understanding of knowledge organization for public libraries. The research shows a way forward to help to link subjective decision making with a scientifically based approach to managing knowledge

  18. Covariant field equations in supergravity

    Energy Technology Data Exchange (ETDEWEB)

    Vanhecke, Bram [KU Leuven, Institute for Theoretical Physics, Leuven (Belgium); Ghent University, Faculty of Physics, Gent (Belgium); Proeyen, Antoine van [KU Leuven, Institute for Theoretical Physics, Leuven (Belgium)

    2017-12-15

    Covariance is a useful property for handling supergravity theories. In this paper, we prove a covariance property of supergravity field equations: under reasonable conditions, field equations of supergravity are covariant modulo other field equations. We prove that for any supergravity there exist such covariant equations of motion, other than the regular equations of motion, that are equivalent to the latter. The relations that we find between field equations and their covariant form can be used to obtain multiplets of field equations. In practice, the covariant field equations are easily found by simply covariantizing the ordinary field equations. (copyright 2017 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  19. Covariant field equations in supergravity

    International Nuclear Information System (INIS)

    Vanhecke, Bram; Proeyen, Antoine van

    2017-01-01

    Covariance is a useful property for handling supergravity theories. In this paper, we prove a covariance property of supergravity field equations: under reasonable conditions, field equations of supergravity are covariant modulo other field equations. We prove that for any supergravity there exist such covariant equations of motion, other than the regular equations of motion, that are equivalent to the latter. The relations that we find between field equations and their covariant form can be used to obtain multiplets of field equations. In practice, the covariant field equations are easily found by simply covariantizing the ordinary field equations. (copyright 2017 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  20. Generally covariant gauge theories

    International Nuclear Information System (INIS)

    Capovilla, R.

    1992-01-01

    A new class of generally covariant gauge theories in four space-time dimensions is investigated. The field variables are taken to be a Lie algebra valued connection 1-form and a scalar density. Modulo an important degeneracy, complex [euclidean] vacuum general relativity corresponds to a special case in this class. A canonical analysis of the generally covariant gauge theories with the same gauge group as general relativity shows that they describe two degrees of freedom per space point, qualifying therefore as a new set of neighbors of general relativity. The modification of the algebra of the constraints with respect to the general relativity case is computed; this is used in addressing the question of how general relativity stands out from its neighbors. (orig.)

  1. Libraries serving dialogue

    CERN Document Server

    Dupont, Odile

    2014-01-01

    This book based on experiences of libraries serving interreligious dialogue, presents themes like library tools serving dialogue between cultures, collections dialoguing, children and young adults dialoguing beyond borders, story telling as dialog, librarians serving interreligious dialogue.

  2. New Neutron, Proton, and S(α,β) MCNP Data Libraries Based on ENDF/B-VII

    International Nuclear Information System (INIS)

    Little, Robert C.; Trellue, Holly R.; MacFarlane, Robert E.; Kahler, A.C.; Lee, Mary Beth; White, Morgan C.

    2008-01-01

    The general-purpose Evaluated Nuclear Data File ENDF/B-VII.0 was released in December 2006. A number of sub-libraries were included in ENDF/B-VII.0 such that data were provided for incident neutrons, photons, and charged particles. This paper describes the creation of MCNP data libraries at Los Alamos National Laboratory based on three ENDF/B-VII.0 sub-libraries: neutrons, protons, and thermal scattering. An ACE-formatted continuous-energy neutron data library called ENDF70 for MCNP has been produced. This library provides data for 390 materials at five temperatures: 293.6, 600, 900, 1200, and 2500 K. The library was processed primarily with Version 248 of NJOY99. Extensive checking and quality-assurance tests were applied to the data. Improvements to the processing code were made and certain evaluations were modified as a result of these tests. ENDF/B-VII.0 included proton evaluations for 48 target materials. Forty-seven proton evaluations (all except for 13 C) were processed at room temperature and combined into the MCNP library ENDF70PROT. Neutron thermal S(α,β) scattering data exist for twenty different materials in ENDF/B-VII.0. All twenty of these evaluations were processed at all applicable temperatures (these vary for each evaluation), and combined into the MCNP library ENDF70SAB. All of these ENDF/B-VII.0 based MCNP libraries (ENDF70, ENDF70PROT, and ENDF70SAB) are available as part of the MCNP5 1.50 release. (authors)

  3. The Bayesian Covariance Lasso.

    Science.gov (United States)

    Khondker, Zakaria S; Zhu, Hongtu; Chu, Haitao; Lin, Weili; Ibrahim, Joseph G

    2013-04-01

    Estimation of sparse covariance matrices and their inverse subject to positive definiteness constraints has drawn a lot of attention in recent years. The abundance of high-dimensional data, where the sample size ( n ) is less than the dimension ( d ), requires shrinkage estimation methods since the maximum likelihood estimator is not positive definite in this case. Furthermore, when n is larger than d but not sufficiently larger, shrinkage estimation is more stable than maximum likelihood as it reduces the condition number of the precision matrix. Frequentist methods have utilized penalized likelihood methods, whereas Bayesian approaches rely on matrix decompositions or Wishart priors for shrinkage. In this paper we propose a new method, called the Bayesian Covariance Lasso (BCLASSO), for the shrinkage estimation of a precision (covariance) matrix. We consider a class of priors for the precision matrix that leads to the popular frequentist penalties as special cases, develop a Bayes estimator for the precision matrix, and propose an efficient sampling scheme that does not precalculate boundaries for positive definiteness. The proposed method is permutation invariant and performs shrinkage and estimation simultaneously for non-full rank data. Simulations show that the proposed BCLASSO performs similarly as frequentist methods for non-full rank data.

  4. Discovery of potent inhibitors of soluble epoxide hydrolase by combinatorial library design and structure-based virtual screening.

    Science.gov (United States)

    Xing, Li; McDonald, Joseph J; Kolodziej, Steve A; Kurumbail, Ravi G; Williams, Jennifer M; Warren, Chad J; O'Neal, Janet M; Skepner, Jill E; Roberds, Steven L

    2011-03-10

    Structure-based virtual screening was applied to design combinatorial libraries to discover novel and potent soluble epoxide hydrolase (sEH) inhibitors. X-ray crystal structures revealed unique interactions for a benzoxazole template in addition to the conserved hydrogen bonds with the catalytic machinery of sEH. By exploitation of the favorable binding elements, two iterations of library design based on amide coupling were employed, guided principally by the docking results of the enumerated virtual products. Biological screening of the libraries demonstrated as high as 90% hit rate, of which over two dozen compounds were single digit nanomolar sEH inhibitors by IC(50) determination. In total the library design and synthesis produced more than 300 submicromolar sEH inhibitors. In cellular systems consistent activities were demonstrated with biochemical measurements. The SAR understanding of the benzoxazole template provides valuable insights into discovery of novel sEH inhibitors as therapeutic agents.

  5. Promoter library-based module combination (PLMC) technology for optimization of threonine biosynthesis in Corynebacterium glutamicum.

    Science.gov (United States)

    Wei, Liang; Xu, Ning; Wang, Yiran; Zhou, Wei; Han, Guoqiang; Ma, Yanhe; Liu, Jun

    2018-05-01

    Due to the lack of efficient control elements and tools, the fine-tuning of gene expression in the multi-gene metabolic pathways is still a great challenge for engineering microbial cell factories, especially for the important industrial microorganism Corynebacterium glutamicum. In this study, the promoter library-based module combination (PLMC) technology was developed to efficiently optimize the expression of genes in C. glutamicum. A random promoter library was designed to contain the putative - 10 (NNTANANT) and - 35 (NNGNCN) consensus motifs, and refined through a three-step screening procedure to achieve numerous genetic control elements with different strength levels, including fluorescence-activated cell sorting (FACS) screening, agar plate screening, and 96-well plate screening. Multiple conventional strategies were employed for further precise characterizations of the promoter library, such as real-time quantitative PCR, sodium dodecyl sulfate polyacrylamide gel electrophoresis, FACS analysis, and the lacZ reporter system. These results suggested that the established promoter elements effectively regulated gene expression and showed varying strengths over a wide range. Subsequently, a multi-module combination technology was created based on the efficient promoter elements for combination and optimization of modules in the multi-gene pathways. Using this technology, the threonine biosynthesis pathway was reconstructed and optimized by predictable tuning expression of five modules in C. glutamicum. The threonine titer of the optimized strain was significantly improved to 12.8 g/L, an approximate 6.1-fold higher than that of the control strain. Overall, the PLMC technology presented in this study provides a rapid and effective method for combination and optimization of multi-gene pathways in C. glutamicum.

  6. A graph-based approach to construct target-focused libraries for virtual screening.

    Science.gov (United States)

    Naderi, Misagh; Alvin, Chris; Ding, Yun; Mukhopadhyay, Supratik; Brylinski, Michal

    2016-01-01

    Due to exorbitant costs of high-throughput screening, many drug discovery projects commonly employ inexpensive virtual screening to support experimental efforts. However, the vast majority of compounds in widely used screening libraries, such as the ZINC database, will have a very low probability to exhibit the desired bioactivity for a given protein. Although combinatorial chemistry methods can be used to augment existing compound libraries with novel drug-like compounds, the broad chemical space is often too large to be explored. Consequently, the trend in library design has shifted to produce screening collections specifically tailored to modulate the function of a particular target or a protein family. Assuming that organic compounds are composed of sets of rigid fragments connected by flexible linkers, a molecule can be decomposed into its building blocks tracking their atomic connectivity. On this account, we developed eSynth, an exhaustive graph-based search algorithm to computationally synthesize new compounds by reconnecting these building blocks following their connectivity patterns. We conducted a series of benchmarking calculations against the Directory of Useful Decoys, Enhanced database. First, in a self-benchmarking test, the correctness of the algorithm is validated with the objective to recover a molecule from its building blocks. Encouragingly, eSynth can efficiently rebuild more than 80 % of active molecules from their fragment components. Next, the capability to discover novel scaffolds is assessed in a cross-benchmarking test, where eSynth successfully reconstructed 40 % of the target molecules using fragments extracted from chemically distinct compounds. Despite an enormous chemical space to be explored, eSynth is computationally efficient; half of the molecules are rebuilt in less than a second, whereas 90 % take only about a minute to be generated. eSynth can successfully reconstruct chemically feasible molecules from molecular fragments

  7. OpenCL-Based Linear Algebra Libraries for High-Performance Computing, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Despite its promise, OpenCL adoption is slow, owing to a lack of libraries and tools. Vendors have shown few signs of plans to provide OpenCL libraries, and were...

  8. Lorentz Covariance of Langevin Equation

    International Nuclear Information System (INIS)

    Koide, T.; Denicol, G.S.; Kodama, T.

    2008-01-01

    Relativistic covariance of a Langevin type equation is discussed. The requirement of Lorentz invariance generates an entanglement between the force and noise terms so that the noise itself should not be a covariant quantity. (author)

  9. On an extension of covariance

    International Nuclear Information System (INIS)

    Sebestyen, A.

    1975-07-01

    The principle of covariance is extended to coordinates corresponding to internal degrees of freedom. The conditions for a system to be isolated are given. It is shown how internal forces arise in such systems. Equations for internal fields are derived. By an interpretation of the generalized coordinates based on group theory it is shown how particles of the ordinary sense enter into the model and as a simple application the gravitational interaction of two pointlike particles is considered and the shift of the perihelion is deduced. (Sz.Z.)

  10. A broad-group cross-section library based on ENDF/B-VII.0 for fast neutron dosimetry Applications

    Energy Technology Data Exchange (ETDEWEB)

    Alpan, F.A. [Westinghouse Electric Company, 1000 Westinghouse Drive, Cranberry Township, PA 16066 (United States)

    2011-07-01

    A new ENDF/B-VII.0-based coupled 44-neutron, 20-gamma-ray-group cross-section library was developed to investigate the latest evaluated nuclear data file (ENDF) ,in comparison to ENDF/B-VI.3 used in BUGLE-96, as well as to generate an objective-specific library. The objectives selected for this work consisted of dosimetry calculations for in-vessel and ex-vessel reactor locations, iron atom displacement calculations for reactor internals and pressure vessel, and {sup 58}Ni(n,{gamma}) calculation that is important for gas generation in the baffle plate. The new library was generated based on the contribution and point-wise cross-section-driven (CPXSD) methodology and was applied to one of the most widely used benchmarks, the Oak Ridge National Laboratory Pool Critical Assembly benchmark problem. In addition to the new library, BUGLE-96 and an ENDF/B-VII.0-based coupled 47-neutron, 20-gamma-ray-group cross-section library was generated and used with both SNLRML and IRDF dosimetry cross sections to compute reaction rates. All reaction rates computed by the multigroup libraries are within {+-} 20 % of measurement data and meet the U. S. Nuclear Regulatory Commission acceptance criterion for reactor vessel neutron exposure evaluations specified in Regulatory Guide 1.190. (authors)

  11. Potentials of fee-based library services in Nigeria: with a case report ...

    African Journals Online (AJOL)

    Libraries have traditionally been custodians of information which are provided free of charge to users. Recent decline in funds to libraries and the change in the concept of information from a free resource to a marketable resource in the information age have necessitated the re-evaluation of free services in the libraries.

  12. Covariate analysis of bivariate survival data

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, L.E.

    1992-01-01

    The methods developed are used to analyze the effects of covariates on bivariate survival data when censoring and ties are present. The proposed method provides models for bivariate survival data that include differential covariate effects and censored observations. The proposed models are based on an extension of the univariate Buckley-James estimators which replace censored data points by their expected values, conditional on the censoring time and the covariates. For the bivariate situation, it is necessary to determine the expectation of the failure times for one component conditional on the failure or censoring time of the other component. Two different methods have been developed to estimate these expectations. In the semiparametric approach these expectations are determined from a modification of Burke's estimate of the bivariate empirical survival function. In the parametric approach censored data points are also replaced by their conditional expected values where the expected values are determined from a specified parametric distribution. The model estimation will be based on the revised data set, comprised of uncensored components and expected values for the censored components. The variance-covariance matrix for the estimated covariate parameters has also been derived for both the semiparametric and parametric methods. Data from the Demographic and Health Survey was analyzed by these methods. The two outcome variables are post-partum amenorrhea and breastfeeding; education and parity were used as the covariates. Both the covariate parameter estimates and the variance-covariance estimates for the semiparametric and parametric models will be compared. In addition, a multivariate test statistic was used in the semiparametric model to examine contrasts. The significance of the statistic was determined from a bootstrap distribution of the test statistic.

  13. Creating a web-based digital photographic archive: one hospital library's experience.

    Science.gov (United States)

    Marshall, Caroline; Hobbs, Janet

    2017-04-01

    Cedars-Sinai Medical Center is a nonprofit community hospital based in Los Angeles. Its history spans over 100 years, and its growth and development from the merging of 2 Jewish hospitals, Mount Sinai and Cedars of Lebanon, is also part of the history of Los Angeles. The medical library collects and maintains the hospital's photographic archive, to which retiring physicians, nurses, and an active Community Relations Department have donated photographs over the years. The collection was growing rapidly, it was impossible to display all the materials, and much of the collection was inaccessible to patrons. The authors decided to make the photographic collection more accessible to medical staff and researchers by purchasing a web-based digital archival package, Omeka. We decided what material should be digitized by analyzing archival reference requests and considering the institution's plan to create a Timeline Wall documenting and celebrating the history of Cedars-Sinai. Within 8 months, we digitized and indexed over 500 photographs. The digital archive now allows patrons and researchers to access the history of the hospital and enables the library to process archival references more efficiently.

  14. phpMs: A PHP-Based Mass Spectrometry Utilities Library.

    Science.gov (United States)

    Collins, Andrew; Jones, Andrew R

    2018-03-02

    The recent establishment of cloud computing, high-throughput networking, and more versatile web standards and browsers has led to a renewed interest in web-based applications. While traditionally big data has been the domain of optimized desktop and server applications, it is now possible to store vast amounts of data and perform the necessary calculations offsite in cloud storage and computing providers, with the results visualized in a high-quality cross-platform interface via a web browser. There are number of emerging platforms for cloud-based mass spectrometry data analysis; however, there is limited pre-existing code accessible to web developers, especially for those that are constrained to a shared hosting environment where Java and C applications are often forbidden from use by the hosting provider. To remedy this, we provide an open-source mass spectrometry library for one of the most commonly used web development languages, PHP. Our new library, phpMs, provides objects for storing and manipulating spectra and identification data as well as utilities for file reading, file writing, calculations, peptide fragmentation, and protein digestion as well as a software interface for controlling search engines. We provide a working demonstration of some of the capabilities at http://pgb.liv.ac.uk/phpMs .

  15. Toward a consistency cross-check of eddy covariance flux–based and biometric estimates of ecosystem carbon balance

    DEFF Research Database (Denmark)

    Luyssaert, S.; Reichstein, M.; Schulze, E.-D.

    2009-01-01

    Quantification of an ecosystem's carbon balance and its components is pivotal for understanding both ecosystem functioning and global cycling. Several methods are being applied in parallel to estimate the different components of the CO2 balance. However, different methods are subject to different...... sources of error. Therefore, it is necessary that site level component estimates are cross-checked against each other before being reported. Here we present a two-step approach for testing the accuracy and consistency of eddy covariance–based gross primary production (GPP) and ecosystem respiration (Re...

  16. Distance covariance for stochastic processes

    DEFF Research Database (Denmark)

    Matsui, Muneya; Mikosch, Thomas Valentin; Samorodnitsky, Gennady

    2017-01-01

    The distance covariance of two random vectors is a measure of their dependence. The empirical distance covariance and correlation can be used as statistical tools for testing whether two random vectors are independent. We propose an analog of the distance covariance for two stochastic processes...

  17. EJ2-MCNPlib. Contents of the JEF-2.2 based neutron cross-section library for MCNP4A

    International Nuclear Information System (INIS)

    Hogenbirk, A.; Oppe, J.

    1995-05-01

    In this report a description is given of the EJ2-MCNPlib library. The EJ2-MCNPlib library is to be used for reactivity/critically calculations and general neutron/photon transport calculations with the Monte Carlo code MCNP4A. The library is based on the European JEF-2.2 nuclear data evaluation and contains data for all (i.e. 313) nuclides available on this evaluation.The cross-section data were generated using the NJOY cross-section processing code system, version 91.118. For easy reference cross-section plots are given in this report for the total, elastic and absorption cross sections for all nuclides on the EJ2-MCNPlib library. Furthermore, for verification purposes a graphical intercomparison is given of the results of standard benchmark calculations performed with JEF-2.2 cross-section data and with ENDF/B-V cross-section data (whenever available). 6 refs

  18. Measuring patrons' technology habits: an evidence-based approach to tailoring library services.

    Science.gov (United States)

    Wu, Jin; Chatfield, Amy J; Hughes, Annie M; Kysh, Lynn; Rosenbloom, Megan Curran

    2014-04-01

    Librarians continually integrate new technologies into library services for health sciences students. Recently published data are lacking about student ownership of technological devices, awareness of new technologies, and interest in using devices and technologies to interact with the library. A survey was implemented at seven health sciences libraries to help answer these questions. Results show that librarian assumptions about awareness of technologies are not supported, and student interest in using new technologies to interact with the library varies widely. Collecting this evidence provides useful information for successfully integrating technologies into library services.

  19. SCALE-6 Sensitivity/Uncertainty Methods and Covariance Data

    International Nuclear Information System (INIS)

    Williams, Mark L.; Rearden, Bradley T.

    2008-01-01

    Computational methods and data used for sensitivity and uncertainty analysis within the SCALE nuclear analysis code system are presented. The methodology used to calculate sensitivity coefficients and similarity coefficients and to perform nuclear data adjustment is discussed. A description is provided of the SCALE-6 covariance library based on ENDF/B-VII and other nuclear data evaluations, supplemented by 'low-fidelity' approximate covariances. SCALE (Standardized Computer Analyses for Licensing Evaluation) is a modular code system developed by Oak Ridge National Laboratory (ORNL) to perform calculations for criticality safety, reactor physics, and radiation shielding applications. SCALE calculations typically use sequences that execute a predefined series of executable modules to compute particle fluxes and responses like the critical multiplication factor. SCALE also includes modules for sensitivity and uncertainty (S/U) analysis of calculated responses. The S/U codes in SCALE are collectively referred to as TSUNAMI (Tools for Sensitivity and UNcertainty Analysis Methodology Implementation). SCALE-6-scheduled for release in 2008-contains significant new capabilities, including important enhancements in S/U methods and data. The main functions of TSUNAMI are to (a) compute nuclear data sensitivity coefficients and response uncertainties, (b) establish similarity between benchmark experiments and design applications, and (c) reduce uncertainty in calculated responses by consolidating integral benchmark experiments. TSUNAMI includes easy-to-use graphical user interfaces for defining problem input and viewing three-dimensional (3D) geometries, as well as an integrated plotting package.

  20. Updated Covariance Processing Capabilities in the AMPX Code System

    International Nuclear Information System (INIS)

    Wiarda, Dorothea; Dunn, Michael E.

    2007-01-01

    A concerted effort is in progress within the nuclear data community to provide new cross-section covariance data evaluations to support sensitivity/uncertainty analyses of fissionable systems. The objective of this work is to update processing capabilities of the AMPX library to process the latest Evaluated Nuclear Data File (ENDF)/B formats to generate covariance data libraries for radiation transport software such as SCALE. The module PUFF-IV was updated to allow processing of new ENDF covariance formats in the resolved resonance region. In the resolved resonance region, covariance matrices are given in terms of resonance parameters, which need to be processed into covariance matrices with respect to the group-averaged cross-section data. The parameter covariance matrix can be quite large if the evaluation has many resonances. The PUFF-IV code has recently been used to process an evaluation of 235U, which was prepared in collaboration between Oak Ridge National Laboratory and Los Alamos National Laboratory.

  1. Calculation and analysis of the source term of the reactor core based on different data libraries

    International Nuclear Information System (INIS)

    Chen Haiying; Zhang Chunming; Wang Shaowei; Lan Bing; Liu Qiaofeng; Han Jingru

    2014-01-01

    The nuclear fuel in reactor core produces large amount of radioactive nuclides in the fission process. ORIGEN-S can calculate the accumulation and decay of radioactive nuclides in the core by using various forms of data libraries, including card-image library, binary library and ORIGEN-S cross section library generated by ARP through interpolation method. In this paper, the information of each data library was described, and the reactor core inventory was calculated by using Card-image library and ARP library. The radioactivity concentration of typical nuclides with the change of fuel burnup was analyzed. The results showed that the influence of data libraries on the calculation of nuclide radioactivity was various. Compared to Card-image library, the radioactivity of a small part of nuclides calculated by ARP library were larger and the radioactivity of "1"3"4Cs, "1"3"6Cs were calculated smaller by about 15%. For some typical nuclides, with the deepening of fuel burnup, the difference of nuclide radioactivity calculated by the two libraries increased. However, the changes of the ratio of nuclide radioactivity were different. (authors)

  2. A jackknife approach to quantifying single-trial correlation between covariance-based metrics undefined on a single-trial basis

    NARCIS (Netherlands)

    Richter, C.G.; Thompson, W.H.; Bosman, C.A.; Fries, P.

    2015-01-01

    The quantification of covariance between neuronal activities (functional connectivity) requires the observation of correlated changes and therefore multiple observations. The strength of such neuronal correlations may itself undergo moment-by-moment fluctuations, which might e.g. lead to

  3. Changing State Digital Libraries

    Science.gov (United States)

    Pappas, Marjorie L.

    2006-01-01

    Research has shown that state virtual or digital libraries are evolving into websites that are loaded with free resources, subscription databases, and instructional tools. In this article, the author explores these evolving libraries based on the following questions: (1) How user-friendly are the state digital libraries?; (2) How do state digital…

  4. Generating "fragment-based virtual library" using pocket similarity search of ligand-receptor complexes.

    Science.gov (United States)

    Khashan, Raed S

    2015-01-01

    As the number of available ligand-receptor complexes is increasing, researchers are becoming more dedicated to mine these complexes to aid in the drug design and development process. We present free software which is developed as a tool for performing similarity search across ligand-receptor complexes for identifying binding pockets which are similar to that of a target receptor. The search is based on 3D-geometric and chemical similarity of the atoms forming the binding pocket. For each match identified, the ligand's fragment(s) corresponding to that binding pocket are extracted, thus forming a virtual library of fragments (FragVLib) that is useful for structure-based drug design. The program provides a very useful tool to explore available databases.

  5. Earth Observing System Covariance Realism

    Science.gov (United States)

    Zaidi, Waqar H.; Hejduk, Matthew D.

    2016-01-01

    The purpose of covariance realism is to properly size a primary object's covariance in order to add validity to the calculation of the probability of collision. The covariance realism technique in this paper consists of three parts: collection/calculation of definitive state estimates through orbit determination, calculation of covariance realism test statistics at each covariance propagation point, and proper assessment of those test statistics. An empirical cumulative distribution function (ECDF) Goodness-of-Fit (GOF) method is employed to determine if a covariance is properly sized by comparing the empirical distribution of Mahalanobis distance calculations to the hypothesized parent 3-DoF chi-squared distribution. To realistically size a covariance for collision probability calculations, this study uses a state noise compensation algorithm that adds process noise to the definitive epoch covariance to account for uncertainty in the force model. Process noise is added until the GOF tests pass a group significance level threshold. The results of this study indicate that when outliers attributed to persistently high or extreme levels of solar activity are removed, the aforementioned covariance realism compensation method produces a tuned covariance with up to 80 to 90% of the covariance propagation timespan passing (against a 60% minimum passing threshold) the GOF tests-a quite satisfactory and useful result.

  6. Scribl: an HTML5 Canvas-based graphics library for visualizing genomic data over the web.

    Science.gov (United States)

    Miller, Chase A; Anthony, Jon; Meyer, Michelle M; Marth, Gabor

    2013-02-01

    High-throughput biological research requires simultaneous visualization as well as analysis of genomic data, e.g. read alignments, variant calls and genomic annotations. Traditionally, such integrative analysis required desktop applications operating on locally stored data. Many current terabyte-size datasets generated by large public consortia projects, however, are already only feasibly stored at specialist genome analysis centers. As even small laboratories can afford very large datasets, local storage and analysis are becoming increasingly limiting, and it is likely that most such datasets will soon be stored remotely, e.g. in the cloud. These developments will require web-based tools that enable users to access, analyze and view vast remotely stored data with a level of sophistication and interactivity that approximates desktop applications. As rapidly dropping cost enables researchers to collect data intended to answer questions in very specialized contexts, developers must also provide software libraries that empower users to implement customized data analyses and data views for their particular application. Such specialized, yet lightweight, applications would empower scientists to better answer specific biological questions than possible with general-purpose genome browsers currently available. Using recent advances in core web technologies (HTML5), we developed Scribl, a flexible genomic visualization library specifically targeting coordinate-based data such as genomic features, DNA sequence and genetic variants. Scribl simplifies the development of sophisticated web-based graphical tools that approach the dynamism and interactivity of desktop applications. Software is freely available online at http://chmille4.github.com/Scribl/ and is implemented in JavaScript with all modern browsers supported.

  7. Comparison of chamber and eddy covariance-based CO2 and CH4 emission estimates in a heterogeneous grass ecosystem on peat

    International Nuclear Information System (INIS)

    Schrier-Uijl, A.P.; Berendse, F.; Veenendaal, E.M.; Kroon, P.S.; Hensen, A.; Leffelaar, P.A.

    2010-08-01

    Fluxes of methane (CH4) and carbon dioxide (CO2) estimated by empirical models based on small-scale chamber measurements were compared to large-scale eddy covariance (EC) measurements for CH4 and to a combination of EC measurements and EC-based models for CO2. The experimental area was a flat peat meadow in the Netherlands with heterogeneous source strengths for both greenhouse gases. Two scenarios were used to assess the importance of stratifying the landscape into landscape elements before up-scaling the fluxes measured by chambers to landscape scale: one took the main landscape elements into account (field, ditch edge ditch), the other took only the field into account. Non-linear regression models were used to up-scale the chamber measurements to field emission estimates. EC CO2 respiration consisted of measured night time EC fluxes and modeled day time fluxes using the Arrhenius model. EC CH4 flux estimate was based on daily averages and the remaining data gaps were filled by linear interpolation. The EC and chamber-based estimates agreed well when the three landscape elements were taken into account with 16.5% and 13.0% difference for CO2 respiration and CH4, respectively. However, both methods differed 31.0% and 55.1% for CO2 respiration and CH4 when only field emissions were taken into account when up-scaling chamber measurements to landscape scale. This emphasizes the importance of stratifying the landscape into landscape elements. The conclusion is that small-scale chamber measurements can be used to estimate fluxes of CO2 and CH4 at landscape scale if fluxes are scaled by different landscape elements.

  8. Exploring a physico-chemical multi-array explanatory model with a new multiple covariance-based technique: structural equation exploratory regression.

    Science.gov (United States)

    Bry, X; Verron, T; Cazes, P

    2009-05-29

    In this work, we consider chemical and physical variable groups describing a common set of observations (cigarettes). One of the groups, minor smoke compounds (minSC), is assumed to depend on the others (minSC predictors). PLS regression (PLSR) of m inSC on the set of all predictors appears not to lead to a satisfactory analytic model, because it does not take into account the expert's knowledge. PLS path modeling (PLSPM) does not use the multidimensional structure of predictor groups. Indeed, the expert needs to separate the influence of several pre-designed predictor groups on minSC, in order to see what dimensions this influence involves. To meet these needs, we consider a multi-group component-regression model, and propose a method to extract from each group several strong uncorrelated components that fit the model. Estimation is based on a global multiple covariance criterion, used in combination with an appropriate nesting approach. Compared to PLSR and PLSPM, the structural equation exploratory regression (SEER) we propose fully uses predictor group complementarity, both conceptually and statistically, to predict the dependent group.

  9. Status of multigroup sensitivity profiles and covariance matrices available from the radiation shielding information center

    International Nuclear Information System (INIS)

    Roussin, R.W.; Drischler, J.D.; Marable, J.H.

    1980-01-01

    In recent years multigroup sensitivity profiles and covariance matrices have been added to the Radiation Shielding Information Center's Data Library Collection (DLC). Sensitivity profiles are available in a single package. DLC-45/SENPRO, and covariance matrices are found in two packages, DLC-44/COVERX and DLC-77/COVERV. The contents of these packages are described and their availability is discussed

  10. SG39 Deliverables. Comments on Covariance Data

    International Nuclear Information System (INIS)

    Yokoyama, Kenji

    2015-01-01

    The covariance matrix of a scattered data set, x_i (i=1,n), must be symmetric and positive-definite. As one of WPEC/SG39 contributions to the SG40/CIELO project, several comments or recommendations on the covariance data are described here from the viewpoint of nuclear-data users. To make the comments concrete and useful for nuclear-data evaluators, the covariance data of the latest evaluated nuclear data library, JENDL-4.0 and ENDF/B-VII.1 are treated here as the representative materials. The surveyed nuclides are five isotopes that are most important for fast reactor application. The nuclides, reactions and energy regions dealt with are followings: Pu-239: fission (2.5∼10 keV) and capture (2.5∼10 keV), U-235: fission (500 eV∼10 keV) and capture (500 eV∼30 keV), U-238: fission (1∼10 MeV), capture (below 20 keV, 20∼150 keV), inelastic (above 100 keV) and elastic (above 20 keV), Fe-56: elastic (below 850 keV) and average scattering cosine (above 10 keV), and, Na-23: capture (600 eV∼600 keV), inelastic (above 1 MeV) and elastic (around 2 keV)

  11. Performance Improvement with Web Based Database on Library Information System of Smk Yadika 5

    Directory of Open Access Journals (Sweden)

    Pualam Dipa Nusantara

    2015-12-01

    Full Text Available The difficulty in managing the data of books collection in the library is a problem that is often faced by the librarian that effect the quality of service. Arrangement and recording a collection of books in the file system of separate applications Word and Excel, as well as transaction handling borrowing and returning books, therehas been no integrated records. Library system can manage the book collection. This system can reduce the problems often experienced by library staff when serving students in borrowing books. There so frequent difficulty in managing the books that still in borrowed state. This system will also record a collection of late fees or lost library conducted by students (borrowers. The conclusion of this study is library performance can be better with the library system using web database.

  12. Contributions to Large Covariance and Inverse Covariance Matrices Estimation

    OpenAIRE

    Kang, Xiaoning

    2016-01-01

    Estimation of covariance matrix and its inverse is of great importance in multivariate statistics with broad applications such as dimension reduction, portfolio optimization, linear discriminant analysis and gene expression analysis. However, accurate estimation of covariance or inverse covariance matrices is challenging due to the positive definiteness constraint and large number of parameters, especially in the high-dimensional cases. In this thesis, I develop several approaches for estimat...

  13. Project GRACE A grid based search tool for the global digital library

    CERN Document Server

    Scholze, Frank; Vigen, Jens; Prazak, Petra; The Seventh International Conference on Electronic Theses and Dissertations

    2004-01-01

    The paper will report on the progress of an ongoing EU project called GRACE - Grid Search and Categorization Engine (http://www.grace-ist.org). The project participants are CERN, Sheffield Hallam University, Stockholm University, Stuttgart University, GL 2006 and Telecom Italia. The project started in 2002 and will finish in 2005, resulting in a Grid based search engine that will search across a variety of content sources including a number of electronic thesis and dissertation repositories. The Open Archives Initiative (OAI) is expanding and is clearly an interesting movement for a community advocating open access to ETD. However, the OAI approach alone may not be sufficiently scalable to achieve a truly global ETD Digital Library. Many universities simply offer their collections to the world via their local web services without being part of any federated system for archiving and even those dissertations that are provided with OAI compliant metadata will not necessarily be picked up by a centralized OAI Ser...

  14. RINGMesh: A programming library for developing mesh-based geomodeling applications

    Science.gov (United States)

    Pellerin, Jeanne; Botella, Arnaud; Bonneau, François; Mazuyer, Antoine; Chauvin, Benjamin; Lévy, Bruno; Caumon, Guillaume

    2017-07-01

    RINGMesh is a C++ open-source programming library for manipulating discretized geological models. It is designed to ease the development of applications and workflows that use discretized 3D models. It is neither a geomodeler, nor a meshing software. RINGMesh implements functionalities to read discretized surface-based or volumetric structural models and to check their validity. The models can be then exported in various file formats. RINGMesh provides data structures to represent geological structural models, either defined by their discretized boundary surfaces, and/or by discretized volumes. A programming interface allows to develop of new geomodeling methods, and to plug in external software. The goal of RINGMesh is to help researchers to focus on the implementation of their specific method rather than on tedious tasks common to many applications. The documented code is open-source and distributed under the modified BSD license. It is available at https://www.ring-team.org/index.php/software/ringmesh.

  15. Covariant Lyapunov vectors

    International Nuclear Information System (INIS)

    Ginelli, Francesco; Politi, Antonio; Chaté, Hugues; Livi, Roberto

    2013-01-01

    Recent years have witnessed a growing interest in covariant Lyapunov vectors (CLVs) which span local intrinsic directions in the phase space of chaotic systems. Here, we review the basic results of ergodic theory, with a specific reference to the implications of Oseledets’ theorem for the properties of the CLVs. We then present a detailed description of a ‘dynamical’ algorithm to compute the CLVs and show that it generically converges exponentially in time. We also discuss its numerical performance and compare it with other algorithms presented in the literature. We finally illustrate how CLVs can be used to quantify deviations from hyperbolicity with reference to a dissipative system (a chain of Hénon maps) and a Hamiltonian model (a Fermi–Pasta–Ulam chain). This article is part of a special issue of Journal of Physics A: Mathematical and Theoretical devoted to ‘Lyapunov analysis: from dynamical systems theory to applications’. (paper)

  16. Deriving covariant holographic entanglement

    Energy Technology Data Exchange (ETDEWEB)

    Dong, Xi [School of Natural Sciences, Institute for Advanced Study, Princeton, NJ 08540 (United States); Lewkowycz, Aitor [Jadwin Hall, Princeton University, Princeton, NJ 08544 (United States); Rangamani, Mukund [Center for Quantum Mathematics and Physics (QMAP), Department of Physics, University of California, Davis, CA 95616 (United States)

    2016-11-07

    We provide a gravitational argument in favour of the covariant holographic entanglement entropy proposal. In general time-dependent states, the proposal asserts that the entanglement entropy of a region in the boundary field theory is given by a quarter of the area of a bulk extremal surface in Planck units. The main element of our discussion is an implementation of an appropriate Schwinger-Keldysh contour to obtain the reduced density matrix (and its powers) of a given region, as is relevant for the replica construction. We map this contour into the bulk gravitational theory, and argue that the saddle point solutions of these replica geometries lead to a consistent prescription for computing the field theory Rényi entropies. In the limiting case where the replica index is taken to unity, a local analysis suffices to show that these saddles lead to the extremal surfaces of interest. We also comment on various properties of holographic entanglement that follow from this construction.

  17. MCNP and MATXS cross section libraries based on JENDL-3.3

    International Nuclear Information System (INIS)

    Kosako, Kazuaki; Konno, Chikara; Fukahori, Tokio; Shibata, Keiichi

    2003-01-01

    The continuous energy cross section library for the Monte Carlo transport code MCNP-4C, FSXLIB-J33, has been generated from the latest version of JENDL-3.3. The multigroup cross section library with the MATXS format, MATXS-J33, has been generated also from JENDL-3.3. Both libraries contain all nuclides in JENDL-3.3 and are processed at 300 K by the nuclear data processing system NJOY99. (author)

  18. A simple and efficient method for assembling TALE protein based on plasmid library.

    Science.gov (United States)

    Zhang, Zhiqiang; Li, Duo; Xu, Huarong; Xin, Ying; Zhang, Tingting; Ma, Lixia; Wang, Xin; Chen, Zhilong; Zhang, Zhiying

    2013-01-01

    DNA binding domain of the transcription activator-like effectors (TALEs) from Xanthomonas sp. consists of tandem repeats that can be rearranged according to a simple cipher to target new DNA sequences with high DNA-binding specificity. This technology has been successfully applied in varieties of species for genome engineering. However, assembling long TALE tandem repeats remains a big challenge precluding wide use of this technology. Although several new methodologies for efficiently assembling TALE repeats have been recently reported, all of them require either sophisticated facilities or skilled technicians to carry them out. Here, we described a simple and efficient method for generating customized TALE nucleases (TALENs) and TALE transcription factors (TALE-TFs) based on TALE repeat tetramer library. A tetramer library consisting of 256 tetramers covers all possible combinations of 4 base pairs. A set of unique primers was designed for amplification of these tetramers. PCR products were assembled by one step of digestion/ligation reaction. 12 TALE constructs including 4 TALEN pairs targeted to mouse Gt(ROSA)26Sor gene and mouse Mstn gene sequences as well as 4 TALE-TF constructs targeted to mouse Oct4, c-Myc, Klf4 and Sox2 gene promoter sequences were generated by using our method. The construction routines took 3 days and parallel constructions were available. The rate of positive clones during colony PCR verification was 64% on average. Sequencing results suggested that all TALE constructs were performed with high successful rate. This is a rapid and cost-efficient method using the most common enzymes and facilities with a high success rate.

  19. Starting a Research Data Management Program Based in a University Library

    Science.gov (United States)

    Henderson, Margaret E.; Knott, Teresa L.

    2015-01-01

    As the need for research data management grows, many libraries are considering adding data services to help with the research mission of their institution. The VCU Libraries created a position and hired a director of research data management in September 2013. The position was new to the libraries and the university. With the backing of the library administration, a plan for building relationships with VCU faculty, researchers, students, service and resource providers, including grant administrators, was developed to educate and engage the community in data management plan writing and research data management training. PMID:25611440

  20. Starting a research data management program based in a university library.

    Science.gov (United States)

    Henderson, Margaret E; Knott, Teresa L

    2015-01-01

    As the need for research data management grows, many libraries are considering adding data services to help with the research mission of their institution. The Virginia Commonwealth University (VCU) Libraries created a position and hired a director of research data management in September 2013. The position was new to the libraries and the university. With the backing of the library administration, a plan for building relationships with VCU faculty, researchers, students, service and resource providers, including grant administrators, was developed to educate and engage the community in data management plan writing and research data management training.

  1. Research Library

    Science.gov (United States)

    Los Alamos National Laboratory Research Library Search Site submit Contact Us | Remote Access Standards Theses/Dissertations Research Help Subject Guides Library Training Video Tutorials Alerts Research Library: delivering essential knowledge services for national security sciences since 1947 Los

  2. Challenges and Opportunities for Libraries in Pakistan

    OpenAIRE

    Shafiq UR, Rehman; Pervaiz, Ahmad

    2007-01-01

    Abstract: This paper, based on review of literature, observation, and informal conversations, discusses various challenges regarding finance, collection development, ICTs, human resources, library education, library association and research & development faced by library profession in Pakistan. The opportunities to meet these challenges have also been explored. Keywords: Library challenges and opportunities (Pakistan); Librarianship (Pakistan); Library issues; Library profession in Pa...

  3. A Polymerase Chain Reaction-Based Method for Isolating Clones from a Complimentary DNA Library in Sheep

    Science.gov (United States)

    Friis, Thor Einar; Stephenson, Sally; Xiao, Yin; Whitehead, Jon

    2014-01-01

    The sheep (Ovis aries) is favored by many musculoskeletal tissue engineering groups as a large animal model because of its docile temperament and ease of husbandry. The size and weight of sheep are comparable to humans, which allows for the use of implants and fixation devices used in human clinical practice. The construction of a complimentary DNA (cDNA) library can capture the expression of genes in both a tissue- and time-specific manner. cDNA libraries have been a consistent source of gene discovery ever since the technology became commonplace more than three decades ago. Here, we describe the construction of a cDNA library using cells derived from sheep bones based on the pBluescript cDNA kit. Thirty clones were picked at random and sequenced. This led to the identification of a novel gene, C12orf29, which our initial experiments indicate is involved in skeletal biology. We also describe a polymerase chain reaction-based cDNA clone isolation method that allows the isolation of genes of interest from a cDNA library pool. The techniques outlined here can be applied in-house by smaller tissue engineering groups to generate tools for biomolecular research for large preclinical animal studies and highlights the power of standard cDNA library protocols to uncover novel genes. PMID:24447069

  4. Fast Computing for Distance Covariance

    OpenAIRE

    Huo, Xiaoming; Szekely, Gabor J.

    2014-01-01

    Distance covariance and distance correlation have been widely adopted in measuring dependence of a pair of random variables or random vectors. If the computation of distance covariance and distance correlation is implemented directly accordingly to its definition then its computational complexity is O($n^2$) which is a disadvantage compared to other faster methods. In this paper we show that the computation of distance covariance and distance correlation of real valued random variables can be...

  5. Investigating the use of a digital library in an inquiry-based undergraduate geology course

    Directory of Open Access Journals (Sweden)

    Xornam S. Apedoe

    2007-06-01

    Full Text Available This paper reports the findings of a qualitative research study designed to investigate the opportunities and obstacles presented by a digital library for supporting teaching and learning in an inquiry-based undergraduate geology course. Data for this study included classroom observations and field-notes of classroom practices, questionnaires, and audiotapes and transcripts of interviews conducted with student and instructor participants. The findings suggest that although both the instructor and students recognized a number of opportunities presented by the digital library to support teaching and learning (e.g., provides access to various types of data, they encountered a number of obstacles (e.g., difficulty with the search mechanism that discouraged them from taking advantage of the resources available. Recommendations are presented for (a developers of digital libraries, and (b instructors wishing to integrate use of a digital library for supporting their teaching and student learning in an inquiry-based course. Le présent article rend compte des conclusions d’une étude de recherche qualitative élaborée afin d’examiner les occasions et les obstacles que présente une bibliothèque numérique appuyant l’enseignement et l’apprentissage dans le cadre d’un cours de géologie de premier cycle axé sur la recherche. Les données pour cette étude comprenaient les observations effectuées en salle de classe et les notes d’excursion des pratiques en salle de classe, les questionnaires, les bandes audio ainsi que les transcriptions des entrevues menées auprès des étudiants et de l’instructeur participant. Les conclusions laissent entendre que bien que l’instructeur et les étudiants reconnaissent un certain nombre d’occasions que présente la bibliothèque numérique en appui à l’enseignement et à l’apprentissage (p. ex. accès à divers types de données, ils ont dû surmonter un certain nombre d’obstacles (p. ex

  6. Multi-Target Angle Tracking Algorithm for Bistatic Multiple-Input Multiple-Output (MIMO Radar Based on the Elements of the Covariance Matrix

    Directory of Open Access Journals (Sweden)

    Zhengyan Zhang

    2018-03-01

    Full Text Available In this paper, we consider the problem of tracking the direction of arrivals (DOA and the direction of departure (DOD of multiple targets for bistatic multiple-input multiple-output (MIMO radar. A high-precision tracking algorithm for target angle is proposed. First, the linear relationship between the covariance matrix difference and the angle difference of the adjacent moment was obtained through three approximate relations. Then, the proposed algorithm obtained the relationship between the elements in the covariance matrix difference. On this basis, the performance of the algorithm was improved by averaging the covariance matrix element. Finally, the least square method was used to estimate the DOD and DOA. The algorithm realized the automatic correlation of the angle and provided better performance when compared with the adaptive asymmetric joint diagonalization (AAJD algorithm. The simulation results demonstrated the effectiveness of the proposed algorithm. The algorithm provides the technical support for the practical application of MIMO radar.

  7. Multi-Target Angle Tracking Algorithm for Bistatic Multiple-Input Multiple-Output (MIMO) Radar Based on the Elements of the Covariance Matrix.

    Science.gov (United States)

    Zhang, Zhengyan; Zhang, Jianyun; Zhou, Qingsong; Li, Xiaobo

    2018-03-07

    In this paper, we consider the problem of tracking the direction of arrivals (DOA) and the direction of departure (DOD) of multiple targets for bistatic multiple-input multiple-output (MIMO) radar. A high-precision tracking algorithm for target angle is proposed. First, the linear relationship between the covariance matrix difference and the angle difference of the adjacent moment was obtained through three approximate relations. Then, the proposed algorithm obtained the relationship between the elements in the covariance matrix difference. On this basis, the performance of the algorithm was improved by averaging the covariance matrix element. Finally, the least square method was used to estimate the DOD and DOA. The algorithm realized the automatic correlation of the angle and provided better performance when compared with the adaptive asymmetric joint diagonalization (AAJD) algorithm. The simulation results demonstrated the effectiveness of the proposed algorithm. The algorithm provides the technical support for the practical application of MIMO radar.

  8. A library based fitting method for visual reflectance spectroscopy of human skin

    Energy Technology Data Exchange (ETDEWEB)

    Verkruysse, Wim [Beckman Laser Institute, University of California, Irvine, CA 92612 (United States); Zhang Rong [Beckman Laser Institute, University of California, Irvine, CA 92612 (United States); Choi, Bernard [Beckman Laser Institute, University of California, Irvine, CA 92612 (United States); Lucassen, Gerald [Personal Care Institute, Philips Research, Prof Holstlaan 4, Eindhoven (Netherlands); Svaasand, Lars O [Department of Physical Electronics Norwegian University of Science and Technology, N-7491 Trondheim (Norway); Nelson, J Stuart [Beckman Laser Institute, University of California, Irvine, CA 92612 (United States)

    2005-01-07

    The diffuse reflectance spectrum of human skin in the visible region (400-800 nm) contains information on the concentrations of chromophores such as melanin and haemoglobin. This information may be extracted by fitting the reflectance spectrum with an optical diffusion based analytical expression applied to a layered skin model. With the use of the analytical expression, it is assumed that light transport is dominated by scattering. For port wine stain (PWS) and highly pigmented human skin, however, this assumption may not be valid resulting in a potentially large error in visual reflectance spectroscopy (VRS). Monte Carlo based techniques can overcome this problem but are currently too computationally intensive to be combined with previously used fitting procedures. The fitting procedure presented herein is based on a library search which enables the use of accurate reflectance spectra based on forward Monte Carlo simulations or diffusion theory. This allows for accurate VRS to characterize chromophore concentrations in PWS and highly pigmented human skin. The method is demonstrated using both simulated and measured reflectance spectra. An additional advantage of the method is that the fitting procedure is very fast.

  9. A library based fitting method for visual reflectance spectroscopy of human skin

    International Nuclear Information System (INIS)

    Verkruysse, Wim; Zhang Rong; Choi, Bernard; Lucassen, Gerald; Svaasand, Lars O; Nelson, J Stuart

    2005-01-01

    The diffuse reflectance spectrum of human skin in the visible region (400-800 nm) contains information on the concentrations of chromophores such as melanin and haemoglobin. This information may be extracted by fitting the reflectance spectrum with an optical diffusion based analytical expression applied to a layered skin model. With the use of the analytical expression, it is assumed that light transport is dominated by scattering. For port wine stain (PWS) and highly pigmented human skin, however, this assumption may not be valid resulting in a potentially large error in visual reflectance spectroscopy (VRS). Monte Carlo based techniques can overcome this problem but are currently too computationally intensive to be combined with previously used fitting procedures. The fitting procedure presented herein is based on a library search which enables the use of accurate reflectance spectra based on forward Monte Carlo simulations or diffusion theory. This allows for accurate VRS to characterize chromophore concentrations in PWS and highly pigmented human skin. The method is demonstrated using both simulated and measured reflectance spectra. An additional advantage of the method is that the fitting procedure is very fast

  10. A library based fitting method for visual reflectance spectroscopy of human skin

    Science.gov (United States)

    Verkruysse, Wim; Zhang, Rong; Choi, Bernard; Lucassen, Gerald; Svaasand, Lars O.; Nelson, J. Stuart

    2005-01-01

    The diffuse reflectance spectrum of human skin in the visible region (400-800 nm) contains information on the concentrations of chromophores such as melanin and haemoglobin. This information may be extracted by fitting the reflectance spectrum with an optical diffusion based analytical expression applied to a layered skin model. With the use of the analytical expression, it is assumed that light transport is dominated by scattering. For port wine stain (PWS) and highly pigmented human skin, however, this assumption may not be valid resulting in a potentially large error in visual reflectance spectroscopy (VRS). Monte Carlo based techniques can overcome this problem but are currently too computationally intensive to be combined with previously used fitting procedures. The fitting procedure presented herein is based on a library search which enables the use of accurate reflectance spectra based on forward Monte Carlo simulations or diffusion theory. This allows for accurate VRS to characterize chromophore concentrations in PWS and highly pigmented human skin. The method is demonstrated using both simulated and measured reflectance spectra. An additional advantage of the method is that the fitting procedure is very fast.

  11. Isolation of xylose isomerases by sequence- and function-based screening from a soil metagenomic library

    Directory of Open Access Journals (Sweden)

    Parachin Nádia

    2011-05-01

    Full Text Available Abstract Background Xylose isomerase (XI catalyses the isomerisation of xylose to xylulose in bacteria and some fungi. Currently, only a limited number of XI genes have been functionally expressed in Saccharomyces cerevisiae, the microorganism of choice for lignocellulosic ethanol production. The objective of the present study was to search for novel XI genes in the vastly diverse microbial habitat present in soil. As the exploitation of microbial diversity is impaired by the ability to cultivate soil microorganisms under standard laboratory conditions, a metagenomic approach, consisting of total DNA extraction from a given environment followed by cloning of DNA into suitable vectors, was undertaken. Results A soil metagenomic library was constructed and two screening methods based on protein sequence similarity and enzyme activity were investigated to isolate novel XI encoding genes. These two screening approaches identified the xym1 and xym2 genes, respectively. Sequence and phylogenetic analyses revealed that the genes shared 67% similarity and belonged to different bacterial groups. When xym1 and xym2 were overexpressed in a xylA-deficient Escherichia coli strain, similar growth rates to those in which the Piromyces XI gene was expressed were obtained. However, expression in S. cerevisiae resulted in only one-fourth the growth rate of that obtained for the strain expressing the Piromyces XI gene. Conclusions For the first time, the screening of a soil metagenomic library in E. coli resulted in the successful isolation of two active XIs. However, the discrepancy between XI enzyme performance in E. coli and S. cerevisiae suggests that future screening for XI activity from soil should be pursued directly using yeast as a host.

  12. A large synthetic peptide and phosphopeptide reference library for mass spectrometry–based proteomics

    NARCIS (Netherlands)

    Marx, H.; Lemeer, S.; Schliep, J.E.; Matheron, L.I.; Mohammed, S.; Cox, J.; Mann, M.; Heck, A.J.R.; Kuster, B.

    2013-01-01

    We present a peptide library and data resource of >100,000 synthetic, unmodified peptides and their phosphorylated counterparts with known sequences and phosphorylation sites. Analysis of the library by mass spectrometry yielded a data set that we used to evaluate the merits of different search

  13. Expanding the Intellectual Property Knowledge Base at University Libraries: Collaborating with Patent and Trademark Resource Centers

    Science.gov (United States)

    Wallace, Martin; Reinman, Suzanne

    2018-01-01

    Patent and Trademark Resource Centers are located in libraries throughout the U.S., with 43 being in academic libraries. With the importance of incorporating a knowledge of intellectual property (IP) and patent research in university curricula nationwide, this study developed and evaluated a partnership program to increase the understanding of IP…

  14. Evaluation of the quality of the college library websites in Iranian medical Universities based on the Stover model.

    Science.gov (United States)

    Nasajpour, Mohammad Reza; Ashrafi-Rizi, Hasan; Soleymani, Mohammad Reza; Shahrzadi, Leila; Hassanzadeh, Akbar

    2014-01-01

    Today, the websites of college and university libraries play an important role in providing the necessary services for clients. These websites not only allow the users to access different collections of library resources, but also provide them with the necessary guidance in order to use the information. The goal of this study is the quality evaluation of the college library websites in Iranian Medical Universities based on the Stover model. This study uses an analytical survey method and is an applied study. The data gathering tool is the standard checklist provided by Stover, which was modified by the researchers for this study. The statistical population is the college library websites of the Iranian Medical Universities (146 websites) and census method was used for investigation. The data gathering method was a direct access to each website and filling of the checklist was based on the researchers' observations. Descriptive and analytical statistics (Analysis of Variance (ANOVA)) were used for data analysis with the help of the SPSS software. The findings showed that in the dimension of the quality of contents, the highest average belonged to type one universities (46.2%) and the lowest average belonged to type three universities (24.8%). In the search and research capabilities, the highest average belonged to type one universities (48.2%) and the lowest average belonged to type three universities. In the dimension of facilities provided for the users, type one universities again had the highest average (37.2%), while type three universities had the lowest average (15%). In general the library websites of type one universities had the highest quality (44.2%), while type three universities had the lowest quality (21.1%). Also the library websites of the College of Rehabilitation and the College of Paramedics, of the Shiraz University of Medical Science, had the highest quality scores. The results showed that there was a meaningful difference between the quality

  15. Simultaneous digital quantification and fluorescence-based size characterization of massively parallel sequencing libraries.

    Science.gov (United States)

    Laurie, Matthew T; Bertout, Jessica A; Taylor, Sean D; Burton, Joshua N; Shendure, Jay A; Bielas, Jason H

    2013-08-01

    Due to the high cost of failed runs and suboptimal data yields, quantification and determination of fragment size range are crucial steps in the library preparation process for massively parallel sequencing (or next-generation sequencing). Current library quality control methods commonly involve quantification using real-time quantitative PCR and size determination using gel or capillary electrophoresis. These methods are laborious and subject to a number of significant limitations that can make library calibration unreliable. Herein, we propose and test an alternative method for quality control of sequencing libraries using droplet digital PCR (ddPCR). By exploiting a correlation we have discovered between droplet fluorescence and amplicon size, we achieve the joint quantification and size determination of target DNA with a single ddPCR assay. We demonstrate the accuracy and precision of applying this method to the preparation of sequencing libraries.

  16. Comparison of WIMS results using libraries based on new evaluated data files

    International Nuclear Information System (INIS)

    Trkov, A.; Ganesan, S.; Zidi, T.

    1996-01-01

    A number of selected benchmark experiments have been modelled with the WIMS-D/4 lattice code. Calculations were performed using multigroup libraries generated from a number of newly released evaluated data files. Data processing was done with the NJOY91.38 code. Since the data processing methods were the same in all cases, the results may serve to determine the impact on integral parameters due to differences in the basic data. The calculated integral parameters were also compared to the measured values. Observed differences were small, which means that there are no essential differences between the evaluated data libraries. The results of the analysis cannot serve to discriminate in terms of quality of the data between the evaluated data libraries considered. For the test cases considered the results with the new, unadjusted libraries are at least as good as those obtained with the old, adjusted WIMS library which is supplied with the code. (author). 16 refs, 3 tabs

  17. Neutron cross section and covariance data evaluation of experimental data for {sup 27}Al

    Energy Technology Data Exchange (ETDEWEB)

    Chunjuan, Li; Jianfeng, Liu [Physics Department , Zhengzhou Univ., Zhengzhou (China); Tingjin, Liu [China Nuclear Data Center, China Inst. of Atomic Energy, Beijing (China)

    2006-07-15

    The evaluation of neutron cross section and covariance data for {sup 27}Al in the energy range from 210 keV to 20 MeV was carried out on the basis of the experimental data mainly taken from EXFOR library. After the experimental data and their errors were analyzed, selected and corrected, SPCC code was used to fit the data and merge the covariance matrix. The evaluated neutron cross section data and covariance matrix for {sup 27}Al given can be collected for the evaluated library and also can be used as the basis of theoretical calculation concerned. (authors)

  18. Neutron cross section and covariance data evaluation of experimental data for 27Al

    International Nuclear Information System (INIS)

    Li Chunjuan; Liu Jianfeng; Liu Tingjin

    2006-01-01

    The evaluation of neutron cross section and covariance data for 27 Al in the energy range from 210 keV to 20 MeV was carried out on the basis of the experimental data mainly taken from EXFOR library. After the experimental data and their errors were analyzed, selected and corrected, SPCC code was used to fit the data and merge the covariance matrix. The evaluated neutron cross section data and covariance matrix for 27 Al given can be collected for the evaluated library and also can be used as the basis of theoretical calculation concerned. (authors)

  19. Schroedinger covariance states in anisotropic waveguides

    International Nuclear Information System (INIS)

    Angelow, A.; Trifonov, D.

    1995-03-01

    In this paper Squeezed and Covariance States based on Schroedinger inequality and their connection with other nonclassical states are considered for particular case of anisotropic waveguide in LiNiO 3 . Here, the problem of photon creation and generation of squeezed and Schroedinger covariance states in optical waveguides is solved in two steps: 1. Quantization of electromagnetic field is provided in the presence of dielectric waveguide using normal-mode expansion. The photon creation and annihilation operators are introduced, expanding the solution A-vector(r-vector,t) in a series in terms of the Sturm - Liouville mode-functions. 2. In terms of these operators the Hamiltonian of the field in a nonlinear waveguide is derived. For such Hamiltonian we construct the covariance states as stable (with nonzero covariance), which minimize the Schroedinger uncertainty relation. The evolutions of the three second momenta of q-circumflex j and p-circumflex j are calculated. For this Hamiltonian all three momenta are expressed in terms of one real parameters s only. It is found out how covariance, via this parameter s, depends on the waveguide profile n(x,y), on the mode-distributions u-vector j (x,y), and on the waveguide phase mismatching Δβ. (author). 37 refs

  20. Results from the second Galaxy Serpent web-based table top exercise utilizing the concept of nuclear forensics libraries

    International Nuclear Information System (INIS)

    Borgardt, James; Canaday, Jodi; Chamberlain, David

    2017-01-01

    Galaxy Serpent is a unique, virtual, web-based international tabletop series of exercises designed to mature the concept of National Nuclear Forensics Libraries (NNFLs). Teams participating in the second version of the exercise were provided synthetic sealed radioactive source data used to compile a model NNFL which then served as a comparative instrument in hypothetical scenarios involving sources out of regulatory control, allowing teams to successfully down-select and determine whether investigated sources were consistent with holdings in their model library. The methodologies utilized and aggregate results of the exercise will be presented, along with challenges encountered and benefits realized. (author)

  1. A jackknife approach to quantifying single-trial correlation between covariance-based metrics undefined on a single-trial basis.

    Science.gov (United States)

    Richter, Craig G; Thompson, William H; Bosman, Conrado A; Fries, Pascal

    2015-07-01

    The quantification of covariance between neuronal activities (functional connectivity) requires the observation of correlated changes and therefore multiple observations. The strength of such neuronal correlations may itself undergo moment-by-moment fluctuations, which might e.g. lead to fluctuations in single-trial metrics such as reaction time (RT), or may co-fluctuate with the correlation between activity in other brain areas. Yet, quantifying the relation between moment-by-moment co-fluctuations in neuronal correlations is precluded by the fact that neuronal correlations are not defined per single observation. The proposed solution quantifies this relation by first calculating neuronal correlations for all leave-one-out subsamples (i.e. the jackknife replications of all observations) and then correlating these values. Because the correlation is calculated between jackknife replications, we address this approach as jackknife correlation (JC). First, we demonstrate the equivalence of JC to conventional correlation for simulated paired data that are defined per observation and therefore allow the calculation of conventional correlation. While the JC recovers the conventional correlation precisely, alternative approaches, like sorting-and-binning, result in detrimental effects of the analysis parameters. We then explore the case of relating two spectral correlation metrics, like coherence, that require multiple observation epochs, where the only viable alternative analysis approaches are based on some form of epoch subdivision, which results in reduced spectral resolution and poor spectral estimators. We show that JC outperforms these approaches, particularly for short epoch lengths, without sacrificing any spectral resolution. Finally, we note that the JC can be applied to relate fluctuations in any smooth metric that is not defined on single observations. Copyright © 2015. Published by Elsevier Inc.

  2. Covariant electromagnetic field lines

    Science.gov (United States)

    Hadad, Y.; Cohen, E.; Kaminer, I.; Elitzur, A. C.

    2017-08-01

    Faraday introduced electric field lines as a powerful tool for understanding the electric force, and these field lines are still used today in classrooms and textbooks teaching the basics of electromagnetism within the electrostatic limit. However, despite attempts at generalizing this concept beyond the electrostatic limit, such a fully relativistic field line theory still appears to be missing. In this work, we propose such a theory and define covariant electromagnetic field lines that naturally extend electric field lines to relativistic systems and general electromagnetic fields. We derive a closed-form formula for the field lines curvature in the vicinity of a charge, and show that it is related to the world line of the charge. This demonstrates how the kinematics of a charge can be derived from the geometry of the electromagnetic field lines. Such a theory may also provide new tools in modeling and analyzing electromagnetic phenomena, and may entail new insights regarding long-standing problems such as radiation-reaction and self-force. In particular, the electromagnetic field lines curvature has the attractive property of being non-singular everywhere, thus eliminating all self-field singularities without using renormalization techniques.

  3. Fragment-Based Screening of a Natural Product Library against 62 Potential Malaria Drug Targets Employing Native Mass Spectrometry

    Science.gov (United States)

    2018-01-01

    Natural products are well known for their biological relevance, high degree of three-dimensionality, and access to areas of largely unexplored chemical space. To shape our understanding of the interaction between natural products and protein targets in the postgenomic era, we have used native mass spectrometry to investigate 62 potential protein targets for malaria using a natural-product-based fragment library. We reveal here 96 low-molecular-weight natural products identified as binding partners of 32 of the putative malarial targets. Seventy-nine (79) fragments have direct growth inhibition on Plasmodium falciparum at concentrations that are promising for the development of fragment hits against these protein targets. This adds a fragment library to the published HTS active libraries in the public domain. PMID:29436819

  4. Development of covariance date for fast reactor cores. 3

    International Nuclear Information System (INIS)

    Shibata, Keiichi; Hasegawa, Akira

    1999-03-01

    Covariances have been estimated for nuclear data contained in JENDL-3.2. As for Cr and Ni, the physical quantities for which covariances are deduced are cross sections and the first order Legendre-polynomial coefficient for the angular distribution of elastically scattered neutrons. The covariances were estimated by using the same methodology that had been used in the JENDL-3.2 evaluation in order to keep a consistency between mean values and their covariances. In a case where evaluated data were based on experimental data, the covariances were estimated from the same experimental data. For cross section that had been evaluated by nuclear model calculations, the same model was applied to generate the covariances. The covariances obtained were compiled into ENDF-6 format files. The covariances, which had been prepared by the previous fiscal year, were re-examined, and some improvements were performed. Parts of Fe and 235 U covariances were updated. Covariances of nu-p and nu-d for 241 Pu and of fission neutron spectra for 233,235,238 U and 239,240 Pu were newly added to data files. (author)

  5. Ligand efficiency based approach for efficient virtual screening of compound libraries.

    Science.gov (United States)

    Ke, Yi-Yu; Coumar, Mohane Selvaraj; Shiao, Hui-Yi; Wang, Wen-Chieh; Chen, Chieh-Wen; Song, Jen-Shin; Chen, Chun-Hwa; Lin, Wen-Hsing; Wu, Szu-Huei; Hsu, John T A; Chang, Chung-Ming; Hsieh, Hsing-Pang

    2014-08-18

    Here we report for the first time the use of fit quality (FQ), a ligand efficiency (LE) based measure for virtual screening (VS) of compound libraries. The LE based VS protocol was used to screen an in-house database of 125,000 compounds to identify aurora kinase A inhibitors. First, 20 known aurora kinase inhibitors were docked to aurora kinase A crystal structure (PDB ID: 2W1C); and the conformations of docked ligand were used to create a pharmacophore (PH) model. The PH model was used to screen the database compounds, and rank (PH rank) them based on the predicted IC50 values. Next, LE_Scale, a weight-dependant LE function, was derived from 294 known aurora kinase inhibitors. Using the fit quality (FQ = LE/LE_Scale) score derived from the LE_Scale function, the database compounds were reranked (PH_FQ rank) and the top 151 (0.12% of database) compounds were assessed for aurora kinase A inhibition biochemically. This VS protocol led to the identification of 7 novel hits, with compound 5 showing aurora kinase A IC50 = 1.29 μM. Furthermore, testing of 5 against a panel of 31 kinase reveals that it is selective toward aurora kinase A & B, with <50% inhibition for other kinases at 10 μM concentrations and is a suitable candidate for further development. Incorporation of FQ score in the VS protocol not only helped identify a novel aurora kinase inhibitor, 5, but also increased the hit rate of the VS protocol by improving the enrichment factor (EF) for FQ based screening (EF = 828), compared to PH based screening (EF = 237) alone. The LE based VS protocol disclosed here could be applied to other targets for hit identification in an efficient manner. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  6. A Validated MCNP(X) Cross Section Library based on JEFF 3.1

    International Nuclear Information System (INIS)

    Haeck, W.; Verboomen, B.

    2006-01-01

    ALEPH-LIB is a multi-temperature neutron transport library for standard use by MCNP(X) and ALEPH generated with ALEPH-DLG. This is an auxiliary computer code to ALEPH, the Monte Carlo burn-up code under development at SCK-CEN in collaboration with Ghent university. ALEPH-DLG automates the entire process of generating library files with NJOY and takes care of the first requirement of a validated application library: verify the processing. It produces tailor made NJOY input files using data from the original ENDF file (initial temperature, the fact if the nuclide is fissile or if it has unresolved resonances, etc.) When the library files have been generated, ALEPH-DLG will also process the output from NJOY by extracting all messages and warnings. If ALEPH-DLG finds anything out of the ordinary, it will either warn the user or perform corrective actions. The temperatures included in the ALEPH-LIB library are 300, 600, 900, 1200, 1500 and 1800 K. Library files were produced for the JEF 2.2, JEFF 3.0, JEFF 3.1, JENDL 3.3 and ENDF/B-VI.8 nuclear data libraries. This will be extended with ENDF/B-VII when it becomes available. This report deals with the JEFF 3.1 files included in ALEPH-LIB that are now released by the NEA-OECD.

  7. A Validated MCNP(X) Cross Section Library based on JEFF 3.1

    Energy Technology Data Exchange (ETDEWEB)

    Haeck, W; Verboomen, B

    2006-10-15

    ALEPH-LIB is a multi-temperature neutron transport library for standard use by MCNP(X) and ALEPH generated with ALEPH-DLG. This is an auxiliary computer code to ALEPH, the Monte Carlo burn-up code under development at SCK-CEN in collaboration with Ghent university. ALEPH-DLG automates the entire process of generating library files with NJOY and takes care of the first requirement of a validated application library: verify the processing. It produces tailor made NJOY input files using data from the original ENDF file (initial temperature, the fact if the nuclide is fissile or if it has unresolved resonances, etc.) When the library files have been generated, ALEPH-DLG will also process the output from NJOY by extracting all messages and warnings. If ALEPH-DLG finds anything out of the ordinary, it will either warn the user or perform corrective actions. The temperatures included in the ALEPH-LIB library are 300, 600, 900, 1200, 1500 and 1800 K. Library files were produced for the JEF 2.2, JEFF 3.0, JEFF 3.1, JENDL 3.3 and ENDF/B-VI.8 nuclear data libraries. This will be extended with ENDF/B-VII when it becomes available. This report deals with the JEFF 3.1 files included in ALEPH-LIB that are now released by the NEA-OECD.

  8. Covariation in Natural Causal Induction.

    Science.gov (United States)

    Cheng, Patricia W.; Novick, Laura R.

    1991-01-01

    Biases and models usually offered by cognitive and social psychology and by philosophy to explain causal induction are evaluated with respect to focal sets (contextually determined sets of events over which covariation is computed). A probabilistic contrast model is proposed as underlying covariation computation in natural causal induction. (SLD)

  9. 3D High Resolution Mesh Deformation Based on Multi Library Wavelet Neural Network Architecture

    Science.gov (United States)

    Dhibi, Naziha; Elkefi, Akram; Bellil, Wajdi; Amar, Chokri Ben

    2016-12-01

    This paper deals with the features of a novel technique for large Laplacian boundary deformations using estimated rotations. The proposed method is based on a Multi Library Wavelet Neural Network structure founded on several mother wavelet families (MLWNN). The objective is to align features of mesh and minimize distortion with a fixed feature that minimizes the sum of the distances between all corresponding vertices. New mesh deformation method worked in the domain of Region of Interest (ROI). Our approach computes deformed ROI, updates and optimizes it to align features of mesh based on MLWNN and spherical parameterization configuration. This structure has the advantage of constructing the network by several mother wavelets to solve high dimensions problem using the best wavelet mother that models the signal better. The simulation test achieved the robustness and speed considerations when developing deformation methodologies. The Mean-Square Error and the ratio of deformation are low compared to other works from the state of the art. Our approach minimizes distortions with fixed features to have a well reconstructed object.

  10. Zero curvature conditions and conformal covariance

    International Nuclear Information System (INIS)

    Akemann, G.; Grimm, R.

    1992-05-01

    Two-dimensional zero curvature conditions were investigated in detail, with special emphasis on conformal properties, and the appearance of covariant higher order differential operators constructed in terms of a projective connection was elucidated. The analysis is based on the Kostant decomposition of simple Lie algebras in terms of representations with respect to their 'principal' SL(2) subalgebra. (author) 27 refs

  11. On superfield covariant quantization in general coordinates

    International Nuclear Information System (INIS)

    Gitman, D.M.; Moshin, P. Yu.; Tomazelli, J.L.

    2005-01-01

    We propose a natural extension of the BRST-antiBRST superfield covariant scheme in general coordinates. Thus, the coordinate dependence of the basic tensor fields and scalar density of the formalism is extended from the base supermanifold to the complete set of superfield variables. (orig.)

  12. On superfield covariant quantization in general coordinates

    Energy Technology Data Exchange (ETDEWEB)

    Gitman, D.M. [Universidade de Sao Paulo, Instituto de Fisica, Sao Paulo, S.P (Brazil); Moshin, P. Yu. [Universidade de Sao Paulo, Instituto de Fisica, Sao Paulo, S.P (Brazil); Tomsk State Pedagogical University, Tomsk (Russian Federation); Tomazelli, J.L. [UNESP, Departamento de Fisica e Quimica, Campus de Guaratingueta (Brazil)

    2005-12-01

    We propose a natural extension of the BRST-antiBRST superfield covariant scheme in general coordinates. Thus, the coordinate dependence of the basic tensor fields and scalar density of the formalism is extended from the base supermanifold to the complete set of superfield variables. (orig.)

  13. Covariant field theory of closed superstrings

    International Nuclear Information System (INIS)

    Siopsis, G.

    1989-01-01

    The authors construct covariant field theories of both type-II and heterotic strings. Toroidal compactification is also considered. The interaction vertices are based on Witten's vertex representing three strings interacting at the mid-point. For closed strings, the authors thus obtain a bilocal interaction

  14. Group cross-section processing method and common nuclear group cross-section library based on JENDL-3 nuclear data file

    International Nuclear Information System (INIS)

    Hasegawa, Akira

    1991-01-01

    A common group cross-section library has been developed in JAERI. This system is called 'JSSTDL-295n-104γ (neutron:295 gamma:104) group constants library system', which is composed of a common 295n-104γ group cross-section library based on JENDL-3 nuclear data file and its utility codes. This system is applicable to fast and fusion reactors. In this paper, firstly outline of group cross-section processing adopted in Prof. GROUCH-G/B system is described in detail which is a common step for all group cross-section library generation. Next available group cross-section libraries developed in Japan based on JENDL-3 are briefly reviewed. Lastly newly developed JSSTDL library system is presented with some special attention to the JENDL-3 data. (author)

  15. Usage Analysis for the Identification of Research Trends in Digital Libraries; Keepers of the Crumbling Culture: What Digital Preservation Can Learn from Library History; Patterns of Journal Use by Scientists through Three Evolutionary Phases; Developing a Content Management System-Based Web Site; Exploring Charging Models for Digital Cultural Heritage in Europe; Visions: The Academic Library in 2012.

    Science.gov (United States)

    Bollen, Johan; Vemulapalli, Soma Sekara; Xu, Weining; Luce, Rick; Marcum, Deanna; Friedlander, Amy; Tenopir, Carol; Grayson, Matt; Zhang, Yan; Ebuen, Mercy; King, Donald W.; Boyce, Peter; Rogers, Clare; Kirriemuir, John; Tanner, Simon; Deegan, Marilyn; Marcum, James W.

    2003-01-01

    Includes six articles that discuss use analysis and research trends in digital libraries; library history and digital preservation; journal use by scientists; a content management system-based Web site for higher education in the United Kingdom; cost studies for transitioning to digitized collections in European cultural institutions; and the…

  16. Introduction to covariant formulation of superstring (field) theory

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    The author discusses covariant formulation of superstring theories based on BRS invariance. New formulation of superstring was constructed by Green and Schwarz in the light-cone gauge first and then a covariant action was discovered. The covariant action has some interesting geometrical interpretation, however, covariant quantizations are difficult to perform because of existence of local supersymmetries. Introducing extra variables into the action, a modified action has been proposed. However, it would be difficult to prescribe constraints to define a physical subspace, or to reproduce the correct physical spectrum. Hence the old formulation, i.e., the Neveu-Schwarz-Ramond (NSR) model for covariant quantization is used. The author begins by quantizing the NSR model in a covariant way using BRS charges. Then the author discusses the field theory of (free) superstring

  17. Comprehensive evaluation of SNP identification with the Restriction Enzyme-based Reduced Representation Library (RRL method

    Directory of Open Access Journals (Sweden)

    Du Ye

    2012-02-01

    Full Text Available Abstract Background Restriction Enzyme-based Reduced Representation Library (RRL method represents a relatively feasible and flexible strategy used for Single Nucleotide Polymorphism (SNP identification in different species. It has remarkable advantage of reducing the complexity of the genome by orders of magnitude. However, comprehensive evaluation for actual efficacy of SNP identification by this method is still unavailable. Results In order to evaluate the efficacy of Restriction Enzyme-based RRL method, we selected Tsp 45I enzyme which covers 266 Mb flanking region of the enzyme recognition site according to in silico simulation on human reference genome, then we sequenced YH RRL after Tsp 45I treatment and obtained reads of which 80.8% were mapped to target region with an 20-fold average coverage, about 96.8% of target region was covered by at least one read and 257 K SNPs were identified in the region using SOAPsnp software. Compared with whole genome resequencing data, we observed false discovery rate (FDR of 13.95% and false negative rate (FNR of 25.90%. The concordance rate of homozygote loci was over 99.8%, but that of heterozygote were only 92.56%. Repeat sequences and bases quality were proved to have a great effect on the accuracy of SNP calling, SNPs in recognition sites contributed evidently to the high FNR and the low concordance rate of heterozygote. Our results indicated that repeat masking and high stringent filter criteria could significantly decrease both FDR and FNR. Conclusions This study demonstrates that Restriction Enzyme-based RRL method was effective for SNP identification. The results highlight the important role of bias and the method-derived defects represented in this method and emphasize the special attentions noteworthy.

  18. General covariance and quantum theory

    International Nuclear Information System (INIS)

    Mashhoon, B.

    1986-01-01

    The extension of the principle of relativity to general coordinate systems is based on the hypothesis that an accelerated observer is locally equivalent to a hypothetical inertial observer with the same velocity as the noninertial observer. This hypothesis of locality is expected to be valid for classical particle phenomena as well as for classical wave phenomena but only in the short-wavelength approximation. The generally covariant theory is therefore expected to be in conflict with the quantum theory which is based on wave-particle duality. This is explicitly demonstrated for the frequency of electromagnetic radiation measured by a uniformly rotating observer. The standard Doppler formula is shown to be valid only in the geometric optics approximation. A new definition for the frequency is proposed, and the resulting formula for the frequency measured by the rotating observer is shown to be consistent with expectations based on the classical theory of electrons. A tentative quantum theory is developed on the basis of the generalization of the Bohr frequency condition to include accelerated observers. The description of the causal sequence of events is assumed to be independent of the motion of the observer. Furthermore, the quantum hypothesis is supposed to be valid for all observers. The implications of this theory are critically examined. The new formula for frequency, which is still based on the hypothesis of locality, leads to the observation of negative energy quanta by the rotating observer and is therefore in conflict with the quantum theory

  19. WIMS/ABBN library based on the fond-2 evaluated files

    International Nuclear Information System (INIS)

    Jerdev, G.; Zabrodskaia, S.; Tsiboulia, A.; Koscheev, V.

    2001-01-01

    Description of a new system WIMS/ABBN is presented. It includes updated libraries of the WIMS/D4 code. The report involves sources and method of their creation, analysis and comparison of results at different tests also. Second path of the work is the development of the burnup library at WIMS system. As the result it was made a new system of burnup which new capabilities are shown. (authors)

  20. Design of redundant array of independent DVD libraries based on iSCSI

    Science.gov (United States)

    Chen, Yupeng; Pan, Longfa

    2003-04-01

    This paper presents a new approach to realize the redundant array of independent DVD libraries (RAID-LoIP) by using the iSCSI technology and traditional RAID algorithms. Our design reaches the high performance of optical storage system with following features: large storage size, highly accessing rate, random access, long distance of DVD libraries, block I/O storage, long storage life. Our RAID-LoIP system can be a good solution for broadcasting media asset storage system.

  1. A Fragment-Based Method of Creating Small-Molecule Libraries to Target the Aggregation of Intrinsically Disordered Proteins.

    Science.gov (United States)

    Joshi, Priyanka; Chia, Sean; Habchi, Johnny; Knowles, Tuomas P J; Dobson, Christopher M; Vendruscolo, Michele

    2016-03-14

    The aggregation process of intrinsically disordered proteins (IDPs) has been associated with a wide range of neurodegenerative disorders, including Alzheimer's and Parkinson's diseases. Currently, however, no drug in clinical use targets IDP aggregation. To facilitate drug discovery programs in this important and challenging area, we describe a fragment-based approach of generating small-molecule libraries that target specific IDPs. The method is based on the use of molecular fragments extracted from compounds reported in the literature to inhibit of the aggregation of IDPs. These fragments are used to screen existing large generic libraries of small molecules to form smaller libraries specific for given IDPs. We illustrate this approach by describing three distinct small-molecule libraries to target, Aβ, tau, and α-synuclein, which are three IDPs implicated in Alzheimer's and Parkinson's diseases. The strategy described here offers novel opportunities for the identification of effective molecular scaffolds for drug discovery for neurodegenerative disorders and to provide insights into the mechanism of small-molecule binding to IDPs.

  2. CORALINA: a universal method for the generation of gRNA libraries for CRISPR-based screening.

    Science.gov (United States)

    Köferle, Anna; Worf, Karolina; Breunig, Christopher; Baumann, Valentin; Herrero, Javier; Wiesbeck, Maximilian; Hutter, Lukas H; Götz, Magdalena; Fuchs, Christiane; Beck, Stephan; Stricker, Stefan H

    2016-11-14

    The bacterial CRISPR system is fast becoming the most popular genetic and epigenetic engineering tool due to its universal applicability and adaptability. The desire to deploy CRISPR-based methods in a large variety of species and contexts has created an urgent need for the development of easy, time- and cost-effective methods enabling large-scale screening approaches. Here we describe CORALINA (comprehensive gRNA library generation through controlled nuclease activity), a method for the generation of comprehensive gRNA libraries for CRISPR-based screens. CORALINA gRNA libraries can be derived from any source of DNA without the need of complex oligonucleotide synthesis. We show the utility of CORALINA for human and mouse genomic DNA, its reproducibility in covering the most relevant genomic features including regulatory, coding and non-coding sequences and confirm the functionality of CORALINA generated gRNAs. The simplicity and cost-effectiveness make CORALINA suitable for any experimental system. The unprecedented sequence complexities obtainable with CORALINA libraries are a necessary pre-requisite for less biased large scale genomic and epigenomic screens.

  3. ZZ-FSXLIB-JD99, MCNP nuclear data library based on JENDL Dosimetry File 99

    International Nuclear Information System (INIS)

    Shibata, Keiichi

    2007-01-01

    Description: JENDL Dosimetry File 99 processed into ACE for Monte Carlo calculations. JENDL/D-99 based MCNP library. Format: ACE. Number of groups: Continuous energy cross section library. Nuclides: 47 Nuclides and 67 reactions: Li-6 (n, triton) alpha; Li-6 alpha-production; Li-7 triton- production; B-10 (n, alpha) Li-7; B-10 alpha-production; F-19 (n, 2n) F-18; Na-23 (n, 2n) Na-22; Na-23 (n, gamma) Na-24; Mg-24 (n, p) Na-24; Al-27 (n, p) Mg-27; Al-27 (n, alpha) Na-24; P-31 (n, p) Si-31; S-32 (n, p) P-32; Sc-45 (n, gamma) Sc-46; Ti-nat (n, x) Sc-46; Ti-nat (n, x) Sc-47; Ti-nat (n, x) Sc-48; Ti-46 (n, 2n) Ti-45; Ti-46 (n, p) Sc-46; Ti-47 (n, np) Sc-46; Ti-47 (n, p) Sc-47; Ti-48 (n, np) Sc-47; Ti-48 (n, p) Sc-48; Ti-49 (n, np) Sc-48; Cr-50 (n, gamma) Cr-51; Cr-52 (n, 2n) Cr-51; Mn-55 (n, 2n) Mn-54; Mn-55 (n, gamma) Mn-56; Fe-54 (n, p) Mn-54; Fe-56 (n, p) Mn-56; Fe-57 (n, np) Mn-56; Fe-58 (n, gamma) Fe-59; Co-59 (n, 2n) Co-58; Co-59 (n, gamma) Co-60; Co-59 (n, alpha) Mn-56; Ni-58 (n, 2n) Ni-57; Ni-58 (n, p) Co-58; Ni-60 (n, p) Co-60; Cu-63 (n, 2n) Cu-62; Cu-63 (n, gamma) Cu-64; Cu-63 (n, alpha) Co-60; Cu-65 (n, 2n) Cu-64; Zn-64 (n, p) Cu-64; Y-89 (n, 2n) Y-88; Zr-90 (n, 2n) Zr-89; Nb-93 (n, n') Nb-93m; Nb-93 (n, 2n) Nb-92m; Rh-103 (n, n') Rh-103m; Ag-109 (n, gamma) Ag-110m; In-115 (n, n') In-115m; In-115 (n, gamma) In-116m; I-127 (n,2n) I-126; Eu-151 (n, gamma) Eu-152; Tm-169 (n,2n) Tm-168; Ta-181 (n, gamma) Ta-182; W-186 (n, gamma) W-187; Au-197 (n, 2n) Au-196; Au-197 (n, gamma) Au-198; Hg-199 (n, n') Hg-199m; Th-232 - fission; Th-232 (n, gamma) Th-233; U-235 - fission; U-238 - fission; U-238 (n, gamma) U-239; Np-237 - fission; Pu-239 - fission; Am-241 - fission. The data were produced on the 31 of March, 2006

  4. Libraries and Accessibility: Istanbul Public Libraries Case

    Directory of Open Access Journals (Sweden)

    Gül Yücel

    2016-12-01

    Full Text Available In the study; the assessment of accessibility has been conducted in Istanbul public libraries within the scope of public area. Public libraries commonly serve with its user of more than 20 million in total, spread to the general of Turkey, having more than one thousand branches in the centrums and having more than one million registered members. The building principles and standards covering the subjects such as the selection of place, historical and architectural specification of the region, distance to the centre of population and design in a way that the disabled people could benefit from the library services fully have been determined with regulations in the construction of new libraries. There are works for the existent libraries such as access for the disabled, fire safety precautions etc. within the scope of the related standards. Easy access by everyone is prioritized in the public libraries having a significant role in life-long learning. The purpose of the study is to develop solution suggestions for the accessibility problems in the public libraries. The study based on the eye inspection and assessments carried out within the scope of accessibility in the public libraries subsidiary to Istanbul Culture and Tourism Provincial Directorate Library and Publications Department within the provincial borders of Istanbul. The arrangements such as reading halls, study areas, book shelves etc. have been examined within the frame of accessible building standards. Building entrances, ramps and staircases, horizontal and vertical circulation of building etc. have been taken into consideration within the scope of accessible building standards. The subjects such as the reading and studying areas and book shelf arrangements for the library have been assessed within the scope of specific buildings. There are a total of 34 public libraries subsidiary to Istanbul Culture and Tourism Provincial Directorate on condition that 20 ea. of them are in the

  5. Experience in Computer-Assisted XML-Based Modelling in the Context of Libraries

    CERN Document Server

    Niinimäki, M

    2003-01-01

    In this paper, we introduce a software called Meta Data Visualisation (MDV) that (i) assists the user with a graphical user interface in the creation of his specific document types, (ii) creates a database according to these document types, (iii) allows the user to browse the database, and (iv) uses native XML presentation of the data in order to allow queries or data to be exported to other XML-based systems. We illustrate the use of MDV and XML modelling using library-related examples to build a bibliographic database. In our opinion, creating document type descriptions corresponds to conceptual and logical database design in a database design process. We consider that this design can be supported with a suitable set of tools that help the designer concentrate on conceptual issues instead of implementation issues. Our hypothesis is that using the methodology presented in this paper we can create XML databases that are useful and relevant, and with which MDV works as a user interface.

  6. Molecular Bases of PDE4D Inhibition by Memory-Enhancing GEBR Library Compounds.

    Science.gov (United States)

    Prosdocimi, Tommaso; Mollica, Luca; Donini, Stefano; Semrau, Marta S; Lucarelli, Anna Paola; Aiolfi, Egidio; Cavalli, Andrea; Storici, Paola; Alfei, Silvana; Brullo, Chiara; Bruno, Olga; Parisini, Emilio

    2018-05-01

    Selected members of the large rolipram-related GEBR family of type 4 phosphodiesterase (PDE4) inhibitors have been shown to facilitate long-term potentiation and to improve memory functions without causing emetic-like behavior in rodents. Despite their micromolar-range binding affinities and their promising pharmacological and toxicological profiles, few if any structure-activity relationship studies have been performed to elucidate the molecular bases of their action. Here, we report the crystal structure of a number of GEBR library compounds in complex with the catalytic domain of PDE4D as well as their inhibitory profiles for both the long PDE4D3 isoform and the catalytic domain alone. Furthermore, we assessed the stability of the observed ligand conformations in the context of the intact enzyme using molecular dynamics simulations. The longer and more flexible ligands appear to be capable of forming contacts with the regulatory portion of the enzyme, thus possibly allowing some degree of selectivity between the different PDE4 isoforms.

  7. Covariance Manipulation for Conjunction Assessment

    Science.gov (United States)

    Hejduk, M. D.

    2016-01-01

    The manipulation of space object covariances to try to provide additional or improved information to conjunction risk assessment is not an uncommon practice. Types of manipulation include fabricating a covariance when it is missing or unreliable to force the probability of collision (Pc) to a maximum value ('PcMax'), scaling a covariance to try to improve its realism or see the effect of covariance volatility on the calculated Pc, and constructing the equivalent of an epoch covariance at a convenient future point in the event ('covariance forecasting'). In bringing these methods to bear for Conjunction Assessment (CA) operations, however, some do not remain fully consistent with best practices for conducting risk management, some seem to be of relatively low utility, and some require additional information before they can contribute fully to risk analysis. This study describes some basic principles of modern risk management (following the Kaplan construct) and then examines the PcMax and covariance forecasting paradigms for alignment with these principles; it then further examines the expected utility of these methods in the modern CA framework. Both paradigms are found to be not without utility, but only in situations that are somewhat carefully circumscribed.

  8. Kinematic source inversions of teleseismic data based on the QUESO library for uncertainty quantification and prediction

    Science.gov (United States)

    Zielke, O.; McDougall, D.; Mai, P. M.; Babuska, I.

    2014-12-01

    One fundamental aspect of seismic hazard mitigation is gaining a better understanding of the rupture process. Because direct observation of the relevant parameters and properties is not possible, other means such as kinematic source inversions are used instead. By constraining the spatial and temporal evolution of fault slip during an earthquake, those inversion approaches may enable valuable insights in the physics of the rupture process. However, due to the underdetermined nature of this inversion problem (i.e., inverting a kinematic source model for an extended fault based on seismic data), the provided solutions are generally non-unique. Here we present a statistical (Bayesian) inversion approach based on an open-source library for uncertainty quantification (UQ) called QUESO that was developed at ICES (UT Austin). The approach has advantages with respect to deterministic inversion approaches as it provides not only a single (non-unique) solution but also provides uncertainty bounds with it. Those uncertainty bounds help to qualitatively and quantitatively judge how well constrained an inversion solution is and how much rupture complexity the data reliably resolve. The presented inversion scheme uses only tele-seismically recorded body waves but future developments may lead us towards joint inversion schemes. After giving an insight in the inversion scheme ifself (based on delayed rejection adaptive metropolis, DRAM) we explore the method's resolution potential. For that, we synthetically generate tele-seismic data, add for example different levels of noise and/or change fault plane parameterization and then apply our inversion scheme in the attempt to extract the (known) kinematic rupture model. We conclude with exemplary inverting real tele-seismic data of a recent large earthquake and compare those results with deterministically derived kinematic source models provided by other research groups.

  9. SCHOOL COMMUNITY PERCEPTION OF LIBRARY APPS AGAINTS LIBRARY EMPOWERMENT

    Directory of Open Access Journals (Sweden)

    Achmad Riyadi Alberto

    2017-07-01

    Full Text Available Abstract. This research is motivated by the development of information and communication technology (ICT in the library world so rapidly that allows libraries in the present to develop its services into digital-based services. This study aims to find out the school community’s perception of library apps developed by Riche Cynthia Johan, Hana Silvana, and Holin Sulistyo and its influence on library empowerment at the library of SD Laboratorium Percontohan UPI Bandung. Library apps in this research belong to the context of m-libraries, which is a library that meets the needs of its users by using mobile platforms such as smartphones,computers, and other mobile devices. Empowerment of library is the utilization of all aspects of the implementation of libraries to the best in order to achieve the expected goals. An analysis of the schoolcommunity’s perception of library apps using the Technology Acceptance Model (TAM includes: ease of use, usefulness, usability, usage trends, and real-use conditions. While the empowerment of the library includes aspects: information empowerment, empowerment of learning resources, empowerment of human resources, empowerment of library facilities, and library promotion. The research method used in this research is descriptive method with quantitative approach. Population and sample in this research is school community at SD Laboratorium Percontohan UPI Bandung. Determination of sample criteria by using disproportionate stratified random sampling with the number of samples of 83 respondents. Data analysis using simple linear regression to measure the influence of school community perception about library apps to library empowerment. The result of data analysis shows that there is influence between school community perception about library apps to library empowerment at library of SD Laboratorium Percontohan UPI Bandung which is proved by library acceptance level and library empowerment improvement.

  10. Covariance matrices of experimental data

    International Nuclear Information System (INIS)

    Perey, F.G.

    1978-01-01

    A complete statement of the uncertainties in data is given by its covariance matrix. It is shown how the covariance matrix of data can be generated using the information available to obtain their standard deviations. Determination of resonance energies by the time-of-flight method is used as an example. The procedure for combining data when the covariance matrix is non-diagonal is given. The method is illustrated by means of examples taken from the recent literature to obtain an estimate of the energy of the first resonance in carbon and for five resonances of 238 U

  11. Evaluation and processing of covariance data

    International Nuclear Information System (INIS)

    Wagner, M.

    1993-01-01

    These proceedings of a specialists'meeting on evaluation and processing of covariance data is divided into 4 parts bearing on: part 1- Needs for evaluated covariance data (2 Papers), part 2- generation of covariance data (15 Papers), part 3- Processing of covariance files (2 Papers), part 4-Experience in the use of evaluated covariance data (2 Papers)

  12. Performance of the libraries in Isfahan University of Medical Sciences based on the EFQM model.

    Science.gov (United States)

    Karimi, Saeid; Atashpour, Bahareh; Papi, Ahmad; Nouri, Rasul; Hasanzade, Akbar

    2014-01-01

    Performance measurement is inevitable for university libraries. Hence, planning and establishing a constant and up-to-date measurement system is required for the libraries, especially the university libraries. The primary studies and analyses reveal that the EFQM Excellence Model has been efficient, and the administrative reform program has focused on the implementation of this model. Therefore, on the basis of these facts as well as the need for a measurement system, the researchers measured the performance of libraries in schools and hospitals supported by Isfahan University of Medical Sciences, using the EFQM Organizational Excellence Model. This descriptive research study was carried out by a cross-sectional survey method in 2011. This research study included librarians and library directors of Isfahan University of Medical Sciences (70 people). The validity of the instrument was measured by the specialists in the field of Management and Library Science. To measure the reliability of the questionnaire, the Cronbach's alpha coefficient value was measured (0.93). The t-test, ANOVA, and Spearman's rank correlation coefficient were used for measurements. The data were analyzed by SPSS. Data analysis revealed that the mean score of the performance measurement for the libraries under study and between nine dimensions the highest score was 65.3% for leadership dimension and the lowest scores were 55.1% for people and 55.1% for society results. In general, using the ninth EFQM model the average level of all dimensions, which is in good agreement with normal values, was assessed. However, compared to other results, the criterion people and society results were poor. It is Recommended by forming the expert committee on criterion people and society results by individuals concerned with the various conferences and training courses to improve the aspects.

  13. ZZ ORIGEN2.2-UPJ, A complete package of ORIGEN2 libraries based on JENDL-3.2 and JENDL-3.3

    International Nuclear Information System (INIS)

    Ishikawa, Makoto; Kataoka, Masaharu; Ohkawachi, Yasushi; Ohki, Shigeo; JIN, Tomoyuki; Katakura, Jun-ich; Suyama, Kenya; Yanagisawa, Hiroshi; Matsumoto, Hideki; ONOUE, Akira; Sasahara, Akihiro

    2006-01-01

    1 - Description: ORLIBJ32 is a package of the libraries for ORIGEN2 code based on JENDL-3.2(NEA-1642). The one grouped cross section data for PWR and BWR were compiled using the burnup calculation results by SWAT code. The FBR libraries were compiled by the analysis system used at JNC for FBR core calculation. The fission yield and decay constants data were also updated using the second version of the JNDC FP library. In ORLIBJ32, not only one-grouped cross section data but also variable actinide cross section data are prepared, using a code written in FORTRAN77. The routines should be linked to the Original ORIGEN2.1 program. The LWR Libraries are prepared based on the current PWR fuel assembly specification, and the FBR libraries are based on the request by the Japanese FBR researchers. Before compiling the libraries, the specification of fuel assembly was completely reviewed and evaluated by the members of Working Group in the Japanese Nuclear Data Committee, 'working group on the evaluation of the amount of isotope generation'. ORLIBJ33 is a new libraries based on JENDL-3.3 following the release of JENDL-3.3. The parameters used to prepare the library are the same as those of ORLIBJ32. The Original version or ORLIBJ33 is coupled with ORIGEN2.1. But after the release of ORIGEN2.2 from ORNL as CCC-0371 through RSICC, several requests for a combination with ORLIBJ33 and ORIGEN2.2 were received. During the development of ORLIBJ33, released as NEA-1642, authors found a problem in the library maker for FBR libraries, and consequently it was revised and tested in JNC-Oarai. This package 'ORIGEN2.2-UPJ' contains: - updated source code of ORIGEN2.2 of CCC-0371 to use ORLIBJ32 and ORLIBJ33, - all Original libraries in CCC-0371, - ORLIBJ32 in NEA-164/03 (but libraries for FBR are revised), - and ORLIBJ33. In this package, decay data based on the second version of the JNDC FP library and, photon and decay data libraries based on JENDL-3.3 are also included. NLB and NLIB

  14. Testing power-law cross-correlations: Rescaled covariance test

    Czech Academy of Sciences Publication Activity Database

    Krištoufek, Ladislav

    2013-01-01

    Roč. 86, č. 10 (2013), 418-1-418-15 ISSN 1434-6028 R&D Projects: GA ČR GA402/09/0965 Institutional support: RVO:67985556 Keywords : power-law cross-correlations * testing * long-term memory Subject RIV: AH - Economics Impact factor: 1.463, year: 2013 http://library.utia.cas.cz/separaty/2013/E/kristoufek-testing power-law cross-correlations rescaled covariance test.pdf

  15. Synthesis and systematic evaluation of dark resonance energy transfer (DRET)-based library and its application in cell imaging.

    Science.gov (United States)

    Su, Dongdong; Teoh, Chai Lean; Kang, Nam-Young; Yu, Xiaotong; Sahu, Srikanta; Chang, Young-Tae

    2015-03-01

    In this paper, we report a new strategy for constructing a dye library with large Stokes shifts. By coupling a dark donor with BODIPY acceptors of tunable high quantum yield, a novel dark resonance energy transfer (DRET)-based library, named BNM, has been synthesized. Upon excitation of the dark donor (BDN) at 490 nm, the absorbed energy is transferred to the acceptor (BDM) with high efficiency, which was tunable in a broad range from 557 nm to 716 nm, with a high quantum yield of up to 0.8. It is noteworthy to mention that the majority of the non-radiative energy loss of the donor was converted into the acceptor's fluorescence output with a minimum leak of donor emission. Fluorescence imaging tested in live cells showed that the BNM compounds are cell-permeable and can also be employed for live-cell imaging. This is a new library which can be excited through a dark donor allowing for strong fluorescence emission in a wide range of wavelengths. Thus, the BNM library is well suited for high-throughput screening or multiplex experiments in biological applications by using a single laser excitation source. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Generation of ENDF/B-IV based 35 group neutron cross-section library and its application in criticality studies

    International Nuclear Information System (INIS)

    Garg, S.B.; Sinha, A.

    1985-01-01

    A 35 group cross-section library with P/sub 3/-anisotropic scattering matrices and resonance self-shielding factors has been generated from the basic ENDF/B-IV cross-section files for 57 elements. This library covers the neutron energy range from 0.005 ev to 15 MeV and is well suited for the neutronics and safety analysis of fission, fusion and hybrid systems. The library is contained in two well known files, namely, ISOTXS and BRKOXS. In order to test the efficacy of this library and to bring out the importance of resonance self-shielding, a few selected fast critical assemblies representing large dilute oxide and carbide fueled uranium and plutonium based systems have been analysed. These assemblies include ZPPR/sub 2/, ZPR-3-48, ZPR-3-53, ZPR-6-6A, ZPR-6-7, ZPR-9-31 and ZEBRA-2 and are amongst those recommended by the US Nuclear Data Evaluation Working Group for testing the accuracy of cross-sections. The evaluated multiplication constants of these assemblies compare favourably with those calculated by others

  17. MCNP4c JEFF-3.1 Based Libraries. Eccolib-Jeff-3.1 libraries; Les bibliotheques Eccolib-Jeff-3.1

    Energy Technology Data Exchange (ETDEWEB)

    Sublet, J.Ch

    2006-07-01

    Continuous-energy and multi-temperatures MCNP Ace types libraries, derived from the Joint European Fusion-Fission JEFF-3.1 evaluations, have been generated using the NJOY-99.111 processing code system. They include the continuous-energy neutron JEFF-3.1/General Purpose, JEFF-3.1/Activation-Dosimetry and thermal S({alpha},{beta}) JEFF-3.1/Thermal libraries and data tables. The processing steps and features are explained together with the Quality Assurance processes and records linked to the generation of such multipurpose libraries. (author)

  18. On Galilean covariant quantum mechanics

    International Nuclear Information System (INIS)

    Horzela, A.; Kapuscik, E.; Kempczynski, J.; Joint Inst. for Nuclear Research, Dubna

    1991-08-01

    Formalism exhibiting the Galilean covariance of wave mechanics is proposed. A new notion of quantum mechanical forces is introduced. The formalism is illustrated on the example of the harmonic oscillator. (author)

  19. Generation, Testing, and Validation of a WIMS-D/4 Multigroup Cross-Section Library Based on the JENDL-3.2 Nuclear Data

    International Nuclear Information System (INIS)

    Rahman, Mafizur; Takano, Hideki

    2001-01-01

    A new 69-group library of multigroup constants for the lattice code WIMS-D/4 has been generated with an improved resonance treatment, processing nuclear data from JENDL-3.2 by NJOY91.108. A parallel ENDF/B-VI based library has also been constructed for intercomparison of results. Benchmark calculations for a number of thermal reactor critical assemblies of both uranium and plutonium fuels have been performed with the code WIMS-D/4.1 with its three different libraries: the original WIMS library (NEA-0329/10) and the new ENDF/B-VI and JENDL-3.2 based libraries. The results calculated with both ENDF and JENDL based libraries show a similar tendency and are found in better agreement with the experimental values. Benchmark parameters are further calculated with the comprehensive lattice code SRAC95. The results from SRAC95 and WIMS-D/4.1 (both using JENDL-3.2 based libraries) agree well with each other. The new library is also verified for its applicability to mixed-oxide cores of varying plutonium contents

  20. Propagation of nuclear data uncertainty: Exact or with covariances

    Directory of Open Access Journals (Sweden)

    van Veen D.

    2010-10-01

    Full Text Available Two distinct methods of propagation for basic nuclear data uncertainties to large scale systems will be presented and compared. The “Total Monte Carlo” method is using a statistical ensemble of nuclear data libraries randomly generated by means of a Monte Carlo approach with the TALYS system. These libraries are then directly used in a large number of reactor calculations (for instance with MCNP after which the exact probability distribution for the reactor parameter is obtained. The second method makes use of available covariance files and can be done in a single reactor calculation (by using the perturbation method. In this exercise, both methods are using consistent sets of data files, which implies that covariance files used in the second method are directly obtained from the randomly generated nuclear data libraries from the first method. This is a unique and straightforward comparison allowing to directly apprehend advantages and drawbacks of each method. Comparisons for different reactions and criticality-safety benchmarks from 19F to actinides will be presented. We can thus conclude whether current methods for using covariance data are good enough or not.

  1. Library Standards: Evidence of Library Effectiveness and Accreditation.

    Science.gov (United States)

    Ebbinghouse, Carol

    1999-01-01

    Discusses accreditation standards for libraries based on experiences in an academic law library. Highlights include the accreditation process; the impact of distance education and remote technologies on accreditation; and a list of Internet sources of standards and information. (LRW)

  2. Computational redesign of bacterial biotin carboxylase inhibitors using structure-based virtual screening of combinatorial libraries.

    Science.gov (United States)

    Brylinski, Michal; Waldrop, Grover L

    2014-04-02

    As the spread of antibiotic resistant bacteria steadily increases, there is an urgent need for new antibacterial agents. Because fatty acid synthesis is only used for membrane biogenesis in bacteria, the enzymes in this pathway are attractive targets for antibacterial agent development. Acetyl-CoA carboxylase catalyzes the committed and regulated step in fatty acid synthesis. In bacteria, the enzyme is composed of three distinct protein components: biotin carboxylase, biotin carboxyl carrier protein, and carboxyltransferase. Fragment-based screening revealed that amino-oxazole inhibits biotin carboxylase activity and also exhibits antibacterial activity against Gram-negative organisms. In this report, we redesigned previously identified lead inhibitors to expand the spectrum of bacteria sensitive to the amino-oxazole derivatives by including Gram-positive species. Using 9,411 small organic building blocks, we constructed a diverse combinatorial library of 1.2×10⁸ amino-oxazole derivatives. A subset of 9×10⁶ of these compounds were subjected to structure-based virtual screening against seven biotin carboxylase isoforms using similarity-based docking by eSimDock. Potentially broad-spectrum antibiotic candidates were selected based on the consensus ranking by several scoring functions including non-linear statistical models implemented in eSimDock and traditional molecular mechanics force fields. The analysis of binding poses of the top-ranked compounds docked to biotin carboxylase isoforms suggests that: (1) binding of the amino-oxazole anchor is stabilized by a network of hydrogen bonds to residues 201, 202 and 204; (2) halogenated aromatic moieties attached to the amino-oxazole scaffold enhance interactions with a hydrophobic pocket formed by residues 157, 169, 171 and 203; and (3) larger substituents reach deeper into the binding pocket to form additional hydrogen bonds with the side chains of residues 209 and 233. These structural insights into drug

  3. A statistical kinematic source inversion approach based on the QUESO library for uncertainty quantification and prediction

    Science.gov (United States)

    Zielke, Olaf; McDougall, Damon; Mai, Martin; Babuska, Ivo

    2014-05-01

    Seismic, often augmented with geodetic data, are frequently used to invert for the spatio-temporal evolution of slip along a rupture plane. The resulting images of the slip evolution for a single event, inferred by different research teams, often vary distinctly, depending on the adopted inversion approach and rupture model parameterization. This observation raises the question, which of the provided kinematic source inversion solutions is most reliable and most robust, and — more generally — how accurate are fault parameterization and solution predictions? These issues are not included in "standard" source inversion approaches. Here, we present a statistical inversion approach to constrain kinematic rupture parameters from teleseismic body waves. The approach is based a) on a forward-modeling scheme that computes synthetic (body-)waves for a given kinematic rupture model, and b) on the QUESO (Quantification of Uncertainty for Estimation, Simulation, and Optimization) library that uses MCMC algorithms and Bayes theorem for sample selection. We present Bayesian inversions for rupture parameters in synthetic earthquakes (i.e. for which the exact rupture history is known) in an attempt to identify the cross-over at which further model discretization (spatial and temporal resolution of the parameter space) is no longer attributed to a decreasing misfit. Identification of this cross-over is of importance as it reveals the resolution power of the studied data set (i.e. teleseismic body waves), enabling one to constrain kinematic earthquake rupture histories of real earthquakes at a resolution that is supported by data. In addition, the Bayesian approach allows for mapping complete posterior probability density functions of the desired kinematic source parameters, thus enabling us to rigorously assess the uncertainties in earthquake source inversions.

  4. HIGH DIMENSIONAL COVARIANCE MATRIX ESTIMATION IN APPROXIMATE FACTOR MODELS.

    Science.gov (United States)

    Fan, Jianqing; Liao, Yuan; Mincheva, Martina

    2011-01-01

    The variance covariance matrix plays a central role in the inferential theories of high dimensional factor models in finance and economics. Popular regularization methods of directly exploiting sparsity are not directly applicable to many financial problems. Classical methods of estimating the covariance matrices are based on the strict factor models, assuming independent idiosyncratic components. This assumption, however, is restrictive in practical applications. By assuming sparse error covariance matrix, we allow the presence of the cross-sectional correlation even after taking out common factors, and it enables us to combine the merits of both methods. We estimate the sparse covariance using the adaptive thresholding technique as in Cai and Liu (2011), taking into account the fact that direct observations of the idiosyncratic components are unavailable. The impact of high dimensionality on the covariance matrix estimation based on the factor structure is then studied.

  5. Library Computing

    Science.gov (United States)

    Library Computing, 1985

    1985-01-01

    Special supplement to "Library Journal" and "School Library Journal" covers topics of interest to school, public, academic, and special libraries planning for automation: microcomputer use, readings in automation, online searching, databases of microcomputer software, public access to microcomputers, circulation, creating a…

  6. From cheminformatics to structure-based design: Web services and desktop applications based on the NAOMI library.

    Science.gov (United States)

    Bietz, Stefan; Inhester, Therese; Lauck, Florian; Sommer, Kai; von Behren, Mathias M; Fährrolfes, Rainer; Flachsenberg, Florian; Meyder, Agnes; Nittinger, Eva; Otto, Thomas; Hilbig, Matthias; Schomburg, Karen T; Volkamer, Andrea; Rarey, Matthias

    2017-11-10

    Nowadays, computational approaches are an integral part of life science research. Problems related to interpretation of experimental results, data analysis, or visualization tasks highly benefit from the achievements of the digital era. Simulation methods facilitate predictions of physicochemical properties and can assist in understanding macromolecular phenomena. Here, we will give an overview of the methods developed in our group that aim at supporting researchers from all life science areas. Based on state-of-the-art approaches from structural bioinformatics and cheminformatics, we provide software covering a wide range of research questions. Our all-in-one web service platform ProteinsPlus (http://proteins.plus) offers solutions for pocket and druggability prediction, hydrogen placement, structure quality assessment, ensemble generation, protein-protein interaction classification, and 2D-interaction visualization. Additionally, we provide a software package that contains tools targeting cheminformatics problems like file format conversion, molecule data set processing, SMARTS editing, fragment space enumeration, and ligand-based virtual screening. Furthermore, it also includes structural bioinformatics solutions for inverse screening, binding site alignment, and searching interaction patterns across structure libraries. The software package is available at http://software.zbh.uni-hamburg.de. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  7. Analysis of valve failures from the NUCLARR data base

    International Nuclear Information System (INIS)

    Moore, L.M.

    1997-11-01

    The Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR) contains data on component failures with categorical and qualifying information such as component design, normal operating state, system application and safety grade information which is important to the development of risk-based component surveillance testing requirements. This report presents descriptions and results of analyses of valve component failure data and covariate information available in the document Nuclear Computerized Library for Assessing Reactor Reliability Data Manual, Part 3: Hardware Component Failure Data (NUCLARR Data Manual). Although there are substantial records on valve performance, there are many categories of the corresponding descriptors and qualifying information for which specific values are missing. Consequently, this limits the data available for analysis of covariate effects. This report presents cross tabulations by different covariate categories and limited modeling of covariate effects for data subsets with substantive non-missing covariate information

  8. Measuring Levels of Work in Academic Libraries: A Time Based Approach.

    Science.gov (United States)

    Gould, Donald P.

    1985-01-01

    Using Stratified Systems Theory, which focuses on the manager-subordinate relationship in the bureaucratic structure, a study was conducted to measure level of responsibility in work of 37 professional and nonprofessional positions in four academic library technical services departments. Three levels of work were measured in "time-spans of…

  9. EcoIS: An image serialization library for plot-based plant flowering phenology

    DEFF Research Database (Denmark)

    Granados, Joel A.; Bonnet, Philippe; Hansen, Lars Hostrup

    2013-01-01

    they are produced by introducing an open source Python (www.python.org) library called EcoIS that creates image series from unaligned pictures of specially equipped plots. We use EcoIS to sample flowering phenology plots in a high arctic environment and create image series that later generate phenophase counts...

  10. The design of a web – based integrated library system with internet ...

    African Journals Online (AJOL)

    Developing countries like Nigeria face series of defects in developing, managing and securing an Integrated Library System (ILS) in tertiary institutions and Secondary Schools where they are mostly needed. ILSs face issues such as non – interactive, speed, cost, unavailability of experienced users and programmers and ...

  11. FRDS.Broker Library

    DEFF Research Database (Denmark)

    2018-01-01

    The FRDS.Broker library is a teaching oriented implementation of the Broker architectural pattern for distributed remote method invocation. It defines the central roles of the pattern and provides implementations of those roles that are not domain/use case specific. It provides a JSON based (GSon...... library) Requestor implementation, and implementations of the ClientRequestHandler and ServerRequestHandler roles in both a Java socket based and a Http/URI tunneling based variants. The latter us based upon the UniRest and Spark-Java libraries. The Broker pattern and the source code is explained...

  12. ANGELO-LAMBDA, Covariance matrix interpolation and mathematical verification

    International Nuclear Information System (INIS)

    Kodeli, Ivo

    2007-01-01

    1 - Description of program or function: The codes ANGELO-2.3 and LAMBDA-2.3 are used for the interpolation of the cross section covariance data from the original to a user defined energy group structure, and for the mathematical tests of the matrices, respectively. The LAMBDA-2.3 code calculates the eigenvalues of the matrices (both for the original or the converted) and lists them accordingly into positive and negative matrices. This verification is strongly recommended before using any covariance matrices. These versions of the two codes are the extended versions of the previous codes available in the Packages NEA-1264 - ZZ-VITAMIN-J/COVA. They were specifically developed for the purposes of the OECD LWR UAM benchmark, in particular for the processing of the ZZ-SCALE5.1/COVA-44G cross section covariance matrix library retrieved from the SCALE-5.1 package. Either the original SCALE-5.1 libraries or the libraries separated into several files by Nuclides can be (in principle) processed by ANGELO/LAMBDA codes, but the use of the one-nuclide data is strongly recommended. Due to large deviations of the correlation matrix terms from unity observed in some SCALE5.1 covariance matrices, the previous more severe acceptance condition in the ANGELO2.3 code was released. In case the correlation coefficients exceed 1.0, only a warning message is issued, and coefficients are replaced by 1.0. 2 - Methods: ANGELO-2.3 interpolates the covariance matrices to a union grid using flat weighting. LAMBDA-2.3 code includes the mathematical routines to calculate the eigenvalues of the covariance matrices. 3 - Restrictions on the complexity of the problem: The algorithm used in ANGELO is relatively simple, therefore the interpolations involving energy group structure which are very different from the original (e.g. large difference in the number of energy groups) may not be accurate. In particular in the case of the MT=1018 data (fission spectra covariances) the algorithm may not be

  13. Health sciences libraries' subscriptions to journals: expectations of general practice departments and collection-based analysis.

    Science.gov (United States)

    Barreau, David; Bouton, Céline; Renard, Vincent; Fournier, Jean-Pascal

    2018-04-01

    The aims of this study were to (i) assess the expectations of general practice departments regarding health sciences libraries' subscriptions to journals and (ii) describe the current general practice journal collections of health sciences libraries. A cross-sectional survey was distributed electronically to the thirty-five university general practice departments in France. General practice departments were asked to list ten journals to which they expected access via the subscriptions of their health sciences libraries. A ranked reference list of journals was then developed. Access to these journals was assessed through a survey sent to all health sciences libraries in France. Adequacy ratios (access/need) were calculated for each journal. All general practice departments completed the survey. The total reference list included 44 journals. This list was heterogeneous in terms of indexation/impact factor, language of publication, and scope (e.g., patient care, research, or medical education). Among the first 10 journals listed, La Revue Prescrire (96.6%), La Revue du Praticien-Médecine Générale (90.9%), the British Medical Journal (85.0%), Pédagogie Médicale (70.0%), Exercer (69.7%), and the Cochrane Database of Systematic Reviews (62.5%) had the highest adequacy ratios, whereas Family Practice (4.2%), the British Journal of General Practice (16.7%), Médecine (29.4%), and the European Journal of General Practice (33.3%) had the lowest adequacy ratios. General practice departments have heterogeneous expectations in terms of health sciences libraries' subscriptions to journals. It is important for librarians to understand the heterogeneity of these expectations, as well as local priorities, so that journal access meets users' needs.

  14. Quantum mechanical energy-based screening of combinatorially generated library of tautomers. TauTGen: a tautomer generator program.

    Science.gov (United States)

    Harańczyk, Maciej; Gutowski, Maciej

    2007-01-01

    We describe a procedure of finding low-energy tautomers of a molecule. The procedure consists of (i) combinatorial generation of a library of tautomers, (ii) screening based on the results of geometry optimization of initial structures performed at the density functional level of theory, and (iii) final refinement of geometry for the top hits at the second-order Möller-Plesset level of theory followed by single-point energy calculations at the coupled cluster level of theory with single, double, and perturbative triple excitations. The library of initial structures of various tautomers is generated with TauTGen, a tautomer generator program. The procedure proved to be successful for these molecular systems for which common chemical knowledge had not been sufficient to predict the most stable structures.

  15. pyOpenMS: a Python-based interface to the OpenMS mass-spectrometry algorithm library.

    Science.gov (United States)

    Röst, Hannes L; Schmitt, Uwe; Aebersold, Ruedi; Malmström, Lars

    2014-01-01

    pyOpenMS is an open-source, Python-based interface to the C++ OpenMS library, providing facile access to a feature-rich, open-source algorithm library for MS-based proteomics analysis. It contains Python bindings that allow raw access to the data structures and algorithms implemented in OpenMS, specifically those for file access (mzXML, mzML, TraML, mzIdentML among others), basic signal processing (smoothing, filtering, de-isotoping, and peak-picking) and complex data analysis (including label-free, SILAC, iTRAQ, and SWATH analysis tools). pyOpenMS thus allows fast prototyping and efficient workflow development in a fully interactive manner (using the interactive Python interpreter) and is also ideally suited for researchers not proficient in C++. In addition, our code to wrap a complex C++ library is completely open-source, allowing other projects to create similar bindings with ease. The pyOpenMS framework is freely available at https://pypi.python.org/pypi/pyopenms while the autowrap tool to create Cython code automatically is available at https://pypi.python.org/pypi/autowrap (both released under the 3-clause BSD licence). © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Multiple feature fusion via covariance matrix for visual tracking

    Science.gov (United States)

    Jin, Zefenfen; Hou, Zhiqiang; Yu, Wangsheng; Wang, Xin; Sun, Hui

    2018-04-01

    Aiming at the problem of complicated dynamic scenes in visual target tracking, a multi-feature fusion tracking algorithm based on covariance matrix is proposed to improve the robustness of the tracking algorithm. In the frame-work of quantum genetic algorithm, this paper uses the region covariance descriptor to fuse the color, edge and texture features. It also uses a fast covariance intersection algorithm to update the model. The low dimension of region covariance descriptor, the fast convergence speed and strong global optimization ability of quantum genetic algorithm, and the fast computation of fast covariance intersection algorithm are used to improve the computational efficiency of fusion, matching, and updating process, so that the algorithm achieves a fast and effective multi-feature fusion tracking. The experiments prove that the proposed algorithm can not only achieve fast and robust tracking but also effectively handle interference of occlusion, rotation, deformation, motion blur and so on.

  17. Learning based on library automation in mobile devices: The video production by students of Universidade Federal do Cariri Library Science Undergraduate Degree

    Directory of Open Access Journals (Sweden)

    David Vernon VIEIRA

    Full Text Available Abstract The video production for learning has been evident over the last few years especially when it involves aspects of the application of hardware and software for automation spaces. In Librarianship Undergraduate Degrees the need for practical learning focused on the knowledge of the requirements for library automation demand on teacher to develop an educational content to enable the student to learn through videos in order to increase the knowledge about information technology. Thus, discusses the possibilities of learning through mobile devices in education reporting an experience that took place with students who entered in March, 2015 (2015.1 Bachelor Degree in Library Science from the Universidade Federal do Cariri (Federal University of Cariri in state of Ceará, Brazil. The literature review includes articles publicated in scientific journals and conference proceedings and books in English, Portuguese and Spanish on the subject. The methodology with quantitative and qualitative approach includes an exploratory study, where the data collection was used online survey to find out the experience of the elaboration of library automation videos by students who studied in that course. The learning experience using mobile devices for recording of technological environments of libraries allowed them to be produced 25 videos that contemplated aspects of library automation having these actively participated in production of the video and its publication on the Internet.

  18. Testing of a JEF-1 based WIMS-D cross section library for migration area and k-infinity predictions for LWHCR lattices

    International Nuclear Information System (INIS)

    Pelloni, S.; Stepanek, J.

    1987-01-01

    The cell code WIMSD4 is used for the analysis of PROTEUS-LWHCR experiments. A library for this code which is based on the European evaluation JEF-1 was produced at EIR using the Los Alamos NJOY system with its module WIMSR and the Canadian management code WILMA. In general, this library delivered more accurate eigenvalues and reaction rates than the WIMS-Standard and WIMS81 libraries did in comparison to experimental values from PROTEUS-LWHCR Cores 1-3. However, large discrepancies (up to about 10%) occured between calculated migration areas (M 2 ). Additional investigations have been undertaken to clarify this problem, since theoretical M 2 -values are needed for deducing k-infinity in the experiments. This has been done in the context of calculations for a reference LWHCR test lattice. The following major reasons for these deviations were found. First, the self-scattering term in non-moderators (P 0 matrix) in the JEF-1 library was not transport corrected. Second, Standard and JEF-1 libraries use infinite dilute cross sections for 238 U, whereas the WIMS81 library uses fully shielded cross sections. Third, the standard library uses the 'row' formula for the transport correction, whereas the 'inflow' formula is applied in the case of JEF-1 and WIMS81 libraries. Lastly, oxygen and 238 U scattering cross sections in the fast energy range are smaller in the case of the WIMS81 library. Differences in calculated k-infinity values between the currently used library and WIMS81 (up to 3%) come (in order of importance for the reference LWHCR lattice) mainly from resonance cross sections for 240 Pu capture, 238 U capture and 239 Pu fission. Recommendations have been made for generating a new JEF-1 library using updated versions of WIMSR and WILMA. (author)

  19. Tapping Teen Talent in Queens: A Library-Based, LSCA-Funded Youth Development Success Story from New York.

    Science.gov (United States)

    Williams, Barbara Osborne

    1996-01-01

    Describes a program developed by the Youth Services Division at the Queens Borough Public Library's Central Library to help teenagers maximize growth opportunities, build self-esteem, and see the library as a life resource. Highlights include securing funding through LSCA (Library Services and Construction Act), recruiting participants, and…

  20. Production and testing of HENDL-2.1/CG coarse-group cross-section library based on ENDF/B-VII.0

    International Nuclear Information System (INIS)

    Xu Dezheng; He Zhaozhong; Zou Jun; Zeng Qin

    2010-01-01

    A coarse-group coupled neutron and photon (27n + 21γ) cross-section library HENDL-2.1/CG, based on ENDF/B-VII.0 evaluate data source, has been produced by FDS Team in Institute of Plasma Physics, Chinese Academy of Sciences (ASIPP). HENDL-2.1/CG containing 350 nuclide cross-section files (from 1 H to 252 Cf) was generated in MATXS format with the NJOY processing system and then by compiling coarse-group problem-dependent format using the TRANSX code. In order to verify the availability and reliability of the HENDL-2.1/CG data library, requisite benchmark calculations were performed and compared with HENDL-2.0/MG fine-group coupled neutron and photon (175n + 42γ) cross-section library. In general, results using the coarse-group library showed similarly believable as fine-group library.

  1. The impact of computerisation of library operations on library ...

    African Journals Online (AJOL)

    The use of computer-based systems in libraries and information units is now a vogue. The era of manual system in library operations is on its way to extinction. Recent developments in information world tend towards a globalized information communication technology (ICT). The library as a dynamic institution cannot afford ...

  2. GLq(N)-covariant quantum algebras and covariant differential calculus

    International Nuclear Information System (INIS)

    Isaev, A.P.; Pyatov, P.N.

    1992-01-01

    GL q (N)-covariant quantum algebras with generators satisfying quadratic polynomial relations are considered. It is that, up to some innessential arbitrariness, there are only two kinds of such quantum algebras, namely, the algebras with q-deformed commutation and q-deformed anticommutation relations. 25 refs

  3. GLq(N)-covariant quantum algebras and covariant differential calculus

    International Nuclear Information System (INIS)

    Isaev, A.P.; Pyatov, P.N.

    1993-01-01

    We consider GL q (N)-covariant quantum algebras with generators satisfying quadratic polynomial relations. We show that, up to some inessential arbitrariness, there are only two kinds of such quantum algebras, namely, the algebras with q-deformed commutation and q-deformed anticommutation relations. The connection with the bicovariant differential calculus on the linear quantum groups is discussed. (orig.)

  4. A class of covariate-dependent spatiotemporal covariance functions

    Science.gov (United States)

    Reich, Brian J; Eidsvik, Jo; Guindani, Michele; Nail, Amy J; Schmidt, Alexandra M.

    2014-01-01

    In geostatistics, it is common to model spatially distributed phenomena through an underlying stationary and isotropic spatial process. However, these assumptions are often untenable in practice because of the influence of local effects in the correlation structure. Therefore, it has been of prolonged interest in the literature to provide flexible and effective ways to model non-stationarity in the spatial effects. Arguably, due to the local nature of the problem, we might envision that the correlation structure would be highly dependent on local characteristics of the domain of study, namely the latitude, longitude and altitude of the observation sites, as well as other locally defined covariate information. In this work, we provide a flexible and computationally feasible way for allowing the correlation structure of the underlying processes to depend on local covariate information. We discuss the properties of the induced covariance functions and discuss methods to assess its dependence on local covariate information by means of a simulation study and the analysis of data observed at ozone-monitoring stations in the Southeast United States. PMID:24772199

  5. SOA-based digital library services and composition in biomedical applications.

    Science.gov (United States)

    Zhao, Xia; Liu, Enjie; Clapworthy, Gordon J; Viceconti, Marco; Testi, Debora

    2012-06-01

    Carefully collected, high-quality data are crucial in biomedical visualization, and it is important that the user community has ready access to both this data and the high-performance computing resources needed by the complex, computational algorithms that will process it. Biological researchers generally require data, tools and algorithms from multiple providers to achieve their goals. This paper illustrates our response to the problems that result from this. The Living Human Digital Library (LHDL) project presented in this paper has taken advantage of Web Services to build a biomedical digital library infrastructure that allows clinicians and researchers not only to preserve, trace and share data resources, but also to collaborate at the data-processing level. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  6. Cosmic censorship conjecture revisited: covariantly

    International Nuclear Information System (INIS)

    Hamid, Aymen I M; Goswami, Rituparno; Maharaj, Sunil D

    2014-01-01

    In this paper we study the dynamics of the trapped region using a frame independent semi-tetrad covariant formalism for general locally rotationally symmetric (LRS) class II spacetimes. We covariantly prove some important geometrical results for the apparent horizon, and state the necessary and sufficient conditions for a singularity to be locally naked. These conditions bring out, for the first time in a quantitative and transparent manner, the importance of the Weyl curvature in deforming and delaying the trapped region during continual gravitational collapse, making the central singularity locally visible. (paper)

  7. IoT-Based Library Automation and Monitoring system: Developing an Implementation framework of Implementation

    OpenAIRE

    Bayani, Majid; Segura, Alberto; Alvarado, Marjorie; Loaiza, Mayra

    2018-01-01

    Currently, the Information and Communication Technology (ICT) and related topics such as Internet of Things (IoT) have an essential influence on all elements of human life. IoT as a prevalent phenomenon is transforming daily life by the usage of the smart features of Radio Frequency Identification (RFID) and Wireless Sensor Network (WSN) technologies. As IoT progresses, it has extended in size and dimension, improving many contexts of the society; such as, the traditional library system. This...

  8. Design and implementation of universal mathematical library supporting algorithm development for FPGA based systems in high energy physics experiments

    International Nuclear Information System (INIS)

    Jalmuzna, W.

    2006-02-01

    The X-ray free-electron laser XFEL that is being planned at the DESY research center in cooperation with European partners will produce high-intensity ultra-short Xray flashes with the properties of laser light. This new light source, which can only be described in terms of superlatives, will open up a whole range of new perspectives for the natural sciences. It could also offer very promising opportunities for industrial users. SIMCON (SIMulator and CONtroller) is the project of the fast, low latency digital controller dedicated for LLRF system in VUV FEL experiment based on modern FPGA chips It is being developed by ELHEP group in Institute of Electronic Systems at Warsaw University of Technology. The main purpose of the project is to create a controller for stabilizing the vector sum of fields in cavities of one cryomodule in the experiment. The device can be also used as the simulator of the cavity and testbench for other devices. Flexibility and computation power of this device allow implementation of fast mathematical algorithms. This paper describes the concept, implementation and tests of universal mathematical library for FPGA algorithm implementation. It consists of many useful components such as IQ demodulator, division block, library for complex and floating point operations, etc. It is able to speed up implementation time of many complicated algorithms. Library have already been tested using real accelerator signals and the performance achieved is satisfactory. (Orig.)

  9. Design and implementation of universal mathematical library supporting algorithm development for FPGA based systems in high energy physics experiments

    Energy Technology Data Exchange (ETDEWEB)

    Jalmuzna, W.

    2006-02-15

    The X-ray free-electron laser XFEL that is being planned at the DESY research center in cooperation with European partners will produce high-intensity ultra-short Xray flashes with the properties of laser light. This new light source, which can only be described in terms of superlatives, will open up a whole range of new perspectives for the natural sciences. It could also offer very promising opportunities for industrial users. SIMCON (SIMulator and CONtroller) is the project of the fast, low latency digital controller dedicated for LLRF system in VUV FEL experiment based on modern FPGA chips It is being developed by ELHEP group in Institute of Electronic Systems at Warsaw University of Technology. The main purpose of the project is to create a controller for stabilizing the vector sum of fields in cavities of one cryomodule in the experiment. The device can be also used as the simulator of the cavity and testbench for other devices. Flexibility and computation power of this device allow implementation of fast mathematical algorithms. This paper describes the concept, implementation and tests of universal mathematical library for FPGA algorithm implementation. It consists of many useful components such as IQ demodulator, division block, library for complex and floating point operations, etc. It is able to speed up implementation time of many complicated algorithms. Library have already been tested using real accelerator signals and the performance achieved is satisfactory. (Orig.)

  10. Intracellular directed evolution of proteins from combinatorial libraries based on conditional phage replication.

    Science.gov (United States)

    Brödel, Andreas K; Jaramillo, Alfonso; Isalan, Mark

    2017-09-01

    Directed evolution is a powerful tool to improve the characteristics of biomolecules. Here we present a protocol for the intracellular evolution of proteins with distinct differences and advantages in comparison with established techniques. These include the ability to select for a particular function from a library of protein variants inside cells, minimizing undesired coevolution and propagation of nonfunctional library members, as well as allowing positive and negative selection logics using basally active promoters. A typical evolution experiment comprises the following stages: (i) preparation of a combinatorial M13 phagemid (PM) library expressing variants of the gene of interest (GOI) and preparation of the Escherichia coli host cells; (ii) multiple rounds of an intracellular selection process toward a desired activity; and (iii) the characterization of the evolved target proteins. The system has been developed for the selection of new orthogonal transcription factors (TFs) but is capable of evolving any gene-or gene circuit function-that can be linked to conditional M13 phage replication. Here we demonstrate our approach using as an example the directed evolution of the bacteriophage λ cI TF against two synthetic bidirectional promoters. The evolved TF variants enable simultaneous activation and repression against their engineered promoters and do not cross-react with the wild-type promoter, thus ensuring orthogonality. This protocol requires no special equipment, allowing synthetic biologists and general users to evolve improved biomolecules within ∼7 weeks.

  11. Assessment of statistical methods used in library-based approaches to microbial source tracking.

    Science.gov (United States)

    Ritter, Kerry J; Carruthers, Ethan; Carson, C Andrew; Ellender, R D; Harwood, Valerie J; Kingsley, Kyle; Nakatsu, Cindy; Sadowsky, Michael; Shear, Brian; West, Brian; Whitlock, John E; Wiggins, Bruce A; Wilbur, Jayson D

    2003-12-01

    Several commonly used statistical methods for fingerprint identification in microbial source tracking (MST) were examined to assess the effectiveness of pattern-matching algorithms to correctly identify sources. Although numerous statistical methods have been employed for source identification, no widespread consensus exists as to which is most appropriate. A large-scale comparison of several MST methods, using identical fecal sources, presented a unique opportunity to assess the utility of several popular statistical methods. These included discriminant analysis, nearest neighbour analysis, maximum similarity and average similarity, along with several measures of distance or similarity. Threshold criteria for excluding uncertain or poorly matched isolates from final analysis were also examined for their ability to reduce false positives and increase prediction success. Six independent libraries used in the study were constructed from indicator bacteria isolated from fecal materials of humans, seagulls, cows and dogs. Three of these libraries were constructed using the rep-PCR technique and three relied on antibiotic resistance analysis (ARA). Five of the libraries were constructed using Escherichia coli and one using Enterococcus spp. (ARA). Overall, the outcome of this study suggests a high degree of variability across statistical methods. Despite large differences in correct classification rates among the statistical methods, no single statistical approach emerged as superior. Thresholds failed to consistently increase rates of correct classification and improvement was often associated with substantial effective sample size reduction. Recommendations are provided to aid in selecting appropriate analyses for these types of data.

  12. Natural product-like virtual libraries: recursive atom-based enumeration.

    Science.gov (United States)

    Yu, Melvin J

    2011-03-28

    A new molecular enumerator is described that allows chemically and architecturally diverse sets of natural product-like and drug-like structures to be generated from a core structure as simple as a single carbon atom or as complex as a polycyclic ring system. Integrated with a rudimentary machine-learning algorithm, the enumerator has the ability to assemble biased virtual libraries enriched in compounds predicted to meet target criteria. The ability to dynamically generate relatively small focused libraries in a recursive manner could reduce the computational time and infrastructure necessary to construct and manage extremely large static libraries. Depending on enumeration conditions, natural product-like structures can be produced with a wide range of heterocyclic and alicyclic ring assemblies. Because natural products represent a proven source of validated structures for identifying and designing new drug candidates, mimicking the structural and topological diversity found in nature with a dynamic set of virtual natural product-like compounds may facilitate the creation of new ideas for novel, biologically relevant lead structures in areas of uncharted chemical space.

  13. Designing a diverse high-quality library for crystallography-based FBDD screening.

    Science.gov (United States)

    Tounge, Brett A; Parker, Michael H

    2011-01-01

    A well-chosen set of fragments is able to cover a large chemical space using a small number of compounds. The actual size and makeup of the fragment set is dependent on the screening method since each technique has its own practical limits in terms of the number of compounds that can be screened and requirements for compound solubility. In this chapter, an overview of the general requirements for a fragment library is presented for different screening platforms. In the case of the FBDD work at Johnson & Johnson Pharmaceutical Research and Development, L.L.C., our main screening technology is X-ray crystallography. Since every soaked protein crystal needs to be diffracted and a protein structure determined to delineate if a fragment binds, the size of our initial screening library cannot be a rate-limiting factor. For this reason, we have chosen 900 as the appropriate primary fragment library size. To choose the best set, we have developed our own mix of simple property ("Rule of 3") and "bad" substructure filtering. While this gets one a long way in terms of limiting the fragment pool, there are still tens of thousands of compounds to choose from after this initial step. Many of the choices left at this stage are not drug-like, so we have developed an FBDD Score to help select a 900-compound set. The details of this score and the filtering are presented. Copyright © 2011 Elsevier Inc. All rights reserved.

  14. Fission yield covariance generation and uncertainty propagation through fission pulse decay heat calculation

    International Nuclear Information System (INIS)

    Fiorito, L.; Diez, C.J.; Cabellos, O.; Stankovskiy, A.; Van den Eynde, G.; Labeau, P.E.

    2014-01-01

    Highlights: • Fission yield data and uncertainty comparison between major nuclear data libraries. • Fission yield covariance generation through Bayesian technique. • Study of the effect of fission yield correlations on decay heat calculations. • Covariance information contribute to reduce fission pulse decay heat uncertainty. - Abstract: Fission product yields are fundamental parameters in burnup/activation calculations and the impact of their uncertainties was widely studied in the past. Evaluations of these uncertainties were released, still without covariance data. Therefore, the nuclear community expressed the need of full fission yield covariance matrices to be able to produce inventory calculation results that take into account the complete uncertainty data. State-of-the-art fission yield data and methodologies for fission yield covariance generation were researched in this work. Covariance matrices were generated and compared to the original data stored in the library. Then, we focused on the effect of fission yield covariance information on fission pulse decay heat results for thermal fission of 235 U. Calculations were carried out using different libraries and codes (ACAB and ALEPH-2) after introducing the new covariance values. Results were compared with those obtained with the uncertainty data currently provided by the libraries. The uncertainty quantification was performed first with Monte Carlo sampling and then compared with linear perturbation. Indeed, correlations between fission yields strongly affect the uncertainty of decay heat. Eventually, a sensitivity analysis of fission product yields to fission pulse decay heat was performed in order to provide a full set of the most sensitive nuclides for such a calculation

  15. Comparison of several chemometric methods of libraries and classifiers for the analysis of expired drugs based on Raman spectra.

    Science.gov (United States)

    Gao, Qun; Liu, Yan; Li, Hao; Chen, Hui; Chai, Yifeng; Lu, Feng

    2014-06-01

    Some expired drugs are difficult to detect by conventional means. If they are repackaged and sold back into market, they will constitute a new public health challenge. For the detection of repackaged expired drugs within specification, paracetamol tablet from a manufacturer was used as a model drug in this study for comparison of Raman spectra-based library verification and classification methods. Raman spectra of different batches of paracetamol tablets were collected and a library including standard spectra of unexpired batches of tablets was established. The Raman spectrum of each sample was identified by cosine and correlation with the standard spectrum. The average HQI of the suspicious samples and the standard spectrum were calculated. The optimum threshold values were 0.997 and 0.998 respectively as a result of ROC and four evaluations, for which the accuracy was up to 97%. Three supervised classifiers, PLS-DA, SVM and k-NN, were chosen to establish two-class classification models and compared subsequently. They were used to establish a classification of expired batches and an unexpired batch, and predict the suspect samples. The average accuracy was 90.12%, 96.80% and 89.37% respectively. Different pre-processing techniques were tried to find that first derivative was optimal for methods of libraries and max-min normalization was optimal for that of classifiers. The results obtained from these studies indicated both libraries and classifier methods could detect the expired drugs effectively, and they should be used complementarily in the fast-screening. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Partially linear varying coefficient models stratified by a functional covariate

    KAUST Repository

    Maity, Arnab; Huang, Jianhua Z.

    2012-01-01

    We consider the problem of estimation in semiparametric varying coefficient models where the covariate modifying the varying coefficients is functional and is modeled nonparametrically. We develop a kernel-based estimator of the nonparametric

  17. Generalized Extreme Value model with Cyclic Covariate Structure ...

    Indian Academy of Sciences (India)

    48

    enhances the estimation of the return period; however, its application is ...... Cohn T A and Lins H F 2005 Nature's style: Naturally trendy; GEOPHYSICAL ..... Final non-stationary GEV models with covariate structures shortlisted based on.

  18. Covariant holography of a tachyonic accelerating universe

    Energy Technology Data Exchange (ETDEWEB)

    Rozas-Fernandez, Alberto [Consejo Superior de Investigaciones Cientificas, Instituto de Fisica Fundamental, Madrid (Spain); University of Portsmouth, Institute of Cosmology and Gravitation, Portsmouth (United Kingdom)

    2014-08-15

    We apply the holographic principle to a flat dark energy dominated Friedmann-Robertson-Walker spacetime filled with a tachyon scalar field with constant equation of state w = p/ρ, both for w > -1 and w < -1. By using a geometrical covariant procedure, which allows the construction of holographic hypersurfaces, we have obtained for each case the position of the preferred screen and have then compared these with those obtained by using the holographic dark energy model with the future event horizon as the infrared cutoff. In the phantom scenario, one of the two obtained holographic screens is placed on the big rip hypersurface, both for the covariant holographic formalism and the holographic phantom model. It is also analyzed whether the existence of these preferred screens allows a mathematically consistent formulation of fundamental theories based on the existence of an S-matrix at infinite distances. (orig.)

  19. Development of covariance capabilities in EMPIRE code

    Energy Technology Data Exchange (ETDEWEB)

    Herman,M.; Pigni, M.T.; Oblozinsky, P.; Mughabghab, S.F.; Mattoon, C.M.; Capote, R.; Cho, Young-Sik; Trkov, A.

    2008-06-24

    The nuclear reaction code EMPIRE has been extended to provide evaluation capabilities for neutron cross section covariances in the thermal, resolved resonance, unresolved resonance and fast neutron regions. The Atlas of Neutron Resonances by Mughabghab is used as a primary source of information on uncertainties at low energies. Care is taken to ensure consistency among the resonance parameter uncertainties and those for thermal cross sections. The resulting resonance parameter covariances are formatted in the ENDF-6 File 32. In the fast neutron range our methodology is based on model calculations with the code EMPIRE combined with experimental data through several available approaches. The model-based covariances can be obtained using deterministic (Kalman) or stochastic (Monte Carlo) propagation of model parameter uncertainties. We show that these two procedures yield comparable results. The Kalman filter and/or the generalized least square fitting procedures are employed to incorporate experimental information. We compare the two approaches analyzing results for the major reaction channels on {sup 89}Y. We also discuss a long-standing issue of unreasonably low uncertainties and link it to the rigidity of the model.

  20. Customer Satisfaction with Public Libraries.

    Science.gov (United States)

    D'Elia, George; Rodger, Eleanor Jo

    1996-01-01

    Surveys conducted in 142 urban public libraries examined customer satisfaction, comparisons with other libraries, and factors affecting satisfaction. Overall, customers were satisfied with their libraries but experienced different levels of satisfaction based on convenience, availability of materials and information, and services facilitating…

  1. Promotion: Study of the Library of the department of library and information science and book

    Directory of Open Access Journals (Sweden)

    Andreja Nagode

    2003-01-01

    Full Text Available The contribution presents basic information about academic libraries and their promotion. Librarians should have promotion knowledge since they have to promote and market their libraries. The paper presents the definition of academic libraries, their purpose, objectives and goals. Marketing and promotion in academic libraries are defined. The history of academic libraries and their promotion are described. The contribution presents results and the interpretation of the research, based on the study of users of the Library of the Department of Library and Information Science and Book studies. A new promotion plan for libraries based on the analysis of the academic library environment is introduced.

  2. Covariance matrix estimation for stationary time series

    OpenAIRE

    Xiao, Han; Wu, Wei Biao

    2011-01-01

    We obtain a sharp convergence rate for banded covariance matrix estimates of stationary processes. A precise order of magnitude is derived for spectral radius of sample covariance matrices. We also consider a thresholded covariance matrix estimator that can better characterize sparsity if the true covariance matrix is sparse. As our main tool, we implement Toeplitz [Math. Ann. 70 (1911) 351–376] idea and relate eigenvalues of covariance matrices to the spectral densities or Fourier transforms...

  3. Condition Number Regularized Covariance Estimation.

    Science.gov (United States)

    Won, Joong-Ho; Lim, Johan; Kim, Seung-Jean; Rajaratnam, Bala

    2013-06-01

    Estimation of high-dimensional covariance matrices is known to be a difficult problem, has many applications, and is of current interest to the larger statistics community. In many applications including so-called the "large p small n " setting, the estimate of the covariance matrix is required to be not only invertible, but also well-conditioned. Although many regularization schemes attempt to do this, none of them address the ill-conditioning problem directly. In this paper, we propose a maximum likelihood approach, with the direct goal of obtaining a well-conditioned estimator. No sparsity assumption on either the covariance matrix or its inverse are are imposed, thus making our procedure more widely applicable. We demonstrate that the proposed regularization scheme is computationally efficient, yields a type of Steinian shrinkage estimator, and has a natural Bayesian interpretation. We investigate the theoretical properties of the regularized covariance estimator comprehensively, including its regularization path, and proceed to develop an approach that adaptively determines the level of regularization that is required. Finally, we demonstrate the performance of the regularized estimator in decision-theoretic comparisons and in the financial portfolio optimization setting. The proposed approach has desirable properties, and can serve as a competitive procedure, especially when the sample size is small and when a well-conditioned estimator is required.

  4. Condition Number Regularized Covariance Estimation*

    Science.gov (United States)

    Won, Joong-Ho; Lim, Johan; Kim, Seung-Jean; Rajaratnam, Bala

    2012-01-01

    Estimation of high-dimensional covariance matrices is known to be a difficult problem, has many applications, and is of current interest to the larger statistics community. In many applications including so-called the “large p small n” setting, the estimate of the covariance matrix is required to be not only invertible, but also well-conditioned. Although many regularization schemes attempt to do this, none of them address the ill-conditioning problem directly. In this paper, we propose a maximum likelihood approach, with the direct goal of obtaining a well-conditioned estimator. No sparsity assumption on either the covariance matrix or its inverse are are imposed, thus making our procedure more widely applicable. We demonstrate that the proposed regularization scheme is computationally efficient, yields a type of Steinian shrinkage estimator, and has a natural Bayesian interpretation. We investigate the theoretical properties of the regularized covariance estimator comprehensively, including its regularization path, and proceed to develop an approach that adaptively determines the level of regularization that is required. Finally, we demonstrate the performance of the regularized estimator in decision-theoretic comparisons and in the financial portfolio optimization setting. The proposed approach has desirable properties, and can serve as a competitive procedure, especially when the sample size is small and when a well-conditioned estimator is required. PMID:23730197

  5. Covariant Gauss law commutator anomaly

    International Nuclear Information System (INIS)

    Dunne, G.V.; Trugenberger, C.A.; Massachusetts Inst. of Tech., Cambridge

    1990-01-01

    Using a (fixed-time) hamiltonian formalism we derive a covariant form for the anomaly in the commutator algebra of Gauss law generators for chiral fermions interacting with a dynamical non-abelian gauge field in 3+1 dimensions. (orig.)

  6. Covariant gauges for constrained systems

    International Nuclear Information System (INIS)

    Gogilidze, S.A.; Khvedelidze, A.M.; Pervushin, V.N.

    1995-01-01

    The method of constructing of extended phase space for singular theories which permits the consideration of covariant gauges without the introducing of a ghost fields, is proposed. The extension of the phase space is carried out by the identification of the initial theory with an equivalent theory with higher derivatives and applying to it the Ostrogradsky method of Hamiltonian description. 7 refs

  7. Uncertainty covariances in robotics applications

    International Nuclear Information System (INIS)

    Smith, D.L.

    1984-01-01

    The application of uncertainty covariance matrices in the analysis of robot trajectory errors is explored. First, relevant statistical concepts are reviewed briefly. Then, a simple, hypothetical robot model is considered to illustrate methods for error propagation and performance test data evaluation. The importance of including error correlations is emphasized

  8. Using machine learning to assess covariate balance in matching studies.

    Science.gov (United States)

    Linden, Ariel; Yarnold, Paul R

    2016-12-01

    In order to assess the effectiveness of matching approaches in observational studies, investigators typically present summary statistics for each observed pre-intervention covariate, with the objective of showing that matching reduces the difference in means (or proportions) between groups to as close to zero as possible. In this paper, we introduce a new approach to distinguish between study groups based on their distributions of the covariates using a machine-learning algorithm called optimal discriminant analysis (ODA). Assessing covariate balance using ODA as compared with the conventional method has several key advantages: the ability to ascertain how individuals self-select based on optimal (maximum-accuracy) cut-points on the covariates; the application to any variable metric and number of groups; its insensitivity to skewed data or outliers; and the use of accuracy measures that can be widely applied to all analyses. Moreover, ODA accepts analytic weights, thereby extending the assessment of covariate balance to any study design where weights are used for covariate adjustment. By comparing the two approaches using empirical data, we are able to demonstrate that using measures of classification accuracy as balance diagnostics produces highly consistent results to those obtained via the conventional approach (in our matched-pairs example, ODA revealed a weak statistically significant relationship not detected by the conventional approach). Thus, investigators should consider ODA as a robust complement, or perhaps alternative, to the conventional approach for assessing covariate balance in matching studies. © 2016 John Wiley & Sons, Ltd.

  9. The academic library network

    Directory of Open Access Journals (Sweden)

    Jacek Wojciechowski

    2012-01-01

    Full Text Available The efficiency of libraries, academic libraries in particular, necessitates organizational changes facilitating or even imposing co-operation. Any structure of any university has to have an integrated network of libraries, with an appropriate division of work, and one that is consolidated as much as it is possible into medium-size or large libraries. Within thus created network, a chance arises to centralize the main library processes based on appropriate procedures in the main library, highly specialized, more effective and therefore cheaper in operation, including a co-ordination of all more important endeavours and tasks. Hierarchically subordinated libraries can be thus more focused on performing their routine service, more and more frequently providing for the whole of the university, and being able to adjust to changeable requirements and demands of patrons and of new tasks resulting from the new model of the university operation. Another necessary change seems to be a universal implementation of an ov rall programme framework that would include all services in the university’s library networks.

  10. Unified accelerator libraries

    International Nuclear Information System (INIS)

    Malitsky, Nikolay; Talman, Richard

    1997-01-01

    A 'Universal Accelerator Libraries' (UAL) environment is described. Its purpose is to facilitate program modularity and inter-program and inter-process communication among heterogeneous programs. The goal ultimately is to facilitate model-based control of accelerators

  11. About the Library - Betty Petersen Memorial Library

    Science.gov (United States)

    branch library of the NOAA Central Library. The library serves the NOAA Science Center in Camp Springs , Maryland. History and Mission: Betty Petersen Memorial Library began as a reading room in the NOAA Science Science Center staff and advises the library on all aspects of the library program. Library Newsletters

  12. The Impact of Library Tutorials on the Information Literacy Skills of Occupational Therapy and Physical Therapy Students in an Evidence-Based Practice Course: A Rubric Assessment.

    Science.gov (United States)

    Schweikhard, April J; Hoberecht, Toni; Peterson, Alyssa; Randall, Ken

    2018-01-01

    This study measures how online library instructional tutorials implemented into an evidence-based practice course have impacted the information literacy skills of occupational and physical therapy graduate students. Through a rubric assessment of final course papers, this study compares differences in students' search strategies and cited sources pre- and post-implementation of the tutorials. The population includes 180 randomly selected graduate students from before and after the library tutorials were introduced into the course curriculum. Results indicate a statistically significant increase in components of students' searching skills and ability to find higher levels of evidence after completing the library tutorials.

  13. Covariant electrodynamics in linear media: Optical metric

    Science.gov (United States)

    Thompson, Robert T.

    2018-03-01

    While the postulate of covariance of Maxwell's equations for all inertial observers led Einstein to special relativity, it was the further demand of general covariance—form invariance under general coordinate transformations, including between accelerating frames—that led to general relativity. Several lines of inquiry over the past two decades, notably the development of metamaterial-based transformation optics, has spurred a greater interest in the role of geometry and space-time covariance for electrodynamics in ponderable media. I develop a generally covariant, coordinate-free framework for electrodynamics in general dielectric media residing in curved background space-times. In particular, I derive a relation for the spatial medium parameters measured by an arbitrary timelike observer. In terms of those medium parameters I derive an explicit expression for the pseudo-Finslerian optical metric of birefringent media and show how it reduces to a pseudo-Riemannian optical metric for nonbirefringent media. This formulation provides a basis for a unified approach to ray and congruence tracing through media in curved space-times that may smoothly vary among positively refracting, negatively refracting, and vacuum.

  14. An Interactive Computer-Based Circulation System for Northwestern University: The Library Puts It to Work

    Directory of Open Access Journals (Sweden)

    Velma Veneziano

    1972-06-01

    Full Text Available Northwestern University Library's on-line circulation system has resulted in dramatic changes in practices and procedures in the Circulation Services Section. After a hectic period of implementation, the staff soon began to adjust to the system. Over the past year and a half, they have devised ways to use the system to maximum advantage, so that manual and machine systems now mesh in close harmony. Freed from time-consuming clerical chores, the staff have been challenged to use their released time to best advantage, with the result that the "service" in "Circulation Services" is much closer to being a reality.

  15. Library Use

    DEFF Research Database (Denmark)

    Konzack, Lars

    2012-01-01

    A seminar paper about a survey of role-playing games in public libraries combined with three cases and a presentation of a model.......A seminar paper about a survey of role-playing games in public libraries combined with three cases and a presentation of a model....

  16. Grey zones in the diagnosis of adult migraine without aura based on the International Classification of Headache Disorders-III beta: exploring the covariates of possible migraine without aura.

    Science.gov (United States)

    Ozge, Aynur; Aydinlar, Elif; Tasdelen, Bahar

    2015-01-01

    Exploring clinical characteristics and migraine covariates may be useful in the diagnosis of migraine without aura. To evaluate the diagnostic value of the International Classification of Headache Disorders (ICHD)-III beta-based diagnosis of migraine without aura; to explore the covariates of possible migraine without aura using an analysis of grey zones in this area; and, finally, to make suggestions for the final version of the ICHD-III. A total of 1365 patients (mean [± SD] age 38.5±10.4 years, 82.8% female) diagnosed with migraine without aura according to the criteria of the ICHD-III beta were included in the present tertiary care-based retrospective study. Patients meeting all of the criteria of the ICHD-III beta were classified as having full migraine without aura, while those who did not meet one, two or ≥3 of the diagnostic criteria were classified as zones I, II and III, respectively. The diagnostic value of the clinical characteristics and covariates of migraine were determined. Full migraine without aura was evident in 25.7% of the migraineurs. A higher likelihood of zone I classification was shown for an attack lasting 4 h to 72 h (OR 1.560; P=0.002), with pulsating quality (OR 4.096; P<0.001), concomitant nausea⁄vomiting (OR 2.300; P<0.001) and photophobia⁄phonophobia (OR 4.865; P<0.001). The first-rank determinants for full migraine without aura were sleep irregularities (OR 1.596; P=0.005) and periodic vomiting (OR 1.464; P=0.026). However, even if not mentioned in ICHD-III beta, the authors determined that motion sickness, abdominal pain or infantile colic attacks in childhood, associated dizziness and osmophobia have important diagnostic value. In cases that do not fulfill all of the diagnostic criteria although they are largely consistent with the characteristics of migraine in clinical terms, the authors believe that a history of infantile colic; periodic vomiting (but not periodic vomiting syndrome); recurrent abdominal pain; the

  17. Analysis of a simplified normalized covariance measure based on binary weighting functions for predicting the intelligibility of noise-suppressed speech.

    Science.gov (United States)

    Chen, Fei; Loizou, Philipos C

    2010-12-01

    The normalized covariance measure (NCM) has been shown previously to predict reliably the intelligibility of noise-suppressed speech containing non-linear distortions. This study analyzes a simplified NCM measure that requires only a small number of bands (not necessarily contiguous) and uses simple binary (1 or 0) weighting functions. The rationale behind the use of a small number of bands is to account for the fact that the spectral information contained in contiguous or nearby bands is correlated and redundant. The modified NCM measure was evaluated with speech intelligibility scores obtained by normal-hearing listeners in 72 noisy conditions involving noise-suppressed speech corrupted by four different types of maskers (car, babble, train, and street interferences). High correlation (r = 0.8) was obtained with the modified NCM measure even when only one band was used. Further analysis revealed a masker-specific pattern of correlations when only one band was used, and bands with low correlation signified the corresponding envelopes that have been severely distorted by the noise-suppression algorithm and/or the masker. Correlation improved to r = 0.84 when only two disjoint bands (centered at 325 and 1874 Hz) were used. Even further improvements in correlation (r = 0.85) were obtained when three or four lower-frequency (<700 Hz) bands were selected.

  18. CAN-SDI: experience with multi-source computer based current awareness services in the National Science Library, Ottawa.

    Science.gov (United States)

    Gaffney, I M

    1973-07-01

    CAN/SDI is Canada's national Selective Dissemination of Information Service offering a choice of nine data bases to its scientific and technical community. The system is based on central processing at the National Science Library combined with the utilization of decentralized expertise and resources for profile formulation and user education. Its greatest strength lies in its wide interdisciplinary quality. The major advantage of centralized processing of many data bases is that Canadians need learn only one method of profile formulation to access many files. A breakdown of services used confirms that a single tape service does not cover all the information requirements of most users. On the average each profile accesses approximately 1.5 data bases. Constant subscriber growth and a low cancellation rate indicate that CAN/SDI is and will continue to be an important element in Canada's information system.

  19. Graphical representation of covariant-contravariant modal formulae

    Directory of Open Access Journals (Sweden)

    Miguel Palomino

    2011-08-01

    Full Text Available Covariant-contravariant simulation is a combination of standard (covariant simulation, its contravariant counterpart and bisimulation. We have previously studied its logical characterization by means of the covariant-contravariant modal logic. Moreover, we have investigated the relationships between this model and that of modal transition systems, where two kinds of transitions (the so-called may and must transitions were combined in order to obtain a simple framework to express a notion of refinement over state-transition models. In a classic paper, Boudol and Larsen established a precise connection between the graphical approach, by means of modal transition systems, and the logical approach, based on Hennessy-Milner logic without negation, to system specification. They obtained a (graphical representation theorem proving that a formula can be represented by a term if, and only if, it is consistent and prime. We show in this paper that the formulae from the covariant-contravariant modal logic that admit a "graphical" representation by means of processes, modulo the covariant-contravariant simulation preorder, are also the consistent and prime ones. In order to obtain the desired graphical representation result, we first restrict ourselves to the case of covariant-contravariant systems without bivariant actions. Bivariant actions can be incorporated later by means of an encoding that splits each bivariant action into its covariant and its contravariant parts.

  20. CCLab--a multi-objective genetic algorithm based combinatorial library design software and an application for histone deacetylase inhibitor design.

    Science.gov (United States)

    Fang, Guanghua; Xue, Mengzhu; Su, Mingbo; Hu, Dingyu; Li, Yanlian; Xiong, Bing; Ma, Lanping; Meng, Tao; Chen, Yuelei; Li, Jingya; Li, Jia; Shen, Jingkang

    2012-07-15

    The introduction of the multi-objective optimization has dramatically changed the virtual combinatorial library design, which can consider many objectives simultaneously, such as synthesis cost and drug-likeness, thus may increase positive rates of biological active compounds. Here we described a software called CCLab (Combinatorial Chemistry Laboratory) for combinatorial library design based on the multi-objective genetic algorithm. Tests of the convergence ability and the ratio to re-take the building blocks in the reference library were conducted to assess the software in silico, and then it was applied to a real case of designing a 5×6 HDAC inhibitor library. Sixteen compounds in the resulted library were synthesized, and the histone deactetylase (HDAC) enzymatic assays proved that 14 compounds showed inhibitory ratios more than 50% against tested 3 HDAC enzymes at concentration of 20 μg/mL, with IC(50) values of 3 compounds comparable to SAHA. These results demonstrated that the CCLab software could enhance the hit rates of the designed library and would be beneficial for medicinal chemists to design focused library in drug development (the software can be downloaded at: http://202.127.30.184:8080/drugdesign.html). Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. Structure-based virtual screening of molecular libraries as cdk2 inhibitors

    International Nuclear Information System (INIS)

    Riaz, U.; Khaleeq, M.

    2011-01-01

    CDK2 inhibitor is an important target in multiple processes associated with tumor growth and development, including proliferation, neovascularization, and metastasis. In this study, hit identification was performed by virtual screening of commercial and in-house compound libraries. Docking studies for the hits were performed, and scoring functions were used to evaluate the docking results and to rank ligand-binding affinities. Subsequently, hit optimization for potent and selective candidate CDK2 inhibitors was performed through focused library design and docking analyses. Consequently, we report that a novel compound with an IC50 value of 89 nM, representing 2-Amino-4,6-di-(4',6'-dibromophenyl)pyrimidine 1, is highly selective for CDK2 inhibitors. The docking structure of compound 1 with CDK2 inhibitor disclosed that the NH moiety and pyrimidine ring appeared to fit tightly into the hydrophobic pocket of CDK2 inhibitor. Additionally, the pyrimidine NH forms a hydrogen bond with the carboxyl group of Asp348. These results confirm the successful application of virtual screening studies in the lead discovery process, and suggest that our novel compound can be an effective CDK2 inhibitor candidate for further lead optimization. (author)

  2. Model for Sucker-Rod Pumping Unit Operating Modes Analysis Based on SimMechanics Library

    Science.gov (United States)

    Zyuzev, A. M.; Bubnov, M. V.

    2018-01-01

    The article provides basic information about the process of a sucker-rod pumping unit (SRPU) model developing by means of SimMechanics library in the MATLAB Simulink environment. The model is designed for the development of a pump productivity optimal management algorithms, sensorless diagnostics of the plunger pump and pumpjack, acquisition of the dynamometer card and determination of a dynamic fluid level in the well, normalization of the faulty unit operation before troubleshooting is performed by staff as well as equilibrium ratio determining by energy indicators and outputting of manual balancing recommendations to achieve optimal power consumption efficiency. Particular attention is given to the application of various blocks from SimMechanics library to take into account the pumpjack construction principal characteristic and to obtain an adequate model. The article explains in depth the developed tools features for collecting and analysis of simulated mechanism data. The conclusions were drawn about practical implementation possibility of the SRPU modelling results and areas for further development of investigation.

  3. Establish an automated flow injection ESI-MS method for the screening of fragment based libraries: Application to Hsp90.

    Science.gov (United States)

    Riccardi Sirtori, Federico; Caronni, Dannica; Colombo, Maristella; Dalvit, Claudio; Paolucci, Mauro; Regazzoni, Luca; Visco, Carlo; Fogliatto, Gianpaolo

    2015-08-30

    ESI-MS is a well established technique for the study of biopolymers (nucleic acids, proteins) and their non covalent adducts, due to its capacity to detect ligand-target complexes in the gas phase and allows inference of ligand-target binding in solution. In this article we used this approach to investigate the interaction of ligands to the Heat Shock Protein 90 (Hsp90). This enzyme is a molecular chaperone involved in the folding and maturation of several proteins which has been subjected in the last years to intensive drug discovery efforts due to its key role in cancer. In particular, reference compounds, with a broad range of dissociation constants from 40pM to 100μM, were tested to assess the reliability of ESI-MS for the study of protein-ligand complexes. A good agreement was found between the values measured with a fluorescence polarization displacement assay and those determined by mass spectrometry. After this validation step we describe the setup of a medium throughput screening method, based on ESI-MS, suitable to explore interactions of therapeutic relevance biopolymers with chemical libraries. Our approach is based on an automated flow injection ESI-MS method (AFI-MS) and has been applied to screen the Nerviano Medical Sciences proprietary fragment library of about 2000 fragments against Hsp90. In order to discard false positive hits and to discriminate those of them interacting with the N-terminal ATP binding site, competition experiments were performed using a reference inhibitor. Gratifyingly, this group of hits matches with the ligands previously identified by NMR FAXS techniques and confirmed by X-ray co-crystallization experiments. These results support the use of AFI-MS for the screening of medium size libraries, including libraries of small molecules with low affinity typically used in fragment based drug discovery. AFI-MS is a valid alternative to other techniques with the additional opportunities to identify compounds interacting with

  4. An Emerging Theory for Evidence Based Information Literacy Instruction in School Libraries, Part 2: Building a Culture of Inquiry

    Directory of Open Access Journals (Sweden)

    Carol A. Gordon

    2009-09-01

    Full Text Available Objective – The purpose of this paper is to articulate a theory for the use of action research as a tool of evidence based practice for information literacy instruction in school libraries. The emerging theory is intended to capture the complex phenomenon of information skills teaching as it is embedded in school curricula. Such a theory is needed to support research on the integrated approach to teaching information skills and knowledge construction within the framework of inquiry learning. Part 1 of this paper, in the previous issue, built a foundation for emerging theory, which established user‐centric information behavior and constructivist learning theory as the substantive theory behind evidence based library instruction in schools. Part 2 continues to build on the Information Search Process and Guided Inquiry as foundational to studying the information‐to‐knowledge connection and the concepts of help and intervention characteristic of 21st century school library instruction.Methods – This paper examines the purpose and methodology of action research as a tool of evidence based instruction. This is accomplished through the explication of three components of theory‐building: paradigm, substantive research, and metatheory. Evidence based practice is identified as the paradigm that contributes values and assumptions about school library instruction. It establishes the role of evidence in teaching and learning, linking theory and practice. Action research, as a tool of evidence based practice is defined as the synthesis of authentic learning, or performance‐based assessment practices that continuously generate evidence throughout the inquiry unit of instruction and traditional data collection methods typically used in formal research. This paper adds social psychology theory from Lewin’s work, which contributes methodology from Gestalt psychology, field theory, group dynamics, and change theory. For Lewin the purpose of action

  5. Status of the JEFF data library

    International Nuclear Information System (INIS)

    Nordborg, C.

    2006-01-01

    A new improved version of the OECD Nuclear Energy Agency (NEA) co-ordinated Joint Evaluated Fission and Fusion (JEFF) data library, JEFF-3.1, was released in May 2005. It comprises a general purpose library and the following five special purpose libraries: activation; thermal scattering law; radioactive decay; fission yield; and proton library. The objective of the previous version of the library (JEFF-2.2) was to achieve improved performance for existing reactors and fuel cycles. In addition to this objective, the JEFF-3.1 library aims to provide users with data for a wider range of applications. These include innovative reactor concepts, transmutation of radioactive waste, fusion, and various other energy and non-energy related industrial applications. Initial benchmark testing has confirmed the expected very good performance of the JEFF-3.1 library. Additional benchmarking of the libraries is underway, both for the general purpose and for the special purpose libraries. A new three-year mandate to continue developing the JEFF library was recently granted by the NEA. For the next version of the library, JEFF-3.2, it is foreseen to put more effort into fission product and minor actinide evaluations, as well as the inclusion of more covariance data. (authors)

  6. Lorentz covariant canonical symplectic algorithms for dynamics of charged particles

    Science.gov (United States)

    Wang, Yulei; Liu, Jian; Qin, Hong

    2016-12-01

    In this paper, the Lorentz covariance of algorithms is introduced. Under Lorentz transformation, both the form and performance of a Lorentz covariant algorithm are invariant. To acquire the advantages of symplectic algorithms and Lorentz covariance, a general procedure for constructing Lorentz covariant canonical symplectic algorithms (LCCSAs) is provided, based on which an explicit LCCSA for dynamics of relativistic charged particles is built. LCCSA possesses Lorentz invariance as well as long-term numerical accuracy and stability, due to the preservation of a discrete symplectic structure and the Lorentz symmetry of the system. For situations with time-dependent electromagnetic fields, which are difficult to handle in traditional construction procedures of symplectic algorithms, LCCSA provides a perfect explicit canonical symplectic solution by implementing the discretization in 4-spacetime. We also show that LCCSA has built-in energy-based adaptive time steps, which can optimize the computation performance when the Lorentz factor varies.

  7. Continuous Covariate Imbalance and Conditional Power for Clinical Trial Interim Analyses

    Science.gov (United States)

    Ciolino, Jody D.; Martin, Renee' H.; Zhao, Wenle; Jauch, Edward C.; Hill, Michael D.; Palesch, Yuko Y.

    2014-01-01

    Oftentimes valid statistical analyses for clinical trials involve adjustment for known influential covariates, regardless of imbalance observed in these covariates at baseline across treatment groups. Thus, it must be the case that valid interim analyses also properly adjust for these covariates. There are situations, however, in which covariate adjustment is not possible, not planned, or simply carries less merit as it makes inferences less generalizable and less intuitive. In this case, covariate imbalance between treatment groups can have a substantial effect on both interim and final primary outcome analyses. This paper illustrates the effect of influential continuous baseline covariate imbalance on unadjusted conditional power (CP), and thus, on trial decisions based on futility stopping bounds. The robustness of the relationship is illustrated for normal, skewed, and bimodal continuous baseline covariates that are related to a normally distributed primary outcome. Results suggest that unadjusted CP calculations in the presence of influential covariate imbalance require careful interpretation and evaluation. PMID:24607294

  8. Group covariance and metrical theory

    International Nuclear Information System (INIS)

    Halpern, L.

    1983-01-01

    The a priori introduction of a Lie group of transformations into a physical theory has often proved to be useful; it usually serves to describe special simplified conditions before a general theory can be worked out. Newton's assumptions of absolute space and time are examples where the Euclidian group and translation group have been introduced. These groups were extended to the Galilei group and modified in the special theory of relativity to the Poincare group to describe physics under the given conditions covariantly in the simplest way. The criticism of the a priori character leads to the formulation of the general theory of relativity. The general metric theory does not really give preference to a particular invariance group - even the principle of equivalence can be adapted to a whole family of groups. The physical laws covariantly inserted into the metric space are however adapted to the Poincare group. 8 references

  9. Optimal covariance selection for estimation using graphical models

    OpenAIRE

    Vichik, Sergey; Oshman, Yaakov

    2011-01-01

    We consider a problem encountered when trying to estimate a Gaussian random field using a distributed estimation approach based on Gaussian graphical models. Because of constraints imposed by estimation tools used in Gaussian graphical models, the a priori covariance of the random field is constrained to embed conditional independence constraints among a significant number of variables. The problem is, then: given the (unconstrained) a priori covariance of the random field, and the conditiona...

  10. Abnormalities in structural covariance of cortical gyrification in schizophrenia

    OpenAIRE

    Palaniyappan, Lena; Park, Bert; Balain, Vijender; Dangi, Raj; Liddle, Peter

    2014-01-01

    The highly convoluted shape of the adult human brain results from several well-coordinated maturational events that start from embryonic development and extend through the adult life span. Disturbances in these maturational events can result in various neurological and psychiatric disorders, resulting in abnormal patterns of morphological relationship among cortical structures (structural covariance). Structural covariance can be studied using graph theory-based approaches that evaluate topol...

  11. Phenotypic covariance at species' borders.

    Science.gov (United States)

    Caley, M Julian; Cripps, Edward; Game, Edward T

    2013-05-28

    Understanding the evolution of species limits is important in ecology, evolution, and conservation biology. Despite its likely importance in the evolution of these limits, little is known about phenotypic covariance in geographically marginal populations, and the degree to which it constrains, or facilitates, responses to selection. We investigated phenotypic covariance in morphological traits at species' borders by comparing phenotypic covariance matrices (P), including the degree of shared structure, the distribution of strengths of pair-wise correlations between traits, the degree of morphological integration of traits, and the ranks of matricies, between central and marginal populations of three species-pairs of coral reef fishes. Greater structural differences in P were observed between populations close to range margins and conspecific populations toward range centres, than between pairs of conspecific populations that were both more centrally located within their ranges. Approximately 80% of all pair-wise trait correlations within populations were greater in the north, but these differences were unrelated to the position of the sampled population with respect to the geographic range of the species. Neither the degree of morphological integration, nor ranks of P, indicated greater evolutionary constraint at range edges. Characteristics of P observed here provide no support for constraint contributing to the formation of these species' borders, but may instead reflect structural change in P caused by selection or drift, and their potential to evolve in the future.

  12. Identification of novel malarial cysteine protease inhibitors using structure-based virtual screening of a focused cysteine protease inhibitor library.

    Science.gov (United States)

    Shah, Falgun; Mukherjee, Prasenjit; Gut, Jiri; Legac, Jennifer; Rosenthal, Philip J; Tekwani, Babu L; Avery, Mitchell A

    2011-04-25

    Malaria, in particular that caused by Plasmodium falciparum , is prevalent across the tropics, and its medicinal control is limited by widespread drug resistance. Cysteine proteases of P. falciparum , falcipain-2 (FP-2) and falcipain-3 (FP-3), are major hemoglobinases, validated as potential antimalarial drug targets. Structure-based virtual screening of a focused cysteine protease inhibitor library built with soft rather than hard electrophiles was performed against an X-ray crystal structure of FP-2 using the Glide docking program. An enrichment study was performed to select a suitable scoring function and to retrieve potential candidates against FP-2 from a large chemical database. Biological evaluation of 50 selected compounds identified 21 diverse nonpeptidic inhibitors of FP-2 with a hit rate of 42%. Atomic Fukui indices were used to predict the most electrophilic center and its electrophilicity in the identified hits. Comparison of predicted electrophilicity of electrophiles in identified hits with those in known irreversible inhibitors suggested the soft-nature of electrophiles in the selected target compounds. The present study highlights the importance of focused libraries and enrichment studies in structure-based virtual screening. In addition, few compounds were screened against homologous human cysteine proteases for selectivity analysis. Further evaluation of structure-activity relationships around these nonpeptidic scaffolds could help in the development of selective leads for antimalarial chemotherapy.

  13. Cloning of low dose radiation induced gene RIG1 by RACE based on non-cloned cDNA library

    International Nuclear Information System (INIS)

    Luo Ying; Sui Jianli; Tie Yi; Zhang Yuanping; Zhou Pingkun; Sun Zhixian

    2001-01-01

    Objective: To obtain full-length cDNA of radiation induced new gene RIG1 based on its EST fragment. Methods: Based on non-cloned cDNA library, enhanced nested RACE PCR and biotin-avidin labelled probe for magnetic bead purification was used to obtain full-length cDNA of RIG1. Results: About 1 kb of 3' end of RIG1 gene was successfully cloned by this set of methods and cloning of RIG1 5' end is proceeding well. Conclusion: The result is consistent with the design of experiment. This set of protocol is useful for cloning of full-length gene based on EST fragment

  14. Understanding the covariation of tics, attention-deficit/hyperactivity, and obsessive-compulsive symptoms: A population-based adult twin study.

    Science.gov (United States)

    Pinto, Rebecca; Monzani, Benedetta; Leckman, James F; Rück, Christian; Serlachius, Eva; Lichtenstein, Paul; Mataix-Cols, David

    2016-10-01

    Chronic tic disorders (TD), attention-deficit/hyperactivity-disorder (ADHD), and obsessive-compulsive disorder (OCD) frequently co-occur in clinical and epidemiological samples. Family studies have found evidence of shared familial transmission between TD and OCD, whereas the familial association between these disorders and ADHD is less clear. This study aimed to investigate to what extent liability of tics, attention-deficit/hyperactivity, and obsessive-compulsive symptoms is caused by shared or distinct genetic or environmental influences, in a large population-representative sample of Swedish adult twins (n = 21,911). Tics, attention-deficit/hyperactivity, and obsessive-compulsive symptoms showed modest, but significant covariation. Model fitting suggested a latent liability factor underlying the three phenotypes. This common factor was relatively heritable, and explained significantly less of the variance of attention-deficit/hyperactivity symptom liability. The majority of genetic variance was specific rather than shared. The greatest proportion of total variance in liability of tics, attention-deficit/hyperactivity, and obsessive-compulsive symptoms was attributed to specific non-shared environmental influences. Our findings suggest that the co-occurrence of tics and obsessive-compulsive symptoms, and to a lesser extent attention-deficit/hyperactivity symptoms, can be partly explained by shared etiological influences. However, these phenotypes do not appear to be alternative expressions of the same underlying genetic liability. Further research examining sub-dimensions of these phenotypes may serve to further clarify the association between these disorders and identify more genetically homogenous symptom subtypes. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  15. Target based screening of small molecule library identifies pregnelonene, a Nrf2 agonist, as a potential radioprotector in zebrafish

    International Nuclear Information System (INIS)

    Joshi, Jayadev; Ghosh, Subhajit; Dimri, Manali; Shrivastava, Nitisha; Indracanti, Prem Kumar; Ray, Jharna

    2014-01-01

    Reactive oxygen species, cellular oxidative stress, tissue inflammation and cell death are the downstream consequences of radiation exposures which ultimately could lead to organism death. Present study aims at identifying potential targets and screening of small molecule compound library for identifying novel and effective radioprotectors. In-silco analysis of known radioprotectors revealed three main function, antioxidant, anti-inflammation and antiapoptosis. In this study, a collection of small molecules (John Hopkins Clinical Compound Library, JHCCL) were screened for these different functions using the biological activity database of NCBI with the help of in-house developed python script. Further, filtering of the JHCCL was done by searching for molecules which are known to be active against target of radiobiological significance, Nrf-2. Close observation of potential hits identified, pregnenolone, as an Nrf-2 agonist which was further evaluated for radioprotection in zebrafish model. Pregnenolone rendered significant protection (at 40 μM; added 1 hour prior to 20 Gy gamma radiation) in terms of damage manifestations (pericardial edema, microcephaly, micropthalmia, yolk sac resorption, curvature of spine, blood flow, body length, heart-beat, blood clot, roughness of skin) and survival advantage (60%) when compared to irradiated control. Further, the ability of pregnenolone to act as a neuroprotectant was also carried out using in-house developed software for assessing neuromotor functions. In comparison to radiation alone group, pregnenolone was found to possess significant neuroactive functions and diminished radiation induced neuronal impairment. Over all these results suggests that pregnenolone is an effective radioprotector which warrants further investigation for validation of its radioprotective action in higher vertebrates. Apart from that the utility of approach to screen out bioactivity data base of various chemical compound libraries for possible

  16. Implementation of Hierarchical Authorization For A Web-Based Digital Library

    Directory of Open Access Journals (Sweden)

    Andreas Geyer-Schulz

    2007-04-01

    Full Text Available Access control mechanisms are needed in almost every system nowadays to control what kind of access each user has to which resources and when. On the one hand access control systems need to be flexible to allow the definition of the access rules that are actually needed. But they must also be easy to administrate to prevent rules from being in place without the administrator realizing it. This is particularly difficult for systems such as a digital library that requires fine-grained access rules specifying access control at a document level. We present the implementation and architecture of a system that allows definition of access rights down to the single document and user level. We use hierarchies on users and roles, hierachies on access rights and hierarchies on documents and document groups. These hierarchies allow a maximum of flexibility and still keep the system easy enough to administrate. Our access control system supports positive as well as negative permissions.

  17. Libraries Today, Libraries Tomorrow: Contemporary Library Practices and the Role of Library Space in the L

    Directory of Open Access Journals (Sweden)

    Ana Vogrinčič Čepič

    2013-09-01

    Full Text Available ABSTRACTPurpose: The article uses sociological concepts in order to rethink the changes in library practices. Contemporary trends are discussed with regard to the changing nature of working habits, referring mostly to the new technology, and the (emergence of the third space phenomenon. The author does not regard libraries only as concrete public service institutions, but rather as complex cultural forms, taking in consideration wider social context with a stress on users’ practices in relation to space.Methodology/approach: The article is based on the (self- observation of the public library use, and on the (discourse analysis of internal library documents (i.e. annual reports and plans and secondary sociological literature. As such, the cultural form approach represents a classic method of sociology of culture.Results: The study of relevant material in combination with direct personal experiences reveals socio-structural causes for the change of users’ needs and habits, and points at the difficulty of spatial redefinition of libraries as well as at the power of the discourse.Research limitations: The article is limited to an observation of users’ practices in some of the public libraries in Ljubljana and examines only a small number of annual reports – the discoveries are then further debated from the sociological perspective.Originality/practical implications: The article offers sociological insight in the current issues of the library science and tries to suggest a wider explanation that could answer some of the challenges of the contemporary librarianship.

  18. The LAW library

    International Nuclear Information System (INIS)

    Green, N.M.; Parks, C.V.; Arwood, J.W.

    1989-01-01

    The 238 group LAW library is a new multigroup library based on ENDF/B-V data. It contains data for 302 materials and will be distributed by the Radiation Shielding Information Center, located at Oak Ridge National Laboratory. It was generated for use in neutronics calculations required in radioactive waste analyses, though it has equal utility in any study requiring multigroup neutron cross sections

  19. Library Research Support in Queensland: A Survey

    Science.gov (United States)

    Richardson, Joanna; Nolan-Brown, Therese; Loria, Pat; Bradbury, Stephanie

    2012-01-01

    University libraries worldwide are reconceptualising the ways in which they support the research agenda in their respective institutions. This paper is based on a survey completed by member libraries of the Queensland University Libraries Office of Cooperation (QULOC), the findings of which may be informative for other university libraries. After…

  20. Development of the tool for generating ORIGEN2 library based on JENDL-3.2 for FBR

    International Nuclear Information System (INIS)

    Ohkawachi, Yasushi; Fukushima, Manabu

    1999-05-01

    ORIGEN2 is one of the most widely-used burnup analysis code in the world. This code has one-grouped cross section libraries compiled for various types of reactors. However, these libraries have some problems. One is that these libraries were developed from old nuclear data libraries (ENDF/B-IV,V) and the other is that core and fuel designs from which these libraries are generated do not match the current analysis. In order to solve the problems, analysis tool is developed for generating ORIGEN2 library from JENDL-3.2 considering multi-energy neutron spectrum. And eight new libraries are prepared using this tool for analysis of sodium-cooled FBR. These new libraries are prepared for eight kinds of cores in total. Seven of them are made by changing core size (small core - large core), fuel type (oxide, nitride, metal) and Pu vector as a parameter. The eighth one is a Pu burner core. Burnup calculation using both new and original libraries, shows large difference in buildup or depletion numbers of nuclides among the libraries. It is estimated that the analysis result is greatly influenced by the neutron spectrum which is used in collapse of cross section. By using this tool or new libraries, it seems to improve evaluation accuracy of buildup or depletion numbers of nuclides in transmutation research on FBR fuel cycle. (author)

  1. Nigerian Libraries

    African Journals Online (AJOL)

    Bridging the digital divide: the potential role of the National Library of Nigeria · EMAIL FULL TEXT EMAIL FULL TEXT · DOWNLOAD FULL TEXT DOWNLOAD FULL TEXT. Juliana Obiageri Akidi, Joy Chituru Onyenachi, 11-19 ...

  2. Library Locations

    Data.gov (United States)

    Allegheny County / City of Pittsburgh / Western PA Regional Data Center — Carnegie Library of Pittsburgh locations including address, coordinates, phone number, square footage, and standard operating hours. The map below does not display...

  3. academic libraries

    African Journals Online (AJOL)

    Information Impact: Journal of Information and Knowledge Management

    Information Impact: Journal of Information and Knowledge Management ... Key words: academic libraries, open access, research, researchers, technology ... European commission (2012) reports that affordable and easy access to the results ...

  4. Non-evaluation applications for covariance matrices

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.L.

    1982-05-01

    The possibility for application of covariance matrix techniques to a variety of common research problems other than formal data evaluation are demonstrated by means of several examples. These examples deal with such matters as fitting spectral data, deriving uncertainty estimates for results calculated from experimental data, obtaining the best values for plurally-measured quantities, and methods for analysis of cross section errors based on properties of the experiment. The examples deal with realistic situations encountered in the laboratory, and they are treated in sufficient detail to enable a careful reader to extrapolate the methods to related problems.

  5. Academic Libraries and Learning Support in Collaboration. Library Based Guidance for Peer Assisted Learning Leaders at Bournemouth University: Theory and Practice.

    OpenAIRE

    Parton, Steve; Fleming, Hugh

    2008-01-01

    This article begins with an overview of the University’s pioneering Peer Assisted Learning Scheme (PAL) and describes how in 2005/6, the Library became involved, collaborating with the PAL Coordinator to develop materials for use by PAL Leaders. PAL is intended to foster cross-year support between students on the same course. It encourages students to support each other and learn co-operatively under the guidance of trained students from the year above - called PAL Leaders. Two documents were...

  6. Grassland gross carbon dioxide uptake based on an improved model tree ensemble approach considering human interventions: global estimation and covariation with climate.

    Science.gov (United States)

    Liang, Wei; Lü, Yihe; Zhang, Weibin; Li, Shuai; Jin, Zhao; Ciais, Philippe; Fu, Bojie; Wang, Shuai; Yan, Jianwu; Li, Junyi; Su, Huimin

    2017-07-01

    Grassland ecosystems act as a crucial role in the global carbon cycle and provide vital ecosystem services for many species. However, these low-productivity and water-limited ecosystems are sensitive and vulnerable to climate perturbations and human intervention, the latter of which is often not considered due to lack of spatial information regarding the grassland management. Here by the application of a model tree ensemble (MTE-GRASS) trained on local eddy covariance data and using as predictors gridded climate and management intensity field (grazing and cutting), we first provide an estimate of global grassland gross primary production (GPP). GPP from our study compares well (modeling efficiency NSE = 0.85 spatial; NSE between 0.69 and 0.94 interannual) with that from flux measurement. Global grassland GPP was on average 11 ± 0.31 Pg C yr -1 and exhibited significantly increasing trend at both annual and seasonal scales, with an annual increase of 0.023 Pg C (0.2%) from 1982 to 2011. Meanwhile, we found that at both annual and seasonal scale, the trend (except for northern summer) and interannual variability of the GPP are primarily driven by arid/semiarid ecosystems, the latter of which is due to the larger variation in precipitation. Grasslands in arid/semiarid regions have a stronger (33 g C m -2  yr -1 /100 mm) and faster (0- to 1-month time lag) response to precipitation than those in other regions. Although globally spatial gradients (71%) and interannual changes (51%) in GPP were mainly driven by precipitation, where most regions with arid/semiarid climate zone, temperature and radiation together shared half of GPP variability, which is mainly distributed in the high-latitude or cold regions. Our findings and the results of other studies suggest the overwhelming importance of arid/semiarid regions as a control on grassland ecosystems carbon cycle. Similarly, under the projected future climate change, grassland ecosystems in these regions will

  7. Modeling Covariance Breakdowns in Multivariate GARCH

    OpenAIRE

    Jin, Xin; Maheu, John M

    2014-01-01

    This paper proposes a flexible way of modeling dynamic heterogeneous covariance breakdowns in multivariate GARCH (MGARCH) models. During periods of normal market activity, volatility dynamics are governed by an MGARCH specification. A covariance breakdown is any significant temporary deviation of the conditional covariance matrix from its implied MGARCH dynamics. This is captured through a flexible stochastic component that allows for changes in the conditional variances, covariances and impl...

  8. Real-time probabilistic covariance tracking with efficient model update.

    Science.gov (United States)

    Wu, Yi; Cheng, Jian; Wang, Jinqiao; Lu, Hanqing; Wang, Jun; Ling, Haibin; Blasch, Erik; Bai, Li

    2012-05-01

    The recently proposed covariance region descriptor has been proven robust and versatile for a modest computational cost. The covariance matrix enables efficient fusion of different types of features, where the spatial and statistical properties, as well as their correlation, are characterized. The similarity between two covariance descriptors is measured on Riemannian manifolds. Based on the same metric but with a probabilistic framework, we propose a novel tracking approach on Riemannian manifolds with a novel incremental covariance tensor learning (ICTL). To address the appearance variations, ICTL incrementally learns a low-dimensional covariance tensor representation and efficiently adapts online to appearance changes of the target with only O(1) computational complexity, resulting in a real-time performance. The covariance-based representation and the ICTL are then combined with the particle filter framework to allow better handling of background clutter, as well as the temporary occlusions. We test the proposed probabilistic ICTL tracker on numerous benchmark sequences involving different types of challenges including occlusions and variations in illumination, scale, and pose. The proposed approach demonstrates excellent real-time performance, both qualitatively and quantitatively, in comparison with several previously proposed trackers.

  9. Alterations in Anatomical Covariance in the Prematurely Born.

    Science.gov (United States)

    Scheinost, Dustin; Kwon, Soo Hyun; Lacadie, Cheryl; Vohr, Betty R; Schneider, Karen C; Papademetris, Xenophon; Constable, R Todd; Ment, Laura R

    2017-01-01

    Preterm (PT) birth results in long-term alterations in functional and structural connectivity, but the related changes in anatomical covariance are just beginning to be explored. To test the hypothesis that PT birth alters patterns of anatomical covariance, we investigated brain volumes of 25 PTs and 22 terms at young adulthood using magnetic resonance imaging. Using regional volumetrics, seed-based analyses, and whole brain graphs, we show that PT birth is associated with reduced volume in bilateral temporal and inferior frontal lobes, left caudate, left fusiform, and posterior cingulate for prematurely born subjects at young adulthood. Seed-based analyses demonstrate altered patterns of anatomical covariance for PTs compared with terms. PTs exhibit reduced covariance with R Brodmann area (BA) 47, Broca's area, and L BA 21, Wernicke's area, and white matter volume in the left prefrontal lobe, but increased covariance with R BA 47 and left cerebellum. Graph theory analyses demonstrate that measures of network complexity are significantly less robust in PTs compared with term controls. Volumes in regions showing group differences are significantly correlated with phonological awareness, the fundamental basis for reading acquisition, for the PTs. These data suggest both long-lasting and clinically significant alterations in the covariance in the PTs at young adulthood. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  10. DevLore - A Firmware Library and Web-Based Configuration Control Tool for Accelerator Systems Under Constant Development

    International Nuclear Information System (INIS)

    Richard Evans; Kevin Jordan; Deborah Gruber; Daniel Sexton

    2005-01-01

    The Free Electron Laser Project at Jefferson Lab is based on a comparatively small accelerator driver. As it's systems continues to grow and evolve, strict configuration control has not been a programmatic goal. Conversely, as the IR-Demo FEL and the 10kW IR FEL have been built and operated, hardware and software changes have been regular part of the machine development process. With relatively small component counts for sub-systems, changes occur without requiring much formal documentation and in-situ alterations are common-place in the name of supporting operations. This paper presents an overview of the web-based software tool called DevLore which was first developed to be a library for embedded programming and then became a tremendously effective tool for tracking all changes made to the machine hardware and software

  11. Library-based discovery and characterization of daphnane diterpenes as potent and selective HIV inhibitors in Daphne gnidium.

    Science.gov (United States)

    Vidal, Vincent; Potterat, Olivier; Louvel, Séverine; Hamy, François; Mojarrab, Mahdi; Sanglier, Jean-Jacques; Klimkait, Thomas; Hamburger, Matthias

    2012-03-23

    Despite the existence of an extended armamentarium of effective synthetic drugs to treat HIV, there is a continuing need for new potent and affordable drugs. Given the successful history of natural product based drug discovery, a library of close to one thousand plant and fungal extracts was screened for antiretroviral activity. A dichloromethane extract of the aerial parts of Daphne gnidium exhibited strong antiretroviral activity and absence of cytotoxicity. With the aid of HPLC-based activity profiling, the antiviral activity could be tracked to four daphnane derivatives, namely, daphnetoxin (1), gnidicin (2), gniditrin (3), and excoecariatoxin (4). Detailed anti-HIV profiling revealed that the pure compounds were active against multidrug-resistant viruses irrespective of their cellular tropism. Mode of action studies that narrowed the site of activity to viral entry events suggested a direct interference with the expression of the two main HIV co-receptors, CCR5 and CXCR4, at the cell surface by daphnetoxin (1).

  12. A fully synthetic human Fab antibody library based on fixed VH/VL framework pairings with favorable biophysical properties

    Science.gov (United States)

    Tiller, Thomas; Schuster, Ingrid; Deppe, Dorothée; Siegers, Katja; Strohner, Ralf; Herrmann, Tanja; Berenguer, Marion; Poujol, Dominique; Stehle, Jennifer; Stark, Yvonne; Heßling, Martin; Daubert, Daniela; Felderer, Karin; Kaden, Stefan; Kölln, Johanna; Enzelberger, Markus; Urlinger, Stefanie

    2013-01-01

    This report describes the design, generation and testing of Ylanthia, a fully synthetic human Fab antibody library with 1.3E+11 clones. Ylanthia comprises 36 fixed immunoglobulin (Ig) variable heavy (VH)/variable light (VL) chain pairs, which cover a broad range of canonical complementarity-determining region (CDR) structures. The variable Ig heavy and Ig light (VH/VL) chain pairs were selected for biophysical characteristics favorable to manufacturing and development. The selection process included multiple parameters, e.g., assessment of protein expression yield, thermal stability and aggregation propensity in fragment antigen binding (Fab) and IgG1 formats, and relative Fab display rate on phage. The framework regions are fixed and the diversified CDRs were designed based on a systematic analysis of a large set of rearranged human antibody sequences. Care was taken to minimize the occurrence of potential posttranslational modification sites within the CDRs. Phage selection was performed against various antigens and unique antibodies with excellent biophysical properties were isolated. Our results confirm that quality can be built into an antibody library by prudent selection of unmodified, fully human VH/VL pairs as scaffolds. PMID:23571156

  13. A cluster-based strategy for assessing the overlap between large chemical libraries and its application to a recent acquisition.

    Science.gov (United States)

    Engels, Michael F M; Gibbs, Alan C; Jaeger, Edward P; Verbinnen, Danny; Lobanov, Victor S; Agrafiotis, Dimitris K

    2006-01-01

    We report on the structural comparison of the corporate collections of Johnson & Johnson Pharmaceutical Research & Development (JNJPRD) and 3-Dimensional Pharmaceuticals (3DP), performed in the context of the recent acquisition of 3DP by JNJPRD. The main objective of the study was to assess the druglikeness of the 3DP library and the extent to which it enriched the chemical diversity of the JNJPRD corporate collection. The two databases, at the time of acquisition, collectively contained more than 1.1 million compounds with a clearly defined structural description. The analysis was based on a clustering approach and aimed at providing an intuitive quantitative estimate and visual representation of this enrichment. A novel hierarchical clustering algorithm called divisive k-means was employed in combination with Kelley's cluster-level selection method to partition the combined data set into clusters, and the diversity contribution of each library was evaluated as a function of the relative occupancy of these clusters. Typical 3DP chemotypes enriching the diversity of the JNJPRD collection were catalogued and visualized using a modified maximum common substructure algorithm. The joint collection of JNJPRD and 3DP compounds was also compared to other databases of known medicinally active or druglike compounds. The potential of the methodology for the analysis of very large chemical databases is discussed.

  14. Structural Analysis of Covariance and Correlation Matrices.

    Science.gov (United States)

    Joreskog, Karl G.

    1978-01-01

    A general approach to analysis of covariance structures is considered, in which the variances and covariances or correlations of the observed variables are directly expressed in terms of the parameters of interest. The statistical problems of identification, estimation and testing of such covariance or correlation structures are discussed.…

  15. Construction of covariance matrix for experimental data

    International Nuclear Information System (INIS)

    Liu Tingjin; Zhang Jianhua

    1992-01-01

    For evaluators and experimenters, the information is complete only in the case when the covariance matrix is given. The covariance matrix of the indirectly measured data has been constructed and discussed. As an example, the covariance matrix of 23 Na(n, 2n) cross section is constructed. A reasonable result is obtained

  16. SpacePy - a Python-based library of tools for the space sciences

    International Nuclear Information System (INIS)

    Morley, Steven K.; Welling, Daniel T.; Koller, Josef; Larsen, Brian A.; Henderson, Michael G.

    2010-01-01

    Space science deals with the bodies within the solar system and the interplanetary medium; the primary focus is on atmospheres and above - at Earth the short timescale variation in the the geomagnetic field, the Van Allen radiation belts and the deposition of energy into the upper atmosphere are key areas of investigation. SpacePy is a package for Python, targeted at the space sciences, that aims to make basic data analysis, modeling and visualization easier. It builds on the capabilities of the well-known NumPy and MatPlotLib packages. Publication quality output direct from analyses is emphasized. The SpacePy project seeks to promote accurate and open research standards by providing an open environment for code development. In the space physics community there has long been a significant reliance on proprietary languages that restrict free transfer of data and reproducibility of results. By providing a comprehensive, open-source library of widely used analysis and visualization tools in a free, modern and intuitive language, we hope that this reliance will be diminished. SpacePy includes implementations of widely used empirical models, statistical techniques used frequently in space science (e.g. superposed epoch analysis), and interfaces to advanced tools such as electron drift shell calculations for radiation belt studies. SpacePy also provides analysis and visualization tools for components of the Space Weather Modeling Framework - currently this only includes the BATS-R-US 3-D magnetohydrodynamic model and the RAM ring current model - including streamline tracing in vector fields. Further development is currently underway. External libraries, which include well-known magnetic field models, high-precision time conversions and coordinate transformations are wrapped for access from Python using SWIG and f2py. The rest of the tools have been implemented directly in Python. The provision of open-source tools to perform common tasks will provide openness in the

  17. Lorentz covariant theory of gravitation

    International Nuclear Information System (INIS)

    Fagundes, H.V.

    1974-12-01

    An alternative method for the calculation of second order effects, like the secular shift of Mercury's perihelium is developed. This method uses the basic ideas of thirring combined with the more mathematical approach of Feyman. In the case of a static source, the treatment used is greatly simplified. Besides, Einstein-Infeld-Hoffmann's Lagrangian for a system of two particles and spin-orbit and spin-spin interactions of two particles with classical spin, ie, internal angular momentum in Moller's sense, are obtained from the Lorentz covariant theory

  18. Covariant gauges at finite temperature

    CERN Document Server

    Landshoff, Peter V

    1992-01-01

    A prescription is presented for real-time finite-temperature perturbation theory in covariant gauges, in which only the two physical degrees of freedom of the gauge-field propagator acquire thermal parts. The propagators for the unphysical degrees of freedom of the gauge field, and for the Faddeev-Popov ghost field, are independent of temperature. This prescription is applied to the calculation of the one-loop gluon self-energy and the two-loop interaction pressure, and is found to be simpler to use than the conventional one.

  19. Comprehensive Assessment of Models and Events based on Library tools (CAMEL)

    Science.gov (United States)

    Rastaetter, L.; Boblitt, J. M.; DeZeeuw, D.; Mays, M. L.; Kuznetsova, M. M.; Wiegand, C.

    2017-12-01

    At the Community Coordinated Modeling Center (CCMC), the assessment of modeling skill using a library of model-data comparison metrics is taken to the next level by fully integrating the ability to request a series of runs with the same model parameters for a list of events. The CAMEL framework initiates and runs a series of selected, pre-defined simulation settings for participating models (e.g., WSA-ENLIL, SWMF-SC+IH for the heliosphere, SWMF-GM, OpenGGCM, LFM, GUMICS for the magnetosphere) and performs post-processing using existing tools for a host of different output parameters. The framework compares the resulting time series data with respective observational data and computes a suite of metrics such as Prediction Efficiency, Root Mean Square Error, Probability of Detection, Probability of False Detection, Heidke Skill Score for each model-data pair. The system then plots scores by event and aggregated over all events for all participating models and run settings. We are building on past experiences with model-data comparisons of magnetosphere and ionosphere model outputs in GEM2008, GEM-CEDAR CETI2010 and Operational Space Weather Model challenges (2010-2013). We can apply the framework also to solar-heliosphere as well as radiation belt models. The CAMEL framework takes advantage of model simulations described with Space Physics Archive Search and Extract (SPASE) metadata and a database backend design developed for a next-generation Run-on-Request system at the CCMC.

  20. Populating a Library of Reusable H-Boms Assessment of a Feasible Image Based Modeling Workflow

    Science.gov (United States)

    Santagati, C.; Lo Turco, M.; D'Agostino, G.

    2017-08-01

    The paper shows the intermediate results of a research activity aimed at populating a library of reusable Historical Building Object Models (H-BOMs) by testing a full digital workflow that takes advantages from using Structure from Motion (SfM) models and is centered on the geometrical/stylistic/materic analysis of the architectural element (portal, window, altar). The aim is to find common (invariant) and uncommon (variant) features in terms of identification of architectural parts and their relationships, geometrical rules, dimensions and proportions, construction materials and measure units, in order to model archetypal shapes from which it is possible to derive all the style variations. At this regard, a set of 14th - 16th century gothic portals of the catalan-aragonese architecture in Etnean area of Eastern Sicily has been studied and used to assess the feasibility of the identified workflow. This approach tries to answer the increasingly demand for guidelines and standards in the field of Cultural Heritage Conservation to create and manage semantic-aware 3D models able to include all the information (both geometrical and alphanumerical ones) concerning historical buildings and able to be reused in several projects.

  1. Covariance Evaluation Methodology for Neutron Cross Sections

    Energy Technology Data Exchange (ETDEWEB)

    Herman,M.; Arcilla, R.; Mattoon, C.M.; Mughabghab, S.F.; Oblozinsky, P.; Pigni, M.; Pritychenko, b.; Songzoni, A.A.

    2008-09-01

    We present the NNDC-BNL methodology for estimating neutron cross section covariances in thermal, resolved resonance, unresolved resonance and fast neutron regions. The three key elements of the methodology are Atlas of Neutron Resonances, nuclear reaction code EMPIRE, and the Bayesian code implementing Kalman filter concept. The covariance data processing, visualization and distribution capabilities are integral components of the NNDC methodology. We illustrate its application on examples including relatively detailed evaluation of covariances for two individual nuclei and massive production of simple covariance estimates for 307 materials. Certain peculiarities regarding evaluation of covariances for resolved resonances and the consistency between resonance parameter uncertainties and thermal cross section uncertainties are also discussed.

  2. Large Covariance Estimation by Thresholding Principal Orthogonal Complements

    Science.gov (United States)

    Fan, Jianqing; Liao, Yuan; Mincheva, Martina

    2012-01-01

    This paper deals with the estimation of a high-dimensional covariance with a conditional sparsity structure and fast-diverging eigenvalues. By assuming sparse error covariance matrix in an approximate factor model, we allow for the presence of some cross-sectional correlation even after taking out common but unobservable factors. We introduce the Principal Orthogonal complEment Thresholding (POET) method to explore such an approximate factor structure with sparsity. The POET estimator includes the sample covariance matrix, the factor-based covariance matrix (Fan, Fan, and Lv, 2008), the thresholding estimator (Bickel and Levina, 2008) and the adaptive thresholding estimator (Cai and Liu, 2011) as specific examples. We provide mathematical insights when the factor analysis is approximately the same as the principal component analysis for high-dimensional data. The rates of convergence of the sparse residual covariance matrix and the conditional sparse covariance matrix are studied under various norms. It is shown that the impact of estimating the unknown factors vanishes as the dimensionality increases. The uniform rates of convergence for the unobserved factors and their factor loadings are derived. The asymptotic results are also verified by extensive simulation studies. Finally, a real data application on portfolio allocation is presented. PMID:24348088

  3. Visualization and assessment of spatio-temporal covariance properties

    KAUST Repository

    Huang, Huang

    2017-11-23

    Spatio-temporal covariances are important for describing the spatio-temporal variability of underlying random fields in geostatistical data. For second-order stationary random fields, there exist subclasses of covariance functions that assume a simpler spatio-temporal dependence structure with separability and full symmetry. However, it is challenging to visualize and assess separability and full symmetry from spatio-temporal observations. In this work, we propose a functional data analysis approach that constructs test functions using the cross-covariances from time series observed at each pair of spatial locations. These test functions of temporal lags summarize the properties of separability or symmetry for the given spatial pairs. We use functional boxplots to visualize the functional median and the variability of the test functions, where the extent of departure from zero at all temporal lags indicates the degree of non-separability or asymmetry. We also develop a rank-based nonparametric testing procedure for assessing the significance of the non-separability or asymmetry. Essentially, the proposed methods only require the analysis of temporal covariance functions. Thus, a major advantage over existing approaches is that there is no need to estimate any covariance matrix for selected spatio-temporal lags. The performances of the proposed methods are examined by simulations with various commonly used spatio-temporal covariance models. To illustrate our methods in practical applications, we apply it to real datasets, including weather station data and climate model outputs.

  4. Covariance fitting of highly-correlated data in lattice QCD

    Science.gov (United States)

    Yoon, Boram; Jang, Yong-Chull; Jung, Chulwoo; Lee, Weonjong

    2013-07-01

    We address a frequently-asked question on the covariance fitting of highly-correlated data such as our B K data based on the SU(2) staggered chiral perturbation theory. Basically, the essence of the problem is that we do not have a fitting function accurate enough to fit extremely precise data. When eigenvalues of the covariance matrix are small, even a tiny error in the fitting function yields a large chi-square value and spoils the fitting procedure. We have applied a number of prescriptions available in the market, such as the cut-off method, modified covariance matrix method, and Bayesian method. We also propose a brand new method, the eigenmode shift (ES) method, which allows a full covariance fitting without modifying the covariance matrix at all. We provide a pedagogical example of data analysis in which the cut-off method manifestly fails in fitting, but the rest work well. In our case of the B K fitting, the diagonal approximation, the cut-off method, the ES method, and the Bayesian method work reasonably well in an engineering sense. However, interpreting the meaning of χ 2 is easier in the case of the ES method and the Bayesian method in a theoretical sense aesthetically. Hence, the ES method can be a useful alternative optional tool to check the systematic error caused by the covariance fitting procedure.

  5. Large Covariance Estimation by Thresholding Principal Orthogonal Complements.

    Science.gov (United States)

    Fan, Jianqing; Liao, Yuan; Mincheva, Martina

    2013-09-01

    This paper deals with the estimation of a high-dimensional covariance with a conditional sparsity structure and fast-diverging eigenvalues. By assuming sparse error covariance matrix in an approximate factor model, we allow for the presence of some cross-sectional correlation even after taking out common but unobservable factors. We introduce the Principal Orthogonal complEment Thresholding (POET) method to explore such an approximate factor structure with sparsity. The POET estimator includes the sample covariance matrix, the factor-based covariance matrix (Fan, Fan, and Lv, 2008), the thresholding estimator (Bickel and Levina, 2008) and the adaptive thresholding estimator (Cai and Liu, 2011) as specific examples. We provide mathematical insights when the factor analysis is approximately the same as the principal component analysis for high-dimensional data. The rates of convergence of the sparse residual covariance matrix and the conditional sparse covariance matrix are studied under various norms. It is shown that the impact of estimating the unknown factors vanishes as the dimensionality increases. The uniform rates of convergence for the unobserved factors and their factor loadings are derived. The asymptotic results are also verified by extensive simulation studies. Finally, a real data application on portfolio allocation is presented.

  6. Hierarchical multivariate covariance analysis of metabolic connectivity.

    Science.gov (United States)

    Carbonell, Felix; Charil, Arnaud; Zijdenbos, Alex P; Evans, Alan C; Bedell, Barry J

    2014-12-01

    Conventional brain connectivity analysis is typically based on the assessment of interregional correlations. Given that correlation coefficients are derived from both covariance and variance, group differences in covariance may be obscured by differences in the variance terms. To facilitate a comprehensive assessment of connectivity, we propose a unified statistical framework that interrogates the individual terms of the correlation coefficient. We have evaluated the utility of this method for metabolic connectivity analysis using [18F]2-fluoro-2-deoxyglucose (FDG) positron emission tomography (PET) data from the Alzheimer's Disease Neuroimaging Initiative (ADNI) study. As an illustrative example of the utility of this approach, we examined metabolic connectivity in angular gyrus and precuneus seed regions of mild cognitive impairment (MCI) subjects with low and high β-amyloid burdens. This new multivariate method allowed us to identify alterations in the metabolic connectome, which would not have been detected using classic seed-based correlation analysis. Ultimately, this novel approach should be extensible to brain network analysis and broadly applicable to other imaging modalities, such as functional magnetic resonance imaging (MRI).

  7. Analysis of PROTEUS phase II experiments performed using the AARE modular system and JEF-based libraries

    International Nuclear Information System (INIS)

    Pelloni, S.; Stepanek, J.; Vontobel, P.

    1989-01-01

    The capability of the advanced analysis of reactor engineering (AARE) modular code system and JEF-1-based nuclear data libraries to analyze light water high converter reactor (LWHCR) lattices is investigated by calculating the wet and dry cells of the PROTEUS-LWHCR phase II experiment. The results are compared to those obtained using several cell codes. Main features of the AARE code system, such as the self-shielding of resonance cross sections in the whole energy range, the generation of adequate fission source spectra, and the efficiency of the elastic removal correction,are investigated. In particular, it is shown that AARE results for the k ∞ void coefficient agree very well with the experiment, whereas other codes give larger deviations

  8. Validation of LWR calculation methods and JEF-1 based data libraries by TRX and BAPL critical experiments

    International Nuclear Information System (INIS)

    Pelloni, S.; Grimm, P.; Mathews, D.; Paratte, J.M.

    1989-06-01

    In this report the capability of various code systems widely used at PSI (such as WIMS-D, BOXER, and the AARE modules TRAMIX and MICROX-2 in connection with the one-dimensional transport code ONEDANT) and JEF-1 based nuclear data libraries to compute LWR lattices is analysed by comparing results from thermal reactor benchmarks TRX and BAPL with experiment and with previously published values. It is shown that with the JEF-1 evaluation eigenvalues are generally well predicted within 8 mk (1 mk = 0.001) or less by all code systems, and that all methods give reasonable results for the measured reaction rate within or not too far from the experimental uncertainty. This is consistent with previous similar studies. (author) 7 tabs., 36 refs

  9. The spectra program library: A PC based system for gamma-ray spectra analysis and INAA data reduction

    Science.gov (United States)

    Baedecker, P.A.; Grossman, J.N.

    1995-01-01

    A PC based system has been developed for the analysis of gamma-ray spectra and for the complete reduction of data from INAA experiments, including software to average the results from mulitple lines and multiple countings and to produce a final report of analysis. Graphics algorithms may be called for the analysis of complex spectral features, to compare the data from alternate photopeaks and to evaluate detector performance during a given counting cycle. A database of results for control samples can be used to prepare quality control charts to evaluate long term precision and to search for systemic variations in data on reference samples as a function of time. The entire software library can be accessed through a user-friendly menu interface with internal help.

  10. Poincare covariance and κ-Minkowski spacetime

    International Nuclear Information System (INIS)

    Dabrowski, Ludwik; Piacitelli, Gherardo

    2011-01-01

    A fully Poincare covariant model is constructed as an extension of the κ-Minkowski spacetime. Covariance is implemented by a unitary representation of the Poincare group, and thus complies with the original Wigner approach to quantum symmetries. This provides yet another example (besides the DFR model), where Poincare covariance is realised a la Wigner in the presence of two characteristic dimensionful parameters: the light speed and the Planck length. In other words, a Doubly Special Relativity (DSR) framework may well be realised without deforming the meaning of 'Poincare covariance'. -- Highlights: → We construct a 4d model of noncommuting coordinates (quantum spacetime). → The coordinates are fully covariant under the undeformed Poincare group. → Covariance a la Wigner holds in presence of two dimensionful parameters. → Hence we are not forced to deform covariance (e.g. as quantum groups). → The underlying κ-Minkowski model is unphysical; covariantisation does not cure this.

  11. Initial draft of CSE-UCLA evaluation model based on weighted product in order to optimize digital library services in computer college in Bali

    Science.gov (United States)

    Divayana, D. G. H.; Adiarta, A.; Abadi, I. B. G. S.

    2018-01-01

    The aim of this research was to create initial design of CSE-UCLA evaluation model modified with Weighted Product in evaluating digital library service at Computer College in Bali. The method used in this research was developmental research method and developed by Borg and Gall model design. The results obtained from the research that conducted earlier this month was a rough sketch of Weighted Product based CSE-UCLA evaluation model that the design had been able to provide a general overview of the stages of weighted product based CSE-UCLA evaluation model used in order to optimize the digital library services at the Computer Colleges in Bali.

  12. Implementation of the NMR CHEmical Shift Covariance Analysis (CHESCA): A Chemical Biologist's Approach to Allostery.

    Science.gov (United States)

    Boulton, Stephen; Selvaratnam, Rajeevan; Ahmed, Rashik; Melacini, Giuseppe

    2018-01-01

    Mapping allosteric sites is emerging as one of the central challenges in physiology, pathology, and pharmacology. Nuclear Magnetic Resonance (NMR) spectroscopy is ideally suited to map allosteric sites, given its ability to sense at atomic resolution the dynamics underlying allostery. Here, we focus specifically on the NMR CHEmical Shift Covariance Analysis (CHESCA), in which allosteric systems are interrogated through a targeted library of perturbations (e.g., mutations and/or analogs of the allosteric effector ligand). The atomic resolution readout for the response to such perturbation library is provided by NMR chemical shifts. These are then subject to statistical correlation and covariance analyses resulting in clusters of allosterically coupled residues that exhibit concerted responses to the common set of perturbations. This chapter provides a description of how each step in the CHESCA is implemented, starting from the selection of the perturbation library and ending with an overview of different clustering options.

  13. The library

    International Nuclear Information System (INIS)

    1980-01-01

    A specialized library is essential for conducting the research work of the Uranium Institute. The need was recognized at the foundation of the Institute and a full-time librarian was employed in 1976 to establish the necessary systems and begin the task of building up the collection. A brief description is given of the services offered by the library which now contains books, periodicals, pamphlets and press cuttings, focussed on uranium and nuclear energy, but embracing economics, politics, trade, legislation, geology, mining and mineral processing, environmental protection and nuclear technology. (author)

  14. Bridging the Gap of Practice and Research: A Preliminary Investigation of Evidence-based Practice for Library and Information Science Research

    Directory of Open Access Journals (Sweden)

    吳寂絹 Chi-Chuan Wu

    2015-10-01

    Full Text Available The gap between practice and research is commonly found in disciplines with both ofprofessional practitioners and academic researchers. How to bridge the gap is also acontinuing concern in the field of Library and Information Studies. This article describes therecent development of Evidence-based Practice for Library and Information ScienceResearch (EBLIP, and provides analysis of the journal EBLIP including its authors’backgrounds, methods, and topics. The results show that the United States and Canadaare the two major nations of contributors; more than 70% of first authors are librarians; 76%of the articles were contributed by one single institute, co-authorship by cross-nationinstitutes were rarely seen, and demonstrates local research interests; type of co-authoredagency is primarily among libraries; 60% methods employed include questionnaires,interviews and content analysis; the coverage of topics is rather broad, and the top threecategories of research topics include Information Literacy & Instruction, Information Needs& Seeking Behavior, and Reference Services / Digital Reference Services (15%, 10%, and8%; many datasets were obtained from real library practice, and 72% of articles provide specific implications for applications which highlight the value of implementation. Manylibrarians have the research capability, and this article serves as a purpose to introduce theevidence-based research and encourage more such research done in Taiwan. Hopefully itmay benefit and further enhance the quality of library decision-making and their professionalimage.

  15. A Framework Based on 2-D Taylor Expansion for Quantifying the Impacts of Sub-Pixel Reflectance Variance and Covariance on Cloud Optical Thickness and Effective Radius Retrievals Based on the Bi-Spectral Method

    Science.gov (United States)

    Zhang, Z.; Werner, F.; Cho, H. -M.; Wind, G.; Platnick, S.; Ackerman, A. S.; Di Girolamo, L.; Marshak, A.; Meyer, Kerry

    2016-01-01

    The bi-spectral method retrieves cloud optical thickness and cloud droplet effective radius simultaneously from a pair of cloud reflectance observations, one in a visible or near-infrared (VISNIR) band and the other in a shortwave infrared (SWIR) band. A cloudy pixel is usually assumed to be horizontally homogeneous in the retrieval. Ignoring sub-pixel variations of cloud reflectances can lead to a significant bias in the retrieved and re. In the literature, the retrievals of and re are often assumed to be independent and considered separately when investigating the impact of sub-pixel cloud reflectance variations on the bi-spectral method. As a result, the impact on is contributed only by the sub-pixel variation of VISNIR band reflectance and the impact on re only by the sub-pixel variation of SWIR band reflectance. In our new framework, we use the Taylor expansion of a two-variable function to understand and quantify the impacts of sub-pixel variances of VISNIR and SWIR cloud reflectances and their covariance on the and re retrievals. This framework takes into account the fact that the retrievals are determined by both VISNIR and SWIR band observations in a mutually dependent way. In comparison with previous studies, it provides a more comprehensive understanding of how sub-pixel cloud reflectance variations impact the and re retrievals based on the bi-spectral method. In particular, our framework provides a mathematical explanation of how the sub-pixel variation in VISNIR band influences the re retrieval and why it can sometimes outweigh the influence of variations in the SWIR band and dominate the error in re retrievals, leading to a potential contribution of positive bias to the re retrieval. We test our framework using synthetic cloud fields from a large-eddy simulation and real observations from Moderate Resolution Imaging Spectroradiometer. The predicted results based on our framework agree very well with the numerical simulations. Our framework can be used

  16. A Framework Based on 2-D Taylor Expansion for Quantifying the Impacts of Subpixel Reflectance Variance and Covariance on Cloud Optical Thickness and Effective Radius Retrievals Based on the Bispectral Method

    Science.gov (United States)

    Zhang, Z.; Werner, F.; Cho, H.-M.; Wind, G.; Platnick, S.; Ackerman, A. S.; Di Girolamo, L.; Marshak, A.; Meyer, K.

    2016-01-01

    The bispectral method retrieves cloud optical thickness (t) and cloud droplet effective radius (re) simultaneously from a pair of cloud reflectance observations, one in a visible or near-infrared (VIS/NIR) band and the other in a shortwave infrared (SWIR) band. A cloudy pixel is usually assumed to be horizontally homogeneous in the retrieval. Ignoring subpixel variations of cloud reflectances can lead to a significant bias in the retrieved t and re. In the literature, the retrievals of t and re are often assumed to be independent and considered separately when investigating the impact of subpixel cloud reflectance variations on the bispectral method. As a result, the impact on t is contributed only by the subpixel variation of VIS/NIR band reflectance and the impact on re only by the subpixel variation of SWIR band reflectance. In our new framework, we use the Taylor expansion of a two-variable function to understand and quantify the impacts of subpixel variances of VIS/NIR and SWIR cloud reflectances and their covariance on the t and re retrievals. This framework takes into account the fact that the retrievals are determined by both VIS/NIR and SWIR band observations in a mutually dependent way. In comparison with previous studies, it provides a more comprehensive understanding of how subpixel cloud reflectance variations impact the t and re retrievals based on the bispectral method. In particular, our framework provides a mathematical explanation of how the subpixel variation in VIS/NIR band influences the re retrieval and why it can sometimes outweigh the influence of variations in the SWIR band and dominate the error in re retrievals, leading to a potential contribution of positive bias to the re retrieval. We test our framework using synthetic cloud fields from a large-eddy simulation and real observations from Moderate Resolution Imaging Spectroradiometer. The predicted results based on our framework agree very well with the numerical simulations. Our

  17. Libraries on the MOVE.

    Science.gov (United States)

    Edgar, Jim; And Others

    1986-01-01

    Presents papers from Illinois State Library and Shawnee Library System's "Libraries on the MOVE" conference focusing on how libraries can impact economic/cultural climate of an area. Topics addressed included information services of rural libraries; marketing; rural library development; library law; information access; interagency…

  18. Personal Virtual Libraries

    Science.gov (United States)

    Pappas, Marjorie L.

    2004-01-01

    Virtual libraries are becoming more and more common. Most states have a virtual library. A growing number of public libraries have a virtual presence on the Web. Virtual libraries are a growing addition to school library media collections. The next logical step would be personal virtual libraries. A personal virtual library (PVL) is a collection…

  19. America's Star Libraries

    Science.gov (United States)

    Lyons, Ray; Lance, Keith Curry

    2009-01-01

    "Library Journal"'s new national rating of public libraries, the "LJ" Index of Public Library Service, identifies 256 "star" libraries. It rates 7,115 public libraries. The top libraries in each group get five, four, or three Michelin guide-like stars. All included libraries, stars or not, can use their scores to learn from their peers and improve…

  20. COVARIANCE ASSISTED SCREENING AND ESTIMATION.

    Science.gov (United States)

    Ke, By Tracy; Jin, Jiashun; Fan, Jianqing

    2014-11-01

    Consider a linear model Y = X β + z , where X = X n,p and z ~ N (0, I n ). The vector β is unknown and it is of interest to separate its nonzero coordinates from the zero ones (i.e., variable selection). Motivated by examples in long-memory time series (Fan and Yao, 2003) and the change-point problem (Bhattacharya, 1994), we are primarily interested in the case where the Gram matrix G = X ' X is non-sparse but sparsifiable by a finite order linear filter. We focus on the regime where signals are both rare and weak so that successful variable selection is very challenging but is still possible. We approach this problem by a new procedure called the Covariance Assisted Screening and Estimation (CASE). CASE first uses a linear filtering to reduce the original setting to a new regression model where the corresponding Gram (covariance) matrix is sparse. The new covariance matrix induces a sparse graph, which guides us to conduct multivariate screening without visiting all the submodels. By interacting with the signal sparsity, the graph enables us to decompose the original problem into many separated small-size subproblems (if only we know where they are!). Linear filtering also induces a so-called problem of information leakage , which can be overcome by the newly introduced patching technique. Together, these give rise to CASE, which is a two-stage Screen and Clean (Fan and Song, 2010; Wasserman and Roeder, 2009) procedure, where we first identify candidates of these submodels by patching and screening , and then re-examine each candidate to remove false positives. For any procedure β̂ for variable selection, we measure the performance by the minimax Hamming distance between the sign vectors of β̂ and β. We show that in a broad class of situations where the Gram matrix is non-sparse but sparsifiable, CASE achieves the optimal rate of convergence. The results are successfully applied to long-memory time series and the change-point model.

  1. Non-Critical Covariant Superstrings

    CERN Document Server

    Grassi, P A

    2005-01-01

    We construct a covariant description of non-critical superstrings in even dimensions. We construct explicitly supersymmetric hybrid type variables in a linear dilaton background, and study an underlying N=2 twisted superconformal algebra structure. We find similarities between non-critical superstrings in 2n+2 dimensions and critical superstrings compactified on CY_(4-n) manifolds. We study the spectrum of the non-critical strings, and in particular the Ramond-Ramond massless fields. We use the supersymmetric variables to construct the non-critical superstrings sigma-model action in curved target space backgrounds with coupling to the Ramond-Ramond fields. We consider as an example non-critical type IIA strings on AdS_2 background with Ramond-Ramond 2-form flux.

  2. Application of a new cross section library based on ENDF/B-IV to reactor core analysis

    International Nuclear Information System (INIS)

    Lima Bezerra, J. de.

    1991-04-01

    The use of the ENDF/B-IV library in the LEOPARD code for the Angra-1 reactor simulation is presented. The results are compared to those obtained using the ENDF/B-II library and show better values for the power distribution but an underestimated global reactivity as compared to experimental results. (F.E.). 1 ref, 55 figs, 1 tab

  3. An Intelligent Mobile Location-Aware Book Recommendation System that Enhances Problem-Based Learning in Libraries

    Science.gov (United States)

    Chen, Chih-Ming

    2013-01-01

    Despite rapid and continued adoption of mobile devices, few learning modes integrate with mobile technologies and libraries' environments as innovative learning modes that emphasize the key roles of libraries in facilitating learning. In addition, some education experts have claimed that transmitting knowledge to learners is not the only…

  4. Out on the Web: The Relationship between Campus Climate and GLBT-Related Web-based Resources in Academic Libraries

    Science.gov (United States)

    Ciszek, Matthew P.

    2011-01-01

    This article explores the relationship between the perceived campus environment for gay, lesbian, bisexual, and transgender (GLBT) students at colleges and universities and how academic libraries have deployed GLBT-related resources on the Web. Recommendations are made for increasing GLBT-related materials and information in academic libraries.…

  5. Library Automation.

    Science.gov (United States)

    Husby, Ole

    1990-01-01

    The challenges and potential benefits of automating university libraries are reviewed, with special attention given to cooperative systems. Aspects discussed include database size, the role of the university computer center, storage modes, multi-institutional systems, resource sharing, cooperative system management, networking, and intelligent…

  6. Application Portable Parallel Library

    Science.gov (United States)

    Cole, Gary L.; Blech, Richard A.; Quealy, Angela; Townsend, Scott

    1995-01-01

    Application Portable Parallel Library (APPL) computer program is subroutine-based message-passing software library intended to provide consistent interface to variety of multiprocessor computers on market today. Minimizes effort needed to move application program from one computer to another. User develops application program once and then easily moves application program from parallel computer on which created to another parallel computer. ("Parallel computer" also include heterogeneous collection of networked computers). Written in C language with one FORTRAN 77 subroutine for UNIX-based computers and callable from application programs written in C language or FORTRAN 77.

  7. Knowledge management for libraries

    CERN Document Server

    Forrestal, Valerie

    2015-01-01

    Libraries are creating dynamic knowledge bases to capture both tacit and explicit knowledge and subject expertise for use within and beyond their organizations. In this book, readers will learn to move policies and procedures manuals online using a wiki, get the most out of Microsoft SharePoint with custom portals and Web Parts, and build an FAQ knowledge base from reference management applications such as LibAnswers. Knowledge Management for Libraries guides readers through the process of planning, developing, and launching th

  8. SU-F-T-47: MRI T2 Exclusive Based Planning Using the Endocavitary/interstitial Gynecological Benidorm Applicator: A Proposed TPS Library and Preplan Efficient Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Richart, J; Otal, A; Rodriguez, S; Santos, M [Clinica Benidorm, Benidorm, Alicante (Spain); Perez-Calatayud, J [Clinica Benidorm, Benidorm, Alicante (Spain); Hospital La Fe, Valencia (Spain)

    2016-06-15

    Purpose: ABS and GEC-ESTRO have recommended MRI T2 for image guided brachytherapy. Recently, a new applicator (Benidorm Template, TB) has been developed in our Department (Rodriguez et al 2015). TB is fully MRI compatible because the Titanium needles and it allows the use of intrauterine tandem. Currently, TPS applicators library are not currently available for non-rigid applicators in case of interstitial component as the TB.The purpose of this work is to present the development of a library for the TB, together with its use on a pre-planning technique. Both new goals allow a very efficient and exclusive T2 MRI based planning clinical TB implementation. Methods: The developed library has been implemented in Oncentra Brachytherapy TPS, version 4.3.0 (Elekta) and now is being implemented on Sagiplan v 2.0 TPS (Eckert&Ziegler BEBIG). To model the TB, free and open software named FreeCAD and MeshLab have been used. The reconstruction process is based on three inserted A-vitamin pellets together with the data provided by the free length. The implemented preplanning procedure is as follow: 1) A MRI T2 acquisition is performed with the template in place just with the vaginal cylinder (no uterine tube nor needles). 2) The CTV is drawn and the required needles are selected using a developed Java based application and 3) A post-implant MRI T2 is performed. Results: This library procedure has been successfully applied by now in 25 patients. In this work the use of the developed library will be illustrated with clinical examples. The preplanning procedure has been applied by now in 6 patients, having significant advantages: needle depth estimation, needle positions and number are optimized a priori, time saving, etc Conclusion: TB library and pre-plan techniques are feasible and very efficient and their use will be illustrated in this work.

  9. SU-F-T-47: MRI T2 Exclusive Based Planning Using the Endocavitary/interstitial Gynecological Benidorm Applicator: A Proposed TPS Library and Preplan Efficient Methodology

    International Nuclear Information System (INIS)

    Richart, J; Otal, A; Rodriguez, S; Santos, M; Perez-Calatayud, J

    2016-01-01

    Purpose: ABS and GEC-ESTRO have recommended MRI T2 for image guided brachytherapy. Recently, a new applicator (Benidorm Template, TB) has been developed in our Department (Rodriguez et al 2015). TB is fully MRI compatible because the Titanium needles and it allows the use of intrauterine tandem. Currently, TPS applicators library are not currently available for non-rigid applicators in case of interstitial component as the TB.The purpose of this work is to present the development of a library for the TB, together with its use on a pre-planning technique. Both new goals allow a very efficient and exclusive T2 MRI based planning clinical TB implementation. Methods: The developed library has been implemented in Oncentra Brachytherapy TPS, version 4.3.0 (Elekta) and now is being implemented on Sagiplan v 2.0 TPS (Eckert&Ziegler BEBIG). To model the TB, free and open software named FreeCAD and MeshLab have been used. The reconstruction process is based on three inserted A-vitamin pellets together with the data provided by the free length. The implemented preplanning procedure is as follow: 1) A MRI T2 acquisition is performed with the template in place just with the vaginal cylinder (no uterine tube nor needles). 2) The CTV is drawn and the required needles are selected using a developed Java based application and 3) A post-implant MRI T2 is performed. Results: This library procedure has been successfully applied by now in 25 patients. In this work the use of the developed library will be illustrated with clinical examples. The preplanning procedure has been applied by now in 6 patients, having significant advantages: needle depth estimation, needle positions and number are optimized a priori, time saving, etc Conclusion: TB library and pre-plan techniques are feasible and very efficient and their use will be illustrated in this work.

  10. Adoption of Library 2.0 Functionalities by Academic Libraries and Users: A Knowledge Management Perspective

    Science.gov (United States)

    Kim, Yong-Mi; Abbas, June

    2010-01-01

    This study investigates the adoption of Library 2.0 functionalities by academic libraries and users through a knowledge management perspective. Based on randomly selected 230 academic library Web sites and 184 users, the authors found RSS and blogs are widely adopted by academic libraries while users widely utilized the bookmark function.…

  11. Beyond Traditional Literacy Instruction: Toward an Account-Based Literacy Training Curriculum in Libraries

    Science.gov (United States)

    Cirella, David

    2012-01-01

    A diverse group, account-based services include a wide variety of sites commonly used by patrons, including online shopping sites, social networks, photo- and video-sharing sites, banking and financial sites, government services, and cloud-based storage. Whether or not a piece of information is obtainable online must be considered when creating…

  12. SketchyDynamics: A Library for the Development of Physics Simulation Applications with Sketch-Based Interfaces

    Directory of Open Access Journals (Sweden)

    Abílio Costa

    2013-09-01

    Full Text Available Sketch-based interfaces provide a powerful, natural and intuitive way for users to interact with an application. By combining a sketch-based interface with a physically simulated environment, an application offers the means for users to rapidly sketch a set of objects, like if they are doing it on piece of paper, and see how these objects behave in a simulation. In this paper we present SketchyDynamics, a library that intends to facilitate the creation of applications by rapidly providing them a sketch-based interface and physics simulation capabilities. SketchyDynamics was designed to be versatile and customizable but also simple. In fact, a simple application where the user draws objects and they are immediately simulated, colliding with each other and reacting to the specified physical forces, can be created with only 3 lines of code. In order to validate SketchyDynamics design choices, we also present some details of the usability evaluation that was conducted with a proof-of-concept prototype

  13. BiNChE: a web tool and library for chemical enrichment analysis based on the ChEBI ontology.

    Science.gov (United States)

    Moreno, Pablo; Beisken, Stephan; Harsha, Bhavana; Muthukrishnan, Venkatesh; Tudose, Ilinca; Dekker, Adriano; Dornfeldt, Stefanie; Taruttis, Franziska; Grosse, Ivo; Hastings, Janna; Neumann, Steffen; Steinbeck, Christoph

    2015-02-21

    Ontology-based enrichment analysis aids in the interpretation and understanding of large-scale biological data. Ontologies are hierarchies of biologically relevant groupings. Using ontology annotations, which link ontology classes to biological entities, enrichment analysis methods assess whether there is a significant over or under representation of entities for ontology classes. While many tools exist that run enrichment analysis for protein sets annotated with the Gene Ontology, there are only a few that can be used for small molecules enrichment analysis. We describe BiNChE, an enrichment analysis tool for small molecules based on the ChEBI Ontology. BiNChE displays an interactive graph that can be exported as a high-resolution image or in network formats. The tool provides plain, weighted and fragment analysis based on either the ChEBI Role Ontology or the ChEBI Structural Ontology. BiNChE aids in the exploration of large sets of small molecules produced within Metabolomics or other Systems Biology research contexts. The open-source tool provides easy and highly interactive web access to enrichment analysis with the ChEBI ontology tool and is additionally available as a standalone library.

  14. Covariant diagrams for one-loop matching

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Zhengkang [Michigan Center for Theoretical Physics (MCTP), University of Michigan,450 Church Street, Ann Arbor, MI 48109 (United States); Deutsches Elektronen-Synchrotron (DESY),Notkestraße 85, 22607 Hamburg (Germany)

    2017-05-30

    We present a diagrammatic formulation of recently-revived covariant functional approaches to one-loop matching from an ultraviolet (UV) theory to a low-energy effective field theory. Various terms following from a covariant derivative expansion (CDE) are represented by diagrams which, unlike conventional Feynman diagrams, involve gauge-covariant quantities and are thus dubbed “covariant diagrams.” The use of covariant diagrams helps organize and simplify one-loop matching calculations, which we illustrate with examples. Of particular interest is the derivation of UV model-independent universal results, which reduce matching calculations of specific UV models to applications of master formulas. We show how such derivation can be done in a more concise manner than the previous literature, and discuss how additional structures that are not directly captured by existing universal results, including mixed heavy-light loops, open covariant derivatives, and mixed statistics, can be easily accounted for.

  15. Covariant diagrams for one-loop matching

    International Nuclear Information System (INIS)

    Zhang, Zhengkang

    2017-01-01

    We present a diagrammatic formulation of recently-revived covariant functional approaches to one-loop matching from an ultraviolet (UV) theory to a low-energy effective field theory. Various terms following from a covariant derivative expansion (CDE) are represented by diagrams which, unlike conventional Feynman diagrams, involve gauge-covariant quantities and are thus dubbed “covariant diagrams.” The use of covariant diagrams helps organize and simplify one-loop matching calculations, which we illustrate with examples. Of particular interest is the derivation of UV model-independent universal results, which reduce matching calculations of specific UV models to applications of master formulas. We show how such derivation can be done in a more concise manner than the previous literature, and discuss how additional structures that are not directly captured by existing universal results, including mixed heavy-light loops, open covariant derivatives, and mixed statistics, can be easily accounted for.

  16. Improvement of covariance data for fast reactors

    International Nuclear Information System (INIS)

    Shibata, Keiichi; Hasegawa, Akira

    2000-02-01

    We estimated covariances of the JENDL-3.2 data on the nuclides and reactions needed to analyze fast-reactor cores for the past three years, and produced covariance files. The present work was undertaken to re-examine the covariance files and to make some improvements. The covariances improved are the ones for the inelastic scattering cross section of 16 O, the total cross section of 23 Na, the fission cross section of 235 U, the capture cross section of 238 U, and the resolved resonance parameters for 238 U. Moreover, the covariances of 233 U data were newly estimated by the present work. The covariances obtained were compiled in the ENDF-6 format. (author)

  17. MARKETING LIBRARY SERVICES IN ACADEMIC LIBRARIES: A ...

    African Journals Online (AJOL)

    MARKETING LIBRARY SERVICES IN ACADEMIC LIBRARIES: A TOOL FOR SURVIVAL IN THE ... This article discusses the concept of marketing library and information services as an ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  18. Scaffold architecture and pharmacophoric properties of natural products and trade drugs: application in the design of natural product-based combinatorial libraries.

    Science.gov (United States)

    Lee, M L; Schneider, G

    2001-01-01

    Natural products were analyzed to determine whether they contain appealing novel scaffold architectures for potential use in combinatorial chemistry. Ring systems were extracted and clustered on the basis of structural similarity. Several such potential scaffolds for combinatorial chemistry were identified that are not present in current trade drugs. For one of these scaffolds a virtual combinatorial library was generated. Pharmacophoric properties of natural products, trade drugs, and the virtual combinatorial library were assessed using a self-organizing map. Obviously, current trade drugs and natural products have several topological pharmacophore patterns in common. These features can be systematically explored with selected combinatorial libraries based on a combination of natural product-derived and synthetic molecular building blocks.

  19. Using Participatory and Service Design to Identify Emerging Needs and Perceptions of Library Services among Science and Engineering Researchers Based at a Satellite Campus

    Science.gov (United States)

    Johnson, Andrew; Kuglitsch, Rebecca; Bresnahan, Megan

    2015-01-01

    This study used participatory and service design methods to identify emerging research needs and existing perceptions of library services among science and engineering faculty, post-graduate, and graduate student researchers based at a satellite campus at the University of Colorado Boulder. These methods, and the results of the study, allowed us…

  20. Strategic marketing planning in library

    Directory of Open Access Journals (Sweden)

    Karmen Štular-Sotošek

    2000-01-01

    Full Text Available The article is based on the idea that every library can design instruments for creating events and managing the important resources of today's world, especially to manage the changes. This process can only be successful if libraries use adequate marketing methods. Strategic marketing planning starts with the analysis of library's mission, its objectives, goals and corporate culture. By analysing the public environment, the competitive environment and the macro environment, libraries recognise their opportunities and threats. These analyses are the foundations for library definitions: What does the library represent?, What does it aspire to? Which goals does it want to reach? What kind of marketing strategy will it use for its target market?