WorldWideScience

Sample records for large compound sets

  1. SAR matrices: automated extraction of information-rich SAR tables from large compound data sets.

    Science.gov (United States)

    Wassermann, Anne Mai; Haebel, Peter; Weskamp, Nils; Bajorath, Jürgen

    2012-07-23

    We introduce the SAR matrix data structure that is designed to elucidate SAR patterns produced by groups of structurally related active compounds, which are extracted from large data sets. SAR matrices are systematically generated and sorted on the basis of SAR information content. Matrix generation is computationally efficient and enables processing of large compound sets. The matrix format is reminiscent of SAR tables, and SAR patterns revealed by different categories of matrices are easily interpretable. The structural organization underlying matrix formation is more flexible than standard R-group decomposition schemes. Hence, the resulting matrices capture SAR information in a comprehensive manner.

  2. Data-Driven Derivation of an "Informer Compound Set" for Improved Selection of Active Compounds in High-Throughput Screening.

    Science.gov (United States)

    Paricharak, Shardul; IJzerman, Adriaan P; Jenkins, Jeremy L; Bender, Andreas; Nigsch, Florian

    2016-09-26

    Despite the usefulness of high-throughput screening (HTS) in drug discovery, for some systems, low assay throughput or high screening cost can prohibit the screening of large numbers of compounds. In such cases, iterative cycles of screening involving active learning (AL) are employed, creating the need for smaller "informer sets" that can be routinely screened to build predictive models for selecting compounds from the screening collection for follow-up screens. Here, we present a data-driven derivation of an informer compound set with improved predictivity of active compounds in HTS, and we validate its benefit over randomly selected training sets on 46 PubChem assays comprising at least 300,000 compounds and covering a wide range of assay biology. The informer compound set showed improvement in BEDROC(α = 100), PRAUC, and ROCAUC values averaged over all assays of 0.024, 0.014, and 0.016, respectively, compared to randomly selected training sets, all with paired t-test p-values agnostic fashion. This approach led to a consistent improvement in hit rates in follow-up screens without compromising scaffold retrieval. The informer set is adjustable in size depending on the number of compounds one intends to screen, as performance gains are realized for sets with more than 3,000 compounds, and this set is therefore applicable to a variety of situations. Finally, our results indicate that random sampling may not adequately cover descriptor space, drawing attention to the importance of the composition of the training set for predicting actives.

  3. Strong compound-risk factors: efficient discovery through emerging patterns and contrast sets.

    Science.gov (United States)

    Li, Jinyan; Yang, Qiang

    2007-09-01

    Odds ratio (OR), relative risk (RR) (risk ratio), and absolute risk reduction (ARR) (risk difference) are biostatistics measurements that are widely used for identifying significant risk factors in dichotomous groups of subjects. In the past, they have often been used to assess simple risk factors. In this paper, we introduce the concept of compound-risk factors to broaden the applicability of these statistical tests for assessing factor interplays. We observe that compound-risk factors with a high risk ratio or a big risk difference have an one-to-one correspondence to strong emerging patterns or strong contrast sets-two types of patterns that have been extensively studied in the data mining field. Such a relationship has been unknown to researchers in the past, and efficient algorithms for discovering strong compound-risk factors have been lacking. In this paper, we propose a theoretical framework and a new algorithm that unify the discovery of compound-risk factors that have a strong OR, risk ratio, or a risk difference. Our method guarantees that all patterns meeting a certain test threshold can be efficiently discovered. Our contribution thus represents the first of its kind in linking the risk ratios and ORs to pattern mining algorithms, making it possible to find compound-risk factors in large-scale data sets. In addition, we show that using compound-risk factors can improve classification accuracy in probabilistic learning algorithms on several disease data sets, because these compound-risk factors capture the interdependency between important data attributes.

  4. Analyzing large data sets acquired through telemetry from rats exposed to organophosphorous compounds: an EEG study.

    Science.gov (United States)

    de Araujo Furtado, Marcio; Zheng, Andy; Sedigh-Sarvestani, Madineh; Lumley, Lucille; Lichtenstein, Spencer; Yourick, Debra

    2009-10-30

    The organophosphorous compound soman is an acetylcholinesterase inhibitor that causes damage to the brain. Exposure to soman causes neuropathology as a result of prolonged and recurrent seizures. In the present study, long-term recordings of cortical EEG were used to develop an unbiased means to quantify measures of seizure activity in a large data set while excluding other signal types. Rats were implanted with telemetry transmitters and exposed to soman followed by treatment with therapeutics similar to those administered in the field after nerve agent exposure. EEG, activity and temperature were recorded continuously for a minimum of 2 days pre-exposure and 15 days post-exposure. A set of automatic MATLAB algorithms have been developed to remove artifacts and measure the characteristics of long-term EEG recordings. The algorithms use short-time Fourier transforms to compute the power spectrum of the signal for 2-s intervals. The spectrum is then divided into the delta, theta, alpha, and beta frequency bands. A linear fit to the power spectrum is used to distinguish normal EEG activity from artifacts and high amplitude spike wave activity. Changes in time spent in seizure over a prolonged period are a powerful indicator of the effects of novel therapeutics against seizures. A graphical user interface has been created that simultaneously plots the raw EEG in the time domain, the power spectrum, and the wavelet transform. Motor activity and temperature are associated with EEG changes. The accuracy of this algorithm is also verified against visual inspection of video recordings up to 3 days after exposure.

  5. Large Data Set Mining

    NARCIS (Netherlands)

    Leemans, I.B.; Broomhall, Susan

    2017-01-01

    Digital emotion research has yet to make history. Until now large data set mining has not been a very active field of research in early modern emotion studies. This is indeed surprising since first, the early modern field has such rich, copyright-free, digitized data sets and second, emotion studies

  6. mmpdb: An Open-Source Matched Molecular Pair Platform for Large Multiproperty Data Sets.

    Science.gov (United States)

    Dalke, Andrew; Hert, Jérôme; Kramer, Christian

    2018-05-29

    Matched molecular pair analysis (MMPA) enables the automated and systematic compilation of medicinal chemistry rules from compound/property data sets. Here we present mmpdb, an open-source matched molecular pair (MMP) platform to create, compile, store, retrieve, and use MMP rules. mmpdb is suitable for the large data sets typically found in pharmaceutical and agrochemical companies and provides new algorithms for fragment canonicalization and stereochemistry handling. The platform is written in Python and based on the RDKit toolkit. It is freely available from https://github.com/rdkit/mmpdb .

  7. Large positive magnetoresistance in intermetallic compound NdCo2Si2

    Science.gov (United States)

    Roy Chowdhury, R.; Dhara, S.; Das, I.; Bandyopadhyay, B.; Rawat, R.

    2018-04-01

    The magnetic, magneto-transport and magnetocaloric properties of antiferromagnetic intermetallic compound NdCo2Si2 (TN = 32K) have been studied. The compound yields a positive magnetoresistance (MR) of about ∼ 123 % at ∼ 5K in 8 T magnetic field. The MR value is significantly large vis - a - vis earlier reports of large MR in intermetallic compounds, and possibly associated with the changes in magnetic structure of the compound. The large MR value can be explained in terms of field induced pseudo-gaps on Fermi surface.

  8. Combining electronic structure and many-body theory with large databases: A method for predicting the nature of 4 f states in Ce compounds

    Science.gov (United States)

    Herper, H. C.; Ahmed, T.; Wills, J. M.; Di Marco, I.; Björkman, T.; Iuşan, D.; Balatsky, A. V.; Eriksson, O.

    2017-08-01

    Recent progress in materials informatics has opened up the possibility of a new approach to accessing properties of materials in which one assays the aggregate properties of a large set of materials within the same class in addition to a detailed investigation of each compound in that class. Here we present a large scale investigation of electronic properties and correlated magnetism in Ce-based compounds accompanied by a systematic study of the electronic structure and 4 f -hybridization function of a large body of Ce compounds. We systematically study the electronic structure and 4 f -hybridization function of a large body of Ce compounds with the goal of elucidating the nature of the 4 f states and their interrelation with the measured Kondo energy in these compounds. The hybridization function has been analyzed for more than 350 data sets (being part of the IMS database) of cubic Ce compounds using electronic structure theory that relies on a full-potential approach. We demonstrate that the strength of the hybridization function, evaluated in this way, allows us to draw precise conclusions about the degree of localization of the 4 f states in these compounds. The theoretical results are entirely consistent with all experimental information, relevant to the degree of 4 f localization for all investigated materials. Furthermore, a more detailed analysis of the electronic structure and the hybridization function allows us to make precise statements about Kondo correlations in these systems. The calculated hybridization functions, together with the corresponding density of states, reproduce the expected exponential behavior of the observed Kondo temperatures and prove a consistent trend in real materials. This trend allows us to predict which systems may be correctly identified as Kondo systems. A strong anticorrelation between the size of the hybridization function and the volume of the systems has been observed. The information entropy for this set of systems is

  9. Metastrategies in large-scale bargaining settings

    NARCIS (Netherlands)

    Hennes, D.; Jong, S. de; Tuyls, K.; Gal, Y.

    2015-01-01

    This article presents novel methods for representing and analyzing a special class of multiagent bargaining settings that feature multiple players, large action spaces, and a relationship among players' goals, tasks, and resources. We show how to reduce these interactions to a set of bilateral

  10. Development and experimental test of support vector machines virtual screening method for searching Src inhibitors from large compound libraries

    Directory of Open Access Journals (Sweden)

    Han Bucong

    2012-11-01

    Full Text Available Abstract Background Src plays various roles in tumour progression, invasion, metastasis, angiogenesis and survival. It is one of the multiple targets of multi-target kinase inhibitors in clinical uses and trials for the treatment of leukemia and other cancers. These successes and appearances of drug resistance in some patients have raised significant interest and efforts in discovering new Src inhibitors. Various in-silico methods have been used in some of these efforts. It is desirable to explore additional in-silico methods, particularly those capable of searching large compound libraries at high yields and reduced false-hit rates. Results We evaluated support vector machines (SVM as virtual screening tools for searching Src inhibitors from large compound libraries. SVM trained and tested by 1,703 inhibitors and 63,318 putative non-inhibitors correctly identified 93.53%~ 95.01% inhibitors and 99.81%~ 99.90% non-inhibitors in 5-fold cross validation studies. SVM trained by 1,703 inhibitors reported before 2011 and 63,318 putative non-inhibitors correctly identified 70.45% of the 44 inhibitors reported since 2011, and predicted as inhibitors 44,843 (0.33% of 13.56M PubChem, 1,496 (0.89% of 168 K MDDR, and 719 (7.73% of 9,305 MDDR compounds similar to the known inhibitors. Conclusions SVM showed comparable yield and reduced false hit rates in searching large compound libraries compared to the similarity-based and other machine-learning VS methods developed from the same set of training compounds and molecular descriptors. We tested three virtual hits of the same novel scaffold from in-house chemical libraries not reported as Src inhibitor, one of which showed moderate activity. SVM may be potentially explored for searching Src inhibitors from large compound libraries at low false-hit rates.

  11. Development and experimental test of support vector machines virtual screening method for searching Src inhibitors from large compound libraries.

    Science.gov (United States)

    Han, Bucong; Ma, Xiaohua; Zhao, Ruiying; Zhang, Jingxian; Wei, Xiaona; Liu, Xianghui; Liu, Xin; Zhang, Cunlong; Tan, Chunyan; Jiang, Yuyang; Chen, Yuzong

    2012-11-23

    Src plays various roles in tumour progression, invasion, metastasis, angiogenesis and survival. It is one of the multiple targets of multi-target kinase inhibitors in clinical uses and trials for the treatment of leukemia and other cancers. These successes and appearances of drug resistance in some patients have raised significant interest and efforts in discovering new Src inhibitors. Various in-silico methods have been used in some of these efforts. It is desirable to explore additional in-silico methods, particularly those capable of searching large compound libraries at high yields and reduced false-hit rates. We evaluated support vector machines (SVM) as virtual screening tools for searching Src inhibitors from large compound libraries. SVM trained and tested by 1,703 inhibitors and 63,318 putative non-inhibitors correctly identified 93.53%~ 95.01% inhibitors and 99.81%~ 99.90% non-inhibitors in 5-fold cross validation studies. SVM trained by 1,703 inhibitors reported before 2011 and 63,318 putative non-inhibitors correctly identified 70.45% of the 44 inhibitors reported since 2011, and predicted as inhibitors 44,843 (0.33%) of 13.56M PubChem, 1,496 (0.89%) of 168 K MDDR, and 719 (7.73%) of 9,305 MDDR compounds similar to the known inhibitors. SVM showed comparable yield and reduced false hit rates in searching large compound libraries compared to the similarity-based and other machine-learning VS methods developed from the same set of training compounds and molecular descriptors. We tested three virtual hits of the same novel scaffold from in-house chemical libraries not reported as Src inhibitor, one of which showed moderate activity. SVM may be potentially explored for searching Src inhibitors from large compound libraries at low false-hit rates.

  12. Using SETS to find minimal cut sets in large fault trees

    International Nuclear Information System (INIS)

    Worrell, R.B.; Stack, D.W.

    1978-01-01

    An efficient algebraic algorithm for finding the minimal cut sets for a large fault tree was defined and a new procedure which implements the algorithm was added to the Set Equation Transformation System (SETS). The algorithm includes the identification and separate processing of independent subtrees, the coalescing of consecutive gates of the same kind, the creation of additional independent subtrees, and the derivation of the fault tree stem equation in stages. The computer time required to determine the minimal cut sets using these techniques is shown to be substantially less than the computer time required to determine the minimal cut sets when these techniques are not employed. It is shown for a given example that the execution time required to determine the minimal cut sets can be reduced from 7,686 seconds to 7 seconds when all of these techniques are employed

  13. Looking at large data sets using binned data plots

    Energy Technology Data Exchange (ETDEWEB)

    Carr, D.B.

    1990-04-01

    This report addresses the monumental challenge of developing exploratory analysis methods for large data sets. The goals of the report are to increase awareness of large data sets problems and to contribute simple graphical methods that address some of the problems. The graphical methods focus on two- and three-dimensional data and common task such as finding outliers and tail structure, assessing central structure and comparing central structures. The methods handle large sample size problems through binning, incorporate information from statistical models and adapt image processing algorithms. Examples demonstrate the application of methods to a variety of publicly available large data sets. The most novel application addresses the too many plots to examine'' problem by using cognostics, computer guiding diagnostics, to prioritize plots. The particular application prioritizes views of computational fluid dynamics solution sets on the fly. That is, as each time step of a solution set is generated on a parallel processor the cognostics algorithms assess virtual plots based on the previous time step. Work in such areas is in its infancy and the examples suggest numerous challenges that remain. 35 refs., 15 figs.

  14. Fate modelling of chemical compounds with incomplete data sets

    DEFF Research Database (Denmark)

    Birkved, Morten; Heijungs, Reinout

    2011-01-01

    Impact assessment of chemical compounds in Life Cycle Impact Assessment (LCIA) and Environmental Risk Assessment (ERA) requires a vast amount of data on the properties of the chemical compounds being assessed. These data are used in multi-media fate and exposure models, to calculate risk levels...... in an approximate way. The idea is that not all data needed in a multi-media fate and exposure model are completely independent and equally important, but that there are physical-chemical and biological relationships between sets of chemical properties. A statistical model is constructed to underpin this assumption...... and other indicators. ERA typically addresses one specific chemical, but in an LCIA, the number of chemicals encountered may be quite high, up to hundreds or thousands. This study explores the development of meta-models, which are supposed to reflect the “true”multi-media fate and exposure model...

  15. Reducing Information Overload in Large Seismic Data Sets

    Energy Technology Data Exchange (ETDEWEB)

    HAMPTON,JEFFERY W.; YOUNG,CHRISTOPHER J.; MERCHANT,BION J.; CARR,DORTHE B.; AGUILAR-CHANG,JULIO

    2000-08-02

    Event catalogs for seismic data can become very large. Furthermore, as researchers collect multiple catalogs and reconcile them into a single catalog that is stored in a relational database, the reconciled set becomes even larger. The sheer number of these events makes searching for relevant events to compare with events of interest problematic. Information overload in this form can lead to the data sets being under-utilized and/or used incorrectly or inconsistently. Thus, efforts have been initiated to research techniques and strategies for helping researchers to make better use of large data sets. In this paper, the authors present their efforts to do so in two ways: (1) the Event Search Engine, which is a waveform correlation tool and (2) some content analysis tools, which area combination of custom-built and commercial off-the-shelf tools for accessing, managing, and querying seismic data stored in a relational database. The current Event Search Engine is based on a hierarchical clustering tool known as the dendrogram tool, which is written as a MatSeis graphical user interface. The dendrogram tool allows the user to build dendrogram diagrams for a set of waveforms by controlling phase windowing, down-sampling, filtering, enveloping, and the clustering method (e.g. single linkage, complete linkage, flexible method). It also allows the clustering to be based on two or more stations simultaneously, which is important to bridge gaps in the sparsely recorded event sets anticipated in such a large reconciled event set. Current efforts are focusing on tools to help the researcher winnow the clusters defined using the dendrogram tool down to the minimum optimal identification set. This will become critical as the number of reference events in the reconciled event set continually grows. The dendrogram tool is part of the MatSeis analysis package, which is available on the Nuclear Explosion Monitoring Research and Engineering Program Web Site. As part of the research

  16. The behavior of intermetallic compounds at large plastic strains

    International Nuclear Information System (INIS)

    Gray, G.T.; Embury, J.D.

    1993-01-01

    This paper contains a summary of a broad study of intermetallics which includes the following materials, Ni 3 Al, Ti-48Al-1V, Ti-24Al-11Nb, Ti-48Al-2Cr-2Nb, and Ti-24.5 Al-10.5Nb-1.5Mo. Much effort has been devoted to the study of ordered materials at modes plastic strains and the problem of premature failure. However by utilizing stress states other than simple tension it is possible to study the deformation of intermetallic compounds up to large plastic strains and to consider the behavior of these materials in the regime where stresses approach the theoretical stress. The current work outlines studies of the work hardening rate of a number of titanium and nickel-based intermetallic compounds deformed in compression. Attention is given to the structural basis of the sustained work hardening. The large strain plasticity of these materials is summarized in a series of diagrams. Fracture in these materials in compression occurs via catastrophic shear at stresses of the order of E/80 (where E is the elastic modulus)

  17. Spatial occupancy models for large data sets

    Science.gov (United States)

    Johnson, Devin S.; Conn, Paul B.; Hooten, Mevin B.; Ray, Justina C.; Pond, Bruce A.

    2013-01-01

    Since its development, occupancy modeling has become a popular and useful tool for ecologists wishing to learn about the dynamics of species occurrence over time and space. Such models require presence–absence data to be collected at spatially indexed survey units. However, only recently have researchers recognized the need to correct for spatially induced overdisperison by explicitly accounting for spatial autocorrelation in occupancy probability. Previous efforts to incorporate such autocorrelation have largely focused on logit-normal formulations for occupancy, with spatial autocorrelation induced by a random effect within a hierarchical modeling framework. Although useful, computational time generally limits such an approach to relatively small data sets, and there are often problems with algorithm instability, yielding unsatisfactory results. Further, recent research has revealed a hidden form of multicollinearity in such applications, which may lead to parameter bias if not explicitly addressed. Combining several techniques, we present a unifying hierarchical spatial occupancy model specification that is particularly effective over large spatial extents. This approach employs a probit mixture framework for occupancy and can easily accommodate a reduced-dimensional spatial process to resolve issues with multicollinearity and spatial confounding while improving algorithm convergence. Using open-source software, we demonstrate this new model specification using a case study involving occupancy of caribou (Rangifer tarandus) over a set of 1080 survey units spanning a large contiguous region (108 000 km2) in northern Ontario, Canada. Overall, the combination of a more efficient specification and open-source software allows for a facile and stable implementation of spatial occupancy models for large data sets.

  18. Conceptual Design and Performance Analysis for a Large Civil Compound Helicopter

    Science.gov (United States)

    Russell, Carl; Johnson, Wayne

    2012-01-01

    A conceptual design study of a large civil compound helicopter is presented. The objective is to determine how a compound helicopter performs when compared to both a conventional helicopter and a tiltrotor using a design mission that is shorter than optimal for a tiltrotor and longer than optimal for a helicopter. The designs are generated and analyzed using conceptual design software and are further evaluated with a comprehensive rotorcraft analysis code. Multiple metrics are used to determine the suitability of each design for the given mission. Plots of various trade studies and parameter sweeps as well as comprehensive analysis results are presented. The results suggest that the compound helicopter examined for this study would not be competitive with a tiltrotor or conventional helicopter, but multiple possibilities are identified for improving the performance of the compound helicopter in future research.

  19. Parallel clustering algorithm for large-scale biological data sets.

    Science.gov (United States)

    Wang, Minchao; Zhang, Wu; Ding, Wang; Dai, Dongbo; Zhang, Huiran; Xie, Hao; Chen, Luonan; Guo, Yike; Xie, Jiang

    2014-01-01

    Recent explosion of biological data brings a great challenge for the traditional clustering algorithms. With increasing scale of data sets, much larger memory and longer runtime are required for the cluster identification problems. The affinity propagation algorithm outperforms many other classical clustering algorithms and is widely applied into the biological researches. However, the time and space complexity become a great bottleneck when handling the large-scale data sets. Moreover, the similarity matrix, whose constructing procedure takes long runtime, is required before running the affinity propagation algorithm, since the algorithm clusters data sets based on the similarities between data pairs. Two types of parallel architectures are proposed in this paper to accelerate the similarity matrix constructing procedure and the affinity propagation algorithm. The memory-shared architecture is used to construct the similarity matrix, and the distributed system is taken for the affinity propagation algorithm, because of its large memory size and great computing capacity. An appropriate way of data partition and reduction is designed in our method, in order to minimize the global communication cost among processes. A speedup of 100 is gained with 128 cores. The runtime is reduced from serval hours to a few seconds, which indicates that parallel algorithm is capable of handling large-scale data sets effectively. The parallel affinity propagation also achieves a good performance when clustering large-scale gene data (microarray) and detecting families in large protein superfamilies.

  20. Very large virtual compound spaces: construction, storage and utility in drug discovery.

    Science.gov (United States)

    Peng, Zhengwei

    2013-09-01

    Recent activities in the construction, storage and exploration of very large virtual compound spaces are reviewed by this report. As expected, the systematic exploration of compound spaces at the highest resolution (individual atoms and bonds) is intrinsically intractable. By contrast, by staying within a finite number of reactions and a finite number of reactants or fragments, several virtual compound spaces have been constructed in a combinatorial fashion with sizes ranging from 10(11)11 to 10(20)20 compounds. Multiple search methods have been developed to perform searches (e.g. similarity, exact and substructure) into those compound spaces without the need for full enumeration. The up-front investment spent on synthetic feasibility during the construction of some of those virtual compound spaces enables a wider adoption by medicinal chemists to design and synthesize important compounds for drug discovery. Recent activities in the area of exploring virtual compound spaces via the evolutionary approach based on Genetic Algorithm also suggests a positive shift of focus from method development to workflow, integration and ease of use, all of which are required for this approach to be widely adopted by medicinal chemists.

  1. Large reversible magnetostrictive effect of MnCoSi-based compounds prepared by high-magnetic-field solidification

    Science.gov (United States)

    Hu, Q. B.; Hu, Y.; Zhang, S.; Tang, W.; He, X. J.; Li, Z.; Cao, Q. Q.; Wang, D. H.; Du, Y. W.

    2018-01-01

    The MnCoSi compound is a potential magnetostriction material since the magnetic field can drive a metamagnetic transition from an antiferromagnetic phase to a high magnetization phase in it, which accompanies a large lattice distortion. However, a large driving magnetic field, magnetic hysteresis, and poor mechanical properties seriously hinder its application for magnetostriction. By substituting Fe for Mn and introducing vacancies of the Mn element, textured and dense Mn0.97Fe0.03CoSi and Mn0.88CoSi compounds are prepared through a high-magnetic-field solidification approach. As a result, large room-temperature and reversible magnetostriction effects are observed in these compounds at a low magnetic field. The origin of this large magnetostriction effect and potential applications are discussed.

  2. Large magnetocaloric effect of GdNiAl2 compound

    International Nuclear Information System (INIS)

    Dembele, S.N.; Ma, Z.; Shang, Y.F.; Fu, H.; Balfour, E.A.; Hadimani, R.L.; Jiles, D.C.; Teng, B.H.; Luo, Y.

    2015-01-01

    This paper presents the structure, magnetic properties, and magnetocaloric effect of the polycrystalline compound GdNiAl 2 . Powder X-ray diffraction (XRD) measurement and Rietveld refinement revealed that GdNiAl 2 alloy is CuMgAl 2 -type phase structure with about 1 wt% GdNi 2 Al 3 secondary phase. Magnetic measurements suggest that the compound is ferromagnetic and undergoes a second-order phase transition near 28 K. The maximum value of magnetic entropy change reaches 16.0 J/kg K for an applied magnetic field change of 0–50 kOe and the relative cooling power was 6.4×10 2 J/kg. It is a promising candidate as a magnetocaloric material working near liquid hydrogen temperature (~20 K) exhibiting large relative cooling power. - Highlights: • Preferred orientation with axis of [010] was found in the GdNiAl 2 compound. • The ΔS Mmax and the RCP are 16.0 J/kg K and 640 J/kg, respectively, for ΔH=50 kOe. • Relative low rare earth content in GdNiAl 2 comparing with other candidates

  3. Large Sets in Boolean and Non-Boolean Groups and Topology

    Directory of Open Access Journals (Sweden)

    Ol’ga V. Sipacheva

    2017-10-01

    Full Text Available Various notions of large sets in groups, including the classical notions of thick, syndetic, and piecewise syndetic sets and the new notion of vast sets in groups, are studied with emphasis on the interplay between such sets in Boolean groups. Natural topologies closely related to vast sets are considered; as a byproduct, interesting relations between vast sets and ultrafilters are revealed.

  4. Observation of large magnetocaloric effect in equiatomic binary compound ErZn

    Directory of Open Access Journals (Sweden)

    Lingwei Li

    2017-05-01

    Full Text Available The magnetism, magnetocaloric effect and universal behaviour in rare earth Zinc binary compound of ErZn have been studied. The ErZn compound undergoes a second order paramagnetic (PM to ferromagnetic (FM transition at Curie temperature of TC ∼ 20 K. The ErZn compound exhibits a large reversible magnetocaloric effect (MCE around its own TC. The rescaled magnetic entropy change curves overlap with each other under various magnetic field changes, further confirming the ErZn with the second order phase transition. For the magnetic field change of 0-7 T, the maximum values of the magnetic entropy change (−ΔSMmax, relative cooling power (RCP and refrigerant capacity (RC for ErZn are 23.3 J/kg K, 581 J/kg and 437 J/kg, respectively.

  5. Multidimensional scaling for large genomic data sets

    Directory of Open Access Journals (Sweden)

    Lu Henry

    2008-04-01

    Full Text Available Abstract Background Multi-dimensional scaling (MDS is aimed to represent high dimensional data in a low dimensional space with preservation of the similarities between data points. This reduction in dimensionality is crucial for analyzing and revealing the genuine structure hidden in the data. For noisy data, dimension reduction can effectively reduce the effect of noise on the embedded structure. For large data set, dimension reduction can effectively reduce information retrieval complexity. Thus, MDS techniques are used in many applications of data mining and gene network research. However, although there have been a number of studies that applied MDS techniques to genomics research, the number of analyzed data points was restricted by the high computational complexity of MDS. In general, a non-metric MDS method is faster than a metric MDS, but it does not preserve the true relationships. The computational complexity of most metric MDS methods is over O(N2, so that it is difficult to process a data set of a large number of genes N, such as in the case of whole genome microarray data. Results We developed a new rapid metric MDS method with a low computational complexity, making metric MDS applicable for large data sets. Computer simulation showed that the new method of split-and-combine MDS (SC-MDS is fast, accurate and efficient. Our empirical studies using microarray data on the yeast cell cycle showed that the performance of K-means in the reduced dimensional space is similar to or slightly better than that of K-means in the original space, but about three times faster to obtain the clustering results. Our clustering results using SC-MDS are more stable than those in the original space. Hence, the proposed SC-MDS is useful for analyzing whole genome data. Conclusion Our new method reduces the computational complexity from O(N3 to O(N when the dimension of the feature space is far less than the number of genes N, and it successfully

  6. Comparative analysis of machine learning methods in ligand-based virtual screening of large compound libraries.

    Science.gov (United States)

    Ma, Xiao H; Jia, Jia; Zhu, Feng; Xue, Ying; Li, Ze R; Chen, Yu Z

    2009-05-01

    Machine learning methods have been explored as ligand-based virtual screening tools for facilitating drug lead discovery. These methods predict compounds of specific pharmacodynamic, pharmacokinetic or toxicological properties based on their structure-derived structural and physicochemical properties. Increasing attention has been directed at these methods because of their capability in predicting compounds of diverse structures and complex structure-activity relationships without requiring the knowledge of target 3D structure. This article reviews current progresses in using machine learning methods for virtual screening of pharmacodynamically active compounds from large compound libraries, and analyzes and compares the reported performances of machine learning tools with those of structure-based and other ligand-based (such as pharmacophore and clustering) virtual screening methods. The feasibility to improve the performance of machine learning methods in screening large libraries is discussed.

  7. Raman scattering in transition metal compounds: Titanium and compounds of titanium

    Energy Technology Data Exchange (ETDEWEB)

    Jimenez, J.; Ederer, D.L.; Shu, T. [Tulane Univ., New Orleans, LA (United States)] [and others

    1997-04-01

    The transition metal compounds form a very interesting and important set of materials. The diversity arises from the many states of ionization the transition elements may take when forming compounds. This variety provides ample opportunity for a large class of materials to have a vast range of electronic and magnetic properties. The x-ray spectroscopy of the transition elements is especially interesting because they have unfilled d bands that are at the bottom of the conduction band with atomic like structure. This group embarked on the systematic study of transition metal sulfides and oxides. As an example of the type of spectra observed in some of these compounds they have chosen to showcase the L{sub II, III} emission and Raman scattering in some titanium compounds obtained by photon excitation.

  8. Operational Aspects of Dealing with the Large BaBar Data Set

    Energy Technology Data Exchange (ETDEWEB)

    Trunov, Artem G

    2003-06-13

    To date, the BaBar experiment has stored over 0.7PB of data in an Objectivity/DB database. Approximately half this data-set comprises simulated data of which more than 70% has been produced at more than 20 collaborating institutes outside of SLAC. The operational aspects of managing such a large data set and providing access to the physicists in a timely manner is a challenging and complex problem. We describe the operational aspects of managing such a large distributed data-set as well as importing and exporting data from geographically spread BaBar collaborators. We also describe problems common to dealing with such large datasets.

  9. Effect of training data size and noise level on support vector machines virtual screening of genotoxic compounds from large compound libraries.

    Science.gov (United States)

    Kumar, Pankaj; Ma, Xiaohua; Liu, Xianghui; Jia, Jia; Bucong, Han; Xue, Ying; Li, Ze Rong; Yang, Sheng Yong; Wei, Yu Quan; Chen, Yu Zong

    2011-05-01

    Various in vitro and in-silico methods have been used for drug genotoxicity tests, which show limited genotoxicity (GT+) and non-genotoxicity (GT-) identification rates. New methods and combinatorial approaches have been explored for enhanced collective identification capability. The rates of in-silco methods may be further improved by significantly diversified training data enriched by the large number of recently reported GT+ and GT- compounds, but a major concern is the increased noise levels arising from high false-positive rates of in vitro data. In this work, we evaluated the effect of training data size and noise level on the performance of support vector machines (SVM) method known to tolerate high noise levels in training data. Two SVMs of different diversity/noise levels were developed and tested. H-SVM trained by higher diversity higher noise data (GT+ in any in vivo or in vitro test) outperforms L-SVM trained by lower noise lower diversity data (GT+ in in vivo or Ames test only). H-SVM trained by 4,763 GT+ compounds reported before 2008 and 8,232 GT- compounds excluding clinical trial drugs correctly identified 81.6% of the 38 GT+ compounds reported since 2008, predicted 83.1% of the 2,008 clinical trial drugs as GT-, and 23.96% of 168 K MDDR and 27.23% of 17.86M PubChem compounds as GT+. These are comparable to the 43.1-51.9% GT+ and 75-93% GT- rates of existing in-silico methods, 58.8% GT+ and 79% GT- rates of Ames method, and the estimated percentages of 23% in vivo and 31-33% in vitro GT+ compounds in the "universe of chemicals". There is a substantial level of agreement between H-SVM and L-SVM predicted GT+ and GT- MDDR compounds and the prediction from TOPKAT. SVM showed good potential in identifying GT+ compounds from large compound libraries based on higher diversity and higher noise training data.

  10. Large magnetocaloric effect of GdNiAl{sub 2} compound

    Energy Technology Data Exchange (ETDEWEB)

    Dembele, S.N.; Ma, Z.; Shang, Y.F. [School of Physical Electronics, University of Electronic Science and Technology of China, Chengdu 610054 (China); Fu, H., E-mail: fuhao@uestc.edu.cn [School of Physical Electronics, University of Electronic Science and Technology of China, Chengdu 610054 (China); Balfour, E.A. [School of Physical Electronics, University of Electronic Science and Technology of China, Chengdu 610054 (China); Hadimani, R.L.; Jiles, D.C. [Department of Electrical and Computer Engineering, Iowa State University, Ames, IA 50011 (United States); Ames Laboratory, US Department of Energy, Ames, IA 50011 (United States); Teng, B.H.; Luo, Y. [School of Physical Electronics, University of Electronic Science and Technology of China, Chengdu 610054 (China)

    2015-10-01

    This paper presents the structure, magnetic properties, and magnetocaloric effect of the polycrystalline compound GdNiAl{sub 2}. Powder X-ray diffraction (XRD) measurement and Rietveld refinement revealed that GdNiAl{sub 2} alloy is CuMgAl{sub 2}-type phase structure with about 1 wt% GdNi{sub 2}Al{sub 3} secondary phase. Magnetic measurements suggest that the compound is ferromagnetic and undergoes a second-order phase transition near 28 K. The maximum value of magnetic entropy change reaches 16.0 J/kg K for an applied magnetic field change of 0–50 kOe and the relative cooling power was 6.4×10{sup 2} J/kg. It is a promising candidate as a magnetocaloric material working near liquid hydrogen temperature (~20 K) exhibiting large relative cooling power. - Highlights: • Preferred orientation with axis of [010] was found in the GdNiAl{sub 2} compound. • The ΔS{sub Mmax} and the RCP are 16.0 J/kg K and 640 J/kg, respectively, for ΔH=50 kOe. • Relative low rare earth content in GdNiAl{sub 2} comparing with other candidates.

  11. MiniWall Tool for Analyzing CFD and Wind Tunnel Large Data Sets

    Science.gov (United States)

    Schuh, Michael J.; Melton, John E.; Stremel, Paul M.

    2017-01-01

    It is challenging to review and assimilate large data sets created by Computational Fluid Dynamics (CFD) simulations and wind tunnel tests. Over the past 10 years, NASA Ames Research Center has developed and refined a software tool dubbed the MiniWall to increase productivity in reviewing and understanding large CFD-generated data sets. Under the recent NASA ERA project, the application of the tool expanded to enable rapid comparison of experimental and computational data. The MiniWall software is browser based so that it runs on any computer or device that can display a web page. It can also be used remotely and securely by using web server software such as the Apache HTTP server. The MiniWall software has recently been rewritten and enhanced to make it even easier for analysts to review large data sets and extract knowledge and understanding from these data sets. This paper describes the MiniWall software and demonstrates how the different features are used to review and assimilate large data sets.

  12. Iterative dictionary construction for compression of large DNA data sets.

    Science.gov (United States)

    Kuruppu, Shanika; Beresford-Smith, Bryan; Conway, Thomas; Zobel, Justin

    2012-01-01

    Genomic repositories increasingly include individual as well as reference sequences, which tend to share long identical and near-identical strings of nucleotides. However, the sequential processing used by most compression algorithms, and the volumes of data involved, mean that these long-range repetitions are not detected. An order-insensitive, disk-based dictionary construction method can detect this repeated content and use it to compress collections of sequences. We explore a dictionary construction method that improves repeat identification in large DNA data sets. Our adaptation, COMRAD, of an existing disk-based method identifies exact repeated content in collections of sequences with similarities within and across the set of input sequences. COMRAD compresses the data over multiple passes, which is an expensive process, but allows COMRAD to compress large data sets within reasonable time and space. COMRAD allows for random access to individual sequences and subsequences without decompressing the whole data set. COMRAD has no competitor in terms of the size of data sets that it can compress (extending to many hundreds of gigabytes) and, even for smaller data sets, the results are competitive compared to alternatives; as an example, 39 S. cerevisiae genomes compressed to 0.25 bits per base.

  13. Data Programming: Creating Large Training Sets, Quickly

    Science.gov (United States)

    Ratner, Alexander; De Sa, Christopher; Wu, Sen; Selsam, Daniel; Ré, Christopher

    2018-01-01

    Large labeled training sets are the critical building blocks of supervised learning methods and are key enablers of deep learning techniques. For some applications, creating labeled training sets is the most time-consuming and expensive part of applying machine learning. We therefore propose a paradigm for the programmatic creation of training sets called data programming in which users express weak supervision strategies or domain heuristics as labeling functions, which are programs that label subsets of the data, but that are noisy and may conflict. We show that by explicitly representing this training set labeling process as a generative model, we can “denoise” the generated training set, and establish theoretically that we can recover the parameters of these generative models in a handful of settings. We then show how to modify a discriminative loss function to make it noise-aware, and demonstrate our method over a range of discriminative models including logistic regression and LSTMs. Experimentally, on the 2014 TAC-KBP Slot Filling challenge, we show that data programming would have led to a new winning score, and also show that applying data programming to an LSTM model leads to a TAC-KBP score almost 6 F1 points over a state-of-the-art LSTM baseline (and into second place in the competition). Additionally, in initial user studies we observed that data programming may be an easier way for non-experts to create machine learning models when training data is limited or unavailable. PMID:29872252

  14. Accelerated EM-based clustering of large data sets

    NARCIS (Netherlands)

    Verbeek, J.J.; Nunnink, J.R.J.; Vlassis, N.

    2006-01-01

    Motivated by the poor performance (linear complexity) of the EM algorithm in clustering large data sets, and inspired by the successful accelerated versions of related algorithms like k-means, we derive an accelerated variant of the EM algorithm for Gaussian mixtures that: (1) offers speedups that

  15. New set-up for high-quality soft-X-ray absorption spectroscopy of large organic molecules in the gas phase

    Energy Technology Data Exchange (ETDEWEB)

    Holch, Florian; Huebner, Dominique [Universitaet Wuerzburg, Experimentelle Physik VII, Am and Roentgen Reasearch Center for Complex Materials (RCCM) Hubland, 97074 Wuerzburg (Germany); Fink, Rainer [Universitaet Erlangen-Nuernberg, ICMM and CENEM, Egerlandstrasse 3, 91058 Erlangen (Germany); Schoell, Achim, E-mail: achim.schoell@physik.uni-wuerzburg.de [Universitaet Wuerzburg, Experimentelle Physik VII, Am and Roentgen Reasearch Center for Complex Materials (RCCM) Hubland, 97074 Wuerzburg (Germany); Umbach, Eberhard [Karlsruhe Institute of Technology, 76021 Karlsruhe (Germany)

    2011-11-15

    Highlights: {yields} We present a new set-up for x-ray absorption (NEXAFS) on large molecules in the gas-phase. {yields} The cell has a confined volume and can be heated. {yields} The spectra can be acquired fast, are of very high quality with respect tosignal-to-noise ratio and energy resolution. {yields} This allowsthe analysis of spectroscopic details (e.g. solid state effects by comparing gas- and condensed phase data). - Abstract: We present a new experimental set-up for the investigation of large (>128 amu) organic molecules in the gas-phase by means of near-edge X-ray absorption fine structure spectroscopy in the soft X-ray range. Our approach uses a gas cell, which is sealed off against the surrounding vacuum and which can be heated above the sublimation temperature of the respective molecular compound. Using a confined volume rather than a molecular beam yields short acquisition times and intense signals due to the high molecular density, which can be tuned by the container temperature. In turn, the resulting spectra are of very high quality with respect to signal-to-noise ratio and energy resolution, which are the essential aspects for the analysis of fine spectroscopic details. Using the examples of ANQ, NTCDA, and PTCDA, specific challenges of gas phase measurements on large organic molecules with high sublimation temperatures are addressed in detail with respect to the presented set-up and possible ways to tackle them are outlined.

  16. Shortest triplet clustering: reconstructing large phylogenies using representative sets

    Directory of Open Access Journals (Sweden)

    Sy Vinh Le

    2005-04-01

    Full Text Available Abstract Background Understanding the evolutionary relationships among species based on their genetic information is one of the primary objectives in phylogenetic analysis. Reconstructing phylogenies for large data sets is still a challenging task in Bioinformatics. Results We propose a new distance-based clustering method, the shortest triplet clustering algorithm (STC, to reconstruct phylogenies. The main idea is the introduction of a natural definition of so-called k-representative sets. Based on k-representative sets, shortest triplets are reconstructed and serve as building blocks for the STC algorithm to agglomerate sequences for tree reconstruction in O(n2 time for n sequences. Simulations show that STC gives better topological accuracy than other tested methods that also build a first starting tree. STC appears as a very good method to start the tree reconstruction. However, all tested methods give similar results if balanced nearest neighbor interchange (BNNI is applied as a post-processing step. BNNI leads to an improvement in all instances. The program is available at http://www.bi.uni-duesseldorf.de/software/stc/. Conclusion The results demonstrate that the new approach efficiently reconstructs phylogenies for large data sets. We found that BNNI boosts the topological accuracy of all methods including STC, therefore, one should use BNNI as a post-processing step to get better topological accuracy.

  17. Analysis and hit filtering of a very large library of compounds screened against Mycobacterium tuberculosis.

    Science.gov (United States)

    Ekins, Sean; Kaneko, Takushi; Lipinski, Christopher A; Bradford, Justin; Dole, Krishna; Spektor, Anna; Gregory, Kellan; Blondeau, David; Ernst, Sylvia; Yang, Jeremy; Goncharoff, Nicko; Hohman, Moses M; Bunin, Barry A

    2010-11-01

    There is an urgent need for new drugs against tuberculosis which annually claims 1.7-1.8 million lives. One approach to identify potential leads is to screen in vitro small molecules against Mycobacterium tuberculosis (Mtb). Until recently there was no central repository to collect information on compounds screened. Consequently, it has been difficult to analyze molecular properties of compounds that inhibit the growth of Mtb in vitro. We have collected data from publically available sources on over 300 000 small molecules deposited in the Collaborative Drug Discovery TB Database. A cheminformatics analysis on these compounds indicates that inhibitors of the growth of Mtb have statistically higher mean logP, rule of 5 alerts, while also having lower HBD count, atom count and lower PSA (ChemAxon descriptors), compared to compounds that are classed as inactive. Additionally, Bayesian models for selecting Mtb active compounds were evaluated with over 100 000 compounds and, they demonstrated 10 fold enrichment over random for the top ranked 600 compounds. This represents a promising approach for finding compounds active against Mtb in whole cells screened under the same in vitro conditions. Various sets of Mtb hit molecules were also examined by various filtering rules used widely in the pharmaceutical industry to identify compounds with potentially reactive moieties. We found differences between the number of compounds flagged by these rules in Mtb datasets, malaria hits, FDA approved drugs and antibiotics. Combining these approaches may enable selection of compounds with increased probability of inhibition of whole cell Mtb activity.

  18. Management of a Large Qualitative Data Set: Establishing Trustworthiness of the Data

    Directory of Open Access Journals (Sweden)

    Debbie Elizabeth White RN, PhD

    2012-07-01

    Full Text Available Health services research is multifaceted and impacted by the multiple contexts and stakeholders involved. Hence, large data sets are necessary to fully understand the complex phenomena (e.g., scope of nursing practice being studied. The management of these large data sets can lead to numerous challenges in establishing trustworthiness of the study. This article reports on strategies utilized in data collection and analysis of a large qualitative study to establish trustworthiness. Specific strategies undertaken by the research team included training of interviewers and coders, variation in participant recruitment, consistency in data collection, completion of data cleaning, development of a conceptual framework for analysis, consistency in coding through regular communication and meetings between coders and key research team members, use of N6™ software to organize data, and creation of a comprehensive audit trail with internal and external audits. Finally, we make eight recommendations that will help ensure rigour for studies with large qualitative data sets: organization of the study by a single person; thorough documentation of the data collection and analysis process; attention to timelines; the use of an iterative process for data collection and analysis; internal and external audits; regular communication among the research team; adequate resources for timely completion; and time for reflection and diversion. Following these steps will enable researchers to complete a rigorous, qualitative research study when faced with large data sets to answer complex health services research questions.

  19. Large-scale exfoliation of inorganic layered compounds in aqueous surfactant solutions

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Ronan J.; King, Paul J.; Lotya, Mustafa; Wirtz, Christian; Khan, Umar; De, Sukanta; O' Neill, Arlene; Coleman, Jonathan N. [School of Physics, Trinity College Dublin, Dublin 2 (Ireland); CRANN, Trinity College Dublin, Dublin 2 (Ireland); Duesberg, Georg S. [CRANN, Trinity College Dublin, Dublin 2 (Ireland); School of Chemistry, Trinity College Dublin, Dublin 2 (Ireland); Grunlan, Jaime C.; Moriarty, Gregory [Department of Mechanical Engineering, Texas A and M University, College Station, Texas 77843 (United States); Chen, Jun [Intelligent Polymer Research Institute, ARC Centre of Excellence for Electromaterials Science, AIIM Facility, University of Wollongong, NSW 2522 (Australia); Wang, Jiazhao [Institute for Superconducting and Electronic Materials, ARC Centre of Excellence for Electromaterials Science, University of Wollongong, NSW 2522 (Australia); Minett, Andrew I. [Laboratory for Sustainable Technology, School of Chemical and Biomolecular Engineering, University of Sydney, Sydney, NSW 2006 (Australia); Nicolosi, Valeria [Department of Materials, University of Oxford, Parks Road, Oxford OX1 3PH (United Kingdom)

    2011-09-08

    A method to exfoliate MoS{sub 2} in large quantities in surfactant-water solutions is described. The layered material tends to be exfoliated as dispersions of thin, relatively defect-free flakes with lateral sizes of hundreds of nanometers. This method can be extended to a range of other layered compounds. The dispersed flakes can be mixed with nanotubes or graphene to greate functional hybrid materials. (Copyright copyright 2011 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  20. Differential profiling of volatile organic compound biomarker signatures utilizing a logical statistical filter-set and novel hybrid evolutionary classifiers

    Science.gov (United States)

    Grigsby, Claude C.; Zmuda, Michael A.; Boone, Derek W.; Highlander, Tyler C.; Kramer, Ryan M.; Rizki, Mateen M.

    2012-06-01

    A growing body of discoveries in molecular signatures has revealed that volatile organic compounds (VOCs), the small molecules associated with an individual's odor and breath, can be monitored to reveal the identity and presence of a unique individual, as well their overall physiological status. Given the analysis requirements for differential VOC profiling via gas chromatography/mass spectrometry, our group has developed a novel informatics platform, Metabolite Differentiation and Discovery Lab (MeDDL). In its current version, MeDDL is a comprehensive tool for time-series spectral registration and alignment, visualization, comparative analysis, and machine learning to facilitate the efficient analysis of multiple, large-scale biomarker discovery studies. The MeDDL toolset can therefore identify a large differential subset of registered peaks, where their corresponding intensities can be used as features for classification. This initial screening of peaks yields results sets that are typically too large for incorporation into a portable, electronic nose based system in addition to including VOCs that are not amenable to classification; consequently, it is also important to identify an optimal subset of these peaks to increase classification accuracy and to decrease the cost of the final system. MeDDL's learning tools include a classifier similar to a K-nearest neighbor classifier used in conjunction with a genetic algorithm (GA) that simultaneously optimizes the classifier and subset of features. The GA uses ROC curves to produce classifiers having maximal area under their ROC curve. Experimental results on over a dozen recognition problems show many examples of classifiers and feature sets that produce perfect ROC curves.

  1. New large solar photocatalytic plant: set-up and preliminary results.

    Science.gov (United States)

    Malato, S; Blanco, J; Vidal, A; Fernández, P; Cáceres, J; Trincado, P; Oliveira, J C; Vincent, M

    2002-04-01

    A European industrial consortium called SOLARDETOX has been created as the result of an EC-DGXII BRITE-EURAM-III-financed project on solar photocatalytic detoxification of water. The project objective was to develop a simple, efficient and commercially competitive water-treatment technology, based on compound parabolic collectors (CPCs) solar collectors and TiO2 photocatalysis, to make possible easy design and installation. The design, set-up and preliminary results of the main project deliverable, the first European industrial solar detoxification treatment plant, is presented. This plant has been designed for the batch treatment of 2 m3 of water with a 100 m2 collector-aperture area and aqueous aerated suspensions of polycrystalline TiO2 irradiated by sunlight. Fully automatic control reduces operation and maintenance manpower. Plant behaviour has been compared (using dichloroacetic acid and cyanide at 50 mg l(-1) initial concentration as model compounds) with the small CPC pilot plants installed at the Plataforma Solar de Almería several years ago. The first results with high-content cyanide (1 g l(-1)) waste water are presented and plant treatment capacity is calculated.

  2. Extraction of tacit knowledge from large ADME data sets via pairwise analysis.

    Science.gov (United States)

    Keefer, Christopher E; Chang, George; Kauffman, Gregory W

    2011-06-15

    Pharmaceutical companies routinely collect data across multiple projects for common ADME endpoints. Although at the time of collection the data is intended for use in decision making within a specific project, knowledge can be gained by data mining the entire cross-project data set for patterns of structure-activity relationships (SAR) that may be applied to any project. One such data mining method is pairwise analysis. This method has the advantage of being able to identify small structural changes that lead to significant changes in activity. In this paper, we describe the process for full pairwise analysis of our high-throughput ADME assays routinely used for compound discovery efforts at Pfizer (microsomal clearance, passive membrane permeability, P-gp efflux, and lipophilicity). We also describe multiple strategies for the application of these transforms in a prospective manner during compound design. Finally, a detailed analysis of the activity patterns in pairs of compounds that share the same molecular transformation reveals multiple types of transforms from an SAR perspective. These include bioisosteres, additives, multiplicatives, and a type we call switches as they act to either turn on or turn off an activity. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. A full scale approximation of covariance functions for large spatial data sets

    KAUST Repository

    Sang, Huiyan

    2011-10-10

    Gaussian process models have been widely used in spatial statistics but face tremendous computational challenges for very large data sets. The model fitting and spatial prediction of such models typically require O(n 3) operations for a data set of size n. Various approximations of the covariance functions have been introduced to reduce the computational cost. However, most existing approximations cannot simultaneously capture both the large- and the small-scale spatial dependence. A new approximation scheme is developed to provide a high quality approximation to the covariance function at both the large and the small spatial scales. The new approximation is the summation of two parts: a reduced rank covariance and a compactly supported covariance obtained by tapering the covariance of the residual of the reduced rank approximation. Whereas the former part mainly captures the large-scale spatial variation, the latter part captures the small-scale, local variation that is unexplained by the former part. By combining the reduced rank representation and sparse matrix techniques, our approach allows for efficient computation for maximum likelihood estimation, spatial prediction and Bayesian inference. We illustrate the new approach with simulated and real data sets. © 2011 Royal Statistical Society.

  4. A full scale approximation of covariance functions for large spatial data sets

    KAUST Repository

    Sang, Huiyan; Huang, Jianhua Z.

    2011-01-01

    Gaussian process models have been widely used in spatial statistics but face tremendous computational challenges for very large data sets. The model fitting and spatial prediction of such models typically require O(n 3) operations for a data set of size n. Various approximations of the covariance functions have been introduced to reduce the computational cost. However, most existing approximations cannot simultaneously capture both the large- and the small-scale spatial dependence. A new approximation scheme is developed to provide a high quality approximation to the covariance function at both the large and the small spatial scales. The new approximation is the summation of two parts: a reduced rank covariance and a compactly supported covariance obtained by tapering the covariance of the residual of the reduced rank approximation. Whereas the former part mainly captures the large-scale spatial variation, the latter part captures the small-scale, local variation that is unexplained by the former part. By combining the reduced rank representation and sparse matrix techniques, our approach allows for efficient computation for maximum likelihood estimation, spatial prediction and Bayesian inference. We illustrate the new approach with simulated and real data sets. © 2011 Royal Statistical Society.

  5. Experimental Errors in QSAR Modeling Sets: What We Can Do and What We Cannot Do.

    Science.gov (United States)

    Zhao, Linlin; Wang, Wenyi; Sedykh, Alexander; Zhu, Hao

    2017-06-30

    Numerous chemical data sets have become available for quantitative structure-activity relationship (QSAR) modeling studies. However, the quality of different data sources may be different based on the nature of experimental protocols. Therefore, potential experimental errors in the modeling sets may lead to the development of poor QSAR models and further affect the predictions of new compounds. In this study, we explored the relationship between the ratio of questionable data in the modeling sets, which was obtained by simulating experimental errors, and the QSAR modeling performance. To this end, we used eight data sets (four continuous endpoints and four categorical endpoints) that have been extensively curated both in-house and by our collaborators to create over 1800 various QSAR models. Each data set was duplicated to create several new modeling sets with different ratios of simulated experimental errors (i.e., randomizing the activities of part of the compounds) in the modeling process. A fivefold cross-validation process was used to evaluate the modeling performance, which deteriorates when the ratio of experimental errors increases. All of the resulting models were also used to predict external sets of new compounds, which were excluded at the beginning of the modeling process. The modeling results showed that the compounds with relatively large prediction errors in cross-validation processes are likely to be those with simulated experimental errors. However, after removing a certain number of compounds with large prediction errors in the cross-validation process, the external predictions of new compounds did not show improvement. Our conclusion is that the QSAR predictions, especially consensus predictions, can identify compounds with potential experimental errors. But removing those compounds by the cross-validation procedure is not a reasonable means to improve model predictivity due to overfitting.

  6. Complex magnetic properties and large magnetocaloric effects in RCoGe (R=Tb, Dy compounds

    Directory of Open Access Journals (Sweden)

    Yan Zhang

    2018-05-01

    Full Text Available Complicated magnetic phase transitions and Large magnetocaloric effects (MCEs in RCoGe (R=Tb, Dy compounds have been reported in this paper. Results show that the TbCoGe compounds have a magnetic phase transition from antiferromagnetic to paramagnetic (AFM-PM at TN∼16 K, which is close to the value reported by neutron diffraction. The DyCoGe compound undergoes complicated phase changes from 2 K up to 300 K. The peak at 10 K displays a phase transition from antiferromagnetic to ferromagnetic (AFM-FM. In particular, a significant ferromagnetic to paramagnetic (FM-PM phase transition was found at the temperature as high as 175 K and the cusp becomes more abrupt with the magnetic field increasing from 0.01 T to 0.1 T. The maximum value of magnetic entropy change of TbCoGe and DyCoGe compounds achieve 14.5 J/kg K and 11.5 J/kg K respectively for a field change of 0-5 T. Additionally, the correspondingly considerable refrigerant capacity value of 260 J/kg and 242 J/kg are also obtained respectively, suggesting that both TbCoGe and DyCoGe compounds could be considered as good candidates for low temperature magnetic refrigerant.

  7. Compound Decomposition in Dutch Large Vocabulary Speech Recognition

    NARCIS (Netherlands)

    Ordelman, Roeland J.F.; van Hessen, Adrianus J.; de Jong, Franciska M.G.

    2003-01-01

    This paper addresses compound splitting for Dutch in the context of broadcast news transcription. Language models were created using original text versions and text versions that were decomposed using a data-driven compound splitting algorithm. Language model performances were compared in terms of

  8. Optimizing distance-based methods for large data sets

    Science.gov (United States)

    Scholl, Tobias; Brenner, Thomas

    2015-10-01

    Distance-based methods for measuring spatial concentration of industries have received an increasing popularity in the spatial econometrics community. However, a limiting factor for using these methods is their computational complexity since both their memory requirements and running times are in {{O}}(n^2). In this paper, we present an algorithm with constant memory requirements and shorter running time, enabling distance-based methods to deal with large data sets. We discuss three recent distance-based methods in spatial econometrics: the D&O-Index by Duranton and Overman (Rev Econ Stud 72(4):1077-1106, 2005), the M-function by Marcon and Puech (J Econ Geogr 10(5):745-762, 2010) and the Cluster-Index by Scholl and Brenner (Reg Stud (ahead-of-print):1-15, 2014). Finally, we present an alternative calculation for the latter index that allows the use of data sets with millions of firms.

  9. A conceptual analysis of standard setting in large-scale assessments

    NARCIS (Netherlands)

    van der Linden, Willem J.

    1994-01-01

    Elements of arbitrariness in the standard setting process are explored, and an alternative to the use of cut scores is presented. The first part of the paper analyzes the use of cut scores in large-scale assessments, discussing three different functions: (1) cut scores define the qualifications used

  10. Secondary organic aerosol formation from a large number of reactive man-made organic compounds

    Energy Technology Data Exchange (ETDEWEB)

    Derwent, Richard G., E-mail: r.derwent@btopenworld.com [rdscientific, Newbury, Berkshire (United Kingdom); Jenkin, Michael E. [Atmospheric Chemistry Services, Okehampton, Devon (United Kingdom); Utembe, Steven R.; Shallcross, Dudley E. [School of Chemistry, University of Bristol, Bristol (United Kingdom); Murrells, Tim P.; Passant, Neil R. [AEA Environment and Energy, Harwell International Business Centre, Oxon (United Kingdom)

    2010-07-15

    A photochemical trajectory model has been used to examine the relative propensities of a wide variety of volatile organic compounds (VOCs) emitted by human activities to form secondary organic aerosol (SOA) under one set of highly idealised conditions representing northwest Europe. This study applied a detailed speciated VOC emission inventory and the Master Chemical Mechanism version 3.1 (MCM v3.1) gas phase chemistry, coupled with an optimised representation of gas-aerosol absorptive partitioning of 365 oxygenated chemical reaction product species. In all, SOA formation was estimated from the atmospheric oxidation of 113 emitted VOCs. A number of aromatic compounds, together with some alkanes and terpenes, showed significant propensities to form SOA. When these propensities were folded into a detailed speciated emission inventory, 15 organic compounds together accounted for 97% of the SOA formation potential of UK man made VOC emissions and 30 emission source categories accounted for 87% of this potential. After road transport and the chemical industry, SOA formation was dominated by the solvents sector which accounted for 28% of the SOA formation potential.

  11. Visualization of diversity in large multivariate data sets.

    Science.gov (United States)

    Pham, Tuan; Hess, Rob; Ju, Crystal; Zhang, Eugene; Metoyer, Ronald

    2010-01-01

    Understanding the diversity of a set of multivariate objects is an important problem in many domains, including ecology, college admissions, investing, machine learning, and others. However, to date, very little work has been done to help users achieve this kind of understanding. Visual representation is especially appealing for this task because it offers the potential to allow users to efficiently observe the objects of interest in a direct and holistic way. Thus, in this paper, we attempt to formalize the problem of visualizing the diversity of a large (more than 1000 objects), multivariate (more than 5 attributes) data set as one worth deeper investigation by the information visualization community. In doing so, we contribute a precise definition of diversity, a set of requirements for diversity visualizations based on this definition, and a formal user study design intended to evaluate the capacity of a visual representation for communicating diversity information. Our primary contribution, however, is a visual representation, called the Diversity Map, for visualizing diversity. An evaluation of the Diversity Map using our study design shows that users can judge elements of diversity consistently and as or more accurately than when using the only other representation specifically designed to visualize diversity.

  12. Teaching the Assessment of Normality Using Large Easily-Generated Real Data Sets

    Science.gov (United States)

    Kulp, Christopher W.; Sprechini, Gene D.

    2016-01-01

    A classroom activity is presented, which can be used in teaching students statistics with an easily generated, large, real world data set. The activity consists of analyzing a video recording of an object. The colour data of the recorded object can then be used as a data set to explore variation in the data using graphs including histograms,…

  13. Simultaneous identification of long similar substrings in large sets of sequences

    Directory of Open Access Journals (Sweden)

    Wittig Burghardt

    2007-05-01

    Full Text Available Abstract Background Sequence comparison faces new challenges today, with many complete genomes and large libraries of transcripts known. Gene annotation pipelines match these sequences in order to identify genes and their alternative splice forms. However, the software currently available cannot simultaneously compare sets of sequences as large as necessary especially if errors must be considered. Results We therefore present a new algorithm for the identification of almost perfectly matching substrings in very large sets of sequences. Its implementation, called ClustDB, is considerably faster and can handle 16 times more data than VMATCH, the most memory efficient exact program known today. ClustDB simultaneously generates large sets of exactly matching substrings of a given minimum length as seeds for a novel method of match extension with errors. It generates alignments of maximum length with a considered maximum number of errors within each overlapping window of a given size. Such alignments are not optimal in the usual sense but faster to calculate and often more appropriate than traditional alignments for genomic sequence comparisons, EST and full-length cDNA matching, and genomic sequence assembly. The method is used to check the overlaps and to reveal possible assembly errors for 1377 Medicago truncatula BAC-size sequences published at http://www.medicago.org/genome/assembly_table.php?chr=1. Conclusion The program ClustDB proves that window alignment is an efficient way to find long sequence sections of homogenous alignment quality, as expected in case of random errors, and to detect systematic errors resulting from sequence contaminations. Such inserts are systematically overlooked in long alignments controlled by only tuning penalties for mismatches and gaps. ClustDB is freely available for academic use.

  14. Compounding medications in a rural setting: an interprofessional perspective

    Directory of Open Access Journals (Sweden)

    Taylor S

    2018-04-01

    Full Text Available Selina Taylor,1 Catherine Hays,1 Beverley Glass2 1Mount Isa Centre for Rural and Remote Health, James Cook University, Mount Isa, QLD, Australia; 2College of Medicine and Dentistry, James Cook University, Townsville, QLD, Australia Background: Interprofessional learning (IPL which focuses on the pharmacist’s role in specialty practices as part of a multidisciplinary health care team has not been explored. This study aimed to determine health care students’ understanding of the role of the pharmacist in compounding medications to optimize health outcomes for patients in rural and remote health care services.Methods: Four workshops followed by focus group interviews were conducted with undergraduate pharmacy, medical, nursing, physiotherapy, dentistry, Aboriginal public health, and speech pathology students (n=15. After an introductory lecture, students working in multidisciplinary teams undertook to compound three products. Focus groups were held at the end of the compounding workshops to explore students’ understanding and perceptions of these compounding activities. Thematic analysis was undertaken on the qualitative data obtained from the focus groups.Results: Student participants responded positively both to the opportunity to undertake a compounding exercise and being part of an interprofessional team, perceiving benefit for their future rural and remote health practice. Four major themes emerged from the qualitative analysis: improved knowledge and understanding; application to practice; interprofessional collaboration; and rural, remote, and Indigenous context. Students acknowledged that the workshops improved their understanding of the role of the pharmacist in compounding and how they, as part of a multidisciplinary team, could deliver better health outcomes for patients with special needs, especially in a rural and remote context.Conclusion: This study highlights that workshops of this nature have a role to play in developing

  15. Influences of large sets of environmental exposures on immune responses in healthy adult men.

    Science.gov (United States)

    Yi, Buqing; Rykova, Marina; Jäger, Gundula; Feuerecker, Matthias; Hörl, Marion; Matzel, Sandra; Ponomarev, Sergey; Vassilieva, Galina; Nichiporuk, Igor; Choukèr, Alexander

    2015-08-26

    Environmental factors have long been known to influence immune responses. In particular, clinical studies about the association between migration and increased risk of atopy/asthma have provided important information on the role of migration associated large sets of environmental exposures in the development of allergic diseases. However, investigations about environmental effects on immune responses are mostly limited in candidate environmental exposures, such as air pollution. The influences of large sets of environmental exposures on immune responses are still largely unknown. A simulated 520-d Mars mission provided an opportunity to investigate this topic. Six healthy males lived in a closed habitat simulating a spacecraft for 520 days. When they exited their "spacecraft" after the mission, the scenario was similar to that of migration, involving exposure to a new set of environmental pollutants and allergens. We measured multiple immune parameters with blood samples at chosen time points after the mission. At the early adaptation stage, highly enhanced cytokine responses were observed upon ex vivo antigen stimulations. For cell population frequencies, we found the subjects displayed increased neutrophils. These results may presumably represent the immune changes occurred in healthy humans when migrating, indicating that large sets of environmental exposures may trigger aberrant immune activity.

  16. Settings and artefacts relevant for Doppler ultrasound in large vessel vasculitis

    DEFF Research Database (Denmark)

    Terslev, L; Diamantopoulos, A P; Døhn, U Møller

    2017-01-01

    Ultrasound is used increasingly for diagnosing large vessel vasculitis (LVV). The application of Doppler in LVV is very different from in arthritic conditions. This paper aims to explain the most important Doppler parameters, including spectral Doppler, and how the settings differ from those used...

  17. Secondary data analysis of large data sets in urology: successes and errors to avoid.

    Science.gov (United States)

    Schlomer, Bruce J; Copp, Hillary L

    2014-03-01

    Secondary data analysis is the use of data collected for research by someone other than the investigator. In the last several years there has been a dramatic increase in the number of these studies being published in urological journals and presented at urological meetings, especially involving secondary data analysis of large administrative data sets. Along with this expansion, skepticism for secondary data analysis studies has increased for many urologists. In this narrative review we discuss the types of large data sets that are commonly used for secondary data analysis in urology, and discuss the advantages and disadvantages of secondary data analysis. A literature search was performed to identify urological secondary data analysis studies published since 2008 using commonly used large data sets, and examples of high quality studies published in high impact journals are given. We outline an approach for performing a successful hypothesis or goal driven secondary data analysis study and highlight common errors to avoid. More than 350 secondary data analysis studies using large data sets have been published on urological topics since 2008 with likely many more studies presented at meetings but never published. Nonhypothesis or goal driven studies have likely constituted some of these studies and have probably contributed to the increased skepticism of this type of research. However, many high quality, hypothesis driven studies addressing research questions that would have been difficult to conduct with other methods have been performed in the last few years. Secondary data analysis is a powerful tool that can address questions which could not be adequately studied by another method. Knowledge of the limitations of secondary data analysis and of the data sets used is critical for a successful study. There are also important errors to avoid when planning and performing a secondary data analysis study. Investigators and the urological community need to strive to use

  18. Large magnetocaloric effect of NdGa compound due to successive magnetic transitions

    Science.gov (United States)

    Zheng, X. Q.; Xu, J. W.; Shao, S. H.; Zhang, H.; Zhang, J. Y.; Wang, S. G.; Xu, Z. Y.; Wang, L. C.; Chen, J.; Shen, B. G.

    2018-05-01

    The magnetic behavior and MCE property of NdGa compound were studied in detail. According to the temperature dependence of magnetization (M-T) curve at 0.01 T, two sharp changes were observed at 20 K (TSR) and 42 K (TC), respectively, corresponding to spin reorientation and FM-PM transition. Isothermal magnetization curves up to 5 T at different temperatures were measured and magnetic entropy change (ΔSM) was calculated based on M-H data. Temperature dependences of -ΔSM for a field change of 0-2 T and 0-5 T show that there are two peaks on the curves corresponding to TSR and TC, respectively. The value of the two peaks is 6.4 J/kg K and 15.5 J/kg K for the field change of 0-5 T. Since the two peaks are close, the value of -ΔSM in the temperature range between TSR and TC keeps a large value. The excellent MCE performance of NdGa compound benefits from the existence of two successive magnetic transitions.

  19. Security Optimization for Distributed Applications Oriented on Very Large Data Sets

    Directory of Open Access Journals (Sweden)

    Mihai DOINEA

    2010-01-01

    Full Text Available The paper presents the main characteristics of applications which are working with very large data sets and the issues related to security. First section addresses the optimization process and how it is approached when dealing with security. The second section describes the concept of very large datasets management while in the third section the risks related are identified and classified. Finally, a security optimization schema is presented with a cost-efficiency analysis upon its feasibility. Conclusions are drawn and future approaches are identified.

  20. Polish Phoneme Statistics Obtained On Large Set Of Written Texts

    Directory of Open Access Journals (Sweden)

    Bartosz Ziółko

    2009-01-01

    Full Text Available The phonetical statistics were collected from several Polish corpora. The paper is a summaryof the data which are phoneme n-grams and some phenomena in the statistics. Triphonestatistics apply context-dependent speech units which have an important role in speech recognitionsystems and were never calculated for a large set of Polish written texts. The standardphonetic alphabet for Polish, SAMPA, and methods of providing phonetic transcriptions are described.

  1. Querying Large Physics Data Sets Over an Information Grid

    CERN Document Server

    Baker, N; Kovács, Z; Le Goff, J M; McClatchey, R

    2001-01-01

    Optimising use of the Web (WWW) for LHC data analysis is a complex problem and illustrates the challenges arising from the integration of and computation across massive amounts of information distributed worldwide. Finding the right piece of information can, at times, be extremely time-consuming, if not impossible. So-called Grids have been proposed to facilitate LHC computing and many groups have embarked on studies of data replication, data migration and networking philosophies. Other aspects such as the role of 'middleware' for Grids are emerging as requiring research. This paper positions the need for appropriate middleware that enables users to resolve physics queries across massive data sets. It identifies the role of meta-data for query resolution and the importance of Information Grids for high-energy physics analysis rather than just Computational or Data Grids. This paper identifies software that is being implemented at CERN to enable the querying of very large collaborating HEP data-sets, initially...

  2. Follow up: Compound data sets and software tools for chemoinformatics and medicinal chemistry applications: update and data transfer

    Science.gov (United States)

    Hu, Ye; Bajorath, Jürgen

    2014-01-01

    In 2012, we reported 30 compound data sets and/or programs developed in our laboratory in a data article and made them freely available to the scientific community to support chemoinformatics and computational medicinal chemistry applications. These data sets and computational tools were provided for download from our website. Since publication of this data article, we have generated 13 new data sets with which we further extend our collection of publicly available data and tools. Due to changes in web servers and website architectures, data accessibility has recently been limited at times. Therefore, we have also transferred our data sets and tools to a public repository to ensure full and stable accessibility. To aid in data selection, we have classified the data sets according to scientific subject areas. Herein, we describe new data sets, introduce the data organization scheme, summarize the database content and provide detailed access information in ZENODO (doi: 10.5281/zenodo.8451 and doi:10.5281/zenodo.8455). PMID:25520777

  3. A scalable method for identifying frequent subtrees in sets of large phylogenetic trees.

    Science.gov (United States)

    Ramu, Avinash; Kahveci, Tamer; Burleigh, J Gordon

    2012-10-03

    We consider the problem of finding the maximum frequent agreement subtrees (MFASTs) in a collection of phylogenetic trees. Existing methods for this problem often do not scale beyond datasets with around 100 taxa. Our goal is to address this problem for datasets with over a thousand taxa and hundreds of trees. We develop a heuristic solution that aims to find MFASTs in sets of many, large phylogenetic trees. Our method works in multiple phases. In the first phase, it identifies small candidate subtrees from the set of input trees which serve as the seeds of larger subtrees. In the second phase, it combines these small seeds to build larger candidate MFASTs. In the final phase, it performs a post-processing step that ensures that we find a frequent agreement subtree that is not contained in a larger frequent agreement subtree. We demonstrate that this heuristic can easily handle data sets with 1000 taxa, greatly extending the estimation of MFASTs beyond current methods. Although this heuristic does not guarantee to find all MFASTs or the largest MFAST, it found the MFAST in all of our synthetic datasets where we could verify the correctness of the result. It also performed well on large empirical data sets. Its performance is robust to the number and size of the input trees. Overall, this method provides a simple and fast way to identify strongly supported subtrees within large phylogenetic hypotheses.

  4. Efficient algorithms for collaborative decision making for large scale settings

    DEFF Research Database (Denmark)

    Assent, Ira

    2011-01-01

    to bring about more effective and more efficient retrieval systems that support the users' decision making process. We sketch promising research directions for more efficient algorithms for collaborative decision making, especially for large scale systems.......Collaborative decision making is a successful approach in settings where data analysis and querying can be done interactively. In large scale systems with huge data volumes or many users, collaboration is often hindered by impractical runtimes. Existing work on improving collaboration focuses...... on avoiding redundancy for users working on the same task. While this improves the effectiveness of the user work process, the underlying query processing engine is typically considered a "black box" and left unchanged. Research in multiple query processing, on the other hand, ignores the application...

  5. Development of estrogen receptor beta binding prediction model using large sets of chemicals.

    Science.gov (United States)

    Sakkiah, Sugunadevi; Selvaraj, Chandrabose; Gong, Ping; Zhang, Chaoyang; Tong, Weida; Hong, Huixiao

    2017-11-03

    We developed an ER β binding prediction model to facilitate identification of chemicals specifically bind ER β or ER α together with our previously developed ER α binding model. Decision Forest was used to train ER β binding prediction model based on a large set of compounds obtained from EADB. Model performance was estimated through 1000 iterations of 5-fold cross validations. Prediction confidence was analyzed using predictions from the cross validations. Informative chemical features for ER β binding were identified through analysis of the frequency data of chemical descriptors used in the models in the 5-fold cross validations. 1000 permutations were conducted to assess the chance correlation. The average accuracy of 5-fold cross validations was 93.14% with a standard deviation of 0.64%. Prediction confidence analysis indicated that the higher the prediction confidence the more accurate the predictions. Permutation testing results revealed that the prediction model is unlikely generated by chance. Eighteen informative descriptors were identified to be important to ER β binding prediction. Application of the prediction model to the data from ToxCast project yielded very high sensitivity of 90-92%. Our results demonstrated ER β binding of chemicals could be accurately predicted using the developed model. Coupling with our previously developed ER α prediction model, this model could be expected to facilitate drug development through identification of chemicals that specifically bind ER β or ER α .

  6. Combining RP and SP data while accounting for large choice sets and travel mode

    DEFF Research Database (Denmark)

    Abildtrup, Jens; Olsen, Søren Bøye; Stenger, Anne

    2015-01-01

    set used for site selection modelling when the actual choice set considered is potentially large and unknown to the analyst. Easy access to forests also implies that around half of the visitors walk or bike to the forest. We apply an error-component mixed-logit model to simultaneously model the travel...

  7. A two-stage compound parabolic concentrator system with a large entrance over the exit aperture ratio

    International Nuclear Information System (INIS)

    Angelescu, Tatiana; Radu, A. A.

    2000-01-01

    Certain optical designs in the field of high energy gamma ray astronomy components of the Cherenkov light, collected by the mirror of telescope, be concentrated on the photo-cathodes of the photomultiplier tubes, with the help of the light collectors having large entrance and small exit apertures. Mathematical restrictions imposed by the design of the compound parabolic concentrator (CPC) implied that for a given cut-off angle and an entrance aperture, the exit aperture of the CPC should not exceed a limit value. If this value is larger than the active diameter of the photocathode, an additional concentrator must be added to the system in order to transfer the light collected, from the exit aperture of the compound parabolic concentrator to the photocathode of the photomultiplier tube. Different designs of a two-stage system composed by a a hollow compound parabolic concentrator and a solid, dielectric filled concentrator are evaluated in this paper, from the point of view of optical efficiency and manufacturability. (authors)

  8. Reducing NIR prediction errors with nonlinear methods and large populations of intact compound feedstuffs

    International Nuclear Information System (INIS)

    Fernández-Ahumada, E; Gómez, A; Vallesquino, P; Guerrero, J E; Pérez-Marín, D; Garrido-Varo, A; Fearn, T

    2008-01-01

    According to the current demands of the authorities, the manufacturers and the consumers, controls and assessments of the feed compound manufacturing process have become a key concern. Among others, it must be assured that a given compound feed is well manufactured and labelled in terms of the ingredient composition. When near-infrared spectroscopy (NIRS) together with linear models were used for the prediction of the ingredient composition, the results were not always acceptable. Therefore, the performance of nonlinear methods has been investigated. Artificial neural networks and least squares support vector machines (LS-SVM) have been applied to a large (N = 20 320) and heterogeneous population of non-milled feed compounds for the NIR prediction of the inclusion percentage of wheat and sunflower meal, as representative of two different classes of ingredients. Compared to partial least squares regression, results showed considerable reductions of standard error of prediction values for both methods and ingredients: reductions of 45% with ANN and 49% with LS-SVM for wheat and reductions of 44% with ANN and 46% with LS-SVM for sunflower meal. These improvements together with the facility of NIRS technology to be implemented in the process make it ideal for meeting the requirements of the animal feed industry

  9. Large margin image set representation and classification

    KAUST Repository

    Wang, Jim Jing-Yan; Alzahrani, Majed A.; Gao, Xin

    2014-01-01

    In this paper, we propose a novel image set representation and classification method by maximizing the margin of image sets. The margin of an image set is defined as the difference of the distance to its nearest image set from different classes and the distance to its nearest image set of the same class. By modeling the image sets by using both their image samples and their affine hull models, and maximizing the margins of the images sets, the image set representation parameter learning problem is formulated as an minimization problem, which is further optimized by an expectation - maximization (EM) strategy with accelerated proximal gradient (APG) optimization in an iterative algorithm. To classify a given test image set, we assign it to the class which could provide the largest margin. Experiments on two applications of video-sequence-based face recognition demonstrate that the proposed method significantly outperforms state-of-the-art image set classification methods in terms of both effectiveness and efficiency.

  10. Large margin image set representation and classification

    KAUST Repository

    Wang, Jim Jing-Yan

    2014-07-06

    In this paper, we propose a novel image set representation and classification method by maximizing the margin of image sets. The margin of an image set is defined as the difference of the distance to its nearest image set from different classes and the distance to its nearest image set of the same class. By modeling the image sets by using both their image samples and their affine hull models, and maximizing the margins of the images sets, the image set representation parameter learning problem is formulated as an minimization problem, which is further optimized by an expectation - maximization (EM) strategy with accelerated proximal gradient (APG) optimization in an iterative algorithm. To classify a given test image set, we assign it to the class which could provide the largest margin. Experiments on two applications of video-sequence-based face recognition demonstrate that the proposed method significantly outperforms state-of-the-art image set classification methods in terms of both effectiveness and efficiency.

  11. ADME evaluation in drug discovery. 1. Applications of genetic algorithms to the prediction of blood-brain partitioning of a large set of drugs.

    Science.gov (United States)

    Hou, Tingjun; Xu, Xiaojie

    2002-12-01

    In this study, the relationships between the brain-blood concentration ratio of 96 structurally diverse compounds with a large number of structurally derived descriptors were investigated. The linear models were based on molecular descriptors that can be calculated for any compound simply from a knowledge of its molecular structure. The linear correlation coefficients of the models were optimized by genetic algorithms (GAs), and the descriptors used in the linear models were automatically selected from 27 structurally derived descriptors. The GA optimizations resulted in a group of linear models with three or four molecular descriptors with good statistical significance. The change of descriptor use as the evolution proceeds demonstrates that the octane/water partition coefficient and the partial negative solvent-accessible surface area multiplied by the negative charge are crucial to brain-blood barrier permeability. Moreover, we found that the predictions using multiple QSPR models from GA optimization gave quite good results in spite of the diversity of structures, which was better than the predictions using the best single model. The predictions for the two external sets with 37 diverse compounds using multiple QSPR models indicate that the best linear models with four descriptors are sufficiently effective for predictive use. Considering the ease of computation of the descriptors, the linear models may be used as general utilities to screen the blood-brain barrier partitioning of drugs in a high-throughput fashion.

  12. Zebrafish Expression Ontology of Gene Sets (ZEOGS): A Tool to Analyze Enrichment of Zebrafish Anatomical Terms in Large Gene Sets

    Science.gov (United States)

    Marsico, Annalisa

    2013-01-01

    Abstract The zebrafish (Danio rerio) is an established model organism for developmental and biomedical research. It is frequently used for high-throughput functional genomics experiments, such as genome-wide gene expression measurements, to systematically analyze molecular mechanisms. However, the use of whole embryos or larvae in such experiments leads to a loss of the spatial information. To address this problem, we have developed a tool called Zebrafish Expression Ontology of Gene Sets (ZEOGS) to assess the enrichment of anatomical terms in large gene sets. ZEOGS uses gene expression pattern data from several sources: first, in situ hybridization experiments from the Zebrafish Model Organism Database (ZFIN); second, it uses the Zebrafish Anatomical Ontology, a controlled vocabulary that describes connected anatomical structures; and third, the available connections between expression patterns and anatomical terms contained in ZFIN. Upon input of a gene set, ZEOGS determines which anatomical structures are overrepresented in the input gene set. ZEOGS allows one for the first time to look at groups of genes and to describe them in terms of shared anatomical structures. To establish ZEOGS, we first tested it on random gene selections and on two public microarray datasets with known tissue-specific gene expression changes. These tests showed that ZEOGS could reliably identify the tissues affected, whereas only very few enriched terms to none were found in the random gene sets. Next we applied ZEOGS to microarray datasets of 24 and 72 h postfertilization zebrafish embryos treated with beclomethasone, a potent glucocorticoid. This analysis resulted in the identification of several anatomical terms related to glucocorticoid-responsive tissues, some of which were stage-specific. Our studies highlight the ability of ZEOGS to extract spatial information from datasets derived from whole embryos, indicating that ZEOGS could be a useful tool to automatically analyze gene

  13. Zebrafish Expression Ontology of Gene Sets (ZEOGS): a tool to analyze enrichment of zebrafish anatomical terms in large gene sets.

    Science.gov (United States)

    Prykhozhij, Sergey V; Marsico, Annalisa; Meijsing, Sebastiaan H

    2013-09-01

    The zebrafish (Danio rerio) is an established model organism for developmental and biomedical research. It is frequently used for high-throughput functional genomics experiments, such as genome-wide gene expression measurements, to systematically analyze molecular mechanisms. However, the use of whole embryos or larvae in such experiments leads to a loss of the spatial information. To address this problem, we have developed a tool called Zebrafish Expression Ontology of Gene Sets (ZEOGS) to assess the enrichment of anatomical terms in large gene sets. ZEOGS uses gene expression pattern data from several sources: first, in situ hybridization experiments from the Zebrafish Model Organism Database (ZFIN); second, it uses the Zebrafish Anatomical Ontology, a controlled vocabulary that describes connected anatomical structures; and third, the available connections between expression patterns and anatomical terms contained in ZFIN. Upon input of a gene set, ZEOGS determines which anatomical structures are overrepresented in the input gene set. ZEOGS allows one for the first time to look at groups of genes and to describe them in terms of shared anatomical structures. To establish ZEOGS, we first tested it on random gene selections and on two public microarray datasets with known tissue-specific gene expression changes. These tests showed that ZEOGS could reliably identify the tissues affected, whereas only very few enriched terms to none were found in the random gene sets. Next we applied ZEOGS to microarray datasets of 24 and 72 h postfertilization zebrafish embryos treated with beclomethasone, a potent glucocorticoid. This analysis resulted in the identification of several anatomical terms related to glucocorticoid-responsive tissues, some of which were stage-specific. Our studies highlight the ability of ZEOGS to extract spatial information from datasets derived from whole embryos, indicating that ZEOGS could be a useful tool to automatically analyze gene expression

  14. Evaluation of Volatile Organic Compounds and Carbonyl Compounds Present in the Cabins of Newly Produced, Medium- and Large-Size Coaches in China

    Directory of Open Access Journals (Sweden)

    Yan-Yang Lu

    2016-06-01

    Full Text Available An air-conditioned coach is an important form of transportation in modern motorized society; as a result, there is an increasing concern of in-vehicle air pollution. In this study, we aimed to identify and quantify the levels of volatile organic compounds (VOCs and carbonyl compounds (CCs in air samples collected from the cabins of newly produced, medium- and large-size coaches. Among the identified VOCs and CCs, toluene, ethylbenzene, xylene, formaldehyde, acetaldehyde, acrolein/acetone, and isovaleraldehyde were relatively abundant in the cabins. Time was found to affect the emissions of the contaminants in the coaches. Except for benzaldehyde, valeraldehyde and benzene, the highest in-vehicle concentrations of VOCs and CCs were observed on the 15th day after coming off the assembly line, and the concentrations exhibited an approximately inverted U-shaped pattern as a function of time. Interestingly, this study also showed that the interior temperature of the coaches significantly affected the VOCs emissions from the interior materials, whereas the levels of CCs were mainly influenced by the relative humidity within the coaches. In China, guidelines and regulations for the in-vehicle air quality assessment of the coaches have not yet been issued. The results of this study provide further understanding of the in-vehicle air quality of air-conditioned coaches and can be used in the development of both specific and general rules regarding medium- and large-size coaches.

  15. Characterization of ToxCast Phase II compounds disruption of ...

    Science.gov (United States)

    The development of multi-well microelectrode array (mwMEA) systems has increased in vitro screening throughput making them an effective method to screen and prioritize large sets of compounds for potential neurotoxicity. In the present experiments, a multiplexed approach was used to determine compound effects on both neural function and cell health in primary cortical networks grown on mwMEA plates following exposure to ~1100 compounds from EPA’s Phase II ToxCast libraries. On DIV 13, baseline activity (40 min) was recorded prior to exposure to each compound at 40 µM. DMSO and the GABAA antagonist bicuculline (BIC) were included as controls on each mwMEA plate. Changes in spontaneous network activity (mean firing rate; MFR) and cell viability (lactate dehydrogenase; LDH and CellTiter Blue; CTB) were assessed within the same well following compound exposure. Activity calls (“hits”) were established using the 90th and 20th percentiles of the compound-induced change in MFR (medians of triplicates) across all tested compounds; compounds above (top 10% of compounds increasing MFR), and below (bottom 20% of compounds decreasing MFR) these thresholds, respectively were considered hits. MFR was altered beyond one of these thresholds by 322 compounds. Four compound categories accounted for 66% of the hits, including: insecticides (e.g. abamectin, lindane, prallethrin), pharmaceuticals (e.g. haloperidol, reserpine), fungicides (e.g. hexaconazole, fenamidone), and h

  16. Action recognition using mined hierarchical compound features.

    Science.gov (United States)

    Gilbert, Andrew; Illingworth, John; Bowden, Richard

    2011-05-01

    The field of Action Recognition has seen a large increase in activity in recent years. Much of the progress has been through incorporating ideas from single-frame object recognition and adapting them for temporal-based action recognition. Inspired by the success of interest points in the 2D spatial domain, their 3D (space-time) counterparts typically form the basic components used to describe actions, and in action recognition the features used are often engineered to fire sparsely. This is to ensure that the problem is tractable; however, this can sacrifice recognition accuracy as it cannot be assumed that the optimum features in terms of class discrimination are obtained from this approach. In contrast, we propose to initially use an overcomplete set of simple 2D corners in both space and time. These are grouped spatially and temporally using a hierarchical process, with an increasing search area. At each stage of the hierarchy, the most distinctive and descriptive features are learned efficiently through data mining. This allows large amounts of data to be searched for frequently reoccurring patterns of features. At each level of the hierarchy, the mined compound features become more complex, discriminative, and sparse. This results in fast, accurate recognition with real-time performance on high-resolution video. As the compound features are constructed and selected based upon their ability to discriminate, their speed and accuracy increase at each level of the hierarchy. The approach is tested on four state-of-the-art data sets, the popular KTH data set to provide a comparison with other state-of-the-art approaches, the Multi-KTH data set to illustrate performance at simultaneous multiaction classification, despite no explicit localization information provided during training. Finally, the recent Hollywood and Hollywood2 data sets provide challenging complex actions taken from commercial movie sequences. For all four data sets, the proposed hierarchical

  17. A set of triple-resonance nuclear magnetic resonance experiments for structural characterization of organophosphorus compounds in mixture samples

    Energy Technology Data Exchange (ETDEWEB)

    Koskela, Harri, E-mail: Harri.T.Koskela@helsinki.fi [VERIFIN, University of Helsinki, P.O. Box 55, FIN-00014 Helsinki (Finland)

    2012-11-02

    Highlights: Black-Right-Pointing-Pointer New {sup 1}H, {sup 13}C, {sup 31}P triple-resonance NMR pulse experiments. Black-Right-Pointing-Pointer Analysis of organophosphorus (OP) compounds in complex matrix. Black-Right-Pointing-Pointer Selective extraction of {sup 1}H, {sup 31}P, and {sup 13}C chemical shifts and connectivities. Black-Right-Pointing-Pointer More precise NMR identification of OP nerve agents and their degradation products. - Abstract: The {sup 1}H, {sup 13}C correlation NMR spectroscopy utilizes J{sub CH} couplings in molecules, and provides important structural information from small organic molecules in the form of carbon chemical shifts and carbon-proton connectivities. The full potential of the {sup 1}H, {sup 13}C correlation NMR spectroscopy has not been realized in the Chemical Weapons Convention (CWC) related verification analyses due to the sample matrix, which usually contains a high amount of non-related compounds obscuring the correlations of the relevant compounds. Here, the results of the application of {sup 1}H, {sup 13}C, {sup 31}P triple-resonance NMR spectroscopy in characterization of OP compounds related to the CWC are presented. With a set of two-dimensional triple-resonance experiments the J{sub HP}, J{sub CH} and J{sub PC} couplings are utilized to map the connectivities of the atoms in OP compounds and to extract the carbon chemical shift information. With the use of the proposed pulse sequences the correlations from the OP compounds can be recorded without significant artifacts from the non-OP compound impurities in the sample. Further selectivity of the observed correlations is achieved with the application of phosphorus band-selective pulse in the pulse sequences to assist the analysis of multiple OP compounds in mixture samples. The use of the triple-resonance experiments in the analysis of a complex sample is shown with a test mixture containing typical scheduled OP compounds, including the characteristic degradation

  18. Large and small sets with respect to homomorphisms and products of groups

    Directory of Open Access Journals (Sweden)

    Riccardo Gusso

    2002-10-01

    Full Text Available We study the behaviour of large, small and medium subsets with respect to homomorphisms and products of groups. Then we introduce the definition af a P-small set in abelian groups and we investigate the relations between this kind of smallness and the previous one, giving some examples that distinguish them.

  19. Teaching Children to Organise and Represent Large Data Sets in a Histogram

    Science.gov (United States)

    Nisbet, Steven; Putt, Ian

    2004-01-01

    Although some bright students in primary school are able to organise numerical data into classes, most attend to the characteristics of individuals rather than the group, and "see the trees rather than the forest". How can teachers in upper primary and early high school teach students to organise large sets of data with widely varying…

  20. Parallel analysis tools and new visualization techniques for ultra-large climate data set

    Energy Technology Data Exchange (ETDEWEB)

    Middleton, Don [National Center for Atmospheric Research, Boulder, CO (United States); Haley, Mary [National Center for Atmospheric Research, Boulder, CO (United States)

    2014-12-10

    ParVis was a project funded under LAB 10-05: “Earth System Modeling: Advanced Scientific Visualization of Ultra-Large Climate Data Sets”. Argonne was the lead lab with partners at PNNL, SNL, NCAR and UC-Davis. This report covers progress from January 1st, 2013 through Dec 1st, 2014. Two previous reports covered the period from Summer, 2010, through September 2011 and October 2011 through December 2012, respectively. While the project was originally planned to end on April 30, 2013, personnel and priority changes allowed many of the institutions to continue work through FY14 using existing funds. A primary focus of ParVis was introducing parallelism to climate model analysis to greatly reduce the time-to-visualization for ultra-large climate data sets. Work in the first two years was conducted on two tracks with different time horizons: one track to provide immediate help to climate scientists already struggling to apply their analysis to existing large data sets and another focused on building a new data-parallel library and tool for climate analysis and visualization that will give the field a platform for performing analysis and visualization on ultra-large datasets for the foreseeable future. In the final 2 years of the project, we focused mostly on the new data-parallel library and associated tools for climate analysis and visualization.

  1. Using Content-Specific Lyrics to Familiar Tunes in a Large Lecture Setting

    Science.gov (United States)

    McLachlin, Derek T.

    2009-01-01

    Music can be used in lectures to increase student engagement and help students retain information. In this paper, I describe my use of biochemistry-related lyrics written to the tune of the theme to the television show, The Flintstones, in a large class setting (400-800 students). To determine student perceptions, the class was surveyed several…

  2. Combinatorial support vector machines approach for virtual screening of selective multi-target serotonin reuptake inhibitors from large compound libraries.

    Science.gov (United States)

    Shi, Z; Ma, X H; Qin, C; Jia, J; Jiang, Y Y; Tan, C Y; Chen, Y Z

    2012-02-01

    Selective multi-target serotonin reuptake inhibitors enhance antidepressant efficacy. Their discovery can be facilitated by multiple methods, including in silico ones. In this study, we developed and tested an in silico method, combinatorial support vector machines (COMBI-SVMs), for virtual screening (VS) multi-target serotonin reuptake inhibitors of seven target pairs (serotonin transporter paired with noradrenaline transporter, H(3) receptor, 5-HT(1A) receptor, 5-HT(1B) receptor, 5-HT(2C) receptor, melanocortin 4 receptor and neurokinin 1 receptor respectively) from large compound libraries. COMBI-SVMs trained with 917-1951 individual target inhibitors correctly identified 22-83.3% (majority >31.1%) of the 6-216 dual inhibitors collected from literature as independent testing sets. COMBI-SVMs showed moderate to good target selectivity in misclassifying as dual inhibitors 2.2-29.8% (majority virtual hits correlate with the reported effects of their predicted targets. COMBI-SVM is potentially useful for searching selective multi-target agents without explicit knowledge of these agents. Copyright © 2011 Elsevier Inc. All rights reserved.

  3. Large data sets in finance and marketing: introduction by the special issue editor

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans)

    1998-01-01

    textabstractOn December 18 and 19 of 1997, a small conference on the "Statistical Analysis of Large Data Sets in Business Economics" was organized by the Rotterdam Institute for Business Economic Studies. Eleven presentations were delivered in plenary sessions, which were attended by about 90

  4. Thermoelectric and Structural Properties of Zr-/Hf-Based Half-Heusler Compounds Produced at a Large Scale

    Science.gov (United States)

    Zillmann, D.; Waag, A.; Peiner, E.; Feyand, M.-H.; Wolyniec, A.

    2018-02-01

    The half-Heusler (HH) systems are promising candidates for thermoelectric (TE) applications since they have shown high figures of merit ( zT) of ˜ 1, which are directly related to the energy conversion efficiency. To use HH compounds for TE devices, the materials must be phase-stable at operating temperatures up to 600°C. Currently, only a few HH compositions are available in large quantities. Hence, we focus on the TE and structural properties of three commercially available Zr-/Hf-based HH compounds in this publication. In particular, we evaluate the thermal conductivities and the figures of merit and critically discuss uncertainties and propagation error in the measurements. We find thermal conductivities of less than 6.0 W K^{-1}m^{-1} for all investigated materials and notably high figures of merit of 0.93 and 0.60 for n- and p-type compounds, respectively, at 600°C. Additionally, our investigations reveal that the grain structures of all materials also contain secondary phases like HfO2, Sn-Ni and Ti-Zr-Sn rich phases while an additional SnO_2 phase was found following several hours of harsh heat treatment at 800°C.

  5. Large Eddy Simulation of turbulent flows in compound channels with a finite element code

    International Nuclear Information System (INIS)

    Xavier, C.M.; Petry, A.P.; Moeller, S.V.

    2011-01-01

    This paper presents the numerical investigation of the developing flow in a compound channel formed by a rectangular main channel and a gap in one of the sidewalls. A three dimensional Large Eddy Simulation computational code with the classic Smagorinsky model is introduced, where the transient flow is modeled through the conservation equations of mass and momentum of a quasi-incompressible, isothermal continuous medium. Finite Element Method, Taylor-Galerkin scheme and linear hexahedrical elements are applied. Numerical results of velocity profile show the development of a shear layer in agreement with experimental results obtained with Pitot tube and hot wires. (author)

  6. Compound Passport Service: supporting corporate collection owners in open innovation.

    Science.gov (United States)

    Andrews, David M; Degorce, Sébastien L; Drake, David J; Gustafsson, Magnus; Higgins, Kevin M; Winter, Jon J

    2015-10-01

    A growing number of early discovery collaborative agreements are being put in place between large pharma companies and partners in which the rights for assets can reside with a partner, exclusively or jointly. Our corporate screening collection, like many others, was built on the premise that compounds generated in-house and not the subject of paper or patent disclosure were proprietary to the company. Collaborative screening arrangements and medicinal chemistry now make the origin, ownership rights and usage of compounds difficult to determine and manage. The Compound Passport Service is a dynamic database, managed and accessed through a set of reusable services that borrows from social media concepts to allow sample owners to take control of their samples in a much more active way. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Atmospheric Chemistry of Micrometeoritic Organic Compounds

    Science.gov (United States)

    Kress, M. E.; Belle, C. L.; Pevyhouse, A. R.; Iraci, L. T.

    2011-01-01

    Micrometeorites approx.100 m in diameter deliver most of the Earth s annual accumulation of extraterrestrial material. These small particles are so strongly heated upon atmospheric entry that most of their volatile content is vaporized. Here we present preliminary results from two sets of experiments to investigate the fate of the organic fraction of micrometeorites. In the first set of experiments, 300 m particles of a CM carbonaceous chondrite were subject to flash pyrolysis, simulating atmospheric entry. In addition to CO and CO2, many organic compounds were released, including functionalized benzenes, hydrocarbons, and small polycyclic aromatic hydrocarbons. In the second set of experiments, we subjected two of these compounds to conditions that simulate the heterogeneous chemistry of Earth s upper atmosphere. We find evidence that meteor-derived compounds can follow reaction pathways leading to the formation of more complex organic compounds.

  8. The higher infinite large cardinals in set theory from their beginnings

    CERN Document Server

    Kanamori, Akihiro

    2003-01-01

    The theory of large cardinals is currently a broad mainstream of modern set theory, the main area of investigation for the analysis of the relative consistency of mathematical propositions and possible new axioms for mathematics. The first of a projected multi-volume series, this book provides a comprehensive account of the theory of large cardinals from its beginnings and some of the direct outgrowths leading to the frontiers of contempory research. A "genetic" approach is taken, presenting the subject in the context of its historical development. With hindsight the consequential avenues are pursued and the most elegant or accessible expositions given. With open questions and speculations provided throughout the reader should not only come to appreciate the scope and coherence of the overall enterpreise but also become prepared to pursue research in several specific areas by studying the relevant sections.

  9. Reverse bifurcation and fractal of the compound logistic map

    Science.gov (United States)

    Wang, Xingyuan; Liang, Qingyong

    2008-07-01

    The nature of the fixed points of the compound logistic map is researched and the boundary equation of the first bifurcation of the map in the parameter space is given out. Using the quantitative criterion and rule of chaotic system, the paper reveal the general features of the compound logistic map transforming from regularity to chaos, the following conclusions are shown: (1) chaotic patterns of the map may emerge out of double-periodic bifurcation and (2) the chaotic crisis phenomena and the reverse bifurcation are found. At the same time, we analyze the orbit of critical point of the compound logistic map and put forward the definition of Mandelbrot-Julia set of compound logistic map. We generalize the Welstead and Cromer's periodic scanning technology and using this technology construct a series of Mandelbrot-Julia sets of compound logistic map. We investigate the symmetry of Mandelbrot-Julia set and study the topological inflexibility of distributing of period region in the Mandelbrot set, and finds that Mandelbrot set contain abundant information of structure of Julia sets by founding the whole portray of Julia sets based on Mandelbrot set qualitatively.

  10. Organic electronic devices using phthalimide compounds

    Science.gov (United States)

    Hassan, Azad M.; Thompson, Mark E.

    2010-09-07

    Organic electronic devices comprising a phthalimide compound. The phthalimide compounds disclosed herein are electron transporters with large HOMO-LUMO gaps, high triplet energies, large reduction potentials, and/or thermal and chemical stability. As such, these phthalimide compounds are suitable for use in any of various organic electronic devices, such as OLEDs and solar cells. In an OLED, the phthalimide compounds may serve various functions, such as a host in the emissive layer, as a hole blocking material, or as an electron transport material. In a solar cell, the phthalimide compounds may serve various functions, such as an exciton blocking material. Various examples of phthalimide compounds which may be suitable for use in the present invention are disclosed.

  11. The gradient boosting algorithm and random boosting for genome-assisted evaluation in large data sets.

    Science.gov (United States)

    González-Recio, O; Jiménez-Montero, J A; Alenda, R

    2013-01-01

    In the next few years, with the advent of high-density single nucleotide polymorphism (SNP) arrays and genome sequencing, genomic evaluation methods will need to deal with a large number of genetic variants and an increasing sample size. The boosting algorithm is a machine-learning technique that may alleviate the drawbacks of dealing with such large data sets. This algorithm combines different predictors in a sequential manner with some shrinkage on them; each predictor is applied consecutively to the residuals from the committee formed by the previous ones to form a final prediction based on a subset of covariates. Here, a detailed description is provided and examples using a toy data set are included. A modification of the algorithm called "random boosting" was proposed to increase predictive ability and decrease computation time of genome-assisted evaluation in large data sets. Random boosting uses a random selection of markers to add a subsequent weak learner to the predictive model. These modifications were applied to a real data set composed of 1,797 bulls genotyped for 39,714 SNP. Deregressed proofs of 4 yield traits and 1 type trait from January 2009 routine evaluations were used as dependent variables. A 2-fold cross-validation scenario was implemented. Sires born before 2005 were used as a training sample (1,576 and 1,562 for production and type traits, respectively), whereas younger sires were used as a testing sample to evaluate predictive ability of the algorithm on yet-to-be-observed phenotypes. Comparison with the original algorithm was provided. The predictive ability of the algorithm was measured as Pearson correlations between observed and predicted responses. Further, estimated bias was computed as the average difference between observed and predicted phenotypes. The results showed that the modification of the original boosting algorithm could be run in 1% of the time used with the original algorithm and with negligible differences in accuracy

  12. Breeding and Genetics Symposium: really big data: processing and analysis of very large data sets.

    Science.gov (United States)

    Cole, J B; Newman, S; Foertter, F; Aguilar, I; Coffey, M

    2012-03-01

    Modern animal breeding data sets are large and getting larger, due in part to recent availability of high-density SNP arrays and cheap sequencing technology. High-performance computing methods for efficient data warehousing and analysis are under development. Financial and security considerations are important when using shared clusters. Sound software engineering practices are needed, and it is better to use existing solutions when possible. Storage requirements for genotypes are modest, although full-sequence data will require greater storage capacity. Storage requirements for intermediate and results files for genetic evaluations are much greater, particularly when multiple runs must be stored for research and validation studies. The greatest gains in accuracy from genomic selection have been realized for traits of low heritability, and there is increasing interest in new health and management traits. The collection of sufficient phenotypes to produce accurate evaluations may take many years, and high-reliability proofs for older bulls are needed to estimate marker effects. Data mining algorithms applied to large data sets may help identify unexpected relationships in the data, and improved visualization tools will provide insights. Genomic selection using large data requires a lot of computing power, particularly when large fractions of the population are genotyped. Theoretical improvements have made possible the inversion of large numerator relationship matrices, permitted the solving of large systems of equations, and produced fast algorithms for variance component estimation. Recent work shows that single-step approaches combining BLUP with a genomic relationship (G) matrix have similar computational requirements to traditional BLUP, and the limiting factor is the construction and inversion of G for many genotypes. A naïve algorithm for creating G for 14,000 individuals required almost 24 h to run, but custom libraries and parallel computing reduced that to

  13. Acid/base bifunctional carbonaceous nanomaterial with large surface area: Preparation, characterization, and adsorption properties for cationic and anionic compounds

    Energy Technology Data Exchange (ETDEWEB)

    Li, Kai; Ma, Chun–Fang; Ling, Yuan; Li, Meng [Department of Chemistry, Faculty of Material Science and Chemistry, China University of Geosciences, Wuhan 430074 (China); Gao, Qiang, E-mail: gaoqiang@cug.edu.cn [Department of Chemistry, Faculty of Material Science and Chemistry, China University of Geosciences, Wuhan 430074 (China); Engineering Research Center of Nano-Geo Materials of Ministry of Education, China University of Geosciences, Wuhan 430074 (China); Luo, Wen–Jun, E-mail: heartnohome@yahoo.com.cn [Department of Chemistry, Faculty of Material Science and Chemistry, China University of Geosciences, Wuhan 430074 (China)

    2015-07-15

    Nanostructured carbonaceous materials are extremely important in the nano field, yet developing simple, mild, and “green” methods that can make such materials possess large surface area and rich functional groups on their surfaces still remains a considerable challenge. Herein, a one-pot and environment-friendly method, i.e., thermal treatment (180 °C; 18 h) of water mixed with glucose and chitosan (CTS), has been proposed. The resultant carbonaceous nanomaterials were characterized by field emitting scanning electron microscope, N{sub 2} adsorption/desorption, Fourier transform infrared spectroscope, X-ray photoelectron spectroscopy, and zeta-potential analysis. It was found that, in contrast to the conventional hydrothermally carbonized product from pure glucose, with low surface area (9.3 m{sup 2} g{sup −1}) and pore volume (0.016 cm{sup 3} g{sup −1}), the CTS-added carbonaceous products showed satisfactory textural parameters (surface area and pore volume up to 254 m{sup 2} g{sup −1} and 0.701 cm{sup 3} g{sup −1}, respectively). Moreover, it was also interestingly found that these CTS-added carbonaceous products possessed both acidic (–COOH) and basic (–NH{sub 2}) groups on their surfaces. Taking the advantages of large surface area and –COOH/–NH{sub 2} bifunctional surface, the carbonaceous nanomaterials exhibited excellent performance for adsorptions of cationic compound (i.e., methylene blue) at pH 10 and anionic compound (i.e., acid red 18) at pH 2, respectively. This work not only provides a simple and green route to prepare acid/base bifunctional carbonaceous nanomaterials with large surface area but also well demonstrates their potential for application in adsorption. - Highlights: • A simple and green method was proposed to prepare carbon nanomaterials. • The carbon product showed acid/base bifunctional surface with large surface area. • The carbon material could efficiently adsorb both cationic and anionic compounds.

  14. Compound Option Pricing under Fuzzy Environment

    Directory of Open Access Journals (Sweden)

    Xiandong Wang

    2014-01-01

    Full Text Available Considering the uncertainty of a financial market includes two aspects: risk and vagueness; in this paper, fuzzy sets theory is applied to model the imprecise input parameters (interest rate and volatility. We present the fuzzy price of compound option by fuzzing the interest and volatility in Geske’s compound option pricing formula. For each α, the α-level set of fuzzy prices is obtained according to the fuzzy arithmetics and the definition of fuzzy-valued function. We apply a defuzzification method based on crisp possibilistic mean values of the fuzzy interest rate and fuzzy volatility to obtain the crisp possibilistic mean value of compound option price. Finally, we present a numerical analysis to illustrate the compound option pricing under fuzzy environment.

  15. [Indoor air pollution by volatile organic compounds in large buildings: pollution levels and remaining issues after revision of the Act on Maintenance of Sanitation in Buildings in 2002].

    Science.gov (United States)

    Sakai, Kiyoshi; Kamijima, Michihiro; Shibata, Eiji; Ohno, Hiroyuki; Nakajima, Tamie

    2010-09-01

    This study aimed to clarify indoor air pollution levels of volatile organic compounds (VOCs), especially 2-ethyl-1-hexanol (2E1H) in large buildings after revising of the Act on Maintenance of Sanitation in Buildings in 2002. We measured indoor air VOC concentrations in 57 (97%) out of a total of 61 large buildings completed within one year in half of the area of Nagoya, Japan, from 2003 through 2007. Airborne concentrations of 13 carbonyl compounds were determined with diffusion samplers and high-performance liquid chromatography, and of the other 32 VOCs with diffusion samplers and gas chromatography with a mass spectrometer. Formaldehyde was detected in all samples of indoor air but the concentrations were lower than the indoor air quality standard value set in Japan (100 microg/m3). Geometric mean concentrations of the other major VOCs, namely toluene, xylene, ethylbenzene, styrene, p-dichlorobenzene and acetaldehyde were also low. 2E1H was found to be one of the predominating VOCs in indoor air of large buildings. A few rooms in a small number of buildings surveyed showed high concentrations of 2E1H, while low concentrations were observed in most rooms of those buildings as well as in other buildings. It was estimated that about 310 buildings had high indoor air pollution levels of 2E1H, with increase during the 5 years from 2003 in Japan. Indoor air pollution levels of VOCs in new large buildings are generally good, although a few rooms in a small number of buildings showed high concentrations in 2E1H, a possible causative chemical in sick building symptoms. Therefore, 2E1H needs particular attention as an important indoor air pollutant.

  16. An effective filter for IBD detection in large data sets.

    KAUST Repository

    Huang, Lin

    2014-03-25

    Identity by descent (IBD) inference is the task of computationally detecting genomic segments that are shared between individuals by means of common familial descent. Accurate IBD detection plays an important role in various genomic studies, ranging from mapping disease genes to exploring ancient population histories. The majority of recent work in the field has focused on improving the accuracy of inference, targeting shorter genomic segments that originate from a more ancient common ancestor. The accuracy of these methods, however, is achieved at the expense of high computational cost, resulting in a prohibitively long running time when applied to large cohorts. To enable the study of large cohorts, we introduce SpeeDB, a method that facilitates fast IBD detection in large unphased genotype data sets. Given a target individual and a database of individuals that potentially share IBD segments with the target, SpeeDB applies an efficient opposite-homozygous filter, which excludes chromosomal segments from the database that are highly unlikely to be IBD with the corresponding segments from the target individual. The remaining segments can then be evaluated by any IBD detection method of choice. When examining simulated individuals sharing 4 cM IBD regions, SpeeDB filtered out 99.5% of genomic regions from consideration while retaining 99% of the true IBD segments. Applying the SpeeDB filter prior to detecting IBD in simulated fourth cousins resulted in an overall running time that was 10,000x faster than inferring IBD without the filter and retained 99% of the true IBD segments in the output.

  17. An effective filter for IBD detection in large data sets.

    KAUST Repository

    Huang, Lin; Bercovici, Sivan; Rodriguez, Jesse M; Batzoglou, Serafim

    2014-01-01

    Identity by descent (IBD) inference is the task of computationally detecting genomic segments that are shared between individuals by means of common familial descent. Accurate IBD detection plays an important role in various genomic studies, ranging from mapping disease genes to exploring ancient population histories. The majority of recent work in the field has focused on improving the accuracy of inference, targeting shorter genomic segments that originate from a more ancient common ancestor. The accuracy of these methods, however, is achieved at the expense of high computational cost, resulting in a prohibitively long running time when applied to large cohorts. To enable the study of large cohorts, we introduce SpeeDB, a method that facilitates fast IBD detection in large unphased genotype data sets. Given a target individual and a database of individuals that potentially share IBD segments with the target, SpeeDB applies an efficient opposite-homozygous filter, which excludes chromosomal segments from the database that are highly unlikely to be IBD with the corresponding segments from the target individual. The remaining segments can then be evaluated by any IBD detection method of choice. When examining simulated individuals sharing 4 cM IBD regions, SpeeDB filtered out 99.5% of genomic regions from consideration while retaining 99% of the true IBD segments. Applying the SpeeDB filter prior to detecting IBD in simulated fourth cousins resulted in an overall running time that was 10,000x faster than inferring IBD without the filter and retained 99% of the true IBD segments in the output.

  18. Large Pelagic Logbook Set Survey (Vessels)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This data set contains catch and effort for fishing trips that are taken by vessels with a Federal permit issued for the swordfish and sharks under the Highly...

  19. Manufacturing and Installation of the Compound Cryogenic Distribution Line for the Large Hadron Collider

    CERN Document Server

    Riddone,, G; Bouillot, A; Brodzinski, K; Dupont, M; Fathallah, M; Fournel, JL; Gitton, E; Junker, S; Moussavi, H; Parente, C; Riddone, G

    2007-01-01

    The Large Hadron Collider (LHC) [1] currently under construction at CERN will make use of superconducting magnets operating in superfluid helium below 2 K. A compound cryogenic distribution line (QRL) will feed with helium at different temperatures and pressures the local elementary cooling loops in the cryomagnet strings. Low heat inleak to all temperature levels is essential for the overall LHC cryogenic performance. Following a competitive tendering, CERN adjudicated in 2001 the contract for the series line to Air Liquide (France). This paper recalls the main features of the technical specification and shows the project status. The basic choices and achievements for the industrialization phase of the series production are also presented, as well as the installation issues and status.

  20. Multi-view 3D echocardiography compounding based on feature consistency

    Science.gov (United States)

    Yao, Cheng; Simpson, John M.; Schaeffter, Tobias; Penney, Graeme P.

    2011-09-01

    Echocardiography (echo) is a widely available method to obtain images of the heart; however, echo can suffer due to the presence of artefacts, high noise and a restricted field of view. One method to overcome these limitations is to use multiple images, using the 'best' parts from each image to produce a higher quality 'compounded' image. This paper describes our compounding algorithm which specifically aims to reduce the effect of echo artefacts as well as improving the signal-to-noise ratio, contrast and extending the field of view. Our method weights image information based on a local feature coherence/consistency between all the overlapping images. Validation has been carried out using phantom, volunteer and patient datasets consisting of up to ten multi-view 3D images. Multiple sets of phantom images were acquired, some directly from the phantom surface, and others by imaging through hard and soft tissue mimicking material to degrade the image quality. Our compounding method is compared to the original, uncompounded echocardiography images, and to two basic statistical compounding methods (mean and maximum). Results show that our method is able to take a set of ten images, degraded by soft and hard tissue artefacts, and produce a compounded image of equivalent quality to images acquired directly from the phantom. Our method on phantom, volunteer and patient data achieves almost the same signal-to-noise improvement as the mean method, while simultaneously almost achieving the same contrast improvement as the maximum method. We show a statistically significant improvement in image quality by using an increased number of images (ten compared to five), and visual inspection studies by three clinicians showed very strong preference for our compounded volumes in terms of overall high image quality, large field of view, high endocardial border definition and low cavity noise.

  1. Multi-view 3D echocardiography compounding based on feature consistency

    International Nuclear Information System (INIS)

    Yao Cheng; Schaeffter, Tobias; Penney, Graeme P; Simpson, John M

    2011-01-01

    Echocardiography (echo) is a widely available method to obtain images of the heart; however, echo can suffer due to the presence of artefacts, high noise and a restricted field of view. One method to overcome these limitations is to use multiple images, using the 'best' parts from each image to produce a higher quality 'compounded' image. This paper describes our compounding algorithm which specifically aims to reduce the effect of echo artefacts as well as improving the signal-to-noise ratio, contrast and extending the field of view. Our method weights image information based on a local feature coherence/consistency between all the overlapping images. Validation has been carried out using phantom, volunteer and patient datasets consisting of up to ten multi-view 3D images. Multiple sets of phantom images were acquired, some directly from the phantom surface, and others by imaging through hard and soft tissue mimicking material to degrade the image quality. Our compounding method is compared to the original, uncompounded echocardiography images, and to two basic statistical compounding methods (mean and maximum). Results show that our method is able to take a set of ten images, degraded by soft and hard tissue artefacts, and produce a compounded image of equivalent quality to images acquired directly from the phantom. Our method on phantom, volunteer and patient data achieves almost the same signal-to-noise improvement as the mean method, while simultaneously almost achieving the same contrast improvement as the maximum method. We show a statistically significant improvement in image quality by using an increased number of images (ten compared to five), and visual inspection studies by three clinicians showed very strong preference for our compounded volumes in terms of overall high image quality, large field of view, high endocardial border definition and low cavity noise.

  2. A summarization approach for Affymetrix GeneChip data using a reference training set from a large, biologically diverse database

    Directory of Open Access Journals (Sweden)

    Tripputi Mark

    2006-10-01

    Full Text Available Abstract Background Many of the most popular pre-processing methods for Affymetrix expression arrays, such as RMA, gcRMA, and PLIER, simultaneously analyze data across a set of predetermined arrays to improve precision of the final measures of expression. One problem associated with these algorithms is that expression measurements for a particular sample are highly dependent on the set of samples used for normalization and results obtained by normalization with a different set may not be comparable. A related problem is that an organization producing and/or storing large amounts of data in a sequential fashion will need to either re-run the pre-processing algorithm every time an array is added or store them in batches that are pre-processed together. Furthermore, pre-processing of large numbers of arrays requires loading all the feature-level data into memory which is a difficult task even with modern computers. We utilize a scheme that produces all the information necessary for pre-processing using a very large training set that can be used for summarization of samples outside of the training set. All subsequent pre-processing tasks can be done on an individual array basis. We demonstrate the utility of this approach by defining a new version of the Robust Multi-chip Averaging (RMA algorithm which we refer to as refRMA. Results We assess performance based on multiple sets of samples processed over HG U133A Affymetrix GeneChip® arrays. We show that the refRMA workflow, when used in conjunction with a large, biologically diverse training set, results in the same general characteristics as that of RMA in its classic form when comparing overall data structure, sample-to-sample correlation, and variation. Further, we demonstrate that the refRMA workflow and reference set can be robustly applied to naïve organ types and to benchmark data where its performance indicates respectable results. Conclusion Our results indicate that a biologically diverse

  3. Compounding around the world.

    Science.gov (United States)

    Vail, Jane

    2008-01-01

    Pharmaceutical compounding is universal in its prevalence. Variations in disease patterns, culture, and tradition; the role of government in health care; and the availability of essential equipment and required agents shape a compounding profile unique to each country worldwide. In the following reflections, pharmacists form Argentina, Belgium, Colombia, Germany, Puerto Rico, Spain, and the United States describe their experiences in the compounding setting unique to their practice and their nation. The unifying theme in their comments is the dedication of each contributor to enabling recovery and ensuring the good health of his or her clients.

  4. Multiple testing issues in discriminating compound-related peaks and chromatograms from high frequency noise, spikes and solvent-based nois in LC-MS data sets

    NARCIS (Netherlands)

    Nyangoma, S.O.; Van Kampen, A.A.; Reijmers, T.H.; Govorukhina, N.I; van der Zee, A.G.; Billingham, I.J; Bischoff, Rainer; Jansen, R.C.

    2007-01-01

    Multiple testing issues in discriminating compound-related peaks and chromatograms from high frequency noise, spikes and solvent-based noise in LC-MS data sets.Nyangoma SO, van Kampen AA, Reijmers TH, Govorukhina NI, van der Zee AG, Billingham LJ, Bischoff R, Jansen RC. University of Birmingham.

  5. Bayesian screening for active compounds in high-dimensional chemical spaces combining property descriptors and molecular fingerprints.

    Science.gov (United States)

    Vogt, Martin; Bajorath, Jürgen

    2008-01-01

    Bayesian classifiers are increasingly being used to distinguish active from inactive compounds and search large databases for novel active molecules. We introduce an approach to directly combine the contributions of property descriptors and molecular fingerprints in the search for active compounds that is based on a Bayesian framework. Conventionally, property descriptors and fingerprints are used as alternative features for virtual screening methods. Following the approach introduced here, probability distributions of descriptor values and fingerprint bit settings are calculated for active and database molecules and the divergence between the resulting combined distributions is determined as a measure of biological activity. In test calculations on a large number of compound activity classes, this methodology was found to consistently perform better than similarity searching using fingerprints and multiple reference compounds or Bayesian screening calculations using probability distributions calculated only from property descriptors. These findings demonstrate that there is considerable synergy between different types of property descriptors and fingerprints in recognizing diverse structure-activity relationships, at least in the context of Bayesian modeling.

  6. Organolanthanoid compounds

    International Nuclear Information System (INIS)

    Schumann, H.

    1984-01-01

    Up to little more than a decade ago organolanthanoid compounds were still a curiosity. Apart from the description of an isolated number of cyclopentadienyl and indenyl derivatives, very few significant contributions had been made to this interesting sector of organometallic chemistry. However, subsequent systematic studies using modern preparative and analytical techniques, together with X-ray single crystal structure determinations, enabled the isolation and characterization of a large number of very interesting homoleptic and heteroleptic compounds in which the lanthanoid is bound to hydrogen, to substituted or unsubstituted cyclopentadienyl groups, to allyl or alkynyl groups, or even to phosphorus ylides, trimethylsilyl, and carbonylmetal groups. These compounds, which are all extremely sensitive to oxygen and water, open up new possibilities in the field of catalysis and have great potential in organic synthesis - as recent studies with pentamethylcyclopentadienyl derivatives, organolanthanoid(II) compounds, and hexamethyllanthanoid complexes have already shown. (orig.) [de

  7. History of sterile compounding in U.S. hospitals: learning from the tragic lessons of the past.

    Science.gov (United States)

    Myers, Charles E

    2013-08-15

    The evolution of sterile compounding in the context of hospital patient care, the evolution of related technology, past incidents of morbidity and mortality associated with preparations compounded in various settings, and efforts over the years to improve compounding practices are reviewed. Tightened United States Pharmacopeial Convention standards (since 2004) for sterile compounding made it difficult for hospitals to achieve all of the sterile compounding necessary for patient care. Shortages of manufactured injections added to the need for compounding. Non-hospital-based compounding pharmacies increased sterile compounding to meet the needs. Gaps in federal and state laws and regulations about compounding pharmacies led to deficiencies in their regulation. Lapses in sterility led to injuries and deaths. Perspectives offered include potential actions, including changes in practitioner education, better surveillance of sterile compounding, regulatory reforms, reexamination of the causes of drug shortages, and the development of new technologies. Over the years, there have been numerous exhortations for voluntary better performance in sterile compounding. In addition, professional leadership has been vigorous and extensive in the form of guidance, publications, education, enforceable standards, and development of various associations and organizations dealing with safe compounding practices. Yet problems continue to occur. We must engage in diligent learning from the injuries and tragedies that have occurred. Assuming that we are already doing all we can to avoid problems would be an abdication of the professional mission of pharmacists. It would be wrong thinking to assume that the recent problems in large-scale compounding pharmacies are the only problems that warrant attention. It is time for a systematic assessment of the nature and the dimensions of the problems in every type of setting where sterile compounding occurs. It also is time for some innovative

  8. Analyzing large data sets from XGC1 magnetic fusion simulations using apache spark

    Energy Technology Data Exchange (ETDEWEB)

    Churchill, R. Michael [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States)

    2016-11-21

    Apache Spark is explored as a tool for analyzing large data sets from the magnetic fusion simulation code XGCI. Implementation details of Apache Spark on the NERSC Edison supercomputer are discussed, including binary file reading, and parameter setup. Here, an unsupervised machine learning algorithm, k-means clustering, is applied to XGCI particle distribution function data, showing that highly turbulent spatial regions do not have common coherent structures, but rather broad, ring-like structures in velocity space.

  9. Octopus: a platform for the virtual high-throughput screening of a pool of compounds against a set of molecular targets.

    Science.gov (United States)

    Maia, Eduardo Habib Bechelane; Campos, Vinícius Alves; Dos Reis Santos, Bianca; Costa, Marina Santos; Lima, Iann Gabriel; Greco, Sandro J; Ribeiro, Rosy I M A; Munayer, Felipe M; da Silva, Alisson Marques; Taranto, Alex Gutterres

    2017-01-01

    Octopus is an automated workflow management tool that is scalable for virtual high-throughput screening (vHTS). It integrates MOPAC2016, MGLTools, PyMOL, and AutoDock Vina. In contrast to other platforms, Octopus can perform docking simulations of an unlimited number of compounds into a set of molecular targets. After generating the ligands in a drawing package in the Protein Data Bank (PDB) format, Octopus can carry out geometry refinement using the semi-empirical method PM7 implemented in MOPAC2016. Docking simulations can be performed using AutoDock Vina and can utilize the Our Own Molecular Targets (OOMT) databank. Finally, the proposed software compiles the best binding energies into a standard table. Here, we describe two successful case studies that were verified by biological assay. In the first case study, the vHTS process was carried out for 22 (phenylamino)urea derivatives. The vHTS process identified a metalloprotease with the PDB code 1GKC as a molecular target for derivative LE&007. In a biological assay, compound LE&007 was found to inhibit 80% of the activity of this enzyme. In the second case study, compound Tx001 was submitted to the Octopus routine, and the results suggested that Plasmodium falciparum ATP6 (PfATP6) as a molecular target for this compound. Following an antimalarial assay, Tx001 was found to have an inhibitory concentration (IC 50 ) of 8.2 μM against PfATP6. These successful examples illustrate the utility of this software for finding appropriate molecular targets for compounds. Hits can then be identified and optimized as new antineoplastic and antimalarial drugs. Finally, Octopus has a friendly Linux-based user interface, and is available at www.drugdiscovery.com.br . Graphical Abstract Octopus: A platform for inverse virtual screening (IVS) to search new molecular targets for drugs.

  10. CUDA based Level Set Method for 3D Reconstruction of Fishes from Large Acoustic Data

    DEFF Research Database (Denmark)

    Sharma, Ojaswa; Anton, François

    2009-01-01

    Acoustic images present views of underwater dynamics, even in high depths. With multi-beam echo sounders (SONARs), it is possible to capture series of 2D high resolution acoustic images. 3D reconstruction of the water column and subsequent estimation of fish abundance and fish species identificat...... of suppressing threshold and show its convergence as the evolution proceeds. We also present a GPU based streaming computation of the method using NVIDIA's CUDA framework to handle large volume data-sets. Our implementation is optimised for memory usage to handle large volumes....

  11. Determination of the n-octanol/water partition coefficients of weakly ionizable basic compounds by reversed-phase high-performance liquid chromatography with neutral model compounds.

    Science.gov (United States)

    Liang, Chao; Han, Shu-ying; Qiao, Jun-qin; Lian, Hong-zhen; Ge, Xin

    2014-11-01

    A strategy to utilize neutral model compounds for lipophilicity measurement of ionizable basic compounds by reversed-phase high-performance liquid chromatography is proposed in this paper. The applicability of the novel protocol was justified by theoretical derivation. Meanwhile, the linear relationships between logarithm of apparent n-octanol/water partition coefficients (logKow '') and logarithm of retention factors corresponding to the 100% aqueous fraction of mobile phase (logkw ) were established for a basic training set, a neutral training set and a mixed training set of these two. As proved in theory, the good linearity and external validation results indicated that the logKow ''-logkw relationships obtained from a neutral model training set were always reliable regardless of mobile phase pH. Afterwards, the above relationships were adopted to determine the logKow of harmaline, a weakly dissociable alkaloid. As far as we know, this is the first report on experimental logKow data for harmaline (logKow = 2.28 ± 0.08). Introducing neutral compounds into a basic model training set or using neutral model compounds alone is recommended to measure the lipophilicity of weakly ionizable basic compounds especially those with high hydrophobicity for the advantages of more suitable model compound choices and convenient mobile phase pH control. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Xenobiotic organic compounds in wastewater

    DEFF Research Database (Denmark)

    Eriksson, Eva; Baun, Anders; Henze, Mogens

    2002-01-01

    hundred of XOCs, among them mainly originating from hygiene products: chlorophenols, detergents and phthalates. Several compounds not deriving from hygiene products were also identified e.g. flame-retardants and drugs. A environmental hazard identification showed that a large number of compounds with high...

  13. Molecular descriptor data explain market prices of a large commercial chemical compound library

    Science.gov (United States)

    Polanski, Jaroslaw; Kucia, Urszula; Duszkiewicz, Roksana; Kurczyk, Agata; Magdziarz, Tomasz; Gasteiger, Johann

    2016-06-01

    The relationship between the structure and a property of a chemical compound is an essential concept in chemistry guiding, for example, drug design. Actually, however, we need economic considerations to fully understand the fate of drugs on the market. We are performing here for the first time the exploration of quantitative structure-economy relationships (QSER) for a large dataset of a commercial building block library of over 2.2 million chemicals. This investigation provided molecular statistics that shows that on average what we are paying for is the quantity of matter. On the other side, the influence of synthetic availability scores is also revealed. Finally, we are buying substances by looking at the molecular graphs or molecular formulas. Thus, those molecules that have a higher number of atoms look more attractive and are, on average, also more expensive. Our study shows how data binning could be used as an informative method when analyzing big data in chemistry.

  14. Radioactive decay and labeled compounds

    International Nuclear Information System (INIS)

    Anon.

    1991-01-01

    This chapter on radioactive decay and labeled compounds has numerous intext equations and worked, sample problems. Topics covered include the following: terms and mathematics of radioactive decay; examples of calculations; graphs of decay equations; radioactivity or activity; activity measurements; activity decay; half-life determinations; labeled compounds. A 20 problem set is also included. 1 ref., 4 figs., 1 tab

  15. Microbial production of volatile sulphur compounds in the large intestine of pigs fed two different diets.

    Science.gov (United States)

    Poulsen, H V; Jensen, B B; Finster, K; Spence, C; Whitehead, T R; Cotta, M A; Canibe, N

    2012-07-01

      To investigate the production of volatile sulphur compounds (VSC) in the segments of the large intestine of pigs and to assess the impact of diet on this production.   Pigs were fed two diets based on either wheat and barley (STD) or wheat and dried distillers grains with solubles (DDGS). Net production of VSC and potential sulphate reduction rate (SRR) (sulphate saturated) along the large intestine were determined by means of in vitro incubations. The net production rate of hydrogen sulphide and potential SRR increased from caecum towards distal colon and were significantly higher in the STD group. Conversely, the net methanethiol production rate was significantly higher in the DDGS group, while no difference was observed for dimethyl sulphide. The number of sulphate-reducing bacteria and total bacteria were determined by quantitative PCR and showed a significant increase along the large intestine, whereas no diet-related differences were observed.   VSC net production varies widely throughout the large intestine of pigs and the microbial processes involved in this production can be affected by diet.   This first report on intestinal production of all VSC shows both spatial and dietary effects, which are relevant to both bowel disease- and odour mitigation research. © 2012 The Authors. Journal of Applied Microbiology © 2012 The Society for Applied Microbiology.

  16. Ssecrett and neuroTrace: Interactive visualization and analysis tools for large-scale neuroscience data sets

    KAUST Repository

    Jeong, Wonki; Beyer, Johanna; Hadwiger, Markus; Blue, Rusty; Law, Charles; Vá zquez Reina, Amelio; Reid, Rollie Clay; Lichtman, Jeff W M D; Pfister, Hanspeter

    2010-01-01

    Recent advances in optical and electron microscopy let scientists acquire extremely high-resolution images for neuroscience research. Data sets imaged with modern electron microscopes can range between tens of terabytes to about one petabyte. These large data sizes and the high complexity of the underlying neural structures make it very challenging to handle the data at reasonably interactive rates. To provide neuroscientists flexible, interactive tools, the authors introduce Ssecrett and NeuroTrace, two tools they designed for interactive exploration and analysis of large-scale optical- and electron-microscopy images to reconstruct complex neural circuits of the mammalian nervous system. © 2010 IEEE.

  17. Ssecrett and neuroTrace: Interactive visualization and analysis tools for large-scale neuroscience data sets

    KAUST Repository

    Jeong, Wonki

    2010-05-01

    Recent advances in optical and electron microscopy let scientists acquire extremely high-resolution images for neuroscience research. Data sets imaged with modern electron microscopes can range between tens of terabytes to about one petabyte. These large data sizes and the high complexity of the underlying neural structures make it very challenging to handle the data at reasonably interactive rates. To provide neuroscientists flexible, interactive tools, the authors introduce Ssecrett and NeuroTrace, two tools they designed for interactive exploration and analysis of large-scale optical- and electron-microscopy images to reconstruct complex neural circuits of the mammalian nervous system. © 2010 IEEE.

  18. Impact of problem-based learning in a large classroom setting: student perception and problem-solving skills.

    Science.gov (United States)

    Klegeris, Andis; Hurren, Heather

    2011-12-01

    Problem-based learning (PBL) can be described as a learning environment where the problem drives the learning. This technique usually involves learning in small groups, which are supervised by tutors. It is becoming evident that PBL in a small-group setting has a robust positive effect on student learning and skills, including better problem-solving skills and an increase in overall motivation. However, very little research has been done on the educational benefits of PBL in a large classroom setting. Here, we describe a PBL approach (using tutorless groups) that was introduced as a supplement to standard didactic lectures in University of British Columbia Okanagan undergraduate biochemistry classes consisting of 45-85 students. PBL was chosen as an effective method to assist students in learning biochemical and physiological processes. By monitoring student attendance and using informal and formal surveys, we demonstrated that PBL has a significant positive impact on student motivation to attend and participate in the course work. Student responses indicated that PBL is superior to traditional lecture format with regard to the understanding of course content and retention of information. We also demonstrated that student problem-solving skills are significantly improved, but additional controlled studies are needed to determine how much PBL exercises contribute to this improvement. These preliminary data indicated several positive outcomes of using PBL in a large classroom setting, although further studies aimed at assessing student learning are needed to further justify implementation of this technique in courses delivered to large undergraduate classes.

  19. MUSI: an integrated system for identifying multiple specificity from very large peptide or nucleic acid data sets.

    Science.gov (United States)

    Kim, Taehyung; Tyndel, Marc S; Huang, Haiming; Sidhu, Sachdev S; Bader, Gary D; Gfeller, David; Kim, Philip M

    2012-03-01

    Peptide recognition domains and transcription factors play crucial roles in cellular signaling. They bind linear stretches of amino acids or nucleotides, respectively, with high specificity. Experimental techniques that assess the binding specificity of these domains, such as microarrays or phage display, can retrieve thousands of distinct ligands, providing detailed insight into binding specificity. In particular, the advent of next-generation sequencing has recently increased the throughput of such methods by several orders of magnitude. These advances have helped reveal the presence of distinct binding specificity classes that co-exist within a set of ligands interacting with the same target. Here, we introduce a software system called MUSI that can rapidly analyze very large data sets of binding sequences to determine the relevant binding specificity patterns. Our pipeline provides two major advances. First, it can detect previously unrecognized multiple specificity patterns in any data set. Second, it offers integrated processing of very large data sets from next-generation sequencing machines. The results are visualized as multiple sequence logos describing the different binding preferences of the protein under investigation. We demonstrate the performance of MUSI by analyzing recent phage display data for human SH3 domains as well as microarray data for mouse transcription factors.

  20. Heat-capacity analysis of a large number of A15-type compounds

    International Nuclear Information System (INIS)

    Junod, A.; Jarlborg, T.; Muller, J.

    1983-01-01

    We analyze the low- and medium-temperature specific heat of 25 samples based on eleven A15 binary compounds, with T/sub c/'s ranging from less than 0.015 to 18 K. Experimentally determined ''moments'' of the phonon spectra (omega-bar,omega-bar 2 ,#betta#/sub log/) are included in the analysis. Values are tabulated for T-bar/sub c/, 2 , eta, 2 >, N/sub bs/(E/sub F/), Momega-bar 2 2 , H/sub c/(0), and 2δ(0)/k/sub B/T/sub c/. We note the following: (i) The Debye temperature is generally a bad estimate of #betta#/sub log/. (ii) lambda is governed mainly by the ''electronic parameter'' eta; lambda = 0.175eta(eV/A 2 ) +- 0.2 for all A15 compounds studied. (iii) eta is proportional to the density of states at the Fermi level and this density of states agrees well with band-structure calculations of Jarlborg in Nb-based compounds. In V-based compounds, the observed bad correlation may reflect the presence of spin fluctuations. (iv) The values for the reduced gap 2δ(0)/k/sub B/T/sub c/ range from 3.4 to 4.9 and they are correlated with T/sub c//#betta#/sub log/

  1. High-Throughput Screening Using iPSC-Derived Neuronal Progenitors to Identify Compounds Counteracting Epigenetic Gene Silencing in Fragile X Syndrome.

    Science.gov (United States)

    Kaufmann, Markus; Schuffenhauer, Ansgar; Fruh, Isabelle; Klein, Jessica; Thiemeyer, Anke; Rigo, Pierre; Gomez-Mancilla, Baltazar; Heidinger-Millot, Valerie; Bouwmeester, Tewis; Schopfer, Ulrich; Mueller, Matthias; Fodor, Barna D; Cobos-Correa, Amanda

    2015-10-01

    Fragile X syndrome (FXS) is the most common form of inherited mental retardation, and it is caused in most of cases by epigenetic silencing of the Fmr1 gene. Today, no specific therapy exists for FXS, and current treatments are only directed to improve behavioral symptoms. Neuronal progenitors derived from FXS patient induced pluripotent stem cells (iPSCs) represent a unique model to study the disease and develop assays for large-scale drug discovery screens since they conserve the Fmr1 gene silenced within the disease context. We have established a high-content imaging assay to run a large-scale phenotypic screen aimed to identify compounds that reactivate the silenced Fmr1 gene. A set of 50,000 compounds was tested, including modulators of several epigenetic targets. We describe an integrated drug discovery model comprising iPSC generation, culture scale-up, and quality control and screening with a very sensitive high-content imaging assay assisted by single-cell image analysis and multiparametric data analysis based on machine learning algorithms. The screening identified several compounds that induced a weak expression of fragile X mental retardation protein (FMRP) and thus sets the basis for further large-scale screens to find candidate drugs or targets tackling the underlying mechanism of FXS with potential for therapeutic intervention. © 2015 Society for Laboratory Automation and Screening.

  2. The Viking viewer for connectomics: scalable multi-user annotation and summarization of large volume data sets.

    Science.gov (United States)

    Anderson, J R; Mohammed, S; Grimm, B; Jones, B W; Koshevoy, P; Tasdizen, T; Whitaker, R; Marc, R E

    2011-01-01

    Modern microscope automation permits the collection of vast amounts of continuous anatomical imagery in both two and three dimensions. These large data sets present significant challenges for data storage, access, viewing, annotation and analysis. The cost and overhead of collecting and storing the data can be extremely high. Large data sets quickly exceed an individual's capability for timely analysis and present challenges in efficiently applying transforms, if needed. Finally annotated anatomical data sets can represent a significant investment of resources and should be easily accessible to the scientific community. The Viking application was our solution created to view and annotate a 16.5 TB ultrastructural retinal connectome volume and we demonstrate its utility in reconstructing neural networks for a distinctive retinal amacrine cell class. Viking has several key features. (1) It works over the internet using HTTP and supports many concurrent users limited only by hardware. (2) It supports a multi-user, collaborative annotation strategy. (3) It cleanly demarcates viewing and analysis from data collection and hosting. (4) It is capable of applying transformations in real-time. (5) It has an easily extensible user interface, allowing addition of specialized modules without rewriting the viewer. © 2010 The Authors Journal of Microscopy © 2010 The Royal Microscopical Society.

  3. Rare earths in uranium compounds and important evidences for nuclear forensic purposes

    International Nuclear Information System (INIS)

    Rosa, Daniele S.; Sarkis, Jorge E.S.

    2011-01-01

    Nuclear forensics mainly focuses on the nuclear or radioactive material and aims to providing indication on the intended use, the history and even the origin of the material. Uranium compounds have isotopic or chemical characteristics that provide unambiguous information concerning their origin and production process. Rare earths elements (REE) are a set of sixteen chemical elements in the periodic table, specifically the fourteen Lanthanides in addition scandium and yttrium. These elements are often found together but in widely variable concentrations in uncommon varieties of igneous rocks. A large amount of uranium is in rare earths deposits, and may be extracted as a by-product. Accordingly, REE in uranium compounds can be used as an evidence of uranium origin. In this study, REE was determined in uranium compounds from different origin. Measurements were carried out using a High resolution inductively coupled plasma mass spectrometer (HR-ICP-MS) Element 2, in low resolution mode (R-300). (author)

  4. A large set of potential past, present and future hydro-meteorological time series for the UK

    Science.gov (United States)

    Guillod, Benoit P.; Jones, Richard G.; Dadson, Simon J.; Coxon, Gemma; Bussi, Gianbattista; Freer, James; Kay, Alison L.; Massey, Neil R.; Sparrow, Sarah N.; Wallom, David C. H.; Allen, Myles R.; Hall, Jim W.

    2018-01-01

    Hydro-meteorological extremes such as drought and heavy precipitation can have large impacts on society and the economy. With potentially increasing risks associated with such events due to climate change, properly assessing the associated impacts and uncertainties is critical for adequate adaptation. However, the application of risk-based approaches often requires large sets of extreme events, which are not commonly available. Here, we present such a large set of hydro-meteorological time series for recent past and future conditions for the United Kingdom based on weather@home 2, a modelling framework consisting of a global climate model (GCM) driven by observed or projected sea surface temperature (SST) and sea ice which is downscaled to 25 km over the European domain by a regional climate model (RCM). Sets of 100 time series are generated for each of (i) a historical baseline (1900-2006), (ii) five near-future scenarios (2020-2049) and (iii) five far-future scenarios (2070-2099). The five scenarios in each future time slice all follow the Representative Concentration Pathway 8.5 (RCP8.5) and sample the range of sea surface temperature and sea ice changes from CMIP5 (Coupled Model Intercomparison Project Phase 5) models. Validation of the historical baseline highlights good performance for temperature and potential evaporation, but substantial seasonal biases in mean precipitation, which are corrected using a linear approach. For extremes in low precipitation over a long accumulation period ( > 3 months) and shorter-duration high precipitation (1-30 days), the time series generally represents past statistics well. Future projections show small precipitation increases in winter but large decreases in summer on average, leading to an overall drying, consistently with the most recent UK Climate Projections (UKCP09) but larger in magnitude than the latter. Both drought and high-precipitation events are projected to increase in frequency and intensity in most regions

  5. Making sense of large data sets without annotations: analyzing age-related correlations from lung CT scans

    Science.gov (United States)

    Dicente Cid, Yashin; Mamonov, Artem; Beers, Andrew; Thomas, Armin; Kovalev, Vassili; Kalpathy-Cramer, Jayashree; Müller, Henning

    2017-03-01

    The analysis of large data sets can help to gain knowledge about specific organs or on specific diseases, just as big data analysis does in many non-medical areas. This article aims to gain information from 3D volumes, so the visual content of lung CT scans of a large number of patients. In the case of the described data set, only little annotation is available on the patients that were all part of an ongoing screening program and besides age and gender no information on the patient and the findings was available for this work. This is a scenario that can happen regularly as image data sets are produced and become available in increasingly large quantities but manual annotations are often not available and also clinical data such as text reports are often harder to share. We extracted a set of visual features from 12,414 CT scans of 9,348 patients that had CT scans of the lung taken in the context of a national lung screening program in Belarus. Lung fields were segmented by two segmentation algorithms and only cases where both algorithms were able to find left and right lung and had a Dice coefficient above 0.95 were analyzed. This assures that only segmentations of good quality were used to extract features of the lung. Patients ranged in age from 0 to 106 years. Data analysis shows that age can be predicted with a fairly high accuracy for persons under 15 years. Relatively good results were also obtained between 30 and 65 years where a steady trend is seen. For young adults and older people the results are not as good as variability is very high in these groups. Several visualizations of the data show the evolution patters of the lung texture, size and density with age. The experiments allow learning the evolution of the lung and the gained results show that even with limited metadata we can extract interesting information from large-scale visual data. These age-related changes (for example of the lung volume, the density histogram of the tissue) can also be

  6. Prediction of enthalpy of fusion of pure compounds using an Artificial Neural Network-Group Contribution method

    International Nuclear Information System (INIS)

    Gharagheizi, Farhad; Salehi, Gholam Reza

    2011-01-01

    Highlights: → An Artificial Neural Network-Group Contribution method is presented for prediction of enthalpy of fusion of pure compounds at their normal melting point. → Validity of the model is confirmed using a large evaluated data set containing 4157 pure compounds. → The average percent error of the model is equal to 2.65% in comparison with the experimental data. - Abstract: In this work, the Artificial Neural Network-Group Contribution (ANN-GC) method is applied to estimate the enthalpy of fusion of pure chemical compounds at their normal melting point. 4157 pure compounds from various chemical families are investigated to propose a comprehensive and predictive model. The obtained results show the Squared Correlation Coefficient (R 2 ) of 0.999, Root Mean Square Error of 0.82 kJ/mol, and average absolute deviation lower than 2.65% for the estimated properties from existing experimental values.

  7. Evaluation of digital soil mapping approaches with large sets of environmental covariates

    Science.gov (United States)

    Nussbaum, Madlene; Spiess, Kay; Baltensweiler, Andri; Grob, Urs; Keller, Armin; Greiner, Lucie; Schaepman, Michael E.; Papritz, Andreas

    2018-01-01

    The spatial assessment of soil functions requires maps of basic soil properties. Unfortunately, these are either missing for many regions or are not available at the desired spatial resolution or down to the required soil depth. The field-based generation of large soil datasets and conventional soil maps remains costly. Meanwhile, legacy soil data and comprehensive sets of spatial environmental data are available for many regions. Digital soil mapping (DSM) approaches relating soil data (responses) to environmental data (covariates) face the challenge of building statistical models from large sets of covariates originating, for example, from airborne imaging spectroscopy or multi-scale terrain analysis. We evaluated six approaches for DSM in three study regions in Switzerland (Berne, Greifensee, ZH forest) by mapping the effective soil depth available to plants (SD), pH, soil organic matter (SOM), effective cation exchange capacity (ECEC), clay, silt, gravel content and fine fraction bulk density for four soil depths (totalling 48 responses). Models were built from 300-500 environmental covariates by selecting linear models through (1) grouped lasso and (2) an ad hoc stepwise procedure for robust external-drift kriging (georob). For (3) geoadditive models we selected penalized smoothing spline terms by component-wise gradient boosting (geoGAM). We further used two tree-based methods: (4) boosted regression trees (BRTs) and (5) random forest (RF). Lastly, we computed (6) weighted model averages (MAs) from the predictions obtained from methods 1-5. Lasso, georob and geoGAM successfully selected strongly reduced sets of covariates (subsets of 3-6 % of all covariates). Differences in predictive performance, tested on independent validation data, were mostly small and did not reveal a single best method for 48 responses. Nevertheless, RF was often the best among methods 1-5 (28 of 48 responses), but was outcompeted by MA for 14 of these 28 responses. RF tended to over

  8. Decomposing wage distributions on a large data set - a quantile regression analysis of the gender wage gap

    DEFF Research Database (Denmark)

    Albæk, Karsten; Brink Thomsen, Lars

    This paper presents and implements a procedure that makes it possible to decompose wage distributions on large data sets. We replace bootstrap sampling in the standard Machado-Mata procedure with ‘non-replacement subsampling’, which is more suitable for the linked employer-employee data applied i...... in gender wage differences in the lower part of the wage distribution.......This paper presents and implements a procedure that makes it possible to decompose wage distributions on large data sets. We replace bootstrap sampling in the standard Machado-Mata procedure with ‘non-replacement subsampling’, which is more suitable for the linked employer-employee data applied...... in this paper. Decompositions show that most of the glass ceiling is related to segregation in the form of either composition effects or different returns to males and females. A counterfactual wage distribution without differences in the constant terms (or ‘discrimination’) implies substantial changes...

  9. Caught you: threats to confidentiality due to the public release of large-scale genetic data sets.

    Science.gov (United States)

    Wjst, Matthias

    2010-12-29

    Large-scale genetic data sets are frequently shared with other research groups and even released on the Internet to allow for secondary analysis. Study participants are usually not informed about such data sharing because data sets are assumed to be anonymous after stripping off personal identifiers. The assumption of anonymity of genetic data sets, however, is tenuous because genetic data are intrinsically self-identifying. Two types of re-identification are possible: the "Netflix" type and the "profiling" type. The "Netflix" type needs another small genetic data set, usually with less than 100 SNPs but including a personal identifier. This second data set might originate from another clinical examination, a study of leftover samples or forensic testing. When merged to the primary, unidentified set it will re-identify all samples of that individual. Even with no second data set at hand, a "profiling" strategy can be developed to extract as much information as possible from a sample collection. Starting with the identification of ethnic subgroups along with predictions of body characteristics and diseases, the asthma kids case as a real-life example is used to illustrate that approach. Depending on the degree of supplemental information, there is a good chance that at least a few individuals can be identified from an anonymized data set. Any re-identification, however, may potentially harm study participants because it will release individual genetic disease risks to the public.

  10. Caught you: threats to confidentiality due to the public release of large-scale genetic data sets

    Directory of Open Access Journals (Sweden)

    Wjst Matthias

    2010-12-01

    Full Text Available Abstract Background Large-scale genetic data sets are frequently shared with other research groups and even released on the Internet to allow for secondary analysis. Study participants are usually not informed about such data sharing because data sets are assumed to be anonymous after stripping off personal identifiers. Discussion The assumption of anonymity of genetic data sets, however, is tenuous because genetic data are intrinsically self-identifying. Two types of re-identification are possible: the "Netflix" type and the "profiling" type. The "Netflix" type needs another small genetic data set, usually with less than 100 SNPs but including a personal identifier. This second data set might originate from another clinical examination, a study of leftover samples or forensic testing. When merged to the primary, unidentified set it will re-identify all samples of that individual. Even with no second data set at hand, a "profiling" strategy can be developed to extract as much information as possible from a sample collection. Starting with the identification of ethnic subgroups along with predictions of body characteristics and diseases, the asthma kids case as a real-life example is used to illustrate that approach. Summary Depending on the degree of supplemental information, there is a good chance that at least a few individuals can be identified from an anonymized data set. Any re-identification, however, may potentially harm study participants because it will release individual genetic disease risks to the public.

  11. Calculations of safe collimator settings and β^{*} at the CERN Large Hadron Collider

    Directory of Open Access Journals (Sweden)

    R. Bruce

    2015-06-01

    Full Text Available The first run of the Large Hadron Collider (LHC at CERN was very successful and resulted in important physics discoveries. One way of increasing the luminosity in a collider, which gave a very significant contribution to the LHC performance in the first run and can be used even if the beam intensity cannot be increased, is to decrease the transverse beam size at the interaction points by reducing the optical function β^{*}. However, when doing so, the beam becomes larger in the final focusing system, which could expose its aperture to beam losses. For the LHC, which is designed to store beams with a total energy of 362 MJ, this is critical, since the loss of even a small fraction of the beam could cause a magnet quench or even damage. Therefore, the machine aperture has to be protected by the collimation system. The settings of the collimators constrain the maximum beam size that can be tolerated and therefore impose a lower limit on β^{*}. In this paper, we present calculations to determine safe collimator settings and the resulting limit on β^{*}, based on available aperture and operational stability of the machine. Our model was used to determine the LHC configurations in 2011 and 2012 and it was found that β^{*} could be decreased significantly compared to the conservative model used in 2010. The gain in luminosity resulting from the decreased margins between collimators was more than a factor 2, and a further contribution from the use of realistic aperture estimates based on measurements was almost as large. This has played an essential role in the rapid and successful accumulation of experimental data in the LHC.

  12. Calculations of safe collimator settings and β* at the CERN Large Hadron Collider

    Science.gov (United States)

    Bruce, R.; Assmann, R. W.; Redaelli, S.

    2015-06-01

    The first run of the Large Hadron Collider (LHC) at CERN was very successful and resulted in important physics discoveries. One way of increasing the luminosity in a collider, which gave a very significant contribution to the LHC performance in the first run and can be used even if the beam intensity cannot be increased, is to decrease the transverse beam size at the interaction points by reducing the optical function β*. However, when doing so, the beam becomes larger in the final focusing system, which could expose its aperture to beam losses. For the LHC, which is designed to store beams with a total energy of 362 MJ, this is critical, since the loss of even a small fraction of the beam could cause a magnet quench or even damage. Therefore, the machine aperture has to be protected by the collimation system. The settings of the collimators constrain the maximum beam size that can be tolerated and therefore impose a lower limit on β*. In this paper, we present calculations to determine safe collimator settings and the resulting limit on β*, based on available aperture and operational stability of the machine. Our model was used to determine the LHC configurations in 2011 and 2012 and it was found that β* could be decreased significantly compared to the conservative model used in 2010. The gain in luminosity resulting from the decreased margins between collimators was more than a factor 2, and a further contribution from the use of realistic aperture estimates based on measurements was almost as large. This has played an essential role in the rapid and successful accumulation of experimental data in the LHC.

  13. Volatile organic compound emissions from Larrea tridentata (creosotebush

    Directory of Open Access Journals (Sweden)

    A. Guenther

    2010-12-01

    Full Text Available We present results from the CREosote ATmosphere Interactions through Volatile Emissions (CREATIVE 2009 field study in southern Arizona aimed at quantifying emission rates of VOCs from creosotebush (Larrea tridentata during the summer 2009 monsoon season. This species was chosen because of its vast distribution in North and South American deserts and because its resins have been reported to contain a rich set of volatile organic compounds (VOC. While a variety of ecosystems have been investigated for VOC emissions, deserts remain essentially unstudied, partially because of their low biomass densities and water limitations. However, during the North American monsoon, a pronounced increase in rainfall from an extremely dry June (80 mm occurs over large areas of the Sonoran desert in the southwestern United States and northwestern Mexico. We observed a strong diurnal pattern of branch emissions and ambient concentrations of an extensive suite of VOCs with maxima in early afternoon. These include VOCs typically observed in forest sites (oxygenated VOCs and volatile isoprenoids as well as a large number of other compounds, some of which have not been previously described from any plant including 1-chloro-2-methoxy-benzene and isobutyronitrile. Although generally considered to be derived from anthropogenic sources, we observed emissions of aromatic compounds including benzene, and a broad range of phenolics. Dimethyl sulfide emissions from creosotebush were higher than reported from any previously studied plant suggesting that terrestrial ecosystems should be reconsidered as an important source of this climatically important gas. We also present direct, primary emission measurements of isoprene and its apparent oxidation products methyl vinyl ketone, methacrolein, and 3-methyl furan (the later three compounds are typically assumed to form from secondary reactions within the atmosphere, as well as a group of compounds considered to be fatty acid

  14. A large set of potential past, present and future hydro-meteorological time series for the UK

    Directory of Open Access Journals (Sweden)

    B. P. Guillod

    2018-01-01

    Full Text Available Hydro-meteorological extremes such as drought and heavy precipitation can have large impacts on society and the economy. With potentially increasing risks associated with such events due to climate change, properly assessing the associated impacts and uncertainties is critical for adequate adaptation. However, the application of risk-based approaches often requires large sets of extreme events, which are not commonly available. Here, we present such a large set of hydro-meteorological time series for recent past and future conditions for the United Kingdom based on weather@home 2, a modelling framework consisting of a global climate model (GCM driven by observed or projected sea surface temperature (SST and sea ice which is downscaled to 25 km over the European domain by a regional climate model (RCM. Sets of 100 time series are generated for each of (i a historical baseline (1900–2006, (ii five near-future scenarios (2020–2049 and (iii five far-future scenarios (2070–2099. The five scenarios in each future time slice all follow the Representative Concentration Pathway 8.5 (RCP8.5 and sample the range of sea surface temperature and sea ice changes from CMIP5 (Coupled Model Intercomparison Project Phase 5 models. Validation of the historical baseline highlights good performance for temperature and potential evaporation, but substantial seasonal biases in mean precipitation, which are corrected using a linear approach. For extremes in low precipitation over a long accumulation period ( > 3 months and shorter-duration high precipitation (1–30 days, the time series generally represents past statistics well. Future projections show small precipitation increases in winter but large decreases in summer on average, leading to an overall drying, consistently with the most recent UK Climate Projections (UKCP09 but larger in magnitude than the latter. Both drought and high-precipitation events are projected to increase in frequency and

  15. Efficient One-click Browsing of Large Trajectory Sets

    DEFF Research Database (Denmark)

    Krogh, Benjamin Bjerre; Andersen, Ove; Lewis-Kelham, Edwin

    2014-01-01

    presents a novel query type called sheaf, where users can browse trajectory data sets using a single mouse click. Sheaves are very versatile and can be used for location-based advertising, travel-time analysis, intersection analysis, and reachability analysis (isochrones). A novel in-memory trajectory...... index compresses the data by a factor of 12.4 and enables execution of sheaf queries in 40 ms. This is up to 2 orders of magnitude faster than existing work. We demonstrate the simplicity, versatility, and efficiency of sheaf queries using a real-world trajectory set consisting of 2.7 million...

  16. Diazo compounds in continuous-flow technology.

    Science.gov (United States)

    Müller, Simon T R; Wirth, Thomas

    2015-01-01

    Diazo compounds are very versatile reagents in organic chemistry and meet the challenge of selective assembly of structurally complex molecules. Their leaving group is dinitrogen; therefore, they are very clean and atom-efficient reagents. However, diazo compounds are potentially explosive and extremely difficult to handle on an industrial scale. In this review, it is discussed how continuous flow technology can help to make these powerful reagents accessible on large scale. Microstructured devices can improve heat transfer greatly and help with the handling of dangerous reagents safely. The in situ formation and subsequent consumption of diazo compounds are discussed along with advances in handling diazomethane and ethyl diazoacetate. The potential large-scale applications of a given methodology is emphasized. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Exploration of Configuration Options for a Large Civil Compound Helicopter

    Science.gov (United States)

    Russell, Carl; Johnson, Wayne

    2013-01-01

    Multiple compound helicopter configurations are designed using a combination of rotorcraft sizing and comprehensive analysis codes. Results from both the conceptual design phase and rotor comprehensive analysis are presented. The designs are evaluated for their suitability to a short-to-medium-haul civil transport mission carrying a payload of 90 passengers. Multiple metrics are used to determine the best configuration, with heavy emphasis placed on minimizing fuel burn.

  18. High-throughput high-volume nuclear imaging for preclinical in vivo compound screening§.

    Science.gov (United States)

    Macholl, Sven; Finucane, Ciara M; Hesterman, Jacob; Mather, Stephen J; Pauplis, Rachel; Scully, Deirdre; Sosabowski, Jane K; Jouannot, Erwan

    2017-12-01

    Preclinical single-photon emission computed tomography (SPECT)/CT imaging studies are hampered by low throughput, hence are found typically within small volume feasibility studies. Here, imaging and image analysis procedures are presented that allow profiling of a large volume of radiolabelled compounds within a reasonably short total study time. Particular emphasis was put on quality control (QC) and on fast and unbiased image analysis. 2-3 His-tagged proteins were simultaneously radiolabelled by 99m Tc-tricarbonyl methodology and injected intravenously (20 nmol/kg; 100 MBq; n = 3) into patient-derived xenograft (PDX) mouse models. Whole-body SPECT/CT images of 3 mice simultaneously were acquired 1, 4, and 24 h post-injection, extended to 48 h and/or by 0-2 h dynamic SPECT for pre-selected compounds. Organ uptake was quantified by automated multi-atlas and manual segmentations. Data were plotted automatically, quality controlled and stored on a collaborative image management platform. Ex vivo uptake data were collected semi-automatically and analysis performed as for imaging data. >500 single animal SPECT images were acquired for 25 proteins over 5 weeks, eventually generating >3500 ROI and >1000 items of tissue data. SPECT/CT images clearly visualized uptake in tumour and other tissues even at 48 h post-injection. Intersubject uptake variability was typically 13% (coefficient of variation, COV). Imaging results correlated well with ex vivo data. The large data set of tumour, background and systemic uptake/clearance data from 75 mice for 25 compounds allows identification of compounds of interest. The number of animals required was reduced considerably by longitudinal imaging compared to dissection experiments. All experimental work and analyses were accomplished within 3 months expected to be compatible with drug development programmes. QC along all workflow steps, blinding of the imaging contract research organization to compound properties and

  19. Evaluation of some organic compounds on bloodstream forms of Trypanosoma cruzi

    Directory of Open Access Journals (Sweden)

    João S. Silva

    1992-09-01

    Full Text Available Accidental transmission of Chagas' disease to man by blood transfusion is a serious problem in Latin-America. This paper describes the testing of several synthetic, semi-synthetic, and natural compounds for their activity against blood trypomastigotes in vitro at 4-C. The compounds embody several types of chemical structures: benzoquinone, naphthoquinone, anthracenequinone, phenanthrenequinone, imidazole, piperazine, quinoline, xanthene, and simple benzenic and naphthalenic derivates. Some of them are for the first time tested against Trypanosoma cruzi. The toxic effect these compounds on this parasite was done by two quite distinct sets of experiments. In one set, the compounds were added to infected blood as ethanolic solution. In this situation the most active one was a furan-1, 2-naphthoquinone, in the same range as gentian violet, a new fact to be considered in the assessment of structure-activity relationships in this class of compounds. In other set, we tentatively evaluated the biological activity of water insoluble compounds by adding them in a pure form without solvent into infected blood. In this way some appear to be very active and it was postulated that the effectiveness of such compounds must result from interactions between them and specific blood components.

  20. QUANTITATIVE ELECTRONIC STRUCTURE - ACTIVITY RELATIONSHIP OF ANTIMALARIAL COMPOUND OF ARTEMISININ DERIVATIVES USING PRINCIPAL COMPONENT REGRESSION APPROACH

    Directory of Open Access Journals (Sweden)

    Paul Robert Martin Werfette

    2010-06-01

    Full Text Available Analysis of quantitative structure - activity relationship (QSAR for a series of antimalarial compound artemisinin derivatives has been done using principal component regression. The descriptors for QSAR study were representation of electronic structure i.e. atomic net charges of the artemisinin skeleton calculated by AM1 semi-empirical method. The antimalarial activity of the compound was expressed in log 1/IC50 which is an experimental data. The main purpose of the principal component analysis approach is to transform a large data set of atomic net charges to simplify into a data set which known as latent variables. The best QSAR equation to analyze of log 1/IC50 can be obtained from the regression method as a linear function of several latent variables i.e. x1, x2, x3, x4 and x5. The best QSAR model is expressed in the following equation,  (;;   Keywords: QSAR, antimalarial, artemisinin, principal component regression

  1. A Core Set Based Large Vector-Angular Region and Margin Approach for Novelty Detection

    Directory of Open Access Journals (Sweden)

    Jiusheng Chen

    2016-01-01

    Full Text Available A large vector-angular region and margin (LARM approach is presented for novelty detection based on imbalanced data. The key idea is to construct the largest vector-angular region in the feature space to separate normal training patterns; meanwhile, maximize the vector-angular margin between the surface of this optimal vector-angular region and abnormal training patterns. In order to improve the generalization performance of LARM, the vector-angular distribution is optimized by maximizing the vector-angular mean and minimizing the vector-angular variance, which separates the normal and abnormal examples well. However, the inherent computation of quadratic programming (QP solver takes O(n3 training time and at least O(n2 space, which might be computational prohibitive for large scale problems. By (1+ε  and  (1-ε-approximation algorithm, the core set based LARM algorithm is proposed for fast training LARM problem. Experimental results based on imbalanced datasets have validated the favorable efficiency of the proposed approach in novelty detection.

  2. Exploring sets of molecules from patents and relationships to other active compounds in chemical space networks

    Science.gov (United States)

    Kunimoto, Ryo; Bajorath, Jürgen

    2017-09-01

    Patents from medicinal chemistry represent a rich source of novel compounds and activity data that appear only infrequently in the scientific literature. Moreover, patent information provides a primary focal point for drug discovery. Accordingly, text mining and image extraction approaches have become hot topics in patent analysis and repositories of patent data are being established. In this work, we have generated network representations using alternative similarity measures to systematically compare molecules from patents with other bioactive compounds, visualize similarity relationships, explore the chemical neighbourhood of patent molecules, and identify closely related compounds with different activities. The design of network representations that combine patent molecules and other bioactive compounds and view patent information in the context of current bioactive chemical space aids in the analysis of patents and further extends the use of molecular networks to explore structure-activity relationships.

  3. New Linear Partitioning Models Based on Experimental Water: Supercritical CO2 Partitioning Data of Selected Organic Compounds.

    Science.gov (United States)

    Burant, Aniela; Thompson, Christopher; Lowry, Gregory V; Karamalidis, Athanasios K

    2016-05-17

    Partitioning coefficients of organic compounds between water and supercritical CO2 (sc-CO2) are necessary to assess the risk of migration of these chemicals from subsurface CO2 storage sites. Despite the large number of potential organic contaminants, the current data set of published water-sc-CO2 partitioning coefficients is very limited. Here, the partitioning coefficients of thiophene, pyrrole, and anisole were measured in situ over a range of temperatures and pressures using a novel pressurized batch-reactor system with dual spectroscopic detectors: a near-infrared spectrometer for measuring the organic analyte in the CO2 phase and a UV detector for quantifying the analyte in the aqueous phase. Our measured partitioning coefficients followed expected trends based on volatility and aqueous solubility. The partitioning coefficients and literature data were then used to update a published poly parameter linear free-energy relationship and to develop five new linear free-energy relationships for predicting water-sc-CO2 partitioning coefficients. A total of four of the models targeted a single class of organic compounds. Unlike models that utilize Abraham solvation parameters, the new relationships use vapor pressure and aqueous solubility of the organic compound at 25 °C and CO2 density to predict partitioning coefficients over a range of temperature and pressure conditions. The compound class models provide better estimates of partitioning behavior for compounds in that class than does the model built for the entire data set.

  4. A QSAR study of integrase strand transfer inhibitors based on a large set of pyrimidine, pyrimidone, and pyridopyrazine carboxamide derivatives

    Science.gov (United States)

    de Campos, Luana Janaína; de Melo, Eduardo Borges

    2017-08-01

    In the present study, 199 compounds derived from pyrimidine, pyrimidone and pyridopyrazine carboxamides with inhibitory activity against HIV-1 integrase were modeled. Subsequently, a multivariate QSAR study was conducted with 54 molecules employed by Ordered Predictors Selection (OPS) and Partial Least Squares (PLS) for the selection of variables and model construction, respectively. Topological, electrotopological, geometric, and molecular descriptors were used. The selected real model was robust and free from chance correlation; in addition, it demonstrated favorable internal and external statistical quality. Once statistically validated, the training model was used to predict the activity of a second data set (n = 145). The root mean square deviation (RMSD) between observed and predicted values was 0.698. Although it is a value outside of the standards, only 15 (10.34%) of the samples exhibited higher residual values than 1 log unit, a result considered acceptable. Results of Williams and Euclidean applicability domains relative to the prediction showed that the predictions did not occur by extrapolation and that the model is representative of the chemical space of test compounds.

  5. The Molecule Cloud - compact visualization of large collections of molecules

    Directory of Open Access Journals (Sweden)

    Ertl Peter

    2012-07-01

    Full Text Available Abstract Background Analysis and visualization of large collections of molecules is one of the most frequent challenges cheminformatics experts in pharmaceutical industry are facing. Various sophisticated methods are available to perform this task, including clustering, dimensionality reduction or scaffold frequency analysis. In any case, however, viewing and analyzing large tables with molecular structures is necessary. We present a new visualization technique, providing basic information about the composition of molecular data sets at a single glance. Summary A method is presented here allowing visual representation of the most common structural features of chemical databases in a form of a cloud diagram. The frequency of molecules containing particular substructure is indicated by the size of respective structural image. The method is useful to quickly perceive the most prominent structural features present in the data set. This approach was inspired by popular word cloud diagrams that are used to visualize textual information in a compact form. Therefore we call this approach “Molecule Cloud”. The method also supports visualization of additional information, for example biological activity of molecules containing this scaffold or the protein target class typical for particular scaffolds, by color coding. Detailed description of the algorithm is provided, allowing easy implementation of the method by any cheminformatics toolkit. The layout algorithm is available as open source Java code. Conclusions Visualization of large molecular data sets using the Molecule Cloud approach allows scientists to get information about the composition of molecular databases and their most frequent structural features easily. The method may be used in the areas where analysis of large molecular collections is needed, for example processing of high throughput screening results, virtual screening or compound purchasing. Several example visualizations of large

  6. Rational Design of Glycomimetic Compounds Targeting the Saccharomyces cerevisiae Transglycosylase Gas2.

    Science.gov (United States)

    Delso, Ignacio; Valero-González, Jessika; Marca, Eduardo; Tejero, Tomás; Hurtado-Guerrero, Ramón; Merino, Pedro

    2016-02-01

    The transglycosylase Saccharomyces cerevisiae Gas2 (ScGas2) belongs to a large family of enzymes that are key players in yeast cell wall remodeling. Despite its biologic importance, no studies on the synthesis of substrate-based compounds as potential inhibitors have been reported. We have synthesized a series of docking-guided glycomimetics that were evaluated by fluorescence spectroscopy and saturation-transfer difference (STD) NMR experiments, revealing that a minimum of three glucose units linked via a β-(1,3) linkage are required for achieving molecular recognition at the binding donor site. The binding mode of our compounds is further supported by STD-NMR experiments using the active site-mutants Y107Q and Y244Q. Our results are important for both understanding of ScGas2-substrate interactions and setting up the basis for future design of glycomimetics as new antifungal agents. © 2015 John Wiley & Sons A/S.

  7. French experience with Uranium compounds: conclusions of medical working group

    International Nuclear Information System (INIS)

    Berard, P.; Mazeyrat, C.; Auriol, B.; Montegue, A.; Estrabaud, M.; Grappin, L.; Giraud, J.M.

    2002-01-01

    The authors who represent several organisations and industrial firms, present observations conducted for some thirty years in France, including routine monitoring or special measurements following contamination by uranium compounds. They propose recommendations for radio toxicological monitoring of workers exposed to industrial uranium compounds and they comment on urine and faecal collections in relation to specific exposures. Our working group, set up by the CEA Medical Adviser in 1975, consists of French specialists in uranium radio toxicology. Their role is to propose recommendations for the monitoring of working conditions and exposed workers. The different plants process chemically and metallurgically, and machine large quantities of uranium with various 235U enrichments. Radio toxicological monitoring of workers exposed to uranium compounds requires examinations prescribed according to the kind of product manipulated and the industrial risk of the workplace. The range of examinations that are useful for this kind of monitoring includes lung monitoring, urine analyses and faecal sampling. The authors present the frequency of the monitoring for routine or special conditions according to industrial exposure, time and duration of collection of excreta (urine and faeces), the necessity of a work break, precautions for preservation of the samples and the ways in interpreting excretion analysis according to natural food intakes

  8. Large volume TENAX {sup registered} extraction of the bioaccessible fraction of sediment-associated organic compounds for a subsequent effect-directed analysis

    Energy Technology Data Exchange (ETDEWEB)

    Schwab, K.; Brack, W. [UFZ - Helmholtz Centre or Environmental Research, Leipzig (Germany). Dept. of Effect-Directed Analysis

    2007-06-15

    Background, Aim and Scope: Effect-directed analysis (EDA) is a powerful tool for the identification of key toxicants in complex environmental samples. In most cases, EDA is based on total extraction of organic contaminants leading to an erroneous prioritization with regard to hazard and risk. Bioaccessibility-directed extraction aims to discriminate between contaminants that take part in partitioning between sediment and biota in a relevant time frame and those that are enclosed in structures, that do not allow rapid desorption. Standard protocols of targeted extraction of rapidly desorbing, and thus bioaccessible fraction using TENAX {sup registered} are based only on small amounts of sediment. In order to get sufficient amounts of extracts for subsequent biotesting, fractionation, and structure elucidation a large volume extraction technique needs to be developed applying one selected extraction time and excluding toxic procedural blanks. Materials and Methods: Desorption behaviour of sediment contaminants was determined by a consecutive solid-solid extraction of sediment using TENAX {sup registered} fitting a tri-compartment model on experimental data. Time needed to remove the rapidly desorbing fraction trap was calculated to select a fixed extraction time for single extraction procedures. Up-scaling by about a factor of 100 provided a large volume extraction technique for EDA. Reproducibility and comparability to small volume approach were proved. Blanks of respective TENAX {sup registered} mass were investigated using Scenedesmus vacuolatus and Artemia salina as test organisms. Results: Desorption kinetics showed that 12 to 30 % of sediment associated pollutants are available for rapid desorption. t{sub r}ap is compound dependent and covers a range of 2 to 18 h. On that basis a fixed extraction time of 24 h was selected. Validation of large volume approach was done by the means of comparison to small method and reproducibility. The large volume showed a good

  9. Envision: An interactive system for the management and visualization of large geophysical data sets

    Science.gov (United States)

    Searight, K. R.; Wojtowicz, D. P.; Walsh, J. E.; Pathi, S.; Bowman, K. P.; Wilhelmson, R. B.

    1995-01-01

    Envision is a software project at the University of Illinois and Texas A&M, funded by NASA's Applied Information Systems Research Project. It provides researchers in the geophysical sciences convenient ways to manage, browse, and visualize large observed or model data sets. Envision integrates data management, analysis, and visualization of geophysical data in an interactive environment. It employs commonly used standards in data formats, operating systems, networking, and graphics. It also attempts, wherever possible, to integrate with existing scientific visualization and analysis software. Envision has an easy-to-use graphical interface, distributed process components, and an extensible design. It is a public domain package, freely available to the scientific community.

  10. On some derived compounds of fluorides of Cerium III or IV: defined compounds and non stoichiometric phases

    International Nuclear Information System (INIS)

    Besse, Jean-Pierre

    1968-01-01

    This research study addresses the study of rare earth fluorides. It reports the preparation and study of new fluoro-cerates (IV) in order to complete the set of already known compounds (ammonium fluoro-cerate, and alkaline earth compounds), the study of binary CeF 3 binary systems, monovalent and divalent fluorides, and CeF 3 -NF 2 -N'F ternary systems, and the study of non stoichiometric phases in CeF 3 oxides, sulphides and selenides [fr

  11. Treatment of severe pulmonary hypertension in the setting of the large patent ductus arteriosus.

    Science.gov (United States)

    Niu, Mary C; Mallory, George B; Justino, Henri; Ruiz, Fadel E; Petit, Christopher J

    2013-05-01

    Treatment of the large patent ductus arteriosus (PDA) in the setting of pulmonary hypertension (PH) is challenging. Left patent, the large PDA can result in irreversible pulmonary vascular disease. Occlusion, however, may lead to right ventricular failure for certain patients with severe PH. Our center has adopted a staged management strategy using medical management, noninvasive imaging, and invasive cardiac catheterization to treat PH in the presence of a large PDA. This approach determines the safety of ductal closure but also leverages medical therapy to create an opportunity for safe PDA occlusion. We reviewed our experience with this approach. Patients with both severe PH and PDAs were studied. PH treatment history and hemodynamic data obtained during catheterizations were reviewed. Repeat catheterizations, echocardiograms, and clinical status at latest follow-up were also reviewed. Seven patients had both PH and large, unrestrictive PDAs. At baseline, all patients had near-systemic right ventricular pressures. Nine catheterizations were performed. Two patients underwent 2 catheterizations each due to poor initial response to balloon test occlusion. Six of 7 patients exhibited subsystemic pulmonary pressures during test occlusion and underwent successful PDA occlusion. One patient did not undergo PDA occlusion. In follow-up, 2 additional catheterizations were performed after successful PDA occlusion for subsequent hemodynamic assessment. At the latest follow-up, the 6 patients who underwent PDA occlusion are well, with continued improvement in PH. Five patients remain on PH treatment. A staged approach to PDA closure for patients with severe PH is an effective treatment paradigm. Aggressive treatment of PH creates a window of opportunity for PDA occlusion, echocardiography assists in identifying the timing for closure, and balloon test occlusion during cardiac catheterization is critical in determining safety of closure. By safely eliminating the large PDA

  12. WebViz:A Web-based Collaborative Interactive Visualization System for large-Scale Data Sets

    Science.gov (United States)

    Yuen, D. A.; McArthur, E.; Weiss, R. M.; Zhou, J.; Yao, B.

    2010-12-01

    WebViz is a web-based application designed to conduct collaborative, interactive visualizations of large data sets for multiple users, allowing researchers situated all over the world to utilize the visualization services offered by the University of Minnesota’s Laboratory for Computational Sciences and Engineering (LCSE). This ongoing project has been built upon over the last 3 1/2 years .The motivation behind WebViz lies primarily with the need to parse through an increasing amount of data produced by the scientific community as a result of larger and faster multicore and massively parallel computers coming to the market, including the use of general purpose GPU computing. WebViz allows these large data sets to be visualized online by anyone with an account. The application allows users to save time and resources by visualizing data ‘on the fly’, wherever he or she may be located. By leveraging AJAX via the Google Web Toolkit (http://code.google.com/webtoolkit/), we are able to provide users with a remote, web portal to LCSE's (http://www.lcse.umn.edu) large-scale interactive visualization system already in place at the University of Minnesota. LCSE’s custom hierarchical volume rendering software provides high resolution visualizations on the order of 15 million pixels and has been employed for visualizing data primarily from simulations in astrophysics to geophysical fluid dynamics . In the current version of WebViz, we have implemented a highly extensible back-end framework built around HTTP "server push" technology. The web application is accessible via a variety of devices including netbooks, iPhones, and other web and javascript-enabled cell phones. Features in the current version include the ability for users to (1) securely login (2) launch multiple visualizations (3) conduct collaborative visualization sessions (4) delegate control aspects of a visualization to others and (5) engage in collaborative chats with other users within the user interface

  13. Metabolism of arachidonic acid derivatives (prostaglandins and related compounds). Radioimmunological methods to measure certain of these compounds

    International Nuclear Information System (INIS)

    Sors, Herve.

    1978-06-01

    The detection of prostaglandins, present in tissues at concentrations of about 10 -7 to 10 -11 g/g and able to induce physiological effects at concentrations of the picomole order, sets the analyst a particularly difficult problem. Owing to the complexity of their metabolism, the existence of many structurally similar compounds and the low concentrations present, it is necessary to develop highly specific and sensitive methods. Suitable techniques are: the biological activity test or biotest; gas-liquid chromatogaphy combined with mass spectrometry; the radioimmunological method. The radioimmunological analysis procedure is developed: preparation of immunogens and immunisation; preparation of tracers; treatment of biological samples. The different radioimmunological systems are presented: determination of antiserum affinity constants; dose-response curves and sensitivities; specificities; applications to biological measurements. Some remarks are called for concerning the RIA of prostaglandins: the difficulty of obtaining antisera seems to depend on the nature of the PG, a good anti-PGB or PGFα is easier to get than an anti-PGA or PGE. The analysis of each compound implies the use of a corresponding immunoserum and it is therefore essential to have a range of immunosera in order to study as large a number of biosynthesis derivatives as possible; too many physiological investigations are still viewed in relation to one PG only (often a primary PG) at the expense of other derivatives [fr

  14. Solubility of carbon monoxide in bio-oil compounds

    International Nuclear Information System (INIS)

    Qureshi, Muhammad Saad; Le Nedelec, Tom; Guerrero-Amaya, Hernando; Uusi-Kyyny, Petri; Richon, Dominique; Alopaeus, Ville

    2017-01-01

    Highlights: • CO solubility was measured in four bio-oil compounds using static-analytic VLE equipment. • A comparison on the performance of different EoS (PC-SAFT, SRK and PR) was made. • Modelling of polar compounds with Polar PC-SAFT was tested. • Polar PC-SAFT is not needed for weakly polar compounds (μ < 1.0 D). - Abstract: The solubility of carbon monoxide is measured in four different bio-oil compounds (furan, diacetyl, 2-methylfuran, and trans-crotonaldehyde) at temperatures (273.15, 283.15, 298.15, and 323.15 K) and pressures up to 8 MPa using a static-analytical VLE measurement method. The equipment was validated by measuring the solubility of CO 2 in methanol at 298.15 K and pressures (P = 2.9–5.7 MPa). The results were compared with the abundantly available literature values. PC-SAFT, Polar PC-SAFT (PPC-SAFT), and Cubic (SRK, PR) EoS, part of commercial process simulator Aspen Plus V. 8.6, are used here for modelling purpose. The pure component parameters needed for PC-SAFT and PPC-SAFT EoS models, are regressed using the experimental liquid density and vapour pressure data of the pure components. It was observed that furan, 2-methylfuran and diacetyl, having weak dipole moments (μ < 1.0 D), could be modelled reasonably well without the addition of polar contribution using conventional PC-SAFT, while it is recommended to use PPC-SAFT for the description of a polar compound like trans-crotonaldehyde (μ ∼ 3.67 D). It was observed that SRK and PR EoS have similar predictive ability in comparison to PC-SAFT for a mixture of CO with weakly polar compounds in this study. A comparison between the performances of EoS models was made in two ways: first by setting the binary interaction parameter k ij to zero, and second by adjusting a temperature-dependent binary interaction parameter (k ij ). All the models perform with comparable accuracy with adjusted binary interaction parameters. However, due to the large differences between the chemical and

  15. Study on Antibiotic compounds from Pseudomonas aeruginosa NO4 Strain

    Energy Technology Data Exchange (ETDEWEB)

    Nam, Ji Young; Kim, Jin Kyu [Korea Atomic Energy Research Institute, Jeongeup (Korea, Republic of)

    2011-05-15

    As important human and veterinary medicines, antibiotics are being produced and consumed in large quantities around the world. For example, more than 50 million pounds (22,000 tons) of antibiotics are produced in the U.S. each year and annual production in Germany is about 2,000 tons. Antibiotics are low molecular weight microbial metabolites that at low concentrations inhibit the growth of other microorganisms. Resistant bacteria may also spread and become broader infection-control problems, not only within health care institutions, but in communities as well. Clinically important bacteria, such as methicillin-resistant Staphylococcus aureus (MRSA). MRSA is a common cause of infection among hospitalized patients. Pseudomonas aeruginosa is a major cause of opportunistic infections among immunocompromised individuals. The spread of this organism in health care settings is often difficult to control due to the presence of multiple intrinsic and acquired mechanisms of antimicrobial resistance. In this study, we isolated novel bacterium which had strong antagonistic activity and separated antibiotic compounds from Pseudomonas sp., and analyzed characteristics and molecular weight of the antibiotic compound

  16. Study on Antibiotic compounds from Pseudomonas aeruginosa NO4 Strain

    International Nuclear Information System (INIS)

    Nam, Ji Young; Kim, Jin Kyu

    2011-01-01

    As important human and veterinary medicines, antibiotics are being produced and consumed in large quantities around the world. For example, more than 50 million pounds (22,000 tons) of antibiotics are produced in the U.S. each year and annual production in Germany is about 2,000 tons. Antibiotics are low molecular weight microbial metabolites that at low concentrations inhibit the growth of other microorganisms. Resistant bacteria may also spread and become broader infection-control problems, not only within health care institutions, but in communities as well. Clinically important bacteria, such as methicillin-resistant Staphylococcus aureus (MRSA). MRSA is a common cause of infection among hospitalized patients. Pseudomonas aeruginosa is a major cause of opportunistic infections among immunocompromised individuals. The spread of this organism in health care settings is often difficult to control due to the presence of multiple intrinsic and acquired mechanisms of antimicrobial resistance. In this study, we isolated novel bacterium which had strong antagonistic activity and separated antibiotic compounds from Pseudomonas sp., and analyzed characteristics and molecular weight of the antibiotic compound

  17. Considerations for Observational Research Using Large Data Sets in Radiation Oncology

    Energy Technology Data Exchange (ETDEWEB)

    Jagsi, Reshma, E-mail: rjagsi@med.umich.edu [Department of Radiation Oncology, University of Michigan, Ann Arbor, Michigan (United States); Bekelman, Justin E. [Departments of Radiation Oncology and Medical Ethics and Health Policy, University of Pennsylvania Perelman School of Medicine, Philadelphia, Pennsylvania (United States); Chen, Aileen [Department of Radiation Oncology, Harvard Medical School, Boston, Massachusetts (United States); Chen, Ronald C. [Department of Radiation Oncology, University of North Carolina at Chapel Hill School of Medicine, Chapel Hill, North Carolina (United States); Hoffman, Karen [Department of Radiation Oncology, Division of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Tina Shih, Ya-Chen [Department of Medicine, Section of Hospital Medicine, The University of Chicago, Chicago, Illinois (United States); Smith, Benjamin D. [Department of Radiation Oncology, Division of Radiation Oncology, and Department of Health Services Research, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Yu, James B. [Yale School of Medicine, New Haven, Connecticut (United States)

    2014-09-01

    The radiation oncology community has witnessed growing interest in observational research conducted using large-scale data sources such as registries and claims-based data sets. With the growing emphasis on observational analyses in health care, the radiation oncology community must possess a sophisticated understanding of the methodological considerations of such studies in order to evaluate evidence appropriately to guide practice and policy. Because observational research has unique features that distinguish it from clinical trials and other forms of traditional radiation oncology research, the International Journal of Radiation Oncology, Biology, Physics assembled a panel of experts in health services research to provide a concise and well-referenced review, intended to be informative for the lay reader, as well as for scholars who wish to embark on such research without prior experience. This review begins by discussing the types of research questions relevant to radiation oncology that large-scale databases may help illuminate. It then describes major potential data sources for such endeavors, including information regarding access and insights regarding the strengths and limitations of each. Finally, it provides guidance regarding the analytical challenges that observational studies must confront, along with discussion of the techniques that have been developed to help minimize the impact of certain common analytical issues in observational analysis. Features characterizing a well-designed observational study include clearly defined research questions, careful selection of an appropriate data source, consultation with investigators with relevant methodological expertise, inclusion of sensitivity analyses, caution not to overinterpret small but significant differences, and recognition of limitations when trying to evaluate causality. This review concludes that carefully designed and executed studies using observational data that possess these qualities hold

  18. Considerations for Observational Research Using Large Data Sets in Radiation Oncology

    International Nuclear Information System (INIS)

    Jagsi, Reshma; Bekelman, Justin E.; Chen, Aileen; Chen, Ronald C.; Hoffman, Karen; Tina Shih, Ya-Chen; Smith, Benjamin D.; Yu, James B.

    2014-01-01

    The radiation oncology community has witnessed growing interest in observational research conducted using large-scale data sources such as registries and claims-based data sets. With the growing emphasis on observational analyses in health care, the radiation oncology community must possess a sophisticated understanding of the methodological considerations of such studies in order to evaluate evidence appropriately to guide practice and policy. Because observational research has unique features that distinguish it from clinical trials and other forms of traditional radiation oncology research, the International Journal of Radiation Oncology, Biology, Physics assembled a panel of experts in health services research to provide a concise and well-referenced review, intended to be informative for the lay reader, as well as for scholars who wish to embark on such research without prior experience. This review begins by discussing the types of research questions relevant to radiation oncology that large-scale databases may help illuminate. It then describes major potential data sources for such endeavors, including information regarding access and insights regarding the strengths and limitations of each. Finally, it provides guidance regarding the analytical challenges that observational studies must confront, along with discussion of the techniques that have been developed to help minimize the impact of certain common analytical issues in observational analysis. Features characterizing a well-designed observational study include clearly defined research questions, careful selection of an appropriate data source, consultation with investigators with relevant methodological expertise, inclusion of sensitivity analyses, caution not to overinterpret small but significant differences, and recognition of limitations when trying to evaluate causality. This review concludes that carefully designed and executed studies using observational data that possess these qualities hold

  19. A method to estimate the enthalpy of formation of organic compounds with chemical accuracy

    DEFF Research Database (Denmark)

    Hukkerikar, Amol; Meier, Robert J.; Sin, Gürkan

    2013-01-01

    through better correlation of data. For parameter estimation, a data-set containing 861 experimentally measured values of a wide variety of organic compounds (hydrocarbons, oxygenated compounds, nitrogenated compounds, multi-functional compounds, etc.) is used. The developed property model for Δf...

  20. Chemical compounds in teak

    Directory of Open Access Journals (Sweden)

    Fernanda Viana da Silva Leonardo

    2015-09-01

    Full Text Available Quinone compounds are largely generated at extractive fraction of the woods in a complex and variable biological system. The literature has indications for many segments from food industry to pharmaceutical industry. Within the field of industrial use of wood, they are less desirable since they are treated only as incidental substances in production strings of pulp, paper, charcoal, and sawmill. In spite of its small amount, compared to other chemical compounds called essential, these substances have received special attention from researchers revealing a diverse range of offerings to market products textiles, pharmaceuticals, colorants, and other polymers, for which are being tested and employed. Quinones are found in fungi, lichens, and mostly in higher plants. Tectona grandis, usually called teak, is able to biosynthesize anthraquinones, which is a quinone compound, byproduct of secondary metabolism. This species provides wood that is much prized in the furniture sector and can also be exploited for metabolites to supply the market in quinone compounds and commercial development of new technologies, adding value to the plantations of this species within our country.

  1. Phenolic Compounds in Brassica Vegetables

    Directory of Open Access Journals (Sweden)

    Pablo Velasco

    2010-12-01

    Full Text Available Phenolic compounds are a large group of phytochemicals widespread in the plant kingdom. Depending on their structure they can be classified into simple phenols, phenolic acids, hydroxycinnamic acid derivatives and flavonoids. Phenolic compounds have received considerable attention for being potentially protective factors against cancer and heart diseases, in part because of their potent antioxidative properties and their ubiquity in a wide range of commonly consumed foods of plant origin. The Brassicaceae family includes a wide range of horticultural crops, some of them with economic significance and extensively used in the diet throughout the world. The phenolic composition of Brassica vegetables has been recently investigated and, nowadays, the profile of different Brassica species is well established. Here, we review the significance of phenolic compounds as a source of beneficial compounds for human health and the influence of environmental conditions and processing mechanisms on the phenolic composition of Brassica vegetables.

  2. RADIOMETRIC NORMALIZATION OF LARGE AIRBORNE IMAGE DATA SETS ACQUIRED BY DIFFERENT SENSOR TYPES

    Directory of Open Access Journals (Sweden)

    S. Gehrke

    2016-06-01

    HxMap software. It has been successfully applied to large sets of heterogeneous imagery, including the adjustment of original sensor images prior to quality control and further processing as well as radiometric adjustment for ortho-image mosaic generation.

  3. Unusual bond paths in organolithium compounds

    International Nuclear Information System (INIS)

    Bachrach, S.M.; Ritchie, J.P.

    1986-01-01

    We have applied the topological method to a number of organolithium compounds. The wavefunctions were determined with GAUSSIAN-82 using 3-21G basis set and fully optimized geometries. Gradient paths were obtained using the RHODER package and critical points were located using EXTREME. These results indicate the unusual nature of organolithium compounds. The strange bond paths arise mainly from the ionic nature of the C-Li interaction. We suggest that the term ''bond path'' may best be suited for covalent bonds. 4 figs., 1 tab

  4. Characterization of the apoptotic response of human leukemia cells to organosulfur compounds

    International Nuclear Information System (INIS)

    Wong, W Wei-Lynn; Langler, Richard F; Penn, Linda Z; Boutros, Paul C; Wasylishen, Amanda R; Guckert, Kristal D; O'Brien, Erin M; Griffiths, Rebecca; Martirosyan, Anna R; Bros, Christina; Jurisica, Igor

    2010-01-01

    Novel therapeutic agents that selectively induce tumor cell death are urgently needed in the clinical management of cancers. Such agents would constitute effective adjuvant approaches to traditional chemotherapy regimens. Organosulfur compounds (OSCs), such as diallyl disulfide, have demonstrated anti-proliferative effects on cancer cells. We have previously shown that synthesized relatives of dysoxysulfone, a natural OSC derived from the Fijian medicinal plant, Dysoxylum richi, possess tumor-specific antiproliferative effects and are thus promising lead candidates. Because our structure-activity analyses showed that regions flanking the disulfide bond mediated specificity, we synthesized 18 novel OSCs by structural modification of the most promising dysoxysulfone derivatives. These compounds were tested for anti-proliferative and apoptotic activity in both normal and leukemic cells. Six OSCs exhibited tumor-specific killing, having no effect on normal bone marrow, and are thus candidates for future toxicity studies. We then employed mRNA expression profiling to characterize the mechanisms by which different OSCs induce apoptosis. Using Gene Ontology analysis we show that each OSC altered a unique set of pathways, and that these differences could be partially rationalized from a transcription factor binding site analysis. For example, five compounds altered genes with a large enrichment of p53 binding sites in their promoter regions (p < 0.0001). Taken together, these data establish OSCs derivatized from dysoxysulfone as a novel group of compounds for development as anti-cancer agents

  5. Microbial degradation of furanic compounds : Biochemistry, genetics, and impact

    NARCIS (Netherlands)

    Wierckx, N.; Koopman, F.; Ruijssenaars, H.J.; De Winde, J.H.

    2011-01-01

    Microbial metabolism of furanic compounds, especially furfural and 5-hydroxymethylfurfural (HMF), is rapidly gaining interest in the scientific community. This interest can largely be attributed to the occurrence of toxic furanic aldehydes in lignocellulosic hydrolysates. However, these compounds

  6. Contact-based ligand-clustering approach for the identification of active compounds in virtual screening

    Directory of Open Access Journals (Sweden)

    Mantsyzov AB

    2012-09-01

    Full Text Available Alexey B Mantsyzov,1 Guillaume Bouvier,2 Nathalie Evrard-Todeschi,1 Gildas Bertho11Université Paris Descartes, Sorbonne, Paris, France; 2Institut Pasteur, Paris, FranceAbstract: Evaluation of docking results is one of the most important problems for virtual screening and in silico drug design. Modern approaches for the identification of active compounds in a large data set of docked molecules use energy scoring functions. One of the general and most significant limitations of these methods relates to inaccurate binding energy estimation, which results in false scoring of docked compounds. Automatic analysis of poses using self-organizing maps (AuPosSOM represents an alternative approach for the evaluation of docking results based on the clustering of compounds by the similarity of their contacts with the receptor. A scoring function was developed for the identification of the active compounds in the AuPosSOM clustered dataset. In addition, the AuPosSOM efficiency for the clustering of compounds and the identification of key contacts considered as important for its activity, were also improved. Benchmark tests for several targets revealed that together with the developed scoring function, AuPosSOM represents a good alternative to the energy-based scoring functions for the evaluation of docking results.Keywords: scoring, docking, virtual screening, CAR, AuPosSOM

  7. Global simulation of aromatic volatile organic compounds in the atmosphere

    Science.gov (United States)

    Cabrera Perez, David; Taraborrelli, Domenico; Pozzer, Andrea

    2015-04-01

    Among the large number of chemical compounds in the atmosphere, the organic group plays a key role in the tropospheric chemistry. Specifically the subgroup called aromatics is of great interest. Aromatics are the predominant trace gases in urban areas due to high emissions, primarily by vehicle exhausts and fuel evaporation. They are also present in areas where biofuel is used (i.e residential wood burning). Emissions of aromatic compounds are a substantial fraction of the total emissions of the volatile organic compounds (VOC). Impact of aromatics on human health is very important, as they do not only contribute to the ozone formation in the urban environment, but they are also highly toxic themselves, especially in the case of benzene which is able to trigger a range of illness under long exposure, and of nitro-phenols which cause detrimental for humans and vegetation even at very low concentrations. The aim of this work is to assess the atmospheric impacts of aromatic compounds on the global scale. The main goals are: lifetime and budget estimation, mixing ratios distribution, net effect on ozone production and OH loss for the most emitted aromatic compounds (benzene, toluene, xylenes, ethylbenzene, styrene and trimethylbenzenes). For this purpose, we use the numerical chemistry and climate simulation ECHAM/MESSy Atmospheric Chemistry (EMAC) model to build the global atmospheric budget for the most emitted and predominant aromatic compounds in the atmosphere. A set of emissions was prepared in order to include biomass burning, vegetation and anthropogenic sources of aromatics into the model. A chemical mechanism based on the Master Chemical Mechanism (MCM) was developed to describe the chemical oxidation in the gas phase of these aromatic compounds. MCM have been reduced in terms of number of chemical equation and species in order to make it affordable in a 3D model. Additionally other features have been added, for instance the production of HONO via ortho

  8. An expanded calibration study of the explicitly correlated CCSD(T)-F12b method using large basis set standard CCSD(T) atomization energies.

    Science.gov (United States)

    Feller, David; Peterson, Kirk A

    2013-08-28

    The effectiveness of the recently developed, explicitly correlated coupled cluster method CCSD(T)-F12b is examined in terms of its ability to reproduce atomization energies derived from complete basis set extrapolations of standard CCSD(T). Most of the standard method findings were obtained with aug-cc-pV7Z or aug-cc-pV8Z basis sets. For a few homonuclear diatomic molecules it was possible to push the basis set to the aug-cc-pV9Z level. F12b calculations were performed with the cc-pVnZ-F12 (n = D, T, Q) basis set sequence and were also extrapolated to the basis set limit using a Schwenke-style, parameterized formula. A systematic bias was observed in the F12b method with the (VTZ-F12/VQZ-F12) basis set combination. This bias resulted in the underestimation of reference values associated with small molecules (valence correlation energies 0.5 E(h)) and an even larger overestimation of atomization energies for bigger systems. Consequently, caution should be exercised in the use of F12b for high accuracy studies. Root mean square and mean absolute deviation error metrics for this basis set combination were comparable to complete basis set values obtained with standard CCSD(T) and the aug-cc-pVDZ through aug-cc-pVQZ basis set sequence. However, the mean signed deviation was an order of magnitude larger. Problems partially due to basis set superposition error were identified with second row compounds which resulted in a weak performance for the smaller VDZ-F12/VTZ-F12 combination of basis sets.

  9. Small-Molecule Compounds Exhibiting Target-Mediated Drug Disposition (TMDD): A Minireview.

    Science.gov (United States)

    An, Guohua

    2017-02-01

    Nonlinearities are commonplace in pharmacokinetics, and 1 special source is the saturable binding of the drug to a high-affinity, low-capacity target, a phenomenon known as target-mediated drug disposition (TMDD). Compared with large-molecule compounds undergoing TMDD, which has been well recognized due to its high prevalence, TMDD in small-molecule compounds is more counterintuitive and has not been well appreciated. With more and more potent small-molecule drugs acting on highly specific targets being developed as well as increasingly sensitive analytical techniques becoming available, many small-molecule compounds have recently been reported to have nonlinear pharmacokinetics imparted by TMDD. To expand our current knowledge of TMDD in small-molecule compounds and increase the awareness of this clinically important phenomenon, this minireview provides an overview of the small-molecule compounds that demonstrate nonlinear pharmacokinetics imparted by TMDD. The present review also summarizes the general features of TMDD in small-molecule compounds and highlights the differences between TMDD in small-molecule compounds and large-molecule compounds. © 2016, The American College of Clinical Pharmacology.

  10. Veterinary Compounding: Regulation, Challenges, and Resources

    OpenAIRE

    Davidson, Gigi

    2017-01-01

    The spectrum of therapeutic need in veterinary medicine is large, and the availability of approved drug products for all veterinary species and indications is relatively small. For this reason, extemporaneous preparation, or compounding, of drugs is commonly employed to provide veterinary medical therapies. The scope of veterinary compounding is broad and focused primarily on meeting the therapeutic needs of companion animals and not food-producing animals in order to avoid human exposure to ...

  11. A Structure-Activity Relationship (SAR Study of Neolignan Compounds with Anti-schistosomiasis Activity

    Directory of Open Access Journals (Sweden)

    Alves Claúdio N.

    2002-01-01

    Full Text Available A set of eighteen neolignan derivative compounds with anti-schistosomiasis activity was studied by using the quantum mechanical semi-empirical method PM3 and other theoretical methods in order to calculate selected molecular properties (variables or descriptors to be correlated to their biological activities. Exploratory data analysis (principal component analysis, PCA, and hierarchical cluster analysis, HCA, discriminant analysis (DA and the Kth nearest neighbor (KNN method were employed for obtaining possible relationships between the calculated descriptors and the biological activities studied and predicting the anti-schistosomiasis activity of new compounds from a test set. The molecular descriptors responsible for the separation between active and inactive compounds were: hydration energy (HE, molecular refractivity (MR and charge on the C19 carbon atom (Q19. These descriptors give information on the kind of interaction that can occur between the compounds and their respective biological receptor. The prediction study was done with a new set of ten derivative compounds by using the PCA, HCA, DA and KNN methods and only five of them were predicted as active against schistosomiasis.

  12. Stock market returns and clinical trial results of investigational compounds: an event study analysis of large biopharmaceutical companies.

    Science.gov (United States)

    Hwang, Thomas J

    2013-01-01

    For biopharmaceutical companies, investments in research and development are risky, and the results from clinical trials are key inflection points in the process. Few studies have explored how and to what extent the public equity market values clinical trial results. Our study dataset matched announcements of clinical trial results for investigational compounds from January 2011 to May 2013 with daily stock market returns of large United States-listed pharmaceutical and biotechnology companies. Event study methodology was used to examine the relationship between clinical research events and changes in stock returns. We identified public announcements for clinical trials of 24 investigational compounds, including 16 (67%) positive and 8 (33%) negative events. The majority of announcements were for Phase 3 clinical trials (N = 13, 54%), and for oncologic (N = 7, 29%) and neurologic (N = 6, 24%) indications. The median cumulative abnormal returns on the day of the announcement were 0.8% (95% confidence interval [CI]: -2.3, 13.4%; P = 0.02) for positive events and -2.0% (95% CI: -9.1, 0.7%; P = 0.04) for negative events, with statistically significant differences from zero. In the day immediately following the announcement, firms with positive events were associated with stock price corrections, with median cumulative abnormal returns falling to 0.4% (95% CI: -3.8, 12.3%; P = 0.33). For firms with negative announcements, the median cumulative abnormal returns were -1.7% (95% CI: -9.5, 1.0%; P = 0.03), and remained significantly negative over the two day event window. The magnitude of abnormal returns did not differ statistically by indication, by trial phase, or between biotechnology and pharmaceutical firms. The release of clinical trial results is an economically significant event and has meaningful effects on market value for large biopharmaceutical companies. Stock return underperformance due to negative events is greater in magnitude and persists longer than

  13. The micro-environmental impact of volatile organic compound emissions from large-scale assemblies of people in a confined space

    Energy Technology Data Exchange (ETDEWEB)

    Dutta, Tanushree [Department of Civil & Environmental Engineering, Hanyang University, 222 Wangsimni-Ro, Seoul 04763 (Korea, Republic of); Kim, Ki-Hyun, E-mail: kkim61@hanyang.ac.kr [Department of Civil & Environmental Engineering, Hanyang University, 222 Wangsimni-Ro, Seoul 04763 (Korea, Republic of); Uchimiya, Minori [USDA-ARS Southern Regional Research Center, 1100 Robert E. Lee Boulevard, New Orleans, LA 70124 (United States); Kumar, Pawan [Department of Chemical Engineering, Indian Institute of Technology, Hauz Khas, New Delhi 11016 (India); Das, Subhasish; Bhattacharya, Satya Sundar [Soil & Agro-Bioengineering Lab, Department of Environmental Science, Tezpur University, Napaam 784028 (India); Szulejko, Jan [Department of Civil & Environmental Engineering, Hanyang University, 222 Wangsimni-Ro, Seoul 04763 (Korea, Republic of)

    2016-11-15

    Large-scale assemblies of people in a confined space can exert significant impacts on the local air chemistry due to human emissions of volatile organics. Variations of air-quality in such small scale can be studied by quantifying fingerprint volatile organic compounds (VOCs) such as acetone, toluene, and isoprene produced during concerts, movie screenings, and sport events (like the Olympics and the World Cup). This review summarizes the extent of VOC accumulation resulting from a large population in a confined area or in a small open area during sporting and other recreational activities. Apart from VOCs emitted directly from human bodies (e.g., perspiration and exhaled breath), those released indirectly from other related sources (e.g., smoking, waste disposal, discharge of food-waste, and use of personal-care products) are also discussed. Although direct and indirect emissions of VOCs from human may constitute <1% of the global atmospheric VOCs budget, unique spatiotemporal variations in VOCs species within a confined space can have unforeseen impacts on the local atmosphere to lead to acute human exposure to harmful pollutants.

  14. The micro-environmental impact of volatile organic compound emissions from large-scale assemblies of people in a confined space

    International Nuclear Information System (INIS)

    Dutta, Tanushree; Kim, Ki-Hyun; Uchimiya, Minori; Kumar, Pawan; Das, Subhasish; Bhattacharya, Satya Sundar; Szulejko, Jan

    2016-01-01

    Large-scale assemblies of people in a confined space can exert significant impacts on the local air chemistry due to human emissions of volatile organics. Variations of air-quality in such small scale can be studied by quantifying fingerprint volatile organic compounds (VOCs) such as acetone, toluene, and isoprene produced during concerts, movie screenings, and sport events (like the Olympics and the World Cup). This review summarizes the extent of VOC accumulation resulting from a large population in a confined area or in a small open area during sporting and other recreational activities. Apart from VOCs emitted directly from human bodies (e.g., perspiration and exhaled breath), those released indirectly from other related sources (e.g., smoking, waste disposal, discharge of food-waste, and use of personal-care products) are also discussed. Although direct and indirect emissions of VOCs from human may constitute <1% of the global atmospheric VOCs budget, unique spatiotemporal variations in VOCs species within a confined space can have unforeseen impacts on the local atmosphere to lead to acute human exposure to harmful pollutants.

  15. PACOM: A Versatile Tool for Integrating, Filtering, Visualizing, and Comparing Multiple Large Mass Spectrometry Proteomics Data Sets.

    Science.gov (United States)

    Martínez-Bartolomé, Salvador; Medina-Aunon, J Alberto; López-García, Miguel Ángel; González-Tejedo, Carmen; Prieto, Gorka; Navajas, Rosana; Salazar-Donate, Emilio; Fernández-Costa, Carolina; Yates, John R; Albar, Juan Pablo

    2018-04-06

    Mass-spectrometry-based proteomics has evolved into a high-throughput technology in which numerous large-scale data sets are generated from diverse analytical platforms. Furthermore, several scientific journals and funding agencies have emphasized the storage of proteomics data in public repositories to facilitate its evaluation, inspection, and reanalysis. (1) As a consequence, public proteomics data repositories are growing rapidly. However, tools are needed to integrate multiple proteomics data sets to compare different experimental features or to perform quality control analysis. Here, we present a new Java stand-alone tool, Proteomics Assay COMparator (PACOM), that is able to import, combine, and simultaneously compare numerous proteomics experiments to check the integrity of the proteomic data as well as verify data quality. With PACOM, the user can detect source of errors that may have been introduced in any step of a proteomics workflow and that influence the final results. Data sets can be easily compared and integrated, and data quality and reproducibility can be visually assessed through a rich set of graphical representations of proteomics data features as well as a wide variety of data filters. Its flexibility and easy-to-use interface make PACOM a unique tool for daily use in a proteomics laboratory. PACOM is available at https://github.com/smdb21/pacom .

  16. Atom-type-based AI topological descriptors: application in structure-boiling point correlations of oxo organic compounds.

    Science.gov (United States)

    Ren, Biye

    2003-01-01

    Structure-boiling point relationships are studied for a series of oxo organic compounds by means of multiple linear regression (MLR) analysis. Excellent MLR models based on the recently introduced Xu index and the atom-type-based AI indices are obtained for the two subsets containing respectively 77 ethers and 107 carbonyl compounds and a combined set of 184 oxo compounds. The best models are tested using the leave-one-out cross-validation and an external test set, respectively. The MLR model produces a correlation coefficient of r = 0.9977 and a standard error of s = 3.99 degrees C for the training set of 184 compounds, and r(cv) = 0.9974 and s(cv) = 4.16 degrees C for the cross-validation set, and r(pred) = 0.9949 and s(pred) = 4.38 degrees C for the prediction set of 21 compounds. For the two subsets containing respectively 77 ethers and 107 carbonyl compounds, the quality of the models is further improved. The standard errors are reduced to 3.30 and 3.02 degrees C, respectively. Furthermore, the results obtained from this study indicate that the boiling points of the studied oxo compound dominantly depend on molecular size and also depend on individual atom types, especially oxygen heteroatoms in molecules due to strong polar interactions between molecules. These excellent structure-boiling point models not only provide profound insights into the role of structural features in a molecule but also illustrate the usefulness of these indices in QSPR/QSAR modeling of complex compounds.

  17. Quantitative prediction of solvation free energy in octanol of organic compounds.

    Science.gov (United States)

    Delgado, Eduardo J; Jaña, Gonzalo A

    2009-03-01

    The free energy of solvation, DeltaGS0, in octanol of organic compounds is quantitatively predicted from the molecular structure. The model, involving only three molecular descriptors, is obtained by multiple linear regression analysis from a data set of 147 compounds containing diverse organic functions, namely, halogenated and non-halogenated alkanes, alkenes, alkynes, aromatics, alcohols, aldehydes, ketones, amines, ethers and esters; covering a DeltaGS0 range from about -50 to 0 kJ.mol(-1). The model predicts the free energy of solvation with a squared correlation coefficient of 0.93 and a standard deviation, 2.4 kJ.mol(-1), just marginally larger than the generally accepted value of experimental uncertainty. The involved molecular descriptors have definite physical meaning corresponding to the different intermolecular interactions occurring in the bulk liquid phase. The model is validated with an external set of 36 compounds not included in the training set.

  18. Olefination of carbonyl compounds: modern and classical methods

    Energy Technology Data Exchange (ETDEWEB)

    Korotchenko, V N; Nenajdenko, Valentine G; Balenkova, Elizabeth S [Department of Chemistry, M.V. Lomonosov Moscow State University, Moscow (Russian Federation); Shastin, Aleksey V [Institute of Problems of Chemical Physics, Russian Academy of Sciences, Chernogolovka, Moscow Region (Russian Federation)

    2004-10-31

    The published data on the methods for alkene synthesis by olefination of carbonyl compounds are generalised and systematised. The main attention is given to the use of transition metals and organoelement compounds. The review covers the data on both classical and newly developed methods that are little known to chemists at large.

  19. Olefination of carbonyl compounds: modern and classical methods

    Science.gov (United States)

    Korotchenko, V. N.; Nenajdenko, Valentine G.; Balenkova, Elizabeth S.; Shastin, Aleksey V.

    2004-10-01

    The published data on the methods for alkene synthesis by olefination of carbonyl compounds are generalised and systematised. The main attention is given to the use of transition metals and organoelement compounds. The review covers the data on both classical and newly developed methods that are little known to chemists at large.

  20. Set of CAMAC modules on the base of large integrated circuits for an accelerator synchronization system

    International Nuclear Information System (INIS)

    Glejbman, Eh.M.; Pilyar, N.V.

    1986-01-01

    Parameters of functional moduli in the CAMAC standard developed for accelerator synchronization system are presented. They comprise BZN-8K and BZ-8K digital delay circuits, timing circuit and pulse selection circuit. In every module 3 large integral circuits of KR 580 VI53 type programmed timer, circuits of the given system bus bar interface with bus bars of crate, circuits of data recording control, 2 peripheric storage devices, circuits of initial regime setting, input and output shapers, circuits of installation and removal of blocking in channels are used

  1. Implementation of Lifestyle Modification Program Focusing on Physical Activity and Dietary Habits in a Large Group, Community-Based Setting

    Science.gov (United States)

    Stoutenberg, Mark; Falcon, Ashley; Arheart, Kris; Stasi, Selina; Portacio, Francia; Stepanenko, Bryan; Lan, Mary L.; Castruccio-Prince, Catarina; Nackenson, Joshua

    2017-01-01

    Background: Lifestyle modification programs improve several health-related behaviors, including physical activity (PA) and nutrition. However, few of these programs have been expanded to impact a large number of individuals in one setting at one time. Therefore, the purpose of this study was to determine whether a PA- and nutrition-based lifestyle…

  2. DNMT1 is associated with cell cycle and DNA replication gene sets in diffuse large B-cell lymphoma.

    Science.gov (United States)

    Loo, Suet Kee; Ab Hamid, Suzina Sheikh; Musa, Mustaffa; Wong, Kah Keng

    2018-01-01

    Dysregulation of DNA (cytosine-5)-methyltransferase 1 (DNMT1) is associated with the pathogenesis of various types of cancer. It has been previously shown that DNMT1 is frequently expressed in diffuse large B-cell lymphoma (DLBCL), however its functions remain to be elucidated in the disease. In this study, we gene expression profiled (GEP) shRNA targeting DNMT1(shDNMT1)-treated germinal center B-cell-like DLBCL (GCB-DLBCL)-derived cell line (i.e. HT) compared with non-silencing shRNA (control shRNA)-treated HT cells. Independent gene set enrichment analysis (GSEA) performed using GEPs of shRNA-treated HT cells and primary GCB-DLBCL cases derived from two publicly-available datasets (i.e. GSE10846 and GSE31312) produced three separate lists of enriched gene sets for each gene sets collection from Molecular Signatures Database (MSigDB). Subsequent Venn analysis identified 268, 145 and six consensus gene sets from analyzing gene sets in C2 collection (curated gene sets), C5 sub-collection [gene sets from gene ontology (GO) biological process ontology] and Hallmark collection, respectively to be enriched in positive correlation with DNMT1 expression profiles in shRNA-treated HT cells, GSE10846 and GSE31312 datasets [false discovery rate (FDR) 0.8) with DNMT1 expression and significantly downregulated (log fold-change <-1.35; p<0.05) following DNMT1 silencing in HT cells. These results suggest the involvement of DNMT1 in the activation of cell cycle and DNA replication in DLBCL cells. Copyright © 2017 Elsevier GmbH. All rights reserved.

  3. Early repositioning through compound set enrichment analysis: a knowledge-recycling strategy.

    Science.gov (United States)

    Temesi, Gergely; Bolgár, Bence; Arany, Adám; Szalai, Csaba; Antal, Péter; Mátyus, Péter

    2014-04-01

    Despite famous serendipitous drug repositioning success stories, systematic projects have not yet delivered the expected results. However, repositioning technologies are gaining ground in different phases of routine drug development, together with new adaptive strategies. We demonstrate the power of the compound information pool, the ever-growing heterogeneous information repertoire of approved drugs and candidates as an invaluable catalyzer in this transition. Systematic, computational utilization of this information pool for candidates in early phases is an open research problem; we propose a novel application of the enrichment analysis statistical framework for fusion of this information pool, specifically for the prediction of indications. Pharmaceutical consequences are formulated for a systematic and continuous knowledge recycling strategy, utilizing this information pool throughout the drug-discovery pipeline.

  4. Magnetic anisotropy basis sets for epitaxial (110) and (111) REFe2 nanofilms

    International Nuclear Information System (INIS)

    Bowden, G J; Martin, K N; Fox, A; Rainford, B D; Groot, P A J de

    2008-01-01

    Magnetic anisotropy basis sets for the cubic Laves phase rare earth intermetallic REFe 2 compounds are discussed in some detail. Such compounds can be either free standing, or thin films grown in either (110) or (111) mode using molecular beam epitaxy. For the latter, it is useful to rotate to a new coordinate system where the z-axis coincides with the growth axes of the film. In this paper, three symmetry adapted basis sets are given, for multi-pole moments up to n = 12. These sets can be used for free-standing compounds and for (110) and (111) epitaxial films. In addition, the distortion of REFe 2 films, grown on sapphire substrates, is also considered. The distortions are different for the (110) and (111) films. Strain-induced harmonic sets are given for both specific and general distortions. Finally, some predictions are made concerning the preferred direction of easy magnetization in (111) molecular beam epitaxy grown REFe 2 films

  5. ScienceHub data set for "Detection of semi-volatile organic compounds in permeable pavement infiltrate"

    Data.gov (United States)

    U.S. Environmental Protection Agency — Observed permeable pavement infiltrate concentrations by EPA (1996) method 8270C Semivolatile Organic Compounds by Gas Chromatography/Mass Spectrometry (GC/MS) with...

  6. Discovery of novel SERCA inhibitors by virtual screening of a large compound library.

    Science.gov (United States)

    Elam, Christopher; Lape, Michael; Deye, Joel; Zultowsky, Jodie; Stanton, David T; Paula, Stefan

    2011-05-01

    Two screening protocols based on recursive partitioning and computational ligand docking methodologies, respectively, were employed for virtual screens of a compound library with 345,000 entries for novel inhibitors of the enzyme sarco/endoplasmic reticulum calcium ATPase (SERCA), a potential target for cancer chemotherapy. A total of 72 compounds that were predicted to be potential inhibitors of SERCA were tested in bioassays and 17 displayed inhibitory potencies at concentrations below 100 μM. The majority of these inhibitors were composed of two phenyl rings tethered to each other by a short link of one to three atoms. Putative interactions between SERCA and the inhibitors were identified by inspection of docking-predicted poses and some of the structural features required for effective SERCA inhibition were determined by analysis of the classification pattern employed by the recursive partitioning models. Copyright © 2011 Elsevier Masson SAS. All rights reserved.

  7. Chemistry of tin compounds and environment

    International Nuclear Information System (INIS)

    Ali, S.; Mazhar, M.; Mahmood, S.; Bhatti, M.H.; Chaudhary, M.A.

    1997-01-01

    Of the large volume of tin compounds reported in the literature, possible only 100 are commercially important. Tin compounds are a wide variety of purposes such as catalysts, stabilizers for many materials including polymer, biocidal agents, bactericides, insecticides, fungicides, wood preservatives, acaricides and anti fouling agents in paints, anticancer and antitumour agents, ceramic opacifiers, as textile additives, in metal finishing operations, as food additives and in electro conductive coating. All these applications make the environment much exposed to tin contamination. The application of organotin compounds as biocides account for about 30% of total tin consumption suggesting that the main environmental effects are likely to originate from this sector. Diorgano tins and mono-organo tins are used mainly in plastic industry which is the next big source for environmental pollution. In this presentation all environmental aspects of the use of tin compounds and the recommended preventive measures are discussed. (author)

  8. Compound-Specific Isotope Analysis of Diesel Fuels in a Forensic Investigation

    Science.gov (United States)

    Muhammad, Syahidah; Frew, Russell; Hayman, Alan

    2015-02-01

    Compound-specific isotope analysis (CSIA) offers great potential as a tool to provide chemical evidence in a forensic investigation. Many attempts to trace environmental oil spills were successful where isotopic values were particularly distinct. However, difficulties arise when a large data set is analyzed and the isotopic differences between samples are subtle. In the present study, discrimination of diesel oils involved in a diesel theft case was carried out to infer the relatedness of the samples to potential source samples. This discriminatory analysis used a suite of hydrocarbon diagnostic indices, alkanes, to generate carbon and hydrogen isotopic data of the compositions of the compounds which were then processed using multivariate statistical analyses to infer the relatedness of the data set. The results from this analysis were put into context by comparing the data with the δ13C and δ2H of alkanes in commercial diesel samples obtained from various locations in the South Island of New Zealand. Based on the isotopic character of the alkanes, it is suggested that diesel fuels involved in the diesel theft case were distinguishable. This manuscript shows that CSIA when used in tandem with multivariate statistical analysis provide a defensible means to differentiate and source-apportion qualitatively similar oils at the molecular level. This approach was able to overcome confounding challenges posed by the near single-point source of origin i.e. the very subtle differences in isotopic values between the samples.

  9. Compound-Specific Isotope Analysis of Diesel Fuels in a Forensic Investigation

    Directory of Open Access Journals (Sweden)

    Syahidah Akmal Muhammad

    2015-02-01

    Full Text Available Compound-specific isotope analysis (CSIA offers great potential as a tool to provide chemical evidence in a forensic investigation. Many attempts to trace environmental oil spills were successful where isotopic values were particularly distinct. However, difficulties arise when a large data set is analyzed and the isotopic differences between samples are subtle. In the present study, discrimination of diesel oils involved in a diesel theft case was carried out to infer the relatedness of the samples to potential source samples. This discriminatory analysis used a suite of hydrocarbon diagnostic indices, alkanes, to generate carbon and hydrogen isotopic data of the compositions of the compounds which were then processed using multivariate statistical analyses to infer the relatedness of the data set. The results from this analysis were put into context by comparing the data with the δ13C and δ2H of alkanes in commercial diesel samples obtained from various locations in the South Island of New Zealand. Based on the isotopic character of the alkanes, it is suggested that diesel fuels involved in the diesel theft case were distinguishable. This manuscript shows that CSIA when used in tandem with multivariate statistical analysis provide a defensible means to differentiate and source-apportion qualitatively similar oils at the molecular level. This approach was able to overcome confounding challenges posed by the near single-point source of origin i.e. the very subtle differences in isotopic values between the samples.

  10. Compound-specific isotope analysis of diesel fuels in a forensic investigation.

    Science.gov (United States)

    Muhammad, Syahidah A; Frew, Russell D; Hayman, Alan R

    2015-01-01

    Compound-specific isotope analysis (CSIA) offers great potential as a tool to provide chemical evidence in a forensic investigation. Many attempts to trace environmental oil spills were successful where isotopic values were particularly distinct. However, difficulties arise when a large data set is analyzed and the isotopic differences between samples are subtle. In the present study, discrimination of diesel oils involved in a diesel theft case was carried out to infer the relatedness of the samples to potential source samples. This discriminatory analysis used a suite of hydrocarbon diagnostic indices, alkanes, to generate carbon and hydrogen isotopic data of the compositions of the compounds which were then processed using multivariate statistical analyses to infer the relatedness of the data set. The results from this analysis were put into context by comparing the data with the δ(13)C and δ(2)H of alkanes in commercial diesel samples obtained from various locations in the South Island of New Zealand. Based on the isotopic character of the alkanes, it is suggested that diesel fuels involved in the diesel theft case were distinguishable. This manuscript shows that CSIA when used in tandem with multivariate statistical analysis provide a defensible means to differentiate and source-apportion qualitatively similar oils at the molecular level. This approach was able to overcome confounding challenges posed by the near single-point source of origin, i.e., the very subtle differences in isotopic values between the samples.

  11. Knowledge and theme discovery across very large biological data sets using distributed queries: a prototype combining unstructured and structured data.

    Directory of Open Access Journals (Sweden)

    Uma S Mudunuri

    Full Text Available As the discipline of biomedical science continues to apply new technologies capable of producing unprecedented volumes of noisy and complex biological data, it has become evident that available methods for deriving meaningful information from such data are simply not keeping pace. In order to achieve useful results, researchers require methods that consolidate, store and query combinations of structured and unstructured data sets efficiently and effectively. As we move towards personalized medicine, the need to combine unstructured data, such as medical literature, with large amounts of highly structured and high-throughput data such as human variation or expression data from very large cohorts, is especially urgent. For our study, we investigated a likely biomedical query using the Hadoop framework. We ran queries using native MapReduce tools we developed as well as other open source and proprietary tools. Our results suggest that the available technologies within the Big Data domain can reduce the time and effort needed to utilize and apply distributed queries over large datasets in practical clinical applications in the life sciences domain. The methodologies and technologies discussed in this paper set the stage for a more detailed evaluation that investigates how various data structures and data models are best mapped to the proper computational framework.

  12. On the enrichment of hydrophobic organic compounds in fog droplets

    Science.gov (United States)

    Valsaraj, K. T.; Thoma, G. J.; Reible, D. D.; Thibodeaux, L. J.

    The unusual degree of enrichment of hydrophobic organics in fogwater droplets reported by several investigators can be interpreted as a result of (a) the effects of temperature correction on the reported enrichment factors, (b) the effects of colloidal organic matter (both filterable and non-filterable) in fog water and (c) the effects of the large air-water interfacial adsorption of neutral hydrophobic organics on the tiny fog droplets. The enrichment factor was directly correlated to the hydrophobicity (or the activity coefficient in water) of the compounds, as indicated by their octanol-water partition constants. Compounds with large octanol-water partition coefficients (high activity coefficients in water) showed the largest enrichment. Available experimental data on the adsorption of hydrophobic compounds at the air-water interface and on colloidal organic carbon were used to show that the large specific air-water interfacial areas of fog droplets contribute significantly to the enrichment factor.

  13. Annotating gene sets by mining large literature collections with protein networks.

    Science.gov (United States)

    Wang, Sheng; Ma, Jianzhu; Yu, Michael Ku; Zheng, Fan; Huang, Edward W; Han, Jiawei; Peng, Jian; Ideker, Trey

    2018-01-01

    Analysis of patient genomes and transcriptomes routinely recognizes new gene sets associated with human disease. Here we present an integrative natural language processing system which infers common functions for a gene set through automatic mining of the scientific literature with biological networks. This system links genes with associated literature phrases and combines these links with protein interactions in a single heterogeneous network. Multiscale functional annotations are inferred based on network distances between phrases and genes and then visualized as an ontology of biological concepts. To evaluate this system, we predict functions for gene sets representing known pathways and find that our approach achieves substantial improvement over the conventional text-mining baseline method. Moreover, our system discovers novel annotations for gene sets or pathways without previously known functions. Two case studies demonstrate how the system is used in discovery of new cancer-related pathways with ontological annotations.

  14. Expatriate Compound Living: An Ethnographic Field Study

    DEFF Research Database (Denmark)

    Lauring, Jakob; Selmer, Jan

    2009-01-01

    In certain countries, closed expatriate compounds have developed.  They serve to provide resident expatriates and accompanying family members with a comfortable and safe environment. Unfortunately, not much is known about compound life since associated empirical research is scarce. Through...... ethnographic field-work methodology, including interviews and participant observation during a period of three months, this exploratory study investigated 16 Danish business expatriates of a large Danish corporation and their families living in the same compound in Saudi Arabia. They shared their spare time...... and the expatriates had the same working hours in the same subsidiary. Results show that a Danish national group was established and maintained. This in-group dominated life in the compound and at work it may have contributed to the perceptual bias and discriminatory behaviour demonstrated by the Danish expatriates...

  15. Double generalized linear compound poisson models to insurance claims data

    DEFF Research Database (Denmark)

    Andersen, Daniel Arnfeldt; Bonat, Wagner Hugo

    2017-01-01

    This paper describes the specification, estimation and comparison of double generalized linear compound Poisson models based on the likelihood paradigm. The models are motivated by insurance applications, where the distribution of the response variable is composed by a degenerate distribution...... implementation and illustrate the application of double generalized linear compound Poisson models using a data set about car insurances....

  16. Unconventional superconductivity in heavy-fermion compounds

    Energy Technology Data Exchange (ETDEWEB)

    White, B.D. [Department of Physics, University of California, San Diego, La Jolla, CA 92093 (United States); Center for Advanced Nanoscience, University of California, San Diego, La Jolla, CA 92093 (United States); Thompson, J.D. [Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Maple, M.B., E-mail: mbmaple@ucsd.edu [Department of Physics, University of California, San Diego, La Jolla, CA 92093 (United States); Center for Advanced Nanoscience, University of California, San Diego, La Jolla, CA 92093 (United States)

    2015-07-15

    Highlights: • Quasiparticles in heavy-fermion compounds are much heavier than free electrons. • Superconductivity involves pairing of these massive quasiparticles. • Quasiparticle pairing mediated by magnetic or quadrupolar fluctuations. • We review the properties of superconductivity in heavy-fermion compounds. - Abstract: Over the past 35 years, research on unconventional superconductivity in heavy-fermion systems has evolved from the surprising observations of unprecedented superconducting properties in compounds that convention dictated should not superconduct at all to performing explorations of rich phase spaces in which the delicate interplay between competing ground states appears to support emergent superconducting states. In this article, we review the current understanding of superconductivity in heavy-fermion compounds and identify a set of characteristics that is common to their unconventional superconducting states. These core properties are compared with those of other classes of unconventional superconductors such as the cuprates and iron-based superconductors. We conclude by speculating on the prospects for future research in this field and how new advances might contribute towards resolving the long-standing mystery of how unconventional superconductivity works.

  17. Stress-enhanced lithiation in MAX compounds for battery applications

    KAUST Repository

    Zhu, Jiajie

    2017-07-31

    Li-ion batteries are well-established energy storage systems. Upon lithiation conventional group IVA compound anodes undergo large volume expansion and thus suffer from stress-induced performance degradation. Instead of the emerging MXene anodes fabricated by an expensive and difficult-to-control etching technique, we study the feasibility of utilizing the parent MAX compounds. We reveal that M2AC (M=Ti, V and A=Si, S) compounds repel lithiation at ambient conditions, while structural stress turns out to support lithiation, in contrast to group IVA compounds. For V2SC the Li diffusion barrier is found to be lower than reported for group IVA compound anodes, reflecting potential to achieve fast charge/discharge.

  18. Stress-enhanced lithiation in MAX compounds for battery applications

    KAUST Repository

    Zhu, Jiajie; Chroneos, Alexander; Wang, Lei; Rao, Feng; Schwingenschlö gl, Udo

    2017-01-01

    Li-ion batteries are well-established energy storage systems. Upon lithiation conventional group IVA compound anodes undergo large volume expansion and thus suffer from stress-induced performance degradation. Instead of the emerging MXene anodes fabricated by an expensive and difficult-to-control etching technique, we study the feasibility of utilizing the parent MAX compounds. We reveal that M2AC (M=Ti, V and A=Si, S) compounds repel lithiation at ambient conditions, while structural stress turns out to support lithiation, in contrast to group IVA compounds. For V2SC the Li diffusion barrier is found to be lower than reported for group IVA compound anodes, reflecting potential to achieve fast charge/discharge.

  19. ToxAlerts: a Web server of structural alerts for toxic chemicals and compounds with potential adverse reactions.

    Science.gov (United States)

    Sushko, Iurii; Salmina, Elena; Potemkin, Vladimir A; Poda, Gennadiy; Tetko, Igor V

    2012-08-27

    The article presents a Web-based platform for collecting and storing toxicological structural alerts from literature and for virtual screening of chemical libraries to flag potentially toxic chemicals and compounds that can cause adverse side effects. An alert is uniquely identified by a SMARTS template, a toxicological endpoint, and a publication where the alert was described. Additionally, the system allows storing complementary information such as name, comments, and mechanism of action, as well as other data. Most importantly, the platform can be easily used for fast virtual screening of large chemical datasets, focused libraries, or newly designed compounds against the toxicological alerts, providing a detailed profile of the chemicals grouped by structural alerts and endpoints. Such a facility can be used for decision making regarding whether a compound should be tested experimentally, validated with available QSAR models, or eliminated from consideration altogether. The alert-based screening can also be helpful for an easier interpretation of more complex QSAR models. The system is publicly accessible and tightly integrated with the Online Chemical Modeling Environment (OCHEM, http://ochem.eu). The system is open and expandable: any registered OCHEM user can introduce new alerts, browse, edit alerts introduced by other users, and virtually screen his/her data sets against all or selected alerts. The user sets being passed through the structural alerts can be used at OCHEM for other typical tasks: exporting in a wide variety of formats, development of QSAR models, additional filtering by other criteria, etc. The database already contains almost 600 structural alerts for such endpoints as mutagenicity, carcinogenicity, skin sensitization, compounds that undergo metabolic activation, and compounds that form reactive metabolites and, thus, can cause adverse reactions. The ToxAlerts platform is accessible on the Web at http://ochem.eu/alerts, and it is constantly

  20. PeptideNavigator: An interactive tool for exploring large and complex data sets generated during peptide-based drug design projects.

    Science.gov (United States)

    Diller, Kyle I; Bayden, Alexander S; Audie, Joseph; Diller, David J

    2018-01-01

    There is growing interest in peptide-based drug design and discovery. Due to their relatively large size, polymeric nature, and chemical complexity, the design of peptide-based drugs presents an interesting "big data" challenge. Here, we describe an interactive computational environment, PeptideNavigator, for naturally exploring the tremendous amount of information generated during a peptide drug design project. The purpose of PeptideNavigator is the presentation of large and complex experimental and computational data sets, particularly 3D data, so as to enable multidisciplinary scientists to make optimal decisions during a peptide drug discovery project. PeptideNavigator provides users with numerous viewing options, such as scatter plots, sequence views, and sequence frequency diagrams. These views allow for the collective visualization and exploration of many peptides and their properties, ultimately enabling the user to focus on a small number of peptides of interest. To drill down into the details of individual peptides, PeptideNavigator provides users with a Ramachandran plot viewer and a fully featured 3D visualization tool. Each view is linked, allowing the user to seamlessly navigate from collective views of large peptide data sets to the details of individual peptides with promising property profiles. Two case studies, based on MHC-1A activating peptides and MDM2 scaffold design, are presented to demonstrate the utility of PeptideNavigator in the context of disparate peptide-design projects. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Design of a general-purpose European compound screening library for EU-OPENSCREEN.

    Science.gov (United States)

    Horvath, Dragos; Lisurek, Michael; Rupp, Bernd; Kühne, Ronald; Specker, Edgar; von Kries, Jens; Rognan, Didier; Andersson, C David; Almqvist, Fredrik; Elofsson, Mikael; Enqvist, Per-Anders; Gustavsson, Anna-Lena; Remez, Nikita; Mestres, Jordi; Marcou, Gilles; Varnek, Alexander; Hibert, Marcel; Quintana, Jordi; Frank, Ronald

    2014-10-01

    This work describes a collaborative effort to define and apply a protocol for the rational selection of a general-purpose screening library, to be used by the screening platforms affiliated with the EU-OPENSCREEN initiative. It is designed as a standard source of compounds for primary screening against novel biological targets, at the request of research partners. Given the general nature of the potential applications of this compound collection, the focus of the selection strategy lies on ensuring chemical stability, absence of reactive compounds, screening-compliant physicochemical properties, loose compliance to drug-likeness criteria (as drug design is a major, but not exclusive application), and maximal diversity/coverage of chemical space, aimed at providing hits for a wide spectrum of drugable targets. Finally, practical availability/cost issues cannot be avoided. The main goal of this publication is to inform potential future users of this library about its conception, sources, and characteristics. The outline of the selection procedure, notably of the filtering rules designed by a large committee of European medicinal chemists and chemoinformaticians, may be of general methodological interest for the screening/medicinal chemistry community. The selection task of 200K molecules out of a pre-filtered set of 1.4M candidates was shared by five independent European research groups, each picking a subset of 40K compounds according to their own in-house methodology and expertise. An in-depth analysis of chemical space coverage of the library serves not only to characterize the collection, but also to compare the various chemoinformatics-driven selection procedures of maximal diversity sets. Compound selections contributed by various participating groups were mapped onto general-purpose self-organizing maps (SOMs) built on the basis of marketed drugs and bioactive reference molecules. In this way, the occupancy of chemical space by the EU-OPENSCREEN library could

  2. Estimating the CCSD basis-set limit energy from small basis sets: basis-set extrapolations vs additivity schemes

    Energy Technology Data Exchange (ETDEWEB)

    Spackman, Peter R.; Karton, Amir, E-mail: amir.karton@uwa.edu.au [School of Chemistry and Biochemistry, The University of Western Australia, Perth, WA 6009 (Australia)

    2015-05-15

    Coupled cluster calculations with all single and double excitations (CCSD) converge exceedingly slowly with the size of the one-particle basis set. We assess the performance of a number of approaches for obtaining CCSD correlation energies close to the complete basis-set limit in conjunction with relatively small DZ and TZ basis sets. These include global and system-dependent extrapolations based on the A + B/L{sup α} two-point extrapolation formula, and the well-known additivity approach that uses an MP2-based basis-set-correction term. We show that the basis set convergence rate can change dramatically between different systems(e.g.it is slower for molecules with polar bonds and/or second-row elements). The system-dependent basis-set extrapolation scheme, in which unique basis-set extrapolation exponents for each system are obtained from lower-cost MP2 calculations, significantly accelerates the basis-set convergence relative to the global extrapolations. Nevertheless, we find that the simple MP2-based basis-set additivity scheme outperforms the extrapolation approaches. For example, the following root-mean-squared deviations are obtained for the 140 basis-set limit CCSD atomization energies in the W4-11 database: 9.1 (global extrapolation), 3.7 (system-dependent extrapolation), and 2.4 (additivity scheme) kJ mol{sup –1}. The CCSD energy in these approximations is obtained from basis sets of up to TZ quality and the latter two approaches require additional MP2 calculations with basis sets of up to QZ quality. We also assess the performance of the basis-set extrapolations and additivity schemes for a set of 20 basis-set limit CCSD atomization energies of larger molecules including amino acids, DNA/RNA bases, aromatic compounds, and platonic hydrocarbon cages. We obtain the following RMSDs for the above methods: 10.2 (global extrapolation), 5.7 (system-dependent extrapolation), and 2.9 (additivity scheme) kJ mol{sup –1}.

  3. Estimating the CCSD basis-set limit energy from small basis sets: basis-set extrapolations vs additivity schemes

    International Nuclear Information System (INIS)

    Spackman, Peter R.; Karton, Amir

    2015-01-01

    Coupled cluster calculations with all single and double excitations (CCSD) converge exceedingly slowly with the size of the one-particle basis set. We assess the performance of a number of approaches for obtaining CCSD correlation energies close to the complete basis-set limit in conjunction with relatively small DZ and TZ basis sets. These include global and system-dependent extrapolations based on the A + B/L α two-point extrapolation formula, and the well-known additivity approach that uses an MP2-based basis-set-correction term. We show that the basis set convergence rate can change dramatically between different systems(e.g.it is slower for molecules with polar bonds and/or second-row elements). The system-dependent basis-set extrapolation scheme, in which unique basis-set extrapolation exponents for each system are obtained from lower-cost MP2 calculations, significantly accelerates the basis-set convergence relative to the global extrapolations. Nevertheless, we find that the simple MP2-based basis-set additivity scheme outperforms the extrapolation approaches. For example, the following root-mean-squared deviations are obtained for the 140 basis-set limit CCSD atomization energies in the W4-11 database: 9.1 (global extrapolation), 3.7 (system-dependent extrapolation), and 2.4 (additivity scheme) kJ mol –1 . The CCSD energy in these approximations is obtained from basis sets of up to TZ quality and the latter two approaches require additional MP2 calculations with basis sets of up to QZ quality. We also assess the performance of the basis-set extrapolations and additivity schemes for a set of 20 basis-set limit CCSD atomization energies of larger molecules including amino acids, DNA/RNA bases, aromatic compounds, and platonic hydrocarbon cages. We obtain the following RMSDs for the above methods: 10.2 (global extrapolation), 5.7 (system-dependent extrapolation), and 2.9 (additivity scheme) kJ mol –1

  4. Immobilization of N-Heterocyclic Carbene Compounds: A Synthetic Perspective.

    Science.gov (United States)

    Zhong, Rui; Lindhorst, Anja C; Groche, Florian J; Kühn, Fritz E

    2017-02-08

    Over the course of the past 15 years the success story of N-heterocyclic carbene (NHC) compounds in organic, inorganic, and organometallic chemistry has been extended to another dimension. The immobilization of NHC compounds, undergoing continuous diversification, broadens their range of applications and leads to new solutions for challenges in catalytic and synthetic chemistry. This review intends to present a synthetic toolkit for the immobilization of NHC compounds, giving the reader an overview on synthetic techniques and strategies available in the literature. By individually summarizing and assessing the synthetic steps of the immobilization process, a comprehensive picture of the strategies and methodologies for the immobilization of NHC compounds is presented. Furthermore, the characterization of supported NHC compounds is discussed in detail in order to set up necessary criteria for an in-depth analysis of the immobilized derivatives. Finally, the catalytic applications of immobilized NHC compounds are briefly reviewed to illustrate the practical use of this technique for a broad variety of reaction types.

  5. Evolutionary Structure Prediction of Stoichiometric Compounds

    Science.gov (United States)

    Zhu, Qiang; Oganov, Artem

    2014-03-01

    In general, for a given ionic compound AmBn\\ at ambient pressure condition, its stoichiometry reflects the valence state ratio between per chemical specie (i.e., the charges for each anion and cation). However, compounds under high pressure exhibit significantly behavior, compared to those analogs at ambient condition. Here we developed a method to solve the crystal structure prediction problem based on the evolutionary algorithms, which can predict both the stable compounds and their crystal structures at arbitrary P,T-conditions, given just the set of chemical elements. By applying this method to a wide range of binary ionic systems (Na-Cl, Mg-O, Xe-O, Cs-F, etc), we discovered a lot of compounds with brand new stoichimetries which can become thermodynamically stable. Further electronic structure analysis on these novel compounds indicates that several factors can contribute to this extraordinary phenomenon: (1) polyatomic anions; (2) free electron localization; (3) emergence of new valence states; (4) metallization. In particular, part of the results have been confirmed by experiment, which warrants that this approach can play a crucial role in new materials design under extreme pressure conditions. This work is funded by DARPA (Grants No. W31P4Q1210008 and W31P4Q1310005), NSF (EAR-1114313 and DMR-1231586).

  6. Prediction of compounds activity in nuclear receptor signaling and stress pathway assays using machine learning algorithms and low dimensional molecular descriptors

    Directory of Open Access Journals (Sweden)

    Filip eStefaniak

    2015-12-01

    Full Text Available Toxicity evaluation of newly synthesized or used compounds is one of the main challenges during product development in many areas of industry. For example, toxicity is the second reason - after lack of efficacy - for failure in preclinical and clinical studies of drug candidates. To avoid attrition at the late stage of the drug development process, the toxicity analyses are employed at the early stages of a discovery pipeline, along with activity and selectivity enhancing. Although many assays for screening in vitro toxicity are available, their massive application is not always time and cost effective. Thus the need for fast and reliable in silico tools, which can be used not only for toxicity prediction of existing compounds, but also for prioritization of compounds planned for synthesis or acquisition. Here I present the benchmark results of the combination of various attribute selection methods and machine learning algorithms and their application to the data sets of the Tox21 Data Challenge. The best performing method: Best First for attribute selection with the Rotation Forest/ADTree classifier offers good accuracy for most tested cases. For 11 out of 12 targets, the AUROC value for the final evaluation set was ≥0.72, while for three targets the AUROC value was ≥ 0.80, with the average AUROC being 0.784±0.069. The use of two-dimensional descriptors sets enables fast screening and compound prioritization even for a very large database. Open source tools used in this project make the presented approach widely available and encourage the community to further improve the presented scheme.

  7. Characterization of a biosurfactant produced by Pseudomonas cepacia CCT6659 in the presence of industrial wastes and its application in the biodegradation of hydrophobic compounds in soil.

    Science.gov (United States)

    Silva, Elias J; Rocha e Silva, Nathália Maria P; Rufino, Raquel D; Luna, Juliana M; Silva, Ricardo O; Sarubbo, Leonie A

    2014-05-01

    The bacterium Pseudomonas cepacia CCT6659 cultivated with 2% soybean waste frying oil and 2% corn steep liquor as substrates produced a biosurfactant with potential application in the bioremediation of soils. The biosurfactant was classified as an anionic biomolecule composed of 75% lipids and 25% carbohydrates. Characterization by proton nuclear magnetic resonance ((1)H and (13)C NMR) revealed the presence of carbonyl, olefinic and aliphatic groups, with typical spectra of lipids. Four sets of biodegradation experiments were carried out with soil contaminated by hydrophobic organic compounds amended with molasses in the presence of an indigenous consortium, as follows: Set 1-soil+bacterial cells; Set 2-soil+biosurfactant; Set 3-soil+bacterial cells+biosurfactant; and Set 4-soil without bacterial cells or biosurfactant (control). Significant oil biodegradation activity (83%) occurred in the first 10 days of the experiments when the biosurfactant and bacterial cells were used together (Set 3), while maximum degradation of the organic compounds (above 95%) was found in Sets 1-3 between 35 and 60 days. It is evident from the results that the biosurfactant alone and its producer species are both capable of promoting biodegradation to a large extent. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Matched molecular pair-based data sets for computer-aided medicinal chemistry

    Science.gov (United States)

    Bajorath, Jürgen

    2014-01-01

    Matched molecular pairs (MMPs) are widely used in medicinal chemistry to study changes in compound properties including biological activity, which are associated with well-defined structural modifications. Herein we describe up-to-date versions of three MMP-based data sets that have originated from in-house research projects. These data sets include activity cliffs, structure-activity relationship (SAR) transfer series, and second generation MMPs based upon retrosynthetic rules. The data sets have in common that they have been derived from compounds included in the ChEMBL database (release 17) for which high-confidence activity data are available. Thus, the activity data associated with MMP-based activity cliffs, SAR transfer series, and retrosynthetic MMPs cover the entire spectrum of current pharmaceutical targets. Our data sets are made freely available to the scientific community. PMID:24627802

  9. Moessbauer spectroscopy in neptunium compounds

    Energy Technology Data Exchange (ETDEWEB)

    Nakamoto, Tadahiro; Nakada, Masami; Masaki, Nobuyuki; Saeki, Masakatsu [Japan Atomic Energy Research Inst., Tokyo (Japan)

    1997-03-01

    Moessbauer effects are observable in seven elements of actinides from {sup 232}Th to {sup 247}Cm and Moesbauer spectra have been investigated mainly with {sup 237}Np and {sup 238}U for the reasons of availability and cost of materials. This report describes the fundamental characteristics of Moessbauer spectra of {sup 237}Np and the correlation between the isomer shift and the coordination number of Np(V) compounds. The isomer shifts of Np(V) compounds had a tendency to increase as an increase of coordination number and the isomer shifts of Np(V) compounds showed broad distribution as well as those of Np(VI) but {delta} values of the compounds with the same coordination number were distributed in a narrow range. The {delta} values of Np(VI) complexes with O{sub x} donor set suggest that the Np atom in its hydroxide (NpO{sub 2}(OH){center_dot}4H{sub 2}O)might have pentagonal bipyramidal structure and at least, pentagonal and hexagonal bipyramidal structures might coexist in its acetate and benzoate. Really, such coexistence has been demonstrated in its nitrate, (NpO{sub 2}){sub 2}(NO{sub 3}){sub 2}{center_dot}5H{sub 2}O. (M.N.)

  10. Rhesus monkeys (Macaca mulatta) show robust primacy and recency in memory for lists from small, but not large, image sets.

    Science.gov (United States)

    Basile, Benjamin M; Hampton, Robert R

    2010-02-01

    The combination of primacy and recency produces a U-shaped serial position curve typical of memory for lists. In humans, primacy is often thought to result from rehearsal, but there is little evidence for rehearsal in nonhumans. To further evaluate the possibility that rehearsal contributes to primacy in monkeys, we compared memory for lists of familiar stimuli (which may be easier to rehearse) to memory for unfamiliar stimuli (which are likely difficult to rehearse). Six rhesus monkeys saw lists of five images drawn from either large, medium, or small image sets. After presentation of each list, memory for one item was assessed using a serial probe recognition test. Across four experiments, we found robust primacy and recency with lists drawn from small and medium, but not large, image sets. This finding is consistent with the idea that familiar items are easier to rehearse and that rehearsal contributes to primacy, warranting further study of the possibility of rehearsal in monkeys. However, alternative interpretations are also viable and are discussed. Copyright 2009 Elsevier B.V. All rights reserved.

  11. Synergy Maps: exploring compound combinations using network-based visualization.

    Science.gov (United States)

    Lewis, Richard; Guha, Rajarshi; Korcsmaros, Tamás; Bender, Andreas

    2015-01-01

    The phenomenon of super-additivity of biological response to compounds applied jointly, termed synergy, has the potential to provide many therapeutic benefits. Therefore, high throughput screening of compound combinations has recently received a great deal of attention. Large compound libraries and the feasibility of all-pairs screening can easily generate large, information-rich datasets. Previously, these datasets have been visualized using either a heat-map or a network approach-however these visualizations only partially represent the information encoded in the dataset. A new visualization technique for pairwise combination screening data, termed "Synergy Maps", is presented. In a Synergy Map, information about the synergistic interactions of compounds is integrated with information about their properties (chemical structure, physicochemical properties, bioactivity profiles) to produce a single visualization. As a result the relationships between compound and combination properties may be investigated simultaneously, and thus may afford insight into the synergy observed in the screen. An interactive web app implementation, available at http://richlewis42.github.io/synergy-maps, has been developed for public use, which may find use in navigating and filtering larger scale combination datasets. This tool is applied to a recent all-pairs dataset of anti-malarials, tested against Plasmodium falciparum, and a preliminary analysis is given as an example, illustrating the disproportionate synergism of histone deacetylase inhibitors previously described in literature, as well as suggesting new hypotheses for future investigation. Synergy Maps improve the state of the art in compound combination visualization, by simultaneously representing individual compound properties and their interactions. The web-based tool allows straightforward exploration of combination data, and easier identification of correlations between compound properties and interactions.

  12. Covariance approximation for large multivariate spatial data sets with an application to multiple climate model errors

    KAUST Repository

    Sang, Huiyan

    2011-12-01

    This paper investigates the cross-correlations across multiple climate model errors. We build a Bayesian hierarchical model that accounts for the spatial dependence of individual models as well as cross-covariances across different climate models. Our method allows for a nonseparable and nonstationary cross-covariance structure. We also present a covariance approximation approach to facilitate the computation in the modeling and analysis of very large multivariate spatial data sets. The covariance approximation consists of two parts: a reduced-rank part to capture the large-scale spatial dependence, and a sparse covariance matrix to correct the small-scale dependence error induced by the reduced rank approximation. We pay special attention to the case that the second part of the approximation has a block-diagonal structure. Simulation results of model fitting and prediction show substantial improvement of the proposed approximation over the predictive process approximation and the independent blocks analysis. We then apply our computational approach to the joint statistical modeling of multiple climate model errors. © 2012 Institute of Mathematical Statistics.

  13. Experience with compound words influences their processing: An eye movement investigation with English compound words.

    Science.gov (United States)

    Juhasz, Barbara J

    2016-11-14

    Recording eye movements provides information on the time-course of word recognition during reading. Juhasz and Rayner [Juhasz, B. J., & Rayner, K. (2003). Investigating the effects of a set of intercorrelated variables on eye fixation durations in reading. Journal of Experimental Psychology: Learning, Memory and Cognition, 29, 1312-1318] examined the impact of five word recognition variables, including familiarity and age-of-acquisition (AoA), on fixation durations. All variables impacted fixation durations, but the time-course differed. However, the study focused on relatively short, morphologically simple words. Eye movements are also informative for examining the processing of morphologically complex words such as compound words. The present study further examined the time-course of lexical and semantic variables during morphological processing. A total of 120 English compound words that varied in familiarity, AoA, semantic transparency, lexeme meaning dominance, sensory experience rating (SER), and imageability were selected. The impact of these variables on fixation durations was examined when length, word frequency, and lexeme frequencies were controlled in a regression model. The most robust effects were found for familiarity and AoA, indicating that a reader's experience with compound words significantly impacts compound recognition. These results provide insight into semantic processing of morphologically complex words during reading.

  14. SAMPL4, a blind challenge for computational solvation free energies: the compounds considered

    Science.gov (United States)

    Guthrie, J. Peter

    2014-03-01

    For the fifth time I have provided a set of solvation energies (1 M gas to 1 M aqueous) for a SAMPL challenge. In this set there are 23 blind compounds and 30 supplementary compounds of related structure to one of the blind sets, but for which the solvation energy is readily available. The best current values of each compound are presented along with complete documentation of the experimental origins of the solvation energies. The calculations needed to go from reported data to solvation energies are presented, with particular attention to aspects which are new to this set. For some compounds the vapor pressures (VP) were reported for the liquid compound, which is solid at room temperature. To correct from VPsubcooled liquid to VPsublimation requires ΔSfusion, which is only known for mannitol. Estimated values were used for the others, all but one of which were benzene derivatives and expected to have very similar values. The final compound for which ΔSfusion was estimated was menthol, which melts at 42 °C so that modest errors in ΔSfusion will have little effect. It was also necessary to look into the effects of including estimated values of ΔCp on this correction. The approximate sizes of the effects of inclusion of ΔCp in the correction from VPsubcooled liquid to VPsublimation were estimated and it was noted that inclusion of ΔCp invariably makes ΔGS more positive. To extend the set of compounds for which the solvation energy could be calculated we explored the use of boiling point (b.p.) data from Reaxys/Beilstein as a substitute for studies of the VP as a function of temperature. B.p. data are not always reliable so it was necessary to develop a criterion for rejecting outliers. For two compounds (chlorinated guaiacols) it became clear that inclusion represented overreach; for each there were only two independent pressure, temperature points, which is too little for a trustworthy extrapolation. For a number of compounds the extrapolation from lowest

  15. Scalable Algorithms for Clustering Large Geospatiotemporal Data Sets on Manycore Architectures

    Science.gov (United States)

    Mills, R. T.; Hoffman, F. M.; Kumar, J.; Sreepathi, S.; Sripathi, V.

    2016-12-01

    The increasing availability of high-resolution geospatiotemporal data sets from sources such as observatory networks, remote sensing platforms, and computational Earth system models has opened new possibilities for knowledge discovery using data sets fused from disparate sources. Traditional algorithms and computing platforms are impractical for the analysis and synthesis of data sets of this size; however, new algorithmic approaches that can effectively utilize the complex memory hierarchies and the extremely high levels of available parallelism in state-of-the-art high-performance computing platforms can enable such analysis. We describe a massively parallel implementation of accelerated k-means clustering and some optimizations to boost computational intensity and utilization of wide SIMD lanes on state-of-the art multi- and manycore processors, including the second-generation Intel Xeon Phi ("Knights Landing") processor based on the Intel Many Integrated Core (MIC) architecture, which includes several new features, including an on-package high-bandwidth memory. We also analyze the code in the context of a few practical applications to the analysis of climatic and remotely-sensed vegetation phenology data sets, and speculate on some of the new applications that such scalable analysis methods may enable.

  16. Non-compound nucleus fission in actinide and pre-actinide regions

    Indian Academy of Sciences (India)

    2015-07-22

    Jul 22, 2015 ... In this article, some of our recent results on fission fragment/product angular distributions are discussed in the context of non-compound nucleus fission. Measurement of fission fragment angular distribution in 28Si+176Yb reaction did not show a large contribution from the non-compound nucleus fission.

  17. Compound Half-Backed Weave Design For Digital Jacquard Fabric

    Science.gov (United States)

    Zhang, Meng; Zhou, Jiu

    2017-12-01

    Based on layered-combination design mode and compound structure, this paper presents a design method, named compound half-backed weave in order to achieve innovating weave structure and surface effect of fabric. This design method includes primary weaves chosen, half-backed technical points set up and half-backed weave databases established. The fabric produced using compound half-backed weave designed by this method can exhibit a unique half-backed effect that only half of the threads on the fabric surface remain in a state of being covered by adjacent wefts. Compound half-backed weave can not only meets the design need of jacquard fabric with different digital images and effectively improves the efficiency of structural design, but also puts forward new theory and method for innovative design of digital jacquard fabric.

  18. Recent N-Atom Containing Compounds from Indo-Pacific Invertebrates

    Directory of Open Access Journals (Sweden)

    Ashgan Bishara

    2010-11-01

    Full Text Available A large variety of unique N-atom containing compounds (alkaloids without terrestrial counterparts, have been isolated from marine invertebrates, mainly sponges and ascidians. Many of these compounds display interesting biological activities. In this report we present studies on nitrogenous compounds, isolated by our group during the last few years, from Indo-Pacific sponges, one ascidian and one gorgonian. The major part of the review deals with metabolites from the Madagascar sponge Fascaplysinopsis sp., namely, four groups of secondary metabolites, the salarins, tulearins, taumycins and tausalarins.

  19. Tables of compound-discount interest rate multipliers for evaluating forestry investments.

    Science.gov (United States)

    Allen L. Lundgren

    1971-01-01

    Tables, prepared by computer, are presented for 10 selected compound-discount interest rate multipliers commonly used in financial analyses of forestry investments. Two set of tables are given for each of the 10 multipliers. The first set gives multipliers for each year from 1 to 40 years; the second set gives multipliers at 5-year intervals from 5 to 160 years....

  20. Assessment of Random Assignment in Training and Test Sets using Generalized Cluster Analysis Technique

    Directory of Open Access Journals (Sweden)

    Sorana D. BOLBOACĂ

    2011-06-01

    Full Text Available Aim: The properness of random assignment of compounds in training and validation sets was assessed using the generalized cluster technique. Material and Method: A quantitative Structure-Activity Relationship model using Molecular Descriptors Family on Vertices was evaluated in terms of assignment of carboquinone derivatives in training and test sets during the leave-many-out analysis. Assignment of compounds was investigated using five variables: observed anticancer activity and four structure descriptors. Generalized cluster analysis with K-means algorithm was applied in order to investigate if the assignment of compounds was or not proper. The Euclidian distance and maximization of the initial distance using a cross-validation with a v-fold of 10 was applied. Results: All five variables included in analysis proved to have statistically significant contribution in identification of clusters. Three clusters were identified, each of them containing both carboquinone derivatives belonging to training as well as to test sets. The observed activity of carboquinone derivatives proved to be normal distributed on every. The presence of training and test sets in all clusters identified using generalized cluster analysis with K-means algorithm and the distribution of observed activity within clusters sustain a proper assignment of compounds in training and test set. Conclusion: Generalized cluster analysis using the K-means algorithm proved to be a valid method in assessment of random assignment of carboquinone derivatives in training and test sets.

  1. Chord length distribution for a compound capsule

    International Nuclear Information System (INIS)

    Pitřík, Pavel

    2017-01-01

    Chord length distribution is a factor important in the calculation of ionisation chamber responses. This article describes Monte Carlo calculations of the chord length distribution for a non-convex compound capsule. A Monte Carlo code was set up for generation of random chords and calculation of their lengths based on the input number of generations and cavity dimensions. The code was written in JavaScript and can be executed in the majority of HTML viewers. The plot of occurrence of cords of different lengths has 3 peaks. It was found that the compound capsule cavity cannot be simply replaced with a spherical cavity of a triangular design. Furthermore, the compound capsule cavity is directionally dependent, which must be taken into account in calculations involving non-isotropic fields of primary particles in the beam, unless equilibrium of the secondary charged particles is attained. (orig.)

  2. Intermetallic compound development for the 21st century

    International Nuclear Information System (INIS)

    Munroe, P.R.

    2000-01-01

    lntermetallic compounds have been vigorously researched for the past twenty years. As a result of these studies the fundamental behaviour of a number of transition metal aluminides and suicides is now well understood, and a number of alloys with commercially acceptable properties have been developed. Future challenges for these alloys, for example Ni 3 AI, TiAI and Fe 3 AI, are focused on the development of large-scale production routes. However, there remain a number of other intermetallic compounds, such as Laves phases, which exhibit some promising properties, but little is presently known about their intrinsic behaviour. For compounds such as these more fundamental studies are required

  3. Leaf transpiration plays a role in phosphorus acquisition among a large set of chickpea genotypes.

    Science.gov (United States)

    Pang, Jiayin; Zhao, Hongxia; Bansal, Ruchi; Bohuon, Emilien; Lambers, Hans; Ryan, Megan H; Siddique, Kadambot H M

    2018-01-09

    Low availability of inorganic phosphorus (P) is considered a major constraint for crop productivity worldwide. A unique set of 266 chickpea (Cicer arietinum L.) genotypes, originating from 29 countries and with diverse genetic background, were used to study P-use efficiency. Plants were grown in pots containing sterilized river sand supplied with P at a rate of 10 μg P g -1 soil as FePO 4 , a poorly soluble form of P. The results showed large genotypic variation in plant growth, shoot P content, physiological P-use efficiency, and P-utilization efficiency in response to low P supply. Further investigation of a subset of 100 chickpea genotypes with contrasting growth performance showed significant differences in photosynthetic rate and photosynthetic P-use efficiency. A positive correlation was found between leaf P concentration and transpiration rate of the young fully expanded leaves. For the first time, our study has suggested a role of leaf transpiration in P acquisition, consistent with transpiration-driven mass flow in chickpea grown in low-P sandy soils. The identification of 6 genotypes with high plant growth, P-acquisition, and P-utilization efficiency suggests that the chickpea reference set can be used in breeding programmes to improve both P-acquisition and P-utilization efficiency under low-P conditions. © 2018 John Wiley & Sons Ltd.

  4. Hierarchical Cantor set in the large scale structure with torus geometry

    Energy Technology Data Exchange (ETDEWEB)

    Murdzek, R. [Physics Department, ' Al. I. Cuza' University, Blvd. Carol I, Nr. 11, Iassy 700506 (Romania)], E-mail: rmurdzek@yahoo.com

    2008-12-15

    The formation of large scale structures is considered within a model with string on toroidal space-time. Firstly, the space-time geometry is presented. In this geometry, the Universe is represented by a string describing a torus surface. Thereafter, the large scale structure of the Universe is derived from the string oscillations. The results are in agreement with the cellular structure of the large scale distribution and with the theory of a Cantorian space-time.

  5. The use of quantum chemically derived descriptors for QSAR modelling of reductive dehalogenation of aromatic compounds

    NARCIS (Netherlands)

    Rorije E; Richter J; Peijnenburg WJGM; ECO; IHE Delft

    1994-01-01

    In this study, quantum-chemically derived parameters are developed for a limited number of halogenated aromatic compounds to model the anaerobic reductive dehalogenation reaction rate constants of these compounds. It is shown that due to the heterogeneity of the set of compounds used, no single

  6. Improved quality control of carbon-14 labelled compounds

    International Nuclear Information System (INIS)

    Leonhardt, J.W.; Fuchs, P.; Standtke, K.

    1997-01-01

    IUT Ltd is a producer of carbon-14 labelled organic compounds like benzene, methanol, phenol, formaldehyde, Na-acetates and also special ordered compounds. The quality control of these compounds is carried out by means of HPLC and GC-MS due to chemical purity. Molar activity was determined by Liquid Scintillation Counting and HPLC being equipped by a radioactivity detector. Unfortunately the accuracy of the activity determination was arrived only ±4% relatively. This error is too high because of the large dilution factors. In respect of the IUT accreditation as an analytical laboratory in Germany the accuracy had to be improved remarkably. Therefore the GC-MS-determination of molar activities of labelled compounds is used as the 14 C-labelled compound. A special evaluation code is used to determine the enrichment values relative to the unlabelled molecules. Taking into account the results of GC-MS the accuracy of molar activity determination is improved to ±2%. The spectra evaluation is demonstrated and some examples are discussed

  7. Efficacy of formative evaluation using a focus group for a large classroom setting in an accelerated pharmacy program.

    Science.gov (United States)

    Nolette, Shaun; Nguyen, Alyssa; Kogan, David; Oswald, Catherine; Whittaker, Alana; Chakraborty, Arup

    2017-07-01

    Formative evaluation is a process utilized to improve communication between students and faculty. This evaluation method allows the ability to address pertinent issues in a timely manner; however, implementation of formative evaluation can be a challenge, especially in a large classroom setting. Using mediated formative evaluation, the purpose of this study is to determine if a student based focus group is a viable option to improve efficacy of communication between an instructor and students as well as time management in a large classroom setting. Out of 140 total students, six students were selected to form a focus group - one from each of six total sections of the classroom. Each focus group representative was responsible for collecting all the questions from students of their corresponding sections and submitting them to the instructor two to three times a day. Responses from the instructor were either passed back to pertinent students by the focus group representatives or addressed directly with students by the instructor. This study was conducted using a fifteen-question survey after the focus group model was utilized for one month. A printed copy of the survey was distributed in the class by student investigators. Questions were of varying types, including Likert scale, yes/no, and open-ended response. One hundred forty surveys were administered, and 90 complete responses were collected. Surveys showed that 93.3% of students found that use of the focus group made them more likely to ask questions for understanding. The surveys also showed 95.5% of students found utilizing the focus group for questions allowed for better understanding of difficult concepts. General open-ended answer portions of the survey showed that most students found the focus group allowed them to ask questions more easily since they did not feel intimidated by asking in front of the whole class. No correlation was found between demographic characteristics and survey responses. This may

  8. Magnetic properties and low-temperature large magnetocaloric effect in the antiferromagnetic HoCu{sub 0.33}Ge{sub 2} and ErCu{sub 0.25}Ge{sub 2} compounds

    Energy Technology Data Exchange (ETDEWEB)

    Gao, R.L. [School of Metallurgy and Materials Engineering, Chongqing University of Science and Technology, Chongqing 401331 (China); Xu, Z.Y., E-mail: zhyxu@nim.ac.cn [National Institute of Metrology, Beijing 100029 (China); Wang, L.C. [State Key Laboratory for Magnetism, Institute of Physics, Chinese Academy of Sciences, Beijing 100190 (China); Dong, Q.Y.; Zhang, Y. [Department of Physics, Capital Normal University, Beijing 100048 (China); Liu, F.H. [National Space Science Center, Beijing 100190 (China); Mo, Z.J. [School of material Science and Engineering, Hebei University of Technology, Tianjin 300401 (China); Niu, E. [State Key Laboratory for Magnetism, Institute of Physics, Chinese Academy of Sciences, Beijing 100190 (China); Fu, C.L.; Cai, W.; Chen, G.; Deng, X.L. [School of Metallurgy and Materials Engineering, Chongqing University of Science and Technology, Chongqing 401331 (China)

    2015-05-15

    Highlights: • Antiferromagnetic material RCu{sub x}Ge{sub 2} of high purity was prepared. • Large MCE as −10.2 J/kg K and −10.5 J/kg K for RCu{sub x}Ge{sub 2} (Ho, Er) was obtained for field change of 0–50 kOe. • The RCu{sub x}Ge{sub 2} compounds with variable x had different transition temperature which made them suitable for ‘table-like’ magnetocaloric refrigerant. - Abstract: Magnetic properties and magnetocaloric effect (MCE) of HoCu{sub 0.33}Ge{sub 2} and ErCu{sub 0.25}Ge{sub 2} compounds have been investigated. The compounds were determined to be antiferromagnetic with the Néel temperatures T{sub N} = 9 K and 3.9 K, respectively. The critical transition magnetic fields for the metamagnetic transition from antiferromagnetic to ferromagnetic state below T{sub N} were determined to be 10 kOe for HoCu{sub 0.33}Ge{sub 2} at 5 K and 6 kOe for ErCu{sub 0.25}Ge{sub 2} at 2 K. Large MCE with the maximal values of magnetic entropy changes (ΔS{sub M}) as −10.2 J/kg K at 10.5 K were found in HoCu{sub 0.33}Ge{sub 2} for field changes of 0–70 kOe and −10.5 J/kg K at 5.5 K in ErCu{sub 0.25}Ge{sub 2} for field changes of 0–50 kOe, respectively. The large ΔS{sub M} around T{sub N} as well as no hysteresis loss made RCu{sub x}Ge{sub 2} competitive candidates as low temperature magnetic refrigerant.

  9. Novel Visualization of Large Health Related Data Sets

    Science.gov (United States)

    2015-03-01

    lower all-cause mortality. 3 While large cross-sectional studies of populations such as the National Health and Nutrition Examination Survey find a...due to impaired renal and hepatic metabolism, decreased dietary intake related to anorexia or nausea, and falsely low HbA1c secondary to uremia or...Renal Nutrition . 2009:19(1):33- 37. 2014 Workshop on Visual Analytics in Healthcare ! ! !"#$%&’(%’$)*+%,"’#%-’$%./*.0*12,$)345%6)*7’$%./’#*8)’#$9*1

  10. Frequency and Severity of Parenteral Nutrition Medication Errors at a Large Children's Hospital After Implementation of Electronic Ordering and Compounding.

    Science.gov (United States)

    MacKay, Mark; Anderson, Collin; Boehme, Sabrina; Cash, Jared; Zobell, Jeffery

    2016-04-01

    The Institute for Safe Medication Practices has stated that parenteral nutrition (PN) is considered a high-risk medication and has the potential of causing harm. Three organizations--American Society for Parenteral and Enteral Nutrition (A.S.P.E.N.), American Society of Health-System Pharmacists, and National Advisory Group--have published guidelines for ordering, transcribing, compounding and administering PN. These national organizations have published data on compliance to the guidelines and the risk of errors. The purpose of this article is to compare total compliance with ordering, transcription, compounding, administration, and error rate with a large pediatric institution. A computerized prescriber order entry (CPOE) program was developed that incorporates dosing with soft and hard stop recommendations and simultaneously eliminating the need for paper transcription. A CPOE team prioritized and identified issues, then developed solutions and integrated innovative CPOE and automated compounding device (ACD) technologies and practice changes to minimize opportunities for medication errors in PN prescription, transcription, preparation, and administration. Thirty developmental processes were identified and integrated in the CPOE program, resulting in practices that were compliant with A.S.P.E.N. safety consensus recommendations. Data from 7 years of development and implementation were analyzed and compared with published literature comparing error, harm rates, and cost reductions to determine if our process showed lower error rates compared with national outcomes. The CPOE program developed was in total compliance with the A.S.P.E.N. guidelines for PN. The frequency of PN medication errors at our hospital over the 7 years was 230 errors/84,503 PN prescriptions, or 0.27% compared with national data that determined that 74 of 4730 (1.6%) of prescriptions over 1.5 years were associated with a medication error. Errors were categorized by steps in the PN process

  11. Mini-review: Molecular mechanisms of antifouling compounds

    KAUST Repository

    Qian, Pei-Yuan

    2013-04-01

    Various antifouling (AF) coatings have been developed to protect submerged surfaces by deterring the settlement of the colonizing stages of fouling organisms. A review of the literature shows that effective AF compounds with specific targets are ones often considered non-toxic. Such compounds act variously on ion channels, quorum sensing systems, neurotransmitters, production/release of adhesive, and specific enzymes that regulate energy production or primary metabolism. In contrast, AF compounds with general targets may or may not act through toxic mechanisms. These compounds affect a variety of biological activities including algal photosynthesis, energy production, stress responses, genotoxic damage, immunosuppressed protein expression, oxidation, neurotransmission, surface chemistry, the formation of biofilms, and adhesive production/release. Among all the targets, adhesive production/release is the most common, possibly due to a more extensive research effort in this area. Overall, the specific molecular targets and the molecular mechanisms of most AF compounds have not been identified. Thus, the information available is insufficient to draw firm conclusions about the types of molecular targets to be used as sensitive biomarkers for future design and screening of compounds with AF potential. In this review, the relevant advantages and disadvantages of the molecular tools available for studying the molecular targets of AF compounds are highlighted briefly and the molecular mechanisms of the AF compounds, which are largely a source of speculation in the literature, are discussed. © 2013 Copyright Taylor and Francis Group, LLC.

  12. Unpacking Noun-Noun Compounds

    DEFF Research Database (Denmark)

    Smith, Viktor; Barratt, Daniel; Zlatev, Jordan

    2014-01-01

    In two complementary experiments we took an integrated approach to a set of tightly interwoven, yet rarely combined questions concerning the spontaneous interpretation of novel (unfamiliar) noun-noun compounds (NNCs) when encountered in isolation, and possible (re)interpretations of novel as well...... concerning the relations between semantics and pragmatics, as well as system and usage, and psycholinguistic issues concerning the processing of NNCs. New insights and methodological tools are also provided for supporting future best practices in the field of food naming and labelling...

  13. A new wire fabrication processing using high Ga content Cu-Ga compound in V3Ga compound superconducting wire

    International Nuclear Information System (INIS)

    Hishinuma, Yoshimitsu; Nishimura, Arata; Kikuchi, Akihiro; Iijima, Yasuo; Takeuchi, Takao

    2007-01-01

    A superconducting magnet system is also one of the important components in an advanced magnetic confinement fusion reactor. Then it is required to have a higher magnetic field property to confine and maintain steady-sate burning deuterium (D)-tritium (T) fusion plasma in the large interspace during the long term operation. Burning plasma is sure to generate 14 MeV fusion neutrons during deuterium-tritium reaction, and fusion neutrons will be streamed and penetrated to superconducting magnet through large ports with damping neutron energy. Therefore, it is necessary to consider carefully not only superconducting property but also neutron irradiation property in superconducting materials for use in a future fusion reactor, and a 'low activation and high field superconducting magnet' will be required to realize the fusion power plant beyond International Thermonuclear Experimental Reactor (ITER). V-based superconducting material has a much shorter decay time of induced radioactivity compared with the Nb-based materials. We thought that the V 3 Ga compound was one of the most promising materials for the 'low activation and higher field superconductors' for an advanced fusion reactor. However, the present critical current density (J c ) property of V 3 Ga compound wire is insufficient for apply to fusion magnet applications. We investigated a new route PIT process using a high Ga content Cu-Ga compound in order to improve the superconducting property of the V 3 Ga compound wire. (author)

  14. Task-specific ionic liquids for solubilizing metal compounds

    OpenAIRE

    Thijs, Ben

    2007-01-01

    The main goal of this PhD thesis was to design new task-specific ionic liquids with the ability to dissolve metal compounds. Despite the large quantity of papers published on ionic liquids, not much is known about the mechanisms of dissolving metals in ionic liquids or about metal-containing ionic liquids. Additionally, many of the commercially available ionic liquids exhibit a very limited solubilizing power for metal compounds, although this is for many applications like electrodeposition a...

  15. Large magnetoresistance in non-magnetic silver chalcogenides and new class of magnetoresistive compounds

    Science.gov (United States)

    Saboungi, Marie-Louis; Price, David C. L.; Rosenbaum, Thomas F.; Xu, Rong; Husmann, Anke

    2001-01-01

    The heavily-doped silver chalcogenides, Ag.sub.2+.delta. Se and Ag.sub.2+.delta. Te, show magnetoresistance effects on a scale comparable to the "colossal" magnetoresistance (CMR) compounds. Hall coefficient, magnetoconductivity, and hydrostatic pressure experiments establish that elements of narrow-gap semiconductor physics apply, but both the size of the effects at room temperature and the linear field dependence down to fields of a few Oersteds are surprising new features.

  16. Common y-intercept and single compound regressions of gas-particle partitioning data vs 1/T

    Science.gov (United States)

    Pankow, James F.

    Confidence intervals are placed around the log Kp vs 1/ T correlation equations obtained using simple linear regressions (SLR) with the gas-particle partitioning data set of Yamasaki et al. [(1982) Env. Sci. Technol.16, 189-194]. The compounds and groups of compounds studied include the polycylic aromatic hydrocarbons phenanthrene + anthracene, me-phenanthrene + me-anthracene, fluoranthene, pyrene, benzo[ a]fluorene + benzo[ b]fluorene, chrysene + benz[ a]anthracene + triphenylene, benzo[ b]fluoranthene + benzo[ k]fluoranthene, and benzo[ a]pyrene + benzo[ e]pyrene (note: me = methyl). For any given compound, at equilibrium, the partition coefficient Kp equals ( F/ TSP)/ A where F is the particulate-matter associated concentration (ng m -3), A is the gas-phase concentration (ng m -3), and TSP is the concentration of particulate matter (μg m -3). At temperatures more than 10°C from the mean sampling temperature of 17°C, the confidence intervals are quite wide. Since theory predicts that similar compounds sorbing on the same particulate matter should possess very similar y-intercepts, the data set was also fitted using a special common y-intercept regression (CYIR). For most of the compounds, the CYIR equations fell inside of the SLR 95% confidence intervals. The CYIR y-intercept value is -18.48, and is reasonably close to the type of value that can be predicted for PAH compounds. The set of CYIR regression equations is probably more reliable than the set of SLR equations. For example, the CYIR-derived desorption enthalpies are much more highly correlated with vaporization enthalpies than are the SLR-derived desorption enthalpies. It is recommended that the CYIR approach be considered whenever analysing temperature-dependent gas-particle partitioning data.

  17. Compound heterozygous ASPM mutations in Pakistani MCPH families

    DEFF Research Database (Denmark)

    Muhammad, Farooq; Mahmood Baig, Shahid; Hansen, Lars

    2009-01-01

    Autosomal recessive primary microcephaly (MCPH) is characterized by reduced head circumference (50% of all reported families. In spite of the high frequency of MCPH in Pakistan only one case of compound heterozygosity for mutations in ASPM has been reported yet. In this large MCPH study we...... confirmed compound heterozygosity in two and homozygous mutations in 20 families, respectively, showing that up to 10% of families with MCPH caused by ASPM are compound heterozygous. In total we identified 16 different nonsense or frameshift mutations of which 12 were novel thereby increasing the number...... of mutations in ASPM significantly from 35 to 47. We found no correlation between the severity of the condition and the site of truncation. We suggest that the high frequency of compound heterozygosity observed in this study is taken into consideration as part of future genetic testing and counseling...

  18. Compound-specific radiocarbon analysis - Analytical challenges and applications

    Science.gov (United States)

    Mollenhauer, G.; Rethemeyer, J.

    2009-01-01

    Within the last decades, techniques have become available that allow measurement of isotopic compositions of individual organic compounds (compound-specific isotope measurements). Most often the carbon isotopic composition of these compounds is studied, including stable carbon (δ13C) and radiocarbon (Δ14C) measurements. While compound-specific stable carbon isotope measurements are fairly simple, and well-established techniques are widely available, radiocarbon analysis of specific organic compounds is a more challenging method. Analytical challenges include difficulty obtaining adequate quantities of sample, tedious and complicated laboratory separations, the lack of authentic standards for measuring realistic processing blanks, and large uncertainties in values of Δ14C at small sample sizes. The challenges associated with sample preparation for compound-specific Δ14C measurements will be discussed in this contribution. Several years of compound-specific radiocarbon analysis have revealed that in most natural samples, purified organic compounds consist of heterogeneous mixtures of the same compound. These mixtures could derive from multiple sources, each having a different initial reservoir age but mixed in the same terminal reservoir, from a single source but mixed after deposition, or from a prokaryotic organism using variable carbon sources including mobilization of ancient carbon. These processes not only represent challenges to the interpretation of compound-specific radiocarbon data, but provide unique tools for the understanding of biogeochemical and sedimentological processes influencing the preserved organic geochemical records in marine sediments. We will discuss some examples where compound-specific radiocarbon analysis has provided new insights for the understanding of carbon source utilization and carbon cycling.

  19. Real-time spatial compound imaging improves reproducibility in evaluation of atherosclerotic carotid plaque

    DEFF Research Database (Denmark)

    Kofoed, Steen Christian; Grønholdt, Marie-Louise M.; Wilhjelm, Jens E.

    2001-01-01

    . The interobserver variation of the gray scale median value (GSM) for conventional technique ranged from -32 to +20 and from -6 to +6 for compound. Likewise, the coefficient of repeatability for the GSM value was 13 for conventional imaging and three for compound imaging, and interobserver variation for the GSM...... value for the overlapping area was 34% and 9% for conventional and compound technique. In conclusion, compound imaging improves intra- and interobserver agreement and reduces interobserver variation in the GSM value in a clinical setting. (C) 2001 World Federation for Ultrasound in Medicine & Biology....

  20. Effects of flow depth and wall roughness on turbulence in compound channels

    International Nuclear Information System (INIS)

    Prinos, P.; Townsend, R.; Tavoularis, S.

    1985-01-01

    Current methods for estimating discharge in compound channels often lead to large errors. The error is largely due to momentum transfer mechanism (MTM) generated in the junction regions of the flow field (between adjacent deep and shallow zones). The MTM adversely affects system conveyance, particularly when the velocity differential between the deep and shallow zones is large. Improved prediction methods, therefore, will necessarily reflect the MTM's presence and its effect on the compound flow field. The mechanism's influence on system hydraulics is best examined by analysing the related turbulence characteristics in the junction zones of the compound section. Townsend reported increased turbulence levels in the junction region between a main channel and its shallower flood plain zone and Elsawy, McKee and McKeogh found that observed normal turbulent stresses in a similar region were of the same order of magnitude as the apparent shear stress on the junction's vertical interface plane. The objective of the present study is to measure turbulent stresses in the junction region of a symmetrical compound open channel and examine their dependence on relative depth and relative boundary roughness. Further details of this phase of the larger study are presented elsewhere. (author)

  1. Combining Two Large MRI Data Sets (AddNeuroMed and ADNI) Using Multivariate Data Analysis to Distinguish between Patients with Alzheimer's Disease and Healthy Controls

    DEFF Research Database (Denmark)

    Westman, Eric; Simmons, Andrew; Muehlboeck, J.-Sebastian

    2010-01-01

    Background: The European Union AddNeuroMed project and the US-based Alzheimer Disease Neuroimaging Initiative (ADNI) are two large multi-centre initiatives designed to analyse and validate biomarkers for AD. This study aims to compare and combine magnetic resonance imaging (MRI) data from the two...... study cohorts using an automated image analysis pipeline and multivariate data analysis. Methods: A total of 664 subjects were included in this study (AddNeuroMed: 126 AD, 115 CTL, ADNI: 194 AD, 229 CTL) Data acquisition for the AddNeuroMed project was set up to be compatible with the ADNI study...... used are robust and that large data sets can be combined if MRI imaging protocols are carefully aligned....

  2. Biogenic volatile organic compounds in the Earth system.

    Science.gov (United States)

    Laothawornkitkul, Jullada; Taylor, Jane E; Paul, Nigel D; Hewitt, C Nicholas

    2009-01-01

    Biogenic volatile organic compounds produced by plants are involved in plant growth, development, reproduction and defence. They also function as communication media within plant communities, between plants and between plants and insects. Because of the high chemical reactivity of many of these compounds, coupled with their large mass emission rates from vegetation into the atmosphere, they have significant effects on the chemical composition and physical characteristics of the atmosphere. Hence, biogenic volatile organic compounds mediate the relationship between the biosphere and the atmosphere. Alteration of this relationship by anthropogenically driven changes to the environment, including global climate change, may perturb these interactions and may lead to adverse and hard-to-predict consequences for the Earth system.

  3. Evaluation of Different Methods for Identification of Structural Alerts Using Chemical Ames Mutagenicity Data Set as a Benchmark.

    Science.gov (United States)

    Yang, Hongbin; Li, Jie; Wu, Zengrui; Li, Weihua; Liu, Guixia; Tang, Yun

    2017-06-19

    Identification of structural alerts for toxicity is useful in drug discovery and other fields such as environmental protection. With structural alerts, researchers can quickly identify potential toxic compounds and learn how to modify them. Hence, it is important to determine structural alerts from a large number of compounds quickly and accurately. There are already many methods reported for identification of structural alerts. However, how to evaluate those methods is a problem. In this paper, we tried to evaluate four of the methods for monosubstructure identification with three indices including accuracy rate, coverage rate, and information gain to compare their advantages and disadvantages. The Kazius' Ames mutagenicity data set was used as the benchmark, and the four methods were MoSS (graph-based), SARpy (fragment-based), and two fingerprint-based methods including Bioalerts and the fingerprint (FP) method we previously used. The results showed that Bioalerts and FP could detect key substructures with high accuracy and coverage rates because they allowed unclosed rings and wildcard atom or bond types. However, they also resulted in redundancy so that their predictive performance was not as good as that of SARpy. SARpy was competitive in predictive performance in both training set and external validation set. These results might be helpful for users to select appropriate methods and further development of methods for identification of structural alerts.

  4. Solid fat content as a substitute for total polar compound analysis in edible oils

    Science.gov (United States)

    The solid fat contents (SFC) of heated edible oil samples were measured and found to correlate positively with total polar compounds (TPC) and inversely with triglyceride concentration. Traditional methods for determination of total polar compounds require a laboratory setting and are time intensiv...

  5. Air sparging of organic compounds in groundwater

    International Nuclear Information System (INIS)

    Hicks, P.M.

    1994-01-01

    Soils and aquifers containing organic compounds have been traditionally treated by excavation and disposal of the soil and/or pumping and treating the groundwater. These remedial options are often not practical or cost effective solutions. A more favorable alternative for removal of the adsorbed/dissolved organic compounds would be an in situ technology. Air sparging will remove volatile organic compounds from both the adsorbed and dissolved phases in the saturated zone. This technology effectively creates a crude air stripper below the aquifer where the soil acts as the ''packing''. The air stream that contacts dissolved/adsorbed phase organics in the aquifer induces volatilization. A case history illustrates the effectiveness of air sparging as a remedial technology for addressing organic compounds in soil and groundwater. The site is an operating heavy equipment manufacturing facility in central Florida. The soil and groundwater below a large building at the facility was found to contain primarily diesel type petroleum hydrocarbons during removal of underground storage tanks. The organic compounds identified in the groundwater were Benzene, Xylenes, Ethylbenzene and Toluenes (BTEX), Methyl tert-Butyl Ether (MTBE) and naphthalenes in concentrations related to diesel fuel

  6. High-Throughput Tabular Data Processor - Platform independent graphical tool for processing large data sets.

    Science.gov (United States)

    Madanecki, Piotr; Bałut, Magdalena; Buckley, Patrick G; Ochocka, J Renata; Bartoszewski, Rafał; Crossman, David K; Messiaen, Ludwine M; Piotrowski, Arkadiusz

    2018-01-01

    High-throughput technologies generate considerable amount of data which often requires bioinformatic expertise to analyze. Here we present High-Throughput Tabular Data Processor (HTDP), a platform independent Java program. HTDP works on any character-delimited column data (e.g. BED, GFF, GTF, PSL, WIG, VCF) from multiple text files and supports merging, filtering and converting of data that is produced in the course of high-throughput experiments. HTDP can also utilize itemized sets of conditions from external files for complex or repetitive filtering/merging tasks. The program is intended to aid global, real-time processing of large data sets using a graphical user interface (GUI). Therefore, no prior expertise in programming, regular expression, or command line usage is required of the user. Additionally, no a priori assumptions are imposed on the internal file composition. We demonstrate the flexibility and potential of HTDP in real-life research tasks including microarray and massively parallel sequencing, i.e. identification of disease predisposing variants in the next generation sequencing data as well as comprehensive concurrent analysis of microarray and sequencing results. We also show the utility of HTDP in technical tasks including data merge, reduction and filtering with external criteria files. HTDP was developed to address functionality that is missing or rudimentary in other GUI software for processing character-delimited column data from high-throughput technologies. Flexibility, in terms of input file handling, provides long term potential functionality in high-throughput analysis pipelines, as the program is not limited by the currently existing applications and data formats. HTDP is available as the Open Source software (https://github.com/pmadanecki/htdp).

  7. Metal cluster compounds - chemistry and importance; clusters containing isolated main group element atoms, large metal cluster compounds, cluster fluxionality

    International Nuclear Information System (INIS)

    Walther, B.

    1988-01-01

    This part of the review on metal cluster compounds deals with clusters containing isolated main group element atoms, with high nuclearity clusters and metal cluster fluxionality. It will be obvious that main group element atoms strongly influence the geometry, stability and reactivity of the clusters. High nuclearity clusters are of interest in there own due to the diversity of the structures adopted, but their intermediate position between molecules and the metallic state makes them a fascinating research object too. These both sites of the metal cluster chemistry as well as the frequently observed ligand and core fluxionality are related to the cluster metal and surface analogy. (author)

  8. Scalable, large area compound array refractive lens for hard X-rays

    Science.gov (United States)

    Reich, Stefan; dos Santos Rolo, Tomy; Letzel, Alexander; Baumbach, Tilo; Plech, Anton

    2018-04-01

    We demonstrate the fabrication of a 2D Compound Array Refractive Lens (CARL) for multi-contrast X-ray imaging. The CARL consists of six stacked polyimide foils with each displaying a 2D array of lenses with a 65 μm pitch aiming for a sensitivity on sub-micrometer structures with a (few-)micrometer resolution in sensing through phase and scattering contrast at multiple keV. The parabolic lenses are formed by indents in the foils by a paraboloid needle. The ability for fast single-exposure multi-contrast imaging is demonstrated by filming the kinetics of pulsed laser ablation in liquid. The three contrast channels, absorption, differential phase, and scattering, are imaged with a time resolution of 25 μs. By changing the sample-detector distance, it is possible to distinguish between nanoparticles and microbubbles.

  9. Promises of Machine Learning Approaches in Prediction of Absorption of Compounds.

    Science.gov (United States)

    Kumar, Rajnish; Sharma, Anju; Siddiqui, Mohammed Haris; Tiwari, Rajesh Kumar

    2018-01-01

    The Machine Learning (ML) is one of the fastest developing techniques in the prediction and evaluation of important pharmacokinetic properties such as absorption, distribution, metabolism and excretion. The availability of a large number of robust validation techniques for prediction models devoted to pharmacokinetics has significantly enhanced the trust and authenticity in ML approaches. There is a series of prediction models generated and used for rapid screening of compounds on the basis of absorption in last one decade. Prediction of absorption of compounds using ML models has great potential across the pharmaceutical industry as a non-animal alternative to predict absorption. However, these prediction models still have to go far ahead to develop the confidence similar to conventional experimental methods for estimation of drug absorption. Some of the general concerns are selection of appropriate ML methods and validation techniques in addition to selecting relevant descriptors and authentic data sets for the generation of prediction models. The current review explores published models of ML for the prediction of absorption using physicochemical properties as descriptors and their important conclusions. In addition, some critical challenges in acceptance of ML models for absorption are also discussed. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  10. Building and calibrating a large-extent and high resolution coupled groundwater-land surface model using globally available data-sets

    Science.gov (United States)

    Sutanudjaja, E. H.; Van Beek, L. P.; de Jong, S. M.; van Geer, F.; Bierkens, M. F.

    2012-12-01

    The current generation of large-scale hydrological models generally lacks a groundwater model component simulating lateral groundwater flow. Large-scale groundwater models are rare due to a lack of hydro-geological data required for their parameterization and a lack of groundwater head data required for their calibration. In this study, we propose an approach to develop a large-extent fully-coupled land surface-groundwater model by using globally available datasets and calibrate it using a combination of discharge observations and remotely-sensed soil moisture data. The underlying objective is to devise a collection of methods that enables one to build and parameterize large-scale groundwater models in data-poor regions. The model used, PCR-GLOBWB-MOD, has a spatial resolution of 1 km x 1 km and operates on a daily basis. It consists of a single-layer MODFLOW groundwater model that is dynamically coupled to the PCR-GLOBWB land surface model. This fully-coupled model accommodates two-way interactions between surface water levels and groundwater head dynamics, as well as between upper soil moisture states and groundwater levels, including a capillary rise mechanism to sustain upper soil storage and thus to fulfill high evaporation demands (during dry conditions). As a test bed, we used the Rhine-Meuse basin, where more than 4000 groundwater head time series have been collected for validation purposes. The model was parameterized using globally available data-sets on surface elevation, drainage direction, land-cover, soil and lithology. Next, the model was calibrated using a brute force approach and massive parallel computing, i.e. by running the coupled groundwater-land surface model for more than 3000 different parameter sets. Here, we varied minimal soil moisture storage and saturated conductivities of the soil layers as well as aquifer transmissivities. Using different regularization strategies and calibration criteria we compared three calibration scenarios

  11. A ranking method for the concurrent learning of compounds with various activity profiles.

    Science.gov (United States)

    Dörr, Alexander; Rosenbaum, Lars; Zell, Andreas

    2015-01-01

    In this study, we present a SVM-based ranking algorithm for the concurrent learning of compounds with different activity profiles and their varying prioritization. To this end, a specific labeling of each compound was elaborated in order to infer virtual screening models against multiple targets. We compared the method with several state-of-the-art SVM classification techniques that are capable of inferring multi-target screening models on three chemical data sets (cytochrome P450s, dehydrogenases, and a trypsin-like protease data set) containing three different biological targets each. The experiments show that ranking-based algorithms show an increased performance for single- and multi-target virtual screening. Moreover, compounds that do not completely fulfill the desired activity profile are still ranked higher than decoys or compounds with an entirely undesired profile, compared to other multi-target SVM methods. SVM-based ranking methods constitute a valuable approach for virtual screening in multi-target drug design. The utilization of such methods is most helpful when dealing with compounds with various activity profiles and the finding of many ligands with an already perfectly matching activity profile is not to be expected.

  12. Validation and evaluation of common large-area display set (CLADS) performance specification

    Science.gov (United States)

    Hermann, David J.; Gorenflo, Ronald L.

    1998-09-01

    Battelle is under contract with Warner Robins Air Logistics Center to design a Common Large Area Display Set (CLADS) for use in multiple Command, Control, Communications, Computers, and Intelligence (C4I) applications that currently use 19- inch Cathode Ray Tubes (CRTs). Battelle engineers have built and fully tested pre-production prototypes of the CLADS design for AWACS, and are completing pre-production prototype displays for three other platforms simultaneously. With the CLADS design, any display technology that can be packaged to meet the form, fit, and function requirements defined by the Common Large Area Display Head Assembly (CLADHA) performance specification is a candidate for CLADS applications. This technology independent feature reduced the risk of CLADS development, permits life long technology insertion upgrades without unnecessary redesign, and addresses many of the obsolescence problems associated with COTS technology-based acquisition. Performance and environmental testing were performed on the AWACS CLADS and continues on other platforms as a part of the performance specification validation process. A simulator assessment and flight assessment were successfully completed for the AWACS CLADS, and lessons learned from these assessments are being incorporated into the performance specifications. Draft CLADS specifications were released to potential display integrators and manufacturers for review in 1997, and the final version of the performance specifications are scheduled to be released to display integrators and manufacturers in May, 1998. Initial USAF applications include replacements for the E-3 AWACS color monitor assembly, E-8 Joint STARS graphics display unit, and ABCCC airborne color display. Initial U.S. Navy applications include the E-2C ACIS display. For these applications, reliability and maintainability are key objectives. The common design will reduce the cost of operation and maintenance by an estimated 3.3M per year on E-3 AWACS

  13. Basic set theory

    CERN Document Server

    Levy, Azriel

    2002-01-01

    An advanced-level treatment of the basics of set theory, this text offers students a firm foundation, stopping just short of the areas employing model-theoretic methods. Geared toward upper-level undergraduate and graduate students, it consists of two parts: the first covers pure set theory, including the basic motions, order and well-foundedness, cardinal numbers, the ordinals, and the axiom of choice and some of it consequences; the second deals with applications and advanced topics such as point set topology, real spaces, Boolean algebras, and infinite combinatorics and large cardinals. An

  14. Comparison of Multiple Linear Regressions and Neural Networks based QSAR models for the design of new antitubercular compounds.

    Science.gov (United States)

    Ventura, Cristina; Latino, Diogo A R S; Martins, Filomena

    2013-01-01

    The performance of two QSAR methodologies, namely Multiple Linear Regressions (MLR) and Neural Networks (NN), towards the modeling and prediction of antitubercular activity was evaluated and compared. A data set of 173 potentially active compounds belonging to the hydrazide family and represented by 96 descriptors was analyzed. Models were built with Multiple Linear Regressions (MLR), single Feed-Forward Neural Networks (FFNNs), ensembles of FFNNs and Associative Neural Networks (AsNNs) using four different data sets and different types of descriptors. The predictive ability of the different techniques used were assessed and discussed on the basis of different validation criteria and results show in general a better performance of AsNNs in terms of learning ability and prediction of antitubercular behaviors when compared with all other methods. MLR have, however, the advantage of pinpointing the most relevant molecular characteristics responsible for the behavior of these compounds against Mycobacterium tuberculosis. The best results for the larger data set (94 compounds in training set and 18 in test set) were obtained with AsNNs using seven descriptors (R(2) of 0.874 and RMSE of 0.437 against R(2) of 0.845 and RMSE of 0.472 in MLRs, for test set). Counter-Propagation Neural Networks (CPNNs) were trained with the same data sets and descriptors. From the scrutiny of the weight levels in each CPNN and the information retrieved from MLRs, a rational design of potentially active compounds was attempted. Two new compounds were synthesized and tested against M. tuberculosis showing an activity close to that predicted by the majority of the models. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  15. HS-SPME optimization and extraction of volatile compounds from soursop (Annona muricata L. pulp with emphasis on their characteristic impact compounds

    Directory of Open Access Journals (Sweden)

    Karen Leticia de SANTANA

    Full Text Available Abstract Aroma and taste are decisive factors in the selection of any food. The aim of this study was to extract the volatile compounds present in soursop (Annona muricata L. pulp by Solid-phase microextraction (SPME technique using 3 different fibers (DVB/CAR/ PDMS, CAR/PDMS and PDMS/DVB. An experimental design was set up to evaluate the best extraction conditions wherein the variables were adsorption temperature, ionic strength and pulp concentration. The separation of volatiles was performed in chromatographic columns of different polarity (polar and non-polar while volatile compounds were identified by analysis in high resolution gas chromatography system coupled with mass spectrometry. The results obtained using 3 different fibers revealed the capture of about 40 compounds. The CAR/PDMS fiber was more efficient for the capture of esters and DVB/CAR/PDMS fiber for terpenes. The optimum conditions for capture of higher number of volatiles for polar column were 45 °C for extraction, 15% of ionic strength and 50% of pulp concentration which resulted in separation of 87 compounds. Among the principal character impact compounds from soursop are (E-2-hexenoate, methyl hexenoate and linalool.

  16. Practice settings and dentists' job satisfaction.

    Science.gov (United States)

    Lo Sasso, Anthony T; Starkel, Rebecca L; Warren, Matthew N; Guay, Albert H; Vujicic, Marko

    2015-08-01

    The nature and organization of dental practice is changing. The aim of this study was to explore how job satisfaction among dentists is associated with dental practice setting. A survey measured satisfaction with income, benefits, hours worked, clinical autonomy, work-life balance, emotional exhaustion, and overall satisfaction among dentists working in large group, small group, and solo practice settings; 2,171 dentists responded. The authors used logistic regression to measure differences in reported levels of satisfaction across practice settings. Dentists working in small group settings reported the most satisfaction overall. Dentists working in large group settings reported more satisfaction with income and benefits than dentists in solo practice, as well as having the least stress. Findings suggest possible advantages and disadvantages of working in different types of practice settings. Dentists working in different practice settings reported differences in satisfaction. These results may help dentists decide which practice setting is best for them. Copyright © 2015 American Dental Association. Published by Elsevier Inc. All rights reserved.

  17. The normal and inverse magnetocaloric effect in RCu2 (R=Tb, Dy, Ho, Er) compounds

    International Nuclear Information System (INIS)

    Zheng, X.Q.; Xu, Z.Y.; Zhang, B.; Hu, F.X.; Shen, B.G.

    2017-01-01

    Orthorhombic polycrystalline RCu 2 (R=Tb, Dy, Ho and Er) compounds were synthesized and the magnetic properties and magnetocaloric effect (MCE) were investigated in detail. All of the RCu 2 compounds are antiferromagnetic (AFM) ordered. As temperature increases, RCu 2 compounds undergo an AFM to AFM transition at T t and an AFM to paramagnetic (PM) transition at T N . Besides of the normal MCE around T N , large inverse MCE around T t was found in TbCu 2 compound. Under a field change of 0–7 T, the maximal value of inverse MCE is even larger than the value of normal MCE around T N for TbCu 2 compound. Considering of the normal and inverse MCE, TbCu 2 shows the largest refrigerant capacity among the RCu 2 (R=Tb, Dy, Ho and Er) compounds indicating its potential applications in low temperature multistage refrigeration. - Highlights: • Large inverse magnetocaloric effect is observed in TbCu 2 compound. • The AFM to AFM transition is observed in RCu 2 (R=Tb, Dy, Ho, Er) compounds. • The MCE performance of TbCu 2 compound is evaluated in a more comprehensively way.

  18. Identification of compounds with binding affinity to proteins via magnetization transfer from bulk water

    International Nuclear Information System (INIS)

    Dalvit, Claudio; Pevarello, Paolo; Tato, Marco; Veronesi, Marina; Vulpetti, Anna; Sundstroem, Michael

    2000-01-01

    A powerful screening by NMR methodology (WaterLOGSY), based on transfer of magnetization from bulk water, for the identification of compounds that interact with target biomolecules (proteins, RNA and DNA fragments) is described. The method exploits efficiently the large reservoir of H 2 O magnetization. The high sensitivity of the technique reduces the amount of biomolecule and ligands needed for the screening, which constitutes an important requirement for high throughput screening by NMR of large libraries of compounds. Application of the method to a compound mixture against the cyclin-dependent kinase 2 (cdk2) protein is presented

  19. Large scale simulations of the mechanical properties of layered transition metal ternary compounds for fossil energy power system applications

    Energy Technology Data Exchange (ETDEWEB)

    Ching, Wai-Yim [Univ. of Missouri, Kansas City, MO (United States)

    2014-12-31

    Advanced materials with applications in extreme conditions such as high temperature, high pressure, and corrosive environments play a critical role in the development of new technologies to significantly improve the performance of different types of power plants. Materials that are currently employed in fossil energy conversion systems are typically the Ni-based alloys and stainless steels that have already reached their ultimate performance limits. Incremental improvements are unlikely to meet the more stringent requirements aimed at increased efficiency and reduce risks while addressing environmental concerns and keeping costs low. Computational studies can lead the way in the search for novel materials or for significant improvements in existing materials that can meet such requirements. Detailed computational studies with sufficient predictive power can provide an atomistic level understanding of the key characteristics that lead to desirable properties. This project focuses on the comprehensive study of a new class of materials called MAX phases, or Mn+1AXn (M = a transition metal, A = Al or other group III, IV, and V elements, X = C or N). The MAX phases are layered transition metal carbides or nitrides with a rare combination of metallic and ceramic properties. Due to their unique structural arrangements and special types of bonding, these thermodynamically stable alloys possess some of the most outstanding properties. We used a genomic approach in screening a large number of potential MAX phases and established a database for 665 viable MAX compounds on the structure, mechanical and electronic properties and investigated the correlations between them. This database if then used as a tool for materials informatics for further exploration of this class of intermetallic compounds.

  20. Organometallic compounds of the lanthanides, actinides and early transition metals

    Energy Technology Data Exchange (ETDEWEB)

    Cardin, D J [Trinity Coll., Dublin (Ireland); Cotton, S A [Stanground School, Peterborough (UK); Green, M [Bristol Univ. (UK); Labinger, J A [Atlantic Richfield Co., Los Angeles, CA (USA); eds.

    1985-01-01

    This book provides a reference compilation of physical and biographical data on over 1500 of the most important and useful organometallic compounds of the lanthanides, actinides and early transition metals representing 38 different elements. The compounds are listed in molecular formula order in a series of entries in dictionary format. Details of structure, physical and chemical properties, reactions and key references are clearly set out. All the data is fully indexed and a structural index is provided.

  1. From Visualisation to Data Mining with Large Data Sets

    CERN Document Server

    Adelmann, Andreas; Shalf, John M; Siegerist, Cristina

    2005-01-01

    In 3D particle simulations, the generated 6D phase space data are can be very large due to the need for accurate statistics, sufficient noise attenuation in the field solver and tracking of many turns in ring machines or accelerators. There is a need for distributed applications that allow users to peruse these extremely large remotely located datasets with the same ease as locally downloaded data. This paper will show concepts and a prototype tool to extract useful physical information out of 6D raw phase space data. ParViT allows the user to project 6D data into 3D space by selecting which dimensions will be represented spatially and which dimensions are represented as particle attributes, and the construction of complex transfer functions for representing the particle attributes. It also allows management of time-series data. An HDF5-based parallel-I/O library, with C++, C and Fortran bindings simplifies the interface with a variety of codes. A number of hooks in ParVit will allow it to connect with a para...

  2. AN EFFICIENT DATA MINING METHOD TO FIND FREQUENT ITEM SETS IN LARGE DATABASE USING TR- FCTM

    Directory of Open Access Journals (Sweden)

    Saravanan Suba

    2016-01-01

    Full Text Available Mining association rules in large database is one of most popular data mining techniques for business decision makers. Discovering frequent item set is the core process in association rule mining. Numerous algorithms are available in the literature to find frequent patterns. Apriori and FP-tree are the most common methods for finding frequent items. Apriori finds significant frequent items using candidate generation with more number of data base scans. FP-tree uses two database scans to find significant frequent items without using candidate generation. This proposed TR-FCTM (Transaction Reduction- Frequency Count Table Method discovers significant frequent items by generating full candidates once to form frequency count table with one database scan. Experimental results of TR-FCTM shows that this algorithm outperforms than Apriori and FP-tree.

  3. Design of Fluorescent Compounds for Scintillation Detection

    Energy Technology Data Exchange (ETDEWEB)

    Pla-Dalmau, Anna [Northern Illinois U.

    1990-01-01

    Plastic scintillation detectors for high energy physics applications require the development of new fluorescent compounds to meet the demands set by the future generation of particle accelerators such as the Superconducting Supercollider (SSe). Plastic scintillators are commonly based on a polymer matrix doped with two fluorescent compounds: the primary dopant and the wavelength shifter. Their main characteristics are fast response time and high quantum efficiency. The exposure to larger radiation doses and demands for larger light output questions their survivability in the future experiments. A new type of plastic scintillator - intrinsic scintillator - has been suggested. It uses a single dopant as primary and wavelength shifter, and should be less susceptible to radiation damage....

  4. Registering coherent change detection products associated with large image sets and long capture intervals

    Science.gov (United States)

    Perkins, David Nikolaus; Gonzales, Antonio I

    2014-04-08

    A set of co-registered coherent change detection (CCD) products is produced from a set of temporally separated synthetic aperture radar (SAR) images of a target scene. A plurality of transformations are determined, which transformations are respectively for transforming a plurality of the SAR images to a predetermined image coordinate system. The transformations are used to create, from a set of CCD products produced from the set of SAR images, a corresponding set of co-registered CCD products.

  5. Antiviral lead compounds from marine sponges

    KAUST Repository

    Sagar, Sunil

    2010-10-11

    Marine sponges are currently one of the richest sources of pharmacologically active compounds found in the marine environment. These bioactive molecules are often secondary metabolites, whose main function is to enable and/or modulate cellular communication and defense. They are usually produced by functional enzyme clusters in sponges and/or their associated symbiotic microorganisms. Natural product lead compounds from sponges have often been found to be promising pharmaceutical agents. Several of them have successfully been approved as antiviral agents for clinical use or have been advanced to the late stages of clinical trials. Most of these drugs are used for the treatment of human immunodeficiency virus (HIV) and herpes simplex virus (HSV). The most important antiviral lead of marine origin reported thus far is nucleoside Ara-A (vidarabine) isolated from sponge Tethya crypta. It inhibits viral DNA polymerase and DNA synthesis of herpes, vaccinica and varicella zoster viruses. However due to the discovery of new types of viruses and emergence of drug resistant strains, it is necessary to develop new antiviral lead compounds continuously. Several sponge derived antiviral lead compounds which are hopedto be developed as future drugs are discussed in this review. Supply problems are usually the major bottleneck to the development of these compounds as drugs during clinical trials. However advances in the field of metagenomics and high throughput microbial cultivation has raised the possibility that these techniques could lead to the cost-effective large scale production of such compounds. Perspectives on biotechnological methods with respect to marine drug development are also discussed. 2010 by the authors; licensee MDPI.

  6. Exploring Marine Cyanobacteria for Lead Compounds of Pharmaceutical Importance

    Directory of Open Access Journals (Sweden)

    Bushra Uzair

    2012-01-01

    Full Text Available The Ocean, which is called the “mother of origin of life,” is also the source of structurally unique natural products that are mainly accumulated in living organisms. Cyanobacteria are photosynthetic prokaryotes used as food by humans. They are excellent source of vitamins and proteins vital for life. Several of these compounds show pharmacological activities and are helpful for the invention and discovery of bioactive compounds, primarily for deadly diseases like cancer, acquired immunodeficiency syndrome (AIDS, arthritis, and so forth, while other compounds have been developed as analgesics or to treat inflammation, and so forth. They produce a large variety of bioactive compounds, including substances with anticancer and antiviral activity, UV protectants, specific inhibitors of enzymes, and potent hepatotoxins and neurotoxins. Many cyanobacteria produce compounds with potent biological activities. This paper aims to showcase the structural diversity of marine cyanobacterial secondary metabolites with a comprehensive coverage of alkaloids and other applications of cyanobacteria.

  7. Relationships for the impact sensitivities of energetic C-nitro compounds based on bond dissociation energy.

    Science.gov (United States)

    Li, Jinshan

    2010-02-18

    The ZPE-corrected C-NO(2) bond dissociation energies (BDEs(ZPE)) of a series of model C-nitro compounds and 26 energetic C-nitro compounds have been calculated using density functional theory methods. Computed results show that for C-nitro compounds the UB3LYP calculated BDE(ZPE) is less than the UB3P86 using the 6-31G** basis set, and the UB3P86 BDE(ZPE) changes slightly with the basis set varying from 6-31G** to 6-31++G**. For the series of model C-nitro compounds with different chemical skeletons, it is drawn from NBO analysis that the order of BDE(ZPE) is not only in line with that of the NAO bond order but also with that of the energy gap between C-NO(2) bonding and antibonding orbitals. It is found that for the energetic C-nitro compounds whose drop energies (Es(dr)) are below 24.5 J a good linear correlation exists between E(dr) and BDE(ZPE), implying that these compounds ignite through the C-NO(2) dissociation mechanism. After excluding the so-called trinitrotoluene mechanism compounds, a polynomial correlation of ln(E(dr)) with the BDE(ZPE) calculated at density functional theory levels has been established successfully for the 18 C-NO(2) dissociation energetic C-nitro compounds.

  8. Effectively identifying compound-protein interactions by learning from positive and unlabeled examples.

    Science.gov (United States)

    Cheng, Zhanzhan; Zhou, Shuigeng; Wang, Yang; Liu, Hui; Guan, Jihong; Chen, Yi-Ping Phoebe

    2016-05-18

    Prediction of compound-protein interactions (CPIs) is to find new compound-protein pairs where a protein is targeted by at least a compound, which is a crucial step in new drug design. Currently, a number of machine learning based methods have been developed to predict new CPIs in the literature. However, as there is not yet any publicly available set of validated negative CPIs, most existing machine learning based approaches use the unknown interactions (not validated CPIs) selected randomly as the negative examples to train classifiers for predicting new CPIs. Obviously, this is not quite reasonable and unavoidably impacts the CPI prediction performance. In this paper, we simply take the unknown CPIs as unlabeled examples, and propose a new method called PUCPI (the abbreviation of PU learning for Compound-Protein Interaction identification) that employs biased-SVM (Support Vector Machine) to predict CPIs using only positive and unlabeled examples. PU learning is a class of learning methods that leans from positive and unlabeled (PU) samples. To the best of our knowledge, this is the first work that identifies CPIs using only positive and unlabeled examples. We first collect known CPIs as positive examples and then randomly select compound-protein pairs not in the positive set as unlabeled examples. For each CPI/compound-protein pair, we extract protein domains as protein features and compound substructures as chemical features, then take the tensor product of the corresponding compound features and protein features as the feature vector of the CPI/compound-protein pair. After that, biased-SVM is employed to train classifiers on different datasets of CPIs and compound-protein pairs. Experiments over various datasets show that our method outperforms six typical classifiers, including random forest, L1- and L2-regularized logistic regression, naive Bayes, SVM and k-nearest neighbor (kNN), and three types of existing CPI prediction models. Source code, datasets and

  9. Orientation distribution in Bi2Te3-based compound prepared by spark plasma sintering

    International Nuclear Information System (INIS)

    Kim, K.T.; Kim, Y.H.; Lim, C.H.; Cho, D.C.; Lee, Y.S.; Lee, C.H.

    2005-01-01

    P-type Bi 0.5 Sb 1.5 Te 3 compounds doped with 3wt.% Te were fabricated by spark plasma sintering after mixing large powders(P L ) and small powders(P S ). We could obtained the highest figure of merit(Z C ) of 2.89 x 10 -3 /K in sintered compound mixed to P L :P S =80:20. This resulted from the increase of orientation by large powders(P S ) and the reduce of pores by small powders. The figure of merit(Z C ) of the sintered compound using only small powders(P S ) showed lower value of 2.67 x 10 -3 /K compared with that of sintered compound mixed to P L :P S =80:20 due to the increase of electrical resistivity. (orig.)

  10. Phenolic Compounds in the Potato and Its Byproducts: An Overview

    Science.gov (United States)

    Akyol, Hazal; Riciputi, Ylenia; Capanoglu, Esra; Caboni, Maria Fiorenza; Verardo, Vito

    2016-01-01

    The potato (Solanum tuberosum L.) is a tuber that is largely used for food and is a source of different bioactive compounds such as starch, dietary fiber, amino acids, minerals, vitamins, and phenolic compounds. Phenolic compounds are synthetized by the potato plant as a protection response from bacteria, fungi, viruses, and insects. Several works showed that these potato compounds exhibited health-promoting effects in humans. However, the use of the potato in the food industry submits this vegetable to different processes that can alter the phenolic content. Moreover, many of these compounds with high bioactivity are located in the potato’s skin, and so are eliminated as waste. In this review the most recent articles dealing with phenolic compounds in the potato and potato byproducts, along with the effects of harvesting, post-harvest, and technological processes, have been reviewed. Briefly, the phenolic composition, main extraction, and determination methods have been described. In addition, the “alternative” food uses and healthy properties of potato phenolic compounds have been addressed. PMID:27240356

  11. Effect of Extracts and Isolated Pure Compounds of Spondias ...

    African Journals Online (AJOL)

    , with the pharmacological mechanisms responsible for these effects have remained largely unexplored. This study elucidated the neurotransmitter systems and receptors involved in the effects of extracts, and isolated compounds of Spondias ...

  12. Galaxy Evolution Insights from Spectral Modeling of Large Data Sets from the Sloan Digital Sky Survey

    Energy Technology Data Exchange (ETDEWEB)

    Hoversten, Erik A. [Johns Hopkins Univ., Baltimore, MD (United States)

    2007-10-01

    This thesis centers on the use of spectral modeling techniques on data from the Sloan Digital Sky Survey (SDSS) to gain new insights into current questions in galaxy evolution. The SDSS provides a large, uniform, high quality data set which can be exploited in a number of ways. One avenue pursued here is to use the large sample size to measure precisely the mean properties of galaxies of increasingly narrow parameter ranges. The other route taken is to look for rare objects which open up for exploration new areas in galaxy parameter space. The crux of this thesis is revisiting the classical Kennicutt method for inferring the stellar initial mass function (IMF) from the integrated light properties of galaxies. A large data set (~ 105 galaxies) from the SDSS DR4 is combined with more in-depth modeling and quantitative statistical analysis to search for systematic IMF variations as a function of galaxy luminosity. Galaxy Hα equivalent widths are compared to a broadband color index to constrain the IMF. It is found that for the sample as a whole the best fitting IMF power law slope above 0.5 M is Γ = 1.5 ± 0.1 with the error dominated by systematics. Galaxies brighter than around Mr,0.1 = -20 (including galaxies like the Milky Way which has Mr,0.1 ~ -21) are well fit by a universal Γ ~ 1.4 IMF, similar to the classical Salpeter slope, and smooth, exponential star formation histories (SFH). Fainter galaxies prefer steeper IMFs and the quality of the fits reveal that for these galaxies a universal IMF with smooth SFHs is actually a poor assumption. Related projects are also pursued. A targeted photometric search is conducted for strongly lensed Lyman break galaxies (LBG) similar to MS1512-cB58. The evolution of the photometric selection technique is described as are the results of spectroscopic follow-up of the best targets. The serendipitous discovery of two interesting blue compact dwarf galaxies is reported. These

  13. Quantitative Prediction of Solvation Free Energy in Octanol of Organic Compounds

    Directory of Open Access Journals (Sweden)

    Eduardo J. Delgado

    2009-03-01

    Full Text Available The free energy of solvation, ΔGS0 , in octanol of organic compunds is quantitatively predicted from the molecular structure. The model, involving only three molecular descriptors, is obtained by multiple linear regression analysis from a data set of 147 compounds containing diverse organic functions, namely, halogenated and non-halogenated alkanes, alkenes, alkynes, aromatics, alcohols, aldehydes, ketones, amines, ethers and esters; covering a ΔGS0 range from about –50 to 0 kJ·mol-1. The model predicts the free energy of solvation with a squared correlation coefficient of 0.93 and a standard deviation, 2.4 kJ·mol-1, just marginally larger than the generally accepted value of experimental uncertainty. The involved molecular descriptors have definite physical meaning corresponding to the different intermolecular interactions occurring in the bulk liquid phase. The model is validated with an external set of 36 compounds not included in the training set.

  14. Identification of isomers of organometallic compounds

    Energy Technology Data Exchange (ETDEWEB)

    Mbue, Sona Peter; Cho, Kwang Hwi [Dept. of Bioinformatics and Life Science, School of Systems Biomedical Science, Soongsil University,Seoul (Korea, Republic of)

    2015-06-15

    The yaChI is a newly suggested chemical naming system. However, yaChI is a derivative of the IUPAC InChI with a modified algorithm that includes additional layers of chemical structure information. Consequently, yaChI string contains more structure details while preserving the original structure file information and can distinctively identify very closely related compounds reducing the chances of ambiguity in chemical compound databases as opposed to the general SMILES, InChI, and InChIKey. This study examines the relative performances of yaChI, SMILES, InChI, and InChIKey in duplication check for isomers. For simplicity, a small data set of 28 organometallic compounds (structural isomers of Rh-containing compounds) subdivided into three major groups (A, B, and C) based on the number and the type of ligands attached to the center atom was used to study the performances of each encoding scheme in describing chemical structures. SMILES, InChI, and InChIKey were generated using Openbabel and RDkit, whereas yaChI strings were generated with in-house program. Strings generated from SMILES, InChI, and InChIKey though different, resulted to only three unique chemical identifiers, with each belonging to one group indicating the presence of only three unique compounds in the study data. However, yaChI results depicted that all structures in each group are indeed unique and differ among themselves as well as those from other groups, mapping each structure with a unique identifier given a total number of 28 unique structures in the study data. This high perception of yaChI probe justifies its accuracy and reliability in duplication check among closely related compounds especially structures exhibiting stereo properties.

  15. Occurrence and fate of antibiotic, analgesic/anti-inflammatory, and antifungal compounds in five wastewater treatment processes.

    Science.gov (United States)

    Guerra, P; Kim, M; Shah, A; Alaee, M; Smyth, S A

    2014-03-01

    The presence of pharmaceuticals and personal care products (PPCPs) in the aquatic environment as a result of wastewater effluent discharge is a concern in many countries. In order to expand our understanding on the occurrence and fate of PPCPs during wastewater treatment processes, 62 antibiotic, analgesic/anti-inflammatory, and antifungal compounds were analyzed in 72 liquid and 24 biosolid samples from six wastewater treatment plants (WWTPs) during the summer and winter seasons of 2010-2012. This is the first scientific study to compare five different wastewater treatment processes: facultative and aerated lagoons, chemically-enhanced primary treatment, secondary activated sludge, and advanced biological nutrient removal. PPCPs were detected in all WWTP influents at median concentrations of 1.5 to 92,000 ng/L, with no seasonal differences. PPCPs were also found in all final effluents at median levels ranging from 3.6 to 4,200 ng/L with higher values during winter (pRemoval efficiencies ranged between -450% and 120%, depending on the compound, WWTP type, and season. Mass balance showed that the fate of analgesic/anti-inflammatory compounds was predominantly biodegradation during biological treatment, while antibiotics and antifungal compounds were more likely to sorb to sludge. However, some PPCPs remained soluble and were detected in effluent samples. Overall, this study highlighted the occurrence and behavior of a large set of PPCPs and determined how their removal is affected by environmental/operational factors in different WWTPs. Crown Copyright © 2013. Published by Elsevier B.V. All rights reserved.

  16. A review of phenolic compounds in oil-bearing plants: Distribution, identification and occurrence of phenolic compounds.

    Science.gov (United States)

    Alu'datt, Muhammad H; Rababah, Taha; Alhamad, Mohammad N; Al-Mahasneh, Majdi A; Almajwal, Ali; Gammoh, Sana; Ereifej, Khalil; Johargy, Ayman; Alli, Inteaz

    2017-03-01

    Over the last two decades, separation, identification and measurement of the total and individual content of phenolic compounds has been widely investigated. Recently, the presence of a wide range of phenolic compounds in oil-bearing plants has been shown to contribute to their therapeutic properties, including anti-cancer, anti-viral, anti-oxidant, hypoglycemic, hypo-lipidemic, and anti-inflammatory activities. Phenolics in oil-bearing plants are now recognized as important minor food components due to several organoleptic and health properties, and they are used as food or sources of food ingredients. Variations in the content of phenolics in oil-bearing plants have largely been attributed to several factors, including the cultivation, time of harvest and soil types. A number of authors have suggested that the presence phenolics in extracted proteins, carbohydrates and oils may contribute to objectionable off flavors The objective of this study was to review the distribution, identification and occurrence of free and bound phenolic compounds in oil-bearing plants. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Efficient discovery of responses of proteins to compounds using active learning

    Science.gov (United States)

    2014-01-01

    Background Drug discovery and development has been aided by high throughput screening methods that detect compound effects on a single target. However, when using focused initial screening, undesirable secondary effects are often detected late in the development process after significant investment has been made. An alternative approach would be to screen against undesired effects early in the process, but the number of possible secondary targets makes this prohibitively expensive. Results This paper describes methods for making this global approach practical by constructing predictive models for many target responses to many compounds and using them to guide experimentation. We demonstrate for the first time that by jointly modeling targets and compounds using descriptive features and using active machine learning methods, accurate models can be built by doing only a small fraction of possible experiments. The methods were evaluated by computational experiments using a dataset of 177 assays and 20,000 compounds constructed from the PubChem database. Conclusions An average of nearly 60% of all hits in the dataset were found after exploring only 3% of the experimental space which suggests that active learning can be used to enable more complete characterization of compound effects than otherwise affordable. The methods described are also likely to find widespread application outside drug discovery, such as for characterizing the effects of a large number of compounds or inhibitory RNAs on a large number of cell or tissue phenotypes. PMID:24884564

  18. Pyrolysis mechanism of microalgae Nannochloropsis sp. based on model compounds and their interaction

    International Nuclear Information System (INIS)

    Wang, Xin; Tang, Xiaohan; Yang, Xiaoyi

    2017-01-01

    Highlights: • Pyrolysis experiments were conducted by model compounds of algal components. • Interaction affected little bio-crude yield of model compounds co-pyrolysis. • Some interaction pathways between microalgae components were recommended. • N-heterocyclic compounds were further pyrolysis products of Maillard reaction products. • Surfactant synthesis (lipid-amino acids and lipid-glucose) between algal components. - Abstract: Pyrolysis is one of important pathways to convert microalgae to liquid biofuels and key components of microalgae have different chemical composition and structure, which provides a barrier for large-scale microalgae-based liquid biofuel application. Microalgae component pyrolysis mechanism should be researched to optimal pyrolysis process parameters. In this study, single pyrolysis and co-pyrolysis of microalgal components (model compounds castor oil, soybean protein and glucose) were conducted to reveal interaction between them by thermogrametric analysis and bio-crude evaluation. Castor oil (model compound of lipid) has higher pyrolysis temperature than other model compounds and has the maximum contribution to bio-crude formation. Bio-crude from soybean protein has higher N-heterocyclic compounds as well as phenols, which could be important aromatic hydrocarbon source during biorefineries and alternative aviation biofuel production. Potential interaction pathways based on model compounds are recommended including further decomposition of Maillard reaction products (MRPs) and surfactant synthesis, which indicate that glucose played an important role on pyrolysis of microalgal protein and lipid components. The results should provide necessary information for microalgae pyrolysis process optimization and large-scale pyrolysis reactor design.

  19. Identifying natural compounds as multi-target-directed ligands against Alzheimer's disease: an in silico approach.

    Science.gov (United States)

    Ambure, Pravin; Bhat, Jyotsna; Puzyn, Tomasz; Roy, Kunal

    2018-04-23

    Alzheimer's disease (AD) is a multi-factorial disease, which can be simply outlined as an irreversible and progressive neurodegenerative disorder with an unclear root cause. It is a major cause of dementia in old aged people. In the present study, utilizing the structural and biological activity information of ligands for five important and mostly studied vital targets (i.e. cyclin-dependant kinase 5, β-secretase, monoamine oxidase B, glycogen synthase kinase 3β, acetylcholinesterase) that are believed to be effective against AD, we have developed five classification models using linear discriminant analysis (LDA) technique. Considering the importance of data curation, we have given more attention towards the chemical and biological data curation, which is a difficult task especially in case of big data-sets. Thus, to ease the curation process we have designed Konstanz Information Miner (KNIME) workflows, which are made available at http://teqip.jdvu.ac.in/QSAR_Tools/ . The developed models were appropriately validated based on the predictions for experiment derived data from test sets, as well as true external set compounds including known multi-target compounds. The domain of applicability for each classification model was checked based on a confidence estimation approach. Further, these validated models were employed for screening of natural compounds collected from the InterBioScreen natural database ( https://www.ibscreen.com/natural-compounds ). Further, the natural compounds that were categorized as 'actives' in at least two classification models out of five developed models were considered as multi-target leads, and these compounds were further screened using the drug-like filter, molecular docking technique and then thoroughly analyzed using molecular dynamics studies. Finally, the most potential multi-target natural compounds against AD are suggested.

  20. Estimation of lower flammability limits of C-H compounds in air at atmospheric pressure, evaluation of temperature dependence and diluent effect.

    Science.gov (United States)

    Mendiburu, Andrés Z; de Carvalho, João A; Coronado, Christian R

    2015-03-21

    Estimation of the lower flammability limits of C-H compounds at 25 °C and 1 atm; at moderate temperatures and in presence of diluent was the objective of this study. A set of 120 C-H compounds was divided into a correlation set and a prediction set of 60 compounds each. The absolute average relative error for the total set was 7.89%; for the correlation set, it was 6.09%; and for the prediction set it was 9.68%. However, it was shown that by considering different sources of experimental data the values were reduced to 6.5% for the prediction set and to 6.29% for the total set. The method showed consistency with Le Chatelier's law for binary mixtures of C-H compounds. When tested for a temperature range from 5 °C to 100 °C, the absolute average relative errors were 2.41% for methane; 4.78% for propane; 0.29% for iso-butane and 3.86% for propylene. When nitrogen was added, the absolute average relative errors were 2.48% for methane; 5.13% for propane; 0.11% for iso-butane and 0.15% for propylene. When carbon dioxide was added, the absolute relative errors were 1.80% for methane; 5.38% for propane; 0.86% for iso-butane and 1.06% for propylene. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Isotopically modified compounds

    International Nuclear Information System (INIS)

    Kuruc, J.

    2009-01-01

    In this chapter the nomenclature of isotopically modified compounds in Slovak language is described. This chapter consists of following parts: (1) Isotopically substituted compounds; (2) Specifically isotopically labelled compounds; (3) Selectively isotopically labelled compounds; (4) Non-selectively isotopically labelled compounds; (5) Isotopically deficient compounds.

  2. Optimal adaptive normalized matched filter for large antenna arrays

    KAUST Repository

    Kammoun, Abla

    2016-09-13

    This paper focuses on the problem of detecting a target in the presence of a compound Gaussian clutter with unknown statistics. To this end, we focus on the design of the adaptive normalized matched filter (ANMF) detector which uses the regularized Tyler estimator (RTE) built from N-dimensional observations x, · · ·, x in order to estimate the clutter covariance matrix. The choice for the RTE is motivated by its possessing two major attributes: first its resilience to the presence of outliers, and second its regularization parameter that makes it more suitable to handle the scarcity in observations. In order to facilitate the design of the ANMF detector, we consider the regime in which n and N are both large. This allows us to derive closed-form expressions for the asymptotic false alarm and detection probabilities. Based on these expressions, we propose an asymptotically optimal setting for the regularization parameter of the RTE that maximizes the asymptotic detection probability while keeping the asymptotic false alarm probability below a certain threshold. Numerical results are provided in order to illustrate the gain of the proposed detector over a recently proposed setting of the regularization parameter.

  3. Optimal adaptive normalized matched filter for large antenna arrays

    KAUST Repository

    Kammoun, Abla; Couillet, Romain; Pascal, Fré dé ric; Alouini, Mohamed-Slim

    2016-01-01

    This paper focuses on the problem of detecting a target in the presence of a compound Gaussian clutter with unknown statistics. To this end, we focus on the design of the adaptive normalized matched filter (ANMF) detector which uses the regularized Tyler estimator (RTE) built from N-dimensional observations x, · · ·, x in order to estimate the clutter covariance matrix. The choice for the RTE is motivated by its possessing two major attributes: first its resilience to the presence of outliers, and second its regularization parameter that makes it more suitable to handle the scarcity in observations. In order to facilitate the design of the ANMF detector, we consider the regime in which n and N are both large. This allows us to derive closed-form expressions for the asymptotic false alarm and detection probabilities. Based on these expressions, we propose an asymptotically optimal setting for the regularization parameter of the RTE that maximizes the asymptotic detection probability while keeping the asymptotic false alarm probability below a certain threshold. Numerical results are provided in order to illustrate the gain of the proposed detector over a recently proposed setting of the regularization parameter.

  4. Phytogenic Compounds as Alternatives to In-Feed Antibiotics: Potentials and Challenges in Application

    Directory of Open Access Journals (Sweden)

    Chengbo Yang

    2015-03-01

    Full Text Available This article summarizes current experimental knowledge on the efficacy, possible mechanisms and feasibility in the application of phytogenic products as feed additives for food-producing animals. Phytogenic compounds comprise a wide range of plant-derived natural bioactive compounds and essential oils are a major group. Numerous studies have demonstrated that phytogenic compounds have a variety of functions, including antimicrobial/antiviral, antioxidative and anti-inflammation effects and improvement in the palatability of feed and gut development/health. However, the mechanisms underlying their functions are still largely unclear. In the past, there has been a lack of consistency in the results from both laboratory and field studies, largely due to the varied composition of products, dosages, purities and growing conditions of animals used. The minimal inhibitory concentration (MIC of phytogenic compounds required for controlling enteric pathogens may not guarantee the best feed intake, balanced immunity of animals and cost-effectiveness in animal production. The lipophilic nature of photogenic compounds also presents a challenge in effective delivery to the animal gut and this can partially be resolved by microencapsulation and combination with other compounds (synergistic effect. Interestingly, the effects of photogenic compounds on anti-inflammation, gut chemosensing and possible disruption of bacterial quorum sensing could explain a certain number of studies with different animal species for the better production performance of animals that have received phytogenic feed additives. It is obvious that phytogenic compounds have good potential as an alternative to antibiotics in feed for food animal production and the combination of different phytogenic compounds appears to be an approach to improve the efficacy and safety of phytogenic compounds in the application. It is our expectation that the recent development of high-throughput and

  5. Data Mining and Visualization of Large Human Behavior Data Sets

    DEFF Research Database (Denmark)

    Cuttone, Andrea

    and credit card transactions – have provided us new sources for studying our behavior. In particular smartphones have emerged as new tools for collecting data about human activity, thanks to their sensing capabilities and their ubiquity. This thesis investigates the question of what we can learn about human...... behavior from this rich and pervasive mobile sensing data. In the first part, we describe a large-scale data collection deployment collecting high-resolution data for over 800 students at the Technical University of Denmark using smartphones, including location, social proximity, calls and SMS. We provide...... an overview of the technical infrastructure, the experimental design, and the privacy measures. The second part investigates the usage of this mobile sensing data for understanding personal behavior. We describe two large-scale user studies on the deployment of self-tracking apps, in order to understand...

  6. Magnetocaloric effects in RTX intermetallic compounds (R = Gd–Tm, T = Fe–Cu and Pd, X = Al and Si)

    International Nuclear Information System (INIS)

    Zhang Hu; Shen Bao-Gen

    2015-01-01

    The magnetocaloric effect (MCE) of RTSi and RT Al systems with R = Gd–Tm, T = Fe–Cu and Pd, which have been widely investigated in recent years, is reviewed. It is found that these RTX compounds exhibit various crystal structures and magnetic properties, which then result in different MCE. Large MCE has been observed not only in the typical ferromagnetic materials but also in the antiferromagnetic materials. The magnetic properties have been studied in detail to discuss the physical mechanism of large MCE in RTX compounds. Particularly, some RTX compounds such as ErFeSi, HoCuSi, HoCuAl exhibit large reversible MCE under low magnetic field change, which suggests that these compounds could be promising materials for magnetic refrigeration in a low temperature range. (topical review)

  7. Induced production of brominated aromatic compounds in the alga Ceramium tenuicorne.

    Science.gov (United States)

    Dahlgren, Elin; Enhus, Carolina; Lindqvist, Dennis; Eklund, Britta; Asplund, Lillemor

    2015-11-01

    In the Baltic Sea, high concentrations of toxic brominated aromatic compounds have been detected in all compartments of the marine food web. A growing body of evidence points towards filamentous algae as a natural producer of these chemicals. However, little is known about the effects of environmental factors and life history on algal production of brominated compounds. In this study, several congeners of methoxylated polybrominated diphenyl ethers (MeO-PBDEs), hydroxylated polybrominated diphenyl ethers (OH-PBDEs) and brominated phenols (BPs) were identified in a naturally growing filamentous red algal species (Ceramium tenuicorne) in the Baltic Sea. The identified substances displayed large seasonal variations in the alga with a concentration peak in July. Production of MeO-/OH-PBDEs and BPs by C. tenuicorne was also established in isolated clonal material grown in a controlled laboratory setting. Based on three replicates, herbivory, as well as elevated levels of light and salinity in the culture medium, significantly increased the production of 2,4,6-tribromophenol (2,4,6-TBP). Investigation of differences in production between the isomorphic female, male and diploid clonal life stages of the alga grown in the laboratory revealed a significantly higher production of 2,4,6-TBP in the brackish water female gametophytes, compared to the corresponding marine gametophytes. Even higher concentrations of 2,4,6-TBP were produced by marine male gametophytes and sporophytes.

  8. Open TG-GATEs: a large-scale toxicogenomics database

    Science.gov (United States)

    Igarashi, Yoshinobu; Nakatsu, Noriyuki; Yamashita, Tomoya; Ono, Atsushi; Ohno, Yasuo; Urushidani, Tetsuro; Yamada, Hiroshi

    2015-01-01

    Toxicogenomics focuses on assessing the safety of compounds using gene expression profiles. Gene expression signatures from large toxicogenomics databases are expected to perform better than small databases in identifying biomarkers for the prediction and evaluation of drug safety based on a compound's toxicological mechanisms in animal target organs. Over the past 10 years, the Japanese Toxicogenomics Project consortium (TGP) has been developing a large-scale toxicogenomics database consisting of data from 170 compounds (mostly drugs) with the aim of improving and enhancing drug safety assessment. Most of the data generated by the project (e.g. gene expression, pathology, lot number) are freely available to the public via Open TG-GATEs (Toxicogenomics Project-Genomics Assisted Toxicity Evaluation System). Here, we provide a comprehensive overview of the database, including both gene expression data and metadata, with a description of experimental conditions and procedures used to generate the database. Open TG-GATEs is available from http://toxico.nibio.go.jp/english/index.html. PMID:25313160

  9. Large-N in Volcano Settings: Volcanosri

    Science.gov (United States)

    Lees, J. M.; Song, W.; Xing, G.; Vick, S.; Phillips, D.

    2014-12-01

    We seek a paradigm shift in the approach we take on volcano monitoring where the compromise from high fidelity to large numbers of sensors is used to increase coverage and resolution. Accessibility, danger and the risk of equipment loss requires that we develop systems that are independent and inexpensive. Furthermore, rather than simply record data on hard disk for later analysis we desire a system that will work autonomously, capitalizing on wireless technology and in field network analysis. To this end we are currently producing a low cost seismic array which will incorporate, at the very basic level, seismological tools for first cut analysis of a volcano in crises mode. At the advanced end we expect to perform tomographic inversions in the network in near real time. Geophone (4 Hz) sensors connected to a low cost recording system will be installed on an active volcano where triggering earthquake location and velocity analysis will take place independent of human interaction. Stations are designed to be inexpensive and possibly disposable. In one of the first implementations the seismic nodes consist of an Arduino Due processor board with an attached Seismic Shield. The Arduino Due processor board contains an Atmel SAM3X8E ARM Cortex-M3 CPU. This 32 bit 84 MHz processor can filter and perform coarse seismic event detection on a 1600 sample signal in fewer than 200 milliseconds. The Seismic Shield contains a GPS module, 900 MHz high power mesh network radio, SD card, seismic amplifier, and 24 bit ADC. External sensors can be attached to either this 24-bit ADC or to the internal multichannel 12 bit ADC contained on the Arduino Due processor board. This allows the node to support attachment of multiple sensors. By utilizing a high-speed 32 bit processor complex signal processing tasks can be performed simultaneously on multiple sensors. Using a 10 W solar panel, second system being developed can run autonomously and collect data on 3 channels at 100Hz for 6 months

  10. Volatile organic compounds and oxides of nitrogen. Further emission reductions

    Energy Technology Data Exchange (ETDEWEB)

    Froste, H [comp.

    1997-12-31

    This report presents the current status in relation to achievement of the Swedish Environmental target set by Parliament to reduce emission of volatile organic compounds by 50 per cent between 1988 and 2000. It also instructed the Agency to formulate proposed measures to achieve a 50 per cent reduction of emission of nitrogen oxides between 1985 and 2005. The report presents an overall account of emission trends for volatile organic compounds (from all sectors) and nitrogen oxides (from the industry sector) and steps proposed to achieve further emission reductions. 43 refs

  11. Volatile organic compounds and oxides of nitrogen. Further emission reductions

    Energy Technology Data Exchange (ETDEWEB)

    Froste, H. [comp.

    1996-12-31

    This report presents the current status in relation to achievement of the Swedish Environmental target set by Parliament to reduce emission of volatile organic compounds by 50 per cent between 1988 and 2000. It also instructed the Agency to formulate proposed measures to achieve a 50 per cent reduction of emission of nitrogen oxides between 1985 and 2005. The report presents an overall account of emission trends for volatile organic compounds (from all sectors) and nitrogen oxides (from the industry sector) and steps proposed to achieve further emission reductions. 43 refs

  12. Correlation between octanol/water and liposome/water distribution coefficients and drug absorption of a set of pharmacologically active compounds.

    Science.gov (United States)

    Esteves, Freddy; Moutinho, Carla; Matos, Carla

    2013-06-01

    Absorption and consequent therapeutic action are key issues in the development of new drugs by the pharmaceutical industry. In this sense, different models can be used to simulate biological membranes to predict the absorption of a drug. This work compared the octanol/water and the liposome/water models. The parameters used to relate the two models were the distribution coefficients between liposomes and water and octanol and water and the fraction of drug orally absorbed. For this study, 66 drugs were collected from literature sources and divided into four groups according to charge and ionization degree: neutral; positively charged; negatively charged; and partially ionized/zwitterionic. The results show a satisfactory linear correlation between the octanol and liposome systems for the neutral (R²= 0.9324) and partially ionized compounds (R²= 0.9367), contrary to the positive (R²= 0.4684) and negatively charged compounds (R²= 0.1487). In the case of neutral drugs, results were similar in both models because of the high fraction orally absorbed. However, for the charged drugs (positively, negatively, and partially ionized/zwitterionic), the liposomal model has a more-appropriate correlation with absorption than the octanol model. These results show that the neutral compounds only interact with membranes through hydrophobic bonds, whereas charged drugs favor electrostatic interactions established with the liposomes. With this work, we concluded that liposomes may be a more-appropriate biomembrane model than octanol for charged compounds.

  13. Mappability of drug-like space: towards a polypharmacologically competent map of drug-relevant compounds

    Science.gov (United States)

    Sidorov, Pavel; Gaspar, Helena; Marcou, Gilles; Varnek, Alexandre; Horvath, Dragos

    2015-12-01

    Intuitive, visual rendering—mapping—of high-dimensional chemical spaces (CS), is an important topic in chemoinformatics. Such maps were so far dedicated to specific compound collections—either limited series of known activities, or large, even exhaustive enumerations of molecules, but without associated property data. Typically, they were challenged to answer some classification problem with respect to those same molecules, admired for their aesthetical virtues and then forgotten—because they were set-specific constructs. This work wishes to address the question whether a general, compound set-independent map can be generated, and the claim of "universality" quantitatively justified, with respect to all the structure-activity information available so far—or, more realistically, an exploitable but significant fraction thereof. The "universal" CS map is expected to project molecules from the initial CS into a lower-dimensional space that is neighborhood behavior-compliant with respect to a large panel of ligand properties. Such map should be able to discriminate actives from inactives, or even support quantitative neighborhood-based, parameter-free property prediction (regression) models, for a wide panel of targets and target families. It should be polypharmacologically competent, without requiring any target-specific parameter fitting. This work describes an evolutionary growth procedure of such maps, based on generative topographic mapping, followed by the validation of their polypharmacological competence. Validation was achieved with respect to a maximum of exploitable structure-activity information, covering all of Homo sapiens proteins of the ChEMBL database, antiparasitic and antiviral data, etc. Five evolved maps satisfactorily solved hundreds of activity-based ligand classification challenges for targets, and even in vivo properties independent from training data. They also stood chemogenomics-related challenges, as cumulated responsibility

  14. Potential Signatures of Semi-volatile Compounds Associated With Nuclear Processing

    Energy Technology Data Exchange (ETDEWEB)

    Probasco, Kathleen M.; Birnbaum, Jerome C.; Maughan, A. D.

    2002-06-01

    Semi-volatile chemicals associated with nuclear processes (e.g., the reprocessing of uranium to produce plutonium for nuclear weapons, or the separation of actinides from processing waste streams), can provide sticky residues or signatures that will attach to piping, ducting, soil, water, or other surface media. Volatile compounds, that are more suitable for electro-optical sensing, have been well studied. However, the semi-volatile compounds have not been well documented or studied. A majority of these semi-volatile chemicals are more robust than typical gaseous or liquid chemicals and can have lifetimes of several weeks, months, or years in the environment. However, large data gaps exist concerning these potential signature compounds and more research is needed to fill these data gaps so that important signature information is not overlooked or discarded. This report investigates key semi-volatile compounds associated with nuclear separations, identifies available chemical and physical properties, and discusses the degradation products that would result from hydrolysis, radiolysis and oxidation reactions on these compounds.

  15. Generalized nonimaging compound elliptical and compound hyperbolic luminaire designs for pair-overlap illumination applications.

    Science.gov (United States)

    Georlette, O; Gordon, J M

    1994-07-01

    Generalized nonimaging compound elliptical luminaires (CEL's) and compound hyperbolic luminaires (CHL's) are developed for pair-overlap illumination applications. A comprehensive analysis of CEL's and CHL's is presented. This includes the possibility of reflector truncation, as well as the extreme direction that spans the full range from positive to negative. Negative extreme direction devices have been overlooked in earlier studies and are shown to be well suited to illumination problems for which large cutoff angles are required. Flux maps can be calculated analytically without the need for computer ray tracing. It is demonstrated that, for a broad range of cutoff angles, adjacent pairs of CEL's and CHL's can generate highly uniform far-field illuminance while maintaining maximal lighting efficiency and excellent glare control. The trade-off between luminaire compactness and flux homogeneity is also illustrated. For V troughs, being a special case of CHL's and being well suited to simple, inexpensive fabri ation, we identify geometries that closely approach the performance characteristics of the optimized CEL's and CHL's.

  16. Large Scale Screening of Ethnomedicinal Plants for Identification of Potential Antibacterial Compounds

    Directory of Open Access Journals (Sweden)

    Sujogya Kumar Panda

    2016-03-01

    activity. The species listed here were shown to have anti-infective activity against both Gram-positive and Gram-negative bacteria. These results may serve as a guide for selecting plant species that could yield the highest probability of finding promising compounds responsible for the antibacterial activities against a broad spectrum of bacterial species. Further investigation of the phytochemicals from these plants will help to identify the lead compounds for drug discovery.

  17. Large Scale Screening of Ethnomedicinal Plants for Identification of Potential Antibacterial Compounds.

    Science.gov (United States)

    Panda, Sujogya Kumar; Mohanta, Yugal Kishore; Padhi, Laxmipriya; Park, Young-Hwan; Mohanta, Tapan Kumar; Bae, Hanhong

    2016-03-14

    listed here were shown to have anti-infective activity against both Gram-positive and Gram-negative bacteria. These results may serve as a guide for selecting plant species that could yield the highest probability of finding promising compounds responsible for the antibacterial activities against a broad spectrum of bacterial species. Further investigation of the phytochemicals from these plants will help to identify the lead compounds for drug discovery.

  18. Timetable-based simulation method for choice set generation in large-scale public transport networks

    DEFF Research Database (Denmark)

    Rasmussen, Thomas Kjær; Anderson, Marie Karen; Nielsen, Otto Anker

    2016-01-01

    The composition and size of the choice sets are a key for the correct estimation of and prediction by route choice models. While existing literature has posed a great deal of attention towards the generation of path choice sets for private transport problems, the same does not apply to public...... transport problems. This study proposes a timetable-based simulation method for generating path choice sets in a multimodal public transport network. Moreover, this study illustrates the feasibility of its implementation by applying the method to reproduce 5131 real-life trips in the Greater Copenhagen Area...... and to assess the choice set quality in a complex multimodal transport network. Results illustrate the applicability of the algorithm and the relevance of the utility specification chosen for the reproduction of real-life path choices. Moreover, results show that the level of stochasticity used in choice set...

  19. Phenolic Compounds Analysis of Root, Stalk, and Leaves of Nettle

    Directory of Open Access Journals (Sweden)

    Semih Otles

    2012-01-01

    Full Text Available Types of nettles (Urtica dioica were collected from different regions to analyze phenolic compounds in this research. Nettles are specially grown in the coastal part. According to this kind of properties, nettle samples were collected from coastal part of (Mediterranean, Aegean, Black sea, and Marmara Turkey. Phenolic profile, total phenol compounds, and antioxidant activities of nettle samples were analyzed. Nettles were separated to the part of root, stalk, and leaves. Then, these parts of nettle were analyzed to understand the difference of phenolic compounds and amount of them. Nettle (root, stalk and leaves samples were analyzed by using High-Performance Liquid Chromatography with Diode-Array Detection (HPLC-DAD to qualitative and quantitative determination of the phenolic compounds. Total phenolic components were measured by using Folin-Ciocalteu method. The antioxidant activity was measured by using DPPH (2,2-diphenyl-1-picrylhydrazyl which is generally used for herbal samples and based on single electron transfer (SET.

  20. Phenolic compounds analysis of root, stalk, and leaves of nettle.

    Science.gov (United States)

    Otles, Semih; Yalcin, Buket

    2012-01-01

    Types of nettles (Urtica dioica) were collected from different regions to analyze phenolic compounds in this research. Nettles are specially grown in the coastal part. According to this kind of properties, nettle samples were collected from coastal part of (Mediterranean, Aegean, Black sea, and Marmara) Turkey. Phenolic profile, total phenol compounds, and antioxidant activities of nettle samples were analyzed. Nettles were separated to the part of root, stalk, and leaves. Then, these parts of nettle were analyzed to understand the difference of phenolic compounds and amount of them. Nettle (root, stalk and leaves) samples were analyzed by using High-Performance Liquid Chromatography with Diode-Array Detection (HPLC-DAD) to qualitative and quantitative determination of the phenolic compounds. Total phenolic components were measured by using Folin-Ciocalteu method. The antioxidant activity was measured by using DPPH (2,2-diphenyl-1-picrylhydrazyl) which is generally used for herbal samples and based on single electron transfer (SET).

  1. Phenolic Compounds Analysis of Root, Stalk, and Leaves of Nettle

    Science.gov (United States)

    Otles, Semih; Yalcin, Buket

    2012-01-01

    Types of nettles (Urtica dioica) were collected from different regions to analyze phenolic compounds in this research. Nettles are specially grown in the coastal part. According to this kind of properties, nettle samples were collected from coastal part of (Mediterranean, Aegean, Black sea, and Marmara) Turkey. Phenolic profile, total phenol compounds, and antioxidant activities of nettle samples were analyzed. Nettles were separated to the part of root, stalk, and leaves. Then, these parts of nettle were analyzed to understand the difference of phenolic compounds and amount of them. Nettle (root, stalk and leaves) samples were analyzed by using High-Performance Liquid Chromatography with Diode-Array Detection (HPLC-DAD) to qualitative and quantitative determination of the phenolic compounds. Total phenolic components were measured by using Folin-Ciocalteu method. The antioxidant activity was measured by using DPPH (2,2-diphenyl-1-picrylhydrazyl) which is generally used for herbal samples and based on single electron transfer (SET). PMID:22593694

  2. Simple multi-party set reconciliation

    DEFF Research Database (Denmark)

    Mitzenmacher, Michael; Pagh, Rasmus

    2017-01-01

     set reconciliation: two parties A1A1 and A2A2 each hold a set of keys, named S1S1 and S2S2 respectively, and the goal is for both parties to obtain S1∪S2S1∪S2. Typically, set reconciliation is interesting algorithmically when sets are large but the set difference |S1−S2|+|S2−S1||S1−S2|+|S2−S1| is small...

  3. Exposure to potentially toxic hydrocarbons and halocarbons released from the dialyzer and tubing set during hemodialysis.

    Science.gov (United States)

    Lee, Hyun Ji Julie; Meinardi, Simone; Pahl, Madeleine V; Vaziri, Nostratola D; Blake, Donald R

    2012-10-01

    Although much is known about the effect of chronic kidney failure and dialysis on the composition of solutes in plasma, little is known about their impact on the composition of gaseous compounds in exhaled breath. This study was designed to explore the effect of uremia and the hemodialysis (HD) procedure on the composition of exhaled breath. Breath samples were collected from 10 dialysis patients immediately before, during, and after a dialysis session. To determine the potential introduction of gaseous compounds from dialysis components, gasses emitted from dialyzers, tubing set, dialysate, and water supplies were collected. Prospective cohort study. 10 HD patients and 10 age-matched healthy individuals. Predictors include the dialyzers, tubing set, dialysate, and water supplies before, during, and after dialysis. Changes in the composition of exhaled breath. A 5-column/detector gas chromatography system was used to measure hydrocarbon, halocarbon, oxygenate, and alkyl nitrate compounds. Concentrations of 14 hydrocarbons and halocarbons in patients' breath rapidly increased after the onset of the HD treatment. All 14 compounds and 5 others not found in patients' breath were emitted from the dialyzers and tubing sets. Contrary to earlier reports, exhaled breath ethane concentrations in our dialysis patients were virtually unchanged during the HD treatment. Single-center study with a small sample size may limit the generalizability of the findings. The study documented the release of several potentially toxic hydrocarbons and halocarbons to patients from the dialyzer and tubing sets during the HD procedure. Because long-term exposure to these compounds may contribute to the morbidity and mortality in dialysis population, this issue should be considered in the manufacturing of the new generation of dialyzers and dialysis tubing sets. Copyright © 2012 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.

  4. Bond-based linear indices in QSAR: computational discovery of novel anti-trichomonal compounds

    Science.gov (United States)

    Marrero-Ponce, Yovani; Meneses-Marcel, Alfredo; Rivera-Borroto, Oscar M.; García-Domenech, Ramón; De Julián-Ortiz, Jesus Vicente; Montero, Alina; Escario, José Antonio; Barrio, Alicia Gómez; Pereira, David Montero; Nogal, Juan José; Grau, Ricardo; Torrens, Francisco; Vogel, Christian; Arán, Vicente J.

    2008-08-01

    Trichomonas vaginalis ( Tv) is the causative agent of the most common, non-viral, sexually transmitted disease in women and men worldwide. Since 1959, metronidazole (MTZ) has been the drug of choice in the systemic treatment of trichomoniasis. However, resistance to MTZ in some patients and the great cost associated with the development of new trichomonacidals make necessary the development of computational methods that shorten the drug discovery pipeline. Toward this end, bond-based linear indices, new TOMOCOMD-CARDD molecular descriptors, and linear discriminant analysis were used to discover novel trichomonacidal chemicals. The obtained models, using non-stochastic and stochastic indices, are able to classify correctly 89.01% (87.50%) and 82.42% (84.38%) of the chemicals in the training (test) sets, respectively. These results validate the models for their use in the ligand-based virtual screening. In addition, they show large Matthews' correlation coefficients ( C) of 0.78 (0.71) and 0.65 (0.65) for the training (test) sets, correspondingly. The result of predictions on the 10% full-out cross-validation test also evidences the robustness of the obtained models. Later, both models are applied to the virtual screening of 12 compounds already proved against Tv. As a result, they correctly classify 10 out of 12 (83.33%) and 9 out of 12 (75.00%) of the chemicals, respectively; which is the most important criterion for validating the models. Besides, these classification functions are applied to a library of seven chemicals in order to find novel antitrichomonal agents. These compounds are synthesized and tested for in vitro activity against Tv. As a result, experimental observations approached to theoretical predictions, since it was obtained a correct classification of 85.71% (6 out of 7) of the chemicals. Moreover, out of the seven compounds that are screened, synthesized and biologically assayed, six compounds (VA7-34, VA7-35, VA7-37, VA7-38, VA7-68, VA7-70) show

  5. Impact of Compound Hydrate Dynamics on Phase Boundary Changes

    Science.gov (United States)

    Osegovic, J. P.; Max, M. D.

    2006-12-01

    Compound hydrate reactions are affected by the local concentration of hydrate forming materials (HFM). The relationship between HFM composition and the phase boundary is as significant as temperature and pressure. Selective uptake and sequestration of preferred hydrate formers (PF) has wide ranging implications for the state and potential use of natural hydrate formation, including impact on climate. Rising mineralizing fluids of hydrate formers (such as those that occur on Earth and are postulated to exist elsewhere in the solar system) will sequester PF before methane, resulting in a positive relationship between depth and BTU content as ethane and propane are removed before methane. In industrial settings the role of preferred formers can separate gases. When depressurizing gas hydrate to release the stored gas, the hydrate initial composition will set the decomposition phase boundary because the supporting solution takes on the composition of the hydrate phase. In other settings where hydrate is formed, transported, and then dissociated, similar effects can control the process. The behavior of compound hydrate systems can primarily fit into three categories: 1) In classically closed systems, all the material that can form hydrate is isolated, such as in a sealed laboratory vessel. In such systems, formation and decomposition are reversible processes with observed hysteresis related to mass or heat transfer limitations, or the order and magnitude in which individual hydrate forming gases are taken up from the mixture and subsequently released. 2) Kinetically closed systems are exposed to a solution mass flow across a hydrate mass. These systems can have multiple P-T phase boundaries based on the local conditions at each face of the hydrate mass. A portion of hydrate that is exposed to fresh mineralizing solution will contain more preferred hydrate formers than another portion that is exposed to a partially depleted solution. Examples of kinetically closed

  6. HS-SPME analysis of volatile organic compounds of coniferous needle litter

    Science.gov (United States)

    Isidorov, V. A.; Vinogorova, V. T.; Rafałowski, K.

    The composition of volatile emission of Scots pine ( Pinus sylvestris) and spruce ( Picea exelsa) litter was studied by gas chromatography-mass spectrometry (GC-MS) and samples were collected by solid-phase microextraction (SPME) method. The list of identified compounds includes over 60 organic substances of different classes. It was established that volatile emission contain not only components of essential oils of pine and spruce needles but also a large number of organic compounds which are probably secondary metabolites of litter-decomposing fungi. They include lower carbonyl compounds and alcohols as well as products of terpene dehydration and oxidation. These data show that the processes of litter decomposition are an important source of reactive organic compounds under canopy of coniferous forests.

  7. Labelled compounds. (Pt. B)

    International Nuclear Information System (INIS)

    Buncel, E.; Jones, J.R.

    1991-01-01

    Since the end of World War II there has been a tremendous increase in the number of compounds that have been synthesized with radioactive or stable isotopes. They have found application in many diverse fields, so much so, that hardly a single area in pure and applied science has not benefited. Not surprisingly it has been reflected in appearance of related publications. The early proceedings of the Symposia on Advances in Trace Methodology were soon followed by various Euratom sponsored meetings in which methods of preparing and storing labelled compounds featured prominently. In due course a resurgence of interest in stable isotopes, brought about by their greater availability (also lower cost) and partly by development of new techniques such as gas chromatography - mass spectrometry (gc-ms), led to the publication of proceedings of several successful conferences. More recently conferences dealing with the synthesis and applications of isotopes and isotopically labelled compounds have been established on a regular basis. In addition to the proceedings of conferences and journal publications individuals left their mark by producing definitive texts, usually on specific nuclides. Only the classic two volume publication of Murray and Williams (Organic syntheses with isotopes, New York 1985), now over 30 years old and out of print, attempted to do justice to several nuclides. With the large amount of work that has been undertaken since then it seems unlikely that an updated edition could be produced. The alternative strategy was to ask scientists currently active to review specific areas and this is the approach adopted in the present series of monographs. In this way it is intended to cover the broad advances that have been made in the synthesis and applications of isotopes and isotopically labelled compounds in the physical and biomedical sciences. (author). refs.; figs.; tabs

  8. Compound Semiconductor Radiation Detector

    International Nuclear Information System (INIS)

    Kim, Y. K.; Park, S. H.; Lee, W. G.; Ha, J. H.

    2005-01-01

    In 1945, Van Heerden measured α, β and γ radiations with the cooled AgCl crystal. It was the first radiation measurement using the compound semiconductor detector. Since then the compound semiconductor has been extensively studied as radiation detector. Generally the radiation detector can be divided into the gas detector, the scintillator and the semiconductor detector. The semiconductor detector has good points comparing to other radiation detectors. Since the density of the semiconductor detector is higher than that of the gas detector, the semiconductor detector can be made with the compact size to measure the high energy radiation. In the scintillator, the radiation is measured with the two-step process. That is, the radiation is converted into the photons, which are changed into electrons by a photo-detector, inside the scintillator. However in the semiconductor radiation detector, the radiation is measured only with the one-step process. The electron-hole pairs are generated from the radiation interaction inside the semiconductor detector, and these electrons and charged ions are directly collected to get the signal. The energy resolution of the semiconductor detector is generally better than that of the scintillator. At present, the commonly used semiconductors as the radiation detector are Si and Ge. However, these semiconductor detectors have weak points. That is, one needs thick material to measure the high energy radiation because of the relatively low atomic number of the composite material. In Ge case, the dark current of the detector is large at room temperature because of the small band-gap energy. Recently the compound semiconductor detectors have been extensively studied to overcome these problems. In this paper, we will briefly summarize the recent research topics about the compound semiconductor detector. We will introduce the research activities of our group, too

  9. PC-based support programs coupled with the sets code for large fault tree analysis

    International Nuclear Information System (INIS)

    Hioki, K.; Nakai, R.

    1989-01-01

    Power Reactor and Nuclear Fuel Development Corporation (PNC) has developed four PC programs: IEIQ (Initiating Event Identification and Quantification), MODESTY (Modular Even Description for a Variety of Systems), FAUST (Fault Summary Tables Generation Program) and ETAAS (Event Tree Analysis Assistant System). These programs prepare the input data for the SETS (Set Equation Transformation System) code and construct and quantify event trees (E/Ts) using the output of the SETS code. The capability of these programs is described and some examples of the results are presented in this paper. With these PC programs and the SETS code, PSA can now be performed with more consistency and less manpower

  10. Current knowledge of soft cheeses flavor and related compounds.

    Science.gov (United States)

    Sablé, S; Cottenceau, G

    1999-12-01

    Cheese aroma is the result of the perception of a large number of molecules belonging to different chemical classes. The volatile compounds involved in the soft cheese flavor have received a great deal of attention. However, there has been less work concerning the volatile compounds in the soft smear-ripened cheeses than in the mold-ripened cheeses. This paper reviews the components that contribute to the characteristic flavor in the soft cheeses such as surface-ripened, Camembert-type, and Blue cheeses. The sensory properties and quantities of the molecules in the different cheeses are discussed.

  11. Large parallel volumes of finite and compact sets in d-dimensional Euclidean space

    DEFF Research Database (Denmark)

    Kampf, Jürgen; Kiderlen, Markus

    The r-parallel volume V (Cr) of a compact subset C in d-dimensional Euclidean space is the volume of the set Cr of all points of Euclidean distance at most r > 0 from C. According to Steiner’s formula, V (Cr) is a polynomial in r when C is convex. For finite sets C satisfying a certain geometric...

  12. ObspyDMT: a Python toolbox for retrieving and processing large seismological data sets

    Directory of Open Access Journals (Sweden)

    K. Hosseini

    2017-10-01

    Full Text Available We present obspyDMT, a free, open-source software toolbox for the query, retrieval, processing and management of seismological data sets, including very large, heterogeneous and/or dynamically growing ones. ObspyDMT simplifies and speeds up user interaction with data centers, in more versatile ways than existing tools. The user is shielded from the complexities of interacting with different data centers and data exchange protocols and is provided with powerful diagnostic and plotting tools to check the retrieved data and metadata. While primarily a productivity tool for research seismologists and observatories, easy-to-use syntax and plotting functionality also make obspyDMT an effective teaching aid. Written in the Python programming language, it can be used as a stand-alone command-line tool (requiring no knowledge of Python or can be integrated as a module with other Python codes. It facilitates data archiving, preprocessing, instrument correction and quality control – routine but nontrivial tasks that can consume much user time. We describe obspyDMT's functionality, design and technical implementation, accompanied by an overview of its use cases. As an example of a typical problem encountered in seismogram preprocessing, we show how to check for inconsistencies in response files of two example stations. We also demonstrate the fully automated request, remote computation and retrieval of synthetic seismograms from the Synthetics Engine (Syngine web service of the Data Management Center (DMC at the Incorporated Research Institutions for Seismology (IRIS.

  13. ObspyDMT: a Python toolbox for retrieving and processing large seismological data sets

    Science.gov (United States)

    Hosseini, Kasra; Sigloch, Karin

    2017-10-01

    We present obspyDMT, a free, open-source software toolbox for the query, retrieval, processing and management of seismological data sets, including very large, heterogeneous and/or dynamically growing ones. ObspyDMT simplifies and speeds up user interaction with data centers, in more versatile ways than existing tools. The user is shielded from the complexities of interacting with different data centers and data exchange protocols and is provided with powerful diagnostic and plotting tools to check the retrieved data and metadata. While primarily a productivity tool for research seismologists and observatories, easy-to-use syntax and plotting functionality also make obspyDMT an effective teaching aid. Written in the Python programming language, it can be used as a stand-alone command-line tool (requiring no knowledge of Python) or can be integrated as a module with other Python codes. It facilitates data archiving, preprocessing, instrument correction and quality control - routine but nontrivial tasks that can consume much user time. We describe obspyDMT's functionality, design and technical implementation, accompanied by an overview of its use cases. As an example of a typical problem encountered in seismogram preprocessing, we show how to check for inconsistencies in response files of two example stations. We also demonstrate the fully automated request, remote computation and retrieval of synthetic seismograms from the Synthetics Engine (Syngine) web service of the Data Management Center (DMC) at the Incorporated Research Institutions for Seismology (IRIS).

  14. BitterSweetForest: A random forest based binary classifier to predict bitterness and sweetness of chemical compounds

    Science.gov (United States)

    Banerjee, Priyanka; Preissner, Robert

    2018-04-01

    Taste of a chemical compounds present in food stimulates us to take in nutrients and avoid poisons. However, the perception of taste greatly depends on the genetic as well as evolutionary perspectives. The aim of this work was the development and validation of a machine learning model based on molecular fingerprints to discriminate between sweet and bitter taste of molecules. BitterSweetForest is the first open access model based on KNIME workflow that provides platform for prediction of bitter and sweet taste of chemical compounds using molecular fingerprints and Random Forest based classifier. The constructed model yielded an accuracy of 95% and an AUC of 0.98 in cross-validation. In independent test set, BitterSweetForest achieved an accuracy of 96 % and an AUC of 0.98 for bitter and sweet taste prediction. The constructed model was further applied to predict the bitter and sweet taste of natural compounds, approved drugs as well as on an acute toxicity compound data set. BitterSweetForest suggests 70% of the natural product space, as bitter and 10 % of the natural product space as sweet with confidence score of 0.60 and above. 77 % of the approved drug set was predicted as bitter and 2% as sweet with a confidence scores of 0.75 and above. Similarly, 75% of the total compounds from acute oral toxicity class were predicted only as bitter with a minimum confidence score of 0.75, revealing toxic compounds are mostly bitter. Furthermore, we applied a Bayesian based feature analysis method to discriminate the most occurring chemical features between sweet and bitter compounds from the feature space of a circular fingerprint.

  15. Study of impurity composition of some compounds of refractory metals by instrumental neutron activation analysis

    International Nuclear Information System (INIS)

    Kaganov, L.K.; Dzhumakulov, D.T.; Mukhamedshina, N.M.

    1994-01-01

    The compounds of refractory transition metals find wide application in all fields of engineering, in particular in microelectronics to manufacture contact-barrier layers of thin-film current-conducting systems of silicon instruments, large and very large scale integrated circuits. Production of such materials is realted with the need to apply the analytical control methods that allow to determine a large number of elements with high reliability. The instrumental neutron-activation techniques have been developed to determine impurity composition of the following compounds: MoSi 2 , WSi 2 , TiB 2 , NbB 2 , TiC, NbC

  16. Quantitative analysis of different volatile organic compounds using an improved electronic nose

    International Nuclear Information System (INIS)

    Gao, Daqi; Ji, Jiuming; Gong, Jiayu; Cai, Chaoqian

    2012-01-01

    This paper sets up an improved electronic nose with an automatic sampling mode, large volumetric vapors and constant temperature for headspace vapors and gas sensor array. In order to facilitate the fast recovery and good repeatability of gas sensors, the steps taken include (A) short-time contact with odors measured; (B) long-time purification using environmental air; (C) exact calibration using clean air before sampling. We employ multiple single-output perceptrons to discriminate and quantify multiple kinds of odors. This task is first regarded as multiple two-class discrimination problems and then multiple quantification problems, and accomplished by multiple single-output perceptrons followed by multiple single-output perceptrons. The experimental results for measuring and quantifying 12 kinds of volatile organic compounds with changing concentrations show that the type of electronic nose with a hierarchical perceptron model has a simple structure, easy operation, good repeatability and good discrimination and quantification performance. (paper)

  17. Quantitative analysis of different volatile organic compounds using an improved electronic nose

    Science.gov (United States)

    Gao, Daqi; Ji, Jiuming; Gong, Jiayu; Cai, Chaoqian

    2012-10-01

    This paper sets up an improved electronic nose with an automatic sampling mode, large volumetric vapors and constant temperature for headspace vapors and gas sensor array. In order to facilitate the fast recovery and good repeatability of gas sensors, the steps taken include (A) short-time contact with odors measured; (B) long-time purification using environmental air; (C) exact calibration using clean air before sampling. We employ multiple single-output perceptrons to discriminate and quantify multiple kinds of odors. This task is first regarded as multiple two-class discrimination problems and then multiple quantification problems, and accomplished by multiple single-output perceptrons followed by multiple single-output perceptrons. The experimental results for measuring and quantifying 12 kinds of volatile organic compounds with changing concentrations show that the type of electronic nose with a hierarchical perceptron model has a simple structure, easy operation, good repeatability and good discrimination and quantification performance.

  18. REVIEW PAPER-MARINE MICROBIAL BIOACTIVE COMPOUNDS

    OpenAIRE

    Kalyani. P*, Hemalatha. K. P. J

    2016-01-01

    Oceans have borne most of the biological activities on our planet. A number of biologically active compounds with varying degrees of action, such as anti-tumor, anti-cancer, anti-microtubule, anti-proliferative, cytotoxic, photo protective, as well as antibiotic and antifouling properties, have been isolated to date from marine sources. The marine environment also represents a largely unexplored source for isolation of new microbes (bacteria, fungi, actinomycetes, microalgae-cyanobacteria and...

  19. [Assessment of the relationship of properties of chemical compounds and their toxicity to a unified hygienic standardization for chemicals].

    Science.gov (United States)

    Trushkov, V F; Perminov, K A; Sapozhnikova, V V; Ignatova, O L

    2013-01-01

    The connection of thermodynamic properties and parameters of toxicity of chemical substances was determined. Obtained data are used for the evaluation of toxicity and hygienic rate setting of chemical compounds. The relationship between enthalpy and toxicity of chemical compounds has been established. Orthogonal planning of the experiment was carried out in the course of the investigations. Equation of unified hygienic rate setting in combined, complex, conjunct influence on the organism is presented. Prospects of determination of toxicity and methodology of unified hygienic rate setting in combined, complex, conjunct influence on the organism are presented

  20. Tracking 20 years of compound-to-target output from literature and patents.

    Directory of Open Access Journals (Sweden)

    Christopher Southan

    Full Text Available The statistics of drug development output and declining yield of approved medicines has been the subject of many recent reviews. However, assessing research productivity that feeds development is more difficult. Here we utilise an extensive database of structure-activity relationships extracted from papers and patents. We have used this database to analyse published compounds cumulatively linked to nearly 4000 protein target identifiers from multiple species over the last 20 years. The compound output increases up to 2005 followed by a decline that parallels a fall in pharmaceutical patenting. Counts of protein targets have plateaued but not fallen. We extended these results by exploring compounds and targets for one large pharmaceutical company. In addition, we examined collective time course data for six individual protease targets, including average molecular weight of the compounds. We also tracked the PubMed profile of these targets to detect signals related to changes in compound output. Our results show that research compound output had decreased 35% by 2012. The major causative factor is likely to be a contraction in the global research base due to mergers and acquisitions across the pharmaceutical industry. However, this does not rule out an increasing stringency of compound quality filtration and/or patenting cost control. The number of proteins mapped to compounds on a yearly basis shows less decline, indicating the cumulative published target capacity of global research is being sustained in the region of 300 proteins for large companies. The tracking of six individual targets shows uniquely detailed patterns not discernible from cumulative snapshots. These are interpretable in terms of events related to validation and de-risking of targets that produce detectable follow-on surges in patenting. Further analysis of the type we present here can provide unique insights into the process of drug discovery based on the data it actually

  1. Abiotic synthesis of organic compounds from carbon disulfide under hydrothermal conditions.

    Science.gov (United States)

    Rushdi, Ahmed I; Simoneit, Bernd R T

    2005-12-01

    Abiotic formation of organic compounds under hydrothermal conditions is of interest to bio, geo-, and cosmochemists. Oceanic sulfur-rich hydrothermal systems have been proposed as settings for the abiotic synthesis of organic compounds. Carbon disulfide is a common component of magmatic and hot spring gases, and is present in marine and terrestrial hydrothermal systems. Thus, its reactivity should be considered as another carbon source in addition to carbon dioxide in reductive aqueous thermosynthesis. We have examined the formation of organic compounds in aqueous solutions of carbon disulfide and oxalic acid at 175 degrees C for 5 and 72 h. The synthesis products from carbon disulfide in acidic aqueous solutions yielded a series of organic sulfur compounds. The major compounds after 5 h of reaction included dimethyl polysulfides (54.5%), methyl perthioacetate (27.6%), dimethyl trithiocarbonate (6.8%), trithianes (2.7%), hexathiepane (1.4%), trithiolanes (0.8%), and trithiacycloheptanes (0.3%). The main compounds after 72 h of reaction consisted of trithiacycloheptanes (39.4%), pentathiepane (11.6%), tetrathiocyclooctanes (11.5%), trithiolanes (10.6%), tetrathianes (4.4%), trithianes (1.2%), dimethyl trisulfide (1.1%), and numerous minor compounds. It is concluded that the abiotic formation of aliphatic straight-chain and cyclic polysulfides is possible under hydrothermal conditions and warrants further studies.

  2. Mining collections of compounds with Screening Assistant 2.

    Science.gov (United States)

    Guilloux, Vincent Le; Arrault, Alban; Colliandre, Lionel; Bourg, Stéphane; Vayer, Philippe; Morin-Allory, Luc

    2012-08-31

    High-throughput screening assays have become the starting point of many drug discovery programs for large pharmaceutical companies as well as academic organisations. Despite the increasing throughput of screening technologies, the almost infinite chemical space remains out of reach, calling for tools dedicated to the analysis and selection of the compound collections intended to be screened. We present Screening Assistant 2 (SA2), an open-source JAVA software dedicated to the storage and analysis of small to very large chemical libraries. SA2 stores unique molecules in a MySQL database, and encapsulates several chemoinformatics methods, among which: providers management, interactive visualisation, scaffold analysis, diverse subset creation, descriptors calculation, sub-structure / SMART search, similarity search and filtering. We illustrate the use of SA2 by analysing the composition of a database of 15 million compounds collected from 73 providers, in terms of scaffolds, frameworks, and undesired properties as defined by recently proposed HTS SMARTS filters. We also show how the software can be used to create diverse libraries based on existing ones. Screening Assistant 2 is a user-friendly, open-source software that can be used to manage collections of compounds and perform simple to advanced chemoinformatics analyses. Its modular design and growing documentation facilitate the addition of new functionalities, calling for contributions from the community. The software can be downloaded at http://sa2.sourceforge.net/.

  3. Core-level XPS studies of Ce and La intermetallic compounds and their implications for the 4f levels of Ce compounds

    International Nuclear Information System (INIS)

    Freiburg, C.; Fuggle, J.C.; Hillebrecht, F.U.; Zolnierek, Z.; Laesser, R.

    1983-01-01

    The 3d core hole X-ray photoelectron spectra (XPS) of approximately 30 intermetallic compounds of La and Ce are reported. Transitions to final states with approximately f 0 , f 1 and f 2 character are observed in some Ce compounds (f 0 and f 1 for La compounds). The results are discussed in terms of the current ideas of the influence of f-counts and f-levels hybridization on core level lineshapes. We cannot find an explanatoin of the observed spectra consisted with the ''promotial model'' where the 4f-count varies and 4f electron was thought to be entirely promoted to the Ce 5d6s valence bands in some compounds. There may be some small charge transfer from the f level, however. In conjunction with ideas on screening processes in XPS the observed lineshapes suggest coupling of the 4f electrons to other states is strongest in those compounds previously thought to have f 0 character. This coupling increases despite a large increase in the Ce-Ce distance when Ce is diluted with Ni or Pd. Thus it cannot be due to direct f-f interaction and must be attributed to coupling with the other valence electrons; possibly those centred on the partner sites. (orig./EZ) [de

  4. Utilizing Maximal Independent Sets as Dominating Sets in Scale-Free Networks

    Science.gov (United States)

    Derzsy, N.; Molnar, F., Jr.; Szymanski, B. K.; Korniss, G.

    Dominating sets provide key solution to various critical problems in networked systems, such as detecting, monitoring, or controlling the behavior of nodes. Motivated by graph theory literature [Erdos, Israel J. Math. 4, 233 (1966)], we studied maximal independent sets (MIS) as dominating sets in scale-free networks. We investigated the scaling behavior of the size of MIS in artificial scale-free networks with respect to multiple topological properties (size, average degree, power-law exponent, assortativity), evaluated its resilience to network damage resulting from random failure or targeted attack [Molnar et al., Sci. Rep. 5, 8321 (2015)], and compared its efficiency to previously proposed dominating set selection strategies. We showed that, despite its small set size, MIS provides very high resilience against network damage. Using extensive numerical analysis on both synthetic and real-world (social, biological, technological) network samples, we demonstrate that our method effectively satisfies four essential requirements of dominating sets for their practical applicability on large-scale real-world systems: 1.) small set size, 2.) minimal network information required for their construction scheme, 3.) fast and easy computational implementation, and 4.) resiliency to network damage. Supported by DARPA, DTRA, and NSF.

  5. Prediction of human population responses to toxic compounds by a collaborative competition.

    Science.gov (United States)

    Eduati, Federica; Mangravite, Lara M; Wang, Tao; Tang, Hao; Bare, J Christopher; Huang, Ruili; Norman, Thea; Kellen, Mike; Menden, Michael P; Yang, Jichen; Zhan, Xiaowei; Zhong, Rui; Xiao, Guanghua; Xia, Menghang; Abdo, Nour; Kosyk, Oksana; Friend, Stephen; Dearry, Allen; Simeonov, Anton; Tice, Raymond R; Rusyn, Ivan; Wright, Fred A; Stolovitzky, Gustavo; Xie, Yang; Saez-Rodriguez, Julio

    2015-09-01

    The ability to computationally predict the effects of toxic compounds on humans could help address the deficiencies of current chemical safety testing. Here, we report the results from a community-based DREAM challenge to predict toxicities of environmental compounds with potential adverse health effects for human populations. We measured the cytotoxicity of 156 compounds in 884 lymphoblastoid cell lines for which genotype and transcriptional data are available as part of the Tox21 1000 Genomes Project. The challenge participants developed algorithms to predict interindividual variability of toxic response from genomic profiles and population-level cytotoxicity data from structural attributes of the compounds. 179 submitted predictions were evaluated against an experimental data set to which participants were blinded. Individual cytotoxicity predictions were better than random, with modest correlations (Pearson's r < 0.28), consistent with complex trait genomic prediction. In contrast, predictions of population-level response to different compounds were higher (r < 0.66). The results highlight the possibility of predicting health risks associated with unknown compounds, although risk estimation accuracy remains suboptimal.

  6. Large deviations

    CERN Document Server

    Varadhan, S R S

    2016-01-01

    The theory of large deviations deals with rates at which probabilities of certain events decay as a natural parameter in the problem varies. This book, which is based on a graduate course on large deviations at the Courant Institute, focuses on three concrete sets of examples: (i) diffusions with small noise and the exit problem, (ii) large time behavior of Markov processes and their connection to the Feynman-Kac formula and the related large deviation behavior of the number of distinct sites visited by a random walk, and (iii) interacting particle systems, their scaling limits, and large deviations from their expected limits. For the most part the examples are worked out in detail, and in the process the subject of large deviations is developed. The book will give the reader a flavor of how large deviation theory can help in problems that are not posed directly in terms of large deviations. The reader is assumed to have some familiarity with probability, Markov processes, and interacting particle systems.

  7. Electrothermal adsorption and desorption of volatile organic compounds on activated carbon fiber cloth

    Energy Technology Data Exchange (ETDEWEB)

    Son, H.K. [Department of Health and Environment, Kosin University, Dong Sam Dong, Young Do Gu, Busan (Korea, Republic of); Sivakumar, S., E-mail: ssivaphd@yahoo.com [Department of Bioenvironmental Energy, College of Natural Resource and Life Science, Pusan National University, Miryang-si, Gyeongsangnam-do 627-706 (Korea, Republic of); Rood, M.J. [Department of Civil and Environmental Engineering, University of Illinois, Urbana, IL (United States); Kim, B.J. [Construction Engineering Research Laboratory, U.S. Army Engineer Research and Development Center (ERDC-CERL), Champaign, IL (United States)

    2016-01-15

    Highlights: • We study the adsorption and desorption of VOCs by an activated carbon fiber cloth. • Desorption concentration was controlled via electrothermal heating. • The desorption rate was successfully equalized and controlled by this system. - Abstract: Adsorption is an effective means to selectively remove volatile organic compounds (VOCs) from industrial gas streams and is particularly of use for gas streams that exhibit highly variable daily concentrations of VOCs. Adsorption of such gas streams by activated carbon fiber cloths (ACFCs) and subsequent controlled desorption can provide gas streams of well-defined concentration that can then be more efficiently treated by biofiltration than streams exhibiting large variability in concentration. In this study, we passed VOC-containing gas through an ACFC vessel for adsorption and then desorption in a concentration-controlled manner via electrothermal heating. Set-point concentrations (40–900 ppm{sub v}) and superficial gas velocity (6.3–9.9 m/s) were controlled by a data acquisition and control system. The results of the average VOC desorption, desorption factor and VOC in-and-out ratio were calculated and compared for various gas set-point concentrations and superficial gas velocities. Our results reveal that desorption is strongly dependent on the set-point concentration and that the VOC desorption rate can be successfully equalized and controlled via an electrothermal adsorption system.

  8. The human volatilome: volatile organic compounds (VOCs) in exhaled breath, skin emanations, urine, feces and saliva.

    Science.gov (United States)

    Amann, Anton; Costello, Ben de Lacy; Miekisch, Wolfram; Schubert, Jochen; Buszewski, Bogusław; Pleil, Joachim; Ratcliffe, Norman; Risby, Terence

    2014-09-01

    Breath analysis is a young field of research with its roots in antiquity. Antoine Lavoisier discovered carbon dioxide in exhaled breath during the period 1777-1783, Wilhelm (Vilém) Petters discovered acetone in breath in 1857 and Johannes Müller reported the first quantitative measurements of acetone in 1898. A recent review reported 1765 volatile compounds appearing in exhaled breath, skin emanations, urine, saliva, human breast milk, blood and feces. For a large number of compounds, real-time analysis of exhaled breath or skin emanations has been performed, e.g., during exertion of effort on a stationary bicycle or during sleep. Volatile compounds in exhaled breath, which record historical exposure, are called the 'exposome'. Changes in biogenic volatile organic compound concentrations can be used to mirror metabolic or (patho)physiological processes in the whole body or blood concentrations of drugs (e.g. propofol) in clinical settings-even during artificial ventilation or during surgery. Also compounds released by bacterial strains like Pseudomonas aeruginosa or Streptococcus pneumonia could be very interesting. Methyl methacrylate (CAS 80-62-6), for example, was observed in the headspace of Streptococcus pneumonia in concentrations up to 1420 ppb. Fecal volatiles have been implicated in differentiating certain infectious bowel diseases such as Clostridium difficile, Campylobacter, Salmonella and Cholera. They have also been used to differentiate other non-infectious conditions such as irritable bowel syndrome and inflammatory bowel disease. In addition, alterations in urine volatiles have been used to detect urinary tract infections, bladder, prostate and other cancers. Peroxidation of lipids and other biomolecules by reactive oxygen species produce volatile compounds like ethane and 1-pentane. Noninvasive detection and therapeutic monitoring of oxidative stress would be highly desirable in autoimmunological, neurological, inflammatory diseases and cancer

  9. Strategies for outcrossing and genetic manipulation of Drosophila compound autosome stocks.

    Science.gov (United States)

    Martins, T; Kotadia, S; Malmanche, N; Sunkel, C E; Sullivan, W

    2013-01-01

    Among all organisms, Drosophila melanogaster has the most extensive well-characterized collection of large-scale chromosome rearrangements. Compound chromosomes, rearrangements in which homologous chromosome arms share a centromere, have proven especially useful in genetic-based surveys of the entire genome. However, their potential has not been fully realized because compound autosome stocks are refractile to standard genetic manipulations: if outcrossed, they yield inviable aneuploid progeny. Here we describe two strategies, cold-shock and use of the bubR1 mutant alleles, to produce nullo gametes through nondisjunction. These gametes are complementary to the compound chromosome-bearing gametes and thus produce viable progeny. Using these techniques, we created a compound chromosome two C(2)EN stock bearing a red fluorescent protein-histone transgene, facilitating live analysis of these unusually long chromosomes.

  10. Halogenated organic compounds in archived whale oil: A pre-industrial record

    International Nuclear Information System (INIS)

    Teuten, Emma L.; Reddy, Christopher M.

    2007-01-01

    To provide additional evidence that several halogenated organic compounds (HOCs) found in environmental samples are natural and not industrially produced, we analyzed an archived whale oil sample collected in 1921 from the last voyage of the whaling ship Charles W. Morgan. This sample, which pre-dates large-scale industrial manufacture of HOCs, contained two methoxylated polybrominated diphenyl ethers (MeO-PBDEs), five halogenated methyl bipyrroles (MBPs), one halogenated dimethyl bipyrrole (DMBP), and tentatively one dimethoxylated polybrominated biphenyl (diMeO-PBB). This result indicates, at least in part, a natural source of the latter compounds. - Nine halogenated organic compounds have been detected in archived whale oil from the early 1920s

  11. Halogenated organic compounds in archived whale oil: A pre-industrial record

    Energy Technology Data Exchange (ETDEWEB)

    Teuten, Emma L. [Department of Marine Chemistry and Geochemistry, Woods Hole Oceanographic Institution, 360 Woods Hole Road, Woods Hole, MA 02543 (United States)]. E-mail: emma.teuten@plymouth.ac.uk; Reddy, Christopher M. [Department of Marine Chemistry and Geochemistry, Woods Hole Oceanographic Institution, 360 Woods Hole Road, Woods Hole, MA 02543 (United States)]. E-mail: creddy@whoi.edu

    2007-02-15

    To provide additional evidence that several halogenated organic compounds (HOCs) found in environmental samples are natural and not industrially produced, we analyzed an archived whale oil sample collected in 1921 from the last voyage of the whaling ship Charles W. Morgan. This sample, which pre-dates large-scale industrial manufacture of HOCs, contained two methoxylated polybrominated diphenyl ethers (MeO-PBDEs), five halogenated methyl bipyrroles (MBPs), one halogenated dimethyl bipyrrole (DMBP), and tentatively one dimethoxylated polybrominated biphenyl (diMeO-PBB). This result indicates, at least in part, a natural source of the latter compounds. - Nine halogenated organic compounds have been detected in archived whale oil from the early 1920s.

  12. Neuroprotective Compound from an Endophytic Fungus, Colletotrichum sp. JS-0367.

    Science.gov (United States)

    Song, Ji Hoon; Lee, Changyeol; Lee, Dahae; Kim, Soonok; Bang, Sunghee; Shin, Myoung-Sook; Lee, Jun; Kang, Ki Sung; Shim, Sang Hee

    2018-05-23

    Colletotrichum sp. JS-0367 was isolated from Morus alba (mulberry), identified, and cultured on a large scale for chemical investigation. One new anthraquinone (1) and three known anthraquinones (2-4) were isolated and identified using spectroscopic methods including 1D/2D-NMR and HRESIMS. Although the neuroprotective effects of some anthraquinones have been reported, the biological activities of the four anthraquinones isolated in this study have not been reported. Therefore, the neuroprotective effects of these compounds were determined against murine hippocampal HT22 cell death induced by glutamate. Compound 4, evariquinone, showed strong protective effects against HT22 cell death induced by glutamate by the inhibition of intracellular ROS accumulation and Ca 2+ influx triggered by glutamate. Immunoblot analysis revealed that compound 4 reduced the phosphorylation of MAPKs (JNK, ERK1/2, and p38) induced by glutamate. Furthermore, compound 4 strongly attenuated glutamate-mediated apoptotic cell death.

  13. Is Performance in Task-Cuing Experiments Mediated by Task Set Selection or Associative Compound Retrieval?

    Science.gov (United States)

    Forrest, Charlotte L. D.; Monsell, Stephen; McLaren, Ian P. L.

    2014-01-01

    Task-cuing experiments are usually intended to explore control of task set. But when small stimulus sets are used, they plausibly afford learning of the response associated with a combination of cue and stimulus, without reference to tasks. In 3 experiments we presented the typical trials of a task-cuing experiment: a cue (colored shape) followed,…

  14. The algebras of large N matrix mechanics

    Energy Technology Data Exchange (ETDEWEB)

    Halpern, M.B.; Schwartz, C.

    1999-09-16

    Extending early work, we formulate the large N matrix mechanics of general bosonic, fermionic and supersymmetric matrix models, including Matrix theory: The Hamiltonian framework of large N matrix mechanics provides a natural setting in which to study the algebras of the large N limit, including (reduced) Lie algebras, (reduced) supersymmetry algebras and free algebras. We find in particular a broad array of new free algebras which we call symmetric Cuntz algebras, interacting symmetric Cuntz algebras, symmetric Bose/Fermi/Cuntz algebras and symmetric Cuntz superalgebras, and we discuss the role of these algebras in solving the large N theory. Most important, the interacting Cuntz algebras are associated to a set of new (hidden!) local quantities which are generically conserved only at large N. A number of other new large N phenomena are also observed, including the intrinsic nonlocality of the (reduced) trace class operators of the theory and a closely related large N field identification phenomenon which is associated to another set (this time nonlocal) of new conserved quantities at large N.

  15. Minor lipophilic compounds in edible insects

    OpenAIRE

    Monika Sabolová; Anna Adámková; Lenka Kouřimská; Diana Chrpová; Jan Pánek

    2016-01-01

    Contemporary society is faced with the question how to ensure suffiecient nutrition (quantity and quality) for rapidly growing population. One solution can be consumption of edible insect, which can have very good nutritional value (dietary energy, protein, fatty acids, fibers, dietary minerals and vitamins composition). Some edible insects species, which contains a relatively large amount of fat, can have a potential to be a „good" (interesting, new) source of minor lipophilic compound...

  16. Cyanobacteria as a Source for Novel Anti-Leukemic Compounds.

    Science.gov (United States)

    Humisto, Anu; Herfindal, Lars; Jokela, Jouni; Karkman, Antti; Bjørnstad, Ronja; Choudhury, Romi R; Sivonen, Kaarina

    2016-01-01

    Cyanobacteria are an inspiring source of bioactive secondary metabolites. These bioactive agents are a diverse group of compounds which are varying in their bioactive targets, the mechanisms of action, and chemical structures. Cyanobacteria from various environments, especially marine benthic cyanobacteria, are found to be rich sources for the search for novel bioactive compounds. Several compounds with anticancer activities have been discovered from cyanobacteria and some of these have succeeded to enter the clinical trials. Varying anticancer agents are needed to overcome increasing challenges in cancer treatments. Different search methods are used to reveal anticancer compounds from natural products, but cell based methods are the most common. Cyanobacterial bioactive compounds as agents against acute myeloid leukemia are not well studied. Here we examined our new results combined with previous studies of anti-leukemic compounds from cyanobacteria with emphasis to reveal common features in strains producing such activity. We report that cyanobacteria harbor specific anti-leukemic compounds since several studied strains induced apoptosis against AML cells but were inactive against non-malignant cells like hepatocytes. We noted that particularly benthic strains from the Baltic Sea, such as Anabaena sp., were especially potential AML apoptosis inducers. Taken together, this review and re-analysis of data demonstrates the power of maintaining large culture collections for the search for novel bioactivities, and also how anti-AML activity in cyanobacteria can be revealed by relatively simple and low-cost assays.

  17. Sanskrit Compound Processor

    Science.gov (United States)

    Kumar, Anil; Mittal, Vipul; Kulkarni, Amba

    Sanskrit is very rich in compound formation. Typically a compound does not code the relation between its components explicitly. To understand the meaning of a compound, it is necessary to identify its components, discover the relations between them and finally generate a paraphrase of the compound. In this paper, we discuss the automatic segmentation and type identification of a compound using simple statistics that results from the manually annotated data.

  18. Superconductivity in a new YBaCuO compound at 105 K

    International Nuclear Information System (INIS)

    Kirschner, I.; Bankuti, J.; Gal, M.; Torkos, K.; Solymos, K.G.; Horvath, G.

    1987-01-01

    A superconducting transition has been detected in a (Y 0.8 Ba 0.2 )(CuO 4-δ ) 2 compound by electrical and magnetic measurements. The resistivity begins to decrease at 173 K and the zero-resistivity state sets in at 105 K

  19. A geometrical correction for the inter- and intra-molecular basis set superposition error in Hartree-Fock and density functional theory calculations for large systems

    Science.gov (United States)

    Kruse, Holger; Grimme, Stefan

    2012-04-01

    chemistry yields MAD=0.68 kcal/mol, which represents a huge improvement over plain B3LYP/6-31G* (MAD=2.3 kcal/mol). Application of gCP-corrected B97-D3 and HF-D3 on a set of large protein-ligand complexes prove the robustness of the method. Analytical gCP gradients make optimizations of large systems feasible with small basis sets, as demonstrated for the inter-ring distances of 9-helicene and most of the complexes in Hobza's S22 test set. The method is implemented in a freely available FORTRAN program obtainable from the author's website.

  20. A geometrical correction for the inter- and intra-molecular basis set superposition error in Hartree-Fock and density functional theory calculations for large systems.

    Science.gov (United States)

    Kruse, Holger; Grimme, Stefan

    2012-04-21

    chemistry yields MAD=0.68 kcal/mol, which represents a huge improvement over plain B3LYP/6-31G* (MAD=2.3 kcal/mol). Application of gCP-corrected B97-D3 and HF-D3 on a set of large protein-ligand complexes prove the robustness of the method. Analytical gCP gradients make optimizations of large systems feasible with small basis sets, as demonstrated for the inter-ring distances of 9-helicene and most of the complexes in Hobza's S22 test set. The method is implemented in a freely available FORTRAN program obtainable from the author's website.

  1. Time series clustering in large data sets

    Directory of Open Access Journals (Sweden)

    Jiří Fejfar

    2011-01-01

    Full Text Available The clustering of time series is a widely researched area. There are many methods for dealing with this task. We are actually using the Self-organizing map (SOM with the unsupervised learning algorithm for clustering of time series. After the first experiment (Fejfar, Weinlichová, Šťastný, 2009 it seems that the whole concept of the clustering algorithm is correct but that we have to perform time series clustering on much larger dataset to obtain more accurate results and to find the correlation between configured parameters and results more precisely. The second requirement arose in a need for a well-defined evaluation of results. It seems useful to use sound recordings as instances of time series again. There are many recordings to use in digital libraries, many interesting features and patterns can be found in this area. We are searching for recordings with the similar development of information density in this experiment. It can be used for musical form investigation, cover songs detection and many others applications.The objective of the presented paper is to compare clustering results made with different parameters of feature vectors and the SOM itself. We are describing time series in a simplistic way evaluating standard deviations for separated parts of recordings. The resulting feature vectors are clustered with the SOM in batch training mode with different topologies varying from few neurons to large maps.There are other algorithms discussed, usable for finding similarities between time series and finally conclusions for further research are presented. We also present an overview of the related actual literature and projects.

  2. Analysis of pharmaceutical and other organic wastewater compounds in filtered and unfiltered water samples by gas chromatography/mass spectrometry

    Science.gov (United States)

    Zaugg, Steven D.; Phillips, Patrick J.; Smith, Steven G.

    2014-01-01

    Research on the effects of exposure of stream biota to complex mixtures of pharmaceuticals and other organic compounds associated with wastewater requires the development of additional analytical capabilities for these compounds in water samples. Two gas chromatography/mass spectrometry (GC/MS) analytical methods used at the U.S. Geological Survey National Water Quality Laboratory (NWQL) to analyze organic compounds associated with wastewater were adapted to include additional pharmaceutical and other organic compounds beginning in 2009. This report includes a description of method performance for 42 additional compounds for the filtered-water method (hereafter referred to as the filtered method) and 46 additional compounds for the unfiltered-water method (hereafter referred to as the unfiltered method). The method performance for the filtered method described in this report has been published for seven of these compounds; however, the addition of several other compounds to the filtered method and the addition of the compounds to the unfiltered method resulted in the need to document method performance for both of the modified methods. Most of these added compounds are pharmaceuticals or pharmaceutical degradates, although two nonpharmaceutical compounds are included in each method. The main pharmaceutical compound classes added to the two modified methods include muscle relaxants, opiates, analgesics, and sedatives. These types of compounds were added to the original filtered and unfiltered methods largely in response to the tentative identification of a wide range of pharmaceutical and other organic compounds in samples collected from wastewater-treatment plants. Filtered water samples are extracted by vacuum through disposable solid-phase cartridges that contain modified polystyrene-divinylbenzene resin. Unfiltered samples are extracted by using continuous liquid-liquid extraction with dichloromethane. The compounds of interest for filtered and unfiltered sample

  3. Identification of a potential superhard compound ReCN

    International Nuclear Information System (INIS)

    Fan, Xiaofeng; Li, M.M.; Singh, David J.; Jiang, Qing; Zheng, W.T.

    2015-01-01

    Highlights: • We identify a new ternary compound ReCN with theoretical calculation. • The ternary compound ReCN is with two stable structures with P63mc and P3m1. • ReCN is a semiconductor from the calculation of electronic structures. • ReCN is found to possess the outstanding mechanical properties. • ReCN may be synthesized relatively easily. - Abstract: We identify a new ternary compound, ReCN and characterize its properties including structural stability and indicators of hardness using first principles calculations. We find that there are two stable structures with space groups P63mc (HI) and P3m1 (HII), in which there are no C–C and N–N bonds. Both structures, H1 and III are elastically and dynamically stable. The electronic structures show that ReCN is a semiconductor, although the parent compounds, ReC 2 and ReN 2 are both metallic. ReCN is found to possess the outstanding mechanical properties with the large bulk modulus, shear modulus and excellent ideal strengths. In addition, ReCN may perhaps be synthesized relatively easily because it becomes thermodynamic stable with respect to decomposition at very low pressures

  4. Metagenomic screening for aromatic compound-responsive transcriptional regulators.

    Directory of Open Access Journals (Sweden)

    Taku Uchiyama

    Full Text Available We applied a metagenomics approach to screen for transcriptional regulators that sense aromatic compounds. The library was constructed by cloning environmental DNA fragments into a promoter-less vector containing green fluorescence protein. Fluorescence-based screening was then performed in the presence of various aromatic compounds. A total of 12 clones were isolated that fluoresced in response to salicylate, 3-methyl catechol, 4-chlorocatechol and chlorohydroquinone. Sequence analysis revealed at least 1 putative transcriptional regulator, excluding 1 clone (CHLO8F. Deletion analysis identified compound-specific transcriptional regulators; namely, 8 LysR-types, 2 two-component-types and 1 AraC-type. Of these, 9 representative clones were selected and their reaction specificities to 18 aromatic compounds were investigated. Overall, our transcriptional regulators were functionally diverse in terms of both specificity and induction rates. LysR- and AraC- type regulators had relatively narrow specificities with high induction rates (5-50 fold, whereas two-component-types had wide specificities with low induction rates (3 fold. Numerous transcriptional regulators have been deposited in sequence databases, but their functions remain largely unknown. Thus, our results add valuable information regarding the sequence-function relationship of transcriptional regulators.

  5. A Decomposition Model for HPLC-DAD Data Set and Its Solution by Particle Swarm Optimization

    Directory of Open Access Journals (Sweden)

    Lizhi Cui

    2014-01-01

    Full Text Available This paper proposes a separation method, based on the model of Generalized Reference Curve Measurement and the algorithm of Particle Swarm Optimization (GRCM-PSO, for the High Performance Liquid Chromatography with Diode Array Detection (HPLC-DAD data set. Firstly, initial parameters are generated to construct reference curves for the chromatogram peaks of the compounds based on its physical principle. Then, a General Reference Curve Measurement (GRCM model is designed to transform these parameters to scalar values, which indicate the fitness for all parameters. Thirdly, rough solutions are found by searching individual target for every parameter, and reinitialization only around these rough solutions is executed. Then, the Particle Swarm Optimization (PSO algorithm is adopted to obtain the optimal parameters by minimizing the fitness of these new parameters given by the GRCM model. Finally, spectra for the compounds are estimated based on the optimal parameters and the HPLC-DAD data set. Through simulations and experiments, following conclusions are drawn: (1 the GRCM-PSO method can separate the chromatogram peaks and spectra from the HPLC-DAD data set without knowing the number of the compounds in advance even when severe overlap and white noise exist; (2 the GRCM-PSO method is able to handle the real HPLC-DAD data set.

  6. A High-Content Live-Cell Viability Assay and Its Validation on a Diverse 12K Compound Screen.

    Science.gov (United States)

    Chiaravalli, Jeanne; Glickman, J Fraser

    2017-08-01

    We have developed a new high-content cytotoxicity assay using live cells, called "ImageTOX." We used a high-throughput fluorescence microscope system, image segmentation software, and the combination of Hoechst 33342 and SYTO 17 to simultaneously score the relative size and the intensity of the nuclei, the nuclear membrane permeability, and the cell number in a 384-well microplate format. We then performed a screen of 12,668 diverse compounds and compared the results to a standard cytotoxicity assay. The ImageTOX assay identified similar sets of compounds to the standard cytotoxicity assay, while identifying more compounds having adverse effects on cell structure, earlier in treatment time. The ImageTOX assay uses inexpensive commercially available reagents and facilitates the use of live cells in toxicity screens. Furthermore, we show that we can measure the kinetic profile of compound toxicity in a high-content, high-throughput format, following the same set of cells over an extended period of time.

  7. Biodegradation of creosote compounds: Comparison of experiments at different scales

    DEFF Research Database (Denmark)

    Broholm, K.; Arvin, Erik

    2001-01-01

    of the pyrroles on the biodegradation of benzene, and the biodegradation of benzothiophene occurs only in the presence of a primary substrate. The experiments show that some biodegradation processes of organic compounds may be common to different microorganisms.......This paper compares the results of biodegradation experiments with creosote compounds performed at different scales. The experiments include field observations, field experiments, large-scale intact laboratory column experiments, model fracture experiments, and batch experiments. Most...... of the experiments were conducted with till or ground water from the field site at Ringe on the island of Funen. Although the experiments were conducted on different scales, they revealed that some phenomena-e.g., an extensive biodegradation potential of several of the creosote compounds, the inhibitory influence...

  8. Energy gap formation mechanism through the interference phenomena of electrons in face-centered cubic elements and compounds with the emphasis on half-Heusler and Heusler compounds

    Science.gov (United States)

    Mizutani, U.; Sato, H.

    2018-05-01

    Many face-centred cubic elements and compounds with the number of atoms per unit cell N equal to 8, 12 and 16 are known to be stabilised by forming either a band gap or a pseudogap at the Fermi level. They are conveniently expressed as cF8, cF12 and cF16, respectively, in the Pearson symbol. From the cF8 family, we worked on three tetravalent elements C (diamond), Si and Ge, SZn-type AsGa compound and NaCl-type compounds like BiLu, AsSc, etc. From the cF12 family, more than 80 compounds were selected, with a particular emphasis on ABC- and half-Heusler-type ternary equiatomic compounds. Among cF16 compounds, both the Heusler compounds ABC2 and Zintl compounds were studied. We revealed that, regardless of whether or not the transition metal (TM) and/or rare-earth (RE) elements are involved as constituent elements, the energy gap formation mechanism for cF8, cF12 and cF16 compounds can be universally discussed in terms of interference phenomenon of itinerant electrons with set of reciprocal lattice planes with ? = 8, 11 and 12, where ? refers to square of the critical reciprocal of lattice vector of an fcc lattice. The number of itinerant electrons per unit cell, e/uc, for all these band gap/pseudogap-bearing compounds is found to fall on a universal line called "3/2-power law" when plotted against ? on a logarithmic scale. This proves the validity of the fulfilment of the interference condition ? in conformity with other pseudogap compounds with different crystal symmetries and different sizes of the unit cell reported in literature.

  9. Statistical theory of multi-step compound and direct reactions

    International Nuclear Information System (INIS)

    Feshbach, H.; Kerman, A.; Koonin, S.

    1980-01-01

    The theory of nuclear reactions is extended so as to include a statistical treatment of multi-step processes. Two types are distinguished, the multi-step compound and the multi-step direct. The wave functions for the system are grouped according to their complexity. The multi-step direct process involves explicitly those states which are open, while the multi-step compound involves those which are bound. In addition to the random phase assumption which is applied differently to the multi-step direct and to the multi-step compound cross-sections, it is assumed that the residual interaction will have non-vanishing matrix elements between states whose complexities differ by at most one unit. This is referred to as the chaining hypothesis. Explicit expressions for the double differential cross-section giving the angular distribution and energy spectrum are obtained for both reaction types. The statistical multi-step compound cross-sections are symmetric about 90 0 . The classical statistical theory of nuclear reactions is a special limiting case. The cross-section for the statistical multi-step direct reaction consists of a set of convolutions of single-step direct cross-sections. For the many step case it is possible to derive a diffusion equation in momentum space. Application is made to the reaction 181 Ta(p,n) 181 W using the statistical multi-step compound formalism

  10. Coupling Matched Molecular Pairs with Machine Learning for Virtual Compound Optimization.

    Science.gov (United States)

    Turk, Samo; Merget, Benjamin; Rippmann, Friedrich; Fulle, Simone

    2017-12-26

    Matched molecular pair (MMP) analyses are widely used in compound optimization projects to gain insights into structure-activity relationships (SAR). The analysis is traditionally done via statistical methods but can also be employed together with machine learning (ML) approaches to extrapolate to novel compounds. The here introduced MMP/ML method combines a fragment-based MMP implementation with different machine learning methods to obtain automated SAR decomposition and prediction. To test the prediction capabilities and model transferability, two different compound optimization scenarios were designed: (1) "new fragments" which occurs when exploring new fragments for a defined compound series and (2) "new static core and transformations" which resembles for instance the identification of a new compound series. Very good results were achieved by all employed machine learning methods especially for the new fragments case, but overall deep neural network models performed best, allowing reliable predictions also for the new static core and transformations scenario, where comprehensive SAR knowledge of the compound series is missing. Furthermore, we show that models trained on all available data have a higher generalizability compared to models trained on focused series and can extend beyond chemical space covered in the training data. Thus, coupling MMP with deep neural networks provides a promising approach to make high quality predictions on various data sets and in different compound optimization scenarios.

  11. Quantitative on-line analysis of sulfur compounds in complex hydrocarbon matrices.

    Science.gov (United States)

    Djokic, Marko R; Ristic, Nenad D; Olahova, Natalia; Marin, Guy B; Van Geem, Kevin M

    2017-08-04

    An improved method for on-line measurement of sulfur containing compounds in complex matrices is presented. The on-line system consists of a specifically designed sampling system connected to a comprehensive two-dimensional gas chromatograph (GC×GC) equipped with two capillary columns (Rtx ® -1 PONA×SGE BPX50), a flame ionization detector (FID) and a sulfur chemiluminescence detector (SCD). The result is an unprecedented sensitivity down to ppm level (1 ppm-w) for various sulfur containing compounds in very complex hydrocarbon matrices. In addition to the GC×GC-SCD, the low molecular weight sulfur containing compounds such as hydrogen sulfide (H 2 S) and carbonyl sulfide (COS) can be analyzed using a thermal conductivity detector of a so-called refinery gas analyzer (RGA). The methodology was extensively tested on a continuous flow pilot plant for steam cracking, in which quantification of sulfur containing compounds in the reactor effluent was carried out using 3-chlorothiophene as internal standard. The GC×GC-FID/-SCD settings were optimized for ppm analysis of sulfur compounds in olefin-rich (ethylene- and propylene-rich) hydrocarbon matrices produced by steam cracking of petroleum feedstocks. Besides that is primarily used for analysis of the hydrocarbon matrix, FID of the GC×GC-FID/-SCD set-up serves to double check the amount of added sulfur internal standard which is crucial for a proper quantification of sulfur compounds. When vacuum gas oil containing 780 ppm-w of elemental sulfur in the form of benzothiophenes and dibenzothiophenes is subjected to steam cracking, the sulfur balance was closed, with 75% of the sulfur contained in the feed is converted to hydrogen sulfide, 13% to alkyl homologues of thiophene while the remaining 12% is present in the form of alkyl homologues of benzothiophenes. The methodology can be applied for many other conversion processes which use sulfur containing feeds such as hydrocracking, catalytic cracking, kerogen

  12. Mapping of the Available Chemical Space versus the Chemical Universe of Lead-Like Compounds.

    Science.gov (United States)

    Lin, Arkadii; Horvath, Dragos; Afonina, Valentina; Marcou, Gilles; Reymond, Jean-Louis; Varnek, Alexandre

    2018-03-20

    This is, to our knowledge, the most comprehensive analysis to date based on generative topographic mapping (GTM) of fragment-like chemical space (40 million molecules with no more than 17 heavy atoms, both from the theoretically enumerated GDB-17 and real-world PubChem/ChEMBL databases). The challenge was to prove that a robust map of fragment-like chemical space can actually be built, in spite of a limited (≪10 5 ) maximal number of compounds ("frame set") usable for fitting the GTM manifold. An evolutionary map building strategy has been updated with a "coverage check" step, which discards manifolds failing to accommodate compounds outside the frame set. The evolved map has a good propensity to separate actives from inactives for more than 20 external structure-activity sets. It was proven to properly accommodate the entire collection of 40 m compounds. Next, it served as a library comparison tool to highlight biases of real-world molecules (PubChem and ChEMBL) versus the universe of all possible species represented by FDB-17, a fragment-like subset of GDB-17 containing 10 million molecules. Specific patterns, proper to some libraries and absent from others (diversity holes), were highlighted. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. A QSAR, Pharmacokinetic and Toxicological Study of New Artemisinin Compounds with Anticancer Activity

    Directory of Open Access Journals (Sweden)

    Josinete B. Vieira

    2014-07-01

    Full Text Available The Density Functional Theory (DFT method and the 6-31G** basis set were employed to calculate the molecular properties of artemisinin and 20 derivatives with different degrees of cytotoxicity against the human hepatocellular carcinoma HepG2 line. Principal component analysis (PCA and hierarchical cluster analysis (HCA were employed to select the most important descriptors related to anticancer activity. The significant molecular descriptors related to the compounds with anticancer activity were the ALOGPS_log, Mor29m, IC5 and GAP energy. The Pearson correlation between activity and most important descriptors were used for the regression partial least squares (PLS and principal component regression (PCR models built. The regression PLS and PCR were very close, with variation between PLS and PCR of R2 = ±0.0106, R2ajust = ±0.0125, s = ±0.0234, F(4,11 = ±12.7802, Q2 = ±0.0088, SEV = ±0.0132, PRESS = ±0.4808 and SPRESS = ±0.0057. These models were used to predict the anticancer activity of eight new artemisinin compounds (test set with unknown activity, and for these new compounds were predicted pharmacokinetic properties: human intestinal absorption (HIA, cellular permeability (PCaCO2, cell permeability Maden Darby Canine Kidney (PMDCK, skin permeability (PSkin, plasma protein binding (PPB and penetration of the blood-brain barrier (CBrain/Blood, and toxicological: mutagenicity and carcinogenicity. The test set showed for two new artemisinin compounds satisfactory results for anticancer activity and pharmacokinetic and toxicological properties. Consequently, further studies need be done to evaluate the different proposals as well as their actions, toxicity, and potential use for treatment of cancers.

  14. A core laboratory offering full evaluation of new boron compounds. A service to the BNCT community

    International Nuclear Information System (INIS)

    Zamenhof, R.G.; Patel, H.; Palmer, M.R.; Lin, H.C.; Busse, P.M.; Harling, O.; Binns, P.J.; Riley, K.J.; Bernard, J.

    2000-01-01

    A joint project by the Beth Israel Deaconess Medical Center at Harvard Medical School and The Nuclear Reactor Laboratory of the Massachusetts Institute of Technology is proposed which would provide a core laboratory for the evaluation of new boron compounds. Federal agency funding has been applied for to support such a facility. The facility's evaluation of candidate boron compounds will include: quantitative cellular boron uptake; cell survival curve analysis (using a thermal neutron beam); small or large animal pharmacokinetic analysis; macro- and micro boron distribution analysis using high-resolution autoradiography, prompt gamma analysis and ICP-AES; small or large animal in vivo tumor control studies (using thermal or epithermal neutron beams); and pharmacological in vivo toxicity evaluation. The laboratory will include small and large animal surgical facilities and resources for additional boron compound chemistry as required by the evaluation procedure. This facility will be open to the BNCT research community. (author)

  15. Mining collections of compounds with Screening Assistant 2

    Directory of Open Access Journals (Sweden)

    Guilloux Vincent

    2012-08-01

    Full Text Available Abstract Background High-throughput screening assays have become the starting point of many drug discovery programs for large pharmaceutical companies as well as academic organisations. Despite the increasing throughput of screening technologies, the almost infinite chemical space remains out of reach, calling for tools dedicated to the analysis and selection of the compound collections intended to be screened. Results We present Screening Assistant 2 (SA2, an open-source JAVA software dedicated to the storage and analysis of small to very large chemical libraries. SA2 stores unique molecules in a MySQL database, and encapsulates several chemoinformatics methods, among which: providers management, interactive visualisation, scaffold analysis, diverse subset creation, descriptors calculation, sub-structure / SMART search, similarity search and filtering. We illustrate the use of SA2 by analysing the composition of a database of 15 million compounds collected from 73 providers, in terms of scaffolds, frameworks, and undesired properties as defined by recently proposed HTS SMARTS filters. We also show how the software can be used to create diverse libraries based on existing ones. Conclusions Screening Assistant 2 is a user-friendly, open-source software that can be used to manage collections of compounds and perform simple to advanced chemoinformatics analyses. Its modular design and growing documentation facilitate the addition of new functionalities, calling for contributions from the community. The software can be downloaded at http://sa2.sourceforge.net/.

  16. Mining collections of compounds with Screening Assistant 2

    Science.gov (United States)

    2012-01-01

    Background High-throughput screening assays have become the starting point of many drug discovery programs for large pharmaceutical companies as well as academic organisations. Despite the increasing throughput of screening technologies, the almost infinite chemical space remains out of reach, calling for tools dedicated to the analysis and selection of the compound collections intended to be screened. Results We present Screening Assistant 2 (SA2), an open-source JAVA software dedicated to the storage and analysis of small to very large chemical libraries. SA2 stores unique molecules in a MySQL database, and encapsulates several chemoinformatics methods, among which: providers management, interactive visualisation, scaffold analysis, diverse subset creation, descriptors calculation, sub-structure / SMART search, similarity search and filtering. We illustrate the use of SA2 by analysing the composition of a database of 15 million compounds collected from 73 providers, in terms of scaffolds, frameworks, and undesired properties as defined by recently proposed HTS SMARTS filters. We also show how the software can be used to create diverse libraries based on existing ones. Conclusions Screening Assistant 2 is a user-friendly, open-source software that can be used to manage collections of compounds and perform simple to advanced chemoinformatics analyses. Its modular design and growing documentation facilitate the addition of new functionalities, calling for contributions from the community. The software can be downloaded at http://sa2.sourceforge.net/. PMID:23327565

  17. Large displacement vertical translational actuator based on piezoelectric thin films.

    Science.gov (United States)

    Qiu, Zhen; Pulskamp, Jeffrey S; Lin, Xianke; Rhee, Choong-Ho; Wang, Thomas; Polcawich, Ronald G; Oldham, Kenn

    2010-07-01

    A novel vertical translational microactuator based on thin-film piezoelectric actuation is presented, using a set of four compound bend-up/bend-down unimorphs to produce translational motion of a moving platform or stage. The actuation material is a chemical-solution deposited lead-zirconate-titanate (PZT) thin film. Prototype designs have shown as much as 120 μ m of static displacement, with 80-90 μ m displacements being typical, using four 920 μ m long by 70 μ m legs. Analytical models are presented that accurately describe nonlinear behavior in both static and dynamic operation of prototype stages when the dependence of piezoelectric coefficients on voltage is known. Resonance of the system is observed at a frequency of 200 Hz. The large displacement and high bandwidth of the actuators at low-voltage and low-power levels should make them useful to a variety of optical applications, including endoscopic microscopy.

  18. Determining an Estimate of an Equivalence Relation for Moderate and Large Sized Sets

    Directory of Open Access Journals (Sweden)

    Leszek Klukowski

    2017-01-01

    Full Text Available This paper presents two approaches to determining estimates of an equivalence relation on the basis of pairwise comparisons with random errors. Obtaining such an estimate requires the solution of a discrete programming problem which minimizes the sum of the differences between the form of the relation and the comparisons. The problem is NP hard and can be solved with the use of exact algorithms for sets of moderate size, i.e. about 50 elements. In the case of larger sets, i.e. at least 200 comparisons for each element, it is necessary to apply heuristic algorithms. The paper presents results (a statistical preprocessing, which enable us to determine the optimal or a near-optimal solution with acceptable computational cost. They include: the development of a statistical procedure producing comparisons with low probabilities of errors and a heuristic algorithm based on such comparisons. The proposed approach guarantees the applicability of such estimators for any size of set. (original abstract

  19. The Periodic Table as a Part of the Periodic Table of Chemical Compounds

    OpenAIRE

    Labushev, Mikhail M.

    2011-01-01

    The numbers of natural chemical elements, minerals, inorganic and organic chemical compounds are determined by 1, 2, 3 and 4-combinations of a set 95 and are respectively equal to 95, 4,465, 138,415 and 3,183,545. To explain these relations it is suggested the concept of information coefficient of proportionality as mathematical generalization of the proportionality coefficient for any set of positive numbers. It is suggested a hypothesis that the unimodal distributions of the sets of informa...

  20. Comparison of confirmed inactive and randomly selected compounds as negative training examples in support vector machine-based virtual screening.

    Science.gov (United States)

    Heikamp, Kathrin; Bajorath, Jürgen

    2013-07-22

    The choice of negative training data for machine learning is a little explored issue in chemoinformatics. In this study, the influence of alternative sets of negative training data and different background databases on support vector machine (SVM) modeling and virtual screening has been investigated. Target-directed SVM models have been derived on the basis of differently composed training sets containing confirmed inactive molecules or randomly selected database compounds as negative training instances. These models were then applied to search background databases consisting of biological screening data or randomly assembled compounds for available hits. Negative training data were found to systematically influence compound recall in virtual screening. In addition, different background databases had a strong influence on the search results. Our findings also indicated that typical benchmark settings lead to an overestimation of SVM-based virtual screening performance compared to search conditions that are more relevant for practical applications.

  1. Analysis of proteinaceous antinutritional compounds in soya oilseed products

    NARCIS (Netherlands)

    Hessing, M.; Bleeker, H.; Biert, R. van; Sleijsters-Selis, H.; Duijn, G. van; Westerop, H. van; Vlooswijk, R.A.A.

    1996-01-01

    In the food and feed industry large amounts of grain legumes are used for the formulation of endproducts meant for human and animal nutrition. In particular the use of soybean (Glycin max) products as milk replacers in the animal diet and the problems related to antinutritional compounds present in

  2. Natural Compounds as Regulators of the Cancer Cell Metabolism

    Directory of Open Access Journals (Sweden)

    Claudia Cerella

    2013-01-01

    Full Text Available Even though altered metabolism is an “old” physiological mechanism, only recently its targeting became a therapeutically interesting strategy and by now it is considered an emerging hallmark of cancer. Nevertheless, a very poor number of compounds are under investigation as potential modulators of cell metabolism. Candidate agents should display selectivity of action towards cancer cells without side effects. This ideal favorable profile would perfectly overlap the requisites of new anticancer therapies and chemopreventive strategies as well. Nature represents a still largely unexplored source of bioactive molecules with a therapeutic potential. Many of these compounds have already been characterized for their multiple anticancer activities. Many of them are absorbed with the diet and therefore possess a known profile in terms of tolerability and bioavailability compared to newly synthetized chemical compounds. The discovery of important cross-talks between mediators of the most therapeutically targeted aberrancies in cancer (i.e., cell proliferation, survival, and migration and the metabolic machinery allows to predict the possibility that many anticancer activities ascribed to a number of natural compounds may be due, in part, to their ability of modulating metabolic pathways. In this review, we attempt an overview of what is currently known about the potential of natural compounds as modulators of cancer cell metabolism.

  3. Spectrophotometric Analysis of Phenolic Compounds in Grapes and Wines.

    Science.gov (United States)

    Aleixandre-Tudo, Jose Luis; Buica, Astrid; Nieuwoudt, Helene; Aleixandre, Jose Luis; du Toit, Wessel

    2017-05-24

    Phenolic compounds are of crucial importance for red wine color and mouthfeel attributes. A large number of enzymatic and chemical reactions involving phenolic compounds take place during winemaking and aging. Despite the large number of published analytical methods for phenolic analyses, the values obtained may vary considerably. In addition, the existing scientific knowledge needs to be updated, but also critically evaluated and simplified for newcomers and wine industry partners. The most used and widely cited spectrophotometric methods for grape and wine phenolic analysis were identified through a bibliometric search using the Science Citation Index-Expanded (SCIE) database accessed through the Web of Science (WOS) platform from Thompson Reuters. The selection of spectrophotometry was based on its ease of use as a routine analytical technique. On the basis of the number of citations, as well as the advantages and disadvantages reported, the modified Somers assay appears as a multistep, simple, and robust procedure that provides a good estimation of the state of the anthocyanins equilibria. Precipitation methods for total tannin levels have also been identified as preferred protocols for these types of compounds. Good reported correlations between methods (methylcellulose precipitable vs bovine serum albumin) and between these and perceived red wine astringency, in combination with the adaptation to high-throughput format, make them suitable for routine analysis. The bovine serum albumin tannin assay also allows for the estimation of the anthocyanins content with the measurement of small and large polymeric pigments. Finally, the measurement of wine color using the CIELab space approach is also suggested as the protocol of choice as it provides good insight into the wine's color properties.

  4. Large scale mapping of groundwater resources using a highly integrated set of tools

    DEFF Research Database (Denmark)

    Søndergaard, Verner; Auken, Esben; Christiansen, Anders Vest

    large areas with information from an optimum number of new investigation boreholes, existing boreholes, logs and water samples to get an integrated and detailed description of the groundwater resources and their vulnerability.Development of more time efficient and airborne geophysical data acquisition...... platforms (e.g. SkyTEM) have made large-scale mapping attractive and affordable in the planning and administration of groundwater resources. The handling and optimized use of huge amounts of geophysical data covering large areas has also required a comprehensive database, where data can easily be stored...

  5. Recovery of volatile fruit juice aroma compounds by membrane technology

    DEFF Research Database (Denmark)

    Bagger-Jørgensen, Rico; Meyer, Anne S.; Pinelo, Manuel

    2011-01-01

    The influence of temperature (10–45°C), feed flow rate (300–500L/h) and sweeping gas flow rate (1.2–2m3/h) on the recovery of berry fruit juice aroma compounds by sweeping gas membrane distillation (SGMD) was examined on an aroma model solution and on black currant juice in a lab scale membrane...... distillation set up. The data were compared to recovery of the aroma compounds by vacuum membrane distillation (VMD). The flux of SGMD increased with an increase in temperature, feed flow rate or sweeping gas flow rate. Increased temperature and feed flow rate also increased the concentration factors...... the degradation of anthocyanins and polyphenolic compounds in the juice. Industrial relevanceHigh temperature evaporation is the most widely used industrial technique for aroma recovery and concentration of juices, but membrane distillation (MD) may provide for gentler aroma stripping and lower energy consumption...

  6. Simplified fate modelling in respect to ecotoxicological and human toxicological characterisation of emissions of chemical compounds

    DEFF Research Database (Denmark)

    Birkved, Morten; Heijungs, Reinout

    2011-01-01

    The impact assessment of chemical compounds in Life Cycle Impact Assessment (LCIA) and Environmental Risk Assessment (ERA) requires a vast amount of data on the properties of the chemical compounds being assessed. The purpose of the present study is to explore statistical options for reduction...... of the data demand associated with characterisation of chemical emissions in LCIA and ERA.Based on a USEtox™ characterisation factor set consisting of 3,073 data records, multi-dimensional bilinear models for emission compartment specific fate characterisation of chemical emissions were derived by application...... the independent chemical input parameters from the minimum data set, needed for characterisation in USEtox™, according to general availability, importance and relevance for fate factor prediction.Each approach (63% and 75% of the minimum data set needed for characterisation in USEtox™) yielded 66 meta...

  7. Reverse screening methods to search for the protein targets of chemopreventive compounds

    Science.gov (United States)

    Huang, Hongbin; Zhang, Guigui; Zhou, Yuquan; Lin, Chenru; Chen, Suling; Lin, Yutong; Mai, Shangkang; Huang, Zunnan

    2018-05-01

    This article is a systematic review of reverse screening methods used to search for the protein targets of chemopreventive compounds or drugs. Typical chemopreventive compounds include components of traditional Chinese medicine, natural compounds and Food and Drug Administration (FDA)-approved drugs. Such compounds are somewhat selective but are predisposed to bind multiple protein targets distributed throughout diverse signaling pathways in human cells. In contrast to conventional virtual screening, which identifies the ligands of a targeted protein from a compound database, reverse screening is used to identify the potential targets or unintended targets of a given compound from a large number of receptors by examining their known ligands or crystal structures. This method, also known as in silico or computational target fishing, is highly valuable for discovering the target receptors of query molecules from terrestrial or marine natural products, exploring the molecular mechanisms of chemopreventive compounds, finding alternative indications of existing drugs by drug repositioning, and detecting adverse drug reactions and drug toxicity. Reverse screening can be divided into three major groups: shape screening, pharmacophore screening and reverse docking. Several large software packages, such as Schrödinger and Discovery Studio; typical software/network services such as ChemMapper, PharmMapper, idTarget and INVDOCK; and practical databases of known target ligands and receptor crystal structures, such as ChEMBL, BindingDB and the Protein Data Bank (PDB), are available for use in these computational methods. Different programs, online services and databases have different applications and constraints. Here, we conducted a systematic analysis and multilevel classification of the computational programs, online services and compound libraries available for shape screening, pharmacophore screening and reverse docking to enable non-specialist users to quickly learn and

  8. Virtual screening methods as tools for drug lead discovery from large chemical libraries.

    Science.gov (United States)

    Ma, X H; Zhu, F; Liu, X; Shi, Z; Zhang, J X; Yang, S Y; Wei, Y Q; Chen, Y Z

    2012-01-01

    Virtual screening methods have been developed and explored as useful tools for searching drug lead compounds from chemical libraries, including large libraries that have become publically available. In this review, we discussed the new developments in exploring virtual screening methods for enhanced performance in searching large chemical libraries, their applications in screening libraries of ~ 1 million or more compounds in the last five years, the difficulties in their applications, and the strategies for further improving these methods.

  9. Significance of ammonium compounds on nicotine exposure to cigarette smokers.

    NARCIS (Netherlands)

    Willems, E W; Rambali, B; Vleeming, W; Opperhuizen, Antoon; Amsterdam, J G C van

    2006-01-01

    The tobacco industry publicly contends that ammonia compounds are solely used as tobacco additive for purposes of tobacco flavoring, process conditioning and reduction of its subjective harshness and irritation. However, neither objective scientific reports, nor the contents of a large number of

  10. Spatial fingerprints of community structure in human interaction network for an extensive set of large-scale regions.

    Science.gov (United States)

    Kallus, Zsófia; Barankai, Norbert; Szüle, János; Vattay, Gábor

    2015-01-01

    Human interaction networks inferred from country-wide telephone activity recordings were recently used to redraw political maps by projecting their topological partitions into geographical space. The results showed remarkable spatial cohesiveness of the network communities and a significant overlap between the redrawn and the administrative borders. Here we present a similar analysis based on one of the most popular online social networks represented by the ties between more than 5.8 million of its geo-located users. The worldwide coverage of their measured activity allowed us to analyze the large-scale regional subgraphs of entire continents and an extensive set of examples for single countries. We present results for North and South America, Europe and Asia. In our analysis we used the well-established method of modularity clustering after an aggregation of the individual links into a weighted graph connecting equal-area geographical pixels. Our results show fingerprints of both of the opposing forces of dividing local conflicts and of uniting cross-cultural trends of globalization.

  11. Ionisation detectors as monitors of toxic compounds in air

    International Nuclear Information System (INIS)

    Leonhardt, J.W.

    1994-01-01

    Beta particles cause ionisation in gas mixtures. The ions produced provide information on the concentration and identity of trace compounds in ambient air. Modern ionisation detectors use ion mobilities to monitor toxic compounds. Chemical solvent, phosphororganic compounds, PCB and many other toxins can be detected using ion mobility detectors (IMD) in the ppb range or lower. Ion mobility detectors have large potential in industry and research because of their sensitivity, specificity, fast response and relatively low cost. Portable devices and fixed installations are possible. The paper discusses the following topics: (1) ionisation sources in IMD: 63 Ni, 3 H, photoionization and corona discharge, (2) basic principles of ion production, (3) ion collection in IMD, (4) design, gas supply, automatic identification and quantification of IMD data, and (5) selected applications. Advantages and problems with this new type of nuclear analytical instrument are also discussed. (author). 2 refs., 9 figs., 3 tabs

  12. Structure and properties of intermetallic ternary rare earth compounds

    International Nuclear Information System (INIS)

    Casper, Frederick

    2008-01-01

    The so called material science is an always growing field in modern research. For the development of new materials not only the experimental characterization but also theoretical calculation of the electronic structure plays an important role. A class of compounds that has attracted a great deal of attention in recent years is known as REME compounds. These compounds are often referred to with RE designating rare earth, actinide or an element from group 1-4, M representing a late transition metal from groups 8-12, and E belonging to groups 13-15. There are more than 2000 compounds with 1:1:1 stoichiometry belonging to this class of compounds and they offer a broad variety of different structure types. Although many REME compounds are know to exist, mainly only structure and magnetism has been determined for these compounds. In particular, in the field of electronic and transport properties relatively few efforts have been made. The main focus in this study is on compounds crystallizing in MgAgAs and LiGaGe structure. Both structures can only be found among 18 valence electron compounds. The f electrons are localized and therefor not count as valence electrons. A special focus here was also on the magnetoresistance effects and spintronic properties found among the REME compounds. An examination of the following compounds was made: GdAuE (E=In,Cd,Mg), GdPdSb, GdNiSb, REAuSn (RE=Gd,Er,Tm) and RENiBi (RE=Pr,Sm,Gd-Tm,Lu). The experimental results were compared with theoretic band structure calculations. The first half metallic ferromagnet with LiGaGe structure (GdPdSb) was found. All semiconducting REME compounds with MgAgAs structure show giant magnetoresistance (GMR) at low temperatures. The GMR is related to a metal-insulator transition, and the value of the GMR depends on the value of the spin-orbit coupling. Inhomogeneous DyNiBi samples show a small positive MR at low temperature that depends on the amount of metallic impurities. At higher fields the samples show a

  13. Structure and properties of intermetallic ternary rare earth compounds

    Energy Technology Data Exchange (ETDEWEB)

    Casper, Frederick

    2008-12-17

    The so called material science is an always growing field in modern research. For the development of new materials not only the experimental characterization but also theoretical calculation of the electronic structure plays an important role. A class of compounds that has attracted a great deal of attention in recent years is known as REME compounds. These compounds are often referred to with RE designating rare earth, actinide or an element from group 1-4, M representing a late transition metal from groups 8-12, and E belonging to groups 13-15. There are more than 2000 compounds with 1:1:1 stoichiometry belonging to this class of compounds and they offer a broad variety of different structure types. Although many REME compounds are know to exist, mainly only structure and magnetism has been determined for these compounds. In particular, in the field of electronic and transport properties relatively few efforts have been made. The main focus in this study is on compounds crystallizing in MgAgAs and LiGaGe structure. Both structures can only be found among 18 valence electron compounds. The f electrons are localized and therefor not count as valence electrons. A special focus here was also on the magnetoresistance effects and spintronic properties found among the REME compounds. An examination of the following compounds was made: GdAuE (E=In,Cd,Mg), GdPdSb, GdNiSb, REAuSn (RE=Gd,Er,Tm) and RENiBi (RE=Pr,Sm,Gd-Tm,Lu). The experimental results were compared with theoretic band structure calculations. The first half metallic ferromagnet with LiGaGe structure (GdPdSb) was found. All semiconducting REME compounds with MgAgAs structure show giant magnetoresistance (GMR) at low temperatures. The GMR is related to a metal-insulator transition, and the value of the GMR depends on the value of the spin-orbit coupling. Inhomogeneous DyNiBi samples show a small positive MR at low temperature that depends on the amount of metallic impurities. At higher fields the samples show a

  14. Two-dimensional materials from high-throughput computational exfoliation of experimentally known compounds

    Science.gov (United States)

    Mounet, Nicolas; Gibertini, Marco; Schwaller, Philippe; Campi, Davide; Merkys, Andrius; Marrazzo, Antimo; Sohier, Thibault; Castelli, Ivano Eligio; Cepellotti, Andrea; Pizzi, Giovanni; Marzari, Nicola

    2018-02-01

    Two-dimensional (2D) materials have emerged as promising candidates for next-generation electronic and optoelectronic applications. Yet, only a few dozen 2D materials have been successfully synthesized or exfoliated. Here, we search for 2D materials that can be easily exfoliated from their parent compounds. Starting from 108,423 unique, experimentally known 3D compounds, we identify a subset of 5,619 compounds that appear layered according to robust geometric and bonding criteria. High-throughput calculations using van der Waals density functional theory, validated against experimental structural data and calculated random phase approximation binding energies, further allowed the identification of 1,825 compounds that are either easily or potentially exfoliable. In particular, the subset of 1,036 easily exfoliable cases provides novel structural prototypes and simple ternary compounds as well as a large portfolio of materials to search from for optimal properties. For a subset of 258 compounds, we explore vibrational, electronic, magnetic and topological properties, identifying 56 ferromagnetic and antiferromagnetic systems, including half-metals and half-semiconductors.

  15. Improvement of a synthetic lure for Anopheles gambiae using compounds produced by human skin microbiota.

    Science.gov (United States)

    Verhulst, Niels O; Mbadi, Phoebe A; Kiss, Gabriella Bukovinszkiné; Mukabana, Wolfgang R; van Loon, Joop J A; Takken, Willem; Smallegange, Renate C

    2011-02-08

    Anopheles gambiae sensu stricto is considered to be highly anthropophilic and volatiles of human origin provide essential cues during its host-seeking behaviour. A synthetic blend of three human-derived volatiles, ammonia, lactic acid and tetradecanoic acid, attracts A. gambiae. In addition, volatiles produced by human skin bacteria are attractive to this mosquito species. The purpose of the current study was to test the effect of ten compounds present in the headspace of human bacteria on the host-seeking process of A. gambiae. The effect of each of the ten compounds on the attractiveness of a basic blend of ammonia, lactic and tetradecanoic acid to A. gambiae was examined. The host-seeking response of A. gambiae was evaluated in a laboratory set-up using a dual-port olfactometer and in a semi-field facility in Kenya using MM-X traps. Odorants were released from LDPE sachets and placed inside the olfactometer as well as in the MM-X traps. Carbon dioxide was added in the semi-field experiments, provided from pressurized cylinders or fermenting yeast. The olfactometer and semi-field set-up allowed for high-throughput testing of the compounds in blends and in multiple concentrations. Compounds with an attractive or inhibitory effect were identified in both bioassays. 3-Methyl-1-butanol was the best attractant in both set-ups and increased the attractiveness of the basic blend up to three times. 2-Phenylethanol reduced the attractiveness of the basic blend in both bioassays by more than 50%. Identification of volatiles released by human skin bacteria led to the discovery of compounds that have an impact on the host-seeking behaviour of A. gambiae. 3-Methyl-1-butanol may be used to increase mosquito trap catches, whereas 2-phenylethanol has potential as a spatial repellent. These two compounds could be applied in push-pull strategies to reduce mosquito numbers in malaria endemic areas.

  16. Offset Compound Gear Drive

    Science.gov (United States)

    Stevens, Mark A.; Handschuh, Robert F.; Lewicki, David G.

    2010-01-01

    The Offset Compound Gear Drive is an in-line, discrete, two-speed device utilizing a special offset compound gear that has both an internal tooth configuration on the input end and external tooth configuration on the output end, thus allowing it to mesh in series, simultaneously, with both a smaller external tooth input gear and a larger internal tooth output gear. This unique geometry and offset axis permits the compound gear to mesh with the smaller diameter input gear and the larger diameter output gear, both of which are on the same central, or primary, centerline. This configuration results in a compact in-line reduction gear set consisting of fewer gears and bearings than a conventional planetary gear train. Switching between the two output ratios is accomplished through a main control clutch and sprag. Power flow to the above is transmitted through concentric power paths. Low-speed operation is accomplished in two meshes. For the purpose of illustrating the low-speed output operation, the following example pitch diameters are given. A 5.0 pitch diameter (PD) input gear to 7.50 PD (internal tooth) intermediate gear (0.667 reduction mesh), and a 7.50 PD (external tooth) intermediate gear to a 10.00 PD output gear (0.750 reduction mesh). Note that it is not required that the intermediate gears on the offset axis be of the same diameter. For this example, the resultant low-speed ratio is 2:1 (output speed = 0.500; product of stage one 0.667 reduction and stage two 0.750 stage reduction). The design is not restricted to the example pitch diameters, or output ratio. From the output gear, power is transmitted through a hollow drive shaft, which, in turn, drives a sprag during which time the main clutch is disengaged.

  17. Response to a Large Polio Outbreak in a Setting of Conflict - Middle East, 2013-2015.

    Science.gov (United States)

    Mbaeyi, Chukwuma; Ryan, Michael J; Smith, Philip; Mahamud, Abdirahman; Farag, Noha; Haithami, Salah; Sharaf, Magdi; Jorba, Jaume C; Ehrhardt, Derek

    2017-03-03

    As the world advances toward the eradication of polio, outbreaks of wild poliovirus (WPV) in polio-free regions pose a substantial risk to the timeline for global eradication. Countries and regions experiencing active conflict, chronic insecurity, and large-scale displacement of persons are particularly vulnerable to outbreaks because of the disruption of health care and immunization services (1). A polio outbreak occurred in the Middle East, beginning in Syria in 2013 with subsequent spread to Iraq (2). The outbreak occurred 2 years after the onset of the Syrian civil war, resulted in 38 cases, and was the first time WPV was detected in Syria in approximately a decade (3,4). The national governments of eight countries designated the outbreak a public health emergency and collaborated with partners in the Global Polio Eradication Initiative (GPEI) to develop a multiphase outbreak response plan focused on improving the quality of acute flaccid paralysis (AFP) surveillance* and administering polio vaccines to >27 million children during multiple rounds of supplementary immunization activities (SIAs). † Successful implementation of the response plan led to containment and interruption of the outbreak within 6 months of its identification. The concerted approach adopted in response to this outbreak could serve as a model for responding to polio outbreaks in settings of conflict and political instability.

  18. Imidazopyridine Compounds Inhibit Mycobacterial Growth by Depleting ATP Levels.

    Science.gov (United States)

    O'Malley, Theresa; Alling, Torey; Early, Julie V; Wescott, Heather A; Kumar, Anuradha; Moraski, Garrett C; Miller, Marvin J; Masquelin, Thierry; Hipskind, Philip A; Parish, Tanya

    2018-06-01

    The imidazopyridines are a promising new class of antitubercular agents with potent activity in vitro and in vivo We isolated mutants of Mycobacterium tuberculosis resistant to a representative imidazopyridine; the mutants had large shifts (>20-fold) in MIC. Whole-genome sequencing revealed mutations in Rv1339, a hypothetical protein of unknown function. We isolated mutants resistant to three further compounds from the series; resistant mutants isolated from two of the compounds had single nucleotide polymorphisms in Rv1339 and resistant mutants isolated from the third compound had single nucleotide polymorphisms in QcrB, the proposed target for the series. All the strains were resistant to two compounds, regardless of the mutation, and a strain carrying the QcrB T313I mutation was resistant to all of the imidazopyridine derivatives tested, confirming cross-resistance. By monitoring pH homeostasis and ATP generation, we confirmed that compounds from the series were targeting QcrB; imidazopyridines disrupted pH homeostasis and depleted ATP, providing further evidence of an effect on the electron transport chain. A representative compound was bacteriostatic against replicating bacteria, consistent with a mode of action against QcrB. The series had a narrow inhibitory spectrum, with no activity against other bacterial species. No synergy or antagonism was seen with other antituberculosis drugs under development. In conclusion, our data support the hypothesis that the imidazopyridine series functions by reducing ATP generation via inhibition of QcrB. Copyright © 2018 O'Malley et al.

  19. TSTA compound cryopump

    International Nuclear Information System (INIS)

    Batzer, T.H.; Patrick, R.E.; Call, W.R.

    1980-01-01

    The Tritium System Test Assembly (TSTA), at the Los Alamos Scientific Laboratory, is intended to demonstrate realistic fuel supply and cleanup scenarios for future fusion reactors. The vacuum pumps must be capable of handling large quantities of reactor exhaust gases consisting largely of mixtures of hydrogen and helium isotopes. Cryocondensing pumps will not pump helium at 4.2 K; while cryosorption pumps using molecular sieves or charcoal have good helium pumping speed, the adsorbent clogs with condensed hydrogen while pumping mixtures of both. A solution to this problem is a compound design whereby the first stage condenses the hydrogen and the second, or sorption, stage pumps the helium. The TSTA pump designed at Lawrence Livermore National Laboratory uses argon gas to cryotrap the helium in the helium-hydrogen mixture. The argon is sprayed directly onto the 4.2 K surface at a rate proportional to the helium flow rate, permitting continuous pumping of the helium-hydrogen mixtures in a single-stage pump. However, the possibility of differential desorption as a first stage in the TSTA gas separation cycle required the inclusion of a first-stage hydrogen isotope condenser. The design, performance, and operating characteristics are discussed

  20. Anti-Biofilm Compounds Derived from Marine Sponges

    Directory of Open Access Journals (Sweden)

    Christian Melander

    2011-10-01

    Full Text Available Bacterial biofilms are surface-attached communities of microorganisms that are protected by an extracellular matrix of biomolecules. In the biofilm state, bacteria are significantly more resistant to external assault, including attack by antibiotics. In their native environment, bacterial biofilms underpin costly biofouling that wreaks havoc on shipping, utilities, and offshore industry. Within a host environment, they are insensitive to antiseptics and basic host immune responses. It is estimated that up to 80% of all microbial infections are biofilm-based. Biofilm infections of indwelling medical devices are of particular concern, since once the device is colonized, infection is almost impossible to eliminate. Given the prominence of biofilms in infectious diseases, there is a notable effort towards developing small, synthetically available molecules that will modulate bacterial biofilm development and maintenance. Here, we highlight the development of small molecules that inhibit and/or disperse bacterial biofilms specifically through non-microbicidal mechanisms. Importantly, we discuss several sets of compounds derived from marine sponges that we are developing in our labs to address the persistent biofilm problem. We will discuss: discovery/synthesis of natural products and their analogues—including our marine sponge-derived compounds and initial adjuvant activity and toxicological screening of our novel anti-biofilm compounds.

  1. The application of FEL-EXPERT system in the interpretation of boron compounds toxicity

    International Nuclear Information System (INIS)

    Strouf, O.; Marik, V.

    1990-01-01

    The effect of substructural features of boron compounds on their toxicity (LD 50 , mice, i.p.) was studied using the FEL-EXPERT system developed by the Czech Technical University of Prague. A set of 108 compounds containing one or two boron atoms in their molecule was arbitrarily divided into three classes: compounds with high toxicity (LD 50 50 50 ≥1000 mg/kg). The compounds were represented by 70 substructural fragments, 27 of them being ''central substructures'' containing boron atom(s). The inference net consisted of 118 nodes (74 of the Bayesian type), 362 production rules and 74 context links. The total classification correctness was 98%. As a case-study, the classification of p-tolylboronic acid (LD 50 =520 mg/kg) and 4-carboxyphenylboronic acid (LD 50 =3838 mg/kg) was discussed. 4 figs., 2 tabs., 11 refs

  2. Parity violation in the compound nucleus

    International Nuclear Information System (INIS)

    Mitchell, G. E.; Crawford, B. E.; Grossmann, C. A.; Lowie, L. Y.; Bowman, J. D.; Knudson, J.; Penttilae, S.; Seestrom, S. J.; Smith, D. A.; Yen, Yi-Fen; Yuan, V. W.; Delheij, P. P. J.; Haseyama, T.; Masaike, A.; Matsuda, Y.; Postma, H.; Roberson, N. R.; Sharapov, E. I.; Stephenson, S. L.

    1999-01-01

    Measurements have been performed on the helicity dependence of the neutron resonance cross section for many nuclei by our TRIPLE Collaboration. A large number of parity violations are observed. Generic enhancements amplify the signal for symmetry breaking and the stochastic properties of the compound nucleus permit the strength of the symmetry-breaking interaction to be determined without knowledge of the wave functions of individual states. A total of 15 nuclei have been analyzed with this statistical approach. The results are summarized

  3. Anaerobic catabolism of aromatic compounds: a genetic and genomic view.

    Science.gov (United States)

    Carmona, Manuel; Zamarro, María Teresa; Blázquez, Blas; Durante-Rodríguez, Gonzalo; Juárez, Javier F; Valderrama, J Andrés; Barragán, María J L; García, José Luis; Díaz, Eduardo

    2009-03-01

    Aromatic compounds belong to one of the most widely distributed classes of organic compounds in nature, and a significant number of xenobiotics belong to this family of compounds. Since many habitats containing large amounts of aromatic compounds are often anoxic, the anaerobic catabolism of aromatic compounds by microorganisms becomes crucial in biogeochemical cycles and in the sustainable development of the biosphere. The mineralization of aromatic compounds by facultative or obligate anaerobic bacteria can be coupled to anaerobic respiration with a variety of electron acceptors as well as to fermentation and anoxygenic photosynthesis. Since the redox potential of the electron-accepting system dictates the degradative strategy, there is wide biochemical diversity among anaerobic aromatic degraders. However, the genetic determinants of all these processes and the mechanisms involved in their regulation are much less studied. This review focuses on the recent findings that standard molecular biology approaches together with new high-throughput technologies (e.g., genome sequencing, transcriptomics, proteomics, and metagenomics) have provided regarding the genetics, regulation, ecophysiology, and evolution of anaerobic aromatic degradation pathways. These studies revealed that the anaerobic catabolism of aromatic compounds is more diverse and widespread than previously thought, and the complex metabolic and stress programs associated with the use of aromatic compounds under anaerobic conditions are starting to be unraveled. Anaerobic biotransformation processes based on unprecedented enzymes and pathways with novel metabolic capabilities, as well as the design of novel regulatory circuits and catabolic networks of great biotechnological potential in synthetic biology, are now feasible to approach.

  4. Analysis of Three Compounds in Flos Farfarae by Capillary Electrophoresis with Large-Volume Sample Stacking

    Directory of Open Access Journals (Sweden)

    Hai-xia Yu

    2017-01-01

    Full Text Available The aim of this study was to develop a method combining an online concentration and high-efficiency capillary electrophoresis separation to analyze and detect three compounds (rutin, hyperoside, and chlorogenic acid in Flos Farfarae. In order to get good resolution and enrichment, several parameters such as the choice of running buffer, pH and concentration of the running buffer, organic modifier, temperature, and separation voltage were all investigated. The optimized conditions were obtained as follows: the buffer of 40 mM NaH2P04-40 mM Borax-30% v/v methanol (pH 9.0; the sample hydrodynamic injection of up to 4 s at 0.5 psi; 20 kV applied voltage. The diode-array detector was used, and the detection wavelength was 364 nm. Based on peak area, higher levels of selective and sensitive improvements in analysis were observed and about 14-, 26-, and 5-fold enrichment of rutin, hyperoside, and chlorogenic acid were achieved, respectively. This method was successfully applied to determine the three compounds in Flos Farfarae. The linear curve of peak response versus concentration was from 20 to 400 µg/ml, 16.5 to 330 µg/mL, and 25 to 500 µg/mL, respectively. The regression coefficients were 0.9998, 0.9999, and 0.9991, respectively.

  5. Testing antidepressant compounds in a neuropsychological model of drug action

    NARCIS (Netherlands)

    Cerit, Hilal

    2015-01-01

    Although much research effort has been put into the development of new antidepressant drugs, the process of developing a drug often fails at the stage of large randomized controlled trials (RCTs) in which an initially promising compound appears to lack efficacy after all. Several experimental

  6. Estimation of optical rotation of γ-alkylidenebutenolide, cyclopropylamine, cyclopropyl-methanol and cyclopropenone based compounds by a Density Functional Theory (DFT) approach.

    Science.gov (United States)

    Shahzadi, Iram; Shaukat, Aqsa; Zara, Zeenat; Irfan, Muhammad; Eliasson, Bertil; Ayub, Khurshid; Iqbal, Javed

    2017-10-01

    Computing the optical rotation of organic molecules can be a real challenge, and various theoretical approaches have been developed in this regard. A benchmark study of optical rotation of various classes of compounds was carried out by Density Functional Theory (DFT) methods. The aim of the present research study was to find out the best-suited functional and basis set to estimate the optical rotations of selected compounds with respect to experimental literature values. Six DFT functional LSDA, BVP86, CAM-B3LYP, B3PW91, and PBE were applied on 22 different compounds. Furthermore, six different basis sets, i.e., 3-21G, 6-31G, aug-cc-pVDZ, aug-cc-pVTZ, DGDZVP, and DGDZVP2 were also applied with the best-suited functional B3LYP. After rigorous effort, it can be safely said that the best combination of functional and basis set is B3LYP/aug-cc-pVTZ for the estimation of optical rotation for selected compounds. © 2017 Wiley Periodicals, Inc.

  7. Organic compounds in hydraulic fracturing fluids and wastewaters: A review.

    Science.gov (United States)

    Luek, Jenna L; Gonsior, Michael

    2017-10-15

    High volume hydraulic fracturing (HVHF) of shale to stimulate the release of natural gas produces a large quantity of wastewater in the form of flowback fluids and produced water. These wastewaters are highly variable in their composition and contain a mixture of fracturing fluid additives, geogenic inorganic and organic substances, and transformation products. The qualitative and quantitative analyses of organic compounds identified in HVHF fluids, flowback fluids, and produced waters are reviewed here to communicate knowledge gaps that exist in the composition of HVHF wastewaters. In general, analyses of organic compounds have focused on those amenable to gas chromatography, focusing on volatile and semi-volatile oil and gas compounds. Studies of more polar and non-volatile organic compounds have been limited by a lack of knowledge of what compounds may be present as well as quantitative methods and standards available for analyzing these complex mixtures. Liquid chromatography paired with high-resolution mass spectrometry has been used to investigate a number of additives and will be a key tool to further research on transformation products that are increasingly solubilized through physical, chemical, and biological processes in situ and during environmental contamination events. Diverse treatments have been tested and applied to HVHF wastewaters but limited information has been published on the quantitative removal of individual organic compounds. This review focuses on recently published information on organic compounds identified in flowback fluids and produced waters from HVHF. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Intercalation compounds involving inorganic layered structures

    Directory of Open Access Journals (Sweden)

    CONSTANTINO VERA R. L.

    2000-01-01

    Full Text Available Two-dimensional inorganic networks can shown intracrystalline reactivity, i.e., simple ions, large species as Keggin ions, organic species, coordination compounds or organometallics can be incorporated in the interlayer region. The host-guest interaction usually causes changes in their chemical, catalytic, electronic and optical properties. The isolation of materials with interesting properties and making use of soft chemistry routes have given rise the possibility of industrial and technological applications of these compounds. We have been using several synthetic approaches to intercalate porphyrins and phthalocyanines into inorganic materials: smectite clays, layered double hydroxides and layered niobates. The isolated materials have been characterized by elemental and thermal analysis, X-ray diffraction, surface area measurements, scanning electronic microscopy, electronic and resonance Raman spectroscopies and EPR. The degree of layer stacking and the charge density of the matrices as well their acid-base nature were considered in our studies on the interaction between the macrocycles and inorganic hosts.

  9. Centrifugal Fragmentation of a Dinuclear System in the Process of Its Evolution to the Compound Nucleus

    CERN Document Server

    Volkov, V V

    2005-01-01

    The physical content of centrifugal fragmentation is discussed. It is a specific nuclear process which is realized in the evolution of a dinuclear system into a compound nucleus at large angular momenta and large mass asymmetry of the system. The dinuclear system concept which describes the process of the compound nucleus formation in heavy ion reactions predicts the possibility of centrifugal fragmentation. Experimental data giving evidence of the realization of this nuclear process are given. A possible scheme of the centrifugal fragmentation model is discussed.

  10. Centrifugal fragmentation of a dinuclear system in the process of its evolution to the compound nucleus

    International Nuclear Information System (INIS)

    Volkov, V.V.

    2005-01-01

    The physical content of centrifugal fragmentation is discussed. It is a specific nuclear process which is realized in the evolution of a dinuclear system into a compound nucleus at large angular momenta and large mass asymmetry of the system. The dinuclear system concept which describes the process of the compound nucleus formation in heavy ion reactions predicts the possibility of centrifugal fragmentation. Experimental data giving evidence of the realization of this nuclear process are given. A possible scheme of the centrifugal fragmentation model is discussed

  11. Maximum hardness and minimum polarizability principles through lattice energies of ionic compounds

    International Nuclear Information System (INIS)

    Kaya, Savaş; Kaya, Cemal; Islam, Nazmul

    2016-01-01

    The maximum hardness (MHP) and minimum polarizability (MPP) principles have been analyzed using the relationship among the lattice energies of ionic compounds with their electronegativities, chemical hardnesses and electrophilicities. Lattice energy, electronegativity, chemical hardness and electrophilicity values of ionic compounds considered in the present study have been calculated using new equations derived by some of the authors in recent years. For 4 simple reactions, the changes of the hardness (Δη), polarizability (Δα) and electrophilicity index (Δω) were calculated. It is shown that the maximum hardness principle is obeyed by all chemical reactions but minimum polarizability principles and minimum electrophilicity principle are not valid for all reactions. We also proposed simple methods to compute the percentage of ionic characters and inter nuclear distances of ionic compounds. Comparative studies with experimental sets of data reveal that the proposed methods of computation of the percentage of ionic characters and inter nuclear distances of ionic compounds are valid.

  12. Maximum hardness and minimum polarizability principles through lattice energies of ionic compounds

    Energy Technology Data Exchange (ETDEWEB)

    Kaya, Savaş, E-mail: savaskaya@cumhuriyet.edu.tr [Department of Chemistry, Faculty of Science, Cumhuriyet University, Sivas 58140 (Turkey); Kaya, Cemal, E-mail: kaya@cumhuriyet.edu.tr [Department of Chemistry, Faculty of Science, Cumhuriyet University, Sivas 58140 (Turkey); Islam, Nazmul, E-mail: nazmul.islam786@gmail.com [Theoretical and Computational Chemistry Research Laboratory, Department of Basic Science and Humanities/Chemistry Techno Global-Balurghat, Balurghat, D. Dinajpur 733103 (India)

    2016-03-15

    The maximum hardness (MHP) and minimum polarizability (MPP) principles have been analyzed using the relationship among the lattice energies of ionic compounds with their electronegativities, chemical hardnesses and electrophilicities. Lattice energy, electronegativity, chemical hardness and electrophilicity values of ionic compounds considered in the present study have been calculated using new equations derived by some of the authors in recent years. For 4 simple reactions, the changes of the hardness (Δη), polarizability (Δα) and electrophilicity index (Δω) were calculated. It is shown that the maximum hardness principle is obeyed by all chemical reactions but minimum polarizability principles and minimum electrophilicity principle are not valid for all reactions. We also proposed simple methods to compute the percentage of ionic characters and inter nuclear distances of ionic compounds. Comparative studies with experimental sets of data reveal that the proposed methods of computation of the percentage of ionic characters and inter nuclear distances of ionic compounds are valid.

  13. The photoluminescence spectra of micropowder of aromatic compounds under ultraviolet laser excitation

    International Nuclear Information System (INIS)

    Rakhmatullaev, I.A.; Kurbonov, A.K. et al.; Gorelik, V.S.

    2016-01-01

    The method of diagnostics of aromatic compounds on the example of novocaine, aspirin and anthracene is presented. The method is based on optical detection of photoluminescence spectra at ultraviolet laser (266 nm) excitation. Employing this method the photoluminescence spectra are obtained which allows one to establish the differences of the composition and structure of compounds. The developed method can be used for analysis the quality of the large class of luminescent bioactive structures under the ultraviolet radiation. (authors)

  14. Design of Availability-Dependent Distributed Services in Large-Scale Uncooperative Settings

    Science.gov (United States)

    Morales, Ramses Victor

    2009-01-01

    Thesis Statement: "Availability-dependent global predicates can be efficiently and scalably realized for a class of distributed services, in spite of specific selfish and colluding behaviors, using local and decentralized protocols". Several types of large-scale distributed systems spanning the Internet have to deal with availability variations…

  15. EXAFS investigations of Cu-Mg-O compound

    CERN Document Server

    Sidorenko, A F; Babanov, Y A; Naumov, S V; Samokhvalov, A A

    2001-01-01

    The interest to systems containing copper oxide is connected with the problem of high-temperature superconductivity because of the closeness of its basic physical properties and properties of superconductor mother Cu-compounds. In this work, EXAFS study of the Cu sub 0 sub . sub 2 Mg sub 0 sub . sub 8 O compound is presented. A new iterative algorithm of the solution of ill-posed problem on determining three partial pair correlation functions from one EXAFS-data set near the Cu K-edge is described. The results of X-ray scattering study of a given sample show a presence of a single phase with the MgO structure and a lattice parameter of 4.219 A instead of 4.208 A for pure MgO. From the EXAFS investigations, we find the local distortion of the lattice. We revealed that the short range order differs both from a hypothetical alloy with the MgO structure and from copper oxide.

  16. The role of familiarity in associative recognition of unitized compound word pairs.

    Science.gov (United States)

    Ahmad, Fahad N; Hockley, William E

    2014-01-01

    This study examined the effect of unitization and contribution of familiarity in the recognition of word pairs. Compound words were presented as word pairs and were contrasted with noncompound word pairs in an associative recognition task. In Experiments 1 and 2, yes-no recognition hit and false-alarm rates were significantly higher for compound than for noncompound word pairs, with no difference in discrimination in both within- and between-subject comparisons. Experiment 2 also showed that item recognition was reduced for words from compound compared to noncompound word pairs, providing evidence of the unitization of the compound pairs. A two-alternative forced-choice test used in Experiments 3A and 3B provided evidence that the concordant effect for compound word pairs was largely due to familiarity. A discrimination advantage for compound word pairs was also seen in these experiments. Experiment 4A showed that a different pattern of results is seen when repeated noncompound word pairs are compared to compound word pairs. Experiment 4B showed that memory for the individual items of compound word pairs was impaired relative to items in repeated and nonrepeated noncompound word pairs, and Experiment 5 demonstrated that this effect is eliminated when the elements of compound word pairs are not unitized. The concordant pattern seen in yes-no recognition and the discrimination advantage in forced-choice recognition for compound relative to noncompound word pairs is due to greater reliance on familiarity at test when pairs are unitized.

  17. Distributed Large Independent Sets in One Round On Bounded-independence Graphs

    OpenAIRE

    Halldorsson , Magnus M.; Konrad , Christian

    2015-01-01

    International audience; We present a randomized one-round, single-bit messages, distributed algorithm for the maximum independent set problem in polynomially bounded-independence graphs with poly-logarithmic approximation factor. Bounded-independence graphs capture various models of wireless networks such as the unit disc graphs model and the quasi unit disc graphs model. For instance, on unit disc graphs, our achieved approximation ratio is O((log(n)/log(log(n)))^2).A starting point of our w...

  18. Characterization of volatile compounds of Mezcal, an ethnic alcoholic beverage obtained from Agave salmiana.

    Science.gov (United States)

    De León-Rodríguez, Antonio; González-Hernández, Lidia; Barba de la Rosa, Ana P; Escalante-Minakata, Pilar; López, Mercedes G

    2006-02-22

    Commercial mezcals (white, white with worm, rested, rested with worm, and aged) produced from Agave salmiana were analyzed by solid-phase microextraction-gas chromatography-mass spectrometry (SPME-GC-MS). Thirty-seven compounds were identified, and nine of them were classified as major compounds of mezcal (MCM). Saturated alcohols, ethyl acetate, ethyl 2-hydroxypropanoate, and acetic acid form the MCM group. Minor compounds of mezcal group include other alcohols, aldehydes, ketones, large chain ethyl esters, organic acids, furans, terpenes, alkenes, and alkynes. Most of the compounds found in mezcals in this study are similar to those present in tequilas and other alcoholic beverages. However, mezcals contain unique compounds such as limonene and pentyl butanoate, which can be used as markers for the authenticity of mezcal produced from A. salmiana.

  19. How large a training set is needed to develop a classifier for microarray data?

    Science.gov (United States)

    Dobbin, Kevin K; Zhao, Yingdong; Simon, Richard M

    2008-01-01

    A common goal of gene expression microarray studies is the development of a classifier that can be used to divide patients into groups with different prognoses, or with different expected responses to a therapy. These types of classifiers are developed on a training set, which is the set of samples used to train a classifier. The question of how many samples are needed in the training set to produce a good classifier from high-dimensional microarray data is challenging. We present a model-based approach to determining the sample size required to adequately train a classifier. It is shown that sample size can be determined from three quantities: standardized fold change, class prevalence, and number of genes or features on the arrays. Numerous examples and important experimental design issues are discussed. The method is adapted to address ex post facto determination of whether the size of a training set used to develop a classifier was adequate. An interactive web site for performing the sample size calculations is provided. We showed that sample size calculations for classifier development from high-dimensional microarray data are feasible, discussed numerous important considerations, and presented examples.

  20. Extreme Simplification and Rendering of Point Sets using Algebraic Multigrid

    NARCIS (Netherlands)

    Reniers, Dennie; Telea, Alexandru

    2005-01-01

    We present a novel approach for extreme simplification of point set models in the context of real-time rendering. Point sets are often rendered using simple point primitives, such as oriented discs. However efficient, simple primitives are less effective in approximating large surface areas. A large

  1. Insertion compounds of transition-metal and uranium oxides

    International Nuclear Information System (INIS)

    Chippindale, A.M.; Dickens, P.G.; Powell, A.V.

    1991-01-01

    Several transition-metal and actinide oxides, in which the metal occurs in a high oxidation state, have open covalent structures and are capable of incorporating alkali and other electropositive metals under mild conditions to form insertion compounds A x MO n . These are solids which have several features in common: Over a range of compositions, A x MO n exists as one or more stable or metastable phases in which the structure of the parent oxide MO n is largely retained and the insertion element A is accommodated interstitially. Insertion is accompanied by a redox process A=A i . + e - M in which M is reduced and the electronic properties of the parent oxide change to those typical of a mixed-valence compound. The insertion process xA + MO n = A x MO n can be reversed, at least to some extent, by chemical or electrochemical reaction, with retention of structure (topotactic reaction). This review concentrates on methods of synthesis, characterisation, crystal structure and thermochemistry of these insertion compounds. It updates and extends previous work. (author)

  2. Self-organising maps and correlation analysis as a tool to explore patterns in excitation-emission matrix data sets and to discriminate dissolved organic matter fluorescence components.

    Science.gov (United States)

    Ejarque-Gonzalez, Elisabet; Butturini, Andrea

    2014-01-01

    Dissolved organic matter (DOM) is a complex mixture of organic compounds, ubiquitous in marine and freshwater systems. Fluorescence spectroscopy, by means of Excitation-Emission Matrices (EEM), has become an indispensable tool to study DOM sources, transport and fate in aquatic ecosystems. However the statistical treatment of large and heterogeneous EEM data sets still represents an important challenge for biogeochemists. Recently, Self-Organising Maps (SOM) has been proposed as a tool to explore patterns in large EEM data sets. SOM is a pattern recognition method which clusterizes and reduces the dimensionality of input EEMs without relying on any assumption about the data structure. In this paper, we show how SOM, coupled with a correlation analysis of the component planes, can be used both to explore patterns among samples, as well as to identify individual fluorescence components. We analysed a large and heterogeneous EEM data set, including samples from a river catchment collected under a range of hydrological conditions, along a 60-km downstream gradient, and under the influence of different degrees of anthropogenic impact. According to our results, chemical industry effluents appeared to have unique and distinctive spectral characteristics. On the other hand, river samples collected under flash flood conditions showed homogeneous EEM shapes. The correlation analysis of the component planes suggested the presence of four fluorescence components, consistent with DOM components previously described in the literature. A remarkable strength of this methodology was that outlier samples appeared naturally integrated in the analysis. We conclude that SOM coupled with a correlation analysis procedure is a promising tool for studying large and heterogeneous EEM data sets.

  3. Next-generation text-mining mediated generation of chemical response-specific gene sets for interpretation of gene expression data

    NARCIS (Netherlands)

    Hettne, K.M.; Boorsma, A.; Dartel, D.A. van; Goeman, J.J.; Jong, E. de; Piersma, A.H.; Stierum, R.H.; Kleinjans, J.C.; Kors, J.A.

    2013-01-01

    BACKGROUND: Availability of chemical response-specific lists of genes (gene sets) for pharmacological and/or toxic effect prediction for compounds is limited. We hypothesize that more gene sets can be created by next-generation text mining (next-gen TM), and that these can be used with gene set

  4. Next-generation text-mining mediated generation of chemical response-specific gene sets for interpretation of gene expression data

    NARCIS (Netherlands)

    Hettne, K.M.; Boorsma, A.; Dartel, van D.A.M.; Goeman, J.J.; Jong, de E.; Piersma, A.H.; Stierum, R.H.; Kleinjans, J.C.; Kors, J.A.

    2013-01-01

    Background: Availability of chemical response-specific lists of genes (gene sets) for pharmacological and/or toxic effect prediction for compounds is limited. We hypothesize that more gene sets can be created by next-generation text mining (next-gen TM), and that these can be used with gene set

  5. Differentiation of the molecular structure of nitro compounds as the basis for simulation of their thermal destruction processes

    Energy Technology Data Exchange (ETDEWEB)

    Korolev, V L; Pivina, Tatyana S; Sheremetev, Aleksei B [N.D.Zelinsky Institute of Organic Chemistry, Russian Academy of Sciences, Moscow (Russian Federation); Porollo, A A [University of Cincinnati, Cincinnati (United States); Petukhova, T V; Ivshin, Viktor P [Mari State University, Yoshkar-Ola (Russian Federation)

    2009-10-31

    Data on the experimental and theoretical studies of thermal decomposition of C- and N-nitro compounds of aliphatic, alicyclic, aromatic and heteroaromatic compounds, which formed the grounds for the development of ab initio approach to the prediction of the mechanisms of thermolysis of energetic compounds, are described systematically. The relationships between the structures and thermolysis mechanisms of compounds based on differentiation of the structural fragments depending on the functional surrounding of nitro groups are identified. Using the RRN (Recombination Reaction Network) strategy and original CASB (Computer Assisted Structure Building) software, full reaction mechanisms for the thermal destruction of nitro compounds at different thermal decomposition levels (including extensive ones) are simulated. The full set of possible mechanisms of thermal decomposition of 38 chemically different nitro compounds is presented

  6. Homogeneous photocatalytic reactions with organometallic and coordination compounds--perspectives for sustainable chemistry.

    Science.gov (United States)

    Hoffmann, Norbert

    2012-02-13

    Since the time of Giacomo Ciamician at the beginning of the 20th century, photochemical transformations have been recognized as contributing to sustainable chemistry. Electronic excitation significantly changes the reactivity of chemical compounds. Thus, the application of activation reagents is frequently avoided and transformations can be performed under mild conditions. Catalysis plays a central role in sustainable chemistry. Stoichiometric amounts of activation reagents are often avoided. This fact and the milder catalytic reaction conditions diminish the formation of byproducts. In the case of homogeneous catalysis, organometallic compounds are often applied. The combination of both techniques develops synergistic effects in the sense of "Green Chemistry". Herein, metal carbonyl-mediated reactions are reported. These transformations are of considerable interest for the synthesis of complex polyfunctionalized compounds. Copper(I)-catalyzed [2+2] photocycloaddition gives access to a large variety of cyclobutane derivatives. Currently, a large number of publications deal with photochemical electron-transfer-induced reactions with organometallic and coordination compounds, particularly with ruthenium complexes. Several photochemically induced oxidations can easily be performed with air or molecular oxygen when they are catalyzed with organometallic complexes. Photochemical reaction conditions also play a certain role in C-H activation with organometallic catalysts, for instance, with alkanes, although such transformations are conveniently performed with a variety of other photochemical reactions. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Just-in-Time Compound Pooling Increases Primary Screening Capacity without Compromising Screening Quality.

    Science.gov (United States)

    Elkin, L L; Harden, D G; Saldanha, S; Ferguson, H; Cheney, D L; Pieniazek, S N; Maloney, D P; Zewinski, J; O'Connell, J; Banks, M

    2015-06-01

    Compound pooling, or multiplexing more than one compound per well during primary high-throughput screening (HTS), is a controversial approach with a long history of limited success. Many issues with this approach likely arise from long-term storage of library plates containing complex mixtures of compounds at high concentrations. Due to the historical difficulties with using multiplexed library plates, primary HTS often uses a one-compound-one-well approach. However, as compound collections grow, innovative strategies are required to increase the capacity of primary screening campaigns. Toward this goal, we have developed a novel compound pooling method that increases screening capacity without compromising data quality. This method circumvents issues related to the long-term storage of complex compound mixtures by using acoustic dispensing to enable "just-in-time" compound pooling directly in the assay well immediately prior to assay. Using this method, we can pool two compounds per well, effectively doubling the capacity of a primary screen. Here, we present data from pilot studies using just-in-time pooling, as well as data from a large >2-million-compound screen using this approach. These data suggest that, for many targets, this method can be used to vastly increase screening capacity without significant reduction in the ability to detect screening hits. © 2015 Society for Laboratory Automation and Screening.

  8. Lebesgue Sets Immeasurable Existence

    Directory of Open Access Journals (Sweden)

    Diana Marginean Petrovai

    2012-12-01

    Full Text Available It is well known that the notion of measure and integral were released early enough in close connection with practical problems of measuring of geometric figures. Notion of measure was outlined in the early 20th century through H. Lebesgue’s research, founder of the modern theory of measure and integral. It was developed concurrently a technique of integration of functions. Gradually it was formed a specific area todaycalled the measure and integral theory. Essential contributions to building this theory was made by a large number of mathematicians: C. Carathodory, J. Radon, O. Nikodym, S. Bochner, J. Pettis, P. Halmos and many others. In the following we present several abstract sets, classes of sets. There exists the sets which are not Lebesgue measurable and the sets which are Lebesgue measurable but are not Borel measurable. Hence B ⊂ L ⊂ P(X.

  9. Setting Learning Analytics in Context: Overcoming the Barriers to Large-Scale Adoption

    Science.gov (United States)

    Ferguson, Rebecca; Macfadyen, Leah P.; Clow, Doug; Tynan, Belinda; Alexander, Shirley; Dawson, Shane

    2014-01-01

    A core goal for most learning analytic projects is to move from small-scale research towards broader institutional implementation, but this introduces a new set of challenges because institutions are stable systems, resistant to change. To avoid failure and maximize success, implementation of learning analytics at scale requires explicit and…

  10. Application of organic compounds for high-order harmonic generation of ultrashort pulses

    Science.gov (United States)

    Ganeev, R. A.

    2016-02-01

    The studies of the high-order nonlinear optical properties of a few organic compounds (polyvinyl alcohol, polyethylene, sugar, coffee, and leaf) are reported. Harmonic generation in the laser-produced plasmas containing the molecules and large particles of above materials is demonstrated. These studies showed that the harmonic distributions and harmonic cutoffs from organic compound plasmas were similar to those from the graphite ablation. The characteristic feature of observed harmonic spectra was the presence of bluesided lobes near the lower-order harmonics.

  11. Occurrence of PCDD/F, PCB, PBDE, PFAS, and organotin compounds in fish meal, fish oil and fish feed.

    Science.gov (United States)

    Suominen, K; Hallikainen, A; Ruokojärvi, P; Airaksinen, R; Koponen, J; Rannikko, R; Kiviranta, H

    2011-10-01

    We analysed polychlorinated dibenzo-p-dioxins and furans (PCDD/F, dioxins), and polychlorinated biphenyls (PCB) in 13 fish meal, five fish oil, and seven fish feed samples. Polybrominated diphenyl ethers (PBDE), organotin compounds (OTC), and perfluoroalkylated substances (PFAS) were analysed in ten fish meal, two fish oil, and two fish feed samples. All measured TEQ concentrations of PCDD/F and PCB were below the maximum levels set by Directive 2002/32/EC. There was no correlation between concentrations of WHOPCDD/F-TEQ and indicator PCB in our samples. The most common congeners among PBDEs were BDE-47 and BDE-100. BDE-209 was present in five fish meals of the ten analysed. Tributyltin (TBT) was the predominant congener in all samples except in three fish meals, where monobutyltin (MBT) was the major congener. Perfluorooctane sulphonate (PFOS) was the predominant congener in six fish meals of the ten analysed. There was large variation in concentrations and congener distributions of the studied compounds between our samples. Our results underline a need to pay special attention to the origin and purity of feed raw material of marine origin. Copyright © 2011 Elsevier Ltd. All rights reserved.

  12. Enantioselective column coupled electrophoresis employing large bore capillaries hyphenated with tandem mass spectrometry for ultra-trace determination of chiral compounds in complex real samples.

    Science.gov (United States)

    Piešťanský, Juraj; Maráková, Katarína; Kovaľ, Marián; Havránek, Emil; Mikuš, Peter

    2015-12-01

    A new multidimensional analytical approach for the ultra-trace determination of target chiral compounds in unpretreated complex real samples was developed in this work. The proposed analytical system provided high orthogonality due to on-line combination of three different methods (separation mechanisms), i.e. (1) isotachophoresis (ITP), (2) chiral capillary zone electrophoresis (chiral CZE), and (3) triple quadrupole mass spectrometry (QqQ MS). The ITP step, performed in a large bore capillary (800 μm), was utilized for the effective sample pretreatment (preconcentration and matrix clean-up) in a large injection volume (1-10 μL) enabling to obtain as low as ca. 80 pg/mL limits of detection for the target enantiomers in urine matrices. In the chiral CZE step, the different chiral selectors (neutral, ionizable, and permanently charged cyclodextrins) and buffer systems were tested in terms of enantioselectivity and influence on the MS detection response. The performance parameters of the optimized ITP - chiral CZE-QqQ MS method were evaluated according to the FDA guidance for bioanalytical method validation. Successful validation and application (enantioselective monitoring of renally eliminated pheniramine and its metabolite in human urine) highlighted great potential of this chiral approach in advanced enantioselective biomedical applications. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Calculations of the magnetic properties of R2M14B intermetallic compounds (R=rare earth, M=Fe, Co)

    International Nuclear Information System (INIS)

    Ito, Masaaki; Yano, Masao; Dempsey, Nora M.; Givord, Dominique

    2016-01-01

    The hard magnetic properties of “R–M–B” (R=rare earth, M=mainly Fe) magnets derive from the specific intrinsic magnetic properties encountered in Fe-rich R 2 M 14 B compounds. Exchange interactions are dominated by the 3d elements, Fe and Co, and may be modeled at the macroscopic scale with good accuracy. Based on classical formulae that relate the anisotropy coefficients to the crystalline electric field parameters and exchange interactions, a simple numerical approach is used to derive the temperature dependence of anisotropy in various R 2 Fe 14 B compounds (R=Pr, Nd, Dy). Remarkably, a unique set of crystal field parameters give fair agreement with the experimentally measured properties of all compounds. This implies reciprocally that the properties of compounds that incorporate a mixture of different rare-earth elements may be predicted accurately. This is of special interest for material optimization that often involves the partial replacement of Nd with another R element and also the substitution of Co for Fe. - Highlights: • Anisotropy constants derived from CEF parameters of R 2 M 14 B compounds (M=Fe, Co). • Anisotropy constants of all R 2 Fe 14 B compounds using unique set of CEF parameters. • Moment non-collinearity in magnetization processes under B app along hard axis.

  14. Benchmarking Data Sets for the Evaluation of Virtual Ligand Screening Methods: Review and Perspectives.

    Science.gov (United States)

    Lagarde, Nathalie; Zagury, Jean-François; Montes, Matthieu

    2015-07-27

    Virtual screening methods are commonly used nowadays in drug discovery processes. However, to ensure their reliability, they have to be carefully evaluated. The evaluation of these methods is often realized in a retrospective way, notably by studying the enrichment of benchmarking data sets. To this purpose, numerous benchmarking data sets were developed over the years, and the resulting improvements led to the availability of high quality benchmarking data sets. However, some points still have to be considered in the selection of the active compounds, decoys, and protein structures to obtain optimal benchmarking data sets.

  15. Organic compounds as indicators of air pollution

    DEFF Research Database (Denmark)

    Mølhave, Lars

    2003-01-01

    The most important indoor air pollutants have already been addressedwith individual national guidelines or recommendations. However, an interna-tional set of guidelines or recommendations for indoor air quality (IAQ) isneeded for these pollutants based on general and uniform rules for setting...... suchstandards. A major research need exist on the less adverse pollutants beforerecommendations or guidelines can be established. In the interim period a pre-caution principle should lead to an ALARA principle for these secondary cau-salities. It should be noted that volatile organic compound (VOC......) is an indicatorfor the presence of VOC indoors. The TVOC indicator can be used in relation toexposure characterization and source identification but for VOCs only, not as anindictor of other pollutants and their health effects. In risk assessment the TVOCindicator can only be used as a screening tool and only...

  16. Organophosphorus pentavalent compounds: history, synthetic methods of preparation and application as insecticides and antitumor agents

    International Nuclear Information System (INIS)

    Santos, Viviane Martins Rebello dos; Donnici, Claudio Luis; DaCosta, Joao Batista Neves; Caixeiro, Janaina Marques Rodrigues

    2007-01-01

    This paper is a review of the history, synthesis and application of organophosphorus compounds, especially of those of pentavalent phosphorus, such as phosphoramidates, phosphorothioates, phosphonates and phosphonic acids with insecticide and anticancer activities. The organophosphorus compounds with agrochemical applications show great structural variety, They include not only insecticides, but also fungicides, herbicides, and others. The large variety of commercially available organophosphorus pesticides is remarkable. Even more interesting is the high efficiency of some organophosphorus compounds as anticancer agents such as cyclophosphamide and its derivatives. (author)

  17. Screening plant derived dietary phenolic compounds for bioactivity related to cardiovascular disease.

    Science.gov (United States)

    Croft, Kevin D; Yamashita, Yoko; O'Donoghue, Helen; Shirasaya, Daishi; Ward, Natalie C; Ashida, Hitoshi

    2018-04-01

    The potential health benefits of phenolic acids found in food and beverages has been suggested from a number of large population studies. However, the mechanism of how these compounds may exert biological effects is less well established. It is also now recognised that many complex polyphenols in the diet are metabolised to simple phenolic acids which can be taken up in the circulation. In this paper a number of selected phenolic compounds have been tested for their bioactivity in two cell culture models. The expression and activity of endothelial nitric oxide synthase (eNOS) in human aortic endothelial cells and the uptake of glucose in muscle cells. Our data indicate that while none of the compounds tested had a significant effect on eNOS expression or activation in endothelial cells, several of the compounds increased glucose uptake in muscle cells. These compounds also enhanced the translocation of the glucose transporter GLUT4 to the plasma membrane, which may explain the observed increase in cellular glucose uptake. These results indicate that simple cell culture models may be useful to help understand the bioactivity of phenolic compounds in relation to cardiovascular protection. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. An introduction to random sets

    CERN Document Server

    Nguyen, Hung T

    2006-01-01

    The study of random sets is a large and rapidly growing area with connections to many areas of mathematics and applications in widely varying disciplines, from economics and decision theory to biostatistics and image analysis. The drawback to such diversity is that the research reports are scattered throughout the literature, with the result that in science and engineering, and even in the statistics community, the topic is not well known and much of the enormous potential of random sets remains untapped.An Introduction to Random Sets provides a friendly but solid initiation into the theory of random sets. It builds the foundation for studying random set data, which, viewed as imprecise or incomplete observations, are ubiquitous in today''s technological society. The author, widely known for his best-selling A First Course in Fuzzy Logic text as well as his pioneering work in random sets, explores motivations, such as coarse data analysis and uncertainty analysis in intelligent systems, for studying random s...

  19. Large magnetoresistance in (AA')2FeReO6 double perovskites

    International Nuclear Information System (INIS)

    Teresa, J.M. de; Serrate, D.; Blasco, J.; Ibarra, M.R.; Morellon, L.

    2005-01-01

    We review the main structural, magnetic and magnetotransport properties of the intriguing (AA') 2 FeReO 6 magnetic double perovskites. As the average cation size decreases, the crystallographic structure at room temperature evolves from cubic [(AA') 2 =Ba 2 , Ba 1.5 Sr 0.5 , BaSr, Ba 0.5 Sr 1.5 ] to tetragonal [(AA') 2 =Sr 2 ] and monoclinic [(AA') 2 =Ca 0.5 Sr 1.5 , CaSr, Ca 1.5 Sr 0.5 , Ca 2 ]. The Curie temperature increases anomalously from ∼303K for Ba 2 to ∼522K for Ca 2 in sharp contrast with the observed behaviour in the isostructural compounds (AA') 2 FeMoO 6 . Other anomalous features in the (AA') 2 FeReO 6 series are: the large magnetic anisotropy, the large magnetoelastic coupling and the semiconducting behaviour of the monoclinic compounds. The monoclinic compounds undergo a structural/magnetic transition at T S below 125K. Three different magnetoresistance mechanisms have been identified: the intergrain negative magnetoresistance effect, which is present across the whole series of compounds, and in the case of the monoclinic compounds below T S a negative magnetoresistance effect associated to the melting of the low-temperature phase and a positive magnetoresistance effect only present in (AA') 2 =Ca 2 below T∼50K

  20. How to Prepare SMC and BMC-like Compounds to Perform Relevant Rheological Experiments?

    Science.gov (United States)

    Guiraud, Olivier; Dumont, Pierre J. J.; Orgéas, Laurent

    2013-04-01

    The study of the rheology of injected or compression moulded compounds like SMC or BMC is made particularly difficult by the high content and the intricate arrangement of their fibrous reinforcement. For these two types of compounds, inappropriate rheological testing protocols and rheometers are often used, which leads to a very large scatter of the experimental data. This study describes specific sampling and specimen's preparation methods, as well as dedicated rheometry devices to test their rheology. Following the proposed protocols, it is possible to obtain rheological measurements showing low scatter of the recorded stress values: about ±10% for SMC and about ±15% for BMC-like compounds.

  1. A biotechnological approach for the development of new antifungal compounds to protect the environment and the human health

    Directory of Open Access Journals (Sweden)

    Claudia Zani

    2015-11-01

    Full Text Available Background. In the Po Valley aflatoxins play a relevant role: the local food economy is heavily based on cereal cultivations for animal feed and human nutrition. Aims of this project are the identification of new compounds that inhibit Aspergillus proliferation, the development of new inhibitors of aflatoxins production, and the set-up a practical screening procedure to identify the most effective and safe compounds. Design and Methods. New compounds will be synthetized with natural origin molecules as ligands and endogenous metal ions to increase their bioavailability for the fungi as metal complexes. A biotechnological high-throughput screening will be set up to identify efficiently the most powerful substances. The newly synthesized compounds with effective antifungal activities, will be evaluated with battery of tests with different end-points to assess the toxic potential risk for environmental and human health. Expected impact of the study for public health. The fundamental step in the project will be the synthesis of new compounds and the study of their capability to inhibit aflatoxin biosynthesis. A new, simple, inexpensive and high-throughput method to screen the anti-fungine and anti-mycotoxin activity of the new synthesised compounds will be applied. The evaluation of possible risks for humans due to toxic and genotoxic activities of the molecules will be made with a new approach using different types of cells (bacteria, plants and human cells.

  2. BACHSCORE. A tool for evaluating efficiently and reliably the quality of large sets of protein structures

    Science.gov (United States)

    Sarti, E.; Zamuner, S.; Cossio, P.; Laio, A.; Seno, F.; Trovato, A.

    2013-12-01

    In protein structure prediction it is of crucial importance, especially at the refinement stage, to score efficiently large sets of models by selecting the ones that are closest to the native state. We here present a new computational tool, BACHSCORE, that allows its users to rank different structural models of the same protein according to their quality, evaluated by using the BACH++ (Bayesian Analysis Conformation Hunt) scoring function. The original BACH statistical potential was already shown to discriminate with very good reliability the protein native state in large sets of misfolded models of the same protein. BACH++ features a novel upgrade in the solvation potential of the scoring function, now computed by adapting the LCPO (Linear Combination of Pairwise Orbitals) algorithm. This change further enhances the already good performance of the scoring function. BACHSCORE can be accessed directly through the web server: bachserver.pd.infn.it. Catalogue identifier: AEQD_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEQD_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 130159 No. of bytes in distributed program, including test data, etc.: 24 687 455 Distribution format: tar.gz Programming language: C++. Computer: Any computer capable of running an executable produced by a g++ compiler (4.6.3 version). Operating system: Linux, Unix OS-es. RAM: 1 073 741 824 bytes Classification: 3. Nature of problem: Evaluate the quality of a protein structural model, taking into account the possible “a priori” knowledge of a reference primary sequence that may be different from the amino-acid sequence of the model; the native protein structure should be recognized as the best model. Solution method: The contact potential scores the occurrence of any given type of residue pair in 5 possible

  3. Chemoinformatics-assisted development of new anti-biofilm compounds

    DEFF Research Database (Denmark)

    Dürig, Anna; Kouskoumvekaki, Irene; Vejborg, Rebecca Munk

    2010-01-01

    Bacterial biofilms are associated with a large number of infections. Biofilm-dwelling bacteria are particularly resistant to antibiotics, making it hard to eradicate biofilm-associated infections. Here, we use a novel cross-disciplinary approach combining microbiology and chemoinformatics...... to identify new and efficient anti-biofilm drugs. We found that ellagic acid (present in green tea) significantly inhibited biofilm formation of Streptococcus dysgalactiae. Based on ellagic acid, we performed in silico screening of the Chinese Natural Product Database to predict a 2nd-generation list...... of compounds with similar characteristics. One of these, esculetin, proved to be more efficient in preventing biofilm formation by Staphylococcus aureus. From esculetin a 3rd-generation list of compounds was predicted. One of them, fisetin, was even better to abolish biofilm formation than the two parent...

  4. Spatial fingerprints of community structure in human interaction network for an extensive set of large-scale regions.

    Directory of Open Access Journals (Sweden)

    Zsófia Kallus

    Full Text Available Human interaction networks inferred from country-wide telephone activity recordings were recently used to redraw political maps by projecting their topological partitions into geographical space. The results showed remarkable spatial cohesiveness of the network communities and a significant overlap between the redrawn and the administrative borders. Here we present a similar analysis based on one of the most popular online social networks represented by the ties between more than 5.8 million of its geo-located users. The worldwide coverage of their measured activity allowed us to analyze the large-scale regional subgraphs of entire continents and an extensive set of examples for single countries. We present results for North and South America, Europe and Asia. In our analysis we used the well-established method of modularity clustering after an aggregation of the individual links into a weighted graph connecting equal-area geographical pixels. Our results show fingerprints of both of the opposing forces of dividing local conflicts and of uniting cross-cultural trends of globalization.

  5. Solving large sets of coupled equations iteratively by vector processing on the CYBER 205 computer

    International Nuclear Information System (INIS)

    Tolsma, L.D.

    1985-01-01

    The set of coupled linear second-order differential equations which has to be solved for the quantum-mechanical description of inelastic scattering of atomic and nuclear particles can be rewritten as an equivalent set of coupled integral equations. When some type of functions is used as piecewise analytic reference solutions, the integrals that arise in this set can be evaluated analytically. The set of integral equations can be solved iteratively. For the results mentioned an inward-outward iteration scheme has been applied. A concept of vectorization of coupled-channel Fortran programs, based on this integral method, is presented for the use on the Cyber 205 computer. It turns out that, for two heavy ion nuclear scattering test cases, this vector algorithm gives an overall speed-up of about a factor of 2 to 3 compared to a highly optimized scalar algorithm for a one vector pipeline computer

  6. Accelerating Multiple Compound Comparison Using LINGO-Based Load-Balancing Strategies on Multi-GPUs.

    Science.gov (United States)

    Lin, Chun-Yuan; Wang, Chung-Hung; Hung, Che-Lun; Lin, Yu-Shiang

    2015-01-01

    Compound comparison is an important task for the computational chemistry. By the comparison results, potential inhibitors can be found and then used for the pharmacy experiments. The time complexity of a pairwise compound comparison is O(n (2)), where n is the maximal length of compounds. In general, the length of compounds is tens to hundreds, and the computation time is small. However, more and more compounds have been synthesized and extracted now, even more than tens of millions. Therefore, it still will be time-consuming when comparing with a large amount of compounds (seen as a multiple compound comparison problem, abbreviated to MCC). The intrinsic time complexity of MCC problem is O(k (2) n (2)) with k compounds of maximal length n. In this paper, we propose a GPU-based algorithm for MCC problem, called CUDA-MCC, on single- and multi-GPUs. Four LINGO-based load-balancing strategies are considered in CUDA-MCC in order to accelerate the computation speed among thread blocks on GPUs. CUDA-MCC was implemented by C+OpenMP+CUDA. CUDA-MCC achieved 45 times and 391 times faster than its CPU version on a single NVIDIA Tesla K20m GPU card and a dual-NVIDIA Tesla K20m GPU card, respectively, under the experimental results.

  7. Synthesis and structural characterization of actinide oxalate compounds

    International Nuclear Information System (INIS)

    Tamain, C.

    2011-01-01

    Oxalic acid is a well-known reagent to recover actinides thanks to the very low solubility of An(IV) and An(III) oxalate compounds in acidic solution. Therefore, considering mixed-oxide fuel or considering minor actinides incorporation in ceramic fuel materials for transmutation, oxalic co-conversion is convenient to synthesize mixed oxalate compounds, precursors of oxide solid solutions. As the existing oxalate single crystal syntheses are not adaptable to the actinide-oxalate chemistry or to their manipulation constrains in gloves box, several original crystal growth methods were developed. They were first validate and optimized on lanthanides and uranium before the application to transuranium elements. The advanced investigations allow to better understand the syntheses and to define optimized chemical conditions to promote crystal growth. These new crystal growth methods were then applied to a large number of mixed An1(IV)-An2(III) or An1(IV)-An2(IV) systems and lead to the formation of the first original mixed An1(IV)-An2(III) and An1(IV)-An2(IV) oxalate single crystals. Finally thanks to the first thorough structural characterizations of these compounds, single crystal X-ray diffraction, EXAFS or micro-RAMAN, the particularly weak oxalate-actinide compounds structural database is enriched, which is essential for future studied nuclear fuel cycles. (author) [fr

  8. Patient data and patient rights: Swiss healthcare stakeholders' ethical awareness regarding large patient data sets - a qualitative study.

    Science.gov (United States)

    Mouton Dorey, Corine; Baumann, Holger; Biller-Andorno, Nikola

    2018-03-07

    There is a growing interest in aggregating more biomedical and patient data into large health data sets for research and public benefits. However, collecting and processing patient data raises new ethical issues regarding patient's rights, social justice and trust in public institutions. The aim of this empirical study is to gain an in-depth understanding of the awareness of possible ethical risks and corresponding obligations among those who are involved in projects using patient data, i.e. healthcare professionals, regulators and policy makers. We used a qualitative design to examine Swiss healthcare stakeholders' experiences and perceptions of ethical challenges with regard to patient data in real-life settings where clinical registries are sponsored, created and/or used. A semi-structured interview was carried out with 22 participants (11 physicians, 7 policy-makers, 4 ethical committee members) between July 2014 and January 2015. The interviews were audio-recorded, transcribed, coded and analysed using a thematic method derived from Grounded Theory. All interviewees were concerned as a matter of priority with the needs of legal and operating norms for the collection and use of data, whereas less interest was shown in issues regarding patient agency, the need for reciprocity, and shared governance in the management and use of clinical registries' patient data. This observed asymmetry highlights a possible tension between public and research interests on the one hand, and the recognition of patients' rights and citizens' involvement on the other. The advocation of further health-related data sharing on the grounds of research and public interest, without due regard for the perspective of patients and donors, could run the risk of fostering distrust towards healthcare data collections. Ultimately, this could diminish the expected social benefits. However, rather than setting patient rights against public interest, new ethical approaches could strengthen both

  9. A Fast Logdet Divergence Based Metric Learning Algorithm for Large Data Sets Classification

    Directory of Open Access Journals (Sweden)

    Jiangyuan Mei

    2014-01-01

    the basis of classifiers, for example, the k-nearest neighbors classifier. Experiments on benchmark data sets demonstrate that the proposed algorithm compares favorably with the state-of-the-art methods.

  10. Studies on in vitro biostability and blood compatibility of polyurethane potting compound based on aromatic polymeric MDI for extracorporeal devices.

    Science.gov (United States)

    Hridya, V K; Jayabalan, M

    2009-12-01

    Polyurethane potting compound based on aromatic isocyanurate of polymeric MDI, poly propylene glycol (PPG400) and trimethylol propane (TMP) has significant favourable properties, good pot life and setting characteristics. The cured potting compound of this formulation has appreciable thermal stability and mechanical properties. In vitro biostability of cured potting compound has been found to be excellent without any significant degradation in simulated physiological media and chemical environment. Studies on blood-material interaction and cytotoxicity reveal in vitro blood compatibility and compatibility with cells of this potting compound.

  11. Setting Standards and Primary School Teachers' Experiences of the Process

    Science.gov (United States)

    Scherman, Vanessa; Zimmerman, Lisa; Howie, Sarah J.; Bosker, Roel

    2014-01-01

    In South Africa, very few standard-setting exercises are carried out in education and, if they are, teachers are not involved in their execution. As a result, there is no clear understanding of what the standard is and how it was arrived at. This situation is compounded when teachers are held accountable when learners do not meet the prescribed…

  12. Daily intake estimation of phenolic compounds in the Spanish population

    Directory of Open Access Journals (Sweden)

    Inma Navarro González

    2017-12-01

    Full Text Available Introduction: Phenolic compounds are a large group of molecules present in plants with a diversity of chemical structures and biological activity. The objective of this study was to quantify the intake of phenolic compounds of the Spanish population. Material and Methods: The most consumed foods from vegetal origin in Spain were selected. These were picked up in the National Survey of Spanish Dietary Intake (ENIDE of 2011, edited by AESAN (Spanish Agency for Food Safety and Nutrition as a basis for quantifying the intake of phenolic compounds of Spaniards using the Phenol-Explorer database. Results: This database has allowed to estimate the average intake of polyphenols per day of Spaniards, which is 1365.1mg. Conclusions: The average intake of total polyphenols of Spaniards could have a protective effect against the mortality rate and exercise a preventive function on some chronic diseases along with other healthy lifestyle habits.

  13. Compound-complex odontoma: A case report of a rare variant

    Directory of Open Access Journals (Sweden)

    Nishath Khanum

    2014-01-01

    Full Text Available The odontoma is a benign tumor containing all the various component tissues of the teeth. It is the most common odontogenic tumor representing 67% of all odontogenic tumors. Odontomas are considered to be developmental anomalies (hamartomas rather than true neoplasms. Based on the degree of morphodifferentiation or on the basis of their resemblance to normal teeth, they are divided into compound and complex odontomas. The compound odontoma is composed of multiple, small tooth-like structures. The complex odontoma consists of a conglomerate mass of enamel and dentin, which bears no anatomic resemblance to a tooth. They are usually diagnosed on routine radiological examinations in the second decade of life and are often slow growing and non-aggressive in nature. Here, we report a case of rare, unusually large, compound-complex odontoma, located in the left anterior maxilla of a 13-year-old male patient.

  14. Potential of Fruit Wastes as Natural Resources of Bioactive Compounds

    Directory of Open Access Journals (Sweden)

    Wen-Hua Ling

    2012-07-01

    Full Text Available Fruit wastes are one of the main sources of municipal waste. In order to explore the potential of fruit wastes as natural resources of bioactive compounds, the antioxidant potency and total phenolic contents (TPC of lipophilic and hydrophilic components in wastes (peel and seed of 50 fruits were systematically evaluated. The results showed that different fruit residues had diverse antioxidant potency and the variation was very large. Furthermore, the main bioactive compounds were identified and quantified, and catechin, cyanidin 3-glucoside, epicatechin, galangin, gallic acid, homogentisic acid, kaempferol, and chlorogenic acid were widely found in these residues. Especially, the values of ferric-reducing antioxidant power (FRAP, trolox equivalent antioxidant capacity (TEAC and TPC in the residues were higher than in pulps. The results showed that fruit residues could be inexpensive and readily available resources of bioactive compounds for use in the food and pharmaceutical industries.

  15. Neutral Red versus MTT assay of cell viability in the presence of copper compounds.

    Science.gov (United States)

    Gomez Perez, Mariela; Fourcade, Lyvia; Mateescu, Mircea Alexandru; Paquin, Joanne

    2017-10-15

    Copper is essential for numerous physiological functions, and copper compounds may display therapeutic as well as cytotoxic effects. The MTT (3-(4,5-dimethyl-2-thiazolyl)-2,5-diphenyl-2H-tetrazolium bromide) assay is a standard test largely used in cytotoxicity studies. This report shows that low micromolar levels of copper compounds such as Cu(II)Urea 2 , Cu(II)Ser 2 and CuCl 2 can interfere with the MTT assay making improper the detection of formazan product of MTT reduction. Comparatively, the Neutral Red assay appears to be sensitive and showing no interference with these compounds. The lactate dehydrogenase alternative assay cannot be used because of inhibitory effect of these copper compounds on the enzyme activity. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Set-oriented data mining in relational databases

    NARCIS (Netherlands)

    Houtsma, M.A.W.; Swami, Arun

    1995-01-01

    Data mining is an important real-life application for businesses. It is critical to find efficient ways of mining large data sets. In order to benefit from the experience with relational databases, a set-oriented approach to mining data is needed. In such an approach, the data mining operations are

  17. Vitali systems in R^n with irregular sets

    DEFF Research Database (Denmark)

    Mejlbro, Leif; Topsøe, Flemming

    1996-01-01

    Vitali type theorems are results stating that out of a given family of sets one can select pairwise disjoint sets which fill out a "large" region. Usually one works with "regular" sets such as balls. We shall establish results with sets of a more complicated geometrical structure, e.g., Cantor......-like sets are allowed. The results are related to a generalisation of the classical notion of a differentiation basis.l They concern real n-space R^n and Lebesgue measure....

  18. Speciation of anthropogenic emissions of non-methane volatile organic compounds: a global gridded data set for 1970–2012

    Directory of Open Access Journals (Sweden)

    G. Huang

    2017-06-01

    Full Text Available Non-methane volatile organic compounds (NMVOCs include a large number of chemical species which differ significantly in their chemical characteristics and thus in their impacts on ozone and secondary organic aerosol formation. It is important that chemical transport models (CTMs simulate the chemical transformation of the different NMVOC species in the troposphere consistently. In most emission inventories, however, only total NMVOC emissions are reported, which need to be decomposed into classes to fit the requirements of CTMs. For instance, the Emissions Database for Global Atmospheric Research (EDGAR provides spatially resolved global anthropogenic emissions of total NMVOCs. In this study the EDGAR NMVOC inventory was revised and extended in time and in sectors. Moreover the new version of NMVOC emission data in the EDGAR database were disaggregated on a detailed sector resolution to individual species or species groups, thus enhancing the usability of the NMVOC emission data by the modelling community. Region- and source-specific speciation profiles of NMVOC species or species groups are compiled and mapped to EDGAR processes (detailed resolution of sectors, with corresponding quality codes specifying the quality of the mapping. Individual NMVOC species in different profiles are aggregated to 25 species groups, in line with the common classification of the Global Emissions Initiative (GEIA. Global annual grid maps with a resolution of 0.1°  ×  0.1° for the period 1970–2012 are produced by sector and species. Furthermore, trends in NMVOC composition are analysed, taking road transport and residential sources in Germany and the United Kingdom (UK as examples.

  19. Immense Essence of Excellence: Marine Microbial Bioactive Compounds

    OpenAIRE

    Ira Bhatnagar; Se-Kwon Kim

    2010-01-01

    Oceans have borne most of the biological activities on our planet. A number of biologically active compounds with varying degrees of action, such as anti-tumor, anti-cancer, anti-microtubule, anti-proliferative, cytotoxic, photo protective, as well as antibiotic and antifouling properties, have been isolated to date from marine sources. The marine environment also represents a largely unexplored source for isolation of new microbes (bacteria, fungi, actinomycetes, microalgae-cyanobacteria and...

  20. Benchmarking of protein descriptor sets in proteochemometric modeling (part 2): modeling performance of 13 amino acid descriptor sets

    Science.gov (United States)

    2013-01-01

    Background While a large body of work exists on comparing and benchmarking descriptors of molecular structures, a similar comparison of protein descriptor sets is lacking. Hence, in the current work a total of 13 amino acid descriptor sets have been benchmarked with respect to their ability of establishing bioactivity models. The descriptor sets included in the study are Z-scales (3 variants), VHSE, T-scales, ST-scales, MS-WHIM, FASGAI, BLOSUM, a novel protein descriptor set (termed ProtFP (4 variants)), and in addition we created and benchmarked three pairs of descriptor combinations. Prediction performance was evaluated in seven structure-activity benchmarks which comprise Angiotensin Converting Enzyme (ACE) dipeptidic inhibitor data, and three proteochemometric data sets, namely (1) GPCR ligands modeled against a GPCR panel, (2) enzyme inhibitors (NNRTIs) with associated bioactivities against a set of HIV enzyme mutants, and (3) enzyme inhibitors (PIs) with associated bioactivities on a large set of HIV enzyme mutants. Results The amino acid descriptor sets compared here show similar performance (set differences ( > 0.3 log units RMSE difference and >0.7 difference in MCC). Combining different descriptor sets generally leads to better modeling performance than utilizing individual sets. The best performers were Z-scales (3) combined with ProtFP (Feature), or Z-Scales (3) combined with an average Z-Scale value for each target, while ProtFP (PCA8), ST-Scales, and ProtFP (Feature) rank last. Conclusions While amino acid descriptor sets capture different aspects of amino acids their ability to be used for bioactivity modeling is still – on average – surprisingly similar. Still, combining sets describing complementary information consistently leads to small but consistent improvement in modeling performance (average MCC 0.01 better, average RMSE 0.01 log units lower). Finally, performance differences exist between the targets compared thereby underlining that

  1. The formation of lithium diarylargentates from arylsilver compounds and the corresponding aryllithium compounds

    NARCIS (Netherlands)

    Blenkers, J.; Hofstee, H.K.; Boersma, J.; Kerk, G.J.M. van der

    1979-01-01

    Diarylsilverlithium compounds of the type Ar2AgLi are formed by treating arylsilver compounds with the corresponding aryllithium compounds. Cryoscopy in benzene shows that the Ar2AgLi compounds are associated into dimers. NMR spectroscopic data indicate that only one type of aryl group is present in

  2. Setting up fuel supply strategies for large-scale bio-energy projects using agricultural and forest residues. A methodology for developing countries

    International Nuclear Information System (INIS)

    Junginger, M.

    2000-08-01

    The objective of this paper is to develop a coherent methodology to set up fuel supply strategies for large-scale biomass-conversion units. This method will explicitly take risks and uncertainties regarding availability and costs in relation to time into account. This paper aims at providing general guidelines, which are not country-specific. These guidelines cannot provide 'perfect fit'-solutions, but aim to give general help to overcome barriers and to set up supply strategies. It will mainly focus on residues from the agricultural and forestry sector. This study focuses on electricity or both electricity and heat production (CHP) with plant scales between 1040 MWe. This range is chosen due to rules of economies of scale. In large-scale plants the benefits of increased efficiency outweigh increased transportation costs, allowing a lower price per kWh which in turn may allow higher biomass costs. However, fuel-supply risks tend to get higher with increasing plant size, which makes it more important to assess them for large(r) conversion plants. Although the methodology does not focus on a specific conversion technology, it should be stressed that the technology must be able to handle a wide variety of biomass fuels with different characteristics because many biomass residues are not available the year round and various fuels are needed for a constant supply. The methodology allows for comparing different technologies (with known investment and operational and maintenance costs from literature) and evaluation for different fuel supply scenarios. In order to demonstrate the methodology, a case study was carried out for the north-eastern part of Thailand (Isaan), an agricultural region. The research was conducted in collaboration with the Regional Wood Energy Development Programme in Asia (RWEDP), a project of the UN Food and Agricultural Organization (FAO) in Bangkok, Thailand. In Section 2 of this paper the methodology will be presented. In Section 3 the economic

  3. An experimental study of praseodymium intermetallic compounds at low temperatures

    International Nuclear Information System (INIS)

    Greidanus, F.J.A.M.

    1982-01-01

    In this thesis the author studies the low temperature properties of praseodymium intermetallic compounds. In chapter 2 some of the techniques used for the experiments described in the subsequent chapters are discussed. A set-up to perform specific-heat experiments below 1 K and a technique for performing magnetic susceptibility measurments below 1 K, using a superconducting quantum interference device (SQUID) are described. Chapter 3 is devoted to the theory of interacting Pr 3+ ions. Both bilinear and biquadratic interactions are dealt with in a molecular-field approximation. It is shown that first as well as second-order phase transitions can occur, depending on the nature of the ground state, and on the ratio of magnetic to crystal-field interactions. In chapters 4, 5, 6 and 7 experimental results on the cubic Laves phase compounds PrRh 2 , PrIr 2 , PrPt 2 , PrRu 2 and PrNi 2 are presented. From inelastic neutron scattering experiments the crystalline electric field parameters of the above compounds are determined. In chapters 5 and 6 susceptibility, neutron-diffraction, hyperfine specific-heat, low-field magnetization, pulsed-field magnetization, specific-heat and resistivity measurements are presented. In chapter 7 the specific heat and differential susceptibility of PrNi 2 below 1 K are studied. Finally, in chapter 8 praseodymium intermetallic compounds with low-symmetry singlet ground states, and cubic compounds with magnetic doublet ground states are studied. (Auth.)

  4. The volatile compound BinBase mass spectral database.

    Science.gov (United States)

    Skogerson, Kirsten; Wohlgemuth, Gert; Barupal, Dinesh K; Fiehn, Oliver

    2011-08-04

    Base database algorithms have been successfully modified to allow for tracking and identification of volatile compounds in complex mixtures. The database is capable of annotating large datasets (hundreds to thousands of samples) and is well-suited for between-study comparisons such as chemotaxonomy investigations. This novel volatile compound database tool is applicable to research fields spanning chemical ecology to human health. The BinBase source code is freely available at http://binbase.sourceforge.net/ under the LGPL 2.0 license agreement.

  5. The volatile compound BinBase mass spectral database

    Directory of Open Access Journals (Sweden)

    Barupal Dinesh K

    2011-08-01

    ://vocbinbase.fiehnlab.ucdavis.edu. Conclusions The BinBase database algorithms have been successfully modified to allow for tracking and identification of volatile compounds in complex mixtures. The database is capable of annotating large datasets (hundreds to thousands of samples and is well-suited for between-study comparisons such as chemotaxonomy investigations. This novel volatile compound database tool is applicable to research fields spanning chemical ecology to human health. The BinBase source code is freely available at http://binbase.sourceforge.net/ under the LGPL 2.0 license agreement.

  6. Altered transport and metabolism of phenolic compounds in obesity and diabetes: implications for functional food development and assessment

    Science.gov (United States)

    Interest in application of phenolic compounds from diet or supplements for prevention of chronic diseases has grown significantly, but efficacy of such approaches in humans is largely dependent on the bioavailability and metabolism of these compounds. While food and dietary factors have been the foc...

  7. Generation of standard gas mixtures of halogenated, aliphatic, and aromatic compounds and prediction of the individual output rates based on molecular formula and boiling point.

    Science.gov (United States)

    Thorenz, Ute R; Kundel, Michael; Müller, Lars; Hoffmann, Thorsten

    2012-11-01

    In this work, we describe a simple diffusion capillary device for the generation of various organic test gases. Using a set of basic equations the output rate of the test gas devices can easily be predicted only based on the molecular formula and the boiling point of the compounds of interest. Since these parameters are easily accessible for a large number of potential analytes, even for those compounds which are typically not listed in physico-chemical handbooks or internet databases, the adjustment of the test gas source to the concentration range required for the individual analytical application is straightforward. The agreement of the predicted and measured values is shown to be valid for different groups of chemicals, such as halocarbons, alkanes, alkenes, and aromatic compounds and for different dimensions of the diffusion capillaries. The limits of the predictability of the output rates are explored and observed to result in an underprediction of the output rates when very thin capillaries are used. It is demonstrated that pressure variations are responsible for the observed deviation of the output rates. To overcome the influence of pressure variations and at the same time to establish a suitable test gas source for highly volatile compounds, also the usability of permeation sources is explored, for example for the generation of molecular bromine test gases.

  8. Prediction of Human Intestinal Absorption of Compounds Using Artificial Intelligence Techniques.

    Science.gov (United States)

    Kumar, Rajnish; Sharma, Anju; Siddiqui, Mohammed Haris; Tiwari, Rajesh Kumar

    2017-01-01

    Information about Pharmacokinetics of compounds is an essential component of drug design and development. Modeling the pharmacokinetic properties require identification of the factors effecting absorption, distribution, metabolism and excretion of compounds. There have been continuous attempts in the prediction of intestinal absorption of compounds using various Artificial intelligence methods in the effort to reduce the attrition rate of drug candidates entering to preclinical and clinical trials. Currently, there are large numbers of individual predictive models available for absorption using machine learning approaches. Six Artificial intelligence methods namely, Support vector machine, k- nearest neighbor, Probabilistic neural network, Artificial neural network, Partial least square and Linear discriminant analysis were used for prediction of absorption of compounds. Prediction accuracy of Support vector machine, k- nearest neighbor, Probabilistic neural network, Artificial neural network, Partial least square and Linear discriminant analysis for prediction of intestinal absorption of compounds was found to be 91.54%, 88.33%, 84.30%, 86.51%, 79.07% and 80.08% respectively. Comparative analysis of all the six prediction models suggested that Support vector machine with Radial basis function based kernel is comparatively better for binary classification of compounds using human intestinal absorption and may be useful at preliminary stages of drug design and development. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  9. Publicly available models to predict normal boiling point of organic compounds

    International Nuclear Information System (INIS)

    Oprisiu, Ioana; Marcou, Gilles; Horvath, Dragos; Brunel, Damien Bernard; Rivollet, Fabien; Varnek, Alexandre

    2013-01-01

    Quantitative structure–property models to predict the normal boiling point (T b ) of organic compounds were developed using non-linear ASNNs (associative neural networks) as well as multiple linear regression – ISIDA-MLR and SQS (stochastic QSAR sampler). Models were built on a diverse set of 2098 organic compounds with T b varying in the range of 185–491 K. In ISIDA-MLR and ASNN calculations, fragment descriptors were used, whereas fragment, FPTs (fuzzy pharmacophore triplets), and ChemAxon descriptors were employed in SQS models. Prediction quality of the models has been assessed in 5-fold cross validation. Obtained models were implemented in the on-line ISIDA predictor at (http://infochim.u-strasbg.fr/webserv/VSEngine.html)

  10. Destruction of organochlorated compounds and CFCs by catalytic hydrodechloration; Destruccion de compuestos organoclorados y CFCs mediante hidrodecloracion catalitica

    Energy Technology Data Exchange (ETDEWEB)

    Ordonez Garcia, S.; Sastre Andres, H.; Diez Sanz, F. V.

    1998-12-01

    The destruction of organohalogenated compounds ( for example chlorinated solvents, PCBs and CFCs) is a very serious environmental problems. Catalytic hydrodechlorination has shown to be potentially efficient method for the destruction of these compounds. In this technique the halogenated compound reacts with hydrogen, tielding a non-chlorinated compound (environmentally harmless) and hydrogen-chloride. In this article, different set-ups and catalysts employed in the catalytic hydrogechlorination were described. Finally, some applications of this technique to the treatment of industrial effluents, such as the destruction of chlorinated solvents (as trichloroethylene o tetrachloromethane), conversion of CFCs into HCFCs, destruction of PCBs and treatment of water polluted with chlorinated pesticides. (Author) 28 refs.

  11. The Amateurs' Love Affair with Large Datasets

    Science.gov (United States)

    Price, Aaron; Jacoby, S. H.; Henden, A.

    2006-12-01

    Amateur astronomers are professionals in other areas. They bring expertise from such varied and technical careers as computer science, mathematics, engineering, and marketing. These skills, coupled with an enthusiasm for astronomy, can be used to help manage the large data sets coming online in the next decade. We will show specific examples where teams of amateurs have been involved in mining large, online data sets and have authored and published their own papers in peer-reviewed astronomical journals. Using the proposed LSST database as an example, we will outline a framework for involving amateurs in data analysis and education with large astronomical surveys.

  12. The ambient dose equivalent at flight altitudes: a fit to a large set of data using a Bayesian approach

    International Nuclear Information System (INIS)

    Wissmann, F; Reginatto, M; Moeller, T

    2010-01-01

    The problem of finding a simple, generally applicable description of worldwide measured ambient dose equivalent rates at aviation altitudes between 8 and 12 km is difficult to solve due to the large variety of functional forms and parametrisations that are possible. We present an approach that uses Bayesian statistics and Monte Carlo methods to fit mathematical models to a large set of data and to compare the different models. About 2500 data points measured in the periods 1997-1999 and 2003-2006 were used. Since the data cover wide ranges of barometric altitude, vertical cut-off rigidity and phases in the solar cycle 23, we developed functions which depend on these three variables. Whereas the dependence on the vertical cut-off rigidity is described by an exponential, the dependences on barometric altitude and solar activity may be approximated by linear functions in the ranges under consideration. Therefore, a simple Taylor expansion was used to define different models and to investigate the relevance of the different expansion coefficients. With the method presented here, it is possible to obtain probability distributions for each expansion coefficient and thus to extract reliable uncertainties even for the dose rate evaluated. The resulting function agrees well with new measurements made at fixed geographic positions and during long haul flights covering a wide range of latitudes.

  13. Semiconducting III-V compounds

    CERN Document Server

    Hilsum, C; Henisch, Heinz R

    1961-01-01

    Semiconducting III-V Compounds deals with the properties of III-V compounds as a family of semiconducting crystals and relates these compounds to the monatomic semiconductors silicon and germanium. Emphasis is placed on physical processes that are peculiar to III-V compounds, particularly those that combine boron, aluminum, gallium, and indium with phosphorus, arsenic, and antimony (for example, indium antimonide, indium arsenide, gallium antimonide, and gallium arsenide).Comprised of eight chapters, this book begins with an assessment of the crystal structure and binding of III-V compounds, f

  14. Chemical nature and distribution of calcium compounds in radiolucent gallstones

    International Nuclear Information System (INIS)

    Agarwal, D.K.; Choudhuri, G.; Kumar, J.

    1993-01-01

    A high failure rate for radiolucent cholesterol gallstones to dissolve with oral bile acids may be due to the presence of insoluble calcium compounds. Twenty sets of radiolucent gallstones, 7-20 mm in diameter, obtained from 20 patients undergoing cholecystectomy, were cut, and the outer surface, outer rim, middle portion, and central core areas were scanned for calcium by energy-dispersive X-ray microanalysis (EDX) and scanning electron microscopy (SEM). Scrapings from the four areas of each stone were analysed by infrared spectroscopy. A sample of the crushed stone was used for chemical estimation of cholesterol. 11 of the 20 cholesterol stones showed presence of calcium by EDX; the distribution was peripheral in 5, homogeneous in 4, and central in 2. The chemical compound was calcium bilirubinate in 10 and calcium carbonate in 8 stones. Calcium compounds are present in a high proportion of radiolucent gallstones considered suitable for chemodissolution by conventional criteria. Their unrecognized presence may explain the high failure rate of such stones to respond to medical therapy. 20 refs., 3 figs

  15. Great isotope effects in compounding of sodium isotopes by macrocyclic polyether

    International Nuclear Information System (INIS)

    Knoechel, A.; Wilken, R.D.

    1978-01-01

    Isotope effects appear in the compounding of the two sodium isotopes 24 Na + and 22 Na + with macrocyclic polyethers, whose value was determined for the 13 best known polyethers. A radiometric process was used for determining the different half life periods of the nuclides used. To separate the compound and non-compound types, these were distributed between water and chloroform. The isotope ratio in the chloroform phase was compared with the output isotope ratio and the separation facfor determined from this. When using crown ethers, there was enrichment of 24 Na + by a significant amount (large crown ether) up to 3.1 +- 0.4% for 18 crown 6. The remarkably high results can be correlated by Biegeleisen's theory with other chemical conditions. There is a report on the first results of transferring these conditions to the H + /T + system. (orig.) [de

  16. Setting Priorities For Large Research Facility Projects Supported By the National Science Foundation

    National Research Council Canada - National Science Library

    2005-01-01

    ...) level has stalled in the face of a backlog of approved but unfunded projects. Second, the rationale and criteria used to select projects and set priorities among projects for MREFC funding have not been clearly and publicly articulated...

  17. P-matrix in the quark compound bag model

    International Nuclear Information System (INIS)

    Kalashnikova, Yu.S.; Narodetskij, I.M.; Veselov, A.I.

    1983-01-01

    Meaning of the P-matrix analysis is discussed within the quark compound bag (QCB) model. The most general version of this model is considered including the arbitrary coupling between quark and hadronic channels and the arbitrary smearipg of the surface interection region. The behaviour of P-matrix poles as functions of matching radius r,L0 is discussed for r 0 > + . In conclusion are presented the parameters of an illustrative set of NN potentials that has been obtained from the P-matrix fit to experimental data

  18. Large magnetoresistance in Er7Rh3

    International Nuclear Information System (INIS)

    Sengupta, Kaushik; Sampathkumaran, E.V.

    2005-01-01

    The compound Er 2 Rh 3 has been known to order antiferromagnetically below (T N =14K), and to exhibit a change in the sign of temperature coefficient of electrical resistivity (ρ) in the paramagnetic state around 120 K. Here we report the influence of external magnetic field (H) on the ρ(T) behavior of this compound (1.8-300 K). While the ρ behavior in the paramagnetic state, qualitatively speaking, is found to be robust to the application of H, the magnitude of the magnetoresistance (MR) is significant for moderate applications of H, even at temperatures for above T N untypical of metallic systems. In addition, large values are observed in the magnetically ordered state. (author)

  19. Theoretical study of thermopower behavior of LaFeO3 compound in high temperature region

    Science.gov (United States)

    Singh, Saurabh; Shastri, Shivprasad S.; Pandey, Sudhir K.

    2018-04-01

    The electronic structure and thermopower (α) behavior of LaFeO3 compound were investigated by combining the ab-initio electronic structures and Boltzmann transport calculations. LSDA plus Hubbard U (U = 5 eV) calculation on G-type anti-ferromagnetic (AFM) configuration gives an energy gap of ˜2 eV, which is very close to the experimentally reported energy gap. The calculated values of effective mass of holes (mh*) in valance band (VB) are found ˜4 times that of the effective mass of electrons (me*) in conduction band (CB). The large effective masses of holes are responsible for the large and positive thermopower exhibited by this compound. The calculated values of α using BoltzTraP code are found to be large and positive in the 300-1200 K temperature range, which is in agreement with the experimentally reported data.

  20. Rheological Properties of Rubber Compounds with Finely Divided Carbon Additives

    Science.gov (United States)

    Shashok, Zh. S.; Prokopchuk, N. R.; Vishnevskii, K. V.; Krauklis, A. V.; Borisevich, K. O.; Borisevich, I. O.

    2018-01-01

    A study has been made of the influence of three different nanomaterials: of the starting material, and also of those functionalized by amine and oxygen-containing groups, on the properties of elastomer compositions based on rubbers for special purposes. As the elastomer matrix, use was made in one case of a rubber compound based on BNKS-18 butadiene-nitrile rubber and in the other, of a combination of two grades of butadiene-nitrile rubber (BNKS-18 + BNKS-28 in a 50:50 ratio), which differed by the amount of the bound nitrile of acrylic acid. To determine the degree of interaction between the additives and the elastomer matrix, the authors carried out multiple tests of the rubber compounds. The indices of plastoelastic properties of the rubber compounds and the qualitative characteristics of distribution of the filler (elastic modulus at small deformation amplitudes and the shear modulus under large deformation) and the difference in these indices (complex dynamic modulus) alike have been determined.

  1. Unexpected magnetism, and transport properties in mixed lanthanide compound

    Science.gov (United States)

    Pathak, Arjun; Gschneidner, Karl, Jr.; Pecharsky, Vitalij; Ames Laboratory Team

    For intelligent materials design it is desirable to have compounds which have multiple functionalities such as a large magnetoresistance, ferromagnetic and ferrimagnetic states, and field-induced first-order metamagnetic transitions. Here, we discuss one such example where we have combined two lanthanide elements Pr and Er in Pr0.6Er0.4Al2. This compound exhibits multiple functionalities in magnetic fields between 1 and 40 kOe. It undergoes only a trivial ferrimagnetism to paramagnetism transition in a zero magnetic field, but Pr0.6Er0.4Al2 exhibits a large positive magnetoresistance (MR) for H >=40 kOe, a small but non negligible negative MR for H field cooling from the paramagnetic state. These phenomena are attributed to the competition between single-ion anisotropies of Pr and Er ions coupled with the opposite nearest-neighbor and next-nearest-neighbor exchange interactions. This work was supported by the US Department of Energy, Office of Basic Energy Science, Division of Material Sciences and Engineering. The research was performed at the Ames Laboratory. The Ames Laboratory is operated by Iowa State University for the US D.

  2. Magnetoelastic couplings in the distorted diamond-chain compound azurite

    Science.gov (United States)

    Cong, Pham Thanh; Wolf, Bernd; Manna, Rudra Sekhar; Tutsch, Ulrich; de Souza, Mariano; Brühl, Andreas; Lang, Michael

    2014-05-01

    We present results of ultrasonic measurements on a single crystal of the distorted diamond-chain compound azurite Cu3(CO3)2(OH)2. Pronounced elastic anomalies are observed in the temperature dependence of the longitudinal elastic mode c22 which can be assigned to the relevant magnetic interactions in the system and their couplings to the lattice degrees of freedom. From a semiquantitative analysis of the magnetic contribution to c22 the magnetoelastic coupling G =∂J2/∂ɛb can be estimated, where J2 is the intradimer coupling constant and ɛb the strain along the intrachain b axis. We find an exceptionally large coupling constant of |G |˜ 3650 K highlighting an extraordinarily strong sensitivity of J2 against changes of the b-axis lattice parameter. These results are complemented by measurements of the hydrostatic pressure dependence of J2 by means of thermal expansion and magnetic susceptibility measurements performed both at ambient and finite hydrostatic pressure. We propose that a structural peculiarity of this compound, in which Cu2O6 dimer units are incorporated in an unusually stretched manner, is responsible for the anomalously large magnetoelastic coupling.

  3. Azo compounds as a family of organic electrode materials for alkali-ion batteries.

    Science.gov (United States)

    Luo, Chao; Borodin, Oleg; Ji, Xiao; Hou, Singyuk; Gaskell, Karen J; Fan, Xiulin; Chen, Ji; Deng, Tao; Wang, Ruixing; Jiang, Jianjun; Wang, Chunsheng

    2018-02-27

    Organic compounds are desirable for sustainable Li-ion batteries (LIBs), but the poor cycle stability and low power density limit their large-scale application. Here we report a family of organic compounds containing azo group (N=N) for reversible lithiation/delithiation. Azobenzene-4,4'-dicarboxylic acid lithium salt (ADALS) with an azo group in the center of the conjugated structure is used as a model azo compound to investigate the electrochemical behaviors and reaction mechanism of azo compounds. In LIBs, ADALS can provide a capacity of 190 mAh g -1 at 0.5 C (corresponding to current density of 95 mA g -1 ) and still retain 90%, 71%, and 56% of the capacity when the current density is increased to 2 C, 10 C, and 20 C, respectively. Moreover, ADALS retains 89% of initial capacity after 5,000 cycles at 20 C with a slow capacity decay rate of 0.0023% per cycle, representing one of the best performances in all organic compounds. Superior electrochemical behavior of ADALS is also observed in Na-ion batteries, demonstrating that azo compounds are universal electrode materials for alkali-ion batteries. The highly reversible redox chemistry of azo compounds to alkali ions was confirmed by density-functional theory (DFT) calculations. It provides opportunities for developing sustainable batteries.

  4. Performance Evaluation of Frequency Transform Based Block Classification of Compound Image Segmentation Techniques

    Science.gov (United States)

    Selwyn, Ebenezer Juliet; Florinabel, D. Jemi

    2018-04-01

    Compound image segmentation plays a vital role in the compression of computer screen images. Computer screen images are images which are mixed with textual, graphical, or pictorial contents. In this paper, we present a comparison of two transform based block classification of compound images based on metrics like speed of classification, precision and recall rate. Block based classification approaches normally divide the compound images into fixed size blocks of non-overlapping in nature. Then frequency transform like Discrete Cosine Transform (DCT) and Discrete Wavelet Transform (DWT) are applied over each block. Mean and standard deviation are computed for each 8 × 8 block and are used as features set to classify the compound images into text/graphics and picture/background block. The classification accuracy of block classification based segmentation techniques are measured by evaluation metrics like precision and recall rate. Compound images of smooth background and complex background images containing text of varying size, colour and orientation are considered for testing. Experimental evidence shows that the DWT based segmentation provides significant improvement in recall rate and precision rate approximately 2.3% than DCT based segmentation with an increase in block classification time for both smooth and complex background images.

  5. Accelerating Multiple Compound Comparison Using LINGO-Based Load-Balancing Strategies on Multi-GPUs

    Directory of Open Access Journals (Sweden)

    Chun-Yuan Lin

    2015-01-01

    Full Text Available Compound comparison is an important task for the computational chemistry. By the comparison results, potential inhibitors can be found and then used for the pharmacy experiments. The time complexity of a pairwise compound comparison is O(n2, where n is the maximal length of compounds. In general, the length of compounds is tens to hundreds, and the computation time is small. However, more and more compounds have been synthesized and extracted now, even more than tens of millions. Therefore, it still will be time-consuming when comparing with a large amount of compounds (seen as a multiple compound comparison problem, abbreviated to MCC. The intrinsic time complexity of MCC problem is O(k2n2 with k compounds of maximal length n. In this paper, we propose a GPU-based algorithm for MCC problem, called CUDA-MCC, on single- and multi-GPUs. Four LINGO-based load-balancing strategies are considered in CUDA-MCC in order to accelerate the computation speed among thread blocks on GPUs. CUDA-MCC was implemented by C+OpenMP+CUDA. CUDA-MCC achieved 45 times and 391 times faster than its CPU version on a single NVIDIA Tesla K20m GPU card and a dual-NVIDIA Tesla K20m GPU card, respectively, under the experimental results.

  6. Large Scale Metric Learning for Distance-Based Image Classification on Open Ended Data Sets

    NARCIS (Netherlands)

    Mensink, T.; Verbeek, J.; Perronnin, F.; Csurka, G.; Farinella, G.M.; Battiato, S.; Cipolla, R,

    2013-01-01

    Many real-life large-scale datasets are open-ended and dynamic: new images are continuously added to existing classes, new classes appear over time, and the semantics of existing classes might evolve too. Therefore, we study large-scale image classification methods that can incorporate new classes

  7. Large estragole fluxes from oil palms in Borneo

    Science.gov (United States)

    During two field campaigns (OP3 and ACES), which ran in Borneo in 2008, we measured large emissions of estragole in ambient air above oil palm canopies flower enclosures. However, we did not detect this compound at a nearby rainforest. Estragole is a known attractant of the Afric...

  8. Selenium-75-labelled foliate compounds

    International Nuclear Information System (INIS)

    1974-01-01

    A saturation method to analyze a foliate is presented; it uses competitive reaction of the compound to be measured and of a radioactive-labelled version of this compound with a reagent specific to this compound present in insufficient quantity to combine with the whole of the compound and its labelled version, separation of the bound compound from its non-bound homologue and measurement of the radioactivity concentration in the bound compound, the non-bound compound or both. The radioactive isotope used in the labelled foliate is selenium 75 [fr

  9. Profiling of the Major Phenolic Compounds and Their Biosynthesis Genes in Sophora flavescens Aiton

    Directory of Open Access Journals (Sweden)

    Jeongyeo Lee

    2018-01-01

    Full Text Available Sophorae Radix (Sophora flavescens Aiton has long been used in traditional medicine in East Asia due to the various biological activities of its secondary metabolites. Endogenous contents of phenolic compounds (phenolic acid, flavonol, and isoflavone and the main bioactive compounds of Sophorae Radix were analyzed based on the qualitative HPLC analysis and evaluated in different organs and at different developmental stages. In total, 11 compounds were detected, and the composition of the roots and aerial parts (leaves, stems, and flowers was significantly different. trans-Cinnamic acid and p-coumaric acid were observed only in the aerial parts. Large amounts of rutin and maackiain were detected in the roots. Four phenolic acid compounds (benzoic acid, caffeic acid, ferulic acid, and chlorogenic acid and four flavonol compounds (kaempferol, catechin hydrate, epicatechin, and rutin were higher in aerial parts than in roots. To identify putative genes involved in phenolic compounds biosynthesis, a total of 41 transcripts were investigated. Expression patterns of these selected genes, as well as the multiple isoforms for the genes, varied by organ and developmental stage, implying that they are involved in the biosynthesis of various phenolic compounds both spatially and temporally.

  10. A critical assessment of boron target compounds for boron neutron capture therapy.

    Science.gov (United States)

    Hawthorne, M Frederick; Lee, Mark W

    2003-01-01

    Boron neutron capture therapy (BNCT) has undergone dramatic developments since its inception by Locher in 1936 and the development of nuclear energy during World War II. The ensuing Cold War spawned the entirely new field of polyhedral borane chemistry, rapid advances in nuclear reactor technology and a corresponding increase in the number to reactors potentially available for BNCT. This effort has been largely oriented toward the eradication of glioblastoma multiforme (GBM) and melanoma with reduced interest in other types of malignancies. The design and synthesis of boron-10 target compounds needed for BNCT was not channeled to those types of compounds specifically required for GBM or melanoma. Consequently, a number of potentially useful boron agents are known which have not been biologically evaluated beyond a cursory examination and only three boron-10 enriched target species are approved for human use following their Investigational New Drug classification by the US Food and Drug Administration; BSH, BPA and GB-10. All ongoing clinical trials with GBM and melanoma are necessarily conducted with one of these three species and most often with BPA. The further development of BNCT is presently stalled by the absence of strong support for advanced compound evaluation and compound discovery driven by recent advances in biology and chemistry. A rigorous demonstration of BNCT efficacy surpassing that of currently available protocols has yet to be achieved. This article discusses the past history of compound development, contemporary problems such as compound classification and those problems which impede future advances. The latter include means for biological evaluation of new (and existing) boron target candidates at all stages of their development and the large-scale synthesis of boron target species for clinical trials and beyond. The future of BNCT is bright if latitude is given to the choice of clinical disease to be treated and if a recognized study

  11. Application of response surface methodology to optimise supercritical carbon dioxide extraction of volatile compounds from Crocus sativus.

    Science.gov (United States)

    Shao, Qingsong; Huang, Yuqiu; Zhou, Aicun; Guo, Haipeng; Zhang, Ailian; Wang, Yong

    2014-05-01

    Crocus sativus has been used as a traditional Chinese medicine for a long time. The volatile compounds of C. sativus appear biologically active and may act as antioxidants as well as anticonvulsants, antidepressants and antitumour agents. In order to obtain the highest possible yield of essential oils from C. sativus, response surface methodology was employed to optimise the conditions of supercritical fluid carbon dioxide extraction of the volatile compounds from C. sativus. Four factorswere investigated: temperature, pressure, extraction time and carbon dioxide flow rate. Furthermore, the chemical compositions of the volatile compounds extracted by supercritical fluid extraction were compared with those obtained by hydro-distillation and Soxhlet extraction. The optimum extraction conditions were found to be: optimised temperature 44.9°C, pressure 34.9 MPa, extraction time 150.2 min and CO₂ flow rate 10.1 L h⁻¹. Under these conditions, the mean extraction yield was 10.94 g kg⁻¹. The volatile compounds extracted by supercritical fluid extraction and Soxhlet extraction contained a large amount of unsaturated fatty acids. Response surface methodology was successfully applied for supercritical fluid CO₂ extraction optimisation of the volatile compounds from C. sativus. The study showed that pressure and CO₂ flow rate had significant effect on volatile compounds yield produced by supercritical fluid extraction. This study is beneficial for the further research operating on a large scale. © 2013 Society of Chemical Industry.

  12. Stabilizing model predictive control : on the enlargement of the terminal set

    NARCIS (Netherlands)

    Brunner, F.D.; Lazar, M.; Allgöwer, F.

    2015-01-01

    It is well known that a large terminal set leads to a large region where the model predictive control problem is feasible without the need for a long prediction horizon. This paper proposes a new method for the enlargement of the terminal set. Different from existing approaches, the method uses the

  13. Attenuation of xenobiotic organic leachate compounds from a landfill to surface water

    DEFF Research Database (Denmark)

    Milosevic, Nemanja

    established lines of evidence of natural attenuation. The conceptual model was formulated for hydrogeology and water chemistry, providing water flow balance and mass discharges of selected contaminants. The model was improved by analyzing in situ indicators of biodegradation, some of which were applied...... groundwater was shown using multiple methods and multiple compound approaches. Concepts, tools and methods used for the degradation assessment were applied in a clay till setting with groundwater discharge into a local stream....... history, geology and hydrogeology), which together result in a virtually unique setting at each landfill site. Nevertheless, many general principles derived from research sites and case studies in homogeneous geological settings can be applied or adjusted to fit specific, complex landfill cases...

  14. Social Set Visualizer (SoSeVi) II

    DEFF Research Database (Denmark)

    Flesch, Benjamin; Vatrapu, Ravi

    2016-01-01

    This paper reports the second iteration of the Social Set Visualizer (SoSeVi), a set theoretical visual analytics dashboard of big social data. In order to further demonstrate its usefulness in large-scale visual analytics tasks of individual and collective behavior of actors in social networks......, the current iteration of the Social Set Visualizer (SoSeVi) in version II builds on recent advancements in visualizing set intersections. The development of the SoSeVi dashboard involved cutting-edge open source visual analytics libraries (D3.js) and creation of new visualizations such as of actor mobility...

  15. Particle-hole state densities for statistical multi-step compound reactions

    International Nuclear Information System (INIS)

    Oblozinsky, P.

    1986-01-01

    An analytical relation is derived for the density of particle-hole bound states applying the equidistant-spacing approximation and the Darwin-Fowler statistical method. The Pauli exclusion principle as well as the finite depth of the potential well are taken into account. The set of densities needed for calculations of multi-step compound reactions is completed by deriving the densities of accessible final states for escape and damping. (orig.)

  16. Computing Convex Coverage Sets for Faster Multi-Objective Coordination

    NARCIS (Netherlands)

    Roijers, D.M.; Whiteson, S.; Oliehoek, F.A.

    2015-01-01

    In this article, we propose new algorithms for multi-objective coordination graphs (MO-CoGs). Key to the efficiency of these algorithms is that they compute a convex coverage set (CCS) instead of a Pareto coverage set (PCS). Not only is a CCS a sufficient solution set for a large class of problems,

  17. Applicability of generic assays based on liquid chromatography–electrospray mass spectrometry to study in vitro metabolism of 55 structurally diverse compounds

    Directory of Open Access Journals (Sweden)

    Timo Rousu

    2010-08-01

    Full Text Available Liquid chromatography-mass spectrometry (LC/MS with generic gradient elution for a large number of chemically different compounds is a common approach in drug development, used to acquire a large amount of data in a short time frame for drug candidates. The analysis with non-optimised parameters however may lead to a poor method performance for many compounds, and contains a risk of losing important information. Here, generic electrospray-time-of-flight (ESI-TOF MS methods in various pH conditions were tested for 55 chemically diverse compounds (10 acids, 25 bases, 17 neutrals and 3 amphoterics, aiming to find best analytical conditions for each compound, for studies of in vitro metabolic properties in liver preparations. The effect of eluent pH and elution gradient strength on chromatographic performance and electrospray MS ionisation efficiency were examined for each compound. The data are evaluated how well the best generic approach could cover the analysis of test compounds and how many compounds would still need completely different analytical conditions after that. Aqueous mobile phase consisting of 0.05% acetic acid and 5 mM ammonium acetate (pH 4.4 showed the best general suitability for the analyses, showing adequate performance for metabolite profiling for 41 out of 55 compounds either in positive or negative ion mode. In positive ion mode, the main limitation of performance in various pH conditions was generally not the lack of ionisation, but rather the poor chromatographic performance (inadequate retention or poor peak shape, suggesting that more emphasis should be put in finding conditions providing best chromatographic performance, rather than highest ionisation properties. However, a single generic approach for a large number of different compounds is not likely to produce good results for all compounds. Preferably, at least two or three different conditions are needed for the coverage of a larger number of structurally diverse

  18. Towards integrated environmental quality objectives for several compounds with a potential for secondary poisoning

    NARCIS (Netherlands)

    Plassche EJ van de; ACT; VW/RWS-DGW; AIDE

    1994-01-01

    Values are derived which can be used to set integrated environmental quality objectives (limit and target values) for 25 compounds with a potential for secondary poisoning. First, Maximum Permissible Concentrations (MPs) and Negligible Concentrations (NCs) are derived for water, sediment and soil

  19. Estimating the octanol/water partition coefficient for aliphatic organic compounds using semi-empirical electrotopological index.

    Science.gov (United States)

    Souza, Erica Silva; Zaramello, Laize; Kuhnen, Carlos Alberto; Junkes, Berenice da Silva; Yunes, Rosendo Augusto; Heinzen, Vilma Edite Fonseca

    2011-01-01

    A new possibility for estimating the octanol/water coefficient (log P) was investigated using only one descriptor, the semi-empirical electrotopological index (I(SET)). The predictability of four octanol/water partition coefficient (log P) calculation models was compared using a set of 131 aliphatic organic compounds from five different classes. Log P values were calculated employing atomic-contribution methods, as in the Ghose/Crippen approach and its later refinement, AlogP; using fragmental methods through the ClogP method; and employing an approach considering the whole molecule using topological indices with the MlogP method. The efficiency and the applicability of the I(SET) in terms of calculating log P were demonstrated through good statistical quality (r > 0.99; s < 0.18), high internal stability and good predictive ability for an external group of compounds in the same order as the widely used models based on the fragmental method, ClogP, and the atomic contribution method, AlogP, which are among the most used methods of predicting log P.

  20. A reference data set for validating vapor pressure measurement techniques: homologous series of polyethylene glycols

    Science.gov (United States)

    Krieger, Ulrich K.; Siegrist, Franziska; Marcolli, Claudia; Emanuelsson, Eva U.; Gøbel, Freya M.; Bilde, Merete; Marsh, Aleksandra; Reid, Jonathan P.; Huisman, Andrew J.; Riipinen, Ilona; Hyttinen, Noora; Myllys, Nanna; Kurtén, Theo; Bannan, Thomas; Percival, Carl J.; Topping, David

    2018-01-01

    To predict atmospheric partitioning of organic compounds between gas and aerosol particle phase based on explicit models for gas phase chemistry, saturation vapor pressures of the compounds need to be estimated. Estimation methods based on functional group contributions require training sets of compounds with well-established saturation vapor pressures. However, vapor pressures of semivolatile and low-volatility organic molecules at atmospheric temperatures reported in the literature often differ by several orders of magnitude between measurement techniques. These discrepancies exceed the stated uncertainty of each technique which is generally reported to be smaller than a factor of 2. At present, there is no general reference technique for measuring saturation vapor pressures of atmospherically relevant compounds with low vapor pressures at atmospheric temperatures. To address this problem, we measured vapor pressures with different techniques over a wide temperature range for intercomparison and to establish a reliable training set. We determined saturation vapor pressures for the homologous series of polyethylene glycols (H - (O - CH2 - CH2)n - OH) for n = 3 to n = 8 ranging in vapor pressure at 298 K from 10-7 to 5×10-2 Pa and compare them with quantum chemistry calculations. Such a homologous series provides a reference set that covers several orders of magnitude in saturation vapor pressure, allowing a critical assessment of the lower limits of detection of vapor pressures for the different techniques as well as permitting the identification of potential sources of systematic error. Also, internal consistency within the series allows outlying data to be rejected more easily. Most of the measured vapor pressures agreed within the stated uncertainty range. Deviations mostly occurred for vapor pressure values approaching the lower detection limit of a technique. The good agreement between the measurement techniques (some of which are sensitive to the mass

  1. Algorithms for detecting and analysing autocatalytic sets.

    Science.gov (United States)

    Hordijk, Wim; Smith, Joshua I; Steel, Mike

    2015-01-01

    Autocatalytic sets are considered to be fundamental to the origin of life. Prior theoretical and computational work on the existence and properties of these sets has relied on a fast algorithm for detectingself-sustaining autocatalytic sets in chemical reaction systems. Here, we introduce and apply a modified version and several extensions of the basic algorithm: (i) a modification aimed at reducing the number of calls to the computationally most expensive part of the algorithm, (ii) the application of a previously introduced extension of the basic algorithm to sample the smallest possible autocatalytic sets within a reaction network, and the application of a statistical test which provides a probable lower bound on the number of such smallest sets, (iii) the introduction and application of another extension of the basic algorithm to detect autocatalytic sets in a reaction system where molecules can also inhibit (as well as catalyse) reactions, (iv) a further, more abstract, extension of the theory behind searching for autocatalytic sets. (i) The modified algorithm outperforms the original one in the number of calls to the computationally most expensive procedure, which, in some cases also leads to a significant improvement in overall running time, (ii) our statistical test provides strong support for the existence of very large numbers (even millions) of minimal autocatalytic sets in a well-studied polymer model, where these minimal sets share about half of their reactions on average, (iii) "uninhibited" autocatalytic sets can be found in reaction systems that allow inhibition, but their number and sizes depend on the level of inhibition relative to the level of catalysis. (i) Improvements in the overall running time when searching for autocatalytic sets can potentially be obtained by using a modified version of the algorithm, (ii) the existence of large numbers of minimal autocatalytic sets can have important consequences for the possible evolvability of

  2. Rubber compounding and processing

    CSIR Research Space (South Africa)

    John, MJ

    2014-06-01

    Full Text Available This chapter presents an overview on the compounding and processing techniques of natural rubber compounds. The introductory portion deals with different types of rubbers and principles of rubber compounding. The primary and secondary fillers used...

  3. Multi-angle compound imaging

    DEFF Research Database (Denmark)

    Jespersen, Søren Kragh; Wilhjelm, Jens Erik; Sillesen, Henrik

    1998-01-01

    This paper reports on a scanning technique, denoted multi-angle compound imaging (MACI), using spatial compounding. The MACI method also contains elements of frequency compounding, as the transmit frequency is lowered for the highest beam angles in order to reduce grating lobes. Compared to conve......This paper reports on a scanning technique, denoted multi-angle compound imaging (MACI), using spatial compounding. The MACI method also contains elements of frequency compounding, as the transmit frequency is lowered for the highest beam angles in order to reduce grating lobes. Compared...... to conventional B-mode imaging MACI offers better defined tissue boundaries and lower variance of the speckle pattern, resulting in an image with reduced random variations. Design and implementation of a compound imaging system is described, images of rubber tubes and porcine aorta are shown and effects...... on visualization are discussed. The speckle reduction is analyzed numerically and the results are found to be in excellent agreement with existing theory. An investigation of detectability of low-contrast lesions shows significant improvements compared to conventional imaging. Finally, possibilities for improving...

  4. Social Work Involvement in Advance Care Planning: Findings from a Large Survey of Social Workers in Hospice and Palliative Care Settings.

    Science.gov (United States)

    Stein, Gary L; Cagle, John G; Christ, Grace H

    2017-03-01

    Few data are available describing the involvement and activities of social workers in advance care planning (ACP). We sought to provide data about (1) social worker involvement and leadership in ACP conversations with patients and families; and (2) the extent of functions and activities when these discussions occur. We conducted a large web-based survey of social workers employed in hospice, palliative care, and related settings to explore their role, participation, and self-rated competency in facilitating ACP discussions. Respondents were recruited through the Social Work Hospice and Palliative Care Network and the National Hospice and Palliative Care Organization. Descriptive analyses were conducted on the full sample of respondents (N = 641) and a subsample of clinical social workers (N = 456). Responses were analyzed to explore differences in ACP involvement by practice setting. Most clinical social workers (96%) reported that social workers in their department are conducting ACP discussions with patients/families. Majorities also participate in, and lead, ACP discussions (69% and 60%, respectively). Most respondents report that social workers are responsible for educating patients/families about ACP options (80%) and are the team members responsible for documenting ACP (68%). Compared with other settings, oncology and inpatient palliative care social workers were less likely to be responsible for ensuring that patients/families are informed of ACP options and documenting ACP preferences. Social workers are prominently involved in facilitating, leading, and documenting ACP discussions. Policy-makers, administrators, and providers should incorporate the vital contributions of social work professionals in policies and programs supporting ACP.

  5. FilTer BaSe: A web accessible chemical database for small compound libraries.

    Science.gov (United States)

    Kolte, Baban S; Londhe, Sanjay R; Solanki, Bhushan R; Gacche, Rajesh N; Meshram, Rohan J

    2018-03-01

    Finding novel chemical agents for targeting disease associated drug targets often requires screening of large number of new chemical libraries. In silico methods are generally implemented at initial stages for virtual screening. Filtering of such compound libraries on physicochemical and substructure ground is done to ensure elimination of compounds with undesired chemical properties. Filtering procedure, is redundant, time consuming and requires efficient bioinformatics/computer manpower along with high end software involving huge capital investment that forms a major obstacle in drug discovery projects in academic setup. We present an open source resource, FilTer BaSe- a chemoinformatics platform (http://bioinfo.net.in/filterbase/) that host fully filtered, ready to use compound libraries with workable size. The resource also hosts a database that enables efficient searching the chemical space of around 348,000 compounds on the basis of physicochemical and substructure properties. Ready to use compound libraries and database presented here is expected to aid a helping hand for new drug developers and medicinal chemists. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Standard setting in medical education: fundamental concepts and emerging challenges.

    Science.gov (United States)

    Mortaz Hejri, Sara; Jalili, Mohammad

    2014-01-01

    The process of determining the minimum pass level to separate the competent students from those who do not perform well enough is called standard setting. A large number of methods are widely used to set cut-scores for both written and clinical examinations. There are some challenging issues pertaining to any standard setting procedure. Ignoring these concerns would result in a large dispute regarding the credibility and defensibility of the method. The goal of this review is to provide a basic understanding of the key concepts and challenges in standard setting and to suggest some recommendations to overcome the challenging issues for educators and policymakers who are dealing with decision-making in this field.

  7. Experimental and computational studies of film cooling with compound angle injection

    Energy Technology Data Exchange (ETDEWEB)

    Goldstein, R.J.; Eckert, E.R.G.; Patankar, S.V. [Univ. of Minnesota, Minneapolis, MN (United States)] [and others

    1995-10-01

    The thermal efficiency of gas turbine systems depends largely on the turbine inlet temperature. Recent decades have seen a steady rise in the inlet temperature and a resulting reduction in fuel consumption. At the same time, it has been necessary to employ intensive cooling of the hot components. Among various cooling methods, film cooling has become a standard method for cooling of the turbine airfoils and combustion chamber walls. The University of Minnesota program is a combined experimental and computational study of various film-cooling configurations. Whereas a large number of parameters influence film cooling processes, this research focuses on compound angle injection through a single row and through two rows of holes. Later work will investigate the values of contoured hole designs. An appreciation of the advantages of compound angle injection has risen recently with the demand for more effective cooling and with improved understanding of the flow; this project should continue to further this understanding. Approaches being applied include: (1) a new measurement system that extends the mass/heat transfer analogy to obtain both local film cooling and local mass (heat) transfer results in a single system, (2) direct measurement of three-dimensional turbulent transport in a highly-disturbed flow, (3) the use of compound angle and shaped holes to optimize film cooling performance, and (4) an exploration of anisotropy corrections to turbulence modeling of film cooling jets.

  8. Quality-control analytical methods: endotoxins: essential testing for pyrogens in the compounding laboratory, part 3: a simplified endotoxin test method for compounded sterile preparations.

    Science.gov (United States)

    Cooper, James F

    2011-01-01

    The first two parts of the IJPC series on endotoxin testing explained the nature of pyrogenic contamination and described various Limulus amebocyte lysate methods for detecting and measuring endotoxin levels with the bacterial endotoxin test described in the United States Pharmacopeia. This third article in that series describes the endotoxin test that is simplest to permorm for pharmacists who prefer to conduct an endotoxin assa at the time of compounding in the pharmacy setting.

  9. Next-generation text-mining mediated generation of chemical response-specific gene sets for interpretation of gene expression data

    NARCIS (Netherlands)

    K.M. Hettne (Kristina); J. Boorsma (Jeffrey); D.A.M. van Dartel (Dorien A M); J.J. Goeman (Jelle); E.C. de Jong (Esther); A.H. Piersma (Aldert); R.H. Stierum (Rob); J. Kleinjans (Jos); J.A. Kors (Jan)

    2013-01-01

    textabstractBackground: Availability of chemical response-specific lists of genes (gene sets) for pharmacological and/or toxic effect prediction for compounds is limited. We hypothesize that more gene sets can be created by next-generation text mining (next-gen TM), and that these can be used with

  10. Governing processes for reactive nitrogen compounds in the European atmosphere

    Directory of Open Access Journals (Sweden)

    O. Hertel

    2012-12-01

    Full Text Available Reactive nitrogen (Nr compounds have different fates in the atmosphere due to differences in the governing processes of physical transport, deposition and chemical transformation. Nr compounds addressed here include reduced nitrogen (NHx: ammonia (NH3 and its reaction product ammonium (NH4+, oxidized nitrogen (NOy: nitrogen monoxide (NO + nitrogen dioxide (NO2 and their reaction products as well as organic nitrogen compounds (organic N. Pollution abatement strategies need to take into account the differences in the governing processes of these compounds when assessing their impact on ecosystem services, biodiversity, human health and climate. NOx (NO + NO2 emitted from traffic affects human health in urban areas where the presence of buildings increases the residence time in streets. In urban areas this leads to enhanced exposure of the population to NOx concentrations. NOx emissions generally have little impact on nearby ecosystems because of the small dry deposition rates of NOx. These compounds need to be converted into nitric acid (HNO3 before removal through deposition is efficient. HNO3 sticks quickly to any surface and is thereby either dry deposited or incorporated into aerosols as nitrate (NO3. In contrast to NOx compounds, NH3 has potentially high impacts on ecosystems near the main agricultural sources of NH3 because of its large ground-level concentrations along with large dry deposition rates. Aerosol phase NH4+ and NO3 contribute significantly to background PM2.5 and PM10 (mass of aerosols with an aerodynamic diameter of less than 2.5 and 10 μm, respectively with an impact on radiation balance as well as potentially on human

  11. A low-cost, hands-on module to characterize antimicrobial compounds using an interdisciplinary, biophysical approach.

    Directory of Open Access Journals (Sweden)

    Karishma S Kaushik

    2015-01-01

    Full Text Available We have developed a hands-on experimental module that combines biology experiments with a physics-based analytical model in order to characterize antimicrobial compounds. To understand antibiotic resistance, participants perform a disc diffusion assay to test the antimicrobial activity of different compounds and then apply a diffusion-based analytical model to gain insights into the behavior of the active antimicrobial component. In our experience, this module was robust, reproducible, and cost-effective, suggesting that it could be implemented in diverse settings such as undergraduate research, STEM (science, technology, engineering, and math camps, school programs, and laboratory training workshops. By providing valuable interdisciplinary research experience in science outreach and education initiatives, this module addresses the paucity of structured training or education programs that integrate diverse scientific fields. Its low-cost requirements make it especially suitable for use in resource-limited settings.

  12. Rb-intercalated C60 compounds studied by Inverse Photoemission Spectroscopy

    International Nuclear Information System (INIS)

    Finazzi, M.; Brambilla, A; Biagioni, P.; Cattoni, A.; Duo, L.; Ciccacci, F.; Braicovich, L.; Giovanelli, L.; Goldoni, A.

    2004-01-01

    Full text: Since the discovery of superconductivity in alkali-doped solid C 60 , the electronic structure of the host material (C 60 ) and the doped compounds (A x C 60 , where A is an alkali metal), has been the subject of a considerable amount of work, both theoretical and experimental. The spectroscopic investigations of the alkali-doped C 60 compounds has been mainly focussed on the valence states, while much less information is available on the unoccupied states. In particular, inverse photoemission data on the complete set of stable Rb x C 60 compounds was, so far, still missing. We have performed Inverse Photoemission (IPE) spectroscopy on Rb x C 60 compounds (x = 1, 3, 4, 6). IPE spectra were obtained using a band-pass photon detector (hv = 9.4 eV, FWHM = 0.7 eV) and scanning the kinetic energy of the electrons impinging on the sample. Rb was evaporated on C 60 films (thickness = 6-12 atomic layers) grown in situ on a Cu(100) substrate. The temperature of the substrate was kept equal to T = 100 deg C, which is lower than the C 60 sublimation temperature. The amount of Rb was checked by measuring the intensity of the C1s and Rb3d photoemission lines. After the required amount of Rb had been deposited, the samples were annealed to distillate the desired stable phase

  13. High-throughput film-densitometry: An efficient approach to generate large data sets

    Energy Technology Data Exchange (ETDEWEB)

    Typke, Dieter; Nordmeyer, Robert A.; Jones, Arthur; Lee, Juyoung; Avila-Sakar, Agustin; Downing, Kenneth H.; Glaeser, Robert M.

    2004-07-14

    A film-handling machine (robot) has been built which can, in conjunction with a commercially available film densitometer, exchange and digitize over 300 electron micrographs per day. Implementation of robotic film handling effectively eliminates the delay and tedium associated with digitizing images when data are initially recorded on photographic film. The modulation transfer function (MTF) of the commercially available densitometer is significantly worse than that of a high-end, scientific microdensitometer. Nevertheless, its signal-to-noise ratio (S/N) is quite excellent, allowing substantial restoration of the output to ''near-to-perfect'' performance. Due to the large area of the standard electron microscope film that can be digitized by the commercial densitometer (up to 10,000 x 13,680 pixels with an appropriately coded holder), automated film digitization offers a fast and inexpensive alternative to high-end CCD cameras as a means of acquiring large amounts of image data in electron microscopy.

  14. Generating mock data sets for large-scale Lyman-α forest correlation measurements

    Energy Technology Data Exchange (ETDEWEB)

    Font-Ribera, Andreu [Institut de Ciències de l' Espai (CSIC-IEEC), Campus UAB, Fac. Ciències, torre C5 parell 2, Bellaterra, Catalonia (Spain); McDonald, Patrick [Lawrence Berkeley National Laboratory, 1 Cyclotron Road, Berkeley, CA 94720 (United States); Miralda-Escudé, Jordi, E-mail: font@ieec.uab.es, E-mail: pvmcdonald@lbl.gov, E-mail: miralda@icc.ub.edu [Institució Catalana de Recerca i Estudis Avançats, Barcelona, Catalonia (Spain)

    2012-01-01

    Massive spectroscopic surveys of high-redshift quasars yield large numbers of correlated Lyα absorption spectra that can be used to measure large-scale structure. Simulations of these surveys are required to accurately interpret the measurements of correlations and correct for systematic errors. An efficient method to generate mock realizations of Lyα forest surveys is presented which generates a field over the lines of sight to the survey sources only, instead of having to generate it over the entire three-dimensional volume of the survey. The method can be calibrated to reproduce the power spectrum and one-point distribution function of the transmitted flux fraction, as well as the redshift evolution of these quantities, and is easily used for modeling any survey systematic effects. We present an example of how these mock surveys are applied to predict the measurement errors in a survey with similar parameters as the BOSS quasar survey in SDSS-III.

  15. Generating mock data sets for large-scale Lyman-α forest correlation measurements

    International Nuclear Information System (INIS)

    Font-Ribera, Andreu; McDonald, Patrick; Miralda-Escudé, Jordi

    2012-01-01

    Massive spectroscopic surveys of high-redshift quasars yield large numbers of correlated Lyα absorption spectra that can be used to measure large-scale structure. Simulations of these surveys are required to accurately interpret the measurements of correlations and correct for systematic errors. An efficient method to generate mock realizations of Lyα forest surveys is presented which generates a field over the lines of sight to the survey sources only, instead of having to generate it over the entire three-dimensional volume of the survey. The method can be calibrated to reproduce the power spectrum and one-point distribution function of the transmitted flux fraction, as well as the redshift evolution of these quantities, and is easily used for modeling any survey systematic effects. We present an example of how these mock surveys are applied to predict the measurement errors in a survey with similar parameters as the BOSS quasar survey in SDSS-III

  16. Phenolic Molding Compounds

    Science.gov (United States)

    Koizumi, Koji; Charles, Ted; de Keyser, Hendrik

    Phenolic Molding Compounds continue to exhibit well balanced properties such as heat resistance, chemical resistance, dimensional stability, and creep resistance. They are widely applied in electrical, appliance, small engine, commutator, and automotive applications. As the focus of the automotive industry is weight reduction for greater fuel efficiency, phenolic molding compounds become appealing alternatives to metals. Current market volumes and trends, formulation components and its impact on properties, and a review of common manufacturing methods are presented. Molding processes as well as unique advanced techniques such as high temperature molding, live sprue, and injection/compression technique provide additional benefits in improving the performance characterisitics of phenolic molding compounds. Of special interest are descriptions of some of the latest innovations in automotive components, such as the phenolic intake manifold and valve block for dual clutch transmissions. The chapter also characterizes the most recent developments in new materials, including long glass phenolic molding compounds and carbon fiber reinforced phenolic molding compounds exhibiting a 10-20-fold increase in Charpy impact strength when compared to short fiber filled materials. The role of fatigue testing and fatigue fracture behavior presents some insight into long-term reliability and durability of glass-filled phenolic molding compounds. A section on new technology outlines the important factors to consider in modeling phenolic parts by finite element analysis and flow simulation.

  17. Modelling of the Kinetics of Sulfure Compounds in Desulfurisation Processes Based on Industry Data of Plant

    OpenAIRE

    Krivtsova, Nadezhda Igorevna; Tataurshikov, A.; Kotkova, Elena

    2016-01-01

    Modelling of sulfur compounds kinetics was performed, including kinetics of benzothiophene and dibenzothiophene homologues. Modelling is based on experimental data obtained from monitoring of industrial hydrotreating set. Obtained results include kinetic parameters of reactions.

  18. Natural compounds' activity against cancer stem-like or fast-cycling melanoma cells.

    Directory of Open Access Journals (Sweden)

    Malgorzata Sztiller-Sikorska

    Full Text Available BACKGROUND: Accumulating evidence supports the concept that melanoma is highly heterogeneous and sustained by a small subpopulation of melanoma stem-like cells. Those cells are considered as responsible for tumor resistance to therapies. Moreover, melanoma cells are characterized by their high phenotypic plasticity. Consequently, both melanoma stem-like cells and their more differentiated progeny must be eradicated to achieve durable cure. By reevaluating compounds in heterogeneous melanoma populations, it might be possible to select compounds with activity not only against fast-cycling cells but also against cancer stem-like cells. Natural compounds were the focus of the present study. METHODS: We analyzed 120 compounds from The Natural Products Set II to identify compounds active against melanoma populations grown in an anchorage-independent manner and enriched with cells exerting self-renewing capacity. Cell viability, cell cycle arrest, apoptosis, gene expression, clonogenic survival and label-retention were analyzed. FINDINGS: Several compounds efficiently eradicated cells with clonogenic capacity and nanaomycin A, streptonigrin and toyocamycin were effective at 0.1 µM. Other anti-clonogenic but not highly cytotoxic compounds such as bryostatin 1, siomycin A, illudin M, michellamine B and pentoxifylline markedly reduced the frequency of ABCB5 (ATP-binding cassette, sub-family B, member 5-positive cells. On the contrary, treatment with maytansine and colchicine selected for cells expressing this transporter. Maytansine, streptonigrin, toyocamycin and colchicine, even if highly cytotoxic, left a small subpopulation of slow-dividing cells unaffected. Compounds selected in the present study differentially altered the expression of melanocyte/melanoma specific microphthalmia-associated transcription factor (MITF and proto-oncogene c-MYC. CONCLUSION: Selected anti-clonogenic compounds might be further investigated as potential adjuvants

  19. Nomenclature on an inorganic compound

    International Nuclear Information System (INIS)

    1998-10-01

    This book contains eleven chapters : which mention nomenclature of an inorganic compound with introduction and general principle on nomenclature of compound. It gives the description of grammar for nomenclature such as brackets, diagonal line, asterisk, and affix, element, atom and groups of atom, chemical formula, naming by stoichiometry, solid, neutral molecule compound, ion, a substituent, radical and name of salt, oxo acid and anion on introduction and definition of oxo acid, coordination compound like symbol of stereochemistry , boron and hydrogen compound and related compound.

  20. Design of a Slowed-Rotor Compound Helicopter for Future Joint Service Missions

    Science.gov (United States)

    Silva, Christopher; Yeo, Hyeonsoo; Johnson, Wayne R.

    2010-01-01

    A slowed-rotor compound helicopter has been synthesized using the NASA Design and Analysis of Rotorcraft (NDARC) conceptual design software. An overview of the design process and the capabilities of NDARC are presented. The benefits of trading rotor speed, wing-rotor lift share, and trim strategies are presented for an example