WorldWideScience

Sample records for statistical multisource-multitarget information

  1. Advances in statistical multisource-multitarget information fusion

    CERN Document Server

    Mahler, Ronald PS

    2014-01-01

    This is the sequel to the 2007 Artech House bestselling title, Statistical Multisource-Multitarget Information Fusion. That earlier book was a comprehensive resource for an in-depth understanding of finite-set statistics (FISST), a unified, systematic, and Bayesian approach to information fusion. The cardinalized probability hypothesis density (CPHD) filter, which was first systematically described in the earlier book, has since become a standard multitarget detection and tracking technique, especially in research and development.Since 2007, FISST has inspired a considerable amount of research

  2. Multi-Source Multi-Target Dictionary Learning for Prediction of Cognitive Decline.

    Science.gov (United States)

    Zhang, Jie; Li, Qingyang; Caselli, Richard J; Thompson, Paul M; Ye, Jieping; Wang, Yalin

    2017-06-01

    Alzheimer's Disease (AD) is the most common type of dementia. Identifying correct biomarkers may determine pre-symptomatic AD subjects and enable early intervention. Recently, Multi-task sparse feature learning has been successfully applied to many computer vision and biomedical informatics researches. It aims to improve the generalization performance by exploiting the shared features among different tasks. However, most of the existing algorithms are formulated as a supervised learning scheme. Its drawback is with either insufficient feature numbers or missing label information. To address these challenges, we formulate an unsupervised framework for multi-task sparse feature learning based on a novel dictionary learning algorithm. To solve the unsupervised learning problem, we propose a two-stage Multi-Source Multi-Target Dictionary Learning (MMDL) algorithm. In stage 1, we propose a multi-source dictionary learning method to utilize the common and individual sparse features in different time slots. In stage 2, supported by a rigorous theoretical analysis, we develop a multi-task learning method to solve the missing label problem. Empirical studies on an N = 3970 longitudinal brain image data set, which involves 2 sources and 5 targets, demonstrate the improved prediction accuracy and speed efficiency of MMDL in comparison with other state-of-the-art algorithms.

  3. National Center for Multisource Information Fusion

    Science.gov (United States)

    2009-04-01

    SUPPLEMENTARY NOTES 14. ABSTRACT The National Center for Multisource Information Fusion (N-CMIF) research was a joint collaboration between CUBRC ...FuSIA).  4      Figure 1: Overall Architectural Vision 2.1 Background and Existing Cyber Security Capabilities  Prior to N‐CMIF, the  CUBRC /Rochester...time Decision‐making  (INFERD)  [2]  is a  tool developed by  CUBRC  and Alion Technologies under the ECCARS contract.   INFERD  is a JDL  level 1

  4. Multitarget Tracking with Spatial Nonmaximum Suppressed Sensor Selection

    Directory of Open Access Journals (Sweden)

    Liang Ma

    2015-01-01

    Full Text Available Multitarget tracking is one of the most important applications of sensor networks, yet it is an extremely challenging problem since multisensor multitarget tracking itself is nontrivial and the difficulty is further compounded by sensor management. Recently, random finite set based Bayesian framework has opened doors for multitarget tracking with sensor management, which is modelled in the framework of partially observed Markov decision process (POMDP. However, sensor management posed as a POMDP is in essence a combinatorial optimization problem which is NP-hard and computationally unacceptable. In this paper, we propose a novel sensor selection method for multitarget tracking. We first present the sequential multi-Bernoulli filter as a centralized multisensor fusion scheme for multitarget tracking. In order to perform sensor selection, we define the hypothesis information gain (HIG of a sensor to measure its information quantity when the sensor is selected alone. Then, we propose spatial nonmaximum suppression approach to select sensors with respect to their locations and HIGs. Two distinguished implementations have been provided using the greedy spatial nonmaximum suppression. Simulation results verify the effectiveness of proposed sensor selection approach for multitarget tracking.

  5. Variable cycle control model for intersection based on multi-source information

    Science.gov (United States)

    Sun, Zhi-Yuan; Li, Yue; Qu, Wen-Cong; Chen, Yan-Yan

    2018-05-01

    In order to improve the efficiency of traffic control system in the era of big data, a new variable cycle control model based on multi-source information is presented for intersection in this paper. Firstly, with consideration of multi-source information, a unified framework based on cyber-physical system is proposed. Secondly, taking into account the variable length of cell, hysteresis phenomenon of traffic flow and the characteristics of lane group, a Lane group-based Cell Transmission Model is established to describe the physical properties of traffic flow under different traffic signal control schemes. Thirdly, the variable cycle control problem is abstracted into a bi-level programming model. The upper level model is put forward for cycle length optimization considering traffic capacity and delay. The lower level model is a dynamic signal control decision model based on fairness analysis. Then, a Hybrid Intelligent Optimization Algorithm is raised to solve the proposed model. Finally, a case study shows the efficiency and applicability of the proposed model and algorithm.

  6. Mobile e-Commerce Recommendation System Based on Multi-Source Information Fusion for Sustainable e-Business

    Directory of Open Access Journals (Sweden)

    Yan Guo

    2018-01-01

    Full Text Available A lack of in-depth excavation of user and resources information has become the main bottleneck restricting the predictive analytics of recommendation systems in mobile commerce. This article provides a method which makes use of multi-source information to analyze consumers’ requirements for e-commerce recommendation systems. Combined with the characteristics of mobile e-commerce, this method employs an improved radial basis function (RBF network in order to determine the weights of recommendations, and an improved Dempster–Shafer theory to fuse the multi-source information. Power-spectrum estimation is then used to handle the fusion results and allow decision-making. The experimental results illustrate that the traditional method is inferior to the proposed approach in terms of recommendation accuracy, simplicity, coverage rate and recall rate. These achievements can further improve recommendation systems, and promote the sustainable development of e-business.

  7. Multisource feedback to graduate nurses: a multimethod study.

    Science.gov (United States)

    McPhee, Samantha; Phillips, Nicole M; Ockerby, Cherene; Hutchinson, Alison M

    2017-11-01

    (1) To explore graduate nurses' perceptions of the influence of multisource feedback on their performance and (2) to explore perceptions of Clinical Nurse Educators involved in providing feedback regarding feasibility and benefit of the approach. Graduate registered nurses are expected to provide high-quality care for patients in demanding and unpredictable clinical environments. Receiving feedback is essential to their development. Performance appraisals are a common method used to provide feedback and typically involve a single source of feedback. Alternatively, multisource feedback allows the learner to gain insight into performance from a variety of perspectives. This study explores multisource feedback in an Australian setting within the graduate nurse context. Multimethod study. Eleven graduates were given structured performance feedback from four raters: Nurse Unit Manager, Clinical Nurse Educator, preceptor and a self-appraisal. Thirteen graduates received standard single-rater appraisals. Data regarding perceptions of feedback for both groups were obtained using a questionnaire. Semistructured interviews were conducted with nurses who received multisource feedback and the educators. In total, 94% (n = 15) of survey respondents perceived feedback was important during the graduate year. Four themes emerged from interviews: informal feedback, appropriateness of raters, elements of delivery and creating an appraisal process that is 'more real'. Multisource feedback was perceived as more beneficial compared to single-rater feedback. Educators saw value in multisource feedback; however, perceived barriers were engaging raters and collating feedback. Some evidence exists to indicate that feedback from multiple sources is valued by graduates. Further research in a larger sample and with more experienced nurses is required. Evidence resulting from this study indicates that multisource feedback is valued by both graduates and educators and informs graduates

  8. Motives for Multisourcing in the IT Sector

    Directory of Open Access Journals (Sweden)

    Łoboda Barbara

    2014-10-01

    Full Text Available Multisourcing is a relatively new phenomenon that began a decade ago as companies began developing strategies to split large information technology (IT contracts into smaller ones. This provided the opportunity to choose best-of-breed suppliers, who were supposed to collaborate to provide a seamless service to the company. Firms began to multisource when the large IT contracts they had did not bring assumed benefits. At the same time, the business environment was forcing them to be flexible, efficient and able to quickly implement new technologies. The trend to multisource has been growing, so it is worth investigating why companies prefer this form of cooperation. This topic was not a subject of research previously.

  9. Metabolism-Activated Multitargeting (MAMUT): An Innovative Multitargeting Approach to Drug Design and Development.

    Science.gov (United States)

    Mátyus, Péter; Chai, Christina L L

    2016-06-20

    Multitargeting is a valuable concept in drug design for the development of effective drugs for the treatment of multifactorial diseases. This concept has most frequently been realized by incorporating two or more pharmacophores into a single hybrid molecule. Many such hybrids, due to the increased molecular size, exhibit unfavorable physicochemical properties leading to adverse effects and/or an inappropriate ADME (absorption, distribution, metabolism, and excretion) profile. To avoid this limitation and achieve additional therapeutic benefits, here we describe a novel multitargeting strategy based on the synergistic effects of a parent drug and its active metabolite(s). The concept of metabolism-activated multitargeting (MAMUT) is illustrated using a number of examples. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Real-Time Multi-Target Localization from Unmanned Aerial Vehicles

    Directory of Open Access Journals (Sweden)

    Xuan Wang

    2016-12-01

    Full Text Available In order to improve the reconnaissance efficiency of unmanned aerial vehicle (UAV electro-optical stabilized imaging systems, a real-time multi-target localization scheme based on an UAV electro-optical stabilized imaging system is proposed. First, a target location model is studied. Then, the geodetic coordinates of multi-targets are calculated using the homogeneous coordinate transformation. On the basis of this, two methods which can improve the accuracy of the multi-target localization are proposed: (1 the real-time zoom lens distortion correction method; (2 a recursive least squares (RLS filtering method based on UAV dead reckoning. The multi-target localization error model is established using Monte Carlo theory. In an actual flight, the UAV flight altitude is 1140 m. The multi-target localization results are within the range of allowable error. After we use a lens distortion correction method in a single image, the circular error probability (CEP of the multi-target localization is reduced by 7%, and 50 targets can be located at the same time. The RLS algorithm can adaptively estimate the location data based on multiple images. Compared with multi-target localization based on a single image, CEP of the multi-target localization using RLS is reduced by 25%. The proposed method can be implemented on a small circuit board to operate in real time. This research is expected to significantly benefit small UAVs which need multi-target geo-location functions.

  11. Mixed labelling in multitarget particle filtering

    NARCIS (Netherlands)

    Boers, Y.; Sviestins, Egils; Driessen, Hans

    2010-01-01

    The so-called mixed labelling problem inherent to a joint state multitarget particle filter implementation is treated. The mixed labelling problem would be prohibitive for track extraction from a joint state multitarget particle filter. It is shown, using the theory of Markov chains, that the mixed

  12. Multi-target molecular imaging and its progress in research and application

    International Nuclear Information System (INIS)

    Tang Ganghua

    2011-01-01

    Multi-target molecular imaging (MMI) is an important field of research in molecular imaging. It includes multi-tracer multi-target molecular imaging(MTMI), fusion-molecule multi-target imaging (FMMI), coupling-molecule multi-target imaging (CMMI), and multi-target multifunctional molecular imaging(MMMI). In this paper,imaging modes of MMI are reviewed, and potential applications of positron emission tomography MMI in near future are discussed. (author)

  13. Application of Ontology Technology in Health Statistic Data Analysis.

    Science.gov (United States)

    Guo, Minjiang; Hu, Hongpu; Lei, Xingyun

    2017-01-01

    Research Purpose: establish health management ontology for analysis of health statistic data. Proposed Methods: this paper established health management ontology based on the analysis of the concepts in China Health Statistics Yearbook, and used protégé to define the syntactic and semantic structure of health statistical data. six classes of top-level ontology concepts and their subclasses had been extracted and the object properties and data properties were defined to establish the construction of these classes. By ontology instantiation, we can integrate multi-source heterogeneous data and enable administrators to have an overall understanding and analysis of the health statistic data. ontology technology provides a comprehensive and unified information integration structure of the health management domain and lays a foundation for the efficient analysis of multi-source and heterogeneous health system management data and enhancement of the management efficiency.

  14. Measuring Conflict in a Multi-Source Environment as a Normal Measure

    OpenAIRE

    Wei, Pan; Ball, John E.; Anderson, Derek T.; Harsh, Archit; Archibald, Christopher

    2018-01-01

    In a multi-source environment, each source has its own credibility. If there is no external knowledge about credibility then we can use the information provided by the sources to assess their credibility. In this paper, we propose a way to measure conflict in a multi-source environment as a normal measure. We examine our algorithm using three simulated examples of increasing conflict and one experimental example. The results demonstrate that the proposed measure can represent conflict in a me...

  15. Integration of multi-source data in mineral exploration

    DEFF Research Database (Denmark)

    Conradsen, Knut; Ersbøll, Bjarne Kjær; Nielsen, Allan Aasbjerg

    1991-01-01

    This paper describes several multivariate statistical analysis applications of geochemical, geophysical, and spectral variables in mineral exploration. Mahalanobis' distance is described in some detail and based on four multisource variables this measure is applied to produce a map that gives an ...... of automatically generated linear features based on Landsat TM data. The results indicate among other things a not previously recognized subsurface continuation of an already mapped lineament....

  16. Fusion-based multi-target tracking and localization for intelligent surveillance systems

    Science.gov (United States)

    Rababaah, Haroun; Shirkhodaie, Amir

    2008-04-01

    In this paper, we have presented two approaches addressing visual target tracking and localization in complex urban environment. The two techniques presented in this paper are: fusion-based multi-target visual tracking, and multi-target localization via camera calibration. For multi-target tracking, the data fusion concepts of hypothesis generation/evaluation/selection, target-to-target registration, and association are employed. An association matrix is implemented using RGB histograms for associated tracking of multi-targets of interests. Motion segmentation of targets of interest (TOI) from the background was achieved by a Gaussian Mixture Model. Foreground segmentation, on other hand, was achieved by the Connected Components Analysis (CCA) technique. The tracking of individual targets was estimated by fusing two sources of information, the centroid with the spatial gating, and the RGB histogram association matrix. The localization problem is addressed through an effective camera calibration technique using edge modeling for grid mapping (EMGM). A two-stage image pixel to world coordinates mapping technique is introduced that performs coarse and fine location estimation of moving TOIs. In coarse estimation, an approximate neighborhood of the target position is estimated based on nearest 4-neighbor method, and in fine estimation, we use Euclidean interpolation to localize the position within the estimated four neighbors. Both techniques were tested and shown reliable results for tracking and localization of Targets of interests in complex urban environment.

  17. Multi-source remote sensing data management system

    International Nuclear Information System (INIS)

    Qin Kai; Zhao Yingjun; Lu Donghua; Zhang Donghui; Wu Wenhuan

    2014-01-01

    In this thesis, the author explored multi-source management problems of remote sensing data. The main idea is to use the mosaic dataset model, and the ways of an integreted display of image and its interpretation. Based on ArcGIS and IMINT feature knowledge platform, the author used the C# and other programming tools for development work, so as to design and implement multi-source remote sensing data management system function module which is able to simply, conveniently and efficiently manage multi-source remote sensing data. (authors)

  18. Conjugate-Gradient Neural Networks in Classification of Multisource and Very-High-Dimensional Remote Sensing Data

    Science.gov (United States)

    Benediktsson, J. A.; Swain, P. H.; Ersoy, O. K.

    1993-01-01

    Application of neural networks to classification of remote sensing data is discussed. Conventional two-layer backpropagation is found to give good results in classification of remote sensing data but is not efficient in training. A more efficient variant, based on conjugate-gradient optimization, is used for classification of multisource remote sensing and geographic data and very-high-dimensional data. The conjugate-gradient neural networks give excellent performance in classification of multisource data, but do not compare as well with statistical methods in classification of very-high-dimentional data.

  19. Multi-target consensus circle pursuit for multi-agent systems via a distributed multi-flocking method

    Science.gov (United States)

    Pei, Huiqin; Chen, Shiming; Lai, Qiang

    2016-12-01

    This paper studies the multi-target consensus pursuit problem of multi-agent systems. For solving the problem, a distributed multi-flocking method is designed based on the partial information exchange, which is employed to realise the pursuit of multi-target and the uniform distribution of the number of pursuing agents with the dynamic target. Combining with the proposed circle formation control strategy, agents can adaptively choose the target to form the different circle formation groups accomplishing a multi-target pursuit. The speed state of pursuing agents in each group converges to the same value. A Lyapunov approach is utilised to analyse the stability of multi-agent systems. In addition, a sufficient condition is given for achieving the dynamic target consensus pursuit, and which is then analysed. Finally, simulation results verify the effectiveness of the proposed approaches.

  20. Multisource Least-squares Reverse Time Migration

    KAUST Repository

    Dai, Wei

    2012-12-01

    Least-squares migration has been shown to be able to produce high quality migration images, but its computational cost is considered to be too high for practical imaging. In this dissertation, a multisource least-squares reverse time migration algorithm (LSRTM) is proposed to increase by up to 10 times the computational efficiency by utilizing the blended sources processing technique. There are three main chapters in this dissertation. In Chapter 2, the multisource LSRTM algorithm is implemented with random time-shift and random source polarity encoding functions. Numerical tests on the 2D HESS VTI data show that the multisource LSRTM algorithm suppresses migration artifacts, balances the amplitudes, improves image resolution, and reduces crosstalk noise associated with the blended shot gathers. For this example, multisource LSRTM is about three times faster than the conventional RTM method. For the 3D example of the SEG/EAGE salt model, with comparable computational cost, multisource LSRTM produces images with more accurate amplitudes, better spatial resolution, and fewer migration artifacts compared to conventional RTM. The empirical results suggest that the multisource LSRTM can produce more accurate reflectivity images than conventional RTM does with similar or less computational cost. The caveat is that LSRTM image is sensitive to large errors in the migration velocity model. In Chapter 3, the multisource LSRTM algorithm is implemented with frequency selection encoding strategy and applied to marine streamer data, for which traditional random encoding functions are not applicable. The frequency-selection encoding functions are delta functions in the frequency domain, so that all the encoded shots have unique non-overlapping frequency content. Therefore, the receivers can distinguish the wavefield from each shot according to the frequencies. With the frequency-selection encoding method, the computational efficiency of LSRTM is increased so that its cost is

  1. Multisource full waveform inversion of marine streamer data with frequency selection

    KAUST Repository

    Huang, Yunsong; Schuster, Gerard T.

    2013-01-01

    Multisource migration with frequency selection is now extended to multisource full waveform inversion (FWI) of supergathers for marine streamer data. There are three advantages of this approach compared to conventional FWI for marine streamer data. 1. The multisource FWI method with frequency selection is computationally more efficient than conventional FWI. 2. A supergather requires more than an order of magnitude less storage than the the original data. 3. Frequency selection overcomes the acquisition mismatch between the observed data and the simulated multisource supergathers for marine data. This mismatch problem has prevented the efficient application of FWI to marine geometries in the space-time domain. Preliminary result of applying multisource FWI with frequency selection to a synthetic marine data set suggests it is at least four times more efficient than standard FWI.

  2. Multi-source least-squares reverse time migration

    KAUST Repository

    Dai, Wei

    2012-06-15

    Least-squares migration has been shown to improve image quality compared to the conventional migration method, but its computational cost is often too high to be practical. In this paper, we develop two numerical schemes to implement least-squares migration with the reverse time migration method and the blended source processing technique to increase computation efficiency. By iterative migration of supergathers, which consist in a sum of many phase-encoded shots, the image quality is enhanced and the crosstalk noise associated with the encoded shots is reduced. Numerical tests on 2D HESS VTI data show that the multisource least-squares reverse time migration (LSRTM) algorithm suppresses migration artefacts, balances the amplitudes, improves image resolution and reduces crosstalk noise associated with the blended shot gathers. For this example, the multisource LSRTM is about three times faster than the conventional RTM method. For the 3D example of the SEG/EAGE salt model, with a comparable computational cost, multisource LSRTM produces images with more accurate amplitudes, better spatial resolution and fewer migration artefacts compared to conventional RTM. The empirical results suggest that multisource LSRTM can produce more accurate reflectivity images than conventional RTM does with a similar or less computational cost. The caveat is that the LSRTM image is sensitive to large errors in the migration velocity model. © 2012 European Association of Geoscientists & Engineers.

  3. Multi-source least-squares reverse time migration

    KAUST Repository

    Dai, Wei; Fowler, Paul J.; Schuster, Gerard T.

    2012-01-01

    Least-squares migration has been shown to improve image quality compared to the conventional migration method, but its computational cost is often too high to be practical. In this paper, we develop two numerical schemes to implement least-squares migration with the reverse time migration method and the blended source processing technique to increase computation efficiency. By iterative migration of supergathers, which consist in a sum of many phase-encoded shots, the image quality is enhanced and the crosstalk noise associated with the encoded shots is reduced. Numerical tests on 2D HESS VTI data show that the multisource least-squares reverse time migration (LSRTM) algorithm suppresses migration artefacts, balances the amplitudes, improves image resolution and reduces crosstalk noise associated with the blended shot gathers. For this example, the multisource LSRTM is about three times faster than the conventional RTM method. For the 3D example of the SEG/EAGE salt model, with a comparable computational cost, multisource LSRTM produces images with more accurate amplitudes, better spatial resolution and fewer migration artefacts compared to conventional RTM. The empirical results suggest that multisource LSRTM can produce more accurate reflectivity images than conventional RTM does with a similar or less computational cost. The caveat is that the LSRTM image is sensitive to large errors in the migration velocity model. © 2012 European Association of Geoscientists & Engineers.

  4. Field Trials of the Multi-Source Approach for Resistivity and Induced Polarization Data Acquisition

    Science.gov (United States)

    LaBrecque, D. J.; Morelli, G.; Fischanger, F.; Lamoureux, P.; Brigham, R.

    2013-12-01

    Implementing systems of distributed receivers and transmitters for resistivity and induced polarization data is an almost inevitable result of the availability of wireless data communication modules and GPS modules offering precise timing and instrument locations. Such systems have a number of advantages; for example, they can be deployed around obstacles such as rivers, canyons, or mountains which would be difficult with traditional 'hard-wired' systems. However, deploying a system of identical, small, battery powered, transceivers, each capable of injecting a known current and measuring the induced potential has an additional and less obvious advantage in that multiple units can inject current simultaneously. The original purpose for using multiple simultaneous current sources (multi-source) was to increase signal levels. In traditional systems, to double the received signal you inject twice the current which requires you to apply twice the voltage and thus four times the power. Alternatively, one approach to increasing signal levels for large-scale surveys collected using small, battery powered transceivers is it to allow multiple units to transmit in parallel. In theory, using four 400 watt transmitters on separate, parallel dipoles yields roughly the same signal as a single 6400 watt transmitter. Furthermore, implementing the multi-source approach creates the opportunity to apply more complex current flow patterns than simple, parallel dipoles. For a perfect, noise-free system, multi-sources adds no new information to a data set that contains a comprehensive set of data collected using single sources. However, for realistic, noisy systems, it appears that multi-source data can substantially impact survey results. In preliminary model studies, the multi-source data produced such startling improvements in subsurface images that even the authors questioned their veracity. Between December of 2012 and July of 2013, we completed multi-source surveys at five sites

  5. Full Waveform Inversion with Multisource Frequency Selection of Marine Streamer Data

    KAUST Repository

    Huang, Yunsong

    2017-10-27

    The theory and practice of multisource full waveform inversion of marine supergathers are described with a frequency-selection strategy. The key enabling property of frequency selection is that it eliminates the crosstalk among sources, thus overcoming the aperture mismatch of marine multisource inversion. Tests on multisource full waveform inversion of synthetic marine data and Gulf of Mexico data show speedups of 4× and 8×, respectively, compared to conventional full waveform inversion.

  6. Full Waveform Inversion with Multisource Frequency Selection of Marine Streamer Data

    KAUST Repository

    Huang, Yunsong; Schuster, Gerard T.

    2017-01-01

    The theory and practice of multisource full waveform inversion of marine supergathers are described with a frequency-selection strategy. The key enabling property of frequency selection is that it eliminates the crosstalk among sources, thus overcoming the aperture mismatch of marine multisource inversion. Tests on multisource full waveform inversion of synthetic marine data and Gulf of Mexico data show speedups of 4× and 8×, respectively, compared to conventional full waveform inversion.

  7. Box-Particle Cardinality Balanced Multi-Target Multi-Bernoulli Filter

    OpenAIRE

    L. Song; X. Zhao

    2014-01-01

    As a generalized particle filtering, the box-particle filter (Box-PF) has a potential to process the measurements affected by bounded error of unknown distributions and biases. Inspired by the Box-PF, a novel implementation for multi-target tracking, called box-particle cardinality balanced multi-target multi-Bernoulli (Box-CBMeMBer) filter is presented in this paper. More important, to eliminate the negative effect of clutters in the estimation of the numbers of targets, an improved generali...

  8. Multi-targeted priming for genome-wide gene expression assays

    Directory of Open Access Journals (Sweden)

    Adomas Aleksandra B

    2010-08-01

    Full Text Available Abstract Background Complementary approaches to assaying global gene expression are needed to assess gene expression in regions that are poorly assayed by current methodologies. A key component of nearly all gene expression assays is the reverse transcription of transcribed sequences that has traditionally been performed by priming the poly-A tails on many of the transcribed genes in eukaryotes with oligo-dT, or by priming RNA indiscriminately with random hexamers. We designed an algorithm to find common sequence motifs that were present within most protein-coding genes of Saccharomyces cerevisiae and of Neurospora crassa, but that were not present within their ribosomal RNA or transfer RNA genes. We then experimentally tested whether degenerately priming these motifs with multi-targeted primers improved the accuracy and completeness of transcriptomic assays. Results We discovered two multi-targeted primers that would prime a preponderance of genes in the genomes of Saccharomyces cerevisiae and Neurospora crassa while avoiding priming ribosomal RNA or transfer RNA. Examining the response of Saccharomyces cerevisiae to nitrogen deficiency and profiling Neurospora crassa early sexual development, we demonstrated that using multi-targeted primers in reverse transcription led to superior performance of microarray profiling and next-generation RNA tag sequencing. Priming with multi-targeted primers in addition to oligo-dT resulted in higher sensitivity, a larger number of well-measured genes and greater power to detect differences in gene expression. Conclusions Our results provide the most complete and detailed expression profiles of the yeast nitrogen starvation response and N. crassa early sexual development to date. Furthermore, our multi-targeting priming methodology for genome-wide gene expression assays provides selective targeting of multiple sequences and counter-selection against undesirable sequences, facilitating a more complete and

  9. A Decision-Making Method with Grey Multi-Source Heterogeneous Data and Its Application in Green Supplier Selection

    Science.gov (United States)

    Dang, Yaoguo; Mao, Wenxin

    2018-01-01

    In view of the multi-attribute decision-making problem that the attribute values are grey multi-source heterogeneous data, a decision-making method based on kernel and greyness degree is proposed. The definitions of kernel and greyness degree of an extended grey number in a grey multi-source heterogeneous data sequence are given. On this basis, we construct the kernel vector and greyness degree vector of the sequence to whiten the multi-source heterogeneous information, then a grey relational bi-directional projection ranking method is presented. Considering the multi-attribute multi-level decision structure and the causalities between attributes in decision-making problem, the HG-DEMATEL method is proposed to determine the hierarchical attribute weights. A green supplier selection example is provided to demonstrate the rationality and validity of the proposed method. PMID:29510521

  10. A Decision-Making Method with Grey Multi-Source Heterogeneous Data and Its Application in Green Supplier Selection.

    Science.gov (United States)

    Sun, Huifang; Dang, Yaoguo; Mao, Wenxin

    2018-03-03

    In view of the multi-attribute decision-making problem that the attribute values are grey multi-source heterogeneous data, a decision-making method based on kernel and greyness degree is proposed. The definitions of kernel and greyness degree of an extended grey number in a grey multi-source heterogeneous data sequence are given. On this basis, we construct the kernel vector and greyness degree vector of the sequence to whiten the multi-source heterogeneous information, then a grey relational bi-directional projection ranking method is presented. Considering the multi-attribute multi-level decision structure and the causalities between attributes in decision-making problem, the HG-DEMATEL method is proposed to determine the hierarchical attribute weights. A green supplier selection example is provided to demonstrate the rationality and validity of the proposed method.

  11. The TRADEX Multitarget Tracker

    Science.gov (United States)

    Meurer, Glenn W., Jr.

    The Multitarget Tracker (MTT) is a real-time signal processing and data processing system installed in the TRADEX radar at the Kiernan Reentry Measurements Site (KREMS) on Kwajalein Atoll in the Marshall Islands. The TRADEX radar is a high-power, high-sensitivity instrumentation radar that was originally designed to track and gather signature data on a single target. The MTT is designed to detect and track as many as 63 targets within the beam of the radar. It provides data necessary for determining the angular locations and ranges of all of these targets, as well as signature data necessary for target identification. The TRADEX MTT is unique because it utilizes a large, mechanically steered, pencil-beam antenna, whereas other MTT systems generally rely on electronically steered antennas or rotating antenna platforms. The MTT system automatically processes received signals, reports targets, initiates and maintains target track files, and presents target information to the radar operators through real-time interactive graphical displays. This information is given to the KREMS Control Center and from there is made available to other systems in the test range. This article presents an overview of the TRADEX MTT system and discusses its implementation, application, and operation.

  12. Multisource waveform inversion of marine streamer data using normalized wavefield

    KAUST Repository

    Choi, Yun Seok

    2013-09-01

    Multisource full-waveform inversion based on the L1- and L2-norm objective functions cannot be applied to marine streamer data because it does not take into account the unmatched acquisition geometries between the observed and modeled data. To apply multisource full-waveform inversion to marine streamer data, we construct the L1- and L2-norm objective functions using the normalized wavefield. The new residual seismograms obtained from the L1- and L2-norms using the normalized wavefield mitigate the problem of unmatched acquisition geometries, which enables multisource full-waveform inversion to work with marine streamer data. In the new approaches using the normalized wavefield, we used the back-propagation algorithm based on the adjoint-state technique to efficiently calculate the gradients of the objective functions. Numerical examples showed that multisource full-waveform inversion using the normalized wavefield yields much better convergence for marine streamer data than conventional approaches. © 2013 Society of Exploration Geophysicists.

  13. Multisource feedback analysis of pediatric outpatient teaching.

    Science.gov (United States)

    Tiao, Mao-Meng; Huang, Li-Tung; Huang, Ying-Hsien; Tang, Kuo-Shu; Chen, Chih-Jen

    2013-11-01

    This study aims to evaluate the outpatient communication skills of medical students via multisource feedback, which may be useful to map future directions in improving physician-patient communication. Family respondents of patients, a nurse, a clinical teacher, and a research assistant evaluated video-recorded medical students' interactions with outpatients by using multisource feedback questionnaires; students also assessed their own skills. The questionnaire was answered based on the video-recorded interactions between outpatients and the medical students. A total of 60 family respondents of the 60 patients completed the questionnaires, 58 (96.7%) of them agreed with the video recording. Two reasons for reluctance were "personal privacy" issues and "simply disagree" with the video recording. The average satisfaction score of the 58 students was 85.1 points, indicating students' performance was in the category between satisfied and very satisfied. The family respondents were most satisfied with the "teacher"s attitude," followed by "teaching quality". In contrast, the family respondents were least satisfied with "being open to questions". Among the 6 assessment domains of communication skills, the students scored highest on "explaining" and lowest on "giving recommendations". In the detailed assessment by family respondents, the students scored lowest on "asking about life/school burden". In the multisource analysis, the nurses' mean score was much higher and the students' mean self-assessment score was lower than the average scores on all domains. The willingness and satisfaction of family respondents were high in this study. Students scored the lowest on giving recommendations to patients. Multisource feedback with video recording is useful in providing more accurate evaluation of students' communication competence and in identifying the areas of communication that require enhancement.

  14. Analysing and Correcting the Differences between Multi-Source and Multi-Scale Spatial Remote Sensing Observations

    Science.gov (United States)

    Dong, Yingying; Luo, Ruisen; Feng, Haikuan; Wang, Jihua; Zhao, Jinling; Zhu, Yining; Yang, Guijun

    2014-01-01

    Differences exist among analysis results of agriculture monitoring and crop production based on remote sensing observations, which are obtained at different spatial scales from multiple remote sensors in same time period, and processed by same algorithms, models or methods. These differences can be mainly quantitatively described from three aspects, i.e. multiple remote sensing observations, crop parameters estimation models, and spatial scale effects of surface parameters. Our research proposed a new method to analyse and correct the differences between multi-source and multi-scale spatial remote sensing surface reflectance datasets, aiming to provide references for further studies in agricultural application with multiple remotely sensed observations from different sources. The new method was constructed on the basis of physical and mathematical properties of multi-source and multi-scale reflectance datasets. Theories of statistics were involved to extract statistical characteristics of multiple surface reflectance datasets, and further quantitatively analyse spatial variations of these characteristics at multiple spatial scales. Then, taking the surface reflectance at small spatial scale as the baseline data, theories of Gaussian distribution were selected for multiple surface reflectance datasets correction based on the above obtained physical characteristics and mathematical distribution properties, and their spatial variations. This proposed method was verified by two sets of multiple satellite images, which were obtained in two experimental fields located in Inner Mongolia and Beijing, China with different degrees of homogeneity of underlying surfaces. Experimental results indicate that differences of surface reflectance datasets at multiple spatial scales could be effectively corrected over non-homogeneous underlying surfaces, which provide database for further multi-source and multi-scale crop growth monitoring and yield prediction, and their corresponding

  15. The Finnish multisource national forest inventory: small-area estimation and map production

    Science.gov (United States)

    Erkki Tomppo

    2009-01-01

    A driving force motivating development of the multisource national forest inventory (MS-NFI) in connection with the Finnish national forest inventory (NFI) was the desire to obtain forest resource information for smaller areas than is possible using field data only without significantly increasing the cost of the inventory. A basic requirement for the method was that...

  16. Improved genome-scale multi-target virtual screening via a novel collaborative filtering approach to cold-start problem.

    Science.gov (United States)

    Lim, Hansaim; Gray, Paul; Xie, Lei; Poleksic, Aleksandar

    2016-12-13

    Conventional one-drug-one-gene approach has been of limited success in modern drug discovery. Polypharmacology, which focuses on searching for multi-targeted drugs to perturb disease-causing networks instead of designing selective ligands to target individual proteins, has emerged as a new drug discovery paradigm. Although many methods for single-target virtual screening have been developed to improve the efficiency of drug discovery, few of these algorithms are designed for polypharmacology. Here, we present a novel theoretical framework and a corresponding algorithm for genome-scale multi-target virtual screening based on the one-class collaborative filtering technique. Our method overcomes the sparseness of the protein-chemical interaction data by means of interaction matrix weighting and dual regularization from both chemicals and proteins. While the statistical foundation behind our method is general enough to encompass genome-wide drug off-target prediction, the program is specifically tailored to find protein targets for new chemicals with little to no available interaction data. We extensively evaluate our method using a number of the most widely accepted gene-specific and cross-gene family benchmarks and demonstrate that our method outperforms other state-of-the-art algorithms for predicting the interaction of new chemicals with multiple proteins. Thus, the proposed algorithm may provide a powerful tool for multi-target drug design.

  17. Multisource Least-squares Reverse Time Migration

    KAUST Repository

    Dai, Wei

    2012-01-01

    is implemented with random time-shift and random source polarity encoding functions. Numerical tests on the 2D HESS VTI data show that the multisource LSRTM algorithm suppresses migration artifacts, balances the amplitudes, improves image resolution, and reduces

  18. An XML schema for automated data integration in a Multi-Source Information System dedicated to end-stage renal disease.

    Science.gov (United States)

    Dufour, Eric; Ben Saïd, Mohamed; Jais, Jean Philippe; Le Mignot, Loic; Richard, Jean-Baptiste; Landais, Paul

    2009-01-01

    Data exchange and interoperability between clinical information systems represent a crucial issue in the context of patient record data collection. An XML representation schema adapted to end-stage renal disease (ESRD) patients was developed and successfully tested against patient data in the dedicated Multi-Source Information System (MSIS) active file (more than 16,000 patient records). The ESRD-XML-Schema is organized into Schema subsets respecting the coherence of the clinical information and enriched with coherent data types. Tests are realized against XML-data files generated in conformity with the ESRD-XML Schema. Manual tests allowed the XML schema validation of the data format and content. Programmatic tests allowed the design of generic XML parsing routines, a portable object data model representation and the implementation of automatic data-exchange flows with the MSIS database system. The ESRD-XML-Schema represents a valid framework for data exchange and supports interoperability. Its modular design offers opportunity to simplify physicians' multiple tasks in order to privilege their clinical work.

  19. Benzimidazoles: an ideal privileged drug scaffold for the design of multitargeted anti-inflammatory ligands.

    Science.gov (United States)

    Kaur, Gaganpreet; Kaur, Maninder; Silakari, Om

    2014-01-01

    The recent research area endeavors to discover ultimate multi-target ligands, an increasingly feasible and attractive alternative to existing mono-targeted drugs for treatment of complex, multi-factorial inflammation process which underlays plethora of debilitated health conditions. In order to improvise this option, exploration of relevant chemical core scaffold will be an utmost need. Privileged benzimidazole scaffold being historically versatile structural motif could offer a viable starting point in the search for novel multi-target ligands against multi-factorial inflammation process since, when appropriately substituted, it can selectively modulate diverse receptors, pathways and enzymes associated with the pathogenesis of inflammation. Despite this remarkable capability, the multi-target capacity of the benzimidazole scaffold remains largely unexploited. With this in focus, the present review article attempts to provide synopsis of published research to exemplify the valuable use of benzimidazole nucleus and focus on their suitability as starting scaffold to develop multi-targeted anti-inflammatory ligands.

  20. An Automatic Multi-Target Independent Analysis Framework for Non-Planar Infrared-Visible Registration.

    Science.gov (United States)

    Sun, Xinglong; Xu, Tingfa; Zhang, Jizhou; Zhao, Zishu; Li, Yuankun

    2017-07-26

    In this paper, we propose a novel automatic multi-target registration framework for non-planar infrared-visible videos. Previous approaches usually analyzed multiple targets together and then estimated a global homography for the whole scene, however, these cannot achieve precise multi-target registration when the scenes are non-planar. Our framework is devoted to solving the problem using feature matching and multi-target tracking. The key idea is to analyze and register each target independently. We present a fast and robust feature matching strategy, where only the features on the corresponding foreground pairs are matched. Besides, new reservoirs based on the Gaussian criterion are created for all targets, and a multi-target tracking method is adopted to determine the relationships between the reservoirs and foreground blobs. With the matches in the corresponding reservoir, the homography of each target is computed according to its moving state. We tested our framework on both public near-planar and non-planar datasets. The results demonstrate that the proposed framework outperforms the state-of-the-art global registration method and the manual global registration matrix in all tested datasets.

  1. Multisource Assessment of Children's Social Competence

    NARCIS (Netherlands)

    Junttila, N.; Voeten, M.J.M.; Kaukiainen, A.; Vauras, M.M.S.

    2006-01-01

    The Multisource Assessment of Social Competence Scale was developed, based on the School Social Behavior Scale and examined to test the factor pattern and the consistency of the ratings of self, peers, teachers, and parents. The findings of the confirmatory factor analysis supported a four-factor

  2. Information selection and signal probability in multisource monitoring under the influence of centrally active drugs : Phentermine versus pentobarbital

    NARCIS (Netherlands)

    Volkerts, E.R; van Laar, M.W; Verbaten, M.N; Mulder, G.; Maes, R.A A

    1996-01-01

    The present study is concerned with the relationship between drug-induced arousal shifts and sampling [(monitoring)] behaviour in a three-source task with an a priori signal occurrence probability of 0.6, 0.3, and 0.1. The multisource monitoring task and procedure was adopted from Hockey (1973) who

  3. Physics Model-Based Scatter Correction in Multi-Source Interior Computed Tomography.

    Science.gov (United States)

    Gong, Hao; Li, Bin; Jia, Xun; Cao, Guohua

    2018-02-01

    Multi-source interior computed tomography (CT) has a great potential to provide ultra-fast and organ-oriented imaging at low radiation dose. However, X-ray cross scattering from multiple simultaneously activated X-ray imaging chains compromises imaging quality. Previously, we published two hardware-based scatter correction methods for multi-source interior CT. Here, we propose a software-based scatter correction method, with the benefit of no need for hardware modifications. The new method is based on a physics model and an iterative framework. The physics model was derived analytically, and was used to calculate X-ray scattering signals in both forward direction and cross directions in multi-source interior CT. The physics model was integrated to an iterative scatter correction framework to reduce scatter artifacts. The method was applied to phantom data from both Monte Carlo simulations and physical experimentation that were designed to emulate the image acquisition in a multi-source interior CT architecture recently proposed by our team. The proposed scatter correction method reduced scatter artifacts significantly, even with only one iteration. Within a few iterations, the reconstructed images fast converged toward the "scatter-free" reference images. After applying the scatter correction method, the maximum CT number error at the region-of-interests (ROIs) was reduced to 46 HU in numerical phantom dataset and 48 HU in physical phantom dataset respectively, and the contrast-noise-ratio at those ROIs increased by up to 44.3% and up to 19.7%, respectively. The proposed physics model-based iterative scatter correction method could be useful for scatter correction in dual-source or multi-source CT.

  4. A modified high-intensity Cs sputter negative-ion source with multi-target mechanism

    International Nuclear Information System (INIS)

    Si Houzhi; Zhang Weizhong; Zhu Jinhau; Du Guangtian; Zhang Tiaorong; Gao Xiang

    1993-01-01

    The source is based on Middleton's high-intensity mode, but modified to a multi-target version. It is equipped with a spherical molybdenum ionizer, a 20-position target wheel and a vacuum lock for loading and unloading sample batches. A metal-ceramic bonded section protected by a specially designed labyrinth shielding system results in reliable insulation of the cathode and convenient control of cesium vapor. The latter is particularly important when an oversupply of cesium occurs. The source was developed for accelerator mass spectrometry (AMS) applications. Recently, three versions based on the prototype of the source have been successfully tested to meet different requirements: (a) Single target version, (b) multi-target version with manual sample change, and (c) multi-target version with remote control sample change. Some details of the technical and operational characteristics are presented. (orig.)

  5. Designing multi-targeted agents: An emerging anticancer drug discovery paradigm.

    Science.gov (United States)

    Fu, Rong-Geng; Sun, Yuan; Sheng, Wen-Bing; Liao, Duan-Fang

    2017-08-18

    The dominant paradigm in drug discovery is to design ligands with maximum selectivity to act on individual drug targets. With the target-based approach, many new chemical entities have been discovered, developed, and further approved as drugs. However, there are a large number of complex diseases such as cancer that cannot be effectively treated or cured only with one medicine to modulate the biological function of a single target. As simultaneous intervention of two (or multiple) cancer progression relevant targets has shown improved therapeutic efficacy, the innovation of multi-targeted drugs has become a promising and prevailing research topic and numerous multi-targeted anticancer agents are currently at various developmental stages. However, most multi-pharmacophore scaffolds are usually discovered by serendipity or screening, while rational design by combining existing pharmacophore scaffolds remains an enormous challenge. In this review, four types of multi-pharmacophore modes are discussed, and the examples from literature will be used to introduce attractive lead compounds with the capability of simultaneously interfering with different enzyme or signaling pathway of cancer progression, which will reveal the trends and insights to help the design of the next generation multi-targeted anticancer agents. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  6. Multichannel/Multisensor Signal Processing In Uncertain Environments With Application To Multitarget Tracking.

    Science.gov (United States)

    1998-05-22

    and time diversity multiaccess/multiuser digital communications and ■ multitarget tracking using multi- platform multisensor arrays. In part II focus...multiuser digital communications and multitarget tracking using multi- platform multisensor arrays. In part II focus is on maneuvering target tracking...Li< we have ■Hk;K-l,K-l,",lC-l(s) = S’fc;K-+ii-i,K+i3-i,-,K’+rM-i(x) (8) Since x{k) = W{z)w(k), using (4) it follows that w(fc) = £[=0 Wix (k - i

  7. Investigations of Orchestra Auralizations Using the Multi-Channel Multi-Source Auralization Technique

    DEFF Research Database (Denmark)

    Vigeant, Michelle; Wang, Lily M.; Rindel, Jens Holger

    2008-01-01

    a multi-channel multi-source auralization technique, involving individual five-channel anechoic recordings of each instrumental part of two symphonies. In the first study, these auralizations were subjectively compared to orchestra auralizations made using (a) a single omni-directional source, (b......) a surface source, and (c) single-channel multi-source method. Results show that the multi-source auralizations were rated to be more realistic than the surface source ones and to have larger source width than the single omni-directional source auralizations. No significant differences were found between......Room acoustics computer modeling is a tool for generating impulse responses and auralizations from modeled spaces. The auralizations are commonly made from a single-channel anechoic recording of solo instruments. For this investigation, auralizations of an entire orchestra were created using...

  8. Multi-source waveform inversion of marine streamer data using the normalized wavefield

    KAUST Repository

    Choi, Yun Seok

    2012-01-01

    Even though the encoded multi-source approach dramatically reduces the computational cost of waveform inversion, it is generally not applicable to marine streamer data. This is because the simultaneous-sources modeled data cannot be muted to comply with the configuration of the marine streamer data, which causes differences in the number of stacked-traces, or energy levels, between the modeled and observed data. Since the conventional L2 norm does not account for the difference in energy levels, multi-source inversion based on the conventional L2 norm does not work for marine streamer data. In this study, we propose the L2, approximated L2, and L1 norm using the normalized wavefields for the multi-source waveform inversion of marine streamer data. Since the normalized wavefields mitigate the different energy levels between the observed and modeled wavefields, the multi-source waveform inversion using the normalized wavefields can be applied to marine streamer data. We obtain the gradient of the objective functions using the back-propagation algorithm. To conclude, the gradient of the L2 norm using the normalized wavefields is exactly the same as that of the global correlation norm. In the numerical examples, the new objective functions using the normalized wavefields generate successful results whereas conventional L2 norm does not.

  9. Low-communication parallel quantum multi-target preimage search

    NARCIS (Netherlands)

    Banegas, G.S.; Bernstein, D.J.; Adams, Carlisle; Camenisch, Jan

    2017-01-01

    The most important pre-quantum threat to AES-128 is the 1994 van Oorschot–Wiener “parallel rho method”, a low-communication parallel pre-quantum multi-target preimage-search algorithm. This algorithm uses a mesh of p small processors, each running for approximately 2 128 /pt 2128/pt fast steps, to

  10. Physics Mining of Multi-source Data Sets, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to implement novel physics mining algorithms with analytical capabilities to derive diagnostic and prognostic numerical models from multi-source...

  11. The optimal algorithm for Multi-source RS image fusion.

    Science.gov (United States)

    Fu, Wei; Huang, Shui-Guang; Li, Zeng-Shun; Shen, Hao; Li, Jun-Shuai; Wang, Peng-Yuan

    2016-01-01

    In order to solve the issue which the fusion rules cannot be self-adaptively adjusted by using available fusion methods according to the subsequent processing requirements of Remote Sensing (RS) image, this paper puts forward GSDA (genetic-iterative self-organizing data analysis algorithm) by integrating the merit of genetic arithmetic together with the advantage of iterative self-organizing data analysis algorithm for multi-source RS image fusion. The proposed algorithm considers the wavelet transform of the translation invariance as the model operator, also regards the contrast pyramid conversion as the observed operator. The algorithm then designs the objective function by taking use of the weighted sum of evaluation indices, and optimizes the objective function by employing GSDA so as to get a higher resolution of RS image. As discussed above, the bullet points of the text are summarized as follows.•The contribution proposes the iterative self-organizing data analysis algorithm for multi-source RS image fusion.•This article presents GSDA algorithm for the self-adaptively adjustment of the fusion rules.•This text comes up with the model operator and the observed operator as the fusion scheme of RS image based on GSDA. The proposed algorithm opens up a novel algorithmic pathway for multi-source RS image fusion by means of GSDA.

  12. Multi-source least-squares migration of marine data

    KAUST Repository

    Wang, Xin; Schuster, Gerard T.

    2012-01-01

    Kirchhoff based multi-source least-squares migration (MSLSM) is applied to marine streamer data. To suppress the crosstalk noise from the excitation of multiple sources, a dynamic encoding function (including both time-shifts and polarity changes

  13. Long-term monitoring on environmental disasters using multi-source remote sensing technique

    Science.gov (United States)

    Kuo, Y. C.; Chen, C. F.

    2017-12-01

    Environmental disasters are extreme events within the earth's system that cause deaths and injuries to humans, as well as causing damages and losses of valuable assets, such as buildings, communication systems, farmlands, forest and etc. In disaster management, a large amount of multi-temporal spatial data is required. Multi-source remote sensing data with different spatial, spectral and temporal resolutions is widely applied on environmental disaster monitoring. With multi-source and multi-temporal high resolution images, we conduct rapid, systematic and seriate observations regarding to economic damages and environmental disasters on earth. It is based on three monitoring platforms: remote sensing, UAS (Unmanned Aircraft Systems) and ground investigation. The advantages of using UAS technology include great mobility and availability in real-time rapid and more flexible weather conditions. The system can produce long-term spatial distribution information from environmental disasters, obtaining high-resolution remote sensing data and field verification data in key monitoring areas. It also supports the prevention and control on ocean pollutions, illegally disposed wastes and pine pests in different scales. Meanwhile, digital photogrammetry can be applied on the camera inside and outside the position parameters to produce Digital Surface Model (DSM) data. The latest terrain environment information is simulated by using DSM data, and can be used as references in disaster recovery in the future.

  14. Automated selection of relevant information for notification of incident cancer cases within a multisource cancer registry.

    Science.gov (United States)

    Jouhet, V; Defossez, G; Ingrand, P

    2013-01-01

    The aim of this study was to develop and evaluate a selection algorithm of relevant records for the notification of incident cases of cancer on the basis of the individual data available in a multi-source information system. This work was conducted on data for the year 2008 in the general cancer registry of Poitou-Charentes region (France). The selection algorithm hierarchizes information according to its level of relevance for tumoral topography and tumoral morphology independently. The selected data are combined to form composite records. These records are then grouped in respect with the notification rules of the International Agency for Research on Cancer for multiple primary cancers. The evaluation, based on recall, precision and F-measure confronted cases validated manually by the registry's physicians with tumours notified with and without records selection. The analysis involved 12,346 tumours validated among 11,971 individuals. The data used were hospital discharge data (104,474 records), pathology data (21,851 records), healthcare insurance data (7508 records) and cancer care centre's data (686 records). The selection algorithm permitted performances improvement for notification of tumour topography (F-measure 0.926 with vs. 0.857 without selection) and tumour morphology (F-measure 0.805 with vs. 0.750 without selection). These results show that selection of information according to its origin is efficient in reducing noise generated by imprecise coding. Further research is needed for solving the semantic problems relating to the integration of heterogeneous data and the use of non-structured information.

  15. Identification and characterization of carprofen as a multitarget fatty acid amide hydrolase/cyclooxygenase inhibitor.

    Science.gov (United States)

    Favia, Angelo D; Habrant, Damien; Scarpelli, Rita; Migliore, Marco; Albani, Clara; Bertozzi, Sine Mandrup; Dionisi, Mauro; Tarozzo, Glauco; Piomelli, Daniele; Cavalli, Andrea; De Vivo, Marco

    2012-10-25

    Pain and inflammation are major therapeutic areas for drug discovery. Current drugs for these pathologies have limited efficacy, however, and often cause a number of unwanted side effects. In the present study, we identify the nonsteroidal anti-inflammatory drug carprofen as a multitarget-directed ligand that simultaneously inhibits cyclooxygenase-1 (COX-1), COX-2, and fatty acid amide hydrolase (FAAH). Additionally, we synthesized and tested several derivatives of carprofen, sharing this multitarget activity. This may result in improved analgesic efficacy and reduced side effects (Naidu et al. J. Pharmacol. Exp. Ther.2009, 329, 48-56; Fowler, C. J.; et al. J. Enzyme Inhib. Med. Chem.2012, in press; Sasso et al. Pharmacol. Res.2012, 65, 553). The new compounds are among the most potent multitarget FAAH/COX inhibitors reported so far in the literature and thus may represent promising starting points for the discovery of new analgesic and anti-inflammatory drugs.

  16. Modeling multi-source flooding disaster and developing simulation framework in Delta

    Science.gov (United States)

    Liu, Y.; Cui, X.; Zhang, W.

    2016-12-01

    Most Delta regions of the world are densely populated and with advanced economies. However, due to impact of the multi-source flooding (upstream flood, rainstorm waterlogging, storm surge flood), the Delta regions is very vulnerable. The academic circles attach great importance to the multi-source flooding disaster in these areas. The Pearl River Delta urban agglomeration in south China is selected as the research area. Based on analysis of natural and environmental characteristics data of the Delta urban agglomeration(remote sensing data, land use data, topographic map, etc.), hydrological monitoring data, research of the uneven distribution and process of regional rainfall, the relationship between the underlying surface and the parameters of runoff, effect of flood storage pattern, we use an automatic or semi-automatic method for dividing spatial units to reflect the runoff characteristics in urban agglomeration, and develop an Multi-model Ensemble System in changing environment, including urban hydrologic model, parallel computational 1D&2D hydrodynamic model, storm surge forecast model and other professional models, the system will have the abilities like real-time setting a variety of boundary conditions, fast and real-time calculation, dynamic presentation of results, powerful statistical analysis function. The model could be optimized and improved by a variety of verification methods. This work was supported by the National Natural Science Foundation of China (41471427); Special Basic Research Key Fund for Central Public Scientific Research Institutes.

  17. Dual-acting of Hybrid Compounds - A New Dawn in the Discovery of Multi-target Drugs: Lead Generation Approaches.

    Science.gov (United States)

    Abdolmaleki, Azizeh; Ghasemi, Jahan B

    2017-01-01

    Finding high quality beginning compounds is a critical job at the start of the lead generation stage for multi-target drug discovery (MTDD). Designing hybrid compounds as selective multitarget chemical entity is a challenge, opportunity, and new idea to better act against specific multiple targets. One hybrid molecule is formed by two (or more) pharmacophore group's participation. So, these new compounds often exhibit two or more activities going about as multi-target drugs (mtdrugs) and may have superior safety or efficacy. Application of integrating a range of information and sophisticated new in silico, bioinformatics, structural biology, pharmacogenomics methods may be useful to discover/design, and synthesis of the new hybrid molecules. In this regard, many rational and screening approaches have followed by medicinal chemists for the lead generation in MTDD. Here, we review some popular lead generation approaches that have been used for designing multiple ligands (DMLs). This paper focuses on dual- acting chemical entities that incorporate a part of two drugs or bioactive compounds to compose hybrid molecules. Also, it presents some of key concepts and limitations/strengths of lead generation methods by comparing combination framework method with screening approaches. Besides, a number of examples to represent applications of hybrid molecules in the drug discovery are included. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  18. Multi-Target Tracking via Mixed Integer Optimization

    Science.gov (United States)

    2016-05-13

    an easily interpretable global objective function. Furthermore, we propose a greedy heuristic which quickly finds good solutions. We extend both the... heuristic and the MIO model to scenarios with missed detections and false alarms. Index Terms—optimization; multi-target tracking; data asso- ciation...energy in [14] and then again as a minimization of discrete-continuous energy in [15]. These algorithms aim to more accurately represent the nature of the

  19. Sound Localization in Multisource Environments

    Science.gov (United States)

    2009-03-01

    A total of 7 paid volunteer listeners (3 males and 4 females, 20-25 years of age ) par- ticipated in the experiment. All had normal hearing (i.e...effects of the loudspeaker frequency responses, and were then sent from an experimental control computer to a Mark of the Unicorn (MOTU 24 I/O) digital-to...after the overall multisource stimulus has been presented (the ’post-cue’ condition). 3.2 Methods 3.2.1 Listeners Eight listeners, ranging in age from

  20. Application of Multi-Source Remote Sensing Image in Yunnan Province Grassland Resources Investigation

    Science.gov (United States)

    Li, J.; Wen, G.; Li, D.

    2018-04-01

    Trough mastering background information of Yunnan province grassland resources utilization and ecological conditions to improves grassland elaborating management capacity, it carried out grassland resource investigation work by Yunnan province agriculture department in 2017. The traditional grassland resource investigation method is ground based investigation, which is time-consuming and inefficient, especially not suitable for large scale and hard-to-reach areas. While remote sensing is low cost, wide range and efficient, which can reflect grassland resources present situation objectively. It has become indispensable grassland monitoring technology and data sources and it has got more and more recognition and application in grassland resources monitoring research. This paper researches application of multi-source remote sensing image in Yunnan province grassland resources investigation. First of all, it extracts grassland resources thematic information and conducts field investigation through BJ-2 high space resolution image segmentation. Secondly, it classifies grassland types and evaluates grassland degradation degree through high resolution characteristics of Landsat 8 image. Thirdly, it obtained grass yield model and quality classification through high resolution and wide scanning width characteristics of MODIS images and sample investigate data. Finally, it performs grassland field qualitative analysis through UAV remote sensing image. According to project area implementation, it proves that multi-source remote sensing data can be applied to the grassland resources investigation in Yunnan province and it is indispensable method.

  1. Application of multi-source waveform inversion to marine streamer data using the global correlation norm

    KAUST Repository

    Choi, Yun Seok

    2012-05-02

    Conventional multi-source waveform inversion using an objective function based on the least-square misfit cannot be applied to marine streamer acquisition data because of inconsistent acquisition geometries between observed and modelled data. To apply the multi-source waveform inversion to marine streamer data, we use the global correlation between observed and modelled data as an alternative objective function. The new residual seismogram derived from the global correlation norm attenuates modelled data not supported by the configuration of observed data and thus, can be applied to multi-source waveform inversion of marine streamer data. We also show that the global correlation norm is theoretically the same as the least-square norm of the normalized wavefield. To efficiently calculate the gradient, our method employs a back-propagation algorithm similar to reverse-time migration based on the adjoint-state of the wave equation. In numerical examples, the multi-source waveform inversion using the global correlation norm results in better inversion results for marine streamer acquisition data than the conventional approach. © 2012 European Association of Geoscientists & Engineers.

  2. Free-time and fixed end-point multi-target optimal control theory: Application to quantum computing

    International Nuclear Information System (INIS)

    Mishima, K.; Yamashita, K.

    2011-01-01

    Graphical abstract: The two-state Deutsch-Jozsa algortihm used to demonstrate the utility of free-time and fixed-end point multi-target optimal control theory. Research highlights: → Free-time and fixed-end point multi-target optimal control theory (FRFP-MTOCT) was constructed. → The features of our theory include optimization of the external time-dependent perturbations with high transition probabilities, that of the temporal duration, the monotonic convergence, and the ability to optimize multiple-laser pulses simultaneously. → The advantage of the theory and a comparison with conventional fixed-time and fixed end-point multi-target optimal control theory (FIFP-MTOCT) are presented by comparing data calculated using the present theory with those published previously [K. Mishima, K. Yamashita, Chem. Phys. 361 (2009) 106]. → The qubit system of our interest consists of two polar NaCl molecules coupled by dipole-dipole interaction. → The calculation examples show that our theory is useful for minor adjustment of the external fields. - Abstract: An extension of free-time and fixed end-point optimal control theory (FRFP-OCT) to monotonically convergent free-time and fixed end-point multi-target optimal control theory (FRFP-MTOCT) is presented. The features of our theory include optimization of the external time-dependent perturbations with high transition probabilities, that of the temporal duration, the monotonic convergence, and the ability to optimize multiple-laser pulses simultaneously. The advantage of the theory and a comparison with conventional fixed-time and fixed end-point multi-target optimal control theory (FIFP-MTOCT) are presented by comparing data calculated using the present theory with those published previously [K. Mishima, K. Yamashita, Chem. Phys. 361, (2009), 106]. The qubit system of our interest consists of two polar NaCl molecules coupled by dipole-dipole interaction. The calculation examples show that our theory is useful for minor

  3. Statistical Estimators Using Jointly Administrative and Survey Data to Produce French Structural Business Statistics

    Directory of Open Access Journals (Sweden)

    Brion Philippe

    2015-12-01

    Full Text Available Using as much administrative data as possible is a general trend among most national statistical institutes. Different kinds of administrative sources, from tax authorities or other administrative bodies, are very helpful material in the production of business statistics. However, these sources often have to be completed by information collected through statistical surveys. This article describes the way Insee has implemented such a strategy in order to produce French structural business statistics. The originality of the French procedure is that administrative and survey variables are used jointly for the same enterprises, unlike the majority of multisource systems, in which the two kinds of sources generally complement each other for different categories of units. The idea is to use, as much as possible, the richness of the administrative sources combined with the timeliness of a survey, even if the latter is conducted only on a sample of enterprises. One main issue is the classification of enterprises within the NACE nomenclature, which is a cornerstone variable in producing the breakdown of the results by industry. At a given date, two values of the corresponding code may coexist: the value of the register, not necessarily up to date, and the value resulting from the data collected via the survey, but only from a sample of enterprises. Using all this information together requires the implementation of specific statistical estimators combining some properties of the difference estimators with calibration techniques. This article presents these estimators, as well as their statistical properties, and compares them with those of other methods.

  4. Compressed Sensing and Low-Rank Matrix Decomposition in Multisource Images Fusion

    Directory of Open Access Journals (Sweden)

    Kan Ren

    2014-01-01

    Full Text Available We propose a novel super-resolution multisource images fusion scheme via compressive sensing and dictionary learning theory. Under the sparsity prior of images patches and the framework of the compressive sensing theory, the multisource images fusion is reduced to a signal recovery problem from the compressive measurements. Then, a set of multiscale dictionaries are learned from several groups of high-resolution sample image’s patches via a nonlinear optimization algorithm. Moreover, a new linear weights fusion rule is proposed to obtain the high-resolution image. Some experiments are taken to investigate the performance of our proposed method, and the results prove its superiority to its counterparts.

  5. A Spatial Data Infrastructure Integrating Multisource Heterogeneous Geospatial Data and Time Series: A Study Case in Agriculture

    Directory of Open Access Journals (Sweden)

    Gloria Bordogna

    2016-05-01

    Full Text Available Currently, the best practice to support land planning calls for the development of Spatial Data Infrastructures (SDI capable of integrating both geospatial datasets and time series information from multiple sources, e.g., multitemporal satellite data and Volunteered Geographic Information (VGI. This paper describes an original OGC standard interoperable SDI architecture and a geospatial data and metadata workflow for creating and managing multisource heterogeneous geospatial datasets and time series, and discusses it in the framework of the Space4Agri project study case developed to support the agricultural sector in Lombardy region, Northern Italy. The main novel contributions go beyond the application domain for which the SDI has been developed and are the following: the ingestion within an a-centric SDI, potentially distributed in several nodes on the Internet to support scalability, of products derived by processing remote sensing images, authoritative data, georeferenced in-situ measurements and voluntary information (VGI created by farmers and agronomists using an original Smart App; the workflow automation for publishing sets and time series of heterogeneous multisource geospatial data and relative web services; and, finally, the project geoportal, that can ease the analysis of the geospatial datasets and time series by providing complex intelligent spatio-temporal query and answering facilities.

  6. A Practice Approach of Multi-source Geospatial Data Integration for Web-based Geoinformation Services

    Science.gov (United States)

    Huang, W.; Jiang, J.; Zha, Z.; Zhang, H.; Wang, C.; Zhang, J.

    2014-04-01

    Geospatial data resources are the foundation of the construction of geo portal which is designed to provide online geoinformation services for the government, enterprise and public. It is vital to keep geospatial data fresh, accurate and comprehensive in order to satisfy the requirements of application and development of geographic location, route navigation, geo search and so on. One of the major problems we are facing is data acquisition. For us, integrating multi-sources geospatial data is the mainly means of data acquisition. This paper introduced a practice integration approach of multi-source geospatial data with different data model, structure and format, which provided the construction of National Geospatial Information Service Platform of China (NGISP) with effective technical supports. NGISP is the China's official geo portal which provides online geoinformation services based on internet, e-government network and classified network. Within the NGISP architecture, there are three kinds of nodes: national, provincial and municipal. Therefore, the geospatial data is from these nodes and the different datasets are heterogeneous. According to the results of analysis of the heterogeneous datasets, the first thing we do is to define the basic principles of data fusion, including following aspects: 1. location precision; 2.geometric representation; 3. up-to-date state; 4. attribute values; and 5. spatial relationship. Then the technical procedure is researched and the method that used to process different categories of features such as road, railway, boundary, river, settlement and building is proposed based on the principles. A case study in Jiangsu province demonstrated the applicability of the principle, procedure and method of multi-source geospatial data integration.

  7. 3D Multisource Full‐Waveform Inversion using Dynamic Random Phase Encoding

    KAUST Repository

    Boonyasiriwat, Chaiwoot

    2010-10-17

    We have developed a multisource full‐waveform inversion algorithm using a dynamic phase encoding strategy with dual‐randomization—both the position and polarity of simultaneous sources are randomized and changed every iteration. The dynamic dual‐randomization is used to promote the destructive interference of crosstalk noise resulting from blending a large number of common shot gathers into a supergather. We compare our multisource algorithm with various algorithms in a numerical experiment using the 3D SEG/EAGE overthrust model and show that our algorithm provides a higher‐quality velocity tomogram than the other methods that use only monorandomization. This suggests that increasing the degree of randomness in phase encoding should improve the quality of the inversion result.

  8. Dynamic statistical information theory

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    In recent years we extended Shannon static statistical information theory to dynamic processes and established a Shannon dynamic statistical information theory, whose core is the evolution law of dynamic entropy and dynamic information. We also proposed a corresponding Boltzmman dynamic statistical information theory. Based on the fact that the state variable evolution equation of respective dynamic systems, i.e. Fokker-Planck equation and Liouville diffusion equation can be regarded as their information symbol evolution equation, we derived the nonlinear evolution equations of Shannon dynamic entropy density and dynamic information density and the nonlinear evolution equations of Boltzmann dynamic entropy density and dynamic information density, that describe respectively the evolution law of dynamic entropy and dynamic information. The evolution equations of these two kinds of dynamic entropies and dynamic informations show in unison that the time rate of change of dynamic entropy densities is caused by their drift, diffusion and production in state variable space inside the systems and coordinate space in the transmission processes; and that the time rate of change of dynamic information densities originates from their drift, diffusion and dissipation in state variable space inside the systems and coordinate space in the transmission processes. Entropy and information have been combined with the state and its law of motion of the systems. Furthermore we presented the formulas of two kinds of entropy production rates and information dissipation rates, the expressions of two kinds of drift information flows and diffusion information flows. We proved that two kinds of information dissipation rates (or the decrease rates of the total information) were equal to their corresponding entropy production rates (or the increase rates of the total entropy) in the same dynamic system. We obtained the formulas of two kinds of dynamic mutual informations and dynamic channel

  9. Fabrication and mount of a multisource irradiation systems

    International Nuclear Information System (INIS)

    Mariano-Heredia, E.

    1990-01-01

    Conception of this equipment was born of the necessity for a system suitalbe for optimizing the methods employed up to the present in ININ, for calibrating portalbe gamma radiation monitors for radiological protection of the need for a national level reference laboratory. The equipment parts are: a multisource irradiator, a system for the transport and positioning of radiation monitors, an aerial conveyor ANS a control panel. The multisource irradiator, which is shielded container, houses five Cs-137 and Co-60 different activity radiative sources. The transport and positioning radiation monitor system places the monitors for calibration at the required distance and height. The instrument source distance can be selected from the control panel. The instrument source distance and detector readings can be verified by means of a closed TV circuit. The activity of the radiation sources, using varying combinations of instrument source distances, has been characterized for each of the radiative sources and all the calibration parameters of the radiation beam central axis are known to a precision of within 3.0 % error. (Author)

  10. A Bayesian solution to multi-target tracking problems with mixed labelling

    NARCIS (Netherlands)

    Aoki, E.H.; Boers, Y.; Svensson, Lennart; Mandal, Pranab K.; Bagchi, Arunabha

    In Multi-Target Tracking (MTT), the problem of assigning labels to tracks (track labelling) is vastly covered in literature and has been previously formulated using Bayesian recursion. However, the existing literature lacks an appropriate measure of uncertainty related to the assigned labels which

  11. A novel multitarget model of radiation-induced cell killing based on the Gaussian distribution.

    Science.gov (United States)

    Zhao, Lei; Mi, Dong; Sun, Yeqing

    2017-05-07

    The multitarget version of the traditional target theory based on the Poisson distribution is still used to describe the dose-survival curves of cells after ionizing radiation in radiobiology and radiotherapy. However, noting that the usual ionizing radiation damage is the result of two sequential stochastic processes, the probability distribution of the damage number per cell should follow a compound Poisson distribution, like e.g. Neyman's distribution of type A (N. A.). In consideration of that the Gaussian distribution can be considered as the approximation of the N. A. in the case of high flux, a multitarget model based on the Gaussian distribution is proposed to describe the cell inactivation effects in low linear energy transfer (LET) radiation with high dose-rate. Theoretical analysis and experimental data fitting indicate that the present theory is superior to the traditional multitarget model and similar to the Linear - Quadratic (LQ) model in describing the biological effects of low-LET radiation with high dose-rate, and the parameter ratio in the present model can be used as an alternative indicator to reflect the radiation damage and radiosensitivity of the cells. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Random set particle filter for bearings-only multitarget tracking

    Science.gov (United States)

    Vihola, Matti

    2005-05-01

    The random set approach to multitarget tracking is a theoretically sound framework that covers joint estimation of the number of targets and the state of the targets. This paper describes a particle filter implementation of the random set multitarget filter. The contribution of this paper to the random set tracking framework is the formulation of a measurement model where each sensor report is assumed to contain at most one measurement. The implemented filter was tested in synthetic bearings-only tracking scenarios containing up to two targets in the presence of false alarms and missed measurements. The estimated target state consisted of 2D position and velocity components. The filter was capable to track the targets fairly well despite of the missing measurements and the relatively high false alarm rates. In addition, the filter showed robustness against wrong parameter values of false alarm rates. The results that were obtained during the limited tests of the filter show that the random set framework has potential for challenging tracking situations. On the other hand, the computational burden of the described implementation is quite high and increases approximately linearly with respect to the expected number of targets.

  13. Challenges with secondary use of multi-source water-quality data in the United States

    Science.gov (United States)

    Sprague, Lori A.; Oelsner, Gretchen P.; Argue, Denise M.

    2017-01-01

    Combining water-quality data from multiple sources can help counterbalance diminishing resources for stream monitoring in the United States and lead to important regional and national insights that would not otherwise be possible. Individual monitoring organizations understand their own data very well, but issues can arise when their data are combined with data from other organizations that have used different methods for reporting the same common metadata elements. Such use of multi-source data is termed “secondary use”—the use of data beyond the original intent determined by the organization that collected the data. In this study, we surveyed more than 25 million nutrient records collected by 488 organizations in the United States since 1899 to identify major inconsistencies in metadata elements that limit the secondary use of multi-source data. Nearly 14.5 million of these records had missing or ambiguous information for one or more key metadata elements, including (in decreasing order of records affected) sample fraction, chemical form, parameter name, units of measurement, precise numerical value, and remark codes. As a result, metadata harmonization to make secondary use of these multi-source data will be time consuming, expensive, and inexact. Different data users may make different assumptions about the same ambiguous data, potentially resulting in different conclusions about important environmental issues. The value of these ambiguous data is estimated at \\$US12 billion, a substantial collective investment by water-resource organizations in the United States. By comparison, the value of unambiguous data is estimated at \\$US8.2 billion. The ambiguous data could be preserved for uses beyond the original intent by developing and implementing standardized metadata practices for future and legacy water-quality data throughout the United States.

  14. PRIS-STATISTICS: Power Reactor Information System Statistical Reports. User's Manual

    International Nuclear Information System (INIS)

    2013-01-01

    The IAEA developed the Power Reactor Information System (PRIS)-Statistics application to assist PRIS end users with generating statistical reports from PRIS data. Statistical reports provide an overview of the status, specification and performance results of every nuclear power reactor in the world. This user's manual was prepared to facilitate the use of the PRIS-Statistics application and to provide guidelines and detailed information for each report in the application. Statistical reports support analyses of nuclear power development and strategies, and the evaluation of nuclear power plant performance. The PRIS database can be used for comprehensive trend analyses and benchmarking against best performers and industrial standards.

  15. Multisource waveform inversion of marine streamer data using normalized wavefield

    KAUST Repository

    Choi, Yun Seok; Alkhalifah, Tariq Ali

    2013-01-01

    Multisource full-waveform inversion based on the L1- and L2-norm objective functions cannot be applied to marine streamer data because it does not take into account the unmatched acquisition geometries between the observed and modeled data. To apply

  16. L1-norm locally linear representation regularization multi-source adaptation learning.

    Science.gov (United States)

    Tao, Jianwen; Wen, Shiting; Hu, Wenjun

    2015-09-01

    In most supervised domain adaptation learning (DAL) tasks, one has access only to a small number of labeled examples from target domain. Therefore the success of supervised DAL in this "small sample" regime needs the effective utilization of the large amounts of unlabeled data to extract information that is useful for generalization. Toward this end, we here use the geometric intuition of manifold assumption to extend the established frameworks in existing model-based DAL methods for function learning by incorporating additional information about the target geometric structure of the marginal distribution. We would like to ensure that the solution is smooth with respect to both the ambient space and the target marginal distribution. In doing this, we propose a novel L1-norm locally linear representation regularization multi-source adaptation learning framework which exploits the geometry of the probability distribution, which has two techniques. Firstly, an L1-norm locally linear representation method is presented for robust graph construction by replacing the L2-norm reconstruction measure in LLE with L1-norm one, which is termed as L1-LLR for short. Secondly, considering the robust graph regularization, we replace traditional graph Laplacian regularization with our new L1-LLR graph Laplacian regularization and therefore construct new graph-based semi-supervised learning framework with multi-source adaptation constraint, which is coined as L1-MSAL method. Moreover, to deal with the nonlinear learning problem, we also generalize the L1-MSAL method by mapping the input data points from the input space to a high-dimensional reproducing kernel Hilbert space (RKHS) via a nonlinear mapping. Promising experimental results have been obtained on several real-world datasets such as face, visual video and object. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. PMHT Approach for Multi-Target Multi-Sensor Sonar Tracking in Clutter.

    Science.gov (United States)

    Li, Xiaohua; Li, Yaan; Yu, Jing; Chen, Xiao; Dai, Miao

    2015-11-06

    Multi-sensor sonar tracking has many advantages, such as the potential to reduce the overall measurement uncertainty and the possibility to hide the receiver. However, the use of multi-target multi-sensor sonar tracking is challenging because of the complexity of the underwater environment, especially the low target detection probability and extremely large number of false alarms caused by reverberation. In this work, to solve the problem of multi-target multi-sensor sonar tracking in the presence of clutter, a novel probabilistic multi-hypothesis tracker (PMHT) approach based on the extended Kalman filter (EKF) and unscented Kalman filter (UKF) is proposed. The PMHT can efficiently handle the unknown measurements-to-targets and measurements-to-transmitters data association ambiguity. The EKF and UKF are used to deal with the high degree of nonlinearity in the measurement model. The simulation results show that the proposed algorithm can improve the target tracking performance in a cluttered environment greatly, and its computational load is low.

  18. SU-D-210-03: Limited-View Multi-Source Quantitative Photoacoustic Tomography

    Energy Technology Data Exchange (ETDEWEB)

    Feng, J; Gao, H [Shanghai Jiao Tong University, Shanghai, Shanghai (China)

    2015-06-15

    Purpose: This work is to investigate a novel limited-view multi-source acquisition scheme for the direct and simultaneous reconstruction of optical coefficients in quantitative photoacoustic tomography (QPAT), which has potentially improved signal-to-noise ratio and reduced data acquisition time. Methods: Conventional QPAT is often considered in two steps: first to reconstruct the initial acoustic pressure from the full-view ultrasonic data after each optical illumination, and then to quantitatively reconstruct optical coefficients (e.g., absorption and scattering coefficients) from the initial acoustic pressure, using multi-source or multi-wavelength scheme.Based on a novel limited-view multi-source scheme here, We have to consider the direct reconstruction of optical coefficients from the ultrasonic data, since the initial acoustic pressure can no longer be reconstructed as an intermediate variable due to the incomplete acoustic data in the proposed limited-view scheme. In this work, based on a coupled photo-acoustic forward model combining diffusion approximation and wave equation, we develop a limited-memory Quasi-Newton method (LBFGS) for image reconstruction that utilizes the adjoint forward problem for fast computation of gradients. Furthermore, the tensor framelet sparsity is utilized to improve the image reconstruction which is solved by Alternative Direction Method of Multipliers (ADMM). Results: The simulation was performed on a modified Shepp-Logan phantom to validate the feasibility of the proposed limited-view scheme and its corresponding image reconstruction algorithms. Conclusion: A limited-view multi-source QPAT scheme is proposed, i.e., the partial-view acoustic data acquisition accompanying each optical illumination, and then the simultaneous rotations of both optical sources and ultrasonic detectors for next optical illumination. Moreover, LBFGS and ADMM algorithms are developed for the direct reconstruction of optical coefficients from the

  19. Two-phase framework for optimal multi-target Lambert rendezvous

    OpenAIRE

    Bang, Jun; Ahn, Jaemyung

    2017-01-01

    This paper proposes a two-phase framework to solve an optimal multi-target Lambert rendezvous problem. The first phase solves a series of single-target rendezvous problems for all departure-arrival object pairs to generate the elementary solutions, which provides candidate rendezvous trajectories (elementary solutions). The second phase formulates a variant of traveling salesman problem (TSP) using the elementary solutions prepared in the first phase and determines the best rendezvous sequenc...

  20. Research on multi-source image fusion technology in haze environment

    Science.gov (United States)

    Ma, GuoDong; Piao, Yan; Li, Bing

    2017-11-01

    In the haze environment, the visible image collected by a single sensor can express the details of the shape, color and texture of the target very well, but because of the haze, the sharpness is low and some of the target subjects are lost; Because of the expression of thermal radiation and strong penetration ability, infrared image collected by a single sensor can clearly express the target subject, but it will lose detail information. Therefore, the multi-source image fusion method is proposed to exploit their respective advantages. Firstly, the improved Dark Channel Prior algorithm is used to preprocess the visible haze image. Secondly, the improved SURF algorithm is used to register the infrared image and the haze-free visible image. Finally, the weighted fusion algorithm based on information complementary is used to fuse the image. Experiments show that the proposed method can improve the clarity of the visible target and highlight the occluded infrared target for target recognition.

  1. Multi-Target Screening and Experimental Validation of Natural Products from Selaginella Plants against Alzheimer's Disease

    Directory of Open Access Journals (Sweden)

    Yin-Hua Deng

    2017-08-01

    Full Text Available Alzheimer's disease (AD is a progressive and irreversible neurodegenerative disorder which is considered to be the most common cause of dementia. It has a greater impact not only on the learning and memory disturbances but also on social and economy. Currently, there are mainly single-target drugs for AD treatment but the complexity and multiple etiologies of AD make them difficult to obtain desirable therapeutic effects. Therefore, the choice of multi-target drugs will be a potential effective strategy inAD treatment. To find multi-target active ingredients for AD treatment from Selaginella plants, we firstly explored the behaviors effects on AD mice of total extracts (TE from Selaginella doederleinii on by Morris water maze test and found that TE has a remarkable improvement on learning and memory function for AD mice. And then, multi-target SAR models associated with AD-related proteins were built based on Random Forest (RF and different descriptors to preliminarily screen potential active ingredients from Selaginella. Considering the prediction outputs and the quantity of existing compounds in our laboratory, 13 compounds were chosen to carry out the in vitro enzyme inhibitory experiments and 4 compounds with BACE1/MAO-B dual inhibitory activity were determined. Finally, the molecular docking was applied to verify the prediction results and enzyme inhibitory experiments. Based on these study and validation processes, we explored a new strategy to improve the efficiency of active ingredients screening based on trace amount of natural product and numbers of targets and found some multi-target compounds with biological activity for the development of novel drugs for AD treatment.

  2. 3D Multisource Full‐Waveform Inversion using Dynamic Random Phase Encoding

    KAUST Repository

    Boonyasiriwat, Chaiwoot; Schuster, Gerard T.

    2010-01-01

    We have developed a multisource full‐waveform inversion algorithm using a dynamic phase encoding strategy with dual‐randomization—both the position and polarity of simultaneous sources are randomized and changed every iteration. The dynamic dual

  3. Relevance analysis and short-term prediction of PM2.5 concentrations in Beijing based on multi-source data

    Science.gov (United States)

    Ni, X. Y.; Huang, H.; Du, W. P.

    2017-02-01

    The PM2.5 problem is proving to be a major public crisis and is of great public-concern requiring an urgent response. Information about, and prediction of PM2.5 from the perspective of atmospheric dynamic theory is still limited due to the complexity of the formation and development of PM2.5. In this paper, we attempted to realize the relevance analysis and short-term prediction of PM2.5 concentrations in Beijing, China, using multi-source data mining. A correlation analysis model of PM2.5 to physical data (meteorological data, including regional average rainfall, daily mean temperature, average relative humidity, average wind speed, maximum wind speed, and other pollutant concentration data, including CO, NO2, SO2, PM10) and social media data (microblog data) was proposed, based on the Multivariate Statistical Analysis method. The study found that during these factors, the value of average wind speed, the concentrations of CO, NO2, PM10, and the daily number of microblog entries with key words 'Beijing; Air pollution' show high mathematical correlation with PM2.5 concentrations. The correlation analysis was further studied based on a big data's machine learning model- Back Propagation Neural Network (hereinafter referred to as BPNN) model. It was found that the BPNN method performs better in correlation mining. Finally, an Autoregressive Integrated Moving Average (hereinafter referred to as ARIMA) Time Series model was applied in this paper to explore the prediction of PM2.5 in the short-term time series. The predicted results were in good agreement with the observed data. This study is useful for helping realize real-time monitoring, analysis and pre-warning of PM2.5 and it also helps to broaden the application of big data and the multi-source data mining methods.

  4. Multisource least-squares reverse-time migration with structure-oriented filtering

    Science.gov (United States)

    Fan, Jing-Wen; Li, Zhen-Chun; Zhang, Kai; Zhang, Min; Liu, Xue-Tong

    2016-09-01

    The technology of simultaneous-source acquisition of seismic data excited by several sources can significantly improve the data collection efficiency. However, direct imaging of simultaneous-source data or blended data may introduce crosstalk noise and affect the imaging quality. To address this problem, we introduce a structure-oriented filtering operator as preconditioner into the multisource least-squares reverse-time migration (LSRTM). The structure-oriented filtering operator is a nonstationary filter along structural trends that suppresses crosstalk noise while maintaining structural information. The proposed method uses the conjugate-gradient method to minimize the mismatch between predicted and observed data, while effectively attenuating the interference noise caused by exciting several sources simultaneously. Numerical experiments using synthetic data suggest that the proposed method can suppress the crosstalk noise and produce highly accurate images.

  5. Gaussian mixture probability hypothesis density filter for multipath multitarget tracking in over-the-horizon radar

    Science.gov (United States)

    Qin, Yong; Ma, Hong; Chen, Jinfeng; Cheng, Li

    2015-12-01

    Conventional multitarget tracking systems presume that each target can produce at most one measurement per scan. Due to the multiple ionospheric propagation paths in over-the-horizon radar (OTHR), this assumption is not valid. To solve this problem, this paper proposes a novel tracking algorithm based on the theory of finite set statistics (FISST) called the multipath probability hypothesis density (MP-PHD) filter in cluttered environments. First, the FISST is used to derive the update equation, and then Gaussian mixture (GM) is introduced to derive the closed-form solution of the MP-PHD filter. Moreover, the extended Kalman filter (EKF) is presented to deal with the nonlinear problem of the measurement model in OTHR. Eventually, the simulation results are provided to demonstrate the effectiveness of the proposed filter.

  6. Multitarget botanical pharmacotherapy in major depression: a toxic brain hypothesis.

    Science.gov (United States)

    Tang, Siu W; Tang, Wayne H; Leonard, Brain E

    2017-11-01

    A significant number of patients with major depression do not respond optimally to current antidepressant drugs. As depression is likely to be a heterogeneous disorder, it is possible that existing neurotransmitter-based antidepressant drugs do not fully address other pathologies that may exist in certain cases. Biological pathologies related to depression that have been proposed and studied extensively include inflammation and immunology, hypercortisolemia, oxidative stress, and impaired angiogenesis. Such pathologies may induce neurodegeneration, which in turn causes cognitive impairment, a symptom increasingly being recognized in depression. A neurotoxic brain hypothesis unifying all these factors may explain the heterogeneity of depression as well as cognitive decline and antidepressant drug resistance in some patients. Compared with neurotransmitter-based antidepressant drugs, many botanical compounds in traditional medicine used for the treatment of depression and its related symptoms have been discovered to be anti-inflammatory, immunoregulatory, anti-infection, antioxidative, and proangiogenic. Some botanical compounds also exert actions on neurotransmission. This multitarget nature of botanical medicine may act through the amelioration of the neurotoxic brain environment in some patients resistant to neurotransmitter-based antidepressant drugs. A multitarget multidimensional approach may be a reasonable solution for patients resistant to neurotransmitter-based antidepressant drugs.

  7. Identification and characterization of carprofen as a multi-target FAAH/COX inhibitor

    Science.gov (United States)

    Favia, Angelo D.; Habrant, Damien; Scarpelli, Rita; Migliore, Marco; Albani, Clara; Bertozzi, Sine Mandrup; Dionisi, Mauro; Tarozzo, Glauco; Piomelli, Daniele; Cavalli, Andrea; De Vivo, Marco

    2013-01-01

    Pain and inflammation are major therapeutic areas for drug discovery. Current drugs for these pathologies have limited efficacy, however, and often cause a number of unwanted side effects. In the present study, we identify the non-steroid anti-inflammatory drug, carprofen, as a multi-target-directed ligand that simultaneously inhibits cyclooxygenase-1 (COX-1), COX-2 and fatty acid amide hydrolase (FAAH). Additionally, we synthesized and tested several racemic derivatives of carprofen, sharing this multi-target activity. This may result in improved analgesic efficacy and reduced side effects (Naidu, et al (2009) J Pharmacol Exp Ther 329, 48-56; Fowler, C.J. et al. (2012) J Enzym Inhib Med Chem Jan 6; Sasso, et al (2012) Pharmacol Res 65, 553). The new compounds are among the most potent multi-target FAAH/COXs inhibitors reported so far in the literature, and thus may represent promising starting points for the discovery of new analgesic and anti-inflammatory drugs. PMID:23043222

  8. Multi-target QSPR modeling for simultaneous prediction of multiple gas-phase kinetic rate constants of diverse chemicals

    Science.gov (United States)

    Basant, Nikita; Gupta, Shikha

    2018-03-01

    The reactions of molecular ozone (O3), hydroxyl (•OH) and nitrate (NO3) radicals are among the major pathways of removal of volatile organic compounds (VOCs) in the atmospheric environment. The gas-phase kinetic rate constants (kO3, kOH, kNO3) are thus, important in assessing the ultimate fate and exposure risk of atmospheric VOCs. Experimental data for rate constants are not available for many emerging VOCs and the computational methods reported so far address a single target modeling only. In this study, we have developed a multi-target (mt) QSPR model for simultaneous prediction of multiple kinetic rate constants (kO3, kOH, kNO3) of diverse organic chemicals considering an experimental data set of VOCs for which values of all the three rate constants are available. The mt-QSPR model identified and used five descriptors related to the molecular size, degree of saturation and electron density in a molecule, which were mechanistically interpretable. These descriptors successfully predicted three rate constants simultaneously. The model yielded high correlations (R2 = 0.874-0.924) between the experimental and simultaneously predicted endpoint rate constant (kO3, kOH, kNO3) values in test arrays for all the three systems. The model also passed all the stringent statistical validation tests for external predictivity. The proposed multi-target QSPR model can be successfully used for predicting reactivity of new VOCs simultaneously for their exposure risk assessment.

  9. Linear transform of the multi-target survival curve

    Energy Technology Data Exchange (ETDEWEB)

    Watson, J V [Cambridge Univ. (UK). Dept. of Clinical Oncology and Radiotherapeutics

    1978-07-01

    A completely linear transform of the multi-target survival curve is presented. This enables all data, including those on the shoulder region of the curve, to be analysed. The necessity to make a subjective assessment about which data points to exclude for conventional methods of analysis is, therefore, removed. The analysis has also been adapted to include a 'Pike-Alper' method of assessing dose modification factors. For the data cited this predicts compatibility with the hypothesis of a true oxygen 'dose-modification' whereas the conventional Pike-Alper analysis does not.

  10. Biological evaluation and molecular docking of Rhein as a multi-targeted radiotherapy sensitization agent of nasopharyngeal carcinoma

    Science.gov (United States)

    Su, Zhengying; Tian, Wei; Li, Jing; Wang, Chunmiao; Pan, Zhiyu; Li, Danrong; Hou, Huaxin

    2017-11-01

    Radiation resistance of nasopharyngeal carcinoma (NPC) is a joint effect caused by complex molecular mechanisms. The development of multi-target radiotherapy sensitization agents offered a promising method for the treatment of NPC. In this work, the probability of Rhein to be a multi-target radiotherapy sensitization agent was explored through computer aid virtual screening by inverse docking study. In order to validate the accuracy of the computational results, radiotherapy sensitization of Rhein to NPC cells and its effects on the expression of target proteins were evaluated separately by CCK8 assay and Western blotting analysis. Our result demonstrated that Rhein possessed strong binding affinity with RAC1 and HSP90. No cytotoxic concentration of Rhein had radiosensitization effect on nasopharyngeal carcinoma CNE1 cells. After treatment with Rhein and 2Gy radiation, the expression of RAC1 upregulated and the expression of HSP90 down-regulated in cells. Based on the above data, Rhein is likely to become an attractive lead compound for the future design of multi-target radiotherapy sensitization agents.

  11. Detection and identification of 700 drugs by multi-target screening with a 3200 Q TRAP LC-MS/MS system and library searching.

    Science.gov (United States)

    Dresen, S; Ferreirós, N; Gnann, H; Zimmermann, R; Weinmann, W

    2010-04-01

    The multi-target screening method described in this work allows the simultaneous detection and identification of 700 drugs and metabolites in biological fluids using a hybrid triple-quadrupole linear ion trap mass spectrometer in a single analytical run. After standardization of the method, the retention times of 700 compounds were determined and transitions for each compound were selected by a "scheduled" survey MRM scan, followed by an information-dependent acquisition using the sensitive enhanced product ion scan of a Q TRAP hybrid instrument. The identification of the compounds in the samples analyzed was accomplished by searching the tandem mass spectrometry (MS/MS) spectra against the library we developed, which contains electrospray ionization-MS/MS spectra of over 1,250 compounds. The multi-target screening method together with the library was included in a software program for routine screening and quantitation to achieve automated acquisition and library searching. With the help of this software application, the time for evaluation and interpretation of the results could be drastically reduced. This new multi-target screening method has been successfully applied for the analysis of postmortem and traffic offense samples as well as proficiency testing, and complements screening with immunoassays, gas chromatography-mass spectrometry, and liquid chromatography-diode-array detection. Other possible applications are analysis in clinical toxicology (for intoxication cases), in psychiatry (antidepressants and other psychoactive drugs), and in forensic toxicology (drugs and driving, workplace drug testing, oral fluid analysis, drug-facilitated sexual assault).

  12. A Multi-Classification Method of Improved SVM-based Information Fusion for Traffic Parameters Forecasting

    Directory of Open Access Journals (Sweden)

    Hongzhuan Zhao

    2016-04-01

    Full Text Available With the enrichment of perception methods, modern transportation system has many physical objects whose states are influenced by many information factors so that it is a typical Cyber-Physical System (CPS. Thus, the traffic information is generally multi-sourced, heterogeneous and hierarchical. Existing research results show that the multisourced traffic information through accurate classification in the process of information fusion can achieve better parameters forecasting performance. For solving the problem of traffic information accurate classification, via analysing the characteristics of the multi-sourced traffic information and using redefined binary tree to overcome the shortcomings of the original Support Vector Machine (SVM classification in information fusion, a multi-classification method using improved SVM in information fusion for traffic parameters forecasting is proposed. The experiment was conducted to examine the performance of the proposed scheme, and the results reveal that the method can get more accurate and practical outcomes.

  13. Statistical Symbolic Execution with Informed Sampling

    Science.gov (United States)

    Filieri, Antonio; Pasareanu, Corina S.; Visser, Willem; Geldenhuys, Jaco

    2014-01-01

    Symbolic execution techniques have been proposed recently for the probabilistic analysis of programs. These techniques seek to quantify the likelihood of reaching program events of interest, e.g., assert violations. They have many promising applications but have scalability issues due to high computational demand. To address this challenge, we propose a statistical symbolic execution technique that performs Monte Carlo sampling of the symbolic program paths and uses the obtained information for Bayesian estimation and hypothesis testing with respect to the probability of reaching the target events. To speed up the convergence of the statistical analysis, we propose Informed Sampling, an iterative symbolic execution that first explores the paths that have high statistical significance, prunes them from the state space and guides the execution towards less likely paths. The technique combines Bayesian estimation with a partial exact analysis for the pruned paths leading to provably improved convergence of the statistical analysis. We have implemented statistical symbolic execution with in- formed sampling in the Symbolic PathFinder tool. We show experimentally that the informed sampling obtains more precise results and converges faster than a purely statistical analysis and may also be more efficient than an exact symbolic analysis. When the latter does not terminate symbolic execution with informed sampling can give meaningful results under the same time and memory limits.

  14. Information theory and statistics

    CERN Document Server

    Kullback, Solomon

    1968-01-01

    Highly useful text studies logarithmic measures of information and their application to testing statistical hypotheses. Includes numerous worked examples and problems. References. Glossary. Appendix. 1968 2nd, revised edition.

  15. Multi-source waveform inversion of marine streamer data using the normalized wavefield

    KAUST Repository

    Choi, Yun Seok; Alkhalifah, Tariq Ali

    2012-01-01

    Even though the encoded multi-source approach dramatically reduces the computational cost of waveform inversion, it is generally not applicable to marine streamer data. This is because the simultaneous-sources modeled data cannot be muted to comply

  16. Binaural segregation in multisource reverberant environments.

    Science.gov (United States)

    Roman, Nicoleta; Srinivasan, Soundararajan; Wang, DeLiang

    2006-12-01

    In a natural environment, speech signals are degraded by both reverberation and concurrent noise sources. While human listening is robust under these conditions using only two ears, current two-microphone algorithms perform poorly. The psychological process of figure-ground segregation suggests that the target signal is perceived as a foreground while the remaining stimuli are perceived as a background. Accordingly, the goal is to estimate an ideal time-frequency (T-F) binary mask, which selects the target if it is stronger than the interference in a local T-F unit. In this paper, a binaural segregation system that extracts the reverberant target signal from multisource reverberant mixtures by utilizing only the location information of target source is proposed. The proposed system combines target cancellation through adaptive filtering and a binary decision rule to estimate the ideal T-F binary mask. The main observation in this work is that the target attenuation in a T-F unit resulting from adaptive filtering is correlated with the relative strength of target to mixture. A comprehensive evaluation shows that the proposed system results in large SNR gains. In addition, comparisons using SNR as well as automatic speech recognition measures show that this system outperforms standard two-microphone beamforming approaches and a recent binaural processor.

  17. Natural products, an important resource for discovery of multitarget drugs and functional food for regulation of hepatic glucose metabolism.

    Science.gov (United States)

    Li, Jian; Yu, Haiyang; Wang, Sijian; Wang, Wei; Chen, Qian; Ma, Yanmin; Zhang, Yi; Wang, Tao

    2018-01-01

    Imbalanced hepatic glucose homeostasis is one of the critical pathologic events in the development of metabolic syndromes (MSs). Therefore, regulation of imbalanced hepatic glucose homeostasis is important in drug development for MS treatment. In this review, we discuss the major targets that regulate hepatic glucose homeostasis in human physiologic and pathophysiologic processes, involving hepatic glucose uptake, glycolysis and glycogen synthesis, and summarize their changes in MSs. Recent literature suggests the necessity of multitarget drugs in the management of MS disorder for regulation of imbalanced glucose homeostasis in both experimental models and MS patients. Here, we highlight the potential bioactive compounds from natural products with medicinal or health care values, and focus on polypharmacologic and multitarget natural products with effects on various signaling pathways in hepatic glucose metabolism. This review shows the advantage and feasibility of discovering multicompound-multitarget drugs from natural products, and providing a new perspective of ways on drug and functional food development for MSs.

  18. Optimal sizing of a multi-source energy plant for power heat and cooling generation

    International Nuclear Information System (INIS)

    Barbieri, E.S.; Dai, Y.J.; Morini, M.; Pinelli, M.; Spina, P.R.; Sun, P.; Wang, R.Z.

    2014-01-01

    Multi-source systems for the fulfilment of electric, thermal and cooling demand of a building can be based on different technologies (e.g. solar photovoltaic, solar heating, cogeneration, heat pump, absorption chiller) which use renewable, partially renewable and fossil energy sources. Therefore, one of the main issues of these kinds of multi-source systems is to find the appropriate size of each technology. Moreover, building energy demands depend on the climate in which the building is located and on the characteristics of the building envelope, which also influence the optimal sizing. This paper presents an analysis of the effect of different climatic scenarios on the multi-source energy plant sizing. For this purpose a model has been developed and has been implemented in the Matlab ® environment. The model takes into consideration the load profiles for electricity, heating and cooling for a whole year. The performance of the energy systems are modelled through a systemic approach. The optimal sizing of the different technologies composing the multi-source energy plant is investigated by using a genetic algorithm, with the goal of minimizing the primary energy consumption only, since the cost of technologies and, in particular, the actual tariff and incentive scenarios depend on the specific country. Moreover economic considerations may lead to inadequate solutions in terms of primary energy consumption. As a case study, the Sino-Italian Green Energy Laboratory of the Shanghai Jiao Tong University has been hypothetically located in five cities in different climatic zones. The load profiles are calculated by means of a TRNSYS ® model. Results show that the optimal load allocation and component sizing are strictly related to climatic data (e.g. external air temperature and solar radiation)

  19. Evolving Relationship Structures in Multi-sourcing Arrangements: The Case of Mission Critical Outsourcing

    Science.gov (United States)

    Heitlager, Ilja; Helms, Remko; Brinkkemper, Sjaak

    Information Technology Outsourcing practice and research mainly considers the outsourcing phenomenon as a generic fulfilment of the IT function by external parties. Inspired by the logic of commodity, core competencies and economies of scale; assets, existing departments and IT functions are transferred to external parties. Although the generic approach might work for desktop outsourcing, where standardisation is the dominant factor, it does not work for the management of mission critical applications. Managing mission critical applications requires a different approach where building relationships is critical. The relationships involve inter and intra organisational parties in a multi-sourcing arrangement, called an IT service chain, consisting of multiple (specialist) parties that have to collaborate closely to deliver high quality services.

  20. WE-DE-201-08: Multi-Source Rotating Shield Brachytherapy Apparatus for Prostate Cancer

    Energy Technology Data Exchange (ETDEWEB)

    Dadkhah, H; Wu, X [University of Iowa, Iowa City, Iowa (United States); Kim, Y; Flynn, R [University of Iowa Hospitals and Clinics, Iowa City, IA (United States)

    2016-06-15

    Purpose: To introduce a novel multi-source rotating shield brachytherapy (RSBT) apparatus for the precise simultaneous angular and linear positioning of all partially-shielded 153Gd radiation sources in interstitial needles for treating prostate cancer. The mechanism is designed to lower the detrimental dose to healthy tissues, the urethra in particular, relative to conventional high-dose-rate brachytherapy (HDR-BT) techniques. Methods: Following needle implantation, the delivery system is docked to the patient template. Each needle is coupled to a multi-source afterloader catheter by a connector passing through a shaft. The shafts are rotated by translating a moving template between two stationary templates. Shaft walls as well as moving template holes are threaded such that the resistive friction produced between the two parts exerts enough force on the shafts to bring about the rotation. Rotation of the shaft is then transmitted to the shielded source via several keys. Thus, shaft angular position is fully correlated with the position of the moving template. The catheter angles are simultaneously incremented throughout treatment as needed, and only a single 360° rotation of all catheters is needed for a full treatment. For each rotation angle, source depth in each needle is controlled by a multi-source afterloader, which is proposed as an array of belt-driven linear actuators, each of which drives a source wire. Results: Optimized treatment plans based on Monte Carlo dose calculations demonstrated RSBT with the proposed apparatus reduced urethral D{sub 1cc} below that of conventional HDR-BT by 35% for urethral dose gradient volume within 3 mm of the urethra surface. Treatment time to deliver 20 Gy with multi-source RSBT apparatus using nineteen 62.4 GBq {sup 153}Gd sources is 117 min. Conclusions: The proposed RSBT delivery apparatus in conjunction with multiple nitinol catheter-mounted platinum-shielded {sup 153}Gd sources enables a mechanically feasible

  1. Tracking and evolution of irrigation triggered active landslides by multi-source high resolution DEM: The Jiaojiacun landslide group of Heifangtai (Northwest of China)

    Science.gov (United States)

    Zeng, Runqiang; Meng, Xingmin; Wang, Siyuan; Chen, Guan; Lee, Yajun; Zhang, Yi

    2014-05-01

    The construction of three large hydropower stations, i.e. Liujia, Yanguo and Bapan, resulted in the immigration of the impacted people to Heifangtai from 1960s. To support the living and farming of the immigrated people, a large amount of water has been pumped from the Yellow River to Heifangtai, which has changed the former underground water budget and led to 111 landslides from 1968 in this area. To reveal the deformation process of landslides in Heifangtai, a quantitative deformation analysis model of landslide based on multi-source DEM data is established using four periods of topographic maps obtained in 1970, 2001, 2010 and 2013 respectively, including two 1:10000 topographic maps and two 1:1000 data acquired from 3D Laser Scanner. The whole study area was divided into two sections based on the two distinct kinds of landslide patterns. The selected morphometric parameters, residual topographic surface and surface roughness, extracted from three typical landslides, and the statistical analysis (Box-plot diagrams) of the temporal variations of these parameters, allowed the reconstruction and tracking of these landslides. We monitored the changing of landslide boundaries, average vertical and horizontal displacement rates and zones of uplift and subsidence. The volumes of removed and/or accumulated material were estimated as well. We can then demonstrate the kinematics of landslides based on information from high-resolution DEM, and the changing table of underground water, ring-shear test and soil-water characteristic curve referenced from other researchers. The results provide a new insight on the use of multi-source high resolution DEM in the monitoring of irrigation-triggered landslides.

  2. Combinatorial support vector machines approach for virtual screening of selective multi-target serotonin reuptake inhibitors from large compound libraries.

    Science.gov (United States)

    Shi, Z; Ma, X H; Qin, C; Jia, J; Jiang, Y Y; Tan, C Y; Chen, Y Z

    2012-02-01

    Selective multi-target serotonin reuptake inhibitors enhance antidepressant efficacy. Their discovery can be facilitated by multiple methods, including in silico ones. In this study, we developed and tested an in silico method, combinatorial support vector machines (COMBI-SVMs), for virtual screening (VS) multi-target serotonin reuptake inhibitors of seven target pairs (serotonin transporter paired with noradrenaline transporter, H(3) receptor, 5-HT(1A) receptor, 5-HT(1B) receptor, 5-HT(2C) receptor, melanocortin 4 receptor and neurokinin 1 receptor respectively) from large compound libraries. COMBI-SVMs trained with 917-1951 individual target inhibitors correctly identified 22-83.3% (majority >31.1%) of the 6-216 dual inhibitors collected from literature as independent testing sets. COMBI-SVMs showed moderate to good target selectivity in misclassifying as dual inhibitors 2.2-29.8% (majority virtual hits correlate with the reported effects of their predicted targets. COMBI-SVM is potentially useful for searching selective multi-target agents without explicit knowledge of these agents. Copyright © 2011 Elsevier Inc. All rights reserved.

  3. Multi-Targeted Antithrombotic Therapy for Total Artificial Heart Device Patients.

    Science.gov (United States)

    Ramirez, Angeleah; Riley, Jeffrey B; Joyce, Lyle D

    2016-03-01

    To prevent thrombotic or bleeding events in patients receiving a total artificial heart (TAH), agents have been used to avoid adverse events. The purpose of this article is to outline the adoption and results of a multi-targeted antithrombotic clinical procedure guideline (CPG) for TAH patients. Based on literature review of TAH anticoagulation and multiple case series, a CPG was designed to prescribe the use of multiple pharmacological agents. Total blood loss, Thromboelastograph(®) (TEG), and platelet light-transmission aggregometry (LTA) measurements were conducted on 13 TAH patients during the first 2 weeks of support in our institution. Target values and actual medians for postimplant days 1, 3, 7, and 14 were calculated for kaolinheparinase TEG, kaolin TEG, LTA, and estimated blood loss. Protocol guidelines were followed and anticoagulation management reduced bleeding and prevented thrombus formation as well as thromboembolic events in TAH patients postimplantation. The patients in this study were susceptible to a variety of possible complications such as mechanical device issues, thrombotic events, infection, and bleeding. Among them all it was clear that patients were at most risk for bleeding, particularly on postoperative days 1 through 3. However, bleeding was reduced into postoperative days 3 and 7, indicating that acceptable hemostasis was achieved with the anticoagulation protocol. The multidisciplinary, multi-targeted anticoagulation clinical procedure guideline was successful to maintain adequate antithrombotic therapy for TAH patients.

  4. Multitarget global sensitivity analysis of n-butanol combustion.

    Science.gov (United States)

    Zhou, Dingyu D Y; Davis, Michael J; Skodje, Rex T

    2013-05-02

    A model for the combustion of butanol is studied using a recently developed theoretical method for the systematic improvement of the kinetic mechanism. The butanol mechanism includes 1446 reactions, and we demonstrate that it is straightforward and computationally feasible to implement a full global sensitivity analysis incorporating all the reactions. In addition, we extend our previous analysis of ignition-delay targets to include species targets. The combination of species and ignition targets leads to multitarget global sensitivity analysis, which allows for a more complete mechanism validation procedure than we previously implemented. The inclusion of species sensitivity analysis allows for a direct comparison between reaction pathway analysis and global sensitivity analysis.

  5. Scanless multitarget-matching multiphoton excitation fluorescence microscopy

    Directory of Open Access Journals (Sweden)

    Junpeng Qiu

    2018-03-01

    Full Text Available Using the combination of a reflective blazed grating and a reflective phase-only diffractive spatial light modulator (SLM, scanless multitarget-matching multiphoton excitation fluorescence microscopy (SMTM-MPM was achieved. The SLM shaped an incoming mode-locked, near-infrared Ti:sapphire laser beam into an excitation pattern with addressable shapes and sizes that matched the samples of interest in the field of view. Temporal and spatial focusing were simultaneously realized by combining an objective lens and a blazed grating. The fluorescence signal from illuminated areas was recorded by a two-dimensional sCMOS camera. Compared with a conventional temporal focusing multiphoton microscope, our microscope achieved effective use of the laser power and decreased photodamage with higher axial resolution.

  6. Student Self-Assessment and Multisource Feedback Assessment: Exploring Benefits, Limitations, and Remedies

    Science.gov (United States)

    Taylor, Scott N.

    2014-01-01

    It has become common practice for management students to participate in some sort of self-assessment or multisource feedback assessment (MSF; also called 360-degree assessment or multirater assessment) during their management degree program. These assessments provide students invaluable feedback about themselves and assist students in their…

  7. Multisource inverse-geometry CT. Part I. System concept and development

    Energy Technology Data Exchange (ETDEWEB)

    De Man, Bruno, E-mail: deman@ge.com; Harrison, Dan; Yin, Zhye [CT Systems and Applications Laboratory, GE Global Research, Niskayuna, New York 12309 (United States); Uribe, Jorge [Functional Imaging Laboratory, GE Global Research, Niskayuna, New York 12309 (United States); Baek, Jongduk [School of Integrated Technology, Yonsei University, Incheon 406-840 (Korea, Republic of); Longtin, Randy; Roy, Jaydeep; Frutschy, Kristopher [Mechanical Systems Technologies, GE Global Research, Niskayuna, New York 12309 (United States); Waters, Bill [Design and Development Shops, GE Global Research, Niskayuna, New York 12309 (United States); Wilson, Colin; Inzinna, Lou; Neculaes, V. Bogdan [High Energy Physics Laboratory, GE Global Research, Niskayuna, New York 12309 (United States); Short, Jonathan [Detector Laboratory, GE Global Research, Niskayuna, New York 12309 (United States); Reynolds, Joseph [High Frequency Power Electronics Laboratory, GE Global Research, Niskayuna, New York 12309 (United States); Senzig, Bob [Molecular Imaging and Computed Tomography, GE Healthcare, Waukesha, Wisconsin 53188 (United States); Pelc, Norbert [Department of Radiology, Stanford University, Stanford, California 94305 (United States)

    2016-08-15

    Purpose: This paper presents an overview of multisource inverse-geometry computed tomography (IGCT) as well as the development of a gantry-based research prototype system. The development of the distributed x-ray source is covered in a companion paper [V. B. Neculaes et al., “Multisource inverse-geometry CT. Part II. X-ray source design and prototype,” Med. Phys. 43, 4617–4627 (2016)]. While progress updates of this development have been presented at conferences and in journal papers, this paper is the first comprehensive overview of the multisource inverse-geometry CT concept and prototype. The authors also provide a review of all previous IGCT related publications. Methods: The authors designed and implemented a gantry-based 32-source IGCT scanner with 22 cm field-of-view, 16 cm z-coverage, 1 s rotation time, 1.09 × 1.024 mm detector cell size, as low as 0.4 × 0.8 mm focal spot size and 80–140 kVp x-ray source voltage. The system is built using commercially available CT components and a custom made distributed x-ray source. The authors developed dedicated controls, calibrations, and reconstruction algorithms and evaluated the system performance using phantoms and small animals. Results: The authors performed IGCT system experiments and demonstrated tube current up to 125 mA with up to 32 focal spots. The authors measured a spatial resolution of 13 lp/cm at 5% cutoff. The scatter-to-primary ratio is estimated 62% for a 32 cm water phantom at 140 kVp. The authors scanned several phantoms and small animals. The initial images have relatively high noise due to the low x-ray flux levels but minimal artifacts. Conclusions: IGCT has unique benefits in terms of dose-efficiency and cone-beam artifacts, but comes with challenges in terms of scattered radiation and x-ray flux limits. To the authors’ knowledge, their prototype is the first gantry-based IGCT scanner. The authors summarized the design and implementation of the scanner and the authors presented

  8. Human-Scale Sustainability Assessment of Urban Intersections Based upon Multi-Source Big Data

    Directory of Open Access Journals (Sweden)

    Yuhuan Zhang

    2017-07-01

    Full Text Available To evaluate the sustainability of an enormous number of urban intersections, a novel assessment model is proposed, along with an indicator system and corresponding methods to determine the indicators. Considering mainly the demands and feelings of the urban residents, the three aspects of safety, functionality, and image perception are taken into account in the indicator system. Based on technologies such as street view picture crawling, image segmentation, and edge detection, GIS spatial data analysis, a rapid automated assessment method, and a corresponding multi-source database are built up to determine the indicators. The improved information entropy method is applied to obtain the entropy weights of each indicator. A case study shows the efficiency and applicability of the proposed assessment model, indicator system and algorithm.

  9. Using multilevel, multisource needs assessment data for planning community interventions.

    Science.gov (United States)

    Levy, Susan R; Anderson, Emily E; Issel, L Michele; Willis, Marilyn A; Dancy, Barbara L; Jacobson, Kristin M; Fleming, Shirley G; Copper, Elizabeth S; Berrios, Nerida M; Sciammarella, Esther; Ochoa, Mónica; Hebert-Beirne, Jennifer

    2004-01-01

    African Americans and Latinos share higher rates of cardiovascular disease (CVD) and diabetes compared with Whites. These diseases have common risk factors that are amenable to primary and secondary prevention. The goal of the Chicago REACH 2010-Lawndale Health Promotion Project is to eliminate disparities related to CVD and diabetes experienced by African Americans and Latinos in two contiguous Chicago neighborhoods using a community-based prevention approach. This article shares findings from the Phase 1 participatory planning process and discusses the implications these findings and lessons learned may have for programs aiming to reduce health disparities in multiethnic communities. The triangulation of data sources from the planning phase enriched interpretation and led to more creative and feasible suggestions for programmatic interventions across the four levels of the ecological framework. Multisource data yielded useful information for program planning and a better understanding of the cultural differences and similarities between African Americans and Latinos.

  10. Multi-model for the control design and diagnosis of multi-sources ...

    African Journals Online (AJOL)

    Les équations analytiques de redondances sont introduites pour le diagnostic des systèmes électriques hybrides afin de détecter et d'isoler les défauts qui peuvent affecter les capteurs de courants et de tensions utilisés dans la commande de ces systèmes. This paper concerns multi-sources renewable energy systems.

  11. Information trimming: Sufficient statistics, mutual information, and predictability from effective channel states

    Science.gov (United States)

    James, Ryan G.; Mahoney, John R.; Crutchfield, James P.

    2017-06-01

    One of the most basic characterizations of the relationship between two random variables, X and Y , is the value of their mutual information. Unfortunately, calculating it analytically and estimating it empirically are often stymied by the extremely large dimension of the variables. One might hope to replace such a high-dimensional variable by a smaller one that preserves its relationship with the other. It is well known that either X (or Y ) can be replaced by its minimal sufficient statistic about Y (or X ) while preserving the mutual information. While intuitively reasonable, it is not obvious or straightforward that both variables can be replaced simultaneously. We demonstrate that this is in fact possible: the information X 's minimal sufficient statistic preserves about Y is exactly the information that Y 's minimal sufficient statistic preserves about X . We call this procedure information trimming. As an important corollary, we consider the case where one variable is a stochastic process' past and the other its future. In this case, the mutual information is the channel transmission rate between the channel's effective states. That is, the past-future mutual information (the excess entropy) is the amount of information about the future that can be predicted using the past. Translating our result about minimal sufficient statistics, this is equivalent to the mutual information between the forward- and reverse-time causal states of computational mechanics. We close by discussing multivariate extensions to this use of minimal sufficient statistics.

  12. A Multi-Disciplinary University Research Initiative in Hard and Soft Information Fusion: Overview, Research Strategies and Initial Results

    Science.gov (United States)

    2010-07-01

    Multisource Information Fusion ( CMIF ) along with a team including the Pennsylvania State University (PSU), Iona College (Iona), and Tennessee State...License. 14. ABSTRACT The University at Buffalo (UB) Center for Multisource Information Fusion ( CMIF ) along with a team including the Pennsylvania...of CMIF current research on methods for Test and Evaluation ([7], [8]) involving for example large- factor-space experimental design techniques ([9

  13. Delay Bounded Multi-Source Multicast in Software-Defined Networking

    Directory of Open Access Journals (Sweden)

    Thabo Semong

    2018-01-01

    Full Text Available Software-Defined Networking (SDN is the next generation network architecture with exciting application prospects. The control function in SDN is decoupled from the data forwarding plane, hence it provides a new centralized architecture with flexible network resource management. Although SDN is attracting much attention from both industry and research, its advantage over the traditional networks has not been fully utilized. Multicast is designed to deliver content to multiple destinations. The current traffic engineering in SDN focuses mainly on unicast, however, multicast can effectively reduce network resource consumption by serving multiple clients. This paper studies a novel delay-bounded multi-source multicast SDN problem, in which among the set of potential sources, we select a source to build the multicast-tree, under the constraint that the transmission delay for every destination is bounded. This problem is more difficult than the traditional Steiner minimum tree (SMT problem, since it needs to find a source from the set of all potential sources. We model the problem as a mixed-integer linear programming (MILP and prove its NP-Hardness. To solve the problem, a delay bounded multi-source (DBMS scheme is proposed, which includes a DBMS algorithm to build a minimum delay cost DBMS-Forest. Through a MATLAB experiment, we demonstrate that DBMS is significantly more efficient and outperforms other existing algorithms in the literature.

  14. Textual information access statistical models

    CERN Document Server

    Gaussier, Eric

    2013-01-01

    This book presents statistical models that have recently been developed within several research communities to access information contained in text collections. The problems considered are linked to applications aiming at facilitating information access:- information extraction and retrieval;- text classification and clustering;- opinion mining;- comprehension aids (automatic summarization, machine translation, visualization).In order to give the reader as complete a description as possible, the focus is placed on the probability models used in the applications

  15. Energy Harvesting Research: The Road from Single Source to Multisource.

    Science.gov (United States)

    Bai, Yang; Jantunen, Heli; Juuti, Jari

    2018-06-07

    Energy harvesting technology may be considered an ultimate solution to replace batteries and provide a long-term power supply for wireless sensor networks. Looking back into its research history, individual energy harvesters for the conversion of single energy sources into electricity are developed first, followed by hybrid counterparts designed for use with multiple energy sources. Very recently, the concept of a truly multisource energy harvester built from only a single piece of material as the energy conversion component is proposed. This review, from the aspect of materials and device configurations, explains in detail a wide scope to give an overview of energy harvesting research. It covers single-source devices including solar, thermal, kinetic and other types of energy harvesters, hybrid energy harvesting configurations for both single and multiple energy sources and single material, and multisource energy harvesters. It also includes the energy conversion principles of photovoltaic, electromagnetic, piezoelectric, triboelectric, electrostatic, electrostrictive, thermoelectric, pyroelectric, magnetostrictive, and dielectric devices. This is one of the most comprehensive reviews conducted to date, focusing on the entire energy harvesting research scene and providing a guide to seeking deeper and more specific research references and resources from every corner of the scientific community. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Application of multi-source waveform inversion to marine streamer data using the global correlation norm

    KAUST Repository

    Choi, Yun Seok; Alkhalifah, Tariq Ali

    2012-01-01

    Conventional multi-source waveform inversion using an objective function based on the least-square misfit cannot be applied to marine streamer acquisition data because of inconsistent acquisition geometries between observed and modelled data

  17. School adjustment of children in residential care: a multi-source analysis.

    Science.gov (United States)

    Martín, Eduardo; Muñoz de Bustillo, María del Carmen

    2009-11-01

    School adjustment is one the greatest challenges in residential child care programs. This study has two aims: to analyze school adjustment compared to a normative population, and to carry out a multi-source analysis (child, classmates, and teacher) of this adjustment. A total of 50 classrooms containing 60 children from residential care units were studied. The "Método de asignación de atributos perceptivos" (Allocation of perceptive attributes; Díaz-Aguado, 2006), the "Test Autoevaluativo Multifactorial de Adaptación Infantil" (TAMAI [Multifactor Self-assessment Test of Child Adjustment]; Hernández, 1996) and the "Protocolo de valoración para el profesorado (Evaluation Protocol for Teachers; Fernández del Valle, 1998) were applied. The main results indicate that, compared with their classmates, children in residential care are perceived as more controversial and less integrated at school, although no differences were observed in problems of isolation. The multi-source analysis shows that there is agreement among the different sources when the externalized and visible aspects are evaluated. These results are discussed in connection with the practices that are being developed in residential child care programs.

  18. Painless, safe, and efficacious noninvasive skin tightening, body contouring, and cellulite reduction using multisource 3DEEP radiofrequency.

    Science.gov (United States)

    Harth, Yoram

    2015-03-01

    In the last decade, Radiofrequency (RF) energy has proven to be safe and highly efficacious for face and neck skin tightening, body contouring, and cellulite reduction. In contrast to first-generation Monopolar/Bipolar and "X -Polar" RF systems which use one RF generator connected to one or more skin electrodes, multisource radiofrequency devices use six independent RF generators allowing efficient dermal heating to 52-55°C, with no pain or risk of other side effects. In this review, the basic science and clinical results of body contouring and cellulite treatment using multisource radiofrequency system (Endymed PRO, Endymed, Cesarea, Israel) will be discussed and analyzed. © 2015 Wiley Periodicals, Inc.

  19. Multi-source least-squares migration of marine data

    KAUST Repository

    Wang, Xin

    2012-11-04

    Kirchhoff based multi-source least-squares migration (MSLSM) is applied to marine streamer data. To suppress the crosstalk noise from the excitation of multiple sources, a dynamic encoding function (including both time-shifts and polarity changes) is applied to the receiver side traces. Results show that the MSLSM images are of better quality than the standard Kirchhoff migration and reverse time migration images; moreover, the migration artifacts are reduced and image resolution is significantly improved. The computational cost of MSLSM is about the same as conventional least-squares migration, but its IO cost is significantly decreased.

  20. Vibration and acoustic frequency spectra for industrial process modeling using selective fusion multi-condition samples and multi-source features

    Science.gov (United States)

    Tang, Jian; Qiao, Junfei; Wu, ZhiWei; Chai, Tianyou; Zhang, Jian; Yu, Wen

    2018-01-01

    Frequency spectral data of mechanical vibration and acoustic signals relate to difficult-to-measure production quality and quantity parameters of complex industrial processes. A selective ensemble (SEN) algorithm can be used to build a soft sensor model of these process parameters by fusing valued information selectively from different perspectives. However, a combination of several optimized ensemble sub-models with SEN cannot guarantee the best prediction model. In this study, we use several techniques to construct mechanical vibration and acoustic frequency spectra of a data-driven industrial process parameter model based on selective fusion multi-condition samples and multi-source features. Multi-layer SEN (MLSEN) strategy is used to simulate the domain expert cognitive process. Genetic algorithm and kernel partial least squares are used to construct the inside-layer SEN sub-model based on each mechanical vibration and acoustic frequency spectral feature subset. Branch-and-bound and adaptive weighted fusion algorithms are integrated to select and combine outputs of the inside-layer SEN sub-models. Then, the outside-layer SEN is constructed. Thus, "sub-sampling training examples"-based and "manipulating input features"-based ensemble construction methods are integrated, thereby realizing the selective information fusion process based on multi-condition history samples and multi-source input features. This novel approach is applied to a laboratory-scale ball mill grinding process. A comparison with other methods indicates that the proposed MLSEN approach effectively models mechanical vibration and acoustic signals.

  1. A Student’s t Mixture Probability Hypothesis Density Filter for Multi-Target Tracking with Outliers

    Science.gov (United States)

    Liu, Zhuowei; Chen, Shuxin; Wu, Hao; He, Renke; Hao, Lin

    2018-01-01

    In multi-target tracking, the outliers-corrupted process and measurement noises can reduce the performance of the probability hypothesis density (PHD) filter severely. To solve the problem, this paper proposed a novel PHD filter, called Student’s t mixture PHD (STM-PHD) filter. The proposed filter models the heavy-tailed process noise and measurement noise as a Student’s t distribution as well as approximates the multi-target intensity as a mixture of Student’s t components to be propagated in time. Then, a closed PHD recursion is obtained based on Student’s t approximation. Our approach can make full use of the heavy-tailed characteristic of a Student’s t distribution to handle the situations with heavy-tailed process and the measurement noises. The simulation results verify that the proposed filter can overcome the negative effect generated by outliers and maintain a good tracking accuracy in the simultaneous presence of process and measurement outliers. PMID:29617348

  2. A GIS-based multi-source and multi-box modeling approach (GMSMB) for air pollution assessment--a North American case study.

    Science.gov (United States)

    Wang, Bao-Zhen; Chen, Zhi

    2013-01-01

    This article presents a GIS-based multi-source and multi-box modeling approach (GMSMB) to predict the spatial concentration distributions of airborne pollutant on local and regional scales. In this method, an extended multi-box model combined with a multi-source and multi-grid Gaussian model are developed within the GIS framework to examine the contributions from both point- and area-source emissions. By using GIS, a large amount of data including emission sources, air quality monitoring, meteorological data, and spatial location information required for air quality modeling are brought into an integrated modeling environment. It helps more details of spatial variation in source distribution and meteorological condition to be quantitatively analyzed. The developed modeling approach has been examined to predict the spatial concentration distribution of four air pollutants (CO, NO(2), SO(2) and PM(2.5)) for the State of California. The modeling results are compared with the monitoring data. Good agreement is acquired which demonstrated that the developed modeling approach could deliver an effective air pollution assessment on both regional and local scales to support air pollution control and management planning.

  3. Data association approaches in bearings-only multi-target tracking

    Science.gov (United States)

    Xu, Benlian; Wang, Zhiquan

    2008-03-01

    According to requirements of time computation complexity and correctness of data association of the multi-target tracking, two algorithms are suggested in this paper. The proposed Algorithm 1 is developed from the modified version of dual Simplex method, and it has the advantage of direct and explicit form of the optimal solution. The Algorithm 2 is based on the idea of Algorithm 1 and rotational sort method, it combines not only advantages of Algorithm 1, but also reduces the computational burden, whose complexity is only 1/ N times that of Algorithm 1. Finally, numerical analyses are carried out to evaluate the performance of the two data association algorithms.

  4. In vitro radiosensitivity of six human cell lines. A comparative study with different statistical models

    International Nuclear Information System (INIS)

    Fertil, B.; Deschavanne, P.J.; Lachet, B.; Malaise, E.P.

    1980-01-01

    The intrinsic radiosensitivity of human cell lines (five tumor and one nontransformed fibroblastic) was studied in vitro. The survival curves were fitted by the single-hit multitarget, the two-hit multitarget, the single-hit multitarget with initial slope, and the quadratic models. The accuracy of the experimental results permitted evaluation of the various fittings. Both a statistical test (comparison of variances left unexplained by the four models) and a biological consideration (check for independence of the fitted parameters vis-a-vis the portion of the survival curve in question) were carried out. The quadratic model came out best with each of them. It described the low-dose effects satisfactorily, revealing a single-hit lethal component. This finding and the fact that the six survival curves displayed a continuous curvature ruled out the adoption of the target models as well as the widely used linear regression. As calculated by the quadratic model, the parameters of the six cell lines lead to the following conclusions: (a) the intrinsic radiosensitivity varies greatly among the different cell lines; (b) the interpretation of the fibroblast survival curve is not basically different from that of the tumor cell lines; and (c) the radiosensitivity of these human cell lines is comparable to that of other mammalian cell lines

  5. A 12-week clinical and instrumental study evaluating the efficacy of a multisource radiofrequency home-use device for wrinkle reduction and improvement in skin tone, skin elasticity, and dermal collagen content.

    Science.gov (United States)

    Sadick, Neil S; Harth, Yoram

    2016-12-01

    This study was performed in order to evaluate the safety and efficacy of a new handheld home-use multisource radiofrequency device on facial rejuvenation. Forty-seven male and female subjects were enrolled. All subjects received a NEWA ® 3DEEP ™ home-use device (EndyMed Medical, Caesarea, Israel) to be used on facial skin three times per week for the first four weeks and then reduced to two times per week for the following eight weeks. Assessments included expert clinical grading for efficacy, instrumental evaluation, image analysis, and photography. Forty-five subjects completed the study; all subjects reported the treatment to be painless with only mild erythema lasting up to 15 minutes post-treatment. No other adverse events were reported. Statistically significant improvements were noted in the appearance of marionette lines, skin brightness, elasticity, firmness, lift (facial), lift (jawline), texture/smoothness, tone, and radiance/luminosity by expert visual assessment. Statistically significant improvements in skin firmness and elasticity were found using a Cutometer MPA 580, as well as in collagen and hemoglobin content of the skin using a SIAscope. The results of this study indicate that the NEWA ® multisource radiofrequency home-use device is effective in self-administered skin rejuvenation.

  6. Least-squares migration of multisource data with a deblurring filter

    KAUST Repository

    Dai, Wei; Wang, Xin; Schuster, Gerard T.

    2011-01-01

    Least-squares migration (LSM) has been shown to be able to produce high-quality migration images, but its computational cost is considered to be too high for practical imaging. We have developed a multisource least-squares migration algorithm (MLSM) to increase the computational efficiency by using the blended sources processing technique. To expedite convergence, a multisource deblurring filter is used as a preconditioner to reduce the data residual. This MLSM algorithm is applicable with Kirchhoff migration, wave-equation migration, or reverse time migration, and the gain in computational efficiency depends on the choice of migration method. Numerical results with Kirchhoff LSM on the 2D SEG/EAGE salt model show that an accurate image is obtained by migrating a supergather of 320 phase-encoded shots. When the encoding functions are the same for every iteration, the input/output cost of MLSM is reduced by 320 times. Empirical results show that the crosstalk noise introduced by blended sources is more effectively reduced when the encoding functions are changed at every iteration. The analysis of signal-to-noise ratio (S/N) suggests that not too many iterations are needed to enhance the S/N to an acceptable level. Therefore, when implemented with wave-equation migration or reverse time migration methods, the MLSM algorithm can be more efficient than the conventional migration method. © 2011 Society of Exploration Geophysicists.

  7. Least-squares migration of multisource data with a deblurring filter

    KAUST Repository

    Dai, Wei

    2011-09-01

    Least-squares migration (LSM) has been shown to be able to produce high-quality migration images, but its computational cost is considered to be too high for practical imaging. We have developed a multisource least-squares migration algorithm (MLSM) to increase the computational efficiency by using the blended sources processing technique. To expedite convergence, a multisource deblurring filter is used as a preconditioner to reduce the data residual. This MLSM algorithm is applicable with Kirchhoff migration, wave-equation migration, or reverse time migration, and the gain in computational efficiency depends on the choice of migration method. Numerical results with Kirchhoff LSM on the 2D SEG/EAGE salt model show that an accurate image is obtained by migrating a supergather of 320 phase-encoded shots. When the encoding functions are the same for every iteration, the input/output cost of MLSM is reduced by 320 times. Empirical results show that the crosstalk noise introduced by blended sources is more effectively reduced when the encoding functions are changed at every iteration. The analysis of signal-to-noise ratio (S/N) suggests that not too many iterations are needed to enhance the S/N to an acceptable level. Therefore, when implemented with wave-equation migration or reverse time migration methods, the MLSM algorithm can be more efficient than the conventional migration method. © 2011 Society of Exploration Geophysicists.

  8. Multidemand Multisource Order Quantity Allocation with Multiple Transportation Alternatives

    Directory of Open Access Journals (Sweden)

    Jun Gang

    2015-01-01

    Full Text Available This paper focuses on a multidemand multisource order quantity allocation problem with multiple transportation alternatives. To solve this problem, a bilevel multiobjective programming model under a mixed uncertain environment is proposed. Two levels of decision makers are considered in the model. On the upper level, the purchaser aims to allocate order quantity to multiple suppliers for each demand node with the consideration of three objectives: total purchase cost minimization, total delay risk minimization, and total defect risk minimization. On the lower level, each supplier attempts to optimize the transportation alternatives with total transportation and penalty costs minimization as the objective. In contrast to prior studies, considering the information asymmetry in the bilevel decision, random and fuzzy random variables are used to model uncertain parameters of the construction company and the suppliers. To solve the bilevel model, a solution method based on Kuhn-Tucker conditions, sectional genetic algorithm, and fuzzy random simulation is proposed. Finally, the applicability of the proposed model and algorithm is evaluated through a practical case from a large scale construction project. The results show that the proposed model and algorithm are efficient in dealing with practical order quantity allocation problems.

  9. Rating leniency and halo in multisource feedback ratings: testing cultural assumptions of power distance and individualism-collectivism.

    Science.gov (United States)

    Ng, Kok-Yee; Koh, Christine; Ang, Soon; Kennedy, Jeffrey C; Chan, Kim-Yin

    2011-09-01

    This study extends multisource feedback research by assessing the effects of rater source and raters' cultural value orientations on rating bias (leniency and halo). Using a motivational perspective of performance appraisal, the authors posit that subordinate raters followed by peers will exhibit more rating bias than superiors. More important, given that multisource feedback systems were premised on low power distance and individualistic cultural assumptions, the authors expect raters' power distance and individualism-collectivism orientations to moderate the effects of rater source on rating bias. Hierarchical linear modeling on data collected from 1,447 superiors, peers, and subordinates who provided developmental feedback to 172 military officers show that (a) subordinates exhibit the most rating leniency, followed by peers and superiors; (b) subordinates demonstrate more halo than superiors and peers, whereas superiors and peers do not differ; (c) the effects of power distance on leniency and halo are strongest for subordinates than for peers and superiors; (d) the effects of collectivism on leniency were stronger for subordinates and peers than for superiors; effects on halo were stronger for subordinates than superiors, but these effects did not differ for subordinates and peers. The present findings highlight the role of raters' cultural values in multisource feedback ratings. PsycINFO Database Record (c) 2011 APA, all rights reserved

  10. Source-independent time-domain waveform inversion using convolved wavefields: Application to the encoded multisource waveform inversion

    KAUST Repository

    Choi, Yun Seok; Alkhalifah, Tariq Ali

    2011-01-01

    Full waveform inversion requires a good estimation of the source wavelet to improve our chances of a successful inversion. This is especially true for an encoded multisource time-domain implementation, which, conventionally, requires separate

  11. Computer-aided design of multi-target ligands at A1R, A2AR and PDE10A, key proteins in neurodegenerative diseases.

    Science.gov (United States)

    Kalash, Leen; Val, Cristina; Azuaje, Jhonny; Loza, María I; Svensson, Fredrik; Zoufir, Azedine; Mervin, Lewis; Ladds, Graham; Brea, José; Glen, Robert; Sotelo, Eddy; Bender, Andreas

    2017-12-30

    Compounds designed to display polypharmacology may have utility in treating complex diseases, where activity at multiple targets is required to produce a clinical effect. In particular, suitable compounds may be useful in treating neurodegenerative diseases by promoting neuronal survival in a synergistic manner via their multi-target activity at the adenosine A 1 and A 2A receptors (A 1 R and A 2A R) and phosphodiesterase 10A (PDE10A), which modulate intracellular cAMP levels. Hence, in this work we describe a computational method for the design of synthetically feasible ligands that bind to A 1 and A 2A receptors and inhibit phosphodiesterase 10A (PDE10A), involving a retrosynthetic approach employing in silico target prediction and docking, which may be generally applicable to multi-target compound design at several target classes. This approach has identified 2-aminopyridine-3-carbonitriles as the first multi-target ligands at A 1 R, A 2A R and PDE10A, by showing agreement between the ligand and structure based predictions at these targets. The series were synthesized via an efficient one-pot scheme and validated pharmacologically as A 1 R/A 2A R-PDE10A ligands, with IC 50 values of 2.4-10.0 μM at PDE10A and K i values of 34-294 nM at A 1 R and/or A 2A R. Furthermore, selectivity profiling of the synthesized 2-amino-pyridin-3-carbonitriles against other subtypes of both protein families showed that the multi-target ligand 8 exhibited a minimum of twofold selectivity over all tested off-targets. In addition, both compounds 8 and 16 exhibited the desired multi-target profile, which could be considered for further functional efficacy assessment, analog modification for the improvement of selectivity towards A 1 R, A 2A R and PDE10A collectively, and evaluation of their potential synergy in modulating cAMP levels.

  12. A multi-source dataset of urban life in the city of Milan and the Province of Trentino.

    Science.gov (United States)

    Barlacchi, Gianni; De Nadai, Marco; Larcher, Roberto; Casella, Antonio; Chitic, Cristiana; Torrisi, Giovanni; Antonelli, Fabrizio; Vespignani, Alessandro; Pentland, Alex; Lepri, Bruno

    2015-01-01

    The study of socio-technical systems has been revolutionized by the unprecedented amount of digital records that are constantly being produced by human activities such as accessing Internet services, using mobile devices, and consuming energy and knowledge. In this paper, we describe the richest open multi-source dataset ever released on two geographical areas. The dataset is composed of telecommunications, weather, news, social networks and electricity data from the city of Milan and the Province of Trentino. The unique multi-source composition of the dataset makes it an ideal testbed for methodologies and approaches aimed at tackling a wide range of problems including energy consumption, mobility planning, tourist and migrant flows, urban structures and interactions, event detection, urban well-being and many others.

  13. A multi-source feedback tool for measuring a subset of Pediatrics Milestones.

    Science.gov (United States)

    Schwartz, Alan; Margolis, Melissa J; Multerer, Sara; Haftel, Hilary M; Schumacher, Daniel J

    2016-10-01

    The Pediatrics Milestones Assessment Pilot employed a new multisource feedback (MSF) instrument to assess nine Pediatrics Milestones among interns and subinterns in the inpatient context. To report validity evidence for the MSF tool for informing milestone classification decisions. We obtained MSF instruments by different raters per learner per rotation. We present evidence for validity based on the unified validity framework. One hundred and ninety two interns and 41 subinterns at 18 Pediatrics residency programs received a total of 1084 MSF forms from faculty (40%), senior residents (34%), nurses (22%), and other staff (4%). Variance in ratings was associated primarily with rater (32%) and learner (22%). The milestone factor structure fit data better than simpler structures. In domains except professionalism, ratings by nurses were significantly lower than those by faculty and ratings by other staff were significantly higher. Ratings were higher when the rater observed the learner for longer periods and had a positive global opinion of the learner. Ratings of interns and subinterns did not differ, except for ratings by senior residents. MSF-based scales correlated with summative milestone scores. We obtain moderately reliable MSF ratings of interns and subinterns in the inpatient context to inform some milestone assignments.

  14. YBa2Cu3O(7-x) based superconducting thin films by multitarget sputtering

    International Nuclear Information System (INIS)

    Bouteloup, E.; Mercey, B.; Poullain, G.; Brousse, T.; Murray, H.; Raveau, B.

    1990-01-01

    This paper reports a new technique to prepare superconducting YBa 2 Cu 3 O (7-x) thin films. The multitarget sputtering apparatus described below allows the simultaneous and reproducible production of numerous films with a metallic composition close to Y 17% Ba 33% Cu 50% . Superconducting films (R = 0) at 80 K have been produced on polycrystalline zirconia substrates after a high temperature annealing [fr

  15. Multi-target drugs: the trend of drug research and development.

    Science.gov (United States)

    Lu, Jin-Jian; Pan, Wei; Hu, Yuan-Jia; Wang, Yi-Tao

    2012-01-01

    Summarizing the status of drugs in the market and examining the trend of drug research and development is important in drug discovery. In this study, we compared the drug targets and the market sales of the new molecular entities approved by the U.S. Food and Drug Administration from January 2000 to December 2009. Two networks, namely, the target-target and drug-drug networks, have been set up using the network analysis tools. The multi-target drugs have much more potential, as shown by the network visualization and the market trends. We discussed the possible reasons and proposed the rational strategies for drug research and development in the future.

  16. Optimization Design and Simulation of a Multi-Source Energy Harvester Based on Solar and Radioisotope Energy Sources

    Directory of Open Access Journals (Sweden)

    Hao Li

    2016-12-01

    Full Text Available A novel multi-source energy harvester based on solar and radioisotope energy sources is designed and simulated in this work. We established the calculation formulas for the short-circuit current and open-circuit voltage, and then studied and analyzed the optimization thickness of the semiconductor, doping concentration, and junction depth with simulation of the transport process of β particles in a semiconductor material using the Monte Carlo simulation program MCNP (version 5, Radiation Safety Information Computational Center, Oak Ridge, TN, USA. In order to improve the efficiency of converting solar light energy into electric power, we adopted PC1D (version 5.9, University of New South Wales, Sydney, Australia to optimize the parameters, and selected the best parameters for converting both the radioisotope energy and solar energy into electricity. The results concluded that the best parameters for the multi-source energy harvester are as follows: Na is 1 × 1019 cm−3, Nd is 3.8 × 1016 cm−3, a PN junction depth of 0.5 μm (using the 147Pm radioisotope source, and so on. Under these parameters, the proposed harvester can achieve a conversion efficiency of 5.05% for the 147Pm radioisotope source (with the activity of 9.25 × 108 Bq and 20.8% for solar light radiation (AM1.5. Such a design and parameters are valuable for some unique micro-power fields, such as applications in space, isolated terrestrial applications, and smart dust in battlefields.

  17. Three methods of multi-source feedback compared: a plea for narrative comments and coworkers' perspectives

    NARCIS (Netherlands)

    Overeem, K.; Lombarts, M. J. M. H.; Arah, O. A.; Klazinga, N. S.; Grol, R. P. T. M.; Wollersheim, H. C.

    2010-01-01

    Doctor performance assessments based on multi-source feedback (MSF) are increasingly central in professional self-regulation. Research has shown that simple MSF is often unproductive. It has been suggested that MSF should be delivered by a facilitator and combined with a portfolio. To compare three

  18. Three methods of multi-source feedback compared: a plea for narrative comments and coworkers' perspectives.

    NARCIS (Netherlands)

    Overeem, K.; Lombarts, M.J.; Arah, O.A.; Klazinga, N.S.; Grol, R.P.T.M.; Wollersheim, H.C.H.

    2010-01-01

    BACKGROUND: Doctor performance assessments based on multi-source feedback (MSF) are increasingly central in professional self-regulation. Research has shown that simple MSF is often unproductive. It has been suggested that MSF should be delivered by a facilitator and combined with a portfolio. AIMS:

  19. Gradient-Type Magnetoelectric Current Sensor with Strong Multisource Noise Suppression.

    Science.gov (United States)

    Zhang, Mingji; Or, Siu Wing

    2018-02-14

    A novel gradient-type magnetoelectric (ME) current sensor operating in magnetic field gradient (MFG) detection and conversion mode is developed based on a pair of ME composites that have a back-to-back capacitor configuration under a baseline separation and a magnetic biasing in an electrically-shielded and mechanically-enclosed housing. The physics behind the current sensing process is the product effect of the current-induced MFG effect associated with vortex magnetic fields of current-carrying cables (i.e., MFG detection) and the MFG-induced ME effect in the ME composite pair (i.e., MFG conversion). The sensor output voltage is directly obtained from the gradient ME voltage of the ME composite pair and is calibrated against cable current to give the current sensitivity. The current sensing performance of the sensor is evaluated, both theoretically and experimentally, under multisource noises of electric fields, magnetic fields, vibrations, and thermals. The sensor combines the merits of small nonlinearity in the current-induced MFG effect with those of high sensitivity and high common-mode noise rejection rate in the MFG-induced ME effect to achieve a high current sensitivity of 0.65-12.55 mV/A in the frequency range of 10 Hz-170 kHz, a small input-output nonlinearity of <500 ppm, a small thermal drift of <0.2%/℃ in the current range of 0-20 A, and a high common-mode noise rejection rate of 17-28 dB from multisource noises.

  20. A beam optics study of a modular multi-source X-ray tube for novel computed tomography applications

    Science.gov (United States)

    Walker, Brandon J.; Radtke, Jeff; Chen, Guang-Hong; Eliceiri, Kevin W.; Mackie, Thomas R.

    2017-10-01

    A modular implementation of a scanning multi-source X-ray tube is designed for the increasing number of multi-source imaging applications in computed tomography (CT). An electron beam array coupled with an oscillating magnetic deflector is proposed as a means for producing an X-ray focal spot at any position along a line. The preliminary multi-source model includes three thermionic electron guns that are deflected in tandem by a slowly varying magnetic field and pulsed according to a scanning sequence that is dependent on the intended imaging application. Particle tracking simulations with particle dynamics analysis software demonstrate that three 100 keV electron beams are laterally swept a combined distance of 15 cm over a stationary target with an oscillating magnetic field of 102 G perpendicular to the beam axis. Beam modulation is accomplished using 25 μs pulse widths to a grid electrode with a reverse gate bias of -500 V and an extraction voltage of +1000 V. Projected focal spot diameters are approximately 1 mm for 138 mA electron beams and the stationary target stays within thermal limits for the 14 kW module. This concept could be used as a research platform for investigating high-speed stationary CT scanners, for lowering dose with virtual fan beam formation, for reducing scatter radiation in cone-beam CT, or for other industrial applications.

  1. Combined analgesics in (headache pain therapy: shotgun approach or precise multi-target therapeutics?

    Directory of Open Access Journals (Sweden)

    Fiebich Bernd L

    2011-03-01

    Full Text Available Abstract Background Pain in general and headache in particular are characterized by a change in activity in brain areas involved in pain processing. The therapeutic challenge is to identify drugs with molecular targets that restore the healthy state, resulting in meaningful pain relief or even freedom from pain. Different aspects of pain perception, i.e. sensory and affective components, also explain why there is not just one single target structure for therapeutic approaches to pain. A network of brain areas ("pain matrix" are involved in pain perception and pain control. This diversification of the pain system explains why a wide range of molecularly different substances can be used in the treatment of different pain states and why in recent years more and more studies have described a superior efficacy of a precise multi-target combination therapy compared to therapy with monotherapeutics. Discussion In this article, we discuss the available literature on the effects of several fixed-dose combinations in the treatment of headaches and discuss the evidence in support of the role of combination therapy in the pharmacotherapy of pain, particularly of headaches. The scientific rationale behind multi-target combinations is the therapeutic benefit that could not be achieved by the individual constituents and that the single substances of the combinations act together additively or even multiplicatively and cooperate to achieve a completeness of the desired therapeutic effect. As an example the fixesd-dose combination of acetylsalicylic acid (ASA, paracetamol (acetaminophen and caffeine is reviewed in detail. The major advantage of using such a fixed combination is that the active ingredients act on different but distinct molecular targets and thus are able to act on more signalling cascades involved in pain than most single analgesics without adding more side effects to the therapy. Summary Multitarget therapeutics like combined analgesics broaden

  2. Combined analgesics in (headache) pain therapy: shotgun approach or precise multi-target therapeutics?

    Science.gov (United States)

    2011-01-01

    Background Pain in general and headache in particular are characterized by a change in activity in brain areas involved in pain processing. The therapeutic challenge is to identify drugs with molecular targets that restore the healthy state, resulting in meaningful pain relief or even freedom from pain. Different aspects of pain perception, i.e. sensory and affective components, also explain why there is not just one single target structure for therapeutic approaches to pain. A network of brain areas ("pain matrix") are involved in pain perception and pain control. This diversification of the pain system explains why a wide range of molecularly different substances can be used in the treatment of different pain states and why in recent years more and more studies have described a superior efficacy of a precise multi-target combination therapy compared to therapy with monotherapeutics. Discussion In this article, we discuss the available literature on the effects of several fixed-dose combinations in the treatment of headaches and discuss the evidence in support of the role of combination therapy in the pharmacotherapy of pain, particularly of headaches. The scientific rationale behind multi-target combinations is the therapeutic benefit that could not be achieved by the individual constituents and that the single substances of the combinations act together additively or even multiplicatively and cooperate to achieve a completeness of the desired therapeutic effect. As an example the fixesd-dose combination of acetylsalicylic acid (ASA), paracetamol (acetaminophen) and caffeine is reviewed in detail. The major advantage of using such a fixed combination is that the active ingredients act on different but distinct molecular targets and thus are able to act on more signalling cascades involved in pain than most single analgesics without adding more side effects to the therapy. Summary Multitarget therapeutics like combined analgesics broaden the array of therapeutic

  3. Combined analgesics in (headache) pain therapy: shotgun approach or precise multi-target therapeutics?

    Science.gov (United States)

    Straube, Andreas; Aicher, Bernhard; Fiebich, Bernd L; Haag, Gunther

    2011-03-31

    Pain in general and headache in particular are characterized by a change in activity in brain areas involved in pain processing. The therapeutic challenge is to identify drugs with molecular targets that restore the healthy state, resulting in meaningful pain relief or even freedom from pain. Different aspects of pain perception, i.e. sensory and affective components, also explain why there is not just one single target structure for therapeutic approaches to pain. A network of brain areas ("pain matrix") are involved in pain perception and pain control. This diversification of the pain system explains why a wide range of molecularly different substances can be used in the treatment of different pain states and why in recent years more and more studies have described a superior efficacy of a precise multi-target combination therapy compared to therapy with monotherapeutics. In this article, we discuss the available literature on the effects of several fixed-dose combinations in the treatment of headaches and discuss the evidence in support of the role of combination therapy in the pharmacotherapy of pain, particularly of headaches. The scientific rationale behind multi-target combinations is the therapeutic benefit that could not be achieved by the individual constituents and that the single substances of the combinations act together additively or even multiplicatively and cooperate to achieve a completeness of the desired therapeutic effect.As an example the fixed-dose combination of acetylsalicylic acid (ASA), paracetamol (acetaminophen) and caffeine is reviewed in detail. The major advantage of using such a fixed combination is that the active ingredients act on different but distinct molecular targets and thus are able to act on more signalling cascades involved in pain than most single analgesics without adding more side effects to the therapy. Multitarget therapeutics like combined analgesics broaden the array of therapeutic options, enable the completeness

  4. Analysis and Simulation of Multi-target Echo Signals from a Phased Array Radar

    OpenAIRE

    Jia Zhen; Zhou Rui

    2017-01-01

    The construction of digital radar simulation systems has been a research hotspot of the radar field. This paper focuses on theoretical analysis and simulation of multi-target echo signals produced in a phased array radar system, and constructs an array antenna element and a signal generation environment. The antenna element is able to simulate planar arrays and optimizes these arrays by adding window functions. And the signal environment can model and simulate radar transmission signals, rada...

  5. Multitarget transcranial direct current stimulation for freezing of gait in Parkinson's disease.

    Science.gov (United States)

    Dagan, Moria; Herman, Talia; Harrison, Rachel; Zhou, Junhong; Giladi, Nir; Ruffini, Giulio; Manor, Brad; Hausdorff, Jeffrey M

    2018-04-01

    Recent findings suggest that transcranial direct current stimulation of the primary motor cortex may ameliorate freezing of gait. However, the effects of multitarget simultaneous stimulation of motor and cognitive networks are mostly unknown. The objective of this study was to evaluate the effects of multitarget transcranial direct current stimulation of the primary motor cortex and left dorsolateral prefrontal cortex on freezing of gait and related outcomes. Twenty patients with Parkinson's disease and freezing of gait received 20 minutes of transcranial direct current stimulation on 3 separate visits. Transcranial direct current stimulation targeted the primary motor cortex and left dorsolateral prefrontal cortex simultaneously, primary motor cortex only, or sham stimulation (order randomized and double-blinded assessments). Participants completed a freezing of gait-provoking test, the Timed Up and Go, and the Stroop test before and after each transcranial direct current stimulation session. Performance on the freezing of gait-provoking test (P = 0.010), Timed Up and Go (P = 0.006), and the Stroop test (P = 0.016) improved after simultaneous stimulation of the primary motor cortex and left dorsolateral prefrontal cortex, but not after primary motor cortex only or sham stimulation. Transcranial direct current stimulation designed to simultaneously target motor and cognitive regions apparently induces immediate aftereffects in the brain that translate into reduced freezing of gait and improvements in executive function and mobility. © 2018 International Parkinson and Movement Disorder Society. © 2018 International Parkinson and Movement Disorder Society.

  6. Multisource least-squares migration of marine streamer and land data with frequency-division encoding

    KAUST Repository

    Huang, Yunsong; Schuster, Gerard T.

    2012-01-01

    Multisource migration of phase-encoded supergathers has shown great promise in reducing the computational cost of conventional migration. The accompanying crosstalk noise, in addition to the migration footprint, can be reduced by least-squares inversion. But the application of this approach to marine streamer data is hampered by the mismatch between the limited number of live traces/shot recorded in the field and the pervasive number of traces generated by the finite-difference modelling method. This leads to a strong mismatch in the misfit function and results in strong artefacts (crosstalk) in the multisource least-squares migration image. To eliminate this noise, we present a frequency-division multiplexing (FDM) strategy with iterative least-squares migration (ILSM) of supergathers. The key idea is, at each ILSM iteration, to assign a unique frequency band to each shot gather. In this case there is no overlap in the crosstalk spectrum of each migrated shot gather m(x, ω i), so the spectral crosstalk product m(x, ω i)m(x, ω j) =δ i, j is zero, unless i=j. Our results in applying this method to 2D marine data for a SEG/EAGE salt model show better resolved images than standard migration computed at about 1/10 th of the cost. Similar results are achieved after applying this method to synthetic data for a 3D SEG/EAGE salt model, except the acquisition geometry is similar to that of a marine OBS survey. Here, the speedup of this method over conventional migration is more than 10. We conclude that multisource migration for a marine geometry can be successfully achieved by a frequency-division encoding strategy, as long as crosstalk-prone sources are segregated in their spectral content. This is both the strength and the potential limitation of this method. © 2012 European Association of Geoscientists & Engineers.

  7. Multisource least-squares migration of marine streamer and land data with frequency-division encoding

    KAUST Repository

    Huang, Yunsong

    2012-05-22

    Multisource migration of phase-encoded supergathers has shown great promise in reducing the computational cost of conventional migration. The accompanying crosstalk noise, in addition to the migration footprint, can be reduced by least-squares inversion. But the application of this approach to marine streamer data is hampered by the mismatch between the limited number of live traces/shot recorded in the field and the pervasive number of traces generated by the finite-difference modelling method. This leads to a strong mismatch in the misfit function and results in strong artefacts (crosstalk) in the multisource least-squares migration image. To eliminate this noise, we present a frequency-division multiplexing (FDM) strategy with iterative least-squares migration (ILSM) of supergathers. The key idea is, at each ILSM iteration, to assign a unique frequency band to each shot gather. In this case there is no overlap in the crosstalk spectrum of each migrated shot gather m(x, ω i), so the spectral crosstalk product m(x, ω i)m(x, ω j) =δ i, j is zero, unless i=j. Our results in applying this method to 2D marine data for a SEG/EAGE salt model show better resolved images than standard migration computed at about 1/10 th of the cost. Similar results are achieved after applying this method to synthetic data for a 3D SEG/EAGE salt model, except the acquisition geometry is similar to that of a marine OBS survey. Here, the speedup of this method over conventional migration is more than 10. We conclude that multisource migration for a marine geometry can be successfully achieved by a frequency-division encoding strategy, as long as crosstalk-prone sources are segregated in their spectral content. This is both the strength and the potential limitation of this method. © 2012 European Association of Geoscientists & Engineers.

  8. Contribution to the theoretical study, modelling and implementation of a multi-source system belonging to a micro-grid: considerations on power quality

    International Nuclear Information System (INIS)

    Houssamo, Issam

    2012-01-01

    The objective of this thesis is to study, analyze and develop a multi-source system belonging to a DC micro-grid with consideration of some aspects of the power quality. Chapter I presents the interest of the smart grid to ensure better coordination between distributed generation and power consumption. Having in view the prediction, a purely experimental model of photovoltaic source is developed and presented in Chapter II. Furthermore, in order to extract the maximum power of the photovoltaic source, a classical algorithm is improved and the extracted energy is compared with three other methods. In Chapter III, the security system elements, the electrochemical storage and the public grid, are characterized. In the case of a storage shortage, the public grid is used to supply power to the load, but also trade back excess energy. Chapter IV presents the control of multi-source system and its experimental validation. The energy management strategy taken into account is based on switching between the elements which secure the multi-source system. For this, the priority is given to storage characterized by its state of charge. Thanks to this strategy, the technical feasibility of the multi-source system is experimentally validated. Chapter V gives some aspects related to the improvement of the power quality: for the public grid side, a resonant controller is proposed, for the DC bus side, the pulsating power is eliminated by injecting the opposite signal supplied by the electrochemical storage. (author) [fr

  9. Analysis of multivariate stochastic signals sampled by on-line particle analyzers: Application to the quantitative assessment of occupational exposure to NOAA in multisource industrial scenarios (MSIS)

    International Nuclear Information System (INIS)

    De Ipiña, J M López; Vaquero, C; Gutierrez-Cañas, C; Pui, D Y H

    2015-01-01

    In multisource industrial scenarios (MSIS) coexist NOAA generating activities with other productive sources of airborne particles, such as parallel processes of manufacturing or electrical and diesel machinery. A distinctive characteristic of MSIS is the spatially complex distribution of aerosol sources, as well as their potential differences in dynamics, due to the feasibility of multi-task configuration at a given time. Thus, the background signal is expected to challenge the aerosol analyzers at a probably wide range of concentrations and size distributions, depending of the multisource configuration at a given time. Monitoring and prediction by using statistical analysis of time series captured by on-line particle analyzersin industrial scenarios, have been proven to be feasible in predicting PNC evolution provided a given quality of net signals (difference between signal at source and background). However the analysis and modelling of non-consistent time series, influenced by low levels of SNR (Signal-Noise Ratio) could build a misleading basis for decision making. In this context, this work explores the use of stochastic models based on ARIMA methodology to monitor and predict exposure values (PNC). The study was carried out in a MSIS where an case study focused on the manufacture of perforated tablets of nano-TiO 2 by cold pressing was performed. (paper)

  10. Combining a leadership course and multi-source feedback has no effect on leadership skills of leaders in postgraduate medical education. An intervention study with a control group

    Directory of Open Access Journals (Sweden)

    Scherpbier Albert

    2009-12-01

    Full Text Available Abstract Background Leadership courses and multi-source feedback are widely used developmental tools for leaders in health care. On this background we aimed to study the additional effect of a leadership course following a multi-source feedback procedure compared to multi-source feedback alone especially regarding development of leadership skills over time. Methods Study participants were consultants responsible for postgraduate medical education at clinical departments. Study design: pre-post measures with an intervention and control group. The intervention was participation in a seven-day leadership course. Scores of multi-source feedback from the consultants responsible for education and respondents (heads of department, consultants and doctors in specialist training were collected before and one year after the intervention and analysed using Mann-Whitney's U-test and Multivariate analysis of variances. Results There were no differences in multi-source feedback scores at one year follow up compared to baseline measurements, either in the intervention or in the control group (p = 0.149. Conclusion The study indicates that a leadership course following a MSF procedure compared to MSF alone does not improve leadership skills of consultants responsible for education in clinical departments. Developing leadership skills takes time and the time frame of one year might have been too short to show improvement in leadership skills of consultants responsible for education. Further studies are needed to investigate if other combination of initiatives to develop leadership might have more impact in the clinical setting.

  11. Combining a leadership course and multi-source feedback has no effect on leadership skills of leaders in postgraduate medical education. An intervention study with a control group.

    Science.gov (United States)

    Malling, Bente; Mortensen, Lene; Bonderup, Thomas; Scherpbier, Albert; Ringsted, Charlotte

    2009-12-10

    Leadership courses and multi-source feedback are widely used developmental tools for leaders in health care. On this background we aimed to study the additional effect of a leadership course following a multi-source feedback procedure compared to multi-source feedback alone especially regarding development of leadership skills over time. Study participants were consultants responsible for postgraduate medical education at clinical departments. pre-post measures with an intervention and control group. The intervention was participation in a seven-day leadership course. Scores of multi-source feedback from the consultants responsible for education and respondents (heads of department, consultants and doctors in specialist training) were collected before and one year after the intervention and analysed using Mann-Whitney's U-test and Multivariate analysis of variances. There were no differences in multi-source feedback scores at one year follow up compared to baseline measurements, either in the intervention or in the control group (p = 0.149). The study indicates that a leadership course following a MSF procedure compared to MSF alone does not improve leadership skills of consultants responsible for education in clinical departments. Developing leadership skills takes time and the time frame of one year might have been too short to show improvement in leadership skills of consultants responsible for education. Further studies are needed to investigate if other combination of initiatives to develop leadership might have more impact in the clinical setting.

  12. LINKS: learning-based multi-source IntegratioN frameworK for Segmentation of infant brain images.

    Science.gov (United States)

    Wang, Li; Gao, Yaozong; Shi, Feng; Li, Gang; Gilmore, John H; Lin, Weili; Shen, Dinggang

    2015-03-01

    Segmentation of infant brain MR images is challenging due to insufficient image quality, severe partial volume effect, and ongoing maturation and myelination processes. In the first year of life, the image contrast between white and gray matters of the infant brain undergoes dramatic changes. In particular, the image contrast is inverted around 6-8months of age, and the white and gray matter tissues are isointense in both T1- and T2-weighted MR images and thus exhibit the extremely low tissue contrast, which poses significant challenges for automated segmentation. Most previous studies used multi-atlas label fusion strategy, which has the limitation of equally treating the different available image modalities and is often computationally expensive. To cope with these limitations, in this paper, we propose a novel learning-based multi-source integration framework for segmentation of infant brain images. Specifically, we employ the random forest technique to effectively integrate features from multi-source images together for tissue segmentation. Here, the multi-source images include initially only the multi-modality (T1, T2 and FA) images and later also the iteratively estimated and refined tissue probability maps of gray matter, white matter, and cerebrospinal fluid. Experimental results on 119 infants show that the proposed method achieves better performance than other state-of-the-art automated segmentation methods. Further validation was performed on the MICCAI grand challenge and the proposed method was ranked top among all competing methods. Moreover, to alleviate the possible anatomical errors, our method can also be combined with an anatomically-constrained multi-atlas labeling approach for further improving the segmentation accuracy. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. Quantum information theory and quantum statistics

    International Nuclear Information System (INIS)

    Petz, D.

    2008-01-01

    Based on lectures given by the author, this book focuses on providing reliable introductory explanations of key concepts of quantum information theory and quantum statistics - rather than on results. The mathematically rigorous presentation is supported by numerous examples and exercises and by an appendix summarizing the relevant aspects of linear analysis. Assuming that the reader is familiar with the content of standard undergraduate courses in quantum mechanics, probability theory, linear algebra and functional analysis, the book addresses graduate students of mathematics and physics as well as theoretical and mathematical physicists. Conceived as a primer to bridge the gap between statistical physics and quantum information, a field to which the author has contributed significantly himself, it emphasizes concepts and thorough discussions of the fundamental notions to prepare the reader for deeper studies, not least through the selection of well chosen exercises. (orig.)

  14. Statistical approach for selection of biologically informative genes.

    Science.gov (United States)

    Das, Samarendra; Rai, Anil; Mishra, D C; Rai, Shesh N

    2018-05-20

    Selection of informative genes from high dimensional gene expression data has emerged as an important research area in genomics. Many gene selection techniques have been proposed so far are either based on relevancy or redundancy measure. Further, the performance of these techniques has been adjudged through post selection classification accuracy computed through a classifier using the selected genes. This performance metric may be statistically sound but may not be biologically relevant. A statistical approach, i.e. Boot-MRMR, was proposed based on a composite measure of maximum relevance and minimum redundancy, which is both statistically sound and biologically relevant for informative gene selection. For comparative evaluation of the proposed approach, we developed two biological sufficient criteria, i.e. Gene Set Enrichment with QTL (GSEQ) and biological similarity score based on Gene Ontology (GO). Further, a systematic and rigorous evaluation of the proposed technique with 12 existing gene selection techniques was carried out using five gene expression datasets. This evaluation was based on a broad spectrum of statistically sound (e.g. subject classification) and biological relevant (based on QTL and GO) criteria under a multiple criteria decision-making framework. The performance analysis showed that the proposed technique selects informative genes which are more biologically relevant. The proposed technique is also found to be quite competitive with the existing techniques with respect to subject classification and computational time. Our results also showed that under the multiple criteria decision-making setup, the proposed technique is best for informative gene selection over the available alternatives. Based on the proposed approach, an R Package, i.e. BootMRMR has been developed and available at https://cran.r-project.org/web/packages/BootMRMR. This study will provide a practical guide to select statistical techniques for selecting informative genes

  15. Gradient-Type Magnetoelectric Current Sensor with Strong Multisource Noise Suppression

    Science.gov (United States)

    2018-01-01

    A novel gradient-type magnetoelectric (ME) current sensor operating in magnetic field gradient (MFG) detection and conversion mode is developed based on a pair of ME composites that have a back-to-back capacitor configuration under a baseline separation and a magnetic biasing in an electrically-shielded and mechanically-enclosed housing. The physics behind the current sensing process is the product effect of the current-induced MFG effect associated with vortex magnetic fields of current-carrying cables (i.e., MFG detection) and the MFG-induced ME effect in the ME composite pair (i.e., MFG conversion). The sensor output voltage is directly obtained from the gradient ME voltage of the ME composite pair and is calibrated against cable current to give the current sensitivity. The current sensing performance of the sensor is evaluated, both theoretically and experimentally, under multisource noises of electric fields, magnetic fields, vibrations, and thermals. The sensor combines the merits of small nonlinearity in the current-induced MFG effect with those of high sensitivity and high common-mode noise rejection rate in the MFG-induced ME effect to achieve a high current sensitivity of 0.65–12.55 mV/A in the frequency range of 10 Hz–170 kHz, a small input-output nonlinearity of <500 ppm, a small thermal drift of <0.2%/℃ in the current range of 0–20 A, and a high common-mode noise rejection rate of 17–28 dB from multisource noises. PMID:29443920

  16. Multi-target camera tracking, hand-off and display LDRD 158819 final report

    International Nuclear Information System (INIS)

    Anderson, Robert J.

    2014-01-01

    Modern security control rooms gather video and sensor feeds from tens to hundreds of cameras. Advanced camera analytics can detect motion from individual video streams and convert unexpected motion into alarms, but the interpretation of these alarms depends heavily upon human operators. Unfortunately, these operators can be overwhelmed when a large number of events happen simultaneously, or lulled into complacency due to frequent false alarms. This LDRD project has focused on improving video surveillance-based security systems by changing the fundamental focus from the cameras to the targets being tracked. If properly integrated, more cameras shouldn't lead to more alarms, more monitors, more operators, and increased response latency but instead should lead to better information and more rapid response times. For the course of the LDRD we have been developing algorithms that take live video imagery from multiple video cameras, identifies individual moving targets from the background imagery, and then displays the results in a single 3D interactive video. In this document we summarize the work in developing this multi-camera, multi-target system, including lessons learned, tools developed, technologies explored, and a description of current capability.

  17. Multi-target camera tracking, hand-off and display LDRD 158819 final report

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Robert J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-10-01

    Modern security control rooms gather video and sensor feeds from tens to hundreds of cameras. Advanced camera analytics can detect motion from individual video streams and convert unexpected motion into alarms, but the interpretation of these alarms depends heavily upon human operators. Unfortunately, these operators can be overwhelmed when a large number of events happen simultaneously, or lulled into complacency due to frequent false alarms. This LDRD project has focused on improving video surveillance-based security systems by changing the fundamental focus from the cameras to the targets being tracked. If properly integrated, more cameras shouldn't lead to more alarms, more monitors, more operators, and increased response latency but instead should lead to better information and more rapid response times. For the course of the LDRD we have been developing algorithms that take live video imagery from multiple video cameras, identifies individual moving targets from the background imagery, and then displays the results in a single 3D interactive video. In this document we summarize the work in developing this multi-camera, multi-target system, including lessons learned, tools developed, technologies explored, and a description of current capability.

  18. Multi-Target Camera Tracking, Hand-off and Display LDRD 158819 Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Robert J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Robotic and Security Systems Dept.

    2014-10-01

    Modern security control rooms gather video and sensor feeds from tens to hundreds of cameras. Advanced camera analytics can detect motion from individual video streams and convert unexpected motion into alarms, but the interpretation of these alarms depends heavily upon human operators. Unfortunately, these operators can be overwhelmed when a large number of events happen simultaneously, or lulled into complacency due to frequent false alarms. This LDRD project has focused on improving video surveillance-based security systems by changing the fundamental focus from the cameras to the targets being tracked. If properly integrated, more cameras shouldn’t lead to more alarms, more monitors, more operators, and increased response latency but instead should lead to better information and more rapid response times. For the course of the LDRD we have been developing algorithms that take live video imagery from multiple video cameras, identify individual moving targets from the background imagery, and then display the results in a single 3D interactive video. In this document we summarize the work in developing this multi-camera, multi-target system, including lessons learned, tools developed, technologies explored, and a description of current capability.

  19. Temporal and Statistical Information in Causal Structure Learning

    Science.gov (United States)

    McCormack, Teresa; Frosch, Caren; Patrick, Fiona; Lagnado, David

    2015-01-01

    Three experiments examined children's and adults' abilities to use statistical and temporal information to distinguish between common cause and causal chain structures. In Experiment 1, participants were provided with conditional probability information and/or temporal information and asked to infer the causal structure of a 3-variable mechanical…

  20. Encryption of covert information into multiple statistical distributions

    International Nuclear Information System (INIS)

    Venkatesan, R.C.

    2007-01-01

    A novel strategy to encrypt covert information (code) via unitary projections into the null spaces of ill-conditioned eigenstructures of multiple host statistical distributions, inferred from incomplete constraints, is presented. The host pdf's are inferred using the maximum entropy principle. The projection of the covert information is dependent upon the pdf's of the host statistical distributions. The security of the encryption/decryption strategy is based on the extreme instability of the encoding process. A self-consistent procedure to derive keys for both symmetric and asymmetric cryptography is presented. The advantages of using a multiple pdf model to achieve encryption of covert information are briefly highlighted. Numerical simulations exemplify the efficacy of the model

  1. Towards Device-Independent Information Processing on General Quantum Networks

    Science.gov (United States)

    Lee, Ciarán M.; Hoban, Matty J.

    2018-01-01

    The violation of certain Bell inequalities allows for device-independent information processing secure against nonsignaling eavesdroppers. However, this only holds for the Bell network, in which two or more agents perform local measurements on a single shared source of entanglement. To overcome the practical constraints that entangled systems can only be transmitted over relatively short distances, large-scale multisource networks have been employed. Do there exist analogs of Bell inequalities for such networks, whose violation is a resource for device independence? In this Letter, the violation of recently derived polynomial Bell inequalities will be shown to allow for device independence on multisource networks, secure against nonsignaling eavesdroppers.

  2. Multi-target Wastage Phenomena on Steam Generator Tubes During an SWR Event

    International Nuclear Information System (INIS)

    Jeong, Ji Young; Kim, Jong Man; Kim, Tae Joon; Eoh, Jae Hyuk; Choi, Jong Hyeun; Lee, Yong Bum

    2011-01-01

    The Korean sodium cooled fast reactor, KALIMER- 600 (Korea Advanced LIquid MEtal Reactor) of which the electric output is 600MWe, was developed. The steam generator (SG) of this system is a shell-and-tube type counter-current flow heat exchanger, which is vertically oriented with fixed tube-sheets. A direct heat exchange occurs between the shell-side sodium and the tube-side water at the SG unit. Feed-water enters the inlet nozzle at the lower part of the unit and it flows upward along the helically coiled heat transfer tubes. The inflow sodium is cooled down at the bundle region and then flows out through the sodium outlet nozzle at the bottom of the unit. The typical configuration of the KALIMER-600 SG is shown in Figure 1. In a steam generator, sodium and water are separated by the heat transfer tube wall and it makes a strong pressure boundary between the shell-side sodium and the tube-side water/steam. For this reason, if there is a small hole or crack, even with a pin hole, on heat transfer tubes, a large amount of water/steam would leak into the liquid sodium due to the high pressure difference more than 150 bars, and an exothermic sodium-water chemical reaction takes place as a result. This type of sodium-water reaction (SWR) has been considered as one of the most important safety issues to be resolved. From previous studies, it was obviously figured out that the number of ruptured tubes during an SWR event is one of the most significant factors to determine the temperature and pressure transient. Any subsequent tube rupture behavior in the vicinity of the initially postulated single ruptured tube should be evaluated by considering the single- and multi-target wastage phenomena. Wastage is defined as damage to the structural material (e.g. heat transfer tubes) due to an impingement of the highly corrosive reaction product. Since the impingement may cause wastage of the neighboring heat transfer tubes, a subsequent tube failure can occur in a very short time

  3. Information Geometric Complexity of a Trivariate Gaussian Statistical Model

    Directory of Open Access Journals (Sweden)

    Domenico Felice

    2014-05-01

    Full Text Available We evaluate the information geometric complexity of entropic motion on low-dimensional Gaussian statistical manifolds in order to quantify how difficult it is to make macroscopic predictions about systems in the presence of limited information. Specifically, we observe that the complexity of such entropic inferences not only depends on the amount of available pieces of information but also on the manner in which such pieces are correlated. Finally, we uncover that, for certain correlational structures, the impossibility of reaching the most favorable configuration from an entropic inference viewpoint seems to lead to an information geometric analog of the well-known frustration effect that occurs in statistical physics.

  4. A multisource approach for coastline mapping and identification of shoreline changes

    Directory of Open Access Journals (Sweden)

    A. Zaccagnino

    2006-06-01

    Full Text Available Coastal dynamics are driven by phenomena of exogenous and endogenous nature. Characterizing factors that influence their equilibrium and continuous monitoring are fundamental for effective environmental planning and management of coastal areas. In order to monitor shoreline changes, we developed a methodology based on a multisource and multitemporal approach. A database, related to the Ionian coast of Basilicata region (about 50 km, was implemented by using cartographic data (IGMI data, satellite imagery (SPOT-PX/XS, Landsat-TM, Corona and aerial data covering the period form 1949 to 2001. In particular, airborne data (1 m spatial resolution were acquired during a specific campaign we performed in 2000 and 2001. To obtain the best performance from the available data, we applied a data fusion procedure on visible and thermal information. Different algorithms were tested, such as band ratios and clustering for extracting the coastline. The best results from multispectral data were obtained using a threshold algorithm we devised by exploiting the green, red and NIR bands, whereas for panchromatic data we selected clustering as the more suitable method. Moreover, a GPS survey was performed to evaluate the influence of tidal effects.

  5. Coordination of the National Statistical System in the Information Security Context

    Directory of Open Access Journals (Sweden)

    O. H.

    2017-12-01

    Full Text Available The need for building the national statistical system (NSS as the framework for coordination of statistical works is substantiated. NSS is defined on the basis of system approach. It is emphasized that the essential conditions underlying NSS are strategic planning, reliance on internationally adopted methods and due consideration to country-specific environment. The role of the state coordination policy in organizing statistical activities in the NSS framework is highlighted, key objectives of the integrated national policy on coordination of statistical activities are given. Threats arising from non-existence of NSS in a country are shown: “irregular” pattern of statistical activities, resulting from absence of common legal, methodological and organizational grounds; high costs involved in the finished information product in parallel with its low quality; impossibility of administering the statistical information security in a coherent manner, i. e. keeping with the rules on confidentiality of data, preventing intentional distortion of information and keeping with the rules of treatment with data making the state secret. An extensive review of NSS functional objectives is made: to ensure the system development of the official statistics; to ensure confidentiality and protection of individual data; to establish interdepartmental mechanisms for control and protection of secret statistical information; to broaden and regulate the access to statistical data and their effective use. The need for creating the National Statistical Commission is grounded.

  6. Crawling and walking infants encounter objects differently in a multi-target environment.

    Science.gov (United States)

    Dosso, Jill A; Boudreau, J Paul

    2014-10-01

    From birth, infants move their bodies in order to obtain information and stimulation from their environment. Exploratory movements are important for the development of an infant's understanding of the world and are well established as being key to cognitive advances. Newly acquired motor skills increase the potential actions available to the infant. However, the way that infants employ potential actions in environments with multiple potential targets is undescribed. The current work investigated the target object selections of infants across a range of self-produced locomotor experience (11- to 14-month-old crawlers and walkers). Infants repeatedly accessed objects among pairs of objects differing in both distance and preference status, some requiring locomotion. Overall, their object actions were found to be sensitive to object preference status; however, the role of object distance in shaping object encounters was moderated by movement status. Crawlers' actions appeared opportunistic and were biased towards nearby objects while walkers' actions appeared intentional and were independent of object position. Moreover, walkers' movements favoured preferred objects more strongly for children with higher levels of self-produced locomotion experience. The multi-target experimental situation used in this work parallels conditions faced by foraging organisms, and infants' behaviours were discussed with respect to optimal foraging theory. There is a complex interplay between infants' agency, locomotor experience, and environment in shaping their motor actions. Infants' movements, in turn, determine the information and experiences offered to infants by their micro-environment.

  7. Home-based wrinkle reduction using a novel handheld multisource phase-controlled radiofrequency device.

    Science.gov (United States)

    Shemer, Avner; Levy, Hanna; Sadick, Neil S; Harth, Yoram; Dorizas, Andrew S

    2014-11-01

    In the last decade, energy-based aesthetic treatments, using light, radiofrequency (RF), and ultrasound, have gained scientific acceptance as safe and efficacious for non-invasive treatment for aesthetic skin disorders. The phase-controlled multisource radiofrequency technology (3DEEP™), which is based on the simultaneous use of multiple RF generators, was proven to allow significant pigment-independent dermal heating without pain or the need of epidermal cooling. This study was performed in order to evaluate the efficacy and safety of a new handheld device delivering multisource radiofrequency to the skin for wrinkle reduction and skin tightening in the home setting. A total of 69 participants (age 54.3 years ± 8.09; age range 37-72 years) were enrolled in the study after meeting all inclusion/exclusion criteria (100%) and providing informed consent. Participants were provided with the tested device together with a user manual and treatment diary, to perform independent treatments at home for 4 weeks. The tested device, (Newa™, EndyMed Medical, Cesarea, Israel) emits 12 W of 1Mhz, RF energy through six electrodes arranged in a linear fashion. Independent control of RF polarity through each one of the 6 electrodes allows significant reduction of energy flow through the epidermis with increased dermal penetration. Participants were instructed to perform at least 5 treatments a week, for one month. Four follow-up visits were scheduled (once a week) during the period of independent treatments at home, following 4 weeks of home treatments, 1 month follow-up visit (1 month after treatment end) and at 3 months follow-up (3 months following treatment end). Analysis of pre-and post treatment images was conducted by three uninvolved physicians experienced with the Fitzpatrick Wrinkle and Elastosis Scale. Fitzpatrick Wrinkle and Elastosis score of each time point (4 weeks following home use treatments; 1 month follow-up, 3 months follow-up) was compared to baseline

  8. Annual statistical information 1996; Informe estatistico anual 1996

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-12-31

    This annual statistical report aims to propagate the information about the generation, transmission and distribution systems evolution and about the electric power market from the Parana State, Brazil, in 1996. The electric power consumption in the distribution area of the Parana Power Company (COPEL) presented a growth about 6,7%. The electric power production in the the COPEL plants increased 42,2% higher than 1995, due to the outflows verified in the Iguacu river and to the long period of the affluence reduction that the Southern region tanks coursed during this year. This report presents statistical data about the following topics: a) electric power energy balance from the Parana State; b) electric power energy balance from the COPEL - own generation, certain interchange, electric power requirement, direct distribution and the electric system 6 graphs, 3 maps, 61 tabs.; e-mail: splcnmr at mail.copel.br

  9. The influence of narrative v. statistical information on perceiving vaccination risks.

    Science.gov (United States)

    Betsch, Cornelia; Ulshöfer, Corina; Renkewitz, Frank; Betsch, Tilmann

    2011-01-01

    Health-related information found on the Internet is increasing and impacts patient decision making, e.g. regarding vaccination decisions. In addition to statistical information (e.g. incidence rates of vaccine adverse events), narrative information is also widely available such as postings on online bulletin boards. Previous research has shown that narrative information can impact treatment decisions, even when statistical information is presented concurrently. As the determinants of this effect are largely unknown, we will vary features of the narratives to identify mechanisms through which narratives impact risk judgments. An online bulletin board setting provided participants with statistical information and authentic narratives about the occurrence and nonoccurrence of adverse events. Experiment 1 followed a single factorial design with 1, 2, or 4 narratives out of 10 reporting adverse events. Experiment 2 implemented a 2 (statistical risk 20% vs. 40%) × 2 (2/10 vs. 4/10 narratives reporting adverse events) × 2 (high vs. low richness) × 2 (high vs. low emotionality) between-subjects design. Dependent variables were perceived risk of side-effects and vaccination intentions. Experiment 1 shows an inverse relation between the number of narratives reporting adverse-events and vaccination intentions, which was mediated by the perceived risk of vaccinating. Experiment 2 showed a stronger influence of the number of narratives than of the statistical risk information. High (vs. low) emotional narratives had a greater impact on the perceived risk, while richness had no effect. The number of narratives influences risk judgments can potentially override statistical information about risk.

  10. The Reliability of Multisource Feedback in Competency-Based Assessment Programs: The Effects of Multiple Occasions and Assessor Groups

    NARCIS (Netherlands)

    Moonen-van Loon, J.M.; Overeem, K.; Govaerts, M.J.; Verhoeven, B.H.; Vleuten, C.P.M. van der; Driessen, E.W.

    2015-01-01

    PURPOSE: Residency programs around the world use multisource feedback (MSF) to evaluate learners' performance. Studies of the reliability of MSF show mixed results. This study aimed to identify the reliability of MSF as practiced across occasions with varying numbers of assessors from different

  11. SOCR data dashboard: an integrated big data archive mashing medicare, labor, census and econometric information.

    Science.gov (United States)

    Husain, Syed S; Kalinin, Alexandr; Truong, Anh; Dinov, Ivo D

    Intuitive formulation of informative and computationally-efficient queries on big and complex datasets present a number of challenges. As data collection is increasingly streamlined and ubiquitous, data exploration, discovery and analytics get considerably harder. Exploratory querying of heterogeneous and multi-source information is both difficult and necessary to advance our knowledge about the world around us. We developed a mechanism to integrate dispersed multi-source data and service the mashed information via human and machine interfaces in a secure, scalable manner. This process facilitates the exploration of subtle associations between variables, population strata, or clusters of data elements, which may be opaque to standard independent inspection of the individual sources. This a new platform includes a device agnostic tool (Dashboard webapp, http://socr.umich.edu/HTML5/Dashboard/) for graphical querying, navigating and exploring the multivariate associations in complex heterogeneous datasets. The paper illustrates this core functionality and serviceoriented infrastructure using healthcare data (e.g., US data from the 2010 Census, Demographic and Economic surveys, Bureau of Labor Statistics, and Center for Medicare Services) as well as Parkinson's Disease neuroimaging data. Both the back-end data archive and the front-end dashboard interfaces are continuously expanded to include additional data elements and new ways to customize the human and machine interactions. A client-side data import utility allows for easy and intuitive integration of user-supplied datasets. This completely open-science framework may be used for exploratory analytics, confirmatory analyses, meta-analyses, and education and training purposes in a wide variety of fields.

  12. Standardized Access and Processing of Multi-Source Earth Observation Time-Series Data within a Regional Data Middleware

    Science.gov (United States)

    Eberle, J.; Schmullius, C.

    2017-12-01

    Increasing archives of global satellite data present a new challenge to handle multi-source satellite data in a user-friendly way. Any user is confronted with different data formats and data access services. In addition the handling of time-series data is complex as an automated processing and execution of data processing steps is needed to supply the user with the desired product for a specific area of interest. In order to simplify the access to data archives of various satellite missions and to facilitate the subsequent processing, a regional data and processing middleware has been developed. The aim of this system is to provide standardized and web-based interfaces to multi-source time-series data for individual regions on Earth. For further use and analysis uniform data formats and data access services are provided. Interfaces to data archives of the sensor MODIS (NASA) as well as the satellites Landsat (USGS) and Sentinel (ESA) have been integrated in the middleware. Various scientific algorithms, such as the calculation of trends and breakpoints of time-series data, can be carried out on the preprocessed data on the basis of uniform data management. Jupyter Notebooks are linked to the data and further processing can be conducted directly on the server using Python and the statistical language R. In addition to accessing EO data, the middleware is also used as an intermediary between the user and external databases (e.g., Flickr, YouTube). Standardized web services as specified by OGC are provided for all tools of the middleware. Currently, the use of cloud services is being researched to bring algorithms to the data. As a thematic example, an operational monitoring of vegetation phenology is being implemented on the basis of various optical satellite data and validation data from the German Weather Service. Other examples demonstrate the monitoring of wetlands focusing on automated discovery and access of Landsat and Sentinel data for local areas.

  13. Multitarget Therapeutic Leads for Alzheimer's Disease: Quinolizidinyl Derivatives of Bi- and Tricyclic Systems as Dual Inhibitors of Cholinesterases and β-Amyloid (Aβ) Aggregation.

    Science.gov (United States)

    Tonelli, Michele; Catto, Marco; Tasso, Bruno; Novelli, Federica; Canu, Caterina; Iusco, Giovanna; Pisani, Leonardo; Stradis, Angelo De; Denora, Nunzio; Sparatore, Anna; Boido, Vito; Carotti, Angelo; Sparatore, Fabio

    2015-06-01

    Multitarget therapeutic leads for Alzheimer's disease were designed on the models of compounds capable of maintaining or restoring cell protein homeostasis and of inhibiting β-amyloid (Aβ) oligomerization. Thirty-seven thioxanthen-9-one, xanthen-9-one, naphto- and anthraquinone derivatives were tested for the direct inhibition of Aβ(1-40) aggregation and for the inhibition of electric eel acetylcholinesterase (eeAChE) and horse serum butyrylcholinesterase (hsBChE). These compounds are characterized by basic side chains, mainly quinolizidinylalkyl moieties, linked to various bi- and tri-cyclic (hetero)aromatic systems. With very few exceptions, these compounds displayed inhibitory activity on both AChE and BChE and on the spontaneous aggregation of β-amyloid. In most cases, IC50 values were in the low micromolar and sub-micromolar range, but some compounds even reached nanomolar potency. The time course of amyloid aggregation in the presence of the most active derivative (IC50 =0.84 μM) revealed that these compounds might act as destabilizers of mature fibrils rather than mere inhibitors of fibrillization. Many compounds inhibited one or both cholinesterases and Aβ aggregation with similar potency, a fundamental requisite for the possible development of therapeutics exhibiting a multitarget mechanism of action. The described compounds thus represent interesting leads for the development of multitarget AD therapeutics. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Pharmacological characterization of memoquin, a multi-target compound for the treatment of Alzheimer's disease.

    Directory of Open Access Journals (Sweden)

    Valeria Capurro

    Full Text Available Alzheimer's disease (AD is characterized by progressive loss of cognitive function, dementia and altered behavior. Over 30 million people worldwide suffer from AD and available therapies are still palliative rather than curative. Recently, Memoquin (MQ, a quinone-bearing polyamine compound, has emerged as a promising anti-AD lead candidate, mainly thanks to its multi-target profile. MQ acts as an acetylcholinesterase and β-secretase-1 inhibitor, and also possesses anti-amyloid and anti-oxidant properties. Despite this potential interest, in vivo behavioral studies with MQ have been limited. Here, we report on in vivo studies with MQ (acute and sub-chronic treatments; 7-15 mg/kg per os carried out using two different mouse models: i scopolamine- and ii beta-amyloid peptide- (Aβ- induced amnesia. Several aspects related to memory were examined using the T-maze, the Morris water maze, the novel object recognition, and the passive avoidance tasks. At the dose of 15 mg/kg, MQ was able to rescue all tested aspects of cognitive impairment including spatial, episodic, aversive, short and long-term memory in both scopolamine- and Aβ-induced amnesia models. Furthermore, when tested in primary cortical neurons, MQ was able to fully prevent the Aβ-induced neurotoxicity mediated by oxidative stress. The results support the effectiveness of MQ as a cognitive enhancer, and highlight the value of a multi-target strategy to address the complex nature of cognitive dysfunction in AD.

  15. Pharmacological characterization of memoquin, a multi-target compound for the treatment of Alzheimer's disease.

    Science.gov (United States)

    Capurro, Valeria; Busquet, Perrine; Lopes, Joao Pedro; Bertorelli, Rosalia; Tarozzo, Glauco; Bolognesi, Maria Laura; Piomelli, Daniele; Reggiani, Angelo; Cavalli, Andrea

    2013-01-01

    Alzheimer's disease (AD) is characterized by progressive loss of cognitive function, dementia and altered behavior. Over 30 million people worldwide suffer from AD and available therapies are still palliative rather than curative. Recently, Memoquin (MQ), a quinone-bearing polyamine compound, has emerged as a promising anti-AD lead candidate, mainly thanks to its multi-target profile. MQ acts as an acetylcholinesterase and β-secretase-1 inhibitor, and also possesses anti-amyloid and anti-oxidant properties. Despite this potential interest, in vivo behavioral studies with MQ have been limited. Here, we report on in vivo studies with MQ (acute and sub-chronic treatments; 7-15 mg/kg per os) carried out using two different mouse models: i) scopolamine- and ii) beta-amyloid peptide- (Aβ-) induced amnesia. Several aspects related to memory were examined using the T-maze, the Morris water maze, the novel object recognition, and the passive avoidance tasks. At the dose of 15 mg/kg, MQ was able to rescue all tested aspects of cognitive impairment including spatial, episodic, aversive, short and long-term memory in both scopolamine- and Aβ-induced amnesia models. Furthermore, when tested in primary cortical neurons, MQ was able to fully prevent the Aβ-induced neurotoxicity mediated by oxidative stress. The results support the effectiveness of MQ as a cognitive enhancer, and highlight the value of a multi-target strategy to address the complex nature of cognitive dysfunction in AD.

  16. Extending multi-tenant architectures: a database model for a multi-target support in SaaS applications

    Science.gov (United States)

    Rico, Antonio; Noguera, Manuel; Garrido, José Luis; Benghazi, Kawtar; Barjis, Joseph

    2016-05-01

    Multi-tenant architectures (MTAs) are considered a cornerstone in the success of Software as a Service as a new application distribution formula. Multi-tenancy allows multiple customers (i.e. tenants) to be consolidated into the same operational system. This way, tenants run and share the same application instance as well as costs, which are significantly reduced. Functional needs vary from one tenant to another; either companies from different sectors run different types of applications or, although deploying the same functionality, they do differ in the extent of their complexity. In any case, MTA leaves one major concern regarding the companies' data, their privacy and security, which requires special attention to the data layer. In this article, we propose an extended data model that enhances traditional MTAs in respect of this concern. This extension - called multi-target - allows MT applications to host, manage and serve multiple functionalities within the same multi-tenant (MT) environment. The practical deployment of this approach will allow SaaS vendors to target multiple markets or address different levels of functional complexity and yet commercialise just one single MT application. The applicability of the approach is demonstrated via a case study of a real multi-tenancy multi-target (MT2) implementation, called Globalgest.

  17. Multitarget Molecular Hybrids of Cinnamic Acids

    Directory of Open Access Journals (Sweden)

    Aikaterini Peperidou

    2014-12-01

    Full Text Available In an attempt to synthesize potential new multitarget agents, 11 novel hybrids incorporating cinnamic acids and paracetamol, 4-/7-hydroxycoumarin, benzocaine, p-aminophenol and m-aminophenol were synthesized. Three hybrids—2e, 2a, 2g—and 3b were found to be multifunctional agents. The hybrid 2e derived from the phenoxyphenyl cinnamic acid and m-acetamidophenol showed the highest lipoxygenase (LOX inhibition and analgesic activity (IC50 = 0.34 μΜ and 98.1%, whereas the hybrid 3b of bromobenzyloxycinnamic acid and hymechromone exhibited simultaneously good LOX inhibitory activity (IC50 = 50 μΜ and the highest anti-proteolytic activity (IC50= 5 μΜ. The hybrid 2a of phenyloxyphenyl acid with paracetamol showed a high analgesic activity (91% and appears to be a promising agent for treating peripheral nerve injuries. Hybrid 2g which has an ester and an amide bond presents an interesting combination of anti-LOX and anti-proteolytic activity. The esters were found very potent and especially those derived from paracetamol and m-acetamidophenol. The amides follow. Based on 2D-structure–activity relationships it was observed that both steric and electronic parameters play major roles in the activity of these compounds. Molecular docking studies point to the fact that allosteric interactions might govern the LOX-inhibitor binding.

  18. Three-dimensional inversion of multisource array electromagnetic data

    Science.gov (United States)

    Tartaras, Efthimios

    Three-dimensional (3-D) inversion is increasingly important for the correct interpretation of geophysical data sets in complex environments. To this effect, several approximate solutions have been developed that allow the construction of relatively fast inversion schemes. One such method that is fast and provides satisfactory accuracy is the quasi-linear (QL) approximation. It has, however, the drawback that it is source-dependent and, therefore, impractical in situations where multiple transmitters in different positions are employed. I have, therefore, developed a localized form of the QL approximation that is source-independent. This so-called localized quasi-linear (LQL) approximation can have a scalar, a diagonal, or a full tensor form. Numerical examples of its comparison with the full integral equation solution, the Born approximation, and the original QL approximation are given. The objective behind developing this approximation is to use it in a fast 3-D inversion scheme appropriate for multisource array data such as those collected in airborne surveys, cross-well logging, and other similar geophysical applications. I have developed such an inversion scheme using the scalar and diagonal LQL approximation. It reduces the original nonlinear inverse electromagnetic (EM) problem to three linear inverse problems. The first of these problems is solved using a weighted regularized linear conjugate gradient method, whereas the last two are solved in the least squares sense. The algorithm I developed provides the option of obtaining either smooth or focused inversion images. I have applied the 3-D LQL inversion to synthetic 3-D EM data that simulate a helicopter-borne survey over different earth models. The results demonstrate the stability and efficiency of the method and show that the LQL approximation can be a practical solution to the problem of 3-D inversion of multisource array frequency-domain EM data. I have also applied the method to helicopter-borne EM

  19. Algorithm for Optimizing Bipolar Interconnection Weights with Applications in Associative Memories and Multitarget Classification

    Science.gov (United States)

    Chang, Shengjiang; Wong, Kwok-Wo; Zhang, Wenwei; Zhang, Yanxin

    1999-08-01

    An algorithm for optimizing a bipolar interconnection weight matrix with the Hopfield network is proposed. The effectiveness of this algorithm is demonstrated by computer simulation and optical implementation. In the optical implementation of the neural network the interconnection weights are biased to yield a nonnegative weight matrix. Moreover, a threshold subchannel is added so that the system can realize, in real time, the bipolar weighted summation in a single channel. Preliminary experimental results obtained from the applications in associative memories and multitarget classification with rotation invariance are shown.

  20. Information Geometry, Inference Methods and Chaotic Energy Levels Statistics

    OpenAIRE

    Cafaro, Carlo

    2008-01-01

    In this Letter, we propose a novel information-geometric characterization of chaotic (integrable) energy level statistics of a quantum antiferromagnetic Ising spin chain in a tilted (transverse) external magnetic field. Finally, we conjecture our results might find some potential physical applications in quantum energy level statistics.

  1. Collaborative filtering on a family of biological targets.

    Science.gov (United States)

    Erhan, Dumitru; L'heureux, Pierre-Jean; Yue, Shi Yi; Bengio, Yoshua

    2006-01-01

    Building a QSAR model of a new biological target for which few screening data are available is a statistical challenge. However, the new target may be part of a bigger family, for which we have more screening data. Collaborative filtering or, more generally, multi-task learning, is a machine learning approach that improves the generalization performance of an algorithm by using information from related tasks as an inductive bias. We use collaborative filtering techniques for building predictive models that link multiple targets to multiple examples. The more commonalities between the targets, the better the multi-target model that can be built. We show an example of a multi-target neural network that can use family information to produce a predictive model of an undersampled target. We evaluate JRank, a kernel-based method designed for collaborative filtering. We show their performance on compound prioritization for an HTS campaign and the underlying shared representation between targets. JRank outperformed the neural network both in the single- and multi-target models.

  2. Systems biology approaches and tools for analysis of interactomes and multi-target drugs.

    Science.gov (United States)

    Schrattenholz, André; Groebe, Karlfried; Soskic, Vukic

    2010-01-01

    diseases" remains a most pressing medical need. Currently, a change of paradigm can be observed with regard to a new interest in agents that modulate multiple targets simultaneously, essentially "dirty drugs." Targeting cellular function as a system rather than on the level of the single target, significantly increases the size of the drugable proteome and is expected to introduce novel classes of multi-target drugs with fewer adverse effects and toxicity. Multiple target approaches have recently been used to design medications against atherosclerosis, cancer, depression, psychosis and neurodegenerative diseases. A focussed approach towards "systemic" drugs will certainly require the development of novel computational and mathematical concepts for appropriate modelling of complex data. But the key is the extraction of relevant molecular information from biological systems by implementing rigid statistical procedures to differential proteomic analytics.

  3. Assessing the potential hydrological impact of the Gibe III Dam on Lake Turkana water level using multi-source satellite data

    Science.gov (United States)

    Velpuri, N. M.; Senay, G. B.

    2012-10-01

    Lake Turkana, the largest desert lake in the world, is fed by ungauged or poorly gauged river systems. To meet the demand of electricity in the East African region, Ethiopia is currently building the Gibe III hydroelectric dam on the Omo River, which supplies more than 80% of the inflows to Lake Turkana. On completion, the Gibe III dam will be the tallest dam in Africa with a height of 241 m. However, the nature of interactions and potential impacts of regulated inflows to Lake Turkana are not well understood due to its remote location and unavailability of reliable in situ datasets. In this study, we used 12 yr (1998-2009) of existing multi-source satellite and model-assimilated global weather data. We used a calibrated multi-source satellite data-driven water balance model for Lake Turkana that takes into account model routed runoff, lake/reservoir evapotranspiration, direct rain on lakes/reservoirs and releases from the dam to compute lake water levels. The model evaluates the impact of the Gibe III dam using three different approaches - a historical approach, a rainfall based approach, and a statistical approach to generate rainfall-runoff scenarios. All the approaches provided comparable and consistent results. Model results indicated that the hydrological impact of the Gibe III dam on Lake Turkana would vary with the magnitude and distribution of rainfall post-dam commencement. On average, the reservoir would take up to 8-10 months, after commencement, to reach a minimum operation level of 201 m depth of water. During the dam filling period, the lake level would drop up to 1-2 m (95% confidence) compared to the lake level modeled without the dam. The lake level variability caused by regulated inflows after the dam commissioning were found to be within the natural variability of the lake of 4.8 m. Moreover, modeling results indicated that the hydrological impact of the Gibe III dam would depend on the initial lake level at the time of dam commencement. Areas

  4. The assessment of pathologists/laboratory medicine physicians through a multisource feedback tool.

    Science.gov (United States)

    Lockyer, Jocelyn M; Violato, Claudio; Fidler, Herta; Alakija, Pauline

    2009-08-01

    There is increasing interest in ensuring that physicians demonstrate the full range of Accreditation Council for Graduate Medical Education competencies. To determine whether it is possible to develop a feasible and reliable multisource feedback instrument for pathologists and laboratory medicine physicians. Surveys with 39, 30, and 22 items were developed to assess individual physicians by 8 peers, 8 referring physicians, and 8 coworkers (eg, technologists, secretaries), respectively, using 5-point scales and an unable-to-assess category. Physicians completed a self-assessment survey. Items addressed key competencies related to clinical competence, collaboration, professionalism, and communication. Data from 101 pathologists and laboratory medicine physicians were analyzed. The mean number of respondents per physician was 7.6, 7.4, and 7.6 for peers, referring physicians, and coworkers, respectively. The reliability of the internal consistency, measured by Cronbach alpha, was > or = .95 for the full scale of all instruments. Analysis indicated that the medical peer, referring physician, and coworker instruments achieved a generalizability coefficient of .78, .81, and .81, respectively. Factor analysis showed 4 factors on the peer questionnaire accounted for 68.8% of the total variance: reports and clinical competency, collaboration, educational leadership, and professional behavior. For the referring physician survey, 3 factors accounted for 66.9% of the variance: professionalism, reports, and clinical competency. Two factors on the coworker questionnaire accounted for 59.9% of the total variance: communication and professionalism. It is feasible to assess this group of physicians using multisource feedback with instruments that are reliable.

  5. FuzzyFusion: an application architecture for multisource information fusion

    Science.gov (United States)

    Fox, Kevin L.; Henning, Ronda R.

    2009-04-01

    The correlation of information from disparate sources has long been an issue in data fusion research. Traditional data fusion addresses the correlation of information from sources as diverse as single-purpose sensors to all-source multi-media information. Information system vulnerability information is similar in its diversity of sources and content, and in the desire to draw a meaningful conclusion, namely, the security posture of the system under inspection. FuzzyFusionTM, A data fusion model that is being applied to the computer network operations domain is presented. This model has been successfully prototyped in an applied research environment and represents a next generation assurance tool for system and network security.

  6. Elementary statistics for effective library and information service management

    CERN Document Server

    Egghe, Leo

    2001-01-01

    This title describes how best to use statistical data to produce professional reports on library activities. The authors cover data gathering, sampling, graphical representation of data and summary statistics from data, and also include a section on trend analysis. A full bibliography and a subject index make this a key title for any information professional..

  7. Optimal path planning for video-guided smart munitions via multitarget tracking

    Science.gov (United States)

    Borkowski, Jeffrey M.; Vasquez, Juan R.

    2006-05-01

    An advent in the development of smart munitions entails autonomously modifying target selection during flight in order to maximize the value of the target being destroyed. A unique guidance law can be constructed that exploits both attribute and kinematic data obtained from an onboard video sensor. An optimal path planning algorithm has been developed with the goals of obstacle avoidance and maximizing the value of the target impacted by the munition. Target identification and classification provides a basis for target value which is used in conjunction with multi-target tracks to determine an optimal waypoint for the munition. A dynamically feasible trajectory is computed to provide constraints on the waypoint selection. Results demonstrate the ability of the autonomous system to avoid moving obstacles and revise target selection in flight.

  8. Development of geo-information data management system and application to geological disposal of high-level radioactive waste in China

    Directory of Open Access Journals (Sweden)

    Wang Peng

    2017-01-01

    Full Text Available In this paper, based on information technology, a geo-information database was established and a geo-information data management system (named as HLW-GIS was developed to facilitate data management work of the multi-source and multidisciplinary data been which are generated during site selection process of geological repository in China. Many important functions, such as basic create, retrieve, update, and delete operations, full text search and download functions, can be achieved through this management system. Even the function of statistics and analysis for certain professional data can be provided. Finally, a few hundred gigabytes of data from numerous different disciplines were integrated, stored, and managed successfully. Meanwhile, the management system can also provide a significant reference for data management work of related research fields, such as decommissioning and management of nuclear facilities, resource prospection and environmental protection.

  9. Defect inspection in hot slab surface: multi-source CCD imaging based fuzzy-rough sets method

    Science.gov (United States)

    Zhao, Liming; Zhang, Yi; Xu, Xiaodong; Xiao, Hong; Huang, Chao

    2016-09-01

    To provide an accurate surface defects inspection method and make the automation of robust image region of interests(ROI) delineation strategy a reality in production line, a multi-source CCD imaging based fuzzy-rough sets method is proposed for hot slab surface quality assessment. The applicability of the presented method and the devised system are mainly tied to the surface quality inspection for strip, billet and slab surface etcetera. In this work we take into account the complementary advantages in two common machine vision (MV) systems(line array CCD traditional scanning imaging (LS-imaging) and area array CCD laser three-dimensional (3D) scanning imaging (AL-imaging)), and through establishing the model of fuzzy-rough sets in the detection system the seeds for relative fuzzy connectedness(RFC) delineation for ROI can placed adaptively, which introduces the upper and lower approximation sets for RIO definition, and by which the boundary region can be delineated by RFC region competitive classification mechanism. For the first time, a Multi-source CCD imaging based fuzzy-rough sets strategy is attempted for CC-slab surface defects inspection that allows an automatic way of AI algorithms and powerful ROI delineation strategies to be applied to the MV inspection field.

  10. Analysis of flood inundation in ungauged basins based on multi-source remote sensing data.

    Science.gov (United States)

    Gao, Wei; Shen, Qiu; Zhou, Yuehua; Li, Xin

    2018-02-09

    Floods are among the most expensive natural hazards experienced in many places of the world and can result in heavy losses of life and economic damages. The objective of this study is to analyze flood inundation in ungauged basins by performing near-real-time detection with flood extent and depth based on multi-source remote sensing data. Via spatial distribution analysis of flood extent and depth in a time series, the inundation condition and the characteristics of flood disaster can be reflected. The results show that the multi-source remote sensing data can make up the lack of hydrological data in ungauged basins, which is helpful to reconstruct hydrological sequence; the combination of MODIS (moderate-resolution imaging spectroradiometer) surface reflectance productions and the DFO (Dartmouth Flood Observatory) flood database can achieve the macro-dynamic monitoring of the flood inundation in ungauged basins, and then the differential technique of high-resolution optical and microwave images before and after floods can be used to calculate flood extent to reflect spatial changes of inundation; the monitoring algorithm for the flood depth combining RS and GIS is simple and easy and can quickly calculate the depth with a known flood extent that is obtained from remote sensing images in ungauged basins. Relevant results can provide effective help for the disaster relief work performed by government departments.

  11. Multi-objective analysis of the conjunctive use of surface water and groundwater in a multisource water supply system

    Science.gov (United States)

    Vieira, João; da Conceição Cunha, Maria

    2017-04-01

    A multi-objective decision model has been developed to identify the Pareto-optimal set of management alternatives for the conjunctive use of surface water and groundwater of a multisource urban water supply system. A multi-objective evolutionary algorithm, Borg MOEA, is used to solve the multi-objective decision model. The multiple solutions can be shown to stakeholders allowing them to choose their own solutions depending on their preferences. The multisource urban water supply system studied here is dependent on surface water and groundwater and located in the Algarve region, southernmost province of Portugal, with a typical warm Mediterranean climate. The rainfall is low, intermittent and concentrated in a short winter, followed by a long and dry period. A base population of 450 000 inhabitants and visits by more than 13 million tourists per year, mostly in summertime, turns water management critical and challenging. Previous studies on single objective optimization after aggregating multiple objectives together have already concluded that only an integrated and interannual water resources management perspective can be efficient for water resource allocation in this drought prone region. A simulation model of the multisource urban water supply system using mathematical functions to represent the water balance in the surface reservoirs, the groundwater flow in the aquifers, and the water transport in the distribution network with explicit representation of water quality is coupled with Borg MOEA. The multi-objective problem formulation includes five objectives. Two objective evaluate separately the water quantity and the water quality supplied for the urban use in a finite time horizon, one objective calculates the operating costs, and two objectives appraise the state of the two water sources - the storage in the surface reservoir and the piezometric levels in aquifer - at the end of the time horizon. The decision variables are the volume of withdrawals from

  12. The application of quadtree algorithm for information integration in the high-level radioactive waste geological disposal

    International Nuclear Information System (INIS)

    Gao Min; Zhong Xia; Huang Shutao

    2008-01-01

    A multi-source database for high-level radioactive waste geological disposal, aims to promote the information process of the geological of HLW. In the periods of the multi-dimensional and multi-source and the integration of information and applications, it also relates to computer software and hardware, the paper preliminary analysises the data resources Beishan area, Gansu Province. The paper introduces a theory based on GIS technology and methods and open source code GDAL application, at the same time, it discusses the technical methods how to finish the application of the Quadtree algorithm in the area of information resources management system, fully sharing, rapid retrieval and so on. A more detailed description of the characteristics of existing data resources, space-related data retrieval algorithm theory, programming design and implementation of ideas are showed in the paper. (authors)

  13. Pre-service primary school teachers’ knowledge of informal statistical inference

    NARCIS (Netherlands)

    de Vetten, Arjen; Schoonenboom, Judith; Keijzer, Ronald; van Oers, Bert

    2018-01-01

    The ability to reason inferentially is increasingly important in today’s society. It is hypothesized here that engaging primary school students in informal statistical reasoning (ISI), defined as making generalizations without the use of formal statistical tests, will help them acquire the

  14. Development and validation of a trustworthy multisource feedback instrument to support nurse appraisals.

    Science.gov (United States)

    Crossley, James G M

    2015-01-01

    Nurse appraisal is well established in the Western world because of its obvious educational advantages. Appraisal works best with many sources of information on performance. Multisource feedback (MSF) is widely used in business and in other clinical disciplines to provide such information. It has also been incorporated into nursing appraisals, but, so far, none of the instruments in use for nurses has been validated. We set out to develop an instrument aligned with the UK Knowledge and Skills Framework (KSF) and to evaluate its reliability and feasibility across a wide hospital-based nursing population. The KSF framework provided a content template. Focus groups developed an instrument based on consensus. The instrument was administered to all the nursing staff in 2 large NHS hospitals forming a single trust in London, England. We used generalizability analysis to estimate reliability, response rates and unstructured interviews to evaluate feasibility, and factor structure and correlation studies to evaluate validity. On a voluntary basis the response rate was moderate (60%). A failure to engage with information technology and employment-related concerns were commonly cited as reasons for not responding. In this population, 11 responses provided a profile with sufficient reliability to inform appraisal (G = 0.7). Performance on the instrument was closely and significantly correlated with performance on a KSF questionnaire. This is the first contemporary psychometric evaluation of an MSF instrument for nurses. MSF appears to be as valid and reliable as an assessment method to inform appraisal in nurses as it is in other health professional groups. © 2015 The Alliance for Continuing Education in the Health Professions, the Society for Academic Continuing Medical Education, and the Council on Continuing Medical Education, Association for Hospital Medical Education.

  15. AVN-101: A Multi-Target Drug Candidate for the Treatment of CNS Disorders.

    Science.gov (United States)

    Ivachtchenko, Alexandre V; Lavrovsky, Yan; Okun, Ilya

    2016-05-25

    Lack of efficacy of many new highly selective and specific drug candidates in treating diseases with poorly understood or complex etiology, as are many of central nervous system (CNS) diseases, encouraged an idea of developing multi-modal (multi-targeted) drugs. In this manuscript, we describe molecular pharmacology, in vitro ADME, pharmacokinetics in animals and humans (part of the Phase I clinical studies), bio-distribution, bioavailability, in vivo efficacy, and safety profile of the multimodal drug candidate, AVN-101. We have carried out development of a next generation drug candidate with a multi-targeted mechanism of action, to treat CNS disorders. AVN-101 is a very potent 5-HT7 receptor antagonist (Ki = 153 pM), with slightly lesser potency toward 5-HT6, 5-HT2A, and 5HT-2C receptors (Ki = 1.2-2.0 nM). AVN-101 also exhibits a rather high affinity toward histamine H1 (Ki = 0.58 nM) and adrenergic α2A, α2B, and α2C (Ki = 0.41-3.6 nM) receptors. AVN-101 shows a good oral bioavailability and facilitated brain-blood barrier permeability, low toxicity, and reasonable efficacy in animal models of CNS diseases. The Phase I clinical study indicates the AVN-101 to be well tolerated when taken orally at doses of up to 20 mg daily. It does not dramatically influence plasma and urine biochemistry, nor does it prolong QT ECG interval, thus indicating low safety concerns. The primary therapeutic area for AVN-101 to be tested in clinical trials would be Alzheimer's disease. However, due to its anxiolytic and anti-depressive activities, there is a strong rational for it to also be studied in such diseases as general anxiety disorders, depression, schizophrenia, and multiple sclerosis.

  16. A network-based multi-target computational estimation scheme for anticoagulant activities of compounds.

    Directory of Open Access Journals (Sweden)

    Qian Li

    Full Text Available BACKGROUND: Traditional virtual screening method pays more attention on predicted binding affinity between drug molecule and target related to a certain disease instead of phenotypic data of drug molecule against disease system, as is often less effective on discovery of the drug which is used to treat many types of complex diseases. Virtual screening against a complex disease by general network estimation has become feasible with the development of network biology and system biology. More effective methods of computational estimation for the whole efficacy of a compound in a complex disease system are needed, given the distinct weightiness of the different target in a biological process and the standpoint that partial inhibition of several targets can be more efficient than the complete inhibition of a single target. METHODOLOGY: We developed a novel approach by integrating the affinity predictions from multi-target docking studies with biological network efficiency analysis to estimate the anticoagulant activities of compounds. From results of network efficiency calculation for human clotting cascade, factor Xa and thrombin were identified as the two most fragile enzymes, while the catalytic reaction mediated by complex IXa:VIIIa and the formation of the complex VIIIa:IXa were recognized as the two most fragile biological matter in the human clotting cascade system. Furthermore, the method which combined network efficiency with molecular docking scores was applied to estimate the anticoagulant activities of a serial of argatroban intermediates and eight natural products respectively. The better correlation (r = 0.671 between the experimental data and the decrease of the network deficiency suggests that the approach could be a promising computational systems biology tool to aid identification of anticoagulant activities of compounds in drug discovery. CONCLUSIONS: This article proposes a network-based multi-target computational estimation

  17. A network-based multi-target computational estimation scheme for anticoagulant activities of compounds.

    Science.gov (United States)

    Li, Qian; Li, Xudong; Li, Canghai; Chen, Lirong; Song, Jun; Tang, Yalin; Xu, Xiaojie

    2011-03-22

    Traditional virtual screening method pays more attention on predicted binding affinity between drug molecule and target related to a certain disease instead of phenotypic data of drug molecule against disease system, as is often less effective on discovery of the drug which is used to treat many types of complex diseases. Virtual screening against a complex disease by general network estimation has become feasible with the development of network biology and system biology. More effective methods of computational estimation for the whole efficacy of a compound in a complex disease system are needed, given the distinct weightiness of the different target in a biological process and the standpoint that partial inhibition of several targets can be more efficient than the complete inhibition of a single target. We developed a novel approach by integrating the affinity predictions from multi-target docking studies with biological network efficiency analysis to estimate the anticoagulant activities of compounds. From results of network efficiency calculation for human clotting cascade, factor Xa and thrombin were identified as the two most fragile enzymes, while the catalytic reaction mediated by complex IXa:VIIIa and the formation of the complex VIIIa:IXa were recognized as the two most fragile biological matter in the human clotting cascade system. Furthermore, the method which combined network efficiency with molecular docking scores was applied to estimate the anticoagulant activities of a serial of argatroban intermediates and eight natural products respectively. The better correlation (r = 0.671) between the experimental data and the decrease of the network deficiency suggests that the approach could be a promising computational systems biology tool to aid identification of anticoagulant activities of compounds in drug discovery. This article proposes a network-based multi-target computational estimation method for anticoagulant activities of compounds by

  18. Information gathering for the Transportation Statistics Data Bank

    International Nuclear Information System (INIS)

    Shappert, L.B.; Mason, P.J.

    1981-10-01

    The Transportation Statistics Data Bank (TSDB) was developed in 1974 to collect information on the transport of Department of Energy (DOE) materials. This computer program may be used to provide the framework for collecting more detailed information on DOE shipments of radioactive materials. This report describes the type of information that is needed in this area and concludes that the existing system could be readily modified to collect and process it. The additional needed information, available from bills of lading and similar documents, could be gathered from DOE field offices and transferred in a standard format to the TSDB system. Costs of the system are also discussed briefly

  19. Suitability Evaluation for Products Generation from Multisource Remote Sensing Data

    Directory of Open Access Journals (Sweden)

    Jining Yan

    2016-12-01

    Full Text Available With the arrival of the big data era in Earth observation, the remote sensing communities have accumulated a large amount of invaluable and irreplaceable data for global monitoring. These massive remote sensing data have enabled large-area and long-term series Earth observation, and have, in particular, made standard, automated product generation more popular. However, there is more than one type of data selection for producing a certain remote sensing product; no single remote sensor can cover such a large area at one time. Therefore, we should automatically select the best data source from redundant multisource remote sensing data, or select substitute data if data is lacking, during the generation of remote sensing products. However, the current data selection strategy mainly adopts the empirical model, and has a lack of theoretical support and quantitative analysis. Hence, comprehensively considering the spectral characteristics of ground objects and spectra differences of each remote sensor, by means of spectrum simulation and correlation analysis, we propose a suitability evaluation model for product generation. The model will enable us to obtain the Production Suitability Index (PSI of each remote sensing data. In order to validate the proposed model, two typical value-added information products, NDVI and NDWI, and two similar or complementary remote sensors, Landsat-OLI and HJ1A-CCD1, were chosen, and the verification experiments were performed. Through qualitative and quantitative analysis, the experimental results were consistent with our model calculation results, and strongly proved the validity of the suitability evaluation model. The proposed production suitability evaluation model could assist with standard, automated, serialized product generation. It will play an important role in one-station, value-added information services during the big data era of Earth observation.

  20. A PZT Actuated Triple-Finger Gripper for Multi-Target Micromanipulation

    Directory of Open Access Journals (Sweden)

    Tao Chen

    2017-01-01

    Full Text Available This paper presents a triple-finger gripper driven by a piezoceramic (PZT transducer for multi-target micromanipulation. The gripper consists of three fingers assembled on adjustable pedestals with flexible hinges for a large adjustable range. Each finger has a PZT actuator, an amplifying structure, and a changeable end effector. The moving trajectories of single and double fingers were calculated and finite element analyses were performed to verify the reliability of the structures. In the gripping experiment, various end effectors of the fingers such as tungsten probes and fibers were tested, and different micro-objects such as glass hollow spheres and iron spheres with diameters ranging from 10 to 800 μm were picked and released. The output resolution is 145 nm/V, and the driven displacement range of the gripper is 43.4 μm. The PZT actuated triple-finger gripper has superior adaptability, high efficiency, and a low cost.

  1. A Geospatial Information Grid Framework for Geological Survey

    OpenAIRE

    Wu, Liang; Xue, Lei; Li, Chaoling; Lv, Xia; Chen, Zhanlong; Guo, Mingqiang; Xie, Zhong

    2015-01-01

    The use of digital information in geological fields is becoming very important. Thus, informatization in geological surveys should not stagnate as a result of the level of data accumulation. The integration and sharing of distributed, multi-source, heterogeneous geological information is an open problem in geological domains. Applications and services use geological spatial data with many features, including being cross-region and cross-domain and requiring real-time updating. As a result of ...

  2. Powering embedded electronics for wind turbine monitoring using multi-source energy harvesting techniques

    Science.gov (United States)

    Anton, S. R.; Taylor, S. G.; Raby, E. Y.; Farinholt, K. M.

    2013-03-01

    With a global interest in the development of clean, renewable energy, wind energy has seen steady growth over the past several years. Advances in wind turbine technology bring larger, more complex turbines and wind farms. An important issue in the development of these complex systems is the ability to monitor the state of each turbine in an effort to improve the efficiency and power generation. Wireless sensor nodes can be used to interrogate the current state and health of wind turbine structures; however, a drawback of most current wireless sensor technology is their reliance on batteries for power. Energy harvesting solutions present the ability to create autonomous power sources for small, low-power electronics through the scavenging of ambient energy; however, most conventional energy harvesting systems employ a single mode of energy conversion, and thus are highly susceptible to variations in the ambient energy. In this work, a multi-source energy harvesting system is developed to power embedded electronics for wind turbine applications in which energy can be scavenged simultaneously from several ambient energy sources. Field testing is performed on a full-size, residential scale wind turbine where both vibration and solar energy harvesting systems are utilized to power wireless sensing systems. Two wireless sensors are investigated, including the wireless impedance device (WID) sensor node, developed at Los Alamos National Laboratory (LANL), and an ultra-low power RF system-on-chip board that is the basis for an embedded wireless accelerometer node currently under development at LANL. Results indicate the ability of the multi-source harvester to successfully power both sensors.

  3. Two years of recorded data for a multisource heat pump system: A performance analysis

    International Nuclear Information System (INIS)

    Busato, F.; Lazzarin, R.M.; Noro, M.

    2013-01-01

    The concept of a low energy building in a temperate climate (according to the Koppen climate classification) is based upon the following principles: reduction of heat losses through enhanced insulation; the inclusion of heat recovery on mechanical ventilation; and the use of high efficiency heating/cooling systems integrated with renewable technologies. It is almost impossible to achieve optimum results in terms of global energy efficiency if one of these elements is omitted from the design. In 2009, a new school building, integrating these three key elements, was opened in Agordo town, located in northern Italy. The main design features of the building incorporate a well insulated envelope and a space heating and ventilation system driven by an innovative multisource heat pump system. Outdoor air is a common heat source, although it does have widely documented limitations. Heat pump systems can utilise more efficient sources than air, including those of ground heat, solar heat, and heat recovery. The installed system within the school building incorporates these three sources. A multisource system aims to enhance the performance of the heat pump, both in terms of heating capacity and overall efficiency. The present work includes evaluation and analysis of data obtained through real time monitoring of the working system in operation, for a period of approximately two heating seasons. During this time, the behaviour of the system was assessed and the incorrect settings of the plant were identified and subsequently adjusted as required. The energy balance indicates that the integration of different sources not only increases the thermal performance of the system as a whole, but also optimizes the use of each source. Further savings can be obtained through correct adjustment of the set point of the indoor temperature. During the final stage of the study, the total energy consumption of the new building is calculated and compared to that of the former building that

  4. Image-Based Multi-Target Tracking through Multi-Bernoulli Filtering with Interactive Likelihoods.

    Science.gov (United States)

    Hoak, Anthony; Medeiros, Henry; Povinelli, Richard J

    2017-03-03

    We develop an interactive likelihood (ILH) for sequential Monte Carlo (SMC) methods for image-based multiple target tracking applications. The purpose of the ILH is to improve tracking accuracy by reducing the need for data association. In addition, we integrate a recently developed deep neural network for pedestrian detection along with the ILH with a multi-Bernoulli filter. We evaluate the performance of the multi-Bernoulli filter with the ILH and the pedestrian detector in a number of publicly available datasets (2003 PETS INMOVE, Australian Rules Football League (AFL) and TUD-Stadtmitte) using standard, well-known multi-target tracking metrics (optimal sub-pattern assignment (OSPA) and classification of events, activities and relationships for multi-object trackers (CLEAR MOT)). In all datasets, the ILH term increases the tracking accuracy of the multi-Bernoulli filter.

  5. Recognition of Voice Commands by Multisource ASR and Noise Cancellation in a Smart Home Environment

    OpenAIRE

    Vacher , Michel; Lecouteux , Benjamin; Portet , François

    2012-01-01

    International audience; In this paper, we present a multisource ASR system to detect home automation orders in various everyday listening conditions in a realistic home. The system is based on a state of the art echo cancellation stage that feeds recently introduced ASR techniques. The evaluation was conducted on a realistic noisy data set acquired in a smart home where a microphone was placed near the noise source and several other microphones were placed in different rooms. This distant spe...

  6. Informing Evidence Based Decisions: Usage Statistics for Online Journal Databases

    Directory of Open Access Journals (Sweden)

    Alexei Botchkarev

    2017-06-01

    Full Text Available Abstract Objective – The primary objective was to examine online journal database usage statistics for a provincial ministry of health in the context of evidence based decision-making. In addition, the study highlights implementation of the Journal Access Centre (JAC that is housed and powered by the Ontario Ministry of Health and Long-Term Care (MOHLTC to inform health systems policy-making. Methods – This was a prospective case study using descriptive analysis of the JAC usage statistics of journal articles from January 2009 to September 2013. Results – JAC enables ministry employees to access approximately 12,000 journals with full-text articles. JAC usage statistics for the 2011-2012 calendar years demonstrate a steady level of activity in terms of searches, with monthly averages of 5,129. In 2009-2013, a total of 4,759 journal titles were accessed including 1,675 journals with full-text. Usage statistics demonstrate that the actual consumption was over 12,790 full-text downloaded articles or approximately 2,700 articles annually. Conclusion – JAC’s steady level of activities, revealed by the study, reflects continuous demand for JAC services and products. It testifies that access to online journal databases has become part of routine government knowledge management processes. MOHLTC’s broad area of responsibilities with dynamically changing priorities translates into the diverse information needs of its employees and a large set of required journals. Usage statistics indicate that MOHLTC information needs cannot be mapped to a reasonably compact set of “core” journals with a subsequent subscription to those.

  7. Preclinical FLT-PET and FDG-PET imaging of tumor response to the multi-targeted Aurora B kinase inhibitor, TAK-901

    International Nuclear Information System (INIS)

    Cullinane, Carleen; Waldeck, Kelly L.; Binns, David; Bogatyreva, Ekaterina; Bradley, Daniel P.; Jong, Ron de; McArthur, Grant A.; Hicks, Rodney J.

    2014-01-01

    Introduction: The Aurora kinases play a key role in mitosis and have recently been identified as attractive targets for therapeutic intervention in cancer. The aim of this study was therefore to investigate the utility of 3′-[ 18 F]fluoro-3′-deoxythymidine (FLT) and 2-deoxy-2-[ 18 F]fluoro-D-glucose (FDG) for assessment of tumor response to the multi-targeted Aurora B kinase inhibitor, TAK-901. Methods: Balb/c nude mice bearing HCT116 colorectal xenografts were treated with up to 30 mg/kg TAK 901 or vehicle intravenously twice daily for two days on a weekly cycle. Tumor growth was monitored by calliper measurements and PET imaging was performed at baseline, day 4, 8, 11 and 15. Tumors were harvested at time points corresponding to days of PET imaging for analysis of ex vivo markers of cell proliferation and metabolism together with markers of Aurora B kinase inhibition including phospho-histone H3 (pHH3) and senescence associated β-galactosidase. Results: Tumor growth was inhibited by 60% on day 12 of 30 mg/kg TAK-901 therapy. FLT uptake was significantly reduced by day 4 of treatment and this corresponded with reduction in bromodeoxyuridine and pHH3 staining by immunohistochemistry. All biomarkers rebounded towards baseline levels by the commencement of the next treatment cycle, consistent with release of Aurora B kinase suppression. TAK-901 therapy had no impact on glucose metabolism as assessed by FDG uptake and GLUT1 staining by immunohistochemistry. Conclusions: FLT-PET, but not FDG-PET, is a robust non-invasive imaging biomarker of early HCT116 tumor response to the on-target effects of the multi-targeted Aurora B kinase inhibitor, TAK-901. Advances in knowledge and implications for patient care: This is the first report to demonstrate the impact of the multi-targeted Aurora B kinase inhibitor, TAK-901 on tumor FLT uptake. The findings provide a strong rationale for the evaluation of FLT-PET as an early biomarker of tumor response in the early phase

  8. Prototyping a Distributed Information Retrieval System That Uses Statistical Ranking.

    Science.gov (United States)

    Harman, Donna; And Others

    1991-01-01

    Built using a distributed architecture, this prototype distributed information retrieval system uses statistical ranking techniques to provide better service to the end user. Distributed architecture was shown to be a feasible alternative to centralized or CD-ROM information retrieval, and user testing of the ranking methodology showed both…

  9. TimeLapseAnalyzer: Multi-target analysis for live-cell imaging and time-lapse microscopy

    DEFF Research Database (Denmark)

    Huth, Johannes; Buchholz, Malte; Kraus, Johann M.

    2011-01-01

    The direct observation of cells over time using time-lapse microscopy can provide deep insights into many important biological processes. Reliable analyses of motility, proliferation, invasive potential or mortality of cells are essential to many studies involving live cell imaging and can aid in...... counting and tube formation analysis in high throughput screening of live-cell experiments. TimeLapseAnalyzer is freely available (MATLAB, Open Source) at http://www.informatik.uniulm. de/ni/mitarbeiter/HKestler/tla......., we developed TimeLapseAnalyzer. Apart from general purpose image enhancements and segmentation procedures, this extensible, self-contained, modular cross-platform package provides dedicated modalities for fast and reliable analysis of multi-target cell tracking, scratch wound healing analysis, cell...

  10. Image-Based Multi-Target Tracking through Multi-Bernoulli Filtering with Interactive Likelihoods

    Directory of Open Access Journals (Sweden)

    Anthony Hoak

    2017-03-01

    Full Text Available We develop an interactive likelihood (ILH for sequential Monte Carlo (SMC methods for image-based multiple target tracking applications. The purpose of the ILH is to improve tracking accuracy by reducing the need for data association. In addition, we integrate a recently developed deep neural network for pedestrian detection along with the ILH with a multi-Bernoulli filter. We evaluate the performance of the multi-Bernoulli filter with the ILH and the pedestrian detector in a number of publicly available datasets (2003 PETS INMOVE, Australian Rules Football League (AFL and TUD-Stadtmitte using standard, well-known multi-target tracking metrics (optimal sub-pattern assignment (OSPA and classification of events, activities and relationships for multi-object trackers (CLEAR MOT. In all datasets, the ILH term increases the tracking accuracy of the multi-Bernoulli filter.

  11. From Single Target to Multitarget/Network Therapeutics in Alzheimer’s Therapy

    Directory of Open Access Journals (Sweden)

    Hailin Zheng

    2014-01-01

    Full Text Available Brain network dysfunction in Alzheimer’s disease (AD involves many proteins (enzymes, processes and pathways, which overlap and influence one another in AD pathogenesis. This complexity challenges the dominant paradigm in drug discovery or a single-target drug for a single mechanism. Although this paradigm has achieved considerable success in some particular diseases, it has failed to provide effective approaches to AD therapy. Network medicines may offer alternative hope for effective treatment of AD and other complex diseases. In contrast to the single-target drug approach, network medicines employ a holistic approach to restore network dysfunction by simultaneously targeting key components in disease networks. In this paper, we explore several drugs either in the clinic or under development for AD therapy in term of their design strategies, diverse mechanisms of action and disease-modifying potential. These drugs act as multi-target ligands and may serve as leads for further development as network medicines.

  12. Development of a gas-jet-coupled multitarget system for multitracer production

    International Nuclear Information System (INIS)

    Haba, H.; Kaji, D.; Kanayama, Y.; Igarashi, K.; Enomoto, S.

    2005-01-01

    de021741792A new multitracer production system, which consists of a gas-jet-coupled multitarget system for short-lived radioactive tracers and a gas- and water-cooled target system for intense beam irradiations, has been installed on a beam line of the K540-MeV RIKEN Ring Cyclotron. The performance of the gas-jet system was investigated with 50 radionuclides of 18 elements produced in the 135 MeV nucl. -1 - 14 N induced reaction on nat Cu. The gas-jet efficiencies of the nuclides varying from 61 Cu to 24 Na, except for the chlorine isotopes, show a smooth variation as a function of the mass difference between a product and a target. The multitracers on the nat Ag and 197 Au targets were also produced by the 135 MeV nucl. -1 - 14 N beam with the intensity of 0.7 pμA, which was more than seven times the limit of the previous system. (orig.)

  13. Multisource drug policies in Latin America: survey of 10 countries.

    Science.gov (United States)

    Homedes, Núria; Ugalde, Antonio

    2005-01-01

    Essential drug lists and generic drug policies have been promoted as strategies to improve access to pharmaceuticals and control their rapidly escalating costs. This article reports the results of a preliminary survey conducted in 10 Latin American countries. The study aimed to document the experiences of different countries in defining and implementing generic drug policies, determine the cost of registering different types of pharmaceutical products and the time needed to register them, and uncover the incentives governments have developed to promote the use of multisource drugs. The survey instrument was administered in person in Chile, Ecuador and Peru and by email in Argentina, Brazil, Bolivia, Colombia, Costa Rica, Nicaragua and Uruguay. There was a total of 22 respondents. Survey responses indicated that countries use the terms generic and bioequivalence differently. We suggest there is a need to harmonize definitions and technical concepts. PMID:15682251

  14. [A model for multi-source feedback in postgraduate medical education based on validation and best practise].

    Science.gov (United States)

    Eriksen, Gitte Valsted; Malling, Bente

    2014-04-14

    In Denmark multi-source feedback is used in formative assessment of trainees' performance regarding the roles: communicator, collaborator, professional and manager. A web-based model was developed and evaluated useful, time-effective, acceptable and feasible. The model comprises a validated questionnaire usable in all specialities, personal feedback from an educated feedback facilitator, identification of areas for improvement and a mandatory written plan for the trainees' further professional development. The model is implemented at all hospitals in the Northern Educational Region in Denmark.

  15. Reasoning about Informal Statistical Inference: One Statistician's View

    Science.gov (United States)

    Rossman, Allan J.

    2008-01-01

    This paper identifies key concepts and issues associated with the reasoning of informal statistical inference. I focus on key ideas of inference that I think all students should learn, including at secondary level as well as tertiary. I argue that a fundamental component of inference is to go beyond the data at hand, and I propose that statistical…

  16. Use of Statistical Information for Damage Assessment of Civil Engineering Structures

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Andersen, P.

    This paper considers the problem of damage assessment of civil engineering structures using statistical information. The aim of the paper is to review how researchers recently have tried to solve the problem. It is pointed out that the problem consists of not only how to use the statistical...

  17. MO-AB-BRA-08: A Modular Multi-Source X-Ray Tube for Novel Computed Tomography Applications

    Energy Technology Data Exchange (ETDEWEB)

    Walker, B; Radtke, J; Chen, G; Mackie, T [University of Wisconsin Madison, Madison, WI (United States); Petry, G; Swader, R; Eliceiri, K [Morgridge Institute for Research, Madison, WI (United States)

    2016-06-15

    Purpose: To develop and build a practical implementation of an x-ray line source for the rapidly increasing number of multi-source imaging applications in CT. Methods: An innovative x-ray tube was designed using CST Particle Studio, ANSYS, and SolidWorks. A slowly varying magnetic field is synchronized with microsecond gating of multiple thermionic electron sources. Electrostatic simulations were run to optimize the geometry of the optics and prevent electrode arcing. Magnetostatic simulations were used for beam deflection studies and solenoid design. Particle beam trajectories were explored with an emphasis on focusing, acceleration, deflection, and space charge effects. Thermal constraints were analyzed for both transient and steady-state regimes. Electromagnetic simulations informed the design of a prototype unit under construction. Results: Particle tracking simulations for a benchtop system demonstrate that three 80 keV electron beams are able to be finely controlled and laterally swept a combined distance of 15 cm over a stationary target with an oscillating magnetic field in the hundreds of gauss. The beams are pulsed according to scanning sequences developed for implementation in a mock stationary CT scanner capable of a 30 ms temporal resolution. Beam spot diameters are approximately 1 mm for 30 mA beams and the stationary target stays well within thermal limits. The relevant hardware and control circuits were developed for incorporation into a physical prototype. Conclusion: A new multi-source x-ray tube was designed in a modular form factor to push the barriers of high-speed CT and spur growth in emerging imaging applications. This technology can be used as the basis for a stationary high-speed CT scanner, a system for generating a virtual fan-beam for dose reduction, or for reducing scatter radiation in cone-beam CT utilizing a tetrahedron beam CT geometry. A 2.4 kW benchtop system is currently being built to show proof of concept for the tube. Support

  18. Public health information and statistics dissemination efforts for Indonesia on the Internet.

    Science.gov (United States)

    Hanani, Febiana; Kobayashi, Takashi; Jo, Eitetsu; Nakajima, Sawako; Oyama, Hiroshi

    2011-01-01

    To elucidate current issues related to health statistics dissemination efforts on the Internet in Indonesia and to propose a new dissemination website as a solution. A cross-sectional survey was conducted. Sources of statistics were identified using link relationship and Google™ search. Menu used to locate statistics, mode of presentation and means of access to statistics, and available statistics were assessed for each site. Assessment results were used to derive design specification; a prototype system was developed and evaluated with usability test. 49 sources were identified on 18 governmental, 8 international and 5 non-government websites. Of 49 menus identified, 33% used non-intuitive titles and lead to inefficient search. 69% of them were on government websites. Of 31 websites, only 39% and 23% used graph/chart and map for presentation. Further, only 32%, 39% and 19% provided query, export and print feature. While >50% sources reported morbidity, risk factor and service provision statistics, disseminate statistics in Indonesia are supported by non-governmental and international organizations and existing their information may not be very useful because it is: a) not widely distributed, b) difficult to locate, and c) not effectively communicated. Actions are needed to ensure information usability, and one of such actions is the development of statistics portal website.

  19. Conference: Statistical Physics and Biological Information

    International Nuclear Information System (INIS)

    Gross, David J.; Hwa, Terence

    2001-01-01

    In the spring of 2001, the Institute for Theoretical Physics ran a 6 month scientific program on Statistical Physics and Biological Information. This program was organized by Walter Fitch (UC Irvine), Terence Hwa (UC San Diego), Luca Peliti (University Federico II), Naples Gary Stormo (Washington University School of Medicine) and Chao Tang (NEC). Overall scientific supervision was provided by David Gross, Director, ITP. The ITP has an online conference/program proceeding which consists of audio and transparencies of almost all of the talks held during this program. Over 100 talks are available on the site at http://online.kitp.ucsb.edu/online/infobio01/

  20. Automatic Matching of Multi-Source Satellite Images: A Case Study on ZY-1-02C and ETM+

    Directory of Open Access Journals (Sweden)

    Bo Wang

    2017-10-01

    Full Text Available The ever-growing number of applications for satellites is being compromised by their poor direct positioning precision. Existing orthoimages, such as enhanced thematic mapper (ETM+ orthoimages, can provide georeferences or improve the geo-referencing accuracy of satellite images, such ZY-1-02C images that have unsatisfactory positioning precision, thus enhancing their processing efficiency and application. In this paper, a feasible image matching approach using multi-source satellite images is proposed on the basis of an experiment carried out with ZY-1-02C Level 1 images and ETM+ orthoimages. The proposed approach overcame differences in rotation angle, scale, and translation between images. The rotation and scale variances were evaluated on the basis of rational polynomial coefficients. The translation vectors were generated after blocking the overall phase correlation. Then, normalized cross-correlation and least-squares matching were applied for matching. Finally, the gross errors of the corresponding points were eliminated by local statistic vectors in a TIN structure. Experimental results showed a matching precision of less than two pixels (root-mean-square error, and comparison results indicated that the proposed method outperforms Scale-Invariant Feature Transform (SIFT, Speeded Up Robust Features (SURF, and Affine-Scale Invariant Feature Transform (A-SIFT in terms of reliability and efficiency.

  1. Development and application of a multi-targeting reference plasmid as calibrator for analysis of five genetically modified soybean events.

    Science.gov (United States)

    Pi, Liqun; Li, Xiang; Cao, Yiwei; Wang, Canhua; Pan, Liangwen; Yang, Litao

    2015-04-01

    Reference materials are important in accurate analysis of genetically modified organism (GMO) contents in food/feeds, and development of novel reference plasmid is a new trend in the research of GMO reference materials. Herein, we constructed a novel multi-targeting plasmid, pSOY, which contained seven event-specific sequences of five GM soybeans (MON89788-5', A2704-12-3', A5547-127-3', DP356043-5', DP305423-3', A2704-12-5', and A5547-127-5') and sequence of soybean endogenous reference gene Lectin. We evaluated the specificity, limit of detection and quantification, and applicability of pSOY in both qualitative and quantitative PCR analyses. The limit of detection (LOD) was as low as 20 copies in qualitative PCR, and the limit of quantification (LOQ) in quantitative PCR was 10 copies. In quantitative real-time PCR analysis, the PCR efficiencies of all event-specific and Lectin assays were higher than 90%, and the squared regression coefficients (R(2)) were more than 0.999. The quantification bias varied from 0.21% to 19.29%, and the relative standard deviations were from 1.08% to 9.84% in simulated samples analysis. All the results demonstrated that the developed multi-targeting plasmid, pSOY, was a credible substitute of matrix reference materials, and could be used as a reliable reference calibrator in the identification and quantification of multiple GM soybean events.

  2. Fused Regression for Multi-source Gene Regulatory Network Inference.

    Directory of Open Access Journals (Sweden)

    Kari Y Lam

    2016-12-01

    Full Text Available Understanding gene regulatory networks is critical to understanding cellular differentiation and response to external stimuli. Methods for global network inference have been developed and applied to a variety of species. Most approaches consider the problem of network inference independently in each species, despite evidence that gene regulation can be conserved even in distantly related species. Further, network inference is often confined to single data-types (single platforms and single cell types. We introduce a method for multi-source network inference that allows simultaneous estimation of gene regulatory networks in multiple species or biological processes through the introduction of priors based on known gene relationships such as orthology incorporated using fused regression. This approach improves network inference performance even when orthology mapping and conservation are incomplete. We refine this method by presenting an algorithm that extracts the true conserved subnetwork from a larger set of potentially conserved interactions and demonstrate the utility of our method in cross species network inference. Last, we demonstrate our method's utility in learning from data collected on different experimental platforms.

  3. Statistical Language Models and Information Retrieval: Natural Language Processing Really Meets Retrieval

    NARCIS (Netherlands)

    Hiemstra, Djoerd; de Jong, Franciska M.G.

    2001-01-01

    Traditionally, natural language processing techniques for information retrieval have always been studied outside the framework of formal models of information retrieval. In this article, we introduce a new formal model of information retrieval based on the application of statistical language models.

  4. Rational design, synthesis and biological screening of triazine-triazolopyrimidine hybrids as multitarget anti-Alzheimer agents.

    Science.gov (United States)

    Jameel, Ehtesham; Meena, Poonam; Maqbool, Mudasir; Kumar, Jitendra; Ahmed, Waqar; Mumtazuddin, Syed; Tiwari, Manisha; Hoda, Nasimul; Jayaram, B

    2017-08-18

    In our endeavor towards the development of potent multitarget ligands for the treatment of Alzheimer's disease, a series of triazine-triazolopyrimidine hybrids were designed, synthesized and characterized by various spectral techniques. Docking and scoring techniques were used to design the inhibitors and to display their interaction with key residues of active site. Organic synthesis relied upon convergent synthetic routes were mono and di-substituted triazines were connected with triazolopyrimidine using piperazine as a linker. In total, seventeen compounds were synthesized in which the di-substituted triazine-triazolopyrimidine derivatives 9a-d showed better acetylcholinesterase (AChE) inhibitory activity than the corresponding tri-substituted triazine-triazolopyrimidine derivatives 10a-f. Out of the disubstituted triazine-triazolopyrimidine based compounds, 9a and 9b showed encouraging inhibitory activity on AChE with IC 50 values 0.065 and 0.092 μM, respectively. Interestingly, 9a and 9b also demonstrated good inhibition selectivity towards AChE over BuChE by ∼28 folds. Furthermore, kinetic analysis and molecular modeling studies showed that 9a and 9b target both catalytic active site as well as peripheral anionic site of AChE. In addition, these derivatives effectively modulated Aβ self-aggregation as investigated through CD spectroscopy, ThT fluorescence assay and electron microscopy. Besides, these compounds exhibited potential antioxidants (2.15 and 2.91 trolox equivalent by ORAC assay) and metal chelating properties. In silico ADMET profiling highlighted that, these novel triazine derivatives have appropriate drug like properties and possess very low toxic effects in the primarily pharmacokinetic study. Overall, the multitarget profile exerted by these novel triazine molecules qualified them as potential anti-Alzheimer drug candidates in AD therapy. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  5. Statistical Techniques For Real-time Anomaly Detection Using Spark Over Multi-source VMware Performance Data

    Energy Technology Data Exchange (ETDEWEB)

    Solaimani, Mohiuddin [Univ. of Texas-Dallas, Richardson, TX (United States); Iftekhar, Mohammed [Univ. of Texas-Dallas, Richardson, TX (United States); Khan, Latifur [Univ. of Texas-Dallas, Richardson, TX (United States); Thuraisingham, Bhavani [Univ. of Texas-Dallas, Richardson, TX (United States); Ingram, Joey Burton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    Anomaly detection refers to the identi cation of an irregular or unusual pat- tern which deviates from what is standard, normal, or expected. Such deviated patterns typically correspond to samples of interest and are assigned different labels in different domains, such as outliers, anomalies, exceptions, or malware. Detecting anomalies in fast, voluminous streams of data is a formidable chal- lenge. This paper presents a novel, generic, real-time distributed anomaly detection framework for heterogeneous streaming data where anomalies appear as a group. We have developed a distributed statistical approach to build a model and later use it to detect anomaly. As a case study, we investigate group anomaly de- tection for a VMware-based cloud data center, which maintains a large number of virtual machines (VMs). We have built our framework using Apache Spark to get higher throughput and lower data processing time on streaming data. We have developed a window-based statistical anomaly detection technique to detect anomalies that appear sporadically. We then relaxed this constraint with higher accuracy by implementing a cluster-based technique to detect sporadic and continuous anomalies. We conclude that our cluster-based technique out- performs other statistical techniques with higher accuracy and lower processing time.

  6. Multi-sources model and control algorithm of an energy management system for light electric vehicles

    International Nuclear Information System (INIS)

    Hannan, M.A.; Azidin, F.A.; Mohamed, A.

    2012-01-01

    Highlights: ► An energy management system (EMS) is developed for a scooter under normal and heavy power load conditions. ► The battery, FC, SC, EMS, DC machine and vehicle dynamics are modeled and designed for the system. ► State-based logic control algorithms provide an efficient and feasible multi-source EMS for light electric vehicles. ► Vehicle’s speed and power are closely matched with the ECE-47 driving cycle under normal and heavy load conditions. ► Sources of energy changeover occurred at 50% of the battery state of charge level in heavy load conditions. - Abstract: This paper presents the multi-sources energy models and ruled based feedback control algorithm of an energy management system (EMS) for light electric vehicle (LEV), i.e., scooters. The multiple sources of energy, such as a battery, fuel cell (FC) and super-capacitor (SC), EMS and power controller, DC machine and vehicle dynamics are designed and modeled using MATLAB/SIMULINK. The developed control strategies continuously support the EMS of the multiple sources of energy for a scooter under normal and heavy power load conditions. The performance of the proposed system is analyzed and compared with that of the ECE-47 test drive cycle in terms of vehicle speed and load power. The results show that the designed vehicle’s speed and load power closely match those of the ECE-47 test driving cycle under normal and heavy load conditions. This study’s results suggest that the proposed control algorithm provides an efficient and feasible EMS for LEV.

  7. Bayesian Information Criterion as an Alternative way of Statistical Inference

    Directory of Open Access Journals (Sweden)

    Nadejda Yu. Gubanova

    2012-05-01

    Full Text Available The article treats Bayesian information criterion as an alternative to traditional methods of statistical inference, based on NHST. The comparison of ANOVA and BIC results for psychological experiment is discussed.

  8. Multisource inverse-geometry CT. Part II. X-ray source design and prototype

    Energy Technology Data Exchange (ETDEWEB)

    Neculaes, V. Bogdan, E-mail: neculaes@ge.com; Caiafa, Antonio; Cao, Yang; De Man, Bruno; Edic, Peter M.; Frutschy, Kristopher; Gunturi, Satish; Inzinna, Lou; Reynolds, Joseph; Vermilyea, Mark; Wagner, David; Zhang, Xi; Zou, Yun [GE Global Research, Niskayuna, New York 12309 (United States); Pelc, Norbert J. [Department of Radiology, Stanford University, Stanford, California 94305 (United States); Lounsberry, Brian [Healthcare Science Technology, GE Healthcare, West Milwaukee, Wisconsin 53219 (United States)

    2016-08-15

    Purpose: This paper summarizes the development of a high-power distributed x-ray source, or “multisource,” designed for inverse-geometry computed tomography (CT) applications [see B. De Man et al., “Multisource inverse-geometry CT. Part I. System concept and development,” Med. Phys. 43, 4607–4616 (2016)]. The paper presents the evolution of the source architecture, component design (anode, emitter, beam optics, control electronics, high voltage insulator), and experimental validation. Methods: Dispenser cathode emitters were chosen as electron sources. A modular design was adopted, with eight electron emitters (two rows of four emitters) per module, wherein tungsten targets were brazed onto copper anode blocks—one anode block per module. A specialized ceramic connector provided high voltage standoff capability and cooling oil flow to the anode. A matrix topology and low-noise electronic controls provided switching of the emitters. Results: Four modules (32 x-ray sources in two rows of 16) have been successfully integrated into a single vacuum vessel and operated on an inverse-geometry computed tomography system. Dispenser cathodes provided high beam current (>1000 mA) in pulse mode, and the electrostatic lenses focused the current beam to a small optical focal spot size (0.5 × 1.4 mm). Controlled emitter grid voltage allowed the beam current to be varied for each source, providing the ability to modulate beam current across the fan of the x-ray beam, denoted as a virtual bowtie filter. The custom designed controls achieved x-ray source switching in <1 μs. The cathode-grounded source was operated successfully up to 120 kV. Conclusions: A high-power, distributed x-ray source for inverse-geometry CT applications was successfully designed, fabricated, and operated. Future embodiments may increase the number of spots and utilize fast read out detectors to increase the x-ray flux magnitude further, while still staying within the stationary target inherent

  9. Concepts and recent advances in generalized information measures and statistics

    CERN Document Server

    Kowalski, Andres M

    2013-01-01

    Since the introduction of the information measure widely known as Shannon entropy, quantifiers based on information theory and concepts such as entropic forms and statistical complexities have proven to be useful in diverse scientific research fields. This book contains introductory tutorials suitable for the general reader, together with chapters dedicated to the basic concepts of the most frequently employed information measures or quantifiers and their recent applications to different areas, including physics, biology, medicine, economics, communication and social sciences. As these quantif

  10. A Novel Nonlinear Multitarget k-Degree Coverage Preservation Protocol in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Zeyu Sun

    2016-01-01

    Full Text Available Due to the existence of a large number of redundant data in the process of covering multiple targets, the effective coverage of monitored region decreases, causing the network to consume more energy. To solve this problem, this paper proposes a multitarget k-degree coverage preservation protocol. Firstly, the affiliation between the sensor nodes and target nodes is established in the network model; meanwhile the method used to calculate the coverage expectation value of the monitored region is put forward; secondly, in the aspect of the network energy conversion, use scheduling mechanisms on the sensor nodes to balance the network energy and achieve different network coverage quality with energy conversion between different nodes. Finally, simulation results show that NMCP can improve the network lifetime by effectively reducing the number of active nodes to meet certain coverage requirements.

  11. Blind estimation of the number of speech source in reverberant multisource scenarios based on binaural signals

    DEFF Research Database (Denmark)

    May, Tobias; van de Par, Steven

    2012-01-01

    In this paper we present a new approach for estimating the number of active speech sources in the presence of interfering noise sources and reverberation. First, a binaural front-end is used to detect the spatial positions of all active sound sources, resulting in a binary mask for each candidate...... on a support vector machine (SVM) classifier. A systematic analysis shows that the proposed algorithm is able to blindly determine the number and the corresponding spatial positions of speech sources in multisource scenarios and generalizes well to unknown acoustic conditions...

  12. Least-squares Migration and Full Waveform Inversion with Multisource Frequency Selection

    KAUST Repository

    Huang, Yunsong

    2013-09-01

    Multisource Least-Squares Migration (LSM) of phase-encoded supergathers has shown great promise in reducing the computational cost of conventional migration. But for the marine acquisition geometry this approach faces the challenge of erroneous misfit due to the mismatch between the limited number of live traces/shot recorded in the field and the pervasive number of traces generated by the finite-difference modeling method. To tackle this mismatch problem, I present a frequency selection strategy with LSM of supergathers. The key idea is, at each LSM iteration, to assign a unique frequency band to each shot gather, so that the spectral overlap among those shots—and therefore their crosstallk—is zero. Consequently, each receiver can unambiguously identify and then discount the superfluous sources—those that are not associated with the receiver in marine acquisition. To compare with standard migration, I apply the proposed method to 2D SEG/EAGE salt model and obtain better resolved images computed at about 1/8 the cost; results for 3D SEG/EAGE salt model, with Ocean Bottom Seismometer (OBS) survey, show a speedup of 40×. This strategy is next extended to multisource Full Waveform Inversion (FWI) of supergathers for marine streamer data, with the same advantages of computational efficiency and storage savings. In the Finite-Difference Time-Domain (FDTD) method, to mitigate spectral leakage due to delayed onsets of sine waves detected at receivers, I double the simulation time and retain only the second half of the simulated records. To compare with standard FWI, I apply the proposed method to 2D velocity model of SEG/EAGE salt and to Gulf Of Mexico (GOM) field data, and obtain a speedup of about 4× and 8×. Formulas are then derived for the resolution limits of various constituent wavepaths pertaining to FWI: diving waves, primary reflections, diffractions, and multiple reflections. They suggest that inverting multiples can provide some low and intermediate

  13. ATP as a Multi-target Danger Signal in the Brain

    Directory of Open Access Journals (Sweden)

    Ricardo J Rodrigues

    2015-04-01

    Full Text Available ATP is released in an activity-dependent manner from different cell types in the brain, fulfilling different roles as a neurotransmitter, neuromodulator, astrocyte-to-neuron communication, propagating astrocytic responses and formatting microglia responses. This involves the activation of different ATP P2 receptors (P2R as well as adenosine receptors upon extracellular ATP catabolism by ecto-nucleotidases. Notably, brain noxious stimuli trigger a sustained increase of extracellular ATP, which plays a key role as danger signal in the brain. This involves a combined action of extracellular ATP in different cell types, namely increasing the susceptibility of neurons to damage, promoting astrogliosis and recruiting and formatting microglia to mount neuroinflammatory responses. Such actions involve the activation of different receptors, as heralded by neuroprotective effects resulting from blockade mainly of P2X7R, P2Y1R and adenosine A2A receptors (A2AR, which hierarchy, cooperation and/or redundancy is still not resolved. These pleiotropic functions of ATP as a danger signal in brain damage prompt a therapeutic interest to multi-target different purinergic receptors to provide maximal opportunities for neuroprotection.

  14. Fisher statistics for analysis of diffusion tensor directional information.

    Science.gov (United States)

    Hutchinson, Elizabeth B; Rutecki, Paul A; Alexander, Andrew L; Sutula, Thomas P

    2012-04-30

    A statistical approach is presented for the quantitative analysis of diffusion tensor imaging (DTI) directional information using Fisher statistics, which were originally developed for the analysis of vectors in the field of paleomagnetism. In this framework, descriptive and inferential statistics have been formulated based on the Fisher probability density function, a spherical analogue of the normal distribution. The Fisher approach was evaluated for investigation of rat brain DTI maps to characterize tissue orientation in the corpus callosum, fornix, and hilus of the dorsal hippocampal dentate gyrus, and to compare directional properties in these regions following status epilepticus (SE) or traumatic brain injury (TBI) with values in healthy brains. Direction vectors were determined for each region of interest (ROI) for each brain sample and Fisher statistics were applied to calculate the mean direction vector and variance parameters in the corpus callosum, fornix, and dentate gyrus of normal rats and rats that experienced TBI or SE. Hypothesis testing was performed by calculation of Watson's F-statistic and associated p-value giving the likelihood that grouped observations were from the same directional distribution. In the fornix and midline corpus callosum, no directional differences were detected between groups, however in the hilus, significant (pstatistical comparison of tissue structural orientation. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. a Statistical Texture Feature for Building Collapse Information Extraction of SAR Image

    Science.gov (United States)

    Li, L.; Yang, H.; Chen, Q.; Liu, X.

    2018-04-01

    Synthetic Aperture Radar (SAR) has become one of the most important ways to extract post-disaster collapsed building information, due to its extreme versatility and almost all-weather, day-and-night working capability, etc. In view of the fact that the inherent statistical distribution of speckle in SAR images is not used to extract collapsed building information, this paper proposed a novel texture feature of statistical models of SAR images to extract the collapsed buildings. In the proposed feature, the texture parameter of G0 distribution from SAR images is used to reflect the uniformity of the target to extract the collapsed building. This feature not only considers the statistical distribution of SAR images, providing more accurate description of the object texture, but also is applied to extract collapsed building information of single-, dual- or full-polarization SAR data. The RADARSAT-2 data of Yushu earthquake which acquired on April 21, 2010 is used to present and analyze the performance of the proposed method. In addition, the applicability of this feature to SAR data with different polarizations is also analysed, which provides decision support for the data selection of collapsed building information extraction.

  16. Paradigms for adaptive statistical information designs: practical experiences and strategies.

    Science.gov (United States)

    Wang, Sue-Jane; Hung, H M James; O'Neill, Robert

    2012-11-10

    In the last decade or so, interest in adaptive design clinical trials has gradually been directed towards their use in regulatory submissions by pharmaceutical drug sponsors to evaluate investigational new drugs. Methodological advances of adaptive designs are abundant in the statistical literature since the 1970s. The adaptive design paradigm has been enthusiastically perceived to increase the efficiency and to be more cost-effective than the fixed design paradigm for drug development. Much interest in adaptive designs is in those studies with two-stages, where stage 1 is exploratory and stage 2 depends upon stage 1 results, but where the data of both stages will be combined to yield statistical evidence for use as that of a pivotal registration trial. It was not until the recent release of the US Food and Drug Administration Draft Guidance for Industry on Adaptive Design Clinical Trials for Drugs and Biologics (2010) that the boundaries of flexibility for adaptive designs were specifically considered for regulatory purposes, including what are exploratory goals, and what are the goals of adequate and well-controlled (A&WC) trials (2002). The guidance carefully described these distinctions in an attempt to minimize the confusion between the goals of preliminary learning phases of drug development, which are inherently substantially uncertain, and the definitive inference-based phases of drug development. In this paper, in addition to discussing some aspects of adaptive designs in a confirmatory study setting, we underscore the value of adaptive designs when used in exploratory trials to improve planning of subsequent A&WC trials. One type of adaptation that is receiving attention is the re-estimation of the sample size during the course of the trial. We refer to this type of adaptation as an adaptive statistical information design. Specifically, a case example is used to illustrate how challenging it is to plan a confirmatory adaptive statistical information

  17. Topology of classical molecular optimal control landscapes for multi-target objectives

    Energy Technology Data Exchange (ETDEWEB)

    Joe-Wong, Carlee [Program in Applied and Computational Mathematics, Princeton University, Princeton, New Jersey 08544-1000 (United States); Ho, Tak-San; Rabitz, Herschel, E-mail: hrabitz@princeton.edu [Department of Chemistry, Princeton University, Princeton, New Jersey 08544-1009 (United States); Wu, Rebing [Department of Automation, Tsinghua University, Beijing (China)

    2015-04-21

    This paper considers laser-driven optimal control of an ensemble of non-interacting molecules whose dynamics lie in classical phase space. The molecules evolve independently under control to distinct final states. We consider a control landscape defined in terms of multi-target (MT) molecular states and analyze the landscape as a functional of the control field. The topology of the MT control landscape is assessed through its gradient and Hessian with respect to the control. Under particular assumptions, the MT control landscape is found to be free of traps that could hinder reaching the objective. The Hessian associated with an optimal control field is shown to have finite rank, indicating an inherent degree of robustness to control noise. Both the absence of traps and rank of the Hessian are shown to be analogous to the situation of specifying multiple targets for an ensemble of quantum states. Numerical simulations are presented to illustrate the classical landscape principles and further characterize the system behavior as the control field is optimized.

  18. Early detection of emerald ash borer infestation using multisourced data: a case study in the town of Oakville, Ontario, Canada

    Science.gov (United States)

    Zhang, Kongwen; Hu, Baoxin; Robinson, Justin

    2014-01-01

    The emerald ash borer (EAB) poses a significant economic and environmental threat to ash trees in southern Ontario, Canada, and the northern states of the USA. It is critical that effective technologies are urgently developed to detect, monitor, and control the spread of EAB. This paper presents a methodology using multisourced data to predict potential infestations of EAB in the town of Oakville, Ontario, Canada. The information combined in this study includes remotely sensed data, such as high spatial resolution aerial imagery, commercial ground and airborne hyperspectral data, and Google Earth imagery, in addition to nonremotely sensed data, such as archived paper maps and documents. This wide range of data provides extensive information that can be used for early detection of EAB, yet their effective employment and use remain a significant challenge. A prediction function was developed to estimate the EAB infestation states of individual ash trees using three major attributes: leaf chlorophyll content, tree crown spatial pattern, and prior knowledge. Comparison between these predicted values and a ground-based survey demonstrated an overall accuracy of 62.5%, with 22.5% omission and 18.5% commission errors.

  19. Effects of multi-source feedback on developmental plans for leaders of postgraduate medical education

    DEFF Research Database (Denmark)

    Malling, Bente; Bonderup, Thomas; Mortensen, Lene

    2009-01-01

    for both management and leadership performance areas. The developmental plans mainly focused on management initiatives, whereas plans for the development of leadership performance were few. Areas rated low by all respondents were scarcely represented in CREs' developmental plans. CONCLUSIONS: An MSF...... process might in itself lead to development in administrative areas. However, MSF carried through as a single stand-alone procedure was not sufficient to foster plans for the development of leadership performance.......OBJECTIVES: Multi-source feedback (MSF) is a widely used developmental tool for leaders in organisations including those dealing with health care. This study was performed to examine the effects of an MSF process on developmental plans made by leaders of postgraduate medical education (PGME...

  20. Multi-source feature extraction and target recognition in wireless sensor networks based on adaptive distributed wavelet compression algorithms

    Science.gov (United States)

    Hortos, William S.

    2008-04-01

    Proposed distributed wavelet-based algorithms are a means to compress sensor data received at the nodes forming a wireless sensor network (WSN) by exchanging information between neighboring sensor nodes. Local collaboration among nodes compacts the measurements, yielding a reduced fused set with equivalent information at far fewer nodes. Nodes may be equipped with multiple sensor types, each capable of sensing distinct phenomena: thermal, humidity, chemical, voltage, or image signals with low or no frequency content as well as audio, seismic or video signals within defined frequency ranges. Compression of the multi-source data through wavelet-based methods, distributed at active nodes, reduces downstream processing and storage requirements along the paths to sink nodes; it also enables noise suppression and more energy-efficient query routing within the WSN. Targets are first detected by the multiple sensors; then wavelet compression and data fusion are applied to the target returns, followed by feature extraction from the reduced data; feature data are input to target recognition/classification routines; targets are tracked during their sojourns through the area monitored by the WSN. Algorithms to perform these tasks are implemented in a distributed manner, based on a partition of the WSN into clusters of nodes. In this work, a scheme of collaborative processing is applied for hierarchical data aggregation and decorrelation, based on the sensor data itself and any redundant information, enabled by a distributed, in-cluster wavelet transform with lifting that allows multiple levels of resolution. The wavelet-based compression algorithm significantly decreases RF bandwidth and other resource use in target processing tasks. Following wavelet compression, features are extracted. The objective of feature extraction is to maximize the probabilities of correct target classification based on multi-source sensor measurements, while minimizing the resource expenditures at

  1. Statistical methods of combining information: Applications to sensor data fusion

    Energy Technology Data Exchange (ETDEWEB)

    Burr, T.

    1996-12-31

    This paper reviews some statistical approaches to combining information from multiple sources. Promising new approaches will be described, and potential applications to combining not-so-different data sources such as sensor data will be discussed. Experiences with one real data set are described.

  2. An Automatic Framework Using Space-Time Processing and TR-MUSIC for Subsurface and Through-Wall Multitarget Imaging

    Directory of Open Access Journals (Sweden)

    Si-hao Tan

    2012-01-01

    Full Text Available We present an automatic framework combined space-time signal processing with Time Reversal electromagnetic (EM inversion for subsurface and through-wall multitarget imaging using electromagnetic waves. This framework is composed of a frequency-wavenumber (FK filter to suppress direct wave and medium bounce, a FK migration algorithm to automatically estimate the number of targets and identify target regions, which can be used to reduce the computational complexity of the following imaging algorithm, and a EM inversion algorithm using Time Reversal Multiple Signal Classification (TR-MUSIC to reconstruct hidden objects. The feasibility of the framework is demonstrated with simulated data generated by GPRMAX.

  3. Can a manager have a life and a career? International and multisource perspectives on work-life balance and career advancement potential.

    Science.gov (United States)

    Lyness, Karen S; Judiesch, Michael K

    2008-07-01

    The present study was the first cross-national examination of whether managers who were perceived to be high in work-life balance were expected to be more or less likely to advance in their careers than were less balanced, more work-focused managers. Using self ratings, peer ratings, and supervisor ratings of 9,627 managers in 33 countries, the authors examined within-source and multisource relationships with multilevel analyses. The authors generally found that managers who were rated higher in work-life balance were rated higher in career advancement potential than were managers who were rated lower in work-life balance. However, national gender egalitarianism, measured with Project GLOBE scores, moderated relationships based on supervisor and self ratings, with stronger positive relationships in low egalitarian cultures. The authors also found 3-way interactions of work-life balance ratings, ratee gender, and gender egalitarianism in multisource analyses in which self balance ratings predicted supervisor and peer ratings of advancement potential. Work-life balance ratings were positively related to advancement potential ratings for women in high egalitarian cultures and men in low gender egalitarian cultures, but relationships were nonsignificant for men in high egalitarian cultures and women in low egalitarian cultures.

  4. AUTOMATIC REGISTRATION OF MULTI-SOURCE DATA USING MUTUAL INFORMATION

    Directory of Open Access Journals (Sweden)

    E. G. Parmehr

    2012-07-01

    Full Text Available Automatic image registration is a basic step in multi-sensor data integration in remote sensing and photogrammetric applications such as data fusion. The effectiveness of Mutual Information (MI as a technique for automated multi-sensor image registration has previously been demonstrated for medical and remote sensing applications. In this paper, a new General Weighted MI (GWMI approach that improves the robustness of MI to local maxima, particularly in the case of registering optical imagery and 3D point clouds, is presented. Two different methods including a Gaussian Mixture Model (GMM and Kernel Density Estimation have been used to define the weight function of joint probability, regardless of the modality of the data being registered. The Expectation Maximizing method is then used to estimate parameters of GMM, and in order to reduce the cost of computation, a multi-resolution strategy has been used. The performance of the proposed GWMI method for the registration of aerial orthotoimagery and LiDAR range and intensity information has been experimentally evaluated and the results obtained are presented.

  5. Laboratory-based surveillance of pertussis using multitarget real-time PCR in Japan: evidence for Bordetella pertussis infection in preteens and teens

    Directory of Open Access Journals (Sweden)

    K. Kamachi

    2015-11-01

    Full Text Available Between January 2013 and December 2014, we conducted laboratory-based surveillance of pertussis using multitarget real-time PCR, which discriminates among Bordetella pertussis, Bordetella parapertussis, Bordetella holmesii and Mycoplasma pneumoniae. Of 355 patients clinically diagnosed with pertussis in Japan, B. pertussis, B. parapertussis and M. pneumoniae were detected in 26% (n = 94, 1.1% (n = 4 and 0.6% (n = 2, respectively, whereas B. holmesii was not detected. It was confirmed that B. parapertussis and M. pneumoniae are also responsible for causing pertussis-like illness. The positive rates for B. pertussis ranged from 16% to 49%, depending on age. Infants aged ≤ 3 months had the highest rate (49%, and children aged 1 to 4 years had the lowest rate (16%, p < 0.01 vs. infants aged ≤ 3 months. Persons aged 10 to 14 and 15 to 19 years also showed high positive rates (29% each; the positive rates were not statistically significant compared with that of infants aged ≤ 3 months (p ≥ 0.06. Our observations indicate that similar to infants, preteens and teens are at high risk of B. pertussis infection.

  6. APPROACH TO CONSTRUCTING 3D VIRTUAL SCENE OF IRRIGATION AREA USING MULTI-SOURCE DATA

    Directory of Open Access Journals (Sweden)

    S. Cheng

    2015-10-01

    Full Text Available For an irrigation area that is often complicated by various 3D artificial ground features and natural environment, disadvantages of traditional 2D GIS in spatial data representation, management, query, analysis and visualization is becoming more and more evident. Building a more realistic 3D virtual scene is thus especially urgent for irrigation area managers and decision makers, so that they can carry out various irrigational operations lively and intuitively. Based on previous researchers' achievements, a simple, practical and cost-effective approach was proposed in this study, by adopting3D geographic information system (3D GIS, remote sensing (RS technology. Based on multi-source data such as Google Earth (GE high-resolution remote sensing image, ASTER G-DEM, hydrological facility maps and so on, 3D terrain model and ground feature models were created interactively. Both of the models were then rendered with texture data and integrated under ArcGIS platform. A vivid, realistic 3D virtual scene of irrigation area that has a good visual effect and possesses primary GIS functions about data query and analysis was constructed.Yet, there is still a long way to go for establishing a true 3D GIS for the irrigation are: issues of this study were deeply discussed and future research direction was pointed out in the end of the paper.

  7. Multi-targeted and aggressive treatment of patients with type 2 diabetes at high risk

    DEFF Research Database (Denmark)

    Gaede, P; Pedersen, O

    2005-01-01

    Results from many single risk factor intervention trials and the multi-targeted Steno-2 trial in the last few years have provided a strong case that management of type 2 diabetes in all age groups requires a structured and intensified approach that is far more than just glucocentric, an approach...... addressing additional cardiovascular risk factors including hypertension, dyslipidaemia, sedentary behaviour, smoking and dietary habits causing insulin resistance and pro-inflammation. This type of integrated therapy applied for almost 8 years to high-risk type 2 diabetic patients has cut the relative risk......-driven polypharmacy and simple but focused behaviour modelling with continuous education, motivation and trouble-shooting for treatment barriers identified for the patient and the care giver. It is high time we transfer these experiences and major health benefits gained in the 'green house' of controlled clinical...

  8. Statistics information of rice EST mapping results - RGP estmap2001 | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us RGP estmap2001 Statistics information of rice EST mapping results Data detail Data name Statistics...of This Database Site Policy | Contact Us Statistics information of rice EST mapping results - RGP estmap2001 | LSDB Archive ...

  9. Separation of non-stationary multi-source sound field based on the interpolated time-domain equivalent source method

    Science.gov (United States)

    Bi, Chuan-Xing; Geng, Lin; Zhang, Xiao-Zheng

    2016-05-01

    In the sound field with multiple non-stationary sources, the measured pressure is the sum of the pressures generated by all sources, and thus cannot be used directly for studying the vibration and sound radiation characteristics of every source alone. This paper proposes a separation model based on the interpolated time-domain equivalent source method (ITDESM) to separate the pressure field belonging to every source from the non-stationary multi-source sound field. In the proposed method, ITDESM is first extended to establish the relationship between the mixed time-dependent pressure and all the equivalent sources distributed on every source with known location and geometry information, and all the equivalent source strengths at each time step are solved by an iterative solving process; then, the corresponding equivalent source strengths of one interested source are used to calculate the pressure field generated by that source alone. Numerical simulation of two baffled circular pistons demonstrates that the proposed method can be effective in separating the non-stationary pressure generated by every source alone in both time and space domains. An experiment with two speakers in a semi-anechoic chamber further evidences the effectiveness of the proposed method.

  10. Conference: Statistical Physics and Biological Information; F

    International Nuclear Information System (INIS)

    Gross, David J.; Hwa, Terence

    2001-01-01

    In the spring of 2001, the Institute for Theoretical Physics ran a 6 month scientific program on Statistical Physics and Biological Information. This program was organized by Walter Fitch (UC Irvine), Terence Hwa (UC San Diego), Luca Peliti (University Federico II), Naples Gary Stormo (Washington University School of Medicine) and Chao Tang (NEC). Overall scientific supervision was provided by David Gross, Director, ITP. The ITP has an online conference/program proceeding which consists of audio and transparencies of almost all of the talks held during this program. Over 100 talks are available on the site at http://online.kitp.ucsb.edu/online/infobio01/

  11. Inclusion probability for DNA mixtures is a subjective one-sided match statistic unrelated to identification information.

    Science.gov (United States)

    Perlin, Mark William

    2015-01-01

    DNA mixtures of two or more people are a common type of forensic crime scene evidence. A match statistic that connects the evidence to a criminal defendant is usually needed for court. Jurors rely on this strength of match to help decide guilt or innocence. However, the reliability of unsophisticated match statistics for DNA mixtures has been questioned. The most prevalent match statistic for DNA mixtures is the combined probability of inclusion (CPI), used by crime labs for over 15 years. When testing 13 short tandem repeat (STR) genetic loci, the CPI(-1) value is typically around a million, regardless of DNA mixture composition. However, actual identification information, as measured by a likelihood ratio (LR), spans a much broader range. This study examined probability of inclusion (PI) mixture statistics for 517 locus experiments drawn from 16 reported cases and compared them with LR locus information calculated independently on the same data. The log(PI(-1)) values were examined and compared with corresponding log(LR) values. The LR and CPI methods were compared in case examples of false inclusion, false exclusion, a homicide, and criminal justice outcomes. Statistical analysis of crime laboratory STR data shows that inclusion match statistics exhibit a truncated normal distribution having zero center, with little correlation to actual identification information. By the law of large numbers (LLN), CPI(-1) increases with the number of tested genetic loci, regardless of DNA mixture composition or match information. These statistical findings explain why CPI is relatively constant, with implications for DNA policy, criminal justice, cost of crime, and crime prevention. Forensic crime laboratories have generated CPI statistics on hundreds of thousands of DNA mixture evidence items. However, this commonly used match statistic behaves like a random generator of inclusionary values, following the LLN rather than measuring identification information. A quantitative

  12. Multisource Estimation of Long-term Global Terrestrial Surface Radiation

    Science.gov (United States)

    Peng, L.; Sheffield, J.

    2017-12-01

    Land surface net radiation is the essential energy source at the earth's surface. It determines the surface energy budget and its partitioning, drives the hydrological cycle by providing available energy, and offers heat, light, and energy for biological processes. Individual components in net radiation have changed historically due to natural and anthropogenic climate change and land use change. Decadal variations in radiation such as global dimming or brightening have important implications for hydrological and carbon cycles. In order to assess the trends and variability of net radiation and evapotranspiration, there is a need for accurate estimates of long-term terrestrial surface radiation. While large progress in measuring top of atmosphere energy budget has been made, huge discrepancies exist among ground observations, satellite retrievals, and reanalysis fields of surface radiation, due to the lack of observational networks, the difficulty in measuring from space, and the uncertainty in algorithm parameters. To overcome the weakness of single source datasets, we propose a multi-source merging approach to fully utilize and combine multiple datasets of radiation components separately, as they are complementary in space and time. First, we conduct diagnostic analysis of multiple satellite and reanalysis datasets based on in-situ measurements such as Global Energy Balance Archive (GEBA), existing validation studies, and other information such as network density and consistency with other meteorological variables. Then, we calculate the optimal weighted average of multiple datasets by minimizing the variance of error between in-situ measurements and other observations. Finally, we quantify the uncertainties in the estimates of surface net radiation and employ physical constraints based on the surface energy balance to reduce these uncertainties. The final dataset is evaluated in terms of the long-term variability and its attribution to changes in individual

  13. Epigenetic polypharmacology: from combination therapy to multitargeted drugs.

    Science.gov (United States)

    de Lera, Angel R; Ganesan, A

    The modern drug discovery process has largely focused its attention in the so-called magic bullets, single chemical entities that exhibit high selectivity and potency for a particular target. This approach was based on the assumption that the deregulation of a protein was causally linked to a disease state, and the pharmacological intervention through inhibition of the deregulated target was able to restore normal cell function. However, the use of cocktails or multicomponent drugs to address several targets simultaneously is also popular to treat multifactorial diseases such as cancer and neurological disorders. We review the state of the art with such combinations that have an epigenetic target as one of their mechanisms of action. Epigenetic drug discovery is a rapidly advancing field, and drugs targeting epigenetic enzymes are in the clinic for the treatment of hematological cancers. Approved and experimental epigenetic drugs are undergoing clinical trials in combination with other therapeutic agents via fused or linked pharmacophores in order to benefit from synergistic effects of polypharmacology. In addition, ligands are being discovered which, as single chemical entities, are able to modulate multiple epigenetic targets simultaneously (multitarget epigenetic drugs). These multiple ligands should in principle have a lower risk of drug-drug interactions and drug resistance compared to cocktails or multicomponent drugs. This new generation may rival the so-called magic bullets in the treatment of diseases that arise as a consequence of the deregulation of multiple signaling pathways provided the challenge of optimization of the activities shown by the pharmacophores with the different targets is addressed.

  14. Histone deacetylase inhibitors (HDACIs): multitargeted anticancer agents.

    Science.gov (United States)

    Ververis, Katherine; Hiong, Alison; Karagiannis, Tom C; Licciardi, Paul V

    2013-01-01

    Histone deacetylase (HDAC) inhibitors are an emerging class of therapeutics with potential as anticancer drugs. The rationale for developing HDAC inhibitors (and other chromatin-modifying agents) as anticancer therapies arose from the understanding that in addition to genetic mutations, epigenetic changes such as dysregulation of HDAC enzymes can alter phenotype and gene expression, disturb homeostasis, and contribute to neoplastic growth. The family of HDAC inhibitors is large and diverse. It includes a range of naturally occurring and synthetic compounds that differ in terms of structure, function, and specificity. HDAC inhibitors have multiple cell type-specific effects in vitro and in vivo, such as growth arrest, cell differentiation, and apoptosis in malignant cells. HDAC inhibitors have the potential to be used as monotherapies or in combination with other anticancer therapies. Currently, there are two HDAC inhibitors that have received approval from the US FDA for the treatment of cutaneous T-cell lymphoma: vorinostat (suberoylanilide hydroxamic acid, Zolinza) and depsipeptide (romidepsin, Istodax). More recently, depsipeptide has also gained FDA approval for the treatment of peripheral T-cell lymphoma. Many more clinical trials assessing the effects of various HDAC inhibitors on hematological and solid malignancies are currently being conducted. Despite the proven anticancer effects of particular HDAC inhibitors against certain cancers, many aspects of HDAC enzymes and HDAC inhibitors are still not fully understood. Increasing our understanding of the effects of HDAC inhibitors, their targets and mechanisms of action will be critical for the advancement of these drugs, especially to facilitate the rational design of HDAC inhibitors that are effective as antineoplastic agents. This review will discuss the use of HDAC inhibitors as multitargeted therapies for malignancy. Further, we outline the pharmacology and mechanisms of action of HDAC inhibitors while

  15. INLINING 3D RECONSTRUCTION, MULTI-SOURCE TEXTURE MAPPING AND SEMANTIC ANALYSIS USING OBLIQUE AERIAL IMAGERY

    Directory of Open Access Journals (Sweden)

    D. Frommholz

    2016-06-01

    Full Text Available This paper proposes an in-line method for the simplified reconstruction of city buildings from nadir and oblique aerial images that at the same time are being used for multi-source texture mapping with minimal resampling. Further, the resulting unrectified texture atlases are analyzed for fac¸ade elements like windows to be reintegrated into the original 3D models. Tests on real-world data of Heligoland/ Germany comprising more than 800 buildings exposed a median positional deviation of 0.31 m at the fac¸ades compared to the cadastral map, a correctness of 67% for the detected windows and good visual quality when being rendered with GPU-based perspective correction. As part of the process building reconstruction takes the oriented input images and transforms them into dense point clouds by semi-global matching (SGM. The point sets undergo local RANSAC-based regression and topology analysis to detect adjacent planar surfaces and determine their semantics. Based on this information the roof, wall and ground surfaces found get intersected and limited in their extension to form a closed 3D building hull. For texture mapping the hull polygons are projected into each possible input bitmap to find suitable color sources regarding the coverage and resolution. Occlusions are detected by ray-casting a full-scale digital surface model (DSM of the scene and stored in pixel-precise visibility maps. These maps are used to derive overlap statistics and radiometric adjustment coefficients to be applied when the visible image parts for each building polygon are being copied into a compact texture atlas without resampling whenever possible. The atlas bitmap is passed to a commercial object-based image analysis (OBIA tool running a custom rule set to identify windows on the contained fac¸ade patches. Following multi-resolution segmentation and classification based on brightness and contrast differences potential window objects are evaluated against geometric

  16. A statistical mechanical interpretation of algorithmic information theory: Total statistical mechanical interpretation based on physical argument

    International Nuclear Information System (INIS)

    Tadaki, Kohtaro

    2010-01-01

    The statistical mechanical interpretation of algorithmic information theory (AIT, for short) was introduced and developed by our former works [K. Tadaki, Local Proceedings of CiE 2008, pp. 425-434, 2008] and [K. Tadaki, Proceedings of LFCS'09, Springer's LNCS, vol. 5407, pp. 422-440, 2009], where we introduced the notion of thermodynamic quantities, such as partition function Z(T), free energy F(T), energy E(T), statistical mechanical entropy S(T), and specific heat C(T), into AIT. We then discovered that, in the interpretation, the temperature T equals to the partial randomness of the values of all these thermodynamic quantities, where the notion of partial randomness is a stronger representation of the compression rate by means of program-size complexity. Furthermore, we showed that this situation holds for the temperature T itself, which is one of the most typical thermodynamic quantities. Namely, we showed that, for each of the thermodynamic quantities Z(T), F(T), E(T), and S(T) above, the computability of its value at temperature T gives a sufficient condition for T is an element of (0,1) to satisfy the condition that the partial randomness of T equals to T. In this paper, based on a physical argument on the same level of mathematical strictness as normal statistical mechanics in physics, we develop a total statistical mechanical interpretation of AIT which actualizes a perfect correspondence to normal statistical mechanics. We do this by identifying a microcanonical ensemble in the framework of AIT. As a result, we clarify the statistical mechanical meaning of the thermodynamic quantities of AIT.

  17. Multi-target activity of Hemidesmus indicus decoction against innovative HIV-1 drug targets and characterization of Lupeol mode of action.

    Science.gov (United States)

    Esposito, Francesca; Mandrone, Manuela; Del Vecchio, Claudia; Carli, Ilaria; Distinto, Simona; Corona, Angela; Lianza, Mariacaterina; Piano, Dario; Tacchini, Massimo; Maccioni, Elias; Cottiglia, Filippo; Saccon, Elisa; Poli, Ferruccio; Parolin, Cristina; Tramontano, Enzo

    2017-08-31

    Despite the availability of several anti-retrovirals, there is still an urgent need for developing novel therapeutic strategies and finding new drugs against underexplored HIV-1 targets. Among them, there are the HIV-1 reverse transcriptase (RT)-associated ribonuclease H (RNase H) function and the cellular α-glucosidase, involved in the control mechanisms of N-linked glycoproteins formation in the endoplasmic reticulum. It is known that many natural compounds, such as pentacyclic triterpenes, are a promising class of HIV-1 inhibitors. Hence, here we tested the pentacyclic triterpene Lupeol, showing that it inhibits the HIV-1 RT-associated RNase H function. We then performed combination studies of Lupeol and the active site RNase H inhibitor RDS1759, and blind docking calculations, demonstrating that Lupeol binds to an HIV-1 RT allosteric pocket. On the bases of these results and searching for potential multitarget active drug supplement, we also investigated the anti-HIV-1 activity of Hemidesmus indicus, an Ayurveda medicinal plant containing Lupeol. Results supported the potential of this plant as a valuable multitarget active drug source. In fact, by virtue of its numerous active metabolites, H. indicus was able to inhibit not only the RT-associated RNase H function, but also the HIV-1 RT-associated RNA-dependent DNA polymerase activity and the cellular α-glucosidase. © FEMS 2017. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  18. HCIDL: Human-computer interface description language for multi-target, multimodal, plastic user interfaces

    Directory of Open Access Journals (Sweden)

    Lamia Gaouar

    2018-06-01

    Full Text Available From the human-computer interface perspectives, the challenges to be faced are related to the consideration of new, multiple interactions, and the diversity of devices. The large panel of interactions (touching, shaking, voice dictation, positioning … and the diversification of interaction devices can be seen as a factor of flexibility albeit introducing incidental complexity. Our work is part of the field of user interface description languages. After an analysis of the scientific context of our work, this paper introduces HCIDL, a modelling language staged in a model-driven engineering approach. Among the properties related to human-computer interface, our proposition is intended for modelling multi-target, multimodal, plastic interaction interfaces using user interface description languages. By combining plasticity and multimodality, HCIDL improves usability of user interfaces through adaptive behaviour by providing end-users with an interaction-set adapted to input/output of terminals and, an optimum layout. Keywords: Model driven engineering, Human-computer interface, User interface description languages, Multimodal applications, Plastic user interfaces

  19. Statistical Inference on Memory Structure of Processes and Its Applications to Information Theory

    Science.gov (United States)

    2016-05-12

    Distribution Unlimited UU UU UU UU 12-05-2016 15-May-2014 14-Feb-2015 Final Report: Statistical Inference on Memory Structure of Processes and Its Applications ...ES) U.S. Army Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 mathematical statistics ; time series; Markov chains; random...journals: Final Report: Statistical Inference on Memory Structure of Processes and Its Applications to Information Theory Report Title Three areas

  20. [Location information acquisition and sharing application design in national census of Chinese medicine resources].

    Science.gov (United States)

    Zhang, Xiao-Bo; Li, Meng; Wang, Hui; Guo, Lan-Ping; Huang, Lu-Qi

    2017-11-01

    In literature, there are many information on the distribution of Chinese herbal medicine. Limited by the technical methods, the origin of Chinese herbal medicine or distribution of information in ancient literature were described roughly. It is one of the main objectives of the national census of Chinese medicine resources, which is the background information of the types and distribution of Chinese medicine resources in the region. According to the national Chinese medicine resource census technical specifications and pilot work experience, census team with "3S" technology, computer network technology, digital camera technology and other modern technology methods, can effectively collect the location information of traditional Chinese medicine resources. Detailed and specific location information, such as regional differences in resource endowment and similarity, biological characteristics and spatial distribution, the Chinese medicine resource census data access to the accuracy and objectivity evaluation work, provide technical support and data support. With the support of spatial information technology, based on location information, statistical summary and sharing of multi-source census data can be realized. The integration of traditional Chinese medicine resources and related basic data can be a spatial integration, aggregation and management of massive data, which can help for the scientific rules data mining of traditional Chinese medicine resources from the overall level and fully reveal its scientific connotation. Copyright© by the Chinese Pharmaceutical Association.

  1. Site-characterization information using LANDSAT satellite and other remote-sensing data: integration of remote-sensing data with geographic information systems. A case study in Pennsylvania

    International Nuclear Information System (INIS)

    Campbell, W.J.; Imhoff, M.L.; Robinson, J.; Gunther, F.; Boyd, R.; Anuta, M.

    1983-06-01

    The utility and cost effectiveness of incorporating digitized aircraft and satellite remote sensing data into a geographic information system for facility siting and environmental impact assessments was evaluated. This research focused on the evaluation of several types of multisource remotely sensed data representing a variety of spectral band widths and spatial resolution. High resolution aircraft photography, Landsat MSS, and 7 band Thematic Mapper Simulator (TMS) data were acquired, analyzed, and evaluated for their suitability as input to an operational geographic information system (GIS). 78 references, 59 figures, 74 tables

  2. Statistics for library and information services a primer for using open source R software for accessibility and visualization

    CERN Document Server

    Friedman, Alon

    2016-01-01

    Statistics for Library and Information Services, written for non-statisticians, provides logical, user-friendly, and step-by-step instructions to make statistics more accessible for students and professionals in the field of Information Science. It emphasizes concepts of statistical theory and data collection methodologies, but also extends to the topics of visualization creation and display, so that the reader will be able to better conduct statistical analysis and communicate his/her findings. The book is tailored for information science students and professionals. It has specific examples of dataset sets, scripts, design modules, data repositories, homework assignments, and a glossary lexicon that matches the field of Information Science. The textbook provides a visual road map that is customized specifically for Information Science instructors, students, and professionals regarding statistics and visualization. Each chapter in the book includes full-color illustrations on how to use R for the statistical ...

  3. Estimating Vegetation Primary Production in the Heihe River Basin of China with Multi-Source and Multi-Scale Data.

    Directory of Open Access Journals (Sweden)

    Tianxiang Cui

    Full Text Available Estimating gross primary production (GPP and net primary production (NPP are significant important in studying carbon cycles. Using models driven by multi-source and multi-scale data is a promising approach to estimate GPP and NPP at regional and global scales. With a focus on data that are openly accessible, this paper presents a GPP and NPP model driven by remotely sensed data and meteorological data with spatial resolutions varying from 30 m to 0.25 degree and temporal resolutions ranging from 3 hours to 1 month, by integrating remote sensing techniques and eco-physiological process theories. Our model is also designed as part of the Multi-source data Synergized Quantitative (MuSyQ Remote Sensing Production System. In the presented MuSyQ-NPP algorithm, daily GPP for a 10-day period was calculated as a product of incident photosynthetically active radiation (PAR and its fraction absorbed by vegetation (FPAR using a light use efficiency (LUE model. The autotrophic respiration (Ra was determined using eco-physiological process theories and the daily NPP was obtained as the balance between GPP and Ra. To test its feasibility at regional scales, our model was performed in an arid and semi-arid region of Heihe River Basin, China to generate daily GPP and NPP during the growing season of 2012. The results indicated that both GPP and NPP exhibit clear spatial and temporal patterns in their distribution over Heihe River Basin during the growing season due to the temperature, water and solar influx conditions. After validated against ground-based measurements, MODIS GPP product (MOD17A2H and results reported in recent literature, we found the MuSyQ-NPP algorithm could yield an RMSE of 2.973 gC m(-2 d(-1 and an R of 0.842 when compared with ground-based GPP while an RMSE of 8.010 gC m(-2 d(-1 and an R of 0.682 can be achieved for MODIS GPP, the estimated NPP values were also well within the range of previous literature, which proved the reliability of

  4. On divergence of finite measures and their applicability in statistics and information theory

    Czech Academy of Sciences Publication Activity Database

    Vajda, Igor; Stummer, W.

    2009-01-01

    Roč. 44, č. 2 (2009), s. 169-187 ISSN 0233-1888 R&D Projects: GA MŠk(CZ) 1M0572; GA ČR(CZ) GA102/07/1131 Institutional research plan: CEZ:AV0Z10750506 Keywords : Local and global divergences of finite measures * Divergences of sigma-finite measures * Statistical censoring * Pinsker's inequality, Ornstein's distance * Differential power entropies Subject RIV: BD - Theory of Information Impact factor: 0.759, year: 2009 http://library.utia.cas.cz/separaty/2009/SI/vajda-on divergence of finite measures and their applicability in statistics and information theory.pdf

  5. Interference contrast in multi-source few photon optics

    OpenAIRE

    Laskowski, Wieslaw; Wiesniak, Marcin; Zukowski, Marek; Bourennane, Mohamed; Weinfurter, Harald

    2009-01-01

    Many recent experiments employ several parametric down conversion (PDC) sources to get multiphoton interference. Such interference has applications in quantum information. We study here how effects due to photon statistics, misalignment, and partial distinguishability of the PDC pairs originating from different sources may lower the interference contrast in the multiphoton experiments.

  6. Toddlers favor communicatively presented information over statistical reliability in learning about artifacts.

    Directory of Open Access Journals (Sweden)

    Hanna Marno

    Full Text Available Observed associations between events can be validated by statistical information of reliability or by testament of communicative sources. We tested whether toddlers learn from their own observation of efficiency, assessed by statistical information on reliability of interventions, or from communicatively presented demonstration, when these two potential types of evidence of validity of interventions on a novel artifact are contrasted with each other. Eighteen-month-old infants observed two adults, one operating the artifact by a method that was more efficient (2/3 probability of success than that of the other (1/3 probability of success. Compared to the Baseline condition, in which communicative signals were not employed, infants tended to choose the less reliable method to operate the artifact when this method was demonstrated in a communicative manner in the Experimental condition. This finding demonstrates that, in certain circumstances, communicative sanctioning of reliability may override statistical evidence for young learners. Such a bias can serve fast and efficient transmission of knowledge between generations.

  7. Identifying Statistical Dependence in Genomic Sequences via Mutual Information Estimates

    Directory of Open Access Journals (Sweden)

    Wojciech Szpankowski

    2007-12-01

    Full Text Available Questions of understanding and quantifying the representation and amount of information in organisms have become a central part of biological research, as they potentially hold the key to fundamental advances. In this paper, we demonstrate the use of information-theoretic tools for the task of identifying segments of biomolecules (DNA or RNA that are statistically correlated. We develop a precise and reliable methodology, based on the notion of mutual information, for finding and extracting statistical as well as structural dependencies. A simple threshold function is defined, and its use in quantifying the level of significance of dependencies between biological segments is explored. These tools are used in two specific applications. First, they are used for the identification of correlations between different parts of the maize zmSRp32 gene. There, we find significant dependencies between the 5′ untranslated region in zmSRp32 and its alternatively spliced exons. This observation may indicate the presence of as-yet unknown alternative splicing mechanisms or structural scaffolds. Second, using data from the FBI's combined DNA index system (CODIS, we demonstrate that our approach is particularly well suited for the problem of discovering short tandem repeats—an application of importance in genetic profiling.

  8. Statistical information 1971-76. From the National Institute of Radiation Protection

    International Nuclear Information System (INIS)

    1978-01-01

    This report includes statistical information about the work performed at the National Institute of Radiation Protection, Sweden, during the period 1971-1976, as well as about the different fields causing the intervention by the institute. (E.R.)

  9. Generalized information fusion and visualization using spatial voting and data modeling

    Science.gov (United States)

    Jaenisch, Holger M.; Handley, James W.

    2013-05-01

    We present a novel and innovative information fusion and visualization framework for multi-source intelligence (multiINT) data using Spatial Voting (SV) and Data Modeling. We describe how different sources of information can be converted into numerical form for further processing downstream, followed by a short description of how this information can be fused using the SV grid. As an illustrative example, we show the modeling of cyberspace as cyber layers for the purpose of tracking cyber personas. Finally we describe a path ahead for creating interactive agile networks through defender customized Cyber-cubes for network configuration and attack visualization.

  10. The Role of Discrete Global Grid Systems in the Global Statistical Geospatial Framework

    Science.gov (United States)

    Purss, M. B. J.; Peterson, P.; Minchin, S. A.; Bermudez, L. E.

    2016-12-01

    The United Nations Committee of Experts on Global Geospatial Information Management (UN-GGIM) has proposed the development of a Global Statistical Geospatial Framework (GSGF) as a mechanism for the establishment of common analytical systems that enable the integration of statistical and geospatial information. Conventional coordinate reference systems address the globe with a continuous field of points suitable for repeatable navigation and analytical geometry. While this continuous field is represented on a computer in a digitized and discrete fashion by tuples of fixed-precision floating point values, it is a non-trivial exercise to relate point observations spatially referenced in this way to areal coverages on the surface of the Earth. The GSGF states the need to move to gridded data delivery and the importance of using common geographies and geocoding. The challenges associated with meeting these goals are not new and there has been a significant effort within the geospatial community to develop nested gridding standards to tackle these issues over many years. These efforts have recently culminated in the development of a Discrete Global Grid Systems (DGGS) standard which has been developed under the auspices of Open Geospatial Consortium (OGC). DGGS provide a fixed areal based geospatial reference frame for the persistent location of measured Earth observations, feature interpretations, and modelled predictions. DGGS address the entire planet by partitioning it into a discrete hierarchical tessellation of progressively finer resolution cells, which are referenced by a unique index that facilitates rapid computation, query and analysis. The geometry and location of the cell is the principle aspect of a DGGS. Data integration, decomposition, and aggregation is optimised in the DGGS hierarchical structure and can be exploited for efficient multi-source data processing, storage, discovery, transmission, visualization, computation, analysis, and modelling. During

  11. Response to Nieminen et al.'s Feature Article on Executive Coaching and Facilitated Multisource Feedback: Toward Better Understanding of a Growing HRD Practice

    Science.gov (United States)

    Egan, Toby

    2013-01-01

    Executive coaching is a booming, but understudied, HRD-related practice. Given the limited number of available studies that have been deployed with rigor and systematic methods, the study by Nieminen et al. adds to our understanding of the impact of executive coaching and multisource feedback on leadership development. Explored in the context of a…

  12. A multi-target caffeine derived rhodium(i) N-heterocyclic carbene complex: evaluation of the mechanism of action.

    Science.gov (United States)

    Zhang, Jing-Jing; Muenzner, Julienne K; Abu El Maaty, Mohamed A; Karge, Bianka; Schobert, Rainer; Wölfl, Stefan; Ott, Ingo

    2016-08-16

    A rhodium(i) and a ruthenium(ii) complex with a caffeine derived N-heterocyclic carbene (NHC) ligand were biologically investigated as organometallic conjugates consisting of a metal center and a naturally occurring moiety. While the ruthenium(ii) complex was largely inactive, the rhodium(i) NHC complex displayed selective cytotoxicity and significant anti-metastatic and in vivo anti-vascular activities and acted as both a mammalian and an E. coli thioredoxin reductase inhibitor. In HCT-116 cells it increased the reactive oxygen species level, leading to DNA damage, and it induced cell cycle arrest, decreased the mitochondrial membrane potential, and triggered apoptosis. This rhodium(i) NHC derivative thus represents a multi-target compound with promising anti-cancer potential.

  13. The system for statistical analysis of logistic information

    Directory of Open Access Journals (Sweden)

    Khayrullin Rustam Zinnatullovich

    2015-05-01

    Full Text Available The current problem for managers in logistic and trading companies is the task of improving the operational business performance and developing the logistics support of sales. The development of logistics sales supposes development and implementation of a set of works for the development of the existing warehouse facilities, including both a detailed description of the work performed, and the timing of their implementation. Logistics engineering of warehouse complex includes such tasks as: determining the number and the types of technological zones, calculation of the required number of loading-unloading places, development of storage structures, development and pre-sales preparation zones, development of specifications of storage types, selection of loading-unloading equipment, detailed planning of warehouse logistics system, creation of architectural-planning decisions, selection of information-processing equipment, etc. The currently used ERP and WMS systems did not allow us to solve the full list of logistics engineering problems. In this regard, the development of specialized software products, taking into account the specifics of warehouse logistics, and subsequent integration of these software with ERP and WMS systems seems to be a current task. In this paper we suggest a system of statistical analysis of logistics information, designed to meet the challenges of logistics engineering and planning. The system is based on the methods of statistical data processing.The proposed specialized software is designed to improve the efficiency of the operating business and the development of logistics support of sales. The system is based on the methods of statistical data processing, the methods of assessment and prediction of logistics performance, the methods for the determination and calculation of the data required for registration, storage and processing of metal products, as well as the methods for planning the reconstruction and development

  14. Inclusion probability for DNA mixtures is a subjective one-sided match statistic unrelated to identification information

    Directory of Open Access Journals (Sweden)

    Mark William Perlin

    2015-01-01

    Full Text Available Background: DNA mixtures of two or more people are a common type of forensic crime scene evidence. A match statistic that connects the evidence to a criminal defendant is usually needed for court. Jurors rely on this strength of match to help decide guilt or innocence. However, the reliability of unsophisticated match statistics for DNA mixtures has been questioned. Materials and Methods: The most prevalent match statistic for DNA mixtures is the combined probability of inclusion (CPI, used by crime labs for over 15 years. When testing 13 short tandem repeat (STR genetic loci, the CPI -1 value is typically around a million, regardless of DNA mixture composition. However, actual identification information, as measured by a likelihood ratio (LR, spans a much broader range. This study examined probability of inclusion (PI mixture statistics for 517 locus experiments drawn from 16 reported cases and compared them with LR locus information calculated independently on the same data. The log(PI -1 values were examined and compared with corresponding log(LR values. Results: The LR and CPI methods were compared in case examples of false inclusion, false exclusion, a homicide, and criminal justice outcomes. Statistical analysis of crime laboratory STR data shows that inclusion match statistics exhibit a truncated normal distribution having zero center, with little correlation to actual identification information. By the law of large numbers (LLN, CPI -1 increases with the number of tested genetic loci, regardless of DNA mixture composition or match information. These statistical findings explain why CPI is relatively constant, with implications for DNA policy, criminal justice, cost of crime, and crime prevention. Conclusions: Forensic crime laboratories have generated CPI statistics on hundreds of thousands of DNA mixture evidence items. However, this commonly used match statistic behaves like a random generator of inclusionary values, following the LLN

  15. Geo-Parcel Based Crop Identification by Integrating High Spatial-Temporal Resolution Imagery from Multi-Source Satellite Data

    Directory of Open Access Journals (Sweden)

    Yingpin Yang

    2017-12-01

    Full Text Available Geo-parcel based crop identification plays an important role in precision agriculture. It meets the needs of refined farmland management. This study presents an improved identification procedure for geo-parcel based crop identification by combining fine-resolution images and multi-source medium-resolution images. GF-2 images with fine spatial resolution of 0.8 m provided agricultural farming plot boundaries, and GF-1 (16 m and Landsat 8 OLI data were used to transform the geo-parcel based enhanced vegetation index (EVI time-series. In this study, we propose a piecewise EVI time-series smoothing method to fit irregular time profiles, especially for crop rotation situations. Global EVI time-series were divided into several temporal segments, from which phenological metrics could be derived. This method was applied to Lixian, where crop rotation was the common practice of growing different types of crops, in the same plot, in sequenced seasons. After collection of phenological features and multi-temporal spectral information, Random Forest (RF was performed to classify crop types, and the overall accuracy was 93.27%. Moreover, an analysis of feature significance showed that phenological features were of greater importance for distinguishing agricultural land cover compared to temporal spectral information. The identification results indicated that the integration of high spatial-temporal resolution imagery is promising for geo-parcel based crop identification and that the newly proposed smoothing method is effective.

  16. Properties of the ZnSe/ZnTe heterojunction prepared by a multi-source evaporation of ZnTe:Sb on ZnSe single crystals

    Energy Technology Data Exchange (ETDEWEB)

    Romeo, N [Parma Univ. (Italy). Ist. di Fisica; First, F [Uniwersytet Mikolaja Kopernika, Torun (Poland). Inst. Fizyki; Seuret, D [Universidad de La Habana, (Cuba). Facultad de Fisica-Matematica

    1979-07-16

    A new method of preparation is described of a ZnSe/ZnTe heterojunction in which Sb-doped ZnTe is deposited by a multi-source apparatus on ZnSe monocrystals. The properties of the heterojunction was studied, esp. the I-U characteristic, the 1/C/sup 2/ plot as a function of applied voltage, the photocurrent spectrum, and the electroluminescence spectrum.

  17. The research on data organization technology in the highway geographic information system

    Science.gov (United States)

    Tian, Zhihui; Wu, Fang; Zeng, Yuhuai

    2008-10-01

    Data are the basis of GIS. It has direct impact on the efficiency and function of a Highway Geographic Information System (HGIS), because of the characteristics of data model and data organization of the traffic geographic information system such as spatial property, multi-path network, linearity. This paper discussed the data property of HGIS, studied and presented the HGIS spatial data on multi-source and model. Also, it described and verified highway geographical feature of special subject data's linearity, dynamic and multiple-path network property in HGIS.

  18. Knowledge-Intensive Gathering and Integration of Statistical Information on European Fisheries

    NARCIS (Netherlands)

    Klinkert, M.; Treur, J.; Verwaart, T.; Loganantharaj, R.; Palm, G.; Ali, M.

    2000-01-01

    Gathering, maintenance, integration and presentation of statistics are major activities of the Dutch Agricultural Economics Research Institute LEI. In this paper we explore how knowledge and agent technology can be exploited to support the information gathering and integration process. In

  19. Generation and evaluation of 3D digital casts of maxillary defects based on multisource data registration: A pilot clinical study.

    Science.gov (United States)

    Ye, Hongqiang; Ma, Qijun; Hou, Yuezhong; Li, Man; Zhou, Yongsheng

    2017-12-01

    Digital techniques are not clinically applied for 1-piece maxillary prostheses containing an obturator and removable partial denture retained by the remaining teeth because of the difficulty in obtaining sufficiently accurate 3-dimensional (3D) images. The purpose of this pilot clinical study was to generate 3D digital casts of maxillary defects, including the defective region and the maxillary dentition, based on multisource data registration and to evaluate their effectiveness. Twelve participants with maxillary defects were selected. The maxillofacial region was scanned with spiral computer tomography (CT), and the maxillary arch and palate were scanned using an intraoral optical scanner. The 3D images from the CT and intraoral scanner were registered and merged to form a 3D digital cast of the maxillary defect containing the anatomic structures needed for the maxillary prosthesis. This included the defect cavity, maxillary dentition, and palate. Traditional silicone impressions were also made, and stone casts were poured. The accuracy of the digital cast in comparison with that of the stone cast was evaluated by measuring the distance between 4 anatomic landmarks. Differences and consistencies were assessed using paired Student t tests and the intraclass correlation coefficient (ICC). In 3 participants, physical resin casts were produced by rapid prototyping from digital casts. Based on the resin casts, maxillary prostheses were fabricated by using conventional methods and then evaluated in the participants to assess the clinical applicability of the digital casts. Digital casts of the maxillary defects were generated and contained all the anatomic details needed for the maxillary prosthesis. Comparing the digital and stone casts, a paired Student t test indicated that differences in the linear distances between landmarks were not statistically significant (P>.05). High ICC values (0.977 to 0.998) for the interlandmark distances further indicated the high

  20. Geo-metadata design for the GIS of the pre-selected site for China's high-level radioactive waste repository

    International Nuclear Information System (INIS)

    Zhong Xia; Wang Ju; Huang Shutao; Wang Shuhong; Gao Min

    2008-01-01

    The information system for the geological disposal of high-level radioactive waste aims at the integrated management and full application of multi-sourceful information in the research for geological disposal of high-level radioactive waste. And the establishment and operation of the system need geo-metadata's support of multi-sourceful information. In the paper, on the basis of geo-data analysis for pre-selected site of disposal of high-level radioactive waste, we can apply the existing metadata standards. Also we can research and design the content information, management pattern and application for geo-metadata of the multi-sourceful information. (authors)

  1. Review of Statistical Learning Methods in Integrated Omics Studies (An Integrated Information Science).

    Science.gov (United States)

    Zeng, Irene Sui Lan; Lumley, Thomas

    2018-01-01

    Integrated omics is becoming a new channel for investigating the complex molecular system in modern biological science and sets a foundation for systematic learning for precision medicine. The statistical/machine learning methods that have emerged in the past decade for integrated omics are not only innovative but also multidisciplinary with integrated knowledge in biology, medicine, statistics, machine learning, and artificial intelligence. Here, we review the nontrivial classes of learning methods from the statistical aspects and streamline these learning methods within the statistical learning framework. The intriguing findings from the review are that the methods used are generalizable to other disciplines with complex systematic structure, and the integrated omics is part of an integrated information science which has collated and integrated different types of information for inferences and decision making. We review the statistical learning methods of exploratory and supervised learning from 42 publications. We also discuss the strengths and limitations of the extended principal component analysis, cluster analysis, network analysis, and regression methods. Statistical techniques such as penalization for sparsity induction when there are fewer observations than the number of features and using Bayesian approach when there are prior knowledge to be integrated are also included in the commentary. For the completeness of the review, a table of currently available software and packages from 23 publications for omics are summarized in the appendix.

  2. Supervised Vicarious Calibration (SVC of Multi-Source Hyperspectral Remote-Sensing Data

    Directory of Open Access Journals (Sweden)

    Anna Brook

    2015-05-01

    Full Text Available Introduced in 2011, the supervised vicarious calibration (SVC approach is a promising approach to radiometric calibration and atmospheric correction of airborne hyperspectral (HRS data. This paper presents a comprehensive study by which the SVC method has been systematically examined and a complete protocol for its practical execution has been established—along with possible limitations encountered during the campaign. The technique was applied to multi-sourced HRS data in order to: (1 verify the at-sensor radiometric calibration and (2 obtain radiometric and atmospheric correction coefficients. Spanning two select study sites along the southeast coast of France, data were collected simultaneously by three airborne sensors (AisaDUAL, AHS and CASI-1500i aboard two aircrafts (CASA of National Institute for Aerospace Technology INTA ES and DORNIER 228 of NERC-ARSF Centre UK. The SVC ground calibration site was assembled along sand dunes near Montpellier and the thematic data were acquired from other areas in the south of France (Salon-de-Provence, Marseille, Avignon and Montpellier on 28 October 2010 between 12:00 and 16:00 UTC. The results of this study confirm that the SVC method enables reliable inspection and, if necessary, in-situ fine radiometric recalibration of airborne hyperspectral data. Independent of sensor or platform quality, the SVC approach allows users to improve at-sensor data to obtain more accurate physical units and subsequently improved reflectance information. Flight direction was found to be important, whereas the flight altitude posed very low impact. The numerous rules and major outcomes of this experiment enable a new standard of atmospherically corrected data based on better radiometric output. Future research should examine the potential of SVC to be applied to super-and-hyperspectral data obtained from on-orbit sensors.

  3. Performance of the multitarget Mikrogen Chlamydia trachomatis IgG ELISA in the prediction of tubal factor infertility (TFI) in subfertile women : Comparison with the Medac MOMP IgG ELISA plus

    NARCIS (Netherlands)

    van Ess, Eleanne F.; Ouburg, Sander; Spaargaren, Joke; Land, Jolande A.; Morre, Servaas A.

    2017-01-01

    There is a need for more accurate Chlamydia trachomatis (CT) IgG antibody tests for tubal factor infertility (TFI) diagnostics. We evaluated the predictive value for TFI of Medac ELISA plus (MOMP) and multitarget Mikrogen ELISA (MOMP-CPAF-TARP). Based on Medac ELISA plus results, 183 subfertile

  4. Advances in audio source seperation and multisource audio content retrieval

    Science.gov (United States)

    Vincent, Emmanuel

    2012-06-01

    Audio source separation aims to extract the signals of individual sound sources from a given recording. In this paper, we review three recent advances which improve the robustness of source separation in real-world challenging scenarios and enable its use for multisource content retrieval tasks, such as automatic speech recognition (ASR) or acoustic event detection (AED) in noisy environments. We present a Flexible Audio Source Separation Toolkit (FASST) and discuss its advantages compared to earlier approaches such as independent component analysis (ICA) and sparse component analysis (SCA). We explain how cues as diverse as harmonicity, spectral envelope, temporal fine structure or spatial location can be jointly exploited by this toolkit. We subsequently present the uncertainty decoding (UD) framework for the integration of audio source separation and audio content retrieval. We show how the uncertainty about the separated source signals can be accurately estimated and propagated to the features. Finally, we explain how this uncertainty can be efficiently exploited by a classifier, both at the training and the decoding stage. We illustrate the resulting performance improvements in terms of speech separation quality and speaker recognition accuracy.

  5. Prediction of Multi-Target Networks of Neuroprotective Compounds with Entropy Indices and Synthesis, Assay, and Theoretical Study of New Asymmetric 1,2-Rasagiline Carbamates

    Directory of Open Access Journals (Sweden)

    Francisco J. Romero Durán

    2014-09-01

    Full Text Available In a multi-target complex network, the links (Lij represent the interactions between the drug (di and the target (tj, characterized by different experimental measures (Ki, Km, IC50, etc. obtained in pharmacological assays under diverse boundary conditions (cj. In this work, we handle Shannon entropy measures for developing a model encompassing a multi-target network of neuroprotective/neurotoxic compounds reported in the CHEMBL database. The model predicts correctly >8300 experimental outcomes with Accuracy, Specificity, and Sensitivity above 80%–90% on training and external validation series. Indeed, the model can calculate different outcomes for >30 experimental measures in >400 different experimental protocolsin relation with >150 molecular and cellular targets on 11 different organisms (including human. Hereafter, we reported by the first time the synthesis, characterization, and experimental assays of a new series of chiral 1,2-rasagiline carbamate derivatives not reported in previous works. The experimental tests included: (1 assay in absence of neurotoxic agents; (2 in the presence of glutamate; and (3 in the presence of H2O2. Lastly, we used the new Assessing Links with Moving Averages (ALMA-entropy model to predict possible outcomes for the new compounds in a high number of pharmacological tests not carried out experimentally.

  6. Modeling and Mapping of Human Source Data

    Science.gov (United States)

    2011-03-08

    Fusion Workshop hosted by the Center for Multisource Information Fusion, University at Buffalo/ CUBRC , October 2008 • The general idea is to conduct...Information Fusion Workshop (via the Center for Multisource Information Fusion, University at Buffalo/ CUBRC ) in October, 2008 at which 40 scientists from

  7. Applications of statistical physics and information theory to the analysis of DNA sequences

    Science.gov (United States)

    Grosse, Ivo

    2000-10-01

    DNA carries the genetic information of most living organisms, and the of genome projects is to uncover that genetic information. One basic task in the analysis of DNA sequences is the recognition of protein coding genes. Powerful computer programs for gene recognition have been developed, but most of them are based on statistical patterns that vary from species to species. In this thesis I address the question if there exist universal statistical patterns that are different in coding and noncoding DNA of all living species, regardless of their phylogenetic origin. In search for such species-independent patterns I study the mutual information function of genomic DNA sequences, and find that it shows persistent period-three oscillations. To understand the biological origin of the observed period-three oscillations, I compare the mutual information function of genomic DNA sequences to the mutual information function of stochastic model sequences. I find that the pseudo-exon model is able to reproduce the mutual information function of genomic DNA sequences. Moreover, I find that a generalization of the pseudo-exon model can connect the existence and the functional form of long-range correlations to the presence and the length distributions of coding and noncoding regions. Based on these theoretical studies I am able to find an information-theoretical quantity, the average mutual information (AMI), whose probability distributions are significantly different in coding and noncoding DNA, while they are almost identical in all studied species. These findings show that there exist universal statistical patterns that are different in coding and noncoding DNA of all studied species, and they suggest that the AMI may be used to identify genes in different living species, irrespective of their taxonomic origin.

  8. A flexible statistics web processing service--added value for information systems for experiment data.

    Science.gov (United States)

    Heimann, Dennis; Nieschulze, Jens; König-Ries, Birgitta

    2010-04-20

    Data management in the life sciences has evolved from simple storage of data to complex information systems providing additional functionalities like analysis and visualization capabilities, demanding the integration of statistical tools. In many cases the used statistical tools are hard-coded within the system. That leads to an expensive integration, substitution, or extension of tools because all changes have to be done in program code. Other systems are using generic solutions for tool integration but adapting them to another system is mostly rather extensive work. This paper shows a way to provide statistical functionality over a statistics web service, which can be easily integrated in any information system and set up using XML configuration files. The statistical functionality is extendable by simply adding the description of a new application to a configuration file. The service architecture as well as the data exchange process between client and service and the adding of analysis applications to the underlying service provider are described. Furthermore a practical example demonstrates the functionality of the service.

  9. Information transport in classical statistical systems

    Science.gov (United States)

    Wetterich, C.

    2018-02-01

    For "static memory materials" the bulk properties depend on boundary conditions. Such materials can be realized by classical statistical systems which admit no unique equilibrium state. We describe the propagation of information from the boundary to the bulk by classical wave functions. The dependence of wave functions on the location of hypersurfaces in the bulk is governed by a linear evolution equation that can be viewed as a generalized Schrödinger equation. Classical wave functions obey the superposition principle, with local probabilities realized as bilinears of wave functions. For static memory materials the evolution within a subsector is unitary, as characteristic for the time evolution in quantum mechanics. The space-dependence in static memory materials can be used as an analogue representation of the time evolution in quantum mechanics - such materials are "quantum simulators". For example, an asymmetric Ising model on a Euclidean two-dimensional lattice represents the time evolution of free relativistic fermions in two-dimensional Minkowski space.

  10. Online Simultaneous Hydrogen/Deuterium Exchange of Multitarget Gas-Phase Molecules by Electrospray Ionization Mass Spectrometry Coupled with Gas Chromatography.

    Science.gov (United States)

    Jeong, Eun Sook; Cha, Eunju; Cha, Sangwon; Kim, Sunghwan; Oh, Han Bin; Kwon, Oh-Seung; Lee, Jaeick

    2017-11-21

    In this study, a hydrogen/deuterium (H/D) exchange method using gas chromatography-electrospray ionization/mass spectrometry (GC-ESI/MS) was first investigated as a novel tool for online H/D exchange of multitarget analytes. The GC and ESI source were combined with a homemade heated column transfer line. GC-ESI/MS-based H/D exchange occurs in an atmospheric pressure ion source as a result of reacting the gas-phase analyte eluted from GC with charged droplets of deuterium oxide infused as the ESI spray solvent. The consumption of the deuterated solvent at a flow rate of 2 μL min -1 was more economical than that in online H/D exchange methods reported to date. In-ESI-source H/D exchange by GC-ESI/MS was applied to 11 stimulants with secondary amino or hydroxyl groups. After H/D exchange, the spectra of the stimulants showed unexchanged, partially exchanged, and fully exchanged ions showing various degrees of exchange. The relative abundances corrected for naturally occurring isotopes of the fully exchanged ions of stimulants, except for etamivan, were in the range 24.3-85.5%. Methylephedrine and cyclazodone showed low H/D exchange efficiency under acidic, neutral, and basic spray solvent conditions and nonexchange for etamivan with an acidic phenolic OH group. The in-ESI-source H/D exchange efficiency by GC-ESI/MS was sufficient to determine the number of hydrogen by elucidation of fragmentation from the spectrum. Therefore, this online H/D exchange technique using GC-ESI/MS has potential as an alternative method for simultaneous H/D exchange of multitarget analytes.

  11. [Object-oriented stand type classification based on the combination of multi-source remote sen-sing data].

    Science.gov (United States)

    Mao, Xue Gang; Wei, Jing Yu

    2017-11-01

    The recognition of forest type is one of the key problems in forest resource monitoring. The Radarsat-2 data and QuickBird remote sensing image were used for object-based classification to study the object-based forest type classification and recognition based on the combination of multi-source remote sensing data. In the process of object-based classification, three segmentation schemes (segmentation with QuickBird remote sensing image only, segmentation with Radarsat-2 data only, segmentation with combination of QuickBird and Radarsat-2) were adopted. For the three segmentation schemes, ten segmentation scale parameters were adopted (25-250, step 25), and modified Euclidean distance 3 index was further used to evaluate the segmented results to determine the optimal segmentation scheme and segmentation scale. Based on the optimal segmented result, three forest types of Chinese fir, Masson pine and broad-leaved forest were classified and recognized using Support Vector Machine (SVM) classifier with Radial Basis Foundation (RBF) kernel according to different feature combinations of topography, height, spectrum and common features. The results showed that the combination of Radarsat-2 data and QuickBird remote sensing image had its advantages of object-based forest type classification over using Radarsat-2 data or QuickBird remote sensing image only. The optimal scale parameter for QuickBirdRadarsat-2 segmentation was 100, and at the optimal scale, the accuracy of object-based forest type classification was the highest (OA=86%, Kappa=0.86), when using all features which were extracted from two kinds of data resources. This study could not only provide a reference for forest type recognition using multi-source remote sensing data, but also had a practical significance for forest resource investigation and monitoring.

  12. INFORMATION TECHNOLOGIES OF THE STATISTICAL DATA ANALYSIS WITHIN THE SYSTEM OF HIGHER PSYCHOLOGICAL EDUCATION

    Directory of Open Access Journals (Sweden)

    Svetlana V. Smirnova

    2013-01-01

    Full Text Available The features of using information technologies within applied statisticians in psychology are considered in the article. Requirements to statistical preparation of psychology students in the conditions of information society are analyzed.

  13. Ginger augmented chemotherapy: A novel multitarget nontoxic approach for cancer management.

    Science.gov (United States)

    Saxena, Roopali; Rida, Padmashree C G; Kucuk, Omer; Aneja, Ritu

    2016-06-01

    Cancer, referred to as the 'disease of civilization', continues to haunt humanity due to its dreadful manifestations and limited success of therapeutic interventions such as chemotherapy in curing the disease. Although effective, chemotherapy has repeatedly demonstrated inadequacy in disease management due to its debilitating side effects arising from its deleterious nonspecific effects on normal healthy cells. In addition, development of chemoresistance due to mono-targeting often results in cessation of chemotherapy. This urgently demands development and implementation of multitargeted alternative therapies with mild or no side effects. One extremely promising strategy that yet remains untapped in the clinic is augmenting chemotherapy with dietary phytochemicals or extracts. Ginger, depository of numerous bioactive molecules, not only targets cancer cells but can also mitigate chemotherapy-associated side effects. Consequently, combination therapy involving ginger extract and chemotherapeutic agents may offer the advantage of being efficacious with reduced toxicity. Here we discuss the remarkable and often overlooked potential of ginger extract to manage cancer, the possibility of developing ginger-based combinational therapies, and the major roadblocks along with strategies to overcome them in clinical translation of such inventions. We are optimistic that clinical implementation of such combination regimens would be a much sought after modality in cancer management. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. The Development of Introductory Statistics Students' Informal Inferential Reasoning and Its Relationship to Formal Inferential Reasoning

    Science.gov (United States)

    Jacob, Bridgette L.

    2013-01-01

    The difficulties introductory statistics students have with formal statistical inference are well known in the field of statistics education. "Informal" statistical inference has been studied as a means to introduce inferential reasoning well before and without the formalities of formal statistical inference. This mixed methods study…

  15. Tiny molecule, big power: Multi-target approach for curcumin in diabetic cardiomyopathy.

    Science.gov (United States)

    Karuppagounder, Vengadeshprabhu; Arumugam, Somasundaram; Giridharan, Vijayasree V; Sreedhar, Remya; Bose, Rajendran J C; Vanama, Jyothi; Palaniyandi, Suresh S; Konishi, Tetsuya; Watanabe, Kenichi; Thandavarayan, Rajarajan A

    2017-02-01

    Diabetic cardiomyopathy (DCM) is described as impaired cardiac diastolic and systolic functions. Diabetes mellitus (DM), a related cardiovascular disease, has become one of the major causes of death in DM patients. Mortality in these diseases is 2 to 3 times higher than in non-DM patients with cardiovascular disease. The progression of DCM and the cellular and molecular perturbations associated with the pathogenesis are complex and multifactorial. Although considerable progress has been achieved, the molecular etiologies of DCM remain poorly understood. There is an expanding need for natural antidiabetic medicines that do not cause the side effects of modern drugs. Curcumin, a pleiotropic molecule, from Curcuma longa, is known to possess numerous impacts such as scavenging free radical, antioxidant, antitumor, and antiinflammatory activities. The reports from preclinical and clinical findings revealed that curcumin can reverse insulin resistance, hyperglycemia, obesity, and obesity-related metabolic diseases. The current review provides an updated overview of the possible molecular mechanism of DCM and multitarget approach of curcumin in alleviating DCM and diabetic complication. Additionally, we mentioned the approaches that are currently being implemented to improve the bioavailability of this promising natural product in diabetes therapeutics. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. the Multitargets Pharmacological Mechanism of Qishenkeli Acting on the Coronary Heart Disease

    Directory of Open Access Journals (Sweden)

    Yong Wang

    2012-01-01

    Full Text Available In this paper, we present a case study of Qishenkeli (QSKL to research TCM’s underlying molecular mechanism, based on drug target prediction and analyses of TCM chemical components and following experimental validation. First, after determining the compositive compounds of QSKL, we use drugCIPHER-CS to predict their potential drug targets. These potential targets are significantly enriched with known cardiovascular disease-related drug targets. Then we find these potential drug targets are significantly enriched in the biological processes of neuroactive ligand-receptor interaction, aminoacyl-tRNA biosynthesis, calcium signaling pathway, glycine, serine and threonine metabolism, and renin-angiotensin system (RAAS, and so on. Then, animal model of coronary heart disease (CHD induced by left anterior descending coronary artery ligation is applied to validate predicted pathway. RAAS pathway is selected as an example, and the results show that QSKL has effect on both rennin and angiotensin II receptor (AT1R, which eventually down regulates the angiotensin II (AngII. Bioinformatics combing with experiment verification can provide a credible and objective method to understand the complicated multitargets mechanism for Chinese herbal formula.

  17. Knowledge-Sharing Intention among Information Professionals in Nigeria: A Statistical Analysis

    Science.gov (United States)

    Tella, Adeyinka

    2016-01-01

    In this study, the researcher administered a survey and developed and tested a statistical model to examine the factors that determine the intention of information professionals in Nigeria to share knowledge with their colleagues. The result revealed correlations between the overall score for intending to share knowledge and other…

  18. [Design and implementation of online statistical analysis function in information system of air pollution and health impact monitoring].

    Science.gov (United States)

    Lü, Yiran; Hao, Shuxin; Zhang, Guoqing; Liu, Jie; Liu, Yue; Xu, Dongqun

    2018-01-01

    To implement the online statistical analysis function in information system of air pollution and health impact monitoring, and obtain the data analysis information real-time. Using the descriptive statistical method as well as time-series analysis and multivariate regression analysis, SQL language and visual tools to implement online statistical analysis based on database software. Generate basic statistical tables and summary tables of air pollution exposure and health impact data online; Generate tendency charts of each data part online and proceed interaction connecting to database; Generate butting sheets which can lead to R, SAS and SPSS directly online. The information system air pollution and health impact monitoring implements the statistical analysis function online, which can provide real-time analysis result to its users.

  19. Incorporating Nonparametric Statistics into Delphi Studies in Library and Information Science

    Science.gov (United States)

    Ju, Boryung; Jin, Tao

    2013-01-01

    Introduction: The Delphi technique is widely used in library and information science research. However, many researchers in the field fail to employ standard statistical tests when using this technique. This makes the technique vulnerable to criticisms of its reliability and validity. The general goal of this article is to explore how…

  20. Dissemination of statistical information for purposes of efficiency management of socio-economic development

    Directory of Open Access Journals (Sweden)

    Gerasimenko S.

    2013-01-01

    Full Text Available The questions connected with the facilitation of the access of the people to the information about the living standard are considered. In particular it is suggested to use the information of the System of National Accounts and statistic methods. It is stressed that the information about living standard should be amplified with the characteristics of the effectiveness of the management of the social-economic development.

  1. a Geodatabase for Multisource Data Applied to Cultural Heritage: the Case Study of Villa Revedin Bolasco

    Science.gov (United States)

    Guarnieri, A.; Masiero, A.; Piragnolo, M.; Pirotti, F.; Vettore, A.

    2016-06-01

    In this paper we present the results of the development of a Web-based archiving and documenting system aimed to the management of multisource and multitemporal data related to cultural heritage. As case study we selected the building complex of Villa Revedin Bolasco in Castefranco Veneto (Treviso, Italy) and its park. Buildings and park were built in XIX century after several restorations of the original XIV century area. The data management system relies on a geodatabase framework, in which different kinds of datasets were stored. More specifically, the geodatabase elements consist of historical information, documents, descriptions of artistic characteristics of the building and the park, in the form of text and images. In addition, we used also floorplans, sections and views of the outer facades of the building extracted by a TLS-based 3D model of the whole Villa. In order to manage and explore these rich dataset, we developed a geodatabase using PostgreSQL and PostGIS as spatial plugin. The Web-GIS platform, based on HTML5 and PHP programming languages, implements the NASA Web World Wind virtual globe, a 3D virtual globe we used to enable the navigation and interactive exploration of the park. Furthermore, through a specific timeline function, the user can explore the historical evolution of the building complex.

  2. A GEODATABASE FOR MULTISOURCE DATA APPLIED TO CULTURAL HERITAGE: THE CASE STUDY OF VILLA REVEDIN BOLASCO

    Directory of Open Access Journals (Sweden)

    A. Guarnieri

    2016-06-01

    Full Text Available In this paper we present the results of the development of a Web-based archiving and documenting system aimed to the management of multisource and multitemporal data related to cultural heritage. As case study we selected the building complex of Villa Revedin Bolasco in Castefranco Veneto (Treviso, Italy and its park. Buildings and park were built in XIX century after several restorations of the original XIV century area. The data management system relies on a geodatabase framework, in which different kinds of datasets were stored. More specifically, the geodatabase elements consist of historical information, documents, descriptions of artistic characteristics of the building and the park, in the form of text and images. In addition, we used also floorplans, sections and views of the outer facades of the building extracted by a TLS-based 3D model of the whole Villa. In order to manage and explore these rich dataset, we developed a geodatabase using PostgreSQL and PostGIS as spatial plugin. The Web-GIS platform, based on HTML5 and PHP programming languages, implements the NASA Web World Wind virtual globe, a 3D virtual globe we used to enable the navigation and interactive exploration of the park. Furthermore, through a specific timeline function, the user can explore the historical evolution of the building complex.

  3. Statistical Information and Uncertainty: A Critique of Applications in Experimental Psychology

    Directory of Open Access Journals (Sweden)

    Donald Laming

    2010-04-01

    Full Text Available This paper presents, first, a formal exploration of the relationships between information (statistically defined, statistical hypothesis testing, the use of hypothesis testing in reverse as an investigative tool, channel capacity in a communication system, uncertainty, the concept of entropy in thermodynamics, and Bayes’ theorem. This exercise brings out the close mathematical interrelationships between different applications of these ideas in diverse areas of psychology. Subsequent illustrative examples are grouped under (a the human operator as an ideal communications channel, (b the human operator as a purely physical system, and (c Bayes’ theorem as an algorithm for combining information from different sources. Some tentative conclusions are drawn about the usefulness of information theory within these different categories. (a The idea of the human operator as an ideal communications channel has long been abandoned, though it provides some lessons that still need to be absorbed today. (b Treating the human operator as a purely physical system provides a platform for the quantitative exploration of many aspects of human performance by analogy with the analysis of other physical systems. (c The use of Bayes’ theorem to calculate the effects of prior probabilities and stimulus frequencies on human performance is probably misconceived, but it is difficult to obtain results precise enough to resolve this question.

  4. Multisource Images Analysis Using Collaborative Clustering

    Directory of Open Access Journals (Sweden)

    Pierre Gançarski

    2008-04-01

    Full Text Available The development of very high-resolution (VHR satellite imagery has produced a huge amount of data. The multiplication of satellites which embed different types of sensors provides a lot of heterogeneous images. Consequently, the image analyst has often many different images available, representing the same area of the Earth surface. These images can be from different dates, produced by different sensors, or even at different resolutions. The lack of machine learning tools using all these representations in an overall process constraints to a sequential analysis of these various images. In order to use all the information available simultaneously, we propose a framework where different algorithms can use different views of the scene. Each one works on a different remotely sensed image and, thus, produces different and useful information. These algorithms work together in a collaborative way through an automatic and mutual refinement of their results, so that all the results have almost the same number of clusters, which are statistically similar. Finally, a unique result is produced, representing a consensus among the information obtained by each clustering method on its own image. The unified result and the complementarity of the single results (i.e., the agreement between the clustering methods as well as the disagreement lead to a better understanding of the scene. The experiments carried out on multispectral remote sensing images have shown that this method is efficient to extract relevant information and to improve the scene understanding.

  5. A quantum information approach to statistical mechanics

    International Nuclear Information System (INIS)

    Cuevas, G.

    2011-01-01

    The field of quantum information and computation harnesses and exploits the properties of quantum mechanics to perform tasks more efficiently than their classical counterparts, or that may uniquely be possible in the quantum world. Its findings and techniques have been applied to a number of fields, such as the study of entanglement in strongly correlated systems, new simulation techniques for many-body physics or, generally, to quantum optics. This thesis aims at broadening the scope of quantum information theory by applying it to problems in statistical mechanics. We focus on classical spin models, which are toy models used in a variety of systems, ranging from magnetism, neural networks, to quantum gravity. We tackle these models using quantum information tools from three different angles. First, we show how the partition function of a class of widely different classical spin models (models in different dimensions, different types of many-body interactions, different symmetries, etc) can be mapped to the partition function of a single model. We prove this by first establishing a relation between partition functions and quantum states, and then transforming the corresponding quantum states to each other. Second, we give efficient quantum algorithms to estimate the partition function of various classical spin models, such as the Ising or the Potts model. The proof is based on a relation between partition functions and quantum circuits, which allows us to determine the quantum computational complexity of the partition function by studying the corresponding quantum circuit. Finally, we outline the possibility of applying quantum information concepts and tools to certain models of dis- crete quantum gravity. The latter provide a natural route to generalize our results, insofar as the central quantity has the form of a partition function, and as classical spin models are used as toy models of matter. (author)

  6. Evaluation of physicians' professional performance: An iterative development and validation study of multisource feedback instruments

    Directory of Open Access Journals (Sweden)

    Overeem Karlijn

    2012-03-01

    Full Text Available Abstract Background There is a global need to assess physicians' professional performance in actual clinical practice. Valid and reliable instruments are necessary to support these efforts. This study focuses on the reliability and validity, the influences of some sociodemographic biasing factors, associations between self and other evaluations, and the number of evaluations needed for reliable assessment of a physician based on the three instruments used for the multisource assessment of physicians' professional performance in the Netherlands. Methods This observational validation study of three instruments underlying multisource feedback (MSF was set in 26 non-academic hospitals in the Netherlands. In total, 146 hospital-based physicians took part in the study. Each physician's professional performance was assessed by peers (physician colleagues, co-workers (including nurses, secretary assistants and other healthcare professionals and patients. Physicians also completed a self-evaluation. Ratings of 864 peers, 894 co-workers and 1960 patients on MSF were available. We used principal components analysis and methods of classical test theory to evaluate the factor structure, reliability and validity of instruments. We used Pearson's correlation coefficient and linear mixed models to address other objectives. Results The peer, co-worker and patient instruments respectively had six factors, three factors and one factor with high internal consistencies (Cronbach's alpha 0.95 - 0.96. It appeared that only 2 percent of variance in the mean ratings could be attributed to biasing factors. Self-ratings were not correlated with peer, co-worker or patient ratings. However, ratings of peers, co-workers and patients were correlated. Five peer evaluations, five co-worker evaluations and 11 patient evaluations are required to achieve reliable results (reliability coefficient ≥ 0.70. Conclusions The study demonstrated that the three MSF instruments produced

  7. Deriving consumer-facing disease concepts for family health histories using multi-source sampling.

    Science.gov (United States)

    Hulse, Nathan C; Wood, Grant M; Haug, Peter J; Williams, Marc S

    2010-10-01

    The family health history has long been recognized as an effective way of understanding individuals' susceptibility to familial disease; yet electronic tools to support the capture and use of these data have been characterized as inadequate. As part of an ongoing effort to build patient-facing tools for entering detailed family health histories, we have compiled a set of concepts specific to familial disease using multi-source sampling. These concepts were abstracted by analyzing family health history data patterns in our enterprise data warehouse, collection patterns of consumer personal health records, analyses from the local state health department, a healthcare data dictionary, and concepts derived from genetic-oriented consumer education materials. Collectively, these sources yielded a set of more than 500 unique disease concepts, represented by more than 2500 synonyms for supporting patients in entering coded family health histories. We expect that these concepts will be useful in providing meaningful data and education resources for patients and providers alike.

  8. Multisource data assimilation in a Richards equation-based integrated hydrological model: a real-world application to an experimental hillslope

    Science.gov (United States)

    Camporese, M.; Botto, A.

    2017-12-01

    Data assimilation is becoming increasingly popular in hydrological and earth system modeling, as it allows for direct integration of multisource observation data in modeling predictions and uncertainty reduction. For this reason, data assimilation has been recently the focus of much attention also for integrated surface-subsurface hydrological models, whereby multiple terrestrial compartments (e.g., snow cover, surface water, groundwater) are solved simultaneously, in an attempt to tackle environmental problems in a holistic approach. Recent examples include the joint assimilation of water table, soil moisture, and river discharge measurements in catchment models of coupled surface-subsurface flow using the ensemble Kalman filter (EnKF). Although the EnKF has been specifically developed to deal with nonlinear models, integrated hydrological models based on the Richards equation still represent a challenge, due to strong nonlinearities that may significantly affect the filter performance. Thus, more studies are needed to investigate the capabilities of EnKF to correct the system state and identify parameters in cases where the unsaturated zone dynamics are dominant. Here, the model CATHY (CATchment HYdrology) is applied to reproduce the hydrological dynamics observed in an experimental hillslope, equipped with tensiometers, water content reflectometer probes, and tipping bucket flow gages to monitor the hillslope response to a series of artificial rainfall events. We assimilate pressure head, soil moisture, and subsurface outflow with EnKF in a number of assimilation scenarios and discuss the challenges, issues, and tradeoffs arising from the assimilation of multisource data in a real-world test case, with particular focus on the capability of DA to update the subsurface parameters.

  9. Which Type of Risk Information to Use for Whom? Moderating Role of Outcome-Relevant Involvement in the Effects of Statistical and Exemplified Risk Information on Risk Perceptions.

    Science.gov (United States)

    So, Jiyeon; Jeong, Se-Hoon; Hwang, Yoori

    2017-04-01

    The extant empirical research examining the effectiveness of statistical and exemplar-based health information is largely inconsistent. Under the premise that the inconsistency may be due to an unacknowledged moderator (O'Keefe, 2002), this study examined a moderating role of outcome-relevant involvement (Johnson & Eagly, 1989) in the effects of statistical and exemplified risk information on risk perception. Consistent with predictions based on elaboration likelihood model (Petty & Cacioppo, 1984), findings from an experiment (N = 237) concerning alcohol consumption risks showed that statistical risk information predicted risk perceptions of individuals with high, rather than low, involvement, while exemplified risk information predicted risk perceptions of those with low, rather than high, involvement. Moreover, statistical risk information contributed to negative attitude toward drinking via increased risk perception only for highly involved individuals, while exemplified risk information influenced the attitude through the same mechanism only for individuals with low involvement. Theoretical and practical implications for health risk communication are discussed.

  10. Statistical techniques to extract information during SMAP soil moisture assimilation

    Science.gov (United States)

    Kolassa, J.; Reichle, R. H.; Liu, Q.; Alemohammad, S. H.; Gentine, P.

    2017-12-01

    Statistical techniques permit the retrieval of soil moisture estimates in a model climatology while retaining the spatial and temporal signatures of the satellite observations. As a consequence, the need for bias correction prior to an assimilation of these estimates is reduced, which could result in a more effective use of the independent information provided by the satellite observations. In this study, a statistical neural network (NN) retrieval algorithm is calibrated using SMAP brightness temperature observations and modeled soil moisture estimates (similar to those used to calibrate the SMAP Level 4 DA system). Daily values of surface soil moisture are estimated using the NN and then assimilated into the NASA Catchment model. The skill of the assimilation estimates is assessed based on a comprehensive comparison to in situ measurements from the SMAP core and sparse network sites as well as the International Soil Moisture Network. The NN retrieval assimilation is found to significantly improve the model skill, particularly in areas where the model does not represent processes related to agricultural practices. Additionally, the NN method is compared to assimilation experiments using traditional bias correction techniques. The NN retrieval assimilation is found to more effectively use the independent information provided by SMAP resulting in larger model skill improvements than assimilation experiments using traditional bias correction techniques.

  11. The Need for the Dissemination of Statistical Data and Information

    Directory of Open Access Journals (Sweden)

    Anna-Alexandra Frunza

    2016-01-01

    Full Text Available There is an emphasis nowadays on knowledge, so the access to information has increased inrelevance in the modern economies which have developed their competitive advantage thoroughtheir dynamic response to the market changes. The effort for transparency has increasedtremendously within the last decades which have been also influenced by the weight that the digitalsupport has provided. The need for the dissemination of statistical data and information has metnew challenges in terms of aggregating the practices that both private and public organizations usein order to ensure the optimum access to the end users. The article stresses some key questions thatcan be introduced which ease the process of collection and presentation of the results subject todissemination.

  12. Statistical properties of quantum entanglement and information entropy

    International Nuclear Information System (INIS)

    Abdel-Aty, M.M.A.

    2007-03-01

    Key words: entropy, entanglement, atom-field interaction, trapped ions, cold atoms, information entropy. Objects of research: Pure state entanglement, entropy squeezing mazer. The aim of the work: Study of the new entanglement features and new measures for both pure-state and mixed state of particle-field interaction. Also, the impact of the information entropy on the quantum information theory. Method of investigation: Methods of theoretical physics and applied mathematics (statistical physics, quantum optics) are used. Results obtained and their novelty are: All the results of the dissertation are new and many new features have been discovered. Particularly: the most general case of the pure state entanglement has been introduced. Although various special aspects of the quantum entropy have been investigated previously, the general features of the dynamics, when a multi-level system and a common environment are considered, have not been treated before and our work therefore, field a gap in the literature. Specifically: 1) A new entanglement measure due to quantum mutual entropy (mixed-state entanglement) we called it DEM, has been introduced, 2) A new treatment of the atomic information entropy in higher level systems has been presented. The problem has been completely solved in the case of three-level system, 3) A new solution of the interaction between the ultra cold atoms and cavity field has been discovered, 4) Some new models of the atom-field interaction have been adopted. Practical value: The subject carries out theoretic character. Application region: Results can be used in quantum computer developments. Also, the presented results can be used for further developments of the quantum information and quantum communications. (author)

  13. Multi-target parallel processing approach for gene-to-structure determination of the influenza polymerase PB2 subunit.

    Science.gov (United States)

    Armour, Brianna L; Barnes, Steve R; Moen, Spencer O; Smith, Eric; Raymond, Amy C; Fairman, James W; Stewart, Lance J; Staker, Bart L; Begley, Darren W; Edwards, Thomas E; Lorimer, Donald D

    2013-06-28

    Pandemic outbreaks of highly virulent influenza strains can cause widespread morbidity and mortality in human populations worldwide. In the United States alone, an average of 41,400 deaths and 1.86 million hospitalizations are caused by influenza virus infection each year (1). Point mutations in the polymerase basic protein 2 subunit (PB2) have been linked to the adaptation of the viral infection in humans (2). Findings from such studies have revealed the biological significance of PB2 as a virulence factor, thus highlighting its potential as an antiviral drug target. The structural genomics program put forth by the National Institute of Allergy and Infectious Disease (NIAID) provides funding to Emerald Bio and three other Pacific Northwest institutions that together make up the Seattle Structural Genomics Center for Infectious Disease (SSGCID). The SSGCID is dedicated to providing the scientific community with three-dimensional protein structures of NIAID category A-C pathogens. Making such structural information available to the scientific community serves to accelerate structure-based drug design. Structure-based drug design plays an important role in drug development. Pursuing multiple targets in parallel greatly increases the chance of success for new lead discovery by targeting a pathway or an entire protein family. Emerald Bio has developed a high-throughput, multi-target parallel processing pipeline (MTPP) for gene-to-structure determination to support the consortium. Here we describe the protocols used to determine the structure of the PB2 subunit from four different influenza A strains.

  14. Research and application of information system for sandstone-type uranium exploration

    International Nuclear Information System (INIS)

    Han Shaoyang; Huang Shutao; Hou Huiqun

    2003-01-01

    The GIS (Geographical Information System) technique is applied to the exploration and evaluation of in-situ leachable sandstone-type uranium deposits and the GIS application system of desktop is created for non-GIS professionals. ArcView3.2 is taken as compositive platform of the information system. The secondary design is developed through the AVENUE language provided by ArcView3.2 on the software functions. According to the needs of multi-source information management and integrated evaluation, a series of new functions are appended to the basic platform through AVENUE language on a basis of sufficiently inheriting ArcView3.2 software functions and a friendly graphic user interface is also created, so that the system implements the following functions better, including information query, data base management, editing graphics, geologic mapping, image processing, spatial analysis, model analysis and result output. In order to manage the plenty of borehole data better and quickly realize the borehole mapping, a system software of borehole data management and mapping on the base of GIS software platform is developed. The system software has been applied to uranium survey project in the west of Hailaer basin. Based on multi-source geoscience information database including geologic, geophysical, geochemical and remote sensing data, the system software has been used to perform the integrated analysis of spatial data for realizing the deep analysis and studies of the metallogenic geologic environments of sandstone-type uranium deposits. In the Kelulun basin, the weights of evidence analysis have been used to quantitatively predict the prospective areas of sandstone uranium deposits. Information system has also been applied to the integrated evaluation of uranium resource in the south of Yili basin, Songliao basin and other areas. (authors)

  15. Variational Bayesian labeled multi-Bernoulli filter with unknown sensor noise statistics

    Directory of Open Access Journals (Sweden)

    Qiu Hao

    2016-10-01

    Full Text Available It is difficult to build accurate model for measurement noise covariance in complex backgrounds. For the scenarios of unknown sensor noise variances, an adaptive multi-target tracking algorithm based on labeled random finite set and variational Bayesian (VB approximation is proposed. The variational approximation technique is introduced to the labeled multi-Bernoulli (LMB filter to jointly estimate the states of targets and sensor noise variances. Simulation results show that the proposed method can give unbiased estimation of cardinality and has better performance than the VB probability hypothesis density (VB-PHD filter and the VB cardinality balanced multi-target multi-Bernoulli (VB-CBMeMBer filter in harsh situations. The simulations also confirm the robustness of the proposed method against the time-varying noise variances. The computational complexity of proposed method is higher than the VB-PHD and VB-CBMeMBer in extreme cases, while the mean execution times of the three methods are close when targets are well separated.

  16. Complex interactions between phytochemicals. The multi-target therapeutic concept of phytotherapy.

    Science.gov (United States)

    Efferth, Thomas; Koch, Egon

    2011-01-01

    Drugs derived from natural resources represent a significant segment of the pharmaceutical market as compared to randomly synthesized compounds. It is a goal of drug development programs to design selective ligands that act on single disease targets to obtain highly effective and safe drugs with low side effects. Although this strategy was successful for many new therapies, there is a marked decline in the number of new drugs introduced into clinical practice over the past decades. One reason for this failure may be due to the fact that the pathogenesis of many diseases is rather multi-factorial in nature and not due to a single cause. Phytotherapy, whose therapeutic efficacy is based on the combined action of a mixture of constituents, offers new treatment opportunities. Because of their biological defence function, plant secondary metabolites act by targeting and disrupting the cell membrane, by binding and inhibiting specific proteins or they adhere to or intercalate into RNA or DNA. Phytotherapeutics may exhibit pharmacological effects by the synergistic or antagonistic interaction of many phytochemicals. Mechanistic reasons for interactions are bioavailability, interference with cellular transport processes, activation of pro-drugs or deactivation of active compounds to inactive metabolites, action of synergistic partners at different points of the same signalling cascade (multi-target effects) or inhibition of binding to target proteins. "-Omics" technologies and systems biology may facilitate unravelling synergistic effects of herbal mixtures.

  17. A Concept Lattice for Semantic Integration of Geo-Ontologies Based on Weight of Inclusion Degree Importance and Information Entropy

    Directory of Open Access Journals (Sweden)

    Jia Xiao

    2016-11-01

    Full Text Available Constructing a merged concept lattice with formal concept analysis (FCA is an important research direction in the field of integrating multi-source geo-ontologies. Extracting essential geographical properties and reducing the concept lattice are two key points of previous research. A formal integration method is proposed to address the challenges in these two areas. We first extract essential properties from multi-source geo-ontologies and use FCA to build a merged formal context. Second, the combined importance weight of each single attribute of the formal context is calculated by introducing the inclusion degree importance from rough set theory and information entropy; then a weighted formal context is built from the merged formal context. Third, a combined weighted concept lattice is established from the weighted formal context with FCA and the importance weight value of every concept is defined as the sum of weight of attributes belonging to the concept’s intent. Finally, semantic granularity of concept is defined by its importance weight; we, then gradually reduce the weighted concept lattice by setting up diminishing threshold of semantic granularity. Additionally, all of those reduced lattices are organized into a regular hierarchy structure based on the threshold of semantic granularity. A workflow is designed to demonstrate this procedure. A case study is conducted to show feasibility and validity of this method and the procedure to integrate multi-source geo-ontologies.

  18. Exploiting target amplitude information to improve multi-target tracking

    Science.gov (United States)

    Ehrman, Lisa M.; Blair, W. Dale

    2006-05-01

    Closely-spaced (but resolved) targets pose a challenge for measurement-to-track data association algorithms. Since the Mahalanobis distances between measurements collected on closely-spaced targets and tracks are similar, several elements of the corresponding kinematic measurement-to-track cost matrix are also similar. Lacking any other information on which to base assignments, it is not surprising that data association algorithms make mistakes. One ad hoc approach for mitigating this problem is to multiply the kinematic measurement-to-track likelihoods by amplitude likelihoods. However, this can actually be detrimental to the measurement-to-track association process. With that in mind, this paper pursues a rigorous treatment of the hypothesis probabilities for kinematic measurements and features. Three simple scenarios are used to demonstrate the impact of basing data association decisions on these hypothesis probabilities for Rayleigh, fixed-amplitude, and Rician targets. The first scenario assumes that the tracker carries two tracks but only one measurement is collected. This provides insight into more complex scenarios in which there are fewer measurements than tracks. The second scenario includes two measurements and one track. This extends naturally to the case with more measurements than tracks. Two measurements and two tracks are present in the third scenario, which provides insight into the performance of this method when the number of measurements equals the number of tracks. In all cases, basing data association decisions on the hypothesis probabilities leads to good results.

  19. Information Theory - The Bridge Connecting Bounded Rational Game Theory and Statistical Physics

    Science.gov (United States)

    Wolpert, David H.

    2005-01-01

    A long-running difficulty with conventional game theory has been how to modify it to accommodate the bounded rationality of all red-world players. A recurring issue in statistical physics is how best to approximate joint probability distributions with decoupled (and therefore far more tractable) distributions. This paper shows that the same information theoretic mathematical structure, known as Product Distribution (PD) theory, addresses both issues. In this, PD theory not only provides a principle formulation of bounded rationality and a set of new types of mean field theory in statistical physics; it also shows that those topics are fundamentally one and the same.

  20. Performance of the S - [chi][squared] Statistic for Full-Information Bifactor Models

    Science.gov (United States)

    Li, Ying; Rupp, Andre A.

    2011-01-01

    This study investigated the Type I error rate and power of the multivariate extension of the S - [chi][squared] statistic using unidimensional and multidimensional item response theory (UIRT and MIRT, respectively) models as well as full-information bifactor (FI-bifactor) models through simulation. Manipulated factors included test length, sample…

  1. Photocatalytic applications of Cr{sub 2}S{sub 3} synthesized from single and multi-source precursors

    Energy Technology Data Exchange (ETDEWEB)

    Hussain, Wajid [Department of Chemistry, Quaid-i-Azam University, 45320, Islamabad (Pakistan); Badshah, Amin, E-mail: aminbadshah@qau.edu.pk [Department of Chemistry, Quaid-i-Azam University, 45320, Islamabad (Pakistan); Hussain, Raja Azadar; Imtiaz-ud-Din [Department of Chemistry, Quaid-i-Azam University, 45320, Islamabad (Pakistan); Aleem, Muhammad Adeel [The Pakistan Institute of Engineering and Applied Sciences (PIEAS) (Pakistan); Bahadur, Ali [Department of Chemistry, Quaid-i-Azam University, 45320, Islamabad (Pakistan); Iqbal, Shahid [School of Chemistry and Chemical Engineering, University of Chinese Academy of Sciences, Beijing, 100049 (China); Farooq, Muhammad Umar; Ali, Hassan [Department of Chemistry, Quaid-i-Azam University, 45320, Islamabad (Pakistan)

    2017-06-15

    Most of the material research work is pertinent to the synthesis of transition-metal sulfides nanoparticles but here the studies are limited to the synthesis of chromium sulfide. However, the preparation method, presented in this work, may be extended to other metal chalcogenides nanoparticles for various potential applications. The ligand (precursor), 1-(2-chloro-4-nitrophenyl)-3,3-chlorobenzoyl and Cr{sub 2}S{sub 3} have been synthesized initially from single source precursor and then from multi source precursors. The target was to alter the morphologies of nanomaterial while altering the synthetic route and that was successfully achieved. Chromium sulfide nano-rods were synthesized using single source precursors while nanoparticles were fabricated using multi source precursors. Characterization were carried out through {sup 1}H and {sup 13}C NMR, scanning electron microscopy (SEM), transmission electron microscopy (TEM), powder X-ray diffraction microscopy (PXRD), energy dispersive X-ray spectroscopy (EDX), Raman spectroscopy, X-ray photoelectron spectroscopy (XPS) and high resolution transmission electron microscopy (HRTEM). Our objective is to change the morphologies by changing the synthetic route so that is why further applications were done only for multi-source product, denying single source product. The metal sulfides nanoparticles exhibit higher activity than their bulk material for the photocatalytic degradation of organic dyes under visible-light irradiation. So, photocatalytic activity was successfully achieved under direct sunlight against five different cationic and anionic organic dyes including malachite green (MG), methylene blue (MB), rhodamine B (RhB), methyl violet (MV) and methyl orange (MO). These organic dyes MV, MG, MB, and RB were almost diminished or decolorized by Cr{sub 2}S{sub 3} within 110, 90, 100, and 130, minutes, respectively expect MO. - Highlights: • Synthesis of Cr{sub 2}S{sub 3} from single and multisource precursors is

  2. Histone deacetylase inhibitors (HDACIs: multitargeted anticancer agents

    Directory of Open Access Journals (Sweden)

    Ververis K

    2013-02-01

    advancement of these drugs, especially to facilitate the rational design of HDAC inhibitors that are effective as antineoplastic agents. This review will discuss the use of HDAC inhibitors as multitargeted therapies for malignancy. Further, we outline the pharmacology and mechanisms of action of HDAC inhibitors while discussing the safety and efficacy of these compounds in clinical studies to date.Keywords: chromatin modifications, histone acetylation, histone deacetylase inhibitor, suberoylanilide hydroxamic acid, depsipeptide, entinostat

  3. From Quality to Information Quality in Official Statistics

    Directory of Open Access Journals (Sweden)

    Kenett Ron S.

    2016-12-01

    Full Text Available The term quality of statistical data, developed and used in official statistics and international organizations such as the International Monetary Fund (IMF and the Organisation for Economic Co-operation and Development (OECD, refers to the usefulness of summary statistics generated by producers of official statistics. Similarly, in the context of survey quality, official agencies such as Eurostat, National Center for Science and Engineering Statistics (NCSES, and Statistics Canada have created dimensions for evaluating the quality of a survey and its ability to report ‘accurate survey data’.

  4. Curcumin, the golden nutraceutical: multitargeting for multiple chronic diseases.

    Science.gov (United States)

    Kunnumakkara, Ajaikumar B; Bordoloi, Devivasha; Padmavathi, Ganesan; Monisha, Javadi; Roy, Nand Kishor; Prasad, Sahdeo; Aggarwal, Bharat B

    2017-06-01

    Curcumin, a yellow pigment in the Indian spice Turmeric (Curcuma longa), which is chemically known as diferuloylmethane, was first isolated exactly two centuries ago in 1815 by two German Scientists, Vogel and Pelletier. However, according to the pubmed database, the first study on its biological activity as an antibacterial agent was published in 1949 in Nature and the first clinical trial was reported in The Lancet in 1937. Although the current database indicates almost 9000 publications on curcumin, until 1990 there were less than 100 papers published on this nutraceutical. At the molecular level, this multitargeted agent has been shown to exhibit anti-inflammatory activity through the suppression of numerous cell signalling pathways including NF-κB, STAT3, Nrf2, ROS and COX-2. Numerous studies have indicated that curcumin is a highly potent antimicrobial agent and has been shown to be active against various chronic diseases including various types of cancers, diabetes, obesity, cardiovascular, pulmonary, neurological and autoimmune diseases. Furthermore, this compound has also been shown to be synergistic with other nutraceuticals such as resveratrol, piperine, catechins, quercetin and genistein. To date, over 100 different clinical trials have been completed with curcumin, which clearly show its safety, tolerability and its effectiveness against various chronic diseases in humans. However, more clinical trials in different populations are necessary to prove its potential against different chronic diseases in humans. This review's primary focus is on lessons learnt about curcumin from clinical trials. This article is part of a themed section on Principles of Pharmacological Research of Nutraceuticals. To view the other articles in this section visit http://onlinelibrary.wiley.com/doi/10.1111/bph.v174.11/issuetoc. © 2016 The British Pharmacological Society.

  5. Curcumin, the golden nutraceutical: multitargeting for multiple chronic diseases

    Science.gov (United States)

    Bordoloi, Devivasha; Padmavathi, Ganesan; Monisha, Javadi; Roy, Nand Kishor; Prasad, Sahdeo

    2016-01-01

    Curcumin, a yellow pigment in the Indian spice Turmeric (Curcuma longa), which is chemically known as diferuloylmethane, was first isolated exactly two centuries ago in 1815 by two German Scientists, Vogel and Pelletier. However, according to the pubmed database, the first study on its biological activity as an antibacterial agent was published in 1949 in Nature and the first clinical trial was reported in The Lancet in 1937. Although the current database indicates almost 9000 publications on curcumin, until 1990 there were less than 100 papers published on this nutraceutical. At the molecular level, this multitargeted agent has been shown to exhibit anti‐inflammatory activity through the suppression of numerous cell signalling pathways including NF‐κB, STAT3, Nrf2, ROS and COX‐2. Numerous studies have indicated that curcumin is a highly potent antimicrobial agent and has been shown to be active against various chronic diseases including various types of cancers, diabetes, obesity, cardiovascular, pulmonary, neurological and autoimmune diseases. Furthermore, this compound has also been shown to be synergistic with other nutraceuticals such as resveratrol, piperine, catechins, quercetin and genistein. To date, over 100 different clinical trials have been completed with curcumin, which clearly show its safety, tolerability and its effectiveness against various chronic diseases in humans. However, more clinical trials in different populations are necessary to prove its potential against different chronic diseases in humans. This review's primary focus is on lessons learnt about curcumin from clinical trials. Linked Articles This article is part of a themed section on Principles of Pharmacological Research of Nutraceuticals. To view the other articles in this section visit http://onlinelibrary.wiley.com/doi/10.1111/bph.v174.11/issuetoc PMID:27638428

  6. [Pitfalls in informed consent: a statistical analysis of malpractice law suits].

    Science.gov (United States)

    Echigo, Junko

    2014-05-01

    In medical malpractice law suits, the notion of informed consent is often relevant in assessing whether negligence can be attributed to the medical practitioner who has caused injury to a patient. Furthermore, it is not rare that courts award damages for a lack of appropriate informed consent alone. In this study, two results were arrived at from a statistical analysis of medical malpractice law suits. One, unexpectedly, was that the severity of a patient's illness made no significant difference to whether damages were awarded. The other was that cases of typical medical treatment that national medical insurance does not cover were involved significantly more often than insured treatment cases. In cases where damages were awarded, the courts required more disclosure and written documents of information by medical practitioners, especially about complications and adverse effects that the patient might suffer.

  7. [Comparison of precision in retrieving soybean leaf area index based on multi-source remote sensing data].

    Science.gov (United States)

    Gao, Lin; Li, Chang-chun; Wang, Bao-shan; Yang Gui-jun; Wang, Lei; Fu, Kui

    2016-01-01

    With the innovation of remote sensing technology, remote sensing data sources are more and more abundant. The main aim of this study was to analyze retrieval accuracy of soybean leaf area index (LAI) based on multi-source remote sensing data including ground hyperspectral, unmanned aerial vehicle (UAV) multispectral and the Gaofen-1 (GF-1) WFV data. Ratio vegetation index (RVI), normalized difference vegetation index (NDVI), soil-adjusted vegetation index (SAVI), difference vegetation index (DVI), and triangle vegetation index (TVI) were used to establish LAI retrieval models, respectively. The models with the highest calibration accuracy were used in the validation. The capability of these three kinds of remote sensing data for LAI retrieval was assessed according to the estimation accuracy of models. The experimental results showed that the models based on the ground hyperspectral and UAV multispectral data got better estimation accuracy (R² was more than 0.69 and RMSE was less than 0.4 at 0.01 significance level), compared with the model based on WFV data. The RVI logarithmic model based on ground hyperspectral data was little superior to the NDVI linear model based on UAV multispectral data (The difference in E(A), R² and RMSE were 0.3%, 0.04 and 0.006, respectively). The models based on WFV data got the lowest estimation accuracy with R2 less than 0.30 and RMSE more than 0.70. The effects of sensor spectral response characteristics, sensor geometric location and spatial resolution on the soybean LAI retrieval were discussed. The results demonstrated that ground hyperspectral data were advantageous but not prominent over traditional multispectral data in soybean LAI retrieval. WFV imagery with 16 m spatial resolution could not meet the requirements of crop growth monitoring at field scale. Under the condition of ensuring the high precision in retrieving soybean LAI and working efficiently, the approach to acquiring agricultural information by UAV remote

  8. Design, Synthesis and in Combo Antidiabetic Bioevaluation of Multitarget Phenylpropanoic Acids

    Directory of Open Access Journals (Sweden)

    Blanca Colín-Lozano

    2018-02-01

    Full Text Available We have synthesized a small series of five 3-[4-arylmethoxyphenyl]propanoic acids employing an easy and short synthetic pathway. The compounds were tested in vitro against a set of four protein targets identified as key elements in diabetes: G protein-coupled receptor 40 (GPR40, aldose reductase (AKR1B1, peroxisome proliferator-activated receptor gama (PPARγ and solute carrier family 2 (facilitated glucose transporter, member 4 (GLUT-4. Compound 1 displayed an EC50 value of 0.075 μM against GPR40 and was an AKR1B1 inhibitor, showing IC50 = 7.4 μM. Compounds 2 and 3 act as slightly AKR1B1 inhibitors, potent GPR40 agonists and showed an increase of 2 to 4-times in the mRNA expression of PPARγ, as well as the GLUT-4 levels. Docking studies were conducted in order to explain the polypharmacological mode of action and the interaction binding mode of the most active molecules on these targets, showing several coincidences with co-crystal ligands. Compounds 1–3 were tested in vivo at an explorative 100 mg/kg dose, being 2 and 3 orally actives, reducing glucose levels in a non-insulin-dependent diabetes mice model. Compounds 2 and 3 displayed robust in vitro potency and in vivo efficacy, and could be considered as promising multitarget antidiabetic candidates. This is the first report of a single molecule with these four polypharmacological target action.

  9. [Estimation of desert vegetation coverage based on multi-source remote sensing data].

    Science.gov (United States)

    Wan, Hong-Mei; Li, Xia; Dong, Dao-Rui

    2012-12-01

    Taking the lower reaches of Tarim River in Xinjiang of Northwest China as study areaAbstract: Taking the lower reaches of Tarim River in Xinjiang of Northwest China as study area and based on the ground investigation and the multi-source remote sensing data of different resolutions, the estimation models for desert vegetation coverage were built, with the precisions of different estimation methods and models compared. The results showed that with the increasing spatial resolution of remote sensing data, the precisions of the estimation models increased. The estimation precision of the models based on the high, middle-high, and middle-low resolution remote sensing data was 89.5%, 87.0%, and 84.56%, respectively, and the precisions of the remote sensing models were higher than that of vegetation index method. This study revealed the change patterns of the estimation precision of desert vegetation coverage based on different spatial resolution remote sensing data, and realized the quantitative conversion of the parameters and scales among the high, middle, and low spatial resolution remote sensing data of desert vegetation coverage, which would provide direct evidence for establishing and implementing comprehensive remote sensing monitoring scheme for the ecological restoration in the study area.

  10. 75 FR 21231 - Proposed Information Collection; Comment Request; Marine Recreational Fisheries Statistics Survey

    Science.gov (United States)

    2010-04-23

    ... Collection; Comment Request; Marine Recreational Fisheries Statistics Survey AGENCY: National Oceanic and... Andrews, (301) 713-2328, ext. 148 or [email protected] . SUPPLEMENTARY INFORMATION: I. Abstract Marine recreational anglers are surveyed for catch and effort data, fish biology data, and angler socioeconomic...

  11. Entropy statistics and information theory

    NARCIS (Netherlands)

    Frenken, K.; Hanusch, H.; Pyka, A.

    2007-01-01

    Entropy measures provide important tools to indicate variety in distributions at particular moments in time (e.g., market shares) and to analyse evolutionary processes over time (e.g., technical change). Importantly, entropy statistics are suitable to decomposition analysis, which renders the

  12. Predictive Big Data Analytics: A Study of Parkinson's Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations.

    Science.gov (United States)

    Dinov, Ivo D; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W; Price, Nathan D; Van Horn, John D; Ames, Joseph; Clark, Kristi; Hood, Leroy; Hampstead, Benjamin M; Dauer, William; Toga, Arthur W

    2016-01-01

    A unique archive of Big Data on Parkinson's Disease is collected, managed and disseminated by the Parkinson's Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson's disease (PD) risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data-large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources-all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data. Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i) introduce methods for rebalancing imbalanced cohorts, (ii) utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii) generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several complementary model-based predictive approaches

  13. Predictive Big Data Analytics: A Study of Parkinson's Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations.

    Directory of Open Access Journals (Sweden)

    Ivo D Dinov

    Full Text Available A unique archive of Big Data on Parkinson's Disease is collected, managed and disseminated by the Parkinson's Progression Markers Initiative (PPMI. The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson's disease (PD risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data-large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources-all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data.Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i introduce methods for rebalancing imbalanced cohorts, (ii utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several complementary model

  14. AVCRI104P3, a novel multitarget compound with cognition-enhancing and anxiolytic activities: studies in cognitively poor middle-aged mice.

    Science.gov (United States)

    Giménez-Llort, L; Ratia, M; Pérez, B; Camps, P; Muñoz-Torrero, D; Badia, A; Clos, M V

    2015-06-01

    The present work describes, for the first time, the in vivo effects of the multitarget compound AVCRI104P3, a new anticholinesterasic drug with potent inhibitory effects on human AChE, human BuChE and BACE-1 activities as well as on the AChE-induced and self-induced Aβ aggregation. We characterized the behavioral effects of chronic treatment with AVCRI104P3 (0.6 μmol kg(-1), i.p., 21 days) in a sample of middle aged (12-month-old) male 129/Sv×C57BL/6 mice with poor cognitive performance, as shown by the slow acquisition curves of saline-treated animals. Besides, a comparative assessment of cognitive and non-cognitive actions was done using its in vitro equipotent doses of huprine X (0.12 μmol kg(-1)), a huperzine A-tacrine hybrid. The screening assessed locomotor activity, anxiety-like behaviors, cognitive function and side effects. The results on the 'acquisition' of spatial learning and memory show that AVCRI104P3 exerted pro-cognitive effects improving both short- and long-term processes, resulting in a fast and efficient acquisition of the place task in the Morris water maze. On the other hand, a removal test and a perceptual visual learning task indicated that both AChEIs improved short-term 'memory' as compared to saline treated mice. Both drugs elicited the same response in the corner test, but only AVCRI104P3 exhibited anxiolytic-like actions in the dark/light box test. These cognitive-enhancement and anxiolytic-like effects demostrated herein using a sample of middle-aged animals and the lack of adverse effects, strongly encourage further studies on AVCRI104P3 as a promising multitarget therapeutic agent for the treatment of cholinergic dysfunction underlying natural aging and/or dementias. Copyright © 2015. Published by Elsevier B.V.

  15. Higher-Order Statistical Correlations and Mutual Information Among Particles in a Quantum Well

    Science.gov (United States)

    Yépez, V. S.; Sagar, R. P.; Laguna, H. G.

    2017-12-01

    The influence of wave function symmetry on statistical correlation is studied for the case of three non-interacting spin-free quantum particles in a unidimensional box, in position and in momentum space. Higher-order statistical correlations occurring among the three particles in this quantum system is quantified via higher-order mutual information and compared to the correlation between pairs of variables in this model, and to the correlation in the two-particle system. The results for the higher-order mutual information show that there are states where the symmetric wave functions are more correlated than the antisymmetric ones with same quantum numbers. This holds in position as well as in momentum space. This behavior is opposite to that observed for the correlation between pairs of variables in this model, and the two-particle system, where the antisymmetric wave functions are in general more correlated. These results are also consistent with those observed in a system of three uncoupled oscillators. The use of higher-order mutual information as a correlation measure, is monitored and examined by considering a superposition of states or systems with two Slater determinants.

  16. Higher-Order Statistical Correlations and Mutual Information Among Particles in a Quantum Well

    International Nuclear Information System (INIS)

    Yépez, V. S.; Sagar, R. P.; Laguna, H. G.

    2017-01-01

    The influence of wave function symmetry on statistical correlation is studied for the case of three non-interacting spin-free quantum particles in a unidimensional box, in position and in momentum space. Higher-order statistical correlations occurring among the three particles in this quantum system is quantified via higher-order mutual information and compared to the correlation between pairs of variables in this model, and to the correlation in the two-particle system. The results for the higher-order mutual information show that there are states where the symmetric wave functions are more correlated than the antisymmetric ones with same quantum numbers. This holds in position as well as in momentum space. This behavior is opposite to that observed for the correlation between pairs of variables in this model, and the two-particle system, where the antisymmetric wave functions are in general more correlated. These results are also consistent with those observed in a system of three uncoupled oscillators. The use of higher-order mutual information as a correlation measure, is monitored and examined by considering a superposition of states or systems with two Slater determinants. (author)

  17. Novel Tacrine-Hydroxyphenylbenzimidazole hybrids as potential multitarget drug candidates for Alzheimer's disease.

    Science.gov (United States)

    Hiremathad, Asha; Keri, Rangappa S; Esteves, A Raquel; Cardoso, Sandra M; Chaves, Sílvia; Santos, M Amélia

    2018-03-25

    Alzheimer's disease (AD) is a severe age-dependent neurodegenerative disorder affecting millions of people, with no cure so far. The current treatments only achieve some temporary amelioration of the cognition symptoms. The main characteristics of the patient brains include the accumulation of amyloid plaques and neurofibrillary tangles (outside and inside the neurons) but also cholinergic deficit, increased oxidative stress and dyshomeostasis of transition metal ions. Considering the multi-factorial nature of AD, we report herein the development of a novel series of potential multi-target directed drugs which, besides the capacity to recover the cholinergic neurons, can also target other AD hallmarks. The novel series of tacrine-hydroxyphenylbenzimidazole (TAC-BIM) hybrid molecules has been designed, synthesized and studied for their multiple biological activities. These agents showed improved AChE inhibitory activity (IC 50 in nanomolar range), as compared with the single drug tacrine (TAC), and also a high inhibition of self-induced- and Cu-induced-Aβ aggregation (up to 75%). They also present moderate radical scavenging activity and metal chelating ability. In addition, neuroprotective studies revealed that all these tested compounds are able to inhibit the neurotoxicity induced by Aβ and Fe/AscH(-) in neuronal cells. Hence, for this set of hybrids, structure-activity relationships are discussed and finally it is highlighted their real promising interest as potential anti-AD drugs. Copyright © 2018 Elsevier Masson SAS. All rights reserved.

  18. Singly and Doubly Charged Projectile Fragments in Nucleus-Emulsion Collisions at Dubna Energy in the Framework of the Multi-Source Model

    International Nuclear Information System (INIS)

    Er-Qin, Wang; Fu-Hu, Liu; Jian-Xin, Sun; Rahim, Magda A.; Fakhraddin, S.

    2011-01-01

    The multiplicity distributions of projectile fragments emitted in interactions of different nuclei with emulsion are studied by using a multi-source model. Our calculated results show that the projectile fragments can be described by the model and each source contributes an exponential distribution. As the weighted sum of the folding result of many exponential distributions, a multi-component Erlang distribution is used to describe the experimental data. The relationship between the height (or width) of the distribution and the mass of the incident projectile, as well as the dependence of projectile fragments on target groups, are investigated too. (nuclear physics)

  19. Detection of Mycoplasma genitalium in female cervical samples by Multitarget Real-Time PCR

    Directory of Open Access Journals (Sweden)

    Sabina Mahmutović-Vranić

    2007-05-01

    Full Text Available Mycoplasma genitalum (MG is associated with variety of urogenital infections such as non-gonococcal urethritis (NGU, endometritis and cervicitis. The objective of this study was to demonstrate and evaluate a research polymerase chain reaction (PCR assay, for the detection of MG in cervical samples of a tested population of women attending gynecology clinics in Bosnia and Herzegovina. The Multitarget Real-Time (MTRT PCR, utilizing the ABI 7900HT, the sequence detection system, was performed for the detection of MG. Cervical samples (N=97 from females were divided into three types of patient groups: Group 1: patients who had known abnormal clinical cytology reports (N=34; Group 2: patients who reported a history of genitourinary infections (N=22; and Group 3: patients not in either groups 1 or 2 (N=41. Overall, 14,43% (14/97 of those tested were positive for MG. A positive sample was defined as having a cycle threshold cross point (Ct < 40,0 with a fluorescent detection comparable to the low positive control utilized during the run. This study validated the use of MTRT PCR as a reliable method for the detection of MG in clinical specimens and should facilitate large-scale screening for this organism.

  20. Multitarget-directed tricyclic pyridazinones as G protein-coupled receptor ligands and cholinesterase inhibitors.

    Science.gov (United States)

    Pau, Amedeo; Catto, Marco; Pinna, Giovanni; Frau, Simona; Murineddu, Gabriele; Asproni, Battistina; Curzu, Maria M; Pisani, Leonardo; Leonetti, Francesco; Loza, Maria Isabel; Brea, José; Pinna, Gérard A; Carotti, Angelo

    2015-06-01

    By following a multitarget ligand design approach, a library of 47 compounds was prepared, and they were tested as binders of selected G protein-coupled receptors (GPCRs) and inhibitors of acetyl and/or butyryl cholinesterase. The newly designed ligands feature pyridazinone-based tricyclic scaffolds connected through alkyl chains of variable length to proper amine moieties (e.g., substituted piperazines or piperidines) for GPCR and cholinesterase (ChE) molecular recognition. The compounds were tested at three different GPCRs, namely serotoninergic 5-HT1A, adrenergic α1A, and dopaminergic D2 receptors. Our main goal was the discovery of compounds that exhibit, in addition to ChE inhibition, antagonist activity at 5-HT1A because of its involvement in neuronal deficits typical of Alzheimer's and other neurodegenerative diseases. Ligands with nanomolar affinity for the tested GPCRs were discovered, but most of them behaved as dual antagonists of α1A and 5-HT1A receptors. Nevertheless, several compounds displaying this GPCR affinity profile also showed moderate to good inhibition of AChE and BChE, thus deserving further investigations to exploit the therapeutic potential of such unusual biological profiles. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Multitarget Effects of Danqi Pill on Global Gene Expression Changes in Myocardial Ischemia

    Directory of Open Access Journals (Sweden)

    Qiyan Wang

    2018-01-01

    Full Text Available Danqi pill (DQP is a widely prescribed traditional Chinese medicine (TCM in the treatment of cardiovascular diseases. The objective of this study is to systematically characterize altered gene expression pattern induced by myocardial ischemia (MI in a rat model and to investigate the effects of DQP on global gene expression. Global mRNA expression was measured. Differentially expressed genes among the sham group, model group, and DQP group were analyzed. The gene ontology enrichment analysis and pathway analysis of differentially expressed genes were carried out. We quantified 10,813 genes. Compared with the sham group, expressions of 339 genes were upregulated and 177 genes were downregulated in the model group. The upregulated genes were enriched in extracellular matrix organization, response to wounding, and defense response pathways. Downregulated genes were enriched in fatty acid metabolism, pyruvate metabolism, PPAR signaling pathways, and so forth. This indicated that energy metabolic disorders occurred in rats with MI. In the DQP group, expressions of genes in the altered pathways were regulated back towards normal levels. DQP reversed expression of 313 of the 516 differentially expressed genes in the model group. This study provides insight into the multitarget mechanism of TCM in the treatment of complex diseases.

  2. Analyzing Statistical Mediation with Multiple Informants: A New Approach with an Application in Clinical Psychology.

    Science.gov (United States)

    Papa, Lesther A; Litson, Kaylee; Lockhart, Ginger; Chassin, Laurie; Geiser, Christian

    2015-01-01

    Testing mediation models is critical for identifying potential variables that need to be targeted to effectively change one or more outcome variables. In addition, it is now common practice for clinicians to use multiple informant (MI) data in studies of statistical mediation. By coupling the use of MI data with statistical mediation analysis, clinical researchers can combine the benefits of both techniques. Integrating the information from MIs into a statistical mediation model creates various methodological and practical challenges. The authors review prior methodological approaches to MI mediation analysis in clinical research and propose a new latent variable approach that overcomes some limitations of prior approaches. An application of the new approach to mother, father, and child reports of impulsivity, frustration tolerance, and externalizing problems (N = 454) is presented. The results showed that frustration tolerance mediated the relationship between impulsivity and externalizing problems. The new approach allows for a more comprehensive and effective use of MI data when testing mediation models.

  3. Scoping Study of Machine Learning Techniques for Visualization and Analysis of Multi-source Data in Nuclear Safeguards

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Yonggang

    2018-05-07

    In implementation of nuclear safeguards, many different techniques are being used to monitor operation of nuclear facilities and safeguard nuclear materials, ranging from radiation detectors, flow monitors, video surveillance, satellite imagers, digital seals to open source search and reports of onsite inspections/verifications. Each technique measures one or more unique properties related to nuclear materials or operation processes. Because these data sets have no or loose correlations, it could be beneficial to analyze the data sets together to improve the effectiveness and efficiency of safeguards processes. Advanced visualization techniques and machine-learning based multi-modality analysis could be effective tools in such integrated analysis. In this project, we will conduct a survey of existing visualization and analysis techniques for multi-source data and assess their potential values in nuclear safeguards.

  4. Bohm's mysterious 'quantum force' and 'active information': alternative interpretation and statistical properties

    International Nuclear Information System (INIS)

    Lan, B.L.

    2001-01-01

    An alternative interpretation to Bohm's 'quantum force' and 'active information' is proposed. Numerical evidence is presented, which suggests that the time series of Bohm's 'quantum force' evaluated at the Bohmian position for non-stationary quantum states are typically non-Gaussian stable distributed with a flat power spectrum in classically chaotic Hamiltonian systems. An important implication of these statistical properties is briefly mentioned. (orig.)

  5. When Statistical Literacy Really Matters: Understanding Published Information about the HIV/AIDS Epidemic in South Africa

    Science.gov (United States)

    Hobden, Sally

    2014-01-01

    Information on the HIV/AIDS epidemic in Southern Africa is often interpreted through a veil of secrecy and shame and, I argue, with flawed understanding of basic statistics. This research determined the levels of statistical literacy evident in 316 future Mathematical Literacy teachers' explanations of the median in the context of HIV/AIDS…

  6. A novel genome-information content-based statistic for genome-wide association analysis designed for next-generation sequencing data.

    Science.gov (United States)

    Luo, Li; Zhu, Yun; Xiong, Momiao

    2012-06-01

    The genome-wide association studies (GWAS) designed for next-generation sequencing data involve testing association of genomic variants, including common, low frequency, and rare variants. The current strategies for association studies are well developed for identifying association of common variants with the common diseases, but may be ill-suited when large amounts of allelic heterogeneity are present in sequence data. Recently, group tests that analyze their collective frequency differences between cases and controls shift the current variant-by-variant analysis paradigm for GWAS of common variants to the collective test of multiple variants in the association analysis of rare variants. However, group tests ignore differences in genetic effects among SNPs at different genomic locations. As an alternative to group tests, we developed a novel genome-information content-based statistics for testing association of the entire allele frequency spectrum of genomic variation with the diseases. To evaluate the performance of the proposed statistics, we use large-scale simulations based on whole genome low coverage pilot data in the 1000 Genomes Project to calculate the type 1 error rates and power of seven alternative statistics: a genome-information content-based statistic, the generalized T(2), collapsing method, multivariate and collapsing (CMC) method, individual χ(2) test, weighted-sum statistic, and variable threshold statistic. Finally, we apply the seven statistics to published resequencing dataset from ANGPTL3, ANGPTL4, ANGPTL5, and ANGPTL6 genes in the Dallas Heart Study. We report that the genome-information content-based statistic has significantly improved type 1 error rates and higher power than the other six statistics in both simulated and empirical datasets.

  7. Information geometry and sufficient statistics

    Czech Academy of Sciences Publication Activity Database

    Ay, N.; Jost, J.; Le, Hong-Van; Schwachhöfer, L.

    2015-01-01

    Roč. 162, 1-2 (2015), s. 327-364 ISSN 0178-8051 Institutional support: RVO:67985840 Keywords : Fisher quadratic form * Amari-Chentsov tensor * sufficient statistic Subject RIV: BA - General Mathematics Impact factor: 2.204, year: 2015 http://link.springer.com/article/10.1007/s00440-014-0574-8

  8. Management information system applied to radiation protection services

    International Nuclear Information System (INIS)

    Grossi, Pablo Andrade; Souza, Leonardo Soares de; Figueiredo, Geraldo Magela; Figueiredo, Arthur

    2013-01-01

    An effective management information system based on technology, information and people is necessary to improve the safety on all processes and operations subjected to radiation risks. The complex and multisource information flux from all radiation protection activities on nuclear organizations requires a robust tool/system to highlight the strengths and weaknesses and identify behaviors and trends on the activities requiring radiation protection programs. Those organized and processed data are useful to reach a successful management and to support the human decision-making on nuclear organization. This paper presents recent improvements on a management information system based on the radiation protection directives and regulations from Brazilian regulatory body. This radiation protection control system is applied to any radiation protection services and research institutes subjected to Brazilian nuclear regulation and is a powerful tool for continuous management, not only indicating how the health and safety activities are going, but why they are not going as well as planned showing up the critical points. (author)

  9. Management information system applied to radiation protection services

    Energy Technology Data Exchange (ETDEWEB)

    Grossi, Pablo Andrade; Souza, Leonardo Soares de; Figueiredo, Geraldo Magela; Figueiredo, Arthur, E-mail: pabloag@cdtn.br, E-mail: lss@cdtn.br, E-mail: gmf@cdtn.br, E-mail: arthurqof@gmail.com [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2013-07-01

    An effective management information system based on technology, information and people is necessary to improve the safety on all processes and operations subjected to radiation risks. The complex and multisource information flux from all radiation protection activities on nuclear organizations requires a robust tool/system to highlight the strengths and weaknesses and identify behaviors and trends on the activities requiring radiation protection programs. Those organized and processed data are useful to reach a successful management and to support the human decision-making on nuclear organization. This paper presents recent improvements on a management information system based on the radiation protection directives and regulations from Brazilian regulatory body. This radiation protection control system is applied to any radiation protection services and research institutes subjected to Brazilian nuclear regulation and is a powerful tool for continuous management, not only indicating how the health and safety activities are going, but why they are not going as well as planned showing up the critical points. (author)

  10. On the Estimation and Use of Statistical Modelling in Information Retrieval

    DEFF Research Database (Denmark)

    Petersen, Casper

    Automatic text processing often relies on assumptions about the distribution of some property (such as term frequency) in the data being processed. In information retrieval (IR) such assumptions may be contributed to (i) the absence of principled approaches for determining the correct statistical...... that assumptions regarding the distribution of dataset properties can be replaced with an effective, efficient and principled method for determining the best-fitting distribution and that using this distribution can lead to improved retrieval performance....

  11. Examples of the Application of Nonparametric Information Geometry to Statistical Physics

    Directory of Open Access Journals (Sweden)

    Giovanni Pistone

    2013-09-01

    Full Text Available We review a nonparametric version of Amari’s information geometry in which the set of positive probability densities on a given sample space is endowed with an atlas of charts to form a differentiable manifold modeled on Orlicz Banach spaces. This nonparametric setting is used to discuss the setting of typical problems in machine learning and statistical physics, such as black-box optimization, Kullback-Leibler divergence, Boltzmann-Gibbs entropy and the Boltzmann equation.

  12. Aortic Aneurysm Statistics

    Science.gov (United States)

    ... Summary Coverdell Program 2012-2015 State Summaries Data & Statistics Fact Sheets Heart Disease and Stroke Fact Sheets ... Roadmap for State Planning Other Data Resources Other Statistic Resources Grantee Information Cross-Program Information Online Tools ...

  13. In vivo animal histology and clinical evaluation of multisource fractional radiofrequency skin resurfacing (FSR) applicator.

    Science.gov (United States)

    Sadick, Neil S; Sato, Masaki; Palmisano, Diana; Frank, Ido; Cohen, Hila; Harth, Yoram

    2011-10-01

    Acne scars are one of the most difficult disorders to treat in dermatology. The optimal treatment system will provide minimal downtime resurfacing for the epidermis and non-ablative deep volumetric heating for collagen remodeling in the dermis. A novel therapy system (EndyMed Ltd., Cesarea, Israel) uses phase-controlled multi-source radiofrequency (RF) to provide simultaneous one pulse microfractional resurfacing with simultaneous volumetric skin tightening. The study included 26 subjects (Fitzpatrick's skin type 2-5) with moderate to severe wrinkles and 4 subjects with depressed acne scars. Treatment was repeated each month up to a total of three treatment sessions. Patients' photographs were graded according to accepted scales by two uninvolved blinded evaluators. Significant reduction in the depth of wrinkles and acne scars was noted 4 weeks after therapy with further improvement at the 3-month follow-up. Our data show the histological impact and clinical beneficial effects of simultaneous RF fractional microablation and volumetric deep dermal heating for the treatment of wrinkles and acne scars.

  14. An Image Matching Algorithm Integrating Global SRTM and Image Segmentation for Multi-Source Satellite Imagery

    Directory of Open Access Journals (Sweden)

    Xiao Ling

    2016-08-01

    Full Text Available This paper presents a novel image matching method for multi-source satellite images, which integrates global Shuttle Radar Topography Mission (SRTM data and image segmentation to achieve robust and numerous correspondences. This method first generates the epipolar lines as a geometric constraint assisted by global SRTM data, after which the seed points are selected and matched. To produce more reliable matching results, a region segmentation-based matching propagation is proposed in this paper, whereby the region segmentations are extracted by image segmentation and are considered to be a spatial constraint. Moreover, a similarity measure integrating Distance, Angle and Normalized Cross-Correlation (DANCC, which considers geometric similarity and radiometric similarity, is introduced to find the optimal correspondences. Experiments using typical satellite images acquired from Resources Satellite-3 (ZY-3, Mapping Satellite-1, SPOT-5 and Google Earth demonstrated that the proposed method is able to produce reliable and accurate matching results.

  15. A Novel, Multi-Target Natural Drug Candidate, Matrine, Improves Cognitive Deficits in Alzheimer's Disease Transgenic Mice by Inhibiting Aβ Aggregation and Blocking the RAGE/Aβ Axis.

    Science.gov (United States)

    Cui, Lili; Cai, Yujie; Cheng, Wanwen; Liu, Gen; Zhao, Jianghao; Cao, Hao; Tao, Hua; Wang, Yan; Yin, Mingkang; Liu, Tingting; Liu, Yu; Huang, Pengru; Liu, Zhou; Li, Keshen; Zhao, Bin

    2017-04-01

    The treatment of AD is a topic that has puzzled researchers for many years. Current mainstream theories still consider Aβ to be the most important target for the cure of AD. In this study, we attempted to explore multiple targets for AD treatments with the aim of identifying a qualified compound that could both inhibit the aggregation of Aβ and block the RAGE/Aβ axis. We believed that a compound that targets both Aβ and RAGE may be a feasible strategy for AD treatment. A novel and small natural compound, Matrine (Mat), was identified by high-throughput screening of the main components of traditional Chinese herbs used to treat dementia. Various experimental techniques were used to evaluate the effect of Mat on these two targets both in vitro and in AD mouse model. Mat could inhibit Aβ42-induced cytotoxicity and suppress the Aβ/RAGE signaling pathway in vitro. Additionally, the results of in vivo evaluations of the effects of Mat on the two targets were consistent with the results of our in vitro studies. Furthermore, Mat reduced proinflammatory cytokines and Aβ deposition and attenuated the memory deficits of AD transgenic mice. We believe that this novel, multi-target strategy to inhibit both Aβ and RAGE, is worthy of further exploration. Therefore, our future studies will focus on identifying even more effective multi-target compounds for the treatment of AD based on the molecular structure of Mat.

  16. Predictive Big Data Analytics: A Study of Parkinson’s Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations

    Science.gov (United States)

    Dinov, Ivo D.; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W.; Price, Nathan D.; Van Horn, John D.; Ames, Joseph; Clark, Kristi; Hood, Leroy; Hampstead, Benjamin M.; Dauer, William; Toga, Arthur W.

    2016-01-01

    Background A unique archive of Big Data on Parkinson’s Disease is collected, managed and disseminated by the Parkinson’s Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson’s disease (PD) risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data–large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources–all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data. Methods and Findings Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i) introduce methods for rebalancing imbalanced cohorts, (ii) utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii) generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several

  17. A 3D modeling approach to complex faults with multi-source data

    Science.gov (United States)

    Wu, Qiang; Xu, Hua; Zou, Xukai; Lei, Hongzhuan

    2015-04-01

    Fault modeling is a very important step in making an accurate and reliable 3D geological model. Typical existing methods demand enough fault data to be able to construct complex fault models, however, it is well known that the available fault data are generally sparse and undersampled. In this paper, we propose a workflow of fault modeling, which can integrate multi-source data to construct fault models. For the faults that are not modeled with these data, especially small-scale or approximately parallel with the sections, we propose the fault deduction method to infer the hanging wall and footwall lines after displacement calculation. Moreover, using the fault cutting algorithm can supplement the available fault points on the location where faults cut each other. Increasing fault points in poor sample areas can not only efficiently construct fault models, but also reduce manual intervention. By using a fault-based interpolation and remeshing the horizons, an accurate 3D geological model can be constructed. The method can naturally simulate geological structures no matter whether the available geological data are sufficient or not. A concrete example of using the method in Tangshan, China, shows that the method can be applied to broad and complex geological areas.

  18. Multitarget Approaches to Robust Navigation

    Data.gov (United States)

    National Aeronautics and Space Administration — The performance, stability, and statistical consistency of a vehicle's navigation algorithm are vitally important to the success and safety of its mission....

  19. Baseline Statistics of Linked Statistical Data

    NARCIS (Netherlands)

    Scharnhorst, Andrea; Meroño-Peñuela, Albert; Guéret, Christophe

    2014-01-01

    We are surrounded by an ever increasing ocean of information, everybody will agree to that. We build sophisticated strategies to govern this information: design data models, develop infrastructures for data sharing, building tool for data analysis. Statistical datasets curated by National

  20. Analyzing Statistical Mediation with Multiple Informants: A New Approach with an Application in Clinical Psychology

    Directory of Open Access Journals (Sweden)

    Lesther ePapa

    2015-11-01

    Full Text Available Testing mediation models is critical for identifying potential variables that need to be targeted to effectively change one or more outcome variables. In addition, it is now common practice for clinicians to use multiple informant (MI data in studies of statistical mediation. By coupling the use of MI data with statistical mediation analysis, clinical researchers can combine the benefits of both techniques. Integrating the information from MIs into a statistical mediation model creates various methodological and practical challenges. The authors review prior methodological approaches to MI mediation analysis in clinical research and propose a new latent variable approach that overcomes some limitations of prior approaches. An application of the new approach to mother, father, and child reports of impulsivity, frustration tolerance, and externalizing problems (N = 454 is presented. The results showed that frustration tolerance mediated the relationship between impulsivity and externalizing problems. Advantages and limitations of the new approach are discussed. The new approach can help clinical researchers overcome limitations of prior techniques. It allows for a more comprehensive and effective use of MI data when testing mediation models.

  1. An Online Q-learning Based Multi-Agent LFC for a Multi-Area Multi-Source Power System Including Distributed Energy Resources

    Directory of Open Access Journals (Sweden)

    H. Shayeghi

    2017-12-01

    Full Text Available This paper presents an online two-stage Q-learning based multi-agent (MA controller for load frequency control (LFC in an interconnected multi-area multi-source power system integrated with distributed energy resources (DERs. The proposed control strategy consists of two stages. The first stage is employed a PID controller which its parameters are designed using sine cosine optimization (SCO algorithm and are fixed. The second one is a reinforcement learning (RL based supplementary controller that has a flexible structure and improves the output of the first stage adaptively based on the system dynamical behavior. Due to the use of RL paradigm integrated with PID controller in this strategy, it is called RL-PID controller. The primary motivation for the integration of RL technique with PID controller is to make the existing local controllers in the industry compatible to reduce the control efforts and system costs. This novel control strategy combines the advantages of the PID controller with adaptive behavior of MA to achieve the desired level of robust performance under different kind of uncertainties caused by stochastically power generation of DERs, plant operational condition changes, and physical nonlinearities of the system. The suggested decentralized controller is composed of the autonomous intelligent agents, who learn the optimal control policy from interaction with the system. These agents update their knowledge about the system dynamics continuously to achieve a good frequency oscillation damping under various severe disturbances without any knowledge of them. It leads to an adaptive control structure to solve LFC problem in the multi-source power system with stochastic DERs. The results of RL-PID controller in comparison to the traditional PID and fuzzy-PID controllers is verified in a multi-area power system integrated with DERs through some performance indices.

  2. UPA-sensitive ACPP-conjugated nanoparticles for multi-targeting therapy of brain glioma.

    Science.gov (United States)

    Zhang, Bo; Zhang, Yujie; Liao, Ziwei; Jiang, Ting; Zhao, Jingjing; Tuo, Yanyan; She, Xiaojian; Shen, Shun; Chen, Jun; Zhang, Qizhi; Jiang, Xinguo; Hu, Yu; Pang, Zhiqing

    2015-01-01

    Now it is well evidenced that tumor growth is a comprehensive result of multiple pathways, and glioma parenchyma cells and stroma cells are closely associated and mutually compensatory. Therefore, drug delivery strategies targeting both of them simultaneously might obtain more promising therapeutic benefits. In the present study, we developed a multi-targeting drug delivery system modified with uPA-activated cell-penetrating peptide (ACPP) for the treatment of brain glioma (ANP). In vitro experiments demonstrated nanoparticles (NP) decorated with cell-penetrating peptide (CPP) or ACPP could significantly improve nanoparticles uptake by C6 glioma cells and nanoparticles penetration into glioma spheroids as compared with traditional NP and thus enhanced the therapeutic effects of its payload when paclitaxel (PTX) was loaded. In vivo imaging experiment revealed that ANP accumulated more specifically in brain glioma site than NP decorated with or without CPP. Brain slides further showed that ACPP contributed to more nanoparticles accumulation in glioma site, and ANP could co-localize not only with glioma parenchyma cells, but also with stroma cells including neo-vascular cells and tumor associated macrophages. The pharmacodynamics results demonstrated ACPP could significantly improve the therapeutic benefits of nanoparticles by significantly prolonging the survival time of glioma bearing mice. In conclusion, the results suggested that nanoparticles modified with uPA-sensitive ACPP could reach multiple types of cells in glioma tissues and provide a novel strategy for glioma targeted therapy.

  3. Multi-Source Multi-Sensor Information Fusion

    Indian Academy of Sciences (India)

    M. Senthilkumar (Newgen Imaging) 1461 1996 Oct 15 13:05:22

    applications in biomedical, industrial automation, aerospace systems and environmental ... performance evaluation and achievable accuracy, mainly for aerospace ... system and sensors, sensor data processing and performance requirement, ...

  4. Monoaminergic Mechanisms in Epilepsy May Offer Innovative Therapeutic Opportunity for Monoaminergic Multi-Target Drugs

    Directory of Open Access Journals (Sweden)

    Dubravka Svob Strac

    2016-11-01

    Full Text Available A large body of experimental and clinical evidence has strongly suggested that monoamines play an important role in regulating epileptogenesis, seizure susceptibility, convulsions and comorbid psychiatric disorders commonly seen in people with epilepsy. However, neither the relative significance of individual monoamines nor their interaction has yet been fully clarified due to the complexity of these neurotransmitter systems. In addition, epilepsy is diverse, with many different seizure types and epilepsy syndromes, and the role played by monoamines may vary from one condition to another. In this review, we will focus on the role of serotonin, dopamine, noradrenaline, histamine and melatonin in epilepsy. Recent experimental, clinical and genetic evidence, will be reviewed in consideration of the mutual relationship of monoamines with the other putative neurotransmitters. The complexity of epileptic pathogenesis may explain why the currently available drugs, developed according to the classic drug discovery paradigm of one-molecule-one-target, have turned out to be effective only in a percentage of people with epilepsy. Although no antiepileptic drugs currently target specifically monoaminergic systems, multi-target directed ligands acting on different monoaminergic proteins present on both neurons and glia cells may represent a new approach in the management of seizures and their generation as well as comorbid neuropsychiatric disorders.

  5. Monoaminergic Mechanisms in Epilepsy May Offer Innovative Therapeutic Opportunity for Monoaminergic Multi-Target Drugs

    Science.gov (United States)

    Svob Strac, Dubravka; Pivac, Nela; Smolders, Ilse J.; Fogel, Wieslawa A.; De Deurwaerdere, Philippe; Di Giovanni, Giuseppe

    2016-01-01

    A large body of experimental and clinical evidence has strongly suggested that monoamines play an important role in regulating epileptogenesis, seizure susceptibility, convulsions, and comorbid psychiatric disorders commonly seen in people with epilepsy (PWE). However, neither the relative significance of individual monoamines nor their interaction has yet been fully clarified due to the complexity of these neurotransmitter systems. In addition, epilepsy is diverse, with many different seizure types and epilepsy syndromes, and the role played by monoamines may vary from one condition to another. In this review, we will focus on the role of serotonin, dopamine, noradrenaline, histamine, and melatonin in epilepsy. Recent experimental, clinical, and genetic evidence will be reviewed in consideration of the mutual relationship of monoamines with the other putative neurotransmitters. The complexity of epileptic pathogenesis may explain why the currently available drugs, developed according to the classic drug discovery paradigm of “one-molecule-one-target,” have turned out to be effective only in a percentage of PWE. Although, no antiepileptic drugs currently target specifically monoaminergic systems, multi-target directed ligands acting on different monoaminergic proteins, present on both neurons and glia cells, may represent a new approach in the management of seizures, and their generation as well as comorbid neuropsychiatric disorders. PMID:27891070

  6. A statistic to estimate the variance of the histogram-based mutual information estimator based on dependent pairs of observations

    NARCIS (Netherlands)

    Moddemeijer, R

    In the case of two signals with independent pairs of observations (x(n),y(n)) a statistic to estimate the variance of the histogram based mutual information estimator has been derived earlier. We present such a statistic for dependent pairs. To derive this statistic it is necessary to avail of a

  7. Statistics in Schools

    Science.gov (United States)

    Information Statistics in Schools Educate your students about the value and everyday use of statistics. The Statistics in Schools program provides resources for teaching and learning with real life data. Explore the site for standards-aligned, classroom-ready activities. Statistics in Schools Math Activities History

  8. GMTR: two-dimensional geo-fit multitarget retrieval model for michelson interferometer for passive atmospheric sounding/environmental satellite observations.

    Science.gov (United States)

    Carlotti, Massimo; Brizzi, Gabriele; Papandrea, Enzo; Prevedelli, Marco; Ridolfi, Marco; Dinelli, Bianca Maria; Magnani, Luca

    2006-02-01

    We present a new retrieval model designed to analyze the observations of the Michelson Interferometer for Passive Atmospheric Sounding (MIPAS), which is on board the ENVironmental SATellite (ENVISAT). The new geo-fit multitarget retrieval model (GMTR) implements the geo-fit two-dimensional inversion for the simultaneous retrieval of several targets including a set of atmospheric constituents that are not considered by the ground processor of the MIPAS experiment. We describe the innovative solutions adopted in the inversion algorithm and the main functionalities of the corresponding computer code. The performance of GMTR is compared with that of the MIPAS ground processor in terms of accuracy of the retrieval products. Furthermore, we show the capability of GMTR to resolve the horizontal structures of the atmosphere. The new retrieval model is implemented in an optimized computer code that is distributed by the European Space Agency as "open source" in a package that includes a full set of auxiliary data for the retrieval of 28 atmospheric targets.

  9. Sample Preparation Strategies for the Effective Quantitation of Hydrophilic Metabolites in Serum by Multi-Targeted HILIC-MS/MS

    Directory of Open Access Journals (Sweden)

    Elisavet Tsakelidou

    2017-03-01

    Full Text Available The effect of endogenous interferences of serum in multi-targeted metabolite profiling HILIC-MS/MS analysis was investigated by studying different sample preparation procedures. A modified QuEChERS dispersive SPE protocol, a HybridSPE protocol, and a combination of liquid extraction with protein precipitation were compared to a simple protein precipitation. Evaluation of extraction efficiency and sample clean-up was performed for all methods. SPE sorbent materials tested were found to retain hydrophilic analytes together with endogenous interferences, thus additional elution steps were needed. Liquid extraction was not shown to minimise matrix effects. In general, it was observed that a balance should be reached in terms of recovery, efficient clean-up, and sample treatment time when a wide range of metabolites are analysed. A quick step for removing phospholipids prior to the determination of hydrophilic endogenous metabolites is required, however, based on the results from the applied methods, further studies are needed to achieve high recoveries for all metabolites.

  10. 100 statistical tests

    CERN Document Server

    Kanji, Gopal K

    2006-01-01

    This expanded and updated Third Edition of Gopal K. Kanji's best-selling resource on statistical tests covers all the most commonly used tests with information on how to calculate and interpret results with simple datasets. Each entry begins with a short summary statement about the test's purpose, and contains details of the test objective, the limitations (or assumptions) involved, a brief outline of the method, a worked example, and the numerical calculation. 100 Statistical Tests, Third Edition is the one indispensable guide for users of statistical materials and consumers of statistical information at all levels and across all disciplines.

  11. Gully Erosion Mapping and Monitoring at Multiple Scales Based on Multi-Source Remote Sensing Data of the Sancha River Catchment, Northeast China

    Directory of Open Access Journals (Sweden)

    Ranghu Wang

    2016-11-01

    Full Text Available This research is focused on gully erosion mapping and monitoring at multiple spatial scales using multi-source remote sensing data of the Sancha River catchment in Northeast China, where gullies extend over a vast area. A high resolution satellite image (Pleiades 1A, 0.7 m was used to obtain the spatial distribution of the gullies of the overall basin. Image visual interpretation with field verification was employed to map the geometric gully features and evaluate gully erosion as well as the topographic differentiation characteristics. Unmanned Aerial Vehicle (UAV remote sensing data and the 3D photo-reconstruction method were employed for detailed gully mapping at a site scale. The results showed that: (1 the sub-meter image showed a strong ability in the recognition of various gully types and obtained satisfactory results, and the topographic factors of elevation, slope and slope aspects exerted significant influence on the gully spatial distribution at the catchment scale; and (2 at a more detailed site scale, UAV imagery combined with 3D photo-reconstruction provided a Digital Surface Model (DSM and ortho-image at the centimeter level as well as a detailed 3D model. The resulting products revealed the area of agricultural utilization and its shaping by human agricultural activities and water erosion in detail, and also provided the gully volume. The present study indicates that using multi-source remote sensing data, including satellite and UAV imagery simultaneously, results in an effective assessment of gully erosion over multiple spatial scales. The combined approach should be continued to regularly monitor gully erosion to understand the erosion process and its relationship with the environment from a comprehensive perspective.

  12. Business statistics for dummies

    CERN Document Server

    Anderson, Alan

    2013-01-01

    Score higher in your business statistics course? Easy. Business statistics is a common course for business majors and MBA candidates. It examines common data sets and the proper way to use such information when conducting research and producing informational reports such as profit and loss statements, customer satisfaction surveys, and peer comparisons. Business Statistics For Dummies tracks to a typical business statistics course offered at the undergraduate and graduate levels and provides clear, practical explanations of business statistical ideas, techniques, formulas, and calculations, w

  13. Spatial-Temporal Analysis on Spring Festival Travel Rush in China Based on Multisource Big Data

    Directory of Open Access Journals (Sweden)

    Jiwei Li

    2016-11-01

    Full Text Available Spring Festival travel rush is a phenomenon in China that population travel intensively surges in a short time around Chinese Spring Festival. This phenomenon, which is a special one in the urbanization process of China, brings a large traffic burden and various kinds of social problems, thereby causing widespread public concern. This study investigates the spatial-temporal characteristics of Spring Festival travel rush in 2015 through time series analysis and complex network analysis based on multisource big travel data derived from Baidu, Tencent, and Qihoo. The main results are as follows: First, big travel data of Baidu and Tencent obtained from location-based services might be more accurate and scientific than that of Qihoo. Second, two travel peaks appeared at five days before and six days after the Spring Festival, respectively, and the travel valley appeared on the Spring Festival. The Spring Festival travel network at the provincial scale did not have small-world and scale-free characteristics. Instead, the travel network showed a multicenter characteristic and a significant geographic clustering characteristic. Moreover, some travel path chains played a leading role in the network. Third, economic and social factors had more influence on the travel network than geographical location factors. The problem of Spring Festival travel rush will not be effectively improved in a short time because of the unbalanced urban-rural development and the unbalanced regional development. However, the development of the modern high-speed transport system and the modern information and communication technology can alleviate problems brought by Spring Festival travel rush. We suggest that a unified real-time traffic platform for Spring Festival travel rush should be established through the government's integration of mobile big data and the official authority data of the transportation department.

  14. INTEGRAÇÃO DE DADOS MULTIFONTES PARA MAPEAMENTOS TEMÁTICOS - INTEGRATION OF MULTISOURCES DATA FOR THEMATIC MAPPING

    Directory of Open Access Journals (Sweden)

    Maria Luiza Osório Moreira; Levindo Cardoso Medeiros; Heitor Faria da Costa

    2007-12-01

    Full Text Available The use of images is fundamental in mapping works. The compositions of multisources images, approachedin this work, is based in some methods previously known in literature, as derived products of the classicIHS transformation. Such techniques of integration of several nature data, have already being spread fromthe end of the 80s and they have successfully been used until today. To evaluate the integration of radardata with multispectral passive satellite data, thematic mapping and cartographic bases constitutes themain objective of this work.The fusion techniques data originated from different sources (multisourceshave been used thoroughly with the intention of generating a final product of good visual quality, for thequantitative and qualitative analyses and for the procedures of visual interpretation in general. Besides, itreduces the costs of field works. Those techniques reached a wide variety of applications inside of thegroup of disciplines within the Earth Sciences.

  15. Validation of survey information on smoking and alcohol consumption against import statistics, Greenland 1993-2010.

    Science.gov (United States)

    Bjerregaard, Peter; Becker, Ulrik

    2013-01-01

    Questionnaires are widely used to obtain information on health-related behaviour, and they are more often than not the only method that can be used to assess the distribution of behaviour in subgroups of the population. No validation studies of reported consumption of tobacco or alcohol have been published from circumpolar indigenous communities. The purpose of the study is to compare information on the consumption of tobacco and alcohol obtained from 3 population surveys in Greenland with import statistics. Estimates of consumption of cigarettes and alcohol using several different survey instruments in cross-sectional population studies from 1993-1994, 1999-2001 and 2005-2010 were compared with import statistics from the same years. For cigarettes, survey results accounted for virtually the total import. Alcohol consumption was significantly under-reported with reporting completeness ranging from 40% to 51% for different estimates of habitual weekly consumption in the 3 study periods. Including an estimate of binge drinking increased the estimated total consumption to 78% of the import. Compared with import statistics, questionnaire-based population surveys capture the consumption of cigarettes well in Greenland. Consumption of alcohol is under-reported, but asking about binge episodes in addition to the usual intake considerably increased the reported intake in this population and made it more in agreement with import statistics. It is unknown to what extent these findings at the population level can be inferred to population subgroups.

  16. Contributions to statistics

    CERN Document Server

    Mahalanobis, P C

    1965-01-01

    Contributions to Statistics focuses on the processes, methodologies, and approaches involved in statistics. The book is presented to Professor P. C. Mahalanobis on the occasion of his 70th birthday. The selection first offers information on the recovery of ancillary information and combinatorial properties of partially balanced designs and association schemes. Discussions focus on combinatorial applications of the algebra of association matrices, sample size analogy, association matrices and the algebra of association schemes, and conceptual statistical experiments. The book then examines latt

  17. Information Graph Flow: A Geometric Approximation of Quantum and Statistical Systems

    Science.gov (United States)

    Vanchurin, Vitaly

    2018-05-01

    Given a quantum (or statistical) system with a very large number of degrees of freedom and a preferred tensor product factorization of the Hilbert space (or of a space of distributions) we describe how it can be approximated with a very low-dimensional field theory with geometric degrees of freedom. The geometric approximation procedure consists of three steps. The first step is to construct weighted graphs (we call information graphs) with vertices representing subsystems (e.g., qubits or random variables) and edges representing mutual information (or the flow of information) between subsystems. The second step is to deform the adjacency matrices of the information graphs to that of a (locally) low-dimensional lattice using the graph flow equations introduced in the paper. (Note that the graph flow produces very sparse adjacency matrices and thus might also be used, for example, in machine learning or network science where the task of graph sparsification is of a central importance.) The third step is to define an emergent metric and to derive an effective description of the metric and possibly other degrees of freedom. To illustrate the procedure we analyze (numerically and analytically) two information graph flows with geometric attractors (towards locally one- and two-dimensional lattices) and metric perturbations obeying a geometric flow equation. Our analysis also suggests a possible approach to (a non-perturbative) quantum gravity in which the geometry (a secondary object) emerges directly from a quantum state (a primary object) due to the flow of the information graphs.

  18. Multi-Source Cooperative Data Collection with a Mobile Sink for the Wireless Sensor Network.

    Science.gov (United States)

    Han, Changcai; Yang, Jinsheng

    2017-10-30

    The multi-source cooperation integrating distributed low-density parity-check codes is investigated to jointly collect data from multiple sensor nodes to the mobile sink in the wireless sensor network. The one-round and two-round cooperative data collection schemes are proposed according to the moving trajectories of the sink node. Specifically, two sparse cooperation models are firstly formed based on geographical locations of sensor source nodes, the impairment of inter-node wireless channels and moving trajectories of the mobile sink. Then, distributed low-density parity-check codes are devised to match the directed graphs and cooperation matrices related with the cooperation models. In the proposed schemes, each source node has quite low complexity attributed to the sparse cooperation and the distributed processing. Simulation results reveal that the proposed cooperative data collection schemes obtain significant bit error rate performance and the two-round cooperation exhibits better performance compared with the one-round scheme. The performance can be further improved when more source nodes participate in the sparse cooperation. For the two-round data collection schemes, the performance is evaluated for the wireless sensor networks with different moving trajectories and the variant data sizes.

  19. Real-time multi-target ranging based on chaotic polarization laser radars in the drive-response VCSELs.

    Science.gov (United States)

    Zhong, Dongzhou; Xu, Geliang; Luo, Wei; Xiao, Zhenzhen

    2017-09-04

    According to the principle of complete chaos synchronization and the theory of Hilbert phase transformation, we propose a novel real-time multi-target ranging scheme by using chaotic polarization laser radar in the drive-response vertical-cavity surface-emitting lasers (VCSELs). In the scheme, to ensure each polarization component (PC) of the master VCSEL (MVCSEL) to be synchronized steadily with that of the slave VCSEL, the output x-PC and y-PC from the MVCSEL in the drive system and those in the response system are modulated by the linear electro-optic effect simultaneously. Under this condition, by simulating the influences of some key parameters of the system on the synchronization quality and the relative errors of the two-target ranging, related operating parameters can be optimized. The x-PC and the y-PC, as two chaotic radar sources, are used to implement the real-time ranging for two targets. It is found that the measured distances of the two targets at arbitrary position exhibit strong real-time stability and only slight jitter. Their resolutions are up to millimeters, and their relative errors are very small and less than 2.7%.

  20. Multigram Synthesis and in Vivo Efficacy Studies of a Novel Multitarget Anti-Alzheimer’s Compound

    Directory of Open Access Journals (Sweden)

    Irene Sola

    2015-03-01

    Full Text Available We describe the multigram synthesis and in vivo efficacy studies of a donepezil‒huprine hybrid that has been found to display a promising in vitro multitarget profile of interest for the treatment of Alzheimer’s disease (AD. Its synthesis features as the key step a novel multigram preparative chromatographic resolution of intermediate racemic huprine Y by chiral HPLC. Administration of this compound to transgenic CL4176 and CL2006 Caenorhabditis elegans strains expressing human Aβ42, here used as simplified animal models of AD, led to a significant protection from the toxicity induced by Aβ42. However, this protective effect was not accompanied, in CL2006 worms, by a reduction of amyloid deposits. Oral administration for 3 months to transgenic APPSL mice, a well-established animal model of AD, improved short-term memory, but did not alter brain levels of Aβ peptides nor cortical and hippocampal amyloid plaque load. Despite the clear protective and cognitive effects of AVCRI104P4, the lack of Aβ lowering effect in vivo might be related to its lower in vitro potency toward Aβ aggregation and formation as compared with its higher anticholinesterase activities. Further lead optimization in this series should thus focus on improving the anti-amyloid/anticholinesterase activity ratio.

  1. Shadow detection of moving objects based on multisource information in Internet of things

    Science.gov (United States)

    Ma, Zhen; Zhang, De-gan; Chen, Jie; Hou, Yue-xian

    2017-05-01

    Moving object detection is an important part in intelligent video surveillance under the banner of Internet of things. The detection of moving target's shadow is also an important step in moving object detection. On the accuracy of shadow detection will affect the detection results of the object directly. Based on the variety of shadow detection method, we find that only using one feature can't make the result of detection accurately. Then we present a new method for shadow detection which contains colour information, the invariance of optical and texture feature. Through the comprehensive analysis of the detecting results of three kinds of information, the shadow was effectively determined. It gets ideal effect in the experiment when combining advantages of various methods.

  2. Error related negativity and multi-source interference task in children with attention deficit hyperactivity disorder-combined type

    Directory of Open Access Journals (Sweden)

    Rosana Huerta-Albarrán

    2015-03-01

    Full Text Available Objective To compare performance of children with attention deficit hyperactivity disorders-combined (ADHD-C type with control children in multi-source interference task (MSIT evaluated by means of error related negativity (ERN. Method We studied 12 children with ADHD-C type with a median age of 7 years, control children were age- and gender-matched. Children performed MSIT and simultaneous recording of ERN. Results We found no differences in MSIT parameters among groups. We found no differences in ERN variables between groups. We found a significant association of ERN amplitude with MSIT in children with ADHD-C type. Some correlation went in positive direction (frequency of hits and MSIT amplitude, and others in negative direction (frequency of errors and RT in MSIT. Conclusion Children with ADHD-C type exhibited a significant association between ERN amplitude with MSIT. These results underline participation of a cingulo-fronto-parietal network and could help in the comprehension of pathophysiological mechanisms of ADHD.

  3. The effects of statistical information on risk and ambiguity attitudes, and on rational insurance decisions

    NARCIS (Netherlands)

    P.P. Wakker (Peter); D.R.M. Timmermans (Danielle); I. Machielse (Irma)

    2007-01-01

    textabstractThis paper presents a field study into the effects of statistical information concerning risks on willingness to take insurance, with special attention being paid to the usefulness of these effects for the clients (the insured). Unlike many academic studies, we were able to use in-depth

  4. Principle of maximum Fisher information from Hardy's axioms applied to statistical systems.

    Science.gov (United States)

    Frieden, B Roy; Gatenby, Robert A

    2013-10-01

    Consider a finite-sized, multidimensional system in parameter state a. The system is either at statistical equilibrium or general nonequilibrium, and may obey either classical or quantum physics. L. Hardy's mathematical axioms provide a basis for the physics obeyed by any such system. One axiom is that the number N of distinguishable states a in the system obeys N=max. This assumes that N is known as deterministic prior knowledge. However, most observed systems suffer statistical fluctuations, for which N is therefore only known approximately. Then what happens if the scope of the axiom N=max is extended to include such observed systems? It is found that the state a of the system must obey a principle of maximum Fisher information, I=I(max). This is important because many physical laws have been derived, assuming as a working hypothesis that I=I(max). These derivations include uses of the principle of extreme physical information (EPI). Examples of such derivations were of the De Broglie wave hypothesis, quantum wave equations, Maxwell's equations, new laws of biology (e.g., of Coulomb force-directed cell development and of in situ cancer growth), and new laws of economic fluctuation and investment. That the principle I=I(max) itself derives from suitably extended Hardy axioms thereby eliminates its need to be assumed in these derivations. Thus, uses of I=I(max) and EPI express physics at its most fundamental level, its axiomatic basis in math.

  5. A resilient and efficient CFD framework: Statistical learning tools for multi-fidelity and heterogeneous information fusion

    Science.gov (United States)

    Lee, Seungjoon; Kevrekidis, Ioannis G.; Karniadakis, George Em

    2017-09-01

    Exascale-level simulations require fault-resilient algorithms that are robust against repeated and expected software and/or hardware failures during computations, which may render the simulation results unsatisfactory. If each processor can share some global information about the simulation from a coarse, limited accuracy but relatively costless auxiliary simulator we can effectively fill-in the missing spatial data at the required times by a statistical learning technique - multi-level Gaussian process regression, on the fly; this has been demonstrated in previous work [1]. Based on the previous work, we also employ another (nonlinear) statistical learning technique, Diffusion Maps, that detects computational redundancy in time and hence accelerate the simulation by projective time integration, giving the overall computation a "patch dynamics" flavor. Furthermore, we are now able to perform information fusion with multi-fidelity and heterogeneous data (including stochastic data). Finally, we set the foundations of a new framework in CFD, called patch simulation, that combines information fusion techniques from, in principle, multiple fidelity and resolution simulations (and even experiments) with a new adaptive timestep refinement technique. We present two benchmark problems (the heat equation and the Navier-Stokes equations) to demonstrate the new capability that statistical learning tools can bring to traditional scientific computing algorithms. For each problem, we rely on heterogeneous and multi-fidelity data, either from a coarse simulation of the same equation or from a stochastic, particle-based, more "microscopic" simulation. We consider, as such "auxiliary" models, a Monte Carlo random walk for the heat equation and a dissipative particle dynamics (DPD) model for the Navier-Stokes equations. More broadly, in this paper we demonstrate the symbiotic and synergistic combination of statistical learning, domain decomposition, and scientific computing in

  6. Tennessee StreamStats: A Web-Enabled Geographic Information System Application for Automating the Retrieval and Calculation of Streamflow Statistics

    Science.gov (United States)

    Ladd, David E.; Law, George S.

    2007-01-01

    The U.S. Geological Survey (USGS) provides streamflow and other stream-related information needed to protect people and property from floods, to plan and manage water resources, and to protect water quality in the streams. Streamflow statistics provided by the USGS, such as the 100-year flood and the 7-day 10-year low flow, frequently are used by engineers, land managers, biologists, and many others to help guide decisions in their everyday work. In addition to streamflow statistics, resource managers often need to know the physical and climatic characteristics (basin characteristics) of the drainage basins for locations of interest to help them understand the mechanisms that control water availability and water quality at these locations. StreamStats is a Web-enabled geographic information system (GIS) application that makes it easy for users to obtain streamflow statistics, basin characteristics, and other information for USGS data-collection stations and for ungaged sites of interest. If a user selects the location of a data-collection station, StreamStats will provide previously published information for the station from a database. If a user selects a location where no data are available (an ungaged site), StreamStats will run a GIS program to delineate a drainage basin boundary, measure basin characteristics, and estimate streamflow statistics based on USGS streamflow prediction methods. A user can download a GIS feature class of the drainage basin boundary with attributes including the measured basin characteristics and streamflow estimates.

  7. Official Statistics and Statistics Education: Bridging the Gap

    Directory of Open Access Journals (Sweden)

    Gal Iddo

    2017-03-01

    Full Text Available This article aims to challenge official statistics providers and statistics educators to ponder on how to help non-specialist adult users of statistics develop those aspects of statistical literacy that pertain to official statistics. We first document the gap in the literature in terms of the conceptual basis and educational materials needed for such an undertaking. We then review skills and competencies that may help adults to make sense of statistical information in areas of importance to society. Based on this review, we identify six elements related to official statistics about which non-specialist adult users should possess knowledge in order to be considered literate in official statistics: (1 the system of official statistics and its work principles; (2 the nature of statistics about society; (3 indicators; (4 statistical techniques and big ideas; (5 research methods and data sources; and (6 awareness and skills for citizens’ access to statistical reports. Based on this ad hoc typology, we discuss directions that official statistics providers, in cooperation with statistics educators, could take in order to (1 advance the conceptualization of skills needed to understand official statistics, and (2 expand educational activities and services, specifically by developing a collaborative digital textbook and a modular online course, to improve public capacity for understanding of official statistics.

  8. Statistical methods

    CERN Document Server

    Szulc, Stefan

    1965-01-01

    Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then

  9. Africa-wide monitoring of small surface water bodies using multisource satellite data: a monitoring system for FEWS NET: chapter 5

    Science.gov (United States)

    Velpuri, Naga Manohar; Senay, Gabriel B.; Rowland, James; Verdin, James P.; Alemu, Henok; Melesse, Assefa M.; Abtew, Wossenu; Setegn, Shimelis G.

    2014-01-01

    Continental Africa has the highest volume of water stored in wetlands, large lakes, reservoirs, and rivers, yet it suffers from problems such as water availability and access. With climate change intensifying the hydrologic cycle and altering the distribution and frequency of rainfall, the problem of water availability and access will increase further. Famine Early Warning Systems Network (FEWS NET) funded by the United States Agency for International Development (USAID) has initiated a large-scale project to monitor small to medium surface water points in Africa. Under this project, multisource satellite data and hydrologic modeling techniques are integrated to monitor several hundreds of small to medium surface water points in Africa. This approach has been already tested to operationally monitor 41 water points in East Africa. The validation of modeled scaled depths with field-installed gauge data demonstrated the ability of the model to capture both the spatial patterns and seasonal variations. Modeled scaled estimates captured up to 60 % of the observed gauge variability with a mean root-mean-square error (RMSE) of 22 %. The data on relative water level, precipitation, and evapotranspiration (ETo) for water points in East and West Africa were modeled since 1998 and current information is being made available in near-real time. This chapter presents the approach, results from the East African study, and the first phase of expansion activities in the West Africa region. The water point monitoring network will be further expanded to cover much of sub-Saharan Africa. The goal of this study is to provide timely information on the water availability that would support already established FEWS NET activities in Africa. This chapter also presents the potential improvements in modeling approach to be implemented during future expansion in Africa.

  10. Robust fault detection for the dynamics of high-speed train with multi-source finite frequency interference.

    Science.gov (United States)

    Bai, Weiqi; Dong, Hairong; Yao, Xiuming; Ning, Bin

    2018-04-01

    This paper proposes a composite fault detection scheme for the dynamics of high-speed train (HST), using an unknown input observer-like (UIO-like) fault detection filter, in the presence of wind gust and operating noises which are modeled as disturbance generated by exogenous system and unknown multi-source disturbance within finite frequency domain. Using system input and system output measurements, the fault detection filter is designed to generate the needed residual signals. In order to decouple disturbance from residual signals without truncating the influence of faults, this paper proposes a method to partition the disturbance into two parts. One subset of the disturbance does not appear in residual dynamics, and the influence of the other subset is constrained by H ∞ performance index in a finite frequency domain. A set of detection subspaces are defined, and every different fault is assigned to its own detection subspace to guarantee the residual signals are diagonally affected promptly by the faults. Simulations are conducted to demonstrate the effectiveness and merits of the proposed method. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  11. Farm Management Support on Cloud Computing Platform: A System for Cropland Monitoring Using Multi-Source Remotely Sensed Data

    Science.gov (United States)

    Coburn, C. A.; Qin, Y.; Zhang, J.; Staenz, K.

    2015-12-01

    Food security is one of the most pressing issues facing humankind. Recent estimates predict that over one billion people don't have enough food to meet their basic nutritional needs. The ability of remote sensing tools to monitor and model crop production and predict crop yield is essential for providing governments and farmers with vital information to ensure food security. Google Earth Engine (GEE) is a cloud computing platform, which integrates storage and processing algorithms for massive remotely sensed imagery and vector data sets. By providing the capabilities of storing and analyzing the data sets, it provides an ideal platform for the development of advanced analytic tools for extracting key variables used in regional and national food security systems. With the high performance computing and storing capabilities of GEE, a cloud-computing based system for near real-time crop land monitoring was developed using multi-source remotely sensed data over large areas. The system is able to process and visualize the MODIS time series NDVI profile in conjunction with Landsat 8 image segmentation for crop monitoring. With multi-temporal Landsat 8 imagery, the crop fields are extracted using the image segmentation algorithm developed by Baatz et al.[1]. The MODIS time series NDVI data are modeled by TIMESAT [2], a software package developed for analyzing time series of satellite data. The seasonality of MODIS time series data, for example, the start date of the growing season, length of growing season, and NDVI peak at a field-level are obtained for evaluating the crop-growth conditions. The system fuses MODIS time series NDVI data and Landsat 8 imagery to provide information of near real-time crop-growth conditions through the visualization of MODIS NDVI time series and comparison of multi-year NDVI profiles. Stakeholders, i.e., farmers and government officers, are able to obtain crop-growth information at crop-field level online. This unique utilization of GEE in

  12. Validation of survey information on smoking and alcohol consumption against import statistics, Greenland 1993–2010

    Directory of Open Access Journals (Sweden)

    Peter Bjerregaard

    2013-03-01

    Full Text Available Background. Questionnaires are widely used to obtain information on health-related behaviour, and they are more often than not the only method that can be used to assess the distribution of behaviour in subgroups of the population. No validation studies of reported consumption of tobacco or alcohol have been published from circumpolar indigenous communities. Objective. The purpose of the study is to compare information on the consumption of tobacco and alcohol obtained from 3 population surveys in Greenland with import statistics. Design. Estimates of consumption of cigarettes and alcohol using several different survey instruments in cross-sectional population studies from 1993–1994, 1999–2001 and 2005–2010 were compared with import statistics from the same years. Results. For cigarettes, survey results accounted for virtually the total import. Alcohol consumption was significantly under-reported with reporting completeness ranging from 40% to 51% for different estimates of habitual weekly consumption in the 3 study periods. Including an estimate of binge drinking increased the estimated total consumption to 78% of the import. Conclusion. Compared with import statistics, questionnaire-based population surveys capture the consumption of cigarettes well in Greenland. Consumption of alcohol is under-reported, but asking about binge episodes in addition to the usual intake considerably increased the reported intake in this population and made it more in agreement with import statistics. It is unknown to what extent these findings at the population level can be inferred to population subgroups.

  13. Imputing historical statistics, soils information, and other land-use data to crop area

    Science.gov (United States)

    Perry, C. R., Jr.; Willis, R. W.; Lautenschlager, L.

    1982-01-01

    In foreign crop condition monitoring, satellite acquired imagery is routinely used. To facilitate interpretation of this imagery, it is advantageous to have estimates of the crop types and their extent for small area units, i.e., grid cells on a map represent, at 60 deg latitude, an area nominally 25 by 25 nautical miles in size. The feasibility of imputing historical crop statistics, soils information, and other ancillary data to crop area for a province in Argentina is studied.

  14. An on-demand provision model for geospatial multisource information with active self-adaption services

    Science.gov (United States)

    Fan, Hong; Li, Huan

    2015-12-01

    Location-related data are playing an increasingly irreplaceable role in business, government and scientific research. At the same time, the amount and types of data are rapidly increasing. It is a challenge how to quickly find required information from this rapidly growing volume of data, as well as how to efficiently provide different levels of geospatial data to users. This paper puts forward a data-oriented access model for geographic information science data. First, we analyze the features of GIS data including traditional types such as vector and raster data and new types such as Volunteered Geographic Information (VGI). Taking into account these analyses, a classification scheme for geographic data is proposed and TRAFIE is introduced to describe the establishment of a multi-level model for geographic data. Based on this model, a multi-level, scalable access system for geospatial information is put forward. Users can select different levels of data according to their concrete application needs. Pull-based and push-based data access mechanisms based on this model are presented. A Service Oriented Architecture (SOA) was chosen for the data processing. The model of this study has been described by providing decision-making process of government departments with a simulation of fire disaster data collection. The use case shows this data model and the data provision system is flexible and has good adaptability.

  15. Multisource radiofrequency for fractional skin resurfacing-significant reduction of wrinkles.

    Science.gov (United States)

    Dahan, Serge; Rousseaux, Isabelle; Cartier, Hugues

    2013-04-01

    Skin roughness, color change, wrinkles and skin laxity are the main characteristics of aging skin. Dermatologists and plastic surgeons look for a treatment that will provide both epidermal resurfacing for the improvement of skin roughness and deep volumetric heating that will trigger collagen remodeling in the dermis to reduce wrinkles and skin laxity. These goals should be achieved with minimal pain and downtime. The study included 10 subjects (Fitzpatrick's skin type 2-3) with Fitzpatrick wrinkle and elastosis scale of 5-8 (average 7.3). Treatment was done with the Fractional skin resurfacing handpiece of the EndyMed PRO multisource radiofrequency system (EndyMed Ltd, Cesarea, Israel). Treatment was repeated each month up to a total of three treatment sessions. Patients photographs were graded according to accepted scales by a board certified dermatologists. Patients' pain and satisfaction were scored using dedicated questionnaires. Doctors' satisfaction was also evaluated. Post treatment skin erythema was noted in all treated patients, lasting up to 10 hours. Fifty six percent of patients reported no pain after treatment, and the rest (44%) reported minimal pain. All patients showed significant reduction in the Fitzpatrick wrinkle score. Average Fitzpatrick wrinkle score was 7.3 at baseline, 4.9 at 1 month after the first treatment, 4.2 at 1 month after the second treatment, and 4.1 at 1 month after the third treatment. The score was similar at 3 months after the third treatment with a score of 4.1. When asked at the end of three treatment sessions, all patients answered they will recommend the treatment to their friends (66% "definitely yes" and 33% "yes"). When asked the same question 3 months after the end of treatment, all patients (100%) answered "definitely yes".

  16. Mental Illness Statistics

    Science.gov (United States)

    ... News & Events About Us Home > Health Information Share Statistics Research shows that mental illnesses are common in ... of mental illnesses, such as suicide and disability. Statistics Top ı cs Mental Illness Any Anxiety Disorder ...

  17. Design and Experimental Evaluation on an Advanced Multisource Energy Harvesting System for Wireless Sensor Nodes

    Directory of Open Access Journals (Sweden)

    Hao Li

    2014-01-01

    Full Text Available An effective multisource energy harvesting system is presented as power supply for wireless sensor nodes (WSNs. The advanced system contains not only an expandable power management module including control of the charging and discharging process of the lithium polymer battery but also an energy harvesting system using the maximum power point tracking (MPPT circuit with analog driving scheme for the collection of both solar and vibration energy sources. Since the MPPT and the power management module are utilized, the system is able to effectively achieve a low power consumption. Furthermore, a super capacitor is integrated in the system so that current fluctuations of the lithium polymer battery during the charging and discharging processes can be properly reduced. In addition, through a simple analog switch circuit with low power consumption, the proposed system can successfully switch the power supply path according to the ambient energy sources and load power automatically. A practical WSNs platform shows that efficiency of the energy harvesting system can reach about 75–85% through the 24-hour environmental test, which confirms that the proposed system can be used as a long-term continuous power supply for WSNs.

  18. Neural correlates of interference resolution in the multi-source interference task: a meta-analysis of functional neuroimaging studies.

    Science.gov (United States)

    Deng, Yuqin; Wang, Xiaochun; Wang, Yan; Zhou, Chenglin

    2018-04-10

    Interference resolution refers to cognitive control processes enabling one to focus on task-related information while filtering out unrelated information. But the exact neural areas, which underlie a specific cognitive task on interference resolution, are still equivocal. The multi-source interference task (MSIT), as a particular cognitive task, is a well-established experimental paradigm used to evaluate interference resolution. Studies combining the MSIT with functional magnetic resonance imaging (fMRI) have shown that the MSIT evokes the dorsal anterior cingulate cortex (dACC) and cingulate-frontal-parietal cognitive-attentional networks. However, these brain areas have not been evaluated quantitatively and these findings have not been replicated. In the current study, we firstly report a voxel-based meta-analysis of functional brain activation associated with the MSIT so as to identify the localization of interference resolution in such a specific cognitive task. Articles on MSIT-related fMRI published between 2003 and July 2017 were eligible. The electronic databases searched included PubMed, Web of Knowledge, and Google Scholar. Differential BOLD activation patterns between the incongruent and congruent condition were meta-analyzed in anisotropic effect-size signed differential mapping software. Robustness meta-analysis indicated that two significant activation clusters were shown to have reliable functional activity in comparisons between incongruent and congruent conditions. The first reliable activation cluster, which included the dACC, medial prefrontal cortex, supplementary motor area, replicated the previous MSIT-related fMRI study results. Furthermore, we found another reliable activation cluster comprising areas of the right insula, right inferior frontal gyrus, and right lenticular nucleus-putamen, which were not typically discussed in previous MSIT-related fMRI studies. The current meta-analysis study presents the reliable brain activation patterns

  19. Data and Statistics: Heart Failure

    Science.gov (United States)

    ... Summary Coverdell Program 2012-2015 State Summaries Data & Statistics Fact Sheets Heart Disease and Stroke Fact Sheets ... Roadmap for State Planning Other Data Resources Other Statistic Resources Grantee Information Cross-Program Information Online Tools ...

  20. Harnessing the fruits of nature for the development of multi-targeted cancer therapeutics.

    Science.gov (United States)

    Sarkar, Fazlul H; Li, Yiwei

    2009-11-01

    Cancer cells exhibit deregulation in multiple cellular signaling pathways. Therefore, treatments using specific agents that target only one pathway usually fail in cancer therapy. The combination treatments using chemotherapeutic agents with distinct molecular mechanisms are considered more promising for higher efficacy; however, using multiple agents contributes to added toxicity. Emerging evidence has shown that some "natural products" such as isoflavones, indole-3-carbinol (I3C) and its in vivo dimeric product 3,3'-diindolylmethane (DIM), and curcumin among many others, have growth inhibitory and apoptosis inducing effects on human and animal cancer cells mediated by targeting multiple cellular signaling pathways in vitro without causing unwanted toxicity in normal cells. Therefore, these non-toxic "natural products" from natural resources could be useful in combination with conventional chemotherapeutic agents for the treatment of human malignancies with lower toxicity and higher efficacy. In fact, recently increasing evidence from pre-clinical in vivo studies and clinical trials have shown some success in support of the use of rational design of multi-targeted therapies for the treatment of cancers using conventional chemotherapeutic agents in combination with "natural products". These studies have provided promising results and further opened-up newer avenues for cancer therapy. In this review article, we have succinctly summarized the known effects of "natural products" especially by focusing on isoflavones, indole-3-carbinol (I3C) and its in vivo dimeric product 3,3'-diindolylmethane (DIM), and curcumin, and provided a comprehensive view on the molecular mechanisms underlying the principle of cancer therapy using combination of "natural products" with conventional therapeutics.

  1. Geospatial Information Service System Based on GeoSOT Grid & Encoding

    Directory of Open Access Journals (Sweden)

    LI Shizhong

    2016-12-01

    Full Text Available With the rapid development of the space and earth observation technology, it is important to establish a multi-source, multi-scale and unified cross-platform reference for global data. In practice, the production and maintenance of geospatial data are scattered in different units, and the standard of the data grid varies between departments and systems. All these bring out the disunity of standards among different historical periods or orgnizations. Aiming at geospatial information security library for the national high resolution earth observation, there are some demands for global display, associated retrieval and template applications and other integrated services for geospatial data. Based on GeoSOT grid and encoding theory system, "geospatial information security library information of globally unified grid encoding management" data subdivision organization solutions have been proposed; system-level analyses, researches and designs have been carried out. The experimental results show that the data organization and management method based on GeoSOT can significantly improve the overall efficiency of the geospatial information security service system.

  2. Uterine Cancer Statistics

    Science.gov (United States)

    ... Doing AMIGAS Stay Informed Cancer Home Uterine Cancer Statistics Language: English (US) Español (Spanish) Recommend on Facebook ... the most commonly diagnosed gynecologic cancer. U.S. Cancer Statistics Data Visualizations Tool The Data Visualizations tool makes ...

  3. Analysis of statistical misconception in terms of statistical reasoning

    Science.gov (United States)

    Maryati, I.; Priatna, N.

    2018-05-01

    Reasoning skill is needed for everyone to face globalization era, because every person have to be able to manage and use information from all over the world which can be obtained easily. Statistical reasoning skill is the ability to collect, group, process, interpret, and draw conclusion of information. Developing this skill can be done through various levels of education. However, the skill is low because many people assume that statistics is just the ability to count and using formulas and so do students. Students still have negative attitude toward course which is related to research. The purpose of this research is analyzing students’ misconception in descriptive statistic course toward the statistical reasoning skill. The observation was done by analyzing the misconception test result and statistical reasoning skill test; observing the students’ misconception effect toward statistical reasoning skill. The sample of this research was 32 students of math education department who had taken descriptive statistic course. The mean value of misconception test was 49,7 and standard deviation was 10,6 whereas the mean value of statistical reasoning skill test was 51,8 and standard deviation was 8,5. If the minimal value is 65 to state the standard achievement of a course competence, students’ mean value is lower than the standard competence. The result of students’ misconception study emphasized on which sub discussion that should be considered. Based on the assessment result, it was found that students’ misconception happen on this: 1) writing mathematical sentence and symbol well, 2) understanding basic definitions, 3) determining concept that will be used in solving problem. In statistical reasoning skill, the assessment was done to measure reasoning from: 1) data, 2) representation, 3) statistic format, 4) probability, 5) sample, and 6) association.

  4. Multi-Targeted Molecular Effects of Hibiscus sabdariffa Polyphenols: An Opportunity for a Global Approach to Obesity.

    Science.gov (United States)

    Herranz-López, María; Olivares-Vicente, Mariló; Encinar, José Antonio; Barrajón-Catalán, Enrique; Segura-Carretero, Antonio; Joven, Jorge; Micol, Vicente

    2017-08-20

    Improper diet can alter gene expression by breaking the energy balance equation and changing metabolic and oxidative stress biomarkers, which can result in the development of obesity-related metabolic disorders. The pleiotropic effects of dietary plant polyphenols are capable of counteracting by modulating different key molecular targets at the cell, as well as through epigenetic modifications. Hibiscus sabdariffa (HS)-derived polyphenols are known to ameliorate various obesity-related conditions. Recent evidence leads to propose the complex nature of the underlying mechanism of action. This multi-targeted mechanism includes the regulation of energy metabolism, oxidative stress and inflammatory pathways, transcription factors, hormones and peptides, digestive enzymes, as well as epigenetic modifications. This article reviews the accumulated evidence on the multiple anti-obesity effects of HS polyphenols in cell and animal models, as well as in humans, and its putative molecular targets. In silico studies reveal the capacity of several HS polyphenols to act as putative ligands for different digestive and metabolic enzymes, which may also deserve further attention. Therefore, a global approach including integrated and networked omics techniques, virtual screening and epigenetic analysis is necessary to fully understand the molecular mechanisms of HS polyphenols and metabolites involved, as well as their possible implications in the design of safe and effective polyphenolic formulations for obesity.

  5. Multi-Targeted Molecular Effects of Hibiscus sabdariffa Polyphenols: An Opportunity for a Global Approach to Obesity

    Science.gov (United States)

    Herranz-López, María; Olivares-Vicente, Mariló; Barrajón-Catalán, Enrique; Segura-Carretero, Antonio; Joven, Jorge; Micol, Vicente

    2017-01-01

    Improper diet can alter gene expression by breaking the energy balance equation and changing metabolic and oxidative stress biomarkers, which can result in the development of obesity-related metabolic disorders. The pleiotropic effects of dietary plant polyphenols are capable of counteracting by modulating different key molecular targets at the cell, as well as through epigenetic modifications. Hibiscus sabdariffa (HS)-derived polyphenols are known to ameliorate various obesity-related conditions. Recent evidence leads to propose the complex nature of the underlying mechanism of action. This multi-targeted mechanism includes the regulation of energy metabolism, oxidative stress and inflammatory pathways, transcription factors, hormones and peptides, digestive enzymes, as well as epigenetic modifications. This article reviews the accumulated evidence on the multiple anti-obesity effects of HS polyphenols in cell and animal models, as well as in humans, and its putative molecular targets. In silico studies reveal the capacity of several HS polyphenols to act as putative ligands for different digestive and metabolic enzymes, which may also deserve further attention. Therefore, a global approach including integrated and networked omics techniques, virtual screening and epigenetic analysis is necessary to fully understand the molecular mechanisms of HS polyphenols and metabolites involved, as well as their possible implications in the design of safe and effective polyphenolic formulations for obesity. PMID:28825642

  6. Practical Statistics

    CERN Document Server

    Lyons, L.

    2016-01-01

    Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.

  7. Cancer Data and Statistics Tools

    Science.gov (United States)

    ... Educational Campaigns Initiatives Stay Informed Cancer Data and Statistics Tools Recommend on Facebook Tweet Share Compartir Cancer Statistics Tools United States Cancer Statistics: Data Visualizations The ...

  8. Enhanced statistical damage identification using frequency-shift information with tunable piezoelectric transducer circuitry

    International Nuclear Information System (INIS)

    Zhao, J; Tang, J; Wang, K W

    2008-01-01

    The frequency-shift-based damage detection method entertains advantages such as global detection capability and easy implementation, but also suffers from drawbacks that include low detection accuracy and sensitivity and the difficulty in identifying damage using a small number of measurable frequencies. Moreover, the damage detection/identification performance is inevitably affected by the uncertainty/variations in the baseline model. In this research, we investigate an enhanced statistical damage identification method using the tunable piezoelectric transducer circuitry. The tunable piezoelectric transducer circuitry can lead to much enriched information on frequency shift (before and after damage occurrence). The circuitry elements, meanwhile, can be directly and accurately measured and thus can be considered uncertainty-free. A statistical damage identification algorithm is formulated which can identify both the mean and variance of the elemental property change. Our analysis indicates that the integration of the tunable piezoelectric transducer circuitry can significantly enhance the robustness of the frequency-shift-based damage identification approach under uncertainty and noise

  9. A Study of volumetric modulated arc therapy for stereotactic body radiation therapy in case of multi-target liver cancer using flattening filter free beam

    International Nuclear Information System (INIS)

    Yeom, Mi Sook; Yoon, In Ha; Hong, Dong Gi; Back, Geum Mun

    2015-01-01

    Stereotactic body radiation therapy (SBRT) has proved its efficacy in several patient populations with primary and metastatic limited tumors. Because SBRT prescription is high dose level than Conventional radiation therapy. SBRT plan is necessary for effective Organ at risk (OAR) protection and sufficient Planning target volume (PTV) dose coverage. In particular, multi-target cases may result excessive doses to OAR and hot spot due to dose overlap. This study evaluate usefulness of Volumetric modulated arc therapy (VMAT) in dosimetric and technical considerations using Flattening filter free (FFF) beam. The treatment plans for five patients, being treated on TrueBeam STx(Varian™, USA) with VMAT using 10MV FFF beam and Standard conformal radiotherapy (CRT) using 15MV Flattening filter (FF) beam. PTV, liver, duodenum, bowel, spinal cord, esophagus, stomach dose were evaluated using the dose volume histogram(DVH). Conformity index(CI), homogeneity index(HI), Paddick's index(PCI) for the PTV was assessed. Total Monitor unit (MU) and beam on time was assessed. Average value of CI, HI and PCI for PTV was 1.381±0.028, 1.096±0.016, 0.944±0.473 in VMAT and 1.381± 0.042, 1.136±0.042, 1.534±0.465 in CRT respectively. OAR dose in CRT plans evaluated 1.8 times higher than VMAT. Total MU in VMAT evaluated 1.3 times increase than CRT. Average beam on time was 6.8 minute in VMAT and 21.3 minute in CRT respectively. OAR dose in CRT plans evaluated 1.8 times higher than VMAT. Total MU in VMAT evaluated 1.3 times increase than CRT. Average beam on time was 6.8 minute in VMAT and 21.3 minute in CRT. VMAT for SBRT in multi-target liver cancer using FFF beam is effective treatment techniqe in dosimetric and technical considerations. VMAT decrease intra-fraction error due to treatment time shortening using high dose rate of FFF beam

  10. A Study of volumetric modulated arc therapy for stereotactic body radiation therapy in case of multi-target liver cancer using flattening filter free beam

    Energy Technology Data Exchange (ETDEWEB)

    Yeom, Mi Sook; Yoon, In Ha; Hong, Dong Gi; Back, Geum Mun [Dept. of Radiation Oncology, ASAN Medical Center, Seoul (Korea, Republic of)

    2015-06-15

    Stereotactic body radiation therapy (SBRT) has proved its efficacy in several patient populations with primary and metastatic limited tumors. Because SBRT prescription is high dose level than Conventional radiation therapy. SBRT plan is necessary for effective Organ at risk (OAR) protection and sufficient Planning target volume (PTV) dose coverage. In particular, multi-target cases may result excessive doses to OAR and hot spot due to dose overlap. This study evaluate usefulness of Volumetric modulated arc therapy (VMAT) in dosimetric and technical considerations using Flattening filter free (FFF) beam. The treatment plans for five patients, being treated on TrueBeam STx(Varian™, USA) with VMAT using 10MV FFF beam and Standard conformal radiotherapy (CRT) using 15MV Flattening filter (FF) beam. PTV, liver, duodenum, bowel, spinal cord, esophagus, stomach dose were evaluated using the dose volume histogram(DVH). Conformity index(CI), homogeneity index(HI), Paddick's index(PCI) for the PTV was assessed. Total Monitor unit (MU) and beam on time was assessed. Average value of CI, HI and PCI for PTV was 1.381±0.028, 1.096±0.016, 0.944±0.473 in VMAT and 1.381± 0.042, 1.136±0.042, 1.534±0.465 in CRT respectively. OAR dose in CRT plans evaluated 1.8 times higher than VMAT. Total MU in VMAT evaluated 1.3 times increase than CRT. Average beam on time was 6.8 minute in VMAT and 21.3 minute in CRT respectively. OAR dose in CRT plans evaluated 1.8 times higher than VMAT. Total MU in VMAT evaluated 1.3 times increase than CRT. Average beam on time was 6.8 minute in VMAT and 21.3 minute in CRT. VMAT for SBRT in multi-target liver cancer using FFF beam is effective treatment techniqe in dosimetric and technical considerations. VMAT decrease intra-fraction error due to treatment time shortening using high dose rate of FFF beam.

  11. Reports on internet traffic statistics

    NARCIS (Netherlands)

    Hoogesteger, Martijn; de Oliveira Schmidt, R.; Sperotto, Anna; Pras, Aiko

    2013-01-01

    Internet traffic statistics can provide valuable information to network analysts and researchers about the way nowadays networks are used. In the past, such information was provided by Internet2 in a public website called Internet2 NetFlow: Weekly Reports. The website reported traffic statistics

  12. Statistical physics of networks, information and complex systems

    Energy Technology Data Exchange (ETDEWEB)

    Ecke, Robert E [Los Alamos National Laboratory

    2009-01-01

    In this project we explore the mathematical methods and concepts of statistical physics that are fmding abundant applications across the scientific and technological spectrum from soft condensed matter systems and bio-infonnatics to economic and social systems. Our approach exploits the considerable similarity of concepts between statistical physics and computer science, allowing for a powerful multi-disciplinary approach that draws its strength from cross-fertilization and mUltiple interactions of researchers with different backgrounds. The work on this project takes advantage of the newly appreciated connection between computer science and statistics and addresses important problems in data storage, decoding, optimization, the infonnation processing properties of the brain, the interface between quantum and classical infonnation science, the verification of large software programs, modeling of complex systems including disease epidemiology, resource distribution issues, and the nature of highly fluctuating complex systems. Common themes that the project has been emphasizing are (i) neural computation, (ii) network theory and its applications, and (iii) a statistical physics approach to infonnation theory. The project's efforts focus on the general problem of optimization and variational techniques, algorithm development and infonnation theoretic approaches to quantum systems. These efforts are responsible for fruitful collaborations and the nucleation of science efforts that span multiple divisions such as EES, CCS, 0 , T, ISR and P. This project supports the DOE mission in Energy Security and Nuclear Non-Proliferation by developing novel infonnation science tools for communication, sensing, and interacting complex networks such as the internet or energy distribution system. The work also supports programs in Threat Reduction and Homeland Security.

  13. Targeting MDM2 by the small molecule RITA: towards the development of new multi-target drugs against cancer

    Directory of Open Access Journals (Sweden)

    Espinoza-Fonseca L Michel

    2005-09-01

    Full Text Available Abstract Background The use of low-molecular-weight, non-peptidic molecules that disrupt the interaction between the p53 tumor suppressor and its negative regulator MDM2 has provided a promising alternative for the treatment of different types of cancer. Among these compounds, RITA (reactivation of p53 and induction of tumor cell apoptosis has been shown to be effective in the selective induction of apoptosis, and this effect is due to its binding to the p53 tumor suppressor. Since biological systems are highly dynamic and MDM2 may bind to different regions of p53, new alternatives should be explored. On this basis, the computational "blind docking" approach was employed in this study to see whether RITA would bind to MDM2. Results It was observed that RITA binds to the MDM2 p53 transactivation domain-binding cleft. Thus, RITA can be used as a lead compound for designing improved "multi-target" drugs. This novel strategy could provide enormous benefits to enable effective anti-cancer strategies. Conclusion This study has demonstrated that a single molecule can target at least two different proteins related to the same disease.

  14. The spread of scientific information: insights from the web usage statistics in PLoS article-level metrics.

    Directory of Open Access Journals (Sweden)

    Koon-Kiu Yan

    Full Text Available The presence of web-based communities is a distinctive signature of Web 2.0. The web-based feature means that information propagation within each community is highly facilitated, promoting complex collective dynamics in view of information exchange. In this work, we focus on a community of scientists and study, in particular, how the awareness of a scientific paper is spread. Our work is based on the web usage statistics obtained from the PLoS Article Level Metrics dataset compiled by PLoS. The cumulative number of HTML views was found to follow a long tail distribution which is reasonably well-fitted by a lognormal one. We modeled the diffusion of information by a random multiplicative process, and thus extracted the rates of information spread at different stages after the publication of a paper. We found that the spread of information displays two distinct decay regimes: a rapid downfall in the first month after publication, and a gradual power law decay afterwards. We identified these two regimes with two distinct driving processes: a short-term behavior driven by the fame of a paper, and a long-term behavior consistent with citation statistics. The patterns of information spread were found to be remarkably similar in data from different journals, but there are intrinsic differences for different types of web usage (HTML views and PDF downloads versus XML. These similarities and differences shed light on the theoretical understanding of different complex systems, as well as a better design of the corresponding web applications that is of high potential marketing impact.

  15. The spread of scientific information: insights from the web usage statistics in PLoS article-level metrics.

    Science.gov (United States)

    Yan, Koon-Kiu; Gerstein, Mark

    2011-01-01

    The presence of web-based communities is a distinctive signature of Web 2.0. The web-based feature means that information propagation within each community is highly facilitated, promoting complex collective dynamics in view of information exchange. In this work, we focus on a community of scientists and study, in particular, how the awareness of a scientific paper is spread. Our work is based on the web usage statistics obtained from the PLoS Article Level Metrics dataset compiled by PLoS. The cumulative number of HTML views was found to follow a long tail distribution which is reasonably well-fitted by a lognormal one. We modeled the diffusion of information by a random multiplicative process, and thus extracted the rates of information spread at different stages after the publication of a paper. We found that the spread of information displays two distinct decay regimes: a rapid downfall in the first month after publication, and a gradual power law decay afterwards. We identified these two regimes with two distinct driving processes: a short-term behavior driven by the fame of a paper, and a long-term behavior consistent with citation statistics. The patterns of information spread were found to be remarkably similar in data from different journals, but there are intrinsic differences for different types of web usage (HTML views and PDF downloads versus XML). These similarities and differences shed light on the theoretical understanding of different complex systems, as well as a better design of the corresponding web applications that is of high potential marketing impact.

  16. National transportation statistics 2011

    Science.gov (United States)

    2011-04-01

    Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics : (BTS), National Transportation Statistics presents information on the U.S. transportation system, including : its physical components, safety reco...

  17. National Transportation Statistics 2008

    Science.gov (United States)

    2009-01-08

    Compiled and published by the U.S. Department of Transportations Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record...

  18. National Transportation Statistics 2009

    Science.gov (United States)

    2010-01-21

    Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record, ...

  19. Segmentation of human skull in MRI using statistical shape information from CT data.

    Science.gov (United States)

    Wang, Defeng; Shi, Lin; Chu, Winnie C W; Cheng, Jack C Y; Heng, Pheng Ann

    2009-09-01

    To automatically segment the skull from the MRI data using a model-based three-dimensional segmentation scheme. This study exploited the statistical anatomy extracted from the CT data of a group of subjects by means of constructing an active shape model of the skull surfaces. To construct a reliable shape model, a novel approach was proposed to optimize the automatic landmarking on the coupled surfaces (i.e., the skull vault) by minimizing the description length that incorporated local thickness information. This model was then used to locate the skull shape in MRI of a different group of patients. Compared with performing landmarking separately on the coupled surfaces, the proposed landmarking method constructed models that had better generalization ability and specificity. The segmentation accuracies were measured by the Dice coefficient and the set difference, and compared with the method based on mathematical morphology operations. The proposed approach using the active shape model based on the statistical skull anatomy presented in the head CT data contributes to more reliable segmentation of the skull from MRI data.

  20. Feedback data sources that inform physician self-assessment.

    Science.gov (United States)

    Lockyer, Jocelyn; Armson, Heather; Chesluk, Benjamin; Dornan, Timothy; Holmboe, Eric; Loney, Elaine; Mann, Karen; Sargeant, Joan

    2011-01-01

    Self-assessment is a process of interpreting data about one's performance and comparing it to explicit or implicit standards. To examine the external data sources physicians used to monitor themselves. Focus groups were conducted with physicians who participated in three practice improvement activities: a multisource feedback program; a program providing patient and chart audit data; and practice-based learning groups. We used grounded theory strategies to understand the external sources that stimulated self-assessment and how they worked. Data from seven focus groups (49 physicians) were analyzed. Physicians used information from structured programs, other educational activities, professional colleagues, and patients. Data were of varying quality, often from non-formal sources with implicit (not explicit) standards. Mandatory programs elicited variable responses, whereas data and activities the physicians selected themselves were more likely to be accepted. Physicians used the information to create a reference point against which they could weigh their performance using it variably depending on their personal interpretation of its accuracy, application, and utility. Physicians use and interpret data and standards of varying quality to inform self-assessment. Physicians may benefit from regular and routine feedback and guidance on how to seek out data for self-assessment.

  1. USING STATISTICAL SURVEY IN ECONOMICS

    Directory of Open Access Journals (Sweden)

    Delia TESELIOS

    2012-01-01

    Full Text Available Statistical survey is an effective method of statistical investigation that involves gathering quantitative data, which is often preferred in statistical reports due to the information which can be obtained regarding the entire population studied by observing a part of it. Therefore, because of the information provided, polls are used in many research areas. In economics, statistics are used in the decision making process in choosing competitive strategies in the analysis of certain economic phenomena, the formulation of forecasts. Economic study presented in this paper is to illustrate how a simple random sampling is used to analyze the existing parking spaces situation in a given locality.

  2. Industrial commodity statistics yearbook 2001. Production statistics (1992-2001)

    International Nuclear Information System (INIS)

    2003-01-01

    This is the thirty-fifth in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title The Growth of World industry and the next eight editions under the title Yearbook of Industrial Statistics. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. The statistics refer to the ten-year period 1992-2001 for about 200 countries and areas

  3. Industrial commodity statistics yearbook 2002. Production statistics (1993-2002)

    International Nuclear Information System (INIS)

    2004-01-01

    This is the thirty-sixth in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title 'The Growth of World industry' and the next eight editions under the title 'Yearbook of Industrial Statistics'. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. The statistics refer to the ten-year period 1993-2002 for about 200 countries and areas

  4. Industrial commodity statistics yearbook 2000. Production statistics (1991-2000)

    International Nuclear Information System (INIS)

    2002-01-01

    This is the thirty-third in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title The Growth of World industry and the next eight editions under the title Yearbook of Industrial Statistics. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. Most of the statistics refer to the ten-year period 1991-2000 for about 200 countries and areas

  5. HPV-Associated Cancers Statistics

    Science.gov (United States)

    ... What CDC Is Doing Related Links Stay Informed Statistics for Other Kinds of Cancer Breast Cervical Colorectal ( ... Vaginal and Vulvar Cancer Home HPV-Associated Cancer Statistics Language: English (US) Español (Spanish) Recommend on Facebook ...

  6. 1997 statistical yearbook

    International Nuclear Information System (INIS)

    1998-01-01

    The international office of energy information and studies (Enerdata), has published the second edition of its 1997 statistical yearbook which includes consolidated 1996 data with respect to the previous version from June 1997. The CD-Rom comprises the annual worldwide petroleum, natural gas, coal and electricity statistics from 1991 to 1996 with information about production, external trade, consumption, market shares, sectoral distribution of consumption and energy balance sheets. The world is divided into 12 zones (52 countries available). It contains also energy indicators: production and consumption tendencies, supply and production structures, safety of supplies, energy efficiency, and CO 2 emissions. (J.S.)

  7. CMS Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — The CMS Center for Strategic Planning produces an annual CMS Statistics reference booklet that provides a quick reference for summary information about health...

  8. Assessing medical students' performance in core competencies using multiple admission programs for colleges and universities: from the perspective of multi-source feedback.

    Science.gov (United States)

    Fang, Ji-Tseng; Ko, Yu-Shien; Chien, Chu-Chun; Yu, Kuang-Hui

    2013-01-01

    Since 1994, Taiwanese medical universities have employed the multiple application method comprising "recommendations and screening" and "admission application." The purpose of this study is to examine whether medical students admitted using different admission programs gave different performances. To evaluate the six core competencies for medical students proposed by Accreditation Council for Graduate Medical Education (ACGME), this study employed various assessment tools, including student opinion feedback, multi-source feedback (MSF), course grades, and examination results.MSF contains self-assessment scale, peer assessment scale, nursing staff assessment scale, visiting staff assessment scale, and chief resident assessment scale. In the subscales, the CronbachÊs alpha were higher than 0.90, indicating good reliability. Research participants consisted of 182 students from the School of Medicine at Chang Gung University. Regarding studentsÊ average grade for the medical ethics course, the performance of students who were enrolled through school recommendations exceeded that of students who were enrolled through the National College University Entrance Examination (NCUEE) p = 0.011), and all considered "teamwork" as the most important. Different entry pipelines of students in the "communication," "work attitude," "medical knowledge," and "teamwork" assessment scales showed no significant difference. The improvement rate of the students who were enrolled through the school recommendations was better than that of the students who were enrolled through the N CUEE in the "professional skills," "medical core competencies," "communication," and "teamwork" projects of self-assessment and peer assessment scales. However, the students who were enrolled through the NCUEE were better in the "professional skills," "medical core competencies," "communication," and "teamwork" projects of the visiting staff assessment scale and the chief resident assessment scale. Collectively

  9. Assessing medical students' performance in core competencies using multiple admission programs for colleges and universities: From the perspective of multi-source feedback

    Directory of Open Access Journals (Sweden)

    Ji-Tseng Fang

    2013-08-01

    Full Text Available Background: Since 1994, Taiwanese medical universities have employed the multiple application method comprising "recommendations and screening" and "admission application." The purpose of this study is to examine whether medical students admitted using different admission programs gave different performances. Methods: To evaluate the six core competencies for medical students proposed by Accreditation Council for Graduate Medical Education (ACGME, this study employed various assessment tools, including student opinion feedback, multi-source feedback (MSF, course grades, and examination results.MSF contains self-assessment scale, peer assessment scale, nursing staff assessment scale, visiting staff assessment scale, and chief resident assessment scale. In the subscales, the CronbachÊs alpha were higher than 0.90, indicating good reliability. Research participants consisted of 182 students from the School of Medicine at Chang Gung University. Results: Regarding studentsÊ average grade for the medical ethics course, the performance of students who were enrolled through school recommendations exceeded that of students who were enrolled through the National College University Entrance Examination (NCUEE p = 0.011, and all considered "teamwork" as the most important. Different entry pipelines of students in the "communication," "work attitude," "medical knowledge," and "teamwork" assessment scales showed no significant difference. The improvement rate of the students who were enrolled through the school recommendations was better than that of the students who were enrolled through the N CUEE in the "professional skills," "medical core competencies," "communication," and "teamwork" projects of self-assessment and peer assessment scales. However, the students who were enrolled through the NCUEE were better in the "professional skills," "medical core competencies," "communication," and "teamwork" projects of the visiting staff assessment scale and the

  10. Statistical secrecy and multibit commitments

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Pedersen, Torben P.; Pfitzmann, Birgit

    1998-01-01

    nothing about it. One definition is based on the L1-norm distance between probability distributions, the other on information theory. We prove that the two definitions are essentially equivalent. We also show that statistical counterparts of definitions of computational secrecy are essentially equivalent......We present and compare definitions of "statistically hiding" protocols, and we propose a novel statistically hiding commitment scheme. Informally, a protocol statistically hides a secret if a computationally unlimited adversary who conducts the protocol with the owner of the secret learns almost...... to our main definitions. Commitment schemes are an important cryptologic primitive. Their purpose is to commit one party to a certain value, while hiding this value from the other party until some later time. We present a statistically hiding commitment scheme allowing commitment to many bits...

  11. Official statistics and Big Data

    Directory of Open Access Journals (Sweden)

    Peter Struijs

    2014-07-01

    Full Text Available The rise of Big Data changes the context in which organisations producing official statistics operate. Big Data provides opportunities, but in order to make optimal use of Big Data, a number of challenges have to be addressed. This stimulates increased collaboration between National Statistical Institutes, Big Data holders, businesses and universities. In time, this may lead to a shift in the role of statistical institutes in the provision of high-quality and impartial statistical information to society. In this paper, the changes in context, the opportunities, the challenges and the way to collaborate are addressed. The collaboration between the various stakeholders will involve each partner building on and contributing different strengths. For national statistical offices, traditional strengths include, on the one hand, the ability to collect data and combine data sources with statistical products and, on the other hand, their focus on quality, transparency and sound methodology. In the Big Data era of competing and multiplying data sources, they continue to have a unique knowledge of official statistical production methods. And their impartiality and respect for privacy as enshrined in law uniquely position them as a trusted third party. Based on this, they may advise on the quality and validity of information of various sources. By thus positioning themselves, they will be able to play their role as key information providers in a changing society.

  12. Towards an optimal adaptation of exposure to NOAA assessment methodology in Multi-Source Industrial Scenarios (MSIS): the challenges and the decision-making process

    Science.gov (United States)

    López de Ipiña, JM; Vaquero, C.; Gutierrez-Cañas, C.

    2017-06-01

    It is expected a progressive increase of the industrial processes that manufacture of intermediate (iNEPs) and end products incorporating ENMs (eNEPs) to bring about improved properties. Therefore, the assessment of occupational exposure to airborne NOAA will migrate, from the simple and well-controlled exposure scenarios in research laboratories and ENMs production plants using innovative production technologies, to much more complex exposure scenarios located around processes of manufacture of eNEPs that, in many cases, will be modified conventional production processes. Here will be discussed some of the typical challenging situations in the process of risk assessment of inhalation exposure to NOAA in Multi-Source Industrial Scenarios (MSIS), from the basis of the lessons learned when confronted to those scenarios in the frame of some European and Spanish research projects.

  13. Structural Optimization of a High-Speed Press Considering Multi-Source Uncertainties Based on a New Heterogeneous TOPSIS

    Directory of Open Access Journals (Sweden)

    Jin Cheng

    2018-01-01

    Full Text Available In order to achieve high punching precision, good operational reliability and low manufacturing cost, the structural optimization of a high-speed press in the presence of a set of available alternatives comprises a heterogeneous multiple-attribute decision-making (HMADM problem involving deviation, fixation, cost and benefit attributes that can be described in various mathematical forms due to the existence of multi-source uncertainties. Such a HMADM problem cannot be easily resolved by existing methods. To overcome this difficulty, a new heterogeneous technique for order preference by similarity to an ideal solution (HTOPSIS is proposed. A new approach to normalization of heterogeneous attributes is proposed by integrating the possibility degree method, relative preference relation and the attribute transformation technique. Expressions for determining positive and negative ideal solutions corresponding to heterogeneous attributes are also developed. Finally, alternative structural configurations are ranked according to their relative closeness coefficients, and the optimal structural configuration can be determined. The validity and effectiveness of the proposed HTOPSIS are demonstrated by a numerical example. The proposed HTOPSIS can also be applied to structural optimization of other complex equipment, because there is no prerequisite of independency among various attributes for its application.

  14. CONFIDENCE LEVELS AND/VS. STATISTICAL HYPOTHESIS TESTING IN STATISTICAL ANALYSIS. CASE STUDY

    Directory of Open Access Journals (Sweden)

    ILEANA BRUDIU

    2009-05-01

    Full Text Available Estimated parameters with confidence intervals and testing statistical assumptions used in statistical analysis to obtain conclusions on research from a sample extracted from the population. Paper to the case study presented aims to highlight the importance of volume of sample taken in the study and how this reflects on the results obtained when using confidence intervals and testing for pregnant. If statistical testing hypotheses not only give an answer "yes" or "no" to some questions of statistical estimation using statistical confidence intervals provides more information than a test statistic, show high degree of uncertainty arising from small samples and findings build in the "marginally significant" or "almost significant (p very close to 0.05.

  15. Statistical decisions under nonparametric a priori information

    International Nuclear Information System (INIS)

    Chilingaryan, A.A.

    1985-01-01

    The basic module of applied program package for statistical analysis of the ANI experiment data is described. By means of this module tasks of choosing theoretical model most adequately fitting to experimental data, selection of events of definte type, identification of elementary particles are carried out. For mentioned problems solving, the Bayesian rules, one-leave out test and KNN (K Nearest Neighbour) adaptive density estimation are utilized

  16. Statistical Pattern Recognition

    CERN Document Server

    Webb, Andrew R

    2011-01-01

    Statistical pattern recognition relates to the use of statistical techniques for analysing data measurements in order to extract information and make justified decisions.  It is a very active area of study and research, which has seen many advances in recent years. Applications such as data mining, web searching, multimedia data retrieval, face recognition, and cursive handwriting recognition, all require robust and efficient pattern recognition techniques. This third edition provides an introduction to statistical pattern theory and techniques, with material drawn from a wide range of fields,

  17. Arizona Public Library Statistics, 1999-2000.

    Science.gov (United States)

    Arizona State Dept. of Library, Archives and Public Records, Phoenix.

    These statistics were compiled from information supplied by Arizona's public libraries. The document is divided according to the following county groups: Apache, Cochise; Coconino, Gila; Graham, Greenlee, La Paz; Maricopa; Mohave, Navajo; Pima, Pinal; Santa Cruz, Yavapai; Yuma. Statistics are presented on the following: general information;…

  18. Pre-2014 mudslides at Oso revealed by InSAR and multi-source DEM analysis

    Science.gov (United States)

    Kim, J. W.; Lu, Z.; QU, F.

    2014-12-01

    The landslide is a process that results in the downward and outward movement of slope-reshaping materials including rocks and soils and annually causes the loss of approximately $3.5 billion and tens of casualties in the United States. The 2014 Oso mudslide was an extreme event costing nearly 40 deaths and damaging civilian properties. Landslides are often unpredictable, but in many cases, catastrophic events are repetitive. Historic record in the Oso mudslide site indicates that there have been serial events in decades, though the extent of sliding events varied from time to time. In our study, the combination of multi-source DEMs, InSAR, and time-series InSAR analysis has enabled to characterize the Oso mudslide. InSAR results from ALOS PALSAR show that there was no significant deformation between mid-2006 and 2011. The combination of time-series InSAR analysis and old-dated DEM indicated revealed topographic changes associated the 2006 sliding event, which is confirmed by the difference of multiple LiDAR DEMs. Precipitation and discharge measurements before the 2006 and 2014 landslide events did not exhibit extremely anomalous records, suggesting the precipitation is not the controlling factor in determining the sliding events at Oso. The lack of surface deformation during 2006-2011 and weak correlation between the precipitation and the sliding event, suggest other factors (such as porosity) might play a critical role on the run-away events at this Oso and other similar landslides.

  19. Mathematical model of statistical identification of information support of road transport

    Directory of Open Access Journals (Sweden)

    V. G. Kozlov

    2016-01-01

    Full Text Available In this paper based on the statistical identification method using the theory of self-organizing systems, built multifactor model the relationship of road transport and training system. Background information for the model represented by a number of parameters of average annual road transport operations and information provision, including training complex system parameters (inputs, road management and output parameters. Ask two criteria: stability criterion model and test correlation. The program determines their minimum, and is the only model of optimal complexity. The predetermined number of parameters established mathematical relationship of each output parameter with the others. To improve the accuracy and regularity of the forecast of the interpolation nodes allocated in the test data sequence. Other data form the training sequence. Decision model based on the principle of selection. Running it with the gradual complication of the mathematical description and exhaustive search of all possible variants of the models on the specified criteria. Advantages of the proposed model: adequately reflects the actual process, allows you to enter any additional input parameters and determine their impact on the individual output parameters of the road transport, allows in turn change the values of key parameters in a certain ratio and to determine the appropriate changes the output parameters of the road transport, allows to predict the output parameters road transport operations.

  20. Medical facility statistics in Japan.

    Science.gov (United States)

    Hamajima, Nobuyuki; Sugimoto, Takuya; Hasebe, Ryo; Myat Cho, Su; Khaing, Moe; Kariya, Tetsuyoshi; Mon Saw, Yu; Yamamoto, Eiko

    2017-11-01

    Medical facility statistics provide essential information to policymakers, administrators, academics, and practitioners in the field of health services. In Japan, the Health Statistics Office of the Director-General for Statistics and Information Policy at the Ministry of Health, Labour and Welfare is generating these statistics. Although the statistics are widely available in both Japanese and English, the methodology described in the technical reports are primarily in Japanese, and are not fully described in English. This article aimed to describe these processes for readers in the English-speaking world. The Health Statistics Office routinely conduct two surveys called the Hospital Report and the Survey of Medical Institutions. The subjects of the former are all the hospitals and clinics with long-term care beds in Japan. It comprises a Patient Questionnaire focusing on the numbers of inpatients, admissions, discharges, and outpatients in one month, and an Employee Questionnaire, which asks about the number of employees as of October 1. The Survey of Medical Institutions consists of the Dynamic Survey, which focuses on the opening and closing of facilities every month, and the Static Survey, which focuses on staff, facilities, and services as of October 1, as well as the number of inpatients as of September 30 and the total number of outpatients during September. All hospitals, clinics, and dental clinics are requested to submit the Static Survey questionnaire every three years. These surveys are useful tools for collecting essential information, as well as providing occasions to implicitly inform facilities of the movements of government policy.

  1. Arizona Public Library Statistics, 2000-2001.

    Science.gov (United States)

    Elliott, Jan, Comp.

    These statistics were compiled from information supplied by Arizona's public libraries. The document is divided according to the following county groups: Apache, Cochise; Coconino, Gila; Graham, Greenlee, La Paz; Maricopa; Mohave, Navajo; Pima, Pinal; Santa Cruz, Yavapai; and Yuma. Statistics are presented on the following: general information;…

  2. Data and Statistics: Women and Heart Disease

    Science.gov (United States)

    ... Summary Coverdell Program 2012-2015 State Summaries Data & Statistics Fact Sheets Heart Disease and Stroke Fact Sheets ... Roadmap for State Planning Other Data Resources Other Statistic Resources Grantee Information Cross-Program Information Online Tools ...

  3. WPRDC Statistics

    Data.gov (United States)

    Allegheny County / City of Pittsburgh / Western PA Regional Data Center — Data about the usage of the WPRDC site and its various datasets, obtained by combining Google Analytics statistics with information from the WPRDC's data portal.

  4. Simulation Research Framework with Embedded Intelligent Algorithms for Analysis of Multi-Target, Multi-Sensor, High-Cluttered Environments

    Science.gov (United States)

    Hanlon, Nicholas P.

    nearly identical performance metrics at orders of magnitude faster in execution. Second, a fuzzy inference system is presented that alleviates air traffic controllers from information overload by utilizing flight plan data and radar/GPS correlation values to highlight aircraft that deviate from their intended routes. Third, a genetic algorithm optimizes sensor placement that is robust and capable of handling unexpected routes in the environment. Fourth, a fuzzy CUSUM algorithm more accurately detects and corrects aircraft mode changes. Finally, all the work is packaged in a holistic simulation research framework that provides evaluation and analysis of various multi-sensor, multi-target scenarios.

  5. Localized prostate cancer treatment decision-making information online: improving its effectiveness and dissemination for nonprofit and government-supported organizations.

    Science.gov (United States)

    Silk, Kami J; Perrault, Evan K; Nazione, Samantha; Pace, Kristin; Hager, Polly; Springer, Steven

    2013-12-01

    The current study reports findings from evaluation research conducted to identify how online prostate cancer treatment decision-making information can be both improved and more effectively disseminated to those who need it most. A multi-method, multi-target approach was used and guided by McGuire's Communication Matrix Model. Focus groups (n = 31) with prostate cancer patients and their family members, and in-depth interviews with physicians (n = 8), helped inform a web survey (n = 89). Results indicated that physicians remain a key information source for medical advice and the Internet is a primary channel used to help make informed prostate cancer treatment decisions. Participants reported a need for more accessible information related to treatment options and treatment side effects. Additionally, physicians indicated that the best way for agencies to reach them with new information to deliver to patients is by contacting them directly and meeting with them one-on-one. Advice for organizations to improve their current prostate cancer web offerings and further ways to improve information dissemination are discussed.

  6. Signal and data processing of small targets 1990; Proceedings of the Meeting, Orlando, FL, Apr. 16-18, 1990

    Science.gov (United States)

    Drummond, Oliver E.

    Various papers on signal and data processing of small targets are presented. Individual topics addressed include: clutter rejection using multispectral processing, new sensor for automatic guidance, Doppler domain localized generalized-likelihood-ratio detector, linear filter for resolution of point sources, analysis of order-statistic filters for robust detection, distribution functions for additive Gaussian and gamma noise, temperature discrimination of closely space objects, knowledge-based tracking algorithm, detecting and tracking low-observable targets using IR, weak target detection using the entropy concept, measurement-based neural-net multitarget tracker, object-track closed-form solution in angle space, passive-sensor data association for tracking. Also discussed are: application of MHT to dim moving targets, automatic static covariance analysis with mathematica, neural network implementation of plot/track association, application of Bayesian networks to multitarget tracking, tracking clusters and extended objects with multiple sensors, target tracking by human operator, finite impulse response estimator, multitarget tracking using an extended Kalman filter, maneuvering target tracking using a two-state model algorithm, mixture reduction algorithms for target tracking in clutter, scene interpretation approach to high-level target tracking.

  7. Some challenges with statistical inference in adaptive designs.

    Science.gov (United States)

    Hung, H M James; Wang, Sue-Jane; Yang, Peiling

    2014-01-01

    Adaptive designs have generated a great deal of attention to clinical trial communities. The literature contains many statistical methods to deal with added statistical uncertainties concerning the adaptations. Increasingly encountered in regulatory applications are adaptive statistical information designs that allow modification of sample size or related statistical information and adaptive selection designs that allow selection of doses or patient populations during the course of a clinical trial. For adaptive statistical information designs, a few statistical testing methods are mathematically equivalent, as a number of articles have stipulated, but arguably there are large differences in their practical ramifications. We pinpoint some undesirable features of these methods in this work. For adaptive selection designs, the selection based on biomarker data for testing the correlated clinical endpoints may increase statistical uncertainty in terms of type I error probability, and most importantly the increased statistical uncertainty may be impossible to assess.

  8. Birth Defects Data and Statistics

    Science.gov (United States)

    ... Submit" /> Information For… Media Policy Makers Data & Statistics Recommend on Facebook Tweet Share Compartir On This ... and critical. Read below for the latest national statistics on the occurrence of birth defects in the ...

  9. Transportation statistics annual report 2010

    Science.gov (United States)

    2011-01-01

    The Transportation Statistics Annual Report (TSAR) presents data and information compiled by the Bureau of Transportation Statistics (BTS), a component of the U.S. Department of Transportations (USDOTs) Research and Innovative Technology Admini...

  10. Transportation statistics annual report 2009

    Science.gov (United States)

    2009-01-01

    The Transportation Statistics Annual Report (TSAR) presents data and information selected by the Bureau of Transportation Statistics (BTS), a component of the U.S. Department of Transportation's (USDOT's) Research and Innovative Technology Administra...

  11. Order-specific fertility estimates based on perinatal statistics and statistics on out-of-hospital births

    OpenAIRE

    Kreyenfeld, Michaela; Peters, Frederik; Scholz, Rembrandt; Wlosnewski, Ines

    2014-01-01

    Until 2008, German vital statistics has not provided information on biological birth order. We have tried to close part of this gap by providing order-specific fertility rates generated from Perinatal Statistics and statistics on out-of-hospital births for the period 2001-2008. This investigation has been published in Comparative Population Studies (CPoS) (see Kreyenfeld, Scholz, Peters and Wlosnewski 2010). The CPoS-paper describes how data from the Perinatal Statistics and statistics on out...

  12. Performance Assessment of Multi-Source Weighted-Ensemble Precipitation (MSWEP Product over India

    Directory of Open Access Journals (Sweden)

    Akhilesh S. Nair

    2017-01-01

    Full Text Available Error characterization is vital for the advancement of precipitation algorithms, the evaluation of numerical model outputs, and their integration in various hydro-meteorological applications. The Tropical Rainfall Measuring Mission (TRMM Multi-satellite Precipitation Analysis (TMPA has been a benchmark for successive Global Precipitation Measurement (GPM based products. This has given way to the evolution of many multi-satellite precipitation products. This study evaluates the performance of the newly released multi-satellite Multi-Source Weighted-Ensemble Precipitation (MSWEP product, whose temporal variability was determined based on several data products including TMPA 3B42 RT. The evaluation was conducted over India with respect to the IMD-gauge-based rainfall for pre-monsoon, monsoon, and post monsoon seasons at daily scale for a 35-year (1979–2013 period. The rainfall climatology is examined over India and over four geographical extents within India known to be subject to uniform rainfall. The performance evaluation of rainfall time series was carried out. In addition to this, the performance of the product over different rainfall classes was evaluated along with the contribution of each class to the total rainfall. Further, seasonal evaluation of the MSWEP products was based on the categorical and volumetric indices from the contingency table. Upon evaluation it was observed that the MSWEP products show large errors in detecting the higher quantiles of rainfall (>75th and > 95th quantiles. The MSWEP precipitation product available at a 0.25° × 0.25° spatial resolution and daily temporal resolution matched well with the daily IMD rainfall over India. Overall results suggest that a suitable region and season-dependent bias correction is essential before its integration in hydrological applications. While the MSWEP was observed to perform well for daily rainfall, it suffered from poor detection capabilities for higher quantiles, making

  13. Quantum formalism for classical statistics

    Science.gov (United States)

    Wetterich, C.

    2018-06-01

    In static classical statistical systems the problem of information transport from a boundary to the bulk finds a simple description in terms of wave functions or density matrices. While the transfer matrix formalism is a type of Heisenberg picture for this problem, we develop here the associated Schrödinger picture that keeps track of the local probabilistic information. The transport of the probabilistic information between neighboring hypersurfaces obeys a linear evolution equation, and therefore the superposition principle for the possible solutions. Operators are associated to local observables, with rules for the computation of expectation values similar to quantum mechanics. We discuss how non-commutativity naturally arises in this setting. Also other features characteristic of quantum mechanics, such as complex structure, change of basis or symmetry transformations, can be found in classical statistics once formulated in terms of wave functions or density matrices. We construct for every quantum system an equivalent classical statistical system, such that time in quantum mechanics corresponds to the location of hypersurfaces in the classical probabilistic ensemble. For suitable choices of local observables in the classical statistical system one can, in principle, compute all expectation values and correlations of observables in the quantum system from the local probabilistic information of the associated classical statistical system. Realizing a static memory material as a quantum simulator for a given quantum system is not a matter of principle, but rather of practical simplicity.

  14. Spina Bifida Data and Statistics

    Science.gov (United States)

    ... Us Information For… Media Policy Makers Data and Statistics Recommend on Facebook Tweet Share Compartir Spina bifida ... the spine. Read below for the latest national statistics on spina bifida in the United States. In ...

  15. Statistical performance and information content of time lag analysis and redundancy analysis in time series modeling.

    Science.gov (United States)

    Angeler, David G; Viedma, Olga; Moreno, José M

    2009-11-01

    Time lag analysis (TLA) is a distance-based approach used to study temporal dynamics of ecological communities by measuring community dissimilarity over increasing time lags. Despite its increased use in recent years, its performance in comparison with other more direct methods (i.e., canonical ordination) has not been evaluated. This study fills this gap using extensive simulations and real data sets from experimental temporary ponds (true zooplankton communities) and landscape studies (landscape categories as pseudo-communities) that differ in community structure and anthropogenic stress history. Modeling time with a principal coordinate of neighborhood matrices (PCNM) approach, the canonical ordination technique (redundancy analysis; RDA) consistently outperformed the other statistical tests (i.e., TLAs, Mantel test, and RDA based on linear time trends) using all real data. In addition, the RDA-PCNM revealed different patterns of temporal change, and the strength of each individual time pattern, in terms of adjusted variance explained, could be evaluated, It also identified species contributions to these patterns of temporal change. This additional information is not provided by distance-based methods. The simulation study revealed better Type I error properties of the canonical ordination techniques compared with the distance-based approaches when no deterministic component of change was imposed on the communities. The simulation also revealed that strong emphasis on uniform deterministic change and low variability at other temporal scales is needed to result in decreased statistical power of the RDA-PCNM approach relative to the other methods. Based on the statistical performance of and information content provided by RDA-PCNM models, this technique serves ecologists as a powerful tool for modeling temporal change of ecological (pseudo-) communities.

  16. Image Statistics

    Energy Technology Data Exchange (ETDEWEB)

    Wendelberger, Laura Jean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-08

    In large datasets, it is time consuming or even impossible to pick out interesting images. Our proposed solution is to find statistics to quantify the information in each image and use those to identify and pick out images of interest.

  17. Water Quality attainment Information from Clean Water Act Statewide Statistical Surveys

    Data.gov (United States)

    U.S. Environmental Protection Agency — Designated uses assessed by statewide statistical surveys and their state and national attainment categories. Statewide statistical surveys are water quality...

  18. Water Quality Stressor Information from Clean Water Act Statewide Statistical Surveys

    Data.gov (United States)

    U.S. Environmental Protection Agency — Stressors assessed by statewide statistical surveys and their state and national attainment categories. Statewide statistical surveys are water quality assessments...

  19. Breakthroughs in statistics

    CERN Document Server

    Johnson, Norman

    This is author-approved bcc: This is the third volume of a collection of seminal papers in the statistical sciences written during the past 110 years. These papers have each had an outstanding influence on the development of statistical theory and practice over the last century. Each paper is preceded by an introduction written by an authority in the field providing background information and assessing its influence. Volume III concerntrates on articles from the 1980's while including some earlier articles not included in Volume I and II. Samuel Kotz is Professor of Statistics in the College of Business and Management at the University of Maryland. Norman L. Johnson is Professor Emeritus of Statistics at the University of North Carolina. Also available: Breakthroughs in Statistics Volume I: Foundations and Basic Theory Samuel Kotz and Norman L. Johnson, Editors 1993. 631 pp. Softcover. ISBN 0-387-94037-5 Breakthroughs in Statistics Volume II: Methodology and Distribution Samuel Kotz and Norman L. Johnson, Edi...

  20. Mathematical statistics and stochastic processes

    CERN Document Server

    Bosq, Denis

    2013-01-01

    Generally, books on mathematical statistics are restricted to the case of independent identically distributed random variables. In this book however, both this case AND the case of dependent variables, i.e. statistics for discrete and continuous time processes, are studied. This second case is very important for today's practitioners.Mathematical Statistics and Stochastic Processes is based on decision theory and asymptotic statistics and contains up-to-date information on the relevant topics of theory of probability, estimation, confidence intervals, non-parametric statistics and rob

  1. Statistical Yearbook of Norway 2012

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-07-01

    The Statistical Yearbook of Norway 2012 contains statistics on Norway and main figures for the Nordic countries and other countries selected from international statistics. The international over-views are integrated with the other tables and figures. The selection of tables in this edition is mostly the same as in the 2011 edition. The yearbook's 480 tables and figures present the main trends in official statistics in most areas of society. The list of tables and figures and an index at the back of the book provide easy access to relevant information. In addition, source information and Internet addresses below the tables make the yearbook a good starting point for those who are looking for more detailed statistics. The statistics are based on data gathered in statistical surveys and from administrative data, which, in cooperation with other public institutions, have been made available for statistical purposes. Some tables have been prepared in their entirety by other public institutions. The statistics follow approved principles, standards and classifications that are in line with international recommendations and guidelines. Content: 00. General subjects; 01. Environment; 02. Population; 03. Health and social conditions; 04. Education; 05. Personal economy and housing conditions; 06. Labour market; 07. Recreational, cultural and sporting activities; 08. Prices and indices; 09. National Economy and external trade; 10. Industrial activities; 11. Financial markets; 12. Public finances; Geographical survey.(eb)

  2. Statistical Yearbook of Norway 2012

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-07-01

    The Statistical Yearbook of Norway 2012 contains statistics on Norway and main figures for the Nordic countries and other countries selected from international statistics. The international over-views are integrated with the other tables and figures. The selection of tables in this edition is mostly the same as in the 2011 edition. The yearbook's 480 tables and figures present the main trends in official statistics in most areas of society. The list of tables and figures and an index at the back of the book provide easy access to relevant information. In addition, source information and Internet addresses below the tables make the yearbook a good starting point for those who are looking for more detailed statistics. The statistics are based on data gathered in statistical surveys and from administrative data, which, in cooperation with other public institutions, have been made available for statistical purposes. Some tables have been prepared in their entirety by other public institutions. The statistics follow approved principles, standards and classifications that are in line with international recommendations and guidelines. Content: 00. General subjects; 01. Environment; 02. Population; 03. Health and social conditions; 04. Education; 05. Personal economy and housing conditions; 06. Labour market; 07. Recreational, cultural and sporting activities; 08. Prices and indices; 09. National Economy and external trade; 10. Industrial activities; 11. Financial markets; 12. Public finances; Geographical survey.(eb)

  3. Knowing Where They Went: Six Years of Online Access Statistics via the Online Catalog for Federal Government Information

    Science.gov (United States)

    Brown, Christopher C.

    2011-01-01

    As federal government information is increasingly migrating to online formats, libraries are providing links to this content via URLs or persistent URLs (PURLs) in their online public access catalogs (OPACs). Clickthrough statistics that accumulated as users visited links to online content in the University of Denver's library OPAC were gathered…

  4. Revisiting Information Technology tools serving authorship and editorship: a case-guided tutorial to statistical analysis and plagiarism detection

    Science.gov (United States)

    Bamidis, P D; Lithari, C; Konstantinidis, S T

    2010-01-01

    With the number of scientific papers published in journals, conference proceedings, and international literature ever increasing, authors and reviewers are not only facilitated with an abundance of information, but unfortunately continuously confronted with risks associated with the erroneous copy of another's material. In parallel, Information Communication Technology (ICT) tools provide to researchers novel and continuously more effective ways to analyze and present their work. Software tools regarding statistical analysis offer scientists the chance to validate their work and enhance the quality of published papers. Moreover, from the reviewers and the editor's perspective, it is now possible to ensure the (text-content) originality of a scientific article with automated software tools for plagiarism detection. In this paper, we provide a step-bystep demonstration of two categories of tools, namely, statistical analysis and plagiarism detection. The aim is not to come up with a specific tool recommendation, but rather to provide useful guidelines on the proper use and efficiency of either category of tools. In the context of this special issue, this paper offers a useful tutorial to specific problems concerned with scientific writing and review discourse. A specific neuroscience experimental case example is utilized to illustrate the young researcher's statistical analysis burden, while a test scenario is purpose-built using open access journal articles to exemplify the use and comparative outputs of seven plagiarism detection software pieces. PMID:21487489

  5. Revisiting Information Technology tools serving authorship and editorship: a case-guided tutorial to statistical analysis and plagiarism detection.

    Science.gov (United States)

    Bamidis, P D; Lithari, C; Konstantinidis, S T

    2010-12-01

    With the number of scientific papers published in journals, conference proceedings, and international literature ever increasing, authors and reviewers are not only facilitated with an abundance of information, but unfortunately continuously confronted with risks associated with the erroneous copy of another's material. In parallel, Information Communication Technology (ICT) tools provide to researchers novel and continuously more effective ways to analyze and present their work. Software tools regarding statistical analysis offer scientists the chance to validate their work and enhance the quality of published papers. Moreover, from the reviewers and the editor's perspective, it is now possible to ensure the (text-content) originality of a scientific article with automated software tools for plagiarism detection. In this paper, we provide a step-bystep demonstration of two categories of tools, namely, statistical analysis and plagiarism detection. The aim is not to come up with a specific tool recommendation, but rather to provide useful guidelines on the proper use and efficiency of either category of tools. In the context of this special issue, this paper offers a useful tutorial to specific problems concerned with scientific writing and review discourse. A specific neuroscience experimental case example is utilized to illustrate the young researcher's statistical analysis burden, while a test scenario is purpose-built using open access journal articles to exemplify the use and comparative outputs of seven plagiarism detection software pieces.

  6. Validation of multisource electronic health record data: an application to blood transfusion data.

    Science.gov (United States)

    Hoeven, Loan R van; Bruijne, Martine C de; Kemper, Peter F; Koopman, Maria M W; Rondeel, Jan M M; Leyte, Anja; Koffijberg, Hendrik; Janssen, Mart P; Roes, Kit C B

    2017-07-14

    Although data from electronic health records (EHR) are often used for research purposes, systematic validation of these data prior to their use is not standard practice. Existing validation frameworks discuss validity concepts without translating these into practical implementation steps or addressing the potential influence of linking multiple sources. Therefore we developed a practical approach for validating routinely collected data from multiple sources and to apply it to a blood transfusion data warehouse to evaluate the usability in practice. The approach consists of identifying existing validation frameworks for EHR data or linked data, selecting validity concepts from these frameworks and establishing quantifiable validity outcomes for each concept. The approach distinguishes external validation concepts (e.g. concordance with external reports, previous literature and expert feedback) and internal consistency concepts which use expected associations within the dataset itself (e.g. completeness, uniformity and plausibility). In an example case, the selected concepts were applied to a transfusion dataset and specified in more detail. Application of the approach to a transfusion dataset resulted in a structured overview of data validity aspects. This allowed improvement of these aspects through further processing of the data and in some cases adjustment of the data extraction. For example, the proportion of transfused products that could not be linked to the corresponding issued products initially was 2.2% but could be improved by adjusting data extraction criteria to 0.17%. This stepwise approach for validating linked multisource data provides a basis for evaluating data quality and enhancing interpretation. When the process of data validation is adopted more broadly, this contributes to increased transparency and greater reliability of research based on routinely collected electronic health records.

  7. A Bayesian look at the optimal track labelling problem

    NARCIS (Netherlands)

    Aoki, E.H.; Boers, Y.; Svensson, L.; Mandal, Pranab K.; Bagchi, Arunabha

    In multi-target tracking (MTT), the problem of assigning labels to tracks (track labelling) is vastly covered in literature, but its exact mathematical formulation, in terms of Bayesian statistics, has not been yet looked at in detail. Doing so, however, may help us to understand how (and when)

  8. An analysis of the Bayesian track labelling problem

    NARCIS (Netherlands)

    Aoki, E.H.; Boers, Y.; Svensson, Lennart; Mandal, Pranab K.; Bagchi, Arunabha

    In multi-target tracking (MTT), the problem of assigning labels to tracks (track labelling) is vastly covered in literature, but its exact mathematical formulation, in terms of Bayesian statistics, has not been yet looked at in detail. Doing so, however, may help us to understand how Bayes-optimal

  9. A Geospatial Information Grid Framework for Geological Survey.

    Science.gov (United States)

    Wu, Liang; Xue, Lei; Li, Chaoling; Lv, Xia; Chen, Zhanlong; Guo, Mingqiang; Xie, Zhong

    2015-01-01

    The use of digital information in geological fields is becoming very important. Thus, informatization in geological surveys should not stagnate as a result of the level of data accumulation. The integration and sharing of distributed, multi-source, heterogeneous geological information is an open problem in geological domains. Applications and services use geological spatial data with many features, including being cross-region and cross-domain and requiring real-time updating. As a result of these features, desktop and web-based geographic information systems (GISs) experience difficulties in meeting the demand for geological spatial information. To facilitate the real-time sharing of data and services in distributed environments, a GIS platform that is open, integrative, reconfigurable, reusable and elastic would represent an indispensable tool. The purpose of this paper is to develop a geological cloud-computing platform for integrating and sharing geological information based on a cloud architecture. Thus, the geological cloud-computing platform defines geological ontology semantics; designs a standard geological information framework and a standard resource integration model; builds a peer-to-peer node management mechanism; achieves the description, organization, discovery, computing and integration of the distributed resources; and provides the distributed spatial meta service, the spatial information catalog service, the multi-mode geological data service and the spatial data interoperation service. The geological survey information cloud-computing platform has been implemented, and based on the platform, some geological data services and geological processing services were developed. Furthermore, an iron mine resource forecast and an evaluation service is introduced in this paper.

  10. The Concise Encyclopedia of Statistics

    CERN Document Server

    Dodge, Yadolah

    2008-01-01

    The Concise Encyclopedia of Statistics presents the essential information about statistical tests, concepts, and analytical methods in language that is accessible to practitioners and students of the vast community using statistics in medicine, engineering, physical science, life science, social science, and business/economics. The reference is alphabetically arranged to provide quick access to the fundamental tools of statistical methodology and biographies of famous statisticians. The more than 500 entries include definitions, history, mathematical details, limitations, examples, references,

  11. Learning models for multi-source integration

    Energy Technology Data Exchange (ETDEWEB)

    Tejada, S.; Knoblock, C.A.; Minton, S. [Univ. of Southern California/ISI, Marina del Rey, CA (United States)

    1996-12-31

    Because of the growing number of information sources available through the internet there are many cases in which information needed to solve a problem or answer a question is spread across several information sources. For example, when given two sources, one about comic books and the other about super heroes, you might want to ask the question {open_quotes}Is Spiderman a Marvel Super Hero?{close_quotes} This query accesses both sources; therefore, it is necessary to have information about the relationships of the data within each source and between sources to properly access and integrate the data retrieved. The SIMS information broker captures this type of information in the form of a model. All the information sources map into the model providing the user a single interface to multiple sources.

  12. EDI Performance Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — This section contains statistical information and reports related to the percentage of electronic transactions being sent to Medicare contractors in the formats...

  13. SESAM: a model for the calculation of radiation exposure by emission of pollutants with the exhaust air in the case of a multi-source situation

    International Nuclear Information System (INIS)

    Ehrlich, H.G.; Vogt, K.J.; Brunen, E.

    The report deals with the calculation of the individual radiation exposure in the catchment area of several nuclear emitters. A model and computer program, SESAM - Calculation of the Radiation Exposure by Emission of Pollutants with the Exhaust air in the Case of a Multi-Source Situation -, was developed which makes possible all the evaluations of long-time exposure which are relevant for the licensing process - such as the determination of the maximum individual radiation exposure to the various organs at the worst receiving point - together with the exposure of the environment by several nuclear emission sources - such as, for example, several units of a power plant facility, the various emitters of a waste management center, or even consideration of the previous exposure of a site by nuclear emission sources

  14. Information processing in bacteria: memory, computation, and statistical physics: a key issues review

    International Nuclear Information System (INIS)

    Lan, Ganhui; Tu, Yuhai

    2016-01-01

    preserving information, it does not reveal the underlying mechanism that leads to the observed input-output relationship, nor does it tell us much about which information is important for the organism and how biological systems use information to carry out specific functions. To do that, we need to develop models of the biological machineries, e.g. biochemical networks and neural networks, to understand the dynamics of biological information processes. This is a much more difficult task. It requires deep knowledge of the underlying biological network—the main players (nodes) and their interactions (links)—in sufficient detail to build a model with predictive power, as well as quantitative input-output measurements of the system under different perturbations (both genetic variations and different external conditions) to test the model predictions to guide further development of the model. Due to the recent growth of biological knowledge thanks in part to high throughput methods (sequencing, gene expression microarray, etc) and development of quantitative in vivo techniques such as various florescence technology, these requirements are starting to be realized in different biological systems. The possible close interaction between quantitative experimentation and theoretical modeling has made systems biology an attractive field for physicists interested in quantitative biology. In this review, we describe some of the recent work in developing a quantitative predictive model of bacterial chemotaxis, which can be considered as the hydrogen atom of systems biology. Using statistical physics approaches, such as the Ising model and Langevin equation, we study how bacteria, such as E. coli, sense and amplify external signals, how they keep a working memory of the stimuli, and how they use these data to compute the chemical gradient. In particular, we will describe how E. coli cells avoid cross-talk in a heterogeneous receptor cluster to keep a ligand-specific memory. We will also

  15. Information processing in bacteria: memory, computation, and statistical physics: a key issues review

    Science.gov (United States)

    Lan, Ganhui; Tu, Yuhai

    2016-05-01

    preserving information, it does not reveal the underlying mechanism that leads to the observed input-output relationship, nor does it tell us much about which information is important for the organism and how biological systems use information to carry out specific functions. To do that, we need to develop models of the biological machineries, e.g. biochemical networks and neural networks, to understand the dynamics of biological information processes. This is a much more difficult task. It requires deep knowledge of the underlying biological network—the main players (nodes) and their interactions (links)—in sufficient detail to build a model with predictive power, as well as quantitative input-output measurements of the system under different perturbations (both genetic variations and different external conditions) to test the model predictions to guide further development of the model. Due to the recent growth of biological knowledge thanks in part to high throughput methods (sequencing, gene expression microarray, etc) and development of quantitative in vivo techniques such as various florescence technology, these requirements are starting to be realized in different biological systems. The possible close interaction between quantitative experimentation and theoretical modeling has made systems biology an attractive field for physicists interested in quantitative biology. In this review, we describe some of the recent work in developing a quantitative predictive model of bacterial chemotaxis, which can be considered as the hydrogen atom of systems biology. Using statistical physics approaches, such as the Ising model and Langevin equation, we study how bacteria, such as E. coli, sense and amplify external signals, how they keep a working memory of the stimuli, and how they use these data to compute the chemical gradient. In particular, we will describe how E. coli cells avoid cross-talk in a heterogeneous receptor cluster to keep a ligand-specific memory. We will also

  16. Information processing in bacteria: memory, computation, and statistical physics: a key issues review.

    Science.gov (United States)

    Lan, Ganhui; Tu, Yuhai

    2016-05-01

    preserving information, it does not reveal the underlying mechanism that leads to the observed input-output relationship, nor does it tell us much about which information is important for the organism and how biological systems use information to carry out specific functions. To do that, we need to develop models of the biological machineries, e.g. biochemical networks and neural networks, to understand the dynamics of biological information processes. This is a much more difficult task. It requires deep knowledge of the underlying biological network-the main players (nodes) and their interactions (links)-in sufficient detail to build a model with predictive power, as well as quantitative input-output measurements of the system under different perturbations (both genetic variations and different external conditions) to test the model predictions to guide further development of the model. Due to the recent growth of biological knowledge thanks in part to high throughput methods (sequencing, gene expression microarray, etc) and development of quantitative in vivo techniques such as various florescence technology, these requirements are starting to be realized in different biological systems. The possible close interaction between quantitative experimentation and theoretical modeling has made systems biology an attractive field for physicists interested in quantitative biology. In this review, we describe some of the recent work in developing a quantitative predictive model of bacterial chemotaxis, which can be considered as the hydrogen atom of systems biology. Using statistical physics approaches, such as the Ising model and Langevin equation, we study how bacteria, such as E. coli, sense and amplify external signals, how they keep a working memory of the stimuli, and how they use these data to compute the chemical gradient. In particular, we will describe how E. coli cells avoid cross-talk in a heterogeneous receptor cluster to keep a ligand-specific memory. We will also

  17. Study on Net Primary Productivity over Complicated Mountainous Area based on Multi-Source Remote Sensing Data

    Science.gov (United States)

    Guan, X.; Shen, H.; Li, X.; Gan, W.

    2017-12-01

    Mountainous area hosts approximately a quarter of the global land surface, with complex climate and ecosystem conditions. More knowledge about mountainous ecosystem could highly advance our understanding of the global carbon cycle and climate change. Net Primary Productivity (NPP), the biomass increment of plants, is a widely used ecological indicator that can be obtained by remote sensing methods. However, limited by the defective characteristic of sensors, which cannot be long-term with enough spatial details synchronously, the mountainous NPP was far from being understood. In this study, a multi-sensor fusion framework was applied to synthesize a 1-km NPP series from 1982 to 2014 in mountainous southwest China, where elevation ranged from 76m to 6740m. The validation with field-measurements proved this framework greatly improved the accuracy of NPP (r=0.79, prun-off. What is more, it was indicated that the NPP variation showed three distinct stages at the year break-point of 1992 and 2002 over the region. The NPP in low-elevation area varied almost triple more drastic than the high-elevation area for all the three stages, due to the much greater change rate of precipitation. In summary, this study innovatively conducted a long-term and accurate NPP study on the not understood mountainous ecosystem with multi-source data, the framework and conclusions will be beneficial for the further cognition of global climate change.

  18. Collaborative classification of hyperspectral and visible images with convolutional neural network

    Science.gov (United States)

    Zhang, Mengmeng; Li, Wei; Du, Qian

    2017-10-01

    Recent advances in remote sensing technology have made multisensor data available for the same area, and it is well-known that remote sensing data processing and analysis often benefit from multisource data fusion. Specifically, low spatial resolution of hyperspectral imagery (HSI) degrades the quality of the subsequent classification task while using visible (VIS) images with high spatial resolution enables high-fidelity spatial analysis. A collaborative classification framework is proposed to fuse HSI and VIS images for finer classification. First, the convolutional neural network model is employed to extract deep spectral features for HSI classification. Second, effective binarized statistical image features are learned as contextual basis vectors for the high-resolution VIS image, followed by a classifier. The proposed approach employs diversified data in a decision fusion, leading to an integration of the rich spectral information, spatial information, and statistical representation information. In particular, the proposed approach eliminates the potential problems of the curse of dimensionality and excessive computation time. The experiments evaluated on two standard data sets demonstrate better classification performance offered by this framework.

  19. Reports on internet traffic statistics

    OpenAIRE

    Hoogesteger, Martijn; de Oliveira Schmidt, R.; Sperotto, Anna; Pras, Aiko

    2013-01-01

    Internet traffic statistics can provide valuable information to network analysts and researchers about the way nowadays networks are used. In the past, such information was provided by Internet2 in a public website called Internet2 NetFlow: Weekly Reports. The website reported traffic statistics from the Abilene network on a weekly basis. At that time, the network connected 230 research institutes with a 10Gb/s link. Although these reports were limited to the behavior of the Albeline's users,...

  20. TargetNet: a web service for predicting potential drug-target interaction profiling via multi-target SAR models

    Science.gov (United States)

    Yao, Zhi-Jiang; Dong, Jie; Che, Yu-Jing; Zhu, Min-Feng; Wen, Ming; Wang, Ning-Ning; Wang, Shan; Lu, Ai-Ping; Cao, Dong-Sheng

    2016-05-01

    Drug-target interactions (DTIs) are central to current drug discovery processes and public health fields. Analyzing the DTI profiling of the drugs helps to infer drug indications, adverse drug reactions, drug-drug interactions, and drug mode of actions. Therefore, it is of high importance to reliably and fast predict DTI profiling of the drugs on a genome-scale level. Here, we develop the TargetNet server, which can make real-time DTI predictions based only on molecular structures, following the spirit of multi-target SAR methodology. Naïve Bayes models together with various molecular fingerprints were employed to construct prediction models. Ensemble learning from these fingerprints was also provided to improve the prediction ability. When the user submits a molecule, the server will predict the activity of the user's molecule across 623 human proteins by the established high quality SAR model, thus generating a DTI profiling that can be used as a feature vector of chemicals for wide applications. The 623 SAR models related to 623 human proteins were strictly evaluated and validated by several model validation strategies, resulting in the AUC scores of 75-100 %. We applied the generated DTI profiling to successfully predict potential targets, toxicity classification, drug-drug interactions, and drug mode of action, which sufficiently demonstrated the wide application value of the potential DTI profiling. The TargetNet webserver is designed based on the Django framework in Python, and is freely accessible at http://targetnet.scbdd.com.

  1. TargetNet: a web service for predicting potential drug-target interaction profiling via multi-target SAR models.

    Science.gov (United States)

    Yao, Zhi-Jiang; Dong, Jie; Che, Yu-Jing; Zhu, Min-Feng; Wen, Ming; Wang, Ning-Ning; Wang, Shan; Lu, Ai-Ping; Cao, Dong-Sheng

    2016-05-01

    Drug-target interactions (DTIs) are central to current drug discovery processes and public health fields. Analyzing the DTI profiling of the drugs helps to infer drug indications, adverse drug reactions, drug-drug interactions, and drug mode of actions. Therefore, it is of high importance to reliably and fast predict DTI profiling of the drugs on a genome-scale level. Here, we develop the TargetNet server, which can make real-time DTI predictions based only on molecular structures, following the spirit of multi-target SAR methodology. Naïve Bayes models together with various molecular fingerprints were employed to construct prediction models. Ensemble learning from these fingerprints was also provided to improve the prediction ability. When the user submits a molecule, the server will predict the activity of the user's molecule across 623 human proteins by the established high quality SAR model, thus generating a DTI profiling that can be used as a feature vector of chemicals for wide applications. The 623 SAR models related to 623 human proteins were strictly evaluated and validated by several model validation strategies, resulting in the AUC scores of 75-100 %. We applied the generated DTI profiling to successfully predict potential targets, toxicity classification, drug-drug interactions, and drug mode of action, which sufficiently demonstrated the wide application value of the potential DTI profiling. The TargetNet webserver is designed based on the Django framework in Python, and is freely accessible at http://targetnet.scbdd.com .

  2. Mineral statistics yearbook 1994

    International Nuclear Information System (INIS)

    1994-01-01

    A summary of mineral production in Saskatchewan was compiled and presented as a reference manual. Statistical information on fuel minerals such as crude oil, natural gas, liquefied petroleum gas and coal, and of industrial and metallic minerals, such as potash, sodium sulphate, salt and uranium, was provided in all conceivable variety of tables. Production statistics, disposition and value of sales of industrial and metallic minerals were also made available. Statistical data on drilling of oil and gas reservoirs and crown land disposition were also included. figs., tabs

  3. A Three-Dimensional Target Depth-Resolution Method with a Single-Vector Sensor.

    Science.gov (United States)

    Zhao, Anbang; Bi, Xuejie; Hui, Juan; Zeng, Caigao; Ma, Lin

    2018-04-12

    This paper mainly studies and verifies the target number category-resolution method in multi-target cases and the target depth-resolution method of aerial targets. Firstly, target depth resolution is performed by using the sign distribution of the reactive component of the vertical complex acoustic intensity; the target category and the number resolution in multi-target cases is realized with a combination of the bearing-time recording information; and the corresponding simulation verification is carried out. The algorithm proposed in this paper can distinguish between the single-target multi-line spectrum case and the multi-target multi-line spectrum case. This paper presents an improved azimuth-estimation method for multi-target cases, which makes the estimation results more accurate. Using the Monte Carlo simulation, the feasibility of the proposed target number and category-resolution algorithm in multi-target cases is verified. In addition, by studying the field characteristics of the aerial and surface targets, the simulation results verify that there is only amplitude difference between the aerial target field and the surface target field under the same environmental parameters, and an aerial target can be treated as a special case of a surface target; the aerial target category resolution can then be realized based on the sign distribution of the reactive component of the vertical acoustic intensity so as to realize three-dimensional target depth resolution. By processing data from a sea experiment, the feasibility of the proposed aerial target three-dimensional depth-resolution algorithm is verified.

  4. Reducing the memory size in the study of statistical properties of the pseudo-random number generators, focused on solving problems of cryptographic information protection

    International Nuclear Information System (INIS)

    Chugunkov, I.V.

    2014-01-01

    The report contains the description of an approach based on calculation of missing sets quantity, which allows to reduce memory usage needed for implementation of statistical tests. Information about estimation procedure of test statistics derived as a result of using this approach is also provided [ru

  5. The Development of On-Line Statistics Program for Radiation Oncology

    International Nuclear Information System (INIS)

    Kim, Yoon Jong; Lee, Dong Hoon; Ji, Young Hoon; Lee, Dong Han; Jo, Chul Ku; Kim, Mi Sook; Ru, Sung Rul; Hong, Seung Hong

    2001-01-01

    Purpose : By developing on-line statistics program to record the information of radiation oncology to share the information with internet. It is possible to supply basic reference data for administrative plans to improve radiation oncology. Materials and methods : The information of radiation oncology statistics had been collected by paper forms about 52 hospitals in the past. Now, we can input the data by internet web browsers. The statistics program used windows NT 4.0 operation system, Internet Information Server 4.0 (IIS4.0) as a web server and the Microsoft Access MDB. We used Structured Query Language (SQL), Visual Basic, VBScript and JAVAScript to display the statistics according to years and hospitals. Results : This program shows present conditions about man power, research, therapy machines, technic, brachytherapy, clinic statistics, radiation safety management, institution, quality assurance and radioisotopes in radiation oncology department. The database consists of 38 inputs and 6 outputs windows. Statistical output windows can be increased continuously according to user need. Conclusion : We have developed statistics program to process all of the data in department of radiation oncology for reference information. Users easily could input the data by internet web browsers and share the information

  6. Data and Statistics

    Science.gov (United States)

    ... About Us Information For… Media Policy Makers Data & Statistics Recommend on Facebook Tweet Share Compartir Sickle cell ... 1999 through 2002. This drop coincided with the introduction in 2000 of a vaccine that protects against ...

  7. Transport statistics 1996

    CSIR Research Space (South Africa)

    Shepperson, L

    1997-12-01

    Full Text Available This publication contains transport and related statistics on roads, vehicles, infrastructure, passengers, freight, rail, air, maritime and road traffic, and international comparisons. The information compiled in this publication has been gathered...

  8. Reconstructing missing information on precipitation datasets: impact of tails on adopted statistical distributions.

    Science.gov (United States)

    Pedretti, Daniele; Beckie, Roger Daniel

    2014-05-01

    Missing data in hydrological time-series databases are ubiquitous in practical applications, yet it is of fundamental importance to make educated decisions in problems involving exhaustive time-series knowledge. This includes precipitation datasets, since recording or human failures can produce gaps in these time series. For some applications, directly involving the ratio between precipitation and some other quantity, lack of complete information can result in poor understanding of basic physical and chemical dynamics involving precipitated water. For instance, the ratio between precipitation (recharge) and outflow rates at a discharge point of an aquifer (e.g. rivers, pumping wells, lysimeters) can be used to obtain aquifer parameters and thus to constrain model-based predictions. We tested a suite of methodologies to reconstruct missing information in rainfall datasets. The goal was to obtain a suitable and versatile method to reduce the errors given by the lack of data in specific time windows. Our analyses included both a classical chronologically-pairing approach between rainfall stations and a probability-based approached, which accounted for the probability of exceedence of rain depths measured at two or multiple stations. Our analyses proved that it is not clear a priori which method delivers the best methodology. Rather, this selection should be based considering the specific statistical properties of the rainfall dataset. In this presentation, our emphasis is to discuss the effects of a few typical parametric distributions used to model the behavior of rainfall. Specifically, we analyzed the role of distributional "tails", which have an important control on the occurrence of extreme rainfall events. The latter strongly affect several hydrological applications, including recharge-discharge relationships. The heavy-tailed distributions we considered were parametric Log-Normal, Generalized Pareto, Generalized Extreme and Gamma distributions. The methods were

  9. Collective Influence of Multiple Spreaders Evaluated by Tracing Real Information Flow in Large-Scale Social Networks.

    Science.gov (United States)

    Teng, Xian; Pei, Sen; Morone, Flaviano; Makse, Hernán A

    2016-10-26

    Identifying the most influential spreaders that maximize information flow is a central question in network theory. Recently, a scalable method called "Collective Influence (CI)" has been put forward through collective influence maximization. In contrast to heuristic methods evaluating nodes' significance separately, CI method inspects the collective influence of multiple spreaders. Despite that CI applies to the influence maximization problem in percolation model, it is still important to examine its efficacy in realistic information spreading. Here, we examine real-world information flow in various social and scientific platforms including American Physical Society, Facebook, Twitter and LiveJournal. Since empirical data cannot be directly mapped to ideal multi-source spreading, we leverage the behavioral patterns of users extracted from data to construct "virtual" information spreading processes. Our results demonstrate that the set of spreaders selected by CI can induce larger scale of information propagation. Moreover, local measures as the number of connections or citations are not necessarily the deterministic factors of nodes' importance in realistic information spreading. This result has significance for rankings scientists in scientific networks like the APS, where the commonly used number of citations can be a poor indicator of the collective influence of authors in the community.

  10. Statistical Literacy in the Data Science Workplace

    Science.gov (United States)

    Grant, Robert

    2017-01-01

    Statistical literacy, the ability to understand and make use of statistical information including methods, has particular relevance in the age of data science, when complex analyses are undertaken by teams from diverse backgrounds. Not only is it essential to communicate to the consumers of information but also within the team. Writing from the…

  11. Improving Statistical Literacy in Schools in Australia

    OpenAIRE

    Trewin, Dennis

    2005-01-01

    We live in the information age. Statistical thinking is a life skill that all Australian children should have. The Statistical Society of Australia (SSAI) and the Australian Bureau of Statistics (ABS) have been working on a strategy to ensure Australian school children acquire a sufficient understanding and appreciation of how data can be acquired and used so they can make informed judgements in their daily lives, as children and then as adults. There is another motive for our work i...

  12. Power analysis as a tool to identify statistically informative indicators for monitoring coral reef disturbances.

    Science.gov (United States)

    Van Wynsberge, Simon; Gilbert, Antoine; Guillemot, Nicolas; Heintz, Tom; Tremblay-Boyer, Laura

    2017-07-01

    Extensive biological field surveys are costly and time consuming. To optimize sampling and ensure regular monitoring on the long term, identifying informative indicators of anthropogenic disturbances is a priority. In this study, we used 1800 candidate indicators by combining metrics measured from coral, fish, and macro-invertebrate assemblages surveyed from 2006 to 2012 in the vicinity of an ongoing mining project in the Voh-Koné-Pouembout lagoon, New Caledonia. We performed a power analysis to identify a subset of indicators which would best discriminate temporal changes due to a simulated chronic anthropogenic impact. Only 4% of tested indicators were likely to detect a 10% annual decrease of values with sufficient power (>0.80). Corals generally exerted higher statistical power than macro-invertebrates and fishes because of lower natural variability and higher occurrence. For the same reasons, higher taxonomic ranks provided higher power than lower taxonomic ranks. Nevertheless, a number of families of common sedentary or sessile macro-invertebrates and fishes also performed well in detecting changes: Echinometridae, Isognomidae, Muricidae, Tridacninae, Arcidae, and Turbinidae for macro-invertebrates and Pomacentridae, Labridae, and Chaetodontidae for fishes. Interestingly, these families did not provide high power in all geomorphological strata, suggesting that the ability of indicators in detecting anthropogenic impacts was closely linked to reef geomorphology. This study provides a first operational step toward identifying statistically relevant indicators of anthropogenic disturbances in New Caledonia's coral reefs, which can be useful in similar tropical reef ecosystems where little information is available regarding the responses of ecological indicators to anthropogenic disturbances.

  13. Statistical methods in nonlinear dynamics

    Indian Academy of Sciences (India)

    Sensitivity to initial conditions in nonlinear dynamical systems leads to exponential divergence of trajectories that are initially arbitrarily close, and hence to unpredictability. Statistical methods have been found to be helpful in extracting useful information about such systems. In this paper, we review briefly some statistical ...

  14. Structure-Based Design and Optimization of Multitarget-Directed 2H-Chromen-2-one Derivatives as Potent Inhibitors of Monoamine Oxidase B and Cholinesterases.

    Science.gov (United States)

    Farina, Roberta; Pisani, Leonardo; Catto, Marco; Nicolotti, Orazio; Gadaleta, Domenico; Denora, Nunzio; Soto-Otero, Ramon; Mendez-Alvarez, Estefania; Passos, Carolina S; Muncipinto, Giovanni; Altomare, Cosimo D; Nurisso, Alessandra; Carrupt, Pierre-Alain; Carotti, Angelo

    2015-07-23

    The multifactorial nature of Alzheimer's disease calls for the development of multitarget agents addressing key pathogenic processes. To this end, by following a docking-assisted hybridization strategy, a number of aminocoumarins were designed, prepared, and tested as monoamine oxidases (MAOs) and acetyl- and butyryl-cholinesterase (AChE and BChE) inhibitors. Highly flexible N-benzyl-N-alkyloxy coumarins 2-12 showed good inhibitory activities at MAO-B, AChE, and BChE but low selectivity. More rigid inhibitors, bearing meta- and para-xylyl linkers, displayed good inhibitory activities and high MAO-B selectivity. Compounds 21, 24, 37, and 39, the last two featuring an improved hydrophilic/lipophilic balance, exhibited excellent activity profiles with nanomolar inhibitory potency toward hMAO-B, high hMAO-B over hMAO-A selectivity and submicromolar potency at hAChE. Cell-based assays of BBB permeation, neurotoxicity, and neuroprotection supported the potential of compound 37 as a BBB-permeant neuroprotective agent against H2O2-induced oxidative stress with poor interaction as P-gp substrate and very low cytotoxicity.

  15. Selecting the right statistical model for analysis of insect count data by using information theoretic measures.

    Science.gov (United States)

    Sileshi, G

    2006-10-01

    Researchers and regulatory agencies often make statistical inferences from insect count data using modelling approaches that assume homogeneous variance. Such models do not allow for formal appraisal of variability which in its different forms is the subject of interest in ecology. Therefore, the objectives of this paper were to (i) compare models suitable for handling variance heterogeneity and (ii) select optimal models to ensure valid statistical inferences from insect count data. The log-normal, standard Poisson, Poisson corrected for overdispersion, zero-inflated Poisson, the negative binomial distribution and zero-inflated negative binomial models were compared using six count datasets on foliage-dwelling insects and five families of soil-dwelling insects. Akaike's and Schwarz Bayesian information criteria were used for comparing the various models. Over 50% of the counts were zeros even in locally abundant species such as Ootheca bennigseni Weise, Mesoplatys ochroptera Stål and Diaecoderus spp. The Poisson model after correction for overdispersion and the standard negative binomial distribution model provided better description of the probability distribution of seven out of the 11 insects than the log-normal, standard Poisson, zero-inflated Poisson or zero-inflated negative binomial models. It is concluded that excess zeros and variance heterogeneity are common data phenomena in insect counts. If not properly modelled, these properties can invalidate the normal distribution assumptions resulting in biased estimation of ecological effects and jeopardizing the integrity of the scientific inferences. Therefore, it is recommended that statistical models appropriate for handling these data properties be selected using objective criteria to ensure efficient statistical inference.

  16. a Task-Oriented Disaster Information Correlation Method

    Science.gov (United States)

    Linyao, Q.; Zhiqiang, D.; Qing, Z.

    2015-07-01

    With the rapid development of sensor networks and Earth observation technology, a large quantity of disaster-related data is available, such as remotely sensed data, historic data, case data, simulated data, and disaster products. However, the efficiency of current data management and service systems has become increasingly difficult due to the task variety and heterogeneous data. For emergency task-oriented applications, the data searches primarily rely on artificial experience based on simple metadata indices, the high time consumption and low accuracy of which cannot satisfy the speed and veracity requirements for disaster products. In this paper, a task-oriented correlation method is proposed for efficient disaster data management and intelligent service with the objectives of 1) putting forward disaster task ontology and data ontology to unify the different semantics of multi-source information, 2) identifying the semantic mapping from emergency tasks to multiple data sources on the basis of uniform description in 1), and 3) linking task-related data automatically and calculating the correlation between each data set and a certain task. The method goes beyond traditional static management of disaster data and establishes a basis for intelligent retrieval and active dissemination of disaster information. The case study presented in this paper illustrates the use of the method on an example flood emergency relief task.

  17. Breach of belongingness: Newcomer relationship conflict, information, and task-related outcomes during organizational socialization.

    Science.gov (United States)

    Nifadkar, Sushil S; Bauer, Talya N

    2016-01-01

    Previous studies of newcomer socialization have underlined the importance of newcomers' information seeking for their adjustment to the organization, and the conflict literature has consistently reported negative effects of relationship conflict with coworkers. However, to date, no study has examined the consequences of relationship conflict on newcomers' information seeking. In this study, we examined newcomers' reactions when they have relationship conflict with their coworkers, and hence cannot obtain necessary information from them. Drawing upon belongingness theory, we propose a model that moves from breach of belongingness to its proximal and distal consequences, to newcomer information seeking, and then to task-related outcomes. In particular, we propose that second paths exist-first coworker-centric and the other supervisor-centric-that may have simultaneous yet contrasting influence on newcomer adjustment. To test our model, we employ a 3-wave data collection research design with egocentric and Likert-type multisource surveys among a sample of new software engineers and their supervisors working in India. This study contributes to the field by linking the literatures on relationship conflict and newcomer information seeking and suggesting that despite conflict with coworkers, newcomers may succeed in organizations by building relationships with and obtaining information from supervisors. (c) 2016 APA, all rights reserved).

  18. Statistics II essentials

    CERN Document Server

    Milewski, Emil G

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Statistics II discusses sampling theory, statistical inference, independent and dependent variables, correlation theory, experimental design, count data, chi-square test, and time se

  19. Fisher information and statistical inference for phase-type distributions

    DEFF Research Database (Denmark)

    Bladt, Mogens; Esparza, Luz Judith R; Nielsen, Bo Friis

    2011-01-01

    This paper is concerned with statistical inference for both continuous and discrete phase-type distributions. We consider maximum likelihood estimation, where traditionally the expectation-maximization (EM) algorithm has been employed. Certain numerical aspects of this method are revised and we...

  20. METHODS AND TOOLS TO DEVELOP INNOVATIVE STRATEGIC MANAGEMENT DECISIONS BASED ON THE APPLICATION OF ADVANCED INFORMATION AND COMMUNICATION TECHNOLOGIES IN THE STATISTICAL BRANCH OF UZBEKISTAN

    OpenAIRE

    Irina E. Zhukovskya

    2013-01-01

    This paper focuses on the improvement of the statistical branch-based application of electronic document management and network information technology. As a software solutions proposed use of new software solutions of the State Committee on Statistics of the Republic of Uzbekistan «eStat 2.0», allowing not only to optimize the statistical sector employees, but also serves as a link between all the economic entities of the national economy.