WorldWideScience

Sample records for complex event-sequence approach

  1. The sequence coding and search system: an approach for constructing and analyzing event sequences at commercial nuclear power plants

    International Nuclear Information System (INIS)

    Mays, G.T.

    1990-01-01

    The U.S. Nuclear Regulatory Commission (NRC) has recognized the importance of the collection, assessment, and feedback of operating experience data from commercial nuclear power plants and has centralized these activities in the Office for Analysis and Evaluation of Operational Data (AEOD). Such data is essential for performing safety and reliability analyses, especially analyses of trends and patterns to identify undesirable changes in plant performance at the earliest opportunity to implement corrective measures to preclude the occurrence of a more serious event. One of NRC's principal tools for collecting and evaluating operating experience data is the Sequence Coding and Search System (SCSS). The SCSS consists of a methodology for structuring event sequences and the requisite computer system to store and search the data. The source information for SCSS is the Licensee Event Report (LER), which is a legally required document. This paper describes the objectives of SCSS, the information it contains, and the format and approach for constructing SCSS event sequences. Examples are presented demonstrating the use of SCSS to support the analysis of LER data. The SCSS contains over 30,000 LERs describing events from 1980 through the present. Insights gained from working with a complex data system from the initial developmental stage to the point of a mature operating system are highlighted. Considerable experience has been gained in the areas of evolving and changing data requirements, staffing requirements, and quality control and quality assurance procedures for addressing consistency, software/hardware considerations for developing and maintaining a complex system, documentation requirements, and end-user needs. Two other approaches for constructing and evaluating event sequences are examined including the Accident Precursor Program (ASP) where sequences having the potential for core damage are identified and analyzed, and the Significant Event Compilation Tree

  2. Characterization of GM events by insert knowledge adapted re-sequencing approaches.

    Science.gov (United States)

    Yang, Litao; Wang, Congmao; Holst-Jensen, Arne; Morisset, Dany; Lin, Yongjun; Zhang, Dabing

    2013-10-03

    Detection methods and data from molecular characterization of genetically modified (GM) events are needed by stakeholders of public risk assessors and regulators. Generally, the molecular characteristics of GM events are incomprehensively revealed by current approaches and biased towards detecting transformation vector derived sequences. GM events are classified based on available knowledge of the sequences of vectors and inserts (insert knowledge). Herein we present three insert knowledge-adapted approaches for characterization GM events (TT51-1 and T1c-19 rice as examples) based on paired-end re-sequencing with the advantages of comprehensiveness, accuracy, and automation. The comprehensive molecular characteristics of two rice events were revealed with additional unintended insertions comparing with the results from PCR and Southern blotting. Comprehensive transgene characterization of TT51-1 and T1c-19 is shown to be independent of a priori knowledge of the insert and vector sequences employing the developed approaches. This provides an opportunity to identify and characterize also unknown GM events.

  3. Characterization of GM events by insert knowledge adapted re-sequencing approaches

    OpenAIRE

    Yang, Litao; Wang, Congmao; Holst-Jensen, Arne; Morisset, Dany; Lin, Yongjun; Zhang, Dabing

    2013-01-01

    Detection methods and data from molecular characterization of genetically modified (GM) events are needed by stakeholders of public risk assessors and regulators. Generally, the molecular characteristics of GM events are incomprehensively revealed by current approaches and biased towards detecting transformation vector derived sequences. GM events are classified based on available knowledge of the sequences of vectors and inserts (insert knowledge). Herein we present three insert knowledge-ad...

  4. The sequence coding and search system: An approach for constructing and analyzing event sequences at commercial nuclear power plants

    International Nuclear Information System (INIS)

    Mays, G.T.

    1989-04-01

    The US Nuclear Regulatory Commission (NRC) has recognized the importance of the collection, assessment, and feedstock of operating experience data from commercial nuclear power plants and has centralized these activities in the Office for Analysis and Evaluation of Operational Data (AEOD). Such data is essential for performing safety and reliability analyses, especially analyses of trends and patterns to identify undesirable changes in plant performance at the earliest opportunity to implement corrective measures to preclude the occurrences of a more serious event. One of NRC's principal tools for collecting and evaluating operating experience data is the Sequence Coding and Search System (SCSS). The SCSS consists of a methodology for structuring event sequences and the requisite computer system to store and search the data. The source information for SCSS is the Licensee Event Report (LER), which is a legally required document. This paper describes the objective SCSS, the information it contains, and the format and approach for constructuring SCSS event sequences. Examples are presented demonstrating the use SCSS to support the analysis of LER data. The SCSS contains over 30,000 LERs describing events from 1980 through the present. Insights gained from working with a complex data system from the initial developmental stage to the point of a mature operating system are highlighted

  5. Sequence Synopsis: Optimize Visual Summary of Temporal Event Data.

    Science.gov (United States)

    Chen, Yuanzhe; Xu, Panpan; Ren, Liu

    2018-01-01

    Event sequences analysis plays an important role in many application domains such as customer behavior analysis, electronic health record analysis and vehicle fault diagnosis. Real-world event sequence data is often noisy and complex with high event cardinality, making it a challenging task to construct concise yet comprehensive overviews for such data. In this paper, we propose a novel visualization technique based on the minimum description length (MDL) principle to construct a coarse-level overview of event sequence data while balancing the information loss in it. The method addresses a fundamental trade-off in visualization design: reducing visual clutter vs. increasing the information content in a visualization. The method enables simultaneous sequence clustering and pattern extraction and is highly tolerant to noises such as missing or additional events in the data. Based on this approach we propose a visual analytics framework with multiple levels-of-detail to facilitate interactive data exploration. We demonstrate the usability and effectiveness of our approach through case studies with two real-world datasets. One dataset showcases a new application domain for event sequence visualization, i.e., fault development path analysis in vehicles for predictive maintenance. We also discuss the strengths and limitations of the proposed method based on user feedback.

  6. Foundations for Streaming Model Transformations by Complex Event Processing.

    Science.gov (United States)

    Dávid, István; Ráth, István; Varró, Dániel

    2018-01-01

    Streaming model transformations represent a novel class of transformations to manipulate models whose elements are continuously produced or modified in high volume and with rapid rate of change. Executing streaming transformations requires efficient techniques to recognize activated transformation rules over a live model and a potentially infinite stream of events. In this paper, we propose foundations of streaming model transformations by innovatively integrating incremental model query, complex event processing (CEP) and reactive (event-driven) transformation techniques. Complex event processing allows to identify relevant patterns and sequences of events over an event stream. Our approach enables event streams to include model change events which are automatically and continuously populated by incremental model queries. Furthermore, a reactive rule engine carries out transformations on identified complex event patterns. We provide an integrated domain-specific language with precise semantics for capturing complex event patterns and streaming transformations together with an execution engine, all of which is now part of the Viatra reactive transformation framework. We demonstrate the feasibility of our approach with two case studies: one in an advanced model engineering workflow; and one in the context of on-the-fly gesture recognition.

  7. Molecular Characterization of Transgenic Events Using Next Generation Sequencing Approach.

    Science.gov (United States)

    Guttikonda, Satish K; Marri, Pradeep; Mammadov, Jafar; Ye, Liang; Soe, Khaing; Richey, Kimberly; Cruse, James; Zhuang, Meibao; Gao, Zhifang; Evans, Clive; Rounsley, Steve; Kumpatla, Siva P

    2016-01-01

    Demand for the commercial use of genetically modified (GM) crops has been increasing in light of the projected growth of world population to nine billion by 2050. A prerequisite of paramount importance for regulatory submissions is the rigorous safety assessment of GM crops. One of the components of safety assessment is molecular characterization at DNA level which helps to determine the copy number, integrity and stability of a transgene; characterize the integration site within a host genome; and confirm the absence of vector DNA. Historically, molecular characterization has been carried out using Southern blot analysis coupled with Sanger sequencing. While this is a robust approach to characterize the transgenic crops, it is both time- and resource-consuming. The emergence of next-generation sequencing (NGS) technologies has provided highly sensitive and cost- and labor-effective alternative for molecular characterization compared to traditional Southern blot analysis. Herein, we have demonstrated the successful application of both whole genome sequencing and target capture sequencing approaches for the characterization of single and stacked transgenic events and compared the results and inferences with traditional method with respect to key criteria required for regulatory submissions.

  8. Molecular Characterization of Transgenic Events Using Next Generation Sequencing Approach.

    Directory of Open Access Journals (Sweden)

    Satish K Guttikonda

    Full Text Available Demand for the commercial use of genetically modified (GM crops has been increasing in light of the projected growth of world population to nine billion by 2050. A prerequisite of paramount importance for regulatory submissions is the rigorous safety assessment of GM crops. One of the components of safety assessment is molecular characterization at DNA level which helps to determine the copy number, integrity and stability of a transgene; characterize the integration site within a host genome; and confirm the absence of vector DNA. Historically, molecular characterization has been carried out using Southern blot analysis coupled with Sanger sequencing. While this is a robust approach to characterize the transgenic crops, it is both time- and resource-consuming. The emergence of next-generation sequencing (NGS technologies has provided highly sensitive and cost- and labor-effective alternative for molecular characterization compared to traditional Southern blot analysis. Herein, we have demonstrated the successful application of both whole genome sequencing and target capture sequencing approaches for the characterization of single and stacked transgenic events and compared the results and inferences with traditional method with respect to key criteria required for regulatory submissions.

  9. A symbolic dynamics approach for the complexity analysis of chaotic pseudo-random sequences

    International Nuclear Information System (INIS)

    Xiao Fanghong

    2004-01-01

    By considering a chaotic pseudo-random sequence as a symbolic sequence, authors present a symbolic dynamics approach for the complexity analysis of chaotic pseudo-random sequences. The method is applied to the cases of Logistic map and one-way coupled map lattice to demonstrate how it works, and a comparison is made between it and the approximate entropy method. The results show that this method is applicable to distinguish the complexities of different chaotic pseudo-random sequences, and it is superior to the approximate entropy method

  10. Contrasting safety assessments of a runway incursion scenario: Event sequence analysis versus multi-agent dynamic risk modelling

    International Nuclear Information System (INIS)

    Stroeve, Sybert H.; Blom, Henk A.P.; Bakker, G.J.

    2013-01-01

    In the safety literature it has been argued, that in a complex socio-technical system safety cannot be well analysed by event sequence based approaches, but requires to capture the complex interactions and performance variability of the socio-technical system. In order to evaluate the quantitative and practical consequences of these arguments, this study compares two approaches to assess accident risk of an example safety critical sociotechnical system. It contrasts an event sequence based assessment with a multi-agent dynamic risk model (MA-DRM) based assessment, both of which are performed for a particular runway incursion scenario. The event sequence analysis uses the well-known event tree modelling formalism and the MA-DRM based approach combines agent based modelling, hybrid Petri nets and rare event Monte Carlo simulation. The comparison addresses qualitative and quantitative differences in the methods, attained risk levels, and in the prime factors influencing the safety of the operation. The assessments show considerable differences in the accident risk implications of the performance of human operators and technical systems in the runway incursion scenario. In contrast with the event sequence based results, the MA-DRM based results show that the accident risk is not manifest from the performance of and relations between individual human operators and technical systems. Instead, the safety risk emerges from the totality of the performance and interactions in the agent based model of the safety critical operation considered, which coincides very well with the argumentation in the safety literature.

  11. A novel approach for modelling complex maintenance systems using discrete event simulation

    International Nuclear Information System (INIS)

    Alrabghi, Abdullah; Tiwari, Ashutosh

    2016-01-01

    Existing approaches for modelling maintenance rely on oversimplified assumptions which prevent them from reflecting the complexity found in industrial systems. In this paper, we propose a novel approach that enables the modelling of non-identical multi-unit systems without restrictive assumptions on the number of units or their maintenance characteristics. Modelling complex interactions between maintenance strategies and their effects on assets in the system is achieved by accessing event queues in Discrete Event Simulation (DES). The approach utilises the wide success DES has achieved in manufacturing by allowing integration with models that are closely related to maintenance such as production and spare parts systems. Additional advantages of using DES include rapid modelling and visual interactive simulation. The proposed approach is demonstrated in a simulation based optimisation study of a published case. The current research is one of the first to optimise maintenance strategies simultaneously with their parameters while considering production dynamics and spare parts management. The findings of this research provide insights for non-conflicting objectives in maintenance systems. In addition, the proposed approach can be used to facilitate the simulation and optimisation of industrial maintenance systems. - Highlights: • This research is one of the first to optimise maintenance strategies simultaneously. • New insights for non-conflicting objectives in maintenance systems. • The approach can be used to optimise industrial maintenance systems.

  12. Instruction sequence based non-uniform complexity classes

    NARCIS (Netherlands)

    Bergstra, J.A.; Middelburg, C.A.

    2013-01-01

    We present an approach to non-uniform complexity in which single-pass instruction sequences play a key part, and answer various questions that arise from this approach. We introduce several kinds of non-uniform complexity classes. One kind includes a counterpart of the well-known non-uniform

  13. Differentially Private Event Histogram Publication on Sequences over Graphs

    Institute of Scientific and Technical Information of China (English)

    Ning Wang; Yu Gu; Jia Xu; Fang-Fang Li; Ge Yu

    2017-01-01

    The big data era is coming with strong and ever-growing demands on analyzing personal information and footprints in the cyber world. To enable such analysis without privacy leak risk, differential privacy (DP) has been quickly rising in recent years, as the first practical privacy protection model with rigorous theoretical guarantee. This paper discusses how to publish differentially private histograms on events in time series domain, with sequences of personal events over graphs with events as edges. Such individual-generated sequences commonly appear in formalized industrial workflows, online game logs, and spatial-temporal trajectories. Directly publishing the statistics of sequences may compromise personal privacy. While existing DP mechanisms mainly target at normalized domains with fixed and aligned dimensions, our problem raises new challenges when the sequences could follow arbitrary paths on the graph. To tackle the problem, we reformulate the problem with a three-step framework, which 1) carefully truncates the original sequences, trading off errors introduced by the truncation with those introduced by the noise added to guarantee privacy, 2) decomposes the event graph into path sub-domains based on a group of event pivots, and 3) employs a deeply optimized tree-based histogram construction approach for each sub-domain to benefit with less noise addition. We present a careful analysis on our framework to support thorough optimizations over each step of the framework, and verify the huge improvements of our proposals over state-of-the-art solutions.

  14. CATEGORIZATION OF EVENT SEQUENCES FOR LICENSE APPLICATION

    Energy Technology Data Exchange (ETDEWEB)

    G.E. Ragan; P. Mecheret; D. Dexheimer

    2005-04-14

    The purposes of this analysis are: (1) Categorize (as Category 1, Category 2, or Beyond Category 2) internal event sequences that may occur before permanent closure of the repository at Yucca Mountain. (2) Categorize external event sequences that may occur before permanent closure of the repository at Yucca Mountain. This includes examining DBGM-1 seismic classifications and upgrading to DBGM-2, if appropriate, to ensure Beyond Category 2 categorization. (3) State the design and operational requirements that are invoked to make the categorization assignments valid. (4) Indicate the amount of material put at risk by Category 1 and Category 2 event sequences. (5) Estimate frequencies of Category 1 event sequences at the maximum capacity and receipt rate of the repository. (6) Distinguish occurrences associated with normal operations from event sequences. It is beyond the scope of the analysis to propose design requirements that may be required to control radiological exposure associated with normal operations. (7) Provide a convenient compilation of the results of the analysis in tabular form. The results of this analysis are used as inputs to the consequence analyses in an iterative design process that is depicted in Figure 1. Categorization of event sequences for permanent retrieval of waste from the repository is beyond the scope of this analysis. Cleanup activities that take place after an event sequence and other responses to abnormal events are also beyond the scope of the analysis.

  15. CATEGORIZATION OF EVENT SEQUENCES FOR LICENSE APPLICATION

    International Nuclear Information System (INIS)

    G.E. Ragan; P. Mecheret; D. Dexheimer

    2005-01-01

    The purposes of this analysis are: (1) Categorize (as Category 1, Category 2, or Beyond Category 2) internal event sequences that may occur before permanent closure of the repository at Yucca Mountain. (2) Categorize external event sequences that may occur before permanent closure of the repository at Yucca Mountain. This includes examining DBGM-1 seismic classifications and upgrading to DBGM-2, if appropriate, to ensure Beyond Category 2 categorization. (3) State the design and operational requirements that are invoked to make the categorization assignments valid. (4) Indicate the amount of material put at risk by Category 1 and Category 2 event sequences. (5) Estimate frequencies of Category 1 event sequences at the maximum capacity and receipt rate of the repository. (6) Distinguish occurrences associated with normal operations from event sequences. It is beyond the scope of the analysis to propose design requirements that may be required to control radiological exposure associated with normal operations. (7) Provide a convenient compilation of the results of the analysis in tabular form. The results of this analysis are used as inputs to the consequence analyses in an iterative design process that is depicted in Figure 1. Categorization of event sequences for permanent retrieval of waste from the repository is beyond the scope of this analysis. Cleanup activities that take place after an event sequence and other responses to abnormal events are also beyond the scope of the analysis

  16. Iterative Calibration: A Novel Approach for Calibrating the Molecular Clock Using Complex Geological Events.

    Science.gov (United States)

    Loeza-Quintana, Tzitziki; Adamowicz, Sarah J

    2018-02-01

    During the past 50 years, the molecular clock has become one of the main tools for providing a time scale for the history of life. In the era of robust molecular evolutionary analysis, clock calibration is still one of the most basic steps needing attention. When fossil records are limited, well-dated geological events are the main resource for calibration. However, biogeographic calibrations have often been used in a simplistic manner, for example assuming simultaneous vicariant divergence of multiple sister lineages. Here, we propose a novel iterative calibration approach to define the most appropriate calibration date by seeking congruence between the dates assigned to multiple allopatric divergences and the geological history. Exploring patterns of molecular divergence in 16 trans-Bering sister clades of echinoderms, we demonstrate that the iterative calibration is predominantly advantageous when using complex geological or climatological events-such as the opening/reclosure of the Bering Strait-providing a powerful tool for clock dating that can be applied to other biogeographic calibration systems and further taxa. Using Bayesian analysis, we observed that evolutionary rate variability in the COI-5P gene is generally distributed in a clock-like fashion for Northern echinoderms. The results reveal a large range of genetic divergences, consistent with multiple pulses of trans-Bering migrations. A resulting rate of 2.8% pairwise Kimura-2-parameter sequence divergence per million years is suggested for the COI-5P gene in Northern echinoderms. Given that molecular rates may vary across latitudes and taxa, this study provides a new context for dating the evolutionary history of Arctic marine life.

  17. Memory for sequences of events impaired in typical aging

    Science.gov (United States)

    Allen, Timothy A.; Morris, Andrea M.; Stark, Shauna M.; Fortin, Norbert J.

    2015-01-01

    Typical aging is associated with diminished episodic memory performance. To improve our understanding of the fundamental mechanisms underlying this age-related memory deficit, we previously developed an integrated, cross-species approach to link converging evidence from human and animal research. This novel approach focuses on the ability to remember sequences of events, an important feature of episodic memory. Unlike existing paradigms, this task is nonspatial, nonverbal, and can be used to isolate different cognitive processes that may be differentially affected in aging. Here, we used this task to make a comprehensive comparison of sequence memory performance between younger (18–22 yr) and older adults (62–86 yr). Specifically, participants viewed repeated sequences of six colored, fractal images and indicated whether each item was presented “in sequence” or “out of sequence.” Several out of sequence probe trials were used to provide a detailed assessment of sequence memory, including: (i) repeating an item from earlier in the sequence (“Repeats”; e.g., ABADEF), (ii) skipping ahead in the sequence (“Skips”; e.g., ABDDEF), and (iii) inserting an item from a different sequence into the same ordinal position (“Ordinal Transfers”; e.g., AB3DEF). We found that older adults performed as well as younger controls when tested on well-known and predictable sequences, but were severely impaired when tested using novel sequences. Importantly, overall sequence memory performance in older adults steadily declined with age, a decline not detected with other measures (RAVLT or BPS-O). We further characterized this deficit by showing that performance of older adults was severely impaired on specific probe trials that required detailed knowledge of the sequence (Skips and Ordinal Transfers), and was associated with a shift in their underlying mnemonic representation of the sequences. Collectively, these findings provide unambiguous evidence that the

  18. A Key Event Path Analysis Approach for Integrated Systems

    Directory of Open Access Journals (Sweden)

    Jingjing Liao

    2012-01-01

    Full Text Available By studying the key event paths of probabilistic event structure graphs (PESGs, a key event path analysis approach for integrated system models is proposed. According to translation rules concluded from integrated system architecture descriptions, the corresponding PESGs are constructed from the colored Petri Net (CPN models. Then the definitions of cycle event paths, sequence event paths, and key event paths are given. Whereafter based on the statistic results after the simulation of CPN models, key event paths are found out by the sensitive analysis approach. This approach focuses on the logic structures of CPN models, which is reliable and could be the basis of structured analysis for discrete event systems. An example of radar model is given to characterize the application of this approach, and the results are worthy of trust.

  19. Harmonic spectral components in time sequences of Markov correlated events

    Science.gov (United States)

    Mazzetti, Piero; Carbone, Anna

    2017-07-01

    The paper concerns the analysis of the conditions allowing time sequences of Markov correlated events give rise to a line power spectrum having a relevant physical interest. It is found that by specializing the Markov matrix in order to represent closed loop sequences of events with arbitrary distribution, generated in a steady physical condition, a large set of line spectra, covering all possible frequency values, is obtained. The amplitude of the spectral lines is given by a matrix equation based on a generalized Markov matrix involving the Fourier transform of the distribution functions representing the time intervals between successive events of the sequence. The paper is a complement of a previous work where a general expression for the continuous power spectrum was given. In that case the Markov matrix was left in a more general form, thus preventing the possibility of finding line spectra of physical interest. The present extension is also suggested by the interest of explaining the emergence of a broad set of waves found in the electro and magneto-encephalograms, whose frequency ranges from 0.5 to about 40Hz, in terms of the effects produced by chains of firing neurons within the complex neural network of the brain. An original model based on synchronized closed loop sequences of firing neurons is proposed, and a few numerical simulations are reported as an application of the above cited equation.

  20. Accident sequence precursor events with age-related contributors

    Energy Technology Data Exchange (ETDEWEB)

    Murphy, G.A.; Kohn, W.E.

    1995-12-31

    The Accident Sequence Precursor (ASP) Program at ORNL analyzed about 14.000 Licensee Event Reports (LERs) filed by US nuclear power plants 1987--1993. There were 193 events identified as precursors to potential severe core accident sequences. These are reported in G/CR-4674. Volumes 7 through 20. Under the NRC Nuclear Plant Aging Research program, the authors evaluated these events to determine the extent to which component aging played a role. Events were selected that involved age-related equipment degradation that initiated an event or contributed to an event sequence. For the 7-year period, ORNL identified 36 events that involved aging degradation as a contributor to an ASP event. Except for 1992, the percentage of age-related events within the total number of ASP events over the 7-year period ({approximately}19%) appears fairly consistent up to 1991. No correlation between plant ape and number of precursor events was found. A summary list of the age-related events is presented in the report.

  1. Probabilistic Dynamics for Integrated Analysis of Accident Sequences considering Uncertain Events

    Directory of Open Access Journals (Sweden)

    Robertas Alzbutas

    2015-01-01

    Full Text Available The analytical/deterministic modelling and simulation/probabilistic methods are used separately as a rule in order to analyse the physical processes and random or uncertain events. However, in the currently used probabilistic safety assessment this is an issue. The lack of treatment of dynamic interactions between the physical processes on one hand and random events on the other hand causes the limited assessment. In general, there are a lot of mathematical modelling theories, which can be used separately or integrated in order to extend possibilities of modelling and analysis. The Theory of Probabilistic Dynamics (TPD and its augmented version based on the concept of stimulus and delay are introduced for the dynamic reliability modelling and the simulation of accidents in hybrid (continuous-discrete systems considering uncertain events. An approach of non-Markovian simulation and uncertainty analysis is discussed in order to adapt the Stimulus-Driven TPD for practical applications. The developed approach and related methods are used as a basis for a test case simulation in view of various methods applications for severe accident scenario simulation and uncertainty analysis. For this and for wider analysis of accident sequences the initial test case specification is then extended and discussed. Finally, it is concluded that enhancing the modelling of stimulated dynamics with uncertainty and sensitivity analysis allows the detailed simulation of complex system characteristics and representation of their uncertainty. The developed approach of accident modelling and analysis can be efficiently used to estimate the reliability of hybrid systems and at the same time to analyze and possibly decrease the uncertainty of this estimate.

  2. An efficient approach to BAC based assembly of complex genomes.

    Science.gov (United States)

    Visendi, Paul; Berkman, Paul J; Hayashi, Satomi; Golicz, Agnieszka A; Bayer, Philipp E; Ruperao, Pradeep; Hurgobin, Bhavna; Montenegro, Juan; Chan, Chon-Kit Kenneth; Staňková, Helena; Batley, Jacqueline; Šimková, Hana; Doležel, Jaroslav; Edwards, David

    2016-01-01

    There has been an exponential growth in the number of genome sequencing projects since the introduction of next generation DNA sequencing technologies. Genome projects have increasingly involved assembly of whole genome data which produces inferior assemblies compared to traditional Sanger sequencing of genomic fragments cloned into bacterial artificial chromosomes (BACs). While whole genome shotgun sequencing using next generation sequencing (NGS) is relatively fast and inexpensive, this method is extremely challenging for highly complex genomes, where polyploidy or high repeat content confounds accurate assembly, or where a highly accurate 'gold' reference is required. Several attempts have been made to improve genome sequencing approaches by incorporating NGS methods, to variable success. We present the application of a novel BAC sequencing approach which combines indexed pools of BACs, Illumina paired read sequencing, a sequence assembler specifically designed for complex BAC assembly, and a custom bioinformatics pipeline. We demonstrate this method by sequencing and assembling BAC cloned fragments from bread wheat and sugarcane genomes. We demonstrate that our assembly approach is accurate, robust, cost effective and scalable, with applications for complete genome sequencing in large and complex genomes.

  3. Swallow Event Sequencing: Comparing Healthy Older and Younger Adults.

    Science.gov (United States)

    Herzberg, Erica G; Lazarus, Cathy L; Steele, Catriona M; Molfenter, Sonja M

    2018-04-23

    Previous research has established that a great deal of variation exists in the temporal sequence of swallowing events for healthy adults. Yet, the impact of aging on swallow event sequence is not well understood. Kendall et al. (Dysphagia 18(2):85-91, 2003) suggested there are 4 obligatory paired-event sequences in swallowing. We directly compared adherence to these sequences, as well as event latencies, and quantified the percentage of unique sequences in two samples of healthy adults: young ( 65). The 8 swallowing events that contribute to the sequences were reliably identified from videofluoroscopy in a sample of 23 healthy seniors (10 male, mean age 74.7) and 20 healthy young adults (10 male, mean age 31.5) with no evidence of penetration-aspiration or post-swallow residue. Chi-square analyses compared the proportions of obligatory pairs and unique sequences by age group. Compared to the older subjects, younger subjects had significantly lower adherence to two obligatory sequences: Upper Esophageal Sphincter (UES) opening occurs before (or simultaneous with) the bolus arriving at the UES and UES maximum distention occurs before maximum pharyngeal constriction. The associated latencies were significantly different between age groups as well. Further, significantly fewer unique swallow sequences were observed in the older group (61%) compared with the young (82%) (χ 2  = 31.8; p < 0.001). Our findings suggest that paired swallow event sequences may not be robust across the age continuum and that variation in swallow sequences appears to decrease with aging. These findings provide normative references for comparisons to older individuals with dysphagia.

  4. Mapping sequences by parts

    Directory of Open Access Journals (Sweden)

    Guziolowski Carito

    2007-09-01

    Full Text Available Abstract Background: We present the N-map method, a pairwise and asymmetrical approach which allows us to compare sequences by taking into account evolutionary events that produce shuffled, reversed or repeated elements. Basically, the optimal N-map of a sequence s over a sequence t is the best way of partitioning the first sequence into N parts and placing them, possibly complementary reversed, over the second sequence in order to maximize the sum of their gapless alignment scores. Results: We introduce an algorithm computing an optimal N-map with time complexity O (|s| × |t| × N using O (|s| × |t| × N memory space. Among all the numbers of parts taken in a reasonable range, we select the value N for which the optimal N-map has the most significant score. To evaluate this significance, we study the empirical distributions of the scores of optimal N-maps and show that they can be approximated by normal distributions with a reasonable accuracy. We test the functionality of the approach over random sequences on which we apply artificial evolutionary events. Practical Application: The method is illustrated with four case studies of pairs of sequences involving non-standard evolutionary events.

  5. Resolving the Complexity of Human Skin Metagenomes Using Single-Molecule Sequencing

    Directory of Open Access Journals (Sweden)

    Yu-Chih Tsai

    2016-02-01

    Full Text Available Deep metagenomic shotgun sequencing has emerged as a powerful tool to interrogate composition and function of complex microbial communities. Computational approaches to assemble genome fragments have been demonstrated to be an effective tool for de novo reconstruction of genomes from these communities. However, the resultant “genomes” are typically fragmented and incomplete due to the limited ability of short-read sequence data to assemble complex or low-coverage regions. Here, we use single-molecule, real-time (SMRT sequencing to reconstruct a high-quality, closed genome of a previously uncharacterized Corynebacterium simulans and its companion bacteriophage from a skin metagenomic sample. Considerable improvement in assembly quality occurs in hybrid approaches incorporating short-read data, with even relatively small amounts of long-read data being sufficient to improve metagenome reconstruction. Using short-read data to evaluate strain variation of this C. simulans in its skin community at single-nucleotide resolution, we observed a dominant C. simulans strain with moderate allelic heterozygosity throughout the population. We demonstrate the utility of SMRT sequencing and hybrid approaches in metagenome quantitation, reconstruction, and annotation.

  6. Resolving the Complexity of Human Skin Metagenomes Using Single-Molecule Sequencing

    Science.gov (United States)

    Tsai, Yu-Chih; Deming, Clayton; Segre, Julia A.; Kong, Heidi H.; Korlach, Jonas

    2016-01-01

    ABSTRACT Deep metagenomic shotgun sequencing has emerged as a powerful tool to interrogate composition and function of complex microbial communities. Computational approaches to assemble genome fragments have been demonstrated to be an effective tool for de novo reconstruction of genomes from these communities. However, the resultant “genomes” are typically fragmented and incomplete due to the limited ability of short-read sequence data to assemble complex or low-coverage regions. Here, we use single-molecule, real-time (SMRT) sequencing to reconstruct a high-quality, closed genome of a previously uncharacterized Corynebacterium simulans and its companion bacteriophage from a skin metagenomic sample. Considerable improvement in assembly quality occurs in hybrid approaches incorporating short-read data, with even relatively small amounts of long-read data being sufficient to improve metagenome reconstruction. Using short-read data to evaluate strain variation of this C. simulans in its skin community at single-nucleotide resolution, we observed a dominant C. simulans strain with moderate allelic heterozygosity throughout the population. We demonstrate the utility of SMRT sequencing and hybrid approaches in metagenome quantitation, reconstruction, and annotation. PMID:26861018

  7. Hidden Markov event sequence models: toward unsupervised functional MRI brain mapping.

    Science.gov (United States)

    Faisan, Sylvain; Thoraval, Laurent; Armspach, Jean-Paul; Foucher, Jack R; Metz-Lutz, Marie-Noëlle; Heitz, Fabrice

    2005-01-01

    Most methods used in functional MRI (fMRI) brain mapping require restrictive assumptions about the shape and timing of the fMRI signal in activated voxels. Consequently, fMRI data may be partially and misleadingly characterized, leading to suboptimal or invalid inference. To limit these assumptions and to capture the broad range of possible activation patterns, a novel statistical fMRI brain mapping method is proposed. It relies on hidden semi-Markov event sequence models (HSMESMs), a special class of hidden Markov models (HMMs) dedicated to the modeling and analysis of event-based random processes. Activation detection is formulated in terms of time coupling between (1) the observed sequence of hemodynamic response onset (HRO) events detected in the voxel's fMRI signal and (2) the "hidden" sequence of task-induced neural activation onset (NAO) events underlying the HROs. Both event sequences are modeled within a single HSMESM. The resulting brain activation model is trained to automatically detect neural activity embedded in the input fMRI data set under analysis. The data sets considered in this article are threefold: synthetic epoch-related, real epoch-related (auditory lexical processing task), and real event-related (oddball detection task) fMRI data sets. Synthetic data: Activation detection results demonstrate the superiority of the HSMESM mapping method with respect to a standard implementation of the statistical parametric mapping (SPM) approach. They are also very close, sometimes equivalent, to those obtained with an "ideal" implementation of SPM in which the activation patterns synthesized are reused for analysis. The HSMESM method appears clearly insensitive to timing variations of the hemodynamic response and exhibits low sensitivity to fluctuations of its shape (unsustained activation during task). Real epoch-related data: HSMESM activation detection results compete with those obtained with SPM, without requiring any prior definition of the expected

  8. Real-time monitoring of clinical processes using complex event processing and transition systems.

    Science.gov (United States)

    Meinecke, Sebastian

    2014-01-01

    Dependencies between tasks in clinical processes are often complex and error-prone. Our aim is to describe a new approach for the automatic derivation of clinical events identified via the behaviour of IT systems using Complex Event Processing. Furthermore we map these events on transition systems to monitor crucial clinical processes in real-time for preventing and detecting erroneous situations.

  9. ES-RBE Event sequence reliability Benchmark exercise

    International Nuclear Information System (INIS)

    Poucet, A.E.J.

    1991-01-01

    The event Sequence Reliability Benchmark Exercise (ES-RBE) can be considered as a logical extension of the other three Reliability Benchmark Exercices : the RBE on Systems Analysis, the RBE on Common Cause Failures and the RBE on Human Factors. The latter, constituting Activity No. 1, was concluded by the end of 1987. The ES-RBE covered the techniques that are currently used for analysing and quantifying sequences of events starting from an initiating event to various plant damage states, including analysis of various system failures and/or successes, human intervention failure and/or success and dependencies between systems. By this way, one of the scopes of the ES-RBE was to integrate the experiences gained in the previous exercises

  10. Study of event sequence database for a nuclear power domain

    International Nuclear Information System (INIS)

    Kusumi, Yoshiaki

    1998-01-01

    A retrieval engine developed to extract event sequences from an accident information database using a time series retrieval formula expressed with ordered retrieval terms is explored. This engine outputs not only a sequence which completely matches with a time series retrieval formula, but also sequence which approximately matches the formula (fuzzy retrieval). An event sequence database in which records consist of three ordered parameters, namely the causal event, the process and result. Then the database is used to assess the feasibility of this engine and favorable results were obtained. (author)

  11. A single-chip event sequencer and related microcontroller instrumentation for atomic physics research.

    Science.gov (United States)

    Eyler, E E

    2011-01-01

    A 16-bit digital event sequencer with 50 ns resolution and 50 ns trigger jitter is implemented by using an internal 32-bit timer on a dsPIC30F4013 microcontroller, controlled by an easily modified program written in standard C. It can accommodate hundreds of output events, and adjacent events can be spaced as closely as 1.5 μs. The microcontroller has robust 5 V inputs and outputs, allowing a direct interface to common laboratory equipment and other electronics. A USB computer interface and a pair of analog ramp outputs can be added with just two additional chips. An optional display/keypad unit allows direct interaction with the sequencer without requiring an external computer. Minor additions also allow simple realizations of other complex instruments, including a precision high-voltage ramp generator for driving spectrum analyzers or piezoelectric positioners, and a low-cost proportional integral differential controller and lock-in amplifier for laser frequency stabilization with about 100 kHz bandwidth.

  12. Application of the whole-transcriptome shotgun sequencing approach to the study of Philadelphia-positive acute lymphoblastic leukemia

    International Nuclear Information System (INIS)

    Iacobucci, I; Ferrarini, A; Sazzini, M; Giacomelli, E; Lonetti, A; Xumerle, L; Ferrari, A; Papayannidis, C; Malerba, G; Luiselli, D; Boattini, A; Garagnani, P; Vitale, A; Soverini, S; Pane, F; Baccarani, M; Delledonne, M; Martinelli, G

    2012-01-01

    Although the pathogenesis of BCR–ABL1-positive acute lymphoblastic leukemia (ALL) is mainly related to the expression of the BCR–ABL1 fusion transcript, additional cooperating genetic lesions are supposed to be involved in its development and progression. Therefore, in an attempt to investigate the complex landscape of mutations, changes in expression profiles and alternative splicing (AS) events that can be observed in such disease, the leukemia transcriptome of a BCR–ABL1-positive ALL patient at diagnosis and at relapse was sequenced using a whole-transcriptome shotgun sequencing (RNA-Seq) approach. A total of 13.9 and 15.8 million sequence reads was generated from de novo and relapsed samples, respectively, and aligned to the human genome reference sequence. This led to the identification of five validated missense mutations in genes involved in metabolic processes (DPEP1, TMEM46), transport (MVP), cell cycle regulation (ABL1) and catalytic activity (CTSZ), two of which resulted in acquired relapse variants. In all, 6390 and 4671 putative AS events were also detected, as well as expression levels for 18 315 and 18 795 genes, 28% of which were differentially expressed in the two disease phases. These data demonstrate that RNA-Seq is a suitable approach for identifying a wide spectrum of genetic alterations potentially involved in ALL

  13. Plastome Sequence Determination and Comparative Analysis for Members of the Lolium-Festuca Grass Species Complex

    Science.gov (United States)

    Hand, Melanie L.; Spangenberg, German C.; Forster, John W.; Cogan, Noel O. I.

    2013-01-01

    Chloroplast genome sequences are of broad significance in plant biology, due to frequent use in molecular phylogenetics, comparative genomics, population genetics, and genetic modification studies. The present study used a second-generation sequencing approach to determine and assemble the plastid genomes (plastomes) of four representatives from the agriculturally important Lolium-Festuca species complex of pasture grasses (Lolium multiflorum, Festuca pratensis, Festuca altissima, and Festuca ovina). Total cellular DNA was extracted from either roots or leaves, was sequenced, and the output was filtered for plastome-related reads. A comparison between sources revealed fewer plastome-related reads from root-derived template but an increase in incidental bacterium-derived sequences. Plastome assembly and annotation indicated high levels of sequence identity and a conserved organization and gene content between species. However, frequent deletions within the F. ovina plastome appeared to contribute to a smaller plastid genome size. Comparative analysis with complete plastome sequences from other members of the Poaceae confirmed conservation of most grass-specific features. Detailed analysis of the rbcL–psaI intergenic region, however, revealed a “hot-spot” of variation characterized by independent deletion events. The evolutionary implications of this observation are discussed. The complete plastome sequences are anticipated to provide the basis for potential organelle-specific genetic modification of pasture grasses. PMID:23550121

  14. A sequence of events across the Cretaceous-Tertiary boundary

    NARCIS (Netherlands)

    Smit, J.; Romein, A.J.T.

    1985-01-01

    The lithological and biological sequence of events across the Cretaceous-Tertiary (K/T), as developed in thick and complete landbased sections and termed the standard K/T event sequence, is also found in many DSDP cores from all over the globe. Microtektite-like spherules have been found in

  15. Sequencing Events: Exploring Art and Art Jobs.

    Science.gov (United States)

    Stephens, Pamela Geiger; Shaddix, Robin K.

    2000-01-01

    Presents an activity for upper-elementary students that correlates the actions of archaeologists, patrons, and artists with the sequencing of events in a logical order. Features ancient Egyptian art images. Discusses the preparation of materials, motivation, a pre-writing activity, and writing a story in sequence. (CMK)

  16. Automated Testing with Targeted Event Sequence Generation

    DEFF Research Database (Denmark)

    Jensen, Casper Svenning; Prasad, Mukul R.; Møller, Anders

    2013-01-01

    Automated software testing aims to detect errors by producing test inputs that cover as much of the application source code as possible. Applications for mobile devices are typically event-driven, which raises the challenge of automatically producing event sequences that result in high coverage...

  17. Service Processes as a Sequence of Events

    NARCIS (Netherlands)

    P.C. Verhoef (Peter); G. Antonides (Gerrit); A.N. de Hoog

    2002-01-01

    textabstractIn this paper the service process is considered as a sequence of events. Using theory from economics and psychology a model is formulated that explains how the utility of each event affects the overall evaluation of the service process. In this model we especially account for the

  18. Time Separation Between Events in a Sequence: a Regional Property?

    Science.gov (United States)

    Muirwood, R.; Fitzenz, D. D.

    2013-12-01

    Earthquake sequences are loosely defined as events occurring too closely in time and space to appear unrelated. Depending on the declustering method, several, all, or no event(s) after the first large event might be recognized as independent mainshocks. It can therefore be argued that a probabilistic seismic hazard assessment (PSHA, traditionally dealing with mainshocks only) might already include the ground shaking effects of such sequences. Alternatively all but the largest event could be classified as an ';aftershock' and removed from the earthquake catalog. While in PSHA the question is only whether to keep or remove the events from the catalog, for Risk Management purposes, the community response to the earthquakes, as well as insurance risk transfer mechanisms, can be profoundly affected by the actual timing of events in such a sequence. In particular the repetition of damaging earthquakes over a period of weeks to months can lead to businesses closing and families evacuating from the region (as happened in Christchurch, New Zealand in 2011). Buildings that are damaged in the first earthquake may go on to be damaged again, even while they are being repaired. Insurance also functions around a set of critical timeframes - including the definition of a single 'event loss' for reinsurance recoveries within the 192 hour ';hours clause', the 6-18 month pace at which insurance claims are settled, and the annual renewal of insurance and reinsurance contracts. We show how temporal aspects of earthquake sequences need to be taken into account within models for Risk Management, and what time separation between events are most sensitive, both in terms of the modeled disruptions to lifelines and business activity as well as in the losses to different parties (such as insureds, insurers and reinsurers). We also explore the time separation between all events and between loss causing events for a collection of sequences from across the world and we point to the need to

  19. System risk evolution analysis and risk critical event identification based on event sequence diagram

    International Nuclear Information System (INIS)

    Luo, Pengcheng; Hu, Yang

    2013-01-01

    During system operation, the environmental, operational and usage conditions are time-varying, which causes the fluctuations of the system state variables (SSVs). These fluctuations change the accidents’ probabilities and then result in the system risk evolution (SRE). This inherent relation makes it feasible to realize risk control by monitoring the SSVs in real time, herein, the quantitative analysis of SRE is essential. Besides, some events in the process of SRE are critical to system risk, because they act like the “demarcative points” of safety and accident, and this characteristic makes each of them a key point of risk control. Therefore, analysis of SRE and identification of risk critical events (RCEs) are remarkably meaningful to ensure the system to operate safely. In this context, an event sequence diagram (ESD) based method of SRE analysis and the related Monte Carlo solution are presented; RCE and risk sensitive variable (RSV) are defined, and the corresponding identification methods are also proposed. Finally, the proposed approaches are exemplified with an accident scenario of an aircraft getting into the icing region

  20. Integrated analysis of RNA-binding protein complexes using in vitro selection and high-throughput sequencing and sequence specificity landscapes (SEQRS).

    Science.gov (United States)

    Lou, Tzu-Fang; Weidmann, Chase A; Killingsworth, Jordan; Tanaka Hall, Traci M; Goldstrohm, Aaron C; Campbell, Zachary T

    2017-04-15

    RNA-binding proteins (RBPs) collaborate to control virtually every aspect of RNA function. Tremendous progress has been made in the area of global assessment of RBP specificity using next-generation sequencing approaches both in vivo and in vitro. Understanding how protein-protein interactions enable precise combinatorial regulation of RNA remains a significant problem. Addressing this challenge requires tools that can quantitatively determine the specificities of both individual proteins and multimeric complexes in an unbiased and comprehensive way. One approach utilizes in vitro selection, high-throughput sequencing, and sequence-specificity landscapes (SEQRS). We outline a SEQRS experiment focused on obtaining the specificity of a multi-protein complex between Drosophila RBPs Pumilio (Pum) and Nanos (Nos). We discuss the necessary controls in this type of experiment and examine how the resulting data can be complemented with structural and cell-based reporter assays. Additionally, SEQRS data can be integrated with functional genomics data to uncover biological function. Finally, we propose extensions of the technique that will enhance our understanding of multi-protein regulatory complexes assembled onto RNA. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. CESAS: Computerized event sequence abstracting system outlines and applications

    International Nuclear Information System (INIS)

    Watanabe, N.; Kobayashi, K.; Fujiki, K.

    1990-01-01

    For the purpose of efficient utilization of the safety-related event information on the nuclear power plants, a new computer software package CESAS has been under development. CESAS is to systematically abstract the event sequence, that is a series of sequential and causal relationships between occurrences, from the event description written in natural language of English. This system is designed to be based on the knowledge engineering technique utilized in the field of natural language processing. The analytical process in this system consists of morphemic, syntactic, semantic, and syntagmatic analyses. At this moment, the first version of CESAS has been developed and applied to several real event descriptions for studying its feasibility. This paper describes the outlines of CESAS and one of analytical results in comparison with a manually-extracted event sequence

  2. Sequence complexity and work extraction

    International Nuclear Information System (INIS)

    Merhav, Neri

    2015-01-01

    We consider a simplified version of a solvable model by Mandal and Jarzynski, which constructively demonstrates the interplay between work extraction and the increase of the Shannon entropy of an information reservoir which is in contact with a physical system. We extend Mandal and Jarzynski’s main findings in several directions: first, we allow sequences of correlated bits rather than just independent bits. Secondly, at least for the case of binary information, we show that, in fact, the Shannon entropy is only one measure of complexity of the information that must increase in order for work to be extracted. The extracted work can also be upper bounded in terms of the increase in other quantities that measure complexity, like the predictability of future bits from past ones. Third, we provide an extension to the case of non-binary information (i.e. a larger alphabet), and finally, we extend the scope to the case where the incoming bits (before the interaction) form an individual sequence, rather than a random one. In this case, the entropy before the interaction can be replaced by the Lempel–Ziv (LZ) complexity of the incoming sequence, a fact that gives rise to an entropic meaning of the LZ complexity, not only in information theory, but also in physics. (paper)

  3. Controlling extreme events on complex networks

    Science.gov (United States)

    Chen, Yu-Zhong; Huang, Zi-Gang; Lai, Ying-Cheng

    2014-08-01

    Extreme events, a type of collective behavior in complex networked dynamical systems, often can have catastrophic consequences. To develop effective strategies to control extreme events is of fundamental importance and practical interest. Utilizing transportation dynamics on complex networks as a prototypical setting, we find that making the network ``mobile'' can effectively suppress extreme events. A striking, resonance-like phenomenon is uncovered, where an optimal degree of mobility exists for which the probability of extreme events is minimized. We derive an analytic theory to understand the mechanism of control at a detailed and quantitative level, and validate the theory numerically. Implications of our finding to current areas such as cybersecurity are discussed.

  4. Application of declarative modeling approaches for external events

    International Nuclear Information System (INIS)

    Anoba, R.C.

    2005-01-01

    Probabilistic Safety Assessments (PSAs) are increasingly being used as a tool for supporting the acceptability of design, procurement, construction, operation, and maintenance activities at Nuclear Power Plants. Since the issuance of Generic Letter 88-20 and subsequent IPE/IPEEE assessments, the NRC has issued several Regulatory Guides such as RG 1.174 to describe the use of PSA in risk-informed regulation activities. Most PSA have the capability to address internal events including internal floods. As the more demands are being placed for using the PSA to support risk-informed applications, there has been a growing need to integrate other eternal events (Seismic, Fire, etc.) into the logic models. Most external events involve spatial dependencies and usually impact the logic models at the component level. Therefore, manual insertion of external events impacts into a complex integrated fault tree model may be too cumbersome for routine uses of the PSA. Within the past year, a declarative modeling approach has been developed to automate the injection of external events into the PSA. The intent of this paper is to introduce the concept of declarative modeling in the context of external event applications. A declarative modeling approach involves the definition of rules for injection of external event impacts into the fault tree logic. A software tool such as the EPRI's XInit program can be used to interpret the pre-defined rules and automatically inject external event elements into the PSA. The injection process can easily be repeated, as required, to address plant changes, sensitivity issues, changes in boundary conditions, etc. External event elements may include fire initiating events, seismic initiating events, seismic fragilities, fire-induced hot short events, special human failure events, etc. This approach has been applied at a number of US nuclear power plants including a nuclear power plant in Romania. (authors)

  5. Introduction of operator actions in the event trees

    International Nuclear Information System (INIS)

    Bars, G.; Lanore, J.M.; Villeroux, C.

    1984-11-01

    In the PRA in progress in France for a 900 MW PWR plant, an effort is done for introducing operator actions during accident sequences. A first approach of this complex problem relies on an extensive use of existing methods an knowledge in diverse fields. Identification of actions is based on the operating procedures, and in particular on the existence of special emergency procedures which define the optimal actions during severe accidents. This approach implies the introduction in the event trees of the notion of procedure failure. Quantification of the corresponding probabilities leads to several problems including physics of the sequences, systems availability and human behaviour for decision making and actions. This treatment is illustrated by the example of the small break event tree

  6. Long-Term Recall of Event Sequences in Infancy.

    Science.gov (United States)

    Mandler, Jean M.; McDonough, Laraine

    1995-01-01

    Two experiments demonstrated that 11-month olds can encode novel causal events from a brief period of observational learning and recall much of the information after 24 hours and after 3 months. The infants remembered more individual actions than whole sequences, but reproduced many of the events in their entirety after the long delay. (MDM)

  7. Incident sequence analysis; event trees, methods and graphical symbols

    International Nuclear Information System (INIS)

    1980-11-01

    When analyzing incident sequences, unwanted events resulting from a certain cause are looked for. Graphical symbols and explanations of graphical representations are presented. The method applies to the analysis of incident sequences in all types of facilities. By means of the incident sequence diagram, incident sequences, i.e. the logical and chronological course of repercussions initiated by the failure of a component or by an operating error, can be presented and analyzed simply and clearly

  8. Improving the extraction of complex regulatory events from scientific text by using ontology-based inference.

    Science.gov (United States)

    Kim, Jung-Jae; Rebholz-Schuhmann, Dietrich

    2011-10-06

    The extraction of complex events from biomedical text is a challenging task and requires in-depth semantic analysis. Previous approaches associate lexical and syntactic resources with ontologies for the semantic analysis, but fall short in testing the benefits from the use of domain knowledge. We developed a system that deduces implicit events from explicitly expressed events by using inference rules that encode domain knowledge. We evaluated the system with the inference module on three tasks: First, when tested against a corpus with manually annotated events, the inference module of our system contributes 53.2% of correct extractions, but does not cause any incorrect results. Second, the system overall reproduces 33.1% of the transcription regulatory events contained in RegulonDB (up to 85.0% precision) and the inference module is required for 93.8% of the reproduced events. Third, we applied the system with minimum adaptations to the identification of cell activity regulation events, confirming that the inference improves the performance of the system also on this task. Our research shows that the inference based on domain knowledge plays a significant role in extracting complex events from text. This approach has great potential in recognizing the complex concepts of such biomedical ontologies as Gene Ontology in the literature.

  9. Improving the extraction of complex regulatory events from scientific text by using ontology-based inference

    Directory of Open Access Journals (Sweden)

    Kim Jung-jae

    2011-10-01

    Full Text Available Abstract Background The extraction of complex events from biomedical text is a challenging task and requires in-depth semantic analysis. Previous approaches associate lexical and syntactic resources with ontologies for the semantic analysis, but fall short in testing the benefits from the use of domain knowledge. Results We developed a system that deduces implicit events from explicitly expressed events by using inference rules that encode domain knowledge. We evaluated the system with the inference module on three tasks: First, when tested against a corpus with manually annotated events, the inference module of our system contributes 53.2% of correct extractions, but does not cause any incorrect results. Second, the system overall reproduces 33.1% of the transcription regulatory events contained in RegulonDB (up to 85.0% precision and the inference module is required for 93.8% of the reproduced events. Third, we applied the system with minimum adaptations to the identification of cell activity regulation events, confirming that the inference improves the performance of the system also on this task. Conclusions Our research shows that the inference based on domain knowledge plays a significant role in extracting complex events from text. This approach has great potential in recognizing the complex concepts of such biomedical ontologies as Gene Ontology in the literature.

  10. Object-Oriented Query Language For Events Detection From Images Sequences

    Science.gov (United States)

    Ganea, Ion Eugen

    2015-09-01

    In this paper is presented a method to represent the events extracted from images sequences and the query language used for events detection. Using an object oriented model the spatial and temporal relationships between salient objects and also between events are stored and queried. This works aims to unify the storing and querying phases for video events processing. The object oriented language syntax used for events processing allow the instantiation of the indexes classes in order to improve the accuracy of the query results. The experiments were performed on images sequences provided from sport domain and it shows the reliability and the robustness of the proposed language. To extend the language will be added a specific syntax for constructing the templates for abnormal events and for detection of the incidents as the final goal of the research.

  11. An Event-Based Approach to Distributed Diagnosis of Continuous Systems

    Science.gov (United States)

    Daigle, Matthew; Roychoudhurry, Indranil; Biswas, Gautam; Koutsoukos, Xenofon

    2010-01-01

    Distributed fault diagnosis solutions are becoming necessary due to the complexity of modern engineering systems, and the advent of smart sensors and computing elements. This paper presents a novel event-based approach for distributed diagnosis of abrupt parametric faults in continuous systems, based on a qualitative abstraction of measurement deviations from the nominal behavior. We systematically derive dynamic fault signatures expressed as event-based fault models. We develop a distributed diagnoser design algorithm that uses these models for designing local event-based diagnosers based on global diagnosability analysis. The local diagnosers each generate globally correct diagnosis results locally, without a centralized coordinator, and by communicating a minimal number of measurements between themselves. The proposed approach is applied to a multi-tank system, and results demonstrate a marked improvement in scalability compared to a centralized approach.

  12. The 2016-2017 Central Italy Seismic Sequence: Source Complexity Inferred from Rupture Models.

    Science.gov (United States)

    Scognamiglio, L.; Tinti, E.; Casarotti, E.; Pucci, S.; Villani, F.; Cocco, M.; Magnoni, F.; Michelini, A.

    2017-12-01

    The Apennines have been struck by several seismic sequences in recent years, showing evidence of the activation of multiple segments of normal fault systems in a variable and, relatively short, time span, as in the case of the 1980 Irpinia earthquake (three shocks in 40 s), the 1997 Umbria-Marche sequence (four main shocks in 18 days) and the 2009 L'Aquila earthquake having three segments activated within a few weeks. The 2016-2017 central Apennines seismic sequence begin on August 24th with a MW 6.0 earthquake, which strike the region between Amatrice and Accumoli causing 299 fatalities. This earthquake ruptures a nearly 20 km long normal fault and shows a quite heterogeneous slip distribution. On October 26th, another main shock (MW 5.9) occurs near Visso extending the activated seismogenic area toward the NW. It is a double event rupturing contiguous patches on the fault segment of the normal fault system. Four days after the second main shock, on October 30th, a third earthquake (MW 6.5) occurs near Norcia, roughly midway between Accumoli and Visso. In this work we have inverted strong motion waveforms and GPS data to retrieve the source model of the MW 6.5 event with the aim of interpreting the rupture process in the framework of this complex sequence of moderate magnitude earthquakes. We noted that some preliminary attempts to model the slip distribution of the October 30th main shock using a single fault plane oriented along the Apennines did not provide convincing fits to the observed waveforms. In addition, the deformation pattern inferred from satellite observations suggested the activation of a multi-fault structure, that is coherent to the complexity and the extension of the geological surface deformation. We investigated the role of multi-fault ruptures and we found that this event revealed an extraordinary complexity of the rupture geometry and evolution: the coseismic rupture propagated almost simultaneously on a normal fault and on a blind fault

  13. Complex Regional Pain Syndrome and Treatment Approaches

    Directory of Open Access Journals (Sweden)

    Neslihan Gokcen

    2013-08-01

    Full Text Available Complex Regional Pain Syndrome is a symptom complex including severe pain which is disproportioned by the initiating event. Formerly, it was known as reflex sympathetic dystropy, Sudeck’s atrophy and algoneurodystrophy. There are two types of complex regional pain syndrome (CPRS. CRPS type 1 (Reflex sympathetic dystropy occurs after a minor trauma of the extremities, CRPS type 2 (Causalgia occurs following peripheral nevre injury. Diagnosis is made according to the history, symptoms and physical findings of the patients. Patient education, physical therapy and medical treatment are the most common treatment approaches of complex regional pain syndrome. The aim of this review is to revise the treatment options ofcomplex regional pain syndrome, as well as to overview the new treatment approaches and options for the refractory complex regional pain syndrome cases. [Archives Medical Review Journal 2013; 22(4.000: 514-531

  14. Thirteen- and Sixteen-Month-Olds' Long-Term Recall of Event Sequences.

    Science.gov (United States)

    Hertsgaard, L.; Bauer, P. J.

    In two experiments, the ability of children younger than 20 months to engage in delayed ordered recall was investigated. In the first experiment, 13- and 16-month-old children were presented with 2-step event sequences and tested for recall, first, immediately following the event and second, after a one-week delay. Sequences were novel-causal,…

  15. Potential Indoor Worker Exposure From Handling Area Leakage: Example Event Sequence Frequency Analysis

    International Nuclear Information System (INIS)

    Benke, Roland R.; Adams, George R.

    2008-01-01

    potential event sequences. A hypothetical case is presented for failure of the HVAC exhaust system to provide confinement for contaminated air from otherwise normal operations. This paper presents an example calculation of frequencies for a potential event sequence involving HVAC system failure during otherwise routine wet transfer operations of spent nuclear fuel assemblies from an open container. For the simplified HVAC exhaust system model, the calculation indicated that the potential event sequence may or may not be a Category 1 event sequence, in light of current uncertainties (e.g., final HVAC system design and duration of facility operations). Categorization of potential event sequences is important because different regulatory requirements and performance objectives are specified based on the categorization of event sequences. A companion paper presents a dose calculation methodology and example calculations of indoor worker consequences for the posed example event sequence. Together, the two companion papers demonstrate capabilities for performing confirmatory calculations of frequency and consequence, which may assist the assessment of worker safety during a risk-informed regulatory review of a potential DOE license application

  16. Self-Exciting Point Process Modeling of Conversation Event Sequences

    Science.gov (United States)

    Masuda, Naoki; Takaguchi, Taro; Sato, Nobuo; Yano, Kazuo

    Self-exciting processes of Hawkes type have been used to model various phenomena including earthquakes, neural activities, and views of online videos. Studies of temporal networks have revealed that sequences of social interevent times for individuals are highly bursty. We examine some basic properties of event sequences generated by the Hawkes self-exciting process to show that it generates bursty interevent times for a wide parameter range. Then, we fit the model to the data of conversation sequences recorded in company offices in Japan. In this way, we can estimate relative magnitudes of the self excitement, its temporal decay, and the base event rate independent of the self excitation. These variables highly depend on individuals. We also point out that the Hawkes model has an important limitation that the correlation in the interevent times and the burstiness cannot be independently modulated.

  17. MinION™ nanopore sequencing of environmental metagenomes: a synthetic approach.

    Science.gov (United States)

    Brown, Bonnie L; Watson, Mick; Minot, Samuel S; Rivera, Maria C; Franklin, Rima B

    2017-03-01

    Environmental metagenomic analysis is typically accomplished by assigning taxonomy and/or function from whole genome sequencing or 16S amplicon sequences. Both of these approaches are limited, however, by read length, among other technical and biological factors. A nanopore-based sequencing platform, MinION™, produces reads that are ≥1 × 104 bp in length, potentially providing for more precise assignment, thereby alleviating some of the limitations inherent in determining metagenome composition from short reads. We tested the ability of sequence data produced by MinION (R7.3 flow cells) to correctly assign taxonomy in single bacterial species runs and in three types of low-complexity synthetic communities: a mixture of DNA using equal mass from four species, a community with one relatively rare (1%) and three abundant (33% each) components, and a mixture of genomic DNA from 20 bacterial strains of staggered representation. Taxonomic composition of the low-complexity communities was assessed by analyzing the MinION sequence data with three different bioinformatic approaches: Kraken, MG-RAST, and One Codex. Results: Long read sequences generated from libraries prepared from single strains using the version 5 kit and chemistry, run on the original MinION device, yielded as few as 224 to as many as 3497 bidirectional high-quality (2D) reads with an average overall study length of 6000 bp. For the single-strain analyses, assignment of reads to the correct genus by different methods ranged from 53.1% to 99.5%, assignment to the correct species ranged from 23.9% to 99.5%, and the majority of misassigned reads were to closely related organisms. A synthetic metagenome sequenced with the same setup yielded 714 high quality 2D reads of approximately 5500 bp that were up to 98% correctly assigned to the species level. Synthetic metagenome MinION libraries generated using version 6 kit and chemistry yielded from 899 to 3497 2D reads with lengths averaging 5700 bp with up

  18. A classification of event sequences in the influence network

    Science.gov (United States)

    Walsh, James Lyons; Knuth, Kevin H.

    2017-06-01

    We build on the classification in [1] of event sequences in the influence network as respecting collinearity or not, so as to determine in future work what phenomena arise in each case. Collinearity enables each observer to uniquely associate each particle event of influencing with one of the observer's own events, even in the case of events of influencing the other observer. We further classify events as to whether they are spacetime events that obey in the fine-grained case the coarse-grained conditions of [2], finding that Newton's First and Second Laws of motion are obeyed at spacetime events. A proof of Newton's Third Law under particular circumstances is also presented.

  19. Characterization of Aftershock Sequences from Large Strike-Slip Earthquakes Along Geometrically Complex Faults

    Science.gov (United States)

    Sexton, E.; Thomas, A.; Delbridge, B. G.

    2017-12-01

    Large earthquakes often exhibit complex slip distributions and occur along non-planar fault geometries, resulting in variable stress changes throughout the region of the fault hosting aftershocks. To better discern the role of geometric discontinuities on aftershock sequences, we compare areas of enhanced and reduced Coulomb failure stress and mean stress for systematic differences in the time dependence and productivity of these aftershock sequences. In strike-slip faults, releasing structures, including stepovers and bends, experience an increase in both Coulomb failure stress and mean stress during an earthquake, promoting fluid diffusion into the region and further failure. Conversely, Coulomb failure stress and mean stress decrease in restraining bends and stepovers in strike-slip faults, and fluids diffuse away from these areas, discouraging failure. We examine spatial differences in seismicity patterns along structurally complex strike-slip faults which have hosted large earthquakes, such as the 1992 Mw 7.3 Landers, the 2010 Mw 7.2 El-Mayor Cucapah, the 2014 Mw 6.0 South Napa, and the 2016 Mw 7.0 Kumamoto events. We characterize the behavior of these aftershock sequences with the Epidemic Type Aftershock-Sequence Model (ETAS). In this statistical model, the total occurrence rate of aftershocks induced by an earthquake is λ(t) = λ_0 + \\sum_{i:t_i

  20. Instruction sequences and non-uniform complexity theory

    NARCIS (Netherlands)

    Bergstra, J.A.; Middelburg, C.A.

    2008-01-01

    We develop theory concerning non-uniform complexity in a setting in which the notion of single-pass instruction sequence considered in program algebra is the central notion. We define counterparts of the complexity classes P/poly and NP/poly and formulate a counterpart of the complexity theoretic

  1. Complexity rating of abnormal events and operator performance

    International Nuclear Information System (INIS)

    Oeivind Braarud, Per

    1998-01-01

    The complexity of the work situation during abnormal situations is a major topic in a discussion of safety aspects of Nuclear Power plants. An understanding of complexity and its impact on operator performance in abnormal situations is important. One way to enhance understanding is to look at the dimensions that constitute complexity for NPP operators, and how those dimensions can be measured. A further step is to study how dimensions of complexity of the event are related to performance of operators. One aspect of complexity is the operator 's subjective experience of given difficulties of the event. Another related aspect of complexity is subject matter experts ratings of the complexity of the event. A definition and a measure of this part of complexity are being investigated at the OECD Halden Reactor Project in Norway. This paper focus on the results from a study of simulated scenarios carried out in the Halden Man-Machine Laboratory, which is a full scope PWR simulator. Six crews of two licensed operators each performed in 16 scenarios (simulated events). Before the experiment subject matter experts rated the complexity of the scenarios, using a Complexity Profiling Questionnaire. The Complexity Profiling Questionnaire contains eight previously identified dimensions associated with complexity. After completing the scenarios the operators received a questionnaire containing 39 questions about perceived complexity. This questionnaire was used for development of a measure of subjective complexity. The results from the study indicated that Process experts' rating of scenario complexity, using the Complexity Profiling Questionnaire, were able to predict crew performance quite well. The results further indicated that a measure of subjective complexity could be developed that was related to crew performance. Subjective complexity was found to be related to subjective work load. (author)

  2. An Efficient Approach to Mining Maximal Contiguous Frequent Patterns from Large DNA Sequence Databases

    Directory of Open Access Journals (Sweden)

    Md. Rezaul Karim

    2012-03-01

    Full Text Available Mining interesting patterns from DNA sequences is one of the most challenging tasks in bioinformatics and computational biology. Maximal contiguous frequent patterns are preferable for expressing the function and structure of DNA sequences and hence can capture the common data characteristics among related sequences. Biologists are interested in finding frequent orderly arrangements of motifs that are responsible for similar expression of a group of genes. In order to reduce mining time and complexity, however, most existing sequence mining algorithms either focus on finding short DNA sequences or require explicit specification of sequence lengths in advance. The challenge is to find longer sequences without specifying sequence lengths in advance. In this paper, we propose an efficient approach to mining maximal contiguous frequent patterns from large DNA sequence datasets. The experimental results show that our proposed approach is memory-efficient and mines maximal contiguous frequent patterns within a reasonable time.

  3. ASSET: Analysis of Sequences of Synchronous Events in Massively Parallel Spike Trains

    Science.gov (United States)

    Canova, Carlos; Denker, Michael; Gerstein, George; Helias, Moritz

    2016-01-01

    With the ability to observe the activity from large numbers of neurons simultaneously using modern recording technologies, the chance to identify sub-networks involved in coordinated processing increases. Sequences of synchronous spike events (SSEs) constitute one type of such coordinated spiking that propagates activity in a temporally precise manner. The synfire chain was proposed as one potential model for such network processing. Previous work introduced a method for visualization of SSEs in massively parallel spike trains, based on an intersection matrix that contains in each entry the degree of overlap of active neurons in two corresponding time bins. Repeated SSEs are reflected in the matrix as diagonal structures of high overlap values. The method as such, however, leaves the task of identifying these diagonal structures to visual inspection rather than to a quantitative analysis. Here we present ASSET (Analysis of Sequences of Synchronous EvenTs), an improved, fully automated method which determines diagonal structures in the intersection matrix by a robust mathematical procedure. The method consists of a sequence of steps that i) assess which entries in the matrix potentially belong to a diagonal structure, ii) cluster these entries into individual diagonal structures and iii) determine the neurons composing the associated SSEs. We employ parallel point processes generated by stochastic simulations as test data to demonstrate the performance of the method under a wide range of realistic scenarios, including different types of non-stationarity of the spiking activity and different correlation structures. Finally, the ability of the method to discover SSEs is demonstrated on complex data from large network simulations with embedded synfire chains. Thus, ASSET represents an effective and efficient tool to analyze massively parallel spike data for temporal sequences of synchronous activity. PMID:27420734

  4. Children's Representation and Imitation of Events: How Goal Organization Influences 3-Year-Old Children's Memory for Action Sequences.

    Science.gov (United States)

    Loucks, Jeff; Mutschler, Christina; Meltzoff, Andrew N

    2017-09-01

    Children's imitation of adults plays a prominent role in human cognitive development. However, few studies have investigated how children represent the complex structure of observed actions which underlies their imitation. We integrate theories of action segmentation, memory, and imitation to investigate whether children's event representation is organized according to veridical serial order or a higher level goal structure. Children were randomly assigned to learn novel event sequences either through interactive hands-on experience (Study 1) or via storybook (Study 2). Results demonstrate that children's representation of observed actions is organized according to higher level goals, even at the cost of representing the veridical temporal ordering of the sequence. We argue that prioritizing goal structure enhances event memory, and that this mental organization is a key mechanism of social-cognitive development in real-world, dynamic environments. It supports cultural learning and imitation in ecologically valid settings when social agents are multitasking and not demonstrating one isolated goal at a time. Copyright © 2016 Cognitive Science Society, Inc.

  5. A unified approach of catastrophic events

    Directory of Open Access Journals (Sweden)

    S. Nikolopoulos

    2004-01-01

    Full Text Available Although there is an accumulated charge of theoretical, computational, and numerical work, like catastrophe theory, bifurcation theory, stochastic and deterministic chaos theory, there is an important feeling that these matters do not completely cover the physics of real catastrophic events. Recent studies have suggested that a large variety of complex processes, including earthquakes, heartbeats, and neuronal dynamics, exhibits statistical similarities. Here we are studying in terms of complexity and non linear techniques whether isomorphic signatures emerged indicating the transition from the normal state to the both geological and biological shocks. In the last 15 years, the study of Complex Systems has emerged as a recognized field in its own right, although a good definition of what a complex system is, actually is eluded. A basic reason for our interest in complexity is the striking similarity in behaviour close to irreversible phase transitions among systems that are otherwise quite different in nature. It is by now recognized that the pre-seismic electromagnetic time-series contain valuable information about the earthquake preparation process, which cannot be extracted without the use of important computational power, probably in connection with computer Algebra techniques. This paper presents an analysis, the aim of which is to indicate the approach of the global instability in the pre-focal area. Non-linear characteristics are studied by applying two techniques, namely the Correlation Dimension Estimation and the Approximate Entropy. These two non-linear techniques present coherent conclusions, and could cooperate with an independent fractal spectral analysis to provide a detection concerning the emergence of the nucleation phase of the impending catastrophic event. In the context of similar mathematical background, it would be interesting to augment this description of pre-seismic electromagnetic anomalies in order to cover biological

  6. Constraining the magnitude of the largest event in a foreshock-main shock-aftershock sequence

    Science.gov (United States)

    Shcherbakov, Robert; Zhuang, Jiancang; Ogata, Yosihiko

    2018-01-01

    Extreme value statistics and Bayesian methods are used to constrain the magnitudes of the largest expected earthquakes in a sequence governed by the parametric time-dependent occurrence rate and frequency-magnitude statistics. The Bayesian predictive distribution for the magnitude of the largest event in a sequence is derived. Two types of sequences are considered, that is, the classical aftershock sequences generated by large main shocks and the aftershocks generated by large foreshocks preceding a main shock. For the former sequences, the early aftershocks during a training time interval are used to constrain the magnitude of the future extreme event during the forecasting time interval. For the latter sequences, the earthquakes preceding the main shock are used to constrain the magnitudes of the subsequent extreme events including the main shock. The analysis is applied retrospectively to past prominent earthquake sequences.

  7. Results of the event sequence reliability benchmark exercise

    International Nuclear Information System (INIS)

    Silvestri, E.

    1990-01-01

    The Event Sequence Reliability Benchmark Exercise is the fourth of a series of benchmark exercises on reliability and risk assessment, with specific reference to nuclear power plant applications, and is the logical continuation of the previous benchmark exercises on System Analysis Common Cause Failure and Human Factors. The reference plant is the Nuclear Power Plant at Grohnde Federal Republic of Germany a 1300 MW PWR plant of KWU design. The specific objective of the Exercise is to model, to quantify and to analyze such event sequences initiated by the occurrence of a loss of offsite power that involve the steam generator feed. The general aim is to develop a segment of a risk assessment, which ought to include all the specific aspects and models of quantification, such as common canal failure, Human Factors and System Analysis, developed in the previous reliability benchmark exercises, with the addition of the specific topics of dependences between homologous components belonging to different systems featuring in a given event sequence and of uncertainty quantification, to end up with an overall assessment of: - the state of the art in risk assessment and the relative influences of quantification problems in a general risk assessment framework. The Exercise has been carried out in two phases, both requiring modelling and quantification, with the second phase adopting more restrictive rules and fixing certain common data, as emerged necessary from the first phase. Fourteen teams have participated in the Exercise mostly from EEC countries, with one from Sweden and one from the USA. (author)

  8. Modeling the Process of Event Sequence Data Generated for Working Condition Diagnosis

    Directory of Open Access Journals (Sweden)

    Jianwei Ding

    2015-01-01

    Full Text Available Condition monitoring systems are widely used to monitor the working condition of equipment, generating a vast amount and variety of telemetry data in the process. The main task of surveillance focuses on analyzing these routinely collected telemetry data to help analyze the working condition in the equipment. However, with the rapid increase in the volume of telemetry data, it is a nontrivial task to analyze all the telemetry data to understand the working condition of the equipment without any a priori knowledge. In this paper, we proposed a probabilistic generative model called working condition model (WCM, which is capable of simulating the process of event sequence data generated and depicting the working condition of equipment at runtime. With the help of WCM, we are able to analyze how the event sequence data behave in different working modes and meanwhile to detect the working mode of an event sequence (working condition diagnosis. Furthermore, we have applied WCM to illustrative applications like automated detection of an anomalous event sequence for the runtime of equipment. Our experimental results on the real data sets demonstrate the effectiveness of the model.

  9. Plastome Sequence Determination and Comparative Analysis for Members of the Lolium-Festuca Grass Species Complex

    OpenAIRE

    Hand, Melanie L.; Spangenberg, German C.; Forster, John W.; Cogan, Noel O. I.

    2013-01-01

    Chloroplast genome sequences are of broad significance in plant biology, due to frequent use in molecular phylogenetics, comparative genomics, population genetics, and genetic modification studies. The present study used a second-generation sequencing approach to determine and assemble the plastid genomes (plastomes) of four representatives from the agriculturally important Lolium-Festuca species complex of pasture grasses (Lolium multiflorum, Festuca pratensis, Festuca altissima, and Festuca...

  10. Evidence for a Complex Mosaic Genome Pattern in a Full-length Hepatitis C Virus Sequence

    Directory of Open Access Journals (Sweden)

    R.S. Ross

    2008-01-01

    Full Text Available The genome of the hepatitis C virus (HCV exhibits a high genetic variability. This remarkable heterogeneity is mainly attributed to the gradual accumulation of mutational changes, whereas the contribution of recombination events to the evolution of HCV remains controversial so far. While performing phylogenetic analyses including a large number of sequences deposited in the GenBank, we encountered a full-length HCV sequence (AY651061 that showed evidence for inter-subtype recombination and was, therefore, subjected to a detailed analysis of its molecular structure. The obtained results indicated that AY651061 does not represent a “simple” HCV 1c isolate, but a complex 1a/1c mosaic genome, showing five putative breakpoints in the core to NS3 regions. To our knowledge, this is the first report on a mosaic HCV full- length sequence with multiple breakpoints. The molecular structure of AY651061 is reminiscent of complex homologous recombinant variants occurring among other members of the flaviviridae family, e.g. GB virus C, dengue virus, and Japanese encephalitis virus. Our finding of a mosaic HCV sequence may have important implications for many fields of current HCV research which merit careful consideration.

  11. Characteristics of aperiodic sequence of slip events caused by interaction between seismic patches and that caused be self-organized stress heterogeneity

    Science.gov (United States)

    Kato, N.

    2017-12-01

    Numerical simulations of earthquake cycles are conducted to investigate the origin of complexity of earthquake recurrence. There are two main causes of the complexity. One is self-organized stress heterogeneity due to dynamical effect. The other is the effect of interaction between some fault patches. In the model, friction on the fault is assumed to obey a rate- and state-dependent friction law. Circular patches of velocity-weakening frictional property are assumed on the fault. On the remaining areas of the fault, velocity-strengthening friction is assumed. We consider three models: Single patch model, two-patch model, and three-patch model. In the first model, the dynamical effect is mainly examined. The latter two models take into consideration the effect of interaction as well as the dynamical effect. Complex multiperiodic or aperiodic sequences of slip events occur when slip behavior changes from the seismic to aseismic, and when the degree of interaction between seismic patches is intermediate. The former is observed in all the models, and the latter is observed in the two-patch model and the three-patch model. Evolution of spatial distribution of shear stress on the fault suggests that aperiodicity at the transition from seismic to aseismic slip is caused by self-organized stress heterogeneity. The iteration maps of recurrence intervals of slip events in aperiodic sequences are examined, and they are approximately expressed by simple curves for aperiodicity at the transition from seismic to aseismic slip. In contrast, the iteration maps for aperiodic sequences caused by interaction between seismic patches are scattered and they are not expressed by simple curves. This result suggests that complex sequences caused by different mechanisms may be distinguished.

  12. Performance Measurement of Complex Event Platforms

    Directory of Open Access Journals (Sweden)

    Eva Zámečníková

    2016-12-01

    Full Text Available The aim of this paper is to find and compare existing solutions of complex event processing platforms (CEP. CEP platforms generally serve for processing and/or predicting of high frequency data. We intend to use CEP platform for processing of complex time series and integrate a solution for newly proposed method of decision making. The decision making process will be described by formal grammar. As there are lots of CEP solutions we will take the following characteristics under consideration - the processing in real time, possibility of processing of high volume data from multiple sources, platform independence, platform allowing integration with user solution and open license. At first we will talk about existing CEP tools and their specific way of use in praxis. Then we will mention the design of method for formalization of business rules used for decision making. Afterwards, we focus on two platforms which seem to be the best fit for integration of our solution and we will list the main pros and cons of each approach. Next part is devoted to benchmark platforms for CEP. Final part is devoted to experimental measurements of platform with integrated method for decision support.

  13. Disruptive event uncertainties in a perturbation approach to nuclear waste repository risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Harvey, T.F.

    1980-09-01

    A methodology is developed for incorporating a full range of the principal forecasting uncertainties into a risk analysis of a nuclear waste repository. The result of this methodology is a set of risk curves similar to those used by Rasmussen in WASH-1400. The set of curves is partially derived from a perturbation approach to analyze potential disruptive event sequences. Such a scheme could be useful in truncating the number of disruptive event scenarios and providing guidance to those establishing data-base development priorities.

  14. ISVASE: identification of sequence variant associated with splicing event using RNA-seq data.

    Science.gov (United States)

    Aljohi, Hasan Awad; Liu, Wanfei; Lin, Qiang; Yu, Jun; Hu, Songnian

    2017-06-28

    Exon recognition and splicing precisely and efficiently by spliceosome is the key to generate mature mRNAs. About one third or a half of disease-related mutations affect RNA splicing. Software PVAAS has been developed to identify variants associated with aberrant splicing by directly using RNA-seq data. However, it bases on the assumption that annotated splicing site is normal splicing, which is not true in fact. We develop the ISVASE, a tool for specifically identifying sequence variants associated with splicing events (SVASE) by using RNA-seq data. Comparing with PVAAS, our tool has several advantages, such as multi-pass stringent rule-dependent filters and statistical filters, only using split-reads, independent sequence variant identification in each part of splicing (junction), sequence variant detection for both of known and novel splicing event, additional exon-exon junction shift event detection if known splicing events provided, splicing signal evaluation, known DNA mutation and/or RNA editing data supported, higher precision and consistency, and short running time. Using a realistic RNA-seq dataset, we performed a case study to illustrate the functionality and effectiveness of our method. Moreover, the output of SVASEs can be used for downstream analysis such as splicing regulatory element study and sequence variant functional analysis. ISVASE is useful for researchers interested in sequence variants (DNA mutation and/or RNA editing) associated with splicing events. The package is freely available at https://sourceforge.net/projects/isvase/ .

  15. Aftershock Sequences and Seismic-Like Organization of Acoustic Events Produced by a Single Propagating Crack

    Science.gov (United States)

    Alizee, D.; Bonamy, D.

    2017-12-01

    In inhomogeneous brittle solids like rocks, concrete or ceramics, one usually distinguish nominally brittle fracture, driven by the propagation of a single crack from quasibrittle one, resulting from the accumulation of many microcracks. The latter goes along with intermittent sharp noise, as e.g. revealed by the acoustic emission observed in lab scale compressive fracture experiments or at geophysical scale in the seismic activity. In both cases, statistical analyses have revealed a complex time-energy organization into aftershock sequences obeying a range of robust empirical scaling laws (the Omori-Utsu, productivity and Bath's law) that help carry out seismic hazard analysis and damage mitigation. These laws are usually conjectured to emerge from the collective dynamics of microcrack nucleation. In the experiments presented at AGU, we will show that such a statistical organization is not specific to the quasi-brittle multicracking situations, but also rules the acoustic events produced by a single crack slowly driven in an artificial rock made of sintered polymer beads. This simpler situation has advantageous properties (statistical stationarity in particular) permitting us to uncover the origins of these seismic laws: Both productivity law and Bath's law result from the scale free statistics for event energy and Omori-Utsu law results from the scale-free statistics of inter-event time. This yields predictions on how the associated parameters are related, which were analytically derived. Surprisingly, the so-obtained relations are also compatible with observations on lab scale compressive fracture experiments, suggesting that, in these complex multicracking situations also, the organization into aftershock sequences and associated seismic laws are also ruled by the propagation of individual microcrack fronts, and not by the collective, stress-mediated, microcrack nucleation. Conversely, the relations are not fulfilled in seismology signals, suggesting that

  16. Symbolic complexity for nucleotide sequences: a sign of the genome structure

    International Nuclear Information System (INIS)

    Salgado-García, R; Ugalde, E

    2016-01-01

    We introduce a method for estimating the complexity function (which counts the number of observable words of a given length) of a finite symbolic sequence, which we use to estimate the complexity function of coding DNA sequences for several species of the Hominidae family. In all cases, the obtained symbolic complexities show the same characteristic behavior: exponential growth for small word lengths, followed by linear growth for larger word lengths. The symbolic complexities of the species we consider exhibit a systematic trend in correspondence with the phylogenetic tree. Using our method, we estimate the complexity function of sequences obtained by some known evolution models, and in some cases we observe the characteristic exponential-linear growth of the Hominidae coding DNA complexity. Analysis of the symbolic complexity of sequences obtained from a specific evolution model points to the following conclusion: linear growth arises from the random duplication of large segments during the evolution of the genome, while the decrease in the overall complexity from one species to another is due to a difference in the speed of accumulation of point mutations. (paper)

  17. Event sequence quantification for a loss of shutdown cooling accident in the GCFR

    International Nuclear Information System (INIS)

    Frank, M.; Reilly, J.

    1979-10-01

    A summary is presented of the core-wide sequence of events of a postulated total loss of forced and natural convection decay heat removal in a shutdown Gas-Cooled Fast Reactor (GCFR). It outlines the analytical methods and results for the progression of the accident sequence. This hypothetical accident proceeds in the distinct phases of cladding melting, assembly wall melting and molten steel relocation into the interassembly spacing, and fuel relocation. It identifies the key phenomena of the event sequence and the concerns and mechanisms of both recriticality and recriticality prevention

  18. A Synergistic Approach towards Autonomic Event Management in Supply Chains

    OpenAIRE

    Oberhauser, Roy

    2009-01-01

    The increasing reliance on SCs, coupled with increasing complexity, dynamism and heightened quality expectations, are necessarily reflected in the SCMS and implicitly in the need for improved SCEM to limit disruptions and achieve self-X qualitites. A novel synergistic approach to SCEM, as presented in this chapter (SASCEM), leverages the computing paradigms of granular, semantic web, service-oriented, space-based, event-based, context-

  19. Event-based state estimation for a class of complex networks with time-varying delays: A comparison principle approach

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Wenbing [Department of Mathematics, Yangzhou University, Yangzhou 225002 (China); Wang, Zidong [Department of Computer Science, Brunel University London, Uxbridge, Middlesex, UB8 3PH (United Kingdom); Liu, Yurong, E-mail: yrliu@yzu.edu.cn [Department of Mathematics, Yangzhou University, Yangzhou 225002 (China); Communication Systems and Networks (CSN) Research Group, Faculty of Engineering, King Abdulaziz University, Jeddah 21589 (Saudi Arabia); Ding, Derui [Shanghai Key Lab of Modern Optical System, Department of Control Science and Engineering, University of Shanghai for Science and Technology, Shanghai 200093 (China); Alsaadi, Fuad E. [Communication Systems and Networks (CSN) Research Group, Faculty of Engineering, King Abdulaziz University, Jeddah 21589 (Saudi Arabia)

    2017-01-05

    The paper is concerned with the state estimation problem for a class of time-delayed complex networks with event-triggering communication protocol. A novel event generator function, which is dependent not only on the measurement output but also on a predefined positive constant, is proposed with hope to reduce the communication burden. A new concept of exponentially ultimate boundedness is provided to quantify the estimation performance. By means of the comparison principle, some sufficient conditions are obtained to guarantee that the estimation error is exponentially ultimately bounded, and then the estimator gains are obtained in terms of the solution of certain matrix inequalities. Furthermore, a rigorous proof is proposed to show that the designed triggering condition is free of the Zeno behavior. Finally, a numerical example is given to illustrate the effectiveness of the proposed event-based estimator. - Highlights: • An event-triggered estimator is designed for complex networks with time-varying delays. • A novel event generator function is proposed to reduce the communication burden. • The comparison principle is utilized to derive the sufficient conditions. • The designed triggering condition is shown to be free of the Zeno behavior.

  20. An Intelligent Complex Event Processing with D Numbers under Fuzzy Environment

    Directory of Open Access Journals (Sweden)

    Fuyuan Xiao

    2016-01-01

    Full Text Available Efficient matching of incoming mass events to persistent queries is fundamental to complex event processing systems. Event matching based on pattern rule is an important feature of complex event processing engine. However, the intrinsic uncertainty in pattern rules which are predecided by experts increases the difficulties of effective complex event processing. It inevitably involves various types of the intrinsic uncertainty, such as imprecision, fuzziness, and incompleteness, due to the inability of human beings subjective judgment. Nevertheless, D numbers is a new mathematic tool to model uncertainty, since it ignores the condition that elements on the frame must be mutually exclusive. To address the above issues, an intelligent complex event processing method with D numbers under fuzzy environment is proposed based on the Technique for Order Preferences by Similarity to an Ideal Solution (TOPSIS method. The novel method can fully support decision making in complex event processing systems. Finally, a numerical example is provided to evaluate the efficiency of the proposed method.

  1. The Use of Next Generation Sequencing and Junction Sequence Analysis Bioinformatics to Achieve Molecular Characterization of Crops Improved Through Modern Biotechnology

    Directory of Open Access Journals (Sweden)

    David Kovalic

    2012-11-01

    Full Text Available The assessment of genetically modified (GM crops for regulatory approval currently requires a detailed molecular characterization of the DNA sequence and integrity of the transgene locus. In addition, molecular characterization is a critical component of event selection and advancement during product development. Typically, molecular characterization has relied on Southern blot analysis to establish locus and copy number along with targeted sequencing of polymerase chain reaction products spanning any inserted DNA to complete the characterization process. Here we describe the use of next generation (NexGen sequencing and junction sequence analysis bioinformatics in a new method for achieving full molecular characterization of a GM event without the need for Southern blot analysis. In this study, we examine a typical GM soybean [ (L. Merr.] line and demonstrate that this new method provides molecular characterization equivalent to the current Southern blot-based method. We also examine an event containing in vivo DNA rearrangement of multiple transfer DNA inserts to demonstrate that the new method is effective at identifying complex cases. Next generation sequencing and bioinformatics offers certain advantages over current approaches, most notably the simplicity, efficiency, and consistency of the method, and provides a viable alternative for efficiently and robustly achieving molecular characterization of GM crops.

  2. Combinatorial events of insertion sequences and ICE in Gram-negative bacteria.

    Science.gov (United States)

    Toleman, Mark A; Walsh, Timothy R

    2011-09-01

    The emergence of antibiotic and antimicrobial resistance in Gram-negative bacteria is incremental and linked to genetic elements that function in a so-called 'one-ended transposition' manner, including ISEcp1, ISCR elements and Tn3-like transposons. The power of these elements lies in their inability to consistently recognize one of their own terminal sequences, while recognizing more genetically distant surrogate sequences. This has the effect of mobilizing the DNA sequence found adjacent to their initial location. In general, resistance in Gram-negatives is closely linked to a few one-off events. These include the capture of the class 1 integron by a Tn5090-like transposon; the formation of the 3' conserved segment (3'-CS); and the fusion of the ISCR1 element to the 3'-CS. The structures formed by these rare events have been massively amplified and disseminated in Gram-negative bacteria, but hitherto, are rarely found in Gram-positives. Such events dominate current resistance gene acquisition and are instrumental in the construction of large resistance gene islands on chromosomes and plasmids. Similar combinatorial events appear to have occurred between conjugative plasmids and phages constructing hybrid elements called integrative and conjugative elements or conjugative transposons. These elements are beginning to be closely linked to some of the more powerful resistance mechanisms such as the extended spectrum β-lactamases, metallo- and AmpC type β-lactamases. Antibiotic resistance in Gram-negative bacteria is dominated by unusual combinatorial mistakes of Insertion sequences and gene fusions which have been selected and amplified by antibiotic pressure enabling the formation of extended resistance islands. © 2011 Federation of European Microbiological Societies. Published by Blackwell Publishing Ltd. All rights reserved.

  3. An Autonomous System for Grouping Events in a Developing Aftershock Sequence

    Energy Technology Data Exchange (ETDEWEB)

    Harris, D. B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dodge, D. A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2011-03-22

    We describe a prototype detection framework that automatically clusters events in real time from a rapidly unfolding aftershock sequence. We use the fact that many aftershocks are repetitive, producing similar waveforms. By clustering events based on correlation measures of waveform similarity, the number of independent event instances that must be examined in detail by analysts may be reduced. Our system processes array data and acquires waveform templates with a short-term average (STA)/long-term average (LTA) detector operating on a beam directed at the P phases of the aftershock sequence. The templates are used to create correlation-type (subspace) detectors that sweep the subsequent data stream for occurrences of the same waveform pattern. Events are clustered by association with a particular detector. Hundreds of subspace detectors can run in this framework a hundred times faster than in real time. Nonetheless, to check the growth in the number of detectors, the framework pauses periodically and reclusters detections to reduce the number of event groups. These groups define new subspace detectors that replace the older generation of detectors. Because low-magnitude occurrences of a particular signal template may be missed by the STA/LTA detector, we advocate restarting the framework from the beginning of the sequence periodically to reprocess the entire data stream with the existing detectors. We tested the framework on 10 days of data from the Nevada Seismic Array (NVAR) covering the 2003 San Simeon earthquake. One hundred eighty-four automatically generated detectors produced 676 detections resulting in a potential reduction in analyst workload of up to 73%.

  4. Prediction of accident sequence probabilities in a nuclear power plant due to earthquake events

    International Nuclear Information System (INIS)

    Hudson, J.M.; Collins, J.D.

    1980-01-01

    This paper presents a methodology to predict accident probabilities in nuclear power plants subject to earthquakes. The resulting computer program accesses response data to compute component failure probabilities using fragility functions. Using logical failure definitions for systems, and the calculated component failure probabilities, initiating event and safety system failure probabilities are synthesized. The incorporation of accident sequence expressions allows the calculation of terminal event probabilities. Accident sequences, with their occurrence probabilities, are finally coupled to a specific release category. A unique aspect of the methodology is an analytical procedure for calculating top event probabilities based on the correlated failure of primary events

  5. Compliance with Environmental Regulations through Complex Geo-Event Processing

    Directory of Open Access Journals (Sweden)

    Federico Herrera

    2017-11-01

    Full Text Available In a context of e-government, there are usually regulatory compliance requirements that support systems must monitor, control and enforce. These requirements may come from environmental laws and regulations that aim to protect the natural environment and mitigate the effects of pollution on human health and ecosystems. Monitoring compliance with these requirements involves processing a large volume of data from different sources, which is a major challenge. This volume is also increased with data coming from autonomous sensors (e.g. reporting carbon emission in protected areas and from citizens providing information (e.g. illegal dumping in a voluntary way. Complex Event Processing (CEP technologies allow processing large amount of event data and detecting patterns from them. However, they do not provide native support for the geographic dimension of events which is essential for monitoring requirements which apply to specific geographic areas. This paper proposes a geospatial extension for CEP that allows monitoring environmental requirements considering the geographic location of the processed data. We extend an existing platform-independent, model-driven approach for CEP adding the geographic location to events and specifying patterns using geographic operators. The use and technical feasibility of the proposal is shown through the development of a case study and the implementation of a prototype.

  6. Licensee Event Report sequence coding and search procedure workshop

    International Nuclear Information System (INIS)

    Cottrell, W.B.; Gallaher, R.B.

    1981-01-01

    Since mid-1980, the Office for Analysis and Evaluation of Operational Data (AEOD) of the Nuclear Regulatory Commission (NRC) has been developing procedures for the systematic review and analysis of Licensee Event Reports (LERs). These procedures generally address several areas of concern, including identification of significant trends and patterns, event sequence of occurrences, component failures, and system and plant effects. The AEOD and NSIC conducted a workshop on the new coding procedure at the American Museum of Science and Energy in Oak Ridge, TN, on November 24, 1980

  7. Perceived synchrony for realistic and dynamic audiovisual events.

    Science.gov (United States)

    Eg, Ragnhild; Behne, Dawn M

    2015-01-01

    In well-controlled laboratory experiments, researchers have found that humans can perceive delays between auditory and visual signals as short as 20 ms. Conversely, other experiments have shown that humans can tolerate audiovisual asynchrony that exceeds 200 ms. This seeming contradiction in human temporal sensitivity can be attributed to a number of factors such as experimental approaches and precedence of the asynchronous signals, along with the nature, duration, location, complexity and repetitiveness of the audiovisual stimuli, and even individual differences. In order to better understand how temporal integration of audiovisual events occurs in the real world, we need to close the gap between the experimental setting and the complex setting of everyday life. With this work, we aimed to contribute one brick to the bridge that will close this gap. We compared perceived synchrony for long-running and eventful audiovisual sequences to shorter sequences that contain a single audiovisual event, for three types of content: action, music, and speech. The resulting windows of temporal integration showed that participants were better at detecting asynchrony for the longer stimuli, possibly because the long-running sequences contain multiple corresponding events that offer audiovisual timing cues. Moreover, the points of subjective simultaneity differ between content types, suggesting that the nature of a visual scene could influence the temporal perception of events. An expected outcome from this type of experiment was the rich variation among participants' distributions and the derived points of subjective simultaneity. Hence, the designs of similar experiments call for more participants than traditional psychophysical studies. Heeding this caution, we conclude that existing theories on multisensory perception are ready to be tested on more natural and representative stimuli.

  8. Learning predictive statistics from temporal sequences: Dynamics and strategies.

    Science.gov (United States)

    Wang, Rui; Shen, Yuan; Tino, Peter; Welchman, Andrew E; Kourtzi, Zoe

    2017-10-01

    Human behavior is guided by our expectations about the future. Often, we make predictions by monitoring how event sequences unfold, even though such sequences may appear incomprehensible. Event structures in the natural environment typically vary in complexity, from simple repetition to complex probabilistic combinations. How do we learn these structures? Here we investigate the dynamics of structure learning by tracking human responses to temporal sequences that change in structure unbeknownst to the participants. Participants were asked to predict the upcoming item following a probabilistic sequence of symbols. Using a Markov process, we created a family of sequences, from simple frequency statistics (e.g., some symbols are more probable than others) to context-based statistics (e.g., symbol probability is contingent on preceding symbols). We demonstrate the dynamics with which individuals adapt to changes in the environment's statistics-that is, they extract the behaviorally relevant structures to make predictions about upcoming events. Further, we show that this structure learning relates to individual decision strategy; faster learning of complex structures relates to selection of the most probable outcome in a given context (maximizing) rather than matching of the exact sequence statistics. Our findings provide evidence for alternate routes to learning of behaviorally relevant statistics that facilitate our ability to predict future events in variable environments.

  9. Contingency Analysis of Cascading Line Outage Events

    Energy Technology Data Exchange (ETDEWEB)

    Thomas L Baldwin; Magdy S Tawfik; Miles McQueen

    2011-03-01

    As the US power systems continue to increase in size and complexity, including the growth of smart grids, larger blackouts due to cascading outages become more likely. Grid congestion is often associated with a cascading collapse leading to a major blackout. Such a collapse is characterized by a self-sustaining sequence of line outages followed by a topology breakup of the network. This paper addresses the implementation and testing of a process for N-k contingency analysis and sequential cascading outage simulation in order to identify potential cascading modes. A modeling approach described in this paper offers a unique capability to identify initiating events that may lead to cascading outages. It predicts the development of cascading events by identifying and visualizing potential cascading tiers. The proposed approach was implemented using a 328-bus simplified SERC power system network. The results of the study indicate that initiating events and possible cascading chains may be identified, ranked and visualized. This approach may be used to improve the reliability of a transmission grid and reduce its vulnerability to cascading outages.

  10. RAVEN. Dynamic Event Tree Approach Level III Milestone

    Energy Technology Data Exchange (ETDEWEB)

    Alfonsi, Andrea [Idaho National Lab. (INL), Idaho Falls, ID (United States); Rabiti, Cristian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Cogliati, Joshua [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kinoshita, Robert [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-07-01

    Conventional Event-Tree (ET) based methodologies are extensively used as tools to perform reliability and safety assessment of complex and critical engineering systems. One of the disadvantages of these methods is that timing/sequencing of events and system dynamics are not explicitly accounted for in the analysis. In order to overcome these limitations several techniques, also know as Dynamic Probabilistic Risk Assessment (DPRA), have been developed. Monte-Carlo (MC) and Dynamic Event Tree (DET) are two of the most widely used D-PRA methodologies to perform safety assessment of Nuclear Power Plants (NPP). In the past two years, the Idaho National Laboratory (INL) has developed its own tool to perform Dynamic PRA: RAVEN (Reactor Analysis and Virtual control ENvironment). RAVEN has been designed to perform two main tasks: 1) control logic driver for the new Thermo-Hydraulic code RELAP-7 and 2) post-processing tool. In the first task, RAVEN acts as a deterministic controller in which the set of control logic laws (user defined) monitors the RELAP-7 simulation and controls the activation of specific systems. Moreover, the control logic infrastructure is used to model stochastic events, such as components failures, and perform uncertainty propagation. Such stochastic modeling is deployed using both MC and DET algorithms. In the second task, RAVEN processes the large amount of data generated by RELAP-7 using data-mining based algorithms. This report focuses on the analysis of dynamic stochastic systems using the newly developed RAVEN DET capability. As an example, a DPRA analysis, using DET, of a simplified pressurized water reactor for a Station Black-Out (SBO) scenario is presented.

  11. RAVEN: Dynamic Event Tree Approach Level III Milestone

    Energy Technology Data Exchange (ETDEWEB)

    Andrea Alfonsi; Cristian Rabiti; Diego Mandelli; Joshua Cogliati; Robert Kinoshita

    2013-07-01

    Conventional Event-Tree (ET) based methodologies are extensively used as tools to perform reliability and safety assessment of complex and critical engineering systems. One of the disadvantages of these methods is that timing/sequencing of events and system dynamics are not explicitly accounted for in the analysis. In order to overcome these limitations several techniques, also know as Dynamic Probabilistic Risk Assessment (DPRA), have been developed. Monte-Carlo (MC) and Dynamic Event Tree (DET) are two of the most widely used D-PRA methodologies to perform safety assessment of Nuclear Power Plants (NPP). In the past two years, the Idaho National Laboratory (INL) has developed its own tool to perform Dynamic PRA: RAVEN (Reactor Analysis and Virtual control ENvironment). RAVEN has been designed to perform two main tasks: 1) control logic driver for the new Thermo-Hydraulic code RELAP-7 and 2) post-processing tool. In the first task, RAVEN acts as a deterministic controller in which the set of control logic laws (user defined) monitors the RELAP-7 simulation and controls the activation of specific systems. Moreover, the control logic infrastructure is used to model stochastic events, such as components failures, and perform uncertainty propagation. Such stochastic modeling is deployed using both MC and DET algorithms. In the second task, RAVEN processes the large amount of data generated by RELAP-7 using data-mining based algorithms. This report focuses on the analysis of dynamic stochastic systems using the newly developed RAVEN DET capability. As an example, a DPRA analysis, using DET, of a simplified pressurized water reactor for a Station Black-Out (SBO) scenario is presented.

  12. Complex Sequencing Problems and Local Search Heuristics

    NARCIS (Netherlands)

    Brucker, P.; Hurink, Johann L.; Osman, I.H.; Kelly, J.P.

    1996-01-01

    Many problems can be formulated as complex sequencing problems. We will present problems in flexible manufacturing that have such a formulation and apply local search methods like iterative improvement, simulated annealing and tabu search to solve these problems. Computational results are reported.

  13. 454 sequencing reveals extreme complexity of the class II Major Histocompatibility Complex in the collared flycatcher

    Directory of Open Access Journals (Sweden)

    Gustafsson Lars

    2010-12-01

    Full Text Available Abstract Background Because of their functional significance, the Major Histocompatibility Complex (MHC class I and II genes have been the subject of continuous interest in the fields of ecology, evolution and conservation. In some vertebrate groups MHC consists of multiple loci with similar alleles; therefore, the multiple loci must be genotyped simultaneously. In such complex systems, understanding of the evolutionary patterns and their causes has been limited due to challenges posed by genotyping. Results Here we used 454 amplicon sequencing to characterize MHC class IIB exon 2 variation in the collared flycatcher, an important organism in evolutionary and immuno-ecological studies. On the basis of over 152,000 sequencing reads we identified 194 putative alleles in 237 individuals. We found an extreme complexity of the MHC class IIB in the collared flycatchers, with our estimates pointing to the presence of at least nine expressed loci and a large, though difficult to estimate precisely, number of pseudogene loci. Many similar alleles occurred in the pseudogenes indicating either a series of recent duplications or extensive concerted evolution. The expressed alleles showed unambiguous signals of historical selection and the occurrence of apparent interlocus exchange of alleles. Placing the collared flycatcher's MHC sequences in the context of passerine diversity revealed transspecific MHC class II evolution within the Muscicapidae family. Conclusions 454 amplicon sequencing is an effective tool for advancing our understanding of the MHC class II structure and evolutionary patterns in Passeriformes. We found a highly dynamic pattern of evolution of MHC class IIB genes with strong signals of selection and pronounced sequence divergence in expressed genes, in contrast to the apparent sequence homogenization in pseudogenes. We show that next generation sequencing offers a universal, affordable method for the characterization and, in perspective

  14. Sequence Matching Analysis for Curriculum Development

    Directory of Open Access Journals (Sweden)

    Liem Yenny Bendatu

    2015-06-01

    Full Text Available Many organizations apply information technologies to support their business processes. Using the information technologies, the actual events are recorded and utilized to conform with predefined model. Conformance checking is an approach to measure the fitness and appropriateness between process model and actual events. However, when there are multiple events with the same timestamp, the traditional approach unfit to result such measures. This study attempts to develop a sequence matching analysis. Considering conformance checking as the basis of this approach, this proposed approach utilizes the current control flow technique in process mining domain. A case study in the field of educational process has been conducted. This study also proposes a curriculum analysis framework to test the proposed approach. By considering the learning sequence of students, it results some measurements for curriculum development. Finally, the result of the proposed approach has been verified by relevant instructors for further development.

  15. Southern-by-Sequencing: A Robust Screening Approach for Molecular Characterization of Genetically Modified Crops

    Directory of Open Access Journals (Sweden)

    Gina M. Zastrow-Hayes

    2015-03-01

    Full Text Available Molecular characterization of events is an integral part of the advancement process during genetically modified (GM crop product development. Assessment of these events is traditionally accomplished by polymerase chain reaction (PCR and Southern blot analyses. Southern blot analysis can be time-consuming and comparatively expensive and does not provide sequence-level detail. We have developed a sequence-based application, Southern-by-Sequencing (SbS, utilizing sequence capture coupled with next-generation sequencing (NGS technology to replace Southern blot analysis for event selection in a high-throughput molecular characterization environment. SbS is accomplished by hybridizing indexed and pooled whole-genome DNA libraries from GM plants to biotinylated probes designed to target the sequence of transformation plasmids used to generate events within the pool. This sequence capture process enriches the sequence data obtained for targeted regions of interest (transformation plasmid DNA. Taking advantage of the DNA adjacent to the targeted bases (referred to as next-to-target sequence that accompanies the targeted transformation plasmid sequence, the data analysis detects plasmid-to-genome and plasmid-to-plasmid junctions introduced during insertion into the plant genome. Analysis of these junction sequences provides sequence-level information as to the following: the number of insertion loci including detection of unlinked, independently segregating, small DNA fragments; copy number; rearrangements, truncations, or deletions of the intended insertion DNA; and the presence of transformation plasmid backbone sequences. This molecular evidence from SbS analysis is used to characterize and select GM plants meeting optimal molecular characterization criteria. SbS technology has proven to be a robust event screening tool for use in a high-throughput molecular characterization environment.

  16. MANGO: a new approach to multiple sequence alignment.

    Science.gov (United States)

    Zhang, Zefeng; Lin, Hao; Li, Ming

    2007-01-01

    Multiple sequence alignment is a classical and challenging task for biological sequence analysis. The problem is NP-hard. The full dynamic programming takes too much time. The progressive alignment heuristics adopted by most state of the art multiple sequence alignment programs suffer from the 'once a gap, always a gap' phenomenon. Is there a radically new way to do multiple sequence alignment? This paper introduces a novel and orthogonal multiple sequence alignment method, using multiple optimized spaced seeds and new algorithms to handle these seeds efficiently. Our new algorithm processes information of all sequences as a whole, avoiding problems caused by the popular progressive approaches. Because the optimized spaced seeds are provably significantly more sensitive than the consecutive k-mers, the new approach promises to be more accurate and reliable. To validate our new approach, we have implemented MANGO: Multiple Alignment with N Gapped Oligos. Experiments were carried out on large 16S RNA benchmarks showing that MANGO compares favorably, in both accuracy and speed, against state-of-art multiple sequence alignment methods, including ClustalW 1.83, MUSCLE 3.6, MAFFT 5.861, Prob-ConsRNA 1.11, Dialign 2.2.1, DIALIGN-T 0.2.1, T-Coffee 4.85, POA 2.0 and Kalign 2.0.

  17. Semantic Complex Event Processing over End-to-End Data Flows

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Qunzhi [University of Southern California; Simmhan, Yogesh; Prasanna, Viktor K.

    2012-04-01

    Emerging Complex Event Processing (CEP) applications in cyber physical systems like SmartPower Grids present novel challenges for end-to-end analysis over events, flowing from heterogeneous information sources to persistent knowledge repositories. CEP for these applications must support two distinctive features - easy specification patterns over diverse information streams, and integrated pattern detection over realtime and historical events. Existing work on CEP has been limited to relational query patterns, and engines that match events arriving after the query has been registered. We propose SCEPter, a semantic complex event processing framework which uniformly processes queries over continuous and archived events. SCEPteris built around an existing CEP engine with innovative support for semantic event pattern specification and allows their seamless detection over past, present and future events. Specifically, we describe a unified semantic query model that can operate over data flowing through event streams to event repositories. Compile-time and runtime semantic patterns are distinguished and addressed separately for efficiency. Query rewriting is examined and analyzed in the context of temporal boundaries that exist between event streams and their repository to avoid duplicate or missing results. The design and prototype implementation of SCEPterare analyzed using latency and throughput metrics for scenarios from the Smart Grid domain.

  18. A robust, simple genotyping-by-sequencing (GBS approach for high diversity species.

    Directory of Open Access Journals (Sweden)

    Robert J Elshire

    Full Text Available Advances in next generation technologies have driven the costs of DNA sequencing down to the point that genotyping-by-sequencing (GBS is now feasible for high diversity, large genome species. Here, we report a procedure for constructing GBS libraries based on reducing genome complexity with restriction enzymes (REs. This approach is simple, quick, extremely specific, highly reproducible, and may reach important regions of the genome that are inaccessible to sequence capture approaches. By using methylation-sensitive REs, repetitive regions of genomes can be avoided and lower copy regions targeted with two to three fold higher efficiency. This tremendously simplifies computationally challenging alignment problems in species with high levels of genetic diversity. The GBS procedure is demonstrated with maize (IBM and barley (Oregon Wolfe Barley recombinant inbred populations where roughly 200,000 and 25,000 sequence tags were mapped, respectively. An advantage in species like barley that lack a complete genome sequence is that a reference map need only be developed around the restriction sites, and this can be done in the process of sample genotyping. In such cases, the consensus of the read clusters across the sequence tagged sites becomes the reference. Alternatively, for kinship analyses in the absence of a reference genome, the sequence tags can simply be treated as dominant markers. Future application of GBS to breeding, conservation, and global species and population surveys may allow plant breeders to conduct genomic selection on a novel germplasm or species without first having to develop any prior molecular tools, or conservation biologists to determine population structure without prior knowledge of the genome or diversity in the species.

  19. Technical Note: A novel leaf sequencing optimization algorithm which considers previous underdose and overdose events for MLC tracking radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Wisotzky, Eric, E-mail: eric.wisotzky@charite.de, E-mail: eric.wisotzky@ipk.fraunhofer.de; O’Brien, Ricky; Keall, Paul J., E-mail: paul.keall@sydney.edu.au [Radiation Physics Laboratory, Sydney Medical School, University of Sydney, Sydney, NSW 2006 (Australia)

    2016-01-15

    Purpose: Multileaf collimator (MLC) tracking radiotherapy is complex as the beam pattern needs to be modified due to the planned intensity modulation as well as the real-time target motion. The target motion cannot be planned; therefore, the modified beam pattern differs from the original plan and the MLC sequence needs to be recomputed online. Current MLC tracking algorithms use a greedy heuristic in that they optimize for a given time, but ignore past errors. To overcome this problem, the authors have developed and improved an algorithm that minimizes large underdose and overdose regions. Additionally, previous underdose and overdose events are taken into account to avoid regions with high quantity of dose events. Methods: The authors improved the existing MLC motion control algorithm by introducing a cumulative underdose/overdose map. This map represents the actual projection of the planned tumor shape and logs occurring dose events at each specific regions. These events have an impact on the dose cost calculation and reduce recurrence of dose events at each region. The authors studied the improvement of the new temporal optimization algorithm in terms of the L1-norm minimization of the sum of overdose and underdose compared to not accounting for previous dose events. For evaluation, the authors simulated the delivery of 5 conformal and 14 intensity-modulated radiotherapy (IMRT)-plans with 7 3D patient measured tumor motion traces. Results: Simulations with conformal shapes showed an improvement of L1-norm up to 8.5% after 100 MLC modification steps. Experiments showed comparable improvements with the same type of treatment plans. Conclusions: A novel leaf sequencing optimization algorithm which considers previous dose events for MLC tracking radiotherapy has been developed and investigated. Reductions in underdose/overdose are observed for conformal and IMRT delivery.

  20. Accident Sequence Precursor Analysis for SGTR by Using Dynamic PSA Approach

    International Nuclear Information System (INIS)

    Lee, Han Sul; Heo, Gyun Young; Kim, Tae Wan

    2016-01-01

    In order to address this issue, this study suggests the sequence tree model to analyze accident sequence systematically. Using the sequence tree model, all possible scenarios which need a specific safety action to prevent the core damage can be identified and success conditions of safety action under complicated situation such as combined accident will be also identified. Sequence tree is branch model to divide plant condition considering the plant dynamics. Since sequence tree model can reflect the plant dynamics, arising from interaction of different accident timing and plant condition and from the interaction between the operator action, mitigation system, and the indicators for operation, sequence tree model can be used to develop the dynamic event tree model easily. Target safety action for this study is a feed-and-bleed (F and B) operation. A F and B operation directly cools down the reactor cooling system (RCS) using the primary cooling system when residual heat removal by the secondary cooling system is not available. In this study, a TLOFW accident and a TLOFW accident with LOCA were the target accidents. Based on the conventional PSA model and indicators, the sequence tree model for a TLOFW accident was developed. Based on the results of a sampling analysis and data from the conventional PSA model, the CDF caused by Sequence no. 26 can be realistically estimated. For a TLOFW accident with LOCA, second accident timings were categorized according to plant condition. Indicators were selected as branch point using the flow chart and tables, and a corresponding sequence tree model was developed. If sampling analysis is performed, practical accident sequences can be identified based on the sequence analysis. If a realistic distribution for the variables can be obtained for sampling analysis, much more realistic accident sequences can be described. Moreover, if the initiating event frequency under a combined accident can be quantified, the sequence tree model

  1. Safety Discrete Event Models for Holonic Cyclic Manufacturing Systems

    Science.gov (United States)

    Ciufudean, Calin; Filote, Constantin

    In this paper the expression “holonic cyclic manufacturing systems” refers to complex assembly/disassembly systems or fork/join systems, kanban systems, and in general, to any discrete event system that transforms raw material and/or components into products. Such a system is said to be cyclic if it provides the same sequence of products indefinitely. This paper considers the scheduling of holonic cyclic manufacturing systems and describes a new approach using Petri nets formalism. We propose an approach to frame the optimum schedule of holonic cyclic manufacturing systems in order to maximize the throughput while minimize the work in process. We also propose an algorithm to verify the optimum schedule.

  2. Internal event analysis for Laguna Verde Unit 1 Nuclear Power Plant. Accident sequence quantification and results

    International Nuclear Information System (INIS)

    Huerta B, A.; Aguilar T, O.; Nunez C, A.; Lopez M, R.

    1994-01-01

    The Level 1 results of Laguna Verde Nuclear Power Plant PRA are presented in the I nternal Event Analysis for Laguna Verde Unit 1 Nuclear Power Plant, CNSNS-TR 004, in five volumes. The reports are organized as follows: CNSNS-TR 004 Volume 1: Introduction and Methodology. CNSNS-TR4 Volume 2: Initiating Event and Accident Sequences. CNSNS-TR 004 Volume 3: System Analysis. CNSNS-TR 004 Volume 4: Accident Sequence Quantification and Results. CNSNS-TR 005 Volume 5: Appendices A, B and C. This volume presents the development of the dependent failure analysis, the treatment of the support system dependencies, the identification of the shared-components dependencies, and the treatment of the common cause failure. It is also presented the identification of the main human actions considered along with the possible recovery actions included. The development of the data base and the assumptions and limitations in the data base are also described in this volume. The accident sequences quantification process and the resolution of the core vulnerable sequences are presented. In this volume, the source and treatment of uncertainties associated with failure rates, component unavailabilities, initiating event frequencies, and human error probabilities are also presented. Finally, the main results and conclusions for the Internal Event Analysis for Laguna Verde Nuclear Power Plant are presented. The total core damage frequency calculated is 9.03x 10-5 per year for internal events. The most dominant accident sequences found are the transients involving the loss of offsite power, the station blackout accidents, and the anticipated transients without SCRAM (ATWS). (Author)

  3. De novo 454 sequencing of barcoded BAC pools for comprehensive gene survey and genome analysis in the complex genome of barley

    Directory of Open Access Journals (Sweden)

    Scholz Uwe

    2009-11-01

    Full Text Available Abstract Background De novo sequencing the entire genome of a large complex plant genome like the one of barley (Hordeum vulgare L. is a major challenge both in terms of experimental feasibility and costs. The emergence and breathtaking progress of next generation sequencing technologies has put this goal into focus and a clone based strategy combined with the 454/Roche technology is conceivable. Results To test the feasibility, we sequenced 91 barcoded, pooled, gene containing barley BACs using the GS FLX platform and assembled the sequences under iterative change of parameters. The BAC assemblies were characterized by N50 of ~50 kb (N80 ~31 kb, N90 ~21 kb and a Q40 of 94%. For ~80% of the clones, the best assemblies consisted of less than 10 contigs at 24-fold mean sequence coverage. Moreover we show that gene containing regions seem to assemble completely and uninterrupted thus making the approach suitable for detecting complete and positionally anchored genes. By comparing the assemblies of four clones to their complete reference sequences generated by the Sanger method, we evaluated the distribution, quality and representativeness of the 454 sequences as well as the consistency and reliability of the assemblies. Conclusion The described multiplex 454 sequencing of barcoded BACs leads to sequence consensi highly representative for the clones. Assemblies are correct for the majority of contigs. Though the resolution of complex repetitive structures requires additional experimental efforts, our approach paves the way for a clone based strategy of sequencing the barley genome.

  4. Structural design principles of complex bird songs: a network-based approach.

    Directory of Open Access Journals (Sweden)

    Kazutoshi Sasahara

    Full Text Available Bird songs are acoustic communication signals primarily used in male-male aggression and in male-female attraction. These are often monotonous patterns composed of a few phrases, yet some birds have extremely complex songs with a large phrase repertoire, organized in non-random fashion with discernible patterns. Since structure is typically associated with function, the structures of complex bird songs provide important clues to the evolution of animal communication systems. Here we propose an efficient network-based approach to explore structural design principles of complex bird songs, in which the song networks--transition relationships among different phrases and the related structural measures--are employed. We demonstrate how this approach works with an example using California Thrasher songs, which are sequences of highly varied phrases delivered in succession over several minutes. These songs display two distinct features: a large phrase repertoire with a 'small-world' architecture, in which subsets of phrases are highly grouped and linked with a short average path length; and a balanced transition diversity amongst phrases, in which deterministic and non-deterministic transition patterns are moderately mixed. We explore the robustness of this approach with variations in sample size and the amount of noise. Our approach enables a more quantitative study of global and local structural properties of complex bird songs than has been possible to date.

  5. Flow Cytometry-Assisted Cloning of Specific Sequence Motifs from Complex 16S rRNA Gene Libraries

    DEFF Research Database (Denmark)

    Nielsen, Jeppe Lund; Schramm, Andreas; Bernhard, Anne E.

    2004-01-01

    for Systems Biology,3 Seattle, Washington, and Department of Ecological Microbiology, University of Bayreuth, Bayreuth, Germany2 A flow cytometry method was developed for rapid screening and recovery of cloned DNA containing common sequence motifs. This approach, termed fluorescence-activated cell sorting......  FLOW CYTOMETRY-ASSISTED CLONING OF SPECIFIC SEQUENCE MOTIFS FROM COMPLEX 16S RRNA GENE LIBRARIES Jeppe L. Nielsen,1 Andreas Schramm,1,2 Anne E. Bernhard,1 Gerrit J. van den Engh,3 and David A. Stahl1* Department of Civil and Environmental Engineering, University of Washington,1 and Institute......-assisted cloning, was used to recover sequences affiliated with a unique lineage within the Bacteroidetes not abundant in a clone library of environmental 16S rRNA genes.  ...

  6. Initiating events and accidental sequences taken into account in the CAREM reactor design

    International Nuclear Information System (INIS)

    Kay, J.M.; Felizia, E.R.; Navarro, N.R.; Caruso, G.J.

    1990-01-01

    The advance made in the nuclear security evaluation of the CAREM reactor is presented. It was carried out using the Security Probabilistic Analysis (SPA). The latter takes into account the different phases of identification and solution of initiating events and the qualitative development of event trees. The method of identification of initiating events is the Master Logical Diagram (MLD), whose deductive basis makes it appropriate for a new design like the one described. The qualitative development of the event trees associated to the identified initiating events, allows identification of those accidental sequences which are to have the security systems in the reactor. (Author) [es

  7. Event Sequence Analysis of the Air Intelligence Agency Information Operations Center Flight Operations

    National Research Council Canada - National Science Library

    Larsen, Glen

    1998-01-01

    This report applies Event Sequence Analysis, methodology adapted from aircraft mishap investigation, to an investigation of the performance of the Air Intelligence Agency's Information Operations Center (IOC...

  8. Sensitivity studies on the approaches for addressing multiple initiating events in fire events PSA

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Dae Il; Lim, Ho Gon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    A single fire event within a fire compartment or a fire scenario can cause multiple initiating events (IEs). As an example, a fire in a turbine building fire area can cause a loss of the main feed-water (LOMF) and loss of off-site power (LOOP) IEs. Previous domestic fire events PSA had considered only the most severe initiating event among multiple initiating events. NUREG/CR-6850 and ANS/ASME PRA Standard require that multiple IEs are to be addressed in fire events PSA. In this paper, sensitivity studies on the approaches for addressing multiple IEs in fire events PSA for Hanul Unit 3 were performed and their results were presented. In this paper, sensitivity studies on the approaches for addressing multiple IEs in fire events PSA are performed and their results were presented. From the sensitivity analysis results, we can find that the incorporations of multiple IEs into fire events PSA model result in the core damage frequency (CDF) increase and may lead to the generation of the duplicate cutsets. Multiple IEs also can occur at internal flooding event or other external events such as seismic event. They should be considered in the constructions of PSA models in order to realistically estimate risk due to flooding or seismic events.

  9. Component Neural Systems for the Creation of Emotional Memories during Free Viewing of a Complex, Real-World Event.

    Science.gov (United States)

    Botzung, Anne; Labar, Kevin S; Kragel, Philip; Miles, Amanda; Rubin, David C

    2010-01-01

    To investigate the neural systems that contribute to the formation of complex, self-relevant emotional memories, dedicated fans of rival college basketball teams watched a competitive game while undergoing functional magnetic resonance imaging (fMRI). During a subsequent recognition memory task, participants were shown video clips depicting plays of the game, stemming either from previously-viewed game segments (targets) or from non-viewed portions of the same game (foils). After an old-new judgment, participants provided emotional valence and intensity ratings of the clips. A data driven approach was first used to decompose the fMRI signal acquired during free viewing of the game into spatially independent components. Correlations were then calculated between the identified components and post-scanning emotion ratings for successfully encoded targets. Two components were correlated with intensity ratings, including temporal lobe regions implicated in memory and emotional functions, such as the hippocampus and amygdala, as well as a midline fronto-cingulo-parietal network implicated in social cognition and self-relevant processing. These data were supported by a general linear model analysis, which revealed additional valence effects in fronto-striatal-insular regions when plays were divided into positive and negative events according to the fan's perspective. Overall, these findings contribute to our understanding of how emotional factors impact distributed neural systems to successfully encode dynamic, personally-relevant event sequences.

  10. Component neural systems for the creation of emotional memories during free viewing of a complex, real-world event

    Directory of Open Access Journals (Sweden)

    Anne Botzung

    2010-05-01

    Full Text Available To investigate the neural systems that contribute to the formation of complex, self-relevant emotional memories, dedicated fans of rival college basketball teams watched a competitive game while undergoing functional magnetic resonance imaging (fMRI. During a subsequent recognition memory task, participants were shown video clips depicting plays of the game, stemming either from previously-viewed game segments (targets or from non-viewed portions of the same game (foils. After an old-new judgment, participants provided emotional valence and intensity ratings of the clips. A data driven approach was first used to decompose the fMRI signal acquired during free viewing of the game into spatially independent components. Correlations were then calculated between the identified components and post-scanning emotion ratings for successfully encoded targets. Two components were correlated with intensity ratings, including temporal lobe regions implicated in memory and emotional functions, such as the hippocampus and amygdala, as well as a midline fronto-cingulo-parietal network implicated in social cognition and self-relevant processing. These data were supported by a general linear model analysis, which revealed additional valence effects in fronto-striatal-insular regions when plays were divided into positive and negative events according to the fan’s perspective. Overall, these findings contribute to our understanding of how emotional factors impact distributed neural systems to successfully encode dynamic, personally-relevant event sequences.

  11. The Swiss-Army-Knife Approach to the Nearly Automatic Analysis for Microearthquake Sequences.

    Science.gov (United States)

    Kraft, T.; Simon, V.; Tormann, T.; Diehl, T.; Herrmann, M.

    2017-12-01

    Many Swiss earthquake sequence have been studied using relative location techniques, which often allowed to constrain the active fault planes and shed light on the tectonic processes that drove the seismicity. Yet, in the majority of cases the number of located earthquakes was too small to infer the details of the space-time evolution of the sequences, or their statistical properties. Therefore, it has mostly been impossible to resolve clear patterns in the seismicity of individual sequences, which are needed to improve our understanding of the mechanisms behind them. Here we present a nearly automatic workflow that combines well-established seismological analysis techniques and allows to significantly improve the completeness of detected and located earthquakes of a sequence. We start from the manually timed routine catalog of the Swiss Seismological Service (SED), which contains the larger events of a sequence. From these well-analyzed earthquakes we dynamically assemble a template set and perform a matched filter analysis on the station with: the best SNR for the sequence; and a recording history of at least 10-15 years, our typical analysis period. This usually allows us to detect events several orders of magnitude below the SED catalog detection threshold. The waveform similarity of the events is then further exploited to derive accurate and consistent magnitudes. The enhanced catalog is then analyzed statistically to derive high-resolution time-lines of the a- and b-value and consequently the occurrence probability of larger events. Many of the detected events are strong enough to be located using double-differences. No further manual interaction is needed; we simply time-shift the arrival-time pattern of the detecting template to the associated detection. Waveform similarity assures a good approximation of the expected arrival-times, which we use to calculate event-pair arrival-time differences by cross correlation. After a SNR and cycle-skipping quality

  12. Data-assisted reduced-order modeling of extreme events in complex dynamical systems.

    Science.gov (United States)

    Wan, Zhong Yi; Vlachas, Pantelis; Koumoutsakos, Petros; Sapsis, Themistoklis

    2018-01-01

    The prediction of extreme events, from avalanches and droughts to tsunamis and epidemics, depends on the formulation and analysis of relevant, complex dynamical systems. Such dynamical systems are characterized by high intrinsic dimensionality with extreme events having the form of rare transitions that are several standard deviations away from the mean. Such systems are not amenable to classical order-reduction methods through projection of the governing equations due to the large intrinsic dimensionality of the underlying attractor as well as the complexity of the transient events. Alternatively, data-driven techniques aim to quantify the dynamics of specific, critical modes by utilizing data-streams and by expanding the dimensionality of the reduced-order model using delayed coordinates. In turn, these methods have major limitations in regions of the phase space with sparse data, which is the case for extreme events. In this work, we develop a novel hybrid framework that complements an imperfect reduced order model, with data-streams that are integrated though a recurrent neural network (RNN) architecture. The reduced order model has the form of projected equations into a low-dimensional subspace that still contains important dynamical information about the system and it is expanded by a long short-term memory (LSTM) regularization. The LSTM-RNN is trained by analyzing the mismatch between the imperfect model and the data-streams, projected to the reduced-order space. The data-driven model assists the imperfect model in regions where data is available, while for locations where data is sparse the imperfect model still provides a baseline for the prediction of the system state. We assess the developed framework on two challenging prototype systems exhibiting extreme events. We show that the blended approach has improved performance compared with methods that use either data streams or the imperfect model alone. Notably the improvement is more significant in

  13. Data-assisted reduced-order modeling of extreme events in complex dynamical systems.

    Directory of Open Access Journals (Sweden)

    Zhong Yi Wan

    Full Text Available The prediction of extreme events, from avalanches and droughts to tsunamis and epidemics, depends on the formulation and analysis of relevant, complex dynamical systems. Such dynamical systems are characterized by high intrinsic dimensionality with extreme events having the form of rare transitions that are several standard deviations away from the mean. Such systems are not amenable to classical order-reduction methods through projection of the governing equations due to the large intrinsic dimensionality of the underlying attractor as well as the complexity of the transient events. Alternatively, data-driven techniques aim to quantify the dynamics of specific, critical modes by utilizing data-streams and by expanding the dimensionality of the reduced-order model using delayed coordinates. In turn, these methods have major limitations in regions of the phase space with sparse data, which is the case for extreme events. In this work, we develop a novel hybrid framework that complements an imperfect reduced order model, with data-streams that are integrated though a recurrent neural network (RNN architecture. The reduced order model has the form of projected equations into a low-dimensional subspace that still contains important dynamical information about the system and it is expanded by a long short-term memory (LSTM regularization. The LSTM-RNN is trained by analyzing the mismatch between the imperfect model and the data-streams, projected to the reduced-order space. The data-driven model assists the imperfect model in regions where data is available, while for locations where data is sparse the imperfect model still provides a baseline for the prediction of the system state. We assess the developed framework on two challenging prototype systems exhibiting extreme events. We show that the blended approach has improved performance compared with methods that use either data streams or the imperfect model alone. Notably the improvement is more

  14. Analysis of Paks NPP Personnel Activity during Safety Related Event Sequences

    International Nuclear Information System (INIS)

    Bareith, A.; Hollo, Elod; Karsa, Z.; Nagy, S.

    1998-01-01

    Within the AGNES Project (Advanced Generic and New Evaluation of Safety) the Level-1 PSA model of the Paks NPP Unit 3 was developed in form of a detailed event tree/fault tree structure (53 initiating events, 580 event sequences, 6300 basic events are involved). This model gives a good basis for quantitative evaluation of potential consequences of actually occurred safety-related events, i.e. for precursor event studies. To make these studies possible and efficient, the current qualitative event analysis practice should be reviewed and a new additional quantitative analysis procedure and system should be developed and applied. The present paper gives an overview of the method outlined for both qualitative and quantitative analyses of the operator crew activity during off-normal situations. First, the operator performance experienced during past operational events is discussed. Sources of raw information, the qualitative evaluation process, the follow-up actions, as well as the documentation requirements are described. Second, the general concept of the proposed precursor event analysis is described. Types of modeled interactions and the considered performance influences are presented. The quantification of the potential consequences of the identified precursor events is based on the task-oriented, Level-1 PSA model of the plant unit. A precursor analysis system covering the evaluation of operator activities is now under development. Preliminary results gained during a case study evaluation of a past historical event are presented. (authors)

  15. Treatment of Events Representing System Success in Accident Sequences in PSA Models with ET/FT Linking

    International Nuclear Information System (INIS)

    Vrbanic, I.; Spiler, J.; Mikulicic, V.; Simic, Z.

    2002-01-01

    Treatment of events that represent systems' successes in accident sequences is well known issue associated primarily with those PSA models that employ event tree / fault tree (ET / FT) linking technique. Even theoretically clear, practical implementation and usage creates for certain PSA models a number of difficulties regarding result correctness. Strict treatment of success-events would require consistent applying of de Morgan laws. However, there are several problems related to it. First, Boolean resolution of the overall model, such as the one representing occurrence of reactor core damage, becomes very challenging task if De Morgan rules are applied consistently at all levels. Even PSA tools of the newest generation have some problems with performing such a task in a reasonable time frame. The second potential issue is related to the presence of negated basic events in minimal cutsets. If all the basic events that result from strict applying of De Morgan rules are retained in presentation of minimal cutsets, their readability and interpretability may be impaired severely. It is also worth noting that the concept of a minimal cutset is tied to equipment failures, rather than to successes. For reasons like these, various simplifications are employed in PSA models and tools, when it comes to the treatment of success-events in the sequences. This paper provides a discussion of major concerns associated with the treatment of success-events in accident sequences of a typical PSA model. (author)

  16. RECOGNITION OF DRAINAGE TUNNELS DURING GLACIER LAKE OUTBURST EVENTS FROM TERRESTRIAL IMAGE SEQUENCES

    Directory of Open Access Journals (Sweden)

    E. Schwalbe

    2016-06-01

    Full Text Available In recent years, many glaciers all over the world have been distinctly retreating and thinning. One of the consequences of this is the increase of so called glacier lake outburst flood events (GLOFs. The mechanisms ruling such GLOF events are still not yet fully understood by glaciologists. Thus, there is a demand for data and measurements that can help to understand and model the phenomena. Thereby, a main issue is to obtain information about the location and formation of subglacial channels through which some lakes, dammed by a glacier, start to drain. The paper will show how photogrammetric image sequence analysis can be used to collect such data. For the purpose of detecting a subglacial tunnel, a camera has been installed in a pilot study to observe the area of the Colonia Glacier (Northern Patagonian Ice Field where it dams the Lake Cachet II. To verify the hypothesis, that the course of the subglacial tunnel is indicated by irregular surface motion patterns during its collapse, the camera acquired image sequences of the glacier surface during several GLOF events. Applying tracking techniques to these image sequences, surface feature motion trajectories could be obtained for a dense raster of glacier points. Since only a single camera has been used for image sequence acquisition, depth information is required to scale the trajectories. Thus, for scaling and georeferencing of the measurements a GPS-supported photogrammetric network has been measured. The obtained motion fields of the Colonia Glacier deliver information about the glacier’s behaviour before during and after a GLOF event. If the daily vertical glacier motion of the glacier is integrated over a period of several days and projected into a satellite image, the location and shape of the drainage channel underneath the glacier becomes visible. The high temporal resolution of the motion fields may also allows for an analysis of the tunnels dynamic in comparison to the changing

  17. Environmental Enrichment Expedites Acquisition and Improves Flexibility on a Temporal Sequencing Task in Mice

    Directory of Open Access Journals (Sweden)

    Darius Rountree-Harrison

    2018-03-01

    Full Text Available Environmental enrichment (EE via increased opportunities for voluntary exercise, sensory stimulation and social interaction, can enhance the function of and behaviours regulated by cognitive circuits. Little is known, however, as to how this intervention affects performance on complex tasks that engage multiple, definable learning and memory systems. Accordingly, we utilised the Olfactory Temporal Order Discrimination (OTOD task which requires animals to recall and report sequence information about a series of recently encountered olfactory stimuli. This approach allowed us to compare animals raised in either enriched or standard laboratory housing conditions on a number of measures, including the acquisition of a complex discrimination task, temporal sequence recall accuracy (i.e., the ability to accurately recall a sequences of events and acuity (i.e., the ability to resolve past events that occurred in close temporal proximity, as well as cognitive flexibility tested in the style of a rule reversal and an Intra-Dimensional Shift (IDS. We found that enrichment accelerated the acquisition of the temporal order discrimination task, although neither accuracy nor acuity was affected at asymptotic performance levels. Further, while a subtle enhancement of overall performance was detected for both rule reversal and IDS versions of the task, accelerated performance recovery could only be attributed to the shift-like contingency change. These findings suggest that EE can affect specific elements of complex, multi-faceted cognitive processes.

  18. Rare recombination events generate sequence diversity among balancer chromosomes in Drosophila melanogaster.

    Science.gov (United States)

    Miller, Danny E; Cook, Kevin R; Yeganeh Kazemi, Nazanin; Smith, Clarissa B; Cockrell, Alexandria J; Hawley, R Scott; Bergman, Casey M

    2016-03-08

    Multiply inverted balancer chromosomes that suppress exchange with their homologs are an essential part of the Drosophila melanogaster genetic toolkit. Despite their widespread use, the organization of balancer chromosomes has not been characterized at the molecular level, and the degree of sequence variation among copies of balancer chromosomes is unknown. To map inversion breakpoints and study potential diversity in descendants of a structurally identical balancer chromosome, we sequenced a panel of laboratory stocks containing the most widely used X chromosome balancer, First Multiple 7 (FM7). We mapped the locations of FM7 breakpoints to precise euchromatic coordinates and identified the flanking sequence of breakpoints in heterochromatic regions. Analysis of SNP variation revealed megabase-scale blocks of sequence divergence among currently used FM7 stocks. We present evidence that this divergence arose through rare double-crossover events that replaced a female-sterile allele of the singed gene (sn(X2)) on FM7c with a sequence from balanced chromosomes. We propose that although double-crossover events are rare in individual crosses, many FM7c chromosomes in the Bloomington Drosophila Stock Center have lost sn(X2) by this mechanism on a historical timescale. Finally, we characterize the original allele of the Bar gene (B(1)) that is carried on FM7, and validate the hypothesis that the origin and subsequent reversion of the B(1) duplication are mediated by unequal exchange. Our results reject a simple nonrecombining, clonal mode for the laboratory evolution of balancer chromosomes and have implications for how balancer chromosomes should be used in the design and interpretation of genetic experiments in Drosophila.

  19. D20S16 is a complex interspersed repeated sequence: Genetic and physical analysis of the locus

    Energy Technology Data Exchange (ETDEWEB)

    Bowden, D.W.; Krawchuk, M.D.; Howard, T.D. [Wake Forest Univ., Winston-Salem, NC (United States)] [and others

    1995-01-20

    The genomic structure of the D20S16 locus has been evaluated using genetic and physical methods. D20S16, originally detected with the probe CRI-L1214, is a highly informative, complex restriction fragment length polymorphism consisting of two separate allelic systems. The allelic systems have the characteristics of conventional VNTR polymorphisms and are separated by recombination ({theta} = 0.02, Z{sub max} = 74.82), as demonstrated in family studies. Most of these recombination events are meiotic crossovers and are maternal in origin, but two, including deletion of the locus in a cell line from a CEPH family member, occur without evidence for exchange of flanking markers. DNA sequence analysis suggests that the basis of the polymorphism is variable numbers of a 98-bp sequence tandemly repeated with 87 to 90% sequence similarity between repeats. The 98-bp repeat is a dimer of 49 bp sequence with 45 to 98% identity between the elements. In addition, nonpolymorphic genomic sequences adjacent to the polymorphic 98-bp repeat tracts are also repeated but are not polymorphic, i.e., show no individual to individual variation. Restriction enzyme mapping of cosmids containing the CRI-L1214 sequence suggests that there are multiple interspersed repeats of the CRI-L1214 sequence on chromosome 20. The results of dual-color fluorescence in situ hybridization experiments with interphase nuclei are also consistent with multiple repeats of an interspersed sequence on chromosome 20. 23 refs., 6 figs.

  20. Cognitive complexity of the medical record is a risk factor for major adverse events.

    Science.gov (United States)

    Roberson, David; Connell, Michael; Dillis, Shay; Gauvreau, Kimberlee; Gore, Rebecca; Heagerty, Elaina; Jenkins, Kathy; Ma, Lin; Maurer, Amy; Stephenson, Jessica; Schwartz, Margot

    2014-01-01

    Patients in tertiary care hospitals are more complex than in the past, but the implications of this are poorly understood as "patient complexity" has been difficult to quantify. We developed a tool, the Complexity Ruler, to quantify the amount of data (as bits) in the patient’s medical record. We designated the amount of data in the medical record as the cognitive complexity of the medical record (CCMR). We hypothesized that CCMR is a useful surrogate for true patient complexity and that higher CCMR correlates with risk of major adverse events. The Complexity Ruler was validated by comparing the measured CCMR with physician rankings of patient complexity on specific inpatient services. It was tested in a case-control model of all patients with major adverse events at a tertiary care pediatric hospital from 2005 to 2006. The main outcome measure was an externally reported major adverse event. We measured CCMR for 24 hours before the event, and we estimated lifetime CCMR. Above empirically derived cutoffs, 24-hour and lifetime CCMR were risk factors for major adverse events (odds ratios, 5.3 and 6.5, respectively). In a multivariate analysis, CCMR alone was essentially as predictive of risk as a model that started with 30-plus clinical factors. CCMR correlates with physician assessment of complexity and risk of adverse events. We hypothesize that increased CCMR increases the risk of physician cognitive overload. An automated version of the Complexity Ruler could allow identification of at-risk patients in real time.

  1. Survey of Applications of Complex Event Processing (CEP in Health Domain

    Directory of Open Access Journals (Sweden)

    Nadeem Mahmood

    2017-12-01

    Full Text Available It is always difficult to manipulate the production of huge amount of data which comes from multiple sources and to extract meaningful information to make appropriate decisions. When data comes from various input resources, to get required streams of events form this complex input network, the one of the strong functionality of Business Intelligence (BI the Complex Event Processing (CEP is the appropriate solution for the above mention problems. Real time processing, pattern matching, stream processing, big data management, sensor data processing and many more are the application areas of CEP. Health domain itself is a multi-dimension domain such as hospital supply chain, OPD management, disease diagnostic, In-patient, out-patient management, and emergency care etc. In this paper, the main focus is to discuss the application areas of Complex Event Processing (CEP in health domain by using sensor device, such that how CEP manipulate health data set events coming from sensor devices such as blood pressure, heart rate, fall detection, sugar level, temperature or any other vital signs and how this systems respond to these events as quickly as possible. Different existing models and application using CEP are discussed and summarized according to different characteristics.

  2. Complex programmable logic device based alarm sequencer for nuclear power plants

    International Nuclear Information System (INIS)

    Khedkar, Ravindra; Solomon, J. Selva; KrishnaKumar, B.

    2001-01-01

    Complex Programmable Logic Device based Alarm Sequencer is an instrument, which detects alarms, memorizes them and displays the sequences of occurrence of alarms. It caters to sixteen alarm signals and distinguishes the sequence among any two alarms with a time resolution of 1 ms. The system described has been designed for continuous operation in process plants, nuclear power plants etc. The system has been tested and found to be working satisfactorily. (author)

  3. The Mw=8.8 Maule earthquake aftershock sequence, event catalog and locations

    Science.gov (United States)

    Meltzer, A.; Benz, H.; Brown, L.; Russo, R. M.; Beck, S. L.; Roecker, S. W.

    2011-12-01

    The aftershock sequence of the Mw=8.8 Maule earthquake off the coast of Chile in February 2010 is one of the most well-recorded aftershock sequences from a great megathrust earthquake. Immediately following the Maule earthquake, teams of geophysicists from Chile, France, Germany, Great Britain and the United States coordinated resources to capture aftershocks and other seismic signals associated with this significant earthquake. In total, 91 broadband, 48 short period, and 25 accelerometers stations were deployed above the rupture zone of the main shock from 33-38.5°S and from the coast to the Andean range front. In order to integrate these data into a unified catalog, the USGS National Earthquake Information Center develop procedures to use their real-time seismic monitoring system (Bulletin Hydra) to detect, associate, location and compute earthquake source parameters from these stations. As a first step in the process, the USGS has built a seismic catalog of all M3.5 or larger earthquakes for the time period of the main aftershock deployment from March 2010-October 2010. The catalog includes earthquake locations, magnitudes (Ml, Mb, Mb_BB, Ms, Ms_BB, Ms_VX, Mc), associated phase readings and regional moment tensor solutions for most of the M4 or larger events. Also included in the catalog are teleseismic phases and amplitude measures and body-wave MT and CMT solutions for the larger events, typically M5.5 and larger. Tuning of automated detection and association parameters should allow a complete catalog of events to approximately M2.5 or larger for that dataset of more than 164 stations. We characterize the aftershock sequence in terms of magnitude, frequency, and location over time. Using the catalog locations and travel times as a starting point we use double difference techniques to investigate relative locations and earthquake clustering. In addition, phase data from candidate ground truth events and modeling of surface waves can be used to calibrate the

  4. Computational complexity of algorithms for sequence comparison, short-read assembly and genome alignment.

    Science.gov (United States)

    Baichoo, Shakuntala; Ouzounis, Christos A

    A multitude of algorithms for sequence comparison, short-read assembly and whole-genome alignment have been developed in the general context of molecular biology, to support technology development for high-throughput sequencing, numerous applications in genome biology and fundamental research on comparative genomics. The computational complexity of these algorithms has been previously reported in original research papers, yet this often neglected property has not been reviewed previously in a systematic manner and for a wider audience. We provide a review of space and time complexity of key sequence analysis algorithms and highlight their properties in a comprehensive manner, in order to identify potential opportunities for further research in algorithm or data structure optimization. The complexity aspect is poised to become pivotal as we will be facing challenges related to the continuous increase of genomic data on unprecedented scales and complexity in the foreseeable future, when robust biological simulation at the cell level and above becomes a reality. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Local sequence alignments statistics: deviations from Gumbel statistics in the rare-event tail

    Directory of Open Access Journals (Sweden)

    Burghardt Bernd

    2007-07-01

    Full Text Available Abstract Background The optimal score for ungapped local alignments of infinitely long random sequences is known to follow a Gumbel extreme value distribution. Less is known about the important case, where gaps are allowed. For this case, the distribution is only known empirically in the high-probability region, which is biologically less relevant. Results We provide a method to obtain numerically the biologically relevant rare-event tail of the distribution. The method, which has been outlined in an earlier work, is based on generating the sequences with a parametrized probability distribution, which is biased with respect to the original biological one, in the framework of Metropolis Coupled Markov Chain Monte Carlo. Here, we first present the approach in detail and evaluate the convergence of the algorithm by considering a simple test case. In the earlier work, the method was just applied to one single example case. Therefore, we consider here a large set of parameters: We study the distributions for protein alignment with different substitution matrices (BLOSUM62 and PAM250 and affine gap costs with different parameter values. In the logarithmic phase (large gap costs it was previously assumed that the Gumbel form still holds, hence the Gumbel distribution is usually used when evaluating p-values in databases. Here we show that for all cases, provided that the sequences are not too long (L > 400, a "modified" Gumbel distribution, i.e. a Gumbel distribution with an additional Gaussian factor is suitable to describe the data. We also provide a "scaling analysis" of the parameters used in the modified Gumbel distribution. Furthermore, via a comparison with BLAST parameters, we show that significance estimations change considerably when using the true distributions as presented here. Finally, we study also the distribution of the sum statistics of the k best alignments. Conclusion Our results show that the statistics of gapped and ungapped local

  6. Sequence Coding and Search System for licensee event reports: code listings. Volume 2

    International Nuclear Information System (INIS)

    Gallaher, R.B.; Guymon, R.H.; Mays, G.T.; Poore, W.P.; Cagle, R.J.; Harrington, K.H.; Johnson, M.P.

    1985-04-01

    Operating experience data from nuclear power plants are essential for safety and reliability analyses, especially analyses of trends and patterns. The licensee event reports (LERs) that are submitted to the Nuclear Regulatory Commission (NRC) by the nuclear power plant utilities contain much of this data. The NRC's Office for Analysis and Evaluation of Operational Data (AEOD) has developed, under contract with NSIC, a system for codifying the events reported in the LERs. The primary objective of the Sequence Coding and Search System (SCSS) is to reduce the descriptive text of the LERs to coded sequences that are both computer-readable and computer-searchable. This system provides a structured format for detailed coding of component, system, and unit effects as well as personnel errors. The database contains all current LERs submitted by nuclear power plant utilities for events occurring since 1981 and is updated on a continual basis. Volume 2 contains all valid and acceptable codes used for searching and encoding the LER data. This volume contains updated material through amendment 1 to revision 1 of the working version of ORNL/NSIC-223, Vol. 2

  7. Automatically Identifying Fusion Events between GLUT4 Storage Vesicles and the Plasma Membrane in TIRF Microscopy Image Sequences

    Directory of Open Access Journals (Sweden)

    Jian Wu

    2015-01-01

    Full Text Available Quantitative analysis of the dynamic behavior about membrane-bound secretory vesicles has proven to be important in biological research. This paper proposes a novel approach to automatically identify the elusive fusion events between VAMP2-pHluorin labeled GLUT4 storage vesicles (GSVs and the plasma membrane. The differentiation is implemented to detect the initiation of fusion events by modified forward subtraction of consecutive frames in the TIRFM image sequence. Spatially connected pixels in difference images brighter than a specified adaptive threshold are grouped into a distinct fusion spot. The vesicles are located at the intensity-weighted centroid of their fusion spots. To reveal the true in vivo nature of a fusion event, 2D Gaussian fitting for the fusion spot is used to derive the intensity-weighted centroid and the spot size during the fusion process. The fusion event and its termination can be determined according to the change of spot size. The method is evaluated on real experiment data with ground truth annotated by expert cell biologists. The evaluation results show that it can achieve relatively high accuracy comparing favorably to the manual analysis, yet at a small fraction of time.

  8. Genomic Investigation Reveals Highly Conserved, Mosaic, Recombination Events Associated with Capsular Switching among Invasive Neisseria meningitidis Serogroup W Sequence Type (ST)-11 Strains.

    Science.gov (United States)

    Mustapha, Mustapha M; Marsh, Jane W; Krauland, Mary G; Fernandez, Jorge O; de Lemos, Ana Paula S; Dunning Hotopp, Julie C; Wang, Xin; Mayer, Leonard W; Lawrence, Jeffrey G; Hiller, N Luisa; Harrison, Lee H

    2016-07-03

    Neisseria meningitidis is an important cause of meningococcal disease globally. Sequence type (ST)-11 clonal complex (cc11) is a hypervirulent meningococcal lineage historically associated with serogroup C capsule and is believed to have acquired the W capsule through a C to W capsular switching event. We studied the sequence of capsule gene cluster (cps) and adjoining genomic regions of 524 invasive W cc11 strains isolated globally. We identified recombination breakpoints corresponding to two distinct recombination events within W cc11: A 8.4-kb recombinant region likely acquired from W cc22 including the sialic acid/glycosyl-transferase gene, csw resulted in a C→W change in capsular phenotype and a 13.7-kb recombinant segment likely acquired from Y cc23 lineage includes 4.5 kb of cps genes and 8.2 kb downstream of the cps cluster resulting in allelic changes in capsule translocation genes. A vast majority of W cc11 strains (497/524, 94.8%) retain both recombination events as evidenced by sharing identical or very closely related capsular allelic profiles. These data suggest that the W cc11 capsular switch involved two separate recombination events and that current global W cc11 meningococcal disease is caused by strains bearing this mosaic capsular switch. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  9. Order and correlations in genomic DNA sequences. The spectral approach

    International Nuclear Information System (INIS)

    Lobzin, Vasilii V; Chechetkin, Vladimir R

    2000-01-01

    The structural analysis of genomic DNA sequences is discussed in the framework of the spectral approach, which is sufficiently universal due to the reciprocal correspondence and mutual complementarity of Fourier transform length scales. The spectral characteristics of random sequences of the same nucleotide composition possess the property of self-averaging for relatively short sequences of length M≥100-300. Comparison with the characteristics of random sequences determines the statistical significance of the structural features observed. Apart from traditional applications to the search for hidden periodicities, spectral methods are also efficient in studying mutual correlations in DNA sequences. By combining spectra for structure factors and correlation functions, not only integral correlations can be estimated but also their origin identified. Using the structural spectral entropy approach, the regularity of a sequence can be quantitatively assessed. A brief introduction to the problem is also presented and other major methods of DNA sequence analysis described. (reviews of topical problems)

  10. Cognitive and emotional reactions to daily events: the effects of self-esteem and self-complexity.

    Science.gov (United States)

    Campbell, J D; Chew, B; Scratchley, L S

    1991-09-01

    In this article we examine the effects of self-esteem and self-complexity on cognitive appraisals of daily events and emotional lability. Subjects (n = 67) participated in a 2-week diary study; each day they made five mood ratings, described the most positive and negative events of the day, and rated these two events on six appraisal measures. Neither self-esteem nor self-complexity was related to an extremity measure of mood variability. Both traits were negatively related to measures assessing the frequency of mood change, although the effect of self-complexity dissipated when self-esteem was taken into account. Self-esteem (but not self-complexity) was also related to event appraisals: Subjects with low self-esteem rated their daily events as less positive and as having more impact on their moods. Subjects with high self-esteem made more internal, stable, global attributions for positive events than for negative events, whereas subjects low in self-esteem made similar attributions for both types of events and viewed their negative events as being more personally important than did subjects high in self-esteem. Despite these self-esteem differences in subjects' views of their daily events, naive judges (n = 63) who read the event descriptions and role-played their appraisals of them generally did not distinguish between the events that had been experienced by low self-esteem versus high self-esteem diary subjects.

  11. On the holistic approach in cellular and cancer biology: nonlinearity, complexity, and quasi-determinism of the dynamic cellular network.

    Science.gov (United States)

    Waliszewski, P; Molski, M; Konarski, J

    1998-06-01

    A keystone of the molecular reductionist approach to cellular biology is a specific deductive strategy relating genotype to phenotype-two distinct categories. This relationship is based on the assumption that the intermediary cellular network of actively transcribed genes and their regulatory elements is deterministic (i.e., a link between expression of a gene and a phenotypic trait can always be identified, and evolution of the network in time is predetermined). However, experimental data suggest that the relationship between genotype and phenotype is nonbijective (i.e., a gene can contribute to the emergence of more than just one phenotypic trait or a phenotypic trait can be determined by expression of several genes). This implies nonlinearity (i.e., lack of the proportional relationship between input and the outcome), complexity (i.e. emergence of the hierarchical network of multiple cross-interacting elements that is sensitive to initial conditions, possesses multiple equilibria, organizes spontaneously into different morphological patterns, and is controlled in dispersed rather than centralized manner), and quasi-determinism (i.e., coexistence of deterministic and nondeterministic events) of the network. Nonlinearity within the space of the cellular molecular events underlies the existence of a fractal structure within a number of metabolic processes, and patterns of tissue growth, which is measured experimentally as a fractal dimension. Because of its complexity, the same phenotype can be associated with a number of alternative sequences of cellular events. Moreover, the primary cause initiating phenotypic evolution of cells such as malignant transformation can be favored probabilistically, but not identified unequivocally. Thermodynamic fluctuations of energy rather than gene mutations, the material traits of the fluctuations alter both the molecular and informational structure of the network. Then, the interplay between deterministic chaos, complexity, self

  12. Genomic Selection in the Era of Next Generation Sequencing for Complex Traits in Plant Breeding.

    Science.gov (United States)

    Bhat, Javaid A; Ali, Sajad; Salgotra, Romesh K; Mir, Zahoor A; Dutta, Sutapa; Jadon, Vasudha; Tyagi, Anshika; Mushtaq, Muntazir; Jain, Neelu; Singh, Pradeep K; Singh, Gyanendra P; Prabhu, K V

    2016-01-01

    Genomic selection (GS) is a promising approach exploiting molecular genetic markers to design novel breeding programs and to develop new markers-based models for genetic evaluation. In plant breeding, it provides opportunities to increase genetic gain of complex traits per unit time and cost. The cost-benefit balance was an important consideration for GS to work in crop plants. Availability of genome-wide high-throughput, cost-effective and flexible markers, having low ascertainment bias, suitable for large population size as well for both model and non-model crop species with or without the reference genome sequence was the most important factor for its successful and effective implementation in crop species. These factors were the major limitations to earlier marker systems viz., SSR and array-based, and was unimaginable before the availability of next-generation sequencing (NGS) technologies which have provided novel SNP genotyping platforms especially the genotyping by sequencing. These marker technologies have changed the entire scenario of marker applications and made the use of GS a routine work for crop improvement in both model and non-model crop species. The NGS-based genotyping have increased genomic-estimated breeding value prediction accuracies over other established marker platform in cereals and other crop species, and made the dream of GS true in crop breeding. But to harness the true benefits from GS, these marker technologies will be combined with high-throughput phenotyping for achieving the valuable genetic gain from complex traits. Moreover, the continuous decline in sequencing cost will make the WGS feasible and cost effective for GS in near future. Till that time matures the targeted sequencing seems to be more cost-effective option for large scale marker discovery and GS, particularly in case of large and un-decoded genomes.

  13. Genomic DNA Enrichment Using Sequence Capture Microarrays: a Novel Approach to Discover Sequence Nucleotide Polymorphisms (SNP) in Brassica napus L

    Science.gov (United States)

    Clarke, Wayne E.; Parkin, Isobel A.; Gajardo, Humberto A.; Gerhardt, Daniel J.; Higgins, Erin; Sidebottom, Christine; Sharpe, Andrew G.; Snowdon, Rod J.; Federico, Maria L.; Iniguez-Luy, Federico L.

    2013-01-01

    Targeted genomic selection methodologies, or sequence capture, allow for DNA enrichment and large-scale resequencing and characterization of natural genetic variation in species with complex genomes, such as rapeseed canola (Brassica napus L., AACC, 2n=38). The main goal of this project was to combine sequence capture with next generation sequencing (NGS) to discover single nucleotide polymorphisms (SNPs) in specific areas of the B. napus genome historically associated (via quantitative trait loci –QTL– analysis) to traits of agronomical and nutritional importance. A 2.1 million feature sequence capture platform was designed to interrogate DNA sequence variation across 47 specific genomic regions, representing 51.2 Mb of the Brassica A and C genomes, in ten diverse rapeseed genotypes. All ten genotypes were sequenced using the 454 Life Sciences chemistry and to assess the effect of increased sequence depth, two genotypes were also sequenced using Illumina HiSeq chemistry. As a result, 589,367 potentially useful SNPs were identified. Analysis of sequence coverage indicated a four-fold increased representation of target regions, with 57% of the filtered SNPs falling within these regions. Sixty percent of discovered SNPs corresponded to transitions while 40% were transversions. Interestingly, fifty eight percent of the SNPs were found in genic regions while 42% were found in intergenic regions. Further, a high percentage of genic SNPs was found in exons (65% and 64% for the A and C genomes, respectively). Two different genotyping assays were used to validate the discovered SNPs. Validation rates ranged from 61.5% to 84% of tested SNPs, underpinning the effectiveness of this SNP discovery approach. Most importantly, the discovered SNPs were associated with agronomically important regions of the B. napus genome generating a novel data resource for research and breeding this crop species. PMID:24312619

  14. Sequence Coding and Search System for licensee event reports: coder's manual. Volume 4

    International Nuclear Information System (INIS)

    Gallaher, R.B.; Guymon, R.H.; Mays, G.T.; Poore, W.P.; Cagle, R.J.; Harrington, K.H.; Johnson, M.P.

    1985-04-01

    Operating experience data from nuclear power plants are essential for safety and reliability analyses, especially analyses of trends and patterns. The licensee event reports (LERs) that are submitted to the Nuclear Regulatory Commission (NRC) by the nuclear power plant utilities contain much of this data. The NRC's Office for Analysis and Evaluation of Operational Data (AEOD) has developed, under contract with NSIC, a system for codifying the events reported in the LERs. The primary objective of the Sequence Coding and Search System (SCSS) is to reduce the descriptive text of the LERs to coded sequences that are both computer-readable and computer-searchable. This four volume report documents and describes SCSS in detail. Volume 3 and 4 provide a technical processor, new to SCSS, the information and methodology necessary to capture descriptive data from the LER and to codify that data into a structured format and serve as reference material for the more experienced technical processor, and contains information that is essential for the more advanced user who needs to be familiar with the intricate coding techniques in order to retrieve specific details in a sequence. This volume contains updated material through amendment 1 to revision 1 of the working version of ORNL/NSIC-223, Vol. 4

  15. Sequence Coding and Search System for licensee event reports: coder's manual. Volume 3

    International Nuclear Information System (INIS)

    Gallaher, R.B.; Guymon, R.H.; Mays, G.T.; Poore, W.P.; Cagle, R.J.; Harrington, K.H.; Johnson, M.P.

    1985-04-01

    Operating experience data from nuclear power plants are essential for safety and reliability analyses, especially analyses of trends and patterns. The licensee event reports (LERs) that are submitted to the Nuclear Regulatory Commission (NRC) by the nuclear power plant utilities contain much of this data. The NRC's Office for Analysis and Evaluation of Operational Data (AEOD) has developed, under contract with NSIC, a system for codifying the events reported in the LERs. The primary objective of the Sequence Coding and Search System (SCSS) is to reduce the descriptive text of the LERs to coded sequences that are both computer-readable and computer-searchable. This four volume report documents and describes SCSS in detail. Volumes 3 and 4 provide a technical processor, new to SCSS, the information and methodology necessary to capture descriptive data from the LER and to codify that data into a structured format and serve as reference material for the more experienced technical processor, and contains information is essential for the more advanced user who needs to be familiar with the intricate coding techniques in order to retrieve specific details in a sequence. This volume contains updated material through amendment 1 to revision 1 of the working version of ORNL/NSIC-223, Vol. 3

  16. A Dynamic Approach to Modeling Dependence Between Human Failure Events

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Laboratory

    2015-09-01

    In practice, most HRA methods use direct dependence from THERP—the notion that error be- gets error, and one human failure event (HFE) may increase the likelihood of subsequent HFEs. In this paper, we approach dependence from a simulation perspective in which the effects of human errors are dynamically modeled. There are three key concepts that play into this modeling: (1) Errors are driven by performance shaping factors (PSFs). In this context, the error propagation is not a result of the presence of an HFE yielding overall increases in subsequent HFEs. Rather, it is shared PSFs that cause dependence. (2) PSFs have qualities of lag and latency. These two qualities are not currently considered in HRA methods that use PSFs. Yet, to model the effects of PSFs, it is not simply a matter of identifying the discrete effects of a particular PSF on performance. The effects of PSFs must be considered temporally, as the PSFs will have a range of effects across the event sequence. (3) Finally, there is the concept of error spilling. When PSFs are activated, they not only have temporal effects but also lateral effects on other PSFs, leading to emergent errors. This paper presents the framework for tying together these dynamic dependence concepts.

  17. Temporal and spatial predictability of an irrelevant event differently affect detection and memory of items in a visual sequence

    Directory of Open Access Journals (Sweden)

    Junji eOhyama

    2016-02-01

    Full Text Available We examined how the temporal and spatial predictability of a task-irrelevant visual event affects the detection and memory of a visual item embedded in a continuously changing sequence. Participants observed 11 sequentially presented letters, during which a task-irrelevant visual event was either present or absent. Predictabilities of spatial location and temporal position of the event were controlled in 2 × 2 conditions. In the spatially predictable conditions, the event occurred at the same location within the stimulus sequence or at another location, while, in the spatially unpredictable conditions, it occurred at random locations. In the temporally predictable conditions, the event timing was fixed relative to the order of the letters, while in the temporally unpredictable condition, it could not be predicted from the letter order. Participants performed a working memory task and a target detection reaction time task. Memory accuracy was higher for a letter simultaneously presented at the same location as the event in the temporally unpredictable conditions, irrespective of the spatial predictability of the event. On the other hand, the detection reaction times were only faster for a letter simultaneously presented at the same location as the event when the event was both temporally and spatially predictable. Thus, to facilitate ongoing detection processes, an event must be predictable both in space and time, while memory processes are enhanced by temporally unpredictable (i.e., surprising events. Evidently, temporal predictability has differential effects on detection and memory of a visual item embedded in a sequence of images.

  18. Yeast genome sequencing:

    DEFF Research Database (Denmark)

    Piskur, Jure; Langkjær, Rikke Breinhold

    2004-01-01

    For decades, unicellular yeasts have been general models to help understand the eukaryotic cell and also our own biology. Recently, over a dozen yeast genomes have been sequenced, providing the basis to resolve several complex biological questions. Analysis of the novel sequence data has shown...... of closely related species helps in gene annotation and to answer how many genes there really are within the genomes. Analysis of non-coding regions among closely related species has provided an example of how to determine novel gene regulatory sequences, which were previously difficult to analyse because...... they are short and degenerate and occupy different positions. Comparative genomics helps to understand the origin of yeasts and points out crucial molecular events in yeast evolutionary history, such as whole-genome duplication and horizontal gene transfer(s). In addition, the accumulating sequence data provide...

  19. Design of Long Period Pseudo-Random Sequences from the Addition of -Sequences over

    Directory of Open Access Journals (Sweden)

    Ren Jian

    2004-01-01

    Full Text Available Pseudo-random sequence with good correlation property and large linear span is widely used in code division multiple access (CDMA communication systems and cryptology for reliable and secure information transmission. In this paper, sequences with long period, large complexity, balance statistics, and low cross-correlation property are constructed from the addition of -sequences with pairwise-prime linear spans (AMPLS. Using -sequences as building blocks, the proposed method proved to be an efficient and flexible approach to construct long period pseudo-random sequences with desirable properties from short period sequences. Applying the proposed method to , a signal set is constructed.

  20. Intelligent monitoring and fault diagnosis for ATLAS TDAQ: a complex event processing solution

    CERN Document Server

    Magnoni, Luca; Luppi, Eleonora

    Effective monitoring and analysis tools are fundamental in modern IT infrastructures to get insights on the overall system behavior and to deal promptly and effectively with failures. In recent years, Complex Event Processing (CEP) technologies have emerged as effective solutions for information processing from the most disparate fields: from wireless sensor networks to financial analysis. This thesis proposes an innovative approach to monitor and operate complex and distributed computing systems, in particular referring to the ATLAS Trigger and Data Acquisition (TDAQ) system currently in use at the European Organization for Nuclear Research (CERN). The result of this research, the AAL project, is currently used to provide ATLAS data acquisition operators with automated error detection and intelligent system analysis. The thesis begins by describing the TDAQ system and the controlling architecture, with a focus on the monitoring infrastructure and the expert system used for error detection and automated reco...

  1. Performance Analysis with Network-Enhanced Complexities: On Fading Measurements, Event-Triggered Mechanisms, and Cyber Attacks

    Directory of Open Access Journals (Sweden)

    Derui Ding

    2014-01-01

    Full Text Available Nowadays, the real-world systems are usually subject to various complexities such as parameter uncertainties, time-delays, and nonlinear disturbances. For networked systems, especially large-scale systems such as multiagent systems and systems over sensor networks, the complexities are inevitably enhanced in terms of their degrees or intensities because of the usage of the communication networks. Therefore, it would be interesting to (1 examine how this kind of network-enhanced complexities affects the control or filtering performance; and (2 develop some suitable approaches for controller/filter design problems. In this paper, we aim to survey some recent advances on the performance analysis and synthesis with three sorts of fashionable network-enhanced complexities, namely, fading measurements, event-triggered mechanisms, and attack behaviors of adversaries. First, these three kinds of complexities are introduced in detail according to their engineering backgrounds, dynamical characteristic, and modelling techniques. Then, the developments of the performance analysis and synthesis issues for various networked systems are systematically reviewed. Furthermore, some challenges are illustrated by using a thorough literature review and some possible future research directions are highlighted.

  2. A Bayesian approach to unanticipated events frequency estimation in the decision making context of a nuclear research reactor facility

    International Nuclear Information System (INIS)

    Chatzidakis, S.; Staras, A.

    2013-01-01

    Highlights: • The Bayes’ theorem is employed to support the decision making process in a research reactor. • The intention is to calculate parameters related to unanticipated occurrence of events. • Frequency, posterior distribution and confidence limits are calculated. • The approach is demonstrated using two real-world numerical examples. • The approach can be used even if no failures have been observed. - Abstract: Research reactors are considered as multi-tasking environments having the multiple roles of commercial, research and training facilities. Yet, reactor managers have to make decisions, frequently with high economic impact, based on little available knowledge. A systematic approach employing the Bayes’ theorem is proposed to support the decision making process in a research reactor environment. This approach is characterized by low level complexity, appropriate for research reactor facilities. The methodology is demonstrated through the study of two characteristic events that lead to unanticipated system shutdown, namely the de-energization of the control rod magnet and the flapper valve opening. The results obtained demonstrate the suitability of the Bayesian approach in the decision making context when unanticipated events are considered

  3. The ATLAS Event Service: A New Approach to Event Processing

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00070566; De, Kaushik; Guan, Wen; Maeno, Tadashi; Nilsson, Paul; Oleynik, Danila; Panitkin, Sergey; Tsulaia, Vakhtang; van Gemmeren, Peter; Wenaus, Torre

    2015-01-01

    The ATLAS Event Service (ES) implements a new fine grained approach to HEP event processing, designed to be agile and efficient in exploiting transient, short-lived resources such as HPC hole-filling, spot market commercial clouds, and volunteer computing. Input and output control and data flows, bookkeeping, monitoring, and data storage are all managed at the event level in an implementation capable of supporting ATLAS-scale distributed processing throughputs (about 4M CPU-hours/day). Input data flows utilize remote data repositories with no data locality or pre­staging requirements, minimizing the use of costly storage in favor of strongly leveraging powerful networks. Object stores provide a highly scalable means of remotely storing the quasi-continuous, fine grained outputs that give ES based applications a very light data footprint on a processing resource, and ensure negligible losses should the resource suddenly vanish. We will describe the motivations for the ES system, its unique features and capabi...

  4. Approaches for in silico finishing of microbial genome sequences

    Directory of Open Access Journals (Sweden)

    Frederico Schmitt Kremer

    Full Text Available Abstract The introduction of next-generation sequencing (NGS had a significant effect on the availability of genomic information, leading to an increase in the number of sequenced genomes from a large spectrum of organisms. Unfortunately, due to the limitations implied by the short-read sequencing platforms, most of these newly sequenced genomes remained as “drafts”, incomplete representations of the whole genetic content. The previous genome sequencing studies indicated that finishing a genome sequenced by NGS, even bacteria, may require additional sequencing to fill the gaps, making the entire process very expensive. As such, several in silico approaches have been developed to optimize the genome assemblies and facilitate the finishing process. The present review aims to explore some free (open source, in many cases tools that are available to facilitate genome finishing.

  5. Approaches for in silico finishing of microbial genome sequences.

    Science.gov (United States)

    Kremer, Frederico Schmitt; McBride, Alan John Alexander; Pinto, Luciano da Silva

    The introduction of next-generation sequencing (NGS) had a significant effect on the availability of genomic information, leading to an increase in the number of sequenced genomes from a large spectrum of organisms. Unfortunately, due to the limitations implied by the short-read sequencing platforms, most of these newly sequenced genomes remained as "drafts", incomplete representations of the whole genetic content. The previous genome sequencing studies indicated that finishing a genome sequenced by NGS, even bacteria, may require additional sequencing to fill the gaps, making the entire process very expensive. As such, several in silico approaches have been developed to optimize the genome assemblies and facilitate the finishing process. The present review aims to explore some free (open source, in many cases) tools that are available to facilitate genome finishing.

  6. ASPIE: A Framework for Active Sensing and Processing of Complex Events in the Internet of Manufacturing Things

    Directory of Open Access Journals (Sweden)

    Shaobo Li

    2018-03-01

    Full Text Available Rapid perception and processing of critical monitoring events are essential to ensure healthy operation of Internet of Manufacturing Things (IoMT-based manufacturing processes. In this paper, we proposed a framework (active sensing and processing architecture (ASPIE for active sensing and processing of critical events in IoMT-based manufacturing based on the characteristics of IoMT architecture as well as its perception model. A relation model of complex events in manufacturing processes, together with related operators and unified XML-based semantic definitions, are developed to effectively process the complex event big data. A template based processing method for complex events is further introduced to conduct complex event matching using the Apriori frequent item mining algorithm. To evaluate the proposed models and methods, we developed a software platform based on ASPIE for a local chili sauce manufacturing company, which demonstrated the feasibility and effectiveness of the proposed methods for active perception and processing of complex events in IoMT-based manufacturing.

  7. Event generators for address event representation transmitters

    Science.gov (United States)

    Serrano-Gotarredona, Rafael; Serrano-Gotarredona, Teresa; Linares Barranco, Bernabe

    2005-06-01

    Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows for real-time virtual massive connectivity between huge number neurons located on different chips. By exploiting high speed digital communication circuits (with nano-seconds timings), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Also, neurons generate 'events' according to their activity levels. More active neurons generate more events per unit time, and access the interchip communication channel more frequently, while neurons with low activity consume less communication bandwidth. In a typical AER transmitter chip, there is an array of neurons that generate events. They send events to a peripheral circuitry (let's call it "AER Generator") that transforms those events to neurons coordinates (addresses) which are put sequentially on an interchip high speed digital bus. This bus includes a parallel multi-bit address word plus a Rqst (request) and Ack (acknowledge) handshaking signals for asynchronous data exchange. There have been two main approaches published in the literature for implementing such "AER Generator" circuits. They differ on the way of handling event collisions coming from the array of neurons. One approach is based on detecting and discarding collisions, while the other incorporates arbitration for sequencing colliding events . The first approach is supposed to be simpler and faster, while the second is able to handle much higher event traffic. In this article we will concentrate on the second arbiter-based approach. Boahen has been publishing several techniques for implementing and improving the arbiter based approach. Originally, he proposed an arbitration squeme by rows, followed by a column arbitration. In this scheme, while one neuron was selected by the arbiters to transmit his event out of the chip, the rest of neurons in the array were

  8. A Single-Cell Biochemistry Approach Reveals PAR Complex Dynamics during Cell Polarization.

    Science.gov (United States)

    Dickinson, Daniel J; Schwager, Francoise; Pintard, Lionel; Gotta, Monica; Goldstein, Bob

    2017-08-21

    Regulated protein-protein interactions are critical for cell signaling, differentiation, and development. For the study of dynamic regulation of protein interactions in vivo, there is a need for techniques that can yield time-resolved information and probe multiple protein binding partners simultaneously, using small amounts of starting material. Here we describe a single-cell protein interaction assay. Single-cell lysates are generated at defined time points and analyzed using single-molecule pull-down, yielding information about dynamic protein complex regulation in vivo. We established the utility of this approach by studying PAR polarity proteins, which mediate polarization of many animal cell types. We uncovered striking regulation of PAR complex composition and stoichiometry during Caenorhabditis elegans zygote polarization, which takes place in less than 20 min. PAR complex dynamics are linked to the cell cycle by Polo-like kinase 1 and govern the movement of PAR proteins to establish polarity. Our results demonstrate an approach to study dynamic biochemical events in vivo. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. The analysis of a complex fire event using multispaceborne observations

    Directory of Open Access Journals (Sweden)

    Andrei Simona

    2018-01-01

    Full Text Available This study documents a complex fire event that occurred on October 2016, in Middle East belligerent area. Two fire outbreaks were detected by different spacecraft monitoring instruments on board of TERRA, CALIPSO and AURA Earth Observation missions. Link with local weather conditions was examined using ERA Interim Reanalysis and CAMS datasets. The detection of the event by multiple sensors enabled a detailed characterization of fires and the comparison with different observational data.

  10. The analysis of a complex fire event using multispaceborne observations

    Science.gov (United States)

    Andrei, Simona; Carstea, Emil; Marmureanu, Luminita; Ene, Dragos; Binietoglou, Ioannis; Nicolae, Doina; Konsta, Dimitra; Amiridis, Vassilis; Proestakis, Emmanouil

    2018-04-01

    This study documents a complex fire event that occurred on October 2016, in Middle East belligerent area. Two fire outbreaks were detected by different spacecraft monitoring instruments on board of TERRA, CALIPSO and AURA Earth Observation missions. Link with local weather conditions was examined using ERA Interim Reanalysis and CAMS datasets. The detection of the event by multiple sensors enabled a detailed characterization of fires and the comparison with different observational data.

  11. Targeted amplicon sequencing (TAS): a scalable next-gen approach to multilocus, multitaxa phylogenetics.

    Science.gov (United States)

    Bybee, Seth M; Bracken-Grissom, Heather; Haynes, Benjamin D; Hermansen, Russell A; Byers, Robert L; Clement, Mark J; Udall, Joshua A; Wilcox, Edward R; Crandall, Keith A

    2011-01-01

    Next-gen sequencing technologies have revolutionized data collection in genetic studies and advanced genome biology to novel frontiers. However, to date, next-gen technologies have been used principally for whole genome sequencing and transcriptome sequencing. Yet many questions in population genetics and systematics rely on sequencing specific genes of known function or diversity levels. Here, we describe a targeted amplicon sequencing (TAS) approach capitalizing on next-gen capacity to sequence large numbers of targeted gene regions from a large number of samples. Our TAS approach is easily scalable, simple in execution, neither time-nor labor-intensive, relatively inexpensive, and can be applied to a broad diversity of organisms and/or genes. Our TAS approach includes a bioinformatic application, BarcodeCrucher, to take raw next-gen sequence reads and perform quality control checks and convert the data into FASTA format organized by gene and sample, ready for phylogenetic analyses. We demonstrate our approach by sequencing targeted genes of known phylogenetic utility to estimate a phylogeny for the Pancrustacea. We generated data from 44 taxa using 68 different 10-bp multiplexing identifiers. The overall quality of data produced was robust and was informative for phylogeny estimation. The potential for this method to produce copious amounts of data from a single 454 plate (e.g., 325 taxa for 24 loci) significantly reduces sequencing expenses incurred from traditional Sanger sequencing. We further discuss the advantages and disadvantages of this method, while offering suggestions to enhance the approach.

  12. CNE (Embalse nuclear power plant): probabilistic safety study. Electric power supply. Events sequence

    International Nuclear Information System (INIS)

    Figueroa, N.

    1987-01-01

    The plant response to the occurrence of the starting event 'total loss of electric power supply to class IV and class III' is analyzed. This involves the study of automatical actions of safety and process systems as well as the operator actions. The probabilistic evaluation of starting event frequency is performed through fault-tree techniques. The frequency of occurrence 'loss of electric power supply to class IV (λIV = 0.56/year) and the probability of failure to demand of 'reserve' generating groups (Pd III 6.79 x 10 -3 ) contribute to the mentioned frequency. As soon as the starting event occurs, the reactor power must be reduced to 0%, the fuel must be cooled through the thermo siphon and decay heat has to be removed. The events sequence analysis leads to the conclusion that the non shutting down of the reactor with any of the shutdown systems is 'incredible' (10 -6 /year). In all cases the fuel is cooled by building the thermo siphon except when a substantial inventory loss exist due to a closure failure of some valve of pressure and inventory control system. The order of magnitude of the failure of decay heat removal through the steam generators is 4 x 10 -4 . This removal would be assured by the emergency water system. Therefore, the frequency of the sequence of possible core meltdown, when the reactor does not shut down is: λ = 5 x 10 -9 /year and for the failure of heat removal: λ = 2 x 10 -6 /year. (Author)

  13. A sequence-based survey of the complex structural organization of tumor genomes

    Energy Technology Data Exchange (ETDEWEB)

    Collins, Colin; Raphael, Benjamin J.; Volik, Stanislav; Yu, Peng; Wu, Chunxiao; Huang, Guiqing; Linardopoulou, Elena V.; Trask, Barbara J.; Waldman, Frederic; Costello, Joseph; Pienta, Kenneth J.; Mills, Gordon B.; Bajsarowicz, Krystyna; Kobayashi, Yasuko; Sridharan, Shivaranjani; Paris, Pamela; Tao, Quanzhou; Aerni, Sarah J.; Brown, Raymond P.; Bashir, Ali; Gray, Joe W.; Cheng, Jan-Fang; de Jong, Pieter; Nefedov, Mikhail; Ried, Thomas; Padilla-Nash, Hesed M.; Collins, Colin C.

    2008-04-03

    The genomes of many epithelial tumors exhibit extensive chromosomal rearrangements. All classes of genome rearrangements can be identified using End Sequencing Profiling (ESP), which relies on paired-end sequencing of cloned tumor genomes. In this study, brain, breast, ovary and prostate tumors along with three breast cancer cell lines were surveyed with ESP yielding the largest available collection of sequence-ready tumor genome breakpoints and providing evidence that some rearrangements may be recurrent. Sequencing and fluorescence in situ hybridization (FISH) confirmed translocations and complex tumor genome structures that include coamplification and packaging of disparate genomic loci with associated molecular heterogeneity. Comparison of the tumor genomes suggests recurrent rearrangements. Some are likely to be novel structural polymorphisms, whereas others may be bona fide somatic rearrangements. A recurrent fusion transcript in breast tumors and a constitutional fusion transcript resulting from a segmental duplication were identified. Analysis of end sequences for single nucleotide polymorphisms (SNPs) revealed candidate somatic mutations and an elevated rate of novel SNPs in an ovarian tumor. These results suggest that the genomes of many epithelial tumors may be far more dynamic and complex than previously appreciated and that genomic fusions including fusion transcripts and proteins may be common, possibly yielding tumor-specific biomarkers and therapeutic targets.

  14. The Flushtration Count Illusion: Attribute substitution tricks our interpretation of a simple visual event sequence.

    Science.gov (United States)

    Thomas, Cyril; Didierjean, André; Kuhn, Gustav

    2018-04-17

    When faced with a difficult question, people sometimes work out an answer to a related, easier question without realizing that a substitution has taken place (e.g., Kahneman, 2011, Thinking, fast and slow. New York, Farrar, Strauss, Giroux). In two experiments, we investigated whether this attribute substitution effect can also affect the interpretation of a simple visual event sequence. We used a magic trick called the 'Flushtration Count Illusion', which involves a technique used by magicians to give the illusion of having seen multiple cards with identical backs, when in fact only the back of one card (the bottom card) is repeatedly shown. In Experiment 1, we demonstrated that most participants are susceptible to the illusion, even if they have the visual and analytical reasoning capacity to correctly process the sequence. In Experiment 2, we demonstrated that participants construct a biased and simplified representation of the Flushtration Count by substituting some attributes of the event sequence. We discussed of the psychological processes underlying this attribute substitution effect. © 2018 The British Psychological Society.

  15. Development of technique for estimating primary cooling system break diameter in predicting nuclear emergency event sequence

    International Nuclear Information System (INIS)

    Tatebe, Yasumasa; Yoshida, Yoshitaka

    2012-01-01

    If an emergency event occurs in a nuclear power plant, appropriate action is selected and taken in accordance with the plant status, which changes from time to time, in order to prevent escalation and mitigate the event consequences. It is thus important to predict the event sequence and identify the plant behavior resulting from the action taken. In predicting the event sequence during a loss-of-coolant accident (LOCA), it is necessary to estimate break diameter. The conventional method for this estimation is time-consuming, since it involves multiple sensitivity analyses to determine the break diameter that is consistent with the plant behavior. To speed up the process of predicting the nuclear emergency event sequence, a new break diameter estimation technique that is applicable to pressurized water reactors was developed in this study. This technique enables the estimation of break diameter using the plant data sent from the safety parameter display system (SPDS), with focus on the depressurization rate in the reactor cooling system (RCS) during LOCA. The results of LOCA analysis, performed by varying the break diameter using the MAAP4 and RELAP5/MOD3.2 codes, confirmed that the RCS depressurization rate could be expressed by the log linear function of break diameter, except in the case of a small leak, in which RCS depressurization is affected by the coolant charging system and the high-pressure injection system. A correlation equation for break diameter estimation was developed from this function and tested for accuracy. Testing verified that the correlation equation could estimate break diameter accurately within an error of approximately 16%, even if the leak increases gradually, changing the plant status. (author)

  16. Sensitivity to structure in action sequences: An infant event-related potential study.

    Science.gov (United States)

    Monroy, Claire D; Gerson, Sarah A; Domínguez-Martínez, Estefanía; Kaduk, Katharina; Hunnius, Sabine; Reid, Vincent

    2017-05-06

    Infants are sensitive to structure and patterns within continuous streams of sensory input. This sensitivity relies on statistical learning, the ability to detect predictable regularities in spatial and temporal sequences. Recent evidence has shown that infants can detect statistical regularities in action sequences they observe, but little is known about the neural process that give rise to this ability. In the current experiment, we combined electroencephalography (EEG) with eye-tracking to identify electrophysiological markers that indicate whether 8-11-month-old infants detect violations to learned regularities in action sequences, and to relate these markers to behavioral measures of anticipation during learning. In a learning phase, infants observed an actor performing a sequence featuring two deterministic pairs embedded within an otherwise random sequence. Thus, the first action of each pair was predictive of what would occur next. One of the pairs caused an action-effect, whereas the second did not. In a subsequent test phase, infants observed another sequence that included deviant pairs, violating the previously observed action pairs. Event-related potential (ERP) responses were analyzed and compared between the deviant and the original action pairs. Findings reveal that infants demonstrated a greater Negative central (Nc) ERP response to the deviant actions for the pair that caused the action-effect, which was consistent with their visual anticipations during the learning phase. Findings are discussed in terms of the neural and behavioral processes underlying perception and learning of structured action sequences. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. High quality maize centromere 10 sequence reveals evidence of frequent recombination events

    Directory of Open Access Journals (Sweden)

    Thomas Kai Wolfgruber

    2016-03-01

    Full Text Available The ancestral centromeres of maize contain long stretches of the tandemly arranged CentC repeat. The abundance of tandem DNA repeats and centromeric retrotransposons (CR have presented a significant challenge to completely assembling centromeres using traditional sequencing methods. Here we report a nearly complete assembly of the 1.85 Mb maize centromere 10 from inbred B73 using PacBio technology and BACs from the reference genome project. The error rates estimated from overlapping BAC sequences are 7 x 10-6 and 5 x 10-5 for mismatches and indels, respectively. The number of gaps in the region covered by the reassembly was reduced from 140 in the reference genome to three. Three expressed genes are located between 92 and 477 kb of the inferred ancestral CentC cluster, which lies within the region of highest centromeric repeat density. The improved assembly increased the count of full-length centromeric retrotransposons from 5 to 55 and revealed a 22.7 kb segmental duplication that occurred approximately 121,000 years ago. Our analysis provides evidence of frequent recombination events in the form of partial retrotransposons, deletions within retrotransposons, chimeric retrotransposons, segmental duplications including higher order CentC repeats, a deleted CentC monomer, centromere-proximal inversions, and insertion of mitochondrial sequences. Double-strand DNA break (DSB repair is the most plausible mechanism for these events and may be the major driver of centromere repeat evolution and diversity. This repair appears to be mediated by microhomology, suggesting that tandem repeats may have evolved to facilitate the repair of frequent DSBs in centromeres.

  18. Consolidation of Complex Events via Reinstatement in Posterior Cingulate Cortex

    Science.gov (United States)

    Keidel, James L.; Ing, Leslie P.; Horner, Aidan J.

    2015-01-01

    It is well-established that active rehearsal increases the efficacy of memory consolidation. It is also known that complex events are interpreted with reference to prior knowledge. However, comparatively little attention has been given to the neural underpinnings of these effects. In healthy adults humans, we investigated the impact of effortful, active rehearsal on memory for events by showing people several short video clips and then asking them to recall these clips, either aloud (Experiment 1) or silently while in an MRI scanner (Experiment 2). In both experiments, actively rehearsed clips were remembered in far greater detail than unrehearsed clips when tested a week later. In Experiment 1, highly similar descriptions of events were produced across retrieval trials, suggesting a degree of semanticization of the memories had taken place. In Experiment 2, spatial patterns of BOLD signal in medial temporal and posterior midline regions were correlated when encoding and rehearsing the same video. Moreover, the strength of this correlation in the posterior cingulate predicted the amount of information subsequently recalled. This is likely to reflect a strengthening of the representation of the video's content. We argue that these representations combine both new episodic information and stored semantic knowledge (or “schemas”). We therefore suggest that posterior midline structures aid consolidation by reinstating and strengthening the associations between episodic details and more generic schematic information. This leads to the creation of coherent memory representations of lifelike, complex events that are resistant to forgetting, but somewhat inflexible and semantic-like in nature. SIGNIFICANCE STATEMENT Memories are strengthened via consolidation. We investigated memory for lifelike events using video clips and showed that rehearsing their content dramatically boosts memory consolidation. Using MRI scanning, we measured patterns of brain activity while

  19. Optimization of parameter values for complex pulse sequences by simulated annealing: application to 3D MP-RAGE imaging of the brain.

    Science.gov (United States)

    Epstein, F H; Mugler, J P; Brookeman, J R

    1994-02-01

    A number of pulse sequence techniques, including magnetization-prepared gradient echo (MP-GRE), segmented GRE, and hybrid RARE, employ a relatively large number of variable pulse sequence parameters and acquire the image data during a transient signal evolution. These sequences have recently been proposed and/or used for clinical applications in the brain, spine, liver, and coronary arteries. Thus, the need for a method of deriving optimal pulse sequence parameter values for this class of sequences now exists. Due to the complexity of these sequences, conventional optimization approaches, such as applying differential calculus to signal difference equations, are inadequate. We have developed a general framework for adapting the simulated annealing algorithm to pulse sequence parameter value optimization, and applied this framework to the specific case of optimizing the white matter-gray matter signal difference for a T1-weighted variable flip angle 3D MP-RAGE sequence. Using our algorithm, the values of 35 sequence parameters, including the magnetization-preparation RF pulse flip angle and delay time, 32 flip angles in the variable flip angle gradient-echo acquisition sequence, and the magnetization recovery time, were derived. Optimized 3D MP-RAGE achieved up to a 130% increase in white matter-gray matter signal difference compared with optimized 3D RF-spoiled FLASH with the same total acquisition time. The simulated annealing approach was effective at deriving optimal parameter values for a specific 3D MP-RAGE imaging objective, and may be useful for other imaging objectives and sequences in this general class.

  20. Greater than the sum of its parts: single-nucleus sequencing identifies convergent evolution of independent EGFR mutants in GBM.

    Science.gov (United States)

    Gini, Beatrice; Mischel, Paul S

    2014-08-01

    Single-cell sequencing approaches are needed to characterize the genomic diversity of complex tumors, shedding light on their evolutionary paths and potentially suggesting more effective therapies. In this issue of Cancer Discovery, Francis and colleagues develop a novel integrative approach to identify distinct tumor subpopulations based on joint detection of clonal and subclonal events from bulk tumor and single-nucleus whole-genome sequencing, allowing them to infer a subclonal architecture. Surprisingly, the authors identify convergent evolution of multiple, mutually exclusive, independent EGFR gain-of-function variants in a single tumor. This study demonstrates the value of integrative single-cell genomics and highlights the biologic primacy of EGFR as an actionable target in glioblastoma. ©2014 American Association for Cancer Research.

  1. A decision theoretic approach to an accident sequence: when feedwater and auxiliary feedwater fail in a nuclear power plant

    International Nuclear Information System (INIS)

    Svenson, Ola

    1998-01-01

    This study applies a decision theoretic perspective on a severe accident management sequence in a processing industry. The sequence contains loss of feedwater and auxiliary feedwater in a boiling water nuclear reactor (BWR), which necessitates manual depressurization of the reactor pressure vessel to enable low pressure cooling of the core. The sequence is fast and is a major contributor to core damage in probabilistic risk analyses (PRAs) of this kind of plant. The management of the sequence also includes important, difficult and fast human decision making. The decision theoretic perspective, which is applied to a Swedish ABB-type reactor, stresses the roles played by uncertainties about plant state, consequences of different actions and goals during the management of a severe accident sequence. Based on a theoretical analysis and empirical simulator data the human error probabilities in the PRA for the plant are considered to be too small. Recommendations for how to improve safety are given and they include full automation of the sequence, improved operator training, and/or actions to assist the operators' decision making through reduction of uncertainties, for example, concerning water/steam level for sufficient cooling, time remaining before insufficient cooling level in the tank is reached and organizational cost-benefit evaluations of the events following a false alarm depressurization as well as the events following a successful depressurization at different points in time. Finally, it is pointed out that the approach exemplified in this study is applicable to any accident scenario which includes difficult human decision making with conflicting goals, uncertain information and with very serious consequences

  2. Observing complex action sequences: The role of the fronto-parietal mirror neuron system.

    Science.gov (United States)

    Molnar-Szakacs, Istvan; Kaplan, Jonas; Greenfield, Patricia M; Iacoboni, Marco

    2006-11-15

    A fronto-parietal mirror neuron network in the human brain supports the ability to represent and understand observed actions allowing us to successfully interact with others and our environment. Using functional magnetic resonance imaging (fMRI), we wanted to investigate the response of this network in adults during observation of hierarchically organized action sequences of varying complexity that emerge at different developmental stages. We hypothesized that fronto-parietal systems may play a role in coding the hierarchical structure of object-directed actions. The observation of all action sequences recruited a common bilateral network including the fronto-parietal mirror neuron system and occipito-temporal visual motion areas. Activity in mirror neuron areas varied according to the motoric complexity of the observed actions, but not according to the developmental sequence of action structures, possibly due to the fact that our subjects were all adults. These results suggest that the mirror neuron system provides a fairly accurate simulation process of observed actions, mimicking internally the level of motoric complexity. We also discuss the results in terms of the links between mirror neurons, language development and evolution.

  3. Integrated sequence analysis. Final report

    International Nuclear Information System (INIS)

    Andersson, K.; Pyy, P.

    1998-02-01

    The NKS/RAK subprojet 3 'integrated sequence analysis' (ISA) was formulated with the overall objective to develop and to test integrated methodologies in order to evaluate event sequences with significant human action contribution. The term 'methodology' denotes not only technical tools but also methods for integration of different scientific disciplines. In this report, we first discuss the background of ISA and the surveys made to map methods in different application fields, such as man machine system simulation software, human reliability analysis (HRA) and expert judgement. Specific event sequences were, after the surveys, selected for application and testing of a number of ISA methods. The event sequences discussed in the report were cold overpressure of BWR, shutdown LOCA of BWR, steam generator tube rupture of a PWR and BWR disturbed signal view in the control room after an external event. Different teams analysed these sequences by using different ISA and HRA methods. Two kinds of results were obtained from the ISA project: sequence specific and more general findings. The sequence specific results are discussed together with each sequence description. The general lessons are discussed under a separate chapter by using comparisons of different case studies. These lessons include areas ranging from plant safety management (design, procedures, instrumentation, operations, maintenance and safety practices) to methodological findings (ISA methodology, PSA,HRA, physical analyses, behavioural analyses and uncertainty assessment). Finally follows a discussion about the project and conclusions are presented. An interdisciplinary study of complex phenomena is a natural way to produce valuable and innovative results. This project came up with structured ways to perform ISA and managed to apply the in practice. The project also highlighted some areas where more work is needed. In the HRA work, development is required for the use of simulators and expert judgement as

  4. Integrated sequence analysis. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, K.; Pyy, P

    1998-02-01

    The NKS/RAK subprojet 3 `integrated sequence analysis` (ISA) was formulated with the overall objective to develop and to test integrated methodologies in order to evaluate event sequences with significant human action contribution. The term `methodology` denotes not only technical tools but also methods for integration of different scientific disciplines. In this report, we first discuss the background of ISA and the surveys made to map methods in different application fields, such as man machine system simulation software, human reliability analysis (HRA) and expert judgement. Specific event sequences were, after the surveys, selected for application and testing of a number of ISA methods. The event sequences discussed in the report were cold overpressure of BWR, shutdown LOCA of BWR, steam generator tube rupture of a PWR and BWR disturbed signal view in the control room after an external event. Different teams analysed these sequences by using different ISA and HRA methods. Two kinds of results were obtained from the ISA project: sequence specific and more general findings. The sequence specific results are discussed together with each sequence description. The general lessons are discussed under a separate chapter by using comparisons of different case studies. These lessons include areas ranging from plant safety management (design, procedures, instrumentation, operations, maintenance and safety practices) to methodological findings (ISA methodology, PSA,HRA, physical analyses, behavioural analyses and uncertainty assessment). Finally follows a discussion about the project and conclusions are presented. An interdisciplinary study of complex phenomena is a natural way to produce valuable and innovative results. This project came up with structured ways to perform ISA and managed to apply the in practice. The project also highlighted some areas where more work is needed. In the HRA work, development is required for the use of simulators and expert judgement as

  5. Zseq: An Approach for Preprocessing Next-Generation Sequencing Data.

    Science.gov (United States)

    Alkhateeb, Abedalrhman; Rueda, Luis

    2017-08-01

    Next-generation sequencing technology generates a huge number of reads (short sequences), which contain a vast amount of genomic data. The sequencing process, however, comes with artifacts. Preprocessing of sequences is mandatory for further downstream analysis. We present Zseq, a linear method that identifies the most informative genomic sequences and reduces the number of biased sequences, sequence duplications, and ambiguous nucleotides. Zseq finds the complexity of the sequences by counting the number of unique k-mers in each sequence as its corresponding score and also takes into the account other factors such as ambiguous nucleotides or high GC-content percentage in k-mers. Based on a z-score threshold, Zseq sweeps through the sequences again and filters those with a z-score less than the user-defined threshold. Zseq algorithm is able to provide a better mapping rate; it reduces the number of ambiguous bases significantly in comparison with other methods. Evaluation of the filtered reads has been conducted by aligning the reads and assembling the transcripts using the reference genome as well as de novo assembly. The assembled transcripts show a better discriminative ability to separate cancer and normal samples in comparison with another state-of-the-art method. Moreover, de novo assembled transcripts from the reads filtered by Zseq have longer genomic sequences than other tested methods. Estimating the threshold of the cutoff point is introduced using labeling rules with optimistic results.

  6. InFusion: Advancing Discovery of Fusion Genes and Chimeric Transcripts from Deep RNA-Sequencing Data.

    Directory of Open Access Journals (Sweden)

    Konstantin Okonechnikov

    Full Text Available Analysis of fusion transcripts has become increasingly important due to their link with cancer development. Since high-throughput sequencing approaches survey fusion events exhaustively, several computational methods for the detection of gene fusions from RNA-seq data have been developed. This kind of analysis, however, is complicated by native trans-splicing events, the splicing-induced complexity of the transcriptome and biases and artefacts introduced in experiments and data analysis. There are a number of tools available for the detection of fusions from RNA-seq data; however, certain differences in specificity and sensitivity between commonly used approaches have been found. The ability to detect gene fusions of different types, including isoform fusions and fusions involving non-coding regions, has not been thoroughly studied yet. Here, we propose a novel computational toolkit called InFusion for fusion gene detection from RNA-seq data. InFusion introduces several unique features, such as discovery of fusions involving intergenic regions, and detection of anti-sense transcription in chimeric RNAs based on strand-specificity. Our approach demonstrates superior detection accuracy on simulated data and several public RNA-seq datasets. This improved performance was also evident when evaluating data from RNA deep-sequencing of two well-established prostate cancer cell lines. InFusion identified 26 novel fusion events that were validated in vitro, including alternatively spliced gene fusion isoforms and chimeric transcripts that include intergenic regions. The toolkit is freely available to download from http:/bitbucket.org/kokonech/infusion.

  7. Systems genetics of complex diseases using RNA-sequencing methods

    DEFF Research Database (Denmark)

    Mazzoni, Gianluca; Kogelman, Lisette; Suravajhala, Prashanth

    2015-01-01

    Next generation sequencing technologies have enabled the generation of huge quantities of biological data, and nowadays extensive datasets at different ‘omics levels have been generated. Systems genetics is a powerful approach that allows to integrate different ‘omics level and understand the bio...

  8. HLA DNA sequence variation among human populations: molecular signatures of demographic and selective events.

    Directory of Open Access Journals (Sweden)

    Stéphane Buhler

    2011-02-01

    Full Text Available Molecular differences between HLA alleles vary up to 57 nucleotides within the peptide binding coding region of human Major Histocompatibility Complex (MHC genes, but it is still unclear whether this variation results from a stochastic process or from selective constraints related to functional differences among HLA molecules. Although HLA alleles are generally treated as equidistant molecular units in population genetic studies, DNA sequence diversity among populations is also crucial to interpret the observed HLA polymorphism. In this study, we used a large dataset of 2,062 DNA sequences defined for the different HLA alleles to analyze nucleotide diversity of seven HLA genes in 23,500 individuals of about 200 populations spread worldwide. We first analyzed the HLA molecular structure and diversity of these populations in relation to geographic variation and we further investigated possible departures from selective neutrality through Tajima's tests and mismatch distributions. All results were compared to those obtained by classical approaches applied to HLA allele frequencies.Our study shows that the global patterns of HLA nucleotide diversity among populations are significantly correlated to geography, although in some specific cases the molecular information reveals unexpected genetic relationships. At all loci except HLA-DPB1, populations have accumulated a high proportion of very divergent alleles, suggesting an advantage of heterozygotes expressing molecularly distant HLA molecules (asymmetric overdominant selection model. However, both different intensities of selection and unequal levels of gene conversion may explain the heterogeneous mismatch distributions observed among the loci. Also, distinctive patterns of sequence divergence observed at the HLA-DPB1 locus suggest current neutrality but old selective pressures on this gene. We conclude that HLA DNA sequences advantageously complement HLA allele frequencies as a source of data used

  9. Multilocus sequence typing scheme for the Mycobacterium abscessus complex.

    Science.gov (United States)

    Macheras, Edouard; Konjek, Julie; Roux, Anne-Laure; Thiberge, Jean-Michel; Bastian, Sylvaine; Leão, Sylvia Cardoso; Palaci, Moises; Sivadon-Tardy, Valérie; Gutierrez, Cristina; Richter, Elvira; Rüsch-Gerdes, Sabine; Pfyffer, Gaby E; Bodmer, Thomas; Jarlier, Vincent; Cambau, Emmanuelle; Brisse, Sylvain; Caro, Valérie; Rastogi, Nalin; Gaillard, Jean-Louis; Heym, Beate

    2014-01-01

    We developed a multilocus sequence typing (MLST) scheme for Mycobacterium abscessus sensu lato, based on the partial sequencing of seven housekeeping genes: argH, cya, glpK, gnd, murC, pta and purH. This scheme was used to characterize a collection of 227 isolates recovered between 1994 and 2010 in France, Germany, Switzerland and Brazil. We identified 100 different sequence types (STs), which were distributed into three groups on the tree obtained by concatenating the sequences of the seven housekeeping gene fragments (3576bp): the M. abscessus sensu stricto group (44 STs), the "M. massiliense" group (31 STs) and the "M. bolletii" group (25 STs). SplitTree analysis showed a degree of intergroup lateral transfers. There was also evidence of lateral transfer events involving rpoB. The most prevalent STs in our collection were ST1 (CC5; 20 isolates) and ST23 (CC3; 31 isolates). Both STs were found in Europe and Brazil, and the latter was implicated in a large post-surgical procedure outbreak in Brazil. Respiratory isolates from patients with cystic fibrosis belonged to a large variety of STs; however, ST2 was predominant in this group of patients. Our MLST scheme, publicly available at www.pasteur.fr/mlst, offers investigators a valuable typing tool for M. abscessus sensu lato in future epidemiological studies throughout the world. Copyright © 2013 Institut Pasteur. Published by Elsevier Masson SAS. All rights reserved.

  10. A methodological approach for designing and sequencing product families in Reconfigurable Disassembly Systems

    Directory of Open Access Journals (Sweden)

    Ignacio Eguia

    2011-10-01

    Full Text Available Purpose: A Reconfigurable Disassembly System (RDS represents a new paradigm of automated disassembly system that uses reconfigurable manufacturing technology for fast adaptation to changes in the quantity and mix of products to disassemble. This paper deals with a methodology for designing and sequencing product families in RDS. Design/methodology/approach: The methodology is developed in a two-phase approach, where products are first grouped into families and then families are sequenced through the RDS, computing the required machines and modules configuration for each family. Products are grouped into families based on their common features using a Hierarchical Clustering Algorithm. The optimal sequence of the product families is calculated using a Mixed-Integer Linear Programming model minimizing reconfigurability and operational costs. Findings: This paper is focused to enable reconfigurable manufacturing technologies to attain some degree of adaptability during disassembly automation design using modular machine tools. Research limitations/implications: The MILP model proposed for the second phase is similar to the well-known Travelling Salesman Problem (TSP and therefore its complexity grows exponentially with the number of products to disassemble. In real-world problems, which a higher number of products, it may be advisable to solve the model approximately with heuristics. Practical implications: The importance of industrial recycling and remanufacturing is growing due to increasing environmental and economic pressures. Disassembly is an important part of remanufacturing systems for reuse and recycling purposes. Automatic disassembly techniques have a growing number of applications in the area of electronics, aerospace, construction and industrial equipment. In this paper, a design and scheduling approach is proposed to apply in this area. Originality/value: This paper presents a new concept called Reconfigurable Disassembly System

  11. Complex Event Detection via Multi Source Video Attributes (Open Access)

    Science.gov (United States)

    2013-10-03

    Complex Event Detection via Multi-Source Video Attributes Zhigang Ma† Yi Yang‡ Zhongwen Xu‡§ Shuicheng Yan Nicu Sebe† Alexander G. Hauptmann...under its International Research Centre @ Singapore Fund- ing Initiative and administered by the IDM Programme Of- fice, and the Intelligence Advanced

  12. Differential beta-band event-related desynchronization during categorical action sequence planning.

    Directory of Open Access Journals (Sweden)

    Hame Park

    Full Text Available A primate study reported the existence of neurons from the dorso-lateral prefrontal cortex which fired prior to executing categorical action sequences. The authors suggested these activities may represent abstract level information. Here, we aimed to find the neurophysiological representation of planning categorical action sequences at the population level in healthy humans. Previous human studies have shown beta-band event-related desynchronization (ERD during action planning in humans. Some of these studies showed different levels of ERD according to different types of action preparation. Especially, the literature suggests that variations in cognitive factors rather than physical factors (force, direction, etc modulate the level of beta-ERD. We hypothesized that the level of beta-band power will differ according to planning of different categorical sequences. We measured magnetoencephalography (MEG from 22 subjects performing 11 four-sequence actions--each consisting of one or two of three simple actions--in 3 categories; 'Paired (ooxx', 'Alternative (oxox' and 'Repetitive (oooo' ('o' and 'x' each denoting one of three simple actions. Time-frequency representations were calculated for each category during the planning period, and the corresponding beta-power time-courses were compared. We found beta-ERD during the planning period for all subjects, mostly in the contralateral fronto-parietal areas shortly after visual cue onset. Power increase (transient rebound followed ERD in 20 out of 22 subjects. Amplitudes differed among categories in 20 subjects for both ERD and transient rebound. In 18 out of 20 subjects 'Repetitive' category showed the largest ERD and rebound. The current result suggests that beta-ERD in the contralateral frontal/motor/parietal areas during planning is differentiated by the category of action sequences.

  13. BioNano genome mapping of individual chromosomes supports physical mapping and sequence assembly in complex plant genomes.

    Science.gov (United States)

    Staňková, Helena; Hastie, Alex R; Chan, Saki; Vrána, Jan; Tulpová, Zuzana; Kubaláková, Marie; Visendi, Paul; Hayashi, Satomi; Luo, Mingcheng; Batley, Jacqueline; Edwards, David; Doležel, Jaroslav; Šimková, Hana

    2016-07-01

    The assembly of a reference genome sequence of bread wheat is challenging due to its specific features such as the genome size of 17 Gbp, polyploid nature and prevalence of repetitive sequences. BAC-by-BAC sequencing based on chromosomal physical maps, adopted by the International Wheat Genome Sequencing Consortium as the key strategy, reduces problems caused by the genome complexity and polyploidy, but the repeat content still hampers the sequence assembly. Availability of a high-resolution genomic map to guide sequence scaffolding and validate physical map and sequence assemblies would be highly beneficial to obtaining an accurate and complete genome sequence. Here, we chose the short arm of chromosome 7D (7DS) as a model to demonstrate for the first time that it is possible to couple chromosome flow sorting with genome mapping in nanochannel arrays and create a de novo genome map of a wheat chromosome. We constructed a high-resolution chromosome map composed of 371 contigs with an N50 of 1.3 Mb. Long DNA molecules achieved by our approach facilitated chromosome-scale analysis of repetitive sequences and revealed a ~800-kb array of tandem repeats intractable to current DNA sequencing technologies. Anchoring 7DS sequence assemblies obtained by clone-by-clone sequencing to the 7DS genome map provided a valuable tool to improve the BAC-contig physical map and validate sequence assembly on a chromosome-arm scale. Our results indicate that creating genome maps for the whole wheat genome in a chromosome-by-chromosome manner is feasible and that they will be an affordable tool to support the production of improved pseudomolecules. © 2016 The Authors. Plant Biotechnology Journal published by Society for Experimental Biology and The Association of Applied Biologists and John Wiley & Sons Ltd.

  14. Sequence Coding and Search System for licensee event reports: user's guide. Volume 1, Revision 1

    International Nuclear Information System (INIS)

    Greene, N.M.; Mays, G.T.; Johnson, M.P.

    1985-04-01

    Operating experience data from nuclear power plants are essential for safety and reliability analyses, especially analyses of trends and patterns. The licensee event reports (LERs) that are submitted to the Nuclear Regulatory Commission (NRC) by the nuclear power plant utilities contain much of this data. The NRC's Office for Analysis and Evaluation of Operational Data (AEOD) has developed, under contract with NSIC, a system for codifying the events reported in the LERs. The primary objective of the Sequence Coding and Search System (SCSS) is to reduce the descriptive text of the LERs to coded sequences that are both computer-readable and computer-searchable. This system provides a structured format for detailed coding of component, system, and unit effects as well as personnel errors. The database contains all current LERs submitted by nuclear power plant utilities for events occurring since 1981 and is updated on a continual basis. This four volume report documents and describes SCSS in detail. Volume 1 is a User's Guide for searching the SCSS database. This volume contains updated material through February 1985 of the working version of ORNL/NSIC-223, Vol. 1

  15. Source complexity of the May 20, 2012, Mw 5.9, Ferrara (Italy event

    Directory of Open Access Journals (Sweden)

    Davide Piccinini

    2012-10-01

    Full Text Available A Mw 3.9 foreshock on May 19, 2012, at 23:13 UTC, was followed at 02:03 on May 20, 2012, by a Mw 5.9 earthquake that hit a densely populated area in the Po Plain, west of the city of Ferrara, Italy (Figure 1. Over the subsequent 13 days, six Mw >5 events occurred; of these, the most energetic was a Mw 5.8 earthquake on May 29, 2012, 12 km WSW of the main shock. The tragic balance of this sequence was 17 casualties, hundreds of injured, and severe damage to the historical and cultural heritage of the area. From a seismological point of view, the 2012 earthquake was not an outstanding event in its regional context. The same area was hit in 1996 by a Mw 5.4 earthquake [Selvaggi et al. 2001], and previously in 1986 and in 1967 (DBMI11 [Locati et al. 2011]. The most destructive historical event was the 1570, Imax 8 event, which struck the town of Ferrara [Guidoboni et al. 2007, Rovida et al. 2011]. The 2012 seismic sequence lasted for several weeks and probably developed on a well-known buried thrust fault [Basili et al. 2008, Toscani et al. 2009, DISS Working Group 2010], at depths between 2 km and 10-12 km. […

  16. Sequence protein identification by randomized sequence database and transcriptome mass spectrometry (SPIDER-TMS): from manual to automatic application of a 'de novo sequencing' approach.

    Science.gov (United States)

    Pascale, Raffaella; Grossi, Gerarda; Cruciani, Gabriele; Mecca, Giansalvatore; Santoro, Donatello; Sarli Calace, Renzo; Falabella, Patrizia; Bianco, Giuliana

    Sequence protein identification by a randomized sequence database and transcriptome mass spectrometry software package has been developed at the University of Basilicata in Potenza (Italy) and designed to facilitate the determination of the amino acid sequence of a peptide as well as an unequivocal identification of proteins in a high-throughput manner with enormous advantages of time, economical resource and expertise. The software package is a valid tool for the automation of a de novo sequencing approach, overcoming the main limits and a versatile platform useful in the proteomic field for an unequivocal identification of proteins, starting from tandem mass spectrometry data. The strength of this software is that it is a user-friendly and non-statistical approach, so protein identification can be considered unambiguous.

  17. A Dynamic Intelligent Decision Approach to Dependency Modeling of Project Tasks in Complex Engineering System Optimization

    Directory of Open Access Journals (Sweden)

    Tinggui Chen

    2013-01-01

    Full Text Available Complex engineering system optimization usually involves multiple projects or tasks. On the one hand, dependency modeling among projects or tasks highlights structures in systems and their environments which can help to understand the implications of connectivity on different aspects of system performance and also assist in designing, optimizing, and maintaining complex systems. On the other hand, multiple projects or tasks are either happening at the same time or scheduled into a sequence in order to use common resources. In this paper, we propose a dynamic intelligent decision approach to dependency modeling of project tasks in complex engineering system optimization. The approach takes this decision process as a two-stage decision-making problem. In the first stage, a task clustering approach based on modularization is proposed so as to find out a suitable decomposition scheme for a large-scale project. In the second stage, according to the decomposition result, a discrete artificial bee colony (ABC algorithm inspired by the intelligent foraging behavior of honeybees is developed for the resource constrained multiproject scheduling problem. Finally, a certain case from an engineering design of a chemical processing system is utilized to help to understand the proposed approach.

  18. Master Logic Diagram: An Approach to Identify Initiating Events of HTGRs

    Science.gov (United States)

    Purba, J. H.

    2018-02-01

    Initiating events of a nuclear power plant being evaluated need to be firstly identified prior to applying probabilistic safety assessment on that plant. Various types of master logic diagrams (MLDs) have been proposedforsearching initiating events of the next generation of nuclear power plants, which have limited data and operating experiences. Those MLDs are different in the number of steps or levels and different in the basis for developing them. This study proposed another type of MLD approach to find high temperature gas cooled reactor (HTGR) initiating events. It consists of five functional steps starting from the top event representing the final objective of the safety functions to the basic event representing the goal of the MLD development, which is an initiating event. The application of the proposed approach to search for two HTGR initiating events, i.e. power turbine generator trip and loss of offsite power, is provided. The results confirmed that the proposed MLD is feasiblefor finding HTGR initiating events.

  19. Probing energy transfer events in the light harvesting complex 2 (LH2) of Rhodobacter sphaeroides with two-dimensional spectroscopy.

    Science.gov (United States)

    Fidler, Andrew F; Singh, Ved P; Long, Phillip D; Dahlberg, Peter D; Engel, Gregory S

    2013-10-21

    Excitation energy transfer events in the photosynthetic light harvesting complex 2 (LH2) of Rhodobacter sphaeroides are investigated with polarization controlled two-dimensional electronic spectroscopy. A spectrally broadened pulse allows simultaneous measurement of the energy transfer within and between the two absorption bands at 800 nm and 850 nm. The phased all-parallel polarization two-dimensional spectra resolve the initial events of energy transfer by separating the intra-band and inter-band relaxation processes across the two-dimensional map. The internal dynamics of the 800 nm region of the spectra are resolved as a cross peak that grows in on an ultrafast time scale, reflecting energy transfer between higher lying excitations of the B850 chromophores into the B800 states. We utilize a polarization sequence designed to highlight the initial excited state dynamics which uncovers an ultrafast transfer component between the two bands that was not observed in the all-parallel polarization data. We attribute the ultrafast transfer component to energy transfer from higher energy exciton states to lower energy states of the strongly coupled B850 chromophores. Connecting the spectroscopic signature to the molecular structure, we reveal multiple relaxation pathways including a cyclic transfer of energy between the two rings of the complex.

  20. Probing energy transfer events in the light harvesting complex 2 (LH2) of Rhodobacter sphaeroides with two-dimensional spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Fidler, Andrew F.; Singh, Ved P.; Engel, Gregory S. [Department of Chemistry, The Institute for Biophysical Dynamics, and The James Franck Institute, The University of Chicago, Chicago, Illinois 60637 (United States); Long, Phillip D.; Dahlberg, Peter D. [Graduate Program in the Biophysical Sciences, The University of Chicago, Chicago, Illinois 60637 (United States)

    2013-10-21

    Excitation energy transfer events in the photosynthetic light harvesting complex 2 (LH2) of Rhodobacter sphaeroides are investigated with polarization controlled two-dimensional electronic spectroscopy. A spectrally broadened pulse allows simultaneous measurement of the energy transfer within and between the two absorption bands at 800 nm and 850 nm. The phased all-parallel polarization two-dimensional spectra resolve the initial events of energy transfer by separating the intra-band and inter-band relaxation processes across the two-dimensional map. The internal dynamics of the 800 nm region of the spectra are resolved as a cross peak that grows in on an ultrafast time scale, reflecting energy transfer between higher lying excitations of the B850 chromophores into the B800 states. We utilize a polarization sequence designed to highlight the initial excited state dynamics which uncovers an ultrafast transfer component between the two bands that was not observed in the all-parallel polarization data. We attribute the ultrafast transfer component to energy transfer from higher energy exciton states to lower energy states of the strongly coupled B850 chromophores. Connecting the spectroscopic signature to the molecular structure, we reveal multiple relaxation pathways including a cyclic transfer of energy between the two rings of the complex.

  1. Probing energy transfer events in the light harvesting complex 2 (LH2) of Rhodobacter sphaeroides with two-dimensional spectroscopy

    International Nuclear Information System (INIS)

    Fidler, Andrew F.; Singh, Ved P.; Engel, Gregory S.; Long, Phillip D.; Dahlberg, Peter D.

    2013-01-01

    Excitation energy transfer events in the photosynthetic light harvesting complex 2 (LH2) of Rhodobacter sphaeroides are investigated with polarization controlled two-dimensional electronic spectroscopy. A spectrally broadened pulse allows simultaneous measurement of the energy transfer within and between the two absorption bands at 800 nm and 850 nm. The phased all-parallel polarization two-dimensional spectra resolve the initial events of energy transfer by separating the intra-band and inter-band relaxation processes across the two-dimensional map. The internal dynamics of the 800 nm region of the spectra are resolved as a cross peak that grows in on an ultrafast time scale, reflecting energy transfer between higher lying excitations of the B850 chromophores into the B800 states. We utilize a polarization sequence designed to highlight the initial excited state dynamics which uncovers an ultrafast transfer component between the two bands that was not observed in the all-parallel polarization data. We attribute the ultrafast transfer component to energy transfer from higher energy exciton states to lower energy states of the strongly coupled B850 chromophores. Connecting the spectroscopic signature to the molecular structure, we reveal multiple relaxation pathways including a cyclic transfer of energy between the two rings of the complex

  2. Stressful life events and psychological dysfunction in complex regional pain syndrome type I

    NARCIS (Netherlands)

    Geertzen, JHB; de Bruijn-Kofman, AT; de Bruijn, HP; van de Wiel, HBM; Dijkstra, PU

    Objective: To determine to what extent stressful life events and psychological dysfunction play a role in the pathogenesis of Complex Regional Pain Syndrome type I (CRPS). Design: A comparative study between a CRPS group and a control group. Stressful life events and psychological dysfunction

  3. Sequence conservation and combinatorial complexity of Drosophila neural precursor cell enhancers

    Directory of Open Access Journals (Sweden)

    Kuzin Alexander

    2008-08-01

    Full Text Available Abstract Background The presence of highly conserved sequences within cis-regulatory regions can serve as a valuable starting point for elucidating the basis of enhancer function. This study focuses on regulation of gene expression during the early events of Drosophila neural development. We describe the use of EvoPrinter and cis-Decoder, a suite of interrelated phylogenetic footprinting and alignment programs, to characterize highly conserved sequences that are shared among co-regulating enhancers. Results Analysis of in vivo characterized enhancers that drive neural precursor gene expression has revealed that they contain clusters of highly conserved sequence blocks (CSBs made up of shorter shared sequence elements which are present in different combinations and orientations within the different co-regulating enhancers; these elements contain either known consensus transcription factor binding sites or consist of novel sequences that have not been functionally characterized. The CSBs of co-regulated enhancers share a large number of sequence elements, suggesting that a diverse repertoire of transcription factors may interact in a highly combinatorial fashion to coordinately regulate gene expression. We have used information gained from our comparative analysis to discover an enhancer that directs expression of the nervy gene in neural precursor cells of the CNS and PNS. Conclusion The combined use EvoPrinter and cis-Decoder has yielded important insights into the combinatorial appearance of fundamental sequence elements required for neural enhancer function. Each of the 30 enhancers examined conformed to a pattern of highly conserved blocks of sequences containing shared constituent elements. These data establish a basis for further analysis and understanding of neural enhancer function.

  4. High-Throughput Analysis of T-DNA Location and Structure Using Sequence Capture.

    Science.gov (United States)

    Inagaki, Soichi; Henry, Isabelle M; Lieberman, Meric C; Comai, Luca

    2015-01-01

    Agrobacterium-mediated transformation of plants with T-DNA is used both to introduce transgenes and for mutagenesis. Conventional approaches used to identify the genomic location and the structure of the inserted T-DNA are laborious and high-throughput methods using next-generation sequencing are being developed to address these problems. Here, we present a cost-effective approach that uses sequence capture targeted to the T-DNA borders to select genomic DNA fragments containing T-DNA-genome junctions, followed by Illumina sequencing to determine the location and junction structure of T-DNA insertions. Multiple probes can be mixed so that transgenic lines transformed with different T-DNA types can be processed simultaneously, using a simple, index-based pooling approach. We also developed a simple bioinformatic tool to find sequence read pairs that span the junction between the genome and T-DNA or any foreign DNA. We analyzed 29 transgenic lines of Arabidopsis thaliana, each containing inserts from 4 different T-DNA vectors. We determined the location of T-DNA insertions in 22 lines, 4 of which carried multiple insertion sites. Additionally, our analysis uncovered a high frequency of unconventional and complex T-DNA insertions, highlighting the needs for high-throughput methods for T-DNA localization and structural characterization. Transgene insertion events have to be fully characterized prior to use as commercial products. Our method greatly facilitates the first step of this characterization of transgenic plants by providing an efficient screen for the selection of promising lines.

  5. Rhipicephalus microplus dataset of nonredundant raw sequence reads from 454 GS FLX sequencing of Cot-selected (Cot = 660) genomic DNA

    Science.gov (United States)

    A reassociation kinetics-based approach was used to reduce the complexity of genomic DNA from the Deutsch laboratory strain of the cattle tick, Rhipicephalus microplus, to facilitate genome sequencing. Selected genomic DNA (Cot value = 660) was sequenced using 454 GS FLX technology, resulting in 356...

  6. Consolidation of Complex Events via Reinstatement in Posterior Cingulate Cortex.

    Science.gov (United States)

    Bird, Chris M; Keidel, James L; Ing, Leslie P; Horner, Aidan J; Burgess, Neil

    2015-10-28

    It is well-established that active rehearsal increases the efficacy of memory consolidation. It is also known that complex events are interpreted with reference to prior knowledge. However, comparatively little attention has been given to the neural underpinnings of these effects. In healthy adults humans, we investigated the impact of effortful, active rehearsal on memory for events by showing people several short video clips and then asking them to recall these clips, either aloud (Experiment 1) or silently while in an MRI scanner (Experiment 2). In both experiments, actively rehearsed clips were remembered in far greater detail than unrehearsed clips when tested a week later. In Experiment 1, highly similar descriptions of events were produced across retrieval trials, suggesting a degree of semanticization of the memories had taken place. In Experiment 2, spatial patterns of BOLD signal in medial temporal and posterior midline regions were correlated when encoding and rehearsing the same video. Moreover, the strength of this correlation in the posterior cingulate predicted the amount of information subsequently recalled. This is likely to reflect a strengthening of the representation of the video's content. We argue that these representations combine both new episodic information and stored semantic knowledge (or "schemas"). We therefore suggest that posterior midline structures aid consolidation by reinstating and strengthening the associations between episodic details and more generic schematic information. This leads to the creation of coherent memory representations of lifelike, complex events that are resistant to forgetting, but somewhat inflexible and semantic-like in nature. Copyright © 2015 Bird, Keidel et al.

  7. A hybrid load flow and event driven simulation approach to multi-state system reliability evaluation

    International Nuclear Information System (INIS)

    George-Williams, Hindolo; Patelli, Edoardo

    2016-01-01

    Structural complexity of systems, coupled with their multi-state characteristics, renders their reliability and availability evaluation difficult. Notwithstanding the emergence of various techniques dedicated to complex multi-state system analysis, simulation remains the only approach applicable to realistic systems. However, most simulation algorithms are either system specific or limited to simple systems since they require enumerating all possible system states, defining the cut-sets associated with each state and monitoring their occurrence. In addition to being extremely tedious for large complex systems, state enumeration and cut-set definition require a detailed understanding of the system's failure mechanism. In this paper, a simple and generally applicable simulation approach, enhanced for multi-state systems of any topology is presented. Here, each component is defined as a Semi-Markov stochastic process and via discrete-event simulation, the operation of the system is mimicked. The principles of flow conservation are invoked to determine flow across the system for every performance level change of its components using the interior-point algorithm. This eliminates the need for cut-set definition and overcomes the limitations of existing techniques. The methodology can also be exploited to account for effects of transmission efficiency and loading restrictions of components on system reliability and performance. The principles and algorithms developed are applied to two numerical examples to demonstrate their applicability. - Highlights: • A discrete event simulation model based on load flow principles. • Model does not require system path or cut sets. • Applicable to binary and multi-state systems of any topology. • Supports multiple output systems with competing demand. • Model is intuitive and generally applicable.

  8. A robust neural network-based approach for microseismic event detection

    KAUST Repository

    Akram, Jubran

    2017-08-17

    We present an artificial neural network based approach for robust event detection from low S/N waveforms. We use a feed-forward network with a single hidden layer that is tuned on a training dataset and later applied on the entire example dataset for event detection. The input features used include the average of absolute amplitudes, variance, energy-ratio and polarization rectilinearity. These features are calculated in a moving-window of same length for the entire waveform. The output is set as a user-specified relative probability curve, which provides a robust way of distinguishing between weak and strong events. An optimal network is selected by studying the weight-based saliency and effect of number of neurons on the predicted results. Using synthetic data examples, we demonstrate that this approach is effective in detecting weaker events and reduces the number of false positives.

  9. Moleculo Long-Read Sequencing Facilitates Assembly and Genomic Binning from Complex Soil Metagenomes

    Energy Technology Data Exchange (ETDEWEB)

    White, Richard Allen; Bottos, Eric M.; Roy Chowdhury, Taniya; Zucker, Jeremy D.; Brislawn, Colin J.; Nicora, Carrie D.; Fansler, Sarah J.; Glaesemann, Kurt R.; Glass, Kevin; Jansson, Janet K.; Langille, Morgan

    2016-06-28

    ABSTRACT

    Soil metagenomics has been touted as the “grand challenge” for metagenomics, as the high microbial diversity and spatial heterogeneity of soils make them unamenable to current assembly platforms. Here, we aimed to improve soil metagenomic sequence assembly by applying the Moleculo synthetic long-read sequencing technology. In total, we obtained 267 Gbp of raw sequence data from a native prairie soil; these data included 109.7 Gbp of short-read data (~100 bp) from the Joint Genome Institute (JGI), an additional 87.7 Gbp of rapid-mode read data (~250 bp), plus 69.6 Gbp (>1.5 kbp) from Moleculo sequencing. The Moleculo data alone yielded over 5,600 reads of >10 kbp in length, and over 95% of the unassembled reads mapped to contigs of >1.5 kbp. Hybrid assembly of all data resulted in more than 10,000 contigs over 10 kbp in length. We mapped three replicate metatranscriptomes derived from the same parent soil to the Moleculo subassembly and found that 95% of the predicted genes, based on their assignments to Enzyme Commission (EC) numbers, were expressed. The Moleculo subassembly also enabled binning of >100 microbial genome bins. We obtained via direct binning the first complete genome, that of “CandidatusPseudomonas sp. strain JKJ-1” from a native soil metagenome. By mapping metatranscriptome sequence reads back to the bins, we found that several bins corresponding to low-relative-abundanceAcidobacteriawere highly transcriptionally active, whereas bins corresponding to high-relative-abundanceVerrucomicrobiawere not. These results demonstrate that Moleculo sequencing provides a significant advance for resolving complex soil microbial communities.

    IMPORTANCESoil microorganisms carry out key processes for life on our planet, including cycling of carbon and other nutrients and supporting growth of plants. However, there is poor molecular-level understanding of their

  10. A new approach to investigate an eruptive paroxysmal sequence using camera and strainmeter networks: Lessons from the 3-5 December 2015 activity at Etna volcano

    Science.gov (United States)

    Bonaccorso, A.; Calvari, S.

    2017-10-01

    Explosive sequences are quite common at basaltic and andesitic volcanoes worldwide. Studies aimed at short-term forecasting are usually based on seismic and ground deformation measurements, which can be used to constrain the source region and quantify the magma volume involved in the eruptive process. However, during single episodes of explosive sequences, integration of camera remote sensing and geophysical data are scant in literature, and the total volume of pyroclastic products is not determined. In this study, we calculate eruption parameters for four powerful lava fountains occurring at the main and oldest Mt. Etna summit crater, Voragine, between 3 and 5 December 2015. These episodes produced impressive eruptive columns and plume clouds, causing lapilli and ash fallout to more than 100 km away. We analyse these paroxysmal events by integrating the images recorded by a network of monitoring cameras and the signals from three high-precision borehole strainmeters. From the camera images we calculated the total erupted volume of fluids (gas plus pyroclastics), inferring amounts from 1.9 ×109 m3 (first event) to 0.86 ×109 m3 (third event). Strain changes recorded during the first and most powerful event were used to constrain the depth of the source. The ratios of strain changes recorded at two stations during the four lava fountains were used to constrain the pyroclastic fraction for each eruptive event. The results revealed that the explosive sequence was characterized by a decreasing trend of erupted pyroclastics with time, going from 41% (first event) to 13% (fourth event) of the total erupted pyroclastic volume. Moreover, the volume ratio fluid/pyroclastic decreased markedly in the fourth and last event. To the best of our knowledge, this is the first time ever that erupted volumes of both fluid and pyroclastics have been estimated for an explosive sequence from a monitoring system using permanent cameras and high precision strainmeters. During future

  11. CISAPS: Complex Informational Spectrum for the Analysis of Protein Sequences

    Directory of Open Access Journals (Sweden)

    Charalambos Chrysostomou

    2015-01-01

    Full Text Available Complex informational spectrum analysis for protein sequences (CISAPS and its web-based server are developed and presented. As recent studies show, only the use of the absolute spectrum in the analysis of protein sequences using the informational spectrum analysis is proven to be insufficient. Therefore, CISAPS is developed to consider and provide results in three forms including absolute, real, and imaginary spectrum. Biologically related features to the analysis of influenza A subtypes as presented as a case study in this study can also appear individually either in the real or imaginary spectrum. As the results presented, protein classes can present similarities or differences according to the features extracted from CISAPS web server. These associations are probable to be related with the protein feature that the specific amino acid index represents. In addition, various technical issues such as zero-padding and windowing that may affect the analysis are also addressed. CISAPS uses an expanded list of 611 unique amino acid indices where each one represents a different property to perform the analysis. This web-based server enables researchers with little knowledge of signal processing methods to apply and include complex informational spectrum analysis to their work.

  12. Prediction of Increasing Production Activities using Combination of Query Aggregation on Complex Events Processing and Neural Network

    Directory of Open Access Journals (Sweden)

    Achmad Arwan

    2016-07-01

    Full Text Available AbstrakProduksi, order, penjualan, dan pengiriman adalah serangkaian event yang saling terkait dalam industri manufaktur. Selanjutnya hasil dari event tersebut dicatat dalam event log. Complex Event Processing adalah metode yang digunakan untuk menganalisis apakah terdapat pola kombinasi peristiwa tertentu (peluang/ancaman yang terjadi pada sebuah sistem, sehingga dapat ditangani secara cepat dan tepat. Jaringan saraf tiruan adalah metode yang digunakan untuk mengklasifikasi data peningkatan proses produksi. Hasil pencatatan rangkaian proses yang menyebabkan peningkatan produksi digunakan sebagai data latih untuk mendapatkan fungsi aktivasi dari jaringan saraf tiruan. Penjumlahan hasil catatan event log dimasukkan ke input jaringan saraf tiruan untuk perhitungan nilai aktivasi. Ketika nilai aktivasi lebih dari batas yang ditentukan, maka sistem mengeluarkan sinyal untuk meningkatkan produksi, jika tidak, sistem tetap memantau kejadian. Hasil percobaan menunjukkan bahwa akurasi dari metode ini adalah 77% dari 39 rangkaian aliran event.Kata kunci: complex event processing, event, jaringan saraf tiruan, prediksi peningkatan produksi, proses. AbstractProductions, orders, sales, and shipments are series of interrelated events within manufacturing industry. Further these events were recorded in the event log. Complex event processing is a method that used to analyze whether there are patterns of combinations of certain events (opportunities / threats that occur in a system, so it can be addressed quickly and appropriately. Artificial neural network is a method that we used to classify production increase activities. The series of events that cause the increase of the production used as a dataset to train the weight of neural network which result activation value. An aggregate stream of events inserted into the neural network input to compute the value of activation. When the value is over a certain threshold (the activation value results

  13. Some critical remarks on a sequence of events interpreted to possibly originate from a decay chain of an element 120 isotope

    Energy Technology Data Exchange (ETDEWEB)

    Hessberger, F.P. [GSI - Helmholtzzentrum fuer Schwerionenforschung GmbH, Darmstadt (Germany); Helmholtz-Institut Mainz, Mainz (Germany); Ackermann, D. [GANIL, Caen (France)

    2017-06-15

    A sequence of three events observed in an irradiation of {sup 248}Cm with {sup 54}Cr at the velocity filter SHIP of the GSI - Helmholtzzentrum fuer Schwerionenforschung GmbH, 64291 Darmstadt, Germany, had been interpreted as a decay chain consisting of three α particles. On the basis of measured energies, a possible assignment to the decay of an isotope of element 120 was discussed, although it was stated that a definite assignment could not be made. A critical analysis of the data, however, shows that the reported events do not have the properties of a decay chain consisting of three α particles and (probably being terminated by) a spontaneous fission event, but that this is rather a random sequence of events. (orig.)

  14. Approaching human language with complex networks

    Science.gov (United States)

    Cong, Jin; Liu, Haitao

    2014-12-01

    The interest in modeling and analyzing human language with complex networks is on the rise in recent years and a considerable body of research in this area has already been accumulated. We survey three major lines of linguistic research from the complex network approach: 1) characterization of human language as a multi-level system with complex network analysis; 2) linguistic typological research with the application of linguistic networks and their quantitative measures; and 3) relationships between the system-level complexity of human language (determined by the topology of linguistic networks) and microscopic linguistic (e.g., syntactic) features (as the traditional concern of linguistics). We show that the models and quantitative tools of complex networks, when exploited properly, can constitute an operational methodology for linguistic inquiry, which contributes to the understanding of human language and the development of linguistics. We conclude our review with suggestions for future linguistic research from the complex network approach: 1) relationships between the system-level complexity of human language and microscopic linguistic features; 2) expansion of research scope from the global properties to other levels of granularity of linguistic networks; and 3) combination of linguistic network analysis with other quantitative studies of language (such as quantitative linguistics).

  15. A time warping approach to multiple sequence alignment.

    Science.gov (United States)

    Arribas-Gil, Ana; Matias, Catherine

    2017-04-25

    We propose an approach for multiple sequence alignment (MSA) derived from the dynamic time warping viewpoint and recent techniques of curve synchronization developed in the context of functional data analysis. Starting from pairwise alignments of all the sequences (viewed as paths in a certain space), we construct a median path that represents the MSA we are looking for. We establish a proof of concept that our method could be an interesting ingredient to include into refined MSA techniques. We present a simple synthetic experiment as well as the study of a benchmark dataset, together with comparisons with 2 widely used MSA softwares.

  16. The Pagoda Sequence: a Ramble through Linear Complexity, Number Walls, D0L Sequences, Finite State Automata, and Aperiodic Tilings

    Directory of Open Access Journals (Sweden)

    Fred Lunnon

    2009-06-01

    Full Text Available We review the concept of the number wall as an alternative to the traditional linear complexity profile (LCP, and sketch the relationship to other topics such as linear feedback shift-register (LFSR and context-free Lindenmayer (D0L sequences. A remarkable ternary analogue of the Thue-Morse sequence is introduced having deficiency 2 modulo 3, and this property verified via the re-interpretation of the number wall as an aperiodic plane tiling.

  17. HEURISTIC METHODS FOR TEST SEQUENCING IN TELECOMMUNICATION SATELLITES

    OpenAIRE

    Boche-Sauvan , Ludivine; Cabon , Bertrand; Huguet , Marie-José; Hébrard , Emmanuel

    2014-01-01

    Colloque avec actes et comité de lecture. internationale.; International audience; The increasing complexity in telecommunication satellite payload design demands new test sequencing approaches to face the induced increase of validation complexity. In such a sequencing problem, each test requires specific payload equipment states at a dedicated temperature range. The industrial objectives are both to reduce the total time of tests and to keep the thermal stability of the payload. For solving ...

  18. A Macro-Observation Scheme for Abnormal Event Detection in Daily-Life Video Sequences

    Directory of Open Access Journals (Sweden)

    Chiu Wei-Yao

    2010-01-01

    Full Text Available Abstract We propose a macro-observation scheme for abnormal event detection in daily life. The proposed macro-observation representation records the time-space energy of motions of all moving objects in a scene without segmenting individual object parts. The energy history of each pixel in the scene is instantly updated with exponential weights without explicitly specifying the duration of each activity. Since possible activities in daily life are numerous and distinct from each other and not all abnormal events can be foreseen, images from a video sequence that spans sufficient repetition of normal day-to-day activities are first randomly sampled. A constrained clustering model is proposed to partition the sampled images into groups. The new observed event that has distinct distance from any of the cluster centroids is then classified as an anomaly. The proposed method has been evaluated in daily work of a laboratory and BEHAVE benchmark dataset. The experimental results reveal that it can well detect abnormal events such as burglary and fighting as long as they last for a sufficient duration of time. The proposed method can be used as a support system for the scene that requires full time monitoring personnel.

  19. Identifying driver mutations in sequenced cancer genomes

    DEFF Research Database (Denmark)

    Raphael, Benjamin J; Dobson, Jason R; Oesper, Layla

    2014-01-01

    High-throughput DNA sequencing is revolutionizing the study of cancer and enabling the measurement of the somatic mutations that drive cancer development. However, the resulting sequencing datasets are large and complex, obscuring the clinically important mutations in a background of errors, nois...... patterns of mutual exclusivity. These techniques, coupled with advances in high-throughput DNA sequencing, are enabling precision medicine approaches to the diagnosis and treatment of cancer....

  20. A Novel Approach For Event-By-Event Early Gluon Fields

    Science.gov (United States)

    Fries, Rainer J.; Rose, Steven

    2017-08-01

    We report on efforts to construct an event generator that calculates the classical gluon field generated at early times in high energy nuclear collisions. Existing approaches utilize numerical solutions of the Yang-Mills equations after the collision. In contrast we employ the analytically known recursion relation in the forward light cone. The few lowest orders are expected to lead to reliable results for times of up to the inverse saturation scale, τ0 ∼ 1 /Qs. In these proceedings we sketch some calculational details, and show some preliminary results.

  1. Intelligent Transportation Control based on Proactive Complex Event Processing

    OpenAIRE

    Wang Yongheng; Geng Shaofeng; Li Qian

    2016-01-01

    Complex Event Processing (CEP) has become the key part of Internet of Things (IoT). Proactive CEP can predict future system states and execute some actions to avoid unwanted states which brings new hope to intelligent transportation control. In this paper, we propose a proactive CEP architecture and method for intelligent transportation control. Based on basic CEP technology and predictive analytic technology, a networked distributed Markov decision processes model with predicting states is p...

  2. Trajectories of Childbearing among HIV Infected Indian Women : A Sequence Analysis Approach

    NARCIS (Netherlands)

    Darak, Shrinivas; Mills, Melinda; Kulkarni, Vinay; Kulkarni, Sanjeevani; Hutter, Inge; Janssen, Fanny

    2015-01-01

    Background HIV infection closely relates to and deeply affects the reproductive career of those infected. However, little is known about the reproductive career trajectories, specifically the interaction of the timing of HIV diagnosis with the timing and sequencing of reproductive events among HIV

  3. Trajectories of childbearing among HIV infected Indian women: A sequence analysis approach

    NARCIS (Netherlands)

    Darak, S.; Mills, M.; Kulkarni, V.; Kulkarni, S.; Hutter, I.; Janssen, F.

    2015-01-01

    Background HIV infection closely relates to and deeply affects the reproductive career of those infected. However, little is known about the reproductive career trajectories, specifically the interaction of the timing of HIV diagnosis with the timing and sequencing of reproductive events among HIV

  4. Trajectories of childbearing among HIV infected Indian women : A sequence analysis approach

    NARCIS (Netherlands)

    S. Darak (Shrinivas); M. Mills (Melinda); V. Kulkarni (Vinay); S. Kulkarni (Sanjeevani); I. Hutter (Inge); F. Janssen (Fanny)

    2015-01-01

    textabstractHIV infection closely relates to and deeply affects the reproductive career of those infected. However, little is known about the reproductive career trajectories, specifically the interaction of the timing of HIV diagnosis with the timing and sequencing of reproductive events among HIV

  5. Information visualization for the Structural Complexity Management Approach

    OpenAIRE

    Maurer, Maik;Braun, Thomas;Lindemann, Udo

    2017-01-01

    The handling of complexity poses an important challenge and a success factor for product design. A considerable percentage of complexity results from dependencies between system elements – as adaptations to single system elements can cause far-reaching consequences. The Structural Complexity Management (SCM) approach provides a five-step procedure that supports users in the identification, acquisition, analysis and optimization of system dependencies. The approach covers the handling of multi...

  6. Key Issues in Modeling of Complex 3D Structures from Video Sequences

    Directory of Open Access Journals (Sweden)

    Shengyong Chen

    2012-01-01

    Full Text Available Construction of three-dimensional structures from video sequences has wide applications for intelligent video analysis. This paper summarizes the key issues of the theory and surveys the recent advances in the state of the art. Reconstruction of a scene object from video sequences often takes the basic principle of structure from motion with an uncalibrated camera. This paper lists the typical strategies and summarizes the typical solutions or algorithms for modeling of complex three-dimensional structures. Open difficult problems are also suggested for further study.

  7. Extreme flood event reconstruction spanning the last century in the El Bibane Lagoon (southeastern Tunisia: a multi-proxy approach

    Directory of Open Access Journals (Sweden)

    A. Affouri

    2017-06-01

    Full Text Available Climate models project that rising atmospheric carbon dioxide concentrations will increase the frequency and the severity of some extreme weather events. The flood events represent a major risk for populations and infrastructures settled on coastal lowlands. Recent studies of lagoon sediments have enhanced our knowledge on extreme hydrological events such as palaeo-storms and on their relation with climate change over the last millennium. However, few studies have been undertaken to reconstruct past flood events from lagoon sediments. Here, the past flood activity was investigated using a multi-proxy approach combining sedimentological and geochemical analysis of surfaces sediments from a southeastern Tunisian catchment in order to trace the origin of sediment deposits in the El Bibane Lagoon. Three sediment sources were identified: marine, fluvial and aeolian. When applying this multi-proxy approach on core BL12-10, recovered from the El Bibane Lagoon, we can see that finer material, a high content of the clay and silt, and a high content of the elemental ratios (Fe ∕ Ca and Ti ∕ Ca characterise the sedimentological signature of the palaeo-flood levels identified in the lagoonal sequence. For the last century, which is the period covered by the BL12-10 short core, three palaeo-flood events were identified. The age of these flood events have been determined by 210Pb and 137Cs chronology and give ages of AD 1995 ± 6, 1970 ± 9 and 1945 ± 9. These results show a good temporal correlation with historical flood events recorded in southern Tunisia in the last century (AD 1932, 1969, 1979 and 1995. Our finding suggests that reconstruction of the history of the hydrological extreme events during the upper Holocene is possible in this location through the use of the sedimentary archives.

  8. A two-step recognition of signal sequences determines the translocation efficiency of proteins.

    Science.gov (United States)

    Belin, D; Bost, S; Vassalli, J D; Strub, K

    1996-02-01

    The cytosolic and secreted, N-glycosylated, forms of plasminogen activator inhibitor-2 (PAI-2) are generated by facultative translocation. To study the molecular events that result in the bi-topological distribution of proteins, we determined in vitro the capacities of several signal sequences to bind the signal recognition particle (SRP) during targeting, and to promote vectorial transport of murine PAI-2 (mPAI-2). Interestingly, the six signal sequences we compared (mPAI-2 and three mutated derivatives thereof, ovalbumin and preprolactin) were found to have the differential activities in the two events. For example, the mPAI-2 signal sequence first binds SRP with moderate efficiency and secondly promotes the vectorial transport of only a fraction of the SRP-bound nascent chains. Our results provide evidence that the translocation efficiency of proteins can be controlled by the recognition of their signal sequences at two steps: during SRP-mediated targeting and during formation of a committed translocation complex. This second recognition may occur at several time points during the insertion/translocation step. In conclusion, signal sequences have a more complex structure than previously anticipated, allowing for multiple and independent interactions with the translocation machinery.

  9. High-Throughput Analysis of T-DNA Location and Structure Using Sequence Capture.

    Directory of Open Access Journals (Sweden)

    Soichi Inagaki

    Full Text Available Agrobacterium-mediated transformation of plants with T-DNA is used both to introduce transgenes and for mutagenesis. Conventional approaches used to identify the genomic location and the structure of the inserted T-DNA are laborious and high-throughput methods using next-generation sequencing are being developed to address these problems. Here, we present a cost-effective approach that uses sequence capture targeted to the T-DNA borders to select genomic DNA fragments containing T-DNA-genome junctions, followed by Illumina sequencing to determine the location and junction structure of T-DNA insertions. Multiple probes can be mixed so that transgenic lines transformed with different T-DNA types can be processed simultaneously, using a simple, index-based pooling approach. We also developed a simple bioinformatic tool to find sequence read pairs that span the junction between the genome and T-DNA or any foreign DNA. We analyzed 29 transgenic lines of Arabidopsis thaliana, each containing inserts from 4 different T-DNA vectors. We determined the location of T-DNA insertions in 22 lines, 4 of which carried multiple insertion sites. Additionally, our analysis uncovered a high frequency of unconventional and complex T-DNA insertions, highlighting the needs for high-throughput methods for T-DNA localization and structural characterization. Transgene insertion events have to be fully characterized prior to use as commercial products. Our method greatly facilitates the first step of this characterization of transgenic plants by providing an efficient screen for the selection of promising lines.

  10. Metaheuristic approaches to order sequencing on a unidirectional picking line

    Directory of Open Access Journals (Sweden)

    AP de Villiers

    2013-06-01

    Full Text Available In this paper the sequencing of orders on a unidirectional picking line is considered. The aim of the order sequencing is to minimise the number of cycles travelled by a picker within the picking line to complete all orders. A tabu search, simulated annealing, genetic algorithm, generalised extremal optimisation and a random local search are presented as possible solution approaches. Computational results based on real life data instances are presented for these metaheuristics and compared to the performance of a lower bound and the solutions used in practise. The random local search exhibits the best overall solution quality, however, the generalised extremal optimisation approach delivers comparable results in considerably shorter computational times.

  11. Solving the Water Jugs Problem by an Integer Sequence Approach

    Science.gov (United States)

    Man, Yiu-Kwong

    2012-01-01

    In this article, we present an integer sequence approach to solve the classic water jugs problem. The solution steps can be obtained easily by additions and subtractions only, which is suitable for manual calculation or programming by computer. This approach can be introduced to secondary and undergraduate students, and also to teachers and…

  12. Strategic Cognitive Sequencing: A Computational Cognitive Neuroscience Approach

    Directory of Open Access Journals (Sweden)

    Seth A. Herd

    2013-01-01

    Full Text Available We address strategic cognitive sequencing, the “outer loop” of human cognition: how the brain decides what cognitive process to apply at a given moment to solve complex, multistep cognitive tasks. We argue that this topic has been neglected relative to its importance for systematic reasons but that recent work on how individual brain systems accomplish their computations has set the stage for productively addressing how brain regions coordinate over time to accomplish our most impressive thinking. We present four preliminary neural network models. The first addresses how the prefrontal cortex (PFC and basal ganglia (BG cooperate to perform trial-and-error learning of short sequences; the next, how several areas of PFC learn to make predictions of likely reward, and how this contributes to the BG making decisions at the level of strategies. The third models address how PFC, BG, parietal cortex, and hippocampus can work together to memorize sequences of cognitive actions from instruction (or “self-instruction”. The last shows how a constraint satisfaction process can find useful plans. The PFC maintains current and goal states and associates from both of these to find a “bridging” state, an abstract plan. We discuss how these processes could work together to produce strategic cognitive sequencing and discuss future directions in this area.

  13. Sequence Algebra, Sequence Decision Diagrams and Dynamic Fault Trees

    International Nuclear Information System (INIS)

    Rauzy, Antoine B.

    2011-01-01

    A large attention has been focused on the Dynamic Fault Trees in the past few years. By adding new gates to static (regular) Fault Trees, Dynamic Fault Trees aim to take into account dependencies among events. Merle et al. proposed recently an algebraic framework to give a formal interpretation to these gates. In this article, we extend Merle et al.'s work by adopting a slightly different perspective. We introduce Sequence Algebras that can be seen as Algebras of Basic Events, representing failures of non-repairable components. We show how to interpret Dynamic Fault Trees within this framework. Finally, we propose a new data structure to encode sets of sequences of Basic Events: Sequence Decision Diagrams. Sequence Decision Diagrams are very much inspired from Minato's Zero-Suppressed Binary Decision Diagrams. We show that all operations of Sequence Algebras can be performed on this data structure.

  14. Sequence-specific capture of protein-DNA complexes for mass spectrometric protein identification.

    Directory of Open Access Journals (Sweden)

    Cheng-Hsien Wu

    Full Text Available The regulation of gene transcription is fundamental to the existence of complex multicellular organisms such as humans. Although it is widely recognized that much of gene regulation is controlled by gene-specific protein-DNA interactions, there presently exists little in the way of tools to identify proteins that interact with the genome at locations of interest. We have developed a novel strategy to address this problem, which we refer to as GENECAPP, for Global ExoNuclease-based Enrichment of Chromatin-Associated Proteins for Proteomics. In this approach, formaldehyde cross-linking is employed to covalently link DNA to its associated proteins; subsequent fragmentation of the DNA, followed by exonuclease digestion, produces a single-stranded region of the DNA that enables sequence-specific hybridization capture of the protein-DNA complex on a solid support. Mass spectrometric (MS analysis of the captured proteins is then used for their identification and/or quantification. We show here the development and optimization of GENECAPP for an in vitro model system, comprised of the murine insulin-like growth factor-binding protein 1 (IGFBP1 promoter region and FoxO1, a member of the forkhead rhabdomyosarcoma (FoxO subfamily of transcription factors, which binds specifically to the IGFBP1 promoter. This novel strategy provides a powerful tool for studies of protein-DNA and protein-protein interactions.

  15. A Fast Event Preprocessor and Sequencer for the Simbol-X Low Energy Detector

    Science.gov (United States)

    Schanz, T.; Tenzer, C.; Maier, D.; Kendziorra, E.; Santangelo, A.

    2009-05-01

    The Simbol-X Low Energy Detector (LED), a 128×128 pixel DEPFET (Depleted Field Effect Transistor) array, will be read out at a very high rate (8000 frames/second) and, therefore, requires a very fast on board electronics. We present an FPGA-based LED camera electronics consisting of an Event Preprocessor (EPP) for on board data preprocessing and filtering of the Simbol-X low-energy detector and a related Sequencer (SEQ) to generate the necessary signals to control the readout.

  16. A Fast Event Preprocessor and Sequencer for the Simbol-X Low Energy Detector

    International Nuclear Information System (INIS)

    Schanz, T.; Tenzer, C.; Maier, D.; Kendziorra, E.; Santangelo, A.

    2009-01-01

    The Simbol-X Low Energy Detector (LED), a 128x128 pixel DEPFET (Depleted Field Effect Transistor) array, will be read out at a very high rate (8000 frames/second) and, therefore, requires a very fast on board electronics. We present an FPGA-based LED camera electronics consisting of an Event Preprocessor (EPP) for on board data preprocessing and filtering of the Simbol-X low-energy detector and a related Sequencer (SEQ) to generate the necessary signals to control the readout.

  17. A Comparison of Molecular Typing Methods Applied to Enterobacter cloacae complex: hsp60 Sequencing, Rep-PCR, and MLST

    Directory of Open Access Journals (Sweden)

    Roberto Viau

    2017-02-01

    Full Text Available Molecular typing using repetitive sequenced-based PCR (rep-PCR and hsp60 sequencing were applied to a collection of diverse Enterobacter cloacae complex isolates. To determine the most practical method for reference laboratories, we analyzed 71 E. cloacae complex isolates from sporadic and outbreak occurrences originating from 4 geographic areas. While rep-PCR was more discriminating, hsp60 sequencing provided a broader and a more objective geographical tracking method similar to multilocus sequence typing (MLST. In addition, we suggest that MLST may have higher discriminative power compared to hsp60 sequencing, although rep-PCR remains the most discriminative method for local outbreak investigations. In addition, rep-PCR can be an effective and inexpensive method for local outbreak investigation.

  18. Towards high-throughput phenotyping of complex patterned behaviors in rodents: focus on mouse self-grooming and its sequencing.

    Science.gov (United States)

    Kyzar, Evan; Gaikwad, Siddharth; Roth, Andrew; Green, Jeremy; Pham, Mimi; Stewart, Adam; Liang, Yiqing; Kobla, Vikrant; Kalueff, Allan V

    2011-12-01

    Increasingly recognized in biological psychiatry, rodent self-grooming is a complex patterned behavior with evolutionarily conserved cephalo-caudal progression. While grooming is traditionally assessed by the latency, frequency and duration, its sequencing represents another important domain sensitive to various experimental manipulations. Such behavioral complexity requires novel objective approaches to quantify rodent grooming, in addition to time-consuming and highly variable manual observation. The present study combined modern behavior-recognition video-tracking technologies (CleverSys, Inc.) with manual observation to characterize in-depth spontaneous (novelty-induced) and artificial (water-induced) self-grooming in adult male C57BL/6J mice. We specifically focused on individual episodes of grooming (paw licking, head washing, body/leg washing, and tail/genital grooming), their duration and transitions between episodes. Overall, the frequency, duration and transitions detected using the automated approach significantly correlated with manual observations (R=0.51-0.7, pgrooming, also indicating that behavior-recognition tools can be applied to characterize both the amount and sequential organization (patterning) of rodent grooming. Together with further refinement and methodological advancement, this approach will foster high-throughput neurophenotyping of grooming, with multiple applications in drug screening and testing of genetically modified animals. Copyright © 2011 Elsevier B.V. All rights reserved.

  19. Strategic crisis and risk communication during a prolonged natural hazard event: lessons learned from the Canterbury earthquake sequence

    Science.gov (United States)

    Wein, A. M.; Potter, S.; Becker, J.; Doyle, E. E.; Jones, J. L.

    2015-12-01

    While communication products are developed for monitoring and forecasting hazard events, less thought may have been given to crisis and risk communication plans. During larger (and rarer) events responsible science agencies may find themselves facing new and intensified demands for information and unprepared for effectively resourcing communications. In a study of the communication of aftershock information during the 2010-12 Canterbury Earthquake Sequence (New Zealand), issues are identified and implications for communication strategy noted. Communication issues during the responses included reliability and timeliness of communication channels for immediate and short decision time frames; access to scientists by those who needed information; unfamiliar emergency management frameworks; information needs of multiple audiences, audience readiness to use the information; and how best to convey empathy during traumatic events and refer to other information sources about what to do and how to cope. Other science communication challenges included meeting an increased demand for earthquake education, getting attention on aftershock forecasts; responding to rumor management; supporting uptake of information by critical infrastructure and government and for the application of scientific information in complex societal decisions; dealing with repetitive information requests; addressing diverse needs of multiple audiences for scientific information; and coordinating communications within and outside the science domain. For a science agency, a communication strategy would consider training scientists in communication, establishing relationships with university scientists and other disaster communication roles, coordinating messages, prioritizing audiences, deliberating forecasts with community leaders, identifying user needs and familiarizing them with the products ahead of time, and practicing the delivery and use of information via scenario planning and exercises.

  20. A Comprehensive Quality Evaluation System for Complex Herbal Medicine Using PacBio Sequencing, PCR-Denaturing Gradient Gel Electrophoresis, and Several Chemical Approaches

    Directory of Open Access Journals (Sweden)

    Xiasheng Zheng

    2017-09-01

    Full Text Available Herbal medicine is a major component of complementary and alternative medicine, contributing significantly to the health of many people and communities. Quality control of herbal medicine is crucial to ensure that it is safe and sound for use. Here, we investigated a comprehensive quality evaluation system for a classic herbal medicine, Danggui Buxue Formula, by applying genetic-based and analytical chemistry approaches to authenticate and evaluate the quality of its samples. For authenticity, we successfully applied two novel technologies, third-generation sequencing and PCR-DGGE (denaturing gradient gel electrophoresis, to analyze the ingredient composition of the tested samples. For quality evaluation, we used high performance liquid chromatography assays to determine the content of chemical markers to help estimate the dosage relationship between its two raw materials, plant roots of Huangqi and Danggui. A series of surveys were then conducted against several exogenous contaminations, aiming to further access the efficacy and safety of the samples. In conclusion, the quality evaluation system demonstrated here can potentially address the authenticity, quality, and safety of herbal medicines, thus providing novel insight for enhancing their overall quality control.Highlight: We established a comprehensive quality evaluation system for herbal medicine, by combining two genetic-based approaches third-generation sequencing and DGGE (denaturing gradient gel electrophoresis with analytical chemistry approaches to achieve the authentication and quality connotation of the samples.

  1. A Comprehensive Quality Evaluation System for Complex Herbal Medicine Using PacBio Sequencing, PCR-Denaturing Gradient Gel Electrophoresis, and Several Chemical Approaches.

    Science.gov (United States)

    Zheng, Xiasheng; Zhang, Peng; Liao, Baosheng; Li, Jing; Liu, Xingyun; Shi, Yuhua; Cheng, Jinle; Lai, Zhitian; Xu, Jiang; Chen, Shilin

    2017-01-01

    Herbal medicine is a major component of complementary and alternative medicine, contributing significantly to the health of many people and communities. Quality control of herbal medicine is crucial to ensure that it is safe and sound for use. Here, we investigated a comprehensive quality evaluation system for a classic herbal medicine, Danggui Buxue Formula, by applying genetic-based and analytical chemistry approaches to authenticate and evaluate the quality of its samples. For authenticity, we successfully applied two novel technologies, third-generation sequencing and PCR-DGGE (denaturing gradient gel electrophoresis), to analyze the ingredient composition of the tested samples. For quality evaluation, we used high performance liquid chromatography assays to determine the content of chemical markers to help estimate the dosage relationship between its two raw materials, plant roots of Huangqi and Danggui. A series of surveys were then conducted against several exogenous contaminations, aiming to further access the efficacy and safety of the samples. In conclusion, the quality evaluation system demonstrated here can potentially address the authenticity, quality, and safety of herbal medicines, thus providing novel insight for enhancing their overall quality control. Highlight : We established a comprehensive quality evaluation system for herbal medicine, by combining two genetic-based approaches third-generation sequencing and DGGE (denaturing gradient gel electrophoresis) with analytical chemistry approaches to achieve the authentication and quality connotation of the samples.

  2. A Comprehensive Quality Evaluation System for Complex Herbal Medicine Using PacBio Sequencing, PCR-Denaturing Gradient Gel Electrophoresis, and Several Chemical Approaches

    Science.gov (United States)

    Zheng, Xiasheng; Zhang, Peng; Liao, Baosheng; Li, Jing; Liu, Xingyun; Shi, Yuhua; Cheng, Jinle; Lai, Zhitian; Xu, Jiang; Chen, Shilin

    2017-01-01

    Herbal medicine is a major component of complementary and alternative medicine, contributing significantly to the health of many people and communities. Quality control of herbal medicine is crucial to ensure that it is safe and sound for use. Here, we investigated a comprehensive quality evaluation system for a classic herbal medicine, Danggui Buxue Formula, by applying genetic-based and analytical chemistry approaches to authenticate and evaluate the quality of its samples. For authenticity, we successfully applied two novel technologies, third-generation sequencing and PCR-DGGE (denaturing gradient gel electrophoresis), to analyze the ingredient composition of the tested samples. For quality evaluation, we used high performance liquid chromatography assays to determine the content of chemical markers to help estimate the dosage relationship between its two raw materials, plant roots of Huangqi and Danggui. A series of surveys were then conducted against several exogenous contaminations, aiming to further access the efficacy and safety of the samples. In conclusion, the quality evaluation system demonstrated here can potentially address the authenticity, quality, and safety of herbal medicines, thus providing novel insight for enhancing their overall quality control. Highlight: We established a comprehensive quality evaluation system for herbal medicine, by combining two genetic-based approaches third-generation sequencing and DGGE (denaturing gradient gel electrophoresis) with analytical chemistry approaches to achieve the authentication and quality connotation of the samples. PMID:28955365

  3. Pitch Sequence Complexity and Long-Term Pitcher Performance

    Directory of Open Access Journals (Sweden)

    Joel R. Bock

    2015-03-01

    Full Text Available Winning one or two games during a Major League Baseball (MLB season is often the difference between a team advancing to post-season play, or “waiting until next year”. Technology advances have made it feasible to augment historical data with in-game contextual data to provide managers immediate insights regarding an opponent’s next move, thereby providing a competitive edge. We developed statistical models of pitcher behavior using pitch sequences thrown during three recent MLB seasons (2011–2013. The purpose of these models was to predict the next pitch type, for each pitcher, based on data available at the immediate moment, in each at-bat. Independent models were developed for each player’s most frequent four pitches. The overall predictability of next pitch type is 74:5%. Additional analyses on pitcher predictability within specific game situations are discussed. Finally, using linear regression analysis, we show that an index of pitch sequence predictability may be used to project player performance in terms of Earned Run Average (ERA and Fielding Independent Pitching (FIP over a longer term. On a restricted range of the independent variable, reducing complexity in selection of pitches is correlated with higher values of both FIP and ERA for the players represented in the sample. Both models were significant at the α = 0.05 level (ERA: p = 0.022; FIP: p = 0.0114. With further development, such models may reduce risk faced by management in evaluation of potential trades, or to scouts assessing unproven emerging talent. Pitchers themselves might benefit from awareness of their individual statistical tendencies, and adapt their behavior on the mound accordingly. To our knowledge, the predictive model relating pitch-wise complexity and long-term performance appears to be novel.

  4. Genetic diagnosis of Mendelian disorders via RNA sequencing.

    Science.gov (United States)

    Kremer, Laura S; Bader, Daniel M; Mertes, Christian; Kopajtich, Robert; Pichler, Garwin; Iuso, Arcangela; Haack, Tobias B; Graf, Elisabeth; Schwarzmayr, Thomas; Terrile, Caterina; Koňaříková, Eliška; Repp, Birgit; Kastenmüller, Gabi; Adamski, Jerzy; Lichtner, Peter; Leonhardt, Christoph; Funalot, Benoit; Donati, Alice; Tiranti, Valeria; Lombes, Anne; Jardel, Claude; Gläser, Dieter; Taylor, Robert W; Ghezzi, Daniele; Mayr, Johannes A; Rötig, Agnes; Freisinger, Peter; Distelmaier, Felix; Strom, Tim M; Meitinger, Thomas; Gagneur, Julien; Prokisch, Holger

    2017-06-12

    Across a variety of Mendelian disorders, ∼50-75% of patients do not receive a genetic diagnosis by exome sequencing indicating disease-causing variants in non-coding regions. Although genome sequencing in principle reveals all genetic variants, their sizeable number and poorer annotation make prioritization challenging. Here, we demonstrate the power of transcriptome sequencing to molecularly diagnose 10% (5 of 48) of mitochondriopathy patients and identify candidate genes for the remainder. We find a median of one aberrantly expressed gene, five aberrant splicing events and six mono-allelically expressed rare variants in patient-derived fibroblasts and establish disease-causing roles for each kind. Private exons often arise from cryptic splice sites providing an important clue for variant prioritization. One such event is found in the complex I assembly factor TIMMDC1 establishing a novel disease-associated gene. In conclusion, our study expands the diagnostic tools for detecting non-exonic variants and provides examples of intronic loss-of-function variants with pathological relevance.

  5. SEDIMENTATION AND BASIN-FILL HISTORY OF THE PLIOCENE SUCCESSION EXPOSED IN THE NORTHERN SIENA-RADICOFANI BASIN (TUSCANY, ITALY: A SEQUENCE-STRATIGRAPHIC APPROACH

    Directory of Open Access Journals (Sweden)

    IVAN MARTINI

    2017-08-01

    Full Text Available Basin-margin paralic deposits are sensitive indicators of relative sea-level changes and typically show complex stratigraphic architectures that only a facies-based sequence-stratigraphic approach, supported by detailed biostratigraphic data, can help unravel, thus providing constraints for the tectono-stratigraphic reconstructions of ancient basins. This paper presents a detailed facies analysis of Pliocene strata exposed in a marginal key-area of the northern Siena-Radicofani Basin (Tuscany, Italy, which is used as a ground for a new sequence-stratigraphic scheme of the studied area. The study reveals a more complex sedimentary history than that inferred from the recent geological maps produced as part of the Regional Cartographic Project (CARG, which are based on lithostratigraphic principles. Specifically, four sequences (S1 to S4, in upward stratigraphic order have been recognised, each bounded by erosional unconformities and deposited within the Zanclean-early Gelasian time span. Each sequence typically comprises fluvial to open marine facies, with deposits of different sequences that show striking lithological similarities.The architecture and internal variability shown by the studied depositional sequences are typical of low-accommodation basin-margin settings, that shows: i a poorly-developed to missing record of the falling-stage systems tract; ii a lowstand system tract predominantly made of fluvio-deltaic deposits; iii a highstand system tract with substantial thickness variation between different sequences due to erosional processes associated with the overlying unconformity; iv a highly variable transgressive system tract, ranging from elementary to parasequential organization.

  6. Post-event human decision errors: operator action tree/time reliability correlation

    International Nuclear Information System (INIS)

    Hall, R.E.; Fragola, J.; Wreathall, J.

    1982-11-01

    This report documents an interim framework for the quantification of the probability of errors of decision on the part of nuclear power plant operators after the initiation of an accident. The framework can easily be incorporated into an event tree/fault tree analysis. The method presented consists of a structure called the operator action tree and a time reliability correlation which assumes the time available for making a decision to be the dominating factor in situations requiring cognitive human response. This limited approach decreases the magnitude and complexity of the decision modeling task. Specifically, in the past, some human performance models have attempted prediction by trying to emulate sequences of human actions, or by identifying and modeling the information processing approach applicable to the task. The model developed here is directed at describing the statistical performance of a representative group of hypothetical individuals responding to generalized situations

  7. Post-event human decision errors: operator action tree/time reliability correlation

    Energy Technology Data Exchange (ETDEWEB)

    Hall, R E; Fragola, J; Wreathall, J

    1982-11-01

    This report documents an interim framework for the quantification of the probability of errors of decision on the part of nuclear power plant operators after the initiation of an accident. The framework can easily be incorporated into an event tree/fault tree analysis. The method presented consists of a structure called the operator action tree and a time reliability correlation which assumes the time available for making a decision to be the dominating factor in situations requiring cognitive human response. This limited approach decreases the magnitude and complexity of the decision modeling task. Specifically, in the past, some human performance models have attempted prediction by trying to emulate sequences of human actions, or by identifying and modeling the information processing approach applicable to the task. The model developed here is directed at describing the statistical performance of a representative group of hypothetical individuals responding to generalized situations.

  8. A Quantitative Accident Sequence Analysis for a VHTR

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jintae; Lee, Joeun; Jae, Moosung [Hanyang University, Seoul (Korea, Republic of)

    2016-05-15

    In Korea, the basic design features of VHTR are currently discussed in the various design concepts. Probabilistic risk assessment (PRA) offers a logical and structured method to assess risks of a large and complex engineered system, such as a nuclear power plant. It will be introduced at an early stage in the design, and will be upgraded at various design and licensing stages as the design matures and the design details are defined. Risk insights to be developed from the PRA are viewed as essential to developing a design that is optimized in meeting safety objectives and in interpreting the applicability of the existing demands to the safety design approach of the VHTR. In this study, initiating events which may occur in VHTRs were selected through MLD method. The initiating events were then grouped into four categories for the accident sequence analysis. Initiating events frequency and safety systems failure rate were calculated by using reliability data obtained from the available sources and fault tree analysis. After quantification, uncertainty analysis was conducted. The SR and LR frequency are calculated respectively 7.52E- 10/RY and 7.91E-16/RY, which are relatively less than the core damage frequency of LWRs.

  9. SNPs in Multi-Species Conserved Sequences (MCS as useful markers in association studies: a practical approach

    Directory of Open Access Journals (Sweden)

    Pericak-Vance Margaret A

    2007-08-01

    Full Text Available Abstract Background Although genes play a key role in many complex diseases, the specific genes involved in most complex diseases remain largely unidentified. Their discovery will hinge on the identification of key sequence variants that are conclusively associated with disease. While much attention has been focused on variants in protein-coding DNA, variants in noncoding regions may also play many important roles in complex disease by altering gene regulation. Since the vast majority of noncoding genomic sequence is of unknown function, this increases the challenge of identifying "functional" variants that cause disease. However, evolutionary conservation can be used as a guide to indicate regions of noncoding or coding DNA that are likely to have biological function, and thus may be more likely to harbor SNP variants with functional consequences. To help bias marker selection in favor of such variants, we devised a process that prioritizes annotated SNPs for genotyping studies based on their location within Multi-species Conserved Sequences (MCSs and used this process to select SNPs in a region of linkage to a complex disease. This allowed us to evaluate the utility of the chosen SNPs for further association studies. Previously, a region of chromosome 1q43 was linked to Multiple Sclerosis (MS in a genome-wide screen. We chose annotated SNPs in the region based on location within MCSs (termed MCS-SNPs. We then obtained genotypes for 478 MCS-SNPs in 989 individuals from MS families. Results Analysis of our MCS-SNP genotypes from the 1q43 region and comparison to HapMap data confirmed that annotated SNPs in MCS regions are frequently polymorphic and show subtle signatures of selective pressure, consistent with previous reports of genome-wide variation in conserved regions. We also present an online tool that allows MCS data to be directly exported to the UCSC genome browser so that MCS-SNPs can be easily identified within genomic regions of

  10. Sequence embedding for fast construction of guide trees for multiple sequence alignment

    LENUS (Irish Health Repository)

    Blackshields, Gordon

    2010-05-14

    Abstract Background The most widely used multiple sequence alignment methods require sequences to be clustered as an initial step. Most sequence clustering methods require a full distance matrix to be computed between all pairs of sequences. This requires memory and time proportional to N 2 for N sequences. When N grows larger than 10,000 or so, this becomes increasingly prohibitive and can form a significant barrier to carrying out very large multiple alignments. Results In this paper, we have tested variations on a class of embedding methods that have been designed for clustering large numbers of complex objects where the individual distance calculations are expensive. These methods involve embedding the sequences in a space where the similarities within a set of sequences can be closely approximated without having to compute all pair-wise distances. Conclusions We show how this approach greatly reduces computation time and memory requirements for clustering large numbers of sequences and demonstrate the quality of the clusterings by benchmarking them as guide trees for multiple alignment. Source code is available for download from http:\\/\\/www.clustal.org\\/mbed.tgz.

  11. Lariat sequencing in a unicellular yeast identifies regulated alternative splicing of exons that are evolutionarily conserved with humans.

    Science.gov (United States)

    Awan, Ali R; Manfredo, Amanda; Pleiss, Jeffrey A

    2013-07-30

    Alternative splicing is a potent regulator of gene expression that vastly increases proteomic diversity in multicellular eukaryotes and is associated with organismal complexity. Although alternative splicing is widespread in vertebrates, little is known about the evolutionary origins of this process, in part because of the absence of phylogenetically conserved events that cross major eukaryotic clades. Here we describe a lariat-sequencing approach, which offers high sensitivity for detecting splicing events, and its application to the unicellular fungus, Schizosaccharomyces pombe, an organism that shares many of the hallmarks of alternative splicing in mammalian systems but for which no previous examples of exon-skipping had been demonstrated. Over 200 previously unannotated splicing events were identified, including examples of regulated alternative splicing. Remarkably, an evolutionary analysis of four of the exons identified here as subject to skipping in S. pombe reveals high sequence conservation and perfect length conservation with their homologs in scores of plants, animals, and fungi. Moreover, alternative splicing of two of these exons have been documented in multiple vertebrate organisms, making these the first demonstrations of identical alternative-splicing patterns in species that are separated by over 1 billion y of evolution.

  12. Evaluating Complex Healthcare Systems: A Critique of Four Approaches

    Directory of Open Access Journals (Sweden)

    Heather Boon

    2007-01-01

    Full Text Available The purpose of this paper is to bring clarity to the emerging conceptual and methodological literature that focuses on understanding and evaluating complex or ‘whole’ systems of healthcare. An international working group reviewed literature from interdisciplinary or interprofessional groups describing approaches to the evaluation of complex systems of healthcare. The following four key approaches were identified: a framework from the MRC (UK, whole systems research, whole medical systems research described by NCCAM (USA and a model from NAFKAM (Norway. Main areas of congruence include acknowledgment of the inherent complexity of many healthcare interventions and the need to find new ways to evaluate these; the need to describe and understand the components of complex interventions in context (as they are actually practiced; the necessity of using mixed methods including randomized clinical trials (RCTs (explanatory and pragmatic and qualitative approaches; the perceived benefits of a multidisciplinary team approach to research; and the understanding that methodological developments in this field can be applied to both complementary and alternative medicine (CAM as well as conventional therapies. In contrast, the approaches differ in the following ways: terminology used, the extent to which the approach attempts to be applicable to both CAM and conventional medical interventions; the prioritization of research questions (in order of what should be done first especially with respect to how the ‘definitive’ RCT fits into the process of assessing complex healthcare systems; and the need for a staged approach. There appears to be a growing international understanding of the need for a new perspective on assessing complex healthcare systems.

  13. Progress in methodology for probabilistic assessment of accidents: timing of accident sequences

    International Nuclear Information System (INIS)

    Lanore, J.M.; Villeroux, C.; Bouscatie, F.; Maigret, N.

    1981-09-01

    There is an important problem for probabilistic studies of accident sequences using the current event tree techniques. Indeed this method does not take into account the dependence in time of the real accident scenarios, involving the random behaviour of the systems (lack or delay in intervention, partial failures, repair, operator actions ...) and the correlated evolution of the physical parameters. A powerful method to perform the probabilistic treatment of these complex sequences (dynamic evolution of systems and associated physics) is Monte-Carlo simulation, very rare events being treated with the help of suitable weighting and biasing techniques. As a practical example the accident sequences related to the loss of the residual heat removal system in a fast breeder reactor has been treated with that method

  14. SBH and the integration of complementary approaches in the mapping, sequencing, and understanding of complex genomes

    Energy Technology Data Exchange (ETDEWEB)

    Drmanac, R.; Drmanac, S.; Labat, I.; Vicentic, A.; Gemmell, A.; Stavropoulos, N.; Jarvis, J.

    1992-01-01

    A variant of sequencing by hybridization (SBH) is being developed with a potential to inexpensively determine up to 100 million base pairs per year. The method comprises (1) arraying short clones in 864-well plates; (2) growth of the M13 clones or PCR of the inserts; (3) automated spotting of DNAs by corresponding pin-arrays; (4) hybridization of dotted samples with 200-3000 [sup 32]P- or [sup 33]P-labeled 6- to 8-mer probes; and (5) scoring hybridization signals using storage phosphor plates. Some 200 7- to 8-mers can provide an inventory of the genes if CDNA clones are hybridized, or can define the order of 2-kb genomic clones, creating physical and structural maps with 100-bp resolution; the distribution of G+C, LINEs, SINEs, and gene families would be revealed. cDNAs that represent new genes and genomic clones in regions of interest selected by SBH can be sequenced by a gel method. Uniformly distributed clones from the previous step will be hybridized with 2000--3000 6- to 8-mers. As a result, approximately 50--60% of the genomic regions containing members of large repetitive and gene families and those families represented in GenBank would be completely sequenced. In the less redundant regions, every base pair is expected to be read with 3-4 probes, but the complete sequence can not be reconstructed. Such partial sequences allow the inference of similarity and the recognition of coding, regulatory, and repetitive sequences, as well as study of the evolutionary processes all the way up to the species delineation.

  15. SBH and the integration of complementary approaches in the mapping, sequencing, and understanding of complex genomes

    Energy Technology Data Exchange (ETDEWEB)

    Drmanac, R.; Drmanac, S.; Labat, I.; Vicentic, A.; Gemmell, A.; Stavropoulos, N.; Jarvis, J.

    1992-12-01

    A variant of sequencing by hybridization (SBH) is being developed with a potential to inexpensively determine up to 100 million base pairs per year. The method comprises (1) arraying short clones in 864-well plates; (2) growth of the M13 clones or PCR of the inserts; (3) automated spotting of DNAs by corresponding pin-arrays; (4) hybridization of dotted samples with 200-3000 {sup 32}P- or {sup 33}P-labeled 6- to 8-mer probes; and (5) scoring hybridization signals using storage phosphor plates. Some 200 7- to 8-mers can provide an inventory of the genes if CDNA clones are hybridized, or can define the order of 2-kb genomic clones, creating physical and structural maps with 100-bp resolution; the distribution of G+C, LINEs, SINEs, and gene families would be revealed. cDNAs that represent new genes and genomic clones in regions of interest selected by SBH can be sequenced by a gel method. Uniformly distributed clones from the previous step will be hybridized with 2000--3000 6- to 8-mers. As a result, approximately 50--60% of the genomic regions containing members of large repetitive and gene families and those families represented in GenBank would be completely sequenced. In the less redundant regions, every base pair is expected to be read with 3-4 probes, but the complete sequence can not be reconstructed. Such partial sequences allow the inference of similarity and the recognition of coding, regulatory, and repetitive sequences, as well as study of the evolutionary processes all the way up to the species delineation.

  16. SBH and the integration of complementary approaches in the mapping, sequencing, and understanding of complex genomes

    International Nuclear Information System (INIS)

    Drmanac, R.; Drmanac, S.; Labat, I.; Vicentic, A.; Gemmell, A.; Stavropoulos, N.; Jarvis, J.

    1992-01-01

    A variant of sequencing by hybridization (SBH) is being developed with a potential to inexpensively determine up to 100 million base pairs per year. The method comprises (1) arraying short clones in 864-well plates; (2) growth of the M13 clones or PCR of the inserts; (3) automated spotting of DNAs by corresponding pin-arrays; (4) hybridization of dotted samples with 200-3000 32 P- or 33 P-labeled 6- to 8-mer probes; and (5) scoring hybridization signals using storage phosphor plates. Some 200 7- to 8-mers can provide an inventory of the genes if CDNA clones are hybridized, or can define the order of 2-kb genomic clones, creating physical and structural maps with 100-bp resolution; the distribution of G+C, LINEs, SINEs, and gene families would be revealed. cDNAs that represent new genes and genomic clones in regions of interest selected by SBH can be sequenced by a gel method. Uniformly distributed clones from the previous step will be hybridized with 2000--3000 6- to 8-mers. As a result, approximately 50--60% of the genomic regions containing members of large repetitive and gene families and those families represented in GenBank would be completely sequenced. In the less redundant regions, every base pair is expected to be read with 3-4 probes, but the complete sequence can not be reconstructed. Such partial sequences allow the inference of similarity and the recognition of coding, regulatory, and repetitive sequences, as well as study of the evolutionary processes all the way up to the species delineation

  17. Event-triggered synchronization for reaction-diffusion complex networks via random sampling

    Science.gov (United States)

    Dong, Tao; Wang, Aijuan; Zhu, Huiyun; Liao, Xiaofeng

    2018-04-01

    In this paper, the synchronization problem of the reaction-diffusion complex networks (RDCNs) with Dirichlet boundary conditions is considered, where the data is sampled randomly. An event-triggered controller based on the sampled data is proposed, which can reduce the number of controller and the communication load. Under this strategy, the synchronization problem of the diffusion complex network is equivalently converted to the stability of a of reaction-diffusion complex dynamical systems with time delay. By using the matrix inequality technique and Lyapunov method, the synchronization conditions of the RDCNs are derived, which are dependent on the diffusion term. Moreover, it is found the proposed control strategy can get rid of the Zeno behavior naturally. Finally, a numerical example is given to verify the obtained results.

  18. Mapping and characterizing N6-methyladenine in eukaryotic genomes using single molecule real-time sequencing.

    Science.gov (United States)

    Zhu, Shijia; Beaulaurier, John; Deikus, Gintaras; Wu, Tao; Strahl, Maya; Hao, Ziyang; Luo, Guanzheng; Gregory, James A; Chess, Andrew; He, Chuan; Xiao, Andrew; Sebra, Robert; Schadt, Eric E; Fang, Gang

    2018-05-15

    N6-methyladenine (m6dA) has been discovered as a novel form of DNA methylation prevalent in eukaryotes, however, methods for high resolution mapping of m6dA events are still lacking. Single-molecule real-time (SMRT) sequencing has enabled the detection of m6dA events at single-nucleotide resolution in prokaryotic genomes, but its application to detecting m6dA in eukaryotic genomes has not been rigorously examined. Herein, we identified unique characteristics of eukaryotic m6dA methylomes that fundamentally differ from those of prokaryotes. Based on these differences, we describe the first approach for mapping m6dA events using SMRT sequencing specifically designed for the study of eukaryotic genomes, and provide appropriate strategies for designing experiments and carrying out sequencing in future studies. We apply the novel approach to study two eukaryotic genomes. For green algae, we construct the first complete genome-wide map of m6dA at single nucleotide and single molecule resolution. For human lymphoblastoid cells (hLCLs), joint analyses of SMRT sequencing and independent sequencing data suggest that putative m6dA events are enriched in the promoters of young, full length LINE-1 elements (L1s). These analyses demonstrate a general method for rigorous mapping and characterization of m6dA events in eukaryotic genomes. Published by Cold Spring Harbor Laboratory Press.

  19. A Unified Approach to the Recognition of Complex Actions from Sequences of Zone-Crossings

    NARCIS (Netherlands)

    Sanromà, G.; Patino, L.; Burghouts, G.J.; Schutte, K.; Ferryman, J.

    2014-01-01

    We present a method for the recognition of complex actions. Our method combines automatic learning of simple actions and manual definition of complex actions in a single grammar. Contrary to the general trend in complex action recognition, that consists in dividing recognition into two stages, our

  20. Managing complex processing of medical image sequences by program supervision techniques

    Science.gov (United States)

    Crubezy, Monica; Aubry, Florent; Moisan, Sabine; Chameroy, Virginie; Thonnat, Monique; Di Paola, Robert

    1997-05-01

    Our objective is to offer clinicians wider access to evolving medical image processing (MIP) techniques, crucial to improve assessment and quantification of physiological processes, but difficult to handle for non-specialists in MIP. Based on artificial intelligence techniques, our approach consists in the development of a knowledge-based program supervision system, automating the management of MIP libraries. It comprises a library of programs, a knowledge base capturing the expertise about programs and data and a supervision engine. It selects, organizes and executes the appropriate MIP programs given a goal to achieve and a data set, with dynamic feedback based on the results obtained. It also advises users in the development of new procedures chaining MIP programs.. We have experimented the approach for an application of factor analysis of medical image sequences as a means of predicting the response of osteosarcoma to chemotherapy, with both MRI and NM dynamic image sequences. As a result our program supervision system frees clinical end-users from performing tasks outside their competence, permitting them to concentrate on clinical issues. Therefore our approach enables a better exploitation of possibilities offered by MIP and higher quality results, both in terms of robustness and reliability.

  1. Complex active regions as the main source of extreme and large solar proton events

    Science.gov (United States)

    Ishkov, V. N.

    2013-12-01

    A study of solar proton sources indicated that solar flare events responsible for ≥2000 pfu proton fluxes mostly occur in complex active regions (CARs), i.e., in transition structures between active regions and activity complexes. Different classes of similar structures and their relation to solar proton events (SPEs) and evolution, depending on the origination conditions, are considered. Arguments in favor of the fact that sunspot groups with extreme dimensions are CARs are presented. An analysis of the flare activity in a CAR resulted in the detection of "physical" boundaries, which separate magnetic structures of the same polarity and are responsible for the independent development of each structure.

  2. Viral metagenomics: Analysis of begomoviruses by illumina high-throughput sequencing

    KAUST Repository

    Idris, Ali

    2014-03-12

    Traditional DNA sequencing methods are inefficient, lack the ability to discern the least abundant viral sequences, and ineffective for determining the extent of variability in viral populations. Here, populations of single-stranded DNA plant begomoviral genomes and their associated beta- and alpha-satellite molecules (virus-satellite complexes) (genus, Begomovirus; family, Geminiviridae) were enriched from total nucleic acids isolated from symptomatic, field-infected plants, using rolling circle amplification (RCA). Enriched virus-satellite complexes were subjected to Illumina-Next Generation Sequencing (NGS). CASAVA and SeqMan NGen programs were implemented, respectively, for quality control and for de novo and reference-guided contig assembly of viral-satellite sequences. The authenticity of the begomoviral sequences, and the reproducibility of the Illumina-NGS approach for begomoviral deep sequencing projects, were validated by comparing NGS results with those obtained using traditional molecular cloning and Sanger sequencing of viral components and satellite DNAs, also enriched by RCA or amplified by polymerase chain reaction. As the use of NGS approaches, together with advances in software development, make possible deep sequence coverage at a lower cost; the approach described herein will streamline the exploration of begomovirus diversity and population structure from naturally infected plants, irrespective of viral abundance. This is the first report of the implementation of Illumina-NGS to explore the diversity and identify begomoviral-satellite SNPs directly from plants naturally-infected with begomoviruses under field conditions. 2014 by the authors; licensee MDPI, Basel, Switzerland.

  3. Viral Metagenomics: Analysis of Begomoviruses by Illumina High-Throughput Sequencing

    Directory of Open Access Journals (Sweden)

    Ali Idris

    2014-03-01

    Full Text Available Traditional DNA sequencing methods are inefficient, lack the ability to discern the least abundant viral sequences, and ineffective for determining the extent of variability in viral populations. Here, populations of single-stranded DNA plant begomoviral genomes and their associated beta- and alpha-satellite molecules (virus-satellite complexes (genus, Begomovirus; family, Geminiviridae were enriched from total nucleic acids isolated from symptomatic, field-infected plants, using rolling circle amplification (RCA. Enriched virus-satellite complexes were subjected to Illumina-Next Generation Sequencing (NGS. CASAVA and SeqMan NGen programs were implemented, respectively, for quality control and for de novo and reference-guided contig assembly of viral-satellite sequences. The authenticity of the begomoviral sequences, and the reproducibility of the Illumina-NGS approach for begomoviral deep sequencing projects, were validated by comparing NGS results with those obtained using traditional molecular cloning and Sanger sequencing of viral components and satellite DNAs, also enriched by RCA or amplified by polymerase chain reaction. As the use of NGS approaches, together with advances in software development, make possible deep sequence coverage at a lower cost; the approach described herein will streamline the exploration of begomovirus diversity and population structure from naturally infected plants, irrespective of viral abundance. This is the first report of the implementation of Illumina-NGS to explore the diversity and identify begomoviral-satellite SNPs directly from plants naturally-infected with begomoviruses under field conditions.

  4. The ATLAS Event Service: A new approach to event processing

    Science.gov (United States)

    Calafiura, P.; De, K.; Guan, W.; Maeno, T.; Nilsson, P.; Oleynik, D.; Panitkin, S.; Tsulaia, V.; Van Gemmeren, P.; Wenaus, T.

    2015-12-01

    The ATLAS Event Service (ES) implements a new fine grained approach to HEP event processing, designed to be agile and efficient in exploiting transient, short-lived resources such as HPC hole-filling, spot market commercial clouds, and volunteer computing. Input and output control and data flows, bookkeeping, monitoring, and data storage are all managed at the event level in an implementation capable of supporting ATLAS-scale distributed processing throughputs (about 4M CPU-hours/day). Input data flows utilize remote data repositories with no data locality or pre-staging requirements, minimizing the use of costly storage in favor of strongly leveraging powerful networks. Object stores provide a highly scalable means of remotely storing the quasi-continuous, fine grained outputs that give ES based applications a very light data footprint on a processing resource, and ensure negligible losses should the resource suddenly vanish. We will describe the motivations for the ES system, its unique features and capabilities, its architecture and the highly scalable tools and technologies employed in its implementation, and its applications in ATLAS processing on HPCs, commercial cloud resources, volunteer computing, and grid resources. Notice: This manuscript has been authored by employees of Brookhaven Science Associates, LLC under Contract No. DE-AC02-98CH10886 with the U.S. Department of Energy. The publisher by accepting the manuscript for publication acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or reproduce the published form of this manuscript, or allow others to do so, for United States Government purposes.

  5. Using nanopore sequencing to get complete genomes from complex samples

    DEFF Research Database (Denmark)

    Kirkegaard, Rasmus Hansen; Karst, Søren Michael; Nielsen, Per Halkjær

    The advantages of “next generation sequencing” has come at the cost of genome finishing. The dominant sequencing technology provides short reads of 150-300 bp, which has made genome assembly very difficult as the reads do not span important repeat regions. Genomes have thus been added...... to the databases as fragmented assemblies and not as finished contigs that resemble the chromosomes in which the DNA is organised within the cells. This is especially troublesome for genomes derived from complex metagenome sequencing. Databases with incomplete genomes can lead to false conclusions about...... the absence of genes and functional predictions of the organisms. Furthermore, it is common that repetitive elements and marker genes such as the 16S rRNA gene are missing completely from these genome bins. Using nanopore long reads, we demonstrate that it is possible to span these regions and make complete...

  6. Timing and sequencing of events marking the transition to adulthood in two informal settlements in Nairobi, Kenya.

    Science.gov (United States)

    Beguy, Donatien; Kabiru, Caroline W; Zulu, Eliya M; Ezeh, Alex C

    2011-06-01

    Young people living in poor urban informal settlements face unique challenges as they transition to adulthood. This exploratory paper uses retrospective information from the baseline survey of a 3-year prospective study to examine the timing and sequencing of four key markers (first sex, marriage, birth, and independent housing) of the transition to adulthood among 3,944 adolescents in two informal settlements in Nairobi city, Kenya. Event history analysis techniques are employed to examine the timing of the events. Results indicate that there is no significant gender difference with regard to first sexual debut among adolescents. For many boys and girls, the first sexual experience occurs outside of marriage or other union. For males, the sequencing of entry begins with entry into first sex, followed by independent housing. Conversely, for females, the sequencing begins with first sex and then parenthood. Apart from sexual debut, the patterns of entry into union and parenthood do not differ much from what was observed for Nairobi as a whole. The space constraints that typify the two slums may have influenced the pattern of leaving home observed. We discuss these and other findings in light of their implications for young people's health and well-being in resource-poor settings in urban areas.

  7. Design of Long Period Pseudo-Random Sequences from the Addition of m -Sequences over 𝔽 p

    Directory of Open Access Journals (Sweden)

    Ren Jian

    2004-01-01

    Full Text Available Pseudo-random sequence with good correlation property and large linear span is widely used in code division multiple access (CDMA communication systems and cryptology for reliable and secure information transmission. In this paper, sequences with long period, large complexity, balance statistics, and low cross-correlation property are constructed from the addition of m -sequences with pairwise-prime linear spans (AMPLS. Using m -sequences as building blocks, the proposed method proved to be an efficient and flexible approach to construct long period pseudo-random sequences with desirable properties from short period sequences. Applying the proposed method to 𝔽 2 , a signal set ( ( 2 n − 1 ( 2 m − 1 , ( 2 n + 1 ( 2 m + 1 , ( 2 ( n + 1 / 2 + 1 ( 2 ( m + 1 / 2 + 1 is constructed.

  8. An Iterative Load Disaggregation Approach Based on Appliance Consumption Pattern

    Directory of Open Access Journals (Sweden)

    Huijuan Wang

    2018-04-01

    Full Text Available Non-intrusive load monitoring (NILM, monitoring single-appliance consumption level by decomposing the aggregated energy consumption, is a novel and economic technology that is beneficial to energy utilities and energy demand management strategies development. Hardware costs of high-frequency sampling and algorithm’s computational complexity hampered NILM large-scale application. However, low sampling data shows poor performance in event detection when multiple appliances are simultaneously turned on. In this paper, we contribute an iterative disaggregation approach that is based on appliance consumption pattern (ILDACP. Our approach combined Fuzzy C-means clustering algorithm, which provide an initial appliance operating status, and sub-sequence searching Dynamic Time Warping, which retrieves single energy consumption based on the typical power consumption pattern. Results show that the proposed approach is effective to accurately disaggregate power consumption, and is suitable for the situation where different appliances are simultaneously operated. Also, the approach has lower computational complexity than Hidden Markov Model method and it is easy to implement in the household without installing special equipment.

  9. Versatile single-chip event sequencer for atomic physics experiments

    Science.gov (United States)

    Eyler, Edward

    2010-03-01

    A very inexpensive dsPIC microcontroller with internal 32-bit counters is used to produce a flexible timing signal generator with up to 16 TTL-compatible digital outputs, with a time resolution and accuracy of 50 ns. This time resolution is easily sufficient for event sequencing in typical experiments involving cold atoms or laser spectroscopy. This single-chip device is capable of triggered operation and can also function as a sweeping delay generator. With one additional chip it can also concurrently produce accurately timed analog ramps, and another one-chip addition allows real-time control from an external computer. Compared to an FPGA-based digital pattern generator, this design is slower but simpler and more flexible, and it can be reprogrammed using ordinary `C' code without special knowledge. I will also describe the use of the same microcontroller with additional hardware to implement a digital lock-in amplifier and PID controller for laser locking, including a simple graphics-based control unit. This work is supported in part by the NSF.

  10. Campbell's monkeys concatenate vocalizations into context-specific call sequences

    Science.gov (United States)

    Ouattara, Karim; Lemasson, Alban; Zuberbühler, Klaus

    2009-01-01

    Primate vocal behavior is often considered irrelevant in modeling human language evolution, mainly because of the caller's limited vocal control and apparent lack of intentional signaling. Here, we present the results of a long-term study on Campbell's monkeys, which has revealed an unrivaled degree of vocal complexity. Adult males produced six different loud call types, which they combined into various sequences in highly context-specific ways. We found stereotyped sequences that were strongly associated with cohesion and travel, falling trees, neighboring groups, nonpredatory animals, unspecific predatory threat, and specific predator classes. Within the responses to predators, we found that crowned eagles triggered four and leopards three different sequences, depending on how the caller learned about their presence. Callers followed a number of principles when concatenating sequences, such as nonrandom transition probabilities of call types, addition of specific calls into an existing sequence to form a different one, or recombination of two sequences to form a third one. We conclude that these primates have overcome some of the constraints of limited vocal control by combinatorial organization. As the different sequences were so tightly linked to specific external events, the Campbell's monkey call system may be the most complex example of ‘proto-syntax’ in animal communication known to date. PMID:20007377

  11. Development of a Single Locus Sequence Typing (SLST) Scheme for Typing Bacterial Species Directly from Complex Communities.

    Science.gov (United States)

    Scholz, Christian F P; Jensen, Anders

    2017-01-01

    The protocol describes a computational method to develop a Single Locus Sequence Typing (SLST) scheme for typing bacterial species. The resulting scheme can be used to type bacterial isolates as well as bacterial species directly from complex communities using next-generation sequencing technologies.

  12. Music and language perception: expectations, structural integration, and cognitive sequencing.

    Science.gov (United States)

    Tillmann, Barbara

    2012-10-01

    Music can be described as sequences of events that are structured in pitch and time. Studying music processing provides insight into how complex event sequences are learned, perceived, and represented by the brain. Given the temporal nature of sound, expectations, structural integration, and cognitive sequencing are central in music perception (i.e., which sounds are most likely to come next and at what moment should they occur?). This paper focuses on similarities in music and language cognition research, showing that music cognition research provides insight into the understanding of not only music processing but also language processing and the processing of other structured stimuli. The hypothesis of shared resources between music and language processing and of domain-general dynamic attention has motivated the development of research to test music as a means to stimulate sensory, cognitive, and motor processes. Copyright © 2012 Cognitive Science Society, Inc.

  13. Reactive Sequencing for Autonomous Navigation Evolving from Phoenix Entry, Descent, and Landing

    Science.gov (United States)

    Grasso, Christopher A.; Riedel, Joseph E.; Vaughan, Andrew T.

    2010-01-01

    Virtual Machine Language (VML) is an award-winning advanced procedural sequencing language in use on NASA deep-space missions since 1997, and was used for the successful entry, descent, and landing (EDL) of the Phoenix spacecraft onto the surface of Mars. Phoenix EDL utilized a state-oriented operations architecture which executed within the constraints of the existing VML 2.0 flight capability, compatible with the linear "land or die" nature of the mission. The intricacies of Phoenix EDL included the planned discarding of portions of the vehicle, the complex communications management for relay through on-orbit assets, the presence of temporally indeterminate physical events, and the need to rapidly catch up four days of sequencing should a reboot of the spacecraft flight computer occur shortly before atmospheric entry. These formidable operational challenges led to new techniques for packaging and coordinating reusable sequences called blocks using one-way synchronization via VML sequencing global variable events. The coordinated blocks acted as an ensemble to land the spacecraft, while individually managing various elements in as simple a fashion as possible. This paper outlines prototype VML 2.1 flight capabilities that have evolved from the one-way synchronization techniques in order to implement even more ambitious autonomous mission capabilities. Target missions for these new capabilities include autonomous touch-and-go sampling of cometary and asteroidal bodies, lunar landing of robotic missions, and ultimately landing of crewed lunar vehicles. Close proximity guidance, navigation, and control operations, on-orbit rendezvous, and descent and landing events featured in these missions require elaborate abort capability, manifesting highly non-linear scenarios that are so complex as to overtax traditional sequencing, or even the sort of one-way coordinated sequencing used during EDL. Foreseeing advanced command and control needs for small body and lunar landing

  14. A Proactive Complex Event Processing Method for Large-Scale Transportation Internet of Things

    OpenAIRE

    Wang, Yongheng; Cao, Kening

    2014-01-01

    The Internet of Things (IoT) provides a new way to improve the transportation system. The key issue is how to process the numerous events generated by IoT. In this paper, a proactive complex event processing method is proposed for large-scale transportation IoT. Based on a multilayered adaptive dynamic Bayesian model, a Bayesian network structure learning algorithm using search-and-score is proposed to support accurate predictive analytics. A parallel Markov decision processes model is design...

  15. Complete genome sequence of the complex carbohydrate-degrading marine bacterium, Saccharophagus degradans strain 2-40 T.

    Directory of Open Access Journals (Sweden)

    Ronald M Weiner

    2008-05-01

    Full Text Available The marine bacterium Saccharophagus degradans strain 2-40 (Sde 2-40 is emerging as a vanguard of a recently discovered group of marine and estuarine bacteria that recycles complex polysaccharides. We report its complete genome sequence, analysis of which identifies an unusually large number of enzymes that degrade >10 complex polysaccharides. Not only is this an extraordinary range of catabolic capability, many of the enzymes exhibit unusual architecture including novel combinations of catalytic and substrate-binding modules. We hypothesize that many of these features are adaptations that facilitate depolymerization of complex polysaccharides in the marine environment. This is the first sequenced genome of a marine bacterium that can degrade plant cell walls, an important component of the carbon cycle that is not well-characterized in the marine environment.

  16. An approach for identification of unknown viruses using sequencing-by-hybridization.

    Science.gov (United States)

    Katoski, Sarah E; Meyer, Hermann; Ibrahim, Sofi

    2015-09-01

    Accurate identification of biological threat agents, especially RNA viruses, in clinical or environmental samples can be challenging because the concentration of viral genomic material in a given sample is usually low, viral genomic RNA is liable to degradation, and RNA viruses are extremely diverse. A two-tiered approach was used for initial identification, then full genomic characterization of 199 RNA viruses belonging to virus families Arenaviridae, Bunyaviridae, Filoviridae, Flaviviridae, and Togaviridae. A Sequencing-by-hybridization (SBH) microarray was used to tentatively identify a viral pathogen then, the identity is confirmed by guided next-generation sequencing (NGS). After optimization and evaluation of the SBH and NGS methodologies with various virus species and strains, the approach was used to test the ability to identify viruses in blinded samples. The SBH correctly identified two Ebola viruses in the blinded samples within 24 hr, and by using guided amplicon sequencing with 454 GS FLX, the identities of the viruses in both samples were confirmed. SBH provides at relatively low-cost screening of biological samples against a panel of viral pathogens that can be custom-designed on a microarray. Once the identity of virus is deduced from the highest hybridization signal on the SBH microarray, guided (amplicon) NGS sequencing can be used not only to confirm the identity of the virus but also to provide further information about the strain or isolate, including a potential genetic manipulation. This approach can be useful in situations where natural or deliberate biological threat incidents might occur and a rapid response is required. © 2015 Wiley Periodicals, Inc.

  17. The use of high-throughput DNA sequencing in the investigation of antigenic variation: application to Neisseria species.

    Directory of Open Access Journals (Sweden)

    John K Davies

    Full Text Available Antigenic variation occurs in a broad range of species. This process resembles gene conversion in that variant DNA is unidirectionally transferred from partial gene copies (or silent loci into an expression locus. Previous studies of antigenic variation have involved the amplification and sequencing of individual genes from hundreds of colonies. Using the pilE gene from Neisseria gonorrhoeae we have demonstrated that it is possible to use PCR amplification, followed by high-throughput DNA sequencing and a novel assembly process, to detect individual antigenic variation events. The ability to detect these events was much greater than has previously been possible. In N. gonorrhoeae most silent loci contain multiple partial gene copies. Here we show that there is a bias towards using the copy at the 3' end of the silent loci (copy 1 as the donor sequence. The pilE gene of N. gonorrhoeae and some strains of Neisseria meningitidis encode class I pilin, but strains of N. meningitidis from clonal complexes 8 and 11 encode a class II pilin. We have confirmed that the class II pili of meningococcal strain FAM18 (clonal complex 11 are non-variable, and this is also true for the class II pili of strain NMB from clonal complex 8. In addition when a gene encoding class I pilin was moved into the meningococcal strain NMB background there was no evidence of antigenic variation. Finally we investigated several members of the opa gene family of N. gonorrhoeae, where it has been suggested that limited variation occurs. Variation was detected in the opaK gene that is located close to pilE, but not at the opaJ gene located elsewhere on the genome. The approach described here promises to dramatically improve studies of the extent and nature of antigenic variation systems in a variety of species.

  18. Sequence comparison alignment-free approach based on suffix tree and L-words frequency.

    Science.gov (United States)

    Soares, Inês; Goios, Ana; Amorim, António

    2012-01-01

    The vast majority of methods available for sequence comparison rely on a first sequence alignment step, which requires a number of assumptions on evolutionary history and is sometimes very difficult or impossible to perform due to the abundance of gaps (insertions/deletions). In such cases, an alternative alignment-free method would prove valuable. Our method starts by a computation of a generalized suffix tree of all sequences, which is completed in linear time. Using this tree, the frequency of all possible words with a preset length L-L-words--in each sequence is rapidly calculated. Based on the L-words frequency profile of each sequence, a pairwise standard Euclidean distance is then computed producing a symmetric genetic distance matrix, which can be used to generate a neighbor joining dendrogram or a multidimensional scaling graph. We present an improvement to word counting alignment-free approaches for sequence comparison, by determining a single optimal word length and combining suffix tree structures to the word counting tasks. Our approach is, thus, a fast and simple application that proved to be efficient and powerful when applied to mitochondrial genomes. The algorithm was implemented in Python language and is freely available on the web.

  19. Discrete event simulation versus conventional system reliability analysis approaches

    DEFF Research Database (Denmark)

    Kozine, Igor

    2010-01-01

    Discrete Event Simulation (DES) environments are rapidly developing and appear to be promising tools for building reliability and risk analysis models of safety-critical systems and human operators. If properly developed, they are an alternative to the conventional human reliability analysis models...... and systems analysis methods such as fault and event trees and Bayesian networks. As one part, the paper describes briefly the author’s experience in applying DES models to the analysis of safety-critical systems in different domains. The other part of the paper is devoted to comparing conventional approaches...

  20. A robust approach to extract biomedical events from literature.

    Science.gov (United States)

    Bui, Quoc-Chinh; Sloot, Peter M A

    2012-10-15

    The abundance of biomedical literature has attracted significant interest in novel methods to automatically extract biomedical relations from the literature. Until recently, most research was focused on extracting binary relations such as protein-protein interactions and drug-disease relations. However, these binary relations cannot fully represent the original biomedical data. Therefore, there is a need for methods that can extract fine-grained and complex relations known as biomedical events. In this article we propose a novel method to extract biomedical events from text. Our method consists of two phases. In the first phase, training data are mapped into structured representations. Based on that, templates are used to extract rules automatically. In the second phase, extraction methods are developed to process the obtained rules. When evaluated against the Genia event extraction abstract and full-text test datasets (Task 1), we obtain results with F-scores of 52.34 and 53.34, respectively, which are comparable to the state-of-the-art systems. Furthermore, our system achieves superior performance in terms of computational efficiency. Our source code is available for academic use at http://dl.dropbox.com/u/10256952/BioEvent.zip.

  1. CNE (Embalse nuclear power plant): probabilistic safety study. Loss of service water. Probabilistic evaluation and analysis through events sequence

    International Nuclear Information System (INIS)

    Couto, A.J.; Perez, S.S.

    1987-01-01

    This work is part of a study on the service water systems of the Embalse nuclear power plant from a safety point of view. The faults of service water systems of high and low pressure that can lead to situations threatening the plant safety were analyzed in a previous report. The event 'total loss of low pressure service water' causes the largest number of such conditions. Such event is an operational incident that can lead to an accident situation due to faults in the required process systems or by omission of a procedure. The annual frequency of the event 'total loss of low pressure service water' is calculated. The main contribution comes from pump failure. The evaluation of the accident sequences shows that the most direct way to the liberation of fission products is the loss of steam generators as heat sink. The contributions to small and large LOCA and electric supply loss are analyzed. The sequence that leads to tritium release through boiling of moderator is also evaluated. (Author)

  2. Understanding complex urban systems multidisciplinary approaches to modeling

    CERN Document Server

    Gurr, Jens; Schmidt, J

    2014-01-01

    Understanding Complex Urban Systems takes as its point of departure the insight that the challenges of global urbanization and the complexity of urban systems cannot be understood – let alone ‘managed’ – by sectoral and disciplinary approaches alone. But while there has recently been significant progress in broadening and refining the methodologies for the quantitative modeling of complex urban systems, in deepening the theoretical understanding of cities as complex systems, or in illuminating the implications for urban planning, there is still a lack of well-founded conceptual thinking on the methodological foundations and the strategies of modeling urban complexity across the disciplines. Bringing together experts from the fields of urban and spatial planning, ecology, urban geography, real estate analysis, organizational cybernetics, stochastic optimization, and literary studies, as well as specialists in various systems approaches and in transdisciplinary methodologies of urban analysis, the volum...

  3. Multilocus sequence typing and rtxA toxin gene sequencing analysis of Kingella kingae isolates demonstrates genetic diversity and international clones.

    Directory of Open Access Journals (Sweden)

    Romain Basmaci

    Full Text Available BACKGROUND: Kingella kingae, a normal component of the upper respiratory flora, is being increasingly recognized as an important invasive pathogen in young children. Genetic diversity of this species has not been studied. METHODS: We analyzed 103 strains from different countries and clinical origins by a new multilocus sequence-typing (MLST schema. Putative virulence gene rtxA, encoding an RTX toxin, was also sequenced, and experimental virulence of representative strains was assessed in a juvenile-rat model. RESULTS: Thirty-six sequence-types (ST and nine ST-complexes (STc were detected. The main STc 6, 14 and 23 comprised 23, 17 and 20 strains respectively, and were internationally distributed. rtxA sequencing results were mostly congruent with MLST, and showed horizontal transfer events. Of interest, all members of the distantly related ST-6 (n = 22 and ST-5 (n = 4 harboured a 33 bp duplication or triplication in their rtxA sequence, suggesting that this genetic trait arose through selective advantage. The animal model revealed significant differences in virulence among strains of the species. CONCLUSION: MLST analysis reveals international spread of ST-complexes and will help to decipher acquisition and evolution of virulence traits and diversity of pathogenicity among K. kingae strains, for which an experimental animal model is now available.

  4. Genome Sequence Analysis of New Isolates of the Winona Strain of Plum pox virus and the First Definitive Evidence of Intrastrain Recombination Events.

    Science.gov (United States)

    James, Delano; Sanderson, Dan; Varga, Aniko; Sheveleva, Anna; Chirkov, Sergei

    2016-04-01

    Plum pox virus (PPV) is genetically diverse with nine different strains identified. Mutations, indel events, and interstrain recombination events are known to contribute to the genetic diversity of PPV. This is the first report of intrastrain recombination events that contribute to PPV's genetic diversity. Fourteen isolates of the PPV strain Winona (W) were analyzed including nine new strain W isolates sequenced completely in this study. Isolates of other strains of PPV with more than one isolate with the complete genome sequence available in GenBank were included also in this study for comparison and analysis. Five intrastrain recombination events were detected among the PPV W isolates, one among PPV C strain isolates, and one among PPV M strain isolates. Four (29%) of the PPV W isolates analyzed are recombinants; one of which (P2-1) is a mosaic, with three recombination events identified. A new interstrain recombinant event was identified between a strain M isolate and a strain Rec isolate, a known recombinant. In silico recombination studies and pairwise distance analyses of PPV strain D isolates indicate that a threshold of genetic diversity exists for the detectability of recombination events, in the range of approximately 0.78×10(-2) to 1.33×10(-2) mean pairwise distance. RDP4 analyses indicate that in the case of PPV Rec isolates there may be a recombinant breakpoint distinct from the obvious transition point of strain sequences. Evidence was obtained that indicates that the frequency of PPV recombination is underestimated, which may be true for other RNA viruses where low genetic diversity exists.

  5. Accident sequence quantification with KIRAP

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Tae Un; Han, Sang Hoon; Kim, Kil You; Yang, Jun Eon; Jeong, Won Dae; Chang, Seung Cheol; Sung, Tae Yong; Kang, Dae Il; Park, Jin Hee; Lee, Yoon Hwan; Hwang, Mi Jeong

    1997-01-01

    The tasks of probabilistic safety assessment(PSA) consists of the identification of initiating events, the construction of event tree for each initiating event, construction of fault trees for event tree logics, the analysis of reliability data and finally the accident sequence quantification. In the PSA, the accident sequence quantification is to calculate the core damage frequency, importance analysis and uncertainty analysis. Accident sequence quantification requires to understand the whole model of the PSA because it has to combine all event tree and fault tree models, and requires the excellent computer code because it takes long computation time. Advanced Research Group of Korea Atomic Energy Research Institute(KAERI) has developed PSA workstation KIRAP(Korea Integrated Reliability Analysis Code Package) for the PSA work. This report describes the procedures to perform accident sequence quantification, the method to use KIRAP`s cut set generator, and method to perform the accident sequence quantification with KIRAP. (author). 6 refs.

  6. Accident sequence quantification with KIRAP

    International Nuclear Information System (INIS)

    Kim, Tae Un; Han, Sang Hoon; Kim, Kil You; Yang, Jun Eon; Jeong, Won Dae; Chang, Seung Cheol; Sung, Tae Yong; Kang, Dae Il; Park, Jin Hee; Lee, Yoon Hwan; Hwang, Mi Jeong.

    1997-01-01

    The tasks of probabilistic safety assessment(PSA) consists of the identification of initiating events, the construction of event tree for each initiating event, construction of fault trees for event tree logics, the analysis of reliability data and finally the accident sequence quantification. In the PSA, the accident sequence quantification is to calculate the core damage frequency, importance analysis and uncertainty analysis. Accident sequence quantification requires to understand the whole model of the PSA because it has to combine all event tree and fault tree models, and requires the excellent computer code because it takes long computation time. Advanced Research Group of Korea Atomic Energy Research Institute(KAERI) has developed PSA workstation KIRAP(Korea Integrated Reliability Analysis Code Package) for the PSA work. This report describes the procedures to perform accident sequence quantification, the method to use KIRAP's cut set generator, and method to perform the accident sequence quantification with KIRAP. (author). 6 refs

  7. Coordination Approaches for Complex Software Systems

    NARCIS (Netherlands)

    Bosse, T.; Hoogendoorn, M.; Treur, J.

    2006-01-01

    This document presents the results of a collaboration between the Vrije Universiteit Amsterdam, Department of Artificial Intelligence and Force Vision to investigate coordination approaches for complex software systems. The project was funded by Force Vision.

  8. Cranio-orbital approach for complex aneurysmal surgery.

    LENUS (Irish Health Repository)

    Kelleher, M O

    2012-02-03

    Certain aneurysms of the anterior circulation continue to offer a technical challenge for safe exposure and clipping. The purpose of this paper was to describe the cranio-orbital approach for surgical clipping of complex aneurysms and to evaluate prospectively the associated complications of this approach. Prospective audit of all patients undergoing cranio-orbital approach for aneurysm surgery from 1997 to 2004 by the senior author. Twenty-five patients, eight male and 17 female, median age of 52 years, range 28-73. All patients had a standard pterional approach supplemented by an orbital osteotomy. In the 7-year period 367 patients underwent treatment for their aneurysms (169 clipped and 198 coiled). Of the 169 patients who were operated on, 29 had a skull base approach, of which 25 were cranio-orbital. The aneurysm location was as follows: 16 middle cerebral artery (MCA), three carotid bifurcation, four anterior communicating artery (ACOMM), one ophthalmic and one basilar. There were no approach-related complications. The cranio-orbital craniotomy can be a useful adjunct in the surgical treatment of giant or complex aneurysms. It offers the following advantages over a standard pterional approach: reduces operative distance; allows easy splitting of the sylvian fissure; and provides a wide arc of exposure with multiple working corridors.

  9. Containment performance evaluation for the GESSAR-II plant for seismic initiating events

    International Nuclear Information System (INIS)

    Shiu, K.K.; Chu, T.; Ludewig, H.; Pratt, W.T.

    1986-01-01

    As a part of the overall effort undertaken by Brookhaven National Laboratory (BNL) to review the GESSAR-II probabilistic risk assessment, an independent containment performance evaluation was performed using the containment event tree approach. This evaluation focused principally on those accident sequences which are initiated by seismic events. This paper reports the findings of this study. 1 ref

  10. Methodology for time-dependent reliability analysis of accident sequences and complex reactor systems

    International Nuclear Information System (INIS)

    Paula, H.M.

    1984-01-01

    The work presented here is of direct use in probabilistic risk assessment (PRA) and is of value to utilities as well as the Nuclear Regulatory Commission (NRC). Specifically, this report presents a methodology and a computer program to calculate the expected number of occurrences for each accident sequence in an event tree. The methodology evaluates the time-dependent (instantaneous) and the average behavior of the accident sequence. The methodology accounts for standby safety system and component failures that occur (a) before they are demanded, (b) upon demand, and (c) during the mission (system operation). With respect to failures that occur during the mission, this methodology is unique in the sense that it models components that can be repaired during the mission. The expected number of system failures during the mission provides an upper bound for the probability of a system failure to run - the mission unreliability. The basic event modeling includes components that are continuously monitored, periodically tested, and those that are not tested or are otherwise nonrepairable. The computer program ASA allows practical applications of the method developed. This work represents a required extension of the presently available methodology and allows a more realistic PRA of nuclear power plants

  11. Rb-Sr measurements on metamorphic rocks from the Barro Alto Complex, Goias, Brazil

    International Nuclear Information System (INIS)

    Fuck, R.A.; Neves, B.B.B.; Cordani, U.G.; Kawashita, K.

    1988-01-01

    The Barro Alto Complex comprises a highly deformed and metamorphosed association of plutonic, volcanic, and sedimentary rocks exposed in a 150 x 25 Km boomerang-like strip in Central Goias, Brazil. It is the southernmost tip of an extensive yet discontinuous belt of granulite and amphibolite facies metamorphic rocks which include the Niquelandia and Cana Brava complexes to the north. Two rock associations are distinguished within the granulite belt. The first one comprises a sequence of fine-grained mafic granulite, hypersthene-quartz-feldspar granulite, garnet quartzite, sillimanite-garnet-cordierite gneiss, calc-silicate rock, and magnetite-rich iron formation. The second association comprises medium-to coarse-grained mafic rocks. The medium-grade rocks of the western/northern portion (Barro Alto Complex) comprise both layered mafic rocks and a volcanic-sedimentary sequence, deformed and metamorphosed under amphibolite facies conditions. The fine-grained amphibolite form the basal part of the Juscelandia meta volcanic-sedimentary sequence. A geochronologic investigation by the Rb-Sr method has been carried out mainly on felsic rocks from the granulite belt and gneisses of the Juscelandia sequence. The analytical results for the Juscelandia sequence are presented. Isotope results for rocks from different outcrops along the gneiss layer near Juscelandia are also presented. In conclusion, Rb-Sr isotope measurements suggest that the Barro Alto rocks have undergone at least one important metamorphic event during Middle Proterozoic times, around 1300 Ma ago. During that event volcanic and sedimentary rocks of the Juscelandia sequence, as well as the underlying gabbro-anorthosite layered complex, underwent deformation and recrystallization under amphibolite facies conditions. (author)

  12. The 2008 Wells, Nevada earthquake sequence: Source constraints using calibrated multiple event relocation and InSAR

    Science.gov (United States)

    Nealy, Jennifer; Benz, Harley M.; Hayes, Gavin; Berman, Eric; Barnhart, William

    2017-01-01

    The 2008 Wells, NV earthquake represents the largest domestic event in the conterminous U.S. outside of California since the October 1983 Borah Peak earthquake in southern Idaho. We present an improved catalog, magnitude complete to 1.6, of the foreshock-aftershock sequence, supplementing the current U.S. Geological Survey (USGS) Preliminary Determination of Epicenters (PDE) catalog with 1,928 well-located events. In order to create this catalog, both subspace and kurtosis detectors are used to obtain an initial set of earthquakes and associated locations. The latter are then calibrated through the implementation of the hypocentroidal decomposition method and relocated using the BayesLoc relocation technique. We additionally perform a finite fault slip analysis of the mainshock using InSAR observations. By combining the relocated sequence with the finite fault analysis, we show that the aftershocks occur primarily updip and along the southwestern edge of the zone of maximum slip. The aftershock locations illuminate areas of post-mainshock strain increase; aftershock depths, ranging from 5 to 16 km, are consistent with InSAR imaging, which shows that the Wells earthquake was a buried source with no observable near-surface offset.

  13. Model SNP development for complex genomes based on hexaploid oat using high-throughput 454 sequencing technology

    Directory of Open Access Journals (Sweden)

    Chao Shiaoman

    2011-01-01

    Full Text Available Abstract Background Genetic markers are pivotal to modern genomics research; however, discovery and genotyping of molecular markers in oat has been hindered by the size and complexity of the genome, and by a scarcity of sequence data. The purpose of this study was to generate oat expressed sequence tag (EST information, develop a bioinformatics pipeline for SNP discovery, and establish a method for rapid, cost-effective, and straightforward genotyping of SNP markers in complex polyploid genomes such as oat. Results Based on cDNA libraries of four cultivated oat genotypes, approximately 127,000 contigs were assembled from approximately one million Roche 454 sequence reads. Contigs were filtered through a novel bioinformatics pipeline to eliminate ambiguous polymorphism caused by subgenome homology, and 96 in silico SNPs were selected from 9,448 candidate loci for validation using high-resolution melting (HRM analysis. Of these, 52 (54% were polymorphic between parents of the Ogle1040 × TAM O-301 (OT mapping population, with 48 segregating as single Mendelian loci, and 44 being placed on the existing OT linkage map. Ogle and TAM amplicons from 12 primers were sequenced for SNP validation, revealing complex polymorphism in seven amplicons but general sequence conservation within SNP loci. Whole-amplicon interrogation with HRM revealed insertions, deletions, and heterozygotes in secondary oat germplasm pools, generating multiple alleles at some primer targets. To validate marker utility, 36 SNP assays were used to evaluate the genetic diversity of 34 diverse oat genotypes. Dendrogram clusters corresponded generally to known genome composition and genetic ancestry. Conclusions The high-throughput SNP discovery pipeline presented here is a rapid and effective method for identification of polymorphic SNP alleles in the oat genome. The current-generation HRM system is a simple and highly-informative platform for SNP genotyping. These techniques provide

  14. State of the art and challenges in sequence based T-cell epitope prediction

    DEFF Research Database (Denmark)

    Lundegaard, Claus; Hoof, Ilka; Lund, Ole

    2010-01-01

    Sequence based T-cell epitope predictions have improved immensely in the last decade. From predictions of peptide binding to major histocompatibility complex molecules with moderate accuracy, limited allele coverage, and no good estimates of the other events in the antigen-processing pathway, the...

  15. A Probabilistic Approach to Network Event Formation from Pre-Processed Waveform Data

    Science.gov (United States)

    Kohl, B. C.; Given, J.

    2017-12-01

    The current state of the art for seismic event detection still largely depends on signal detection at individual sensor stations, including picking accurate arrivals times and correctly identifying phases, and relying on fusion algorithms to associate individual signal detections to form event hypotheses. But increasing computational capability has enabled progress toward the objective of fully utilizing body-wave recordings in an integrated manner to detect events without the necessity of previously recorded ground truth events. In 2011-2012 Leidos (then SAIC) operated a seismic network to monitor activity associated with geothermal field operations in western Nevada. We developed a new association approach for detecting and quantifying events by probabilistically combining pre-processed waveform data to deal with noisy data and clutter at local distance ranges. The ProbDet algorithm maps continuous waveform data into continuous conditional probability traces using a source model (e.g. Brune earthquake or Mueller-Murphy explosion) to map frequency content and an attenuation model to map amplitudes. Event detection and classification is accomplished by combining the conditional probabilities from the entire network using a Bayesian formulation. This approach was successful in producing a high-Pd, low-Pfa automated bulletin for a local network and preliminary tests with regional and teleseismic data show that it has promise for global seismic and nuclear monitoring applications. The approach highlights several features that we believe are essential to achieving low-threshold automated event detection: Minimizes the utilization of individual seismic phase detections - in traditional techniques, errors in signal detection, timing, feature measurement and initial phase ID compound and propagate into errors in event formation, Has a formalized framework that utilizes information from non-detecting stations, Has a formalized framework that utilizes source information, in

  16. Sequence Comparison Alignment-Free Approach Based on Suffix Tree and L-Words Frequency

    Directory of Open Access Journals (Sweden)

    Inês Soares

    2012-01-01

    Full Text Available The vast majority of methods available for sequence comparison rely on a first sequence alignment step, which requires a number of assumptions on evolutionary history and is sometimes very difficult or impossible to perform due to the abundance of gaps (insertions/deletions. In such cases, an alternative alignment-free method would prove valuable. Our method starts by a computation of a generalized suffix tree of all sequences, which is completed in linear time. Using this tree, the frequency of all possible words with a preset length L—L-words—in each sequence is rapidly calculated. Based on the L-words frequency profile of each sequence, a pairwise standard Euclidean distance is then computed producing a symmetric genetic distance matrix, which can be used to generate a neighbor joining dendrogram or a multidimensional scaling graph. We present an improvement to word counting alignment-free approaches for sequence comparison, by determining a single optimal word length and combining suffix tree structures to the word counting tasks. Our approach is, thus, a fast and simple application that proved to be efficient and powerful when applied to mitochondrial genomes. The algorithm was implemented in Python language and is freely available on the web.

  17. Application of dynamic probabilistic safety assessment approach for accident sequence precursor analysis: Case study for steam generator tube rupture

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Han Sul; Heo, Gyun Young [Kyung Hee University, Yongin (Korea, Republic of); Kim, Tae Wan [Incheon National University, Incheon (Korea, Republic of)

    2017-03-15

    The purpose of this research is to introduce the technical standard of accident sequence precursor (ASP) analysis, and to propose a case study using the dynamic-probabilistic safety assessment (D-PSA) approach. The D-PSA approach can aid in the determination of high-risk/low-frequency accident scenarios from all potential scenarios. It can also be used to investigate the dynamic interaction between the physical state and the actions of the operator in an accident situation for risk quantification. This approach lends significant potential for safety analysis. Furthermore, the D-PSA approach provides a more realistic risk assessment by minimizing assumptions used in the conventional PSA model so-called the static-PSA model, which are relatively static in comparison. We performed risk quantification of a steam generator tube rupture (SGTR) accident using the dynamic event tree (DET) methodology, which is the most widely used methodology in D-PSA. The risk quantification results of D-PSA and S-PSA are compared and evaluated. Suggestions and recommendations for using D-PSA are described in order to provide a technical perspective.

  18. Application of dynamic probabilistic safety assessment approach for accident sequence precursor analysis: Case study for steam generator tube rupture

    International Nuclear Information System (INIS)

    Lee, Han Sul; Heo, Gyun Young; Kim, Tae Wan

    2017-01-01

    The purpose of this research is to introduce the technical standard of accident sequence precursor (ASP) analysis, and to propose a case study using the dynamic-probabilistic safety assessment (D-PSA) approach. The D-PSA approach can aid in the determination of high-risk/low-frequency accident scenarios from all potential scenarios. It can also be used to investigate the dynamic interaction between the physical state and the actions of the operator in an accident situation for risk quantification. This approach lends significant potential for safety analysis. Furthermore, the D-PSA approach provides a more realistic risk assessment by minimizing assumptions used in the conventional PSA model so-called the static-PSA model, which are relatively static in comparison. We performed risk quantification of a steam generator tube rupture (SGTR) accident using the dynamic event tree (DET) methodology, which is the most widely used methodology in D-PSA. The risk quantification results of D-PSA and S-PSA are compared and evaluated. Suggestions and recommendations for using D-PSA are described in order to provide a technical perspective

  19. Pairwise Sequence Alignment Library

    Energy Technology Data Exchange (ETDEWEB)

    2015-05-20

    Vector extensions, such as SSE, have been part of the x86 CPU since the 1990s, with applications in graphics, signal processing, and scientific applications. Although many algorithms and applications can naturally benefit from automatic vectorization techniques, there are still many that are difficult to vectorize due to their dependence on irregular data structures, dense branch operations, or data dependencies. Sequence alignment, one of the most widely used operations in bioinformatics workflows, has a computational footprint that features complex data dependencies. The trend of widening vector registers adversely affects the state-of-the-art sequence alignment algorithm based on striped data layouts. Therefore, a novel SIMD implementation of a parallel scan-based sequence alignment algorithm that can better exploit wider SIMD units was implemented as part of the Parallel Sequence Alignment Library (parasail). Parasail features: Reference implementations of all known vectorized sequence alignment approaches. Implementations of Smith Waterman (SW), semi-global (SG), and Needleman Wunsch (NW) sequence alignment algorithms. Implementations across all modern CPU instruction sets including AVX2 and KNC. Language interfaces for C/C++ and Python.

  20. Particle tracking from image sequences of complex plasma crystals

    International Nuclear Information System (INIS)

    Hadziavdic, Vedad; Melandsoe, Frank; Hanssen, Alfred

    2006-01-01

    In order to gather information about the physics of the complex plasma crystals from the experimental data, particles have to be tracked through a sequence of images. An application of the Kalman filter for that purpose is presented, using a one-dimensional approximation of the particle dynamics as a model for the filter. It is shown that Kalman filter is capable of tracking dust particles even with high levels of measurement noise. An inherent part of the Kalman filter, the innovation process, can be used to estimate values of the physical system parameters from the experimental data. The method is shown to be able to estimate the characteristic oscillation frequency from noisy data

  1. Screening the sequence selectivity of DNA-binding molecules using a gold nanoparticle-based colorimetric approach.

    Science.gov (United States)

    Hurst, Sarah J; Han, Min Su; Lytton-Jean, Abigail K R; Mirkin, Chad A

    2007-09-15

    We have developed a novel competition assay that uses a gold nanoparticle (Au NP)-based, high-throughput colorimetric approach to screen the sequence selectivity of DNA-binding molecules. This assay hinges on the observation that the melting behavior of DNA-functionalized Au NP aggregates is sensitive to the concentration of the DNA-binding molecule in solution. When short, oligomeric hairpin DNA sequences were added to a reaction solution consisting of DNA-functionalized Au NP aggregates and DNA-binding molecules, these molecules may either bind to the Au NP aggregate interconnects or the hairpin stems based on their relative affinity for each. This relative affinity can be measured as a change in the melting temperature (Tm) of the DNA-modified Au NP aggregates in solution. As a proof of concept, we evaluated the selectivity of 4',6-diamidino-2-phenylindone (an AT-specific binder), ethidium bromide (a nonspecific binder), and chromomycin A (a GC-specific binder) for six sequences of hairpin DNA having different numbers of AT pairs in a five-base pair variable stem region. Our assay accurately and easily confirmed the known trends in selectivity for the DNA binders in question without the use of complicated instrumentation. This novel assay will be useful in assessing large libraries of potential drug candidates that work by binding DNA to form a drug/DNA complex.

  2. A simple approach to enhance the performance of complex-coefficient filter-based PLL in grid-connected applications

    DEFF Research Database (Denmark)

    Ramezani, Malek; Golestan, Saeed; Li, Shuhui

    2018-01-01

    In recent years, a large number of three-phase phase-locked loops (PLLs) have been developed. One of the most popular ones is the complex coefficient filterbased PLL (CCF-PLL). The CCFs benefit from a sequence selective filtering ability and, hence, enable the CCF-PLL to selectively reject/extract...... disturbances before the PLL control loop while maintaining an acceptable dynamic behavior. The aim of this paper is presenting a simple yet effective approach to enhance the standard CCF-PLL performance without requiring any additional computational load....

  3. Highly multiplexed targeted DNA sequencing from single nuclei.

    Science.gov (United States)

    Leung, Marco L; Wang, Yong; Kim, Charissa; Gao, Ruli; Jiang, Jerry; Sei, Emi; Navin, Nicholas E

    2016-02-01

    Single-cell DNA sequencing methods are challenged by poor physical coverage, high technical error rates and low throughput. To address these issues, we developed a single-cell DNA sequencing protocol that combines flow-sorting of single nuclei, time-limited multiple-displacement amplification (MDA), low-input library preparation, DNA barcoding, targeted capture and next-generation sequencing (NGS). This approach represents a major improvement over our previous single nucleus sequencing (SNS) Nature Protocols paper in terms of generating higher-coverage data (>90%), thereby enabling the detection of genome-wide variants in single mammalian cells at base-pair resolution. Furthermore, by pooling 48-96 single-cell libraries together for targeted capture, this approach can be used to sequence many single-cell libraries in parallel in a single reaction. This protocol greatly reduces the cost of single-cell DNA sequencing, and it can be completed in 5-6 d by advanced users. This single-cell DNA sequencing protocol has broad applications for studying rare cells and complex populations in diverse fields of biological research and medicine.

  4. Trending analysis of precursor events

    International Nuclear Information System (INIS)

    Watanabe, Norio

    1998-01-01

    The Accident Sequence Precursor (ASP) Program of United States Nuclear Regulatory Commission (U.S.NRC) identifies and categorizes operational events at nuclear power plants in terms of the potential for core damage. The ASP analysis has been performed on yearly basis and the results have been published in the annual reports. This paper describes the trends in initiating events and dominant sequences for 459 precursors identified in the ASP Program during the 1969-94 period and also discusses a comparison with dominant sequences predicted in the past Probabilistic Risk Assessment (PRA) studies. These trends were examined for three time periods, 1969-81, 1984-87 and 1988-94. Although the different models had been used in the ASP analyses for these three periods, the distribution of precursors by dominant sequences show similar trends to each other. For example, the sequences involving loss of both main and auxiliary feedwater were identified in many PWR events and those involving loss of both high and low coolant injection were found in many BWR events. Also, it was found that these dominant sequences were comparable to those determined to be dominant in the predictions by the past PRAs. As well, a list of the 459 precursors identified are provided in Appendix, indicating initiating event types, unavailable systems, dominant sequences, conditional core damage probabilities, and so on. (author)

  5. Development of an Electrochemistry Teaching Sequence using a Phenomenographic Approach

    Science.gov (United States)

    Rodriguez-Velazquez, Sorangel

    the core concepts from discipline-specific models and theories serve as visual tools to describe reversible redox half-reactions at equilibrium, predict the spontaneity of the electrochemical process and explain interfacial equilibrium between redox species and electrodes in solution. The integration of physics concepts into electrochemistry instruction facilitated describing the interactions between the chemical system (e.g., redox species) and the external circuit (e.g., voltmeter). The "Two worlds" theoretical framework was chosen to anchor a robust educational design where the world of objects and events is deliberately connected to the world of theories and models. The core concepts in Marcus theory and density of states (DOS) provided the scientific foundations to connect both worlds. The design of this teaching sequence involved three phases; the selection of the content to be taught, the determination of a coherent and explicit connection among concepts and the development of educational activities to engage students in the learning process. The reduction-oxidation and electrochemistry chapters of three of the most popular general chemistry textbooks were revised in order to identify potential gaps during instruction, taking into consideration learning and teaching difficulties. The electrochemistry curriculum was decomposed into manageable sections contained in modules. Thirteen modules were developed and each module addresses specific conceptions with regard to terminology, redox reactions in electrochemical cells, and the function of the external circuit in electrochemical process. The electrochemistry teaching sequence was evaluated using a phenomenographic approach. This approach allows describing the qualitative variation in instructors' consciousness about the teaching of electrochemistry. A phenomenographic analysis revealed that the most relevant aspect of variation came from instructors' expertise. Participant A expertise (electrochemist) promoted in

  6. Defense-in-depth approach against a beyond design basis event

    Energy Technology Data Exchange (ETDEWEB)

    Hoang, H., E-mail: Hoa.hoang@ge.com [GE Hitachi Nuclear Energy, 1989 Little Orchard St., 95125 San Jose, California (United States)

    2013-10-15

    The US industry, with the approval of the Nuclear Regulatory Commission, is promoting an approach to add diverse and flexible mitigation strategies, or Flex, that will increase the defense-in-depth capability for the nuclear power plants in the event of beyond design basis event, such as at the Fukushima Dai-ichi station. The objective of Flex is to establish and indefinite coping capability to prevent damage to the fuel in the core and spent fuel pool, and to maintain the containment function by utilizing installed equipment, on-site portable equipment and pre-staged off-site resources. This capability will address both an extended loss of all Ac power and a loss of ultimate heat sink which could arise following a design basis event with additional failures, and conditions from a beyond design basis event. (author)

  7. Defense-in-depth approach against a beyond design basis event

    International Nuclear Information System (INIS)

    Hoang, H.

    2013-10-01

    The US industry, with the approval of the Nuclear Regulatory Commission, is promoting an approach to add diverse and flexible mitigation strategies, or Flex, that will increase the defense-in-depth capability for the nuclear power plants in the event of beyond design basis event, such as at the Fukushima Dai-ichi station. The objective of Flex is to establish and indefinite coping capability to prevent damage to the fuel in the core and spent fuel pool, and to maintain the containment function by utilizing installed equipment, on-site portable equipment and pre-staged off-site resources. This capability will address both an extended loss of all Ac power and a loss of ultimate heat sink which could arise following a design basis event with additional failures, and conditions from a beyond design basis event. (author)

  8. Event-specific qualitative and quantitative PCR detection of the GMO carnation (Dianthus caryophyllus) variety Moonlite based upon the 5'-transgene integration sequence.

    Science.gov (United States)

    Li, P; Jia, J W; Jiang, L X; Zhu, H; Bai, L; Wang, J B; Tang, X M; Pan, A H

    2012-04-27

    To ensure the implementation of genetically modified organism (GMO)-labeling regulations, an event-specific detection method was developed based on the junction sequence of an exogenous integrant in the transgenic carnation variety Moonlite. The 5'-transgene integration sequence was isolated by thermal asymmetric interlaced PCR. Based upon the 5'-transgene integration sequence, the event-specific primers and TaqMan probe were designed to amplify the fragments, which spanned the exogenous DNA and carnation genomic DNA. Qualitative and quantitative PCR assays were developed employing the designed primers and probe. The detection limit of the qualitative PCR assay was 0.05% for Moonlite in 100 ng total carnation genomic DNA, corresponding to about 79 copies of the carnation haploid genome; the limit of detection and quantification of the quantitative PCR assay were estimated to be 38 and 190 copies of haploid carnation genomic DNA, respectively. Carnation samples with different contents of genetically modified components were quantified and the bias between the observed and true values of three samples were lower than the acceptance criterion (GMO detection method. These results indicated that these event-specific methods would be useful for the identification and quantification of the GMO carnation Moonlite.

  9. Insights into three whole-genome duplications gleaned from the Paramecium caudatum genome sequence.

    Science.gov (United States)

    McGrath, Casey L; Gout, Jean-Francois; Doak, Thomas G; Yanagi, Akira; Lynch, Michael

    2014-08-01

    Paramecium has long been a model eukaryote. The sequence of the Paramecium tetraurelia genome reveals a history of three successive whole-genome duplications (WGDs), and the sequences of P. biaurelia and P. sexaurelia suggest that these WGDs are shared by all members of the aurelia species complex. Here, we present the genome sequence of P. caudatum, a species closely related to the P. aurelia species group. P. caudatum shares only the most ancient of the three WGDs with the aurelia complex. We found that P. caudatum maintains twice as many paralogs from this early event as the P. aurelia species, suggesting that post-WGD gene retention is influenced by subsequent WGDs and supporting the importance of selection for dosage in gene retention. The availability of P. caudatum as an outgroup allows an expanded analysis of the aurelia intermediate and recent WGD events. Both the Guanine+Cytosine (GC) content and the expression level of preduplication genes are significant predictors of duplicate retention. We find widespread asymmetrical evolution among aurelia paralogs, which is likely caused by gradual pseudogenization rather than by neofunctionalization. Finally, cases of divergent resolution of intermediate WGD duplicates between aurelia species implicate this process acts as an ongoing reinforcement mechanism of reproductive isolation long after a WGD event. Copyright © 2014 by the Genetics Society of America.

  10. Intelligent Transportation Control based on Proactive Complex Event Processing

    Directory of Open Access Journals (Sweden)

    Wang Yongheng

    2016-01-01

    Full Text Available Complex Event Processing (CEP has become the key part of Internet of Things (IoT. Proactive CEP can predict future system states and execute some actions to avoid unwanted states which brings new hope to intelligent transportation control. In this paper, we propose a proactive CEP architecture and method for intelligent transportation control. Based on basic CEP technology and predictive analytic technology, a networked distributed Markov decision processes model with predicting states is proposed as sequential decision model. A Q-learning method is proposed for this model. The experimental evaluations show that this method works well when used to control congestion in in intelligent transportation systems.

  11. Detecting false positive sequence homology: a machine learning approach.

    Science.gov (United States)

    Fujimoto, M Stanley; Suvorov, Anton; Jensen, Nicholas O; Clement, Mark J; Bybee, Seth M

    2016-02-24

    Accurate detection of homologous relationships of biological sequences (DNA or amino acid) amongst organisms is an important and often difficult task that is essential to various evolutionary studies, ranging from building phylogenies to predicting functional gene annotations. There are many existing heuristic tools, most commonly based on bidirectional BLAST searches that are used to identify homologous genes and combine them into two fundamentally distinct classes: orthologs and paralogs. Due to only using heuristic filtering based on significance score cutoffs and having no cluster post-processing tools available, these methods can often produce multiple clusters constituting unrelated (non-homologous) sequences. Therefore sequencing data extracted from incomplete genome/transcriptome assemblies originated from low coverage sequencing or produced by de novo processes without a reference genome are susceptible to high false positive rates of homology detection. In this paper we develop biologically informative features that can be extracted from multiple sequence alignments of putative homologous genes (orthologs and paralogs) and further utilized in context of guided experimentation to verify false positive outcomes. We demonstrate that our machine learning method trained on both known homology clusters obtained from OrthoDB and randomly generated sequence alignments (non-homologs), successfully determines apparent false positives inferred by heuristic algorithms especially among proteomes recovered from low-coverage RNA-seq data. Almost ~42 % and ~25 % of predicted putative homologies by InParanoid and HaMStR respectively were classified as false positives on experimental data set. Our process increases the quality of output from other clustering algorithms by providing a novel post-processing method that is both fast and efficient at removing low quality clusters of putative homologous genes recovered by heuristic-based approaches.

  12. Complex Needs or Simplistic Approaches? Homelessness Services and People with Complex Needs in Edinburgh

    Directory of Open Access Journals (Sweden)

    Manuel Macias Balda

    2016-10-01

    Full Text Available This research addresses how homelessness services from the statutory and voluntary sector are working for people with complex needs in the City of Edinburgh. Using a qualitative approach, it analyses the service providers’ perspectives on the concept, challenges and what works when dealing with this group of people. It also explores the opinions of a sample of service users, categorised as having complex needs, regarding the accommodation and support they are receiving. After analysing the data, it is argued that homelessness agencies do not have an appropriate cognitive nor institutional framework that facilitates an effective approach to work with people with complex needs. The lack of a sophisticated understanding that recognises the relational difficulties of individuals and the presence of structural, organisational, professional and interpersonal barriers hinder the development of positive long-term relationships which is considered as the key factor of change. For this reason, it is recommended to address a set of factors that go beyond simplistic and linear approaches and move towards complex responses in order to tackle homelessness from a broader perspective and, ultimately, achieve social inclusion.

  13. Statistical processing of large image sequences.

    Science.gov (United States)

    Khellah, F; Fieguth, P; Murray, M J; Allen, M

    2005-01-01

    The dynamic estimation of large-scale stochastic image sequences, as frequently encountered in remote sensing, is important in a variety of scientific applications. However, the size of such images makes conventional dynamic estimation methods, for example, the Kalman and related filters, impractical. In this paper, we present an approach that emulates the Kalman filter, but with considerably reduced computational and storage requirements. Our approach is illustrated in the context of a 512 x 512 image sequence of ocean surface temperature. The static estimation step, the primary contribution here, uses a mixture of stationary models to accurately mimic the effect of a nonstationary prior, simplifying both computational complexity and modeling. Our approach provides an efficient, stable, positive-definite model which is consistent with the given correlation structure. Thus, the methods of this paper may find application in modeling and single-frame estimation.

  14. Complex analyses on clinical information systems using restricted natural language querying to resolve time-event dependencies.

    Science.gov (United States)

    Safari, Leila; Patrick, Jon D

    2018-06-01

    This paper reports on a generic framework to provide clinicians with the ability to conduct complex analyses on elaborate research topics using cascaded queries to resolve internal time-event dependencies in the research questions, as an extension to the proposed Clinical Data Analytics Language (CliniDAL). A cascaded query model is proposed to resolve internal time-event dependencies in the queries which can have up to five levels of criteria starting with a query to define subjects to be admitted into a study, followed by a query to define the time span of the experiment. Three more cascaded queries can be required to define control groups, control variables and output variables which all together simulate a real scientific experiment. According to the complexity of the research questions, the cascaded query model has the flexibility of merging some lower level queries for simple research questions or adding a nested query to each level to compose more complex queries. Three different scenarios (one of them contains two studies) are described and used for evaluation of the proposed solution. CliniDAL's complex analyses solution enables answering complex queries with time-event dependencies at most in a few hours which manually would take many days. An evaluation of results of the research studies based on the comparison between CliniDAL and SQL solutions reveals high usability and efficiency of CliniDAL's solution. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. Harnessing Whole Genome Sequencing in Medical Mycology.

    Science.gov (United States)

    Cuomo, Christina A

    2017-01-01

    Comparative genome sequencing studies of human fungal pathogens enable identification of genes and variants associated with virulence and drug resistance. This review describes current approaches, resources, and advances in applying whole genome sequencing to study clinically important fungal pathogens. Genomes for some important fungal pathogens were only recently assembled, revealing gene family expansions in many species and extreme gene loss in one obligate species. The scale and scope of species sequenced is rapidly expanding, leveraging technological advances to assemble and annotate genomes with higher precision. By using iteratively improved reference assemblies or those generated de novo for new species, recent studies have compared the sequence of isolates representing populations or clinical cohorts. Whole genome approaches provide the resolution necessary for comparison of closely related isolates, for example, in the analysis of outbreaks or sampled across time within a single host. Genomic analysis of fungal pathogens has enabled both basic research and diagnostic studies. The increased scale of sequencing can be applied across populations, and new metagenomic methods allow direct analysis of complex samples.

  16. Time fluctuation analysis of forest fire sequences

    Science.gov (United States)

    Vega Orozco, Carmen D.; Kanevski, Mikhaïl; Tonini, Marj; Golay, Jean; Pereira, Mário J. G.

    2013-04-01

    Forest fires are complex events involving both space and time fluctuations. Understanding of their dynamics and pattern distribution is of great importance in order to improve the resource allocation and support fire management actions at local and global levels. This study aims at characterizing the temporal fluctuations of forest fire sequences observed in Portugal, which is the country that holds the largest wildfire land dataset in Europe. This research applies several exploratory data analysis measures to 302,000 forest fires occurred from 1980 to 2007. The applied clustering measures are: Morisita clustering index, fractal and multifractal dimensions (box-counting), Ripley's K-function, Allan Factor, and variography. These algorithms enable a global time structural analysis describing the degree of clustering of a point pattern and defining whether the observed events occur randomly, in clusters or in a regular pattern. The considered methods are of general importance and can be used for other spatio-temporal events (i.e. crime, epidemiology, biodiversity, geomarketing, etc.). An important contribution of this research deals with the analysis and estimation of local measures of clustering that helps understanding their temporal structure. Each measure is described and executed for the raw data (forest fires geo-database) and results are compared to reference patterns generated under the null hypothesis of randomness (Poisson processes) embedded in the same time period of the raw data. This comparison enables estimating the degree of the deviation of the real data from a Poisson process. Generalizations to functional measures of these clustering methods, taking into account the phenomena, were also applied and adapted to detect time dependences in a measured variable (i.e. burned area). The time clustering of the raw data is compared several times with the Poisson processes at different thresholds of the measured function. Then, the clustering measure value

  17. Short template switch events explain mutation clusters in the human genome.

    Science.gov (United States)

    Löytynoja, Ari; Goldman, Nick

    2017-06-01

    Resequencing efforts are uncovering the extent of genetic variation in humans and provide data to study the evolutionary processes shaping our genome. One recurring puzzle in both intra- and inter-species studies is the high frequency of complex mutations comprising multiple nearby base substitutions or insertion-deletions. We devised a generalized mutation model of template switching during replication that extends existing models of genome rearrangement and used this to study the role of template switch events in the origin of short mutation clusters. Applied to the human genome, our model detects thousands of template switch events during the evolution of human and chimp from their common ancestor and hundreds of events between two independently sequenced human genomes. Although many of these are consistent with a template switch mechanism previously proposed for bacteria, our model also identifies new types of mutations that create short inversions, some flanked by paired inverted repeats. The local template switch process can create numerous complex mutation patterns, including hairpin loop structures, and explains multinucleotide mutations and compensatory substitutions without invoking positive selection, speculative mechanisms, or implausible coincidence. Clustered sequence differences are challenging for current mapping and variant calling methods, and we show that many erroneous variant annotations exist in human reference data. Local template switch events may have been neglected as an explanation for complex mutations because of biases in commonly used analyses. Incorporation of our model into reference-based analysis pipelines and comparisons of de novo assembled genomes will lead to improved understanding of genome variation and evolution. © 2017 Löytynoja and Goldman; Published by Cold Spring Harbor Laboratory Press.

  18. Pleiotropy constrains the evolution of protein but not regulatory sequences in a transcription regulatory network influencing complex social behaviours

    Directory of Open Access Journals (Sweden)

    Daria eMolodtsova

    2014-12-01

    Full Text Available It is increasingly apparent that genes and networks that influence complex behaviour are evolutionary conserved, which is paradoxical considering that behaviour is labile over evolutionary timescales. How does adaptive change in behaviour arise if behaviour is controlled by conserved, pleiotropic, and likely evolutionary constrained genes? Pleiotropy and connectedness are known to constrain the general rate of protein evolution, prompting some to suggest that the evolution of complex traits, including behaviour, is fuelled by regulatory sequence evolution. However, we seldom have data on the strength of selection on mutations in coding and regulatory sequences, and this hinders our ability to study how pleiotropy influences coding and regulatory sequence evolution. Here we use population genomics to estimate the strength of selection on coding and regulatory mutations for a transcriptional regulatory network that influences complex behaviour of honey bees. We found that replacement mutations in highly connected transcription factors and target genes experience significantly stronger negative selection relative to weakly connected transcription factors and targets. Adaptively evolving proteins were significantly more likely to reside at the periphery of the regulatory network, while proteins with signs of negative selection were near the core of the network. Interestingly, connectedness and network structure had minimal influence on the strength of selection on putative regulatory sequences for both transcription factors and their targets. Our study indicates that adaptive evolution of complex behaviour can arise because of positive selection on protein-coding mutations in peripheral genes, and on regulatory sequence mutations in both transcription factors and their targets throughout the network.

  19. "Polymeromics": Mass spectrometry based strategies in polymer science toward complete sequencing approaches: a review.

    Science.gov (United States)

    Altuntaş, Esra; Schubert, Ulrich S

    2014-01-15

    Mass spectrometry (MS) is the most versatile and comprehensive method in "OMICS" sciences (i.e. in proteomics, genomics, metabolomics and lipidomics). The applications of MS and tandem MS (MS/MS or MS(n)) provide sequence information of the full complement of biological samples in order to understand the importance of the sequences on their precise and specific functions. Nowadays, the control of polymer sequences and their accurate characterization is one of the significant challenges of current polymer science. Therefore, a similar approach can be very beneficial for characterizing and understanding the complex structures of synthetic macromolecules. MS-based strategies allow a relatively precise examination of polymeric structures (e.g. their molar mass distributions, monomer units, side chain substituents, end-group functionalities, and copolymer compositions). Moreover, tandem MS offer accurate structural information from intricate macromolecular structures; however, it produces vast amount of data to interpret. In "OMICS" sciences, the software application to interpret the obtained data has developed satisfyingly (e.g. in proteomics), because it is not possible to handle the amount of data acquired via (tandem) MS studies on the biological samples manually. It can be expected that special software tools will improve the interpretation of (tandem) MS output from the investigations of synthetic polymers as well. Eventually, the MS/MS field will also open up for polymer scientists who are not MS-specialists. In this review, we dissect the overall framework of the MS and MS/MS analysis of synthetic polymers into its key components. We discuss the fundamentals of polymer analyses as well as recent advances in the areas of tandem mass spectrometry, software developments, and the overall future perspectives on the way to polymer sequencing, one of the last Holy Grail in polymer science. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. Rupture processes of the 2013-2014 Minab earthquake sequence, Iran

    Science.gov (United States)

    Kintner, Jonas A.; Ammon, Charles J.; Cleveland, K. Michael; Herman, Matthew

    2018-06-01

    We constrain epicentroid locations, magnitudes and depths of moderate-magnitude earthquakes in the 2013-2014 Minab sequence using surface-wave cross-correlations, surface-wave spectra and teleseismic body-wave modelling. We estimate precise relative locations of 54 Mw ≥ 3.8 earthquakes using 48 409 teleseismic, intermediate-period Rayleigh and Love-wave cross-correlation measurements. To reduce significant regional biases in our relative locations, we shift the relative locations to align the Mw 6.2 main-shock centroid to a location derived from an independent InSAR fault model. Our relocations suggest that the events lie along a roughly east-west trend that is consistent with the faulting geometry in the GCMT catalogue. The results support previous studies that suggest the sequence consists of left-lateral strain release, but better defines the main-shock fault length and shows that most of the Mw ≥ 5.0 aftershocks occurred on one or two similarly oriented structures. We also show that aftershock activity migrated westwards along strike, away from the main shock, suggesting that Coulomb stress transfer played a role in the fault failure. We estimate the magnitudes of the relocated events using surface-wave cross-correlation amplitudes and find good agreement with the GCMT moment magnitudes for the larger events and underestimation of small-event size by catalogue MS. In addition to clarifying details of the Minab sequence, the results demonstrate that even in tectonically complex regions, relative relocation using teleseismic surface waves greatly improves the precision of relative earthquake epicentroid locations and can facilitate detailed tectonic analyses of remote earthquake sequences.

  1. Visual artificial grammar learning by rhesus macaques (Macaca mulatta): exploring the role of grammar complexity and sequence length.

    Science.gov (United States)

    Heimbauer, Lisa A; Conway, Christopher M; Christiansen, Morten H; Beran, Michael J; Owren, Michael J

    2018-03-01

    Humans and nonhuman primates can learn about the organization of stimuli in the environment using implicit sequential pattern learning capabilities. However, most previous artificial grammar learning studies with nonhuman primates have involved relatively simple grammars and short input sequences. The goal in the current experiments was to assess the learning capabilities of monkeys on an artificial grammar-learning task that was more complex than most others previously used with nonhumans. Three experiments were conducted using a joystick-based, symmetrical-response serial reaction time task in which two monkeys were exposed to grammar-generated sequences at sequence lengths of four in Experiment 1, six in Experiment 2, and eight in Experiment 3. Over time, the monkeys came to respond faster to the sequences generated from the artificial grammar compared to random versions. In a subsequent generalization phase, subjects generalized their knowledge to novel sequences, responding significantly faster to novel instances of sequences produced using the familiar grammar compared to those constructed using an unfamiliar grammar. These results reveal that rhesus monkeys can learn and generalize the statistical structure inherent in an artificial grammar that is as complex as some used with humans, for sequences up to eight items long. These findings are discussed in relation to whether or not rhesus macaques and other primate species possess implicit sequence learning abilities that are similar to those that humans draw upon to learn natural language grammar.

  2. From Hamiltonian chaos to complex systems a nonlinear physics approach

    CERN Document Server

    Leonetti, Marc

    2013-01-01

    From Hamiltonian Chaos to Complex Systems: A Nonlinear Physics Approach collects contributions on recent developments in non-linear dynamics and statistical physics with an emphasis on complex systems. This book provides a wide range of state-of-the-art research in these fields. The unifying aspect of this book is a demonstration of how similar tools coming from dynamical systems, nonlinear physics, and statistical dynamics can lead to a large panorama of  research in various fields of physics and beyond, most notably with the perspective of application in complex systems. This book also: Illustrates the broad research influence of tools coming from dynamical systems, nonlinear physics, and statistical dynamics Adopts a pedagogic approach to facilitate understanding by non-specialists and students Presents applications in complex systems Includes 150 illustrations From Hamiltonian Chaos to Complex Systems: A Nonlinear Physics Approach is an ideal book for graduate students and researchers working in applied...

  3. PhyloSim - Monte Carlo simulation of sequence evolution in the R statistical computing environment

    Directory of Open Access Journals (Sweden)

    Massingham Tim

    2011-04-01

    Full Text Available Abstract Background The Monte Carlo simulation of sequence evolution is routinely used to assess the performance of phylogenetic inference methods and sequence alignment algorithms. Progress in the field of molecular evolution fuels the need for more realistic and hence more complex simulations, adapted to particular situations, yet current software makes unreasonable assumptions such as homogeneous substitution dynamics or a uniform distribution of indels across the simulated sequences. This calls for an extensible simulation framework written in a high-level functional language, offering new functionality and making it easy to incorporate further complexity. Results PhyloSim is an extensible framework for the Monte Carlo simulation of sequence evolution, written in R, using the Gillespie algorithm to integrate the actions of many concurrent processes such as substitutions, insertions and deletions. Uniquely among sequence simulation tools, PhyloSim can simulate arbitrarily complex patterns of rate variation and multiple indel processes, and allows for the incorporation of selective constraints on indel events. User-defined complex patterns of mutation and selection can be easily integrated into simulations, allowing PhyloSim to be adapted to specific needs. Conclusions Close integration with R and the wide range of features implemented offer unmatched flexibility, making it possible to simulate sequence evolution under a wide range of realistic settings. We believe that PhyloSim will be useful to future studies involving simulated alignments.

  4. Next-generation phylogeography: a targeted approach for multilocus sequencing of non-model organisms.

    Directory of Open Access Journals (Sweden)

    Jonathan B Puritz

    Full Text Available The field of phylogeography has long since realized the need and utility of incorporating nuclear DNA (nDNA sequences into analyses. However, the use of nDNA sequence data, at the population level, has been hindered by technical laboratory difficulty, sequencing costs, and problematic analytical methods dealing with genotypic sequence data, especially in non-model organisms. Here, we present a method utilizing the 454 GS-FLX Titanium pyrosequencing platform with the capacity to simultaneously sequence two species of sea star (Meridiastra calcar and Parvulastra exigua at five different nDNA loci across 16 different populations of 20 individuals each per species. We compare results from 3 populations with traditional Sanger sequencing based methods, and demonstrate that this next-generation sequencing platform is more time and cost effective and more sensitive to rare variants than Sanger based sequencing. A crucial advantage is that the high coverage of clonally amplified sequences simplifies haplotype determination, even in highly polymorphic species. This targeted next-generation approach can greatly increase the use of nDNA sequence loci in phylogeographic and population genetic studies by mitigating many of the time, cost, and analytical issues associated with highly polymorphic, diploid sequence markers.

  5. A robust neural network-based approach for microseismic event detection

    KAUST Repository

    Akram, Jubran; Ovcharenko, Oleg; Peter, Daniel

    2017-01-01

    We present an artificial neural network based approach for robust event detection from low S/N waveforms. We use a feed-forward network with a single hidden layer that is tuned on a training dataset and later applied on the entire example dataset

  6. Development of a state machine sequencer for the Keck Interferometer: evolution, development, and lessons learned using a CASE tool approach

    Science.gov (United States)

    Reder, Leonard J.; Booth, Andrew; Hsieh, Jonathan; Summers, Kellee R.

    2004-09-01

    This paper presents a discussion of the evolution of a sequencer from a simple Experimental Physics and Industrial Control System (EPICS) based sequencer into a complex implementation designed utilizing UML (Unified Modeling Language) methodologies and a Computer Aided Software Engineering (CASE) tool approach. The main purpose of the Interferometer Sequencer (called the IF Sequencer) is to provide overall control of the Keck Interferometer to enable science operations to be carried out by a single operator (and/or observer). The interferometer links the two 10m telescopes of the W. M. Keck Observatory at Mauna Kea, Hawaii. The IF Sequencer is a high-level, multi-threaded, Harel finite state machine software program designed to orchestrate several lower-level hardware and software hard real-time subsystems that must perform their work in a specific and sequential order. The sequencing need not be done in hard real-time. Each state machine thread commands either a high-speed real-time multiple mode embedded controller via CORBA, or slower controllers via EPICS Channel Access interfaces. The overall operation of the system is simplified by the automation. The UML is discussed and our use of it to implement the sequencer is presented. The decision to use the Rhapsody product as our CASE tool is explained and reflected upon. Most importantly, a section on lessons learned is presented and the difficulty of integrating CASE tool automatically generated C++ code into a large control system consisting of multiple infrastructures is presented.

  7. Towards Hybrid Online On-Demand Querying of Realtime Data with Stateful Complex Event Processing

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Qunzhi; Simmhan, Yogesh; Prasanna, Viktor K.

    2013-10-09

    Emerging Big Data applications in areas like e-commerce and energy industry require both online and on-demand queries to be performed over vast and fast data arriving as streams. These present novel challenges to Big Data management systems. Complex Event Processing (CEP) is recognized as a high performance online query scheme which in particular deals with the velocity aspect of the 3-V’s of Big Data. However, traditional CEP systems do not consider data variety and lack the capability to embed ad hoc queries over the volume of data streams. In this paper, we propose H2O, a stateful complex event processing framework, to support hybrid online and on-demand queries over realtime data. We propose a semantically enriched event and query model to address data variety. A formal query algebra is developed to precisely capture the stateful and containment semantics of online and on-demand queries. We describe techniques to achieve the interactive query processing over realtime data featured by efficient online querying, dynamic stream data persistence and on-demand access. The system architecture is presented and the current implementation status reported.

  8. Top event prevention in complex systems

    International Nuclear Information System (INIS)

    Youngblood, R.W.; Worrell, R.B.

    1995-01-01

    A key step in formulating a regulatory basis for licensing complex and potentially hazardous facilities is identification of a collection of design elements that is necessary and sufficient to achieve the desired level of protection of the public, the workers, and the environment. Here, such a collection of design elements will be called a ''prevention set.'' At the design stage, identifying a prevention set helps to determine what elements to include in the final design. Separately, a prevention-set argument could be used to limit the scope of regulatory oversight to a subset of design elements. This step can be taken during initial review of a design, or later as part of an effort to justify relief from regulatory requirements that are burdensome but provide little risk reduction. This paper presents a systematic approach to the problem of optimally choosing a prevention set

  9. Correlation in chicken between the marker LEI0258 alleles and Major Histocompatibility Complex sequences

    DEFF Research Database (Denmark)

    Chazara, Olympe; Juul-Madsen, Helle Risdahl; Chang, Chi-Seng

    Background The LEI0258 marker is located within the B region of the chicken Major Histocompatibility Complex (MHC), and is surprisingly well associated with serology. Therefore, the correlation between the LEI0258 alleles and the MHC class I and the class II alleles at the level of sequences is w...

  10. SPARSE: quadratic time simultaneous alignment and folding of RNAs without sequence-based heuristics.

    Science.gov (United States)

    Will, Sebastian; Otto, Christina; Miladi, Milad; Möhl, Mathias; Backofen, Rolf

    2015-08-01

    RNA-Seq experiments have revealed a multitude of novel ncRNAs. The gold standard for their analysis based on simultaneous alignment and folding suffers from extreme time complexity of [Formula: see text]. Subsequently, numerous faster 'Sankoff-style' approaches have been suggested. Commonly, the performance of such methods relies on sequence-based heuristics that restrict the search space to optimal or near-optimal sequence alignments; however, the accuracy of sequence-based methods breaks down for RNAs with sequence identities below 60%. Alignment approaches like LocARNA that do not require sequence-based heuristics, have been limited to high complexity ([Formula: see text] quartic time). Breaking this barrier, we introduce the novel Sankoff-style algorithm 'sparsified prediction and alignment of RNAs based on their structure ensembles (SPARSE)', which runs in quadratic time without sequence-based heuristics. To achieve this low complexity, on par with sequence alignment algorithms, SPARSE features strong sparsification based on structural properties of the RNA ensembles. Following PMcomp, SPARSE gains further speed-up from lightweight energy computation. Although all existing lightweight Sankoff-style methods restrict Sankoff's original model by disallowing loop deletions and insertions, SPARSE transfers the Sankoff algorithm to the lightweight energy model completely for the first time. Compared with LocARNA, SPARSE achieves similar alignment and better folding quality in significantly less time (speedup: 3.7). At similar run-time, it aligns low sequence identity instances substantially more accurate than RAF, which uses sequence-based heuristics. © The Author 2015. Published by Oxford University Press.

  11. A Basis Function Approach to Simulate Storm Surge Events for Coastal Flood Risk Assessment

    Science.gov (United States)

    Wu, Wenyan; Westra, Seth; Leonard, Michael

    2017-04-01

    Storm surge is a significant contributor to flooding in coastal and estuarine regions, especially when it coincides with other flood producing mechanisms, such as extreme rainfall. Therefore, storm surge has always been a research focus in coastal flood risk assessment. Often numerical models have been developed to understand storm surge events for risk assessment (Kumagai et al. 2016; Li et al. 2016; Zhang et al. 2016) (Bastidas et al. 2016; Bilskie et al. 2016; Dalledonne and Mayerle 2016; Haigh et al. 2014; Kodaira et al. 2016; Lapetina and Sheng 2015), and assess how these events may change or evolve in the future (Izuru et al. 2015; Oey and Chou 2016). However, numeric models often require a lot of input information and difficulties arise when there are not sufficient data available (Madsen et al. 2015). Alternative, statistical methods have been used to forecast storm surge based on historical data (Hashemi et al. 2016; Kim et al. 2016) or to examine the long term trend in the change of storm surge events, especially under climate change (Balaguru et al. 2016; Oh et al. 2016; Rueda et al. 2016). In these studies, often the peak of surge events is used, which result in the loss of dynamic information within a tidal cycle or surge event (i.e. a time series of storm surge values). In this study, we propose an alternative basis function (BF) based approach to examine the different attributes (e.g. peak and durations) of storm surge events using historical data. Two simple two-parameter BFs were used: the exponential function and the triangular function. High quality hourly storm surge record from 15 tide gauges around Australia were examined. It was found that there are significantly location and seasonal variability in the peak and duration of storm surge events, which provides additional insights in coastal flood risk. In addition, the simple form of these BFs allows fast simulation of storm surge events and minimises the complexity of joint probability

  12. A combination of LongSAGE with Solexa sequencing is well suited to explore the depth and the complexity of transcriptome

    Directory of Open Access Journals (Sweden)

    Scoté-Blachon Céline

    2008-09-01

    Full Text Available Abstract Background "Open" transcriptome analysis methods allow to study gene expression without a priori knowledge of the transcript sequences. As of now, SAGE (Serial Analysis of Gene Expression, LongSAGE and MPSS (Massively Parallel Signature Sequencing are the mostly used methods for "open" transcriptome analysis. Both LongSAGE and MPSS rely on the isolation of 21 pb tag sequences from each transcript. In contrast to LongSAGE, the high throughput sequencing method used in MPSS enables the rapid sequencing of very large libraries containing several millions of tags, allowing deep transcriptome analysis. However, a bias in the complexity of the transcriptome representation obtained by MPSS was recently uncovered. Results In order to make a deep analysis of mouse hypothalamus transcriptome avoiding the limitation introduced by MPSS, we combined LongSAGE with the Solexa sequencing technology and obtained a library of more than 11 millions of tags. We then compared it to a LongSAGE library of mouse hypothalamus sequenced with the Sanger method. Conclusion We found that Solexa sequencing technology combined with LongSAGE is perfectly suited for deep transcriptome analysis. In contrast to MPSS, it gives a complex representation of transcriptome as reliable as a LongSAGE library sequenced by the Sanger method.

  13. Pairwise local structural alignment of RNA sequences with sequence similarity less than 40%

    DEFF Research Database (Denmark)

    Havgaard, Jakob Hull; Lyngsø, Rune B.; Stormo, Gary D.

    2005-01-01

    detect two genes with low sequence similarity, where the genes are part of a larger genomic region. Results: Here we present such an approach for pairwise local alignment which is based on FILDALIGN and the Sankoff algorithm for simultaneous structural alignment of multiple sequences. We include...... the ability to conduct mutual scans of two sequences of arbitrary length while searching for common local structural motifs of some maximum length. This drastically reduces the complexity of the algorithm. The scoring scheme includes structural parameters corresponding to those available for free energy....... The structure prediction performance for a family is typically around 0.7 using Matthews correlation coefficient. In case (2), the algorithm is successful at locating RNA families with an average sensitivity of 0.8 and a positive predictive value of 0.9 using a BLAST-like hit selection scheme. Availability...

  14. Disentangling the complex evolutionary history of the Western Palearctic blue tits (Cyanistes spp.) - phylogenomic analyses suggest radiation by multiple colonization events and subsequent isolation.

    Science.gov (United States)

    Stervander, Martin; Illera, Juan Carlos; Kvist, Laura; Barbosa, Pedro; Keehnen, Naomi P; Pruisscher, Peter; Bensch, Staffan; Hansson, Bengt

    2015-05-01

    Isolated islands and their often unique biota continue to play key roles for understanding the importance of drift, genetic variation and adaptation in the process of population differentiation and speciation. One island system that has inspired and intrigued evolutionary biologists is the blue tit complex (Cyanistes spp.) in Europe and Africa, in particular the complex evolutionary history of the multiple genetically distinct taxa of the Canary Islands. Understanding Afrocanarian colonization events is of particular importance because of recent unconventional suggestions that these island populations acted as source of the widespread population in mainland Africa. We investigated the relationship between mainland and island blue tits using a combination of Sanger sequencing at a population level (20 loci; 12 500 nucleotides) and next-generation sequencing of single population representatives (>3 200 000 nucleotides), analysed in coalescence and phylogenetic frameworks. We found (i) that Afrocanarian blue tits are monophyletic and represent four major clades, (ii) that the blue tit complex has a continental origin and that the Canary Islands were colonized three times, (iii) that all island populations have low genetic variation, indicating low long-term effective population sizes and (iv) that populations on La Palma and in Libya represent relicts of an ancestral North African population. Further, demographic reconstructions revealed (v) that the Canary Islands, conforming to traditional views, hold sink populations, which have not served as source for back colonization of the African mainland. Our study demonstrates the importance of complete taxon sampling and an extensive multimarker study design to obtain robust phylogeographical inferences. © 2015 John Wiley & Sons Ltd.

  15. Involuntary memory chaining versus event cueing: Which is a better indicator of autobiographical memory organisation?

    Science.gov (United States)

    Mace, John H; Clevinger, Amanda M; Martin, Cody

    2010-11-01

    Involuntary memory chains are spontaneous recollections of the past that occur in a sequence. Much like semantic memory priming, this memory phenomenon has provided some insights into the nature of associations in autobiographical memory. The event-cueing procedure (a laboratory-based memory sequencing task) has also provided some insights into the nature of autobiographical memory organisation. However, while both of these memory-sequencing phenomena have exhibited the same types of memory associations (conceptual associations and general-event or temporal associations), both have also produced discrepant results with respect to the relative proportions of such associations. This study investigated the possibility that the results from event cueing are artefacts of various memory production responses. Using a number of different approaches we demonstrated that these memory production responses cause overestimates of general-event association. We conclude that for this reason, the data from involuntary memory chains provide a better picture of the organisation of autobiographical memory.

  16. A study for the sequence of events (SOE) system on the nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Byung Chae; Jeon, Jong Sun; Lee, Sun Sung; Lee, Kyung Ho; Lee, Byung Ju; Sohn, Kwang Young [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-06-01

    It is important to identify where and why an event or a trip is occurred in the Nuclear Power Plant(NPP) and to provide proper resolution against above situation. In order to analyze the prime cause or conspicuous reason of trouble occurred after events or trips occur, the Sequence of Events(SOE) system has been adopted in Korean NPP to acquire the sequential information along where and when an event or a trip take place. The SOE system of UCN 3 and 4 plant which is included in the Plant Data Acquisition System (PDAS), shares the 3205 computer and system software with PDAS. Sharing of the computer H/w and S/W, however, requires more complicated process to provide the events or trip signals due to the inherent characteristics of the shared system. Moreover there are high potentiality of collision between synchronization signals and data transmitted to the Plant Computer System (PCS), when the synchronization signals are sent from PCS to the three SOE processors. When this collision happens the SOE system will break down, thus it is not possible to analyze the trend of events or trips. An independent SOE system composed with single processor is proposed in this paper. To begin with, the analyses on the hardware and software of SOE and PDAS system of UCN 3 and 4 were performed to justify the problems and the resolution if it exists. In order to test the new SOE system, VMEbus, VM30 CPU, change of status I/O card and OS-9 for the operating system were adopted and the analysis for this test system was done as follows; the verification should be achieved through the simulation; the simulated signals for events are given the test system as inputs and the outputs are monitored to verify whether the sequential events logging function works well or not on PC. In conclusion, this report is expected to provide the technical background for the improvement and changing of the NPP PDAS and SOE system in the future. 18 tabs., 33 figs., 26 refs. (Author) .new.

  17. A study for the sequence of events (SOE) system on the nuclear power plant

    International Nuclear Information System (INIS)

    Lee, Byung Chae; Jeon, Jong Sun; Lee, Sun Sung; Lee, Kyung Ho; Lee, Byung Ju; Sohn, Kwang Young

    1996-06-01

    It is important to identify where and why an event or a trip is occurred in the Nuclear Power Plant(NPP) and to provide proper resolution against above situation. In order to analyze the prime cause or conspicuous reason of trouble occurred after events or trips occur, the Sequence of Events(SOE) system has been adopted in Korean NPP to acquire the sequential information along where and when an event or a trip take place. The SOE system of UCN 3 and 4 plant which is included in the Plant Data Acquisition System (PDAS), shares the 3205 computer and system software with PDAS. Sharing of the computer H/w and S/W, however, requires more complicated process to provide the events or trip signals due to the inherent characteristics of the shared system. Moreover there are high potentiality of collision between synchronization signals and data transmitted to the Plant Computer System (PCS), when the synchronization signals are sent from PCS to the three SOE processors. When this collision happens the SOE system will break down, thus it is not possible to analyze the trend of events or trips. An independent SOE system composed with single processor is proposed in this paper. To begin with, the analyses on the hardware and software of SOE and PDAS system of UCN 3 and 4 were performed to justify the problems and the resolution if it exists. In order to test the new SOE system, VMEbus, VM30 CPU, change of status I/O card and OS-9 for the operating system were adopted and the analysis for this test system was done as follows; the verification should be achieved through the simulation; the simulated signals for events are given the test system as inputs and the outputs are monitored to verify whether the sequential events logging function works well or not on PC. In conclusion, this report is expected to provide the technical background for the improvement and changing of the NPP PDAS and SOE system in the future. 18 tabs., 33 figs., 26 refs. (Author) .new

  18. Towards a Strategic Approach to Special Events Management in the Post-9/11 World

    National Research Council Canada - National Science Library

    Jones, G. B

    2005-01-01

    ... by the Department of Homeland Security and the FBI to categorize and resource special events, and it evaluates whether the current approach to major event planning is sufficient for contemporary counterterrorism challenges...

  19. Crystal structure of importin-{alpha} complexed with a classic nuclear localization sequence obtained by oriented peptide library screening

    Energy Technology Data Exchange (ETDEWEB)

    Takeda, A.A.S.; Fontes, M.R.M. [UNESP, Universidade Estadual Paulista, Botucatu, SP (Brazil); Yang, S.N.Y. [University of Melbourne, Melbourne (Australia); Harris, J.M. [Queensland University of Technology, Brisbane (Australia); Jans, D.A. [Monash University, Clayton (Australia); Kobe, B. [University of Queensland, Brisbane, QU (Australia)

    2012-07-01

    Full text: Importin-{alpha} (Imp{alpha}) plays a role in the classical nuclear import pathway, binding to cargo proteins with activities in the nucleus. Different Imp{alpha} paralogs responsible for specific cargos can be found in a single organism. The cargos contain nuclear localization sequences (NLSs), which are characterized by one or two clusters of basic amino acids (monopartite and bipartite NLSs, respectively). In this work we present the crystal structure of Imp{alpha} from M. musculus (residues 70-529, lacking the auto inhibitory domain) bound to a NLS peptide (pepTM). The peptide corresponds to the optimal sequence obtained by an oriented peptide library experiment designed to probe the specificity of the major NLS binding site. The peptide library used five degenerate positions and identified the sequence KKKRR as the optimal sequence for binding to this site for mouse Imp{alpha} (70-529). The protein was obtained using an E. coli expression system and purified by affinity chromatography followed by an ion exchange chromatography. A single crystal of Imp{alpha} -pepTM complex was grown by the hanging drop method. The data were collected using the Synchrotron Radiation Source LNLS, Brazil and processed to 2.3. Molecular replacement techniques were used to determine the crystal structure. Electron density corresponding to the peptide was present in both major and minor binding sites The peptide is bound to Imp{alpha} similar as the simian virus 40 (SV40) large tumour (T)-antigen NLS. Binding assays confirmed that the peptide bound to Imp{alpha} with low nM affinities. This is the first time that structural information has been linked to an oriented peptide library screening approach for importin-{alpha}; the results will contribute to understanding of the sequence determinants of classical NLSs, and may help identify as yet unidentified classical NLSs in novel proteins. (author)

  20. The influence of spherical cavity surface charge distribution on the sequence of partial discharge events

    International Nuclear Information System (INIS)

    Illias, Hazlee A; Chen, George; Lewin, Paul L

    2011-01-01

    In this work, a model representing partial discharge (PD) behaviour of a spherical cavity within a homogeneous dielectric material has been developed to study the influence of cavity surface charge distribution on the electric field distribution in both the cavity and the material itself. The charge accumulation on the cavity surface after a PD event and charge movement along the cavity wall under the influence of electric field magnitude and direction has been found to affect the electric field distribution in the whole cavity and in the material. This in turn affects the likelihood of any subsequent PD activity in the cavity and the whole sequence of PD events. The model parameters influencing cavity surface charge distribution can be readily identified; they are the cavity surface conductivity, the inception field and the extinction field. Comparison of measurement and simulation results has been undertaken to validate the model.

  1. The influence of spherical cavity surface charge distribution on the sequence of partial discharge events

    Energy Technology Data Exchange (ETDEWEB)

    Illias, Hazlee A [Department of Electrical Engineering, Faculty of Engineering, University of Malaya, 50603 Kuala Lumpur (Malaysia); Chen, George; Lewin, Paul L, E-mail: h.illias@um.edu.my [Tony Davies High Voltage Laboratory, School of Electronics and Computer Science, University of Southampton, Southampton, SO17 1BJ (United Kingdom)

    2011-06-22

    In this work, a model representing partial discharge (PD) behaviour of a spherical cavity within a homogeneous dielectric material has been developed to study the influence of cavity surface charge distribution on the electric field distribution in both the cavity and the material itself. The charge accumulation on the cavity surface after a PD event and charge movement along the cavity wall under the influence of electric field magnitude and direction has been found to affect the electric field distribution in the whole cavity and in the material. This in turn affects the likelihood of any subsequent PD activity in the cavity and the whole sequence of PD events. The model parameters influencing cavity surface charge distribution can be readily identified; they are the cavity surface conductivity, the inception field and the extinction field. Comparison of measurement and simulation results has been undertaken to validate the model.

  2. Long aftershock sequences in North China and Central US: implications for hazard assessment in mid-continents

    Science.gov (United States)

    Liu, Mian; Luo, Gang; Wang, Hui; Stein, Seth

    2014-02-01

    Because seismic activity within mid-continents is usually much lower than that along plate boundary zones, even small earthquakes can cause widespread concerns, especially when these events occur in the source regions of previous large earthquakes. However, these small earthquakes may be just aftershocks that continue for decades or even longer. The recent seismicity in the Tangshan region in North China is likely aftershocks of the 1976 Great Tangshan earthquake. The current earthquake sequence in the New Madrid seismic zone in central United States, which includes a cluster of M ~ 7.0 events in 1811-1812 and a number of similar events in the past millennium, is believed to result from recent fault reactivation that releases pre-stored strain energy in the crust. If so, this earthquake sequence is similar to aftershocks in that the rates of energy release should decay with time and the sequence of earthquakes will eventually end. We use simple physical analysis and numerical simulations to show that the current sequence of large earthquakes in the New Madrid fault zone is likely ending or has ended. Recognizing that mid-continental earthquakes have long aftershock sequences and complex spatiotemporal occurrences are critical to improve hazard assessments.

  3. Error Analysis of Satellite Precipitation-Driven Modeling of Flood Events in Complex Alpine Terrain

    Directory of Open Access Journals (Sweden)

    Yiwen Mei

    2016-03-01

    Full Text Available The error in satellite precipitation-driven complex terrain flood simulations is characterized in this study for eight different global satellite products and 128 flood events over the Eastern Italian Alps. The flood events are grouped according to two flood types: rain floods and flash floods. The satellite precipitation products and runoff simulations are evaluated based on systematic and random error metrics applied on the matched event pairs and basin-scale event properties (i.e., rainfall and runoff cumulative depth and time series shape. Overall, error characteristics exhibit dependency on the flood type. Generally, timing of the event precipitation mass center and dispersion of the time series derived from satellite precipitation exhibits good agreement with the reference; the cumulative depth is mostly underestimated. The study shows a dampening effect in both systematic and random error components of the satellite-driven hydrograph relative to the satellite-retrieved hyetograph. The systematic error in shape of the time series shows a significant dampening effect. The random error dampening effect is less pronounced for the flash flood events and the rain flood events with a high runoff coefficient. This event-based analysis of the satellite precipitation error propagation in flood modeling sheds light on the application of satellite precipitation in mountain flood hydrology.

  4. QRS complex detection based on continuous density hidden Markov models using univariate observations

    Science.gov (United States)

    Sotelo, S.; Arenas, W.; Altuve, M.

    2018-04-01

    In the electrocardiogram (ECG), the detection of QRS complexes is a fundamental step in the ECG signal processing chain since it allows the determination of other characteristics waves of the ECG and provides information about heart rate variability. In this work, an automatic QRS complex detector based on continuous density hidden Markov models (HMM) is proposed. HMM were trained using univariate observation sequences taken either from QRS complexes or their derivatives. The detection approach is based on the log-likelihood comparison of the observation sequence with a fixed threshold. A sliding window was used to obtain the observation sequence to be evaluated by the model. The threshold was optimized by receiver operating characteristic curves. Sensitivity (Sen), specificity (Spc) and F1 score were used to evaluate the detection performance. The approach was validated using ECG recordings from the MIT-BIH Arrhythmia database. A 6-fold cross-validation shows that the best detection performance was achieved with 2 states HMM trained with QRS complexes sequences (Sen = 0.668, Spc = 0.360 and F1 = 0.309). We concluded that these univariate sequences provide enough information to characterize the QRS complex dynamics from HMM. Future works are directed to the use of multivariate observations to increase the detection performance.

  5. Discrimination of Rock Fracture and Blast Events Based on Signal Complexity and Machine Learning

    Directory of Open Access Journals (Sweden)

    Zilong Zhou

    2018-01-01

    Full Text Available The automatic discrimination of rock fracture and blast events is complex and challenging due to the similar waveform characteristics. To solve this problem, a new method based on the signal complexity analysis and machine learning has been proposed in this paper. First, the permutation entropy values of signals at different scale factors are calculated to reflect complexity of signals and constructed into a feature vector set. Secondly, based on the feature vector set, back-propagation neural network (BPNN as a means of machine learning is applied to establish a discriminator for rock fracture and blast events. Then to evaluate the classification performances of the new method, the classifying accuracies of support vector machine (SVM, naive Bayes classifier, and the new method are compared, and the receiver operating characteristic (ROC curves are also analyzed. The results show the new method obtains the best classification performances. In addition, the influence of different scale factor q and number of training samples n on discrimination results is discussed. It is found that the classifying accuracy of the new method reaches the highest value when q = 8–15 or 8–20 and n=140.

  6. A fast one-chip event-preprocessor and sequencer for the Simbol-X Low Energy Detector

    Science.gov (United States)

    Schanz, T.; Tenzer, C.; Maier, D.; Kendziorra, E.; Santangelo, A.

    2010-12-01

    We present an FPGA-based digital camera electronics consisting of an Event-Preprocessor (EPP) for on-board data preprocessing and a related Sequencer (SEQ) to generate the necessary signals to control the readout of the detector. The device has been originally designed for the Simbol-X low energy detector (LED). The EPP operates on 64×64 pixel images and has a real-time processing capability of more than 8000 frames per second. The already working releases of the EPP and the SEQ are now combined into one Digital-Camera-Controller-Chip (D3C).

  7. A fast one-chip event-preprocessor and sequencer for the Simbol-X Low Energy Detector

    Energy Technology Data Exchange (ETDEWEB)

    Schanz, T., E-mail: schanz@astro.uni-tuebingen.d [Kepler Center for Astro- and Particlephysics, Institut fuer Astronomie und Astrophysik Tuebingen, Sand 1, 72076 Tuebingen (Germany); Tenzer, C., E-mail: tenzer@astro.uni-tuebingen.d [Kepler Center for Astro- and Particlephysics, Institut fuer Astronomie und Astrophysik Tuebingen, Sand 1, 72076 Tuebingen (Germany); Maier, D.; Kendziorra, E.; Santangelo, A. [Kepler Center for Astro- and Particlephysics, Institut fuer Astronomie und Astrophysik Tuebingen, Sand 1, 72076 Tuebingen (Germany)

    2010-12-11

    We present an FPGA-based digital camera electronics consisting of an Event-Preprocessor (EPP) for on-board data preprocessing and a related Sequencer (SEQ) to generate the necessary signals to control the readout of the detector. The device has been originally designed for the Simbol-X low energy detector (LED). The EPP operates on 64x64 pixel images and has a real-time processing capability of more than 8000 frames per second. The already working releases of the EPP and the SEQ are now combined into one Digital-Camera-Controller-Chip (D3C).

  8. A fast one-chip event-preprocessor and sequencer for the Simbol-X Low Energy Detector

    International Nuclear Information System (INIS)

    Schanz, T.; Tenzer, C.; Maier, D.; Kendziorra, E.; Santangelo, A.

    2010-01-01

    We present an FPGA-based digital camera electronics consisting of an Event-Preprocessor (EPP) for on-board data preprocessing and a related Sequencer (SEQ) to generate the necessary signals to control the readout of the detector. The device has been originally designed for the Simbol-X low energy detector (LED). The EPP operates on 64x64 pixel images and has a real-time processing capability of more than 8000 frames per second. The already working releases of the EPP and the SEQ are now combined into one Digital-Camera-Controller-Chip (D3C).

  9. Evolution of Sequence Type 4821 Clonal Complex Meningococcal Strains in China from Prequinolone to Quinolone Era, 1972–2013

    Science.gov (United States)

    Guo, Qinglan; Mustapha, Mustapha M.; Chen, Mingliang; Qu, Di; Zhang, Xi; Harrison, Lee H.

    2018-01-01

    The expansion of hypervirulent sequence type 4821 clonal complex (CC4821) lineage Neisseria meningitidis bacteria has led to a shift in meningococcal disease epidemiology in China, from serogroup A (MenA) to MenC. Knowledge of the evolution and genetic origin of the emergent MenC strains is limited. In this study, we subjected 76 CC4821 isolates collected across China during 1972–1977 and 2005–2013 to phylogenetic analysis, traditional genotyping, or both. We show that successive recombination events within genes encoding surface antigens and acquisition of quinolone resistance mutations possibly played a role in the emergence of CC4821 as an epidemic clone in China. MenC and MenB CC4821 strains have spread across China and have been detected in several countries in different continents. Capsular switches involving serogroups B and C occurred among epidemic strains, raising concerns regarding possible increases in MenB disease, given that vaccines in use in China do not protect against MenB. PMID:29553310

  10. Integrating network, sequence and functional features using machine learning approaches towards identification of novel Alzheimer genes.

    Science.gov (United States)

    Jamal, Salma; Goyal, Sukriti; Shanker, Asheesh; Grover, Abhinav

    2016-10-18

    Alzheimer's disease (AD) is a complex progressive neurodegenerative disorder commonly characterized by short term memory loss. Presently no effective therapeutic treatments exist that can completely cure this disease. The cause of Alzheimer's is still unclear, however one of the other major factors involved in AD pathogenesis are the genetic factors and around 70 % risk of the disease is assumed to be due to the large number of genes involved. Although genetic association studies have revealed a number of potential AD susceptibility genes, there still exists a need for identification of unidentified AD-associated genes and therapeutic targets to have better understanding of the disease-causing mechanisms of Alzheimer's towards development of effective AD therapeutics. In the present study, we have used machine learning approach to identify candidate AD associated genes by integrating topological properties of the genes from the protein-protein interaction networks, sequence features and functional annotations. We also used molecular docking approach and screened already known anti-Alzheimer drugs against the novel predicted probable targets of AD and observed that an investigational drug, AL-108, had high affinity for majority of the possible therapeutic targets. Furthermore, we performed molecular dynamics simulations and MM/GBSA calculations on the docked complexes to validate our preliminary findings. To the best of our knowledge, this is the first comprehensive study of its kind for identification of putative Alzheimer-associated genes using machine learning approaches and we propose that such computational studies can improve our understanding on the core etiology of AD which could lead to the development of effective anti-Alzheimer drugs.

  11. Identification and characterization of earthquake clusters: a comparative analysis for selected sequences in Italy

    Science.gov (United States)

    Peresan, Antonella; Gentili, Stefania

    2017-04-01

    Identification and statistical characterization of seismic clusters may provide useful insights about the features of seismic energy release and their relation to physical properties of the crust within a given region. Moreover, a number of studies based on spatio-temporal analysis of main-shocks occurrence require preliminary declustering of the earthquake catalogs. Since various methods, relying on different physical/statistical assumptions, may lead to diverse classifications of earthquakes into main events and related events, we aim to investigate the classification differences among different declustering techniques. Accordingly, a formal selection and comparative analysis of earthquake clusters is carried out for the most relevant earthquakes in North-Eastern Italy, as reported in the local OGS-CRS bulletins, compiled at the National Institute of Oceanography and Experimental Geophysics since 1977. The comparison is then extended to selected earthquake sequences associated with a different seismotectonic setting, namely to events that occurred in the region struck by the recent Central Italy destructive earthquakes, making use of INGV data. Various techniques, ranging from classical space-time windows methods to ad hoc manual identification of aftershocks, are applied for detection of earthquake clusters. In particular, a statistical method based on nearest-neighbor distances of events in space-time-energy domain, is considered. Results from clusters identification by the nearest-neighbor method turn out quite robust with respect to the time span of the input catalogue, as well as to minimum magnitude cutoff. The identified clusters for the largest events reported in North-Eastern Italy since 1977 are well consistent with those reported in earlier studies, which were aimed at detailed manual aftershocks identification. The study shows that the data-driven approach, based on the nearest-neighbor distances, can be satisfactorily applied to decompose the seismic

  12. PCR amplification of repetitive sequences as a possible approach in relative species quantification

    DEFF Research Database (Denmark)

    Ballin, Nicolai Zederkopff; Vogensen, Finn Kvist; Karlsson, Anders H

    2012-01-01

    Abstract Both relative and absolute quantifications are possible in species quantification when single copy genomic DNA is used. However, amplification of single copy genomic DNA does not allow a limit of detection as low as one obtained from amplification of repetitive sequences. Amplification...... of repetitive sequences is therefore frequently used in absolute quantification but problems occur in relative quantification as the number of repetitive sequences is unknown. A promising approach was developed where data from amplification of repetitive sequences were used in relative quantification of species...... to relatively quantify the amount of chicken DNA in a binary mixture of chicken DNA and pig DNA. However, the designed PCR primers lack the specificity required for regulatory species control....

  13. Modeling crowd behavior based on the discrete-event multiagent approach

    OpenAIRE

    Лановой, Алексей Феликсович; Лановой, Артем Алексеевич

    2014-01-01

    The crowd is a temporary, relatively unorganized group of people, who are in close physical contact with each other. Individual behavior of human outside the crowd is determined by many factors, associated with his intellectual activities, but inside the crowd the man loses his identity and begins to obey more simple laws of behavior.One of approaches to the construction of multi-level model of the crowd using discrete-event multiagent approach was described in the paper.Based on this analysi...

  14. Bayesian inference and interpretation of centroid moment tensors of the 2016 Kumamoto earthquake sequence, Kyushu, Japan

    Science.gov (United States)

    Hallo, Miroslav; Asano, Kimiyuki; Gallovič, František

    2017-09-01

    On April 16, 2016, Kumamoto prefecture in Kyushu region, Japan, was devastated by a shallow M JMA7.3 earthquake. The series of foreshocks started by M JMA6.5 foreshock 28 h before the mainshock. They have originated in Hinagu fault zone intersecting the mainshock Futagawa fault zone; hence, the tectonic background for this earthquake sequence is rather complex. Here we infer centroid moment tensors (CMTs) for 11 events with M JMA between 4.8 and 6.5, using strong motion records of the K-NET, KiK-net and F-net networks. We use upgraded Bayesian full-waveform inversion code ISOLA-ObsPy, which takes into account uncertainty of the velocity model. Such an approach allows us to reliably assess uncertainty of the CMT parameters including the centroid position. The solutions show significant systematic spatial and temporal variations throughout the sequence. Foreshocks are right-lateral steeply dipping strike-slip events connected to the NE-SW shear zone. Those located close to the intersection of the Hinagu and Futagawa fault zones are dipping slightly to ESE, while those in the southern area are dipping to WNW. Contrarily, aftershocks are mostly normal dip-slip events, being related to the N-S extensional tectonic regime. Most of the deviatoric moment tensors contain only minor CLVD component, which can be attributed to the velocity model uncertainty. Nevertheless, two of the CMTs involve a significant CLVD component, which may reflect complex rupture process. Decomposition of those moment tensors into two pure shear moment tensors suggests combined right-lateral strike-slip and normal dip-slip mechanisms, consistent with the tectonic settings of the intersection of the Hinagu and Futagawa fault zones.[Figure not available: see fulltext.

  15. Fusarium diversity in soil using a specific molecular approach and a cultural approach.

    Science.gov (United States)

    Edel-Hermann, Véronique; Gautheron, Nadine; Mounier, Arnaud; Steinberg, Christian

    2015-04-01

    Fusarium species are ubiquitous in soil. They cause plant and human diseases and can produce mycotoxins. Surveys of Fusarium species diversity in environmental samples usually rely on laborious culture-based methods. In the present study, we have developed a molecular method to analyze Fusarium diversity directly from soil DNA. We designed primers targeting the translation elongation factor 1-alpha (EF-1α) gene and demonstrated their specificity toward Fusarium using a large collection of fungi. We used the specific primers to construct a clone library from three contrasting soils. Sequence analysis confirmed the specificity of the assay, with 750 clones identified as Fusarium and distributed among eight species or species complexes. The Fusarium oxysporum species complex (FOSC) was the most abundant one in the three soils, followed by the Fusarium solani species complex (FSSC). We then compared our molecular approach results with those obtained by isolating Fusarium colonies on two culture media and identifying species by sequencing part of the EF-1α gene. The 750 isolates were distributed into eight species or species complexes, with the same dominant species as with the cloning method. Sequence diversity was much higher in the clone library than in the isolate collection. The molecular approach proved to be a valuable tool to assess Fusarium diversity in environmental samples. Combined with high throughput sequencing, it will allow for in-depth analysis of large numbers of samples. Published by Elsevier B.V.

  16. Modern approaches to agent-based complex automated negotiation

    CERN Document Server

    Bai, Quan; Ito, Takayuki; Zhang, Minjie; Ren, Fenghui; Aydoğan, Reyhan; Hadfi, Rafik

    2017-01-01

    This book addresses several important aspects of complex automated negotiations and introduces a number of modern approaches for facilitating agents to conduct complex negotiations. It demonstrates that autonomous negotiation is one of the most important areas in the field of autonomous agents and multi-agent systems. Further, it presents complex automated negotiation scenarios that involve negotiation encounters that may have, for instance, a large number of agents, a large number of issues with strong interdependencies and/or real-time constraints.

  17. Probabilistic Methods for Processing High-Throughput Sequencing Signals

    DEFF Research Database (Denmark)

    Sørensen, Lasse Maretty

    High-throughput sequencing has the potential to answer many of the big questions in biology and medicine. It can be used to determine the ancestry of species, to chart complex ecosystems and to understand and diagnose disease. However, going from raw sequencing data to biological or medical insig....... By estimating the genotypes on a set of candidate variants obtained from both a standard mapping-based approach as well as de novo assemblies, we are able to find considerably more structural variation than previous studies...... for reconstructing transcript sequences from RNA sequencing data. The method is based on a novel sparse prior distribution over transcript abundances and is markedly more accurate than existing approaches. The second chapter describes a new method for calling genotypes from a fixed set of candidate variants....... The method queries the reads using a graph representation of the variants and hereby mitigates the reference-bias that characterise standard genotyping methods. In the last chapter, we apply this method to call the genotypes of 50 deeply sequencing parent-offspring trios from the GenomeDenmark project...

  18. Neural bases of event knowledge and syntax integration in comprehension of complex sentences.

    Science.gov (United States)

    Malaia, Evie; Newman, Sharlene

    2015-01-01

    Comprehension of complex sentences is necessarily supported by both syntactic and semantic knowledge, but what linguistic factors trigger a readers' reliance on a specific system? This functional neuroimaging study orthogonally manipulated argument plausibility and verb event type to investigate cortical bases of the semantic effect on argument comprehension during reading. The data suggest that telic verbs facilitate online processing by means of consolidating the event schemas in episodic memory and by easing the computation of syntactico-thematic hierarchies in the left inferior frontal gyrus. The results demonstrate that syntax-semantics integration relies on trade-offs among a distributed network of regions for maximum comprehension efficiency.

  19. A Multifaceted Mathematical Approach for Complex Systems

    Energy Technology Data Exchange (ETDEWEB)

    Alexander, F.; Anitescu, M.; Bell, J.; Brown, D.; Ferris, M.; Luskin, M.; Mehrotra, S.; Moser, B.; Pinar, A.; Tartakovsky, A.; Willcox, K.; Wright, S.; Zavala, V.

    2012-03-07

    Applied mathematics has an important role to play in developing the tools needed for the analysis, simulation, and optimization of complex problems. These efforts require the development of the mathematical foundations for scientific discovery, engineering design, and risk analysis based on a sound integrated approach for the understanding of complex systems. However, maximizing the impact of applied mathematics on these challenges requires a novel perspective on approaching the mathematical enterprise. Previous reports that have surveyed the DOE's research needs in applied mathematics have played a key role in defining research directions with the community. Although these reports have had significant impact, accurately assessing current research needs requires an evaluation of today's challenges against the backdrop of recent advances in applied mathematics and computing. To address these needs, the DOE Applied Mathematics Program sponsored a Workshop for Mathematics for the Analysis, Simulation and Optimization of Complex Systems on September 13-14, 2011. The workshop had approximately 50 participants from both the national labs and academia. The goal of the workshop was to identify new research areas in applied mathematics that will complement and enhance the existing DOE ASCR Applied Mathematics Program efforts that are needed to address problems associated with complex systems. This report describes recommendations from the workshop and subsequent analysis of the workshop findings by the organizing committee.

  20. Long span DNA paired-end-tag (DNA-PET sequencing strategy for the interrogation of genomic structural mutations and fusion-point-guided reconstruction of amplicons.

    Directory of Open Access Journals (Sweden)

    Fei Yao

    Full Text Available Structural variations (SVs contribute significantly to the variability of the human genome and extensive genomic rearrangements are a hallmark of cancer. While genomic DNA paired-end-tag (DNA-PET sequencing is an attractive approach to identify genomic SVs, the current application of PET sequencing with short insert size DNA can be insufficient for the comprehensive mapping of SVs in low complexity and repeat-rich genomic regions. We employed a recently developed procedure to generate PET sequencing data using large DNA inserts of 10-20 kb and compared their characteristics with short insert (1 kb libraries for their ability to identify SVs. Our results suggest that although short insert libraries bear an advantage in identifying small deletions, they do not provide significantly better breakpoint resolution. In contrast, large inserts are superior to short inserts in providing higher physical genome coverage for the same sequencing cost and achieve greater sensitivity, in practice, for the identification of several classes of SVs, such as copy number neutral and complex events. Furthermore, our results confirm that large insert libraries allow for the identification of SVs within repetitive sequences, which cannot be spanned by short inserts. This provides a key advantage in studying rearrangements in cancer, and we show how it can be used in a fusion-point-guided-concatenation algorithm to study focally amplified regions in cancer.

  1. Synthetic approaches to lanthanide complexes with tetrapyrrole type ligands

    International Nuclear Information System (INIS)

    Pushkarev, V E; Tomilova, L G; Tomilov, Yu V

    2008-01-01

    Approaches to the synthesis of single-, double- and triple-decker complexes of lanthanides with phthalocyanines and their analogues known to date are considered. Examples of preparation of sandwich-type complexes based on other metals of the Periodic system are given.

  2. Complex evolutionary patterns revealed by mitochondrial genomes of the domestic horse.

    Science.gov (United States)

    Ning, T; Li, J; Lin, K; Xiao, H; Wylie, S; Hua, S; Li, H; Zhang, Y-P

    2014-01-01

    The domestic horse is the most widely used and important stock and recreational animal, valued for its strength and endurance. The energy required by the domestic horse is mainly supplied by mitochondria via oxidative phosphorylation. Thus, selection may have played an essential role in the evolution of the horse mitochondria. Besides, demographic events also affect the DNA polymorphic pattern on mitochondria. To understand the evolutionary patterns of the mitochondria of the domestic horse, we used a deep sequencing approach to obtain the complete sequences of 15 mitochondrial genomes, and four mitochondrial gene sequences, ND6, ATP8, ATP6 and CYTB, collected from 509, 363, 363 and 409 domestic horses, respectively. Evidence of strong substitution rate heterogeneity was found at nonsynonymous sites across the genomes. Signatures of recent positive selection on mtDNA of domestic horse were detected. Specifically, five amino acids in the four mitochondrial genes were identified as the targets of positive selection. Coalescentbased simulations imply that recent population expansion is the most probable explanation for the matrilineal population history for domestic horse. Our findings reveal a complex pattern of non-neutral evolution of the mitochondrial genome in the domestic horses.

  3. Acoustic sequences in non-human animals: a tutorial review and prospectus

    Science.gov (United States)

    Kershenbaum, Arik; Blumstein, Daniel T.; Roch, Marie A.; Akçay, Çağlar; Backus, Gregory; Bee, Mark A.; Bohn, Kirsten; Cao, Yan; Carter, Gerald; Cäsar, Cristiane; Coen, Michael; DeRuiter, Stacy L.; Doyle, Laurance; Edelman, Shimon; Ferrer-i-Cancho, Ramon; Freeberg, Todd M.; Garland, Ellen C.; Gustison, Morgan; Harley, Heidi E.; Huetz, Chloé; Hughes, Melissa; Bruno, Julia Hyland; Ilany, Amiyaal; Jin, Dezhe Z.; Johnson, Michael; Ju, Chenghui; Karnowski, Jeremy; Lohr, Bernard; Manser, Marta B.; McCowan, Brenda; Mercado, Eduardo; Narins, Peter M.; Piel, Alex; Rice, Megan; Salmi, Roberta; Sasahara, Kazutoshi; Sayigh, Laela; Shiu, Yu; Taylor, Charles; Vallejo, Edgar E.; Waller, Sara; Zamora-Gutierrez, Veronica

    2015-01-01

    Animal acoustic communication often takes the form of complex sequences, made up of multiple distinct acoustic units. Apart from the well-known example of birdsong, other animals such as insects, amphibians, and mammals (including bats, rodents, primates, and cetaceans) also generate complex acoustic sequences. Occasionally, such as with birdsong, the adaptive role of these sequences seems clear (e.g. mate attraction and territorial defence). More often however, researchers have only begun to characterise – let alone understand – the significance and meaning of acoustic sequences. Hypotheses abound, but there is little agreement as to how sequences should be defined and analysed. Our review aims to outline suitable methods for testing these hypotheses, and to describe the major limitations to our current and near-future knowledge on questions of acoustic sequences. This review and prospectus is the result of a collaborative effort between 43 scientists from the fields of animal behaviour, ecology and evolution, signal processing, machine learning, quantitative linguistics, and information theory, who gathered for a 2013 workshop entitled, “Analysing vocal sequences in animals”. Our goal is to present not just a review of the state of the art, but to propose a methodological framework that summarises what we suggest are the best practices for research in this field, across taxa and across disciplines. We also provide a tutorial-style introduction to some of the most promising algorithmic approaches for analysing sequences. We divide our review into three sections: identifying the distinct units of an acoustic sequence, describing the different ways that information can be contained within a sequence, and analysing the structure of that sequence. Each of these sections is further subdivided to address the key questions and approaches in that area. We propose a uniform, systematic, and comprehensive approach to studying sequences, with the goal of clarifying

  4. A case for multi-model and multi-approach based event attribution: The 2015 European drought

    Science.gov (United States)

    Hauser, Mathias; Gudmundsson, Lukas; Orth, René; Jézéquel, Aglaé; Haustein, Karsten; Seneviratne, Sonia Isabelle

    2017-04-01

    Science on the role of anthropogenic influence on extreme weather events such as heat waves or droughts has evolved rapidly over the past years. The approach of "event attribution" compares the occurrence probability of an event in the present, factual world with the probability of the same event in a hypothetical, counterfactual world without human-induced climate change. Every such analysis necessarily faces multiple methodological choices including, but not limited to: the event definition, climate model configuration, and the design of the counterfactual world. Here, we explore the role of such choices for an attribution analysis of the 2015 European summer drought (Hauser et al., in preparation). While some GCMs suggest that anthropogenic forcing made the 2015 drought more likely, others suggest no impact, or even a decrease in the event probability. These results additionally differ for single GCMs, depending on the reference used for the counterfactual world. Observational results do not suggest a historical tendency towards more drying, but the record may be too short to provide robust assessments because of the large interannual variability of drought occurrence. These results highlight the need for a multi-model and multi-approach framework in event attribution research. This is especially important for events with low signal to noise ratio and high model dependency such as regional droughts. Hauser, M., L. Gudmundsson, R. Orth, A. Jézéquel, K. Haustein, S.I. Seneviratne, in preparation. A case for multi-model and multi-approach based event attribution: The 2015 European drought.

  5. Scanning mutagenesis of the amino acid sequences flanking phosphorylation site 1 of the mitochondrial pyruvate dehydrogenase complex

    Directory of Open Access Journals (Sweden)

    Nagib eAhsan

    2012-07-01

    Full Text Available The mitochondrial pyruvate dehydrogenase complex is regulated by reversible seryl-phosphorylation of the E1α subunit by a dedicated, intrinsic kinase. The phospho-complex is reactivated when dephosphorylated by an intrinsic PP2C-type protein phosphatase. Both the position of the phosphorylated Ser-residue and the sequences of the flanking amino acids are highly conserved. We have used the synthetic peptide-based kinase client assay plus recombinant pyruvate dehydrogenase E1α and E1α-kinase to perform scanning mutagenesis of the residues flanking the site of phosphorylation. Consistent with the results from phylogenetic analysis of the flanking sequences, the direct peptide-based kinase assays tolerated very few changes. Even conservative changes such as Leu, Ile, or Val for Met, or Glu for Asp, gave very marked reductions in phosphorylation. Overall the results indicate that regulation of the mitochondrial pyruvate dehydrogenase complex by reversible phosphorylation is an extreme example of multiple, interdependent instances of co-evolution.

  6. SPARSE: quadratic time simultaneous alignment and folding of RNAs without sequence-based heuristics

    Science.gov (United States)

    Will, Sebastian; Otto, Christina; Miladi, Milad; Möhl, Mathias; Backofen, Rolf

    2015-01-01

    Motivation: RNA-Seq experiments have revealed a multitude of novel ncRNAs. The gold standard for their analysis based on simultaneous alignment and folding suffers from extreme time complexity of O(n6). Subsequently, numerous faster ‘Sankoff-style’ approaches have been suggested. Commonly, the performance of such methods relies on sequence-based heuristics that restrict the search space to optimal or near-optimal sequence alignments; however, the accuracy of sequence-based methods breaks down for RNAs with sequence identities below 60%. Alignment approaches like LocARNA that do not require sequence-based heuristics, have been limited to high complexity (≥ quartic time). Results: Breaking this barrier, we introduce the novel Sankoff-style algorithm ‘sparsified prediction and alignment of RNAs based on their structure ensembles (SPARSE)’, which runs in quadratic time without sequence-based heuristics. To achieve this low complexity, on par with sequence alignment algorithms, SPARSE features strong sparsification based on structural properties of the RNA ensembles. Following PMcomp, SPARSE gains further speed-up from lightweight energy computation. Although all existing lightweight Sankoff-style methods restrict Sankoff’s original model by disallowing loop deletions and insertions, SPARSE transfers the Sankoff algorithm to the lightweight energy model completely for the first time. Compared with LocARNA, SPARSE achieves similar alignment and better folding quality in significantly less time (speedup: 3.7). At similar run-time, it aligns low sequence identity instances substantially more accurate than RAF, which uses sequence-based heuristics. Availability and implementation: SPARSE is freely available at http://www.bioinf.uni-freiburg.de/Software/SPARSE. Contact: backofen@informatik.uni-freiburg.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25838465

  7. Development of accident sequence precursors methodologies for core damage Probabilities in NPPs

    International Nuclear Information System (INIS)

    Munoz, R.; Minguez, E.; Melendez, E.; Sanchez-Perea, M.; Izquierdo, J.M.

    1998-01-01

    Several licensing programs have focused on the evaluation of the importance of operating events occurred in NPPs. Some have worked the dynamic aspects of the sequence of events involved, reproducing the incidents, while others are based on PSA applications to incident analysis. A method that controls the two above approaches to determine risk analysis derives from the Integrated Safety Assessment methodology (ISA). The dynamics of the event is followed by transient simulation in tree form, building a Setpoint or Deterministic Dynamic Event Tree (DDET). When a setpoint is reached, the actuation of a protection is triggered, then the tree is opened in branches corresponding to different functioning states. The engineering simulator with the new states followers each branch. One of these states is the nominal one, which is the PSA is associated to the success criterion of the system. The probability of the sequence is calculated in parallel to the dynamics. The following tools should perform the couple simulation: 1. A Scheduler that drives the simulation of the different sequences, and open branches upon demand. It will be the unique generator of processes while constructing the tree calculation, and will develop the computation in a distributed computational environment. 2. The Plant Simulator, which models the plant systems and the operator actions throughout a sequence. It receives the state of the equipment in each sequence and must provide information about setpoint crossing to the Scheduler. It will receive decision flags to continue or to stop each sequence, and to send new conditions to other plant simulators. 3. The Probability Calculator, linked only to the Scheduler, is the fault trees associated with each event tree header and performing their Boolean product. (Author)

  8. Multi-locus sequence typing provides epidemiological insights for diseased sharks infected with fungi belonging to the Fusarium solani species complex.

    Science.gov (United States)

    Desoubeaux, Guillaume; Debourgogne, Anne; Wiederhold, Nathan P; Zaffino, Marie; Sutton, Deanna; Burns, Rachel E; Frasca, Salvatore; Hyatt, Michael W; Cray, Carolyn

    2018-07-01

    Fusarium spp. are saprobic moulds that are responsible for severe opportunistic infections in humans and animals. However, we need epidemiological tools to reliably trace the circulation of such fungal strains within medical or veterinary facilities, to recognize environmental contaminations that might lead to infection and to improve our understanding of factors responsible for the onset of outbreaks. In this study, we used molecular genotyping to investigate clustered cases of Fusarium solani species complex (FSSC) infection that occurred in eight Sphyrnidae sharks under managed care at a public aquarium. Genetic relationships between fungal strains were determined by multi-locus sequence typing (MLST) analysis based on DNA sequencing at five loci, followed by comparison with sequences of 50 epidemiologically unrelated FSSC strains. Our genotyping approach revealed that F. keratoplasticum and F. solani haplotype 9x were most commonly isolated. In one case, the infection proved to be with another Hypocrealian rare opportunistic pathogen Metarhizium robertsii. Twice, sharks proved to be infected with FSSC strains with the same MLST sequence type, supporting the hypothesis the hypothesis that common environmental populations of fungi existed for these sharks and would suggest the longtime persistence of the two clonal strains within the environment, perhaps in holding pools and life support systems of the aquarium. This study highlights how molecular tools like MLST can be used to investigate outbreaks of microbiological disease. This work reinforces the need for regular controls of water quality to reduce microbiological contamination due to waterborne microorganisms.

  9. Validating a mass balance accounting approach to using 7Be measurements to estimate event-based erosion rates over an extended period at the catchment scale

    Science.gov (United States)

    Porto, Paolo; Walling, Des E.; Cogliandro, Vanessa; Callegari, Giovanni

    2016-07-01

    Use of the fallout radionuclides cesium-137 and excess lead-210 offers important advantages over traditional methods of quantifying erosion and soil redistribution rates. However, both radionuclides provide information on longer-term (i.e., 50-100 years) average rates of soil redistribution. Beryllium-7, with its half-life of 53 days, can provide a basis for documenting short-term soil redistribution and it has been successfully employed in several studies. However, the approach commonly used introduces several important constraints related to the timing and duration of the study period. A new approach proposed by the authors that overcomes these constraints has been successfully validated using an erosion plot experiment undertaken in southern Italy. Here, a further validation exercise undertaken in a small (1.38 ha) catchment is reported. The catchment was instrumented to measure event sediment yields and beryllium-7 measurements were employed to document the net soil loss for a series of 13 events that occurred between November 2013 and June 2015. In the absence of significant sediment storage within the catchment's ephemeral channel system and of a significant contribution from channel erosion to the measured sediment yield, the estimates of net soil loss for the individual events could be directly compared with the measured sediment yields to validate the former. The close agreement of the two sets of values is seen as successfully validating the use of beryllium-7 measurements and the new approach to obtain estimates of net soil loss for a sequence of individual events occurring over an extended period at the scale of a small catchment.

  10. A Bac Library and Paired-PCR Approach to Mapping and Completing the Genome Sequence of Sulfolobus Solfataricus P2

    DEFF Research Database (Denmark)

    She, Qunxin; Confalonieri, F.; Zivanovic, Y.

    2000-01-01

    The original strategy used in the Sulfolobus solfatnricus genome project was to sequence non overlapping, or minimally overlapping, cosmid or lambda inserts without constructing a physical map. However, after only about two thirds of the genome sequence was completed, this approach became counter......-productive because there was a high sequence bias in the cosmid and lambda libraries. Therefore, a new approach was devised for linking the sequenced regions which may be generally applicable. BAC libraries were constructed and terminal sequences of the clones were determined and used for both end mapping and PCR...

  11. Employment of Near Full-Length Ribosome Gene TA-Cloning and Primer-Blast to Detect Multiple Species in a Natural Complex Microbial Community Using Species-Specific Primers Designed with Their Genome Sequences.

    Science.gov (United States)

    Zhang, Huimin; He, Hongkui; Yu, Xiujuan; Xu, Zhaohui; Zhang, Zhizhou

    2016-11-01

    It remains an unsolved problem to quantify a natural microbial community by rapidly and conveniently measuring multiple species with functional significance. Most widely used high throughput next-generation sequencing methods can only generate information mainly for genus-level taxonomic identification and quantification, and detection of multiple species in a complex microbial community is still heavily dependent on approaches based on near full-length ribosome RNA gene or genome sequence information. In this study, we used near full-length rRNA gene library sequencing plus Primer-Blast to design species-specific primers based on whole microbial genome sequences. The primers were intended to be specific at the species level within relevant microbial communities, i.e., a defined genomics background. The primers were tested with samples collected from the Daqu (also called fermentation starters) and pit mud of a traditional Chinese liquor production plant. Sixteen pairs of primers were found to be suitable for identification of individual species. Among them, seven pairs were chosen to measure the abundance of microbial species through quantitative PCR. The combination of near full-length ribosome RNA gene library sequencing and Primer-Blast may represent a broadly useful protocol to quantify multiple species in complex microbial population samples with species-specific primers.

  12. Reliable Detection of Herpes Simplex Virus Sequence Variation by High-Throughput Resequencing.

    Science.gov (United States)

    Morse, Alison M; Calabro, Kaitlyn R; Fear, Justin M; Bloom, David C; McIntyre, Lauren M

    2017-08-16

    High-throughput sequencing (HTS) has resulted in data for a number of herpes simplex virus (HSV) laboratory strains and clinical isolates. The knowledge of these sequences has been critical for investigating viral pathogenicity. However, the assembly of complete herpesviral genomes, including HSV, is complicated due to the existence of large repeat regions and arrays of smaller reiterated sequences that are commonly found in these genomes. In addition, the inherent genetic variation in populations of isolates for viruses and other microorganisms presents an additional challenge to many existing HTS sequence assembly pipelines. Here, we evaluate two approaches for the identification of genetic variants in HSV1 strains using Illumina short read sequencing data. The first, a reference-based approach, identifies variants from reads aligned to a reference sequence and the second, a de novo assembly approach, identifies variants from reads aligned to de novo assembled consensus sequences. Of critical importance for both approaches is the reduction in the number of low complexity regions through the construction of a non-redundant reference genome. We compared variants identified in the two methods. Our results indicate that approximately 85% of variants are identified regardless of the approach. The reference-based approach to variant discovery captures an additional 15% representing variants divergent from the HSV1 reference possibly due to viral passage. Reference-based approaches are significantly less labor-intensive and identify variants across the genome where de novo assembly-based approaches are limited to regions where contigs have been successfully assembled. In addition, regions of poor quality assembly can lead to false variant identification in de novo consensus sequences. For viruses with a well-assembled reference genome, a reference-based approach is recommended.

  13. Psychological distress and stressful life events in pediatric complex regional pain syndrome

    Science.gov (United States)

    Wager, Julia; Brehmer, Hannah; Hirschfeld, Gerrit; Zernikow, Boris

    2015-01-01

    BACKGROUND: There is little knowledge regarding the association between psychological factors and complex regional pain syndrome (CRPS) in children. Specifically, it is not known which factors precipitate CRPS and which result from the ongoing painful disease. OBJECTIVES: To examine symptoms of depression and anxiety as well as the experience of stressful life events in children with CRPS compared with children with chronic primary headaches and functional abdominal pain. METHODS: A retrospective chart study examined children with CRPS (n=37) who received intensive inpatient pain treatment between 2004 and 2010. They were compared with two control groups (chronic primary headaches and functional abdominal pain; each n=37), who also received intensive inpatient pain treatment. Control groups were matched with the CRPS group with regard to admission date, age and sex. Groups were compared on symptoms of depression and anxiety as well as stressful life events. RESULTS: Children with CRPS reported lower anxiety and depression scores compared with children with abdominal pain. A higher number of stressful life events before and after the onset of the pain condition was observed for children with CRPS. CONCLUSIONS: Children with CRPS are not particularly prone to symptoms of anxiety or depression. Importantly, children with CRPS experienced more stressful life events than children with chronic headaches or abdominal pain. Prospective long-term studies are needed to further explore the potential role of stressful life events in the etiology of CRPS. PMID:26035287

  14. Towards a strategic approach to special events management in the post-9/11 world

    OpenAIRE

    Jones, G. B.

    2005-01-01

    CHDS State/Local This thesis reviews background related to counterterrorism and law enforcement planning for major special events and it identifies some of the strategic issues that have emerged in special events management since the terrorist attacks of September 11, 2001. It focuses on the subjective and objective components of the systems currently used by DHS and the FBI to categorize and resource special events, and it evaluates whether the current approach to major event planning ...

  15. Research as an event: a novel approach to promote patient-focused drug development

    Directory of Open Access Journals (Sweden)

    Tsai JH

    2018-05-01

    Full Text Available Jui-Hua Tsai, Ellen Janssen, John FP Bridges Department of Health Policy and Management, Johns Hopkins Bloomberg School of Public Health, Baltimore, MD, USA Abstract: Patient groups are increasingly engaging in research to understand patients’ preferences and incorporate their perspectives into drug development and regulation. Several models of patient engagement have emerged, but there is little guidance on how to partner with patient groups to engage the disease community. Our group has been using an approach to engage patient groups that we call research as an event. Research as an event is a method for researchers to use a community-centered event to engage patients in their own environment at modest incremental cost. It is a pragmatic solution to address the challenges of engaging patients in research to minimize patients’ frustration, decrease the time burden, and limit the overall cost. The community, the event, and the research are the three components that constitute the research as an event framework. The community represents a disease-specific community. The event is a meeting of common interest for patients and other stakeholders, such as a patient advocacy conference. The research describes activities in engaging the community for the purpose of research. Research as an event follows a six-step approach. A case study is used to demonstrate the six steps followed by recommendations for future implementation. Keywords: patients’ perspectives, decision making, drug approval, patient engagement, patient organization, patients’ preference

  16. Discrete event simulation: Modeling simultaneous complications and outcomes

    NARCIS (Netherlands)

    Quik, E.H.; Feenstra, T.L.; Krabbe, P.F.M.

    2012-01-01

    OBJECTIVES: To present an effective and elegant model approach to deal with specific characteristics of complex modeling. METHODS: A discrete event simulation (DES) model with multiple complications and multiple outcomes that each can occur simultaneously was developed. In this DES model parameters,

  17. Transforming clinical microbiology with bacterial genome sequencing.

    Science.gov (United States)

    Didelot, Xavier; Bowden, Rory; Wilson, Daniel J; Peto, Tim E A; Crook, Derrick W

    2012-09-01

    Whole-genome sequencing of bacteria has recently emerged as a cost-effective and convenient approach for addressing many microbiological questions. Here, we review the current status of clinical microbiology and how it has already begun to be transformed by using next-generation sequencing. We focus on three essential tasks: identifying the species of an isolate, testing its properties, such as resistance to antibiotics and virulence, and monitoring the emergence and spread of bacterial pathogens. We predict that the application of next-generation sequencing will soon be sufficiently fast, accurate and cheap to be used in routine clinical microbiology practice, where it could replace many complex current techniques with a single, more efficient workflow.

  18. A Metastatistical Approach to Satellite Estimates of Extreme Rainfall Events

    Science.gov (United States)

    Zorzetto, E.; Marani, M.

    2017-12-01

    The estimation of the average recurrence interval of intense rainfall events is a central issue for both hydrologic modeling and engineering design. These estimates require the inference of the properties of the right tail of the statistical distribution of precipitation, a task often performed using the Generalized Extreme Value (GEV) distribution, estimated either from a samples of annual maxima (AM) or with a peaks over threshold (POT) approach. However, these approaches require long and homogeneous rainfall records, which often are not available, especially in the case of remote-sensed rainfall datasets. We use here, and tailor it to remotely-sensed rainfall estimates, an alternative approach, based on the metastatistical extreme value distribution (MEVD), which produces estimates of rainfall extreme values based on the probability distribution function (pdf) of all measured `ordinary' rainfall event. This methodology also accounts for the interannual variations observed in the pdf of daily rainfall by integrating over the sample space of its random parameters. We illustrate the application of this framework to the TRMM Multi-satellite Precipitation Analysis rainfall dataset, where MEVD optimally exploits the relatively short datasets of satellite-sensed rainfall, while taking full advantage of its high spatial resolution and quasi-global coverage. Accuracy of TRMM precipitation estimates and scale issues are here investigated for a case study located in the Little Washita watershed, Oklahoma, using a dense network of rain gauges for independent ground validation. The methodology contributes to our understanding of the risk of extreme rainfall events, as it allows i) an optimal use of the TRMM datasets in estimating the tail of the probability distribution of daily rainfall, and ii) a global mapping of daily rainfall extremes and distributional tail properties, bridging the existing gaps in rain gauges networks.

  19. Complexity Thinking in PE: Game-Centred Approaches, Games as Complex Adaptive Systems, and Ecological Values

    Science.gov (United States)

    Storey, Brian; Butler, Joy

    2013-01-01

    Background: This article draws on the literature relating to game-centred approaches (GCAs), such as Teaching Games for Understanding, and dynamical systems views of motor learning to demonstrate a convergence of ideas around games as complex adaptive learning systems. This convergence is organized under the title "complexity thinking"…

  20. A Study on Modeling Approaches in Discrete Event Simulation Using Design Patterns

    National Research Council Canada - National Science Library

    Kim, Leng Koh

    2007-01-01

    .... This modeling paradigm encompasses several modeling approaches active role of events, entities as independent components, and chaining components to enable interactivity that are excellent ways of building a DES system...

  1. Quantification of sequence exchange events between PMS2 and PMS2CL provides a basis for improved mutation scanning of Lynch syndrome patients.

    Science.gov (United States)

    van der Klift, Heleen M; Tops, Carli M J; Bik, Elsa C; Boogaard, Merel W; Borgstein, Anne-Marijke; Hansson, Kerstin B M; Ausems, Margreet G E M; Gomez Garcia, Encarna; Green, Andrew; Hes, Frederik J; Izatt, Louise; van Hest, Liselotte P; Alonso, Angel M; Vriends, Annette H J T; Wagner, Anja; van Zelst-Stams, Wendy A G; Vasen, Hans F A; Morreau, Hans; Devilee, Peter; Wijnen, Juul T

    2010-05-01

    Heterozygous mutations in PMS2 are involved in Lynch syndrome, whereas biallelic mutations are found in Constitutional mismatch repair-deficiency syndrome patients. Mutation detection is complicated by the occurrence of sequence exchange events between the duplicated regions of PMS2 and PMS2CL. We investigated the frequency of such events with a nonspecific polymerase chain reaction (PCR) strategy, co-amplifying both PMS2 and PMS2CL sequences. This allowed us to score ratios between gene and pseudogene-specific nucleotides at 29 PSV sites from exon 11 to the end of the gene. We found sequence transfer at all investigated PSVs from intron 12 to the 3' end of the gene in 4 to 52% of DNA samples. Overall, sequence exchange between PMS2 and PMS2CL was observed in 69% (83/120) of individuals. We demonstrate that mutation scanning with PMS2-specific PCR primers and MLPA probes, designed on PSVs, in the 3' duplicated region is unreliable, and present an RNA-based mutation detection strategy to improve reliability. Using this strategy, we found 19 different putative pathogenic PMS2 mutations. Four of these (21%) are lying in the region with frequent sequence transfer and are missed or called incorrectly as homozygous with several PSV-based mutation detection methods. (c) 2010 Wiley-Liss, Inc.

  2. Significance of flow clustering and sequencing on sediment transport: 1D sediment transport modelling

    Science.gov (United States)

    Hassan, Kazi; Allen, Deonie; Haynes, Heather

    2016-04-01

    This paper considers 1D hydraulic model data on the effect of high flow clusters and sequencing on sediment transport. Using observed flow gauge data from the River Caldew, England, a novel stochastic modelling approach was developed in order to create alternative 50 year flow sequences. Whilst the observed probability density of gauge data was preserved in all sequences, the order in which those flows occurred was varied using the output from a Hidden Markov Model (HMM) with generalised Pareto distribution (GP). In total, one hundred 50 year synthetic flow series were generated and used as the inflow boundary conditions for individual flow series model runs using the 1D sediment transport model HEC-RAS. The model routed graded sediment through the case study river reach to define the long-term morphological changes. Comparison of individual simulations provided a detailed understanding of the sensitivity of channel capacity to flow sequence. Specifically, each 50 year synthetic flow sequence was analysed using a 3-month, 6-month or 12-month rolling window approach and classified for clusters in peak discharge. As a cluster is described as a temporal grouping of flow events above a specified threshold, the threshold condition used herein is considered as a morphologically active channel forming discharge event. Thus, clusters were identified for peak discharges in excess of 10%, 20%, 50%, 100% and 150% of the 1 year Return Period (RP) event. The window of above-peak flows also required cluster definition and was tested for timeframes 1, 2, 10 and 30 days. Subsequently, clusters could be described in terms of the number of events, maximum peak flow discharge, cumulative flow discharge and skewness (i.e. a description of the flow sequence). The model output for each cluster was analysed for the cumulative flow volume and cumulative sediment transport (mass). This was then compared to the total sediment transport of a single flow event of equivalent flow volume

  3. An AU-rich element in the 3{prime} untranslated region of the spinach chloroplast petD gene participates in sequence-specific RNA-protein complex formation

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Qiuyun; Adams, C.C.; Usack, L. [Cornell Univ., Ithaca, NY (United States)] [and others

    1995-04-01

    In chloroplasts, the 3{prime} untranslated regions of most mRNAs contain a stem-loop-forming inverted repeat (IR) sequence that is required for mRNA stability and correct 3{prime}-end formation. The IR regions of several mRNAs are also known to bind chloroplast proteins, as judged from in vitro gel mobility shift and UV cross-linking assays, and these RNA-protein interactions may be involved in the regulation of chloroplast mRNA processing and/or stability. Here we describe in detail the RNA and protein components that are involved in 3{prime} IR-containing RNA (3{prime} IR-RNA)-protein complex formation for the spinach chloroplast petD gene, which encodes subunit IV of the cytochrome b{sub 6}/f complex. We show that the complex contains 55-, 41-, and 29-kDa RNA-binding proteins (ribonucleoproteins [RNPs]). These proteins together protect a 90-nucleotide segment of RNA from RNase T{sub 1} digestion; this RNA contains the IR and downstream flanking sequences. Competition experiments using 3{prime} IR-RNAs from the psbA or rbcL gene demonstrate that the RNPs have a strong specificity for the petD sequence. Site-directed mutagenesis was carried out to define the RNA sequence elements required for complex formation. These studies identified an 8-nucleotide AU-rich sequence downstream of the IR; mutations within this sequence had moderate to severe effects on RNA-protein complex formation. Although other similar sequences are present in the petD 3{prime} untranslated region, only a single copy, which we have termed box II, appears to be essential for in vivo protein binding. In addition, the IR itself is necessary for optimal complex formation. These two sequence elements together with an RNP complex may direct correct 3{prime}-end processing and/or influence the stability of petD mRNA in chloroplasts. 48 refs., 9 figs., 2 tabs.

  4. Approaches to proton single-event rate calculations

    International Nuclear Information System (INIS)

    Petersen, E.L.

    1996-01-01

    This article discusses the fundamentals of proton-induced single-event upsets and of the various methods that have been developed to calculate upset rates. Two types of approaches are used based on nuclear-reaction analysis. Several aspects can be analyzed using analytic methods, but a complete description is not available. The paper presents an analytic description for the component due to elastic-scattering recoils. There have been a number of studies made using Monte Carlo methods. These can completely describe the reaction processes, including the effect of nuclear reactions occurring outside the device-sensitive volume. They have not included the elastic-scattering processes. The article describes the semiempirical approaches that are most widely used. The quality of previous upset predictions relative to space observations is discussed and leads to comments about the desired quality of future predictions. Brief sections treat the possible testing limitation due to total ionizing dose effects, the relationship of proton and heavy-ion upsets, upsets due to direct proton ionization, and relative proton and cosmic-ray upset rates

  5. The Cladophora complex (Chlorophyta): new views based on 18S rRNA gene sequences.

    Science.gov (United States)

    Bakker, F T; Olsen, J L; Stam, W T; van den Hoek, C

    1994-12-01

    Evolutionary relationships among species traditionally ascribed to the Siphonocladales/Cladophorales have remained unclear due to a lack of phylogenetically informative characters and extensive morphological plasticity resulting in morphological convergence. This study explores some of the diversity within the generic complex Cladophora and its siphonocladalaen allies. Twelve species of Cladophora representing 6 of the 11 morphological sections recognized by van den Hoek were analyzed along with 8 siphonocladalaen species using 18S rRNA gene sequences. The final alignment consisted of 1460 positions containing 92 phylogenetically informative substitutions. Weighting schemes (EOR weighting, combinatorial weighting) were applied in maximum parsimony analysis to correct for substitution bias. Stem characters were weighted 0.66 relative to single-stranded characters to correct for secondary structural constraints. Both weighting approaches resulted in greater phylogenetic resolution. Results confirm that there is no basis for the independent recognition of the Cladophorales and Siphonocladales. The Siphonocladales is polyphyletic, and Cladophora is paraphyletic. All analyses support two principal lineages, of which one contains predominantly tropical members including almost all siphonocladalean taxa, while the other lineage consists of mostly warm- to cold-temperate species of Cladophora.

  6. SSFSE sequence functional MRI of the human cervical spinal cord with complex finger tapping

    International Nuclear Information System (INIS)

    Xie Chuhai; Kong Kangmei; Guan Jitian; Chen Yexi; He Jiankang; Qi Weili; Wang Xinjia; Shen Zhiwei; Wu Renhua

    2009-01-01

    Purpose: Functional MR imaging of the human cervical spinal cord was carried out on volunteers during alternated rest and a complex finger tapping task, in order to detect image intensity changes arising from neuronal activity. Methods: Functional MR imaging data using single-shot fast spin-echo sequence (SSFSE) with echo time 42.4 ms on a 1.5 T GE Clinical System were acquired in eight subjects performing a complex finger tapping task. Cervical spinal cord activation was measured both in the sagittal and transverse imaging planes. Postprocessing was performed by AFNI (Analysis of Functional Neuroimages) software system. Results: Intensity changes (5.5-7.6%) were correlated with the time course of stimulation and were consistently detected in both sagittal and transverse imaging planes of the cervical spinal cord. The activated regions localized to the ipsilateral side of the spinal cord in agreement with the neural anatomy. Conclusion: Functional MR imaging signals can be reliably detected with finger tapping activity in the human cervical spinal cord using a SSFSE sequence with 42.4 ms echo time. The anatomic location of neural activity correlates with the muscles used in the finger tapping task.

  7. Diagnostics of Primary Immunodeficiencies through Next Generation Sequencing

    Directory of Open Access Journals (Sweden)

    Vera Gallo

    2016-11-01

    Full Text Available Background: Recently, a growing number of novel genetic defects underlying primary immunodeficiencies (PID have been identified, increasing the number of PID up to more than 250 well-defined forms. Next-generation sequencing (NGS technologies and proper filtering strategies greatly contributed to this rapid evolution, providing the possibility to rapidly and simultaneously analyze large numbers of genes or the whole exome. Objective: To evaluate the role of targeted next-generation sequencing and whole exome sequencing in the diagnosis of a case series, characterized by complex or atypical clinical features suggesting a PID, difficult to diagnose using the current diagnostic procedures.Methods: We retrospectively analyzed genetic variants identified through targeted next-generation sequencing or whole exome sequencing in 45 patients with complex PID of unknown etiology. Results: 40 variants were identified using targeted next-generation sequencing, while 5 were identified using whole exome sequencing. Newly identified genetic variants were classified into 4 groups: I variations associated with a well-defined PID; II variations associated with atypical features of a well-defined PID; III functionally relevant variations potentially involved in the immunological features; IV non-diagnostic genotype, in whom the link with phenotype is missing. We reached a conclusive genetic diagnosis in 7/45 patients (~16%. Among them, 4 patients presented with a typical well-defined PID. In the remaining 3 cases, mutations were associated with unexpected clinical features, expanding the phenotypic spectrum of typical PIDs. In addition, we identified 31 variants in 10 patients with complex phenotype, individually not causative per se of the disorder.Conclusion: NGS technologies represent a cost-effective and rapid first-line genetic approaches for the evaluation of complex PIDs. Whole exome sequencing, despite a moderate higher cost compared to targeted, is

  8. An integrated native mass spectrometry and top-down proteomics method that connects sequence to structure and function of macromolecular complexes

    Science.gov (United States)

    Li, Huilin; Nguyen, Hong Hanh; Ogorzalek Loo, Rachel R.; Campuzano, Iain D. G.; Loo, Joseph A.

    2018-02-01

    Mass spectrometry (MS) has become a crucial technique for the analysis of protein complexes. Native MS has traditionally examined protein subunit arrangements, while proteomics MS has focused on sequence identification. These two techniques are usually performed separately without taking advantage of the synergies between them. Here we describe the development of an integrated native MS and top-down proteomics method using Fourier-transform ion cyclotron resonance (FTICR) to analyse macromolecular protein complexes in a single experiment. We address previous concerns of employing FTICR MS to measure large macromolecular complexes by demonstrating the detection of complexes up to 1.8 MDa, and we demonstrate the efficacy of this technique for direct acquirement of sequence to higher-order structural information with several large complexes. We then summarize the unique functionalities of different activation/dissociation techniques. The platform expands the ability of MS to integrate proteomics and structural biology to provide insights into protein structure, function and regulation.

  9. Superresolution restoration of an image sequence: adaptive filtering approach.

    Science.gov (United States)

    Elad, M; Feuer, A

    1999-01-01

    This paper presents a new method based on adaptive filtering theory for superresolution restoration of continuous image sequences. The proposed methodology suggests least squares (LS) estimators which adapt in time, based on adaptive filters, least mean squares (LMS) or recursive least squares (RLS). The adaptation enables the treatment of linear space and time-variant blurring and arbitrary motion, both of them assumed known. The proposed new approach is shown to be of relatively low computational requirements. Simulations demonstrating the superresolution restoration algorithms are presented.

  10. Mapping the sequence of brain events in response to disgusting food.

    Science.gov (United States)

    Pujol, Jesus; Blanco-Hinojo, Laura; Coronas, Ramón; Esteba-Castillo, Susanna; Rigla, Mercedes; Martínez-Vilavella, Gerard; Deus, Joan; Novell, Ramón; Caixàs, Assumpta

    2018-01-01

    Warning signals indicating that a food is potentially dangerous may evoke a response that is not limited to the feeling of disgust. We investigated the sequence of brain events in response to visual representations of disgusting food using a dynamic image analysis. Functional MRI was acquired in 30 healthy subjects while they were watching a movie showing disgusting food scenes interspersed with the scenes of appetizing food. Imaging analysis included the identification of the global brain response and the generation of frame-by-frame activation maps at the temporal resolution of 2 s. Robust activations were identified in brain structures conventionally associated with the experience of disgust, but our analysis also captured a variety of other brain elements showing distinct temporal evolutions. The earliest events included transient changes in the orbitofrontal cortex and visual areas, followed by a more durable engagement of the periaqueductal gray, a pivotal element in the mediation of responses to threat. A subsequent core phase was characterized by the activation of subcortical and cortical structures directly concerned not only with the emotional dimension of disgust (e.g., amygdala-hippocampus, insula), but also with the regulation of food intake (e.g., hypothalamus). In a later phase, neural excitement extended to broad cortical areas, the thalamus and cerebellum, and finally to the default mode network that signaled the progressive termination of the evoked response. The response to disgusting food representations is not limited to the emotional domain of disgust, and may sequentially involve a variety of broadly distributed brain networks. Hum Brain Mapp 39:369-380, 2018. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  11. European Conference on Complex Systems

    CERN Document Server

    Pellegrini, Francesco; Caldarelli, Guido; Merelli, Emanuela

    2016-01-01

    This work contains a stringent selection of extended contributions presented at the meeting of 2014 and its satellite meetings, reflecting scope, diversity and richness of research areas in the field, both fundamental and applied. The ECCS meeting, held under the patronage of the Complex Systems Society, is an annual event that has become the leading European conference devoted to complexity science. It offers cutting edge research and unique opportunities to study novel scientific approaches in a multitude of application areas. ECCS'14, its eleventh occurrence, took place in Lucca, Italy. It gathered some 650 scholars representing a wide range of topics relating to complex systems research, with emphasis on interdisciplinary approaches. The editors are among the best specialists in the area. The book is of great interest to scientists, researchers and graduate students in complexity, complex systems and networks.

  12. Exact combinatorial reliability analysis of dynamic systems with sequence-dependent failures

    International Nuclear Information System (INIS)

    Xing Liudong; Shrestha, Akhilesh; Dai Yuanshun

    2011-01-01

    Many real-life fault-tolerant systems are subjected to sequence-dependent failure behavior, in which the order in which the fault events occur is important to the system reliability. Such systems can be modeled by dynamic fault trees (DFT) with priority-AND (pAND) gates. Existing approaches for the reliability analysis of systems subjected to sequence-dependent failures are typically state-space-based, simulation-based or inclusion-exclusion-based methods. Those methods either suffer from the state-space explosion problem or require long computation time especially when results with high degree of accuracy are desired. In this paper, an analytical method based on sequential binary decision diagrams is proposed. The proposed approach can analyze the exact reliability of non-repairable dynamic systems subjected to the sequence-dependent failure behavior. Also, the proposed approach is combinatorial and is applicable for analyzing systems with any arbitrary component time-to-failure distributions. The application and advantages of the proposed approach are illustrated through analysis of several examples. - Highlights: → We analyze the sequence-dependent failure behavior using combinatorial models. → The method has no limitation on the type of time-to-failure distributions. → The method is analytical and based on sequential binary decision diagrams (SBDD). → The method is computationally more efficient than existing methods.

  13. Sequenced Integration and the Identification of a Problem-Solving Approach through a Learning Process

    Science.gov (United States)

    Cormas, Peter C.

    2016-01-01

    Preservice teachers (N = 27) in two sections of a sequenced, methodological and process integrated mathematics/science course solved a levers problem with three similar learning processes and a problem-solving approach, and identified a problem-solving approach through one different learning process. Similar learning processes used included:…

  14. Can you sequence ecology? Metagenomics of adaptive diversification.

    Science.gov (United States)

    Marx, Christopher J

    2013-01-01

    Few areas of science have benefited more from the expansion in sequencing capability than the study of microbial communities. Can sequence data, besides providing hypotheses of the functions the members possess, detect the evolutionary and ecological processes that are occurring? For example, can we determine if a species is adapting to one niche, or if it is diversifying into multiple specialists that inhabit distinct niches? Fortunately, adaptation of populations in the laboratory can serve as a model to test our ability to make such inferences about evolution and ecology from sequencing. Even adaptation to a single niche can give rise to complex temporal dynamics due to the transient presence of multiple competing lineages. If there are multiple niches, this complexity is augmented by segmentation of the population into multiple specialists that can each continue to evolve within their own niche. For a known example of parallel diversification that occurred in the laboratory, sequencing data gave surprisingly few obvious, unambiguous signs of the ecological complexity present. Whereas experimental systems are open to direct experimentation to test hypotheses of selection or ecological interaction, the difficulty in "seeing ecology" from sequencing for even such a simple system suggests translation to communities like the human microbiome will be quite challenging. This will require both improved empirical methods to enhance the depth and time resolution for the relevant polymorphisms and novel statistical approaches to rigorously examine time-series data for signs of various evolutionary and ecological phenomena within and between species.

  15. Targeted assembly of short sequence reads.

    Directory of Open Access Journals (Sweden)

    René L Warren

    Full Text Available As next-generation sequence (NGS production continues to increase, analysis is becoming a significant bottleneck. However, in situations where information is required only for specific sequence variants, it is not necessary to assemble or align whole genome data sets in their entirety. Rather, NGS data sets can be mined for the presence of sequence variants of interest by localized assembly, which is a faster, easier, and more accurate approach. We present TASR, a streamlined assembler that interrogates very large NGS data sets for the presence of specific variants by only considering reads within the sequence space of input target sequences provided by the user. The NGS data set is searched for reads with an exact match to all possible short words within the target sequence, and these reads are then assembled stringently to generate a consensus of the target and flanking sequence. Typically, variants of a particular locus are provided as different target sequences, and the presence of the variant in the data set being interrogated is revealed by a successful assembly outcome. However, TASR can also be used to find unknown sequences that flank a given target. We demonstrate that TASR has utility in finding or confirming genomic mutations, polymorphisms, fusions and integration events. Targeted assembly is a powerful method for interrogating large data sets for the presence of sequence variants of interest. TASR is a fast, flexible and easy to use tool for targeted assembly.

  16. High throughput sequencing and proteomics to identify immunogenic proteins of a new pathogen: the dirty genome approach.

    Science.gov (United States)

    Greub, Gilbert; Kebbi-Beghdadi, Carole; Bertelli, Claire; Collyn, François; Riederer, Beat M; Yersin, Camille; Croxatto, Antony; Raoult, Didier

    2009-12-23

    With the availability of new generation sequencing technologies, bacterial genome projects have undergone a major boost. Still, chromosome completion needs a costly and time-consuming gap closure, especially when containing highly repetitive elements. However, incomplete genome data may be sufficiently informative to derive the pursued information. For emerging pathogens, i.e. newly identified pathogens, lack of release of genome data during gap closure stage is clearly medically counterproductive. We thus investigated the feasibility of a dirty genome approach, i.e. the release of unfinished genome sequences to develop serological diagnostic tools. We showed that almost the whole genome sequence of the emerging pathogen Parachlamydia acanthamoebae was retrieved even with relatively short reads from Genome Sequencer 20 and Solexa. The bacterial proteome was analyzed to select immunogenic proteins, which were then expressed and used to elaborate the first steps of an ELISA. This work constitutes the proof of principle for a dirty genome approach, i.e. the use of unfinished genome sequences of pathogenic bacteria, coupled with proteomics to rapidly identify new immunogenic proteins useful to develop in the future specific diagnostic tests such as ELISA, immunohistochemistry and direct antigen detection. Although applied here to an emerging pathogen, this combined dirty genome sequencing/proteomic approach may be used for any pathogen for which better diagnostics are needed. These genome sequences may also be very useful to develop DNA based diagnostic tests. All these diagnostic tools will allow further evaluations of the pathogenic potential of this obligate intracellular bacterium.

  17. Managing the Organizational and Cultural Precursors to Major Events — Recognising and Addressing Complexity

    International Nuclear Information System (INIS)

    Taylor, R. H.; Carhart, N.; May, J.; Wijk, L. G. A. van

    2016-01-01

    illustration will be given of the use of Hierarchical Process Modelling (HPM) to develop a vulnerability tool using the question sets. However, to understand the issues involved more fully, requires the development of models and associated tools which recognise the complexity and interactive nature of the organizational and cultural issues involved. Various repeating patterns of system failure appear in most of the events studied. Techniques such as System Dynamics (SD) can be used to ‘map’ these processes and capture the complexity involved. This highlights interdependencies, incubating vulnerabilities and the impact of time lags within systems. Two examples will be given. In almost all of the events studied, there has been a strong disconnect between the knowledge and aspirations of senior management and those planning and carrying out operations. There has, for example, frequently been a failure to ensure that information flows up and down the management chain are effective. It has often led to conflicts between the need to maintain safety standards through exercising a cautious and questioning attitude in the light of uncertainty and the need to meet production and cost targets. Business pressures have led to shortcuts, failure to provide sufficient oversight so that leaders are aware of the true picture of process and nuclear safety at operational level (often leading to organizational ‘drift’), normalisation of risks, and the establishment of a ‘good news culture’. The development of this disconnect and its consequences have been shown to be interdependent, dynamic and complex. A second example is that of gaining a better appreciation of the deeper factors involved in managing the supply chain and, in particular, of the interface with contractors. Initiating projects with unclear accountabilities and to unrealistic timescales, together with a lack of clarity about the cost implications when safety-related concerns are reported and need to be addressed, have

  18. An Agent Based Software Approach towards Building Complex Systems

    Directory of Open Access Journals (Sweden)

    Latika Kharb

    2015-08-01

    Full Text Available Agent-oriented techniques represent an exciting new means of analyzing, designing and building complex software systems. They have the potential to significantly improve current practice in software engineering and to extend the range of applications that can feasibly be tackled. Yet, to date, there have been few serious attempts to cast agent systems as a software engineering paradigm. This paper seeks to rectify this omission. Specifically, points to be argued include:firstly, the conceptual apparatus of agent-oriented systems is well-suited to building software solutions for complex systems and secondly, agent-oriented approaches represent a genuine advance over the current state of the art for engineering complex systems. Following on from this view, the major issues raised by adopting an agentoriented approach to software engineering are highlighted and discussed in this paper.

  19. Whole-genome sequencing approaches for conservation biology: Advantages, limitations and practical recommendations.

    Science.gov (United States)

    Fuentes-Pardo, Angela P; Ruzzante, Daniel E

    2017-10-01

    Whole-genome resequencing (WGR) is a powerful method for addressing fundamental evolutionary biology questions that have not been fully resolved using traditional methods. WGR includes four approaches: the sequencing of individuals to a high depth of coverage with either unresolved or resolved haplotypes, the sequencing of population genomes to a high depth by mixing equimolar amounts of unlabelled-individual DNA (Pool-seq) and the sequencing of multiple individuals from a population to a low depth (lcWGR). These techniques require the availability of a reference genome. This, along with the still high cost of shotgun sequencing and the large demand for computing resources and storage, has limited their implementation in nonmodel species with scarce genomic resources and in fields such as conservation biology. Our goal here is to describe the various WGR methods, their pros and cons and potential applications in conservation biology. WGR offers an unprecedented marker density and surveys a wide diversity of genetic variations not limited to single nucleotide polymorphisms (e.g., structural variants and mutations in regulatory elements), increasing their power for the detection of signatures of selection and local adaptation as well as for the identification of the genetic basis of phenotypic traits and diseases. Currently, though, no single WGR approach fulfils all requirements of conservation genetics, and each method has its own limitations and sources of potential bias. We discuss proposed ways to minimize such biases. We envision a not distant future where the analysis of whole genomes becomes a routine task in many nonmodel species and fields including conservation biology. © 2017 John Wiley & Sons Ltd.

  20. An evaluation of different target enrichment methods in pooled sequencing designs for complex disease association studies.

    Directory of Open Access Journals (Sweden)

    Aaron G Day-Williams

    Full Text Available Pooled sequencing can be a cost-effective approach to disease variant discovery, but its applicability in association studies remains unclear. We compare sequence enrichment methods coupled to next-generation sequencing in non-indexed pools of 1, 2, 10, 20 and 50 individuals and assess their ability to discover variants and to estimate their allele frequencies. We find that pooled resequencing is most usefully applied as a variant discovery tool due to limitations in estimating allele frequency with high enough accuracy for association studies, and that in-solution hybrid-capture performs best among the enrichment methods examined regardless of pool size.

  1. Rb-Sr geochronology from Barro Alto Complex, Goias: metamorphism evidence of high degree and continental collision around 1300 Ma ago in Central Brazil

    International Nuclear Information System (INIS)

    Fuck, R.A.; Neves, B.B.B.; Cordani, U.G.; Kawashita, K.

    1989-01-01

    Rb-Sr geochronologic investigation carried out on rocks from the Barro Alto Complex, Goias, yielded iso chronic ages of 1266 +- 17 Ma, for felsic rocks from the granulite belt and 1330 +- 67 Ma, for gneisses belonging to the Juscelandia Sequence. Rb-Sr isotope measurements suggest that Barro Alto rocks have undergone an important metamorphic event during middle Proterozoic times, around 1300 Ma ago. During that event, volcanic and sedimentary rocks of Juscelandia Sequence, as well as the underlying gabbros-anorthosite layered complex, underwent deformation and recrystallization under amphibolite facies conditions. Deformation and metamorphism took place during the collision of two continental blocks, which resulted in a southeastward directed thrust complex, allowing the exposure of granulite slices from the middle-lower crust of the overthrusted block. (author)

  2. Complex correlation approach for high frequency financial data

    Science.gov (United States)

    Wilinski, Mateusz; Ikeda, Yuichi; Aoyama, Hideaki

    2018-02-01

    We propose a novel approach that allows the calculation of a Hilbert transform based complex correlation for unevenly spaced data. This method is especially suitable for high frequency trading data, which are of a particular interest in finance. Its most important feature is the ability to take into account lead-lag relations on different scales, without knowing them in advance. We also present results obtained with this approach while working on Tokyo Stock Exchange intraday quotations. We show that individual sectors and subsectors tend to form important market components which may follow each other with small but significant delays. These components may be recognized by analysing eigenvectors of complex correlation matrix for Nikkei 225 stocks. Interestingly, sectorial components are also found in eigenvectors corresponding to the bulk eigenvalues, traditionally treated as noise.

  3. NiMax system for hadronic event generators in HEP

    International Nuclear Information System (INIS)

    Amelin, N.S.; Komogorov, M.E.

    2001-01-01

    We have suggested a new approach to the development and use of Monte Carlo event generators in high-energy physics (HEP). It is a component approach, when a complex numerical model is composed of standard components. Our approach opens a way to organize a library of HEP model components and provides a great flexibility for the construction of very powerful and realistic numerical models. To support this approach we have designed the NiMax software system (framework) written in C++

  4. Random and externally controlled occurrences of Dansgaard–Oeschger events

    Directory of Open Access Journals (Sweden)

    J. Lohmann

    2018-05-01

    Full Text Available Dansgaard–Oeschger (DO events constitute the most pronounced mode of centennial to millennial climate variability of the last glacial period. Since their discovery, many decades of research have been devoted to understand the origin and nature of these rapid climate shifts. In recent years, a number of studies have appeared that report emergence of DO-type variability in fully coupled general circulation models via different mechanisms. These mechanisms result in the occurrence of DO events at varying degrees of regularity, ranging from periodic to random. When examining the full sequence of DO events as captured in the North Greenland Ice Core Project (NGRIP ice core record, one can observe high irregularity in the timing of individual events at any stage within the last glacial period. In addition to the prevailing irregularity, certain properties of the DO event sequence, such as the average event frequency or the relative distribution of cold versus warm periods, appear to be changing throughout the glacial. By using statistical hypothesis tests on simple event models, we investigate whether the observed event sequence may have been generated by stationary random processes or rather was strongly modulated by external factors. We find that the sequence of DO warming events is consistent with a stationary random process, whereas dividing the event sequence into warming and cooling events leads to inconsistency with two independent event processes. As we include external forcing, we find a particularly good fit to the observed DO sequence in a model where the average residence time in warm periods are controlled by global ice volume and cold periods by boreal summer insolation.

  5. Random and externally controlled occurrences of Dansgaard-Oeschger events

    Science.gov (United States)

    Lohmann, Johannes; Ditlevsen, Peter D.

    2018-05-01

    Dansgaard-Oeschger (DO) events constitute the most pronounced mode of centennial to millennial climate variability of the last glacial period. Since their discovery, many decades of research have been devoted to understand the origin and nature of these rapid climate shifts. In recent years, a number of studies have appeared that report emergence of DO-type variability in fully coupled general circulation models via different mechanisms. These mechanisms result in the occurrence of DO events at varying degrees of regularity, ranging from periodic to random. When examining the full sequence of DO events as captured in the North Greenland Ice Core Project (NGRIP) ice core record, one can observe high irregularity in the timing of individual events at any stage within the last glacial period. In addition to the prevailing irregularity, certain properties of the DO event sequence, such as the average event frequency or the relative distribution of cold versus warm periods, appear to be changing throughout the glacial. By using statistical hypothesis tests on simple event models, we investigate whether the observed event sequence may have been generated by stationary random processes or rather was strongly modulated by external factors. We find that the sequence of DO warming events is consistent with a stationary random process, whereas dividing the event sequence into warming and cooling events leads to inconsistency with two independent event processes. As we include external forcing, we find a particularly good fit to the observed DO sequence in a model where the average residence time in warm periods are controlled by global ice volume and cold periods by boreal summer insolation.

  6. Rhythmic complexity and predictive coding: a novel approach to modeling rhythm and meter perception in music

    Science.gov (United States)

    Vuust, Peter; Witek, Maria A. G.

    2014-01-01

    Musical rhythm, consisting of apparently abstract intervals of accented temporal events, has a remarkable capacity to move our minds and bodies. How does the cognitive system enable our experiences of rhythmically complex music? In this paper, we describe some common forms of rhythmic complexity in music and propose the theory of predictive coding (PC) as a framework for understanding how rhythm and rhythmic complexity are processed in the brain. We also consider why we feel so compelled by rhythmic tension in music. First, we consider theories of rhythm and meter perception, which provide hierarchical and computational approaches to modeling. Second, we present the theory of PC, which posits a hierarchical organization of brain responses reflecting fundamental, survival-related mechanisms associated with predicting future events. According to this theory, perception and learning is manifested through the brain’s Bayesian minimization of the error between the input to the brain and the brain’s prior expectations. Third, we develop a PC model of musical rhythm, in which rhythm perception is conceptualized as an interaction between what is heard (“rhythm”) and the brain’s anticipatory structuring of music (“meter”). Finally, we review empirical studies of the neural and behavioral effects of syncopation, polyrhythm and groove, and propose how these studies can be seen as special cases of the PC theory. We argue that musical rhythm exploits the brain’s general principles of prediction and propose that pleasure and desire for sensorimotor synchronization from musical rhythm may be a result of such mechanisms. PMID:25324813

  7. Rhythmic complexity and predictive coding: A novel approach to modeling rhythm and meter perception in music

    Directory of Open Access Journals (Sweden)

    Peter eVuust

    2014-10-01

    Full Text Available Musical rhythm, consisting of apparently abstract intervals of accented temporal events, has a remarkable capacity to move our minds and bodies. How does the cognitive system enable our experiences of rhythmically complex music? In this paper, we describe some common forms of rhythmic complexity in music and propose the theory of predictive coding as a framework for understanding how rhythm and rhythmic complexity are processed in the brain. We also consider why we feel so compelled by rhythmic tension in music. First, we consider theories of rhythm and meter perception, which provide hierarchical and computational approaches to modeling. Second, we present the theory of predictive coding, which posits a hierarchical organization of brain responses reflecting fundamental, survival-related mechanisms associated with predicting future events. According to this theory, perception and learning is manifested through the brain’s Bayesian minimization of the error between the input to the brain and the brain’s prior expectations. Third, we develop a predictive coding model of musical rhythm, in which rhythm perception is conceptualized as an interaction between what is heard (‘rhythm’ and the brain’s anticipatory structuring of music (‘meter’. Finally, we review empirical studies of the neural and behavioral effects of syncopation, polyrhythm and groove, and propose how these studies can be seen as special cases of the predictive coding theory. We argue that musical rhythm exploits the brain’s general principles of prediction and propose that pleasure and desire for sensorimotor synchronization from musical rhythm may be a result of such mechanisms.

  8. Seismic sequences in the Sombrero Seismic Zone

    Science.gov (United States)

    Pulliam, J.; Huerfano, V. A.; ten Brink, U.; von Hillebrandt, C.

    2007-05-01

    The northeastern Caribbean, in the vicinity of Puerto Rico and the Virgin Islands, has a long and well-documented history of devastating earthquakes and tsunamis, including major events in 1670, 1787, 1867, 1916, 1918, and 1943. Recently, seismicity has been concentrated to the north and west of the British Virgin Islands, in the region referred to as the Sombrero Seismic Zone by the Puerto Rico Seismic Network (PRSN). In the combined seismicity catalog maintained by the PRSN, several hundred small to moderate magnitude events can be found in this region prior to 2006. However, beginning in 2006 and continuing to the present, the rate of seismicity in the Sombrero suddenly increased, and a new locus of activity developed to the east of the previous location. Accurate estimates of seismic hazard, and the tsunamigenic potential of seismic events, depend on an accurate and comprehensive understanding of how strain is being accommodated in this corner region. Are faults locked and accumulating strain for release in a major event? Or is strain being released via slip over a diffuse system of faults? A careful analysis of seismicity patterns in the Sombrero region has the potential to both identify faults and modes of failure, provided the aggregation scheme is tuned to properly identify related events. To this end, we experimented with a scheme to identify seismic sequences based on physical and temporal proximity, under the assumptions that (a) events occur on related fault systems as stress is refocused by immediately previous events and (b) such 'stress waves' die out with time, so that two events that occur on the same system within a relatively short time window can be said to have a similar 'trigger' in ways that two nearby events that occurred years apart cannot. Patterns that emerge from the identification, temporal sequence, and refined locations of such sequences of events carry information about stress accommodation that is obscured by large clouds of

  9. MOCASSIN-prot: A multi-objective clustering approach for protein similarity networks

    Science.gov (United States)

    Motivation: Proteins often include multiple conserved domains. Various evolutionary events including duplication and loss of domains, domain shuffling, as well as sequence divergence contribute to generating complexities in protein structures, and consequently, in their functions. The evolutionary h...

  10. Did A Planet Survive A Post-Main Sequence Evolutionary Event?

    Science.gov (United States)

    Sorber, Rebecca; Jang-Condell, Hannah; Zimmerman, Mara

    2018-06-01

    The GL86 is star system approximately 10 pc away with a main sequence K- type ~ 0.77 M⊙ star (GL 86A) with a white dwarf ~0.49 M⊙ companion (GL86 B). The system has a ~ 18.4 AU semi-major axis, an orbital period of ~353 yrs, and an eccentricity of ~ 0.39. A 4.5 MJ planet orbits the main sequence star with a semi-major axis of 0.113 AU, an orbital period of 15.76 days, in a near circular orbit with an eccentricity of 0.046. If we assume that this planet was formed during the time when the white dwarf was a main sequence star, it would be difficult for the planet to have remained in a stable orbit during the post-main sequence evolution of GL86 B. The post-main sequence evolution with planet survival will be examined by modeling using the program Mercury (Chambers 1999). Using the model, we examine the origins of the planet: whether it formed before or after the post-main sequence evolution of GL86B. The modeling will give us insight into the dynamical evolution of, not only, the binary star system, but also the planet’s life cycle.

  11. Transcriptome Profiling Using Single-Molecule Direct RNA Sequencing Approach for In-depth Understanding of Genes in Secondary Metabolism Pathways of Camellia sinensis

    Directory of Open Access Journals (Sweden)

    Qingshan Xu

    2017-07-01

    Full Text Available Characteristic secondary metabolites, including flavonoids, theanine and caffeine, are important components of Camellia sinensis, and their biosynthesis has attracted widespread interest. Previous studies on the biosynthesis of these major secondary metabolites using next-generation sequencing technologies limited the accurately prediction of full-length (FL splice isoforms. Herein, we applied single-molecule sequencing to pooled tea plant tissues, to provide a more complete transcriptome of C. sinensis. Moreover, we identified 94 FL transcripts and four alternative splicing events for enzyme-coding genes involved in the biosynthesis of flavonoids, theanine and caffeine. According to the comparison between long-read isoforms and assemble transcripts, we improved the quality and accuracy of genes sequenced by short-read next-generation sequencing technology. The resulting FL transcripts, together with the improved assembled transcripts and identified alternative splicing events, enhance our understanding of genes involved in the biosynthesis of characteristic secondary metabolites in C. sinensis.

  12. Exercise-Induced Rhabdomyolysis and Stress-Induced Malignant Hyperthermia Events, Association with Malignant Hyperthermia Susceptibility, and RYR1 Gene Sequence Variations

    Directory of Open Access Journals (Sweden)

    Antonella Carsana

    2013-01-01

    Full Text Available Exertional rhabdomyolysis (ER and stress-induced malignant hyperthermia (MH events are syndromes that primarily afflict military recruits in basic training and athletes. Events similar to those occurring in ER and in stress-induced MH events are triggered after exposure to anesthetic agents in MH-susceptible (MHS patients. MH is an autosomal dominant hypermetabolic condition that occurs in genetically predisposed subjects during general anesthesia, induced by commonly used volatile anesthetics and/or the neuromuscular blocking agent succinylcholine. Triggering agents cause an altered intracellular calcium regulation. Mutations in RYR1 gene have been found in about 70% of MH families. The RYR1 gene encodes the skeletal muscle calcium release channel of the sarcoplasmic reticulum, commonly known as ryanodine receptor type 1 (RYR1. The present work reviews the documented cases of ER or of stress-induced MH events in which RYR1 sequence variations, associated or possibly associated to MHS status, have been identified.

  13. High throughput sequencing and proteomics to identify immunogenic proteins of a new pathogen: the dirty genome approach.

    Directory of Open Access Journals (Sweden)

    Gilbert Greub

    Full Text Available BACKGROUND: With the availability of new generation sequencing technologies, bacterial genome projects have undergone a major boost. Still, chromosome completion needs a costly and time-consuming gap closure, especially when containing highly repetitive elements. However, incomplete genome data may be sufficiently informative to derive the pursued information. For emerging pathogens, i.e. newly identified pathogens, lack of release of genome data during gap closure stage is clearly medically counterproductive. METHODS/PRINCIPAL FINDINGS: We thus investigated the feasibility of a dirty genome approach, i.e. the release of unfinished genome sequences to develop serological diagnostic tools. We showed that almost the whole genome sequence of the emerging pathogen Parachlamydia acanthamoebae was retrieved even with relatively short reads from Genome Sequencer 20 and Solexa. The bacterial proteome was analyzed to select immunogenic proteins, which were then expressed and used to elaborate the first steps of an ELISA. CONCLUSIONS/SIGNIFICANCE: This work constitutes the proof of principle for a dirty genome approach, i.e. the use of unfinished genome sequences of pathogenic bacteria, coupled with proteomics to rapidly identify new immunogenic proteins useful to develop in the future specific diagnostic tests such as ELISA, immunohistochemistry and direct antigen detection. Although applied here to an emerging pathogen, this combined dirty genome sequencing/proteomic approach may be used for any pathogen for which better diagnostics are needed. These genome sequences may also be very useful to develop DNA based diagnostic tests. All these diagnostic tools will allow further evaluations of the pathogenic potential of this obligate intracellular bacterium.

  14. Coevolutionary constraints in the sequence-space of macromolecular complexes reflect their self-assembly pathways.

    Science.gov (United States)

    Mallik, Saurav; Kundu, Sudip

    2017-07-01

    Is the order in which biomolecular subunits self-assemble into functional macromolecular complexes imprinted in their sequence-space? Here, we demonstrate that the temporal order of macromolecular complex self-assembly can be efficiently captured using the landscape of residue-level coevolutionary constraints. This predictive power of coevolutionary constraints is irrespective of the structural, functional, and phylogenetic classification of the complex and of the stoichiometry and quaternary arrangement of the constituent monomers. Combining this result with a number of structural attributes estimated from the crystal structure data, we find indications that stronger coevolutionary constraints at interfaces formed early in the assembly hierarchy probably promotes coordinated fixation of mutations that leads to high-affinity binding with higher surface area, increased surface complementarity and elevated number of molecular contacts, compared to those that form late in the assembly. Proteins 2017; 85:1183-1189. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  15. Structural Approaches to Sequence Evolution Molecules, Networks, Populations

    CERN Document Server

    Bastolla, Ugo; Roman, H. Eduardo; Vendruscolo, Michele

    2007-01-01

    Structural requirements constrain the evolution of biological entities at all levels, from macromolecules to their networks, right up to populations of biological organisms. Classical models of molecular evolution, however, are focused at the level of the symbols - the biological sequence - rather than that of their resulting structure. Now recent advances in understanding the thermodynamics of macromolecules, the topological properties of gene networks, the organization and mutation capabilities of genomes, and the structure of populations make it possible to incorporate these key elements into a broader and deeply interdisciplinary view of molecular evolution. This book gives an account of such a new approach, through clear tutorial contributions by leading scientists specializing in the different fields involved.

  16. Post-closure biosphere assessment modelling: comparison of complex and more stylised approaches.

    Science.gov (United States)

    Walke, Russell C; Kirchner, Gerald; Xu, Shulan; Dverstorp, Björn

    2015-10-01

    Geological disposal facilities are the preferred option for high-level radioactive waste, due to their potential to provide isolation from the surface environment (biosphere) on very long timescales. Assessments need to strike a balance between stylised models and more complex approaches that draw more extensively on site-specific information. This paper explores the relative merits of complex versus more stylised biosphere models in the context of a site-specific assessment. The more complex biosphere modelling approach was developed by the Swedish Nuclear Fuel and Waste Management Co (SKB) for the Formark candidate site for a spent nuclear fuel repository in Sweden. SKB's approach is built on a landscape development model, whereby radionuclide releases to distinct hydrological basins/sub-catchments (termed 'objects') are represented as they evolve through land rise and climate change. Each of seventeen of these objects is represented with more than 80 site specific parameters, with about 22 that are time-dependent and result in over 5000 input values per object. The more stylised biosphere models developed for this study represent releases to individual ecosystems without environmental change and include the most plausible transport processes. In the context of regulatory review of the landscape modelling approach adopted in the SR-Site assessment in Sweden, the more stylised representation has helped to build understanding in the more complex modelling approaches by providing bounding results, checking the reasonableness of the more complex modelling, highlighting uncertainties introduced through conceptual assumptions and helping to quantify the conservatisms involved. The more stylised biosphere models are also shown capable of reproducing the results of more complex approaches. A major recommendation is that biosphere assessments need to justify the degree of complexity in modelling approaches as well as simplifying and conservative assumptions. In light of

  17. A practical approach to screen for authorised and unauthorised genetically modified plants.

    Science.gov (United States)

    Waiblinger, Hans-Ulrich; Grohmann, Lutz; Mankertz, Joachim; Engelbert, Dirk; Pietsch, Klaus

    2010-03-01

    In routine analysis, screening methods based on real-time PCR are most commonly used for the detection of genetically modified (GM) plant material in food and feed. In this paper, it is shown that the combination of five DNA target sequences can be used as a universal screening approach for at least 81 GM plant events authorised or unauthorised for placing on the market and described in publicly available databases. Except for maize event LY038, soybean events DP-305423 and BPS-CV127-9 and cotton event 281-24-236 x 3006-210-23, at least one of the five genetic elements has been inserted in these GM plants and is targeted by this screening approach. For the detection of these sequences, fully validated real-time PCR methods have been selected. A screening table is presented that describes the presence or absence of the target sequences for most of the listed GM plants. These data have been verified either theoretically according to available databases or experimentally using available reference materials. The screening table will be updated regularly by a network of German enforcement laboratories.

  18. Four Essays on Family Life Events

    DEFF Research Database (Denmark)

    Loft, Lisbeth Trille Gylling

    of the present thesis is the way in which individual, social, and institutional contexts shape family life events. The main objective of the present thesis is twofold: to highlight the importance of how family life events are theoretically understood and methodologically approached, and to examine why social...... differentiation in family life events persists across institutional settings and over time. Specifically, from a life course perspective and by means of dynamic quantitative methods, three central themes are investigated: a) the importance of children’s characteristics, b) the need to link family contexts......As demographic and social trends continue to change the institution of the family, a need to reconsider the study family life events as they unfold over the life course has emerged. To advance current knowledge of social dynamics associated with this new complexity, the point of departure...

  19. A DEA Approach for Selecting a Bundle of Tickets for Performing Arts Events

    DEFF Research Database (Denmark)

    Baldin, Andrea

    2017-01-01

    tackle the issue of identifying the most efficient subset of the events scheduled to offer as a bundle. We formulate this problem following the choice-based network Revenue Management approach. Assuming the price as fixed on two types of events, lowbrow and highbrow, proposed by the theatre, the purchase...... decision is modelled on the basis of two random variables: the available time and the reservation price per perfomance. The super-efficiency DEA model will be implemented in order to find the most efficient combination of events to be bundled, defined as the one that offers the most favourable trade...

  20. An efficient, versatile and scalable pattern growth approach to mine frequent patterns in unaligned protein sequences.

    Science.gov (United States)

    Ye, Kai; Kosters, Walter A; Ijzerman, Adriaan P

    2007-03-15

    Pattern discovery in protein sequences is often based on multiple sequence alignments (MSA). The procedure can be computationally intensive and often requires manual adjustment, which may be particularly difficult for a set of deviating sequences. In contrast, two algorithms, PRATT2 (http//www.ebi.ac.uk/pratt/) and TEIRESIAS (http://cbcsrv.watson.ibm.com/) are used to directly identify frequent patterns from unaligned biological sequences without an attempt to align them. Here we propose a new algorithm with more efficiency and more functionality than both PRATT2 and TEIRESIAS, and discuss some of its applications to G protein-coupled receptors, a protein family of important drug targets. In this study, we designed and implemented six algorithms to mine three different pattern types from either one or two datasets using a pattern growth approach. We compared our approach to PRATT2 and TEIRESIAS in efficiency, completeness and the diversity of pattern types. Compared to PRATT2, our approach is faster, capable of processing large datasets and able to identify the so-called type III patterns. Our approach is comparable to TEIRESIAS in the discovery of the so-called type I patterns but has additional functionality such as mining the so-called type II and type III patterns and finding discriminating patterns between two datasets. The source code for pattern growth algorithms and their pseudo-code are available at http://www.liacs.nl/home/kosters/pg/.

  1. Genetic plasticity of the Shigella virulence plasmid is mediated by intra- and inter-molecular events between insertion sequences.

    Science.gov (United States)

    Pilla, Giulia; McVicker, Gareth; Tang, Christoph M

    2017-09-01

    Acquisition of a single copy, large virulence plasmid, pINV, led to the emergence of Shigella spp. from Escherichia coli. The plasmid encodes a Type III secretion system (T3SS) on a 30 kb pathogenicity island (PAI), and is maintained in a bacterial population through a series of toxin:antitoxin (TA) systems which mediate post-segregational killing (PSK). The T3SS imposes a significant cost on the bacterium, and strains which have lost the plasmid and/or genes encoding the T3SS grow faster than wild-type strains in the laboratory, and fail to bind the indicator dye Congo Red (CR). Our aim was to define the molecular events in Shigella flexneri that cause loss of Type III secretion (T3S), and to examine whether TA systems exert positional effects on pINV. During growth at 37°C, we found that deletions of regions of the plasmid including the PAI lead to the emergence of CR-negative colonies; deletions occur through intra-molecular recombination events between insertion sequences (ISs) flanking the PAI. Furthermore, by repositioning MvpAT (which belongs to the VapBC family of TA systems) near the PAI, we demonstrate that the location of this TA system alters the rearrangements that lead to loss of T3S, indicating that MvpAT acts both globally (by reducing loss of pINV through PSK) as well as locally (by preventing loss of adjacent sequences). During growth at environmental temperatures, we show for the first time that pINV spontaneously integrates into different sites in the chromosome, and this is mediated by inter-molecular events involving IS1294. Integration leads to reduced PAI gene expression and impaired secretion through the T3SS, while excision of pINV from the chromosome restores T3SS function. Therefore, pINV integration provides a reversible mechanism for Shigella to circumvent the metabolic burden imposed by pINV. Intra- and inter-molecular events between ISs, which are abundant in Shigella spp., mediate plasticity of S. flexneri pINV.

  2. Current understanding of the sequence of events. Overview of current understanding of accident progression at Fukushima Dai-ichi

    International Nuclear Information System (INIS)

    Gulliford, Jim

    2013-01-01

    An overview of the main sequence of events, particularly the evolution of the cores in Units 1-3 was given. The presentation is based on information provided by Dr Okajima of JAEA to the June 2012 Nuclear Science Committee meeting. During the accident, conditions at the plant were such that operators were initially unable to obtain instruments readouts from the control panel and hence could not know what condition the reactors were in. (Reactor Power, Pressure, Temperature, Water height and flow rate, etc.). Subsequently, as electrical power supplies were gradually restored more data became available. In addition to the reactor data, other information from off-site measurements and from measuring stations inside the site boundary is now available, particularly for radiation dose rates in air. These types of information, combined with detailed knowledge of the plant design and operations history up to the time of the accident are being used to construct detailed computer models which simulate the behaviour of the reactor core, pressure vessel and containment during the accident sequence. This combination of detailed design/operating data, limited measured data during the accident and computer modelling allows us to construct a fairly clear picture of the accident progression. The main sequence of events (common to Units 1, 2 and 3) is summarised. The OECD/NEA is currently coordinating an international benchmark study of the accident at Fukushima Daiichi known as the BSAF Project. The objectives of this activity are to analyse and evaluate the accident progression and improve severe accident (SA) analysis methods and models. The project provides valuable additional (and corrected) data from plant measurements as well as an improved understanding of the role played by the fuel and cladding design. Based on (limited) plant data and extensive modelling analysis, we have a detailed qualitative description of the Fukushima-Daiichi accident. Further analyses of the type

  3. Event-related frontal alpha asymmetries: electrophysiological correlates of approach motivation.

    Science.gov (United States)

    Schöne, Benjamin; Schomberg, Jessica; Gruber, Thomas; Quirin, Markus

    2016-02-01

    Over the last decades, frontal alpha asymmetries observed during resting state periods of several minutes have been used as a marker of affective-motivational states. To date, there is no evidence that alpha asymmetries can be observed in response to brief affective-motivational stimuli, as typically presented in event-related designs. As we argue, frontal alpha asymmetry might indeed be elicited by brief events if they are salient enough. In an event-related design, we used erotic pictures, i.e., highly salient incentives to elicit approach motivation, and contrasted them with pictures of dressed attractive women. As expected, we found significant alpha asymmetries for erotic pictures as compared to control pictures. Our findings suggest that the highly reactive reward system can lead to immediate, phasic changes in frontal alpha asymmetries. We discuss the findings with respect to the notion that high salience of erotic pictures derives from their potential of satisfying an individuals' need by mere visual inspection, which is not the case for pictures showing other types of motivational stimuli such as food.

  4. Assessment of Sustainability of Sports Events (Slovenia

    Directory of Open Access Journals (Sweden)

    Aleksandra Golob

    2015-04-01

    Full Text Available Events industry plays an important role in nowadays economy and can have a substantial impact for a successful business; in addition, sustainability-oriented tourism is becoming an important component of development and planning of a tourist destination. Thus, organizing sustainability-oriented events is crucial and should focus on the zero waste event management and consider as many elements of sustainable development as possible. The same stands for organizing sports events. The aim of this paper was to find out to which level the organizers of existing sports events in Slovenia are taking into account different domains of sustainable development. Answering to a common questionnaire the organizers gave us a feedback considering four main areas: environmental, social, cultural, and economic criteria. The plan was to determine the level of sustainability of three sports events and compare them to each other according to the outstanding areas as well as to draw the attention to the importance of organizing sustainability-oriented sports events and minimizing negative effects of those. Since the field of research is complex, dynamic, and has an interdisciplinary character the results were attained using the DEX software which supports a qualitative approach and allows the modelling of complex decision-making processes with a large number of parameters and alternatives. Such methodology enables the input of a preliminary set of sustainability criteria and can be used as a support when deciding on the evaluation of sustainability of events in general.

  5. Accidental sequences associated with the containment of the pressurized water nuclear installation - INAP

    International Nuclear Information System (INIS)

    Natacci, Faustina Beatriz; Correa, Francisco

    2002-01-01

    The analysis of accidental sequences associated with the Containment is one of the most important tasks during the development of the Probabilistic Safety Assessment (PSA) of nuclear plants mainly because of its importance on the mitigation of consequences of severe postulated accident initiating events. This paper presents a first approach of the Containment analysis of the INAP identifying failures and events that can compromise its performance, and outlining accidental sequences and Containment end states. The initial plant damage states, which are the input for this study, are based on the event trees developed in the PSA level 1 for the INAP. It should be emphasized that since this PSA is still in a preliminary stage it is subjected to further completion. Consequently, the Containment analysis shall also be revised in order to incorporate, in an extension as complete as possible, all initial plant damage states, the corresponding event trees, and the related Containment end states. Finally, it can be concluded that the evaluation of the qualitative analysis presented herein allows a concise and broad knowledge of the qualitative analysis presented herein allows a concise and broad knowledge of the development of accidental sequences related to the Containment of the INAP. (author)

  6. Nature and provenance of the Beishan Complex, southernmost Central Asian Orogenic Belt

    Science.gov (United States)

    Zheng, Rongguo; Li, Jinyi; Xiao, Wenjiao; Zhang, Jin

    2018-03-01

    The ages and origins of metasedimentary rocks, which were previously mapped as Precambrian, are critical in rebuilding the orogenic process and better understanding the Phanerozoic continental growth in the Central Asian Orogenic Belt (CAOB). The Beishan Complex was widely distributed in the southern Beishan Orogenic Collage, southernmost CAOB, and their ages and tectonic affinities are still in controversy. The Beishan Complex was previously proposed as fragments drifted from the Tarim Craton, Neoproterozoic Block or Phanerozoic accretionary complex. In this study, we employ detrital zircon age spectra to constrain ages and provenances of metasedimentary sequences of the Beishan Complex in the Chuanshanxun area. The metasedimentary rocks here are dominated by zircons with Paleoproterozoic-Mesoproterozoic age ( 1160-2070 Ma), and yield two peak ages at 1454 and 1760 Ma. One sample yielded a middle Permian peak age (269 Ma), which suggests that the metasedimentary sequences were deposited in the late Paleozoic. The granitoid and dioritic dykes, intruding into the metasedimentary sequences, exhibit zircon U-Pb ages of 268 and 261 Ma, respectively, which constrain the minimum deposit age of the metasedimentary sequences. Zircon U-Pb ages of amphibolite (274 and 216 Ma) indicate that they might be affected by multi-stage metamorphic events. The Beishan Complex was not a fragment drifted from the Tarim Block or Dunhuang Block, and none of cratons or blocks surrounding Beishan Orogenic Collage was the sole material source of the Beishan Complex due to obviously different age spectra. Instead, 1.4 Ga marginal accretionary zones of the Columbia supercontinent might have existed in the southern CAOB, and may provide the main source materials for the sedimentary sequences in the Beishan Complex.

  7. Analysis of the Steam Generator Tubes Rupture Initiating Event

    International Nuclear Information System (INIS)

    Trillo, A.; Minguez, E.; Munoz, R.; Melendez, E.; Sanchez-Perea, M.; Izquierd, J.M.

    1998-01-01

    In PSA studies, Event Tree-Fault Tree techniques are used to analyse to consequences associated with the evolution of an initiating event. The Event Tree is built in the sequence identification stage, following the expected behaviour of the plant in a qualitative way. Computer simulation of the sequences is performed mainly to determine the allowed time for operator actions, and do not play a central role in ET validation. The simulation of the sequence evolution can instead be performed by using standard tools, helping the analyst obtain a more realistic ET. Long existing methods and tools can be used to automatism the construction of the event tree associated to a given initiator. These methods automatically construct the ET by simulating the plant behaviour following the initiator, allowing some of the systems to fail during the sequence evolution. Then, the sequences with and without the failure are followed. The outcome of all this is a Dynamic Event Tree. The work described here is the application of one such method to the particular case of the SGTR initiating event. The DYLAM scheduler, designed at the Ispra (Italy) JRC of the European Communities, is used to automatically drive the simulation of all the sequences constituting the Event Tree. Similarly to the static Event Tree, each time a system is demanded, two branches are open: one corresponding to the success and the other to the failure of the system. Both branches are followed by the plant simulator until a new system is demanded, and the process repeats. The plant simulation modelling allows the treatment of degraded sequences that enter into the severe accident domain as well as of success sequences in which long-term cooling is started. (Author)

  8. Physical approach to complex systems

    Science.gov (United States)

    Kwapień, Jarosław; Drożdż, Stanisław

    2012-06-01

    Typically, complex systems are natural or social systems which consist of a large number of nonlinearly interacting elements. These systems are open, they interchange information or mass with environment and constantly modify their internal structure and patterns of activity in the process of self-organization. As a result, they are flexible and easily adapt to variable external conditions. However, the most striking property of such systems is the existence of emergent phenomena which cannot be simply derived or predicted solely from the knowledge of the systems’ structure and the interactions among their individual elements. This property points to the holistic approaches which require giving parallel descriptions of the same system on different levels of its organization. There is strong evidence-consolidated also in the present review-that different, even apparently disparate complex systems can have astonishingly similar characteristics both in their structure and in their behaviour. One can thus expect the existence of some common, universal laws that govern their properties. Physics methodology proves helpful in addressing many of the related issues. In this review, we advocate some of the computational methods which in our opinion are especially fruitful in extracting information on selected-but at the same time most representative-complex systems like human brain, financial markets and natural language, from the time series representing the observables associated with these systems. The properties we focus on comprise the collective effects and their coexistence with noise, long-range interactions, the interplay between determinism and flexibility in evolution, scale invariance, criticality, multifractality and hierarchical structure. The methods described either originate from “hard” physics-like the random matrix theory-and then were transmitted to other fields of science via the field of complex systems research, or they originated elsewhere but

  9. Addressing complex challenges using a co-innovation approach

    NARCIS (Netherlands)

    Vereijssen, Jessica; Srinivasan, M.S.; Dirks, Sarah; Fielke, Simon; Jongmans, C.T.; Agnew, Natasha; Klerkx, Laurens; Pinxterhuis, Ina; Moore, John; Edwards, Paul; Brazendale, Rob; Botha, Neels; Turner, James A.

    2017-01-01

    Co-innovation can be effective for complex challenges – involving interactions amongst multiple stakeholders, viewpoints, perceptions, practices and interests across programmes, sectors and national systems. Approaches to challenges in the primary sector have tended to be linear, where tools and

  10. Power System Event Ranking Using a New Linear Parameter-Varying Modeling with a Wide Area Measurement System-Based Approach

    Directory of Open Access Journals (Sweden)

    Mohammad Bagher Abolhasani Jabali

    2017-07-01

    Full Text Available Detecting critical power system events for Dynamic Security Assessment (DSA is required for reliability improvement. The approach proposed in this paper investigates the effects of events on dynamic behavior during nonlinear system response while common approaches use steady-state conditions after events. This paper presents some new and enhanced indices for event ranking based on time-domain simulation and polytopic linear parameter-varying (LPV modeling of a power system. In the proposed approach, a polytopic LPV representation is generated via linearization about some points of the nonlinear dynamic behavior of power system using wide-area measurement system (WAMS concepts and then event ranking is done based on the frequency response of the system models on the vertices. Therefore, the nonlinear behaviors of the system in the time of fault occurrence are considered for events ranking. The proposed algorithm is applied to a power system using nonlinear simulation. The comparison of the results especially in different fault conditions shows the advantages of the proposed approach and indices.

  11. ["Re-evaluation upon suspected event" is an approach for post-marketing clinical study: lessons from adverse drug events related to Bupleuri Radix preparations].

    Science.gov (United States)

    Wu, Shu-Xin; Sun, Hong-Feng; Yang, Xiao-Hui; Long, Hong-Zhu; Ye, Zu-Guang; Ji, Shao-Liang; Zhang, Li

    2014-08-01

    We revisited the "Xiao Chaihu Decoction event (XCHDE)" occurred in late 1980s in Japan and the Bupleuri Radix related adverse drug reaction (ADR) reports in China After careful review, comparison, analysis and evaluation, we think the interstitial pneumonitis, drug induced Liver injury (DILI) and other severe adverse drug envents (ADEs) including death happened in Japan is probably results from multiple factors, including combinatory use of XCHDE with interferon, Kampo usage under modern medicine theory guidance, and use of XCHD on the basis of disease diagnosis instead of traditional Chinese syndrome complex differentiation. There are less ADE case reports related to XCHD preparation in China compared to Japan, mostly manifest with hypersensitivity responses of skin and perfuse perspiration. The symptoms of Radix Bupleuri injection related ADEs mainly manifest hypersensitivity-like response, 2 cases of intravenous infusion instead of intramuscular injection developed hypokalemia and renal failure. One case died from severe hypersensitivity shock. In Chinese literatures, there is no report of the interstitial pneumonitis and DILI associated with XCHDG in Japan. So far, there is no voluntary monitoring data and large sample clinical research data available. The author elaborated the classification of "reevaluation" and clarified "re-evaluation upon events" included the reaction to the suspected safety and efficacy events. Based on the current status of the clinical research on the Radix Bupleuri preparations, the author points out that post-marketing "re-evaluation upon suspected event" is not only a necessity of continuous evaluation of the safety, efficacy of drugs, it is also a necessity for providing objective clinical research data to share with the international and domestic drug administrations in the risk-benefit evaluation. It is also the unavoidable pathway to culture and push the excellent species and famous brands of TCM to the international market, in

  12. Comparative transcriptome analysis within the Lolium/Festuca species complex reveals high sequence conservation

    DEFF Research Database (Denmark)

    Czaban, Adrian; Sharma, Sapna; Byrne, Stephen

    2015-01-01

    species from the Lolium-Festuca complex, ranging from 52,166 to 72,133 transcripts per assembly. We have also predicted a set of proteins and validated it with a high-confidence protein database from three closely related species (H. vulgare, B. distachyon and O. sativa). We have obtained gene family...... clusters for the four species using OrthoMCL and analyzed their inferred phylogenetic relationships. Our results indicate that VRN2 is a candidate gene for differentiating vernalization and non-vernalization types in the Lolium-Festuca complex. Grouping of the gene families based on their BLAST identity...... enabled us to divide ortholog groups into those that are very conserved and those that are more evolutionarily relaxed. The ratio of the non-synonumous to synonymous substitutions enabled us to pinpoint protein sequences evolving in response to positive selection. These proteins may explain some...

  13. Transcription-based model for the induction of chromosomal exchange events by ionising radiation

    International Nuclear Information System (INIS)

    Radford, I.A.

    2003-01-01

    The mechanistic basis for chromosomal aberration formation, following exposure of mammalian cells to ionising radiation, has long been debated. Although chromosomal aberrations are probably initiated by DNA double-strand breaks (DSB), little is understood about the mechanisms that generate and modulate DNA rearrangement. Based on results from our laboratory and data from the literature, a novel model of chromosomal aberration formation has been suggested (Radford 2002). The basic postulates of this model are that: (1) DSB, primarily those involving multiple individual damage sites (i.e. complex DSB), are the critical initiating lesion; (2) only those DSB occurring in transcription units that are associated with transcription 'factories' (complexes containing multiple transcription units) induce chromosomal exchange events; (3) such DSB are brought into contact with a DNA topoisomerase I molecule through RNA polymerase II catalysed transcription and give rise to trapped DNA-topo I cleavage complexes; and (4) trapped complexes interact with another topo I molecule on a temporarily inactive transcription unit at the same transcription factory leading to DNA cleavage and subsequent strand exchange between the cleavage complexes. We have developed a method using inverse PCR that allows the detection and sequencing of putative ionising radiation-induced DNA rearrangements involving different regions of the human genome (Forrester and Radford 1998). The sequences detected by inverse PCR can provide a test of the prediction of the transcription-based model that ionising radiation-induced DNA rearrangements occur between sequences in active transcription units. Accordingly, reverse transcriptase PCR was used to determine if sequences involved in rearrangements were transcribed in the test cells. Consistent with the transcription-based model, nearly all of the sequences examined gave a positive result to reverse transcriptase PCR (Forrester and Radford unpublished)

  14. Forward Genetics by Sequencing EMS Variation-Induced Inbred Lines

    Directory of Open Access Journals (Sweden)

    Charles Addo-Quaye

    2017-02-01

    Full Text Available In order to leverage novel sequencing techniques for cloning genes in eukaryotic organisms with complex genomes, the false positive rate of variant discovery must be controlled for by experimental design and informatics. We sequenced five lines from three pedigrees of ethyl methanesulfonate (EMS-mutagenized Sorghum bicolor, including a pedigree segregating a recessive dwarf mutant. Comparing the sequences of the lines, we were able to identify and eliminate error-prone positions. One genomic region contained EMS mutant alleles in dwarfs that were homozygous reference sequences in wild-type siblings and heterozygous in segregating families. This region contained a single nonsynonymous change that cosegregated with dwarfism in a validation population and caused a premature stop codon in the Sorghum ortholog encoding the gibberellic acid (GA biosynthetic enzyme ent-kaurene oxidase. Application of exogenous GA rescued the mutant phenotype. Our method for mapping did not require outcrossing and introduced no segregation variance. This enables work when line crossing is complicated by life history, permitting gene discovery outside of genetic models. This inverts the historical approach of first using recombination to define a locus and then sequencing genes. Our formally identical approach first sequences all the genes and then seeks cosegregation with the trait. Mutagenized lines lacking obvious phenotypic alterations are available for an extension of this approach: mapping with a known marker set in a line that is phenotypically identical to starting material for EMS mutant generation.

  15. Event-by-event simulation of quantum phenomena

    NARCIS (Netherlands)

    De Raedt, Hans; Michielsen, Kristel

    A discrete-event simulation approach is reviewed that does not require the knowledge of the solution of the wave equation of the whole system, yet reproduces the statistical distributions of wave theory by generating detection events one-by-one. The simulation approach is illustrated by applications

  16. A bioinformatics approach for identifying transgene insertion sites using whole genome sequencing data.

    Science.gov (United States)

    Park, Doori; Park, Su-Hyun; Ban, Yong Wook; Kim, Youn Shic; Park, Kyoung-Cheul; Kim, Nam-Soo; Kim, Ju-Kon; Choi, Ik-Young

    2017-08-15

    Genetically modified crops (GM crops) have been developed to improve the agricultural traits of modern crop cultivars. Safety assessments of GM crops are of paramount importance in research at developmental stages and before releasing transgenic plants into the marketplace. Sequencing technology is developing rapidly, with higher output and labor efficiencies, and will eventually replace existing methods for the molecular characterization of genetically modified organisms. To detect the transgenic insertion locations in the three GM rice gnomes, Illumina sequencing reads are mapped and classified to the rice genome and plasmid sequence. The both mapped reads are classified to characterize the junction site between plant and transgene sequence by sequence alignment. Herein, we present a next generation sequencing (NGS)-based molecular characterization method, using transgenic rice plants SNU-Bt9-5, SNU-Bt9-30, and SNU-Bt9-109. Specifically, using bioinformatics tools, we detected the precise insertion locations and copy numbers of transfer DNA, genetic rearrangements, and the absence of backbone sequences, which were equivalent to results obtained from Southern blot analyses. NGS methods have been suggested as an effective means of characterizing and detecting transgenic insertion locations in genomes. Our results demonstrate the use of a combination of NGS technology and bioinformatics approaches that offers cost- and time-effective methods for assessing the safety of transgenic plants.

  17. Behavioral Event Management: una proposta di applicazione della prospettiva comportamentale alla progettazione e organizzazione di eventi culturali

    Directory of Open Access Journals (Sweden)

    Federico Brunetti

    2016-06-01

    In this paper we try to apply the main results of behavioral research to the specific field of event management. The objective of this work is therefore to identify what the behavioral approach can suggest to improve the effectiveness in the management of cultural events. The underlying methodological approach followed in this paper is deductive. Within this frame, the orientation is prescriptive and aims to identify patterns of behavior to be suggested to event organizers. Following this methodology, some behavioral insights have been identifi ed: peak-end rule; sequence in pain and pleasure; anticipation; build commitment through choice; rituals; experienced vs remembered happiness. The present work seems to be one of the fi rst applications of behavioral approach to cultural event management. The use of behavioral perspective forces us to think about an event from an attendee’s perspective in the sense that it takes into consideration his psychological dimension, the “experience” that the event is creating in his memory. Those engaged in the organization of events will certainly benefi t, thus, from useful knowledge to optimize the effects of their actions.

  18. Programming molecular self-assembly of intrinsically disordered proteins containing sequences of low complexity

    Science.gov (United States)

    Simon, Joseph R.; Carroll, Nick J.; Rubinstein, Michael; Chilkoti, Ashutosh; López, Gabriel P.

    2017-06-01

    Dynamic protein-rich intracellular structures that contain phase-separated intrinsically disordered proteins (IDPs) composed of sequences of low complexity (SLC) have been shown to serve a variety of important cellular functions, which include signalling, compartmentalization and stabilization. However, our understanding of these structures and our ability to synthesize models of them have been limited. We present design rules for IDPs possessing SLCs that phase separate into diverse assemblies within droplet microenvironments. Using theoretical analyses, we interpret the phase behaviour of archetypal IDP sequences and demonstrate the rational design of a vast library of multicomponent protein-rich structures that ranges from uniform nano-, meso- and microscale puncta (distinct protein droplets) to multilayered orthogonally phase-separated granular structures. The ability to predict and program IDP-rich assemblies in this fashion offers new insights into (1) genetic-to-molecular-to-macroscale relationships that encode hierarchical IDP assemblies, (2) design rules of such assemblies in cell biology and (3) molecular-level engineering of self-assembled recombinant IDP-rich materials.

  19. Experimentation on accuracy of non functional requirement prioritization approaches for different complexity projects

    Directory of Open Access Journals (Sweden)

    Raj Kumar Chopra

    2016-09-01

    Full Text Available Non functional requirements must be selected for implementation together with functional requirements to enhance the success of software projects. Three approaches exist for performing the prioritization of non functional requirements using the suitable prioritization technique. This paper performs experimentation on three different complexity versions of the industrial software project using cost-value prioritization technique employing three approaches. Experimentation is conducted to analyze the accuracy of individual approaches and the variation of accuracy with the complexity of the software project. The results indicate that selecting non functional requirements separately, but in accordance with functionality has higher accuracy amongst the other two approaches. Further, likewise other approaches, it witnesses the decrease in accuracy with increase in software complexity but the decrease is minimal.

  20. Using Behavior Sequence Analysis to Map Serial Killers' Life Histories.

    Science.gov (United States)

    Keatley, David A; Golightly, Hayley; Shephard, Rebecca; Yaksic, Enzo; Reid, Sasha

    2018-03-01

    The aim of the current research was to provide a novel method for mapping the developmental sequences of serial killers' life histories. An in-depth biographical account of serial killers' lives, from birth through to conviction, was gained and analyzed using Behavior Sequence Analysis. The analyses highlight similarities in behavioral events across the serial killers' lives, indicating not only which risk factors occur, but the temporal order of these factors. Results focused on early childhood environment, indicating the role of parental abuse; behaviors and events surrounding criminal histories of serial killers, showing that many had previous convictions and were known to police for other crimes; behaviors surrounding their murders, highlighting differences in victim choice and modus operandi; and, finally, trial pleas and convictions. The present research, therefore, provides a novel approach to synthesizing large volumes of data on criminals and presenting results in accessible, understandable outcomes.

  1. Development on quantitative safety analysis method of accident scenario. The automatic scenario generator development for event sequence construction of accident

    International Nuclear Information System (INIS)

    Kojima, Shigeo; Onoue, Akira; Kawai, Katsunori

    1998-01-01

    This study intends to develop a more sophisticated tool that will advance the current event tree method used in all PSA, and to focus on non-catastrophic events, specifically a non-core melt sequence scenario not included in an ordinary PSA. In the non-catastrophic event PSA, it is necessary to consider various end states and failure combinations for the purpose of multiple scenario construction. Therefore it is anticipated that an analysis work should be reduced and automated method and tool is required. A scenario generator that can automatically handle scenario construction logic and generate the enormous size of sequences logically identified by state-of-the-art methodology was developed. To fulfill the scenario generation as a technical tool, a simulation model associated with AI technique and graphical interface, was introduced. The AI simulation model in this study was verified for the feasibility of its capability to evaluate actual systems. In this feasibility study, a spurious SI signal was selected to test the model's applicability. As a result, the basic capability of the scenario generator could be demonstrated and important scenarios were generated. The human interface with a system and its operation, as well as time dependent factors and their quantification in scenario modeling, was added utilizing human scenario generator concept. Then the feasibility of an improved scenario generator was tested for actual use. Automatic scenario generation with a certain level of credibility, was achieved by this study. (author)

  2. TEMAC, Top Event Sensitivity Analysis

    International Nuclear Information System (INIS)

    Iman, R.L.; Shortencarier, M.J.

    1988-01-01

    1 - Description of program or function: TEMAC is designed to permit the user to easily estimate risk and to perform sensitivity and uncertainty analyses with a Boolean expression such as produced by the SETS computer program. SETS produces a mathematical representation of a fault tree used to model system unavailability. In the terminology of the TEMAC program, such a mathematical representation is referred to as a top event. The analysis of risk involves the estimation of the magnitude of risk, the sensitivity of risk estimates to base event probabilities and initiating event frequencies, and the quantification of the uncertainty in the risk estimates. 2 - Method of solution: Sensitivity and uncertainty analyses associated with top events involve mathematical operations on the corresponding Boolean expression for the top event, as well as repeated evaluations of the top event in a Monte Carlo fashion. TEMAC employs a general matrix approach which provides a convenient general form for Boolean expressions, is computationally efficient, and allows large problems to be analyzed. 3 - Restrictions on the complexity of the problem - Maxima of: 4000 cut sets, 500 events, 500 values in a Monte Carlo sample, 16 characters in an event name. These restrictions are implemented through the FORTRAN 77 PARAMATER statement

  3. Assessing the Diversity of Rodent-Borne Viruses: Exploring of High-Throughput Sequencing and Classical Amplification/Sequencing Approaches.

    Science.gov (United States)

    Drewes, Stephan; Straková, Petra; Drexler, Jan F; Jacob, Jens; Ulrich, Rainer G

    2017-01-01

    Rodents are distributed throughout the world and interact with humans in many ways. They provide vital ecosystem services, some species are useful models in biomedical research and some are held as pet animals. However, many rodent species can have adverse effects such as damage to crops and stored produce, and they are of health concern because of the transmission of pathogens to humans and livestock. The first rodent viruses were discovered by isolation approaches and resulted in break-through knowledge in immunology, molecular and cell biology, and cancer research. In addition to rodent-specific viruses, rodent-borne viruses are causing a large number of zoonotic diseases. Most prominent examples are reemerging outbreaks of human hemorrhagic fever disease cases caused by arena- and hantaviruses. In addition, rodents are reservoirs for vector-borne pathogens, such as tick-borne encephalitis virus and Borrelia spp., and may carry human pathogenic agents, but likely are not involved in their transmission to human. In our days, next-generation sequencing or high-throughput sequencing (HTS) is revolutionizing the speed of the discovery of novel viruses, but other molecular approaches, such as generic RT-PCR/PCR and rolling circle amplification techniques, contribute significantly to the rapidly ongoing process. However, the current knowledge still represents only the tip of the iceberg, when comparing the known human viruses to those known for rodents, the mammalian taxon with the largest species number. The diagnostic potential of HTS-based metagenomic approaches is illustrated by their use in the discovery and complete genome determination of novel borna- and adenoviruses as causative disease agents in squirrels. In conclusion, HTS, in combination with conventional RT-PCR/PCR-based approaches, resulted in a drastically increased knowledge of the diversity of rodent viruses. Future improvements of the used workflows, including bioinformatics analysis, will further

  4. Modeling Air Traffic Situation Complexity with a Dynamic Weighted Network Approach

    Directory of Open Access Journals (Sweden)

    Hongyong Wang

    2018-01-01

    Full Text Available In order to address the flight delays and risks associated with the forecasted increase in air traffic, there is a need to increase the capacity of air traffic management systems. This should be based on objective measurements of traffic situation complexity. In current air traffic complexity research, no simple means is available to integrate airspace and traffic flow characteristics. In this paper, we propose a new approach for the measurement of air traffic situation complexity. This approach considers the effects of both airspace and traffic flow and objectively quantifies air traffic situation complexity. Considering the aircraft, waypoints, and airways as nodes, and the complexity relationships among these nodes as edges, a dynamic weighted network is constructed. Air traffic situation complexity is defined as the sum of the weights of all edges in the network, and the relationships of complexity with some commonly used indices are statistically analyzed. The results indicate that the new complexity index is more accurate than traffic count and reflects the number of trajectory changes as well as the high-risk situations. Additionally, analysis of potential applications reveals that this new index contributes to achieving complexity-based management, which represents an efficient method for increasing airspace system capacity.

  5. Discovery of rare variants via sequencing: implications for the design of complex trait association studies.

    Directory of Open Access Journals (Sweden)

    Bingshan Li

    2009-05-01

    Full Text Available There is strong evidence that rare variants are involved in complex disease etiology. The first step in implicating rare variants in disease etiology is their identification through sequencing in both randomly ascertained samples (e.g., the 1,000 Genomes Project and samples ascertained according to disease status. We investigated to what extent rare variants will be observed across the genome and in candidate genes in randomly ascertained samples, the magnitude of variant enrichment in diseased individuals, and biases that can occur due to how variants are discovered. Although sequencing cases can enrich for casual variants, when a gene or genes are not involved in disease etiology, limiting variant discovery to cases can lead to association studies with dramatically inflated false positive rates.

  6. An integrated PCR colony hybridization approach to screen cDNA libraries for full-length coding sequences.

    Science.gov (United States)

    Pollier, Jacob; González-Guzmán, Miguel; Ardiles-Diaz, Wilson; Geelen, Danny; Goossens, Alain

    2011-01-01

    cDNA-Amplified Fragment Length Polymorphism (cDNA-AFLP) is a commonly used technique for genome-wide expression analysis that does not require prior sequence knowledge. Typically, quantitative expression data and sequence information are obtained for a large number of differentially expressed gene tags. However, most of the gene tags do not correspond to full-length (FL) coding sequences, which is a prerequisite for subsequent functional analysis. A medium-throughput screening strategy, based on integration of polymerase chain reaction (PCR) and colony hybridization, was developed that allows in parallel screening of a cDNA library for FL clones corresponding to incomplete cDNAs. The method was applied to screen for the FL open reading frames of a selection of 163 cDNA-AFLP tags from three different medicinal plants, leading to the identification of 109 (67%) FL clones. Furthermore, the protocol allows for the use of multiple probes in a single hybridization event, thus significantly increasing the throughput when screening for rare transcripts. The presented strategy offers an efficient method for the conversion of incomplete expressed sequence tags (ESTs), such as cDNA-AFLP tags, to FL-coding sequences.

  7. Peanut (Arachis hypogaea Expressed Sequence Tag Project: Progress and Application

    Directory of Open Access Journals (Sweden)

    Suping Feng

    2012-01-01

    Full Text Available Many plant ESTs have been sequenced as an alternative to whole genome sequences, including peanut because of the genome size and complexity. The US peanut research community had the historic 2004 Atlanta Genomics Workshop and named the EST project as a main priority. As of August 2011, the peanut research community had deposited 252,832 ESTs in the public NCBI EST database, and this resource has been providing the community valuable tools and core foundations for various genome-scale experiments before the whole genome sequencing project. These EST resources have been used for marker development, gene cloning, microarray gene expression and genetic map construction. Certainly, the peanut EST sequence resources have been shown to have a wide range of applications and accomplished its essential role at the time of need. Then the EST project contributes to the second historic event, the Peanut Genome Project 2010 Inaugural Meeting also held in Atlanta where it was decided to sequence the entire peanut genome. After the completion of peanut whole genome sequencing, ESTs or transcriptome will continue to play an important role to fill in knowledge gaps, to identify particular genes and to explore gene function.

  8. Distributed representations of action sequences in anterior cingulate cortex: A recurrent neural network approach.

    Science.gov (United States)

    Shahnazian, Danesh; Holroyd, Clay B

    2018-02-01

    Anterior cingulate cortex (ACC) has been the subject of intense debate over the past 2 decades, but its specific computational function remains controversial. Here we present a simple computational model of ACC that incorporates distributed representations across a network of interconnected processing units. Based on the proposal that ACC is concerned with the execution of extended, goal-directed action sequences, we trained a recurrent neural network to predict each successive step of several sequences associated with multiple tasks. In keeping with neurophysiological observations from nonhuman animals, the network yields distributed patterns of activity across ACC neurons that track the progression of each sequence, and in keeping with human neuroimaging data, the network produces discrepancy signals when any step of the sequence deviates from the predicted step. These simulations illustrate a novel approach for investigating ACC function.

  9. Whole-genome sequencing of multiple myeloma from diagnosis to plasma cell leukemia reveals genomic initiating events, evolution, and clonal tides.

    Science.gov (United States)

    Egan, Jan B; Shi, Chang-Xin; Tembe, Waibhav; Christoforides, Alexis; Kurdoglu, Ahmet; Sinari, Shripad; Middha, Sumit; Asmann, Yan; Schmidt, Jessica; Braggio, Esteban; Keats, Jonathan J; Fonseca, Rafael; Bergsagel, P Leif; Craig, David W; Carpten, John D; Stewart, A Keith

    2012-08-02

    The longitudinal evolution of a myeloma genome from diagnosis to plasma cell leukemia has not previously been reported. We used whole-genome sequencing (WGS) on 4 purified tumor samples and patient germline DNA drawn over a 5-year period in a t(4;14) multiple myeloma patient. Tumor samples were acquired at diagnosis, first relapse, second relapse, and end-stage secondary plasma cell leukemia (sPCL). In addition to the t(4;14), all tumor time points also shared 10 common single-nucleotide variants (SNVs) on WGS comprising shared initiating events. Interestingly, we observed genomic sequence variants that waxed and waned with time in progressive tumors, suggesting the presence of multiple independent, yet related, clones at diagnosis that rose and fell in dominance. Five newly acquired SNVs, including truncating mutations of RB1 and ZKSCAN3, were observed only in the final sPCL sample suggesting leukemic transformation events. This longitudinal WGS characterization of the natural history of a high-risk myeloma patient demonstrated tumor heterogeneity at diagnosis with shifting dominance of tumor clones over time and has also identified potential mutations contributing to myelomagenesis as well as transformation from myeloma to overt extramedullary disease such as sPCL.

  10. Assembling large, complex environmental metagenomes

    Energy Technology Data Exchange (ETDEWEB)

    Howe, A. C. [Michigan State Univ., East Lansing, MI (United States). Microbiology and Molecular Genetics, Plant Soil and Microbial Sciences; Jansson, J. [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Earth Sciences Division; Malfatti, S. A. [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States); Tringe, S. G. [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States); Tiedje, J. M. [Michigan State Univ., East Lansing, MI (United States). Microbiology and Molecular Genetics, Plant Soil and Microbial Sciences; Brown, C. T. [Michigan State Univ., East Lansing, MI (United States). Microbiology and Molecular Genetics, Computer Science and Engineering

    2012-12-28

    The large volumes of sequencing data required to sample complex environments deeply pose new challenges to sequence analysis approaches. De novo metagenomic assembly effectively reduces the total amount of data to be analyzed but requires significant computational resources. We apply two pre-assembly filtering approaches, digital normalization and partitioning, to make large metagenome assemblies more computationaly tractable. Using a human gut mock community dataset, we demonstrate that these methods result in assemblies nearly identical to assemblies from unprocessed data. We then assemble two large soil metagenomes from matched Iowa corn and native prairie soils. The predicted functional content and phylogenetic origin of the assembled contigs indicate significant taxonomic differences despite similar function. The assembly strategies presented are generic and can be extended to any metagenome; full source code is freely available under a BSD license.

  11. Dependency models and probability of joint events

    International Nuclear Information System (INIS)

    Oerjasaeter, O.

    1982-08-01

    Probabilistic dependencies between components/systems are discussed with reference to a broad classification of potential failure mechanisms. Further, a generalized time-dependency model, based on conditional probabilities for estimation of the probability of joint events and event sequences is described. The applicability of this model is clarified/demonstrated by various examples. It is concluded that the described model of dependency is a useful tool for solving a variety of practical problems concerning the probability of joint events and event sequences where common cause and time-dependent failure mechanisms are involved. (Auth.)

  12. Time spans and spacers : Molecular phylogenetic explorations in the Cladophora complex (Chlorophyta) from the perspective of rDNA gene and spacer sequences

    NARCIS (Netherlands)

    Bakker, Frederik Theodoor

    1995-01-01

    In this study, phylogenetic relationships among genera, species and biogeographic representatives of single Cladophora species within the Cladophorales were analyzed using rDNA gene and spacer sequences. Based on phylogenetic analysis of 18S rRNA gene sequences, the Cladophora complex is shown to be

  13. Analysis of microRNA profile of Anopheles sinensis by deep sequencing and bioinformatic approaches.

    Science.gov (United States)

    Feng, Xinyu; Zhou, Xiaojian; Zhou, Shuisen; Wang, Jingwen; Hu, Wei

    2018-03-12

    microRNAs (miRNAs) are small non-coding RNAs widely identified in many mosquitoes. They are reported to play important roles in development, differentiation and innate immunity. However, miRNAs in Anopheles sinensis, one of the Chinese malaria mosquitoes, remain largely unknown. We investigated the global miRNA expression profile of An. sinensis using Illumina Hiseq 2000 sequencing. Meanwhile, we applied a bioinformatic approach to identify potential miRNAs in An. sinensis. The identified miRNA profiles were compared and analyzed by two approaches. The selected miRNAs from the sequencing result and the bioinformatic approach were confirmed with qRT-PCR. Moreover, target prediction, GO annotation and pathway analysis were carried out to understand the role of miRNAs in An. sinensis. We identified 49 conserved miRNAs and 12 novel miRNAs by next-generation high-throughput sequencing technology. In contrast, 43 miRNAs were predicted by the bioinformatic approach, of which two were assigned as novel. Comparative analysis of miRNA profiles by two approaches showed that 21 miRNAs were shared between them. Twelve novel miRNAs did not match any known miRNAs of any organism, indicating that they are possibly species-specific. Forty miRNAs were found in many mosquito species, indicating that these miRNAs are evolutionally conserved and may have critical roles in the process of life. Both the selected known and novel miRNAs (asi-miR-281, asi-miR-184, asi-miR-14, asi-miR-nov5, asi-miR-nov4, asi-miR-9383, and asi-miR-2a) could be detected by quantitative real-time PCR (qRT-PCR) in the sequenced sample, and the expression patterns of these miRNAs measured by qRT-PCR were in concordance with the original miRNA sequencing data. The predicted targets for the known and the novel miRNAs covered many important biological roles and pathways indicating the diversity of miRNA functions. We also found 21 conserved miRNAs and eight counterparts of target immune pathway genes in An. sinensis

  14. Time to foster a rational approach to preventing cardiovascular morbid events.

    Science.gov (United States)

    Cohn, Jay N; Duprez, Daniel A

    2008-07-29

    Efforts to prevent atherosclerotic morbid events have focused primarily on risk factor prevention and intervention. These approaches, based on the statistical association of risk factors with events, have dominated clinical practice in the last generation. Because the cardiovascular abnormalities eventuating in morbid events are detectable in the arteries and heart before the development of symptomatic disease, recent efforts have focused on identifying the presence of these abnormalities as a more sensitive and specific guide to the need for therapy. Advances in noninvasive techniques for studying the vasculature and the left ventricle now provide the opportunity to use early disease rather than risk factors as the tool for clinical decision making. A disease scoring system has been developed using 10 tests of vascular and cardiac function and structure. More extensive data to confirm the sensitivity and specificity of this scoring system and to demonstrate its utility in tracking the response to therapy are needed to justify widespread application in clinical practice.

  15. Maximum likelihood sequence estimation for optical complex direct modulation.

    Science.gov (United States)

    Che, Di; Yuan, Feng; Shieh, William

    2017-04-17

    Semiconductor lasers are versatile optical transmitters in nature. Through the direct modulation (DM), the intensity modulation is realized by the linear mapping between the injection current and the light power, while various angle modulations are enabled by the frequency chirp. Limited by the direct detection, DM lasers used to be exploited only as 1-D (intensity or angle) transmitters by suppressing or simply ignoring the other modulation. Nevertheless, through the digital coherent detection, simultaneous intensity and angle modulations (namely, 2-D complex DM, CDM) can be realized by a single laser diode. The crucial technique of CDM is the joint demodulation of intensity and differential phase with the maximum likelihood sequence estimation (MLSE), supported by a closed-form discrete signal approximation of frequency chirp to characterize the MLSE transition probability. This paper proposes a statistical method for the transition probability to significantly enhance the accuracy of the chirp model. Using the statistical estimation, we demonstrate the first single-channel 100-Gb/s PAM-4 transmission over 1600-km fiber with only 10G-class DM lasers.

  16. The Development of Narrative Productivity, Syntactic Complexity, Referential Cohesion and Event Content in Four- to Eight-Year-Old Finnish Children

    Science.gov (United States)

    Mäkinen, Leena; Loukusa, Soile; Nieminen, Lea; Leinonen, Eeva; Kunnari, Sari

    2014-01-01

    This study focuses on the development of narrative structure and the relationship between narrative productivity and event content. A total of 172 Finnish children aged between four and eight participated. Their picture-elicited narrations were analysed for productivity, syntactic complexity, referential cohesion and event content. Each measure…

  17. Real-time complex event processing for cloud resources

    Science.gov (United States)

    Adam, M.; Cordeiro, C.; Field, L.; Giordano, D.; Magnoni, L.

    2017-10-01

    The ongoing integration of clouds into the WLCG raises the need for detailed health and performance monitoring of the virtual resources in order to prevent problems of degraded service and interruptions due to undetected failures. When working in scale, the existing monitoring diversity can lead to a metric overflow whereby the operators need to manually collect and correlate data from several monitoring tools and frameworks, resulting in tens of different metrics to be constantly interpreted and analyzed per virtual machine. In this paper we present an ESPER based standalone application which is able to process complex monitoring events coming from various sources and automatically interpret data in order to issue alarms upon the resources’ statuses, without interfering with the actual resources and data sources. We will describe how this application has been used with both commercial and non-commercial cloud activities, allowing the operators to quickly be alarmed and react to misbehaving VMs and LHC experiments’ workflows. We will present the pattern analysis mechanisms being used, as well as the surrounding Elastic and REST API interfaces where the alarms are collected and served to users.

  18. Prevalence of transcription promoters within archaeal operons and coding sequences.

    Science.gov (United States)

    Koide, Tie; Reiss, David J; Bare, J Christopher; Pang, Wyming Lee; Facciotti, Marc T; Schmid, Amy K; Pan, Min; Marzolf, Bruz; Van, Phu T; Lo, Fang-Yin; Pratap, Abhishek; Deutsch, Eric W; Peterson, Amelia; Martin, Dan; Baliga, Nitin S

    2009-01-01

    Despite the knowledge of complex prokaryotic-transcription mechanisms, generalized rules, such as the simplified organization of genes into operons with well-defined promoters and terminators, have had a significant role in systems analysis of regulatory logic in both bacteria and archaea. Here, we have investigated the prevalence of alternate regulatory mechanisms through genome-wide characterization of transcript structures of approximately 64% of all genes, including putative non-coding RNAs in Halobacterium salinarum NRC-1. Our integrative analysis of transcriptome dynamics and protein-DNA interaction data sets showed widespread environment-dependent modulation of operon architectures, transcription initiation and termination inside coding sequences, and extensive overlap in 3' ends of transcripts for many convergently transcribed genes. A significant fraction of these alternate transcriptional events correlate to binding locations of 11 transcription factors and regulators (TFs) inside operons and annotated genes-events usually considered spurious or non-functional. Using experimental validation, we illustrate the prevalence of overlapping genomic signals in archaeal transcription, casting doubt on the general perception of rigid boundaries between coding sequences and regulatory elements.

  19. The cost of conservative synchronization in parallel discrete event simulations

    Science.gov (United States)

    Nicol, David M.

    1990-01-01

    The performance of a synchronous conservative parallel discrete-event simulation protocol is analyzed. The class of simulation models considered is oriented around a physical domain and possesses a limited ability to predict future behavior. A stochastic model is used to show that as the volume of simulation activity in the model increases relative to a fixed architecture, the complexity of the average per-event overhead due to synchronization, event list manipulation, lookahead calculations, and processor idle time approach the complexity of the average per-event overhead of a serial simulation. The method is therefore within a constant factor of optimal. The analysis demonstrates that on large problems--those for which parallel processing is ideally suited--there is often enough parallel workload so that processors are not usually idle. The viability of the method is also demonstrated empirically, showing how good performance is achieved on large problems using a thirty-two node Intel iPSC/2 distributed memory multiprocessor.

  20. PREvaIL, an integrative approach for inferring catalytic residues using sequence, structural, and network features in a machine-learning framework.

    Science.gov (United States)

    Song, Jiangning; Li, Fuyi; Takemoto, Kazuhiro; Haffari, Gholamreza; Akutsu, Tatsuya; Chou, Kuo-Chen; Webb, Geoffrey I

    2018-04-14

    Determining the catalytic residues in an enzyme is critical to our understanding the relationship between protein sequence, structure, function, and enhancing our ability to design novel enzymes and their inhibitors. Although many enzymes have been sequenced, and their primary and tertiary structures determined, experimental methods for enzyme functional characterization lag behind. Because experimental methods used for identifying catalytic residues are resource- and labor-intensive, computational approaches have considerable value and are highly desirable for their ability to complement experimental studies in identifying catalytic residues and helping to bridge the sequence-structure-function gap. In this study, we describe a new computational method called PREvaIL for predicting enzyme catalytic residues. This method was developed by leveraging a comprehensive set of informative features extracted from multiple levels, including sequence, structure, and residue-contact network, in a random forest machine-learning framework. Extensive benchmarking experiments on eight different datasets based on 10-fold cross-validation and independent tests, as well as side-by-side performance comparisons with seven modern sequence- and structure-based methods, showed that PREvaIL achieved competitive predictive performance, with an area under the receiver operating characteristic curve and area under the precision-recall curve ranging from 0.896 to 0.973 and from 0.294 to 0.523, respectively. We demonstrated that this method was able to capture useful signals arising from different levels, leveraging such differential but useful types of features and allowing us to significantly improve the performance of catalytic residue prediction. We believe that this new method can be utilized as a valuable tool for both understanding the complex sequence-structure-function relationships of proteins and facilitating the characterization of novel enzymes lacking functional annotations

  1. Full-length mRNA sequencing uncovers a widespread coupling between transcription initiation and mRNA processing.

    Science.gov (United States)

    Anvar, Seyed Yahya; Allard, Guy; Tseng, Elizabeth; Sheynkman, Gloria M; de Klerk, Eleonora; Vermaat, Martijn; Yin, Raymund H; Johansson, Hans E; Ariyurek, Yavuz; den Dunnen, Johan T; Turner, Stephen W; 't Hoen, Peter A C

    2018-03-29

    The multifaceted control of gene expression requires tight coordination of regulatory mechanisms at transcriptional and post-transcriptional level. Here, we studied the interdependence of transcription initiation, splicing and polyadenylation events on single mRNA molecules by full-length mRNA sequencing. In MCF-7 breast cancer cells, we find 2700 genes with interdependent alternative transcription initiation, splicing and polyadenylation events, both in proximal and distant parts of mRNA molecules, including examples of coupling between transcription start sites and polyadenylation sites. The analysis of three human primary tissues (brain, heart and liver) reveals similar patterns of interdependency between transcription initiation and mRNA processing events. We predict thousands of novel open reading frames from full-length mRNA sequences and obtained evidence for their translation by shotgun proteomics. The mapping database rescues 358 previously unassigned peptides and improves the assignment of others. By recognizing sample-specific amino-acid changes and novel splicing patterns, full-length mRNA sequencing improves proteogenomics analysis of MCF-7 cells. Our findings demonstrate that our understanding of transcriptome complexity is far from complete and provides a basis to reveal largely unresolved mechanisms that coordinate transcription initiation and mRNA processing.

  2. Estimation of isolation times of the island species in the Drosophila simulans complex from multilocus DNA sequence data.

    Directory of Open Access Journals (Sweden)

    Shannon R McDermott

    2008-06-01

    Full Text Available The Drosophila simulans species complex continues to serve as an important model system for the study of new species formation. The complex is comprised of the cosmopolitan species, D. simulans, and two island endemics, D. mauritiana and D. sechellia. A substantial amount of effort has gone into reconstructing the natural history of the complex, in part to infer the context in which functional divergence among the species has arisen. In this regard, a key parameter to be estimated is the initial isolation time (t of each island species. Loci in regions of low recombination have lower divergence within the complex than do other loci, yet divergence from D. melanogaster is similar for both classes. This might reflect gene flow of the low-recombination loci subsequent to initial isolation, but it might also reflect differential effects of changing population size on the two recombination classes of loci when the low-recombination loci are subject to genetic hitchhiking or pseudohitchhikingNew DNA sequence variation data for 17 loci corroborate the prior observation from 13 loci that DNA sequence divergence is reduced in genes of low recombination. Two models are presented to estimate t and other relevant parameters (substitution rate correction factors in lineages leading to the island species and, in the case of the 4-parameter model, the ratio of ancestral to extant effective population size from the multilocus DNA sequence data.In general, it appears that both island species were isolated at about the same time, here estimated at approximately 250,000 years ago. It also appears that the difference in divergence patterns of genes in regions of low and higher recombination can be reconciled by allowing a modestly larger effective population size for the ancestral population than for extant D. simulans.

  3. C-terminal low-complexity sequence repeats of Mycobacterium smegmatis Ku modulate DNA binding.

    Science.gov (United States)

    Kushwaha, Ambuj K; Grove, Anne

    2013-01-24

    Ku protein is an integral component of the NHEJ (non-homologous end-joining) pathway of DSB (double-strand break) repair. Both eukaryotic and prokaryotic Ku homologues have been characterized and shown to bind DNA ends. A unique feature of Mycobacterium smegmatis Ku is its basic C-terminal tail that contains several lysine-rich low-complexity PAKKA repeats that are absent from homologues encoded by obligate parasitic mycobacteria. Such PAKKA repeats are also characteristic of mycobacterial Hlp (histone-like protein) for which they have been shown to confer the ability to appose DNA ends. Unexpectedly, removal of the lysine-rich extension enhances DNA-binding affinity, but an interaction between DNA and the PAKKA repeats is indicated by the observation that only full-length Ku forms multiple complexes with a short stem-loop-containing DNA previously designed to accommodate only one Ku dimer. The C-terminal extension promotes DNA end-joining by T4 DNA ligase, suggesting that the PAKKA repeats also contribute to efficient end-joining. We suggest that low-complexity lysine-rich sequences have evolved repeatedly to modulate the function of unrelated DNA-binding proteins.

  4. Quantification of sequence exchange events between PMS2 and PMS2CL provides a basis for improved mutation scanning of Lynch syndrome patients.

    NARCIS (Netherlands)

    Klift, H.M. van der; Tops, C.M.; Bik, E.C.; Boogaard, M.W.; Borgstein, A.M.; Hansson, K.B.; Ausems, M.G.E.M.; Gomez Garcia, E.; Green, A.; Hes, F.J.; Izatt, L.; Hest, L.P. van; Alonso, A.M.; Vriends, A.H.; Wagner, A.; Zelst-Stams, W.A.G. van; Vasen, H.F.; Morreau, H.; Devilee, P.; Wijnen, J.T.

    2010-01-01

    Heterozygous mutations in PMS2 are involved in Lynch syndrome, whereas biallelic mutations are found in Constitutional mismatch repair-deficiency syndrome patients. Mutation detection is complicated by the occurrence of sequence exchange events between the duplicated regions of PMS2 and PMS2CL. We

  5. Dynamic anticipatory processing of hierarchical sequential events: a common role for Broca's area and ventral premotor cortex across domains?

    Science.gov (United States)

    Fiebach, Christian J; Schubotz, Ricarda I

    2006-05-01

    This paper proposes a domain-general model for the functional contribution of ventral premotor cortex (PMv) and adjacent Broca's area to perceptual, cognitive, and motor processing. We propose to understand this frontal region as a highly flexible sequence processor, with the PMv mapping sequential events onto stored structural templates and Broca's Area involved in more complex, hierarchical or hypersequential processing. This proposal is supported by reference to previous functional neuroimaging studies investigating abstract sequence processing and syntactic processing.

  6. New Approaches to Attenuated Hepatitis a Vaccine Development: Cloning and Sequencing of Cell-Culture Adapted Viral cDNA.

    Science.gov (United States)

    1987-10-13

    after multiple passages in vivo and in vitro. J. Gen. Virol. 67, 1741- 1744. Sabin , A.B. (1985). Oral poliovirus vaccine : history of its development...IN (N NEW APPROACHES TO ATTENUATED HEPATITIS A VACCINE DEVELOPMENT: Q) CLONING AND SEQUENCING OF CELL-CULTURE ADAPTED VIRAL cDNA I ANNUAL REPORT...6ll02Bsl0 A 055 11. TITLE (Include Security Classification) New Approaches to Attenuated Hepatitis A Vaccine Development: Cloning and Sequencing of Cell

  7. Identification of DNA-binding protein target sequences by physical effective energy functions: free energy analysis of lambda repressor-DNA complexes.

    Directory of Open Access Journals (Sweden)

    Caselle Michele

    2007-09-01

    Full Text Available Abstract Background Specific binding of proteins to DNA is one of the most common ways gene expression is controlled. Although general rules for the DNA-protein recognition can be derived, the ambiguous and complex nature of this mechanism precludes a simple recognition code, therefore the prediction of DNA target sequences is not straightforward. DNA-protein interactions can be studied using computational methods which can complement the current experimental methods and offer some advantages. In the present work we use physical effective potentials to evaluate the DNA-protein binding affinities for the λ repressor-DNA complex for which structural and thermodynamic experimental data are available. Results The binding free energy of two molecules can be expressed as the sum of an intermolecular energy (evaluated using a molecular mechanics forcefield, a solvation free energy term and an entropic term. Different solvation models are used including distance dependent dielectric constants, solvent accessible surface tension models and the Generalized Born model. The effect of conformational sampling by Molecular Dynamics simulations on the computed binding energy is assessed; results show that this effect is in general negative and the reproducibility of the experimental values decreases with the increase of simulation time considered. The free energy of binding for non-specific complexes, estimated using the best energetic model, agrees with earlier theoretical suggestions. As a results of these analyses, we propose a protocol for the prediction of DNA-binding target sequences. The possibility of searching regulatory elements within the bacteriophage λ genome using this protocol is explored. Our analysis shows good prediction capabilities, even in absence of any thermodynamic data and information on the naturally recognized sequence. Conclusion This study supports the conclusion that physics-based methods can offer a completely complementary

  8. ESPRIT-Forest: Parallel clustering of massive amplicon sequence data in subquadratic time.

    Science.gov (United States)

    Cai, Yunpeng; Zheng, Wei; Yao, Jin; Yang, Yujie; Mai, Volker; Mao, Qi; Sun, Yijun

    2017-04-01

    The rapid development of sequencing technology has led to an explosive accumulation of genomic sequence data. Clustering is often the first step to perform in sequence analysis, and hierarchical clustering is one of the most commonly used approaches for this purpose. However, it is currently computationally expensive to perform hierarchical clustering of extremely large sequence datasets due to its quadratic time and space complexities. In this paper we developed a new algorithm called ESPRIT-Forest for parallel hierarchical clustering of sequences. The algorithm achieves subquadratic time and space complexity and maintains a high clustering accuracy comparable to the standard method. The basic idea is to organize sequences into a pseudo-metric based partitioning tree for sub-linear time searching of nearest neighbors, and then use a new multiple-pair merging criterion to construct clusters in parallel using multiple threads. The new algorithm was tested on the human microbiome project (HMP) dataset, currently one of the largest published microbial 16S rRNA sequence dataset. Our experiment demonstrated that with the power of parallel computing it is now compu- tationally feasible to perform hierarchical clustering analysis of tens of millions of sequences. The software is available at http://www.acsu.buffalo.edu/∼yijunsun/lab/ESPRIT-Forest.html.

  9. Radiochemical data collected on events from which radioactivity escaped beyond the borders of the Nevada test range complex

    International Nuclear Information System (INIS)

    Hicks, H.G.

    1981-01-01

    This report identifies all nuclear events in Nevada that are known to have sent radioactivity beyond the borders of the test range complex. There have been 177 such tests, representing seven different types: nuclear detonations in the atmosphere, nuclear excavation events, nuclear safety events, underground nuclear events that inadvertently seeped or vented to the atmosphere, dispersion of plutonium and/or uranium by chemical high explosives, nuclear rocket engine tests, and nuclear ramjet engine tests. The source term for each of these events is given, together with the data base from which it was derived (except where the data are classified). The computer programs used for organizing and processing the data base and calculating radionuclide production are described and included, together with the input and output data and details of the calculations. This is the basic formation needed to make computer modeling studies of the fallout from any of these 177 events

  10. Exome localization of complex disease association signals

    Directory of Open Access Journals (Sweden)

    Lewis Cathryn M

    2011-02-01

    Full Text Available Abstract Background Genome-wide association studies (GWAS of common diseases have had a tremendous impact on genetic research over the last five years; the field is now moving from microarray-based technology towards next-generation sequencing. To evaluate the potential of association studies for complex diseases based on exome sequencing we analysed the distribution of association signal with respect to protein-coding genes based on GWAS data for seven diseases from the Wellcome Trust Case Control Consortium. Results We find significant concentration of association signal in exons and genes for Crohn's Disease, Type 1 Diabetes and Bipolar Disorder, but also observe enrichment from up to 40 kilobases upstream to 40 kilobases downstream of protein-coding genes for Crohn's Disease and Type 1 Diabetes; the exact extent of the distribution is disease dependent. Conclusions Our work suggests that exome sequencing may be a feasible approach to find genetic variation associated with complex disease. Extending the exome sequencing to include flanking regions therefore promises further improvement of covering disease-relevant variants.

  11. Safety based on organisational learning (SOL) - Conceptual approach and verification of a method for event analysis

    International Nuclear Information System (INIS)

    Miller, R.; Wilpert, B.; Fahlbruch, B.

    1999-01-01

    This paper discusses a method for analysing safety-relevant events in NPP which is known as 'SOL', safety based on organisational learning. After discussion of the specific organisational and psychological problems examined in the event analysis, the analytic process using the SOL approach is explained as well as the required general setting. The SOL approach has been tested both with scientific experiments and from the practical perspective, by operators of NPPs and experts from other branches of industry. (orig./CB) [de

  12. Random Tagging Genotyping by Sequencing (rtGBS, an Unbiased Approach to Locate Restriction Enzyme Sites across the Target Genome.

    Directory of Open Access Journals (Sweden)

    Elena Hilario

    Full Text Available Genotyping by sequencing (GBS is a restriction enzyme based targeted approach developed to reduce the genome complexity and discover genetic markers when a priori sequence information is unavailable. Sufficient coverage at each locus is essential to distinguish heterozygous from homozygous sites accurately. The number of GBS samples able to be pooled in one sequencing lane is limited by the number of restriction sites present in the genome and the read depth required at each site per sample for accurate calling of single-nucleotide polymorphisms. Loci bias was observed using a slight modification of the Elshire et al.some restriction enzyme sites were represented in higher proportions while others were poorly represented or absent. This bias could be due to the quality of genomic DNA, the endonuclease and ligase reaction efficiency, the distance between restriction sites, the preferential amplification of small library restriction fragments, or bias towards cluster formation of small amplicons during the sequencing process. To overcome these issues, we have developed a GBS method based on randomly tagging genomic DNA (rtGBS. By randomly landing on the genome, we can, with less bias, find restriction sites that are far apart, and undetected by the standard GBS (stdGBS method. The study comprises two types of biological replicates: six different kiwifruit plants and two independent DNA extractions per plant; and three types of technical replicates: four samples of each DNA extraction, stdGBS vs. rtGBS methods, and two independent library amplifications, each sequenced in separate lanes. A statistically significant unbiased distribution of restriction fragment size by rtGBS showed that this method targeted 49% (39,145 of BamH I sites shared with the reference genome, compared to only 14% (11,513 by stdGBS.

  13. Prediction of Antimicrobial Peptides Based on Sequence Alignment and Support Vector Machine-Pairwise Algorithm Utilizing LZ-Complexity

    Directory of Open Access Journals (Sweden)

    Xin Yi Ng

    2015-01-01

    Full Text Available This study concerns an attempt to establish a new method for predicting antimicrobial peptides (AMPs which are important to the immune system. Recently, researchers are interested in designing alternative drugs based on AMPs because they have found that a large number of bacterial strains have become resistant to available antibiotics. However, researchers have encountered obstacles in the AMPs designing process as experiments to extract AMPs from protein sequences are costly and require a long set-up time. Therefore, a computational tool for AMPs prediction is needed to resolve this problem. In this study, an integrated algorithm is newly introduced to predict AMPs by integrating sequence alignment and support vector machine- (SVM- LZ complexity pairwise algorithm. It was observed that, when all sequences in the training set are used, the sensitivity of the proposed algorithm is 95.28% in jackknife test and 87.59% in independent test, while the sensitivity obtained for jackknife test and independent test is 88.74% and 78.70%, respectively, when only the sequences that has less than 70% similarity are used. Applying the proposed algorithm may allow researchers to effectively predict AMPs from unknown protein peptide sequences with higher sensitivity.

  14. Leadership within Emergent Events in Complex Systems: Micro-Enactments and the Mechanisms of Organisational Learning and Change

    Science.gov (United States)

    Hazy, James K.; Silberstang, Joyce

    2009-01-01

    One tradition within the complexity paradigm considers organisations as complex adaptive systems in which autonomous individuals interact, often in complex ways with difficult to predict, non-linear outcomes. Building upon this tradition, and more specifically following the complex systems leadership theory approach, we describe the ways in which…

  15. A Systems Approach towards an Intelligent and Self-Controlling Platform for Integrated Continuous Reaction Sequences**

    Science.gov (United States)

    Ingham, Richard J; Battilocchio, Claudio; Fitzpatrick, Daniel E; Sliwinski, Eric; Hawkins, Joel M; Ley, Steven V

    2015-01-01

    Performing reactions in flow can offer major advantages over batch methods. However, laboratory flow chemistry processes are currently often limited to single steps or short sequences due to the complexity involved with operating a multi-step process. Using new modular components for downstream processing, coupled with control technologies, more advanced multi-step flow sequences can be realized. These tools are applied to the synthesis of 2-aminoadamantane-2-carboxylic acid. A system comprising three chemistry steps and three workup steps was developed, having sufficient autonomy and self-regulation to be managed by a single operator. PMID:25377747

  16. Situation models and memory: the effects of temporal and causal information on recall sequence.

    Science.gov (United States)

    Brownstein, Aaron L; Read, Stephen J

    2007-10-01

    Participants watched an episode of the television show Cheers on video and then reported free recall. Recall sequence followed the sequence of events in the story; if one concept was observed immediately after another, it was recalled immediately after it. We also made a causal network of the show's story and found that recall sequence followed causal links; effects were recalled immediately after their causes. Recall sequence was more likely to follow causal links than temporal sequence, and most likely to follow causal links that were temporally sequential. Results were similar at 10-minute and 1-week delayed recall. This is the most direct and detailed evidence reported on sequential effects in recall. The causal network also predicted probability of recall; concepts with more links and concepts on the main causal chain were most likely to be recalled. This extends the causal network model to more complex materials than previous research.

  17. Cloud-Assisted UAV Data Collection for Multiple Emerging Events in Distributed WSNs.

    Science.gov (United States)

    Cao, Huiru; Liu, Yongxin; Yue, Xuejun; Zhu, Wenjian

    2017-08-07

    In recent years, UAVs (Unmanned Aerial Vehicles) have been widely applied for data collection and image capture. Specifically, UAVs have been integrated with wireless sensor networks (WSNs) to create data collection platforms with high flexibility. However, most studies in this domain focus on system architecture and UAVs' flight trajectory planning while event-related factors and other important issues are neglected. To address these challenges, we propose a cloud-assisted data gathering strategy for UAV-based WSN in the light of emerging events. We also provide a cloud-assisted approach for deriving UAV's optimal flying and data acquisition sequence of a WSN cluster. We validate our approach through simulations and experiments. It has been proved that our methodology outperforms conventional approaches in terms of flying time, energy consumption, and integrity of data acquisition. We also conducted a real-world experiment using a UAV to collect data wirelessly from multiple clusters of sensor nodes for monitoring an emerging event, which are deployed in a farm. Compared against the traditional method, this proposed approach requires less than half the flying time and achieves almost perfect data integrity.

  18. A fuzzy-based reliability approach to evaluate basic events of fault tree analysis for nuclear power plant probabilistic safety assessment

    International Nuclear Information System (INIS)

    Purba, Julwan Hendry

    2014-01-01

    Highlights: • We propose a fuzzy-based reliability approach to evaluate basic event reliabilities. • It implements the concepts of failure possibilities and fuzzy sets. • Experts evaluate basic event failure possibilities using qualitative words. • Triangular fuzzy numbers mathematically represent qualitative failure possibilities. • It is a very good alternative for conventional reliability approach. - Abstract: Fault tree analysis has been widely utilized as a tool for nuclear power plant probabilistic safety assessment. This analysis can be completed only if all basic events of the system fault tree have their quantitative failure rates or failure probabilities. However, it is difficult to obtain those failure data due to insufficient data, environment changing or new components. This study proposes a fuzzy-based reliability approach to evaluate basic events of system fault trees whose failure precise probability distributions of their lifetime to failures are not available. It applies the concept of failure possibilities to qualitatively evaluate basic events and the concept of fuzzy sets to quantitatively represent the corresponding failure possibilities. To demonstrate the feasibility and the effectiveness of the proposed approach, the actual basic event failure probabilities collected from the operational experiences of the David–Besse design of the Babcock and Wilcox reactor protection system fault tree are used to benchmark the failure probabilities generated by the proposed approach. The results confirm that the proposed fuzzy-based reliability approach arises as a suitable alternative for the conventional probabilistic reliability approach when basic events do not have the corresponding quantitative historical failure data for determining their reliability characteristics. Hence, it overcomes the limitation of the conventional fault tree analysis for nuclear power plant probabilistic safety assessment

  19. Sistemic Approach - a Complexity Management Instrument

    Directory of Open Access Journals (Sweden)

    Vadim Dumitrascu

    2006-04-01

    Full Text Available he systemic principle uses the deduction and the induction, analyse and synthesis, inferency and proferency, in order to find out the interdependencies and the inner connections that make mooving the complex organized entities. The true valences of this approach can be found neither in the simplist models of the “in-out” type, nor in the “circular” models that fill in the Economics and Management handbooks, and that consecrate another kind of formalism, but in the constructiviste-reflexive strategies, used in order to explain the economic and social structures.

  20. Genome-wide identification and characterization of Notch transcription complex-binding sequence paired sites in leukemia cells

    Science.gov (United States)

    Severson, Eric; Arnett, Kelly L.; Wang, Hongfang; Zang, Chongzhi; Taing, Len; Liu, Hudan; Pear, Warren S.; Liu, X. Shirley; Blacklow, Stephen C.; Aster, Jon C.

    2018-01-01

    Notch transcription complexes (NTCs) drive target gene expression by binding to two distinct types of genomic response elements, NTC monomer-binding sites and sequence-paired sites (SPSs) that bind NTC dimers. SPSs are conserved and are linked to the Notch-responsiveness of a few genes, but their overall contribution to Notch-dependent gene regulation is unknown. To address this issue, we determined the DNA sequence requirements for NTC dimerization using a fluorescence resonance energy transfer (FRET) assay, and applied insights from these in vitro studies to Notch-“addicted” leukemia cells. We find that SPSs contribute to the regulation of approximately a third of direct Notch target genes. While originally described in promoters, SPSs are present mainly in long-range enhancers, including an enhancer containing a newly described SPS that regulates HES5. Our work provides a general method for identifying sequence-paired sites in genome-wide data sets and highlights the widespread role of NTC dimerization in Notch-transformed leukemia cells. PMID:28465412

  1. Novel Complexity Indicator of Manufacturing Process Chains and Its Relations to Indirect Complexity Indicators

    Directory of Open Access Journals (Sweden)

    Vladimir Modrak

    2017-01-01

    Full Text Available Manufacturing systems can be considered as a network of machines/workstations, where parts are produced in flow shop or job shop environment, respectively. Such network of machines/workstations can be depicted as a graph, with machines as nodes and material flow between the nodes as links. The aim of this paper is to use sequences of operations and machine network to measure static complexity of manufacturing processes. In this order existing approaches to measure the static complexity of manufacturing systems are analyzed and subsequently compared. For this purpose, analyzed competitive complexity indicators were tested on two different manufacturing layout examples. A subsequent analysis showed relevant potential of the proposed method.

  2. Event recognition by detrended fluctuation analysis: An application to Teide-Pico Viejo volcanic complex, Tenerife, Spain

    International Nuclear Information System (INIS)

    Del Pin, Enrico; Carniel, Roberto; Tarraga, Marta

    2008-01-01

    In this work we investigate the application of the detrended fluctuation analysis (DFA) to seismic data recorded in the island of Tenerife (Canary Islands, Spain) during the month of July 2004, in a phase of possible unrest of the Teide-Pico Viejo volcanic complex. Tectonic events recorded in the area are recognized and located by the Spanish national agency Instituto Geografico Nacional (IGN) and their catalogue is the only currently available dataset, whose completeness unfortunately suffers from the strong presence of anthropogenic noise. In this paper we propose the use of DFA to help to automatically identify events. The evaluation of this case study proves DFA to be a promising tool to be used for rapidly screening large seismic datasets and highlighting time windows with the potential presence of discrete events

  3. Event monitoring of parallel computations

    Directory of Open Access Journals (Sweden)

    Gruzlikov Alexander M.

    2015-06-01

    Full Text Available The paper considers the monitoring of parallel computations for detection of abnormal events. It is assumed that computations are organized according to an event model, and monitoring is based on specific test sequences

  4. LOSP-initiated event tree analysis for BWR

    International Nuclear Information System (INIS)

    Watanabe, Norio; Kondo, Masaaki; Uno, Kiyotaka; Chigusa, Takeshi; Harami, Taikan

    1989-03-01

    As a preliminary study of 'Japanese Model Plant PSA', a LOSP (loss of off-site power)-initiated Event Tree Analysis for a Japanese typical BWR was carried out solely based on the open documents such as 'Safety Analysis Report'. The objectives of this analysis are as follows; - to delineate core-melt accident sequences initiated by LOSP, - to evaluate the importance of core-melt accident sequences in terms of occurrence frequency, and - to develop a foundation of plant information and analytical procedures for efficiently performing further 'Japanese Model Plant PSA'. This report describes the procedure and results of the LOSP-initiated Event Tree Analysis. In this analysis, two types of event trees, Functional Event Tree and Systemic Event Tree, were developed to delineate core-melt accident sequences and to quantify their frequencies. Front-line System Event Tree was prepared as well to provide core-melt sequence delineation for accident progression analysis of Level 2 PSA which will be followed in a future. Applying U.S. operational experience data such as component failure rates and a LOSP frequency, we obtained the following results; - The total frequency of core-melt accident sequences initiated by LOSP is estimated at 5 x 10 -4 per reactor-year. - The dominant sequences are 'Loss of Decay Heat Removal' and 'Loss of Emergency Electric Power Supply', which account for more than 90% of the total core-melt frequency. In this analysis, a higher value of 0.13/R·Y was used for the LOSP frequency than experiences in Japan and any recovery action was not considered. In fact, however, there has been no experience of LOSP event in Japanese nuclear power plants so far and it is also expected that offsite power and/or PCS would be recovered before core melt. Considering Japanese operating experience and recovery factors will reduce the total core-melt frequency to less than 10 -6 per reactor-year. (J.P.N.)

  5. Distinguishing cognitive state with multifractal complexity of hippocampal interspike interval sequences

    Directory of Open Access Journals (Sweden)

    Dustin eFetterhoff

    2015-09-01

    Full Text Available Fractality, represented as self-similar repeating patterns, is ubiquitous in nature and the brain. Dynamic patterns of hippocampal spike trains are known to exhibit multifractal properties during working memory processing; however, it is unclear whether the multifractal properties inherent to hippocampal spike trains reflect active cognitive processing. To examine this possibility, hippocampal neuronal ensembles were recorded from rats before, during and after a spatial working memory task following administration of tetrahydrocannabinol (THC, a memory-impairing component of cannabis. Multifractal detrended fluctuation analysis was performed on hippocampal interspike interval sequences to determine characteristics of monofractal long-range temporal correlations (LRTCs, quantified by the Hurst exponent, and the degree/magnitude of multifractal complexity, quantified by the width of the singularity spectrum. Our results demonstrate that multifractal firing patterns of hippocampal spike trains are a marker of functional memory processing, as they are more complex during the working memory task and significantly reduced following administration of memory impairing THC doses. Conversely, LRTCs are largest during resting state recordings, therefore reflecting different information compared to multifractality. In order to deepen conceptual understanding of multifractal complexity and LRTCs, these measures were compared to classical methods using hippocampal frequency content and firing variability measures. These results showed that LRTCs, multifractality, and theta rhythm represent independent processes, while delta rhythm correlated with multifractality. Taken together, these results provide a novel perspective on memory function by demonstrating that the multifractal nature of spike trains reflects hippocampal microcircuit activity that can be used to detect and quantify cognitive, physiological and pathological states.

  6. Towards the development of a DNA-sequence based approach to serotyping of Salmonella enterica

    Directory of Open Access Journals (Sweden)

    Logan Julie MJ

    2004-08-01

    Full Text Available Abstract Background The fliC and fljB genes in Salmonella code for the phase 1 (H1 and phase 2 (H2 flagellin respectively, the rfb cluster encodes the majority of enzymes for polysaccharide (O antigen biosynthesis, together they determine the antigenic profile by which Salmonella are identified. Sequencing and characterisation of fliC was performed in the development of a molecular serotyping technique. Results FliC sequencing of 106 strains revealed two groups; the g-complex included those exhibiting "g" or "m,t" antigenic factors, and the non-g strains which formed a second more diverse group. Variation in fliC was characterised and sero-specific motifs identified. Furthermore, it was possible to identify differences in certain H antigens that are not detected by traditional serotyping. A rapid short sequencing assay was developed to target serotype-specific sequence motifs in fliC. The assay was evaluated for identification of H1 antigens with a panel of 55 strains. Conclusion FliC sequences were obtained for more than 100 strains comprising 29 different H1 alleles. Unique pyrosequencing profiles corresponding to the H1 component of the serotype were generated reproducibly for the 23 alleles represented in the evaluation panel. Short read sequence assays can now be used to identify fliC alleles in approximately 97% of the 50 medically most important Salmonella in England and Wales. Capability for high throughput testing and automation give these assays considerable advantages over traditional methods.

  7. NiMax: a new approach to develop hadronic event generators in HEP

    International Nuclear Information System (INIS)

    Amelin, N.; Komogorov, M.

    2000-01-01

    The NiMax framework is a new approach to develop, assemble and use hadronic event generators in HEP. There are several important concepts of the NiMax architecture: the components, the data file, the application domain module, the control system and the project. Here we describe these concepts stressing their functionality

  8. Amplification and chromosomal dispersion of human endogenous retroviral sequences

    International Nuclear Information System (INIS)

    Steele, P.E.; Martin, M.A.; Rabson, A.B.; Bryan, T.; O'Brien, S.J.

    1986-01-01

    Endogenous retroviral sequences have undergone amplification events involving both viral and flanking cellular sequences. The authors cloned members of an amplified family of full-length endogenous retroviral sequences. Genomic blotting, employing a flanking cellular DNA probe derived from a member of this family, revealed a similar array of reactive bands in both humans and chimpanzees, indicating that an amplification event involving retroviral and associated cellular DNA sequences occurred before the evolutionary separation of these two primates. Southern analyses of restricted somatic cell hybrid DNA preparations suggested that endogenous retroviral segments are widely dispersed in the human genome and that amplification and dispersion events may be linked

  9. Study on the systematic approach of Markov modeling for dependability analysis of complex fault-tolerant features with voting logics

    International Nuclear Information System (INIS)

    Son, Kwang Seop; Kim, Dong Hoon; Kim, Chang Hwoi; Kang, Hyun Gook

    2016-01-01

    The Markov analysis is a technique for modeling system state transitions and calculating the probability of reaching various system states. While it is a proper tool for modeling complex system designs involving timing, sequencing, repair, redundancy, and fault tolerance, as the complexity or size of the system increases, so does the number of states of interest, leading to difficulty in constructing and solving the Markov model. This paper introduces a systematic approach of Markov modeling to analyze the dependability of a complex fault-tolerant system. This method is based on the decomposition of the system into independent subsystem sets, and the system-level failure rate and the unavailability rate for the decomposed subsystems. A Markov model for the target system is easily constructed using the system-level failure and unavailability rates for the subsystems, which can be treated separately. This approach can decrease the number of states to consider simultaneously in the target system by building Markov models of the independent subsystems stage by stage, and results in an exact solution for the Markov model of the whole target system. To apply this method we construct a Markov model for the reactor protection system found in nuclear power plants, a system configured with four identical channels and various fault-tolerant architectures. The results show that the proposed method in this study treats the complex architecture of the system in an efficient manner using the merits of the Markov model, such as a time dependent analysis and a sequential process analysis. - Highlights: • Systematic approach of Markov modeling for system dependability analysis is proposed based on the independent subsystem set, its failure rate and unavailability rate. • As an application example, we construct the Markov model for the digital reactor protection system configured with four identical and independent channels, and various fault-tolerant architectures. • The

  10. Complex Parts, Complex Data: Why You Need to Understand What Radiation Single Event Testing Data Does and Doesn't Show and the Implications Thereof

    Science.gov (United States)

    LaBel, Kenneth A.; Berg, Melanie D.

    2015-01-01

    Electronic parts (integrated circuits) have grown in complexity such that determining all failure modes and risks from single particle event testing is impossible. In this presentation, the authors will present why this is so and provide some realism on what this means. Its all about understanding actual risks and not making assumptions.

  11. Perceptions of randomness in binary sequences: Normative, heuristic, or both?

    Science.gov (United States)

    Reimers, Stian; Donkin, Chris; Le Pelley, Mike E

    2018-03-01

    When people consider a series of random binary events, such as tossing an unbiased coin and recording the sequence of heads (H) and tails (T), they tend to erroneously rate sequences with less internal structure or order (such as HTTHT) as more probable than sequences containing more structure or order (such as HHHHH). This is traditionally explained as a local representativeness effect: Participants assume that the properties of long sequences of random outcomes-such as an equal proportion of heads and tails, and little internal structure-should also apply to short sequences. However, recent theoretical work has noted that the probability of a particular sequence of say, heads and tails of length n, occurring within a larger (>n) sequence of coin flips actually differs by sequence, so P(HHHHH) rational norms based on limited experience. We test these accounts. Participants in Experiment 1 rated the likelihood of occurrence for all possible strings of 4, 5, and 6 observations in a sequence of coin flips. Judgments were better explained by representativeness in alternation rate, relative proportion of heads and tails, and sequence complexity, than by objective probabilities. Experiments 2 and 3 gave similar results using incentivized binary choice procedures. Overall the evidence suggests that participants are not sensitive to variation in objective probabilities of a sub-sequence occurring; they appear to use heuristics based on several distinct forms of representativeness. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. A pragmatic approach to infants with Robin sequence: a retrospective cohort study and presence of a treatment algorithm.

    Science.gov (United States)

    Paes, Emma C; van Nunen, Daan P F; Speleman, Lucienne; Muradin, Marvick S M; Smarius, Bram; Kon, Moshe; Mink van der Molen, Aebele B; van der Molen, Aebele B Mink; Niers, Titia L E M; Veldhoen, Esther S; Breugem, Corstiaan C

    2015-11-01

    Initial approaches to and treatments of infants with Robin sequence (RS) is diverse and inconsistent. The care of these sometimes critically ill infants involves many different medical specialties, which can make the decision process complex and difficult. To optimize the care of infants with RS, we present our institution's approach and a review of the current literature. A retrospective cohort study was conducted among 75 infants diagnosed with RS and managed at our institution in the 1996-2012 period. Additionally, the conducted treatment regimen in this paper was discussed with recent literature describing the approach of infants with RS. Forty-four infants (59%) were found to have been treated conservatively. A significant larger proportion of nonisolated RS infants than isolated RS infants needed surgical intervention (53 vs. 25%, p = .014). A mandibular distraction was conducted in 24% (n = 18) of cases, a tracheotomy in 9% (n = 7), and a tongue-lip adhesion in 8% (n = 6). Seventy-seven percent of all infants had received temporary nasogastric tube feeding. The literature review of 31 studies showed that initial examinations and the indications to perform a surgical intervention varied and were often not clearly described. RS is a heterogenic group with a wide spectrum of associated anomalies. As a result, the decisional process is challenging, and a multidisciplinary approach to treatment is desirable. Current treatment options in literature vary, and a more uniform approach is recommended. We provide a comprehensive and pragmatic approach to the analysis and treatment of infants with RS, which could serve as useful guidance in other clinics.

  13. Chemical rationale for selection of isolates for genome sequencing

    DEFF Research Database (Denmark)

    Rank, Christian; Larsen, Thomas Ostenfeld; Frisvad, Jens Christian

    The advances in gene sequencing will in the near future enable researchers to affordably acquire the full genomes of handpicked isolates. We here present a method to evaluate the chemical potential of an entire species and select representatives for genome sequencing. The selection criteria for new...... strains to be sequenced can be manifold, but for studying the functional phenotype, using a metabolome based approach offers a cheap and rapid assessment of critical strains to cover the chemical diversity. We have applied this methodology on the complex A. flavus/A. oryzae group. Though these two species...... are in principal identical, they represent two different phenotypes. This is clearly presented through a correspondence analysis of selected extrolites, in which the subtle chemical differences are visually dispersed. The results points to a handful of strains, which, if sequenced, will likely enhance our...

  14. The Iquique 2014 sequence: understanding its nucleation and propagation from the seismicity evolution

    Science.gov (United States)

    Fuenzalida, A.; Rietbrock, A.; Woollam, J.; Tavera, H.; Ruiz, S.

    2017-12-01

    The Northern Chile and Southern Peru region is well known for its high seismic hazard due to the lack of recent major ruptures along long segments of the subduction interface. For this reason the 2014 Iquique Mw 8.1 earthquake that occurred in the Northern Chile seismic gap was expected and high quality seismic and geodetic networks were operating at the time of the event recording the precursory phase of a mega-thrust event with unprecedented detail. In this study we used seismic data collected during the 2014 Iquique sequence to generate a detailed earthquake catalogue. This catalogue consists of more than 15,000 events identified in Northern Chile during the period between 1/3/14 and 31/5/14 and provides full coverage of the immediate foreshock sequence, the main-shock and early after-shock series. The initial catalogue was obtained by automatic data processing and only selecting events with at least two associate S phases to improve the reliability of initial locations. Subsequently, this subset of events was automatically processed again using an optimized STA/LTA triggering algorithm for both P and S-waves and constraining the detection times by estimated arrival times at each station calculated for the preliminary locations. Finally, all events were relocated using a recently developed 1D velocity model and associated station corrections. For events Mw 4 or larger that occurred between the 15/3/14 and 10/04/14, we estimated it regional moment tensor by full-waveform inversion. Our results confirm the seismic activation of the upper plate during the foreshock sequence, as well highlight a crustal activity on the fore-arc during the aftershock series. The seismicity distribution was compared to the previous inter-seismic coupling studies obtained in the region, in which we observe interplay between high and low coupling areas, which are correlated to the seismicity rate. The spatial distribution of the seismicity and the complexities on the mechanisms observed

  15. Four Essays on Family Life Events

    DEFF Research Database (Denmark)

    Loft, Lisbeth Trille Gylling

    . In addition, the present thesis underlines the need for an improved understanding of the role of health and caregiving as fundamental aspects of family life, and in doing so allocates increased attention to how children’s characteristics are central to family-level outcomes. Just as the lives of family......As demographic and social trends continue to change the institution of the family, a need to reconsider the study family life events as they unfold over the life course has emerged. To advance current knowledge of social dynamics associated with this new complexity, the point of departure...... of the present thesis is the way in which individual, social, and institutional contexts shape family life events. The main objective of the present thesis is twofold: to highlight the importance of how family life events are theoretically understood and methodologically approached, and to examine why social...

  16. Exact distribution of a pattern in a set of random sequences generated by a Markov source: applications to biological data.

    Science.gov (United States)

    Nuel, Gregory; Regad, Leslie; Martin, Juliette; Camproux, Anne-Claude

    2010-01-26

    In bioinformatics it is common to search for a pattern of interest in a potentially large set of rather short sequences (upstream gene regions, proteins, exons, etc.). Although many methodological approaches allow practitioners to compute the distribution of a pattern count in a random sequence generated by a Markov source, no specific developments have taken into account the counting of occurrences in a set of independent sequences. We aim to address this problem by deriving efficient approaches and algorithms to perform these computations both for low and high complexity patterns in the framework of homogeneous or heterogeneous Markov models. The latest advances in the field allowed us to use a technique of optimal Markov chain embedding based on deterministic finite automata to introduce three innovative algorithms. Algorithm 1 is the only one able to deal with heterogeneous models. It also permits to avoid any product of convolution of the pattern distribution in individual sequences. When working with homogeneous models, Algorithm 2 yields a dramatic reduction in the complexity by taking advantage of previous computations to obtain moment generating functions efficiently. In the particular case of low or moderate complexity patterns, Algorithm 3 exploits power computation and binary decomposition to further reduce the time complexity to a logarithmic scale. All these algorithms and their relative interest in comparison with existing ones were then tested and discussed on a toy-example and three biological data sets: structural patterns in protein loop structures, PROSITE signatures in a bacterial proteome, and transcription factors in upstream gene regions. On these data sets, we also compared our exact approaches to the tempting approximation that consists in concatenating the sequences in the data set into a single sequence. Our algorithms prove to be effective and able to handle real data sets with multiple sequences, as well as biological patterns of

  17. Environment-Gene interaction in common complex diseases: New approaches

    Directory of Open Access Journals (Sweden)

    William A. Toscano, Jr.

    2014-10-01

    Full Text Available Approximately 100,000 different environmental chemicals that are in use as high production volume chemicals confront us in our daily lives. Many of the chemicals we encounter are persistent and have long half-lives in the environment and our bodies. These compounds are referred to as Persistent Organic Pollutants, or POPS. The total environment however is broader than just toxic pollutants. It includes social capital, social economic status, and other factors that are not commonly considered in traditional approaches to studying environment-human interactions. The mechanism of action of environmental agents in altering the human phenotype from health to disease is more complex than once thought. The focus in public health has shifted away from the study of single-gene rare diseases and has given way to the study of multifactorial complex diseases that are common in the population. To understand common complex diseases, we need teams of scientists from different fields working together with common aims. We review some approaches for studying the action of the environment by discussing use-inspired research, and transdisciplinary research approaches. The Genomic era has yielded new tools for study of gene-environment interactions, including genomics, epigenomics, and systems biology. We use environmentally-driven diabetes mellitus type two as an example of environmental epigenomics and disease. The aim of this review is to start the conversation of how the application of advances in biomedical science can be used to advance public health.

  18. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    The paper demonstrates that a wide variety of event-based modeling approaches are based on special cases of the same general event concept, and that the general event concept can be used to unify the otherwise unrelated fields of information modeling and process modeling. A set of event......-based modeling approaches are analyzed and the results are used to formulate a general event concept that can be used for unifying the seemingly unrelated event concepts. Events are characterized as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms...... of information structures. The general event concept can be used to guide systems analysis and design and to improve modeling approaches....

  19. Cyclic Sequences, Events and Evolution of the Sino-Korean Plate,with a Discussion on the Evolution of Molar-tooth Carbonates,Phosphorites and Source Rocks

    Institute of Scientific and Technical Information of China (English)

    MENG Xianghua; GE Ming

    2003-01-01

    This paper gives an account of the research that the authors conducted on the cyclic sequences, events and evolutionary history from Proterozoic to Meso-Cenozoic in the Sino-Korean plate based on the principle of the Cosmos-Earth System. The authors divided this plate into 20 super-cyclic or super-mega-cyclic periods and more than 100 Oort periods. The research focused on important sea flooding events, uplift interruption events, tilting movement events, molar-tooth carbonate events, thermal events, polarity reversal events, karst events, volcanic explosion events and storm events, as well as types of resource areas and paleotectonic evolution. By means of the isochronous theory of the Cosmos-Earth System periodicity and based on long-excentricity and periodicity, the authors elaborately studied the paleogeographic evolution of the aulacogen of the Sino-Korean plate, the oolitic beach platform formation, the development of foreland basin and continental rift valley basin, and reconstructed the evolution of tectonic paleogeography and stratigraphic framework in the Sino-Korean plate in terms of evolutionary maps. Finally, the authors gave a profound discussion on the formation and development of molar-tooth carbonates, phosphorites and source rocks.

  20. Event-by-event simulation of quantum phenomena

    NARCIS (Netherlands)

    Raedt, H. De; Raedt, K. De; Michielsen, K.; Landau, DP; Lewis, SP; Schuttler, HB

    2006-01-01

    In various basic experiments in quantum physics, observations are recorded event-by-event. The final outcome of such experiments can be computed according to the rules of quantum theory but quantum theory does not describe single events. In this paper, we describe a stimulation approach that does

  1. Insights into bacterioplankton community structure from Sundarbans mangrove ecoregion using Sanger and Illumina MiSeq sequencing approaches: A comparative analysis

    Directory of Open Access Journals (Sweden)

    Anwesha Ghosh

    2017-03-01

    Full Text Available Next generation sequencing using platforms such as Illumina MiSeq provides a deeper insight into the structure and function of bacterioplankton communities in coastal ecosystems compared to traditional molecular techniques such as clone library approach which incorporates Sanger sequencing. In this study, structure of bacterioplankton communities was investigated from two stations of Sundarbans mangrove ecoregion using both Sanger and Illumina MiSeq sequencing approaches. The Illumina MiSeq data is available under the BioProject ID PRJNA35180 and Sanger sequencing data under accession numbers KX014101-KX014140 (Stn1 and KX014372-KX014410 (Stn3. Proteobacteria-, Firmicutes- and Bacteroidetes-like sequences retrieved from both approaches appeared to be abundant in the studied ecosystem. The Illumina MiSeq data (2.1 GB provided a deeper insight into the structure of bacterioplankton communities and revealed the presence of bacterial phyla such as Actinobacteria, Cyanobacteria, Tenericutes, Verrucomicrobia which were not recovered based on Sanger sequencing. A comparative analysis of bacterioplankton communities from both stations highlighted the presence of genera that appear in both stations and genera that occur exclusively in either station. However, both the Sanger sequencing and Illumina MiSeq data were coherent at broader taxonomic levels. Pseudomonas, Devosia, Hyphomonas and Erythrobacter-like sequences were the abundant bacterial genera found in the studied ecosystem. Both the sequencing methods showed broad coherence although as expected the Illumina MiSeq data helped identify rarer bacterioplankton groups and also showed the presence of unassigned OTUs indicating possible presence of novel bacterioplankton from the studied mangrove ecosystem.

  2. Application of Massively Parallel Sequencing in the Clinical Diagnostic Testing of Inherited Cardiac Conditions

    Directory of Open Access Journals (Sweden)

    Ivone U. S. Leong

    2014-06-01

    Full Text Available Sudden cardiac death in people between the ages of 1–40 years is a devastating event and is frequently caused by several heritable cardiac disorders. These disorders include cardiac ion channelopathies, such as long QT syndrome, catecholaminergic polymorphic ventricular tachycardia and Brugada syndrome and cardiomyopathies, such as hypertrophic cardiomyopathy and arrhythmogenic right ventricular cardiomyopathy. Through careful molecular genetic evaluation of DNA from sudden death victims, the causative gene mutation can be uncovered, and the rest of the family can be screened and preventative measures implemented in at-risk individuals. The current screening approach in most diagnostic laboratories uses Sanger-based sequencing; however, this method is time consuming and labour intensive. The development of massively parallel sequencing has made it possible to produce millions of sequence reads simultaneously and is potentially an ideal approach to screen for mutations in genes that are associated with sudden cardiac death. This approach offers mutation screening at reduced cost and turnaround time. Here, we will review the current commercially available enrichment kits, massively parallel sequencing (MPS platforms, downstream data analysis and its application to sudden cardiac death in a diagnostic environment.

  3. Quantitative risk trends deriving from PSA-based event analyses. Analysis of results from U.S.NRC's accident sequence precursor program

    International Nuclear Information System (INIS)

    Watanabe, Norio

    2004-01-01

    The United States Nuclear Regulatory Commission (U.S.NRC) has been carrying out the Accident Sequence Precursor (ASP) Program to identify and categorize precursors to potential severe core damage accident sequences using the probabilistic safety assessment (PSA) technique. The ASP Program has identified a lot of risk significant events as precursors that occurred at U.S. nuclear power plants. Although the results from the ASP Program include valuable information that could be useful for obtaining and characterizing risk significant insights and for monitoring risk trends in nuclear power industry, there are only a few attempts to determine and develop the trends using the ASP results. The present study examines and discusses quantitative risk trends for the industry level, using two indicators, that is, the occurrence frequency of precursors and the annual core damage probability, deriving from the results of the ASP analysis. It is shown that the core damage risk at U.S. nuclear power plants has been lowered and the likelihood of risk significant events has been remarkably decreasing. As well, the present study demonstrates that two risk indicators used here can provide quantitative information useful for examining and monitoring the risk trends and/or risk characteristics in nuclear power industry. (author)

  4. Rapid determination of anti-tuberculosis drug resistance from whole-genome sequences

    KAUST Repository

    Coll, Francesc

    2015-05-27

    Mycobacterium tuberculosis drug resistance (DR) challenges effective tuberculosis disease control. Current molecular tests examine limited numbers of mutations, and although whole genome sequencing approaches could fully characterise DR, data complexity has restricted their clinical application. A library (1,325 mutations) predictive of DR for 15 anti-tuberculosis drugs was compiled and validated for 11 of them using genomic-phenotypic data from 792 strains. A rapid online ‘TB-Profiler’ tool was developed to report DR and strain-type profiles directly from raw sequences. Using our DR mutation library, in silico diagnostic accuracy was superior to some commercial diagnostics and alternative databases. The library will facilitate sequence-based drug-susceptibility testing.

  5. A comparative evaluation of sequence classification programs

    Directory of Open Access Journals (Sweden)

    Bazinet Adam L

    2012-05-01

    Full Text Available Abstract Background A fundamental problem in modern genomics is to taxonomically or functionally classify DNA sequence fragments derived from environmental sampling (i.e., metagenomics. Several different methods have been proposed for doing this effectively and efficiently, and many have been implemented in software. In addition to varying their basic algorithmic approach to classification, some methods screen sequence reads for ’barcoding genes’ like 16S rRNA, or various types of protein-coding genes. Due to the sheer number and complexity of methods, it can be difficult for a researcher to choose one that is well-suited for a particular analysis. Results We divided the very large number of programs that have been released in recent years for solving the sequence classification problem into three main categories based on the general algorithm they use to compare a query sequence against a database of sequences. We also evaluated the performance of the leading programs in each category on data sets whose taxonomic and functional composition is known. Conclusions We found significant variability in classification accuracy, precision, and resource consumption of sequence classification programs when used to analyze various metagenomics data sets. However, we observe some general trends and patterns that will be useful to researchers who use sequence classification programs.

  6. Learning and Recognition of a Non-conscious Sequence of Events in Human Primary Visual Cortex.

    Science.gov (United States)

    Rosenthal, Clive R; Andrews, Samantha K; Antoniades, Chrystalina A; Kennard, Christopher; Soto, David

    2016-03-21

    Human primary visual cortex (V1) has long been associated with learning simple low-level visual discriminations [1] and is classically considered outside of neural systems that support high-level cognitive behavior in contexts that differ from the original conditions of learning, such as recognition memory [2, 3]. Here, we used a novel fMRI-based dichoptic masking protocol-designed to induce activity in V1, without modulation from visual awareness-to test whether human V1 is implicated in human observers rapidly learning and then later (15-20 min) recognizing a non-conscious and complex (second-order) visuospatial sequence. Learning was associated with a change in V1 activity, as part of a temporo-occipital and basal ganglia network, which is at variance with the cortico-cerebellar network identified in prior studies of "implicit" sequence learning that involved motor responses and visible stimuli (e.g., [4]). Recognition memory was associated with V1 activity, as part of a temporo-occipital network involving the hippocampus, under conditions that were not imputable to mechanisms associated with conscious retrieval. Notably, the V1 responses during learning and recognition separately predicted non-conscious recognition memory, and functional coupling between V1 and the hippocampus was enhanced for old retrieval cues. The results provide a basis for novel hypotheses about the signals that can drive recognition memory, because these data (1) identify human V1 with a memory network that can code complex associative serial visuospatial information and support later non-conscious recognition memory-guided behavior (cf. [5]) and (2) align with mouse models of experience-dependent V1 plasticity in learning and memory [6]. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Whole exome sequencing in neurogenetic odysseys: An effective, cost- and time-saving diagnostic approach.

    Directory of Open Access Journals (Sweden)

    Marta Córdoba

    Full Text Available Diagnostic trajectories for neurogenetic disorders frequently require the use of considerable time and resources, exposing patients and families to so-called "diagnostic odysseys". Previous studies have provided strong evidence for increased diagnostic and clinical utility of whole-exome sequencing in medical genetics. However, specific reports assessing its utility in a setting such as ours- a neurogeneticist led academic group serving in a low-income country-are rare.To assess the diagnostic yield of WES in patients suspected of having a neurogenetic condition and explore the cost-effectiveness of its implementation in a research group located in an Argentinean public hospital.This is a prospective study of the clinical utility of WES in a series of 40 consecutive patients selected from a Neurogenetic Clinic of a tertiary Hospital in Argentina. We evaluated patients retrospectively for previous diagnostic trajectories. Diagnostic yield, clinical impact on management and economic diagnostic burden were evaluated.We demonstrated the clinical utility of Whole Exome Sequencing in our patient cohort, obtaining a diagnostic yield of 40% (95% CI, 24.8%-55.2% among a diverse group of neurological disorders. The average age at the time of WES was 23 (range 3-70. The mean time elapsed from symptom onset to WES was 11 years (range 3-42. The mean cost of the diagnostic workup prior to WES was USD 1646 (USD 1439 to 1853, which is 60% higher than WES cost in our center.WES for neurogenetics proved to be an effective, cost- and time-saving approach for the molecular diagnosis of this heterogeneous and complex group of patients.

  8. Risk-oriented approach application at planning and orginizing antiepidemic provision of mass events

    Directory of Open Access Journals (Sweden)

    D.V. Efremenko

    2017-03-01

    Full Text Available Mass events tend to become more and more dangerous for population health, as they cause various health risks, including infectious pathologies risks. Our research goal was to work out scientifically grounded approaches to assessing and managing epidemiologic risks as well as analyze their application practices implemented during preparation to the Olympics-2014, the Games themselves, as well as other mass events which took place in 2014–2016. We assessed epidemiologic complications risks with the use of diagnostic test-systems and applying a new technique which allowed for mass events peculiarities. The technique is based on infections ranking as per 3 potential danger categories in accordance with created criteria which represented quantitative and qualitative predictive parameters (predictors. Application of risk-oriented approach and multi-factor analysis allowed us to detect exact possible maximum requirements for providing sanitary-epidemiologic welfare in terms of each separate nosologic form. As we enhanced our laboratory base with test-systems to provide specific indication as per accomplished calculations, it enabled us, on one hand, to secure the required preparations, and, on the other hand, to avoid unnecessary expenditures. To facilitate decision-making process during the Olympics-2014 we used an innovative product, namely, a computer program based on geoinformation system (GIS. It helped us to simplify and to accelerate information exchange within the frameworks of intra- and interdepartmental interaction. "Dynamic epidemiologic threshold" was daily calculated for measles, chickenpox, acute enteric infections and acute respiratory viral infections of various etiology. And if it was exceeded or possibility of "epidemiologic spot" for one or several nosologies occurred, an automatic warning appeared in GIS. Planning prevention activities regarding feral herd infections and zoogenous extremely dangerous infections which were endemic

  9. An Ensemble Approach for Emotion Cause Detection with Event Extraction and Multi-Kernel SVMs

    Institute of Scientific and Technical Information of China (English)

    Ruifeng Xu; Jiannan Hu; Qin Lu; Dongyin Wu; Lin Gui

    2017-01-01

    In this paper,we present a new challenging task for emotion analysis,namely emotion cause extraction.In this task,we focus on the detection of emotion cause a.k.a the reason or the stimulant of an emotion,rather than the regular emotion classification or emotion component extraction.Since there is no open dataset for this task available,we first designed and annotated an emotion cause dataset which follows the scheme of W3C Emotion Markup Language.We then present an emotion cause detection method by using event extraction framework,where a tree structure-based representation method is used to represent the events.Since the distribution of events is imbalanced in the training data,we propose an under-sampling-based bagging algorithm to solve this problem.Even with a limited training set,the proposed approach may still extract sufficient features for analysis by a bagging of multi-kernel based SVMs method.Evaluations show that our approach achieves an F-measure 7.04% higher than the state-of-the-art methods.

  10. Post-contrast T1-weighted sequences in pediatric abdominal imaging: comparative analysis of three different sequences and imaging approach

    Energy Technology Data Exchange (ETDEWEB)

    Roque, Andreia; Ramalho, Miguel; AlObaidy, Mamdoh; Heredia, Vasco; Burke, Lauren M.; De Campos, Rafael O.P.; Semelka, Richard C. [University of North Carolina at Chapel Hill, Department of Radiology, Chapel Hill, NC (United States)

    2014-10-15

    Post-contrast T1-weighted imaging is an essential component of a comprehensive pediatric abdominopelvic MR examination. However, consistent good image quality is challenging, as respiratory motion in sedated children can substantially degrade the image quality. To compare the image quality of three different post-contrast T1-weighted imaging techniques - standard three-dimensional gradient-echo (3-D-GRE), magnetization-prepared gradient-recall echo (MP-GRE) and 3-D-GRE with radial data sampling (radial 3-D-GRE) - acquired in pediatric patients younger than 5 years of age. Sixty consecutive exams performed in 51 patients (23 females, 28 males; mean age 2.5 ± 1.4 years) constituted the final study population. Thirty-nine scans were performed at 3 T and 21 scans were performed at 1.5 T. Two different reviewers independently and blindly qualitatively evaluated all sequences to determine image quality and extent of artifacts. MP-GRE and radial 3-D-GRE sequences had the least respiratory motion (P < 0.0001). Standard 3-D-GRE sequences displayed the lowest average score ratings in hepatic and pancreatic edge definition, hepatic vessel clarity and overall image quality. Radial 3-D-GRE sequences showed the highest scores ratings in overall image quality. Our preliminary results support the preference of fat-suppressed radial 3-D-GRE as the best post-contrast T1-weighted imaging approach for patients under the age of 5 years, when dynamic imaging is not essential. (orig.)

  11. Hydrogeological hazards and weather events: Triggering and evolution of shallow landslides

    Directory of Open Access Journals (Sweden)

    Salvatore Monteleone

    2014-06-01

    The complex nature of these instability events that affect anthropized areas does not allow specific approaches for the defence of single good, but it finds a more effective solution based on the extensive knowledge of territory, perhaps at the scale of individual or several watersheds.

  12. Three Dimensional Simulation of the Baneberry Nuclear Event

    Science.gov (United States)

    Lomov, Ilya N.; Antoun, Tarabay H.; Wagoner, Jeff; Rambo, John T.

    2004-07-01

    Baneberry, a 10-kiloton nuclear event, was detonated at a depth of 278 m at the Nevada Test Site on December 18, 1970. Shortly after detonation, radioactive gases emanating from the cavity were released into the atmosphere through a shock-induced fissure near surface ground zero. Extensive geophysical investigations, coupled with a series of 1D and 2D computational studies were used to reconstruct the sequence of events that led to the catastrophic failure. However, the geological profile of the Baneberry site is complex and inherently three-dimensional, which meant that some geological features had to be simplified or ignored in the 2D simulations. This left open the possibility that features unaccounted for in the 2D simulations could have had an important influence on the eventual containment failure of the Baneberry event. This paper presents results from a high-fidelity 3D Baneberry simulation based on the most accurate geologic and geophysical data available. The results are compared with available data, and contrasted against the results of the previous 2D computational studies.

  13. Determination of haplotypes at structurally complex regions using emulsion haplotype fusion PCR.

    Science.gov (United States)

    Tyson, Jess; Armour, John A L

    2012-12-11

    Genotyping and massively-parallel sequencing projects result in a vast amount of diploid data that is only rarely resolved into its constituent haplotypes. It is nevertheless this phased information that is transmitted from one generation to the next and is most directly associated with biological function and the genetic causes of biological effects. Despite progress made in genome-wide sequencing and phasing algorithms and methods, problems assembling (and reconstructing linear haplotypes in) regions of repetitive DNA and structural variation remain. These dynamic and structurally complex regions are often poorly understood from a sequence point of view. Regions such as these that are highly similar in their sequence tend to be collapsed onto the genome assembly. This is turn means downstream determination of the true sequence haplotype in these regions poses a particular challenge. For structurally complex regions, a more focussed approach to assembling haplotypes may be required. In order to investigate reconstruction of spatial information at structurally complex regions, we have used an emulsion haplotype fusion PCR approach to reproducibly link sequences of up to 1kb in length to allow phasing of multiple variants from neighbouring loci, using allele-specific PCR and sequencing to detect the phase. By using emulsion systems linking flanking regions to amplicons within the CNV, this led to the reconstruction of a 59kb haplotype across the DEFA1A3 CNV in HapMap individuals. This study has demonstrated a novel use for emulsion haplotype fusion PCR in addressing the issue of reconstructing structural haplotypes at multiallelic copy variable regions, using the DEFA1A3 locus as an example.

  14. Determination of haplotypes at structurally complex regions using emulsion haplotype fusion PCR

    Directory of Open Access Journals (Sweden)

    Tyson Jess

    2012-12-01

    Full Text Available Abstract Background Genotyping and massively-parallel sequencing projects result in a vast amount of diploid data that is only rarely resolved into its constituent haplotypes. It is nevertheless this phased information that is transmitted from one generation to the next and is most directly associated with biological function and the genetic causes of biological effects. Despite progress made in genome-wide sequencing and phasing algorithms and methods, problems assembling (and reconstructing linear haplotypes in regions of repetitive DNA and structural variation remain. These dynamic and structurally complex regions are often poorly understood from a sequence point of view. Regions such as these that are highly similar in their sequence tend to be collapsed onto the genome assembly. This is turn means downstream determination of the true sequence haplotype in these regions poses a particular challenge. For structurally complex regions, a more focussed approach to assembling haplotypes may be required. Results In order to investigate reconstruction of spatial information at structurally complex regions, we have used an emulsion haplotype fusion PCR approach to reproducibly link sequences of up to 1kb in length to allow phasing of multiple variants from neighbouring loci, using allele-specific PCR and sequencing to detect the phase. By using emulsion systems linking flanking regions to amplicons within the CNV, this led to the reconstruction of a 59kb haplotype across the DEFA1A3 CNV in HapMap individuals. Conclusion This study has demonstrated a novel use for emulsion haplotype fusion PCR in addressing the issue of reconstructing structural haplotypes at multiallelic copy variable regions, using the DEFA1A3 locus as an example.

  15. Sequence characterisation of deletion breakpoints in the dystrophin gene by PCR

    Energy Technology Data Exchange (ETDEWEB)

    Abbs, S.; Sandhu, S.; Bobrow, M. [Guy`s Hospital, London (United Kingdom)

    1994-09-01

    Partial deletions of the dystrophin gene account for 65% of cases of Duchenne muscular dystrophy. A high proportion of these structural changes are generated by new mutational events, and lie predominantly within two `hotspot` regions, yet the underlying reasons for this are not known. We are characterizing and sequencing the regions surrounding deletion breakpoints in order to: (i) investigate the mechanisms of deletion mutation, and (ii) enable the design of PCR assays to specifically amplify mutant and normal sequences, allowing us to search for the presence of somatic mosaicism in appropriate family members. Using this approach we have been able to demonstrate the presence of somatic mosaicism in a maternal grandfather of a DMD-affected male, deleted for exons 49-50. Three deletions, namely of exons 48-49, 49-50, and 50, have been characterized using a PCR approach that avoids any cloning procedures. Breakpoints were initially localized to within regions of a few kilobases using Southern blot restriction analyses with exon-specific probes and PCR amplification of exonic and intronic loci. Sequencing was performed directly on PCR products: (i) mutant sequences were obtained from long-range or inverse-PCR across the deletion junction fragments, and (ii) normal sequences were obtained from the products of standard PCR, vectorette PCR, or inverse-PCR performed on YACs. Further characterization of intronic sequences will allow us to amplify and sequence across other deletion breakpoints and increase our knowledge of the mechanisms of mutation in the dystophin gene.

  16. Chiron: translating nanopore raw signal directly into nucleotide sequence using deep learning

    KAUST Repository

    Teng, Haotian; Cao, Minh Duc; Hall, Michael B; Duarte, Tania; Wang, Sheng; Coin, Lachlan J M

    2018-01-01

    Sequencing by translocating DNA fragments through an array of nanopores is a rapidly maturing technology that offers faster and cheaper sequencing than other approaches. However, accurately deciphering the DNA sequence from the noisy and complex electrical signal is challenging. Here, we report Chiron, the first deep learning model to achieve end-to-end basecalling and directly translate the raw signal to DNA sequence without the error-prone segmentation step. Trained with only a small set of 4,000 reads, we show that our model provides state-of-the-art basecalling accuracy, even on previously unseen species. Chiron achieves basecalling speeds of more than 2,000 bases per second using desktop computer graphics processing units.

  17. Chiron: translating nanopore raw signal directly into nucleotide sequence using deep learning

    KAUST Repository

    Teng, Haotian

    2018-04-10

    Sequencing by translocating DNA fragments through an array of nanopores is a rapidly maturing technology that offers faster and cheaper sequencing than other approaches. However, accurately deciphering the DNA sequence from the noisy and complex electrical signal is challenging. Here, we report Chiron, the first deep learning model to achieve end-to-end basecalling and directly translate the raw signal to DNA sequence without the error-prone segmentation step. Trained with only a small set of 4,000 reads, we show that our model provides state-of-the-art basecalling accuracy, even on previously unseen species. Chiron achieves basecalling speeds of more than 2,000 bases per second using desktop computer graphics processing units.

  18. ReRep: Computational detection of repetitive sequences in genome survey sequences (GSS

    Directory of Open Access Journals (Sweden)

    Alves-Ferreira Marcelo

    2008-09-01

    Full Text Available Abstract Background Genome survey sequences (GSS offer a preliminary global view of a genome since, unlike ESTs, they cover coding as well as non-coding DNA and include repetitive regions of the genome. A more precise estimation of the nature, quantity and variability of repetitive sequences very early in a genome sequencing project is of considerable importance, as such data strongly influence the estimation of genome coverage, library quality and progress in scaffold construction. Also, the elimination of repetitive sequences from the initial assembly process is important to avoid errors and unnecessary complexity. Repetitive sequences are also of interest in a variety of other studies, for instance as molecular markers. Results We designed and implemented a straightforward pipeline called ReRep, which combines bioinformatics tools for identifying repetitive structures in a GSS dataset. In a case study, we first applied the pipeline to a set of 970 GSSs, sequenced in our laboratory from the human pathogen Leishmania braziliensis, the causative agent of leishmaniosis, an important public health problem in Brazil. We also verified the applicability of ReRep to new sequencing technologies using a set of 454-reads of an Escheria coli. The behaviour of several parameters in the algorithm is evaluated and suggestions are made for tuning of the analysis. Conclusion The ReRep approach for identification of repetitive elements in GSS datasets proved to be straightforward and efficient. Several potential repetitive sequences were found in a L. braziliensis GSS dataset generated in our laboratory, and further validated by the analysis of a more complete genomic dataset from the EMBL and Sanger Centre databases. ReRep also identified most of the E. coli K12 repeats prior to assembly in an example dataset obtained by automated sequencing using 454 technology. The parameters controlling the algorithm behaved consistently and may be tuned to the properties

  19. An ensemble approach to the evolution of complex systems

    Indian Academy of Sciences (India)

    2014-03-15

    Mar 15, 2014 ... [Arpağ G and Erzan A 2014 An ensemble approach to the evolution of complex systems. J. Biosci. ... almost nothing about all the different ways in which your ...... energy cost to the organism of the maintenance, replication,.

  20. Conditional Probabilities of Large Earthquake Sequences in California from the Physics-based Rupture Simulator RSQSim

    Science.gov (United States)

    Gilchrist, J. J.; Jordan, T. H.; Shaw, B. E.; Milner, K. R.; Richards-Dinger, K. B.; Dieterich, J. H.

    2017-12-01

    Within the SCEC Collaboratory for Interseismic Simulation and Modeling (CISM), we are developing physics-based forecasting models for earthquake ruptures in California. We employ the 3D boundary element code RSQSim (Rate-State Earthquake Simulator of Dieterich & Richards-Dinger, 2010) to generate synthetic catalogs with tens of millions of events that span up to a million years each. This code models rupture nucleation by rate- and state-dependent friction and Coulomb stress transfer in complex, fully interacting fault systems. The Uniform California Earthquake Rupture Forecast Version 3 (UCERF3) fault and deformation models are used to specify the fault geometry and long-term slip rates. We have employed the Blue Waters supercomputer to generate long catalogs of simulated California seismicity from which we calculate the forecasting statistics for large events. We have performed probabilistic seismic hazard analysis with RSQSim catalogs that were calibrated with system-wide parameters and found a remarkably good agreement with UCERF3 (Milner et al., this meeting). We build on this analysis, comparing the conditional probabilities of sequences of large events from RSQSim and UCERF3. In making these comparisons, we consider the epistemic uncertainties associated with the RSQSim parameters (e.g., rate- and state-frictional parameters), as well as the effects of model-tuning (e.g., adjusting the RSQSim parameters to match UCERF3 recurrence rates). The comparisons illustrate how physics-based rupture simulators might assist forecasters in understanding the short-term hazards of large aftershocks and multi-event sequences associated with complex, multi-fault ruptures.